Curl is great because it’s stable and universally available.
But for interactive CLI usage, I always find its command line options non intuitive - take this for example, I don’t think I can really remember the difference between -d and —data-url and —url-query if I don’t use it daily. I think that the price to pay to make it backward compatible.
For that reason, I would recommend httpie as a replacement, which provides a more intuitive interface.
When I use curl, it's mostly when I need to check header or to trace requests in the network, for which I just use -vvv. Simple one-off stuff.
So I wonder how much demand there actually is for advanced features like this.
When I need to make POST requests with queries and data, I would just use python because I most likely need to process the response (parse html, store json output to database, etc.).
Adding advanced features are nice, but IMO this might lead to over-engineering and it might be hard to maintain down the line.
> Adding advanced features are nice, but IMO this might lead to over-engineering and it might be hard to maintain down the line.
curl's not exactly a new project -- this is 'down the line' for it.
I personally welcome the change. I wont use it often, but when I will, I'll be quite happy that it's there. There are a number of other features I'd give up in it's stead.
More than several times, there are data APIs that return raw text/csv, esp taking data from older systems.
Writing a scraper for those, bash & curl comes in handy w/o needing extra cruft of file writing ceremonies (ex: with open…), just pipe straight to file.
I thought this was going to be about base64 encoded json query param(s) - instead of posting base64 or making a bunch of different params, do a GET with one param of base64 encoded json. Easy for the client, easy for the server, easy to share, only downside that I see is that it makes things a bit obtuse for users, but an extension or browser affordance could fix that.
> Easy for the client, easy for the server, easy to share, only downside that I see is that it makes things a bit obtuse for users
There are more downsides:
- processing and creating payload can be harder are you're passing plaintext w. ca. 30% size increase on average
- base64-encoded assets (e.g. SVGs) would cause a huge performance drop in WebKit (this probably has improved since the last time I ran benchmarks).
- since you're encoding content and not its location, you can't easily update the underlying resource (hence to update my version of Pong, I'd need to create a new unique URL). IIRC IPFS suffers (used to?) from a similar issue
- not an issue now, but still was 1-2 years ago: some browsers would allow only for 2 or 4kb of text in URL
Yes, I think this is how Wordle did the migration to NY Times without losing game history. The original site did a redirect to nytimes.com/something/wordle?data=X where X was the game history. Quite clever, I thought.
That does raise an interesting question though. If one wanted to create a command-line tool (or, rather, toolset) as comprehensive as curl, but with a much smaller API surface, how would they go about it? Sure, some of these flags are just shorthands, and some of them could be removed entirely (-o → stdout redirection), but that still leaves a lot of options.
curl inferring the HTTP method from options is just bad design. It should have always been explicit via `-X`, but weirdly enough it sometimes warns when using that option.
Also, it seems some curl behaviours live in the past where only GET and POST verbs exists, ignoring others like PUT and PATCH.
[+] [-] blahgeek|3 years ago|reply
But for interactive CLI usage, I always find its command line options non intuitive - take this for example, I don’t think I can really remember the difference between -d and —data-url and —url-query if I don’t use it daily. I think that the price to pay to make it backward compatible.
For that reason, I would recommend httpie as a replacement, which provides a more intuitive interface.
[+] [-] pmoriarty|3 years ago|reply
Here are some org-format notes I took on curl, mostly from the contents of that video: [2]
[1] - https://www.youtube.com/watch?v=I6id1Y0YuNk
[2] - https://paste.grml.org/plain/3559
[+] [-] kh1|3 years ago|reply
So I wonder how much demand there actually is for advanced features like this. When I need to make POST requests with queries and data, I would just use python because I most likely need to process the response (parse html, store json output to database, etc.).
Adding advanced features are nice, but IMO this might lead to over-engineering and it might be hard to maintain down the line.
[+] [-] chrsig|3 years ago|reply
curl's not exactly a new project -- this is 'down the line' for it.
I personally welcome the change. I wont use it often, but when I will, I'll be quite happy that it's there. There are a number of other features I'd give up in it's stead.
[+] [-] selcuka|3 years ago|reply
Well, one person [1] has been maintaining it since 1998. It's a good thing that HTTP doesn't change that often.
[1] https://daniel.haxx.se/
[+] [-] bfung|3 years ago|reply
Writing a scraper for those, bash & curl comes in handy w/o needing extra cruft of file writing ceremonies (ex: with open…), just pipe straight to file.
[+] [-] csours|3 years ago|reply
[+] [-] rpastuszak|3 years ago|reply
> Easy for the client, easy for the server, easy to share, only downside that I see is that it makes things a bit obtuse for users
There are more downsides:
- processing and creating payload can be harder are you're passing plaintext w. ca. 30% size increase on average
- base64-encoded assets (e.g. SVGs) would cause a huge performance drop in WebKit (this probably has improved since the last time I ran benchmarks).
- since you're encoding content and not its location, you can't easily update the underlying resource (hence to update my version of Pong, I'd need to create a new unique URL). IIRC IPFS suffers (used to?) from a similar issue
- not an issue now, but still was 1-2 years ago: some browsers would allow only for 2 or 4kb of text in URL
[+] [-] r_c_a_d|3 years ago|reply
[+] [-] jrochkind1|3 years ago|reply
[+] [-] nickcw|3 years ago|reply
And also
(yes rclone has a command to grep its flags `rclone help flags <regex>`!)[+] [-] ainar-g|3 years ago|reply
That does raise an interesting question though. If one wanted to create a command-line tool (or, rather, toolset) as comprehensive as curl, but with a much smaller API surface, how would they go about it? Sure, some of these flags are just shorthands, and some of them could be removed entirely (-o → stdout redirection), but that still leaves a lot of options.
[+] [-] UI_at_80x24|3 years ago|reply
This is one seriously under appreciated piece of code, and I am so grateful that it exists.
[+] [-] ape4|3 years ago|reply
[+] [-] pablobaz|3 years ago|reply
[+] [-] silverwind|3 years ago|reply
Also, it seems some curl behaviours live in the past where only GET and POST verbs exists, ignoring others like PUT and PATCH.
[+] [-] chrsig|3 years ago|reply
Backwards compatibility is a wonderful thing, but leads to anachronisms 20 years later.
`-X` exists for anyone that doesn't want curl's divination
[+] [-] cdelsolar|3 years ago|reply
[+] [-] sadjad|3 years ago|reply