top | item 33545786

Append data to the url query

87 points| soheilpro | 3 years ago |daniel.haxx.se | reply

32 comments

order
[+] blahgeek|3 years ago|reply
Curl is great because it’s stable and universally available.

But for interactive CLI usage, I always find its command line options non intuitive - take this for example, I don’t think I can really remember the difference between -d and —data-url and —url-query if I don’t use it daily. I think that the price to pay to make it backward compatible.

For that reason, I would recommend httpie as a replacement, which provides a more intuitive interface.

[+] kh1|3 years ago|reply
When I use curl, it's mostly when I need to check header or to trace requests in the network, for which I just use -vvv. Simple one-off stuff.

So I wonder how much demand there actually is for advanced features like this. When I need to make POST requests with queries and data, I would just use python because I most likely need to process the response (parse html, store json output to database, etc.).

Adding advanced features are nice, but IMO this might lead to over-engineering and it might be hard to maintain down the line.

[+] chrsig|3 years ago|reply
> Adding advanced features are nice, but IMO this might lead to over-engineering and it might be hard to maintain down the line.

curl's not exactly a new project -- this is 'down the line' for it.

I personally welcome the change. I wont use it often, but when I will, I'll be quite happy that it's there. There are a number of other features I'd give up in it's stead.

[+] selcuka|3 years ago|reply
> Adding advanced features are nice, but IMO this might lead to over-engineering and it might be hard to maintain down the line.

Well, one person [1] has been maintaining it since 1998. It's a good thing that HTTP doesn't change that often.

[1] https://daniel.haxx.se/

[+] bfung|3 years ago|reply
More than several times, there are data APIs that return raw text/csv, esp taking data from older systems.

Writing a scraper for those, bash & curl comes in handy w/o needing extra cruft of file writing ceremonies (ex: with open…), just pipe straight to file.

[+] csours|3 years ago|reply
I thought this was going to be about base64 encoded json query param(s) - instead of posting base64 or making a bunch of different params, do a GET with one param of base64 encoded json. Easy for the client, easy for the server, easy to share, only downside that I see is that it makes things a bit obtuse for users, but an extension or browser affordance could fix that.
[+] rpastuszak|3 years ago|reply
If you're into this stuff, you might enjoy my "Twitter as a CDN tool", here's pong in a single Tweet: https://twitter.com/rafalpast/status/1316836397903474688

> Easy for the client, easy for the server, easy to share, only downside that I see is that it makes things a bit obtuse for users

There are more downsides:

- processing and creating payload can be harder are you're passing plaintext w. ca. 30% size increase on average

- base64-encoded assets (e.g. SVGs) would cause a huge performance drop in WebKit (this probably has improved since the last time I ran benchmarks).

- since you're encoding content and not its location, you can't easily update the underlying resource (hence to update my version of Pong, I'd need to create a new unique URL). IIRC IPFS suffers (used to?) from a similar issue

- not an issue now, but still was 1-2 years ago: some browsers would allow only for 2 or 4kb of text in URL

[+] r_c_a_d|3 years ago|reply
Yes, I think this is how Wordle did the migration to NY Times without losing game history. The original site did a redirect to nytimes.com/something/wordle?data=X where X was the game history. Quite clever, I thought.
[+] jrochkind1|3 years ago|reply
> This is curl’s 249th command line option
[+] nickcw|3 years ago|reply
This made me smile!

And also

    $ rclone help flags . | grep "^ *--"  | wc -l
    665
(yes rclone has a command to grep its flags `rclone help flags <regex>`!)
[+] ainar-g|3 years ago|reply
Almost quarter of a thousand. Yay!

That does raise an interesting question though. If one wanted to create a command-line tool (or, rather, toolset) as comprehensive as curl, but with a much smaller API surface, how would they go about it? Sure, some of these flags are just shorthands, and some of them could be removed entirely (-o → stdout redirection), but that still leaves a lot of options.

[+] UI_at_80x24|3 years ago|reply
Wow. Every time I've tried to dig into the man pages to find an option that I need it's been overwhelming; and now I know why!

This is one seriously under appreciated piece of code, and I am so grateful that it exists.

[+] ape4|3 years ago|reply
The article says the new option replaces the -G option. But, of course, for compatibility they cannot remove -G.
[+] pablobaz|3 years ago|reply
I wonder will they have a party when they hit 250?
[+] silverwind|3 years ago|reply
curl inferring the HTTP method from options is just bad design. It should have always been explicit via `-X`, but weirdly enough it sometimes warns when using that option.

Also, it seems some curl behaviours live in the past where only GET and POST verbs exists, ignoring others like PUT and PATCH.

[+] chrsig|3 years ago|reply
They didn't live in the past when implemented :)

Backwards compatibility is a wonderful thing, but leads to anachronisms 20 years later.

`-X` exists for anyone that doesn't want curl's divination