top | item 4261263

HTTPie: A cURL-like tool for humans

312 points| jkbr | 13 years ago |github.com | reply

83 comments

order
[+] javajosh|13 years ago|reply
This is a nice curl and wget replacement that handles a bunch of modern use-cases without a lot of hard-to-remember command-line flags.

That said, there is a broader problem of "hard-to-remember command-line flags" which I have personally solved using snippet management (I use notational velocity or command history, whichever is handiest).

There is no doubt httpie's interface is a lot better, but it creates another problem (again, which is somewhat universal) of installing, learning and remembering to use a new tool. This is a non-trivial problem that is a key concern for anyone evaluating a new tool, and it's a problem that only really gets solved with ubiquity.

Finally, an observation that so many of our "traditional" command line tools pay no attention to usability because, at least back in the day, the problem they solved was hard. People had a choice: either put up with an (admittedly) bad interface or write their own version in C. The individual cost of learning a bad interface outweighed the cost of rewriting the tool, and so standard tools were born.

And now, decades later, new generations are stuck having to learn needlessly obtuse interfaces to standard tools. We have a situation where newcomers pay the cost of developer UI laziness in perpetuity. This is, of course, a terrible outcome and it's projects like this one that are trying to change it.

So I applaud the effort and hope it catches on, become ubiquitous, and I can take the curl and wget snippets out of NV.

[+] wheels|13 years ago|reply
> People had a choice: either put up with an (admittedly) bad interface or write their own version in C.

That's totally false. libwww-perl was created 17 years ago and predates curl by a few years. There's nothing new about capable scripting languages and Unix.

[+] chernevik|13 years ago|reply
DELETED

This was a warning of a hazard to navigation, when a more diligent effort to remove the hazard is called for.

[+] fromhet|13 years ago|reply
Everything on HN is just "hey, learn to use this complicated complex thing - in just 30 minutes! You'll be so productive with all your creative startups!"

"Learn VIM essentials in this blog post!" "Never bother reading 'man curl'!" "Learn the basics of C in three easy steps!"

Sometimes HN feels like a lifestyle magazine for people who dream of being PG.

EDIT: Don't get me wrong, I too dream of having the same succes as PG. Why else would I be writing here?

[+] dschobel|13 years ago|reply
Also not sure what your complaint is. Working at the right level of abstraction is fundamental to being a good engineer.
[+] skeletonjelly|13 years ago|reply
What's your criticism exactly? What kind of content would you personally like to see more of?
[+] slurgfest|13 years ago|reply
"for humans" seems to have no better meaning than "for the OS X sensibility".

In other words: this is a style change rather than a productivity gain. And the superiority of the style is not obvious - unless you just HATE the style of existing tools and need to be set apart.

Most humans don't operate the command line or write scripts to begin with. Those who do, usually can handle wget "http://foo/bar. It took me all of a few seconds to start using wget and all of 10 minutes to have access to fancier features. (But the truth is that a certain level of complexity really just wants a script rather than ad hoc commands).

So here is a new tool, and it looks nice. But it doesn't at all relieve me from having to learn syntax and conventions - I still have to go to a doc/manpage and read that same kind of technical prose. So the only effective difference is that now I am using different punctuation, like @filename and -b. But the use of this "@" character is not really consistent with anything else.

So the tool is fine and I am sure people will use it but the competitive advantage is incredibly thin and the project smacks of NIH.

If curl and wget are not for humans then what are they for? People who do not have that magical design sensibility. Lame code-monkeys without vision, who are not creative and different. Soulless agents of the man.

This emphasis on branding over substance irks me quite a bit.

[+] javajosh|13 years ago|reply
Gosh, what a horrible comment! The simple fact is that curl and wget's interfaces are bad, and everyone who learns to use them spends that 10 minutes. And then, if you're like me, respends that 10 minutes when I need to do something fancy again. If a new tool were to save 5 minutes (which this one does), that's 5 minutes saved on every use, which is probably a total of a few hours for me personally, and spread over the population of future users, is a few lifetimes.

Rather than admit that even incremental usability improvements are not only useful, but continue to pay dividends long after the tool is produced, you lambast the author and the effort.

Not cool.

P.S. You should watch Bret Victor again, talking about how much easier it is to crush an idea than to support it and nurture it. http://vimeo.com/36579366

[+] azinman2|13 years ago|reply
You're totally off here. curl,wget, etc are tools that make overly complicated the most common use cases. There are 900 million command line flag options, so figuring out how to do a post & set a particular header can be 10 minutes. 10 minutes is an awfully long time for learning how to perform an HTTP request! At least on the github page I can learn the 5 things I'd actually want to do in about 3 seconds. And it colorizes the json output to make it more legible? Sweet. That's called productivity.

Interestingly enough, people seem to search for curl tutorials much more than wget: http://www.google.com/trends/?q=curl+tutorial,+wget+tutorial...

Having something that makes life easier with less complication IS what tools are all about. That's called evolution. Why have the web when you could have gopher? Because it's _better_, even if you can 'accomplish the same things.'

[+] maratd|13 years ago|reply
> Most humans don't operate the command line or write scripts to begin with.

I'm not a huge fan of cURL, but most people who use cURL don't use the command line either. They use the cURL library and access that functionality through a high level language (PHP, C, C++, whatever).

[+] pqdbr|13 years ago|reply
I couldn't disagree more. I wish I could disagree more, tough. Your comment shows a complete disregard for User Experience and User Interface, and this disregard is the reason why so much stuff simply "sucks" nowadays.
[+] jcromartie|13 years ago|reply
If by "for humans" you mean "for programmers who mostly use JSON". I have to say I'm not sending JSON with cURL too often, compared to any other payload. And when I do send JSON it's more complex than a flat set of keys and values.
[+] jkbr|13 years ago|reply
There is more to it than that. It provides an expressive syntax to construct all sorts of requests. You can submit forms, upload files and set headers without having to use flags. You also get syntax highlighting, ability to pipe in request data, etc.

If you are sending complex JSON, it's probably stored in a file or it's the output of another program:

    http PUT httpbin.org/put @/data/test.json

    http -b localhost:8888/couchdb/ | http PUT httpbin.org/put
[+] raffi|13 years ago|reply
I think the tool looks useful. I'd appreciate something like this for manually poking web servers when conducting a security assessment.

[Edit: removed HN snarky comment]

[+] mjs|13 years ago|reply
The UI for curl is awful (--request to change the method??) and wget's is only slightly better, but they do have the advantage of ubiquity, and it's often useful to email/Skype complete curl or wget command lines about the place to explain how to use an API, or demonstrate problems.

(e.g. Stripe and others document their API in terms of curl commands: https://stripe.com/docs/api.)

I do wish the curl UI was better, but I can't see it being trivially replaced. (It's a similar issue with git: bad UI, but every git question and answer is described in terms of the CLI, so even if you prefer a GUI client, say, you still need to be able to formulate your problem in CLI terms for anyone on stackoverflow to understand you.)

[+] SoftwareMaven|13 years ago|reply
It is possible to have another tool become ubiquitous (and in this case, it really would be for the best to have another tool become ubiquitous). In order to do that, though, you have to have a better tool, so this is a great start to making that happen.
[+] brown9-2|13 years ago|reply
You can also change the request method with -X.
[+] jparise|13 years ago|reply
I'd also recommend Curlish (http://packages.python.org/curlish/). It performs nice JSON highlighting and also handles OAuth 2.0 token authentication.

It simply wraps curl(1), so all of the familiar arguments and recipes continue to work just fine, as well.

[+] benatkin|13 years ago|reply
It looks great! I see no reason to slight cURL, though. Its CLI was intended for humans.
[+] jkbr|13 years ago|reply
It's not meant to be an insult on cURL. cURL is a great library/tool and supports way more than HTTP, but the command line interface simply isn't as convenient for common HTTP as it could be. The "for humans" slogan is borrowed from the underlying python-requests library and is meant to communicate that good UX is one of the top priorities of the project. Glad you like it!
[+] pkulak|13 years ago|reply
This is a great idea. Sometimes when something is not working the last thing I want to do is poor through the curl man page before I can even get started figuring it out.
[+] akvlad|13 years ago|reply
I was looking for something like this a while back and found a useful firefox plugin called Poster (https://addons.mozilla.org/en-us/firefox/addon/poster/). It's useful for testing a RESTful api without creating any front-end code to handle the requests. Or anything you can do with cURL just simpler.
[+] barrkel|13 years ago|reply
My main difficulties in using wget are in organizing output location (things like -nH, --cut-dirs, -P), choosing between -nc / -c / default (rename), error / retry policy (-T / -t), logging (-a vs -o), etc.

This tool doesn't really solve any of my actual problems. YMMV. It's less a cURL replacement than a web API invocation tool.

[+] masto|13 years ago|reply
I like it, though I got around curl's painful syntax by wrapping it in a few shell script. For example, here's my api_post script (meant to be used like "api_post users/123 first_name=Foo last_name=Bar") (pardon my incompetent shell scripting and redaction of company internals):

  #!/bin/sh
  
  resource="$1"
  shift
  
  declare -a post
  while [ "$1" ]; do
      post=("${post[@]}" "-F" "$1")
      shift
  done
  
  if [ -z "${API_BASE:=}" ]; then
      API_BASE=http://localhost:3000/api/v1/
      echo "No API_BASE set.  Using $API_BASE."
  fi
  
  verbose=""
  [ -n "$API_VERBOSE" ] && verbose="-v"
  
  if [ -z "${API_COOKIES:=}" ]; then
      cookies=~/.api.cookies
  else
      cookies="$API_COOKIES"
  fi
  
  curl -0 -k -s -S $verbose -b "$cookies" -c "$cookies" -X POST "${post[@]}" "${API_BASE}${resource}"
[+] mixmastamyk|13 years ago|reply
I was prepared to not be impressed, but this looks nice to use. Installed.
[+] ww520|13 years ago|reply
This is a good tool. I wish there's an editor-integrated interactive http tool.

OT: Is there an Emacs package that can do interactive invocation of http? Like having a text buffer to hold all the urls. Hitting Ctrl-E on a url to invoke it, and display the http response headers and result on separate buffers.

[+] ubasu|13 years ago|reply
For website scraping, something like casperjs/phantomjs or selenium are more suitable, since they emulate a browser and evaluate javascript, which curl cannot, and it is not clear if this tool can.

What are the use cases where curl is more suitable than phantomjs or selenium?

[+] exogen|13 years ago|reply
APIs. Lots of HTTP usage is not browser-related.
[+] icefox|13 years ago|reply
phantomjs doesn't just emulate a browser, it uses WebKit.
[+] vlucas|13 years ago|reply
If you are doing cURL calls as a part of testing functionality, you may want to consider using a tool like Frisby.js ( http://frisbyjs.com/ ) to create a suite of automated tests that involve HTTP calls.
[+] bmuon|13 years ago|reply
I really like how writing little tools like this in Node is so simple:

    node -e require('request')('http://www.asdf.com/').pipe(process.stdout)