I really love this idea, a standard place to find the authors and the tools (!) used. But I'm not a big fan of the name. Calling it humans.txt to mirror robots.txt doesn't make much sense to me, even as a joke. I think it should be named something direct and comprehensible like credits.txt.
"Google is built by a large team of engineers, designers, researchers, robots ..."
Wait, does that say "robots"? This is how it starts people, with a robot creating a humans.txt text file, posing as a friendly Googler. Bill Joy must feel so vindicated now.
I think humans.txt is great, but it would be even better if the "standard" was to use a human/machine-readable format like YAML. The example on the website is really close to that.
So basically, Google has too many people for them to be able to list them all. Or Google didn't want to try and list them, thinking they might miss someone, or subject them to poaching.
That seems to be a fundamental problem with humans.txt: The bigger, more interesting a project gets, it creates several reasons why it will only vaguely be able to lost anything, out of a conflict of interest, rather than give full credit to the team behind the site.
[+] [-] Sukotto|15 years ago|reply
[+] [-] cromulent|15 years ago|reply
http://news.ycombinator.com/item?id=2131692
[+] [-] stephth|15 years ago|reply
[+] [-] ry0ohki|15 years ago|reply
[+] [-] arturadib|15 years ago|reply
[+] [-] InfinityX0|15 years ago|reply
[+] [-] dude_abides|15 years ago|reply
[+] [-] mrspeaker|15 years ago|reply
[+] [-] eschulte|15 years ago|reply
gentlemen stay within 79 characters of the start of a line
[+] [-] bradya|15 years ago|reply
https://github.com/paulirish/html5-boilerplate/blob/master/h...
[+] [-] Andrex|15 years ago|reply
[+] [-] nametoremember|15 years ago|reply
[+] [-] riffraff|15 years ago|reply
[+] [-] unknown|15 years ago|reply
[deleted]
[+] [-] dexen|15 years ago|reply
(also, the comments in the source of the root page, http://www.hasthelhcdestroyedtheearth.com/)
[+] [-] nhebb|15 years ago|reply
Wait, does that say "robots"? This is how it starts people, with a robot creating a humans.txt text file, posing as a friendly Googler. Bill Joy must feel so vindicated now.
[+] [-] iwwr|15 years ago|reply
[+] [-] ma2rten|15 years ago|reply
[+] [-] jarin|15 years ago|reply
Yes, I know it's an ironic request.
[+] [-] paulirish|15 years ago|reply
Personally I say fuck it.. While machine-parseable would be nice, that's not the point of this file.
More creativity without some sort of YAML constraint. In the HTML5 Boilerplate ours has effing stars, bro: https://github.com/paulirish/html5-boilerplate/blob/master/h...
[+] [-] autalpha|15 years ago|reply
Unknown.
---- Awww... :(
[+] [-] drKarl|15 years ago|reply
[+] [-] nametoremember|15 years ago|reply
[+] [-] lukifer|15 years ago|reply
[+] [-] tlrobinson|15 years ago|reply
[+] [-] notJim|15 years ago|reply
[+] [-] PhatBaja|15 years ago|reply
[+] [-] gabrielroth|15 years ago|reply
[+] [-] slouch|15 years ago|reply
[+] [-] jschuur|15 years ago|reply
That seems to be a fundamental problem with humans.txt: The bigger, more interesting a project gets, it creates several reasons why it will only vaguely be able to lost anything, out of a conflict of interest, rather than give full credit to the team behind the site.
[+] [-] sixtofour|15 years ago|reply
[+] [-] bauchidgw|15 years ago|reply
[+] [-] jonah|15 years ago|reply
[+] [-] rmccue|15 years ago|reply
[+] [-] cellis|15 years ago|reply
[+] [-] kennymeyers|15 years ago|reply