Oh wow. So I read "The Bug" of hers just after it came out (14 years ago) and it's such a poignant read about someone being slowly driven mad because he can't find a very peculiar bug which unpredictably haunts the sales people demo'ing their product. It's a great read and I highly recommend it to anyone here. The technical accuracy is hilarious too but you don't need to be a programmer whatsoever to understand the novel. I won't spoil the reveal of the actual bug or the ending but it's very much worth it.
After reading it I loaned my hardcover copy out to a guy named Boris when I told him how much I liked it. Boris; I don't know where you are and what you're up to nowadays; but if you somehow end up reading this; I kinda want it back. Ping me.
> The beauty of language is that we can be imprecise and still be understood.
And the terrifying thing about language is that we can still strive to be precise and yet (perhaps even unknowingly) be wildly misunderstood...perhaps malevolently. I've heard someone describe programming as telling a very precise story to a very stupid human. However, I think it would be better to describe a compiler as a more humble listener. One who is willing to say "I'm sorry, you just told me to multiply a string, and I think I'm misunderstanding you." Thinking of compiler errors as an expression of humility can explain the comfort one finds in a stronger type system.
It is the same comfort one finds by not speaking publicly on twitter but speaking to a person who has the charity to say "huh? I think you just said this thing that sounds wildly offensive, and I think I'm misunderstanding you."
I would love to see an article on this topic - this one is not it though. The reason: we struggle at many major universities to determine this balance. On the one hand, students need a lot of CS in order to become mildly proficient; there is just so much to learn. On the other, school is a last chance to expose students to some of the big ideas of the world, the stuff (as Keating says in Dead Poets) that makes life worth living. How can you do both in 4 short years?
High school is where students should be getting their breadth. At some point you have to learn what you will do, and balance that out with a realization that more and more school just equals more and more debt.
I would actually like to see the opposite...get rid of all elective coursework and try to graduate students in two years. They'll have an incomplete education, but we all have incomplete educations.
If anything, we as a society and as individuals are creating more harm by pointlessly overeducating ourselves.
She strikes me as a person who has a lot of interesting things to say, I haven't read her books so I thank the author for the exposure.
However, if I had one criticism it would be that the interview and resulting article focus quite a bit on her being a victim of prejudice rather than focusing on the great work she has done. But maybe that's me.
In the article, she never directly states that hackers need to study the humanities. What she was basically saying - in the part about Vacca - is that there needs to be human oversight to algorithms that affect public policies (school districting, police paroles, ...). So the implication, in that example, is that programmers should work with people in the social sciences to deliver solutions that may not be algorithmically optimal, but better meet the needs of the community.
She is only asked about it in the last question, and mentions a New York City politician wanting open source algorithms:
> Algorithms make decisions all through his borough, the East Bronx, including where children go to school, where police patrols are assigned, and the schedule for garbage pickup.
Which seems a bit bizarre, open source algorithms would be nice, but the tie in to 'the humanities' seems tenuous at best.
This is a nice little interview, but the title is very misleading! The word "humanities" only appears once outside the title and the answer doesn't really address the title's claim.
I think it's an interesting topic, though. Do tech companies need more humanities grads? I'm skeptical because plenty of other industries have caused their own social problems, and the humanities haven't saved them. Look at advertising, which was originally built by artists rather than hackers.
It may well be true that big tech companies need to be more socially aware and environmentally conscious, but that doesn't mean "the humanities" is necessarily the answer.
I don't believe she's necessarily calling for the hiring of humanities grads.
What would help, as a start, is an appreciation for the values (and value) of the humanities.
On HN specifically, you frequently see something akin to Schadenfreude when referring to students of the liberal arts serving fries to tech grads. That is to some degree just factually wrong, considering many liberal arts undergrads go on to rather lucrative careers in law, or powerful careers in politics.
But it is also short-sighted, because history has always been a two-step process, with technology enabling us to do new things, and other disciplines enabling us to use these capabilities for something resembling the "common good".
I always like this thought experiment: Would you rather live in a world with the middle age's social/political/law system and today's technology–or the other way around?
This doesn't mean that social and political progress are the exclusive domain of those who studied greek mythology. In fact, this article explicitly argues that it shouldn't be: Hackers should engage with that community.
But what it does take is, sometimes, a bit of humility. Too often, I feel the tech community is looking at technological solutions to social and political problems. See cryptocurrencies, or the encryption debate as examples where you can feel cynicism, bordering on disdain, for the political process.
The problem with that is, first, that it's wrong: a lot of power rests in politics, and China is a living example that the internet community is currently unable to win a direct confrontation with the power of the state. It is also wrong in that it portraits all governments as equally bad, all laws as subject to changed or broken when expedient, when clearly there are differences in the rule of law between countries, and there are differences in the willingness of politicians to act with integrity.
Secondly, using technology to create facts is somewhat undemocratic in itself. The Federal Reserve may be committing all sorts of errors. But it is hard to deny that derives legitimacy through a transparent, tested, process much more democratic than bitcoin. It is also an illusion to suggest that bitcoin is free of ideology, only because that ideology takes the form of an algorithm decided in the past, ticking away with almost no method to influence it.
Actually modern advertising ( post WW2 ) was built by psychologists - the artists just dressed up the intentions. Edward Bernays pivoted from being a propagandist to advertising and virtually created the foundation for Madison Avenue as we understand it. Until social media and it's dopamine manipulation strategies, advertising was focused on the manipulation of the viewer.
Everyone needs the humanities. You go to Princeton and you're gonna find Brian Kernighan teaching computer science and digital humanities crossover class and explaining tech basics to the 'educated laymen'. The people at the top of the field know this; why haven't the rest of us figured this out yet?
[+] [-] santaragolabs|8 years ago|reply
Oh wow. So I read "The Bug" of hers just after it came out (14 years ago) and it's such a poignant read about someone being slowly driven mad because he can't find a very peculiar bug which unpredictably haunts the sales people demo'ing their product. It's a great read and I highly recommend it to anyone here. The technical accuracy is hilarious too but you don't need to be a programmer whatsoever to understand the novel. I won't spoil the reveal of the actual bug or the ending but it's very much worth it.
After reading it I loaned my hardcover copy out to a guy named Boris when I told him how much I liked it. Boris; I don't know where you are and what you're up to nowadays; but if you somehow end up reading this; I kinda want it back. Ping me.
[+] [-] jacquesm|8 years ago|reply
[+] [-] afarrell|8 years ago|reply
And the terrifying thing about language is that we can still strive to be precise and yet (perhaps even unknowingly) be wildly misunderstood...perhaps malevolently. I've heard someone describe programming as telling a very precise story to a very stupid human. However, I think it would be better to describe a compiler as a more humble listener. One who is willing to say "I'm sorry, you just told me to multiply a string, and I think I'm misunderstanding you." Thinking of compiler errors as an expression of humility can explain the comfort one finds in a stronger type system.
It is the same comfort one finds by not speaking publicly on twitter but speaking to a person who has the charity to say "huh? I think you just said this thing that sounds wildly offensive, and I think I'm misunderstanding you."
[+] [-] auggierose|8 years ago|reply
[+] [-] Upvoter33|8 years ago|reply
[+] [-] mcphage|8 years ago|reply
Most of the CS students learn in school will be obsolete in a few years, and they probably won’t be using it in their job anyway.
But the humanities they learn will be useful their whole lives.
[+] [-] junkscience2017|8 years ago|reply
I would actually like to see the opposite...get rid of all elective coursework and try to graduate students in two years. They'll have an incomplete education, but we all have incomplete educations.
If anything, we as a society and as individuals are creating more harm by pointlessly overeducating ourselves.
[+] [-] dijit|8 years ago|reply
However, if I had one criticism it would be that the interview and resulting article focus quite a bit on her being a victim of prejudice rather than focusing on the great work she has done. But maybe that's me.
[+] [-] mcphage|8 years ago|reply
[+] [-] unknown|8 years ago|reply
[deleted]
[+] [-] YouAreGreat|8 years ago|reply
[+] [-] dang|8 years ago|reply
I changed the title to a phrase from the article which doesn't cover the whole thing but at least covers some of it.
[+] [-] nhebb|8 years ago|reply
[+] [-] torpcoms|8 years ago|reply
> Algorithms make decisions all through his borough, the East Bronx, including where children go to school, where police patrols are assigned, and the schedule for garbage pickup.
Which seems a bit bizarre, open source algorithms would be nice, but the tie in to 'the humanities' seems tenuous at best.
[+] [-] duggan|8 years ago|reply
[+] [-] iainmerrick|8 years ago|reply
I think it's an interesting topic, though. Do tech companies need more humanities grads? I'm skeptical because plenty of other industries have caused their own social problems, and the humanities haven't saved them. Look at advertising, which was originally built by artists rather than hackers.
It may well be true that big tech companies need to be more socially aware and environmentally conscious, but that doesn't mean "the humanities" is necessarily the answer.
[+] [-] matt4077|8 years ago|reply
What would help, as a start, is an appreciation for the values (and value) of the humanities.
On HN specifically, you frequently see something akin to Schadenfreude when referring to students of the liberal arts serving fries to tech grads. That is to some degree just factually wrong, considering many liberal arts undergrads go on to rather lucrative careers in law, or powerful careers in politics.
But it is also short-sighted, because history has always been a two-step process, with technology enabling us to do new things, and other disciplines enabling us to use these capabilities for something resembling the "common good".
I always like this thought experiment: Would you rather live in a world with the middle age's social/political/law system and today's technology–or the other way around?
This doesn't mean that social and political progress are the exclusive domain of those who studied greek mythology. In fact, this article explicitly argues that it shouldn't be: Hackers should engage with that community.
But what it does take is, sometimes, a bit of humility. Too often, I feel the tech community is looking at technological solutions to social and political problems. See cryptocurrencies, or the encryption debate as examples where you can feel cynicism, bordering on disdain, for the political process.
The problem with that is, first, that it's wrong: a lot of power rests in politics, and China is a living example that the internet community is currently unable to win a direct confrontation with the power of the state. It is also wrong in that it portraits all governments as equally bad, all laws as subject to changed or broken when expedient, when clearly there are differences in the rule of law between countries, and there are differences in the willingness of politicians to act with integrity.
Secondly, using technology to create facts is somewhat undemocratic in itself. The Federal Reserve may be committing all sorts of errors. But it is hard to deny that derives legitimacy through a transparent, tested, process much more democratic than bitcoin. It is also an illusion to suggest that bitcoin is free of ideology, only because that ideology takes the form of an algorithm decided in the past, ticking away with almost no method to influence it.
[+] [-] pjonesdotca|8 years ago|reply
[+] [-] torpcoms|8 years ago|reply
[+] [-] internetman55|8 years ago|reply
[+] [-] unknown|8 years ago|reply
[deleted]