Social networking as conversation is a good analogy, and one that Gabler doesn't follow through on in this article. Yes, the vast majority of conversations are banal, and always have been. The only difference is that, because of social networking, we now have a lasting record of how mundane our conversations are. Most conversations are not three-hour monologues about the End of History (and thank God for that). That doesn't mean that the quality of discussion has declined, or that we are in a post-idea era. It just means that throwaway small talk is located on the same global platform as the big ideas.
A positive side effect is that the interesting conversations are now available to everyone. For example, most chatter in a Philadelphia bar in 2011 is of no interest to anyone except the people having the conversation, but chatter in a Philadelphia bar in 1776 would make a fascinating read. I sure wish we had a record of that. Take Twitter as an example, since Gabler singles it out for criticism. Most of the time, I don't care what Muhammad Everyman in Cairo is tweeting about, because the quality level is the same as that of ordinary conversation. But when a revolution breaks out, I can listen in on tens of thousands of conversations talking about what is happening on a street-by-street level. Conversation is mundane, except when it isn't.
As for the comment about the Internet following a sort of Gresham's Law, with trivial information pushing out the big ideas, I think it's off the mark. Nothing is getting pushed out because there is nothing to get pushed out of. More bytes of kitten photos does not mean less bytes of Nietzsche. The amount and availability of big ideas and trivial data are increasing concurrently. The issue is one of attention: You can seek out and pay attention to big ideas, or you can look at cats. Most people look at cats. But that's always been true.
> As for the comment about the Internet following a sort of Gresham's Law, with trivial information pushing out the big ideas, I think it's off the mark. Nothing is getting pushed out because there is nothing to get pushed out of.
I disagree. What it gets pushed out of is human thought. Both that of individual humans and of our collective processing of notions.
I notice it in my own life for sure. I read more now, but I read a lot fewer books, and I spend less time thinking about what I read: I tend to leap to the next shiny thing.
I'm using tools like LeechBlock to claw back some of that time. And the time spent on lighter-weight web stuff isn't totally wasted; it allows me to maintain a much larger social network, which is both enjoyable and professionally useful. But I'm starting to feel about the web like I feel about modern grocery stores: I don't mind that they have candy, but I kinda resent having to run a junk-food gauntlet to get to the broccoli.
Every generation has said something exactly like this about the generation before (modern, postwar, postmodern, etc). However, I agree that we are in an age where most (if not all) big fuzzy ideas have already been thought of. People have already suggested that "God is Dead" (or nonexistent) or "God is all knowing and all powerful", and everything in between (even that a flying spaghetti monster controls all). Nobody is going to come up with something more radical on that spectrum. Its now on to morality and efficacy. Morally, there are still active debates on topics such as gay marriage, abortion, scope of the government, and cloning. Everything else is about efficacy, and what actually works. Big ideas are the easy part. Proving they work requires intellectual vigor, experimentation, data, and analysis over long periods of time. If big ideas are dead, then its only because we are on to the good stuff, figuring out if they were worth anything.
I, for one, welcome the post-idea world -- consider that the rise of Nazism and Communism was primarily driven by the so-called big ideas that helped the masses "make sense of the world" not unlike in the way the article seems to espouse.
I've known people consumed by ideas (nearly all of my college friends who were into politics). They were antagonistic, insincere, subtly hypocritical, and appeared simply brainwashed. You would not want to have them as partners neither in a business nor in a family. Not a pretty picture.
I think you're half right here, both on your examples and on your conclusion. Communism was definitely built around an idea. Nazism, though, wasn't; it's just tribalism dressed up a little.
By generalizing from a few data points without looking for other examples, you come to a false conclusion. The fight against both communism and fascism was helped greatly by fighting for ideals like democracy, individual freedom, and the right of peoples to avoid oppression.
What we should learn from this isn't that ideas are bad. It's that ideas are very powerful, and that we need to be extremely careful with the ideas we support. That in turn argues for the sort of careful thinking that the article recommends.
antagonistic, insincere, subtly hypocritical, and appeared simply brainwashed.
You could say the same for plenty of folks in the tech scene. I'd say it's a pretty sweeping generalization, no?
The concepts of capitalism & free markets that were incorporated into the framework of the United States as a governing body were pretty big ideas as well; heretical by the standards of human civilization at a period of time which knew nothing but monarchy & tyranny.
Some "big ideas" turn out to be cynical, evil and horrible for society. Doesn't mean thinking big (politically, business-wise or otherwise) should be discouraged or driven away from our collective consciousness.
They were antagonistic, insincere, subtly hypocritical, and appeared simply brainwashed.
That's true for a lot of people. Most people don't want to think for themselves, they want to fight for an idea that makes sense for them, and they lack the mental tools to actually think or challenge the idea.
“We prefer knowing to thinking because knowing has more immediate value. It keeps us in the loop, keeps us connected to our friends and our cohort. Ideas are too airy, too impractical, too much work for too little reward. Few talk ideas. Everyone talks information, usually personal information. Where are you going? What are you doing? Whom are you seeing? These are today’s big questions.”
Knowing is also what gets you brownie points from academia & from most cubicle jobs. Rote memorization has become the litmus test by which many seem to measure intelligence. And most on the receiving end oblige because hey, do what your told and while society might suffer as a result of your lack of imagination & ideas, at least you’ll get that A. Or the promotion you so desperately want. So to me, our learning institutions and a great bulk of corporate America are to blame for big ideas dying on the vine just as much as new media.
The author makes some salient points though, particularly with the concept of “information narcissism.” That's certainly something I can see becoming more & more prevalent with the rise of the social graph.
Ideas such as the death of God or the end of history, ideas that can't be nailed down, much less verified, serve as food for enlightening discussion. And we're tired of discussion. We can't take any more. We want ideas that can be settled in labs or supercolliders or computer simulations. Why don't we want to talk about ideas any more? Actually, it was a single idea killed our appetite for discussion and hence our appetite for big, open-ended ideas. That idea is that everything has political implications and because of our concern for the well-being of others we are morally obligated to prioritize politics above all else.
Because of that belief, we cannot indulge in honesty, good faith, or curiosity in any intellectual discussion, lest the enemy steal a march on us. Lest a child starve, a woman be raped, a citizen be robbed of the fruit of his labor, or a nation decay, we must treat every discussion as a political battlefield. Talking with someone you disagree with on their own terms isn't stimulating or educational; it's legitimizing a harmful ideology. Conceding a point isn't part of a healthy interchange of ideas; it's emboldening a dangerous mentality. Even the goal of changing your conversation partner's mind is self-indulgent, since what's important is not changing the mind of the person you're talking with but convincing the spectators that you're winning. It doesn't even matter whether your partner thinks you are sincere. Since the spectators are the point, he is obligated to argue with you no matter how uncivil or insincere you are. It would be immoral of him to concede the field to you, because of the damage you could do with it.
This view of conversation as political struggle, which ruthlessly extends into every sphere of life through the habits we cultivate in public, has banished good faith and curiosity from intellectual discussion and turned it into an onerous moral obligation. Ideas that stimulate conversation are an unwelcome call to arms for bored and weary warriors. Good riddance to them!
Perhaps we don't need to consolidate information into ideas (theories), as much because the information is available directly (via internet). We don't need to find the underlying pattern anymore.
Also, The History of every major Galactic Civilization tends to pass through three distinct and recognizable phases, those of Survival, Inquiry and Sophistication, otherwise known as the How, Why, and Where phases. For instance, the first phase is characterized by the question 'How can we eat?' the second by the question 'Why do we eat?' and the third by the question 'Where shall we have lunch?'
I would suggest an experiment. All intellectuals complaining about the dumbing down of the generation that follows their own should instead compare pairs of past generations and see if they still think that the later generation always has less substance, fewer great thinkers, etc. Because if you read what people write when comparing their own generation (or the one before) to the next you'd have to think the world has been in an intellectual death spiral basically from the start.
It may just take time for great ideas to turn out as such or be interpreted as such. It would be interesting to do an empirical study to see whether the age of ideas at a time when they were first called "great ideas" has changed over time. It would be difficult to do because the concept of "greatness" may itself have gone through fashion cycles.
But maybe he is right and our society suffers from a little bit of "big" idea fatigue after the ideological desasters of the past two centuries that cost many millions of lives.
His complaints about social media seems totally misplaced. Does he really think that the Facebook or twitter event stream is somehow competing with deeper thought in people who would otherwise be thinking more? I don't believe that at all.
Just because a lot of chatter is now public doesn't mean fewer ideas of substance are published or perceived nowadays. Publishing just isn't filtered and curated as hierarchically as it used to be. That may influence the perception of greatness quite a lot.
I agree with the author in that there seems to be a general lack of thought in society as a whole. However, I would say that the information glut we're exposed to is only one of the reasons - and different reasons apply to different people.
I would include the "rat race" or "daily grind" into these reason pool for a lack of critical thought. So much of our brain power is demanded for tasks that are unoriginal, uninspired and uninnovative that when ( or even if ) we have a chance to sit and think critically, our brains are sapped and it becomes easier to let a pundit or a blogger form our thoughts for us.
Another big reason is the lack of incentive put forth in the article. An innovative idea has little to know value to society if not immediately monetizeable.
Ironic that the article is light on fact and heavy on anecdote and unsubstantiated claims.
"For one thing, social networking sites are the primary form of communication among young people, and they are supplanting print, which is where ideas have typically gestated. For another, social networking sites engender habits of mind that are inimical to the kind of deliberate discourse that gives rise to ideas. Instead of theories, hypotheses and grand arguments, we get instant 140-character tweets about eating a sandwich or watching a TV show."
This might have the sound of truth, but the article really gives me no reason to believe it's actually true.
It seems like there is a pretty big assumption here, not supported in any significant way. I don't see much evidence that there aren't big ideas being made right now. Not all big ideas make waves right away, for one. And some big ideas might not be obvious to everyone paying attention. How do you define it even? Audience? This post did little in that regard.
"The collection itself is exhausting: what each of our friends is doing at that particular moment and then the next moment and the next one; who Jennifer Aniston is dating right now; which video is going viral on YouTube this hour; what Princess Letizia or Kate Middleton is wearing that day."
To say this is a modern construct is false. People may not have been following Jennifer Aniston, but they were doing whatever the equivalent was at the time. As with any generation, there will be your rare innovators as well as your 'average chumps.'
Great points about the information flood of banality - not just 'facts', but factoids, op-ed pretending to be fact - that has reduced the depth of public discourse to that of a 3-day puddle. And thereby made our society far more prone to twisting in the wind of every blow-hard shock-jock and politician willing to hammer their talking points to death (IF that politician is the recipient of Murdoch press support, of course, otherwise, talking points vanish into the ether).
Far more prone to acceding to corporate fascism (witness the slide of the USA and UK in this direction) dressed up as 'the all-knowing Market'. God is dead (which was Nietzsche's question, not finding by the way), long live The Market!
Uncomfortable, unsettling, mold-breaking ideas - AND a passionate discussion of them in their wake, placing them in context, searching out their salience, noting their blind-spots, etc - and popular curiosity and interest in the writers and thinkers who produce them, are life-blood to a civilization.
Ideas are still at large and very much alive in many non-anglophone countries. is this the saving grace of NOT being native speakers of the the English that rules the internet?
I'm interested in seeing information technology become idea technology. Any ideas about that?
Meanwhile, the internet reminds me of nothing so much as one of those bowls you have somewhere around the house into which miscellaneous oddities accrue over time, none of which will ever be used, but it seemed good at the time to put them somewhere... You know what I mean?
I mean a constant 'feed' of junk...Exhausting indeed, to host it anywhere in your actual head. And so much is coming in that memory is buckling under the strain and having to be downloaded to Google, which just further strip-mines the mind in its effort to retain the component parts that build genuine, socially valuable knowledge.
Love how the line right after the last one - "Think about that." is "Connect with The New York Times on Facebook."
On a related note to the article, I agree with the author that the web is filled with information exchanges that is one reason why the web can end up being anti-ideas.
However, what I love about Hacker News, is that it is one of the few such exchanges that I know of that seems to have a health balance between information and big ideas. Hope it stays this way.
It is certainly no accident that the post-idea world has sprung up alongside the social networking world.
.
.
.
Entrepreneurs have plenty of ideas, and some, like Steven P. Jobs of Apple, have come up with some brilliant ideas in the “inventional” sense of the word. Still, while these ideas may change the way we live, they rarely transform the way we think
[+] [-] pemulis|14 years ago|reply
A positive side effect is that the interesting conversations are now available to everyone. For example, most chatter in a Philadelphia bar in 2011 is of no interest to anyone except the people having the conversation, but chatter in a Philadelphia bar in 1776 would make a fascinating read. I sure wish we had a record of that. Take Twitter as an example, since Gabler singles it out for criticism. Most of the time, I don't care what Muhammad Everyman in Cairo is tweeting about, because the quality level is the same as that of ordinary conversation. But when a revolution breaks out, I can listen in on tens of thousands of conversations talking about what is happening on a street-by-street level. Conversation is mundane, except when it isn't.
As for the comment about the Internet following a sort of Gresham's Law, with trivial information pushing out the big ideas, I think it's off the mark. Nothing is getting pushed out because there is nothing to get pushed out of. More bytes of kitten photos does not mean less bytes of Nietzsche. The amount and availability of big ideas and trivial data are increasing concurrently. The issue is one of attention: You can seek out and pay attention to big ideas, or you can look at cats. Most people look at cats. But that's always been true.
[+] [-] wpietri|14 years ago|reply
I disagree. What it gets pushed out of is human thought. Both that of individual humans and of our collective processing of notions.
I notice it in my own life for sure. I read more now, but I read a lot fewer books, and I spend less time thinking about what I read: I tend to leap to the next shiny thing.
I'm using tools like LeechBlock to claw back some of that time. And the time spent on lighter-weight web stuff isn't totally wasted; it allows me to maintain a much larger social network, which is both enjoyable and professionally useful. But I'm starting to feel about the web like I feel about modern grocery stores: I don't mind that they have candy, but I kinda resent having to run a junk-food gauntlet to get to the broccoli.
[+] [-] mchusma|14 years ago|reply
[+] [-] bluekeybox|14 years ago|reply
I've known people consumed by ideas (nearly all of my college friends who were into politics). They were antagonistic, insincere, subtly hypocritical, and appeared simply brainwashed. You would not want to have them as partners neither in a business nor in a family. Not a pretty picture.
[+] [-] wpietri|14 years ago|reply
By generalizing from a few data points without looking for other examples, you come to a false conclusion. The fight against both communism and fascism was helped greatly by fighting for ideals like democracy, individual freedom, and the right of peoples to avoid oppression.
What we should learn from this isn't that ideas are bad. It's that ideas are very powerful, and that we need to be extremely careful with the ideas we support. That in turn argues for the sort of careful thinking that the article recommends.
[+] [-] bmac27|14 years ago|reply
You could say the same for plenty of folks in the tech scene. I'd say it's a pretty sweeping generalization, no?
The concepts of capitalism & free markets that were incorporated into the framework of the United States as a governing body were pretty big ideas as well; heretical by the standards of human civilization at a period of time which knew nothing but monarchy & tyranny.
Some "big ideas" turn out to be cynical, evil and horrible for society. Doesn't mean thinking big (politically, business-wise or otherwise) should be discouraged or driven away from our collective consciousness.
[+] [-] mathgladiator|14 years ago|reply
That's true for a lot of people. Most people don't want to think for themselves, they want to fight for an idea that makes sense for them, and they lack the mental tools to actually think or challenge the idea.
[+] [-] bmac27|14 years ago|reply
Knowing is also what gets you brownie points from academia & from most cubicle jobs. Rote memorization has become the litmus test by which many seem to measure intelligence. And most on the receiving end oblige because hey, do what your told and while society might suffer as a result of your lack of imagination & ideas, at least you’ll get that A. Or the promotion you so desperately want. So to me, our learning institutions and a great bulk of corporate America are to blame for big ideas dying on the vine just as much as new media.
The author makes some salient points though, particularly with the concept of “information narcissism.” That's certainly something I can see becoming more & more prevalent with the rise of the social graph.
[+] [-] Raphael|14 years ago|reply
[+] [-] dkarl|14 years ago|reply
Because of that belief, we cannot indulge in honesty, good faith, or curiosity in any intellectual discussion, lest the enemy steal a march on us. Lest a child starve, a woman be raped, a citizen be robbed of the fruit of his labor, or a nation decay, we must treat every discussion as a political battlefield. Talking with someone you disagree with on their own terms isn't stimulating or educational; it's legitimizing a harmful ideology. Conceding a point isn't part of a healthy interchange of ideas; it's emboldening a dangerous mentality. Even the goal of changing your conversation partner's mind is self-indulgent, since what's important is not changing the mind of the person you're talking with but convincing the spectators that you're winning. It doesn't even matter whether your partner thinks you are sincere. Since the spectators are the point, he is obligated to argue with you no matter how uncivil or insincere you are. It would be immoral of him to concede the field to you, because of the damage you could do with it.
This view of conversation as political struggle, which ruthlessly extends into every sphere of life through the habits we cultivate in public, has banished good faith and curiosity from intellectual discussion and turned it into an onerous moral obligation. Ideas that stimulate conversation are an unwelcome call to arms for bored and weary warriors. Good riddance to them!
[+] [-] dreamdu5t|14 years ago|reply
[+] [-] ivankirigin|14 years ago|reply
[+] [-] 6ren|14 years ago|reply
Also, The History of every major Galactic Civilization tends to pass through three distinct and recognizable phases, those of Survival, Inquiry and Sophistication, otherwise known as the How, Why, and Where phases. For instance, the first phase is characterized by the question 'How can we eat?' the second by the question 'Why do we eat?' and the third by the question 'Where shall we have lunch?'
[+] [-] fauigerzigerk|14 years ago|reply
It may just take time for great ideas to turn out as such or be interpreted as such. It would be interesting to do an empirical study to see whether the age of ideas at a time when they were first called "great ideas" has changed over time. It would be difficult to do because the concept of "greatness" may itself have gone through fashion cycles.
But maybe he is right and our society suffers from a little bit of "big" idea fatigue after the ideological desasters of the past two centuries that cost many millions of lives.
His complaints about social media seems totally misplaced. Does he really think that the Facebook or twitter event stream is somehow competing with deeper thought in people who would otherwise be thinking more? I don't believe that at all.
Just because a lot of chatter is now public doesn't mean fewer ideas of substance are published or perceived nowadays. Publishing just isn't filtered and curated as hierarchically as it used to be. That may influence the perception of greatness quite a lot.
[+] [-] networkjester|14 years ago|reply
Pretty good read, thanks for posting this!
[+] [-] padobson|14 years ago|reply
I would include the "rat race" or "daily grind" into these reason pool for a lack of critical thought. So much of our brain power is demanded for tasks that are unoriginal, uninspired and uninnovative that when ( or even if ) we have a chance to sit and think critically, our brains are sapped and it becomes easier to let a pundit or a blogger form our thoughts for us.
Another big reason is the lack of incentive put forth in the article. An innovative idea has little to know value to society if not immediately monetizeable.
[+] [-] aklein|14 years ago|reply
"For one thing, social networking sites are the primary form of communication among young people, and they are supplanting print, which is where ideas have typically gestated. For another, social networking sites engender habits of mind that are inimical to the kind of deliberate discourse that gives rise to ideas. Instead of theories, hypotheses and grand arguments, we get instant 140-character tweets about eating a sandwich or watching a TV show."
This might have the sound of truth, but the article really gives me no reason to believe it's actually true.
[+] [-] ivankirigin|14 years ago|reply
Also, who says we don't have and discuss big ideas in social media? http://www.reddit.com/r/AskReddit/comments/jitk9/how_the_fuc...
[+] [-] ivanzhao|14 years ago|reply
2: patterns are the least resistant way to think -- they are habits.
3: breaking habits is uncomfortable.
So: radical ideas are uncomfortable.
[+] [-] mathgladiator|14 years ago|reply
And this is why I'm writing a programming language that isn't turing complete.
[+] [-] repos|14 years ago|reply
To say this is a modern construct is false. People may not have been following Jennifer Aniston, but they were doing whatever the equivalent was at the time. As with any generation, there will be your rare innovators as well as your 'average chumps.'
[+] [-] Rebus|14 years ago|reply
Far more prone to acceding to corporate fascism (witness the slide of the USA and UK in this direction) dressed up as 'the all-knowing Market'. God is dead (which was Nietzsche's question, not finding by the way), long live The Market!
Uncomfortable, unsettling, mold-breaking ideas - AND a passionate discussion of them in their wake, placing them in context, searching out their salience, noting their blind-spots, etc - and popular curiosity and interest in the writers and thinkers who produce them, are life-blood to a civilization.
Ideas are still at large and very much alive in many non-anglophone countries. is this the saving grace of NOT being native speakers of the the English that rules the internet?
I'm interested in seeing information technology become idea technology. Any ideas about that?
Meanwhile, the internet reminds me of nothing so much as one of those bowls you have somewhere around the house into which miscellaneous oddities accrue over time, none of which will ever be used, but it seemed good at the time to put them somewhere... You know what I mean?
I mean a constant 'feed' of junk...Exhausting indeed, to host it anywhere in your actual head. And so much is coming in that memory is buckling under the strain and having to be downloaded to Google, which just further strip-mines the mind in its effort to retain the component parts that build genuine, socially valuable knowledge.
[+] [-] aakz|14 years ago|reply
On a related note to the article, I agree with the author that the web is filled with information exchanges that is one reason why the web can end up being anti-ideas.
However, what I love about Hacker News, is that it is one of the few such exchanges that I know of that seems to have a health balance between information and big ideas. Hope it stays this way.
[+] [-] richcollins|14 years ago|reply
. . .
Entrepreneurs have plenty of ideas, and some, like Steven P. Jobs of Apple, have come up with some brilliant ideas in the “inventional” sense of the word. Still, while these ideas may change the way we live, they rarely transform the way we think
[+] [-] dfc|14 years ago|reply
How much of an effect did Rawls and Nozick have on the practical political debates of their time?
[+] [-] msluyter|14 years ago|reply