There is a phenomenon in finance called Post Earnings Announcement Drift (http://en.wikipedia.org/wiki/Post-earnings-announcement_drif...), meaning there is a tendency for companies to move too much or too little after announcing their earnings and their price eventually snaps back to a more suitable price.
I spent the better part of 3 months trying all sorts of models to figure out how to trade this.
The most profitable model I could come up with....
Take each analysts who follows the company, rank them by their previous historical success at predicting earnings and then weight each analysts' estimate by the analysts rank.
This grade 8 level math beat out my "fancy" regression models.
I was pretty humbled by that discovery:)
Hopefully my blunder can save someone else here 3 months of their life:)
The only caveat to this is, beware of crowds when guessing costs nothing. When you get free guesses then the results can become useless very quickly.
The analysts rankings work as a financial analyst has exactly one currency, their reputation for predicting earnings. When this goes, so does their job:)
I just want to start off any discussion of this with an accurate understanding of Surowiecki's catchphrase.
"The wisdom of crowds" doesn't refer to a general tendency for large groups of people to be right about anything, under any circumstances. It refers to the fact that crowds with market-like dynamics tend to be surprisingly good at ascertaining information accurately.
It suggests that the whole converging-on-the-right-price aspect of markets might be a specific instance of a more general phenomenon. For instance, gamblers are also quite good at assessing probabilities, and when country fairs hold competitions to see who can guess the weight of a cow, the guesses tend to cluster around the correct answer.
Obviously, as with anything related to economics, there are a ton of caveats, and people who don't understand the phrase's intent have misused it wildly.
> when country fairs hold competitions to see who can guess
> the weight of a cow, the guesses tend to cluster around the
> correct answer
Do they really? I've seen this a bunch of times, but never seen a citation ... and even if it's true, it strikes me that guesses at the weight of a cow at a country fair are likely to be expert guesses. I wonder how well a New York crowd would do at the same thing?
Another thing that strikes me about the wisdom of crowds are the most convincing examples are ones where the guessers are basically guessing-what-other-people-will-guess. That's the essence of markets
Good points. Under the right circumstances, the "wisdom of crowds" can be really compelling. A couple of years back, I ran the data from an Oscar contest that had run over the course of about 30 years and the consensus pick each year was soundly the long-term winner. [1] However, it's an approach that doesn't seem to have been hugely generalizable.
> For instance, gamblers are also quite good at assessing probabilities, and when country fairs hold competitions to see who can guess the weight of a cow, the guesses tend to cluster around the correct answer.
Does each guess submission cost money? If so, then that's a pretty standard market. People who know more about cow weights are more likely to spend money because they have more to gain. Of course, even with free tickets, it could just be that people at rural fairs know a reasonable amount about cow weights.
What are they saying? Can somebody explain like I'm 5?
"the idea that aggregating or averaging the imperfect, distributed knowledge of a large group of people can often yield better information than canvassing expert opinion."
What? Often? If I have a pain in my lower right abdomen and I poll the human population, I'm going to get a better answer than a doctor?
If I poll the human population about a physics problem, I'm going to get a better answer than a physicist?
"But as Surowiecki himself, and many commentators on his book, have pointed out, circumstances can conspire to undermine the wisdom of crowds. In particular, if a handful of people in a population exert an excessive influence on those around them, a “herding” instinct can kick in, and people will rally around an idea that could turn out to be wrong."
Conspire? Undermine the wisdom? Wow...
So when somebody realized blood-letting wasn't helping, I guess those folk 'conspired' to 'exert an excessive influence on those around them' until a 'herding instinct' kicked in...
Brain explodes from the stupidity
When it's wrong, somebody conspired, when it's right, oh, well, don't mention that it's the exact same process.
As tessierashpool notes, it's not a statement that a random bunch of people will always give a better answer than an expert, especially if that expert knows the correct answer with high confidence. Rather, it's an observation that crowds who know something about the question being asked will often do better than experts who are also "guessing."
It requires a particular set of circumstances and tends to be most associated with people independently guessing numbers about things they can hazard an intelligent guess about.
> If I poll the human population about a physics problem, I'm going to get a better answer than a physicist?
I think it was Newton who said that anyone could be as successful as him, but they just didn't think hard enough. But 10 smart physicists could probably think "as hard" as a single Newton. This means there is no need to chase the expert. This removes inaccuracies, since groups have more variance in opinion than individuals. It is harder for groups to be all wrong. Group intelligence benefits science.
> Conspire? Undermine the wisdom? Wow...
See it as game theory with possibly adversarial agents going against the group consensus. If there is no incentive to go against the grain, or if there is damage possible to your reputation (and thus energy), people will not disagree with top management or thought leaders. All errors trickle down into the group without filters. This is dangerous as it mimics group intelligence, but is really the opinion of a single individual inside a group of yes-men. This contributes to economic crashes and hype-y bubbles.
Blood-letting is a good example of variant group behavior. We need people "insane" enough to try that. We need to be wrong many times, for someone to say: "this is not right." and help fix it. Blood-letting may have benefits though -- being a nearly 3000 year old practice it would be rare to be completely without merit and survive.
> When it's wrong, somebody conspired, when it's right, oh, well, don't mention that it's the exact same process.
You want to label a spam dataset. You can use one person for this, or a room with a hundred people. When the one person is wrong, it may be because he has a different/quirky opinion on what it means to be spam. When the whole room is wrong, you start to wonder if the e-mail was really ham to begin with. One person can only be wrong or right. A hundred people can (and preferably should) disagree, while still converging to a better solution than any individual in that group could have thought of. Groups remove arguments from authority.
If I remember correctly, the book also described another phenomenon where if the guessers were allowed to see everyone else's guesses, there was a good chance that everyone else would end up guessing closer to what the first few people guessed. This decreased the accuracy of the crowd because it placed higher weight on the first couple guesses and removed most of the variance in the guesses resulting in decreased accuracy.
My finance professor told my class a story about his wife and himself were at a fair and encountered one of those games where you have to guess the number of jelly beans in a jar. His wife has a consulting background while he has a background in finance. She observed the jar, tried counting the number of jelly beans on the bottom and extrapolating an estimate from those values. He, on the other hand, just looked through the first couple of cards others had turned in and picked a number that seemed in line with the estimations of others.
He got closer to the actual value.
The wisdom of crowds approach is alive and well in finance. For better or worse, the wisdom of the crowds approach allows an analysis with much less work than the engineering approach. But more importantly, it serves as a type of defense mechanism to the person making the actual prediction especially if he has no skin in the game. If you're way off, at least you're in good company. Okay for counting jelly beans, but it has a potential of being disastrous in other situations.
Well that's a relief. I only read that article as I've usually relied on a "wisdom of the crowds" approach to large decisions. I tend to reach out to a number of people inside and outside of my immediate "herd" and then aggregate their opinions to help influence my own.
Had I just learned that the wisdom of the crowds was not a thing I would have had to seriously re-evaluate my life!
> For instance, he says, if you notice that a Chinese restaurant in your neighborhood is always half-empty, and a nearby Indian restaurant is always crowded, then information about what percentages of people prefer Chinese or Indian food will tell you which restaurant, if either, is of above-average or below-average quality.
The situation described here wouldn't tell you anything about the quality of the restaurants. It might tell you what kind of food that neighborhood prefers. It might tell you which restaurant is more affordable. It might tell you which restaurant has a better public image. Or maybe it just tells you which restaurant is renting out its parking lot to a nearby car repair shop. To wit, quality cannot be determined by observing human behavior. Quality is determined exclusively by predefined parameters. Loudness, brightness, velocity, purity, etc., are measures of quality. In other words, quality is antipodal to popularity. Every human relation to an object, such as cost, availability, relevance, etc., has priority over quality at all times. It is even more accurate to say that quality is never a direct factor in choice. A person does not choose a knife, for example, because it is sharper than other knives; a person chooses the knife because it is sharp enough to meet their needs and, more importantly, because it is accessible to them. The only exception is the case where a person seeks out the highest quality option for the sake of quality itself, i.e. "I want to find the sharpest knife in the world." In this exceptional scenario, however, one learns nothing by studying the quality-seeking crowd that could not have been learned by simply measuring and comparing objects, i.e. it is much easier to find the sharpest knife by measuring knives for sharpness than by trying to determine which humans buy knives strictly because they are sharper than the alternatives (in such a case you would still have to measure the knives to confirm any conclusions you came to). All of this being said, I can't believe people at MIT studying human behavior would make such an absurd analogy. It indicates a complete misunderstanding of not only human beings but of logic as well.
However, this model assumes that the people are actually sincerely trying to help you. In practice, any use of the wisdom of crowds really boils down to dealing with abuse/trolls/spam.
Does anyone remember the name of the wise crowd app that just launched in beta? It's a GUI interface that let's groups pick things like basketball game results or the oscars?
[+] [-] chollida1|11 years ago|reply
There is a phenomenon in finance called Post Earnings Announcement Drift (http://en.wikipedia.org/wiki/Post-earnings-announcement_drif...), meaning there is a tendency for companies to move too much or too little after announcing their earnings and their price eventually snaps back to a more suitable price.
I spent the better part of 3 months trying all sorts of models to figure out how to trade this.
The most profitable model I could come up with....
Take each analysts who follows the company, rank them by their previous historical success at predicting earnings and then weight each analysts' estimate by the analysts rank.
This grade 8 level math beat out my "fancy" regression models.
I was pretty humbled by that discovery:)
Hopefully my blunder can save someone else here 3 months of their life:)
The only caveat to this is, beware of crowds when guessing costs nothing. When you get free guesses then the results can become useless very quickly.
The analysts rankings work as a financial analyst has exactly one currency, their reputation for predicting earnings. When this goes, so does their job:)
[+] [-] infinite8s|11 years ago|reply
[+] [-] tessierashpool|11 years ago|reply
"The wisdom of crowds" doesn't refer to a general tendency for large groups of people to be right about anything, under any circumstances. It refers to the fact that crowds with market-like dynamics tend to be surprisingly good at ascertaining information accurately.
It suggests that the whole converging-on-the-right-price aspect of markets might be a specific instance of a more general phenomenon. For instance, gamblers are also quite good at assessing probabilities, and when country fairs hold competitions to see who can guess the weight of a cow, the guesses tend to cluster around the correct answer.
Obviously, as with anything related to economics, there are a ton of caveats, and people who don't understand the phrase's intent have misused it wildly.
[+] [-] circlefavshape|11 years ago|reply
Do they really? I've seen this a bunch of times, but never seen a citation ... and even if it's true, it strikes me that guesses at the weight of a cow at a country fair are likely to be expert guesses. I wonder how well a New York crowd would do at the same thing?
Another thing that strikes me about the wisdom of crowds are the most convincing examples are ones where the guessers are basically guessing-what-other-people-will-guess. That's the essence of markets
[+] [-] ghaff|11 years ago|reply
[1] http://bitmason.blogspot.com/2012/04/crowdsourcing-predictio...
[+] [-] baddox|11 years ago|reply
Does each guess submission cost money? If so, then that's a pretty standard market. People who know more about cow weights are more likely to spend money because they have more to gain. Of course, even with free tickets, it could just be that people at rural fairs know a reasonable amount about cow weights.
[+] [-] unknown|11 years ago|reply
[deleted]
[+] [-] alexashka|11 years ago|reply
"the idea that aggregating or averaging the imperfect, distributed knowledge of a large group of people can often yield better information than canvassing expert opinion."
What? Often? If I have a pain in my lower right abdomen and I poll the human population, I'm going to get a better answer than a doctor?
If I poll the human population about a physics problem, I'm going to get a better answer than a physicist?
"But as Surowiecki himself, and many commentators on his book, have pointed out, circumstances can conspire to undermine the wisdom of crowds. In particular, if a handful of people in a population exert an excessive influence on those around them, a “herding” instinct can kick in, and people will rally around an idea that could turn out to be wrong."
Conspire? Undermine the wisdom? Wow...
So when somebody realized blood-letting wasn't helping, I guess those folk 'conspired' to 'exert an excessive influence on those around them' until a 'herding instinct' kicked in...
Brain explodes from the stupidity
When it's wrong, somebody conspired, when it's right, oh, well, don't mention that it's the exact same process.
[+] [-] ghaff|11 years ago|reply
It requires a particular set of circumstances and tends to be most associated with people independently guessing numbers about things they can hazard an intelligent guess about.
[+] [-] compbio|11 years ago|reply
You could. It is why people prefer a second opinion even when dealing with an expert. There are programs where everyone can join and try to diagnose disease. See CrowdMed (https://crowdmed.com) -- a YC company (http://venturebeat.com/2013/04/16/yc-startup-bets-crowdsourc...)
> If I poll the human population about a physics problem, I'm going to get a better answer than a physicist?
I think it was Newton who said that anyone could be as successful as him, but they just didn't think hard enough. But 10 smart physicists could probably think "as hard" as a single Newton. This means there is no need to chase the expert. This removes inaccuracies, since groups have more variance in opinion than individuals. It is harder for groups to be all wrong. Group intelligence benefits science.
> Conspire? Undermine the wisdom? Wow...
See it as game theory with possibly adversarial agents going against the group consensus. If there is no incentive to go against the grain, or if there is damage possible to your reputation (and thus energy), people will not disagree with top management or thought leaders. All errors trickle down into the group without filters. This is dangerous as it mimics group intelligence, but is really the opinion of a single individual inside a group of yes-men. This contributes to economic crashes and hype-y bubbles.
Blood-letting is a good example of variant group behavior. We need people "insane" enough to try that. We need to be wrong many times, for someone to say: "this is not right." and help fix it. Blood-letting may have benefits though -- being a nearly 3000 year old practice it would be rare to be completely without merit and survive.
> When it's wrong, somebody conspired, when it's right, oh, well, don't mention that it's the exact same process.
You want to label a spam dataset. You can use one person for this, or a room with a hundred people. When the one person is wrong, it may be because he has a different/quirky opinion on what it means to be spam. When the whole room is wrong, you start to wonder if the e-mail was really ham to begin with. One person can only be wrong or right. A hundred people can (and preferably should) disagree, while still converging to a better solution than any individual in that group could have thought of. Groups remove arguments from authority.
[+] [-] mrep|11 years ago|reply
[+] [-] bko|11 years ago|reply
He got closer to the actual value.
The wisdom of crowds approach is alive and well in finance. For better or worse, the wisdom of the crowds approach allows an analysis with much less work than the engineering approach. But more importantly, it serves as a type of defense mechanism to the person making the actual prediction especially if he has no skin in the game. If you're way off, at least you're in good company. Okay for counting jelly beans, but it has a potential of being disastrous in other situations.
[+] [-] simonswords82|11 years ago|reply
Had I just learned that the wisdom of the crowds was not a thing I would have had to seriously re-evaluate my life!
[+] [-] napowitzu|11 years ago|reply
The situation described here wouldn't tell you anything about the quality of the restaurants. It might tell you what kind of food that neighborhood prefers. It might tell you which restaurant is more affordable. It might tell you which restaurant has a better public image. Or maybe it just tells you which restaurant is renting out its parking lot to a nearby car repair shop. To wit, quality cannot be determined by observing human behavior. Quality is determined exclusively by predefined parameters. Loudness, brightness, velocity, purity, etc., are measures of quality. In other words, quality is antipodal to popularity. Every human relation to an object, such as cost, availability, relevance, etc., has priority over quality at all times. It is even more accurate to say that quality is never a direct factor in choice. A person does not choose a knife, for example, because it is sharper than other knives; a person chooses the knife because it is sharp enough to meet their needs and, more importantly, because it is accessible to them. The only exception is the case where a person seeks out the highest quality option for the sake of quality itself, i.e. "I want to find the sharpest knife in the world." In this exceptional scenario, however, one learns nothing by studying the quality-seeking crowd that could not have been learned by simply measuring and comparing objects, i.e. it is much easier to find the sharpest knife by measuring knives for sharpness than by trying to determine which humans buy knives strictly because they are sharper than the alternatives (in such a case you would still have to measure the knives to confirm any conclusions you came to). All of this being said, I can't believe people at MIT studying human behavior would make such an absurd analogy. It indicates a complete misunderstanding of not only human beings but of logic as well.
[+] [-] sukilot|11 years ago|reply
[+] [-] alwaysdoit|11 years ago|reply
[+] [-] beefman|11 years ago|reply
http://pages.stern.nyu.edu/~ilobel/bayesian-learning-social-...
[+] [-] aet|11 years ago|reply
[+] [-] seeingfurther|11 years ago|reply
[+] [-] aet|11 years ago|reply