I really dislike this narrative that facts don't change our minds and we are all irrational.
The thing is, truth does exert a pull on our beliefs. It's a slow force. It may take years for people to come around to it. Sometimes it even happens on a generational scale. But we are approaching the truth. Everything in history, and everything in our daily experience tells us this. A couple of experiments where researchers manage to fool the people in their studies does not disprove this overall trend.
What scares me about this narrative, is that people are using it to discredit democracy. "Look how stupid people are! We have to spoonfed them the cherrypicked facts that lead them to the right beliefs. We have to decide everything for them."
Whether you like it or not has no bearing on whether it's true. In fact that's a confirmation bias right there. Presented with evidence of something you find unpalatable you simply reject it outright purely on the basis you don't like it. Personally I'd rather know.
The point isn't that evidence has no power, it's that it has dramatically less power than most people think. However there are strategies for getting us out of the personal bias quagmire, such as the scientific method and the approach described in the article of providing an account or explanation of your position and the reasons for it. The debating rule of first explaining your opponent's position in your own words, but in a form they accept as being accurate, before trying to rebut it is also hugely powerful. These do seem to work and help lead us to better outcomes, so this is valuable and actually useful work.
> I really dislike this narrative that facts don't change our minds and we are all irrational.
I don't think that is the real "narrative" here, although reading this article may make it seem like that. The "narrative", or rather the modern scientific understanding which this article tries to present to a lay audience, is that real rational thinking is not our default, even though it seems to us that way.
But we can think more rationally. It just takes a lot more work. We can do all of the following...
* Subject our thinking to a rigorous framework such as the scientific method, in which we have to declare what evidence would falsify our argument (hypothesis) as we make it.
* Study cognitive biases to become more aware of their effects on our thinking and hopefully "immunize" our mind against some of their effects.
* Train our capacity for meta-cognition with mindfulness practices to become more aware of why we think what we think as we think it.
As for using the limitations of human rationality as an argument against democracy... I don't think that's a logical conclusion at all since leaders and lawmakers are subject to these limitations no matter how they come into power. But it is an argument that we still need to improve the processes by which policy is decided and that we need to watch out for and guard against those who would abuse the specific ways that humans can be tricked because of these factors (such as the Cambridge Analytica crowd).
> I really dislike this narrative that facts don't change our minds and we are all irrational.
To mirror simonh's comment, it's rather ironic that you're responding to a claim of fact, with an opinion. Whether it's true or not is a question of science, not wishful thinking, and you've not given a solid reason to reject the findings of this research.
It's like the way the theory of evolution remains true whether or not some nasty elements of the far right try to use it to justify an atrocious ideology like 'social Darwinism'.
> people are using it to discredit democracy
Who does this? I don't see researchers like Dan Ariely [0] lurching to the far right when they make discoveries about our psychology. (It's odd that neither Ariely nor the field of behavioural economics [1] are mentioned in the article.)
Nothing about this research indicates that non-democratic systems of government are the best way to run things after all.
> truth does exert a pull on our beliefs
Broadly speaking mankind seems to get less ignorant over time, but sometimes the pull on our beliefs can act in the opposite direction. [2]
The article doesn't say facts never change minds. Clearly, facts do change minds. Just not always. The article simply focuses on the latter case, because that's a more interesting thing to read about.
It's like boring headlines don't get voted up in Hacker News. It has to be something interesting. Facts change peoples' minds. Facts don't change people's minds. Both are true. But only one is interesting.
Only certain truth exerts a pull on our beliefs: the truth that we use to justify our beliefs post-hoc. It's well established in psychology, from books I've read like "thinking fast and slow" or "The Righteous Mind", psychological study points to people building their beliefs THEN finding facts to justify them.
I do agree that, over generations, the correct and truthful views tend to gain the upper-hand. This arises from each generation downloading a new set of facts and learning in school, when they are young and their minds haven't formed their belief system yet. However, if we allowed all children to enter school at their place of worship from 5 to 18, we'd find college students remarkably unwilling to learn many more facts.
So, more broadly, why does it bother you that facts don't change our minds and we're all irrational? We are Homo Sapiens, a mammalian primate who made the jump from the jungle to the Savannah and learned to work together to gather food and hunt game. We haven't left behind our animal software, it is still active in and exploited by our modern society.
> But we are approaching the truth. Everything in history, and everything in our daily experience tells us this.
I think this needs substantiation. You present this like a fact, but it looks entirely like an opinion: your interpretation of history.
It presents a sense of inevitability that I find extremely dangerous.
The only thing that keeps us from losing what we have today - as many civilizations have done in the past - is our actions. Presenting it as historical inevitability cheapens both the meaning of our actions but also discourages people from seeing just how important active effort is.
Truth on others, like medicine, psychiatry, nutrition, etc.. Are really really conflicting. And people build cultures and tribes around their truths, each backed by science. Get some keto people, vegetarians and run of the mill nutrtion experts to sit around debating and your head will spin
There's such a complexity there, conflicting studies, poorly done studies.. Finding a "truth" in how we should eat, how often, etc.. Is near impossible.
Maybe part of it is expectation of instant gratification. But given how things change over time, slow adoption may not be worse than quick disruptive adoption that needs recalibration.
I think there is a more holistic POV here that resolves the tension you are worried about which is simply to accept that the beliefs people openly espouse are not what they actually believe, there is outward cognitive dissonance but inwardly people are making sense of the world in a fairly rational way that is consistent with their own goals and needs, not with objective/mathematically consistent reality. It only seems like irrational denial of facts because you incorrectly assume (a) you understand their needs and goals and (b) what they say they believe is about a position being objectively true or false and not a model of reality that works for them. If you want to "change people's minds" what you need to do is understand their needs and goals, how the most brutal version of "the truth" does or doesn't support those needs, and then "convince people" by creating coherent world views that are both "true" and allow them psychological coherence and safety (aka, +empathy and diplomacy).
I tend to agree. The problem is that the elites weaponize stuff like this, because they think it applies only to the masses--not the highly educated. Instead of asking why somebody might have different views (and instead of actually learning what the different viewpoints are), they just assume that that other, less educated people fail to actually analyze the data.
And even as the facts indicate that it is true, you will continue not to believe them. As you said, you dislike the narrative - whether it is true or not never really mattered to you, you just didn't like the implications.
You are doing a remarkably effective job at demonstrating this phenomenon.
Generational shifts are literally driven by indoctrinating children in increasingly less stupid sets of beliefs in childhood as understood by minorities of experts in each field.
I'm more cynical than that. I believe that facts do change people's minds, but most people harbor hidden agendas that they try to adjust convenient facts to while ignoring inconvenient ones.
I don't know if it's to discredit democracy so much as humanism. A lot of people right now seem to be fantasizing about a collectivist democracy, which can be as oppressive and destructive as an autocracy.
I'm reminded of Tetlock and Gardner's excellent book 'Superforecasting', which was essentially a study of people who consistently score at the top of prediction markets. One key thing that these 'superforecasters' had in common was that any new information caused them to update their model of the world, but none caused them to update it very much - typical people making predictions either didn't update their model or updated it too much in response to new facts.
I think it makes a lot of sense, when one is trying to identify patterns in information, that it's easy to over- or undervalue novel information. We don't necessarily know what a new fact means, so ignoring it is one common error while paying too much attention to it is another.
We also rarely even know if a "new fact" is actually true. So many studies don't replicate that it makes sense to hold off on updating core beliefs whenever "new facts" seem unlikely or in contradiction with previously known (and reliable) facts.
SSC had a nice article (now gone) that discussed this for a scientific theory that had literally hundreds of confirming studies done for it. All wrong. The "new facts" were bullshit. So even with tons of studies, it's reasonable to be skeptical in some situations.
It's also great that, eventually, science was able to figure out the "new facts" were bullshit. Yay, science. But it also means that people aren't being irrational when they don't immediately alter their fundamental beliefs while the ink is still dry, especially "new facts" that seem in contradiction with everything else we know…
IMO the conclusion "facts don't change our minds" is a stronger conclusion than the first two experiments show. On my reading, the first two experiments show that:
1. if I have a uniform/undefined prior (how the fuck should I know how risky/conservative firefighters are?)
2. and then I'm given an anchor
3. and then told the anchor is bunk
4. the anchor still affects me
But I suspect this hinges very heavily on the fact that our initial prior is basically non-existent. By contrast, if you:
1. picked a topic where I actually have some prior belief (What country is colder: Sweden or Germany?)
2. gave me some information "Germany is actually colder on average than Sweden because of a weird atmospheric thing that affects the nordics"
Specific facts are orthogonal to the actual underlying positions held, which are presented outwardly as other positions for the sake of political cover, hence the illusion of facts not changing minds. What's needed is an understanding of the actual underlying, usually hidden, positions, then present facts to disrupt those positions.
Why not? There are many strange facts in the world. Some of them are even true. It is very difficult, when recalling a strange facts, to remember whether it was one of the true ones or not. So naturally, things we've heard will tend to exert a pull, even if we later found out they were wrong.
I saw that Yuri Bezmenov interview[1] ages ago and didn't really think of it until now, when crime statistics are openly denied almost as if crime doesn't really exist at all.
Then I thought back to that Bezmenov interview with what he said about "demoralization". When a population is demoralized, they cannot discern true information when it is staring them in the face.
I think ignoring facts has less to do with some kind of esoteric psychological process and more to do with raising multiple generations to believe that they've been lied to and the whole "system" is evil.
The general public has been lied to, to a significant degree. People who have full trust in entities that have and continue to publish untruths seem more irrational to me than supposedly irrational skeptics, etc (it is unknowable what the aggregate rationality of a given group is, but good luck finding anyone rational enough to realize that).
For example in the one where the participant's own answer was disguised as that of another person we can't discount the result so easily. That's also true of the studies where participants downgraded their confidence when asked to give an account of it.
On the invented studies, bear in mind that the point wasn't to measure changing the participant's mind, only for them to rate the value of a study that either supported or contradicted their initial position. Their only basis for evaluating the value of either study was their own pre-existing bias, so objectively they had no reason to evaluate them differently.
That's quite different from expecting them to change their minds, as the reasons for them holding their position might not even have been addressed by the study. For example someone who disagrees with capital punishment on moral grounds may not care whether it is an effective deterrent or not so may no have any reason to doubt a study that it is an effective deterrent.
What politicians know that the authors of this study don't seem to realize, is that if we are told the same story repeatedly for long enough, no matter how absurd, we'll start believing it. If you throw in some scary outcome if we don't believe the story, we'll come around sooner. It seems that fear will cause us to re-examine our beliefs and values.
According to 19th century German philosopher Arthur Schopenhauer, “All truth passes through three stages: First, it is ridiculed. Second, it is violently opposed. Third, it is accepted as self-evident.”
The problem with that quote is that it is all too often adopted by people who wish to use it to prove that their pet "ridiculed" or "violently opposed" idea is one of those that will become self-evident, when in fact, it states nothing of the kind.
Many theories that are ridiculed deserve it.
Many ideas that are violently opposed should never see the light of day again.
Very, very few of those that reach either the first or the second stage ever make it to the third, and it is a classic logical fallacy to argue that being ridiculed implies that an idea will be proven true in the end.
Oh that's a fact right there. Take any piece of evidence or information, if it is ridiculed too much or violently opposed, and has been for ages, but no one forgot it, then it's probably true or partially true.
First of all have all these psychological studies been replicated?
Part of the reason, “facts” don’t change our mind is that a lot of “facts” aren’t really facts like physics, but are rather the result of statistical games.
Finally, and I think the biggest issue is that a lot of facts rely on trust, since they are practically impossible for the average person to fully verify. And I think, for a variety of reasons, trust has been lost. Think about vaccines. Say back in the 1950’s, you probably knew or heard of someone who died from polio. You mom, might have had a sibling that died from one of the other vaccine related illnesses. The doctor recommending the vaccines, was seen as a trusted friend. He(it was usually a he back then) probably spent his whole life in your town. He knew your grandparents. Maybe he delivered your parents. He would spend hours at the bedside of a sick child or a dying grandparent. Maybe he was the one who delivered your children as well. Now when he says that he recommends you give your child this vaccine, you are going to listen.
Now forward to modern times. You book your appointment. You go to the office where you wait for hours. The pediatrician comes in and rushes through a 15 minute visit. Says your kid should get vaccinated. On the way home you listen to an investigative report of how doctors are paid by big pharma to prescribe drugs. By the way, you have never heard of anyone you know getting one of these vaccine preventable illnesses.
Now the gap between the educated elites and regular people in this country is widening. They do t interact much socially. They do t even live together. In the United States, the non-college educated have seen a steady decline in their real wages and well-being. Of course they are going to distrust “facts” put out by the elite who are seen as out of touch.
I say this as someone who totally believes in vaccines and have persuaded many of my friends that they should have their children vaccinated. The growing gap between the rich and poor in this country is at the root of many issues.
>First of all have all these psychological studies been replicated?
According to the article they have many times, yes, it describes many examples of similar experiments along these lines.
This evolutionary function of reason, and it's resulting flaws in our implementation of it supports my belief that in the grand scheme of things we are actually only just barely sentient. That is, we're at the very lowermost bound of the set of possible intelligences that are capable of technological civilisation. I think this because, well, we only just recently evolved enough intelligence to actually do it. If we'd become intelligent enough earlier, we'd have done it earlier.
If that's true then sure, it would be natural to expect that our reasoning powers are still impaired by flaws and fallacious tendencies. The scientific method then is a procedural set of rules we've invented to prevent our naturally somewhat irrational tendencies to mess up our ability to determine accurate actionable information. Yay us!
> Finally, and I think the biggest issue is that a lot of facts rely on trust, since they are practically impossible for the average person to fully verify. And I think, for a variety of reasons, trust has been lost.
That can explain non-movement of opinion when presented with contrary fact, but not movement away from the fact. The article here notes the experiment when students were presented with dueling articles on capital punishment: the ambiguous data acted to bolster their original position no matter the original stance.
A lack of trust in authority is one thing, but to use the authority's agreement with your pre-existing opinion to determine trust in that same evaluation is inherently circular -- even if it is human.
Do you believe that the vaccines contain mercury and aluminum and that those metals are causing problems in people who take that vaccines? Because that's a fact they contain those metals and they hurt people. I have a daughter who went from speaking to requiring 5 years of therapy to start speaking again. I not allowed to sue about that and no one in the medical establishment would acknowledge that the only cause of my daughters sickness could have been the vaccines. The medical establishment is a joke, I'm through with giving them any trust. There is propaganda about vaccines being harmless and it needs to end.
Interesting read. They're basically proposing that our anti-rational behavior came out as a type of 'hyper-socialization'. I can believe it and, if true, would point to why things like changing the Overton window [1] and other mass public perception shifts change individual perception.
I don't think it's the only way to change peoples minds and I hesitate to dive into "just employ emotional reasoning" as that seems dangerous.
From personal experience, another effective way is to change people's minds is by giving them "skin in the game".
I've tried, over the years, to convince friends of the solution to the Monty Hall [2] problem. After explaining the solution and them either not believing it or not understanding it, I then play the game with them with 100 doors and revealing 98 after the first pick. Once this game is played a couple times, they understand the solution much more readily.
My take on this is that they suddenly have a personal stake in the game, even if it's weak. There's a personal cost that takes the form as social shame or loss aversion, even for a game that's played between friends with no money involved, that gives them a stake. Once they start wanting to actively avoid losing, they're much more willing to listen to reason.
The article points out that our anti-rational behavior is at odds with survival but I would bet there's a level of abstraction below which our survival minded rationality kicks in and above which we don't have enough of a stake in the answer to use our rationality to good effect.
> strong feelings about issues do not emerge from deep understanding
I've thought about this too on my own strong feelings. The more I know about something, the more I understand its nuances, pros and cons, etc, the less I feel strongly about it. Now when I spot myself with a strong feeling about something I try to remind myself that I'm most likely missing something.
We see this constantly in the dev world. Younger devs feel very strongly about languages, libraries, frameworks, etc, probably because they have a shallower understanding of the thing.
It takes constant training and energy to follow where the facts lead you. Feynman used different approaches as a way to keep himself focused on the facts and not exclusively what he “knew” was true. He said the easiest person to fool is yourself.
Mostly people want to validate their intuition and gut feelings and don’t want to experience the discomfort of finding out that their intuition is not magically correct.
The fundamental problem is that our beliefs become part of our identity, and thus most of the time we're not actually seeking the "truth". This is obviously true when it comes to religion, and almost as bad when it comes to politics. And these days, a lot of "science" has become hyper political: race, climate, gender, evolution. Forget changing anyone's mind on those topics, no matter what facts you have in your arsenal.
Truth is only important to us as long as it contributes positively to our well-being. This sort of mushrooms is edible and this one is poisonous - everyone would agree on that.
As far as more abstract truths are concerned: people believed for centuries that the Earth is flat. Many still do. If you said otherwise, society would probably burn you for heresy, so the cost of truth was hugely negative.
Were the New Yorker honest, they'd entitle this: "Why the Uneducated Don't Understand That You're Right." Which is a shame. This type of information should be used to help better the reader by asking them to understand their own blind spots--not indulge the reader by telling them that their adversary is ignorant and irraitonal.
Slightly misleading headline. The study tested how much a lie persists in someone's mind even after they're told the truth.
The study found that facts do indeed change people's minds, just not as much as we'd like, because the initial impression sets expectations. Caldini talks about this in some of his books on persuasion.
The Stanford experiment forgot to account for the fact that the students could've used the fake score they first received as a useful prior on how difficult the task was. It does not show that "Facts Don't Change Our Minds".
The New Yorker can't help itself, can it? Reasonably fair article, but then suddenly veers into:
"When I talk to Tom and he decides he agrees with me, his opinion is also baseless, but now that the three of us concur we feel that much more smug about our views. If we all now dismiss as unconvincing any information that contradicts our opinion, you get, well, the Trump Administration."
And:
"(They can now count on their side—sort of—Donald Trump, who has said that, although he and his wife had their son, Barron, vaccinated, they refused to do so on the timetable recommended by pediatricians.)"
The thing is with studies like this is it's used by people on the losing side of elections to start complaining about "low information voters" with the subtext being "If only everyone was as clever as me and all my friends that think the same then [thing I disagree with] would never win elections." Ironically this also lets them avoid any introspection as to whether they may lose because there are defects with their policy positions.
Exactly. Many times following the facts can paint BOTH sides as wrong, and those who espouse "follow the facts" often only mean "follow the facts I want you to follow and discard the rest".
> it's used by people on the losing side of elections to start complaining about "low information voters" with the subtext being "If only everyone was as clever as me and all my friends that think the same then [thing I disagree with] would never win elections."
it's pretty backed up by evidence (and honestly attending a Trump ralley), that the average voter of Trump is less educated, much more prone to misinformation, and simply holds a ton of trivially wrong beliefs about the state of the world.
That's without making a value judgement about the voter or saying they shouldn't have their vote which they should of course because there's no requirement for voting in a democracy, but it seems silly to pretend that such a thing as an uninformed group of voters does not exist, or even cannot exist because it would be offensive in a way.
Autocrats and corrupt leaders have banked on them throughout all of history, and measured, intelligent and truthful discourse is not always found in the majority.If we're concerned with truth then "they keep losing elections" or might makes right style arguments hold no value, in fact they're quite dangerous.
It's downvoted and greyed out because of a general hate of Jordan Peterson (which I've found is really independent of political affiliation), but it's a good clip in my opinion.
i very much enjoyed the article, but i do prefer apolitical content when possible. unsure why it was necessary to reference trump in the vaccine portion. people (authors included) that can’t control themselves from injecting politics where it doesn’t naturally belong are becoming more and more irritating imo.
> unsure why it was necessary to reference trump in the vaccine portion.
It wasn't necessary, however it gave the authors the opportunity to test in just one line if the summary was true, and I guess it worked.
I also don't want politics injected into scientific topics, but the role of politicians is to rule for people's good, and talk with extreme caution and responsibility because of the trust people give them. When a high profile politician says "this is good", a lot of people will follow the advice blindly, so when a politician put people lives at risk by telling for example that Hydroxychloroquine works as a cure for the Coronavirus (to date at least one dead and one intoxicated after following that advice), it's politics actually harming lives with dangerous information, which makes everyone's duty to inject back common sense into the debate. If only because scientists don't have the same exposure, and it becomes so hard or even impossible for them to undo the damages done by clueless politicians who talk about things they don't know squat.
BTW. I would have the same exact opinion even in the case it was Obama or Clinton doing what Trump did.
im3w1l|5 years ago
The thing is, truth does exert a pull on our beliefs. It's a slow force. It may take years for people to come around to it. Sometimes it even happens on a generational scale. But we are approaching the truth. Everything in history, and everything in our daily experience tells us this. A couple of experiments where researchers manage to fool the people in their studies does not disprove this overall trend.
What scares me about this narrative, is that people are using it to discredit democracy. "Look how stupid people are! We have to spoonfed them the cherrypicked facts that lead them to the right beliefs. We have to decide everything for them."
simonh|5 years ago
The point isn't that evidence has no power, it's that it has dramatically less power than most people think. However there are strategies for getting us out of the personal bias quagmire, such as the scientific method and the approach described in the article of providing an account or explanation of your position and the reasons for it. The debating rule of first explaining your opponent's position in your own words, but in a form they accept as being accurate, before trying to rebut it is also hugely powerful. These do seem to work and help lead us to better outcomes, so this is valuable and actually useful work.
jbotz|5 years ago
I don't think that is the real "narrative" here, although reading this article may make it seem like that. The "narrative", or rather the modern scientific understanding which this article tries to present to a lay audience, is that real rational thinking is not our default, even though it seems to us that way.
But we can think more rationally. It just takes a lot more work. We can do all of the following...
* Subject our thinking to a rigorous framework such as the scientific method, in which we have to declare what evidence would falsify our argument (hypothesis) as we make it.
* Study cognitive biases to become more aware of their effects on our thinking and hopefully "immunize" our mind against some of their effects.
* Train our capacity for meta-cognition with mindfulness practices to become more aware of why we think what we think as we think it.
As for using the limitations of human rationality as an argument against democracy... I don't think that's a logical conclusion at all since leaders and lawmakers are subject to these limitations no matter how they come into power. But it is an argument that we still need to improve the processes by which policy is decided and that we need to watch out for and guard against those who would abuse the specific ways that humans can be tricked because of these factors (such as the Cambridge Analytica crowd).
MaxBarraclough|5 years ago
To mirror simonh's comment, it's rather ironic that you're responding to a claim of fact, with an opinion. Whether it's true or not is a question of science, not wishful thinking, and you've not given a solid reason to reject the findings of this research.
It's like the way the theory of evolution remains true whether or not some nasty elements of the far right try to use it to justify an atrocious ideology like 'social Darwinism'.
> people are using it to discredit democracy
Who does this? I don't see researchers like Dan Ariely [0] lurching to the far right when they make discoveries about our psychology. (It's odd that neither Ariely nor the field of behavioural economics [1] are mentioned in the article.)
Nothing about this research indicates that non-democratic systems of government are the best way to run things after all.
> truth does exert a pull on our beliefs
Broadly speaking mankind seems to get less ignorant over time, but sometimes the pull on our beliefs can act in the opposite direction. [2]
[0] https://en.wikipedia.org/wiki/Dan_Ariely
[1] https://en.wikipedia.org/wiki/Behavioral_economics
[2] https://en.wikipedia.org/wiki/Confirmation_bias#backfire_eff...
Lendal|5 years ago
It's like boring headlines don't get voted up in Hacker News. It has to be something interesting. Facts change peoples' minds. Facts don't change people's minds. Both are true. But only one is interesting.
WhompingWindows|5 years ago
I do agree that, over generations, the correct and truthful views tend to gain the upper-hand. This arises from each generation downloading a new set of facts and learning in school, when they are young and their minds haven't formed their belief system yet. However, if we allowed all children to enter school at their place of worship from 5 to 18, we'd find college students remarkably unwilling to learn many more facts.
So, more broadly, why does it bother you that facts don't change our minds and we're all irrational? We are Homo Sapiens, a mammalian primate who made the jump from the jungle to the Savannah and learned to work together to gather food and hunt game. We haven't left behind our animal software, it is still active in and exploited by our modern society.
majormajor|5 years ago
I think this needs substantiation. You present this like a fact, but it looks entirely like an opinion: your interpretation of history.
It presents a sense of inevitability that I find extremely dangerous.
The only thing that keeps us from losing what we have today - as many civilizations have done in the past - is our actions. Presenting it as historical inevitability cheapens both the meaning of our actions but also discourages people from seeing just how important active effort is.
loudtieblahblah|5 years ago
Truth on others, like medicine, psychiatry, nutrition, etc.. Are really really conflicting. And people build cultures and tribes around their truths, each backed by science. Get some keto people, vegetarians and run of the mill nutrtion experts to sit around debating and your head will spin
There's such a complexity there, conflicting studies, poorly done studies.. Finding a "truth" in how we should eat, how often, etc.. Is near impossible.
And that's just that subject.
mc32|5 years ago
brightball|5 years ago
1. The person learning the fact trusts the source
2. The fact can be easily proven if the person doesn’t trust the source
3. There are no other facts which provide context that are missing
These are all critical in how the general public receives “facts”.
eagsalazar2|5 years ago
btmoney06|5 years ago
santoshalper|5 years ago
You are doing a remarkably effective job at demonstrating this phenomenon.
michaelmrose|5 years ago
desipis|5 years ago
What is the nature of this force? Where does it come from and how does it influence our minds?
commandlinefan|5 years ago
I'm more cynical than that. I believe that facts do change people's minds, but most people harbor hidden agendas that they try to adjust convenient facts to while ignoring inconvenient ones.
fullshark|5 years ago
SuoDuanDao|5 years ago
I think it makes a lot of sense, when one is trying to identify patterns in information, that it's easy to over- or undervalue novel information. We don't necessarily know what a new fact means, so ignoring it is one common error while paying too much attention to it is another.
erichocean|5 years ago
We also rarely even know if a "new fact" is actually true. So many studies don't replicate that it makes sense to hold off on updating core beliefs whenever "new facts" seem unlikely or in contradiction with previously known (and reliable) facts.
SSC had a nice article (now gone) that discussed this for a scientific theory that had literally hundreds of confirming studies done for it. All wrong. The "new facts" were bullshit. So even with tons of studies, it's reasonable to be skeptical in some situations.
It's also great that, eventually, science was able to figure out the "new facts" were bullshit. Yay, science. But it also means that people aren't being irrational when they don't immediately alter their fundamental beliefs while the ink is still dry, especially "new facts" that seem in contradiction with everything else we know…
roter|5 years ago
I guess we just need to tune our relaxation factors [0] or, perhaps better, recalibrate our Kalman filters.
[0] https://en.wikipedia.org/wiki/Successive_over-relaxation
dlkf|5 years ago
1. if I have a uniform/undefined prior (how the fuck should I know how risky/conservative firefighters are?)
2. and then I'm given an anchor
3. and then told the anchor is bunk
4. the anchor still affects me
But I suspect this hinges very heavily on the fact that our initial prior is basically non-existent. By contrast, if you:
1. picked a topic where I actually have some prior belief (What country is colder: Sweden or Germany?)
2. gave me some information "Germany is actually colder on average than Sweden because of a weird atmospheric thing that affects the nordics"
3. told me that 2 was BS
I highly doubt you'd be able to replicate 4.
eagsalazar2|5 years ago
syrrim|5 years ago
olah_1|5 years ago
Then I thought back to that Bezmenov interview with what he said about "demoralization". When a population is demoralized, they cannot discern true information when it is staring them in the face.
I think ignoring facts has less to do with some kind of esoteric psychological process and more to do with raising multiple generations to believe that they've been lied to and the whole "system" is evil.
[1]: https://www.youtube.com/watch?v=wYaR7mWxuf8
mistermann|5 years ago
trabant00|5 years ago
So why was it expected of the participants to change their minds? Nothing they could verify disproved their initial position.
For me all this proves is what I already knew: "garbage in, garbage out".
edit: as below comment pointed out this might not be the problem of the studies but of how the article tries to use them to prove its point.
simonh|5 years ago
On the invented studies, bear in mind that the point wasn't to measure changing the participant's mind, only for them to rate the value of a study that either supported or contradicted their initial position. Their only basis for evaluating the value of either study was their own pre-existing bias, so objectively they had no reason to evaluate them differently.
That's quite different from expecting them to change their minds, as the reasons for them holding their position might not even have been addressed by the study. For example someone who disagrees with capital punishment on moral grounds may not care whether it is an effective deterrent or not so may no have any reason to doubt a study that it is an effective deterrent.
TopHand|5 years ago
Majromax|5 years ago
RoutinePlayer|5 years ago
danaris|5 years ago
Many theories that are ridiculed deserve it.
Many ideas that are violently opposed should never see the light of day again.
Very, very few of those that reach either the first or the second stage ever make it to the third, and it is a classic logical fallacy to argue that being ridiculed implies that an idea will be proven true in the end.
rafaelvasco|5 years ago
jstanley|5 years ago
RcouF1uZ4gsC|5 years ago
Part of the reason, “facts” don’t change our mind is that a lot of “facts” aren’t really facts like physics, but are rather the result of statistical games.
Finally, and I think the biggest issue is that a lot of facts rely on trust, since they are practically impossible for the average person to fully verify. And I think, for a variety of reasons, trust has been lost. Think about vaccines. Say back in the 1950’s, you probably knew or heard of someone who died from polio. You mom, might have had a sibling that died from one of the other vaccine related illnesses. The doctor recommending the vaccines, was seen as a trusted friend. He(it was usually a he back then) probably spent his whole life in your town. He knew your grandparents. Maybe he delivered your parents. He would spend hours at the bedside of a sick child or a dying grandparent. Maybe he was the one who delivered your children as well. Now when he says that he recommends you give your child this vaccine, you are going to listen.
Now forward to modern times. You book your appointment. You go to the office where you wait for hours. The pediatrician comes in and rushes through a 15 minute visit. Says your kid should get vaccinated. On the way home you listen to an investigative report of how doctors are paid by big pharma to prescribe drugs. By the way, you have never heard of anyone you know getting one of these vaccine preventable illnesses.
Now the gap between the educated elites and regular people in this country is widening. They do t interact much socially. They do t even live together. In the United States, the non-college educated have seen a steady decline in their real wages and well-being. Of course they are going to distrust “facts” put out by the elite who are seen as out of touch.
I say this as someone who totally believes in vaccines and have persuaded many of my friends that they should have their children vaccinated. The growing gap between the rich and poor in this country is at the root of many issues.
treeman79|5 years ago
Facts are closely related to statistics. It’s possible to be both true and a complete lie at the same time.
Abusive people will often use “facts” to control victims. You learn to be very mistrustful after awhile.
simonh|5 years ago
According to the article they have many times, yes, it describes many examples of similar experiments along these lines.
This evolutionary function of reason, and it's resulting flaws in our implementation of it supports my belief that in the grand scheme of things we are actually only just barely sentient. That is, we're at the very lowermost bound of the set of possible intelligences that are capable of technological civilisation. I think this because, well, we only just recently evolved enough intelligence to actually do it. If we'd become intelligent enough earlier, we'd have done it earlier.
If that's true then sure, it would be natural to expect that our reasoning powers are still impaired by flaws and fallacious tendencies. The scientific method then is a procedural set of rules we've invented to prevent our naturally somewhat irrational tendencies to mess up our ability to determine accurate actionable information. Yay us!
Majromax|5 years ago
That can explain non-movement of opinion when presented with contrary fact, but not movement away from the fact. The article here notes the experiment when students were presented with dueling articles on capital punishment: the ambiguous data acted to bolster their original position no matter the original stance.
A lack of trust in authority is one thing, but to use the authority's agreement with your pre-existing opinion to determine trust in that same evaluation is inherently circular -- even if it is human.
byte1918|5 years ago
> Thousands of subsequent experiments have confirmed (and elaborated on) this finding.
ghthor|5 years ago
abetusk|5 years ago
I don't think it's the only way to change peoples minds and I hesitate to dive into "just employ emotional reasoning" as that seems dangerous.
From personal experience, another effective way is to change people's minds is by giving them "skin in the game".
I've tried, over the years, to convince friends of the solution to the Monty Hall [2] problem. After explaining the solution and them either not believing it or not understanding it, I then play the game with them with 100 doors and revealing 98 after the first pick. Once this game is played a couple times, they understand the solution much more readily.
My take on this is that they suddenly have a personal stake in the game, even if it's weak. There's a personal cost that takes the form as social shame or loss aversion, even for a game that's played between friends with no money involved, that gives them a stake. Once they start wanting to actively avoid losing, they're much more willing to listen to reason.
The article points out that our anti-rational behavior is at odds with survival but I would bet there's a level of abstraction below which our survival minded rationality kicks in and above which we don't have enough of a stake in the answer to use our rationality to good effect.
[1] https://en.wikipedia.org/wiki/Overton_window
[2] https://en.wikipedia.org/wiki/Monty_Hall_problem
082349872349872|5 years ago
pier25|5 years ago
I've thought about this too on my own strong feelings. The more I know about something, the more I understand its nuances, pros and cons, etc, the less I feel strongly about it. Now when I spot myself with a strong feeling about something I try to remind myself that I'm most likely missing something.
We see this constantly in the dev world. Younger devs feel very strongly about languages, libraries, frameworks, etc, probably because they have a shallower understanding of the thing.
Isamu|5 years ago
Mostly people want to validate their intuition and gut feelings and don’t want to experience the discomfort of finding out that their intuition is not magically correct.
dang|5 years ago
Why they didn't at the time: https://news.ycombinator.com/item?id=13810764
iconjack|5 years ago
mD5pPxMcS6fVWKE|5 years ago
btmoney06|5 years ago
SmokeyHamster|5 years ago
The study found that facts do indeed change people's minds, just not as much as we'd like, because the initial impression sets expectations. Caldini talks about this in some of his books on persuasion.
bigpumpkin|5 years ago
gadders|5 years ago
"When I talk to Tom and he decides he agrees with me, his opinion is also baseless, but now that the three of us concur we feel that much more smug about our views. If we all now dismiss as unconvincing any information that contradicts our opinion, you get, well, the Trump Administration."
And:
"(They can now count on their side—sort of—Donald Trump, who has said that, although he and his wife had their son, Barron, vaccinated, they refused to do so on the timetable recommended by pediatricians.)"
The thing is with studies like this is it's used by people on the losing side of elections to start complaining about "low information voters" with the subtext being "If only everyone was as clever as me and all my friends that think the same then [thing I disagree with] would never win elections." Ironically this also lets them avoid any introspection as to whether they may lose because there are defects with their policy positions.
noetic_techy|5 years ago
Barrin92|5 years ago
it's pretty backed up by evidence (and honestly attending a Trump ralley), that the average voter of Trump is less educated, much more prone to misinformation, and simply holds a ton of trivially wrong beliefs about the state of the world.
That's without making a value judgement about the voter or saying they shouldn't have their vote which they should of course because there's no requirement for voting in a democracy, but it seems silly to pretend that such a thing as an uninformed group of voters does not exist, or even cannot exist because it would be offensive in a way.
Autocrats and corrupt leaders have banked on them throughout all of history, and measured, intelligent and truthful discourse is not always found in the majority.If we're concerned with truth then "they keep losing elections" or might makes right style arguments hold no value, in fact they're quite dangerous.
thisrod|5 years ago
rbecker|5 years ago
troughway|5 years ago
war1025|5 years ago
dutch3000|5 years ago
squarefoot|5 years ago
It wasn't necessary, however it gave the authors the opportunity to test in just one line if the summary was true, and I guess it worked.
I also don't want politics injected into scientific topics, but the role of politicians is to rule for people's good, and talk with extreme caution and responsibility because of the trust people give them. When a high profile politician says "this is good", a lot of people will follow the advice blindly, so when a politician put people lives at risk by telling for example that Hydroxychloroquine works as a cure for the Coronavirus (to date at least one dead and one intoxicated after following that advice), it's politics actually harming lives with dangerous information, which makes everyone's duty to inject back common sense into the debate. If only because scientists don't have the same exposure, and it becomes so hard or even impossible for them to undo the damages done by clueless politicians who talk about things they don't know squat.
BTW. I would have the same exact opinion even in the case it was Obama or Clinton doing what Trump did.
tribeofone|5 years ago
because it's the new yorker. Facts are optional, bias is required.
williesleg|5 years ago
[deleted]
unknown|5 years ago
[deleted]
MaxBarraclough|5 years ago
dang|5 years ago
baxtr|5 years ago
wizzwizz4|5 years ago
willvarfar|5 years ago