I maintain that it isn't just hard, it is computationally impossible.
We should all know that given a belief about the world, and evidence, Bayes' Theorem describes how to update our beliefs.
But what if we have a network of interrelated beliefs? That's called a Bayesian net, and it turns out that Bayes' Theorem also prescribes a unique answer. However, unfortunately, it turns out that working out that answer is NP-hard.
OK, you say, we can come up with an approximate answer. Sorry, no, coming up with an approximate answer that gets within probability 0.5 - ε, for 0 < ε, is ALSO NP-hard. It is literally true that under the right circumstances a single data point logically should be able to flip our entire world view, and which data point does it is computationally intractible.
Therefore our brains use a bunch of heuristics, with a bunch of known failure modes. You can read all the lesswrong you want. You can read Thinking, Fast and Slow and learn why we fail as we do. But the one thing that we cannot do, no matter how much work or effort we put into it, is have the sheer brainpower required to actually BE rational.
The effort of doing better is still worthwhile. But the goal itself is unachievable.
There are some good bits in here. I love the subtitle especially: "The real challenge isn’t being right but knowing how wrong you might be." Knowing when not to provide an answer is hard. A big part of my job is communicating statistical findings and giving a good non-answer is much harder than giving a good answer, both technically speaking and socially speaking.
One thing I'll add that drives me nuts is the fetishization of bayesian reasoning I see some times here on HN. There are times that bayesian reasoning is helpful and times that it isn't. Specifically, when you don't trust your model, bayes rule can mislead you badly (frequently when it comes to missing/counterfactual data). It's just a tool. There are others. It makes me crazy when it's someone's only hammer, so everything starts to look like a nail. Sometimes, more appropriate tools leave you without an answer.
Apparently that's not something we're willing to live with.
Thinking Fast and Slow left me with a feeling of despair about the human inability to reason effectively about statistics.
I like to tell people that charts work better for asking questions than answering them. Once people know you look for answers there, the data changes. More so than they do for question asking (people will try to smooth the data to avoid awkward questions).
Studying logical fallacies and behavioral economics biases have been the best ways for me to become more rational. I'm constantly calling myself out for confirmation bias, home country bias, and the recency effect in my internal investment thought process.
Learning about logical fallacies and identifying them in conversations is great. Don't tell the counterparty of their logical fallacies in conversations cause that's off putting. Just note them internally for a more rational inner dialogue.
Learning other languages and cultures is another way to learn about how different societies interact with objective truth. Living other places taught me a lot about how denial works in different places.
Thinking rationally is quite hard and I've learned how to abandon it in a lot of situations in favor of human emotions. How someone feels is more important than how they should feel.
> Learning other languages and cultures is another way to learn about how different societies interact with objective truth. Living other places taught me a lot about how denial works in different places.
This also had a rather frustrating effect. It is true that not just intensely traveling (not in the sight seeing way), but also actual living in several different countries and cultures, changed my horizon a lot. It definitely had the effect you talk about.
But then what? You cannot tell your partner in discussion "you would not think like that if you had traveled/lived outside of your culture", and it's also impossible to send everyone off to travel in order to experience the same. Much less in the US, where for most of the country you cannot just hop into a train for a few hours to encounter a completely different language and culture. (I grew up in Europe and moved to the US as an adult, but I've also lived in several different European countries before, and traveled to far away places like Asia.)
The sunk cost fallacy is particularly important to learn about and teach your children about.
I see it everywhere, from my own decision making process to international politics. Just this morning I was thinking about it as I read the news about the US leaving Afghanistan, and last week talking with a friend who is staying at a bad job.
In an attempt to catch myself in the act of logical fallacies I have a flash card app on my phone. One of the sets I have is of logical fallacies. Educating myself has helped make me more aware of them and when I fall victim to them.
It's not an easy task. But 10 minutes a day can add up and reinforce that information.
A related idea is cognitive distortion. It's basically an irrational thought pattern that perpetuates negative emotions and a distorted view of reality. One example many here can relate to is imposter syndrome. But to feel like an imposter you have to overlook your achievements and assets and cherry-pick negative data points.
“Logical fallacies”
are mostly Boolean/Aristotelian and identifying them is completely useless and/or counterproductive in 99% of real world scenarios. Most of your reasoning should be Bayesian, not Boolean, and under Bayesian reasoning a lot of “fallacies” like sunk cost, slippery slope, etc. are actually powerful heuristics for EV optimization.
Some are grateful to have them pointed out, after a bit of initial discomfort and resistance. Didn’t work out so well for Socrates of course, but we’re more enlightened now.
Big business want people to buy things they don't need, with money they don't have to impress people they don't like
Politicians want people who will drink the cool-aid and follow what they (the politicians) say (and not what they do)
Religions... well, same.
And so all messages from advertisement, to movies, TV, narrative is about hijacking people's feelings and suppressing rationality. Common sense is no longer common, and doesn't make much sense.
It's worse than that. The problem is that being truly rational is hard, unpleasant work that few people want to do.
If you read an article that makes your political opponents look bad, you can't just feel smugly superior, you have to take into account that you are predisposed to believe convenient sounding things, so you have to put extra effort into checking the truth of that claim.
If you follow the evidence instead of tribal consensus, you will probably end up with some beliefs that your friends and relatives wont like, etc.
I think this is connected to another reason why so many seem to reject "rationality" today.
They are rejecting the authorities that in the past have tried to associate themselves with "rationality". The political think tanks. The seminaries. The universities. Government agencies. Capitalist CEOs following the "invisible hand" of the market.
All of these so-called elites have biases and agendas, so of course none of them should be accepted at face value.
I think what's missed, is rationality is not about trusting people and organizations, but about trusting a process. Trusting debates over lectures. Trusting well designed studies over trusting scientists. Trusting free speech and examining a broad range of ideas over speech codes and censorship. Trusting empirical observation over ideological purity.
This is the value system of the so called "classical liberals", and they are an ever more lonely and isolated group. There is a growing embrace for authoritarianism and defense of tribal identity on both the "left" and the "right" taking its place.
Sometimes you want to deal with rational people - for example if you want things fixed and to work. I'd like a rational doctor, plumber and government. But I see your point that there are major incentives for encouraging irrationality in your customers.
My problem in everyday work is so often I have to deal with so-called software engineers who fancy themselves quite the scientific thinkers but whose irrationality borders on delusional. In fact a lot of them believe "I'm very smart, so I am therefore the most rational" which is obviously not true at all. In fact this will probably make a lot of so-called software engineers angry but I tend to think of the non-technical folk as the rational ones and much easier to deal with as a result. Purely anecdotal though.
It's really hard (for me, and I imagine, for everyone else) to not put myself into my views and opinions. Like, when someone shows me that I'm wrong, it's natural for me to feel attacked, instead of just taking it as a learning moment. Noticing when this happens and working with it has been my main struggle in learning how to be more rational. Those views and opinions really don't need to be a part of what I consider "myself."
Rationality, to me, is really about an open-minded approach to beliefs. Allowing multiple beliefs to overlap, to compete, to adapt, without interfering too much with the process.
I find the distinction between emotions and logic to be quite synthetic. Emotions is nothing but logic, just hard coded, subconscious and hard to trace back from the inside. Alot of "rational" thought though, falls into a similar category as the emotional pre-chosen outcome is just decorated with "rational" arguments. The reason ultimately is the same as everywhere in life. Economics. In this case energy economics. Heuristics and early-outs, are more desirable then a long, energy-intensive search of a complex space, coming to a indecisive conclusion to wander between local maximums.
The real interesting thing here, is the answer to why emotions, work as they do and what the patterns and bits are that trigger them. To turn over that particular rock is to go to some deeply disturbing places. And to loose the illusion that emotion make one more "human" - meanwhile, if ones reaction is more hard coded, shouldn't it be considered more machine-like?
It is hard to be rational in the way the New Yorker intends because we are constantly being lied to and having information hidden from us by institutions and so we have lost trust in them.
President Dwight D. Eisenhower put it succinctly in his farewell address to the nation:
"The prospect of domination of the nation's scholars by Federal employment, project allocations, and the power of money is ever present and is gravely to be regarded. Yet, in holding scientific research and discovery in respect, as we should, we must also be alert to the equal and opposite danger that public policy could itself become the captive of a scientific technological elite."
Can rationality exist outside of our minds? Is it just another mental heuristic?
In meditation, a common teaching is to examine an object for a long period, really just stare at it and allow your mind to focus on it fully. I see a coffee mug, it has a handle and writing on it, it's off-white and has little coffee stains. This descriptive mind goes a mile-a-minute normally, but eventually you can break through that and realize, this is just a collection of atoms, this is something reflecting photons and pushing back electrically against my skins' atoms. Even deeper, it's just part of the environment, all the things I can notice, like everything else we care about.
Such exercises can help reveal the nature of mind. There are many layers of this onion, and many separate onions vying for our attention at once. Rationality relies upon peeling back these superficial layers of the thought onion to get towards "the truth." That means peeling back biases, emotions, hunches, instincts, and all the little mental heuristics that are nice "shortcuts" for a biologically limited thinker.
But outside our minds, how is there any rationality left? It feels like another program or heuristic we use to make decisions to help us survive and reproduce.
> In a recent interview, Cowen—a superhuman reader whose blog, Marginal Revolution, is a daily destination for info-hungry rationalists—told Ezra Klein that the rationality movement has adopted an “extremely culturally specific way of viewing the world.” It’s the culture, more or less, of winning arguments in Web forums.
This matches my observations, too.
> Cowen suggested that to understand reality you must not just read about it but see it firsthand; he has grounded his priors in visits to about a hundred countries, once getting caught in a shoot-out between a Brazilian drug gang and the police.
One of my many pet peeves are people who travel to more than a 100 countries to get "experiences". It feels misguided, wasteful, excessive and done to impress others, as a sort of a status symbol. I bet he wouldn't be able to name all those countries and cities that he's been to. A deep and meaningful experience requires way more than a superficial visit.
I think there is a simpler explanation that draws from evolutionary theory: being excessively rational is not a good survival strategy, be it in the distant past or today.
If our ancestors would have made the rational assessment that there is unlikely to be a predator hiding behind the bush, that would have worked only as long as it worked, until one day they got eaten.
Irrationally overestimating threats and risks is not an optimal approach, but as long as you can survive it can be a long-term optimal approach.
Humans using irrational stories to enable group cohesion and coordination are similarly irrational but intrinsic ways of being that also provide an evolutionary advantage.
Rationality, however is an incredible optimization tool when operating in domains that are well understood, like the example of stereo equipment that the author gave in the article. It can also help in the process of expanding knowledge by helping a systematically compare and contrast signals.
But it doesn't prevent the lion from eating you or the religious or temporal authority from ostracizing you from the safety of the settlement, and it may even make both of those outcomes more likely.
> If our ancestors would have made the rational assessment that there is unlikely to be a predator hiding behind the bush, that would have worked only as long as it worked, until one day they got eaten.
That wouldn't have been a rational assessment, because it wouldn't have been an accurate assessment of the risks of being wrong, and the behavior required to avoid them.
If there's only a 1% chance that a predator is behind a bush, and that predator might eat you, it's absolutely rational to act as though there is a predator. You'll be seeing lots of bushes in your life, and you can't escape from those 1% chances for long.
The same thinking is why it would have been rational to try and avoid global warming 30 years ago. Even if the science was not settled, in the worst-case scenario, you'd have "wasted" a bunch of money making green energy production. In the best-case scenario, you saved the planet.
Humans operate by doing, then rationalizing, and much of the attempts at rational thought here demonstrate how easy it is to fool ourselves into thinking we are being rational, when really we are
acting on feelings and delusions and then constructing what feels like a rational argument that we originally had - but falls apart upon analysis.
In the past, it is a rational concern to be worried about being jumped by a predator from behind a bush, and if you don’t know if or if not there is a predator, it is perfectly rational to be worried about such a concern!
Same with diseases and causes when you don’t know what is causing them, etc.
It’s a tendency to dismiss older concerns from a time when there was a severe lack of information as irrational, where when you know your limits and see the results, there is no other rational way to behave except to be concerned or avoid those things. While also not rational to believe clearly contradictory religious dogma that covers the topic, it is rational to follow or support it when it has clear alignment with visibly effective methods encoded in it for avoiding disease and other problems.
This is also captured in the “midwit phenomenon”, where people who are just smart enough to start applying “rationality” make worse decisions than stupid people. This is because stupid people are operating off of hard-earned adaptations (encoded as traditions, folk wisdom, etc.). Midwits are smart enough to realize that the putative justifications for these adaptations are wrong, and therefore they toss out the adaptations. People who think about it even harder realize that these adaptations were mostly there for good reasons, and getting rid of them isn’t a good idea even if the relevant just-so stories explaining them don’t hold up to “rational” scrutiny.
Just because there's a 99% chance the bush has no predators behind it does not make it rational to assume there are no predators.
In Bayesian decision theory, you'd choose the action (walk directly by the bush; walk by the bush but have your guard up; steer clear of the bush) that minimizes your loss function (e.g. probability of dying or probability of your blood line dying out). You'd end up picking a path that balances the risk of being eaten by a lion with the cost of having to walk further (and thus having less time and energy to gather food; or tripping and cutting yourself and dying of infection; or whatever).
I like this, but I also like the even simpler explanation. Our bodies/minds require energy and "skipping" to conclusions (irrationally) expends less energy.
It's impossible to be absolutely rational. I feel like there is so many different levels and viewpoints that there is no right answer.
Simple example:
Let's say the same pair of shoes is available in two different shops, but in one shop it's more expensive. It seem more rational to buy it in the cheaper shop. However, what if you've heard that the cheaper shop is very unethical in how it conducts the business. Is it still more rational to buy the shoes there?
And then you might also start considering this situation "in the grand scheme of things" - in the grand scheme of things does it make any difference if I buy it in shop A or B?
And at which point does it become irrational to be overthinking simple things in order to try to be rational? What if trying to always be rational is stressing you out, and turns out to be worse in the long run?
I try to avoid mind viruses, or ideas that can hijack your decisions and thought process and take over. Think of a mind virus as a sort of dangerous meme that underpins everything you do. This is why first principles and making decisions based on sound foundations is better, absent of some sort of virulent dogma.
There's a YouTube channel (1) called Street Epistemology which has a guy interview members of the public and ask them if they have a belief they hold to be true such as "the supernatural exists" or "climate change is real" or "x is better than y".
He then asks them to estimate how certain they are that it's true.
Then they talk. The interviewer asks a question and makes notes, then tries to summarise the reply. He questions how they know what they think they know and at the end he asks them to again say how confident they are that what they said is true.
It's fascinating to see people actually talk about and discuss what are usually unsaid thoughts and it shows some glaring biases logical fallacies.
Glad to hear you aren't the only person thinking of the mind virus idea!
Exactly what you said. Once you accept one toxic thought, it tends to branch out into other decisions. Unfortunately there are many, many memes out there ready to cause an infection.
Sci-fi novel "Lexicon" by Max Barry explores the idea of words used for persuasion to the extent of actually hacking the brain via spoken word to take control of the subject's thoughts and actions.
I am seeing a lot of "institutions lied to us and are actively keeping information from ourselves" when people try to justify acting irrationaly. I don't agree with this premise at all. What do you mean they keep information from you? This assumes that information can be contained, which in most cases is impossible. There is always leakage.
Now, to be more generous, I will assume that people are actually criticizing how "institutions impose a mainstream view that is difficult to replaced even when facts say it should". To that I say: fine. But even in this case, there should be enough resources to form a rational opinion over the matter (with probabilistic reasoning). Hell, I have a lot of non-orthodox opinions that are so out of Overton Window that I rarely can discuss them. And even in these cases, the internet and Google Scholar/Sci-hub were sources that helped me explore it.
So, I have no sympathy for this "institutions lied to us, let me believe now whatever I want" bullshit.
I think part of the problem is that most people are conditioned into many beliefs from a young age
I know a guy who hates foo (using a place holder). In fact he's downright foophobic. He is pretty convinced he has a natural unbiased hate of foo and is being rational when he expresses it.
To me as an outsider it is pretty obvious that his hate of foo is the result of cultural conditioning. To him it is perfectly rational to hate foo and to me it is totally irrational, especially since he can't give any concrete reason for it.
The ultimate issue is that underpinning every action is a value system. Value systems are opinions and are fundamentally not rational.
Virtually every political disagreement is based on values, though most of the time people dont recognize it.
Values determine priorities and priorities underpin action.
For example some people feel that liberty (e.g. choice) is more important than saving lives when it comes to vaccines.
Some people feel that economic efficiency is less important than reducing suffering.
Some people feel that the life of an unborn child is worth less than the ability to choose whether to have that child
Even in the article, is a stereo that sounds better actually better than a stereo that looks better? That is a value judgement and there is no right or wrong.
No one is actually wrong since everything is value judgements. Many people believe in universal view of ethics/morality. There is almost no universal set of ethics/morality if you look across space and time.
However some values allow a culture to out compete other cultures causing the "inferior" values to disappear. New mutations are constantly being created. Most are neutral and have no impact on societal survival. Some are negative and some are positive.
I came to say something similar, that rational decision making is really a poorly posed problem at some level.
Take money for example. You can create a theoretical decision-making dilemma involving certain sums of money, and work out what the most rational strategy is, but in reality, the differences between different sums of money is going to differ between people depending on different value systems and competing interests. So then you get into this scenario where 1 unit of money means something different to different people (the value you put on 1 € is going to be different from the value I put on it; the exchange rates are sort of an average over all these valuations), which might throw off the relevance of the theoretical scenario for reality, or change the optimal decision scenario.
The other issue beside the one you're relating to -- the subjectivity of the weights assigned to different outcomes, the achille's heel of utility theory -- is uncertainty not just about the values in the model, but whether the model is even correct at all. That is, you can create some idea that some course of action is more rational, but what happens when there's some nontrivial probability that the whole framework is incorrect? Your decision about A and B, then, shouldn't just be modeled in terms of whatever is in your model, but all the other things you're not accounting for. Maybe there are other decisions, C and D, which you're not even aware of, or someone else is, but you have to choose B to get to them.
Just yesterday I read this very well-reasoned, elegant, rational explanation by an epidemiologist about why boosters aren't needed. But about 3/4 of the way through I realized it was all based on an assumption that is very suspect, and which throws everything out the window. There are still other things their arguments were missing. So by the end of it I was convinced of the opposite conclusion.
Rationality as a framework is important, but it's limited and often misleading.
> is a stereo that sounds better actually better than a stereo that looks better? That is a value judgement and there is no right or wrong.
Disagree; value systems are the inputs to rationality. The only constraint is that you do the introspection in order to know what it is that you value. In that sense buying a stereo based on appearance is the right decision if you seek status among peers or appreciate aesthetics. It's the wrong decision if you want sound quality or durability.
I think the real issue is that people don't do the necessary introspection, and instead just glom onto catch-phrases or follow someone else's lead. That's why so many people hold political views that are contrary to their own interests.
Yes, and I think when people claim to be describing what a "rational actor" would do, what they often leave out are the normative assumptions inherent in their rational analysis. Moreover, I suspect the omission at times is not accidental.
Jim Keller (famous cpu designer; Lex Fridman interview)[1]: "Really? To get out of all your assumptions, you think that's not going to be unbelievably painful?" "Imagine 99% of your thought process is protecting your self conception, and 98% of that's wrong". "For a long time I've suspected you could get better [...] think more clearly, take things apart [...] there are lots of examples of that, people who do that". "I would say my brain has this idea that you can question first [sic] assumptions, and but I can go days at a time and forget that, and you have to kind of like circle back to that observation [...] it's hard to keep it front and center [...]".
Because we didn't evolve to be rational. We evolved to reproduce as often as possible, not to thing as precises as possible. We're not thinking machines, we're reproduction machines.
That we are able to think somewhat rational-ish is only because we adapted by adopting extensive modeling simulations. The fundamental function of these simulations is to simulate other beings, primarily human. And in that our brainware is lazy as hell, because - to quote evolution; why do perfect, when you can do good enough? Saves a ton of energy.
The wetware we employ was never expected to rationally solve differential equations or do proper statistical analysis. At best it was expected to guess the parabola of a thrown stone or spear, or empate the best way to mate without facing repercussions from the tribe.
So, really. It's not that thinking is hard. It's just that we're just not equipped to do it.
[+] [-] neonate|4 years ago|reply
[+] [-] btilly|4 years ago|reply
We should all know that given a belief about the world, and evidence, Bayes' Theorem describes how to update our beliefs.
But what if we have a network of interrelated beliefs? That's called a Bayesian net, and it turns out that Bayes' Theorem also prescribes a unique answer. However, unfortunately, it turns out that working out that answer is NP-hard.
OK, you say, we can come up with an approximate answer. Sorry, no, coming up with an approximate answer that gets within probability 0.5 - ε, for 0 < ε, is ALSO NP-hard. It is literally true that under the right circumstances a single data point logically should be able to flip our entire world view, and which data point does it is computationally intractible.
Therefore our brains use a bunch of heuristics, with a bunch of known failure modes. You can read all the lesswrong you want. You can read Thinking, Fast and Slow and learn why we fail as we do. But the one thing that we cannot do, no matter how much work or effort we put into it, is have the sheer brainpower required to actually BE rational.
The effort of doing better is still worthwhile. But the goal itself is unachievable.
[+] [-] 6gvONxR4sf7o|4 years ago|reply
One thing I'll add that drives me nuts is the fetishization of bayesian reasoning I see some times here on HN. There are times that bayesian reasoning is helpful and times that it isn't. Specifically, when you don't trust your model, bayes rule can mislead you badly (frequently when it comes to missing/counterfactual data). It's just a tool. There are others. It makes me crazy when it's someone's only hammer, so everything starts to look like a nail. Sometimes, more appropriate tools leave you without an answer.
Apparently that's not something we're willing to live with.
[+] [-] hinkley|4 years ago|reply
I like to tell people that charts work better for asking questions than answering them. Once people know you look for answers there, the data changes. More so than they do for question asking (people will try to smooth the data to avoid awkward questions).
[+] [-] belter|4 years ago|reply
[+] [-] MrPowers|4 years ago|reply
Learning about logical fallacies and identifying them in conversations is great. Don't tell the counterparty of their logical fallacies in conversations cause that's off putting. Just note them internally for a more rational inner dialogue.
Learning other languages and cultures is another way to learn about how different societies interact with objective truth. Living other places taught me a lot about how denial works in different places.
Thinking rationally is quite hard and I've learned how to abandon it in a lot of situations in favor of human emotions. How someone feels is more important than how they should feel.
[+] [-] anyfoo|4 years ago|reply
This also had a rather frustrating effect. It is true that not just intensely traveling (not in the sight seeing way), but also actual living in several different countries and cultures, changed my horizon a lot. It definitely had the effect you talk about.
But then what? You cannot tell your partner in discussion "you would not think like that if you had traveled/lived outside of your culture", and it's also impossible to send everyone off to travel in order to experience the same. Much less in the US, where for most of the country you cannot just hop into a train for a few hours to encounter a completely different language and culture. (I grew up in Europe and moved to the US as an adult, but I've also lived in several different European countries before, and traveled to far away places like Asia.)
[+] [-] nostromo|4 years ago|reply
I see it everywhere, from my own decision making process to international politics. Just this morning I was thinking about it as I read the news about the US leaving Afghanistan, and last week talking with a friend who is staying at a bad job.
[+] [-] oldsklgdfth|4 years ago|reply
It's not an easy task. But 10 minutes a day can add up and reinforce that information.
A related idea is cognitive distortion. It's basically an irrational thought pattern that perpetuates negative emotions and a distorted view of reality. One example many here can relate to is imposter syndrome. But to feel like an imposter you have to overlook your achievements and assets and cherry-pick negative data points.
[+] [-] wyager|4 years ago|reply
[+] [-] newbamboo|4 years ago|reply
[+] [-] SMAAART|4 years ago|reply
Big business want people to buy things they don't need, with money they don't have to impress people they don't like
Politicians want people who will drink the cool-aid and follow what they (the politicians) say (and not what they do)
Religions... well, same.
And so all messages from advertisement, to movies, TV, narrative is about hijacking people's feelings and suppressing rationality. Common sense is no longer common, and doesn't make much sense.
[+] [-] ret2plt|4 years ago|reply
[+] [-] jimbokun|4 years ago|reply
They are rejecting the authorities that in the past have tried to associate themselves with "rationality". The political think tanks. The seminaries. The universities. Government agencies. Capitalist CEOs following the "invisible hand" of the market.
All of these so-called elites have biases and agendas, so of course none of them should be accepted at face value.
I think what's missed, is rationality is not about trusting people and organizations, but about trusting a process. Trusting debates over lectures. Trusting well designed studies over trusting scientists. Trusting free speech and examining a broad range of ideas over speech codes and censorship. Trusting empirical observation over ideological purity.
This is the value system of the so called "classical liberals", and they are an ever more lonely and isolated group. There is a growing embrace for authoritarianism and defense of tribal identity on both the "left" and the "right" taking its place.
[+] [-] DoingIsLearning|4 years ago|reply
[+] [-] toshk|4 years ago|reply
[+] [-] tim333|4 years ago|reply
[+] [-] athenot|4 years ago|reply
[+] [-] marcod|4 years ago|reply
[+] [-] kerblang|4 years ago|reply
[+] [-] alecst|4 years ago|reply
Rationality, to me, is really about an open-minded approach to beliefs. Allowing multiple beliefs to overlap, to compete, to adapt, without interfering too much with the process.
[+] [-] PicassoCTs|4 years ago|reply
The real interesting thing here, is the answer to why emotions, work as they do and what the patterns and bits are that trigger them. To turn over that particular rock is to go to some deeply disturbing places. And to loose the illusion that emotion make one more "human" - meanwhile, if ones reaction is more hard coded, shouldn't it be considered more machine-like?
[+] [-] jscipione|4 years ago|reply
President Dwight D. Eisenhower put it succinctly in his farewell address to the nation:
"The prospect of domination of the nation's scholars by Federal employment, project allocations, and the power of money is ever present and is gravely to be regarded. Yet, in holding scientific research and discovery in respect, as we should, we must also be alert to the equal and opposite danger that public policy could itself become the captive of a scientific technological elite."
[+] [-] WhompingWindows|4 years ago|reply
In meditation, a common teaching is to examine an object for a long period, really just stare at it and allow your mind to focus on it fully. I see a coffee mug, it has a handle and writing on it, it's off-white and has little coffee stains. This descriptive mind goes a mile-a-minute normally, but eventually you can break through that and realize, this is just a collection of atoms, this is something reflecting photons and pushing back electrically against my skins' atoms. Even deeper, it's just part of the environment, all the things I can notice, like everything else we care about.
Such exercises can help reveal the nature of mind. There are many layers of this onion, and many separate onions vying for our attention at once. Rationality relies upon peeling back these superficial layers of the thought onion to get towards "the truth." That means peeling back biases, emotions, hunches, instincts, and all the little mental heuristics that are nice "shortcuts" for a biologically limited thinker.
But outside our minds, how is there any rationality left? It feels like another program or heuristic we use to make decisions to help us survive and reproduce.
[+] [-] wizzwizz4|4 years ago|reply
This matches my observations, too.
> Cowen suggested that to understand reality you must not just read about it but see it firsthand; he has grounded his priors in visits to about a hundred countries, once getting caught in a shoot-out between a Brazilian drug gang and the police.
[+] [-] kubb|4 years ago|reply
[+] [-] danans|4 years ago|reply
If our ancestors would have made the rational assessment that there is unlikely to be a predator hiding behind the bush, that would have worked only as long as it worked, until one day they got eaten.
Irrationally overestimating threats and risks is not an optimal approach, but as long as you can survive it can be a long-term optimal approach.
Humans using irrational stories to enable group cohesion and coordination are similarly irrational but intrinsic ways of being that also provide an evolutionary advantage.
Rationality, however is an incredible optimization tool when operating in domains that are well understood, like the example of stereo equipment that the author gave in the article. It can also help in the process of expanding knowledge by helping a systematically compare and contrast signals.
But it doesn't prevent the lion from eating you or the religious or temporal authority from ostracizing you from the safety of the settlement, and it may even make both of those outcomes more likely.
[+] [-] SamBam|4 years ago|reply
That wouldn't have been a rational assessment, because it wouldn't have been an accurate assessment of the risks of being wrong, and the behavior required to avoid them.
If there's only a 1% chance that a predator is behind a bush, and that predator might eat you, it's absolutely rational to act as though there is a predator. You'll be seeing lots of bushes in your life, and you can't escape from those 1% chances for long.
The same thinking is why it would have been rational to try and avoid global warming 30 years ago. Even if the science was not settled, in the worst-case scenario, you'd have "wasted" a bunch of money making green energy production. In the best-case scenario, you saved the planet.
[+] [-] lazide|4 years ago|reply
In the past, it is a rational concern to be worried about being jumped by a predator from behind a bush, and if you don’t know if or if not there is a predator, it is perfectly rational to be worried about such a concern!
Same with diseases and causes when you don’t know what is causing them, etc.
It’s a tendency to dismiss older concerns from a time when there was a severe lack of information as irrational, where when you know your limits and see the results, there is no other rational way to behave except to be concerned or avoid those things. While also not rational to believe clearly contradictory religious dogma that covers the topic, it is rational to follow or support it when it has clear alignment with visibly effective methods encoded in it for avoiding disease and other problems.
[+] [-] wyager|4 years ago|reply
[+] [-] meatmanek|4 years ago|reply
In Bayesian decision theory, you'd choose the action (walk directly by the bush; walk by the bush but have your guard up; steer clear of the bush) that minimizes your loss function (e.g. probability of dying or probability of your blood line dying out). You'd end up picking a path that balances the risk of being eaten by a lion with the cost of having to walk further (and thus having less time and energy to gather food; or tripping and cutting yourself and dying of infection; or whatever).
[+] [-] mbesto|4 years ago|reply
[+] [-] FinanceAnon|4 years ago|reply
Simple example:
Let's say the same pair of shoes is available in two different shops, but in one shop it's more expensive. It seem more rational to buy it in the cheaper shop. However, what if you've heard that the cheaper shop is very unethical in how it conducts the business. Is it still more rational to buy the shoes there?
And then you might also start considering this situation "in the grand scheme of things" - in the grand scheme of things does it make any difference if I buy it in shop A or B?
And at which point does it become irrational to be overthinking simple things in order to try to be rational? What if trying to always be rational is stressing you out, and turns out to be worse in the long run?
[+] [-] legrande|4 years ago|reply
[+] [-] jjbinx007|4 years ago|reply
There's a YouTube channel (1) called Street Epistemology which has a guy interview members of the public and ask them if they have a belief they hold to be true such as "the supernatural exists" or "climate change is real" or "x is better than y".
He then asks them to estimate how certain they are that it's true.
Then they talk. The interviewer asks a question and makes notes, then tries to summarise the reply. He questions how they know what they think they know and at the end he asks them to again say how confident they are that what they said is true.
It's fascinating to see people actually talk about and discuss what are usually unsaid thoughts and it shows some glaring biases logical fallacies.
(1) https://youtube.com/c/AnthonyMagnabosco210
[+] [-] jklinger410|4 years ago|reply
Exactly what you said. Once you accept one toxic thought, it tends to branch out into other decisions. Unfortunately there are many, many memes out there ready to cause an infection.
These things can be fatal.
[+] [-] OnACoffeeBreak|4 years ago|reply
[+] [-] rafaelero|4 years ago|reply
Now, to be more generous, I will assume that people are actually criticizing how "institutions impose a mainstream view that is difficult to replaced even when facts say it should". To that I say: fine. But even in this case, there should be enough resources to form a rational opinion over the matter (with probabilistic reasoning). Hell, I have a lot of non-orthodox opinions that are so out of Overton Window that I rarely can discuss them. And even in these cases, the internet and Google Scholar/Sci-hub were sources that helped me explore it.
So, I have no sympathy for this "institutions lied to us, let me believe now whatever I want" bullshit.
[+] [-] throwaway9690|4 years ago|reply
I know a guy who hates foo (using a place holder). In fact he's downright foophobic. He is pretty convinced he has a natural unbiased hate of foo and is being rational when he expresses it.
To me as an outsider it is pretty obvious that his hate of foo is the result of cultural conditioning. To him it is perfectly rational to hate foo and to me it is totally irrational, especially since he can't give any concrete reason for it.
So who is right and who is being rational?
[+] [-] _ea1k|4 years ago|reply
[+] [-] achenatx|4 years ago|reply
Virtually every political disagreement is based on values, though most of the time people dont recognize it.
Values determine priorities and priorities underpin action.
For example some people feel that liberty (e.g. choice) is more important than saving lives when it comes to vaccines.
Some people feel that economic efficiency is less important than reducing suffering.
Some people feel that the life of an unborn child is worth less than the ability to choose whether to have that child
Even in the article, is a stereo that sounds better actually better than a stereo that looks better? That is a value judgement and there is no right or wrong.
No one is actually wrong since everything is value judgements. Many people believe in universal view of ethics/morality. There is almost no universal set of ethics/morality if you look across space and time.
However some values allow a culture to out compete other cultures causing the "inferior" values to disappear. New mutations are constantly being created. Most are neutral and have no impact on societal survival. Some are negative and some are positive.
[+] [-] derbOac|4 years ago|reply
Take money for example. You can create a theoretical decision-making dilemma involving certain sums of money, and work out what the most rational strategy is, but in reality, the differences between different sums of money is going to differ between people depending on different value systems and competing interests. So then you get into this scenario where 1 unit of money means something different to different people (the value you put on 1 € is going to be different from the value I put on it; the exchange rates are sort of an average over all these valuations), which might throw off the relevance of the theoretical scenario for reality, or change the optimal decision scenario.
The other issue beside the one you're relating to -- the subjectivity of the weights assigned to different outcomes, the achille's heel of utility theory -- is uncertainty not just about the values in the model, but whether the model is even correct at all. That is, you can create some idea that some course of action is more rational, but what happens when there's some nontrivial probability that the whole framework is incorrect? Your decision about A and B, then, shouldn't just be modeled in terms of whatever is in your model, but all the other things you're not accounting for. Maybe there are other decisions, C and D, which you're not even aware of, or someone else is, but you have to choose B to get to them.
Just yesterday I read this very well-reasoned, elegant, rational explanation by an epidemiologist about why boosters aren't needed. But about 3/4 of the way through I realized it was all based on an assumption that is very suspect, and which throws everything out the window. There are still other things their arguments were missing. So by the end of it I was convinced of the opposite conclusion.
Rationality as a framework is important, but it's limited and often misleading.
[+] [-] _greim_|4 years ago|reply
Disagree; value systems are the inputs to rationality. The only constraint is that you do the introspection in order to know what it is that you value. In that sense buying a stereo based on appearance is the right decision if you seek status among peers or appreciate aesthetics. It's the wrong decision if you want sound quality or durability.
I think the real issue is that people don't do the necessary introspection, and instead just glom onto catch-phrases or follow someone else's lead. That's why so many people hold political views that are contrary to their own interests.
[+] [-] mariodiana|4 years ago|reply
[+] [-] mncharity|4 years ago|reply
[1] https://www.youtube.com/watch?v=Nb2tebYAaOA&t=4962s
[+] [-] esarbe|4 years ago|reply
That we are able to think somewhat rational-ish is only because we adapted by adopting extensive modeling simulations. The fundamental function of these simulations is to simulate other beings, primarily human. And in that our brainware is lazy as hell, because - to quote evolution; why do perfect, when you can do good enough? Saves a ton of energy.
The wetware we employ was never expected to rationally solve differential equations or do proper statistical analysis. At best it was expected to guess the parabola of a thrown stone or spear, or empate the best way to mate without facing repercussions from the tribe.
So, really. It's not that thinking is hard. It's just that we're just not equipped to do it.