I used to think this checklist was misguided. My mental model of improving my rationality was something like acquiring lots and lots of impressive and clever tools and then training myself to apply them in context appropriate places. I've come around to a model much more in line with this checklist though. Something like:
1. Noticing behavioral patterns
2. Thinking hard and doing research on the pattern, whether it leads to good or bad outcomes and whether modifying it is cost effective
3. Using habit formation techniques to modify things
4. Reviewing and reflecting on what is and isn't working for things I have modified using the above steps
Old me would have considered this very laborious. You only get to make small changes this way and you aren't guaranteed they work. However, it doesn't take that many real improvements in fundamental counter-productive patterns before the benefits start stacking on top of each other. A few weeks of intense focus on one particular pattern is actually fairly high leverage considering that you can instill habits that will last a lifetime.
Your comment reminded me of the Meyers-Briggs model's concepts of introverted thinking and extraverted thinking. The extraverted thinker (NTJ) seeks out common, objective methodologies, while the introverted thinker (NTP) finds it easier to become comfortable with a more subjective, philosophical approach. You might then expect an INTJ to learn and hold to rules that seem to have authority or at least some critical thinking behind them, and an INTP could be expected to back off and start poking holes in those rules.
An important part of INTJ maturity is accepting that their own approach may fall short and examining the basis of other beliefs regarding the system or framework they instinctively favor. Or just learning why the lack of adherence to such a framework might be acceptable. INTJs, for example, are urged to learn about different personality preferences so they don't grow impatient with just about every other type. It's easy for INTJs to wonder why their ENFP friend just won't act logically.
In the case of the INTJ, the role of extraverted thinking, commonly called systems thinking, is crucial: It can replace their instinctive (and at times compulsive or otherwise destructive) emotional response to circumstances that make them feel anxiety. So when a more mature INTJ is feeling anxious about a situation or circumstance, they usually fall back on their extraverted thinking to help them examine rationally and build a model around it. Otherwise they might find themselves falling into compulsive behaviors, losing their grip on the situation.
I'm distressed by how unscientific a lot of this stuff is. It's all good advice, but some of it seems to come from some pretty naive perceptions of how consciousness and the brain work. The example in 3.1, for example:
> Recent example from Anna: Jumping off the Stratosphere Hotel in Las Vegas in a wire-guided fall. I knew it was safe based on 40,000 data points of people doing it without significant injury, but to persuade my brain I had to visualize 2 times the population of my college jumping off and surviving. Also, my brain sometimes seems much more pessimistic, especially about social things, than I am, and is almost always wrong.
Fear of heights is a very primal, very understandable and even reasonably well understood response. Our brains have specific hardware for this kind of circumstance, and it exists for good (even "rational") reasons. Sweeping it into the same generic bucket with all other "irrationality" is just silly.
I think you missed the point. It's not that the fear is itself intrinsically wrong, it's that the fear is leading to an irrational belief: that I will likely die if I do this wire-guided fall. That's just not a belief that comports with reality.
The existence of a mechanism for processing fear may be rational, but this doesn't mean fear is rational.
Similarly, if you are trying to destroy a small target and you know it lies in a 10 km^2 area, it may be optimal to employ a carpet-bombing strategy over the whole area. This doesn't mean that any of the individual bombs are optimized: most of them will fall nowhere near the target, and they are 'irrational' in support of a broader rationalization strategy.
This has to do with the fact that you only have enough information to narrow the target to such a wide area. With more information, you can optimize the individual bombs and maybe switch strategies away from carpet bombing.
Our fear processing machinery is a carpet bombing strategy that drops specific fears in our way that we have to deal with. Overall it is better than nothing, but since we are trying to _improve_ our strategy, we are forced to conclude that it is not an ideal methodology. We have to collect more information, and we have to ensure that this new information is accounted for in the decision-making process. This means fighting your reflexes. You cannot do better than nature by idolizing it.
Fear of heights was developed without wire guided falls. It's irrational to be afraid of that vs say, being unafraid of taking a car ride.
It's exactly because we've got this corrupted evolved hardware that we must actively seek out all irrationality in our minds.
After all, things like availability bias were also developed for good reasons. No sense being afraid of tigers and not going into the forest for good if no one you know has ever seen a tiger. That's just as good heuristic as "be scared when going to high places".
I found "Predictably Irrational" to repeat a lot of the content about biases from TFTS. Its well written and not a bad book by any means but I regretted the time investment after TFTS.
I found "Decisive: How to Make Better Choices in Life and Work" to be very practical aiming to giving some real tools to use some of which I hadn't come across before.
I feel as though most of the psychological issues and poor decision making I experienced when I was younger were because I thought of myself as a fundamentally rational person, and refused to examine the irrational things my brain was doing.
This made it more or less impossible for me to address harmful emotions (because emotions are irrational) or problematic behaviours (because I don't make irrational decisions), which led to some bad times, to say the least.
Our brains are basically not wired to be objective or rational. While the human brain has developed the ability to think rationally, there is a considerable amount of the kind of "gut level" machinery that kept us alive before we developed this ability lying around.
Without careful introspection and a willingness to admit that many of our thoughts and behaviours are, without careful examination, going to be deeply irrational no matter how much we believe in rationalism, we tend to act on our irrationality to our own detriment, and often to the detriment of others as well (remember when Colbert trolled Bush by saying they were both "guys who think with [their] guts"?)
Just remember that the human brain is basically not set up for objectivity. Plenty of people who consider themselves rational are capable of making decisions that you would call unrational. Doesn't mean you need to live your life through a checklist, but it helps to be critical of thoughts like "I am a very rational person"
A lot of irrational behavior stems from simple biases in our brains' usually good-enough-for-daily-life heuristics. Attempts to avoid those biases seem to include training to turn off (or at least notice and interrupt) your heuristics altogether as much as possible, resulting in a much slower (and more introspective) way of going through the world.
As a mathematician, rationality is very important to me. (It is sufficiently important that I believe in determining my own criteria for rationality, rather than using someone else's list.)
In this spirit, I have an infuriating (to others) habit of reflexively arguing against someone else's position. (It was originally born out of the "search for counterexamples" mindset of a mathematician, but now it's an unconscious behaviour.) This habit is so comically ingrained that, if the person to whom I make this argument switches his or her position, then I will in turn switch to arguing for the original position.
I don't know whether this counts as highly rational, highly contrarian, or just annoying.
As a scientist, empiricism is very important to me as rationality alone will lead you into the thickets of solipsism.
Is there an expected useful outcome for challenging every statement? "My favorite dessert is chocolate cake" can be countered with "I've seen you choose apple pie instead of chocolate cake for dessert the last 3 of 5 times". But so what? Why do you think your limited observations trump the speaker's more complete knowledge enough to force an argument at that point?
It doesn't sound like you've used feedback from your previous arguments in order to determine if you indeed have enough information to have a meaningful counter-argument. Doubt fuels empiricism, not rationalism.
There's also an opportunity cost. "The Vikings made it to North America" can be challenged with "but the Norse of Greenland weren't really Vikings", inviting a competitive but ultimately bland argument about various etymological nuances, when the speaker actually wanted to bring up the latest archeological finds on Baffin Island. The speaker may be less likely to talk with you in the future about interesting ideas, to avoid being dragged into inane debates.
There's also the possibility that you are being deliberately distracted. A reflexive challenge means you only really consider the surface arguments, and not the underlying premise. If someone says that Agile is better than Waterfall, your response could be to point out cases where Waterfall is better, without realizing how that discussion tends to assume the creation myth of Agile, constructed on a false narrative of history. (See http://www.infoq.com/articles/bossavit-agile-ten-years-on .)
Statements are data points, not debate points. You may feel like you need to respond to them at once. Or you can collect them, use to them test and refine your hypotheses about what the other person is saying, and estimate if it's useful enough to voice a contrary opinion.
Please be aware you have interpreted contrarian behaviour, which can be very negative and disruptive when unchecked, as having a virtuous cause.
Other causes worth examining: ego (desire to prove one's intelligence), one-up-manship (desire to harm/contradict/take down arguments of others), deficiency (not observing feelings of others or not addressing the end-goal of the task in favour of hair splitting), hedonism (argument gives you pleasure no matter the consequences for the task or others) etc.
Contrarians can be intolerable team members so it requires care to harness and channel an analytical mind and not let it be driven by base and counter-productive motivations. Its very easy not to notice and stick with a seductive virtuous justification for bad behaviour.
When I look for bugs in code that is exactly the method I use. "This code postulates A, B and C. Can I prove that any of A, B and C can not be proven true?" If I can, there is a bug.
Applying the method to a verbal argument: "You never listen!" "That's not true, I'm listening now!" Yes, I think it is very annoying. :)
> I notice when my mind is arguing for a side (instead of evaluating which side to choose), and flag this as an error mode. (Recent example from Anna: Noticed myself explaining to myself why outsourcing my clothes shopping does make sense, rather than evaluating whether to do it.)
Now I know this is not going to sound "rational", but...
I saw the example early on the page about evaluating whether or not to outsource their clothes shopping, and just stopped. Anyone for whom this is even a valid concept is so far removed from who I am that I just am not going to value their advice.
Who told them that we have any kind of "parallel threads of consciousness" to run "chekers" for these "rational heuristics" or "pattern matchers" on an input stream? Another variety of "multitasking" nonsense.
On paper it is all logical and "rational". Now try to run this in your mind in so-called state of "flow". Don't have a flow? That means you don't have your attention focused.
All the classic attention experiments, like that one with gorilla, reveal limitations and "single-threadness" of our minds.
It seems that the only kind of parallel tasks a human mind could run is one "conscious" with many "subconscious" - 1:N threads, or 0:N ;)
Btw, pthreads (and, perhaps, lock-based sharing in general) is a flawed concept in the first place. Imagine a deadlock with breathing or race-condition with heart rate.
Certainly all of this would be completely impractical if it required a "parallel thread of consciousness" (where did you get that phrase). Thankfully recognizing situations isn't something that has to take conscious attention. If I'm walking down the street and someone I pass is wearing a gorilla costume that fact will be promoted to my conscious attention by other parts of my brain without me having to consciously think to myself "is this a gorilla" whenever I see someone. The idea here is to train yourself to notice when, say, you're confused the same way you'd notice a gorilla costume and only then deal with the issue consciously.
Of course automatic recognition isn't perfect as the video you're referring to shows. But recognizing something most of the time is far superior to never recognizing it.
And I think a better model for the human brain is a hard core with a single thread embedded in a much larger FPGA fabric that can issue it interrupts. Only a tiny fraction of the sense data that enters our brain actually impinged on our conscious awareness and the directives that our consciousness issues such as "pick up that cup" are multiplied hugely in complexity as they are translated into precisely calibrated exertion rates for dozens of separate muscles over time.
This is just an amusing aside: Where I work, we sometimes have these sessions conducted by outside consultants. At one such session, after an introduction about how we can be fooled by our expectations, we were shown the gorilla video.
When the trainer asked if we noticed something unusual in the video, all but one or two of us raised our hands.
We had all seen the gorilla video.
So I think the gorilla video actually teaches a new concept, which is that you should assume that the people you are dealing with have the same access to information as you have unless you have a reason to believe that it's a secret.
[+] [-] nazgulnarsil|11 years ago|reply
1. Noticing behavioral patterns
2. Thinking hard and doing research on the pattern, whether it leads to good or bad outcomes and whether modifying it is cost effective
3. Using habit formation techniques to modify things
4. Reviewing and reflecting on what is and isn't working for things I have modified using the above steps
Old me would have considered this very laborious. You only get to make small changes this way and you aren't guaranteed they work. However, it doesn't take that many real improvements in fundamental counter-productive patterns before the benefits start stacking on top of each other. A few weeks of intense focus on one particular pattern is actually fairly high leverage considering that you can instill habits that will last a lifetime.
[+] [-] themodelplumber|11 years ago|reply
An important part of INTJ maturity is accepting that their own approach may fall short and examining the basis of other beliefs regarding the system or framework they instinctively favor. Or just learning why the lack of adherence to such a framework might be acceptable. INTJs, for example, are urged to learn about different personality preferences so they don't grow impatient with just about every other type. It's easy for INTJs to wonder why their ENFP friend just won't act logically.
In the case of the INTJ, the role of extraverted thinking, commonly called systems thinking, is crucial: It can replace their instinctive (and at times compulsive or otherwise destructive) emotional response to circumstances that make them feel anxiety. So when a more mature INTJ is feeling anxious about a situation or circumstance, they usually fall back on their extraverted thinking to help them examine rationally and build a model around it. Otherwise they might find themselves falling into compulsive behaviors, losing their grip on the situation.
[+] [-] ajross|11 years ago|reply
> Recent example from Anna: Jumping off the Stratosphere Hotel in Las Vegas in a wire-guided fall. I knew it was safe based on 40,000 data points of people doing it without significant injury, but to persuade my brain I had to visualize 2 times the population of my college jumping off and surviving. Also, my brain sometimes seems much more pessimistic, especially about social things, than I am, and is almost always wrong.
Fear of heights is a very primal, very understandable and even reasonably well understood response. Our brains have specific hardware for this kind of circumstance, and it exists for good (even "rational") reasons. Sweeping it into the same generic bucket with all other "irrationality" is just silly.
[+] [-] Suryc011|11 years ago|reply
[+] [-] Retra|11 years ago|reply
Similarly, if you are trying to destroy a small target and you know it lies in a 10 km^2 area, it may be optimal to employ a carpet-bombing strategy over the whole area. This doesn't mean that any of the individual bombs are optimized: most of them will fall nowhere near the target, and they are 'irrational' in support of a broader rationalization strategy.
This has to do with the fact that you only have enough information to narrow the target to such a wide area. With more information, you can optimize the individual bombs and maybe switch strategies away from carpet bombing.
Our fear processing machinery is a carpet bombing strategy that drops specific fears in our way that we have to deal with. Overall it is better than nothing, but since we are trying to _improve_ our strategy, we are forced to conclude that it is not an ideal methodology. We have to collect more information, and we have to ensure that this new information is accounted for in the decision-making process. This means fighting your reflexes. You cannot do better than nature by idolizing it.
[+] [-] MichaelGG|11 years ago|reply
It's exactly because we've got this corrupted evolved hardware that we must actively seek out all irrationality in our minds.
After all, things like availability bias were also developed for good reasons. No sense being afraid of tigers and not going into the forest for good if no one you know has ever seen a tiger. That's just as good heuristic as "be scared when going to high places".
[+] [-] unknown|11 years ago|reply
[deleted]
[+] [-] fcanela|11 years ago|reply
The website promotes a list (http://rationality.org/reading/). I have only read "Thinking, Fast and Slow" and I am looking for HN users favorites.
[+] [-] duncanawoods|11 years ago|reply
I found "Predictably Irrational" to repeat a lot of the content about biases from TFTS. Its well written and not a bad book by any means but I regretted the time investment after TFTS.
I found "Decisive: How to Make Better Choices in Life and Work" to be very practical aiming to giving some real tools to use some of which I hadn't come across before.
[+] [-] projectileboy|11 years ago|reply
[+] [-] analog31|11 years ago|reply
[+] [-] jjmason|11 years ago|reply
This made it more or less impossible for me to address harmful emotions (because emotions are irrational) or problematic behaviours (because I don't make irrational decisions), which led to some bad times, to say the least.
Our brains are basically not wired to be objective or rational. While the human brain has developed the ability to think rationally, there is a considerable amount of the kind of "gut level" machinery that kept us alive before we developed this ability lying around.
Without careful introspection and a willingness to admit that many of our thoughts and behaviours are, without careful examination, going to be deeply irrational no matter how much we believe in rationalism, we tend to act on our irrationality to our own detriment, and often to the detriment of others as well (remember when Colbert trolled Bush by saying they were both "guys who think with [their] guts"?)
[+] [-] pokpokpok|11 years ago|reply
[+] [-] Jach|11 years ago|reply
[+] [-] JadeNB|11 years ago|reply
In this spirit, I have an infuriating (to others) habit of reflexively arguing against someone else's position. (It was originally born out of the "search for counterexamples" mindset of a mathematician, but now it's an unconscious behaviour.) This habit is so comically ingrained that, if the person to whom I make this argument switches his or her position, then I will in turn switch to arguing for the original position.
I don't know whether this counts as highly rational, highly contrarian, or just annoying.
[+] [-] dalke|11 years ago|reply
Is there an expected useful outcome for challenging every statement? "My favorite dessert is chocolate cake" can be countered with "I've seen you choose apple pie instead of chocolate cake for dessert the last 3 of 5 times". But so what? Why do you think your limited observations trump the speaker's more complete knowledge enough to force an argument at that point?
It doesn't sound like you've used feedback from your previous arguments in order to determine if you indeed have enough information to have a meaningful counter-argument. Doubt fuels empiricism, not rationalism.
There's also an opportunity cost. "The Vikings made it to North America" can be challenged with "but the Norse of Greenland weren't really Vikings", inviting a competitive but ultimately bland argument about various etymological nuances, when the speaker actually wanted to bring up the latest archeological finds on Baffin Island. The speaker may be less likely to talk with you in the future about interesting ideas, to avoid being dragged into inane debates.
There's also the possibility that you are being deliberately distracted. A reflexive challenge means you only really consider the surface arguments, and not the underlying premise. If someone says that Agile is better than Waterfall, your response could be to point out cases where Waterfall is better, without realizing how that discussion tends to assume the creation myth of Agile, constructed on a false narrative of history. (See http://www.infoq.com/articles/bossavit-agile-ten-years-on .)
Statements are data points, not debate points. You may feel like you need to respond to them at once. Or you can collect them, use to them test and refine your hypotheses about what the other person is saying, and estimate if it's useful enough to voice a contrary opinion.
[+] [-] duncanawoods|11 years ago|reply
Other causes worth examining: ego (desire to prove one's intelligence), one-up-manship (desire to harm/contradict/take down arguments of others), deficiency (not observing feelings of others or not addressing the end-goal of the task in favour of hair splitting), hedonism (argument gives you pleasure no matter the consequences for the task or others) etc.
Contrarians can be intolerable team members so it requires care to harness and channel an analytical mind and not let it be driven by base and counter-productive motivations. Its very easy not to notice and stick with a seductive virtuous justification for bad behaviour.
[+] [-] bjourne|11 years ago|reply
Applying the method to a verbal argument: "You never listen!" "That's not true, I'm listening now!" Yes, I think it is very annoying. :)
[+] [-] wdewind|11 years ago|reply
> I notice when my mind is arguing for a side (instead of evaluating which side to choose), and flag this as an error mode. (Recent example from Anna: Noticed myself explaining to myself why outsourcing my clothes shopping does make sense, rather than evaluating whether to do it.)
[+] [-] tlholaday|11 years ago|reply
I am confidant that any attempt to rule out the third possibility will fail.
[+] [-] themodelplumber|11 years ago|reply
[+] [-] davidgerard|11 years ago|reply
[+] [-] codingdave|11 years ago|reply
I saw the example early on the page about evaluating whether or not to outsource their clothes shopping, and just stopped. Anyone for whom this is even a valid concept is so far removed from who I am that I just am not going to value their advice.
[+] [-] pcl|11 years ago|reply
[+] [-] huntingralphlee|11 years ago|reply
[deleted]
[+] [-] dschiptsov|11 years ago|reply
On paper it is all logical and "rational". Now try to run this in your mind in so-called state of "flow". Don't have a flow? That means you don't have your attention focused.
All the classic attention experiments, like that one with gorilla, reveal limitations and "single-threadness" of our minds.
It seems that the only kind of parallel tasks a human mind could run is one "conscious" with many "subconscious" - 1:N threads, or 0:N ;)
Btw, pthreads (and, perhaps, lock-based sharing in general) is a flawed concept in the first place. Imagine a deadlock with breathing or race-condition with heart rate.
[+] [-] Symmetry|11 years ago|reply
Of course automatic recognition isn't perfect as the video you're referring to shows. But recognizing something most of the time is far superior to never recognizing it.
And I think a better model for the human brain is a hard core with a single thread embedded in a much larger FPGA fabric that can issue it interrupts. Only a tiny fraction of the sense data that enters our brain actually impinged on our conscious awareness and the directives that our consciousness issues such as "pick up that cup" are multiplied hugely in complexity as they are translated into precisely calibrated exertion rates for dozens of separate muscles over time.
[+] [-] analog31|11 years ago|reply
When the trainer asked if we noticed something unusual in the video, all but one or two of us raised our hands.
We had all seen the gorilla video.
So I think the gorilla video actually teaches a new concept, which is that you should assume that the people you are dealing with have the same access to information as you have unless you have a reason to believe that it's a secret.