(no title)
RationalDino | 2 years ago
That said, I would advocate a more systematic version of the article's conclusion that we should accept that we can't think. Instead borrow an idea from Superforecasting. One way to get better at predictions is to start by constructing an outside view, and an inside view.
Suppose we want to predict the odds of a thing happening. First we make a tally of roughly comparable situations. In how many did the comparable thing happen?
The inside view is a detailed analysis of how the thing could happen this time. You can analyze this to a Bayesian analysis.
The problem with the inside view is that, confronted with a way something could have happened, we overestimate all of the probabilities. We wind up too certain that it did. Conversely if we don't think of a way that it could happen, we wind up being too certain that it could not.
So the inside view has too much variance. We fix that by adjusting the inside view towards the outside view. How much you should do this takes some practice to figure out. But actually doing this makes your predictions more accurate.
Instead of calling this epistemic learned helplessness, I would call the need to adjust to the outside view as estimated epistemic uncertainty. And yes, people who practice doing this really do become a lot better at predicting what will prove to be true in uncertain circumstances.
No comments yet.