04. Changing Your Mind

Dated Jan 2, 2022; last modified on Sun, 02 Jan 2022

How to Be Wrong

found that experts were barely able to forecast better than random chance. However, a small subset of people (coined “superforecasters” ) were better. In a competition, they beat teams of top professors and CIA professional analysts. These superforecasters were not smarter than everyone else nor did they have more knowledge/experience, they were great at being wrong.

Change your mind a little at a time. Seeing the world in shades of grey is less stressful, as the experience of encountering evidence against one of your beliefs is not high stakes.

Try to understand why you were wrong. Identify the faulty reasoning that led you to the wrong conclusion. Maybe you can learn how to avoid a similar error next time.

found justifications used by non-superforecasters, e.g. I was almost right. Superforecasters were happy to analyze their errors to hone their technique. In each year of the contest, the superforecasters’s average increased by about 25%, while the other forecasters didn’t improve.

Reading about biases and fallacies is good and all, but such knowledge doesn’t really become part of you until you’ve derived them for yourself when analyzing why you were wrong.

Admit that you were wrong. No shame because people are frequently wrong, especially at hard things like forecasting.

There’s a difference between “admitting a mistake” and “updating”. If you were negligent in some way, then the former is appropriate because you messed up. However, if you come to a new conclusion after learning new information, then the latter applies, and you have nothing to be defensive about or to self-flagellate for.

Tetlock’s climb to fame was decisively winning IARPA’s ACE forecasting tournament . How much better are forecasters in the GJP than those in prediction markets? How well does their commercial venture do?

Lean in to Confusion

We tend explain away observations that conflict with our worldview. For example, during the Cholera outbursts of 1850s, London’s officials omitted the London Homeopathic Hospital’s low cholera mortality rate of 18% from the survey (homeopathy was seen as a disgrace to proper medicine). However, the homeopaths sterilized blankets of the sick before reusing them (good hygiene) and recommended that cholera patients drink whey (akin to oral rehydration). Neither of these two recommendations were rooted in homeopathy, and if the city officials had been more curious, they’d have incorporated them into standard cholera treatment sooner and saved more lives.

looks like a good book to read. It’d revise my opinions on the closer-to-truth nature of hard sciences compared to social sciences. cites one of its tenets: A paradigm shift starts with a core belief, or paradigm, that everyone assumes is true. Gradually, some people observe anomalies that don’t seem to fit with that paradigm. At first, scientists shrug off those anomalies as exceptions or mistakes, or they modify their paradigm a little bit at a time to accommodate the new observations. But the more anomalies accumulate, the more confused the scientists become, until someone eventually develops a new paradigm that makes everything make sense again.

More often than not, it’s the accumulation of many puzzling observations over time that changes your mind. But for that to happen, you need to be willing to stay confused for a while, instead of explaining away anomalies.

For example, Jerry Taylor, a professional climate change skeptic (believed that the consequences of warmer climate would be on the lower side of the possible outcomes outlined by the IPCC), was initially prodded by a claim that he was misrepresenting evidence (turned out that Taylor trusted a colleague who misrepresented the evidence to him). From then on, he started paying more attention to the talking points, and the finer details of the supporting research.

finds Jerry Taylor noteworthy because professional climate change skeptics rarely switch sides.

Escape Your Echo Chamber

Blindly listening to the other side tends not to work. found that conservatives who spent a month reading liberal tweets became dramatically more conservative, and liberals who spent a month reading conservative tweets became slightly (statistically insignificant) more liberal.

Instead, engage with people from the other side who you like, or respect, or share intellectual common ground with, or share common goals. For example, Taylor was finally convinced by Litterman’s argument that catastrophic climate change is a nondiversifiable risk. Taylor describing Litterman: instant credibility with people like me… from Goldman Sachs… kind of a soft Libertarian… intelligent risk management

Changing each other’s views is hard:

  • We may misunderstand each other’s views, e.g. “Oh, thank goodness [you’re an atheist because of the errors, atrocities and contradictions in the Holy Books], I was afraid you were one of those crazies who believed that monkeys transformed into humans.”

  • Bad arguments inoculate us against good arguments, e.g. London officials reflexively dismissing the London Homeopathic Hospital.

  • Our views are inter-dependent and thus changing one may require changing others, e.g. “climate change isn’t real”, “climate change skeptic media outlets are more trustworthy”, “smart people don’t buy the climate change consensus”, etc..


  1. The Scout Mindset: Why Some People See Things Clearly and Others Don't. Chapter 10: Changing Your Mind. Julia Galef. 2021. ISBN: 9780735217553 .
  2. Superforecasting: The art and science of prediction. Tetlock, Philip E.; Dan Gardner. 2016.
  3. The Good Judgment Project. Philip E. Tetlock; Barbara Mellers. en.wikipedia.org . goodjudgment.com . www.gjopen.com . Accessed Jan 2, 2022.
  4. Good judgment in forecasting international affairs (and an invitation for season 3) - The Washington Post. Michael Horowitz. www.washingtonpost.com . Nov 26, 2013. Accessed Jan 2, 2022.
  5. The Scout Mindset: Why Some People See Things Clearly and Others Don't. Chapter 11: Lean in to Confusion. Julia Galef. 2021. ISBN: 9780735217553 .
  6. The Structure of Scientific Revolutions. Thomas Kuhn. en.wikipedia.org . 1962. ISBN: 9780226458113 .
  7. The Scout Mindset: Why Some People See Things Clearly and Others Don't. Chapter 12: Escape Your Echo Chamber. Julia Galef. 2021. ISBN: 9780735217553 .
  8. Exposure to opposing views on social media can increase political polarization. Christopher A. Bail; Lisa P. Argyle; Taylor W. Brown; John P. Bumpus; Haohan Chen; MB Fallin Hunzaker; Jaemin Lee; Marcus Mann; Friedolin Merhout; Alexander Volfovsky. Proceedings of the National Academy of Sciences, Vol. 115, No. 37 (2018): 9216-9221. doi.org . scholar.google.com .
  9. Talking Snakes: A Cautionary Tale - LessWrong. Scott Alexander. www.lesswrong.com . Mar 12, 2009. Accessed Jan 2, 2022.
  10. #3: Jerry Taylor of Niskanen Center Has New View On Climate Change — Important, Not Important. Jerry Taylor; Quinn Emmett; Brian Colbert Kennedy. www.importantnotimportant.com . Feb 9, 2018. Accessed Jan 2, 2022.