The Knowledge Illusion: Book Summary, Review, Notes

The Knowledge Illusion

The Book In A Single Sentence

Knowledge works in very different and counter-intuitvie ways; perhaps having knowledge isn’t personally very important at all.

Personal Thoughts

The Knowledge Illusion is a great primer on how knowledge works. Why do we need knowledge? How do we think? What do we get wrong? These are topics that the book discussed. If you’re knowledgeable about this subject, this book may admittedly be a lot more example driven than you would like. Otherwise, this is arguably the best place to start.

Check out this article for more notes and thoughts.

Book Notes

  • Our point is not that people are ignorant. It’s that people are more ignorant than they think they are. We all suffer, to a greater or lesser extent, from an illusion of understanding, an illusion that we understand how things work when in fact our understanding is meagre.
  • Causal reasoning is our attempt to use our knowledge of causal mechanisms to understand change. It helps us guess what will happen in the future by reasoning about how mechanisms will transform causes into effects.
  • People ignore alternative causes when reasoning from cause to effect because their mental simulations have no room for them, and because we’re unable to run mental simulations backward in time from effect to cause.
  • “There are known knowns. These are things we know that we know. There are known unknowns. That is to say, there are things that we know we don’t know. But there are also unknown unknowns. There are things we don’t know we don’t know.
  • We ignore complexity by overestimating how much we know about how things work, by living life in the belief that we know how things work even when we don’t. We tell ourselves that we understand what’s going on, that our opinions are justified by our knowledge, and that our actions are grounded in justified beliefs even though they are not. We tolerate complexity by failing to recognize it. That’s the illusion of understanding.
  • The illusion of explanatory depth enables people to hold much stronger positions than they can support. People rate their understanding of an issue a lot higher than they can actually explain it. When made to explain how a policy works, their positions became less extreme as they realised how little they knew.
  • Storytelling is our natural way of making causal sense of sequences of events. That’s why we find stories everywhere.
  • The idea in hockey is that a team will score more goals when a good player is on the ice and the other team will score fewer. So the quality of a player is indicated by a plus-minus score, the number of goals that the player’s team scored while the player was on the ice minus the number of goals that were scored against the player’s team. One could measure the contribution of a thinker to a group’s problem-solving in a similar way. […] Intelligence is no longer a person’s ability to reason and solve problems; it’s how much the person contributes to a group’s reasoning and problem-solving process.
  • Scientists care about the truth, but what drives their day-to-day behaviour isn’t the search for truth as much as the social life entailed by a community of knowledge. Jane Doe’s success as a researcher is only indirectly related to how many important findings she discovers in her laboratory. She’ll get tenure at Harvard and be allowed to stay there only if she publishes those findings in high-profile outlets. So her job is as much to persuade others of the importance of her work as it is to actually do the work.
  • When academics encounter a new idea that doesn’t conform to their preconceptions, there’s often a sequence of three reactions: first dismiss, then reject, and finally declare it obvious.
  • Individuals don’t make decisions by themselves. Other people formulate options for them, other people present those options, and other people give them advice. Moreover, people sometimes copy decisions that are made by others (for example, when stock market guru Warren Buffett makes a decision to buy a stock, many people copy him).
  • Because so much of our financial knowledge is possessed by the community and not by us individually, we need to radically scale back our expectations of how much complexity people can tolerate. We need to give people the opportunity to understand and evaluate products and then decide for themselves.
  • We often think of social skills and intelligence as being negatively correlated. Turn on almost any eighties movie and you’ll find a stereotypical nerd character who is great at math or physics, but can’t carry on a simple conversation with a member of the opposite
    sex. These depictions belie the deep connection between individual and group intelligence. As we’ll soon see, the smartest among us—in the sense of being the most successful—may well be those who are best able to understand others.
  • Decision rules might be more effective if they are supplemented with a short, clear explanation that gives people an understanding of why the rule is a good one. Giving people a correct intuition about the benefit of diversifying, the power of compound interest, or other core financial principles might make them more likely to apply rules correctly and stick to them.
  • We are like Goldilocks: We have a sweet spot for explanatory detail, not too little and not too much.
  • We also suffer from the knowledge illusion because we confuse what experts know with what we ourselves know. The fact that I can access someone else’s knowledge makes me feel like I already know what I’m talking about.
  • Ignorance is not bliss, but it doesn’t have to be misery. For humans, ignorance is inevitable: It’s our natural state. There’s too much complexity in the world for any individual to master. Ignorance can be frustrating, but the problem is not ignorance per se. It’s the trouble we get into by not recognizing it.

Get The Knowledge Illusion on Amazon.