To be fair, this sort of learning is a lot harder than we think.
For well more than five decades, the Nobel laureate Daniel Kahneman has been examining and explaining human attitudes and behavior. His disarmingly simple experiments and profoundly expert analysis have dramatically altered the way we see human reason. Philosophers and social scientists had assumed for centuries that humans are inherently rational. Kahneman’s powerful legacy (largely created with his late colleague Amos Tversky) comes in two parts. The first is that we are not nearly as rational as we tend to assume. Relatedly, we also aren’t as smart and skilled as we readily assume. We are routinely burdened by the “
hubris hypothesis” (for example, as an
important study found, physicians “who were ‘completely certain’ of the diagnosis ante-mortem were wrong 40 percent of the time”). Thus, as Martha Deevy, director of the Financial Security Division at Stanford’s Center on Longevity
points out, “investment fraud works best on highly educated men, who think they’re too smart to be scammed.”
The second part of Kahneman’s powerful legacy is more insidious and more vexing still. Kahneman’s great memoir of his life’s work,
Thinking Fast and Slow, opens with the following. “The premise of this book is that it is easier to recognize other people’s mistakes than your own.” Or, in the
careful prose of scientific research, “people who were aware of their own biases were not better able to overcome them.”
Kahneman admits as much even for himself. “My intuitive thinking is just as prone to overconfidence, extreme predictions, and the planning fallacy as it was before I made a study of these issues.” We might grudgingly concede that we hold views that are wrong. The problem is in providing current examples. Even worse still is the
unfortunate and shocking reality that the smarter and more self-aware we are the more vulnerable we are to these sorts of errors. Bias blindness impedes us all.