Mental Mistakes We Commonly Make

 

20170129_205220
My collage made in POTUS 45’s first week in office.

I am not alone in asking, “How the hell did we get to this point?” I am searching for powerful explanations as to how the United States elected an undisciplined, unfocused President who is completely lacking in humility, empathy or curiosity.

I have read brain science articles that suggest our political affiliations light up the part of the brain that identifies with our tribe. It is not reasonable.

The books I have been reading either address cultural divides, such as Hillbilly Elegy, or rebut “rational man” ideas. I’ve never been a fan of theories that require a rational actor. I could think of 2 dozen examples of decisions I  made for reasons other than my own best interest in a one hour economics lecture.

lewis-photoAfter listening to interviews with author Michael Lewis on several podcasts, I bought his hardback book (high compliment!), The Undoing Project.  His latest book is a profile of two Israeli psychologists who created a new branch called behavioral psychology and impacted other fields including economics, and military strategy. Daniel Kahneman and Amos Tversky each possessed brilliant minds, but their collaboration was genius. Lewis tells the interesting story of how they developed independently, how they met and then describes their unique work partnership.

kahnemantervsky

Both Kahneman and Tversky were fascinated with common mental errors. Kahneman was especially observant of people’s tendencies to make mistakes. As a combatant and psychologist on the front lines of multiple Israeli wars, he was able to make improvements in how the army selected officers by questioning conventional wisdom and tracking actual performance against the impressions the same candidates made during the screening process. His improved method made such an impression the Israeli army still uses it with minor adjustments.

Biases towards managing with criticism or praise: We all tend to believe that one is more effective than the other. Kahneman was helping the Israeli Air Force train pilots and he noticed that the trainers believed criticism was more useful than praise. “They’d explained to Danny that he only needed to see what happened after they praised him for performing especially well, or criticized him for performing especially badly. The pilot who was praised always performed worse the next time out, and the pilot who was criticized always performed better.” (p 125) After observing he explained what was really going on. “The pilot who was praised because he’d flown exceptionally well, like the pilot who was chastised after he had flown exceptionally badly, simply were regressing to the mean. They’d have tended to perform better (or worse) even if the teacher had said nothing at all. An illusion of the mind tricked teachers—and probably many others—into thinking that their words were less effective when they gave pleasure than when they gave pain.”

nobel
Daniel Kahneman won the Nobel Prize and shared credit posthumously with Tversky

There were many fascinating ideas presented in The Undoing Project and the most telling for our current situation are the four heuristics or rules of thumbs they developed for the fallibility of human judgment.

  1. Representativeness: when people make judgments, they compare whatever they are judging to some model in their minds. We compare the specific case to the parent population. “Our thesis is that in many situations, an event A is judged to be more probable than an event B whenever A appears more representative than B.” They had a hunch that people, when they formed judgments, weren’t just making random mistakes—that they were doing something systemically wrong. (examples p 186-187)
  2. Availability: a heuristic for judging frequency and probability. “The more easily people can call some scenario to mind—the more available it is to them—the more probable they find it to be. Any fact or incident that was especially vivid, or recent, or common—or anything that happened to preoccupy a person—was likely to be recalled with special ease, and so be disproportionately weighted in any judgment.”(examples p 191-192) Human judgment was distorted by the memorable.
  3. Anchoring: People can be anchored with information that was totally irrelevant to the problem they were being asked to solve. (examples p 192)kahneman-quote
  4. Simulation: the power of unrealized possibilities to contaminate people’s minds. As they moved through the world, people ran simulations of the future. They based their judgments and decisions in part on these imagined scenarios. And yet not all scenarios were equally easy to imagine; they were constrained, much in the way that people’s minds seemed constrained when they “undid” some tragedy. Discover the mental rules that the mind obeyed when it undid events after they occurred and you might find, in the bargain, how it simulated reality before it occurred. (p 300) Danny called this “the state of the ‘undoing project.”  This reinforces the importance of framing. Simply by changing the description of a situation, and making a gain seem like a loss, you could cause people to completely flip their attitude toward risk, and turn them from risk-avoiding to risk-seeking.

These examples of these profound rules are ubiquitous. One of Lewis’ illustrations of how these rules confine people’s thinking made a big impression: “It’s far easier for a Jew living in Paris in 1939 to construct a story about how the German army will behave much as it had in 1919, for instance, than to invent a story in which it behaves as it did in 1941, no matter how persuasive the evidence might be that, this time, things are different.” (p 195)

tversky

Because of our faulty assumptions and thinking, we need to have some humility about what we know and what can be known. Leaders must remain curious and open to new knowledge.

I’ll let Amos Tversky have the last word: “The handwriting was on the wall, it was just the ink that was invisible.”