Mental Mistakes We Commonly Make

 

20170129_205220
My collage made in POTUS 45’s first week in office.

I am not alone in asking, “How the hell did we get to this point?” I am searching for powerful explanations as to how the United States elected an undisciplined, unfocused President who is completely lacking in humility, empathy or curiosity.

I have read brain science articles that suggest our political affiliations light up the part of the brain that identifies with our tribe. It is not reasonable.

The books I have been reading either address cultural divides, such as Hillbilly Elegy, or rebut “rational man” ideas. I’ve never been a fan of theories that require a rational actor. I could think of 2 dozen examples of decisions I  made for reasons other than my own best interest in a one hour economics lecture.

lewis-photoAfter listening to interviews with author Michael Lewis on several podcasts, I bought his hardback book (high compliment!), The Undoing Project.  His latest book is a profile of two Israeli psychologists who created a new branch called behavioral psychology and impacted other fields including economics, and military strategy. Daniel Kahneman and Amos Tversky each possessed brilliant minds, but their collaboration was genius. Lewis tells the interesting story of how they developed independently, how they met and then describes their unique work partnership.

kahnemantervsky

Both Kahneman and Tversky were fascinated with common mental errors. Kahneman was especially observant of people’s tendencies to make mistakes. As a combatant and psychologist on the front lines of multiple Israeli wars, he was able to make improvements in how the army selected officers by questioning conventional wisdom and tracking actual performance against the impressions the same candidates made during the screening process. His improved method made such an impression the Israeli army still uses it with minor adjustments.

Biases towards managing with criticism or praise: We all tend to believe that one is more effective than the other. Kahneman was helping the Israeli Air Force train pilots and he noticed that the trainers believed criticism was more useful than praise. “They’d explained to Danny that he only needed to see what happened after they praised him for performing especially well, or criticized him for performing especially badly. The pilot who was praised always performed worse the next time out, and the pilot who was criticized always performed better.” (p 125) After observing he explained what was really going on. “The pilot who was praised because he’d flown exceptionally well, like the pilot who was chastised after he had flown exceptionally badly, simply were regressing to the mean. They’d have tended to perform better (or worse) even if the teacher had said nothing at all. An illusion of the mind tricked teachers—and probably many others—into thinking that their words were less effective when they gave pleasure than when they gave pain.”

nobel
Daniel Kahneman won the Nobel Prize and shared credit posthumously with Tversky

There were many fascinating ideas presented in The Undoing Project and the most telling for our current situation are the four heuristics or rules of thumbs they developed for the fallibility of human judgment.

  1. Representativeness: when people make judgments, they compare whatever they are judging to some model in their minds. We compare the specific case to the parent population. “Our thesis is that in many situations, an event A is judged to be more probable than an event B whenever A appears more representative than B.” They had a hunch that people, when they formed judgments, weren’t just making random mistakes—that they were doing something systemically wrong. (examples p 186-187)
  2. Availability: a heuristic for judging frequency and probability. “The more easily people can call some scenario to mind—the more available it is to them—the more probable they find it to be. Any fact or incident that was especially vivid, or recent, or common—or anything that happened to preoccupy a person—was likely to be recalled with special ease, and so be disproportionately weighted in any judgment.”(examples p 191-192) Human judgment was distorted by the memorable.
  3. Anchoring: People can be anchored with information that was totally irrelevant to the problem they were being asked to solve. (examples p 192)kahneman-quote
  4. Simulation: the power of unrealized possibilities to contaminate people’s minds. As they moved through the world, people ran simulations of the future. They based their judgments and decisions in part on these imagined scenarios. And yet not all scenarios were equally easy to imagine; they were constrained, much in the way that people’s minds seemed constrained when they “undid” some tragedy. Discover the mental rules that the mind obeyed when it undid events after they occurred and you might find, in the bargain, how it simulated reality before it occurred. (p 300) Danny called this “the state of the ‘undoing project.”  This reinforces the importance of framing. Simply by changing the description of a situation, and making a gain seem like a loss, you could cause people to completely flip their attitude toward risk, and turn them from risk-avoiding to risk-seeking.

These examples of these profound rules are ubiquitous. One of Lewis’ illustrations of how these rules confine people’s thinking made a big impression: “It’s far easier for a Jew living in Paris in 1939 to construct a story about how the German army will behave much as it had in 1919, for instance, than to invent a story in which it behaves as it did in 1941, no matter how persuasive the evidence might be that, this time, things are different.” (p 195)

tversky

Because of our faulty assumptions and thinking, we need to have some humility about what we know and what can be known. Leaders must remain curious and open to new knowledge.

I’ll let Amos Tversky have the last word: “The handwriting was on the wall, it was just the ink that was invisible.”

 

Predicting Surprises

jack popped

My colleague recommended reading Predictable Surprises, a book by Max H. Bazerman and Michael D. Watkins, and then it was discussed in the Human-Centered Design course. And finally another colleague learned about it in his management short course at Harvard. So I read the book.

Predictable surprises are disasters you should have seen coming–events or a set of events that take an individual or group by surprise, despite prior awareness of all the information necessary to anticipate the events and their consequences. (p 1)

The book focuses on three themes: cognitive failures, organizational failures, and political failures. There are 6 general characteristics to predictable surprises:

  1. Leaders know a problem existed and that the problem would not solve itself.
  2. Predictable surprises can be expected when organization members recognize that a problem is getting worse over time.
  3. Fixing the problem would incur significant costs in the present, while the benefits of action would be delayed.
  4. Addressing the predictable surprise typically requires incurring costs, while the reward is avoiding a cost that is uncertain but likely to be much larger. And perhaps more importantly, leaders know they can expect little credit for preventing them.
  5. Decision makers, organizations and nations often fail to prepare for predictable surprises because of the natural tendency to maintain the status quo.
  6. A small, vocal minority benefits from inaction and is motivated to subvert the actions of leaders for their own private benefit.

jack in the box popped

I can think of many predictable surprises that challenges me as a leader and our society at large. Working on Delta solutions is a case study in predictable surprises. They offer up examples such as Hurricane Katrina and the aftermath. They also mention the meltdown of the financial system in 2007-8. If you want to skip reading the book, then please go see The Big Short. This movie does a terrific job of explaining what happened. Just do not believe the hype–it is a tragedy not a comedy.

If you are involved in trying to solve a problem such as climate change or even something narrower such as leading a change initiative in a company, I recommend Predictable Surprises.

The book does not offer many solutions to avoiding predictable surprises–although recognizing them is a management advantage. My conclusion is that it is another strong reason to create and actively maintain a risk register and to make managing risk a discipline.

 

Responsible Communication: Choosing Our Words

Our choices make up the sum of our leadership. A mature person realizes they are always “in choice.” This includes responsible speech.

Snarky business owner's sign.
Snarky business owner’s sign at the AMGEN Tour of California 2015 City of Lodi finish.

There is much confusion about free speech in the USA, especially during this election cycle, what with money being called speech and lies that would have ended campaigns drawing nothing but headlines. Leaders usually are held to a higher standard than the entitlement to say whatever they like. Leaders exercise responsibility when they act and are careful with their words. Oh where are the leaders today?

Marilyn Chandler McEntyre’s thesis in her excellent book, Caring for Words in a Culture of Lies, is “if language is to retain its power to nourish and sustain our common life, we have to care for it in something like the way good farmers care for the soil.” (p 3) Years of hyperbolic advertising, yellow journalism, misrepresentations in political speech and fraud in business has depleted and polluted the English language. As English is the dominant language of the internet (80% of information is in English) and business, it is urgent to address the decline in literacy and commitment to truth.

She makes the case that to be good stewards of our language we need to do three things: 1) deepen and sharpen our reading skills; 2) cultivate habits of speaking and listening that foster precision and clarity; and 3) practice poesis–be makers and doers of the word. (p 9-10)

McEntyre gives 12 strategies to steward the English language:

  1. Love words.
  2. Tell the truth.
  3. Don’t tolerate lies.
  4. Read well.
  5. Stay in conversation.
  6. Share stories.
  7. Love the long sentence.
  8. Practice poetry.
  9. Attend to translation.
  10. Play.
  11. Pray.
  12. Cherish silence.

It has inspired me to make my word for the year: truth. I intend to focus on reducing my own tendency to hyperbolic enthusiasm, to take a Great Course on crafting better sentences, and to memorize poetry. It is a start.

There is an urgency that I hope you share with me. I just watched The Big Short at the movie theater. It can only be described as a comedy if you like black humor. High levels of deceit (and greed) in the world’s banking system led to a complete meltdown in 2008. The complicity of the regulatory and government agencies resulted in no one being held accountable and nothing enacted to avoid a repetition of the same calamity. The stakes are high on so many fronts.