Home Politicians Flaws in How We Evaluate Leaders (from Kahneman’s THINKING, FAST AND SLOW)

Flaws in How We Evaluate Leaders (from Kahneman’s THINKING, FAST AND SLOW)

2

I’m reading a really fascinating book: Thinking, Fast and Slow, by Nobel-Prize-winning “behavioral economist” Daniel Kahneman. In fact, I’m reading it for the second time, which is really testimony of how much this book contains, and how unusually rewarding it is to review such an interesting and substantial body of knowledge.

That body of knowledge concerns one major theme: although human beings are endowed with a level of intelligence unmatched by any other life-form, we are also endowed with a whole variety of cognitive tendencies that make us misunderstand and misjudge the world. He is reporting on many decades of research done by himself and many, many others that expose the nature of our tendencies to error.

I would like to quote here a passage from the book that points to something deeply relevant to the political world. People always say that “Hindsight is 20-20,” but of course it is anything but. People fight forever on such questions as “Who lost China?” and what was the South fighting for in the American Civil War.

And in politics, we are always trying to evaluate how well our leaders have done navigating their way through the particular set of circumstances with which they were presented. Kahneman shows how systematically we are likely to misjudge in making such evaluations. I imagine it could be useful for us who work at making evaluations of political decision makers to be at least aware of the errors we are predisposed to make.

One last bit of background: prior to the passage I’m about to quote, Kahneman has already shown that people tend to greatly underestimate the unpredictability of events, the degree of uncertainty under which we therefore must operate, and the extent to which luck plays a role in human affairs.

One Reason Hindsight is Not 20/20

(Passage from pp. 202-203)

Hindsight bias has pernicious effects on the evaluations of decision makers. It leads observers to assess the quality of a decision not by whether the process was sound but by whether its outcome was good or bad. Consider a low-risk surgical intervention in which an unpredictable accident occurred that caused the patient’s death. The jury will be prone to believe, after the fact, that the operation was actually risky and that the doctor who ordered it should have known better. This outcome bias makes it almost impossible to evaluate a decision properly– in terms of the beliefs that were reasonable when the decision was made.

Hindsight is especially unkind to decision makers who act as agents for others– physicians, financial advisers, third-based coaches, CEOs, social workers, diplomats, politicians. We are prone to blame decision makers for good decisions that worked out badly and to give them too little credit for successful moves that appear obvious after the fact. There is a clear outcome bias. When the outcomes are bad, the clients often blame their agents for not seeing the handwriting on the wall– forgetting that it was written in invisible ink that became legible only afterward. Actions that seemed prudent in foresight can look irresponsibly negligent in hindsight. Based on an actual legal case, students in California were asked whether the city of Duluth, Minnesota, should have shouldered the considerable cost of hiring a full-time bridge monitor to protect against the risk that debris might get caught and block the free flow of water. One group was shown only the evidence available at the time of the city’s decision; 24% of these people felt that Duluth should take on the expense of hiring a flood monitor. The second group was informed that debris had blocked the river, causing major flood damage. 56% of people said the city should have hired the monitor, although they had been explicitly instructed not to let hindsight distort their judgment…

Because adherence to standard operating procedures is difficult to second-guess, decision makers who expect to have their decisions scrutinized with hindsight are driven to bureaucratic solutions— and to an extreme reluctance to take risks…

Although hindsight and the outcome bias generally foster risk aversion, they also bring undeserved rewards to irresponsible risk seekers, such as a general or an entrepreneur who took a crazy gamble and won. Leaders who have been lucky are never punished for having taken too much risk. Instead, they are believed to have had the flair and foresignt to anticipate success, and the sensible people who doubted them are seen in hindsight as mediocre, timid, and weak. A few lucky gambles can crown a reckless leader with a halo of prescience and boldness.

Preferring Overconfidence to Admission of Ignorance

And here is a passage from later in the book (p. 263) that highlights another — somewhat related — problem with how people evaluate leaders (and others who are compelled to act, in some way, in the face of uncertainty):

Overconfidence … appears to be endemic in medicine. A study of patients who died in the ICU compared autopsy results with the diagnosis that physicians had provided while the patients were still alive. Physicians also reported their confidence. The result: “clinicians who were ‘completely certain’ of the diagnosis antemortem were wrong 40% of the time.” Here again, expert overconfidence is encouraged by their clients: “Generally, it is considered a weakness and a sign of vulnerability for clinicians to appear unsure. Confidence is valued over uncertainty and there is a prevailing censure against disclosing uncertainty to patients.” Experts who acknowledge the full extent of their ignorance may expect to be replaced by more confident competitors who are better able to gain the trust of clients. An unbiased appreciation of uncertainty is a cornerstone of rationality– but it is not what people and organizations want.” (Emphasis added.)

********************************************************


Sign up for the Blue Virginia weekly newsletter