Rethinking the “Think Again” decision framework


Think Again: Why Good Leaders Make Bad Decisions and How to Keep it From Happening to You
by Sydney Finkelstein, Jo Whitehead and Andrew Campbell (2009, Harvard Business)

The authors of Think Again, impeccably credentialed and well versed in management strategy, are eminently qualified to scrutinize the performance of executives and senior managers in making organizational decisions. In their book they discuss numerous cases involving high-ranking decision makers. It is quite sobering, though not at all surprising, to see so many atrocious decisions consistently being made by people who are supposed to be masters of that craft. Evidently, these professionals are nowhere near as proficient as they are usually deemed to be. In view of the prevalence of this situation, it is hard to avoid concluding that, on the whole, top decision makers are no better at doing their job —making the right decision— than would be a randomly selected employee drawn from the ranks of their own organization. Even more troubling is the fact that no other professional field of endeavor seems to suffer, chronically, the consequences wrought by such appalling practicioners.

The book tackles this disconcerting problem by proposing a framework which consists of three parts: a description of how our brains make decisions and how it can be tricked into false judgments, an explanation of four posited conditions under which flawed thinking is likely to happen, and a set of safeguards prescribing how to counterbalance the four sources of error. The brain is presented as a pattern recognition apparatus that employs emotional tagging and one-plan-at-a-time processing to make sense of what’s going on in the world and devise a response to the perceived challenges. Most of that processing, however, is conducted beyond the realm of consciousness, so the hapless (and ostensible) decision maker is in an extremely weak position to question the validity of the brain’s verdicts or its torrent of neural decrees. The clinical evidence sustaining this point is striking: V.S. Ramachandran’s notable work in behavioral neurology is cited on several occasions. See Phantoms in the Brain: Probing the Mysteries of the Human Mind (1998).

Decisions go wrong, state the authors, because of two factors: (1) an individual or group makes an error of judgment (which follows from the above) and (2) the decision process fails to correct the error. Four sources of error, called red flag conditions, are identified: misleading experiences, misleading prejudgments, inappropriate self-interest, and inappropriate attachments (yes, of the type denounced by Siddhartha Gautama). The authors then advance four categories of safeguard to counter the inevitable errors: provide decision makers with new experiences or data and analysis, create group debates which challenge biases, institute governance teams to protect against flawed judgments, and set up extra monitoring processes to track the progress of important decisions.

That is all very fine. But is it an adequate description of and, more importantly, a reliable solution to the problem of faulty managerial decision making? Let’s see.

Consider one of the cases discussed in the book: John F. Kennedy’s handling of the Cuban missile crisis of October 1962. According to the authors, the preceding year’s Bay of Pigs fiasco taught Kennedy a lot, namely, how not to wage war or rattle sabers. This time around, “Kennedy recognized the red flag conditions and created a process that reduced the risk of a flawed decision.” Specifically, “Kennedy set up a decision process to create room for rigorous and multifaceted debate” by forming the ExComm committee of senior advisers. “President Kennedy rejected early options involving air strikes or invasion, asking ExComm to think again to see whether there was a solution that reduced the risks of nuclear war. As a result, they came up with what proved to be the best option: a blockade of Cuba.” (The quotations are from pp. 159 and 160.)

Proved to be the best option?

The 40th Anniversary Conference of the Cuban Missile Crisis held in Havana, Cuba on 10-12 October of 2002 revealed the following: (Source: National Security Archive, George Washington University)

1. US intelligence never located the nuclear warheads for the Soviet missiles in Cuba during the crisis, and only 33 of what photography later showed was a total of 42 medium-range ballistic missiles.

2. The US Navy dropped a series of “signaling depth charges” (equivalent to hand grenades) on a Soviet submarine at the quarantine line. According to the Soviet signals intelligence officer on the receiving end inside submarine B-59, Vadim Orlov, the depth charges felt like “sledgehammers on a metal barrel.” Unbeknownst to the Navy, the submarine carried a nuclear-tipped torpedo with orders that allowed its use if the submarine was “hulled” (hole in the hull from depth charges or surface fire).

3. Exhausted by weeks undersea in difficult circumstances and worried that the U.S. Navy’s practice depth charges were dangerous explosives, senior officers on several of the submarines, notably B-59 and B-130, were rattled enough to talk about firing their nuclear torpedoes, whose 15 kiloton explosive yields approximated the bomb that devastated Hiroshima in August 1945.

That nuclear war was averted was due to extraordinary prudence on the part of the Soviet leadership and naval commanders mixed with an abundance of sheer luck, not to a debate-based decision process on the American side which in several respects was clueless as to pivotal facts. One cannot conclude that just because the outcome turned out fortuitous the decision —or the decision maker(s)— was therefore correct. Had Washington or New York been blown off the map, this book would almost certainly have not declared that ExComm “came up with what proved to be the best option.” The correctness of a decision cannot be predicated on the uncontrollable occurrence of a specific favorable outcome.

The authors go on to claim: “Kennedy found a way of allowing Khrushchev to back down without losing face, by using backdoor Russian contacts to secure a trade: the withdrawal of US missiles stationed in Turkey for Soviet agreement to dismantle the missiles in Cuba.” (p. 160) They repeat that claim on page 168: “… helped him [Kennedy] come up with the idea of trading the missiles in Turkey for those in Cuba.” Those assertions are incorrect. In his letter to Kennedy of 27 October 1962, Khrushchev states: (Source: Letter From Chairman Khrushchev to President Kennedy, October 27, 1962, John F. Kennedy Presidential Library & Museum)

“Your missiles are located in Britain, are located in Italy, and are aimed against us. Your missiles are located in Turkey.

“You are disturbed over Cuba. You say that this disturbs you because it is 90 miles by sea from the coast of the United States of America. But Turkey adjoins us; our sentries patrol back and forth and see each other. Do you consider, then, that you have the right to demand security for your own country and the removal of the weapons you call offensive, but do not accord the same right to us? You have placed destructive missile weapons, which you call offensive, in Turkey, literally next to us. How then can recognition of our equal military capacities be reconciled with such unequal relations between our great states?”

“I therefore make this proposal: We are willing to remove from Cuba the means which you regard as offensive. We are willing to carry this out and to make this pledge in the United Nations. Your representatives will make a declaration to the effect that the United States, for its part, considering the uneasiness and anxiety of the Soviet State, will remove its analogous means from Turkey. Let us reach agreement as to the period of time needed by you and by us to bring this about. And, after that, persons entrusted by the United Nations Security Council could inspect on the spot the fulfillment of the pledges made.”

The point to be made here is this: even the authors, who —individually and as a monitoring team— are deliberately focusing their best efforts at explaining and promoting the Think Again framework as the recommended means of safeguarding against errors of judgment, failed to catch and correct the error. Why should one expect things to be any different in the executive suite? This lapse calls into question the credibility of the entire framework, particularly when it comes to real-time situations where the available information is rarely unambiguous, complete, properly structured, sufficiently precise or demonstrably accurate.

Consider another case from the book: Paul Wolfowitz’s dalliance with questionable ethics at the World Bank. True to their framework, the authors attribute Wolfowitz’s conduct to inappropriate attachments (pp. 129-34). Perhaps this was far too generous a judgment. Other possibilities spring to mind, including outright corruption and, if Wolfowitz’s role in instigating the still ongoing war in Iraq is allowed to figure in the assessment, plain old managerial incompetence. The possibility arises that by pigeonholing the faculty of reason with preconfigured templates of red flags and safeguards, the Think Again framework may actually hinder the process of procuring accurate interpretations of reality necessary for unbiased and efficacious decision making.

That might explain why the authors’ judgment of Admiral Yamamoto Isoroku in the Battle of Midway (chapter 4) seems biased. Their portrayal of Yamamoto as an inflexible strategist bent on carrying out his pet plan irrespective of the concerns of his superiors (which was a factor, though by no means the only one nor the most critical, as shown below) is compatible with the framework’s one-plan-at-a-time assumption. But never is it mentioned or taken into account (1) that Lt. Col. Jimmy Doolittle’s air raid on Tokyo and other Japanese cities seven weeks before Midway had suddenly made the destruction of the American carriers —and therefore, the Midway operation— a top priority throughout the Imperial Japanese military establishment; (2) that the Battle of the Coral Sea, four weeks before Midway, “the first naval engagement in history in which the participating ships never sighted or fired directly at each other,” had conclusively proved that naval supremacy now depended on the aircraft carrier; and (3) that Yamamoto had no reason to suspect that the Allies had recently cracked the Japanese JN-25 naval code — the sole reason why the American ambush at the Battle of Midway ever came to take place at all. The reality of the situation was much more complex than the facile interpretation given in the book, which, as would be expected, happens to conform to the framework’s a priori worldview. In addition to this, the authors’ judgments once again evince being influenced by hindsight. (Sources: Wikipedia; Naval Historical Center, US Navy)

If hindsight serves to prove anything, it would be that Yamamoto’s preoccupation with eliminating the American carriers as soon as possible was indeed rationally justified.

These observations suggest that the Think Again framework could benefit from certain improvements. Interestingly, the main candidate is already listed in the framework, although it is not given the full attention it deserves. The first of the safeguards calls for providing decision makers with “new experiences or data and analysis.” By “data” I understand the authors to mean information. Analysis, however, is the key concept that should be stressed. The author’s mention the point now and then, but only once is there a forceful assertion of its importance: ” ‘Objective strategic analysis’ is close to useless if the key decision makers are not part of that analysis.” Bravo! “Leaders have the responsibility to ensure that those involved in an important decision are [their italics] part of the analysis.” (p. 203)

Executive decision making has traditionally relied much too heavily on judgment and intuition —to wit, personal opinions— which are of limited value when dealing with the complex problems typically faced by modern organizations. Management thinking and practice must adapt to the demands of modern times. It is no longer possible for senior managers to continue acting like brightly garbed generals flaunting plumed headgear while commanding their troops atop handsome white steeds. Those days are gone. Modern-day managers must roll up their shirtsleeves and learn how to make use of the array of instruments available for the modern cockpit. And learn to think like analysts by becoming analytically involved. Resistance is futile. In a fiercely competitive Darwinian environment, ignoring the new rules of the game will only result in more atrocious decisions, which invariably lead to extinction.

That said, I should add that the book is a worthy and enjoyable read, and should figure on any manager’s reading list. There’s plenty of practical advice for the responsible decision maker who values professional competence.

Note: Michael Dobbs’s One Minute to Midnight: Kennedy, Khrushchev, and Castro on the Brink of Nuclear War (2008) re-examines the Cuban missile crisis in the light of declassified American and Russian documents. Hiroyuki Agawa’s The Reluctant Admiral: Yamamoto and the Imperial Navy (2000) is quite possibly the best biography of the man who planned the attacks on Pearl Harbor and Midway. It is now out of print, but used copies are available.



One Response to “Rethinking the “Think Again” decision framework”

  1. strategic change management…

    Great post. My approach to strategic change management says the quality of the first five percent determines what happens in the rest of the process. This same principle applies to many situations….

Leave a Reply

Please log in using one of these methods to post your comment: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: