Intelligent Failure.

AuthorTierney, Dominic
PositionReport

Secretary of Defense James Mattis reportedly said: "I don't lose any sleep at night over the potential for failure. I cannot even spell the word." To paraphrase Trotsky, American generals may not be interested in failure, but failure is interested in them. In recent decades, the United States has suffered a number of stalemates and defeats in Vietnam, Iraq and Afghanistan. Despite the recurrent experience of military fiascos, there is a puzzling discrepancy in how U.S. officials think about past versus future loss. When leaders learn from historical cases, debacles often loom large and powerfully shape policy. But when officials plan prospective operations, they tend to neglect the possibility of disaster. As a result, military planners focus too much on avoiding a repeat of prior reversals, and not enough on the possibility that the new strategy will itself unravel.

One solution is to take inspiration from the business realm, where the school of "intelligent failure" encourages a healthier relationship with loss. By adopting the right set of tools, the military can become more adaptable and resilient.

Civilian and military officials in the U.S. national security community tend to learn more from past failure rather than success. By failure we mean military operations that did not achieve the intended core aims, or where the balance sheet was skewed toward costs rather than benefits. Leaders, for example, often draw historical analogies with prior policies to clarify the strategic stakes in a current issue or suggest the optimum path forward. Strikingly, these analogies are overwhelmingly negative (do not repeat past errors), rather than positive (copy past successes). No more Munichs. No more Vietnams. No more Iraqs. And so on.

What's more, failure is the primary catalyst for organizational or doctrinal change. It often takes a fiasco to delegitimize standard procedures. For example, America's negative experience in Vietnam in the 1960s and 1970s, as well as in Lebanon in 1982-1984, spurred the Weinberger-Powell doctrine, which outlined a set of principles to assess the wisdom of military operations. More recently, the desire to avoid a repetition of the Iraq War lay at the core of the Obama doctrine.

The tendency to learn more from failures than successes is rooted in what psychologists call "negativity bias," which is a core predisposition in the human brain where bad is stronger than good. Negative factors loom larger than positive factors in almost every realm of psychology, including cognition, emotion and information processing, as well as memory and learning. Bad events are recalled more easily than good events, lead to more intense reflection and "why" questions, and have a much more enduring impact. "Prosperity is easily received as our due, and few questions are asked concerning its cause or author," observed David Hume. "On the other hand, every disastrous accident alarms us, and sets us on enquiries concerning the principles whence it arose."

Recent failures are especially salient because of the "availability heuristic," where people recall vivid events that just happened, and then mistakenly think these events are representative or likely to reoccur. For example, the purchase of earthquake insurance tends to increase immediately after an earthquake and then drop off as people forget the disaster.

Given that past failure is salient in memory and learning, we might expect that planning for future military operations would also highlight the possibility of loss. But, in fact, the opposite happens. When considering prospective uses of force, officials tend to downplay the possibility of disaster and focus instead on taking the first steps in the strategy of victory. Put simply, past failure is illuminated in bright lights whereas future failure is hidden.

U.S. military war games, for example, often neglect the potential for loss. A 1971 review of education in the U.S. Army discovered that war games and other exercises were, "generally euphoric in nature--the U.S. Army always wins with relative ease." By 2001, the war games were more sophisticated, but the outcome was the same. According to a study by Robert Haffa and James Patton: "the good guys win convincingly and no one gets hurt."

When war games do provide a cautionary tale, the results may simply be ignored. In the early 1960s, before the United States sent ground troops to Vietnam, the RAND Corporation ran a series of war games called SIGMA to simulate a multi-year U.S. campaign in Southeast Asia. Chairman of the Joint Chiefs Maxwell Taylor led the communist side to a crushing victory. Despite this outcome, the United States pressed ahead with an intervention in Vietnam and Taylor maintained confidence that the United States would win--perhaps because the communists would lack the benefit of his leadership.

Preparation for real war may also neglect the possibility of failure. Planning for the Iraq War, for example, was overly optimistic about the stabilization phase and the possible risks of disorder or insurgency. The special inspector general for Iraq reconstruction concluded that, "when Iraq's withering post-invasion reality superseded [official] expectations, there was no well-defined 'Plan B' as a fallback and no existing government structures or resources to support a quick response."

Why do officials downplay the possibility of future failure...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT