Planning for Surprises

A company doesn't need a crystal ball to see impending disasters. Harvard Business School professor Max H. Bazerman and INSEAD professor Michael D. Watkins explain how to foresee and avoid predictable surprises.
by Martha Lagace

The train wreck that was Enron's collapse is only one big, blatant example of how some disasters catch us unawares—but shouldn't. In fact, according to Max H. Bazerman and Michael D. Watkins, many surprises in all types and sizes of organizations are predictable and avoidable. Predictable surprises, say Bazerman and Watkins, are a common form of leadership failure. "Predictable surprises happen when leaders had all the data and insight they needed to recognize the potential, even the inevitability, of major problems, but failed to respond with effective preventative action," they say. Here's the good news: There are reasons why leaders fail to prevent predictable surprises and there are ways to identify trouble while there is still time to stop it.

As authors of a new book from Harvard Business School Press, Predictable Surprises: The Disasters You Should Have Seen Coming and How to Prevent Them, Bazerman and Watkins recently collaborated on the following e-mail interview with HBS Working Knowledge.

Martha Lagace: What distinguishes a predictable surprise from any event seen with 20/20 hindsight?

Max Bazerman and Michael Watkins: Bad surprises really do happen to good leaders. But there must be a point where we hold leaders accountable for their failure to prevent predictable surprises. There must be a point at which we conclude that leaders have been misguided or negligent or both.

When should we hold our leaders accountable for failing to take actions to prevent predictable surprises and when should we give them a pass? If they have made reasonable efforts to scan the environment and tap into available information, yet still fail to see the piano dropping, we should let leaders off the hook. They likewise get a pass if they do what a reasonable person would do to make prevention of a potential disaster a priority in the face of competing priorities, and to mobilize a response in the face of concerted opposition for special interests.

Far too often, we only address problems after the predictable surprise has occurred.

Q: Why are predictable surprises so common?

A: Our research shows that there are psychological, organizational, and political factors that conspire to keep us from dealing with problems that are worthy of our attention.

Psychological vulnerabilities have to do with well-recognized biases in the way people think, such as self-serving illusions and overcommitment, as well as the tendency to stick with the status quo and to discount the future.

Organizational vulnerabilities arise because of structural barriers to the effective collection, processing, and dissemination of information, such as the division of organizations into independently operating silos and the filtering of information as it passes up through the hierarchies.

Political vulnerabilities contribute to predictable surprises when a small number of individuals and organizations are able to "capture" the political system or organization for their own benefit.

The area of decision bias has grown as an important lens of analysis in many areas of business, from finance to marketing to negotiations. We also believe that cognitive biases explain why we allow predictable surprises to occur. For example, people tend to exist in a state of denial that leads us to undervalue risks. In addition, people overly discount the future, reducing our willingness to invest in the present to prevent some disaster that may be quite distant.

People also try to maintain the status quo, creating a barrier to the dramatic changes that are needed to address predictable surprises. Finally, many are more willing to run the risk of incurring a large but small-probability loss in the future rather than accepting a smaller, yet certain loss now. We don't want to invest in preventing a problem that we have not experienced and can hardly bear to imagine. Thus, far too often, we only address problems after the surprise has occurred.

Q: Your book uses examples of widely known surprises in the public record, including the 9/11 attacks. As a business example, you use what you call the looming crisis of frequent-flyer programs. Why do you see frequent-flyer programs as a "house of cards"?

A: Predictable surprises loom in most organizations. Frequent-flyer programs are simply one example that affects a lot of people. Most of us collect our miles and even think of them as an asset that we can rely on using in the future. We think this is optimism. For many U.S. airlines, the debt owed to customers in the form of miles is a value significantly larger than the airlines' market capitalization. This is simply not sustainable. Airlines have already reduced the value of miles by making seats less and less available. But, a larger predictable surprise still awaits. We stick by our recommendation: Use your miles!

Q: What other looming, predictable surprises have you noticed since the book manuscript was completed?

A: The flu vaccine crisis is a classic example of a predictable surprise. As The New York Times noted in the October 17, 2004 Health section, "The shortage caught many Americans by surprise, but it followed decades of warnings from health experts who said the nation's system for vaccine supply and distribution was growing increasingly fragile." There have been numerous disruptions in the supply of vaccines for flu and other diseases. In addition to disruptions with the flu supply in each of the past four years, the Times notes that there have been shortages in eight of the eleven vaccines for childhood diseases in the U.S.

Why were we vulnerable to this predictable surprise? Because the U.S. has become dependent on just two suppliers, while Great Britain has five suppliers to reduce the risk of supply disruptions. While fear of lawsuits has played a role, the fundamental problem is that the economics of vaccine production are unattractive for pharmaceutical companies in comparison to other opportunities. So the private sector has responded rationally, by exiting the business. Of the twenty-five companies that made vaccines in the U.S. thirty years ago, only five remain today. The Federal Government has, regrettably, permitted this to happen by not treating vaccines as a critical public good and providing appropriate subsidies for their production. One contributing problem is the absence of a single agency responsible for overseeing the U.S. vaccine supply.

Q: You write in the book, "By deliberately assuming a veil of ignorance, individuals can learn to see beyond themselves and more effectively ward off a predictable surprise." What's a veil of ignorance and how can it help executives avoid predictable surprises?

A: American philosopher John Rawls developed the concept of a veil of ignorance to encourage us to think about what the situation would look like without the partisan perceptions that we bring to most issues. We always have our own views. But the key issue here is the difference between hypotheses and assumptions.

In practice it is difficult to assume complete ignorance. But it is essential to try to be conscious of the assumptions you are making about what is possible and critically what is not possible. To the extent that you can treat these as hypotheses to be rigorously challenged and tested, rather than as assumptions that are taken for granted, you reduce the potential to be predictably surprised. People always learn with a point of view; the key is to be open to altering that point of view in the face of reality.

For example, we question whether the Bush administration objectively considered the issue of weapons of mass destruction (WMD) in Iraq from an objective standpoint (under a veil of ignorance), or whether the administration started with a partisan perception and then avoided any reasonable challenge to this view. Richard Clarke, the former counterterrorism czar under Former President Clinton and President Bush, made a compelling case that the administration never escaped their desire to see the data as they wanted to see it.

Q: At the organizational level, you make the point that special interest groups can really interfere with efforts to avert a predictable surprise. What is the power of special interest groups?

A: Any time reform would yield broad but modest gains and deep but narrow costs, you can expect the potential losers to become strongly mobilized to oppose any change. They can often do this successfully until the wall begins to collapse and a crisis allows leaders to overcome their resistance.

Far too often, a select group is successful in lobbying the U.S. Congress to the detriment of society. In our book, we document the dysfunctional role of the airlines in weakening airline security throughout the '90s and into the twenty-first century. We also show the dysfunctional role of the lead auditing firms in keeping auditor independence from becoming a reality. Our book is another call for meaningful campaign finance reform in this country. Until this occurs, special interest groups will continue to cost society in terms of the loss of jobs and the growth of the deficit, and by extension in the lives of citizens.

About the Author

Martha Lagace is senior editor of Working Knowledge.