26 Aug 2002  Research & Ideas

High-Stakes Decision Making:
The Lessons of Mount Everest

On May 10, 1996, five mountaineers from two teams perished while climbing Mount Everest. Is there anything business leaders can learn from the tragedy? HBS professor Michael A. Roberto used the tools of management to find out. Plus: Q&A with Michael Roberto

 

Editor's Note— What went wrong on Mount Everest on May 10, 1996? That day, twenty-three climbers reached the summit. Five climbers, however, did not survive the descent. Two of these, Rob Hall and Scott Fischer, were extremely skilled team leaders with much experience on Everest. As the world's mightiest mountain, Everest has never been a cakewalk: 148 people have lost their lives attempting to reach the summit since 1922.

Newspaper and magazine articles and books—most famously, Jon Krakauer's Into Thin Air: A Personal Account of the Mount Everest Disaster—have attempted to explain how events got so out of control that particular day. Several explanations compete: human error, weather, all the dangers inherent in human beings pitting themselves against the world's most forbidding peak.

A single cause of the 1996 tragedy may never be known, says HBS professor Michael A. Roberto. But perhaps the events that day hold lessons, some of them for business managers. Roberto's new working paper describes how. Here follows an excerpt from "Lessons From Everest: The Interaction of Cognitive Bias, Psychological Safety, and System Complexity."

Implications for leaders

This multi-lens analysis of the Everest case provides a framework for understanding, diagnosing, and preventing serious failures in many types of organizations. However, it also has important implications for how leaders can shape and direct the processes through which their organizations make and implement high-stakes decisions. The Everest analysis suggests that leaders must pay close attention to how they balance competing pressures in their organizations, and how their words and actions shape the perceptions and beliefs of organization members. In addition, the case provides insight regarding how firms approach learning from past failures.

Balancing competing forces

The Everest case suggests that leaders need to engage in a delicate balancing act with regard to nurturing confidence, dissent, and commitment within their organizations. First, executives must strike a balance between overconfidence on the one hand and insufficient confidence on the other. Leaders must act decisively when faced with challenges, and they must inspire others to do so as well. A lack of confidence can enhance anticipatory regret, or the apprehension that individuals often experience prior to making a decision. High levels of anticipatory regret can lead to indecision and costly delays. 71 This anxiety can be particularly problematic for executives in fast-moving industries. Successful management teams in turbulent industries develop certain practices to cope with this anxiety. For instance, some leaders develop the confidence to act decisively in the face of considerable ambiguity by seeking the advice of one or more "expert counselors," i.e. highly experienced executives who can serve as a confidante and a sounding board for various ideas. 72 Naturally, too much confidence can become dangerous as well, as the Everest case clearly demonstrates. To combat overconfidence, leaders must seek out information that disconfirms their existing views, and they should discourage subordinates from hiding bad news. Leaders also must take great care to separate facts from assumptions, and they must encourage everyone to test critical assumptions vigorously to root out overly optimistic projections.

Fostering constructive dissent poses another challenge for managers. As we see in the Everest case, insufficient debate among team members can diminish the extent to which plans and proposals undergo critical evaluation. Flawed ideas remain unchallenged, and creative alternatives are not generated. On the other hand, when leaders arrive at a final decision, they need everyone to accept the outcome and support its implementation. They cannot allow continued dissension to disrupt the effort to turn that decision into action. As Cyrus the Great once said, leaders must balance the need for "diversity in counsel, unity in command." To accomplish this, leaders must insure that each participant has a fair and equal opportunity to voice their opinions during the decision process, and they must demonstrate that they have considered those views carefully and genuinely. Moreover, they must clearly explain the rationale for their final decision, including why they chose to accept some input and advice while rejecting other suggestions. 73 By doing so, leaders can encourage divergent thinking while building decision acceptance.

Finally, leaders must balance the need for strong buy-in against the danger of escalating commitment to a failing course of action over time. To implement effectively, managers must foster commitment by providing others with ample opportunities to participate in decision making, insuring that the process is fair and legitimate, and minimizing the level of interpersonal conflict that emerges during the deliberations. Without strong buy-in, they risk numerous delays including efforts to re-open the decision process after implementation is underway. However, leaders must be aware of the dangers of over-commitment to a flawed course of action, particularly after employees have expended a great deal of time, money, and effort. The ability to "cut your losses" remains a difficult challenge as well as a hallmark of courageous leadership. Simple awareness of the sunk cost trap will not prevent flawed decisions. Instead, leaders must be vigilant about asking tough questions such as: What would another executive do if he assumed my position today with no prior history in this organization? 74 Leaders also need to question themselves and others repeatedly about why they wish to make additional investments in a particular initiative. Managers should be extremely wary if they hear responses such as: "Well, we have put so much money into this already. We don't want to waste all of those resources." Finally, leaders can compare the benefits and costs of additional investments with several alternative uses of those resources. By encouraging the consideration of multiple options, leaders may help themselves and others recognize how over-commitment to an existing project may be preventing the organization from pursuing other promising opportunities.

Shaping perceptions and beliefs

The Everest case also demonstrates how leaders can shape the perceptions and beliefs of organization members, and thereby affect how these individuals will interact with one another and with their leaders in critical situations. Hall and Fischer made a number of seemingly minor choices about how the teams were structured that had an enormous impact on people's perceptions of their roles, status, and relationships with other climbers. Ultimately, these perceptions and beliefs constrained the way that people behaved when the groups encountered serious obstacles and dangers.

The ability to "cut your losses" remains a difficult challenge as well as a hallmark of courageous leadership.
— Michael A. Roberto

Leaders can shape the perceptions and beliefs of others in many ways. In some cases, the leaders' words or actions send a clear signal as to how they expect people to behave. For instance, Hall made it very clear that he did not wish to hear dissenting views while the expedition made the final push to the summit. Most leaders understand the power of these very direct commands or directives. However, this case also demonstrates that leaders shape the perceptions and beliefs of others through subtle signals, actions, and symbols. For example, the compensation differential among the guides shaped people's beliefs about their relative status in the expedition. It is hard to believe that the expedition leaders recognized that their compensation decisions would impact perceptions of status, and ultimately, the likelihood of constructive dissent within the expedition teams. Nevertheless, this relatively minor decision did send a strong signal to others in the organization. The lesson for managers is that they must recognize the symbolic power of their actions and the strength of the signals they send when they make decisions about the formation and structure of work teams in their organizations.

Learning from failure

Often, when an organization suffers a terrible failure, others attempt to learn from the experience. Trying to avoid repeating the mistakes of the past seems like an admirable goal. Naturally, some observers attribute the poor performance of others to human error of one kind or another. They blame the firm's leaders for making critical mistakes, at times even going so far as to accuse them of ignorance, negligence, or indifference. Attributing failures to the flawed decisions of others has certain benefits for outside observers. In particular, it can become a convenient argument for those who have a desire to embark on a similar endeavor. By concluding that human error caused others to fail, ambitious and self-confident managers can convince themselves that they will learn from those mistakes and succeed where others did not. 75

The lesson for managers is that they must recognize the symbolic power of their actions and the strength of the signals they send.
— Michael A. Roberto

This research demonstrates a more holistic approach to learning from large-scale organizational failures. It suggests that we cannot think about individual, group, and organizational levels of analysis in isolation. Instead, we need to examine how cognitive, interpersonal, and systemic forces interact to affect organizational processes and performance. System complexity, team structure and beliefs, and cognitive limitations are not alternative explanations for failures, but rather complementary and mutually reinforcing concepts.

Business executives and other leaders typically recognize that equifinality characterizes many situations. In other words, most leaders understand that there are many ways to arrive at the same outcome. Nevertheless, we have a natural tendency to blame other people for failures, rather than attributing the poor performance to external and contextual factors. 76 We also tend to pit competing theories against one another in many cases, and try to argue that one explanation outperforms the others. The Everest case suggests that both of these approaches may lead to erroneous conclusions and reduce our capability to learn from experience. We need to recognize multiple factors that contribute to large-scale organizational failures, and to explore the linkages among the psychological and sociological forces involved at the individual, group, and organizational system level. In sum, all leaders would be well-served to recall Anatoli Boukreev's closing thoughts about the Everest tragedy: "To cite a specific cause would be to promote an omniscience that only gods, drunks, politicians, and dramatic writers can claim." 77

Five Questions for Michael A. Roberto

Why study Mount Everest? Professor Roberto described what managers can learn from mountain climbing in an e-mail interview with HBS Working Knowledge senior editor Martha Lagace.

Lagace: In your new research, you tried to learn from a tragic episode on Mount Everest. You've applied a variety of theories from management to study why events on May 10, 1996 went horribly wrong. What interested you in the Everest case, and why did you decide to delve further using the tools of management?

Roberto: When I read Jon Krakauer's best-selling account of this tragedy, entitled Into Thin Air, I became fascinated with the possibility of using this material as a tool for teaching students about high-stakes decision-making. After all, here you had two of the most capable and experienced high altitude climbers in the world, and they both perished during one of the deadliest days in the mountain's history. It struck me that the disastrous consequences had more to do with individual cognition and group dynamics than with the tactics of mountain climbing.

In addition, I am always searching for material from outside of the business environment that can be used in our classrooms at HBS. I believe that there are important lessons that we can learn by examining case studies from other fields. Students find the material refreshing, and they enjoy trying to learn about management by studying experts in other domains.

Q: In hindsight, it is very easy to point a finger and assign blame to individuals involved in the climb. You resist that temptation. Why?

A: If we simply attribute the tragedy to the inadequate capabilities of a few climbers, then we have missed an opportunity to identify broader lessons from this episode. Many of us often fall into the trap of saying to ourselves, "That could never happen to me," when we observe others fail. The fact is that there may be powerful reasons why many people would fail under similar circumstances. It seemed that this might be the case here, and that's what motivated me to consider several different conceptual explanations for the tragedy.

Q: Overconfidence, an unwillingness to "cut one's losses," and a reliance on the most recent information are all psychological factors that can play into high-stakes decisions. You suggest that people dealing with risk—be they expedition leaders or executives—are very susceptible to these emotions. How might they have applied on Mount Everest that day?

A: First and foremost, I would advocate strict adherence to a turn-around time. In this case, the climbers ignored the conventional wisdom, which suggests that they should turn back if they cannot reach the summit by one o'clock in the afternoon. A strictly enforced rule would help protect them against the sunk cost effect, i.e., the tendency to continue climbing because of the substantial prior commitment of time, money, and other resources.

As for the overconfidence bias, I would suggest that expeditions assign someone with a great deal of credibility and experience to be the contrarian during the climb. That person would be responsible for identifying risks, questioning the judgment of other guides and climbers, and reminding everyone of the reasons why many people have died on the slopes of Everest.

Finally, I think the climbers should maintain radio communication with some expert hikers who are not involved in their expedition. Their emotional distance from the effort may enable these experts to offer unbiased guidance and to provide a more balanced assessment of the risks involved in particular situations.

Q: You also looked at the Everest tragedy through the lens of group dynamics. How, in a nutshell, do you think group dynamics could have influenced climbers' actions that day?

A: I would argue that the groups developed a climate that was hostile to open discussion and constructive dissent. One expedition leader went so far as to say, "I will tolerate no dissension...my word will be absolute law." Not surprisingly, people suppressed their concerns and doubts about some of the poor judgment and choices that were made during the climb.

For instance, one survivor lamented that he did not "always speak up when maybe I should have." One factor that contributed to the lack of candid discussion was the perceived differences in status among expedition members. For example, one climber said that he did not speak up when things began to go wrong because he "was quite conscious of his place in the expedition pecking order."

The unwillingness to question team procedures and exchange ideas openly prevented the group from revising and improving their plans as conditions changed.

Q: Many pieces of a puzzle need to interlock successfully for a team to climb a mountain or execute a high-pressure business decision. What is often the role of complexity in these kinds of situations?

A: The idea here is that climbing Everest entails a complex system of activities and behaviors. Two characteristics of this system—complex interactions and tight coupling—enhanced the likelihood of a serious accident.

First, complex interactions means that different elements of the system interacted in ways that were unexpected and difficult to perceive or comprehend in advance. This led to a series of small, but interconnected, breakdowns and failures that became part of a dangerous "domino effect."

Second, tight coupling means that there was a fairly rigid sequence of time-dependent activities, one dominant path to achieving the goal, and very little slack in the system. These characteristics made it easier for a problem in one area to quickly trigger failures in other aspects of the climb.

Footnotes:

71. For a more extensive discussion of anticipatory regret, see I. Janis & L. Mann, Decision Making: A Psychological Analysis of Conflict, Choice, and Commitment, (New York: Free Press, 1977).

72. For more on the issue of developing confidence to make decisions quickly in turbulent environments, see: K. Eisenhardt, "Making Fast Strategic Decisions in High-Velocity Environments," Academy of Management Journal, 32 (1989): 543-576.

73. See A. Korsgaard, D. Schweiger, & H. Sapienza, "Building Commitment, Attachment, and Trust in Strategic Decision-Making Teams: The Role of Procedural Justice," Academy of Management Journal, 38 (1995): 60-84.

74. In the famous story of Intel's exit from the DRAM business, this is exactly what Gordon Moore and Andrew Grove asked themselves as they were contemplating whether to continue investing in the loss-making DRAM business.

75. Jon Krakauer has cautioned that this could occur quite easily with respect to the Everest tragedy. In his book, he wrote, "If you can convince yourself that Rob Hall died because he made a string of stupid errors and that you are too clever to repeat those same errors, it makes it easier for you to attempt Everest in the face of some rather compelling evidence that doing so is injudicious." (p. 356-357).

76. E. Jones and R. Nisbett, "The Actor and the Observer: Divergent Perceptions of the Causes of Behavior," in E. Jones, D. Kanouse, H. Kelley, R. Nisbett, S. Valins, and B. Weiner, eds., Attribution: Perceiving the Causes of Behavior (General Learning Press, 1971).

77. Boukreev and DeWalt [p. 226-227], op cit.

Excerpted with permission from the working paper "Lessons From Everest: The Interaction of Cognitive Bias, Psychological Safety, and System Complexity," Michael A. Roberto, 2002.