High-Stakes Decision Making:
The Lessons of Mount Everest
On May 10, 1996, five mountaineers from two teams perished while climbing Mount Everest. Is there anything business leaders can learn from the tragedy? HBS professor Michael A. Roberto used the tools of management to find out. Plus: Q&A with Michael Roberto
Editor's Note— What went wrong on Mount Everest on May 10, 1996? That day, twenty-three climbers reached the summit. Five climbers, however, did not survive the descent. Two of these, Rob Hall and Scott Fischer, were extremely skilled team leaders with much experience on Everest. As the world's mightiest mountain, Everest has never been a cakewalk: 148 people have lost their lives attempting to reach the summit since 1922.
Newspaper and magazine articles and books—most famously, Jon Krakauer's Into Thin Air: A Personal Account of the Mount Everest Disaster—have attempted to explain how events got so out of control that particular day. Several explanations compete: human error, weather, all the dangers inherent in human beings pitting themselves against the world's most forbidding peak.
A single cause of the 1996 tragedy may never be known, says HBS professor Michael A. Roberto. But perhaps the events that day hold lessons, some of them for business managers. Roberto's new working paper describes how. Here follows an excerpt from "Lessons From Everest: The Interaction of Cognitive Bias, Psychological Safety, and System Complexity."
Implications for leaders
This multi-lens analysis of the Everest case provides a framework for understanding, diagnosing, and preventing serious failures in many types of organizations. However, it also has important implications for how leaders can shape and direct the processes through which their organizations make and implement high-stakes decisions. The Everest analysis suggests that leaders must pay close attention to how they balance competing pressures in their organizations, and how their words and actions shape the perceptions and beliefs of organization members. In addition, the case provides insight regarding how firms approach learning from past failures.
Balancing competing forces
The Everest case suggests that leaders need to engage in a delicate balancing act with regard to nurturing confidence, dissent, and commitment within their organizations. First, executives must strike a balance between overconfidence on the one hand and insufficient confidence on the other. Leaders must act decisively when faced with challenges, and they must inspire others to do so as well. A lack of confidence can enhance anticipatory regret, or the apprehension that individuals often experience prior to making a decision. High levels of anticipatory regret can lead to indecision and costly delays. 71 This anxiety can be particularly problematic for executives in fast-moving industries. Successful management teams in turbulent industries develop certain practices to cope with this anxiety. For instance, some leaders develop the confidence to act decisively in the face of considerable ambiguity by seeking the advice of one or more "expert counselors," i.e. highly experienced executives who can serve as a confidante and a sounding board for various ideas. 72 Naturally, too much confidence can become dangerous as well, as the Everest case clearly demonstrates. To combat overconfidence, leaders must seek out information that disconfirms their existing views, and they should discourage subordinates from hiding bad news. Leaders also must take great care to separate facts from assumptions, and they must encourage everyone to test critical assumptions vigorously to root out overly optimistic projections.
Fostering constructive dissent poses another challenge for managers. As we see in the Everest case, insufficient debate among team members can diminish the extent to which plans and proposals undergo critical evaluation. Flawed ideas remain unchallenged, and creative alternatives are not generated. On the other hand, when leaders arrive at a final decision, they need everyone to accept the outcome and support its implementation. They cannot allow continued dissension to disrupt the effort to turn that decision into action. As Cyrus the Great once said, leaders must balance the need for "diversity in counsel, unity in command." To accomplish this, leaders must insure that each participant has a fair and equal opportunity to voice their opinions during the decision process, and they must demonstrate that they have considered those views carefully and genuinely. Moreover, they must clearly explain the rationale for their final decision, including why they chose to accept some input and advice while rejecting other suggestions. 73 By doing so, leaders can encourage divergent thinking while building decision acceptance.
Finally, leaders must balance the need for strong buy-in against the danger of escalating commitment to a failing course of action over time. To implement effectively, managers must foster commitment by providing others with ample opportunities to participate in decision making, insuring that the process is fair and legitimate, and minimizing the level of interpersonal conflict that emerges during the deliberations. Without strong buy-in, they risk numerous delays including efforts to re-open the decision process after implementation is underway. However, leaders must be aware of the dangers of over-commitment to a flawed course of action, particularly after employees have expended a great deal of time, money, and effort. The ability to "cut your losses" remains a difficult challenge as well as a hallmark of courageous leadership. Simple awareness of the sunk cost trap will not prevent flawed decisions. Instead, leaders must be vigilant about asking tough questions such as: What would another executive do if he assumed my position today with no prior history in this organization? 74 Leaders also need to question themselves and others repeatedly about why they wish to make additional investments in a particular initiative. Managers should be extremely wary if they hear responses such as: "Well, we have put so much money into this already. We don't want to waste all of those resources." Finally, leaders can compare the benefits and costs of additional investments with several alternative uses of those resources. By encouraging the consideration of multiple options, leaders may help themselves and others recognize how over-commitment to an existing project may be preventing the organization from pursuing other promising opportunities.
Shaping perceptions and beliefs
The Everest case also demonstrates how leaders can shape the perceptions and beliefs of organization members, and thereby affect how these individuals will interact with one another and with their leaders in critical situations. Hall and Fischer made a number of seemingly minor choices about how the teams were structured that had an enormous impact on people's perceptions of their roles, status, and relationships with other climbers. Ultimately, these perceptions and beliefs constrained the way that people behaved when the groups encountered serious obstacles and dangers.
The ability to "cut your losses" remains a difficult challenge as well as a hallmark of courageous leadership.
— Michael A. Roberto
Leaders can shape the perceptions and beliefs of others in many ways. In some cases, the leaders' words or actions send a clear signal as to how they expect people to behave. For instance, Hall made it very clear that he did not wish to hear dissenting views while the expedition made the final push to the summit. Most leaders understand the power of these very direct commands or directives. However, this case also demonstrates that leaders shape the perceptions and beliefs of others through subtle signals, actions, and symbols. For example, the compensation differential among the guides shaped people's beliefs about their relative status in the expedition. It is hard to believe that the expedition leaders recognized that their compensation decisions would impact perceptions of status, and ultimately, the likelihood of constructive dissent within the expedition teams. Nevertheless, this relatively minor decision did send a strong signal to others in the organization. The lesson for managers is that they must recognize the symbolic power of their actions and the strength of the signals they send when they make decisions about the formation and structure of work teams in their organizations.
Learning from failure
Often, when an organization suffers a terrible failure, others attempt to learn from the experience. Trying to avoid repeating the mistakes of the past seems like an admirable goal. Naturally, some observers attribute the poor performance of others to human error of one kind or another. They blame the firm's leaders for making critical mistakes, at times even going so far as to accuse them of ignorance, negligence, or indifference. Attributing failures to the flawed decisions of others has certain benefits for outside observers. In particular, it can become a convenient argument for those who have a desire to embark on a similar endeavor. By concluding that human error caused others to fail, ambitious and self-confident managers can convince themselves that they will learn from those mistakes and succeed where others did not. 75
The lesson for managers is that they must recognize the symbolic power of their actions and the strength of the signals they send.
— Michael A. Roberto
This research demonstrates a more holistic approach to learning from large-scale organizational failures. It suggests that we cannot think about individual, group, and organizational levels of analysis in isolation. Instead, we need to examine how cognitive, interpersonal, and systemic forces interact to affect organizational processes and performance. System complexity, team structure and beliefs, and cognitive limitations are not alternative explanations for failures, but rather complementary and mutually reinforcing concepts.
Business executives and other leaders typically recognize that equifinality characterizes many situations. In other words, most leaders understand that there are many ways to arrive at the same outcome. Nevertheless, we have a natural tendency to blame other people for failures, rather than attributing the poor performance to external and contextual factors. 76 We also tend to pit competing theories against one another in many cases, and try to argue that one explanation outperforms the others. The Everest case suggests that both of these approaches may lead to erroneous conclusions and reduce our capability to learn from experience. We need to recognize multiple factors that contribute to large-scale organizational failures, and to explore the linkages among the psychological and sociological forces involved at the individual, group, and organizational system level. In sum, all leaders would be well-served to recall Anatoli Boukreev's closing thoughts about the Everest tragedy: "To cite a specific cause would be to promote an omniscience that only gods, drunks, politicians, and dramatic writers can claim." 77
72. For more on the issue of developing confidence to make decisions quickly in turbulent environments, see: K. Eisenhardt, "Making Fast Strategic Decisions in High-Velocity Environments," Academy of Management Journal, 32 (1989): 543-576.
73. See A. Korsgaard, D. Schweiger, & H. Sapienza, "Building Commitment, Attachment, and Trust in Strategic Decision-Making Teams: The Role of Procedural Justice," Academy of Management Journal, 38 (1995): 60-84.
74. In the famous story of Intel's exit from the DRAM business, this is exactly what Gordon Moore and Andrew Grove asked themselves as they were contemplating whether to continue investing in the loss-making DRAM business.
75. Jon Krakauer has cautioned that this could occur quite easily with respect to the Everest tragedy. In his book, he wrote, "If you can convince yourself that Rob Hall died because he made a string of stupid errors and that you are too clever to repeat those same errors, it makes it easier for you to attempt Everest in the face of some rather compelling evidence that doing so is injudicious." (p. 356-357).
76. E. Jones and R. Nisbett, "The Actor and the Observer: Divergent Perceptions of the Causes of Behavior," in E. Jones, D. Kanouse, H. Kelley, R. Nisbett, S. Valins, and B. Weiner, eds., Attribution: Perceiving the Causes of Behavior (General Learning Press, 1971).
Excerpted with permission from the working paper "Lessons From Everest: The Interaction of Cognitive Bias, Psychological Safety, and System Complexity," Michael A. Roberto, 2002.