What if businesses could learn from their worst mistakes without actually making them? How might the same progress and innovation occur, without firms incurring the costs associated with such errors?
The results of a recent study about close calls in health care suggest that when people feel secure about speaking up at work, incidents in which catastrophe is narrowly averted rise to the surface, spurring important growth and systems improvement.
“People don't pay enough attention, especially in the business world, to the potential goldmine of near-misses,” says Harvard Business School Professor Amy C. Edmondson, who studies psychological safety and organizational learning.
Incidents that almost result in loss or harm often pass unnoticed, in part because workers worry about being associated with vulnerability or failure. But when leaders frame near misses as free learning opportunities and express the value of resilience to their teams, the likelihood that workers will report such incidents increases.
That was the main finding of Resilience vs. Vulnerability: Psychological Safety and Reporting of Near Misses with Varying Proximity to Harm in Radiation Oncology, a study by Edmondson, the Novartis Professor of Leadership and Management at Harvard Business School, and Olivia Jung, a doctoral student at HBS. Co-authors on the paper, which was published in The Joint Commission Journal on Quality and Patient Safety, included UCLA physicians Palak Kundu, John Hegde, Michael Steinberg, and Ann Raldow, and medical physicist Nzhde Agazaryan.
A spectrum of close calls
The research team wanted to understand the role of psychological safety—defined as “the shared belief that interpersonal risk-taking is safe”—in determining the likelihood that employees in the radiation oncology department report near misses and whether that changes based on the nature of the incident.
"A near miss ... can also be thought of as a success, where they say, 'Whew, we caught the error and delivered great care.'"
“What's interesting about a near miss is that it can be thought of as a failure, where people say, ‘Oh, we almost made a huge mistake,’” explains Jung. “That interpretation highlights a vulnerability in the care-delivery processes. But it can also be thought of as a success, where they say, ‘Whew, we caught the error and delivered great care,’ which highlights resilience of care delivery systems.”
To unravel this complexity, the research team surveyed 78 radiation oncology professionals at the University of California in Los Angeles. First, they surveyed the group about their perceived psychological safety in the department. Overall, they found that individuals felt accountable to each other and comfortable speaking up, but there was significant variance depending on position, with higher-ranked employees, like physicians, generally feeling safer to speak their minds compared to lower-ranked employees, like nurses and therapists. This, says Edmondson, has been a consistent finding across much of the research on teams and psychological safety, across a variety of industries.
“Higher-status people are more likely to feel confident that their voice is welcomed,” she says.
Next, the researchers devised a spectrum of hypothetical near misses based on real-life practice. For example, providers must check cancer patients undergoing radiation for pacemakers, which can malfunction during the treatment. Employees were asked to rank the likelihood that they would report the following near-miss scenarios, which become progressively more threatening to the patient:
- Could have happened. The pacemaker status of a patient was not checked at initial consultation. By chance, the patient did not have a pacemaker and received radiation without any harm afterwards.
- Fortuitous catch. The pacemaker status was not checked. The patient had a pacemaker, but by chance, a team member noticed this, and the patient’s treatment was postponed until they received clearance.
- Almost happened. The pacemaker status was not checked. The patient had a pacemaker and received radiation, but, by chance, the patient did not experience complications.
When overlaid with the results of the first survey, the data showed that the closer the situation got to causing patient harm, the more important psychological safety became in determining whether the employees would report the near-miss event.
“With near misses that we characterize as ‘could have happened,’ where the chance event is far from patient harm, and therefore highlights resilience, we find that the role of psychological safety on people's willingness to report is almost negligible,” explains Jung. “But for near misses that we characterize as ‘nearly happened,’ which highlight vulnerability, we find there's a huge effect of psychological safety on people’s willingness to report.”
"Near misses that highlight things that are scary to talk about ... require more psychological safety."
In other words, “near misses that highlight things that are scary to talk about—things that were broken in the system—require more psychological safety as opposed to near misses that are framed as successes,” she says.
The ‘goldmine’ of avoided catastrophes
By framing near misses as important learning opportunities, as well as by fostering psychological safety, business leaders can increase the chances that employees will discuss close calls, potentially helping to avoid costly errors immediately and in the future, says Edmondson.
“If you are a leader and you are framing good catches as examples of vigilance and resilience, and telling people, ‘It’s great when we speak up and catch problems, because nobody's perfect; things do go wrong,’ then you're more likely to hear about them,” she says. “But if you're framing them as failures and screw-ups, you're less likely to hear about them, because everybody knows, if you're the person associated with screw-ups, you get in trouble.”
Oftentimes, high levels of psychological safety go hand-in-hand with a strong sense of shared organizational purpose, Edmondson explains, and that makes for a powerful combination. In some settings, such as radiation oncology, the team is highly motivated to get it right—the stakes could not be higher. The lives of seriously ill patients depend on high-quality care.
"When people don't recognize near-misses as this goldmine, then they're not going to take advantage of them."
In other settings, it may be up to managers to communicate a clear and compelling purpose, and to make sure that employees feel like their contributions are valued. When employees feel free to express their ideas and concerns, the whole group benefits, particularly when it comes to close calls, Edmondson says.
“In organizations like Toyota, where they recognize the richness of the almost-failure and recognize that those are free learning opportunities, people are more likely to speak up, and everyone learns,” says Edmondson. “But when people don't recognize near-misses as this goldmine, then they're not going to take advantage of them, because people quite often won't even mention them.”
About the Author
Kristen Senz is the growth editor of Harvard Business School Working Knowledge.
[Image: iStockphoto/John-Kelly]
What's the biggest professional mistake you've ever made? What did you learn?
Share your insights in the comments below.