What is the right mix between intuition and analysis? Several clear themes characterized responses to this month's column. Dominant among these was that the best way to reach a decision depends on a number of factors, including the nature of the decision, the nature of the decider, the information available, history, experience, the number of deciders, and so forth.
Nevertheless, several comments reflected an uneasy fondness for a good dose of intuition in the mix. Guy Gould-Davies' comment was particularly insightful: "The idea of using feeling in the context of decision making makes many people highly uncomfortable which is why intuition gets a bad rap. (It implies) … emotion … a lack of discipline and robustness in analysis … the lack of control (replicability)." Pallavi Marathe put it this way: "'Careful Decisions' is a paradox …. If there is past data available to help predict the future, it may be a good idea to refer to it. But in most cases, the decision maker is posed with a unique challenge." Vanitha Rangganathan, arguing for the role of intuition in the creative process, commented that "Experience makes us personally wiser …. 'Wisdom of crowds' breeds convenient conformity and creativity is often lost in the process."
At the other end of the intuition-analysis spectrum, R. C. Saxena opined, "I believe intuition ought not to play any part …. Sincere effort to harness all the collective wisdom coupled with a commitment to deliver the Complete Solution ought to be the key."
Most argued for a process involving intuition based on analysis and experience. Rowland Freeman commented, "A great deal depends on the magnitude of the decision…. The lesser the impact, go with experience and intuition." As Marlis K. put it, "… the question should not be rational decision making OR intuition, but rather … how to combine both." David Kendall said, "In the most difficult case of no-time and high-risk, reliance on 'rational intuition' may be a preferred way to minimize direct and/or collateral damage if the decision goes wrong." Luis X. B. Mourao opined that "… while (a model) helps to diligently collect and analyze relevant data, it only gets you so far. Add experience and it will get you a step further."
Some of the most interesting comments raised questions about whether we should instead concentrate on ways to make our own decision-making processes more transparent to others and to ourselves. Edward Hare put it this way: "Openness and honesty are the sunlight that's needed to make … decision making as effective and efficient as it ought to be …." This may require a certain amount of self-awareness. Maree Conway said that "Our worldview conditions what we accept and don't accept as real, and this conditions how we make decisions. Being aware of your worldview and biases is the first step to wise decision making …." Jeremy Stunt commented that "I have learned that it is helpful to host a 'conversation' between my rational/analytical side and my intuitive side." Phil Clark's advice provides a useful close: "If you are seeking the perfect solution, you likely are making no decisions and letting life pass you by. That may be the saddest decision anyone makes." What do you think?
In his book Blink, discussed in this column in February 2005, Malcolm Gladwell advised us to place faith in intuition based on experience in deciding many things quickly. Now Michael Mauboussin, with his book Think Twice, makes the case for a more careful approach, suggesting that we place too much emphasis on intuition and personal experience as opposed to the "wisdom of crowds," mathematical models, and systematically-collected data. He argues that "blink" serves us well in stable environments where feedback from previous decisions is clear and where cause-and-effect relationships can be identified. Unfortunately, in his view these conditions are more and more rare. As he puts it, "intuition is losing relevance in an increasingly complex world … more is different." You ask, what's new here? Perhaps these sound like "dog bites man" assertions.
I'll risk oversimplifying a complex set of arguments this way: Mauboussin, citing a wide range of examples and research, argues that we use experts (as opposed to diverse "crowds") too frequently, that we too often fail to: identify the nature of the problem, match solution techniques with problems, seek diversity in our feedback, and use technology where possible.
Among other things, he argues, we ignore the subtle and ignored biases that our experiences impose on our independence as decision-makers, we decide too frequently on our emotional reactions to risk (playing the lottery even when we know better, for example), and we succumb to pressures to follow the group. As decision-makers, we are products of our environment to a greater degree than we realize. We take credit for things out of our control while blaming others for failure in similarly uncontrollable circumstances. We hire "stars," only to watch them burn out in a new and different managerial environment. We look for "best practice" (à la Jim Collins in Good to Great and others) in highly complex situations where there is little comparability and therefore no best practice, only "it all depends." Worse yet, we are often not conscious of these influences.
Mauboussin maintains that we too often underestimate the importance of luck in the outcomes of our decisions, employing Nobel Prize-winner Daniel Kahneman's observation that success requires some talent and some luck, while great success requires some talent and a lot of luck. The importance of this observation is that systems that involve significant amounts of luck, such as investing for many people, revert to the mean for the group over time, a fact that can be used to make better decisions without the influences described above. It's why, for example, some successful investors simply choose stocks of the bottom companies in the Dow Jones average in the preceding year in making their investments for the coming year.
Is intuition losing its relevance in an increasingly complex world? Will we need to turn increasingly to such things as quantitative models, "prediction markets" (where people bet on their views), the wisdom of crowds, and even such things as models based on "system dynamics" developed at MIT in the 1960s? And should we rely less on so-called "experts" and "stars"? In short, should we be spending more time examining our true decision-making abilities and the things that influence our results, i.e., more time "thinking twice" than "blinking"? What do you think?
To read more:
Michael J. Mauboussin, Think Twice: Harnessing the Power of Counterintuition, (Boston: HBS Press, 2009)