20 Apr 2011  Research & Ideas

Blind Spots: We’re Not as Ethical as We Think

Even when we think we are making principled decisions, recent research reveals we are not as ethical as we would like to believe. Professor Max H. Bazerman discusses his new book, Blind Spots: Why We Fail to Do What's Right and What to Do about It. Plus: Book excerpt. Key concepts include:

  • Good people do bad things without being aware that they are doing anything wrong.
  • Motivational blindness is the tendency to not notice the unethical actions of others when it is against our own best interests to notice.
  • The "want" self—that part of us that behaves according to self-interest and, often, without regard for moral principles—is silent during the planning stage of a decision but typically emerges and dominates at the time of the decision.
  • Organizations can monitor how they are creating institutions, structures, and incentives that increase the likelihood of unethical actions, while individuals can "precommit" to intended ethical choices.

 

Think back to recent events when people making unethical decisions grabbed the headlines. How did auditors approve the books of Enron and Lehman Brothers? How did feeder funds sell Bernard Madoff's invesments? We would never act as they did, we think. We operate under a higher standard.

But the fact is that while we like to think of ourselves as fair, moral, and lawful, recent science shows us that we are quite capable of committing unethical acts, or approving of the dishonest acts of others, even as we believe we are doing the right thing.

Recognizing why we do this and how we can get out of the trap is the subject of the new book, Blind Spots: Why We Fail to Do What's Right and What to Do about It, by Max H. Bazerman, a professor at Harvard Business School, and Ann E. Tenbrunsel, a professor of business ethics at the University of Notre Dame.

In short, there is a gap between intended and actual behavior, according to the authors. The rapidly developing field of behavioral ethics has described a decision-making process whereby we recognize what we should do—give equal weight to job candidates of all races, for example—but in the end do what we want to do—hire just white candidates.

Such actions are not without consequences. The Challenger space shuttle explosion, steroid use in major league baseball, and the financial crash are all results of unethical decision-making, even though the participants at the time may have seen themselves acting in the right.

We asked Bazerman to discuss some of the ideas behind the book. A book excerpt follows.

Sean Silverthorne: Why did you write this book, and who should read it?

Max Bazerman: Research over the last two decades has documented that good people do bad things without being aware that they are doing anything wrong. Yet, training in ethics and corporate programs focus on intentional acts. We saw an opportunity to contribute to our understanding of how so many unethical acts occur.

Q: Why don't traditional approaches to thinking about ethics work?

A: Most ethicists define ethics to involve intentional action. Yet, if unethical actions are occurring without intent, we need to solve those problems as well. Our book is an attempt to move in this direction.

Q: What are ethical blind spots, and how do they influence our decisions?

A: There are many. But a few that will illustrate the point include having gender or race biases without knowing that you have these biases, overclaiming credit without meaning to do so, being affected by conflicts of interest, and favoring an in-group—such as universities often do when they give preferential treatment to the children of alumni. All these unethical actions can occur without anyone realizing that they are doing anything wrong.

Q: What is motivational blindness?

A: Motivational blindness is the tendency to not notice the unethical actions of others when it is against our own best interests to notice—such as auditors who fail to notice the faulty accounting practices of their clients, who have the power to fire them if they do notice.

Q: The book is full of examples of people making decisions that they thought were ethical but clearly violated their own standards for ethical behavior—for example, decisions that set the stage for the subprime mortgage crisis. Is there anything in the news more recently that illustrates some of your points?

A: Sure. Despite attempts at auditor reform a decade ago, we see Ernst & Young charged with contributing to the fall of Lehman Brothers. One can certainly ask whether Ernst & Young didn't notice what was wrong with Lehman's books because noticing was not in Ernst & Young's interest. We can tell the same story with the security rating agencies and their role in our recent financial collapse.

Q: The book is a little down on organizational programs designed to encourage ethical behavior. You say in general they don't work, and in fact can cause unethical behavior. What's the problem?

A: I would say that we see current programs as limited, due to the limited attention given to bounded ethicality—or the ways in which good people do bad things without knowing that they are doing so.

Q: What are some steps individuals and organizations can take to make decisions that are truly in line with their own ethical views?

A: Organizations can monitor how they are creating institutions, structures, and incentives that increase the likelihood of bounded ethicality. When industries allow conflicts of interest to remain, the leaders are responsible for the boundedly unethical actions that follow.

Q: How do you become aware of your blind spots?

A: By looking at the data. If you firmly believe that you want to give women and minorities greater opportunities in your organization, but the data show that you always seem to see the white male as the best candidate, this might provide a hint.

Book excerpt from Blind Spots: Why We Fail to Do What's Right and What to Do about It.

Blind Spots: Why We Fail to Do What's Right and What to Do about ItPreparing to Decide: Anticipating the "Want" Self

The "want" self—that part of us which behaves according to self-interest and, often, without regard for moral principles—is silent during the planning stage of a decision but typically emerges and dominates at the time of the decision. Not only will your self-interested motives be more prevalent than you think, but they likely will override whatever "moral" thoughts you have. If you find yourself thinking, "I'd never do that" and "Of course I'll choose the right path," it's likely your planning efforts will fail, and you'll be unprepared for the influence of self-interest at the time of the decision.

One useful way to prepare for the onslaught of the "want" self is to think about the motivations that are likely to influence you at the time you make a decision, as Ann [Tenbrunsel] and her colleagues have demonstrated in their research. Drawing on the sexual harassment study discussed in chapter 4, participants were asked to predict how they would react if a job interviewer asked questions that qualified as sexual harassment. Participants who were induced to think about the motivation they likely would experience at the time of the decision—the desire to get the job—were significantly less likely to predict that they would confront the harasser and more likely to predict that they would stay silent (just as those in the actual situation did) than were those who were not asked to think about the motivation they would experience at the time of the decision. As this study suggests, thinking about your motivations at the time of a decision can help bring the "want" self out of hiding during the planning stage and thus promote more accurate predictions.

Narrowing the Gap

To help our negotiation students anticipate the influence of the "want" self on decisions that have an ethical dimension, we ask them to prepare for the very question they hope won't be asked. When preparing for a job negotiation, for example, we encourage them to be ready to field questions about other offers they may have. Otherwise, when a potential employer asks "What's your other salary offer?" an applicant's "want" self might answer "$90,000," when the truthful answer is $70,000. If an applicant has prepared for this type of question, her "should" self will be more assertive during the actual interview, leading her to answer in a way that's in harmony with her ethical principles, yet still strategic: "I'm afraid I'm not comfortable revealing that information."

"You might also precommit to your intended ethical choice by sharing it with an unbiased individual."

Similarly, rehearsing or practicing for an upcoming event, such as a work presentation or exams, may help you focus on concrete details of the future situation that you might otherwise overlook. In her book Giving Voice to Values, Mary Gentile offers a framework to help managers prepare for difficult ethical decisions by practicing their responses to ethical situations. When you are able to project yourself into a future situation, almost as if you were actually in it, you can better anticipate which motivations will be most powerful and prepare to manage them.

The point of increasing your accuracy in the planning stage of decision making isn't to recognize that you will be influenced by self-interested motives and admit defeat to the "want" self. Rather, it's to arm you with accurate information about your most likely response so that you can engage in proactive strategies to reduce that probability. Knowing that your "want" self will exert undue pressure at the time of the decision and increase the odds that self-interest will dominate can help you use self-control strategies to curb that influence.

One such strategy involves putting in place precommitment devices that seal you to a desired course of action. In one example, Philippine farmers who saved their money by putting it in a "lockbox" that they could not access were able to save more money than those who did not, even factoring in the small cost of the lockbox. By eliminating the farmers' ability to spend their money immediately, the lockbox effectively constrained the "want" self. Ann's teaching assistant used a similar precommitment strategy to constrain her "want" self during finals week. Knowing she should study but would be tempted to procrastinate by spending time on Facebook, she had her roommate change her password so that she could not access the social networking site. By doing so, the student constrained her "want" self from acting and allowed her "should" self to flourish. Such precommitment devices explain the popularity of personal trainers at health clubs. By making appointments with a trainer (who might charge up to $100 an hour) with the threat of a cancellation fee, clients precommit to their "should" self, ensuring that they will work out rather than giving into the strong pull of the "want" self and watching TV instead.

When faced with an ethical dilemma, we can use similar strategies to keep our "want" self from dominating more reasoned decision making. Research on the widespread phenomenon of escalation of commitment—our reluctance to walk away from a chosen course of action—shows that those who publicly commit to a decision in advance are more likely to follow through with the decision than are those who do not make such a commitment. You might also precommit to your intended ethical choice by sharing it with an unbiased individual whose opinion you respect and whom you believe to be highly ethical. In doing so, you can induce escalation of commitment and increase the likelihood that you will make the decision you planned and hoped to make.

About the author

Sean Silverthorne is editor-in-chief of HBS Working Knowledge.

Excerpted with permission of Princeton University Press. Blind Spots: Why We Fail to Do What's Right and What to Do about It, by Max H. Bazerman and Ann E. Tenbrunsel. Copyright © 2011 by Princeton University Press. All rights reserved.

Purchase the book

Post A Comment





By hitting “Submit” you agree that your comment, in whole or in edited form, may be posted online. Comments are selected on the basis of relevancy and variety; not all will be posted.

 

Comments

    • Al Watts
    • Owner, inTEgro, Inc.

    'Good to know about this book; thanks for the excerpts. I've noticed another way that blind spots get in the way of ethical performance: blindness to aspects of our character or organizational culture that will lead to trouble if not known and managed. Using the Hogan Development Survey, for example, a CEO with excessive "Boldness" who is unaware of its downsides will cold easily lead an organization down an overly risky, "damn the torpedoes" path (e.g. BP or the banks that operated more like casinos.)

     
     
     
    • Sudheer Thaakur
    • Academic, BITS, Pilani, India

    Want will contiue to trump morality as long as we continue with culture that celebrates results over process / path. Morality is like hygiene whose presence nobody notices but absence is felt. Can we have a culture that rewards morality just like results. I doubt very much. At a personal level we can learn to be more moral as in increasing situations we consciously lock out want options like the Facebook example quoted. As an individual progresses on this path of lock out s/he will find it increasingly easier to take recourse to lockout strategy in more and more difficult and complex situations.
    But in organisational situations it becomes dicey. We often believe that organisations have no morals but are wedded to results only. CEOs succumb easily to it because reward system is so skewed.

     
     
     
    • David Physick
    • Consultant, Glowinkowski International Limited

    This is excellent stuff, especially the comments concerning 'unconscious bias' or, as Max terms it, 'blind spots'. My friend and colleague, Ian Dodds, iandodds@iandoddsconsulting.com, has done lots of work in this arena and is worth tapping into. In terms of gender differences, a recent report we issued highlights the psychological differences between men and women and why the predominance of the 'alpha-male' has got us into the difficulties we presently endure. See http://www.glowinkowski.com/img/Approach-and-Toolkit/GPI/Impact-of-Gender.pdf

     
     
     
    • Paul Metler
    • Pastor, Heritage Fellowship

    Thanks for the work. I look forward to reading more. At the personal level, "accountability" is an overused word that seldom has any real teeth. Your insight is very practical. I appreciate the methods you suggest to overcome the powerful sway of the "want" self. Especially, I believe that your description of partnering with someone who is "unbiased" shows great promise. Unfortunately, our "want" self will tend to gravitate toward individuals who will tell us what we want to hear. For executive leaders, it requires a significant investment to build a relationship where "truth-telling" is safe. Executive leaders often have very few people in their "inner circle" who are truly unbiased.

     
     
     
    • Balachandran T S
    • Deputy General Manager - Purchase Quality 7 Supplier Development, Bosch

    I agree fully that blind spots can come in the way of the ethical behaviour. However, any individual will always bring along with him/her subjectivity related bias ....it can also be "don't want" self. This can even stem from their own very recent experience - pleasant / unpleasant which can lead to temporary bias clouding their decision. Using inner circles as rebound boards will help. One can also look at consciously inducing opposing bias in a transparant way... Next five hires will be women / of a particular race or whatever. However, this may lead to meritocracy taking the rear seat. so what should we do to be ethically introspect prior to firming up on a decision and be driven by your own conviction.....

     
     
     
    • G.P.Rao.
    • Founder Chairman., Spandan (Foundation for Human Values in management and society), India.

    The concepts of Want self and Should self and their relevance to the theme on hand is interesting, ingenious and instructive. One possible reason is separation of Personal/ individual Want and Should selves from Organisational/ Institutional Want and Should selves: Want Self: Individual; Should Self: Institutional.

    An act which is unethical individually is rationalized - sincerely at that- as needed by the organsiation, is in the interest of it, congruent to the soical ambience in which the organisation is located and funtioning, and that trying to defy such social ambience may endanger the very existence of the organisation. This perhaps explains as to why many organsations highly professional and reputed otherwise resort to 'acommodate' offering bribes to the authorities and agencies concerned through consultants. G.P.Rao.

     
     
     
    • Anonymous

    The subject of "Want Self" is operating under political, which comes under the sub-categories of many psychological feelings such as doubt, greed, jealousy, revenge, and super-ego. These situations cannot be communicated in any language.

     
     
     
    • Ashraf Khan
    • Governance advisor, De Nederlandsche Bank

    Great reading, but...

    The described "lock out strategy" essentially puts negative incentives on giving into the "want" self (the Facebook example). As Mr Thaakur, comment no 2, points out, this can become more difficult to maintain, especially in organisations.

    What is needed instead, is a positive incentive structure: rewards for not giving into the "want" self, instead of punishments. A personal example: I want to go to the gym because it's healthy. But it's so darn boring. My incentive however is that I work my way through the necessary machines whilst listening to the latest music on my mp3 player - something I don't get to do much anymore. I am now actually looking forward to going to the gym.

    And, because I (like most humans) have an "action bias" (I value results based on actions more than similar results based on non actions), my "going-to-the-gym-because-I-actually-want-to" is going to make me happier (light up my pre frontal cortex :-)) and more likely to do it again (even if the positive incentive disappears).

     
     
     
    • Lindsay Thompson
    • Associate Professor, Carey Business School

    The "should" and "want" self are a version of the Socratic charioteer's horses -- with some updated research from behavioral science that is very useful. I would also like to suggest, however, that the rich trove of literature on cultural/individual reflexivity, especially in the myth/ritual domain, is worth a look. (e.g., for starters, Douglas, Leach, Foucault, Anthony Cohen) This literature elucidates the way culture shapes the "want" and "should" selves -- and also demonstrates that the "want" self is influenced by affiliational bonds. I would suggest that we also think more about how informal rituals of organizational cultures are very "sticky" and can be either positive or negative influences on the "want" self. When they align and resonate with the "should" self, they can be a powerful force for the good. When they contradict or misalign, th ey can be a source of sabotage.

     
     
     
    • Roy Bhikharie
    • Managing-Director, Papaya Media Counseling

    Blind spots, unconscious bias, motivational blindness etc. are just labels to justify serving self-interest while lacking empathy: "The Devil wears Prada!"

     
     
     
    • Anonymous

    Re: Ethical Decision-making First, I would like to thank Professor Max Bazerman for generating the theme "Blind Spots: We're Not as Ethical as We Think" for discussion because it is very relevant to effective and sustainable development. Every organization needs ethical decision-making to thrive (Velasquez, Moberg, Meyer, Shanks, McLean, DeCosse, Andre, & Hanson, 2010).

    There are organismic and sociological factors which militate against ethical decision-making. Discernibly, endogenous factors such as bias, anger, greed, lust, the mind's demands, integration of the three gunas or qualitative modes of material nature (namely, the modes ignorance, passion, and goodness), identity crisis, pathological state of consciousness, low self-control, phenomenological mindset, and maladjustment play some role in ethical decision-making.

    Essentially, it is germane to be sure that our decision-making process has passed the litmus test of social and psychological sanity so as to be effectively positioned on the pivot of ethical thinking inasmuch as decision-makers imbued with phenomenological mindset, maladjustment, pathological state of consciousness, bias, identity crisis, etc. may not facilitate ethical decision-making in organizations. Policy should be promulgated that mandates policy formulators to undergo psychological tests to ensure that such decision-makers have zero insanity. A people-oriented mindset of decision-makers will undoubtedly enhance ethical decision-making. Furthermore, since sonic therapeutic intervention has been proven as an antidote to depression, stress, lust, anger, low self-control, etc., it is worthwhile for decision-makers to be administered with sonic therapy, inasmuch as it would improve ethical decision-making. Cultivation of transcendental knowledge in self-realization enhances emancipation from identity crisis. If decision-makers rise above their mundane or bodily consciousness, and see all and sundry or at least members of their constituency as supramundane entities with intrinsic connectedness, then that will help to invoke ethical decision-making. Ethical decision-making necessitates that decision-makers would have to walk their talk, and that requires high self-control. General Arjuna (Prabhupada, 2011) posed a query to his mentor as to why people are impelled to act against their better judgment.

    According to applied Vedic science (Prabhupada, 2011), endogenous lust is the root cause of unethical decisions. Applied Vedic science (Prabhupada, 2011) asseverates that sonic therapeutic intervention is a guaranteed strategy to check lust, which is a major impediment to ethical decision-making.

    References Prabhupada, A.C.B.S. (2011). Bhaktivedanta VedaBase. Los Angeles, CA: Bhaktivedanta Book Trust International.

    Velasquez, M., Moberg, D., Meyer, M.J., Shanks, T., McLean, M.R., DeCosse, Andre, C. & Hanson, K.L. (2010). A framework for thinking ethically. Santa Clara University. Retrieved on May 10, 2011, from http://www.scu.edu/ethics/practicing/decision/framework.html

     
     
     
    • Vasudev Das
    • Doctoral researcher of Applied Management and Decision Sciences, Walden University., ISKCON

    Re: Ethical Decision-making

    First, I would like to thank Professor Max Bazerman for generating the theme "Blind Spots: We're Not as Ethical as We Think" for discussion because it is very relevant to effective and sustainable development. Every organization needs ethical decision-making to thrive (Velasquez, Moberg, Meyer, Shanks, McLean, DeCosse, Andre, & Hanson, 2010).

    There are organismic and sociological factors which militate against ethical decision-making. Discernibly, endogenous factors such as bias, anger, greed, lust, the mind's demands, integration of the three gunas or qualitative modes of material nature (namely, the modes ignorance, passion, and goodness), identity crisis, pathological state of consciousness, low self-control, phenomenological mindset, and maladjustment play some role in ethical decision-making.

    Essentially, it is germane to be sure that our decision-making process has passed the litmus test of social and psychological sanity so as to be effectively positioned on the pivot of ethical thinking inasmuch as decision-makers imbued with phenomenological mindset, maladjustment, pathological state of consciousness, bias, identity crisis, etc. may not facilitate ethical decision-making in organizations.

    Policy should be promulgated that mandates policy formulators to undergo psychological tests to ensure that such decision-makers have zero insanity. A people-oriented mindset of decision-makers will undoubtedly enhance ethical decision-making. Furthermore, since sonic therapeutic intervention has been proven as an antidote to depression, stress, lust, anger, low self-control, etc., it is worthwhile for decision-makers to be administered with sonic therapy, inasmuch as it would improve ethical decision-making. Cultivation of transcendental knowledge in self-realization enhances emancipation from identity crisis. If decision-makers rise above their mundane or bodily consciousness, and see all and sundry or at least members of their constituency as supramundane entities with intrinsic connectedness, then that will help to invoke ethical decision-making. Ethical decision-making necessitates that decision-makers would have to walk their talk, and that requires high self-control. G eneral Arjuna (Prabhupada, 2011) posed a query to his mentor as to why people are impelled to act against their better judgment. According to applied Vedic science (Prabhupada, 2011), endogenous lust is the root cause of unethical decisions. Applied Vedic science (Prabhupada, 2011) asseverates that sonic therapeutic intervention is a guaranteed strategy to check lust, which is a major impediment to ethical decision-making.

    References Prabhupada, A.C.B.S. (2011). Bhaktivedanta VedaBase. Los Angeles, CA: Bhaktivedanta Book Trust International.

    Velasquez, M., Moberg, D., Meyer, M.J., Shanks, T., McLean, M.R., DeCosse, Andre, C. & Hanson, K.L. (2010). A framework for thinking ethically. Santa Clara University. Retrieved on May 10, 2011, from http://www.scu.edu/ethics/practicing/decision/framework.html