Skip to Main Content
HBS Home
  • About
  • Academic Programs
  • Alumni
  • Faculty & Research
  • Baker Library
  • Giving
  • Harvard Business Review
  • Initiatives
  • News
  • Recruit
  • Map / Directions
Working Knowledge
Business Research for Business Leaders
  • Browse All Articles
  • Popular Articles
  • Cold Call Podcast
  • Managing the Future of Work Podcast
  • About Us
  • Book
  • Leadership
  • Marketing
  • Finance
  • Management
  • Entrepreneurship
  • All Topics...
  • Topics
    • COVID-19
    • Entrepreneurship
    • Finance
    • Gender
    • Globalization
    • Leadership
    • Management
    • Negotiation
    • Social Enterprise
    • Strategy
  • Sections
    • Book
    • Podcasts
    • HBS Case
    • In Practice
    • Lessons from the Classroom
    • Op-Ed
    • Research & Ideas
    • Research Event
    • Sharpening Your Skills
    • What Do You Think?
    • Working Paper Summaries
  • Browse All
    • Archive

    The Hidden Cost of Buying Information

     
    11/8/2004
    New research from Harvard Business School's Francesca Gino suggests that if we pay for information, we tend to overweigh its actual value.

    by Sean Silverthorne, Editor, HBS Working Knowledge

    We all need good information to make decisions—that is why consulting is an industry that never goes out of style. But paying for information can carry a hidden cost: We may give it more weight in our decision making than it deserves.

    That's one of the conclusions made by Francesca Gino, a Harvard Business School post-doctoral fellow in the Technology and Operations Management Unit. She recently published a working paper, "Getting Advice from the Same Source but at a Different Cost: Do We Overweigh Information Just Because We Paid for It?"

    Gino's results are based upon an experiment where subjects were asked to answer different sets of questions about American history and were provided the opportunity to receive free advice as well as costly advice—the same advice, as it turned out. Gino's conclusion: When the advice is costly, subjects are more inclined to take it into consideration and use it. And that conclusion can have profound consequences for consumers, managers, and organizations in their decision making, she says.

    The experiment is part of her wider interest in what might be considered a paradox in today's "Information Age": More information isn't always better. "Past research largely focused on the adverse effects of insufficient information. My interest, instead, lies in the potentially harmful effects of too much information, i.e., the conditions under which an additional or excessive amount of information might be detrimental to decision making," she says.

    In this e-mail interview, Gino discusses her research and its implications for managers and other decision makers.

    Sean Silverthorne: Could you describe your initial hypothesis, your research, and what you found?

    Francesca Gino: One of the assumptions that is commonly made in formal theories of decision making states that if information is costly to acquire, but free to dispose of, then a rational decision maker should acquire information up to the point at which the marginal cost of acquiring additional information equals the marginal benefit. With the standard model, a decision maker should be willing (and able) to discard (i.e., ignore) information that would not improve the quality of the decision. One of my experimental studies shows that this assumption is not valid: Once one pays for information, it is hard to ignore it.

    Francesca Gino
    Francesca Gino

    Often information comes to decision makers in the form of advice or, more broadly, as information about others' opinions on the issue at hand. My experimental study focuses on this specific type of information. Drawing on aspects of behavioral decision theory, I argue that the cost of advice, independent of its quality, will affect how it is used. Despite prior theory on the irrational use of information and on the role of advice in decision making, research has not examined the effect of free versus costly advice on the influence of that advice. This is the goal of my study. Thus, my initial hypothesis is that advice is weighed differently depending on whether it is costly or free. In particular, I hypothesize that costly advice is assigned a significantly greater weight than free advice.

    In the experiment, subjects are asked to answer different sets of questions about American history and to provide estimates privately without communicating with other participants. Before answering some of the questions, they have the opportunity to get advice on the correct answer. In the free-advice treatment, subjects can get advice for free, while in the costly-advice treatment they can receive it by paying a certain amount of money. In both treatments, the advice comes from the same source, i.e., subjects are told explicitly that the quality of advice remains constant across treatments. Advice is a specific type of information: It is information in the form of an opinion about a course of action. Specifically, in the case of my experiment, advice is the information about a subject's answer.

    Using a within-subjects design (i.e., each subject experiences both the free-advice treatment and the costly-advice condition), I find that decision makers tend to weigh the advice they get significantly more when they pay for it than when they get it for free. This is equivalent to saying that individuals discount others' opinions significantly more when advice is free than they do when advice is costly. I refer to this bias as "information overweight."

    Q: What is the role advice plays in decision making? Are there real-world examples that illustrate your findings?

    A: When you think about it, most decisions made by consumers or managers are influenced to some degree by advice. For instance, individual consumers often take advice from friends, relatives, or experts when buying products such as a laptop or a car; when choosing a doctor; or when choosing a job or reflecting on a career path. When faced with medical decisions, people commonly solicit second opinions.

    It is really important for managers to understand the biases they may fall prey to in either soliciting or taking advice.

    And it is hard to think about important management decisions taken without soliciting or receiving advice from colleagues, supervisors, project team members, or from external experts, like consultants. This advice seems to have a really strong influence on which decisions are actually made. Seeking advice often entails some cost either in time, effort, or money. If you travel to another city to seek a second medical opinion, you likely incur a significant cost. Or, if your company hires a consulting firm or investment bank to provide advice on a strategy decision, there are significant costs involved.

    There are no studies yet documenting this tendency in real-world settings. This is something I hope to do in my future research. However, there are situations where one gets a sense that this tendency to overweigh costly information is at work. In a conversation I had with a dietician, the doctor noticed that his patients were more inclined to follow a diet (and thus lose weight!) if they have sustained a high cost for their visit than if they obtained the diet for free. A similar effect might be in play in the case of financial planning or investment advice. And certainly, one often hears managers lament the tendency of their companies to listen to the advice of consultants over the advice of internal people. In real-world settings, of course, many factors may be at work. In some cases, the second medical opinion may actually be a better quality opinion. Or, the consultant may have advice based on better information or a more objective view. So, simply because their advice is followed is not evidence of overweight. That's why I plan to do more careful empirical studies to sort out the effects.

    Q: Although we are likely to give more weight to advice we pay for, does it follow that we are also more likely to act upon that advice?

    A: In order to capture the impact of advice on subjects' behavior (i.e., whether subjects follow the advice they get), I use a measure in my experiment developed by Ilan Yaniv (2003) and labeled "weight of advice," or WOA. The measure reflects how much a subject weighs the advice she receives. That is, the WOA allows one to measure how much a subject uses the advice. Thus, in my experiment, how much weight participants give to advice is equivalent to how much use they make of it—how much they act upon it.

    Q: Not only do people in your study give more weight to paid information, but they give significantly more weight. Why?

    A: There are two factors that might be at work. One is the effect of "sunk costs." In traditional economic theory, decision makers should not worry about sunk costs. Bygones are bygones. However, there is now quite a bit of evidence showing that people tend to take into account prior investments as they consider the next course of action. This is known as the "sunk cost fallacy." My results on information overweight are highly consistent with a sunk cost fallacy. Once people have paid for information, they feel a greater need to use that information so as not to "waste" the initial investment.

    A second possible explanation is cognitive dissonance, which refers to the discomfort felt at a discrepancy between what one already knows or believes, and new information or interpretation. That is, cognitive dissonance occurs when there is a need to accommodate new information or ideas (in my study, information about the opinions of others).

    Thus, if subjects pay for advice, they might justify such payment by using the new information to update the beliefs they had prior to receiving advice.

    Q: Sometimes we pay for information with means other than money: time, effort, and so on. Although not a subject of your work, do you have a sense whether information overweight would occur in these non-monetary situations as well?

    A: I do believe information overweight would occur in non-monetary situations. Often the role money plays in decision making is not different from the one played by time or effort. Think for instance about the psychology of sunk costs. Whether the investment that has been made is in money, effort, or time, the fact that it has been made results in a tendency to continue the endeavor. Similarly, it is reasonable to expect that if we spend a lot of time looking for information, once we have it available we are more inclined to use it.

    Q: Are there implications for consultancies from your work? What are the implications for decision makers?

    A: Well, first let me say that the work I've done so far needs much more empirical validation before we can really take the implications with more confidence. However, if my findings stand up to additional testing and validation, they do have some interesting implications for both managers and those who are in the business of providing advice, such as consultants.

    It is really important for managers to understand the biases they may fall prey to in either soliciting or taking advice. Just being aware that you may have a tendency to overweigh some kinds of advice, and that this may lead to bad decisions, could be a helpful first step.

    For consultants, investment bankers, lawyers, and other professional advice givers, the implications of my results are less clear. Perhaps they could look at my results as a justification for their high fees! Seriously, the results suggest that fee structures may well influence the extent to which clients respond to the advice they give. This might be very relevant for companies that offer consulting services as an ancillary service to their core business. These companies often struggle with how to price their consulting services. Should they charge "full markets rates" or should they offer these services as a discount to potentially create demand for their core business? If there is a tendency to overweigh costly advice, it suggests that the discounting strategy may not actually be effective.

    Q: What are the limits of your study? For example, your experiments focused on quantitative rather than qualitative estimates. What didn't you learn about this subject?

    A: Whether information overweight bias occurs when information comes in the form of qualitative advice instead of quantitative estimates is a matter I'd like to explore in future work. In my study I used numerical estimates as advice, i.e., "quantitative" advice. One might wonder if results would change in the case of qualitative advice. Qualitative advice manifests itself in different forms: in the form of opinion about how to choose among several options, about how to play a certain game (as in the studies by [Andrew] Schotter at New York University and his colleagues), or in the form of opinion about matters of taste, e.g., opinions about a product or an entertainment activity.

    Admittedly, my experimental study is not immune from limitations. For instance, it falls prey to critiques about external validity that are commonly made in front of laboratory experiments. As I mentioned before, the outside world is more complicated than what we simulate in the lab. Yet, in favor of an experimental approach, the fact stands that the laboratory experiments allow one to examine relevant attributes of subjects, specify parameters, and control for individual differences on relevant dimensions. My view is that both experimental and empirical studies are necessary and complementary.

    Q: What are you working on next?

    A: Even if advice-taking and advice-giving are common activities across a wide range of contexts, under what conditions people use advice they have received or gathered is not well understood. The study I conducted is only a first step toward a better understanding of when, how, and why individuals take advice.

    The study suggests several lines of future research. First, I need to develop a better understanding of decision makers' sensitivity to the cost incurred to acquire advice. What happens if one either increases or decreases the cost of advice? The effect found on the weight assigned to the advice suggests that how much subjects paid was salient, but at the same time the fee was not too high to induce participants not to buy that piece of information. However, I need to gain a more comprehensive understanding of the impact on the "size" of the cost people sustain to acquire the advice both on their willingness to buy it, and on measures such as the weight of advice or the weight of one's own estimate. The experimental data indeed do not provide any indication of the extent to which the cost "size" matters.

    There are other questions I'd like to explore in future work. For instance: Are people interested in knowing where the advice comes from? Do people behave differently toward advice based on what they know about their advisors? Suppose people are given information on their advisors that is of no relevance for the task they are asked to solve: Do they pay attention to such information?

    Finally, let me conclude by saying a few words about my work in progress. As part of my current research on advice-taking, Don Moore (of the Tepper School of Business, Carnegie Mellon University) and I are exploring the question of whether people behave differently toward advice based on the task they are facing. That is, does advice-taking depend on task difficulty? And if so, do people weigh others' opinions below or above what they should be weighed? Does this change based on the difficulty of the task?

    ǁ
    Campus Map
    Harvard Business School Working Knowledge
    Baker Library | Bloomberg Center
    Soldiers Field
    Boston, MA 02163
    Email: Editor-in-Chief
    →Map & Directions
    →More Contact Information
    • Make a Gift
    • Site Map
    • Jobs
    • Harvard University
    • Trademarks
    • Policies
    • Accessibility
    • Digital Accessibility
    Copyright © President & Fellows of Harvard College