Skip to Main Content
HBS Home
  • About
  • Academic Programs
  • Alumni
  • Faculty & Research
  • Baker Library
  • Giving
  • Harvard Business Review
  • Initiatives
  • News
  • Recruit
  • Map / Directions
Working Knowledge
Business Research for Business Leaders
  • Browse All Articles
  • Popular Articles
  • Cold Call Podcast
  • Managing the Future of Work Podcast
  • About Us
  • Book
  • Leadership
  • Marketing
  • Finance
  • Management
  • Entrepreneurship
  • All Topics...
  • Topics
    • COVID-19
    • Entrepreneurship
    • Finance
    • Gender
    • Globalization
    • Leadership
    • Management
    • Negotiation
    • Social Enterprise
    • Strategy
  • Sections
    • Book
    • Podcasts
    • HBS Case
    • In Practice
    • Lessons from the Classroom
    • Op-Ed
    • Research & Ideas
    • Research Event
    • Sharpening Your Skills
    • What Do You Think?
    • Working Paper Summaries
  • Browse All
    Racism and Digital Design: How Online Platforms Can Thwart Discrimination
    Research & Ideas
    Racism and Digital Design: How Online Platforms Can Thwart Discrimination
    28 Jul 2020Research & Ideas

    Racism and Digital Design: How Online Platforms Can Thwart Discrimination

    by Kristen Senz
    28 Jul 2020| by Kristen Senz
    Poor design decisions contribute to racial discrimination on many online platforms. Michael Luca and colleagues offer tips for reducing the risk, used by Airbnb and other companies.
    LinkedIn
    Email

    Having uncovered the scope of discrimination taking place against Black guests and hosts on Airbnb, researcher Michael Luca and his colleagues put together a toolkit to aid managers in recognizing and mitigating discrimination on online platforms.

    Platforms like Airbnb and Uber can reduce the prevalence of discrimination among their users by making careful design choices, says Luca, the Lee J. Styslinger III Associate Professor of Business Administration at Harvard Business School, who has been studying platform design for more than a decade.

    "The choices you make as a manager can have a profound impact on the inclusivity of the systems you are creating," says Luca, who has also coauthored a new case study, Racial Discrimination on Airbnb: The Role of Platform Design. "Platform companies should think through the implications of design choices, including unintended consequences like discrimination."

    A 2015 study by Luca and colleagues Benjamin Edelman, now an economist at Microsoft, and Dan Svirsky, an economist at Uber, used booking requests by 20 mock Airbnb profiles without photos to gauge discrimination among 6,400 Airbnb hosts in five American cities. To isolate the issue of race, half the fake profiles were given names that, according to birth records, are common among whites, while half had names common among African Americans.

    "The choices you make as a manager can have a profound impact on the inclusivity of the systems you are creating."

    Requests made from profiles with African American-sounding names were about 16 percent less likely to be accepted. According to the study, discrimination was pervasive across price points and in a variety of neighborhoods, but most of the rejections came from hosts who had never hosted a Black guest. In a related paper, researchers found that African American hosts in New York City typically charged less than other hosts on the platform for similar listings.

    "The research demonstrates that African American guests were getting rejected more often than white guests," Luca says. "More broadly, evidence points to disparities on both sides of the market—both African American guests and hosts face discrimination."

    After the research was made public, policymakers took note. Users began to push back. The hashtag #AirbnbWhileBlack started trending. In 2016, Airbnb CEO Brian Chesky responded to the findings and publicly acknowledged that the potential for discrimination on the platform hadn't occurred to him or his two cofounders prior to the site's launch, a blind spot he attributed in part to the fact that all three founders of the company were white males.

    Airbnb set up a task force to determine the extent of the problem and evaluate proposed fixes, some of which Luca had recommended and described in a Harvard Business Review article coauthored with Ray Fisman. The company has since made a series of announcements about its efforts to measure and mitigate racial discrimination among users, most recently through Project Lighthouse.

    Poor design invites discrimination

    Airbnb's early design choices were aimed at facilitating a user's trust both in the platform and fellow users, but some of those choices had the unintended consequence of enabling discrimination. For example, when it launched in 2008, would-be guests' names and profile pictures were prominent on Airbnb requests that hosts saw prior to booking. This design was in contrast to existing platforms such as eBay, where images of products dominated pages and users remained anonymous.

    At the time, Chesky was quoted as saying: "Access is built on trust, and trust is built on transparency. When you remove anonymity, it brings out the best in people. We believe anonymity has no place in the future of Airbnb or the sharing economy."

    But as it turned out, putting users' photos front and center made discrimination easier, according to Luca's research. In response to the research and growing public pressure, hosts now only see a guest's profile photo once a booking is confirmed. Additionally, the company updated its policies and terms of service. Luca also points to increased use of Airbnb's "instant book" feature, which allows guests to book rentals before hosts view their profiles, as a move in the right direction.

    Through experimentation, Luca says, platform companies like Airbnb can surface design elements that align with both firm performance goals and inclusivity.

    "If you're running an online marketplace," he says, "you should think about whether discrimination is likely to be a problem, and what design choices might mitigate the risk of discrimination. If you're optimizing only to short-run growth metrics and ignoring the potential for discrimination, you might be creating a blind spot."

    A manager’s toolkit for platform design

    The first step to building inclusive online platforms, says Luca, is for designers and decision-makers to recognize the potential for discrimination to occur.

    Luca and Svirsky outline a framework for making inclusive design choices in a forthcoming article in the journal Marketing Intelligence Review. What follows is a condensed version:

    Build awareness. Digital platform builders must recognize how their design choices and algorithms can lead to discrimination in a marketplace. Managers can be proactive about investigating and tackling the problem. For example, Uber created a cross-functional Fairness Working Group made up of economists, data scientists, lawyers, and product managers to explore discrimination issues.

    Measure discrimination. Many platforms do not know the racial, ethnic, or gender composition of their transaction participants. A regular report on problems and successes among users who are at risk of being discriminated against can help companies reveal and confront issues.

    Withhold sensitive data. In many cases, a simple but effective change involves withholding potentially sensitive user information, such as race and gender, in the early stages of engagement with the platform.

    Automate with awareness of algorithmic bias. Automation and algorithms can help reduce bias, such as the case with Airbnb's instant booking feature. However, discrimination can also occur through algorithms. Algorithms can be debiased by altering their inputs, but doing so requires managers to think about their goals around diversity and fairness. For example, LinkedIn redesigned its recruitment search tool to ensure that the gender breakdown of search results matches the gender breakdown for that occupation as a whole. If 30 percent of data scientists are women, then a recruiter searching for data scientists would see 30 percent female candidates in search results.

    Think like a choice-architect. The principles of choice architecture can help reduce discrimination. For example, people tend to use whatever option is set up as the default, so resetting default options with inclusivity in mind can be a useful strategy. Companies can also consider increasing the prominence of their anti-discrimination policies to help raise awareness.

    Experiment to measure effects. Platforms can incorporate efforts to measure discrimination into their experimental testing to understand the impact of different design choices.

    Be transparent. Platforms should make their work on issues of discrimination transparent and open up lines of communication with managers and designers. It is also essential to evaluate methods for measuring discrimination and associated design changes over time.

    About the Author

    Kristen Senz is a writer and social media editor for Harvard Business School Working Knowledge.


    [Image: RyanJLane]

    Related Reading

    • Airbnb Hosts Discriminate Against African-American Guests
    • Uncovering Racial Discrimination in the ‘Sharing Economy’
    • The Case Against Racial Colorblindness

    Share your insights below.

    Post A Comment
    In order to be published, comments must be on-topic and civil in tone, with no name calling or personal attacks. Your comment may be edited for clarity and length.
      Trending
        • 25 Jan 2022
        • Research & Ideas

        More Proof That Money Can Buy Happiness (or a Life with Less Stress)

        • 14 Mar 2023
        • In Practice

        What Does the Failure of Silicon Valley Bank Say About the State of Finance?

        • 25 Feb 2019
        • Research & Ideas

        How Gender Stereotypes Kill a Woman’s Self-Confidence

        • 15 Nov 2022
        • Book

        Stop Ignoring Bad Behavior: 6 Tips for Better Ethics at Work

        • 07 Mar 2023
        • HBS Case

        ChatGPT: Did Big Tech Set Up the World for an AI Bias Disaster?

    Michael Luca
    Michael Luca
    Lee J. Styslinger III Associate Professor of Business Administration
    Contact
    Send an email
    → More Articles
    Find Related Articles
    • Race
    • Technology Platform
    • Prejudice and Bias
    • Product Design
    • Travel
    • Entertainment and Recreation
    • Service

    Sign up for our weekly newsletter

    Interested in improving your business? Learn about fresh research and ideas from Harvard Business School faculty.
    This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
    ǁ
    Campus Map
    Harvard Business School Working Knowledge
    Baker Library | Bloomberg Center
    Soldiers Field
    Boston, MA 02163
    Email: Editor-in-Chief
    →Map & Directions
    →More Contact Information
    • Make a Gift
    • Site Map
    • Jobs
    • Harvard University
    • Trademarks
    • Policies
    • Accessibility
    • Digital Accessibility
    Copyright © President & Fellows of Harvard College