Skip to Main Content
HBS Home
  • About
  • Academic Programs
  • Alumni
  • Faculty & Research
  • Baker Library
  • Giving
  • Harvard Business Review
  • Initiatives
  • News
  • Recruit
  • Map / Directions
Working Knowledge
Business Research for Business Leaders
  • Browse All Articles
  • Popular Articles
  • Cold Call Podcast
  • Managing the Future of Work Podcast
  • About Us
  • Book
  • Leadership
  • Marketing
  • Finance
  • Management
  • Entrepreneurship
  • All Topics...
  • Topics
    • COVID-19
    • Entrepreneurship
    • Finance
    • Gender
    • Globalization
    • Leadership
    • Management
    • Negotiation
    • Social Enterprise
    • Strategy
  • Sections
    • Book
    • Podcasts
    • HBS Case
    • In Practice
    • Lessons from the Classroom
    • Op-Ed
    • Research & Ideas
    • Research Event
    • Sharpening Your Skills
    • What Do You Think?
    • Working Paper Summaries
  • Browse All
    Why Technology Alone Can't Solve AI's Bias Problem
    Research & Ideas
    Why Technology Alone Can't Solve AI's Bias Problem
    16 Dec 2022Research & Ideas

    Why Technology Alone Can't Solve AI's Bias Problem

    by Michael Blanding
    16 Dec 2022| by Michael Blanding
    Engineers designed "fair-ranking algorithms" to prevent artificial intelligence from marginalizing certain groups. While these tools help, research by Himabindu Lakkaraju finds that they can't completely override the most stubborn source of bias: people.
    LinkedIn
    Email

    In a cluttered online world, few can resist the convenience of an automated ranking when deciding what movie to watch on Netflix or which seafood restaurant looks promising in a Google search. But when it comes to finding a job candidate or someone to do a basic household task, there’s often a human toll to letting algorithms do the work.

    “Maybe there is a bias from people who have been traditionally hiring men.”

    Searches on popular recruiting sites might seem like a neutral way to find prospective candidates, but their underlying technology can reinforce biases by excluding underrepresented groups, including women. For instance, research shows that women receive fewer employment reviews on the popular online freelancing site TaskRabbit compared to men with the same experience—and this lack of reviews can lower the rankings of women in talent search algorithms.

    “Maybe there is a bias from people who have been traditionally hiring men,” explains Himabindu Lakkaraju, an assistant professor at Harvard Business School. “They review men and give high rankings to men, and then men are always showing up higher on the list—even when you have women who can do the job just as well.”

    To combat ranking biases, web developers have created “fair-ranking algorithms” that try to serve up a more equitable list of relevant results in a search query. For example, a fair-ranking algorithm might ensure women and other underrepresented groups, including people of color, are represented in proportion to their presence in the wider pool of qualified candidates. A fair algorithm may also change the applicants ranked highest, theoretically giving more opportunities for a variety of candidates to make a hiring company’s short list.

    With companies increasingly focused on hiring more equitably and diversifying their workforces, more firms are scrutinizing the results of conventional algorithms on recruiting websites. But do fair rankings actually weed out gender bias and allow more women to rise to the top of talent searches?

    Employer biases still creep in

    That is the question Lakkaraju set out to test, along with HBS research assistant Tom Sühr and Harvard computer science doctoral student Sophie Hilgard, in research published last year in the Proceedings of the AAAI/ACM Conference on Artificial Intelligence, Ethics, and Society.

    “The fair-ranking researcher is always concerned with defining what is a fair distribution of attention in a ranking, and then developing an algorithm that achieves this distribution,” says Sühr.

    The research team found that fair-ranking algorithms are effective, but only to a point: While women are elevated in job applicant searches compared to regular algorithms, fair rankings are limited by the responses of employers who still express biases based on the type of job and the profiles of the candidates being considered. In other words, a hiring manager’s bias toward hiring men for certain jobs can still creep in.

    “Our analysis revealed that fair-ranking algorithms can be helpful in increasing the number of underrepresented candidates selected. However, their effectiveness is dampened in those job contexts where employers have a persistent gender preference,” the researchers write.

    How employers view job candidates

    In order to test the efficacy of algorithms, the researchers set up an online experiment with more than 1,000 participants who were told to imagine they were employers hiring on TaskRabbit for one of three jobs—shoppers, event staffers, or moving workers. The researchers then presented participants with a set of 10 candidates and asked them to rank their top three choices.

    Some of these sets employed standard algorithms, which generally included only three women who ranked near the bottom of the list; others used fair-ranking algorithms, which represented women proportionately and randomized their positions on the list.

    Lakkaraju and her colleagues found that the fair rankings significantly improved the chances of women being included among the top four candidates, and especially their opportunity to be ranked in the first slot. For example, for moving assistance, 10 percent of participants viewing the traditional algorithm chose a woman as their first choice, and 23 percent included them in the top four. Meanwhile, with the fair algorithm, 23 percent chose a woman as their first choice, and nearly 29 percent included them in the top four.



    However, the researchers found smaller increases when using fair ranking for the shopping and event staffing tasks—with jumps between 2.5 and 13 percent—and they say that might be because participants associated these tasks with women. Ultimately, those tasks had a higher percentage of women among the top four candidates overall: 32 and 33 percent, respectively.

    The limitations of fair rankings

    In surveying participants after the experiment, some admitted they specifically looked for male candidates for the moving assistance tasks, then tried to balance out their choices by seeking women for the other tasks. This shows that despite the fair algorithm’s attempts to provide a more inclusive mix of candidates, people’s gender biases still factored into their decisions, Lakkaraju says.

    “People said, OK, for moving candidates, I chose only males, but don’t worry, I made up for it in other tasks.”

    “People said, OK, for moving candidates, I chose only males, but don’t worry, I made up for it in other tasks,” she says.

    The research team found other limitations to fair ranking algorithms. For instance, candidate profiles were a sticking point in that the algorithms didn’t consistently place women high in rankings if they had fewer positive rankings or fewer completed jobs than men.

    “We find that fair ranking is more effective when underrepresented candidate profiles are similar to those of the majority class,” the researchers write.

    Plus, overall, the total number of men and women ranked high was never truly equitable. The actual percentage of female candidates on the site, about 42 percent, was never proportionately represented in the top rankings.

    Can incentives help?

    For that reason, the paper’s authors say, fair-ranking algorithms can be a good first step toward counteracting gender bias, but they don’t completely eliminate it.

    “If there is a human sitting there who is likely to make choices based on bias, you can’t say you have completely solved it.”

    “Computational scientists have a way of thinking they can come up with something that will solve the whole problem,” says Lakkaraju, “but the truth is, if there is a human sitting there who is likely to make choices based on bias, you can’t say you have completely solved it.”

    Instead, she suggests using a combination of approaches, including offering incentives that might address hiring managers’ behavioral habits. For example, managers could receive “bonus points” for choosing underrepresented candidates that they could later redeem for a free service.

    “This work essentially says you can’t design these algorithms in isolation,” Lakkaraju says. “You’ve also got to think about people’s behavior and incorporate other strategies to achieve a better solution in the end.”

    You Might Also Like:

    • When Bias Creeps into AI, Managers Can Stop It by Asking the Right Questions
    • White Airbnb Hosts Earn More. Can AI Shrink the Racial Gap?
    • When Design Enables Discrimination: Learning from Anti-Asian Bias on Airbnb

    Feedback or ideas to share? Email the Working Knowledge team at hbswk@hbs.edu.

    Image: iStockphoto/Sylverarts

      Trending
        • 25 Jan 2022
        • Research & Ideas

        More Proof That Money Can Buy Happiness (or a Life with Less Stress)

        • 25 Feb 2019
        • Research & Ideas

        How Gender Stereotypes Kill a Woman’s Self-Confidence

        • 14 Mar 2023
        • In Practice

        What Does the Failure of Silicon Valley Bank Say About the State of Finance?

        • 17 May 2017
        • Research & Ideas

        Minorities Who 'Whiten' Job Resumes Get More Interviews

        • 15 Nov 2022
        • Book

        Stop Ignoring Bad Behavior: 6 Tips for Better Ethics at Work

    Himabindu Lakkaraju
    Himabindu Lakkaraju
    Assistant Professor of Business Administration
    Contact
    Send an email
    → More Articles
    Find Related Articles
    • Prejudice and Bias
    • Technology
    • Search Technology
    • Diversity
    • Technology

    Sign up for our weekly newsletter

    Interested in improving your business? Learn about fresh research and ideas from Harvard Business School faculty.
    This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
    ǁ
    Campus Map
    Harvard Business School Working Knowledge
    Baker Library | Bloomberg Center
    Soldiers Field
    Boston, MA 02163
    Email: Editor-in-Chief
    →Map & Directions
    →More Contact Information
    • Make a Gift
    • Site Map
    • Jobs
    • Harvard University
    • Trademarks
    • Policies
    • Accessibility
    • Digital Accessibility
    Copyright © President & Fellows of Harvard College