Skip to Main Content
HBS Home
  • About
  • Academic Programs
  • Alumni
  • Faculty & Research
  • Baker Library
  • Giving
  • Harvard Business Review
  • Initiatives
  • News
  • Recruit
  • Map / Directions
Alumni
  • Login
  • Volunteer
  • Clubs
  • Reunions
  • Bulletin
  • Class Notes
  • Help
  • Give Now
  • Stories
  • Alumni Directory
  • Lifelong Learning
  • Careers
  • Programs & Events
  • Giving
  • …→
  • Harvard Business School→
  • Alumni→
  • Stories→

Stories

Stories

397
397 views
01 Mar 2023

Is AI OK?

Retooling the algorithms to reduce—not perpetuate—human bias
Re: Seke Ballard (MBA 2010); Frida Polli (MBA 2012); Shunyuan Zhang (Assistant Professor of Business Administration); By: Alexander Gelfand
Topics: Technology-Artificial IntelligencePsychology-Prejudice and BiasInnovation-Technological Innovation
ShareBar

Illustrations by Dan Bejar

Illustrations by Dan Bejar

When she began investigating Airbnb’s smart-pricing algorithm several years ago, Assistant Professor Shunyuan Zhang wasn’t looking for evidence that it generated discriminatory outcomes, a problem known as “algorithmic bias” that has dogged the use of AI in everything from facial recognition to predictive sentencing. She found it nonetheless.

The online rental platform uses its pricing tool to increase host revenues by dynamically adjusting rental prices according to demand. Zhang, who uses machine-learning methods to study the sharing economy, wanted to see how the algorithm performed. What she found surprised her: Although Airbnb had designed the pricing tool to be race-blind, the algorithm nonetheless widened the revenue gap between white and Black hosts.

The reasons for this were complicated, and perhaps even counterintuitive. For one thing, Black hosts were underrepresented in the data that was used to train the algorithm. What’s more, says Zhang, societal bias may result in different demand curves for Black-owned properties versus white-owned ones. Renters may be more price-sensitive when considering the former, and willing to pay more for the latter.

Unfortunately, by purposefully ignoring real differences in what consumers were willing to pay for Black-owned rentals and basing its calculations on data from predominantly white-owned properties, the algorithm wound up recommending rental prices for Black hosts that were less likely to optimize their revenues.

“There is no more biased instrument than the human brain.”

“There is no more biased instrument than the human brain.”

Fixing such problems will also be complicated, Zhang says. A good place to start would be a multipronged approach that combines better data science (e.g., unbiased data and algorithms), better implementation (e.g., understanding the target audience and market conditions), and better policies (e.g., auditing algorithms to ensure they perform in an equitable manner).

For Frida Polli (MBA 2012), cofounder of the AI-based recruitment startup pymetrics, solving the bias problem meant first acknowledging that humans are the most biased decision-makers of all. As a neuroscientist who spent 10 years studying the bias systems wired into our brains, Polli knew that algorithmic bias paled in comparison with the human sort, which can’t be fixed through better coding or inputs. “There is no more biased instrument than the human brain,” she observes.

At HBS, Polli says she witnessed a recruiting process that was rife with bias and bad data. Employers screened job seekers using outdated psychological assessments that overwhelmingly favored white males, hunted for clues to candidates’ soft skills over dinner or coffee, and hired people based on résumés that revealed little about job fit. Yet Polli knew that scientists possessed a battery of tests for accurately and objectively evaluating cognitive abilities and soft skills: memory and attention, generosity and fairness. She cofounded pymetrics to capitalize on those existing tools.

Polli and her partners developed a collection of gamified assessments and used them to evaluate top performers in various roles, like sales and marketing. They then employed machine-learning techniques to analyze that data and build an algorithm that successfully predicted job fit while minimizing bias—matching candidates to the right jobs based on their traits and aptitudes, for example, without discriminating on the basis of gender or ethnicity.

Auditing reveals that pymetrics’ algorithmic approach to evaluating job candidates is virtually bias-free when compared with traditional methods: Men and women perform almost identically on the game-based assessments, with only minute variations occurring across racial and ethnic groups. As a result, enterprise clients have come to rely on pymetrics to help them meet their diversity goals. And the firm’s impact is likely to spread, thanks to its recent acquisition by Harver, an automated recruiting platform with global reach.

Seke Ballard (MBA 2010), founder of the fintech startups Good Tree Capital and BetaBank, took a similar approach to weeding bias out of banking and lending, essentially removing from the picture human beings and their inherently biased decision-making processes. He understood that his father, a Black business owner in the Deep South, was repeatedly denied loans for reasons that had little to do with his company’s financials. Ballard also knew that Black men were nearly three times more likely to be denied a loan than white men, and that the figures weren’t much better for women, Latinos, or immigrants. So he challenged a group of data scientists and software developers to find a way to evaluate creditworthiness on purely financial and operational grounds, omitting the prejudice endemic among loan officers.

The team used machine-learning techniques to sift through loan records from the Small Business Administration, identifying and weighting the factors that most strongly predicted the likelihood of default. They then constructed an algorithm that was purpose-built to ignore the factors that tend to bias human decision-making but that don’t have any real bearing on whether a business will be able to repay a loan, such as postal codes, which can raise red flags simply due to their association with communities of color.

When used to power Good Tree Capital, a Chicago-based firm that provides deposit accounts and loans to state-licensed cannabis companies, the algorithm helped produce a diverse pool of approved borrowers—55 percent of which are minority-owned businesses; 45 percent, women-owned—with zero defaults. The company’s track record is especially impressive given that most of its customers themselves don’t have track records: The vast majority of Good Tree’s clients have never run an enterprise before.

Now Ballard is building BetaBank, a state-chartered, online-depository institution focused on small- and medium-sized businesses that will take a similar approach to evaluating prospective customers. BetaBank will rely on its technology to reject applicants who plainly don’t qualify for a loan or an account and to approve those who do, with human bankers intervening only in more difficult edge cases. Most people, Ballard says, will be able to complete the application process in minutes without ever having to deal with a human being.

Ballard is developing BetaBank’s automated systems and online platform in partnership with Google and Deloitte; he anticipates regulatory approval by next spring; and he is in the process of raising $5 million—every penny of which will go toward building a bias-free bank. “I want to create a model where everyone, irrespective of their socioeconomic demographics, is measured by the same yardstick,” he says.

Entrepreneurs like Ballard and Polli are proving that it is possible to turn algorithms into instruments for reducing bias and inequity, rather than perpetuating them. At least for now, Zhang sees little evidence that the largest tech companies are doing the same, but she is confident that more businesses will follow their lead. “I may be biased,” she says, “but I’m quite optimistic.”

ShareBar

Featured Alumni

Seke Ballard
MBA 2010
Frida Polli
MBA 2012

Post a Comment

Featured Alumni

Seke Ballard
MBA 2010
Frida Polli
MBA 2012

Featured Faculty

Shunyuan Zhang
Assistant Professor of Business Administration

Related Stories

    • 25 Aug 2022
    • HBS Alumni Bulletin

    Labs Enable Large-scale Research

    Re: Raffaella Sadun (Charles Edward Wilson Professor of Business Administration); Himabindu Lakkaraju (Assistant Professor of Business Administration); Shai Benjamin Bernstein (Marvin Bower Associate Professor); Marco Iansiti (David Sarnoff Professor of Business Administration); Scott Duke Kominers (Professor of Business Administration (Leave of Absence)); Marco Di Maggio (Ogunlesi Family Associate Professor of Business Administration); Seth Neel (Assistant Professor of Business Administration); Jorge Tamayo (Assistant Professor of Business Administration)
    • 25 Aug 2022
    • HBS Alumni Bulletin

    A New Platform for Alumni Engagement

    Re: Kenneth Salas (MBA 2015); Morgan McKenney (MBA 2003); Vladimir Jacimovic (MBA 1992); Marco Iansiti (David Sarnoff Professor of Business Administration); Karim R. Lakhani (Dorothy and Michael Hintze Professor of Business Administration); By: April White
    • 25 Aug 2022
    • HBS Alumni Bulletin

    Understanding the Digital, Data, and Design Institute at Harvard

    • 25 Aug 2022
    • HBS Alumni Bulletin

    Harnessing the Tools of the Digital Age

    Re: Vladimir Jacimovic (MBA 1992); Karim R. Lakhani (Dorothy and Michael Hintze Professor of Business Administration); By: April White

More Related Stories

 
 
 
 
  • Explore
  • Explore
  • Explore
  • Explore
  • Explore
  • Explore
  • Explore
  • Explore
  • Explore
ǁ
Campus Map
External Relations
Harvard Business School
Teele Hall
Soldiers Field
Boston, MA 02163
Phone: 1.617.495.6890
Email: alumni+hbs.edu
→Map & Directions
→More Contact Information
  • Make a Gift
  • Site Map
  • Jobs
  • Harvard University
  • Trademarks
  • Policies
  • Accessibility
  • Digital Accessibility
  • Terms of Use
Copyright © President & Fellows of Harvard College