• Home
  • Latest
  • Fortune 500
  • Finance
  • Tech
  • Leadership
  • Lifestyle
  • Rankings
  • Multimedia
TechGoogle

Google’s sexist algorithms offer an important lesson in diversity

By
Stacey Higginbotham
Stacey Higginbotham
Down Arrow Button Icon
By
Stacey Higginbotham
Stacey Higginbotham
Down Arrow Button Icon
July 8, 2015, 12:17 PM ET
Digital business help from Google
Digital business help from Google. File photo dated 04/09/13 of an iPad showing the Google search engine home page as the internet giant has launched a multimillion-pound initiative aimed at helping thousands of firms with digital business. Issue date: Friday March 13, 2015. The company is opening so-called "pop-up garages", starting in Leeds later this month, as part of a six-month scheme offering digital classes, as well as offering workshops in computer science training for teachers. Eileen Naughton, Google's managing director for UK & Ireland, said: "While the majority of UK small businesses recognise the importance of having a website and using basic digital tools, less than 30% of small to medium-sized enterprises have an effective online presence. We want to help jumpstart the other 70%. We believe that giving small business-owners access to expert advice will help strengthen the UK's reputation as one of the most advanced digital economies, and cement its place as the most advanced e-commerce market in the world.Ó See PA story INDUSTRY Google. Photo credit should read: Chris Radburn/PA Wire URN:22485088Chris Radburn — PA Wire/AP

Google is the latest company to be caught out by software behaving like a jerk, with researchers from Carnegie Mellon University showing that the company displays more prestigious job listings to men as opposed to women. MIT’s Technology Review reported that the researchers built a tool called AdFisher and created fake profiles to look for jobs using profiles with male names and female names.

The research looked at the targeting of ads served up by Google on third-party websites and found the issues associated with job listings. It also found that the ad settings page that Google shows searchers so they can see how Google is making inferences about them, doesn’t offer a complete picture. Researchers found that visiting webpages associated with substance abuse changed the ads shown to substance-abuse-related ads but that the ad setting page didn’t change.

This isn’t the first time Google has had trouble with its algorithms. Its facial recognition algorithms used in its photo software recently labeled a photo of two African-American people as gorillas, and it also was called out for only showing male images for a search of “CEO.” Facebook and other companies trying to build smart machines have had similar problems with their algorithms.

Stories like this aren’t just fodder to titillate your friends on Facebook (governed by its own algorithms), but should drive home two important lessons. The first is that as our machines get smarter and we trust them to do more of our thinking, diversity in tech becomes more important. The reason these algorithms make these mistakes isn’t necessarily malicious, it’s because they reflect the biases of those that programmed them. For example, if you train facial recognition software using photos from your employees and have few black employees, then your software won’t recognize African-Americans as well.

When you built software to show certain people job listings, there are any number of ways biases can creep in, from an assumption that men are more interested in C-level jobs to looking at salary range and eliminating women for some jobs because they are historically paid less. Having employees who can bring these systemic biases to the forefront will help tech firms (and any firms building machine learning algorithms) avoid these mistakes.

The second big lesson we need to learn from these stories is that this software has real consequences and we need to ensure that we don’t blindly trust it. Discrimination from software can have tangible effects on people’s lives, such as downgrading their credit scores, tweaking the value of their homes and even who they perceive to be their friends on social networks. Thus, we need ways to monitor how algorithms affect people, which is why research like Carnegie Mellon’s is so important.

In the case of the findings from this research, Google was notified of the findings, but didn’t respond to the team. That’s unfortunate, because it may have led to better research and ultimately, better algorithms. In many cases all researchers are hoping for is more transparency from the companies they are covering. From the MIT Tech review story:

However, [Roxana Geambasu, an assistant professor at Columbia University] says that the results from both XRay and AdFisher are still only suggestive. “You can’t draw big conclusions, because we haven’t studied this very much and these examples could be rare exceptions,” she says. “What we need now is infrastructure and tools to study these systems at much larger scale.” Being able to watch how algorithms target and track people to do things like serve ads or tweak the price of insurance and other products is likely to be vital if civil rights groups and regulators are to keep pace with developments in how companies use data, she says.

Google responded to the news stories about the study’s findings with the following statement:

Advertisers can choose to target the audience they want to reach, and we have policies that guide the type of interest-based ads that are allowed. We provide transparency to users with ‘Why This Ad’ notices and Ad Settings, as well as the ability to opt out of interest-based ads.

This is a complicated issue, made worse by the lack of understanding around computer science and how we are training computers to think and see. For now, public shame may be enough, but that’s likely only in cases of hot button issues around racism and sexism. Other problematic algorithms around class discrimination or labeling drug addicts by their search terms will be tougher to draw attention to. But they are just as harmful.

The Obama administration is looking at the problem, as are others. Suggestions include the aforementioned public shaming, government legislation or teaching people some kind of algorithmic literacy. But the first step is to hire with diversity in mind.

This story was updated to add Google’s comment.

About the Author
By Stacey Higginbotham
See full bioRight Arrow Button Icon

Latest in Tech

Sarandos
Arts & EntertainmentM&A
It’s a sequel, it’s a remake, it’s a reboot: Lawyers grow wistful for old corporate rumbles as Paramount, Netflix fight for Warner
By Nick LichtenbergDecember 13, 2025
4 hours ago
Oracle chairman of the board and chief technology officer Larry Ellison delivers a keynote address during the 2019 Oracle OpenWorld on September 16, 2019 in San Francisco, California.
AIOracle
Oracle’s collapsing stock shows the AI boom is running into two hard limits: physics and debt markets
By Eva RoytburgDecember 13, 2025
5 hours ago
robots
InnovationRobots
‘The question is really just how long it will take’: Over 2,000 gather at Humanoids Summit to meet the robots who may take their jobs someday
By Matt O'Brien and The Associated PressDecember 12, 2025
18 hours ago
Man about to go into police vehicle
CryptoCryptocurrency
Judge tells notorious crypto scammer ‘you have been bitten by the crypto bug’ in handing down 15 year sentence 
By Carlos GarciaDecember 12, 2025
19 hours ago
three men in suits, one gesturing
AIBrainstorm AI
The fastest athletes in the world can botch a baton pass if trust isn’t there—and the same is true of AI, Blackbaud exec says
By Amanda GerutDecember 12, 2025
20 hours ago
Brainstorm AI panel
AIBrainstorm AI
Creative workers won’t be replaced by AI—but their roles will change to become ‘directors’ managing AI agents, executives say
By Beatrice NolanDecember 12, 2025
20 hours ago

Most Popular

placeholder alt text
Economy
Tariffs are taxes and they were used to finance the federal government until the 1913 income tax. A top economist breaks it down
By Kent JonesDecember 12, 2025
1 day ago
placeholder alt text
Success
Apple cofounder Ronald Wayne sold his 10% stake for $800 in 1976—today it’d be worth up to $400 billion
By Preston ForeDecember 12, 2025
1 day ago
placeholder alt text
Success
40% of Stanford undergrads receive disability accommodations—but it’s become a college-wide phenomenon as Gen Z try to succeed in the current climate
By Preston ForeDecember 12, 2025
24 hours ago
placeholder alt text
Economy
For the first time since Trump’s tariff rollout, import tax revenue has fallen, threatening his lofty plans to slash the $38 trillion national debt
By Sasha RogelbergDecember 12, 2025
20 hours ago
placeholder alt text
Economy
The Fed just ‘Trump-proofed’ itself with a unanimous move to preempt a potential leadership shake-up
By Jason MaDecember 12, 2025
18 hours ago
placeholder alt text
Success
At 18, doctors gave him three hours to live. He played video games from his hospital bed—and now, he’s built a $10 million-a-year video game studio
By Preston ForeDecember 10, 2025
3 days ago
Rankings
  • 100 Best Companies
  • Fortune 500
  • Global 500
  • Fortune 500 Europe
  • Most Powerful Women
  • Future 50
  • World’s Most Admired Companies
  • See All Rankings
Sections
  • Finance
  • Leadership
  • Success
  • Tech
  • Asia
  • Europe
  • Environment
  • Fortune Crypto
  • Health
  • Retail
  • Lifestyle
  • Politics
  • Newsletters
  • Magazine
  • Features
  • Commentary
  • Mpw
  • CEO Initiative
  • Conferences
  • Personal Finance
  • Education
Customer Support
  • Frequently Asked Questions
  • Customer Service Portal
  • Privacy Policy
  • Terms Of Use
  • Single Issues For Purchase
  • International Print
Commercial Services
  • Advertising
  • Fortune Brand Studio
  • Fortune Analytics
  • Fortune Conferences
  • Business Development
About Us
  • About Us
  • Editorial Calendar
  • Press Center
  • Work At Fortune
  • Diversity And Inclusion
  • Terms And Conditions
  • Site Map

© 2025 Fortune Media IP Limited. All Rights Reserved. Use of this site constitutes acceptance of our Terms of Use and Privacy Policy | CA Notice at Collection and Privacy Notice | Do Not Sell/Share My Personal Information
FORTUNE is a trademark of Fortune Media IP Limited, registered in the U.S. and other countries. FORTUNE may receive compensation for some links to products and services on this website. Offers may be subject to change without notice.