A.I. for Hire: 4 Ways Algorithms Can Boost Diversity in Hiring
Artificial intelligence can be a “black box”—mysterious and more than a little intimidating. Meanwhile, new permutations of the tech are sprouting up like mushrooms, especially for recruiting and hiring. Yet as employers have increasingly tried to make their workforces more diverse and inclusive, the A.I. industry itself has taken some flak for being almost exclusively white and male. For instance, a recent study by New York University researchers points out that at tech giants like Facebook and Google, such tiny percentages of employees are female or nonwhite that the whole business is suffering a “diversity crisis.”
The irony there is that A.I., used correctly, has “a shot at being better at decision-making than we humans are, particularly in hiring,” says Aleksandra Mojsilovic. A research fellow in A.I. at IBM, Mojsilovic holds 16 patents in machine learning, and helped develop algorithms that can check other algorithms for unintended bias. An essential part of using A.I. to encourage diversity, she notes, is making sure the teams that build what goes into the black box are themselves a diverse group, with a variety of backgrounds and points of view.
“Any A.I. tool can only be as good—and as impartial—as the data we put in,” Mojsilovic says. “It’s not about replacing human intelligence, but rather about complementing it.”
A.I. has helped companies find and attract new hires of all sexes, ages, and ethnicities. Here are four main ways it’s helped them to do that:
A.I. knows how to speak to your best candidates
The words in job postings matter, not least because they often unwittingly discourage some potential hires from applying. “We as humans take our best guess at what will resonate with job seekers, but we’re often wrong,” notes Kieran Snyder, cofounder and CEO of the A.I. firm Textio.
Using a dataset of about 500 million actual job ads, and A.I. that analyzes the real-life responses they got, Textio advises companies on which words to use—and avoid. At client eBay, for instance, the phrase “prior experience” drew a 50% increase in male applicants. “But the phrase ‘demonstrated ability’—even though it means essentially the same thing—attracted 40% more women,” Snyder says.
Language that is neutral across sexes, races, and ethnicities “changes rapidly. There is no ‘use-these-10-words’ list,” she adds. “But the right word at the right moment does attract the most diverse possible group of applicants.”
A.I. widens the pool of eligible workers
A.I. also has the power to cast a wider net across unmanageable geographies. Take, for example, campus recruiting. Employers can send only so many humans to a limited number of campuses—but what if the perfect hire skipped the job fair, or goes to a different school entirely?
“A student at an obscure college where you’d never send a recruiter could be every bit as good as, or better than, graduates of the ‘right’ schools,” observes Loren Larsen, chief technology officer at A.I. firm HireVue, which lists Intel, Oracle, Dow Jones, Dunkin’ Brands, and many others among its clients.
In the old days, says Larsen, this student wouldn’t have gotten a second sniff, let alone a first. But by sourcing the leads with A.I., and using modern tools like video chatting, you can reach them with ease. “This way, a lot more people are let into the system on their merits, so you get to ‘meet’ and assess a much more diverse group of candidates,” adds Larsen.
A.I. has an eye for talent—and skill sets
Resumes are nice, but “if you focus on what it says on someone’s resume, you risk overlooking huge numbers of people,” says Irina Novoselsky, CEO of CareerBuilder, whose top leadership is now 70% women and minorities—up from 40% when Novoselsky joined in 2017.
The site uses A.I. to help employers and job hunters find the best match, with a database that includes more than 2.3 million job postings, 10 million job titles, and 1.3 billion skills. The algorithms zero in on exactly what skills a job requires, and find promising candidates who have them—but who may, based on their background, be applying for a different job altogether.
“Someone’s resume headline or most recent role may not necessarily translate into what else they can do,” says Novoselsky. Customer service reps need, for instance, patience and problem-solving ability, and “we’ve found that home health care workers share those skills. Without A.I., making those matches would have been impossible.”
A strict focus on skills “naturally leads to more diversity, because the hiring criteria are exactly the same for each and every candidate, regardless of sex, race, ethnicity, age, or anything else. A.I. strips out all that extraneous stuff,” says Loren Larsen at HireVue. Reams of research confirm that so-called structured interviews, where interviewers ask precisely the same questions of each candidate and look for precisely the same checklist of answers, work best at eliminating unconscious biases.
The catch is, human interviewers rarely do them. “We get bored, or we’re distracted, or we have a toothache,” Larsen notes. “A.I. never does.”
A.I. can correct its own biases
People can’t help bringing their own experiences, assumptions, and preferences with them to work in the morning, and some of those quirks—especially when they lurk in the subconscious—are notoriously slow to change. By contrast, even the smartest machines (at least so far) can learn and apply only what programmers install in them. That can include an emphasis on welcoming the best-qualified candidates of all ages, sexes, and colors.
“Humans often can’t fully explain their decisions, because they’re going partly on ‘gut feel’,” says Larsen. “But with algorithms, we can pinpoint exactly where an unintentional bias has sneaked in.”
At one client company, HireVue’s team tried out an algorithm that turned out to be biased toward job applicants with deep voices so that, in preliminary testing, it kept selecting men over women who were just as qualified. Meanwhile, other, earlier A.I. systems have drawn fire for favoring light skin tones over darker ones in video interviews.
Larsen says programmers have learned to spot—and fix—that sort of thing, adding that “data-driven technology gives us the chance to keep getting more fair in ways that weren’t possible before.”
That’s not to say that A.I. can ever push human resource professionals and hiring managers to the sidelines. The tasks of managing company policy on inclusion, building great relationships with promising candidates, and making sure that A.I. is doing its job can only be done by people.