A strategy to minimize bias in A.I. that any company can use
Some Olympic athletes are using their platforms to raise issues of identity and rights, and a new academic paper explains why corporate promises on diversity and inclusion are unlikely to succeed. All that, plus Jonathan Vanian reports on a simple tool that can help root out bias in A.I.
But first, here’s your Olympics week in review in Haiku.
Zeus made of fame and
glory. So beautiful, so
strong, loved by countries
earth. Yet we cheer, cry
along with them and pray for
a gentle landing.
Before I go: I want to hear from you! Please take a few minutes to complete this short raceAhead survey.
Wishing you a winning weekend.
As more companies incorporate artificial intelligence, the more likely the risk that their software discriminates against people of color and other underrepresented groups.
So much has been researched and written about algorithmic bias over the past few years that this shouldn’t come as a surprise. Machine learning software is only as good as the data it feeds on, and if that data is representative of only certain populations, it will not present an accurate picture.
But just because A.I. has a propensity toward bias doesn’t mean that there is nothing companies can do about the problem. In fact, there are lessons that managers can learn about how to mitigate A.I.’s bias problems that stem from the cybersecurity industry.
Camille François is a cybersecurity expert and co-lead of the Algorithmic Justice League's (AJL) Community Reporting of Algorithmic Harms project. The Algorithmic Justice League is a technology advocacy research group founded by Joy Buolamwini, known for her trailblazing work inspecting popular facial-recognition software for bias issues.
Francois and her colleagues recently published a comprehensive report about creating so-called bug bounty programs to help spot problems with software that rely on algorithms and machine learning to take actions.
Companies use bug bounty programs to enlist altruistic hackers to spot security flaws in their products and IT systems. Businesses pay the hackers a financial “bounty” when they discover the bug, and then presumably patch the software.
Similar to how third parties can inspect an organization’s tools for security flaws, the AJL’s report details how the same process could be adapted for algorithmic bias problems. Twitter, for instance, recently held an algorithmic bug bounty program in which it invited outsiders to spot bias problems with one of its tools that automatically crops photos for users. Through the bug bounty process, third parties were able to find many different bias problems with the tool, such as the algorithm powering the tool would crop people out of photos who were wearing turbans, hijabs, and other head garments.
Although Twitter debuted an algorithmic bug bounty program involving a tool that was already highlighted for bias problems (and the company already decommissioned the feature), Francois noted its significance.
“Twitter did it, Twitter's fine,” François bluntly says about the fear companies might have debuting an algorithmic bug bounty program for fear of public backlash.
Indeed, it’s likely that many companies might fear hosting bug bounty programs intended to highlight bias problems with their A.I. software. But there’s no shame in admitting that machine learning doesn’t always work as one expects and that getting input from outsiders doesn’t have to be perceived as a major public relations disaster.
The hope is that organizations get more comfortable acknowledging bias issues with their software so that they can be addressed as opposed to putting their heads in the sand and hoping the issues blow over.
On point, Olympics
Gu made gold and stirred up some passionate conversation in the process. Eighteen-year-old freestyle skier Eileen Gu made Olympic history by winning two golds and a silver medal, but also for her decision to compete for China. The San Francisco-born athlete has a mother who is Chinese. “When I’m in the U.S., I’m American, but when I’m in China, I’m Chinese,” Gu has said often. For lots of Chinese Americans, this is a familiar sentiment, writes Ashley Wong. “I think what I’m seeing is somebody who isn’t afraid to love her identities and share that with people,” Sarah Belle Lin, 28, told Wong. “I think it’s so brave, actually, for her to speak about that on a public platform.” But it’s complicated. “Some of those interviewed said they viewed the questioning of her loyalty as a troubling reminder of ongoing Orientalist stereotypes of Asian Americans as ‘perpetual foreigners’ with the potential to undermine the United States, even though many of them call the country their home,”. she says.
New York Times
Meet the three Indigenous women who made ice hockey history While the decades-long rivalry between the U.S and Canada women’s ice hockey teams is a story of its own, this year, three Indigenous women were also making history. Jocelyne Larocque and Jamie Lee Rattray, both of the Métis Nation, played for the Canadian team, and Abby Roque, Ojibway from Wahnapitae First Nation for the U.S. It was the largest contingent of Indigenous athletes in the Olympic games. Roque is a serious star and was featured on Sports Illustrated Olympic preview cover. It was a real nailbiter, but Canada won 3-2 for the gold.
Indian Country Today
This edition of raceAhead was edited by Ashley Sylla.
On background, the research edition
Why corporate DEI efforts are not really working This excellent paper by Michael W. Kraus and Brittany Torrez from Yale School of Management and LaStarr Hollie from the University of Massachusetts, Amherst makes an important contribution to an increasingly distressing conundrum: What’s taking so long? “How organizations make such public commitments to DEI goals across time without making good on them involves a complex set of interlocking social, economic, and psychological processes,” they write. In this paper, they focus on the idea of racial progress that is embedded in the overall culture in the U.S. and beyond. It feels like a form of race magic if you will. “Specifically, members of the workforce, much like members of American society, adhere to beliefs that racial equality will naturally unfold across time, and it is this belief in the automatic unfolding of racial progress that makes actual organizational policy change in the service of DEI more unlikely to be competently executed." The pdf is $27.95 for non-practitioners, and well worth every penny.
Current Opinion in Psychology
A once-controversial book finds a new audience Our Separate Ways: Black and White Women and the Struggle for Professional Identity generated alarm when it was published in 2001; specifically that professors Ella Bell Smith and Stella Nkomo—regarded as “organizational behavior royalty”—would put their tenure in jeopardy if they published it. The book studies the experiences of 120 Black and white female managers to explore where race and economic status, and not just gender, play a role in their career outcomes. Now, twenty years later, an updated version of the should-have-been iconic book drops on August 10. Wharton management professor Stephanie Creary signed petitions to get the book re-issued and speaks with the authors in this episode of her Leading Diversity at Work podcast series. (H/T Enable Leaders on LinkedIn. Thanks, Jonathan!)
Knowledge @ Wharton
Black and Latinx workers and the quest for equity While the Silicon Valley bubbles ebb and flow, one thing remains constant: When a valuable employee has equity in a successful company, the exit event can be life-changing. But, points out J.J. McCorvey, substantial research shows that people of color hold far less equity in their companies than do their white peers. “At the same time, research suggests the equitable distribution of company stock could be a significant tool in narrowing the racial wealth gap,” he says.
Wall Street Journal