CEO DailyCFO DailyBroadsheetData SheetTerm Sheet

Algorithms won’t end racism—they’re making it worse

June 23, 2020, 10:40 AM UTC

This is the web version of Business x Design, a newsletter on the power of design. Sign up here to get it in your inbox.

“Computational design” and “inclusive design” have been, for at least a decade now, among the design world’s buzziest terms.

The former has been used to denote a new category of designers and engineers who exploit the power of computers, data, algorithms, and the Internet to roll out complex services to millions at lightning speed. The latter began as a mostly academic effort to make products accessible to users with disabilities, then evolved into a broader discussion about the importance of ethnic, racial and gender diversity in design and how to make large organizations more representative of the communities they serve.

For many years, it was widely assumed that there was a natural affinity between the two approaches. After all, it was often asserted, computers, data, and algorithms are “objective”—tools of logic that could be used to make decisions and serve users without the messy prejudices of humans.

It was hoped that the new methods of design would help promote diversity in a variety of fields including education, job recruitment, credit scoring, criminal justice, and the distribution of health care services.

Alas, just as the Internet, once hailed as a technology that would bring the world together, is now widely reviled for driving us apart, so algorithms increasingly are vilified for turbocharging racism and sexism.

Safiya Umoja Noble, author of Algorithms of Oppression: How Search Engines Reinforce Racism, this week offered a curated reading list on Pocket, exploring how technology can replicate and reinforce racist and sexist attitudes and skew outcomes in nearly all the fields it was meant to improve. I recommend every item. A few highlights:

  • Google’s image-ranking algorithms reinforce negative stereotypes of “black girls” as sex objects and “black teenagers” as criminals. (Time)
  • Algorithmic risk scoring in the U.S. criminal justice system has led to racially biased outcomes. One such algorithm, documented by Pro Publica, classified black defendants as having a “high recidivism risk” at disproportionately higher rates than white defendants even though the algorithm did not explicitly use race as a variable. (The Boston Review, Pro Publica)
  • Algorithms used by hospitals and physicians to guide decisions such as who receives heart surgery, who needs kidney care, and who should try to give birth vaginally are racially biased. (STAT)
  • Speech recognition systems from Amazon, Apple, Google, IBM and Microsoft misidentify words spoken by black people 35% of the time compared to 19% for white people. (The New York Times)

Tech optimists assert the long-run benefits to society of embracing artificial intelligence far outweigh its current shortcomings. A host of practitioners argue the use of algorithms remains in its early days, and that problems of racial and gender bias can be tweaked and optimized away to achieve more neutral decision outcomes in the future.

But Noble and other critics insist algorithms are inherently biased and bound to perpetuate prejudice—not least because the data upon which they feed is the product of past discrimination.

Is it possible to teach machines to be truly inclusive? Are we capable of devising algorithms that overcome, not amplify, injustice? The early evidence isn’t encouraging.

More design news below.

Clay Chandler
— clay.chandler@fortune.com

NEWS BY DESIGN

Apple's new core

Apple unveiled a slew of changes at its developers’ conference yesterday, including a redesign of the iPhone’s home screen and a plan to ditch Intel chips for ones designed in-house.

"Cultural catastrophe"

Research commissioned by the Creative Industries Federation warns 400,000 creative jobs could be lost in the U.K. this year due to the coronavirus pandemic. That’s equivalent to 20% of creative jobs. The loss marks a stark reversal to the pre-Covid era when the British creative sector was growing at five times the rate of other sectors.

"Systemic racism is by design"

Ideo apologized for sharing an Instagram post of “anti-racism resources” after receiving criticism for not doing enough to directly support black creatives and designers. The concept of “design thinking,” popularized by Ideo, has been criticized for perpetuating racist structures through its suggestion that designers are capable of absolute empathy.  

Guggenheim redesign

An open letter, signed by the curatorial department of the Guggenheim Museum and sent to the institution's leadership, called out the museum for maintaining an “inequitable work environment that enables racism, white supremacy and other discriminatory practices.”

Sometimes low-tech is best

The U.K. government abandoned its attempt to develop a “test and trace” app for monitoring the spread of Covid-19. After months of trying to go it alone, the U.K. now says it will use the Google and Apple model. Some of the most effective containment measures have been achieved without an app.

Fraudster found

Inigo Philbrick—the young art dealer who allegedly swindled millions from the art world by stealing work, selling collections he didn’t own, and other fraudulent action—was arrested in the Pacific island nation of Vanuatu. Philbrick, who went on the lam last year, was flown to Guam where he is now held in U.S. federal custody.

EVENTS BY DESIGN

July: Christie’s is planning a semi-virtual auction for July 10. The “first of its kind” event will livestream auctions from four cities—Hong Kong, New York, Paris and London; D&AD’s New Blood Festival—a celebration of upcoming talent in design—will be running online this year, July 6-10. The digital festival marks New Blood’s 40th anniversary.

September: Art Basel in Switzerland—initially rescheduled from June to September—has been cancelled, set to return in June 2021.

Ongoing: Dezeen’s Virtual Design Festival has been extended to July 10

QUOTED BY DESIGN

“The role of social robots like Paro is becoming more important, especially as we see this sector of our population [the elderly] targeted by this virus.”

Sandra Petersen, program director for the University of Texas at Tyler nursing department, discusses the use of robots in social care. Paro, a robotic baby seal designed by Japan’s National Institute of Advanced Industrial Science and Technology, was introduced in 2003 but is finding new pertinence in the era of social distancing. Researchers say Paro has helped lower blood pressure, dependence on drugs and sparked emotional recognition in Alzheimer’s patients.

This week’s edition of BxD was curated by Eamon Barrett. Email him at eamon.barrett@fortune.com