By Ellen McGirt
Updated: October 12, 2017 2:53 PM ET

Mark Zuckerberg was forced to apologize on Tuesday after live-streaming a video of himself taking a voyeuristic virtual reality tour of post-hurricane Puerto Rico.

A cartoon avatar version of Zuckerberg, accompanied by a cartoon version of Rachel Franklin, a member of Facebook’s virtual reality team, appeared embedded into a 360-degree video of the ravaged island. While some of their conversation touched on the company’s efforts to help, the true purpose of the video appeared to be to show off the “amazing” new Facebook Spaces, a product that allows users to create a 3-D virtual avatar of themselves to use with an Oculus Rift headset.

“One of the things that’s really magical about virtual reality, is you can get the feeling that you’re really in a place,” said Zuckerberg, while virtually standing in a place without electricity, potable water, food, a rising death toll and growing rates of deadly and preventable disease.

He responded swiftly to the immediate outcry. “One of the most powerful features of VR is empathy. My goal here was to show how VR can raise awareness and help us see what’s happening in different parts of the world. I also wanted to share the news of our partnership with the Red Cross to help with the recovery,” said Zuckerberg. “Reading some of the comments, I realize this wasn’t clear, and I’m sorry to anyone this offended.”

I think that Zuckerberg is sincere in his apology and believes in the empathy-scaling potential of VR. And I’m clear the company is doing plenty of other helpful-sounding things, like donating over $1.5 million to Puerto Rican relief efforts and using artificial intelligence to better help aid workers.

But the problem isn’t about messaging. It’s about prioritizing product while erasing people. What ends up scaling is something else entirely.

Of course, it’s not just Facebook. We’ve see it time and time again throughout history: Cars designed to optimize the safety of men at the expense of women; algorithms that bake biases into the criminal justice system; film that can’t see black people; and now technology platforms that allow the president of a country to say untrue and inflammatory things without penalty, but routinely blocks the accounts of advocates and activists.

The disconnect between celebrating a “shiny new thing” and understanding the potential negative impact the thing can have on people who haven’t been fully considered is more than an empathy gap. It’s a global emergency.

In an extended interview published today with Mike Allen, co-founder of Axios, Facebook’s COO Sheryl Sandberg was unable to answer definitely whether Facebook was a media company, but did concede that the Russian-financed ads and “fake news” on Facebook were a completely “new threat” and that “things happened on our platform that shouldn’t have happened,” before the 2016 election.

Though I fully appreciate all the things that Facebook does right, in no universe I’m aware of is propaganda a new development and I’m having a hard time understanding how this type of predictable mischief could have been overlooked or ignored. We need a better sense for how the technology that dominates the way we communicate behaves and a real plan for holding it accountable.

In a complex world, not everything needs to move fast if what gets broken is us.

 

SPONSORED FINANCIAL CONTENT

You May Like