By Jeff John Roberts
September 19, 2017

 

Facebook is caught in a new political crossfire, this time over Myanmar, where the government is allegedly conducting a campaign to ethically cleanse the Southeast Asian country of its Rohingya minority.

As the Daily Beast reports, several Facebook users who posted reports to draw attention to the ethnic cleansing (which Human Rights Watch says has been confirmed by satellite photos) had their posts removed by the social network. Such posts included photographs, but also, in one case, a poem protesting the government’s actions.

This suggests some of the removals have come in response to demands from the Myanmar government or its supporters.

Facebook, in response, confirmed the company made “errors” in removing the posts about the crackdown, but it said it is addressing them.

“In response to the situation in Myanmar, we are only removing graphic content when it is shared to celebrate the violence, versus raising awareness and condemning the action,” said the company in an email statement to Fortune. “We are carefully reviewing content against our Community Standards and, when alerted to errors quickly resolving them and working to prevent them from happening again.”

Get Data Sheet, Fortune’s technology newsletter.

The Myanmar controversy echoes earlier ones, including when Facebook’s explicit images policy led the company to censor posts of an iconic Vietnam War photograph of a naked girl burned by napalm. Facebook has also come under fire for helping governments smother dissent in places like Turkey and China.

According to a person close to Facebook, who spoke on the condition of anonymity, Facebook did not remove the posts about crackdown in response to pressure from the Myanmar government. Instead, the removals occurred because of the difficulty in rapidly discerning if a graphic image was posted gratuitously or as an incitement to violence—or instead to raise awareness about brutality.

In some situations, tech companies resort to automated tools to try to parse large volumes of images, though it’s unclear if this occurred in the case of the posts about the Rohingya.

The person familiar with Facebook said the company is working with Burmese speakers to assist its human moderators identify legitimate posts about the violence, and said it’s restoring posts removed in error.

All of this is calling attention, once again, to the immense power of Facebook as a media platform, and how it can be used both to spread awareness and propaganda. Even as it faces criticism for letting authoritarian governments abuse its platform overseas, Facebook is also confronting questions from U.S. lawmakers over how Russian entities use fake accounts to meddle with the presidential election.

SPONSORED FINANCIAL CONTENT

You May Like

EDIT POST