Facebook has admitted being too slow in combating the use of its platform to perpetuate violence in Myanmar.
Online vitriol has for a long time accompanied and accelerated ethnic violence in Myanmar, and investigations by the United Nations and multiple media outlets have shown that Facebook isn’t helping. Religious groups have been fighting in Myanmar since 2012, drawing in the national army and displacing 140,000 people.
“The ethnic violent in Myanmar is horrific and we have been too slow to prevent misinformation and hate,” Facebook product manager Sara Su wrote in a Wednesday statement. The company said it will hire more Burmese speakers and invest in artificial intelligence to help it rein in hate speech in Myanmar and several other countries where “false news has had life or death consequences.”
Facebook dominates Internet use in Myanmar, where widespread use of Internet-enabled mobile devices only began in 2014. Many Burmese consider Facebook the only entry point to the Internet and consider posts there to be as reliable as independent news, according to a 2016 GSMA report.
But Facebook has no offices in the country and outsources speech moderation for Myanmar to a third-party company based in Malaysia. Despite warnings from panicked Myanmar officials, academics, and human rights activists, and a promise from Facebook CEO Mark Zuckerberg to review hate speech within 24 hours, the social media giant left online more than 1,000 hateful posts, some dating back years, a new Reuters investigation found.
Facebook has banned some hate groups and individual members, the company said, and is hiring a non-profit human rights group, Business for Social Responsibility, to assess its impact on human rights in Myanmar.