New Zealand is exerting pressure on Facebook (FB) following the live-streaming of Friday’s deadly mosque shootings in Christchurch.
“We cannot simply sit back and accept that these platforms just exist and what is said is not the responsibility of the place where they are published,” Prime Minister Jacinda Ardern said. “They are the publisher, not just the postman. There cannot be a case of all profit, no responsibility.”
Speaking to New Zealand’s parliament Tuesday as part of a tribute to the victims, Ardern said she had been in communication with Facebook COO Sheryl Sandberg, according to ABC.
The prime minister also implored people to speak of the victims, not of the perpetrator. “He sought many things from his act of terror, but one of those was notoriety,” Ardern said. “That is why you will never hear me mention his name.”
A total of 50 people died in the attacks in Christchurch, which were documented in a 17-minute Facebook Live video from the shooter. New Zealand police have charged 28-year-old Australian Brenton Tarrant with murder, while an unnamed 18-year-old man has been charged with inciting violence by distributing footage of the attack.
Tarrant has been banned from accessing any media — newspapers, TV or radio — so that he cannot consume coverage of the attacks while in jail, the New Zealand Herald reports. He has also been moved from Christchurch to a maximum security prison in Auckland, where he is under 24-hour supervision.
Meanwhile, Facebook has released further details about how far the live-streamed video of the attack spread and what steps the company is taking to prevent more harm.
“We have been working directly with the New Zealand Police to respond to the attack and support their investigation,” said Facebook VP and Deputy General Counsel Chris Sonderby, noting that some information cannot be released amid the active investigation. Among the new revelations:
- The 17-minute shooting video was viewed fewer than 200 times during the live broadcast. No users reported the video during this time.
- The video was viewed about 4,000 times in total before being removed from Facebook.
- The first user report on the original video came in 29 minutes after the video started, and 12 minutes after the live broadcast ended.
- Before Facebook was alerted to the video by police, a user on 8chan posted a link to a copy of the video on a filesharing site.
- Facebook has identified more than 800 visually distinct videos related to the attack and shared the fingerprints via the Global Internet Forum to Counter Terrorism — including Google, Twitter and Microsoft — to better combat the spread of screen recordings or edited versions of the video.
Facebook reported Sunday that it had removed 1.5 million videos of the mosque shooting from its servers in the 24 hours following the attack, many of those at the upload stage. Despite this multiple New Zealand businesses said they would be pulling their ad dollars from Facebook, accusing the social network of not doing enough to combat hate.