An act of public service or a case of old media defending its ancient privileges while it still can?
The Times of London reported Thursday that Facebook is at risk of being prosecuted in the U.K. after failing to remove content sponsoring terrorism and child pornography from its social network even after being alerted to it.
The accusations are a variation on an increasingly familiar theme with Facebook and Google, which was embroiled in a similar controversy last month over the automated placing of digital ads on extremist content on YouTube, and add to the evidence that social media companies care less about what they publish than about bolstering audience figures to maximize their advertising revenue.
Read: Google Promises Closer Policing of Websites After YouTube Ad Flap in Britain
The Times said Facebook failed to take down dozens of images and videos such as “one showing an Islamic State beheading, several violent paedophilic cartoons, a video of an apparent sexual assault on a child and propaganda posters glorifying recent terrorist attacks in London and Egypt.” The material included an official news bulletin posted by Islamic State praising last weekend’s bombing of two Coptic churches in Egypt, which killed 91 “Christian warriors.”
It noted that Facebook’s algorithms even promoted some of the material by inviting users to join the groups and profiles that published it.
Read: Germany Gives Social Media Companies a Break Over Hate Speech
The Times said it has handed its material to the Metropolitan Police but that the police didn’t confirm it would be investigating. Under U.K. law, it is an offense to disseminate terrorist material “intentionally or recklessly.”
The U.K. newspaper’s reporters had uncovered the material and alerted Facebook through a standard user profile. Facebook only took the content down when the paper asked it for comment ahead of publishing its story.
It quoted Facebook’s vice-president of global operations Justin Osofsky as saying: “We are sorry that this occurred. It is clear that we can do better and we’ll continue to work hard to live up to the high standards people rightly expect of Facebook.”
It isn’t that Facebook can’t remove images quickly and forcefully if it chooses to. Last year, it suspended the account of Norway’s leading newspaper Aftenposten after it posted an iconic Vietnam War image of a naked girl fleeing a napalm attack. Facebook thought that was child pornography. It also took down a re-post of the image by Norwegian Prime Minister Erna Solberg after she tried to support the newspaper.
Read: Here’s Why Facebook Removing That Vietnam War Photo Is So Important
The Times, like The Wall Street Journal, is owned by Rupert Murdoch’s News Corp, once the dominant force in the U.K. media market but now under serious threat from the likes of Facebook and Google. In an editorial published last week, News Corp CEO Robert Thomson cited data saying they now control two-thirds of the digital advertising market.
“It is beyond risible that Google and its subsidiary YouTube, which have earned many billions of dollars from other people’s content, should now be lamenting that they can’t possibly be held responsible for monitoring that content,” Thomson said.
Solberg came to a similar conclusion about Facebook, saying “They are editing and then they have to be honest on the editing.”