The British government has been making noises for a while now about introducing new online safety laws, and its proposals are finally here.
On Monday, the government published an “online harms white paper” that is open for public consultation until July 1. In it, the government proposes making online platforms liable for the protection of their users, children in particular.
A regulator—either a new one or the existing communications watchdog, Ofcom—would be able to hit companies with fines or even blockages of their services, if they don’t protect users from things like “extremist content,” intimidation, disinformation, hate crimes, child sexual abuse material, revenge pornography, and violent content.
“The Internet can be brilliant at connecting people across the world—but for too long these companies have not done enough to protect users, especially children and young people, from harmful content,” said Prime Minister Theresa May in a statement. “That is not good enough, and it is time to do things differently. We have listened to campaigners and parents, and are putting a legal duty of care on Internet companies to keep people safe.”
The move is extremely controversial because, while some of those “online harms” are plainly illegal—such as distributing child sexual abuse material—others, such as disinformation and intimidation, are not. Free expression activists see a very slippery slope, while child safety advocates see the government’s push as long overdue.
‘Implications for legal content’
“The government’s proposals would create state regulation of the speech of millions of British citizens. We have to expect that the duty of care will end up widely drawn with serious implications for legal content, that is deemed potentially risky, whether it really is nor not,” said Jim Killock, the executive director of the Open Rights Group, a prominent digital rights organization. “The government is using Internet regulation as a blunt tool to try and fix complex societal problems. Its proposals lack an assessment of the risk to free expression and omit any explanation as to how it would be protected.”
On the other side of the debate, Peter Wanless, CEO of the National Society for the Prevention of Cruelty to Children, said social networks had for too long “failed to prioritize children’s safety and left them exposed to grooming, abuse, and harmful content.”
“So it’s high time they were forced to act through this legally binding duty to protect children, backed up with hefty punishments if they fail to do so,” Wanless said.
However, the proposals don’t just cover social networks. In its statement, the government said the new laws would “apply to any company that allows users to share or discover user generated content or interact with each other online.” That covers everything from search engines and file-hosting sites to messaging services and public forums.
Technological solutions
The government, which maintains that it is possible to introduce new protections while also making the U.K. “the best place in the world to start and grow a digital business,” referred approvingly to technological solutions such Apple’s anti-addiction Screen Time feature, and Google’s Family Link parental-controls app. However, it also proposed a media-literacy push to help regular people more easily spot “deceptive and malicious behaviors online.”
Perhaps the biggest target here is Facebook, accused in a recent parliamentary report of behaving like a “digital gangster.” The social network responded to that report—which called for roughly the same things that were proposed on Monday—by saying it was “open to meaningful regulation” regarding issues such as privacy and disinformation.
Mark Zuckerberg likely had the U.K.’s imminent proposals in mind when he wrote about the risk of fragmented rules in his recent Washington Post op-ed. “Lawmakers often tell me we have too much power over speech, and frankly I agree. I’ve come to believe that we shouldn’t make so many important decisions about speech on our own,” he wrote, agreeing that “Internet companies should be accountable for enforcing standards on harmful content.”
In a Monday statement, Facebook’s U.K. public policy chief, Rebecca Stimson, referred back to Zuckerberg’s op-ed, saying the company shares “the government’s commitment to tackling harmful content online.”
“While we’ve tripled the team working to identify harmful content and protect people to 30,000 and invested heavily in technology to help prevent abuse of our platform, we know there is much more to do. We are continually reviewing our policies with experts and working to ensure our reporting, artificial intelligence and machine learning systems remain industry-leading,” Stimson said. “New rules for the Internet should protect society from harm while also supporting innovation, the digital economy and freedom of speech.”
“These are complex issues to get right and we look forward to working with the Government and Parliament to ensure new regulations are effective,” she said.
Twitter’s U.K. public policy chief, Katy Minshall, said the platform had been “an active participant” in talks between the government and industry on the issue of online safety, and was looking forward to “engaging in the next steps of the process, and working to strike an appropriate balance between keeping users safe and preserving the open, free nature of the Internet.”
Google, meanwhile, said it “look[s] forward to looking at the detail of these suggestions and working in partnership to ensure a free, open and safer internet that works for everyone,” according to a statement from Claire Lilley, Google’s U.K. public policy manager.
Funding the regulator
It remains an open question as to who would enforce the new rules. Ofcom is quite keen to take on the role, in addition to its existing remit of regulating telecommunications firms and broadcasters.
Whether it ends up being Ofcom or a new regulator, the government said Monday that industry should provide the funding in the “medium term,” and “the Government is exploring options such as an industry levy to put [the regulator] on a sustainable footing.”
A parliamentary report last month also called for the likes of Facebook and Google to pay a 0.5% tax on their U.K. profits, in order to fund research into social-media addiction—one of the many issues that the government hopes to tackle in its new proposals.