Cybersecurity experts and privacy activists are horrified at a new proposal for an online crackdown on child sexual abuse material in the European Union.
On Thursday, the European Commission proposed new legislation that would oblige social networks and messaging services “to detect, report and remove child sexual abuse material on their services.”
“Detection, reporting and removal of child sexual abuse online is…urgently needed to prevent the sharing of images and videos of the sexual abuse of children, which retraumatizes the victims often years after the sexual abuse has ended,” said Home Affairs Commissioner Ylva Johansson.
This stated aim is not what has triggered a massive backlash—rather, it’s the methods that tech firms would be forced to deploy under the proposed law, and the possibility that they could be misused in the future.
The Commission wants the likes of Facebook and Signal to use technology to detect not only known images of child sexual abuse when users send them—something that can be done automatically by comparing the images with those held on official databases—but also new images that were not previously known.
What’s more, it wants them to scan users’ content and messages for indications that an adult is “grooming” a child for abuse, for example by soliciting indecent pictures.
Experts and activists alike say this is impossible without severely compromising privacy and opening the door for abuse by future authoritarian regimes.
“The new proposal creates major risks to the privacy, security and integrity of private communications, not just in the EU, but globally,” warned EDRi, the European digital rights organization.
“The proposal is not only privatizing law enforcement tasks for combatting [child sexual abuse material], but also requiring service providers to use mass surveillance measures, e.g. scanning everyone’s private communications, which law enforcement authorities are legally barred from using.”
“This would be the worst surveillance mechanism ever established outside of China, and all in the pretext of protecting children,” said Matthias Pfau, CEO of encrypted-email service Tutanota, in an emailed statement.
“This document is the most terrifying thing I’ve ever seen,” tweeted Johns Hopkins cryptography professor Matthew Green, a leading voice in cybersecurity who has seen many proposals in a similar vein.
In many ways, this new debate is a rehash of the one that has played out many times since the Clinton administration tried to give law enforcement and spy agencies access to people’s voice and data messages in the 1990s. (This “Clipper chip” program met an untimely end when the underlying technology was shown to be insecure, and phone manufacturers declined to use it in their products.)
The essential problem, experts say, is that it is impossible to reduce the security of a system only for certain targets, without weakening that security for all users.
In the case of many of the services that the new EU proposal would cover, such as WhatsApp for example, users’ communications are protected by end-to-end encryption that makes it impossible for even the service provider to read or see what is being sent.
As has been repeatedly pointed out over the years, there’s no middle ground here.
“End-to-end encryption is either end-to-end encrypted or it’s not end-to-end encrypted,” said Alan Woodward, a cybersecurity professor at the University of Surrey in England, and an adviser to Europol. “You either weaken it so you can look inside it, or only the two end recipients can look at it.”
The Commission’s proposal—which could change significantly as it goes through the EU’s legislative process—explicitly denies trying to disincentivize the use of the end-to-end encryption, which it describes as “an important tool to guarantee the security and confidentiality of the communications of users, including those of children.”
An associated Q&A acknowledges the technical problems that the Commission faces, saying the EU executive is working “to support research that identifies technical solutions to scale up and feasibly and lawfully be implemented by companies to detect child sexual abuse in end-to-end encrypted electronic communications in full respect of fundamental rights.”
But the proposal ultimately leaves the problem to the tech firms themselves.
“They’ve put the onus on the tech companies to say that you will use some as-yet-undefined technology to make sure that this material is detected and taken down,” said Woodward, who added that very much the same approach was being taken in the U.K.’s Online Safety Bill.
“Companies can make use of any technology of their choosing that is privacy-protective and meets the strict standards,” a Commission spokesperson told Fortune. “Where no technologies are available that meet the standards, no [detection] order can be issued. On the other hand, leaving out encrypted conversations entirely would mean leaving wide open the channel that currently affords easy access to children.”
According to Woodward, there is only one way tech firms can possibly comply with the Commission’s proposal without ditching end-to-end encryption and therefore putting everyone’s messages at risk: a method called client-side scanning, where the provider’s app scans the messages and images before they are encrypted, or after they are decrypted.
This is essentially what Apple last year proposed doing on its iPhones, before an enormous privacy outcry forced it to largely shelve the plans.
Apple’s 2021 proposal was eviscerated by a who’s who of cybersecurity experts in a paper that said the tech “can fail, can be evaded, and can be abused.”
“The fundamental issue that privacy advocates have is once you have something like that on the device, who is to say it’s not being abused by the authorities to look for something else?” asked Woodward. “Which database is it being compared against?”
Such a mechanism might be installed to combat child sexual abuse material, he said, but it could then serve to scout for information that could be cross-checked against a database of terrorist or extremist material. And the same detection technology, once installed on people’s phones, could also be abused by oppressive regimes such as those in Iran and Russia.
“You’ve got to trust the authorities, basically,” Woodward said, “and one of the very reasons end-to-end encryption [gained popularity] was that Edward Snowden showed the authorities were conducting bulk surveillance…[The new proposals are] kind of taking a step back towards that. It has the potential to be turned into a mass surveillance system again, and consequently people are very upset about that.”
Woodward said Pfau was “not wrong” to compare the EU’s proposal to what happens in China, where state spyware is on everyone’s phones. “That’s a short step to what’s being proposed here,” he said.
The Computer & Communications Industry Association—a major lobbying group for Silicon Valley—said Wednesday that it would work with EU lawmakers “to develop effective and workable rules” that don’t undermine encryption and that respect the EU’s long-standing ban on generalized surveillance of online platforms.
“[The Commission is] setting the tech companies an impossible task: ‘Just nerd harder. There’s an answer there somewhere,’” said Woodward. “They’re putting these companies in a no-win situation, which I imagine is going to get tied up in lawsuits for years. Meanwhile the children are still going to be abused.”
Woodward said it would be more effective to increase funding for social workers and specialist policing to stop child abuse, along with better education to dissuade kids from transmitting illegal imagery.
“Something like 70% of the material that is sent is actually sent by youngsters themselves,” he noted.
“I worry people are losing sight of the original objective,” Woodward said.