Facebook whistleblower Frances Haugen tells lawmakers the only way to fix the company is to partially destroy its business model

Facebook should “slow down” how people use its platform in order to counteract disinformation, but it doesn’t want to do so because it would mean a tiny reduction in profit, whistleblower Frances Haugen told a parliamentary committee in the United Kingdom on Monday.

Haugen is a former product manager on Facebook’s civic integrity team who quit earlier this year, taking with her thousands of internal documents that have fueled media coverage over the last month. She testified to the U.S. Senate earlier this month and is now embarking on a European tour of sorts, starting with Monday’s appearance before a U.K. committee that is considering new online-safety legislation.

Her message to those lawmakers: Give Facebook the incentives to prioritize safety over the user engagement that makes it so profitable, because “engagement-based ranking amplifies polarizing content” and “anger and hate is the easiest way to grow [popularity] on Facebook.”

For example, she suggested introducing mechanisms that ask people if they are sure they want to share pieces of content to large groups. According to Haugen, such examples of “slowing the platform down” would be a no-go at Facebook right now. “They don’t want to lose that growth. They don’t want 1% shorter sessions, because that’s 1% less revenue,” she said. “They’re not willing to sacrifice little slivers of profit.”

Haugen also said large Facebook groups—she suggested 10,000 members as the threshold—should have to provide their own moderators, to ensure that they aren’t being misused to amplify disinformation.

“Selective friction”

The former Facebook employee argued that adding “selective friction” would make Facebook safer than relying on A.I. systems to spot disinformation and abuse, or trying to remove users’ anonymity. A.I. has trouble differentiating between glorification of terrorism and counterterrorist messages, she said, and it would be easy for bad actors to get around limits on anonymity.

“Any system where the solution is A.I. is a system that’s going to fail,” Haugen said, noting that Facebook’s automated systems were often bad at handling languages other than English. “Instead we need to focus on making the system slow down, making it human scale.”

“I think if you make Facebook safer and more pleasant it will be a more profitable company in 10 years’ time,” she added.

Haugen also said Facebook needed to be forced to be more transparent about its work on safety—to publish its risk assessments and publicly suggest solutions—because it was otherwise duty-bound to serve only its shareholders: “Today Facebook is scared that if they freely disclose information, there might be a shareholder lawsuit.”

The whistleblower also said her views on end-to-end encryption had been “mischaracterized” in recent reports. She said she supported and used open-source end-to-end encryption, but said she did not trust Facebook to tell the truth about the security of its products. “We need public oversight of anything Facebook does around end-to-end encryption, because they’re making people feel safe when they may be in danger,” she said.

Media strategy

Haugen’s trove first formed the basis of a series of “Facebook Files” articles from the Wall Street Journal, starting in September. The Journal’s revelations covered Facebook’s failure to deal with Instagram’s harmful effects on teenage girls; its efforts to attract preteen users; its exemption of certain high-profile accounts from its usual enforcement measures; its weak response to abuse by drug cartels and human traffickers in developing countries; and the platform’s weaponization by anti-vaccine activists, despite Zuckerberg’s efforts to promote vaccination.

However, last week it emerged that Haugen—who revealed her identity a few days into the series—had also shared the documents with a range of other news outlets, which have now also started publishing articles based on them.

One of the first pieces to emerge, based on this unusual consortium approach, showed Facebook staffers were outraged at the platform’s effects when the Jan. 6 insurrection occurred. Another delved further into Facebook’s notoriously weak moderation of hate speech and disinformation in India. There has also been widespread coverage of Facebook’s loss of popularity among teenagers and young adults—Haugen has officially complained to the U.S. Securities and Exchange Commission that Facebook “has misrepresented core metrics to investors and advertisers.” She said on Monday that as many as 60% of the new accounts created on the platform were “not actually new people.”

Many more revelations are expected this week, based on her stash of documents.

“We need to steel ourselves for more bad headlines in the coming days, I’m afraid,” warned Facebook’s global affairs chief Nick Clegg, the former British deputy prime minister, in a Saturday post to staffers that was reported by Axios. “We should keep our heads held high and do the work we came here to do.”

Haugen has accepted help from Luminate and WhistleBlower Aid, both nonprofits that are backed by eBay cofounder Pierre Omidyar, but only for providing legal support and paying for her travel and certain other expenses. She told the New York Times that she is otherwise financially self-sufficient, “because I [bought] crypto at the right time.” It should also be noted that Luminate is funding a British nonprofit called The Citizens, which is behind the so-called Real Facebook Oversight Board (RFOB)—an independent counterpart to Facebook’s official Oversight Board, whose own members are complaining about the company’s lack of transparency.

“Critical moment”

Haugen’s appearance before the parliamentary committee comes as the British Parliament considers an Online Safety Bill that would aim to protect people—kids in particular—from harmful material, some but not all of which is already illegal. Companies flouting the rules could get fines of up to 10% of global profits, and enforcement would be entrusted to a beefed-up iteration of Ofcom, the existing U.K. media regulator. The European Union is also working to force Facebook and other Big Tech companies to clean up their platforms, with a piece of legislation called the Digital Services Act.

Haugen said the U.K. bill was deficient in that it focuses only on individual harm—”It is a grave danger to societies around the world to omit societal harm”—and would not regulate paid-for advertisements on the platform. However, she said she was “incredibly excited and proud of the U.K. for taking such a world-leading stance with regard to regulating social media platforms.”

“This is a critical moment for the U.K. to stand up and make sure these platforms are in the public good,” she said.

Meanwhile, Facebook is desperately trying to call the world’s attention to its ambitions in virtual reality. In the past week, it has said it will hire 10,000 workers in Europe to realize its “metaverse” dreams and has even teased a name change for the company as a whole.

“Instead of having 10,000 engineers to build the metaverse, we need 10,000 engineers to make us safer,” Haugen countered.

It remains to be seen whether Facebook’s metaverse campaign will deflect European regulators’ gaze. After Haugen’s Monday testimony in London, she will go on to meet with French, German, and other EU lawmakers over the next couple of weeks.

More tech coverage from Fortune:

Subscribe to Fortune Daily to get essential business stories straight to your inbox each morning.

Read More

Artificial IntelligenceCryptocurrencyMetaverseCybersecurityTech Forward