Facebook still has an election interference problem, and it looks much worse two years after the 2016 presidential election, according to research conducted by Jonathan Albright at the Tow Center for Digital Journalism.
In the days and weeks leading up to the 2018 midterms, Facebook has been scrutinized over reports that organizations could submit political ads while also hiding their identity.
Last week, VICE reported that it was able to buy fake political ads on behalf of Vice President Mike Pence, as well as ISIS. Facebook’s most recent ad policy—that groups had to include a “paid for by” disclosure—was announced in May, but the feature does little in the way of accountability. Facebook even revealed last month that an Iranian influence campaign had purchased ads on the social networking site, which only pointed to how easy it is for foreign actors to swing the website’s technology for political gain, Mother Jones reported.
The new report from Albright shows that disinformation campaigns are increasingly homegrown, and may pose as much or more of a significant problem than foreign election meddling. Over a period of three months, Albright collected more than 250,000 posts, 5,000 political ads, and the engagement metrics of hundreds of Facebook pages.
Through this process, Albright found that “an alarming number” of verified and influential Facebook pages running U.S. political ad campaigns over the last six months were managed outside the country. The company doesn’t regularly look into pages after their initial verification, and various influential pages run by foreign accounts with targeted political ads operated for up to four months without including a “paid for by” disclosure.
“In other words, Facebook’s political ad transparency tools — and I mean all of them — offer no real basis for evaluation,” Albright wrote.
Beyond Facebook pages, private groups on the platform, where people share “hate content, outrageous news clips, and fear-mongering political memes” are becoming a breeding ground for domestic political influence campaigns that seek to push false and damaging news and narratives. According to Albright, who shared these findings in a second Medium post, these groups facilitate the spread of hateful content by “enabling a new form of shadow organizing,” which includes a mix of sensationalist political commentary and hateful conspiracy theories.
Facebook even allowed groups to target audiences with an interest in the white supremacist conspiracy theory of “white genocide,” which the Intercept discovered last week. Choosing the option would expose 168,000 users to the particular political ad.
A Facebook spokesperson told Gizmodo: “This targeting option has been removed, and we’ve taken down these ads. It’s against our ads principles and never should have been in our system to begin with. We deeply apologize for this error.”
“It’s like the worst-case scenario from a hybrid of 2016-era Facebook and an unmoderated Reddit,” Albright said.
Fortune has reached out to Facebook for comment and will update this article if we hear back.