Facebook Warns Groups and Pages It Could Proactively Shut Them Down for Being Fake News Networks
Facebook issued a warning on Wednesday that it will proactively shut down some groups and pages, even if they haven’t been found to be in violation of community guidelines.
The move is designed to keep page managers from skirting Facebook bans by using pages they already manage to re-post the content Facebook removed from their shuttered pages and groups. Facebook’s crackdown on misinformation networks comes one week after it announced the the removal of 364 pages that originated in Russia and were engaged in “coordinated, inauthentic behavior.”
In a blog post, Facebook said it will enforce the policy by looking at “a broad set of information,” including whether the page has a similar name or the same administrators as a page that has been removed.
Facebook rules don’t allow people to recreate pages or groups that look similar to ones that have been banned, however bad actors have managed to stay one step ahead by converting other pages in their network into vehicles for fake news and other content that violates Facebook’s community guidelines.
It’s the latest step Facebook has made in its never-ending quest to catch up to the shape-shifting tactics of bad actors. Last year, Facebook added a new “history” feature to pages and groups, allowing everyone to see when a group was created, its previous names, and the advertisements it is running, adding an extra level of transparency in the fight against nefarious activity on the site.
Facebook also announced on Wednesday it will debut a new “page quality” tab in the administrative portal for groups and pages. The new feature will offer more insight into content that has either been removed for violating community guidelines or has been demoted by Facebook’s algorithms.
The “page quality” tab won’t be a comprehensive reporting of a page’s infractions, though, and will leave out minor “slap on the hand” offenses, such as posting clickbait or spam. However, the goal is to offer more transparency and with that, hopefully help to gently nudge administrators into making sure they comply with Facebook’s guidelines.