Hate Pages Banned as Reddit Tightens Rules on Violent Content

October 28, 2017, 8:07 PM UTC

Reddit, a large internet community that has sometimes been accused of incubating hate groups, announced on Wednesday that its content policy would become more restrictive. Several pages dedicated to hateful speech or ideologies were quickly banned.

Reddit’s new policy was outlined in statements on the site’s Help page and on a message board for the volunteer moderators of the site’s thousands of community pages, known as subreddits. The change expands a prior ban on “inciting” violence to include “any content that encourages, glorifies, incites, or calls for violence or physical harm against an individual or group of people” or against animals.

The broadened restrictions seem intended, in part, to tamp down on hate groups and racist ideology on the site. The New York Times reported that at least three subreddits were swiftly banned after the announced changes: r/NationalSocialism, r/Nazi, and r/Far_Right.

Get Data Sheet, Fortune’s technology newsletter.

Reddit has previously banned a number of subreddits, particularly those glorifying or enabling sexual crimes or targeted harassment. But the latest wave of shutdowns comes amid a broader move away from free-speech absolutism on platforms including Twitter and Facebook, both of which have announced plans to further restrict content considered hateful.

Those efforts have shown mixed results, and Reddit’s new approach is likely to be even more constrained, simply because of the company’s size. Reddit was recently valued at $1.8 billion, compared to around $16 billion for Twitter and $516 billion for Facebook. Facebook employs thousands of content screeners, while Reddit’s entire staff numbers fewer than 300.

Read More

Artificial IntelligenceCryptocurrencyMetaverseCybersecurityTech Forward