Facebook is stepping up efforts to police any content that supports terrorism, which includes removing profiles linked to extremists from its roughly-1.6 billion users.
The social network has reportedly assembled a team dedicated to looking at any material that backs terrorist groups, and has sped up removal of users that are involved in or seemingly support such activities, according to the Wall Street Journal.
This comes after the site removed the profile of one of the two attackers involved in the San Benardino shooting last December, who was later discovered to have pledged allegiance to the Islamic State on Facebook. In early January, government officials met with a team from the company— including chief operating officer Sheryl Sandberg—over how social media networks like Facebook (FB) could help curb extremist views online.
“If it’s the leader of Boko Haram and he wants to post pictures of his two-year-old and some kittens, that would not be allowed,” Monika Bickert, Facebook’s head of global policy management, told the Journal.
Company executives reportedly started tightening their process about a year ago after meetings with academics led them to find that terrorists typically operate in groups.
Last week, Twitter (TWTR) said it had suspended over 125,000 accounts since mid-2015 for “threatening or promoting terrorist acts, primarily related to ISIS.”