Shootings, torture, and child abuse stream live over the Web.
Facebook’s live-streaming service can be used to share important life events or teach others. But it also has a dark side, a new study from BuzzFeed News suggests.
Since December 2015, Facebook Live has been used to broadcast at least 45 acts of violence, including shootings, murders, child abuse, and torture, BuzzFeed News is reporting. The news outlet, which performed its own analysis to arrive at the figure, said Facebook Live is averaging two acts of violence per month.
In 2015, CEO Mark Zuckerberg pitched Facebook Live as a new way for his company’s billion users to communicate. The service allows users to go live from their Facebook apps and stream video their friends over the social network. While in the vast majority of cases, it’s been used for positive reasons, like sharing a memory with friends, it didn’t take long for users to stream violence to others.
Get Data Sheet, Fortune’s technology newsletter
Facebook Live’s violence problems have been well-documented over the last several months as more grisly acts have gone viral. And there’s some debate over just how widespread the violence is. Tracking Facebook Live usage can be exceedingly difficult due to the sheer size of the social network’s user base. In March, for instance, the Wall Street Journal found in its own analysis that there had been at least 50 acts of violence on Facebook Live—several more than BuzzFeed News’ own tally.
Regardless, both critics and Facebook FB itself have acknowledged the live-streaming service has a violence problem. And as acts of violence continued to crop up on the service, Zuckerberg in May responded by saying his company would hire 3,000 people to its operations team to police video on the social network.
“If we’re going to build a safe community, we need to respond quickly,” Zuckerberg wrote on Facebook. “We’re working to make these videos easier to report so we can take the right action sooner—whether that’s responding quickly when someone needs help or taking a post down.”
Zuckerberg added that Facebook receives “millions of reports” each week to review potentially inappropriate video content.
Whether that effort will help Facebook reduce the number of violent acts streamed over the service is unknown. However, BuzzFeed News said that it found nine violent streams in April and three in May, suggesting the additional team members might already have made an impact.
Facebook did not immediately respond to a Fortune request for comment on the report.