Facebook will add 3,000 people over the next year to monitor reports of inappropriate material on the social media network and removing videos such as murders and suicides, Chief Executive Officer Mark Zuckerberg said on Wednesday.
Zuckerberg, the company’s co-founder, said in a Facebook post the workers will be in addition to the 4,500 people who already review posts that may violate its terms of service.
The hiring spree represents an acknowledgement by Facebook that, at least for now, it needs more than automated software to improve monitoring of broadcasts on Facebook Live, a service that has been marred since its launch last year by instances of people streaming violence.
Last week, a father in Thailand broadcast himself killing his daughter on Facebook Live, police said. After more than a day, and 370,000 views, Facebook (FB) removed the video. Other videos from places such as Chicago and Cleveland have also shocked viewers with their violence.
Get Data Sheet, Fortune’s technology newsletter
Zuckerberg said: “We’re working to make these videos easier to report so we can take the right action sooner—whether that’s responding quickly when someone needs help or taking a post down.”
Facebook is due to report quarterly revenue and earnings later Wednesday after markets close in New York.