In an effort to stop suicide on its live-streaming service, Facebook is amping up its suicide prevention tools.
On Wednesday, the company announced that it is integrating a new tool that will allow users to report if a Facebook Live video is suicidal. From there, the user in the video will be offered live-chat support through services like the National Suicide Prevention Lifeline and the Crisis Text Line via Facebook Messenger, the company said. Resources will also be provided to the person who reported the Live video so they can personally assist their friend or loved one.
What’s more, Facebook (FB) is also planning to use artificial intelligence to test a “streamlined reporting process.” The AI tool will use pattern recognition technology on posts that have been reported for suicide in the past. The goal of the AI tool, according to Facebook, is to make the option to report posts that suggest “suicide or self injury” more prominent.
The company plans to expand the test over the next several months.
“Facebook is in a unique position — through friendships on the site — to help connect a person in distress with people who can support them,” the company said in a statement. “It’s part of our ongoing effort to help build a safe community on and off Facebook.”
Facebook has offered suicide prevention services for about a decade, and expanded its suicide-prevention tools last year. Regardless, the decision to beef up its prevention tools comes after a 14-year-old girl named Naika Venant live-streamed her suicide in January. And last year, a Chicago man was shot and killed while filming himself on Facebook Live.