By Emily Price
March 15, 2019

Facebook thinks it has come up with a way to stop revenge porn from being posted on the platform. The social network announced Friday that it has developed a new tool that uses artificial intelligence to detect revenge porn on the platform before the post has been reported.

Using machine learning and artificial intelligence, Facebook says that it can detect nude or near-nude images and videos that have been shared on Facebook or Instagram and then kick them over to its Community Operations Team to review, Facebook explained in a blog post. If it violates Facebook’s community standards then it will be removed, and in many cases, Facebook will also disable the account that posted it (although it does offer an appeals process if someone feels like Facebook has made a mistake).

The program is in addition to a new pilot program that it is running with victim advocate organizations which gives them an emergency option to securely report a photo to Facebook. That program allows Facebook to create a “digital fingerprint” for the image and prevent it from being shared on the platform to being with. Facebook says that so far the pilot program has been successful, and it plans to expand that program over the coming months.

It’s also launching “Not Without My Consent” a victim-support hub in Facebook’s Safety Center where individuals that have had images shared without their consent can find organizations and resources to help them, including information about how to remove content from Facebook.

SPONSORED FINANCIAL CONTENT

You May Like

EDIT POST