Planning on posting an image on Bumble? Keep your pants on.
The dating app says it will roll out a new A.I. tool in June that will automatically blur suspected lewd images.
Dubbed Private Detector—and, yes, the name refers to exactly what you think it does—it’s meant to make the app more friendly to women, its initial target audience. (Bumble requires women to send the first message to people they’re interested in learning more about.) More than half of millennial women have received nude images electronically—and three-quarters of those did so without asking, according to a 2017 YouGov survey.
Company officials say Private Detector has a 98% accuracy rate in testing. Bumble already has restrictions on photos that include underwear and guns, among other topics.
The dating app has some high profile investors, including Serena Williams and Priyanka Chopra. The company hasn’t been afraid to engage in a courtroom tussle with larger competitors, including Match.com. It’s also considering an IPO—and the roll out of Private Detector could be a first step to make it more digestible to investors.
Beyond dating, the Bumble app also offers a “women only” job hunting tool and a friend finding service. It’s also taking user privacy seriously, doing away with a Facebook login requirement approximately a year ago.
Want more Fortune tech coverage? Follow our magazine on Flipboard: