The shape of your face is as distinct as your fingerprint. That’s why a growing number of organizations — from police forces to schools to Wal-Mart — are using facial recognition software to identify you in online photos and in real world locations.
But facial recognition technology is beginning to pose a major privacy threat, which has led researchers to explore ways to counteract it. One of them is Joey Bose, a computer engineering student at the University of Toronto.
Bose claims he has developed a tool to “break” facial recognition systems by adding extra elements to photos before they are uploaded to the Internet. The photos don’t look any different to the naked eye, but the hidden features thwart detection systems. Here’s an image that shows Bose’s tool in action:
“It adds specially-crafted noise for the face images. It’s trained to attack facial recognition software,” he said. “Think of it as Instagram filter.”
Bose told Fortune the tool will soon be available as a phone app or plug-in for web browsers, and that he has shared the underlying code on his GitHub page.
This opportunity to thwart facial recognition will likely be welcome by many people at a time when the technology is becoming more pervasive.
Private companies, for instance, have seized on anxiety over school shootings to sell face-detection systems to school districts — even as skeptics pan this as “security theater” that’s unlikely to prevent more shootings. Meanwhile, Forbes this week reported that Amazon Web Services is selling facial recognition technology to all comers for as little as $10.
Bose’s could thus slow the spread of the technology by reducing the number of faces available to companies that make the detection software.
His tool, however, is only a pre-emptive measure and does not address situations where a company already has an image of someone’s face and uses a camera to detect them in the real world. In order to prevent this sort of recognition, Bose says, people can employ tactics like wearing glasses with special patterns that fool the detection mechanisms or even put small stickers on their face.
A Cat-and-Mouse Game
For now, Bose’s tool only works to thwart certain types of facial recognition software. Specifically, it can break the software if the training model — the machine learning data set used to train the software — is publicly available.
While a number of facial detection systems sold by security companies rely on these publicly available data sets, other companies, notably Facebook, have their own proprietary versions that Bose’s tool can’t defeat.
Bose suspects that Facebook uses an ensemble of different facial recognition techniques in order to overcome counter-measures, like the one he developed, to fool its software. But he predicts his facial recognition duping tool will eventually be able to thwart Facebook, and set off a cat-and-mouse game between developers seeking to detect faces and those seeking to disguise them.
This raises the question of whether companies will seek to commercialize tools that thwart facial recognition. Bose says he has already been approached by a number of venture capitalists, but that he’s decided to pass for now. Instead, he says he plans to continue his research in a PhD program at McGill University starting this fall.