Artificial IntelligenceCryptocurrencyMetaverseCybersecurityTech Forward

How New Technology Could Create New Legal Problems

July 8, 2016, 12:30 PM UTC
Inside The F8 Facebook Developers Conference
A developer works on code during the Facebook F8 Developers Conference in San Francisco, California, U.S., on Wednesday, April 30, 2014. Facebook will offer software developers improved tools to create programs that run on any smartphone and a more streamlined experience for people to log into apps, including the option to sign in anonymously. Photographer: Erin Lubin/Bloomberg via Getty Images
Photograph by Erin Lubin — Bloomberg via Getty Images

This essay originally appeared in Data Sheet, Fortune’s daily tech newsletter. Sign up here.

The next time your company is about to splurge on the latest hot tool for crunching data, take a deep breath and channel your inner lawyer. Cutting-edge technology could conflict with old-school laws.

Garry Mathiason, a longtime litigator at the labor and employment law firm Littler Mendelson, can remember a key moment that cemented his interest in how fast-changing technologies intersect with law.

A few years ago, Japanese technology company NEC unveiled a robot named Sophie that its inventors said could conduct job interviews. Sophie not only asked questions of job candidates, but it also studied their faces for changes in expressions that could indicate whether they may be fibbing. The robot could even be programmed to monitor changes in the candidates’ blood pressure and perspiration, just in case someone happened to have a poker face.

“All of a sudden, what you got is a lie detector,” Mathiason said. “And so then, the lie detector laws come into effect,” which could put companies at risk of violating laws that ban subjecting job candidates to polygraph tests.

Additionally, Sophie was outfitted with the ability to sift through data to identify the key traits of the best workers at companies. The robot could then try to determine whether candidates it interviewed had those traits.

Although this “sounds excellent and in fact highly useful,” Mathiason explained, the technology could potentially lead to a legal headache. If Sophie determined that more successful workers lived in certain areas, and it asked people where they lived, it could end up weeding out candidates who lived in poorer or minority-heavy areas.

For more about data analytics, watch:

Sophie, like many predictive analytics technologies, is not immune to bias, and that bias could lead to potential discrimination lawsuits. That doesn’t mean that Sophie or similar data-crunching technologies are to be avoided at all costs. It just means that companies must do their due diligence and consult with legal and human resources to ensure they don’t unwittingly create problems down the road.