By Jeff John Roberts
November 8, 2017

The FBI hopes a cell phone belonging to the gunman who killed 26 people in a Texas church might provide clues about the crime, but has run into a familiar problem: The agency can’t get around the phone’s password protection to unlock it.

Special Agent Christopher Combs yesterday told reporters the FBI has flown the phone of the late Devin Kelley to Quantico, Va., for analysis, but a forensic team has been unable to open it. Combs did not specify whether the device was an Apple iPhone or another model.

Combs’s announcement appears to foreshadow another stand-off between law enforcement and the tech industry over the encryption techniques used to protect many smartphones. The issue gained prominence in 2016 when the FBI sued Apple, demanding it unlock an iPhone belonging to a dead terrorist who murdered 14 people in San Bernardino, Calif.

The highly publicized court case ended quietly, however, when the FBI found a technique to break into the phone on its own accord. But the underlying legal issue—which turns on whether companies like Apple must provide law enforcement with so-called “back doors” into devices—remains unresolved.

Legal observers have speculated that law enforcement has been on the lookout for another high profile test case, and the Texas church shooting may fit the bill. (In the recent vehicle rampage in New York City, the phone of the dead terrorist was reportedly unlocked).

Get Data Sheet, Fortune’s technology newsletter.

The FBI’s announcement this week also comes after ongoing complaints from the Justice Department that tech companies’ security measures are an obstacle to solving crimes. New York District Attorney Cy Vance has likewise been a vocal critic, saying last year that hundreds of locked iPhones are piling up in the city’s evidence rooms.

Civil liberties advocates retort that people have a right to secure their devices in a way that prevents others from prying into them. Meanwhile, they and many in the tech industry fear that police-mandated “back doors” will quickly be discovered by criminals and foreign governments, and used to spy on citizens.

Some legal scholars, meanwhile, question whether forcing companies to build backdoors could amount to an illegal form of “compelled speech” by forcing engineers to write code against their will. This argument is based on the fact that Apple’s encryption software is built so that it can’t be broken—unless the company wrote new code to get around the encryption.

Right now, the reason Apple’s password protection can’t be defeated is because the phone introduces a delay after every incorrect guess, and locks permanently if someone enters 10 wrong guesses. This technique means law enforcement can’t use so-called brute force techniques to guess the passcode.

For practical purposes, however, law enforcement often has other ways to get into phones. These include issuing subpoenas for backup data stored in cloud services like the iCloud or, in the case of older phones, applying software hacks to break the encryption.

SPONSORED FINANCIAL CONTENT

You May Like

EDIT POST