The 4-digit pin code to a terrorist's iPhone 5C has become the flashpoint for a long-simmering conflict. The FBI wants Apple to unlock it. Apple has refused to help. The Department of Justice took its case to a federal judge. Tim Cook took Apple's (aapl) case to the public.
The dispute has erupted into national referendum on the role of strong encryption in a free society—a debate complicated by the fact that newer iPhones are even more secure that the one used by the San Bernardino shooter.
I've heard arguments, weak and strong, from opinion leaders on both sides. These caught my eye.
For the FBI:
Justice Department Spokesperson: " It is unfortunate that Apple continues to refuse to assist the Department in obtaining access to the phone of one of the terrorists involved in a major terror attack on U.S. soil.”
Josh Ernest, White House spokesperson (via Reuters): "I t's important to recognize that the government is not asking Apple to redesign its product or 'create a new backdoor' to its products."
Cyrus Vance, Manhattan District Attorney: " Decisions about who can access key evidence in criminal investigations should be made by courts and legislatures, not by Apple and Google."
Shane Harris, Daily Beast: " A 2015 court case shows that the tech giant has been willing to play ball with the government before—and is only stopping now because it might ‘tarnish the Apple brand.’"
Tom Cotton, Senator (R) Arkansas: " Apple chose to protect a dead ISIS terrorist’s privacy over the security of the American people.. . Apple is becoming the company of choice for terrorists, drug dealers, and sexual predators of all sorts."
Donald Trump, On CNN: "To think that Apple won't allow us to get into her (sic) cell phone. Who do they think they are?"
Subscribe to Data Sheet, Fortune’s daily newsletter on the business of technology.
Editorial Board, Washington Post: " To what extent is it reasonable to force companies to write new code and harm their international reputation for data security — and, therefore, their business models — in order to help the U.S. government hack into suspects’ phones?"
Editorial Board, New York Times: " Congress would do great harm by requiring such back doors. Criminals and domestic and foreign intelligence agencies could exploit such features to conduct mass surveillance and steal national and trade secrets. There’s a very good chance that such a law, intended to ease the job of law enforcement, would make private citizens, businesses and the government itself far less secure.
Editorial Board, Wall Street Journal: " The CEO has a strong case when he says that backdoors create more problems than they solve. Introducing security vulnerabilities that third parties like cops and spooks can use as needed can also be exploited by hackers, crooks and spies. Nations can mandate backdoors, but there will always be some encrypted channels outside of their jurisdiction where the likes of ISIS can plot. The result would be weaker products for law-abiding consumers that leave U.S. companies less competitive with little security benefit.
Sundar Pinchai, CEO, Google: " Forcing companies to enable hacking could compromise users’ privacy. We build secure products to keep your information safe and we give law enforcement access to data based on valid legal orders. But that’s wholly different than requiring companies to enable hacking of customer devices & data."
Kurt Opsahl, Electronic Frontier Foundation: " Essentially, the government is asking Apple to create a master key so that it can open a single phone. And once that master key is created, we're certain that our government will ask for it again and again, for other phones, and turn this power against any software or device that has the audacity to offer strong security."
Alex Abdo, American Civil Liberties Union: “ This is an unprecedented, unwise, and unlawful move by the government. Apple deserves praise for standing up for its right to offer secure devices to all of its customers."
Sophia Cope, EFF: "This kind of technological power will become ripe for abuse not only by the government, but also by identity thieves, estranged spouses or anyone who would benefit from snooping into your life."
Lance Ulanoff, Mashable: "Forget the technology. If the FBI successfully forces Apple to create a new OS just to brute-force hack its own product, it’s the first step through a very dark one-way door — for all of us."
For more on Apple, watch:
Edward Snowden, via Twitter: " The @FBI is creating a world where citizens rely on #Apple to defend their rights, rather than the other way around."
John Gruber, Daring Fireball: " By fighting this, Apple is doing something risky and difficult. It would be easier, and far less risky, if they just quietly complied with the FBI. That’s what makes their very public stance on this so commendable."
Mark Cuban, Shark Tank: " Amen. A standing ovation. They did the exact right thing... Encryption is easy. It is like wearing a seatbelt in your car. For years we didn’t. Then we did and it was smart."
Ben Thompson, Stratechery: "I'm just a tiny bit worried about Tim Cook drawing such a stark line in the sand with this case: the PR optics could not possibly be worse for Apple... Then again, I can see the other side: a backdoor is a backdoor."
From the New York Times: "Asked about Apple’s opposition to the court order, representatives of Microsoft, Twitter and Facebook declined to comment. A spokesman for Amazon, which is not in the [ Reform Government Surveillance] coalition, also declined to comment."
More as they come in.
What do you think Apple should do? Vote here.