Should a government get into a smartphone?
In other words, should a government have access to data inside a smartphone?
Yea, we know. Your first reaction is, “no, of course not; that’s a violation of privacy.”
But what if the smartphone the government wanted inside was the smartphone of a criminal? A murderer? A dead terrorist? A live terrorist?
Does that make the question different? Does that alter your opinion on the access by government of smartphone data?
It’s not an easy question. But it’s the issue at hand as a U.S. District Court judge – a judge of the U.S. federal government – ordered Apple on Feb. 16 to break into an iPhone owned by Syed Farook, one of the two murderers who gunned down 14 people inside a health clinic in San Bernardino, California on Dec. 2.
Police have been unable to crack the iPhone’s system to retrieve what it believes will be valuable information linking Farook to other criminals and terrorists. They want Apple’s help and the federal judge said, in essence, Apple can be required to create a “backdoor” to give police access to the phone’s data.
In response, Apple CEO Tim Cook, said, “no.”
In an unusual letter posted to Apple’s primary website, Cook said his company – and other online, phone and tech companies have to stand firm against government demands for smartphone access.
“The United States government has demanded that Apple take an unprecedented step which threatens the security of our customers,” Cook wrote. “We oppose this order, which has implications far beyond the legal case at hand.”
So, we can guess what you might be thinking, now. Something along the lines of: “Yea, but this is a murderer, a criminal, a terrorist. If breaking into his phone will help save the lives of others, isn’t it worth it?”
It’s a legitimate question to ask. But notice that part when Cook points out, “this order…has implications far beyond the legal case at hand.”
“Smartphones…have become an essential part of our lives,” Cook writes. “People use them to store an incredible amount of personal information, from our private conversations to our photos, our music, our notes, our calendars and contacts, our financial information and health data, even where we have been and where we are going.
“All that information needs to be protected from hackers and criminals who want to access it, steal it, and use it without our knowledge or permission. Customers expect Apple and other technology companies to do everything in our power to protect their personal information, and at Apple we are deeply committed to safeguarding their data.
“Compromising the security of our personal information can ultimately put our personal safety at risk. That is why encryption has become so important to all of us.
“For many years, we have used encryption to protect our customers’ personal data because we believe it’s the only way to keep their information safe. We have even put that data out of our own reach, because we believe the contents of your iPhone are none of our business.”
Cook points out in his letter Apple has helped the police in this case, has provided the FBI with information it requested, has complied with subpoenas and search warrants in the case.
“Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation,” he writes. “In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.
“The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.
“The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.
“We can find no precedent for an American company being forced to expose its customers to a greater risk of attack. For years, cryptologists and national security experts have been warning against weakening encryption. Doing so would hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data. Criminals and bad actors will still encrypt, using tools that are readily available to them.
“While we believe the FBI’s intentions are good, it would be wrong for the government to force us to build a backdoor into our products. And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect.”
Does any of this argument change your opinion? Back to the original question, “should a government be allowed inside a person’s smartphone?”
We’d like to know what you think. Comment below or, if you prefer, reach out to us through any of our other channels.