Tim Cook Opposes Order for Apple to Unlock iPhone, Setting Up Showdown


NY Times:

Tim Cook Opposes Order for Apple to Unlock iPhone, Setting Up Showdown

SAN FRANCISCO — Apple said on Wednesday that it would oppose and challenge a federal court order to help the F.B.I. unlock an iPhone used by one of the two attackers who killed 14 people in San Bernardino, Calif., in December.On Tuesday, in a significant victory for the government, Magistrate Judge Sheri Pym of the Federal District Court for the District of Central California ordered Apple to bypass security functions on an iPhone 5c used by Syed Rizwan Farook, who was killed by the police along with his wife, Tashfeen Malik, after they attacked Mr. Farook’s co-workers at a holiday gathering.

Judge Pym also ordered Apple to provide related technical assistance and to build special software that would essentially act as a skeleton key capable of unlocking the phone.
But hours later, in a statement by its chief executive, Timothy D. Cook, Apple announced its refusal to comply. The move sets up a legal showdown between the company, which says it is eager to protect the privacy of its customers, and the law enforcement authorities, who say that new encryption technologies hamper their ability to prevent and solve crime.

In his statement, Mr. Cook called the court order an “unprecedented step” by the federal government. “We oppose this order, which has implications far beyond the legal case at hand,” he wrote.

The Justice Department did not immediately respond publicly to Apple’s resistance.
The F.B.I. said that its experts had been unable to access data on Mr. Farook’s iPhone, and that only Apple could bypass its security features. F.B.I. experts have said they risk losing the data permanently after 10 failed attempts to enter the password because of the phone’s security features.

The Justice Department had secured a search warrant for the phone, owned by Mr. Farook’s former employer, the San Bernardino County Department of Public Health, which consented to the search.



Encryption backdoors are a horrendous idea in every conceivable way.


Why should a phone be more private than my house? If they really needed to get into my home they can do it. What makes a phone special?


They already have the legal authority to search the phone, if they are capable of it. In this case, they are trying to fundamentally destroy encryption because it isn’t easy for them to search it.

In your example, it would be like requiring all door locks to have a second key made and sent to the government just in case they feel like searching your house in the future. Except this is far more disastrous, because backdooring encryption software weakens it permanently, and you have to hope that nobody else figures it out as they did with the idiotic Clipper chip idea in the 90s.


This is a classic case of “Be careful what you ask for; you just might get it.”

Everybody wants privacy as long as it is THEIR privacy. But suppose, for example, someone kidnapped your child and the kidnapper accidentally dropped his cell phone. How would you feel if the police told you “Sorry, we can’t get any information off the phone (which could save your child’s life) because of privacy protections.”?


It isn’t as if the government can’t find out everything about me anyway. Not so very long ago, I looked into some websites about skid steerers; Bobcat to be specific. Ever since then, almost everything I bring up has a Bobcat ad on it.

If I look up a particular company’s financials, all kinds of ads start popping up for brokers and money funds and I start getting emails from them.

If every commercial enterprise can track me like that (and probably a lot more than that) then how much privacy do I really have?

If it’s a matter of the government being able to see what I transmit to my daughter and the reason for it is to save lives, I think the benefits of the latter outweigh my privacy in the former.


Certainly, China could employ 10,000 computer-trained people to break the code, and eventually, after thousands of destroyed phones, they would crack it. But we can’t have it to protect American lives. Seems an ill-conceived position on Apple’s part to me.


As they say, “hard cases make bad law”. I’m not sure the value of having the police able to act in the incredibly rare situation in which a kidnapper drops a phone with actionable information on it is worth surrendering my expectation of privacy.




I believe Apple’s position has nothing to do with protecting our privacy. I believe this is a sales ploy. By refusing to help the Feds, Apple makes all the Apple-bots out there feel even more impressed with Apple.

Sales will go up because Apple is fighting for us. Blah blah bla–


They could and would, but it is much easier than that. It took only 2 years for independent researchers to figure out how to eavesdrop on the Clipper chip in real time. That wasn’t with an enormous nation-state funding them either.

Look, I have written encryption software. It’s incredibly hard to do correctly when you’re trying to do it correctly, and even harder to verify that you actually did it correctly. We see a constant stream of vulnerabilities in crypto code like OpenSSL and others–and they’re not trying to introduce flaws on purpose. Trying to allow some people access but not others and under certain conditions, but not usually, introduces an absurd amount of complexity to an already very fragile process.


Exactly. It’s a Pandora’s box that does not need to be opened. And it’s not just Apple. Google and all the other major players oppose this. And on a fundamental level it’s going to be hard for the government to compel a company to create software that doesn’t exist and they have no interest in creating. If it’s so important to the state, he state should figure out how to get into the phone.


Agreed. For a company whose software allows advertisers to target their ads to potential customers based on their proximity to certain retail establishments to argue for privacy is absurd.

[quote=irishpatrick] I believe this is a sales ploy. By refusing to help the Feds, Apple makes all the Apple-bots out there feel even more impressed with Apple.


If so, it’s a remarkably tone-deaf strategy.

“Even terrorists can count on Apple” is not the message Apple wants to promote right now. I think they will rethink this.


Perhaps I am missing something here, but doesn’t Apple’s resistance to becoming involved imply that the ability to go “backdoor” already exists and is within the power of Apple to do?

So it comes down to who do you trust, Apple or government security agencies? Apple has shown its willingness to become political and weigh-in on certain issues, in Indiana and Texas, among other places. So why would we presume to allow Apple to have access where we fear government agencies might go?

Why hand over our privacy to cell phone technology in the first place? Why place that kind of trust in the technology? Isn’t that presumption fraught with issues to begin with?


Judge Pym also ordered Apple to provide related technical assistance and to build special software that would essentially act as a skeleton key capable of unlocking the phone.

Let’s say, arguendo, that Apple has really created encryption that even they can’t break. Will they be held in contempt of court? Will the government say, “I don’t believe you.” and charge them with obstruction?

I would go with the analogy of a safe that can’t be cracked. I or a manufacturer doesn’t have any obligation to make it easier for Uncle Sam to break in.
Maybe people shouldn’t be allowed to have steel security doors (or panic rooms), just wooden ones that police that can be taken out with a battering ram.


:thumbsup: Yes, I agree. I am not an Apple-bot and don’t plan on being one. Go ahead, read my phone. If you care what is for dinner, what time I am leaving to come see you and to see pics of my garden, take a look. :wink: My phone is not my life and does not contain anything important at all. And it never will.


LOL–I am with you on that. :slight_smile:


It must be our Irish connection. :irish3:


Indeed. :thumbsup:


The judge ordered Apple to “build special software that would essentially act as a skeleton key capable of unlocking the phone”, so I am assuming that in its current state, iOS’s encryption scheme does not contain such a backdoor. Apple could certainly modify it and maybe push it to the device as an update, which is what I am assuming the judge means by “build special software”. They probably have the ability to add a backdoor, but I doubt that one currently exists.

As for your other points, I don’t trust either Apple or security agencies. That’s why the encryption should be so designed that neither has access and both would be forced to break it. If encryption is done properly, this is the case (and appears to be so with iOS).

DISCLAIMER: The views and opinions expressed in these forums do not necessarily reflect those of Catholic Answers. For official apologetics resources please visit www.catholic.com.