What is going on in the Apple-FBI fight over encryption


Yesterday a federal judge ruled against the FBI in its efforts to get Apple to unlock the phone of a suspected drug dealer. This ruling has an impact on the other case in which the FBI has demanded that Apple unlock the phone of the San Bernadino shooter. The government is using a 227-year old law known as the All Writs Act as the basis for its claim. (Neil Richards and Woodrow Hartzog explain the origins and purpose of this law.)

A federal judge on Monday rejected an FBI request to order Apple to open the iPhone of a drug dealer in a major setback to the US government’s increasingly heated efforts to force the company to help unlock an iPhone used by a San Bernardino terrorist.

The ruling late on Monday by magistrate judge James Orenstein rejected the US Justice Department’s attempt to gain access to the iPhone of accused crystal meth dealer Jun Feng, whose case is ongoing, though Feng has pleaded guilty. He will be sentenced in April.

The ruling comes just hours before Apple and justice department officials are set to clash in Congress over a court ruling calling on Apple to weaken the password protection of an iPhone belonging to San Bernardino killer Syed Farook.

Tim Cook, the head of Apple, has been engaged in a very public fight with the FBI over their demand that the company allow the government to access the contents of the phone used by one of the San Bernadino shooters. The Intercept has had a series of articles explaining what is at issue in this complicated case. Basically it boils down to the government asking Apple to weaken the encryption that it has built into its software.

As Dan Froomkin and Jenna McLaughlin write:

The more we learn about the FBI’s demand that Apple help it hack into a password-protected iPhone, the more it looks like part of a concerted, long-term effort by the government to find new ways around unbreakable encryption — rather than try to break it.

The court order Apple is fighting would require it to come up with a new way to hack into an iPhone 5c belonging to San Bernardino killer Syed Rizwan Farook.

The fact is that Apple couldn’t break the encryption scrambling the phone’s data if it tried. But the FBI doesn’t have to worry about that if it can just open the phone with the right password.

As Apple CEO Tim Cook put it, in his rebellious public response to the court order: “The ‘key’ to an encrypted system is a piece of information that unlocks the data, and it is only as secure as the protections around it.”

McLaughlin examines Apple’s strong response to the court order.

The response Apple lawyers filed Thursday to a court order that the company write software to defeat its own security protocols is exhaustive, fiery, accessible, and full of memorable passages.

The lawyers were asking a federal magistrate judge to vacate what they called her “unprecedented and oppressive” order demanding that Apple design and build software to hack into an iPhone used by San Bernardino killer Syed Rizwan Farook.

And they were relentless.

McLaughlin also writes that the FBI likely knows that there is nothing worth getting from the phone but that they are using this case, with all its anti-terror fearmongering, to establish their right to get backdoors to encryption.

A locked phone used by a dead terrorist initially may have seemed like the perfect test case for law enforcement to argue that it needs ways to get around advanced device security.

But authorities may have picked the wrong phone after all. It’s becoming increasingly clear that law enforcement doesn’t really think there’s any important data on San Bernardino killer Syed Rizwan Farook’s iPhone and that it has more precedent-setting value than investigative value.

“I’ll be honest with you, I think that there is a reasonably good chance that there is nothing of any value on the phone,” San Bernardino Police Chief Jarrod Burguan told NPR reporter Steve Inskeep on Friday.

Ted Rall makes the point that this is really a case of the FBI against what Edward Snowden has unleashed and the fact that Apple is fighting this request and is being supported by other tech giants is because of the heightened awareness created by Edward Snowden.

A few years ago, no one — left, right, libertarian — would have supported Apple’s refusal to cooperate with a federal investigation of a terrorist attack associated with a radical Islamist group, much less its decision to fight a court order to do so. If investigators hadn’t combed through the data on the phone used by Syed Farook before he slaughtered 14 people, it would have been seen as dereliction of duty. Obviously the authorities need to learn everything they can about Farook, such as whether he ever had direct communications with ISIS or if there were any coconspirators. Looking at evidence like that is what law enforcement is for.

Rather than face Uncle Sam alone, Apple’s defiance is being backed by Facebook, Google, Microsoft, Twitter and Yahoo — companies who suffered disastrous blows to their reputations, and billions of dollars in lost business, after NSA whistleblower Edward Snowden revealed that they spent years voluntarily turning over their customers’ data to the spy agency in its drive to “hoover up” every email, phone call, text message and video communication on the planet, including those of Americans.

The NSA almost certainly has the contents of Farook’s iPhone — and yours, and mine — on a server at its massive data farm in Bluffdale, Utah. Thanks to a court order and inside-the-Beltway turf battles, however, the NSA can’t/won’t turn them over to the FBI.

This is what happens when government treats citizens with contempt. Citizens return the favor.

Today Apple will be testifying before congress about its refusal to do what the FBI is demanding. Its top attorney Bruce Sewell will be asking them these questions:

“Do we want to put a limit on the technology that protects our data, and therefore our privacy and our safety, in the face of increasingly sophisticated cyber attacks?

“Should the FBI be allowed to stop Apple, or any company, from offering the American people the safest and most secure product it can make?”

“Should the FBI have the right to compel a company to produce a product it doesn’t already make, to the FBI’s exact specifications and for the FBI’s use?”

The defenders of the FBI keep saying that they are only asking for this one phone in this one instance but that is not true.

Micah Lee describes what FBI is asking Apple to do and why, and how it can be foiled.

YESTERDAY, APPLE CEO TIM COOK published an open letter opposing a court order to build the FBI a “backdoor” for the iPhone.

Cook wrote that the backdoor, which removes limitations on how often an attacker can incorrectly guess an iPhone passcode, would set a dangerous precedent and “would have the potential to unlock any iPhone in someone’s physical possession,” even though in this instance, the FBI is seeking to unlock a single iPhone belonging to one of the killers in a 14-victim mass shooting spree in San Bernardino, California, in December.

It’s true that ordering Apple to develop the backdoor will fundamentally undermine iPhone security, as Cook and other digital security advocates have argued. But it’s possible for individual iPhone users to protect themselves from government snooping by setting strong passcodes on their phones — passcodes the FBI would not be able to unlock even if it gets its iPhone backdoor.

The NSA has portrayed itself as this behemoth that is capable of great technological feats with an array of massive computers and skilled cryptographers that can break into any system. But the reality seems to be that while it can and does vacuum up all the communication traffic, it can it can be foiled by strong encryption from making sense of the data.

Comments

  1. doublereed says

    The NSA has portrayed itself as this behemoth that is capable of great technological feats with an array of massive computers and skilled cryptographers that can break into any system. But the reality seems to be that while it can and does vacuum up all the communication traffic, it can it can be foiled by strong encryption from making sense of the data.

    Ex-NSA, CIA chief Michael Hayden on this issue. It’s surprising to see that someone so against privacy rights is actually siding with Apple.

    Another good resource is this ACLU article on the consequences of this case.

  2. doublereed says

    Oh, the reason I quoted that section above is just so you don’t conflate the FBI and NSA. Generally speaking, the NSA is against official backdoors unless they’ve been deemed “Nobody But Us” (for whatever reason they think others won’t be able to exploit it). But the FBI doesn’t even pretend to care about people’s security whatsoever.

  3. felicis says

    So… I have some points of disagreement here. First, let us note that in the single case of Syed Rizwan Farook’s iPhone we have a court ordered search warrant backed up with a court ordered writ directing Apple to aid the FBI in their search.

    It is disingenuous of Apple to pretend that they are doing this out of some desire to protect people’s privacy. As with most large corporations, they would harvest our organs if they thought it would help their bottom line. That a bunch of other large corporations are supporting Apple in this fight is not necessarily something we should view as being any better than the nutjobs supporting Amon Bundy in his occupation of the Malheur Wildlife Refuge -- both groups are challenging Federal Court rulings that they don’t want to comply with -- it’s just that the giant corporations are more likely to get their way (setting a different, and, to my mind, a more dangerous, precedent).

    Let us also be clear on what it is that Apple is being asked to do -- help the FBI access the data on the phone. While the FBI suggested a method, the actual order does NOT require Apple to follow the FBI’s suggestion -- if there is another way to get at the data, Apple is welcome to use it. Nor does Apple have to give the method to the government. Indeed, Apple may keep the phone and method and only give the FBI remote access to the data and still be complying with the order.

    So -- NOT “the government asking Apple to weaken the encryption that it has built into its software”. That is an absolutely incorrect way of looking at this situation. NOR is Apple being required to build an OS with a ‘backdoor’ that will allow anyone to bypass the encryption.

    Also -- Apple has the option of explaining to the court why the request is too burdensome.

    Of course -- the CEO immediately released an open letter instead.

    That makes these questions:
    “Do we want to put a limit on the technology that protects our data, and therefore our privacy and our safety, in the face of increasingly sophisticated cyber attacks?

    “Should the FBI be allowed to stop Apple, or any company, from offering the American people the safest and most secure product it can make?”

    “Should the FBI have the right to compel a company to produce a product it doesn’t already make, to the FBI’s exact specifications and for the FBI’s use?”

    Seem a little more misleading. Because none of that is what Apple is actually being asked to do. (I notice that, in all the noise about this, none of the articles link to the actual court order).

    Let us also look at another of Apple’s objections, “And the DoJ’s continued use of the All Writs Act, rather than the boilerplate warrant Apple provides for phones used in at least eight of the cases, seems to suggest an effort to set precedent.”

    Again -- in this case, there is already a warrant in place -- the All Writs Act notice is to require Apple to comply with the warrant already issued. Is this about ‘setting precedent’ or about Apple’s continued refusal to respond to warrants? Which -- in itself -- is a kind of precedent; if Apple can succesfully refuse to obey a warrant *once*, that makes it easier for them to do so again.

    Should we trust that our government has our best interests in mind? No -- but let us not forget that Apple doesn’t *either*, and of the two, I trust the government far more than I trust what Apple is saying publicly.

  4. Dunc says

    While the FBI suggested a method, the actual order does NOT require Apple to follow the FBI’s suggestion – if there is another way to get at the data, Apple is welcome to use it.

    There isn’t.

  5. felicis says

    Dunc #4

    How do you know? Especially if Apple is given the physical device (the order specifically allows that Apple can have the phone and give the FBI only remote access to the data) -- generally, having physical access allows a *lot*.

    Certainly -- you’re going to need a bit more than flat denial, or do you believe that because Apple said so?

  6. snoeman says

    I’m glad you posted this. I was hoping for a post where it would be relatively on-topic to pose a question for Marcus Ranum (if he reads this today), who I understand is an expert on cyber security. Namely, is the latest version of the iPhone and iOS likely to be as secure as Apple claims?
    I read Apple’s white paper on the security architecture of the iPhone, and while terms such as “entanglement”, “Secure Element” and “Secure Enclave” all sound impressive, I’m utterly unqualified to make any judgments on the merits of what they’ve done to make the iPhone secure. Is there any reason to believe the iPhone is as secure as Apple says, the capabilities of the NSA notwithstanding?

  7. Friendly says

    Let us also be clear on what it is that Apple is being asked to do – help the FBI access the data on the phone.

    Let us also be clear on what it is that the government cannot be allowed to make accepted practice, whether it’s against ordinary citizens or even nasty and soulless corporations: “You are aware of these individuals’ crimes against the state. Refusing to help us by taking an active role in gathering evidence against them and their co-conspirators is *also* a crime against the state.”

  8. Blood Knight in Sour Armor says

    Giving the FBI backdoor access is at the very least abhorrent customer service and does a lot to hurt the brand (as already happened to the other mentioned megacorps).

  9. Dunc says

    How do you know? … Certainly – you’re going to need a bit more than flat denial, or do you believe that because Apple said so?

    I believe that because I know some stuff about encryption and information security, which is entirely congruent with what Apple have said on the matter since long before this particular case came up. In order for there to be another way to get at the data, Apple’s security design would have to be so badly flawed that it could only reasonably be explained by a deliberate choice to break it, and then to lie about its capabilities in all of their communications and marketing over a period of several years, all in order to pick a fight with the FBI that they might well lose. That doesn’t seem like a particularly likely scenario. I believe that it’s far more likely that they’ve been at least minimally competent in their security design, and that it does actually function in the way they’ve described ever since it was first introduced, in which case there is no other way around it, unless you happen to have a truly revolutionary quantum computer handy.

    Physical access does not allow you to magically bypass a well-designed cryptographic system.

  10. Dunc says

    I was hoping for a post where it would be relatively on-topic to pose a question for Marcus Ranum (if he reads this today), who I understand is an expert on cyber security. Namely, is the latest version of the iPhone and iOS likely to be as secure as Apple claims?

    Well, I’m not nearly in the same ballpark as Marcus, and I would also be very interested to hear his point of view (and I’m sure he’ll weigh in at some point), but in my understanding, I’d say that it does look to be really very good from the security point of view, at least to the eyes of this interested amateur.

    Of course, if someone can get your passcode by whatever means, then all the security in the world is useless…

  11. felicis says

    Friendly #8 — Exactly how is this a case of the government attempting to make accepted practice that “You are aware of these individuals’ crimes against the state. Refusing to help us by taking an active role in gathering evidence against them and their co-conspirators is *also* a crime against the state.”

    This is pursuant to a search warrant already issued. This is not an arbitrary search.

  12. felicis says

    Dunc #9 — “I know some stuff about encryption and information security” is still not terribly convincing. And you seem to be deliberately misunderstanding my point about having physical access to the device -- I do not claim that it “allow[s] you to magically bypass a well-designed cryptographic system”, I said, ” generally, having physical access allows a *lot*.” Meaning that it makes breaking into a (however well-designed) system *easier*. That ‘easier’ may only be marginal, but don’t pretend that there’s no difference!

  13. Friendly says

    felicis @12: The FBI’s warrant is not demanding the *disclosure* of information, *handing over* of records, etc. It is demanding that a third party become an *active collaborator* with the government’s security apparatus. If that kind of demand is allowed to stand, then the FBI can not only force Brinks to crack any safe they’re interested in, they can also haul in you or me or anyone else off the street and force you to wear a wire when you’re talking to acquaintances engaged in “suspicious activity.” If you don’t see a problem with government bodies conscripting individuals and organizations as agents — however hypocritical or self-serving such individuals or organizations might be --, I have nothing further to say to you.

  14. felicis says

    Friendly -- #15 -- Slippery slope argument -- your hypotheticals: the FBI can not only force Brinks to crack any safe they’re interested in, they can also haul in you or me or anyone else off the street and force you to wear a wire when you’re talking to acquaintances engaged in “suspicious activity.” The All Writs Act has been law since 1789, and in its current form since 1911. Yet none of what you claim could happen has.

    Further -- this case is not a parallel to ‘haul[ing] in you or me or anyone else off the street and force[ing] you to wear a wire’, nor is it forcing (in this case) Apple to act as a agent.

    The closest you get is ‘force Brinks to crack any safe they’re interested in’ — which I have no problem with so long as there is (as in this case) a valid search warrant for what is in the safe.

    So -- perhaps explain why individuals or organizations without privileged relationships (as a confessor, doctor, or writer) with someone should be able to ignore a search warrant at will? Should a landlord be able to refuse to allow police to search an apartment if they have a valid warrant? Is it unreasonable of the government to require a landlord to open the apartment? And if the landlord opens the apartment, how does that make them an ‘agent’?

  15. doublereed says

    @felicis

    You are under a lot of misunderstandings about this case. Almost everything you have said is completely and entirely incorrect.

    So – NOT “the government asking Apple to weaken the encryption that it has built into its software”. That is an absolutely incorrect way of looking at this situation. NOR is Apple being required to build an OS with a ‘backdoor’ that will allow anyone to bypass the encryption.

    What the FBI is asking for is essentially use of Apple’s signing key, which allows the device to recognize the code as legitimate when it loads it into the device. The FBI is requesting Apple actively write code to disable security features of the device using their signing key.

    This is a backdoor. That’s what a backdoor is. The code that Apple writes will be effective on any device of the same model. Writing such code and signing it with Apple’s key is a massive security risk as it threatens all devices.

    Also – Apple has the option of explaining to the court why the request is too burdensome.

    Apple is not claiming that they cannot write the code. They are arguing the principle of the matter.

    Schneier gives a good run-down of the technical aspects of the case.

    I don’t understand why you think Apple is lying. They have no desire to make the FBI look bad or have some grudge against the FBI. The FBI does have desire to make Apple look stubborn, as they want to use the precedent for much more far-reaching cases. Once again, I link to an ACLU article on the consequences of the case.

  16. says

    The FBI is arguing for unrestricted access to your phone without a warrant. And anyone who resents or resists is a “threat”, ergo the FBI must force phone makers to give them access. Maybe British drug dealers were right, buying up old Nokias that have old software and no GPS on them.

    If you don’t trust apple or other companies, you could make your own phone.

    http://www.instructables.com/id/Build-Your-Own-Smartphone/

    https://www.raspberrypi.org/blog/piphone-home-made-raspberry-pi-smartphone/

    The NSA and FBI’s petty squabbling over territory is nothing new. Some writers I’ve read claim various branches of the US military used to do the same thing in the past, sometimes sabotaging other agencies attempts to gather intelligence. I bet the Soviets knew and were laughing.

  17. doublereed says

    @felicis

    So – NOT “the government asking Apple to weaken the encryption that it has built into its software”. That is an absolutely incorrect way of looking at this situation. NOR is Apple being required to build an OS with a ‘backdoor’ that will allow anyone to bypass the encryption.

    I wanted to quote this again, because this is astonishingly incorrect. The Apple devices have a vulnerability, and the FBI is asking for Apple-signed code that exploits the vulnerability and bypasses the security features of the phone at the OS level.

    Yes, Apple is being required to build an OS that backdoors the device. The court order is actually quite specific. That’s what is happening. The fact that you actively deny that this is what is happening expresses not just ignorance, but willful ignorance of the situation.

    Apple has PR touting their security features, and they have done many good things, including closing this vulnerability in newer devices. It is frankly laughable that you think they have some vested interest in not complying with the FBI. What for? What’s the end game? What’s the motivation? You think Apple is part of some criminal syndicate or something? This is just senseless posturing to advertise for Apple maybe? Makes no sense.

  18. doublereed says

    So – perhaps explain why individuals or organizations without privileged relationships (as a confessor, doctor, or writer) with someone should be able to ignore a search warrant at will? Should a landlord be able to refuse to allow police to search an apartment if they have a valid warrant? Is it unreasonable of the government to require a landlord to open the apartment? And if the landlord opens the apartment, how does that make them an ‘agent’?

    Let’s say I’m a safe-maker. It is not my responsibility to come up with ways to break into the safe, in case the FBI wants in. The whole point of me making safes is that they can’t be broken into. I designed them to be that way.

    Designing a master key into my safe (just in case the FBI wants in) weakens the security of all my safes. It makes everyone less secure. A precedent in court would mean anyone who makes security products have to deliberately weaken their security for the FBI. This is a dangerous precedent for all American security products.

  19. moarscienceplz says

    McLaughlin also writes that the FBI likely knows that there is nothing worth getting from the phone but that they are using this case, with all its anti-terror fearmongering, to establish their right to get backdoors to encryption.

    This is the only explanation that makes any sense at all. What data could there possibly be that would be worth anything to the FBI except how to break into more phones? There were only two people in the attacks and they are dead. Enough time has now passed that any sort of “automatic” attack (time bomb, anthrax in the mail, etc.) would have happened if it was going to. What else could there be -- the identity of a couple of Muslims who cheered them on? Big whoop, I’m sure the FBI knows thousands of people like that already, and probably don’t surveil most of those due to lack of resources.

  20. Friendly says

    felicis@16:

    The All Writs Act has been law since 1789, and in its current form since 1911. Yet none of what you claim could happen has.

    Until recently, we’ve had few federal governments that have shown themselves to be as astoundingly willing to trample on civil liberties in the name of state security as (in particular) the Bush and Obama administrations. Donald Trump recently made this statement about The Washington Post (which has criticized him) and its owner Jeff Bezos: “Believe me, if I become president, oh do they have problems. They’re gonna have such problems.” This makes it very clear that Trump sees no problem with using the apparatus of the state against his perceived personal enemies. Worse might be coming, and we don’t need any precedents to be set that would make it easier for future autocrats in the Executive Branch.

    this case is not […] forcing […] Apple to act as a [sic] agent.

    I find it hard to believe that you could possibly write that with a straight face.

    The closest you get is ‘force Brinks to crack any safe they’re interested in’ — which I have no problem with so long as there is (as in this case) a valid search warrant for what is in the safe.

    As my friend Ashton Sydney has asked, “What happens if Apple are forced to write this code, go to their head programmer and say ‘Write this code because we’re forced to order you to do so’ — and he says ‘I WON’T’?
    Are Apple supposed to fire their valuable employees for not doing something that they themselves don’t want to do?
    Are the courts going to order an individual who has nothing to do with a crime to do specific work to help them? [To which I would add, ‘And subject them to contempt of court proceedings if they don’t?’] Doesn’t that count as involuntary servitude under the Thirteenth Amendment?”

    perhaps explain why individuals or organizations without privileged relationships (as a confessor, doctor, or writer) with someone should be able to ignore a search warrant at will? Should a landlord be able to refuse to allow police to search an apartment if they have a valid warrant? Is it unreasonable of the government to require a landlord to open the apartment? And if the landlord opens the apartment, how does that make them an ‘agent’?

    First off, refusing to comply with an unconstitutional warrant while contesting it in court is not “ignoring” it. Second, a landlord *owns* the space that a tenant is renting; any information stored there is *on his property*, and it’s not unreasonable for the government to seek access to someone’s property to conduct a lawful search. In this case, Apple might have *made* the phone in question, but *it is not their property*; attempting to compel them to break into it is an abuse of power — plain and simple, full stop — and given that you won’t stop defending that abuse, I’m done here.

  21. says

    I haven’t looked through all the comments yet but there’s a very good likelihood that this is a side-show and that FBI is only asking Apple because:
    a) they already fucked up their forensic process, are stupid, and are taking the easy path, which is to pressure apple.
    b) they are typically on bad terms with the NSA, who almost certainly have the cell phone messages in PRISM databases out in Utah. but NSA would want FBI to beg. and FBI would rather pressure apple.

    Whether the device is encrypted well or not (I am almost certain not, for reasons I can explain when I have more time) the data would have been exposed at the juncture-points between providers and carrier edge networks. AT&T and Verizon have never balked at giving that data to the FBI -- even in bulk -- and I see no reason to imagine they would have suddenly stopped. It’s a profit centre for them.

  22. felicis says

    Doublereed -- #17 --

    “Almost everything you have said is completely and entirely incorrect.
    So – NOT “the government asking Apple to weaken the encryption that it has built into its software”. That is an absolutely incorrect way of looking at this situation. NOR is Apple being required to build an OS with a ‘backdoor’ that will allow anyone to bypass the encryption.

    I refer you to the actual court order:
    http://www.ndaa.org/pdf/SB-Shooter-Order-Compelling-Apple-Asst-iPhone.pdf

    Note especially paragraph 4 -- while the government is asking for Apple to unlock the phone in a particular way, Apple may do so in any manner it wants, or even (Paragraph 7) show that this places an undue burden on Apple and not do it at all.

    “What the FBI is asking for is essentially use of Apple’s signing key, which allows the device to recognize the code as legitimate when it loads it into the device. The FBI is requesting Apple actively write code to disable security features of the device using their signing key.”

    Close -- the court order specifies a signed file, however (see Paragraph 3) the file may be loaded at an Apple facility and not provided to the government at all -- so long as the government has remote access to the phone so it can try passwords until it is unlocked. So -- sure, the FBI is asking them to either unlock the phone or at least allow them infinite tries so they can unlock the phone in a reasonable amount of time. That is not the same as handing over the key or the OS. Especially if Apple can keep the phone.

    And note Paragraph 4 again -- Apple does not have to build in any particular weakness at all if there is any alternative to accessing the data.

  23. felicis says

    Doublereed #20

    “Let’s say I’m a safe-maker. It is not my responsibility to come up with ways to break into the safe, in case the FBI wants in. The whole point of me making safes is that they can’t be broken into. I designed them to be that way.”

    Every safe can be broken into. And you would have technical expertise in being able to get into your safe -- the State has a process to compel your aid in executing a search warrant -- you can argue that the very concept of being required to help the state is unjust; you can argue that this instance is overly burdensome on yourself; but arguing that this master key (which you don’t have to give the state, if you open the safe, then destroy the key -- that would meet the letter of the order), is a danger to the security of everyone who has one of your safes is stretching things a bit.

    “Designing a master key into my safe (just in case the FBI wants in) weakens the security of all my safes. It makes everyone less secure. A precedent in court would mean anyone who makes security products have to deliberately weaken their security for the FBI. This is a dangerous precedent for all American security products.”

    The security weakness already exists -- so how does this make it worse? Are you sure that no-one else can use this to get into your Iphone? They aren’t being asked to deliberately weaken security on Iphones in general, nor to provide the FBI with a means of doing so generally.

  24. felicis says

    Friendly #22

    Yes -- Trump has publicly stated that he will do things (well, try to do things) that are blatantly unconstitutional. That’s one reason we have courts. Here we have two court orders -- one for a search warrant and one ordering aid in implementing that search warrant.

    Aiding is not the same as acting as an agent.

    “As my friend Ashton Sydney has asked, “What happens if Apple are forced to write this code, go to their head programmer and say ‘Write this code because we’re forced to order you to do so’ — and he says ‘I WON’T’?
    Are Apple supposed to fire their valuable employees for not doing something that they themselves don’t want to do?”

    It is possible he could be held in contempt -- he can appeal that and I expect that Apple’s legal department would help him do so. There’s certainly nothing in the order requiring Apple to fire anyone. (Nor would it be reasonable to compel them to do so -- that would probably be found to be placing an undue burden on them, at least I would see it that way).

    “Doesn’t that count as involuntary servitude under the Thirteenth Amendment?”

    Possible -- I would be willing to listen to an argument to that effect. If Apple wants to refuse to perform this service, I don’t see why they could not make such an argument -- however, if Apple agrees to perform the service and assigns that work to one of their employees who then refuses, the employee would probably not have any similar recourse (since they are effectively refusing to do their job).

    However, that’s not the argument Apple is making (and it is unclear whether or not a corporation *could* make that argument (is it a ‘person’ or an ‘individual’ who is protected -- I’ve seen both used in the description of involuntary servitude).

    “In this case, Apple might have *made* the phone in question, but *it is not their property*; attempting to compel them to break into it is an abuse of power — plain and simple, full stop.”

    No -- it is not their property, it is, however, property subject to a search warrant -- just how is this an abuse of power? But -- it looks like you won’t answer, so I suppose we’re done here.

  25. lanir says

    This all sounds like magic bullshit.

    FBI is not asking for search warrants -- it is asking to deputize unwilling people. Pretty sure the legal process for that is different.

    FBI is playing Mr. Wizard and talking to the masses about Things We Know And You Don’t. Their statements sound quite compelling and sincere if you have no idea how the underlying technology works. They do their incantation, they utter the magic word “terrorism” and wait for the spell to work.

    Techie goo follows:

    When computer drives are encrypted there’s a small part that isn’t encrypted. It’s required to start things up and have something that can run to ask for your password. Not sure how smartphones do it, from the sound of it the firmware handles that so to replace it you’d have to have something that did all the normal bootup tasks like understanding how to talk to all the hardware in addition to providing the different password functionality the government wants. Which they need because it sounds like they changed the password so Apple doesn’t have it anymore and I’m guessing the phone needs some confirmation before allowing a remote password change. That’s why they need Apple to help and why it has to be done this specific way.

  26. says

    Dunc@#4 --
    There isn’t.

    I wouldn’t be so sure of that. When you restore an iPhone from the cloud settings, what is required? Just the PIN and the Apple ID. That tells you one of 2 things:
    -- there’s an unencrypted version of the cloud data sitting around
    -- the encryption is based on the Apple ID’s password/PIN

    It also tells you there is an offline copy of the data sitting around, that can be attacked offline using dictionary attacks or even a brute force search (since the offline copy won’t be programmed to wipe itself)

    You can always tell you’re dealing with mediocre-at-best encryption if the data can be restored in the event of a device failure. That always means that the encryption is not keyed to something on the device, but the keys are exported to some other place and are probably encrypted with a master per-user key or even a global key. Global keys would (of course) be supreme incompetence but I have seen several mobile device security systems that had global keys embedded in the code. Let’s imagine apple cares enough not to do that. But, honestly, that’s a stretch for me.

    Don’t believe me? Have you ever purchased a new iPhone and restored all the settings and text messages from iCloud using just your apple password? What does that tell you?

    Sorry, I’m a cynical old dog who’s coded a number of the security systems that many of you probably relied on if you were doing anything securely on the internet in the mid-late 90s. There is a great deal of bullshit surrounding this story.

    What I believe is going on is that apple has carefully picked its battle to get what they hope will be a favorable ruling in this area. Major companies like Blackberry (remember them?) and Facebook and Google -- have to deal with “national security letters” in which the government violates the 4th amendment by issuing warrants and classifying the warrant so it cannot be discussed. The government has established secret courts, which are arguably unconstitutional or depend on jesuitical parsing of the constitution. I think Apple thinks this is a good enough case that they can fight this particular position. I tend to agree with them. It appears that the FBI have screwed up their forensic process and are going to put themselves on the stand as being a bunch of idiots.

  27. says

    Not sure how smartphones do it, from the sound of it the firmware handles that so to replace it you’d have to have something that did all the normal bootup tasks like understanding how to talk to all the hardware in addition to providing the different password functionality the government wants.

    You pull the encrypted blob off the memory (it’s in accessible memory) and you whack it with brute force attacks against all the PINs. There aren’t that many PINs. There are plenty of known plaintexts. The FBI is too incompetent and too lazy to do an offline known plaintext attack. In fact if they had just gotten busy doing that, they’d have unlocked it all by now.

    The FBI probably realizes there’s nothing worth anything on the phone, anymore. They’re just setting up so that next time there’s a terrorist attack they can drag out people like Sam Harris to talk about ticking time bomb scenarios and argue for backdoors in everything. Like they did with CALEA. The FBI will do anything to get out of having to do actual detective work.

  28. doublereed says

    @felicis

    Note especially paragraph 4 – while the government is asking for Apple to unlock the phone in a particular way, Apple may do so in any manner it wants, or even (Paragraph 7) show that this places an undue burden on Apple and not do it at all.

    Once again, Apple is not claiming that it cannot do what the FBI is asking. Please stop talking about “undue burden.” Apple is fighting this on principles, not technicalities. Once again, you are completely misunderstanding the issue at hand.

    Stop talking about undue burden. The issue is whether tech companies can essentially be conscripted into law enforcement on demand to weaken the security of all their customers.

    very safe can be broken into. And you would have technical expertise in being able to get into your safe – the State has a process to compel your aid in executing a search warrant – you can argue that the very concept of being required to help the state is unjust; you can argue that this instance is overly burdensome on yourself; but arguing that this master key (which you don’t have to give the state, if you open the safe, then destroy the key – that would meet the letter of the order), is a danger to the security of everyone who has one of your safes is stretching things a bit.

    The government can ask for my aid in the technical aspects of my safe. They can’t compel me to design a master key for it. (And saying “every safe can be broken into” is a nonsense, pointless thing to say, and has nothing to do with this case).

    And how is that stretching things at all? Once the precedent is set, what is to stop the FBI from asking to use it again and again and again and on other devices? Do you not understand how the law works and what a precedent is? Do you think the FBI attorneys don’t understand how court orders work?

    To say that this is “stretching things” is shockingly naive. Gross incompetence and lack of critical thinking. I have no more words.

    The security weakness already exists – so how does this make it worse? Are you sure that no-one else can use this to get into your Iphone? They aren’t being asked to deliberately weaken security on Iphones in general, nor to provide the FBI with a means of doing so generally.

    Once again, you show complete and total ignorance of the technical security issues at play here. They are being asked to deliberately weaken the security of all iPhones of the same model. They all use would use the same code-signing key. That’s how the code works.

    And yes, other people can exploit the vulnerability to get into your iPhone. It’s just easier to get Apple to exploit it directly because they have the key. The FBI could do it themselves if they have the power (maybe they already have, and are using this case just to get the legal precedent).

  29. doublereed says

    @felicis

    I want to make this clear: in terms of privacy and security, you are going further than even ex-NSA chief Michael Hayden on this issue. You do not know what you are talking about.

  30. EnlightenmentLiberal says

    Fascinating. Part of the decryption key is hardwired into the iPhone processor itself. Ingenious!

    Now, if only the processor itself was hardwired to force a wait of 5 seconds after a failed decrypt operation. Perhaps that’ll be the next feature on the chip design for Apple.

  31. EnlightenmentLiberal says

    Oh, they already did that in the iPhone 5S from what I’m reading. Incredibly fascinating. I need to read more about this.

  32. EnlightenmentLiberal says

    Nope. I take that back. The iPhone 5S can still be hacked because the decrypt delay is not hardwired in the chip itself where the decrypt key is also internally stored. As I said, I hope Apple has a bug request open with their chip designer, lol.

  33. lanir says

    The FBI is too incompetent and too lazy to do an offline known plaintext attack. In fact if they had just gotten busy doing that, they’d have unlocked it all by now.

    Okay, I can believe that. I’d always been told the FBI had some very skilled technical people. That particular illusion lasted less than 2 minutes into the first conversation I had with one of their agents. I would have brushed it off as them needing people with all sorts of different skillsets except he was responding to a very technical part of an investigation and he clearly thought my colleagues and I would do all his work for him.

  34. says

    Part of the decryption key is hardwired into the iPhone processor itself. Ingenious!

    Always look at how backup keys are managed for backup sequestration of for data migration to a new system. Usually when rubes are looking at the first couple pages of math their eyes glaze over and they don’t catch the part where the secret key generated by the TPM gets encrypted with a public key embedded in the software and hashed appendded into the message integrity checksum. Or something like that.

    Carl Elison published a paper in crypto 95, I think it was, based on my observation that it ought to be possible to modulate a secret key exhange with a known secret such that it would be indistingushable from a real DH-exchange but the secret key could later be applied to extract the exchanged key. One cryptographer I know referred to that technique as “satanic” whereas the commandant of the US cryptological academy told me it was “obvious” -- you make whatever inferences from that, be my guest.

  35. says

    doublereed:
    you are going further than even ex-NSA chief Michael Hayden on this issue

    Hayden was lying when he gave that testimony and has lied a great deal since then. So, unless your point was that suckers should believe what they’re told by the secret police, I don’t see what you’re saying.

    Hayden was lying about lots of things (before you ask) the most obvious of which was his head-fake defintion of “metadata” defining “looking at” something as meaning “human beings observing” and “collecting” meaning “humans being observing the collection of.”

  36. says

    PS -- watch carefully when an NSA or CIA person uses the word “investigation”; it’s a lot like when a cop uses the term “probable cause” and means your colon is going to get turned inside out.

    Where things get interesting is when you have a bunch of data being run through a load of processors and a scoring system, then getting applied against clusters of messages (still not “searched”) then “matched” for content probability against linguistic codexes. You know what, you don’t even have to decode or match a data set of you can subset match the contents closely enough?? It’s amazing.

  37. Dunc says

    Marcus, @28: I was referring specifically to the encrypted file system on the device itself, which (as far as I understand it) is what the Feds are actually going for in this case. You are of course completely correct to point out that any backup that can be restored without access to the device key is a much softer target -- which only serves to re-enforce the impression that they’re less interested in the actual data on this specific device than the general principle that they should be able to break encryption.

    I think it’s worth remembering that consumer-level security isn’t really intended to protect people from governments. The security on the iPhone is intended to keep your data secure if your phone is lost or stolen and falls into the hands of ordinary criminals or the merely curious -- it’s not intended to protect you from a determined attack by a national security agency. I’m a big fan of James Mickens’ “MOSSAD / not-MOSSAD” threat model in this respect.

    (For the uninitiated, the “MOSSAD / not-MOSSAD” threat model breaks threats down into two simple categories -- “not-MOSSAD” type threats -- i.e. nosy partners and common-or-garden cybercriminals, which can be relatively easily defeated by ordinary means such as strong passwords and of-the-shelf encryption, and “MOSSAD” type threats -- which would certainly include the NSA and GCHQ -- which cannot be defeated and will own you and your data sixteen ways from Sunday if they bother to put their minds to it. The key message being that there’s really no point in worrying about MOSSAD -- if they want you it’s game over, and all the encryption in the world won’t help you.)

  38. Dunc says

    Just looking at the white paper again, and I see that there are several classes of data which are always encrypted using the device UID, even during backup, and so cannot be restored to a different device. Could be that they’re after one of those…

  39. doublereed says

    @38 Marcus

    My point was that even someone as disregarding-of-privacy-rights and police-state-enabling as Michael Hayden is on Apple’s side of the case. He’s no raging civil libertarian. But even he said James Comey was simply wrong.

    I wasn’t referring to his testimony, I was referring to an interview he gave here.

  40. says

    My point was that even someone as disregarding-of-privacy-rights and police-state-enabling as Michael Hayden is on Apple’s side of the case

    Hayden is not on any “side” of anything; he’s going to say whatever is convenient.
    The point is that you can’t assume people like Hayden are going to say the opposite of the truth or the truth at any given time. They aren’t like republicans, who you can expect to always lie in the same way on any given issue; they’re not that easy to predict. Especially now that Hayden’s a revenue-sharing member of the military/industrial/intelligence complex.

  41. says

    there are several classes of data which are always encrypted using the device UID, even during backup, and so cannot be restored to a different device

    Well, anyone who’s ever restored an iPhone will tell you that all their message-streams reappear as they were when the device was being sync’d. That tells you that nothing device-dependent can possibly be being used to protect the cloud version of the data.

    This whole thing is a sideshow.

    One popular way hackers will monitor a person’s cell phone is to buy a blank phone and enroll it via iCloud using their credential. Then, it becomes a perfect tap. You have to delete the email messages they’ll get that notify them there’s a new device enrolled, of course. There is no way the FBI does not know about that technique because they’ve had to investigate its use. And even though they are barely competent to tie their own shoes, they can always find a hacker who’ll plea-bargain their way through it. Remember how they flipped ‘Sabu’ into an unpaid FBI agent?

    This whole thing is a sideshow. And it’s a badly chosen sideshow for the FBI. As you probably can guess, that flinty ball of neutronium I call a heart does not bleed for the FBI.

  42. Dunc says

    Well, anyone who’s ever restored an iPhone will tell you that all their message-streams reappear as they were when the device was being sync’d. That tells you that nothing device-dependent can possibly be being used to protect the cloud version of the data.

    Sure, none of that stuff is device-dependent. It’s only certain very specific items that are locked to the device.

  43. says

    It’s only certain very specific items that are locked to the device.

    Yep. Stuff the FBI doesn’t care about. Contacts, messages, app data, etc -- will usually be backed up to the cloud. The FBI doesn’t need to crack the DRM on the guy’s music downloads.

    Apple fans tend to fall for the “oo! encryption!” marketing. That’s why apple does it, after all.

  44. Dunc says

    Well, I think it’s more a question of stuff that users don’t care about… People care far more about maintaining access to their data than security. (Me, I’m a Windows guy… ;))

  45. deepak shetty says

    @felicis

    I trust the government far more than I trust what Apple is saying publicly.

    You do not seem to be following current events -- nor do you seem to be willing to learn from history.

  46. doublereed says

    Hayden is not on any “side” of anything; he’s going to say whatever is convenient.

    That’s a fair point.

  47. EnlightenmentLiberal says

    To Marcus

    Always look at how backup keys are managed for backup sequestration of for data migration to a new system. Usually when rubes are looking at the first couple pages of math their eyes glaze over and they don’t catch the part where the secret key generated by the TPM gets encrypted with a public key embedded in the software and hashed appendded into the message integrity checksum. Or something like that.

    Sorry. I never meant to imply that it’s actually secure against MOSSAD threats. I just thought that it was a fascinating and ingenious idea to bake part of the decryption key into non-volatile memory which is embedded in the actual physical chip, and built-in encrypt and decrypt processor instructions, with no method for outside access of the key (short of opening the physical chip very carefully, and using an electron microscope or some such).

    As you so eloquently say, there are probably a bazillion holes in the actual security setup of the iPhone, such as cloud backups. It would probably take someone on the level of MOSSAD to develop a phone and apps that is secure against MOSSAD.

    I thank you for your knowledgeable contributions.

  48. Dunc says

    You’ve got to remember that security is always a matter of trade-offs -- in this case, the trade-off between the security of your data and the ability to recover it. For the vast majority of people, “what if I lose my phone?” is a very real and pressing concern, whilst “what if MOSSAD want to read my messages?” is so far down the threat matrix that it’s not worth thinking about.

  49. EnlightenmentLiberal says

    whilst “what if MOSSAD want to read my messages?” is so far down the threat matrix that it’s not worth thinking about.

    What is on the radar is if the common person wants to give the middle finger to the US NSA and CIA. If everyone uses this kind of encryption, then no one stands out as the obvious suspect, and the NSA and CIA do not have the human resources to follow up on everyone.

  50. StevoR says

    @ ^ EnlightenmentLiberal : And who in effect does that help and who does it harm?

    Doesn’t it occur to y’all that we have law enforcement and counter-terrorism agencies for a reason (or fifty!) and those reasons are often good reasons and that we should support -- not uncritically but still -- support such law enforcement and counter-terrorist groups as part of the whole social contract deal much the same as big companies should pay their taxes, we should have laws and courts and elections in the first place and so on.

  51. Holms says

    And We’ll all be playing on the smallest violins we can find, out of sympathy for said organisations having to fall back on old fashioned detective work.

  52. says

    Doesn’t it occur to y’all that we have law enforcement and counter-terrorism agencies for a reason (or fifty!) and those reasons are often good reasons and that we should support – not uncritically but still – support such law enforcement and counter-terrorist groups as part of the whole social contract deal

    They’re outside of the social contract -- unless you can recall when you were asked whether you needed the FBI and whether you wanted it in the particular form it’s taken. What about “consent by the governed” is not clear to you? You’re an authoritarian (the government is right because it’s the government) not a liberal (explain to me why…)

    I suggest you consider Pournelle’s “Iron law of bureacracy” -- it’s actually a pretty fair assessment of what’s gone on in law enforcement and intelligence pretty much always. There is no reason at all for you to assume that law enforcement and anti-terror are there for your good; they are much more likely to be there for their own good.

  53. says

    there are probably a bazillion holes in the actual security setup of the iPhone, such as cloud backups.

    Slate has a good(ish) piece on the topic here: http://www.slate.com/blogs/future_tense/2016/03/01/fbi_director_comey_seemed_lost_at_congressional_hearing_about_apple_iphone.html
    One quote that warmed my heart was Rep Issa asking the right question -- the one I immediately asked:

    Issa: Did you receive the source code from Apple? Did you demand the source code?

    Comey: Did we ask Apple for their source code? Not that I’m aware of.

    Issa: Does the 5C have a non-volatile memory in which all of the encrypted data and the selection switches for the phone settings are all located in that encrypted data?

    Comey: I don’t know.

    Issa: Well, it does. Take my word for it for now. So that means that you can in fact remove from the phone all of its no-nvolatile memory, its disk drive if you will, and set it over here and have a true copy of it that you could conduct infinite number of attacks on. Let’s assume that you can make an infinite number of copies once you make one copy, right?

    Comey: I have no idea.

    Issa: I’m doing this because I came out of the security business and this befuddles me that you haven’t looked at the source code and you don’t really understand the disk drive.

    The source code is unnecessary. As I said in my earlier posting -- if the FBI had started an offline decryption process it would have been finished by now. Very few people use a PIN that’s longer than 4 digits (I use 6 but I wear a tin foil hat, as someone has rightly pointed out) In fact if the FBI had any brains at all (they don’t) they would have gotten the guy’s ATM card PIN from his bank and tried that. That would give them a ~70% chance of cracking the phone on the first try.

    The cloud backups are another funny thing. For one thing, they are backups which are backed up, themselves. There is a very very good chance -- bordering on 100% -- that the phone’s state at any time since it came online can be recovered from past backups. And, historically, Apple, Google, Yahoo! Microsoft and other companies have never failed to give that information when presented with a proper warrant. My suspicion is that Apple’s reaction in this case is because FBI dramatically overreached itself and is getting a well-deserved rap on the knuckles from the industrial sector that really runs the country.

    Now, it’s possible that the suspect had all the cloud services turned off (I sure as fuck do, yet there is a vast amount of chatter between Apple and my iPhone that I suspect includes application and browser data) If the suspect had icloud turned off then the backups are going to be sitting there on a desktop machine running iTunes and even if they are encrypted there the password will be (on average) predictably crackable. People who do cyber investigations (including yours truly) know how to do this stuff and would have counselled the FBI how to do it in this case except it appears that some room-temperature intellect at the FBI started firing from the hip in the general direction of his own foot.

    It would probably take someone on the level of MOSSAD to develop a phone and apps that is secure against MOSSAD.

    Mossad are just another bureaucratic cluster of fuckwits like the NSA and CIA -- only marginally better because they’re smaller and less well-funded. (There seems to be a rule that the more money you give cops to do their job, the dumber they get) And they would be exactly the wrong people to develop that stuff. Usually cryptographers can’t get out of their own way because they don’t do a good job with practical stuff. There are existing systems and techniques that can be used today in such a way that the mossad, NSA, and CIA probably would have deep, angstful, heartache over. You can build them out of off-the-shelf parts, but what you’ll find you have is something not particularly convenient to use. I described a few techniques relevant to the time (1997) in tutorial materials for a class I gave at USENIX on the topic:
    http://ranum.com/security/computer_security/archives/secure-communications.pdf
    In today’s tech, it’s a matter of correctly using the cloud to ensure that your data is in the wrong place for the state to get at it.

    The cipherpunks have pretty much died down and gotten stock options, and it’s hard to rail against the state when it’s paying you so much. Also, they’ve realized that encryption only solves a fairly small (and relatively unimportant) part of the big problem, which is communication security.

    Here’s a rough rule of thumb to bear in mind: if it’s convenient, it’s not very secure.

    That should tell you most of what you need to know about Apple’s stuff, other than that “it sure is shiny!” If you want secure comms, though, you can set up a PPTP VPN over some bulk encryption and do voice-chat with a point-to-point chat application like bitwise, teamview, openfire, etc. Use a great big pre-exchanged key generated from hashing video samples of a lava lamp. Use your own distro of an operating system like openBSD and build your own hardware. Then discover: you actually have nothing to talk about.

  54. EnlightenmentLiberal says

    Here’s a rough rule of thumb to bear in mind: if it’s convenient, it’s not very secure.

    Thumbs up. Well said.

    For example, I could run OpenBSD and try to be paranoid and secure, but then I can’t use my computer for things that I normally use my computer for.

  55. EnlightenmentLiberal says

    To Marcus
    Question: Am I right about the following? I believe that I’m right.

    Take the security enclave feature. Beef up the internal key to 4096 bit AES encryption. Bake an escalating timeout into the internal wiring of the chip itself, up to 1 day after 10 failed attempts, reset after one good attempt. I assume that this can be done -- alternatively, do it in software, but require signed firmware / OS, and have a hardware requirement that updating the firmware / OS will destroy the stored 4096 bit AES key.

    Voila. You can remove the disk drive and copy the contents for all the good that it will do you, which about none, assuming even a moderately secure pin. You would need to open the chip and use an electron microscope to read the 4096 bit AES key, or you would need to obtain the user’s pin, or you would need to obtain the data via some other method, like a cloud backup.

    Of course, you’d have to worry about the hardware at every step of the way to ensure that unencrypted stuff is not persisted in any non-volatile hardware cache of any kind, and any unencrypted data in volatile cache is flushed according to some predefined time period, like 1 minute(?).

    You’d also have to have near perfect software, and so forget about downloading and using any apps except the one or two that you got a expert hacking team to examine for weakness for several months / years (and even then…).

    Again, I would emphasize as you would, that this only accomplishes being unable to beat the encryption directly. There are still a plethora of indirect attacks, such as just keeping you in prison until you fess up the user pin, or installing keyloggers (software or hardware) of some kind. For example, I’m particularly impressed by some work that shows by reading power fluctuations on the nearby power lines, you can decode keyboard presses of a nearby keyboard that is connected to the power line. Aka stuff like that.

Leave a Reply

Your email address will not be published. Required fields are marked *