Interesting developments in thwarting government spying


According to recent reports, Apple’s new operating system iOS8 makes it impossible for it to comply with NSA requests to hand over people’s data.

Apple said Wednesday night that it is making it impossible for the company to turn over data from most iPhones or iPads to police — even when they have a search warrant — taking a hard new line as tech companies attempt to blunt allegations that they have too readily participated in government efforts to collect user information.

The move, announced with the publication of a new privacy policy tied to the release of Apple’s latest mobile operating system, iOS 8, amounts to an engineering solution to a legal quandary: Rather than comply with binding court orders, Apple has reworked its latest encryption in a way that prevents the company — or anyone but the device’s owner — from gaining access to the vast troves of user data typically stored on smartphones or tablet computers.

The key is the encryption that Apple mobile devices automatically put in place when a user selects a passcode, making it difficult for anyone who lacks that passcode to access the information within, including photos, e-mails and recordings. Apple once maintained the ability to unlock some content on devices for legally binding police requests but will no longer do so for iOS 8, it said in the new privacy policy.

But Cory Doctorow quotes John Gilmore who says that Apple is asking us to take their claims on faith without giving us the information to think that their promises can be trusted. And Andy Greenberg says that the government still has ways of getting your private information off your phone.

Apple’s announcement comes on the heels of Google and Dropbox also announcing steps to make secure systems easier for consumers to use.

Google and file-hosting service Dropbox announced the creation of Simply Secure on Thursday, an organization that aims to make security tech easier to use.

“While consumer-facing security tools exist and are technically effective, they often have low adoption rates because they’re inconvenient or too confusing for the average person to operate. Even well-known features like two-factor authentication, offered by many online services, are not widely used,” the companies said in a statement.

The file sharing site Dropbox is particularly vulnerable to customer defection after Edward Snowden said that the company was “hostile to privacy” and that people should use Spideroak instead, which is what I now use.

While I welcome these developments at improving privacy protections by these big companies, I am also a little wary of accepting them at face value. The big corporations have an ugly history of pretending to be on the side of their clients while secretly colluding with governments.

Comments

  1. Who Cares says

    The keyword with what Apple is doing is Physical.
    With the tendency of Apple devices to do cloud syncing you need to disable that as well before they can’t comply.

  2. says

    I don’t believe it for a second.

    When you consider the kind of pressure that the government has been willing to apply to Yahoo! (and classify it) what do you think they’ll do to Apple?

    Reading between the lines it sounds like they’re encrypting some of the containers on the devices, which will protect you in the event that someone finds your iDevice where you dropped it, and decides to take it apart rather than selling it on Ebay. That would only be valuable for someone who never uses their iDevice to communicate at all. I.e.: it won’t help anyone but it sounds good.

  3. says

    (If any vendor actually wants to do something useful, what they’ll do is have the device generate a strong encryption key and use that to bulk encrypt anything before it is sent to the cloud; then decrypt it when it’s pulled down. This could be done easily and reliably, with the sole downside being that if a user managed to forget their key-encrypting key, their data is irretrievably gone. That could be further addressed by adding an option in which a copy of the key was encrypted to a public key provided by the vendor, with a statement attached that:
    -- the user understands that the vendor’s key recovery service has potential access to your data
    -- the user declares and attests that they have an Expectation Of Privacy with respect to their key and their data

    The fact that no major vendor has actually made any significant effort to protect their users’ data ought to be extremely telling Because it’s easier to dime their customers out to the government. That being the US government — the same one that complains that China is naughty for monitoring and restricting their users’ internet access. Go figure.)

  4. Dunc says

    I’ve been thinking about this a little more recently, and I think the real, fundamental problem is that we (by which I mean “users in general”) want to do two entirely contradictory things with our data simultaneously: we want to be able to share stuff widely and frictionlessly, and yet retain control over where it ends up. You can have one or the other, but absolutely not both at the same time.

    Marcus’ suggestion of doing end-to-end encryption for cloud storage would be fine (and some such services are already available), as long as you don’t want to share your stuff with anybody else. But sharing is one of the killer apps (possibly the killer app) for the cloud in the first place, and the whole idea of “promiscuous” sharing is deeply embedded into the current app development culture.

Leave a Reply

Your email address will not be published. Required fields are marked *