The currency of computer security is Trust – the degree to which you can believe that your system is doing what you expect it to. There are a lot of properties that comprise trust, including integrity, reliability, etc., each of which is made up of smaller properties like non-repudiation, auditability, resistance to replay attacks, ad infinitum. We talk about trust loosely; it’s like Liberty or Good Cinematography – it’s a useful concept for describing the relationship between ourselves and the systems we use – whether they work right for any given notion of “right.”
One of the other properties of a trustworthy system is that it does what you want it do, and it doesn’t do things you don’t want it to. The category of “things I don’t want” is infinite in potential, so we shorten that to “do only what I want and nothing else.” A system that does other stuff you didn’t ask it to is not trustworthy – it may be a mildly undesirable feature like Microsoft’s animated paper clip in Office, or it could be a highly undesirable feature like occasionally selling your entire stock portfolio and investing in dodgy penny stocks. From a high level perspective, ‘malware’ is just an unwanted feature that is doing stuff for you, like aggressively sharing your private data.
That’s another aspect of a trustworthy system: it protects the data you want protected, and publishes the data you want published, and it doesn’t make those sort of broad policy decisions for you (because: how can it get that right?) A system that starts sharing data you want protected, is untrustworthy, insecure – or, you could just say it’s compromised.
From a high-level security perspective, then, you can immediately see that a system that works based on “opt-out” is going to be more likely to force you into making a bad policy decision than one based on “opt-in.” More precisely, security types would talk about “fail graceful defaults” – the system should always tend to do the least damaging thing unless it’s explicitly told to do something risky. In a fail-graceful system, your phone would not export all its data without asking you, first: “hey, shall I make your contact list available to your friends?” or whatever.
None of that deals with outright malicious and deceptive systems. Imagine if I had an app and I wanted to give a copy of your contact list to my marketing team. What if it automatically added me in your contact lists as foozum.com official site and then asked “shall I make your contact list available to your friends?” If you click OK, now that I am your friend you just gave me your contact list. That’s a silly example, except that it isn’t. Horrible skeevy marketing weasels use that sort of trick all the time. It’s why virtually every business, for a while (and still a lot) want you to install their app: they get to have their code running on your system, and then they can either outright compromise your security, or sneakily do it using legalistic dodges like the one with friend-lists. When you combine that sort of thing with the tracking apps running in your browser, [stderr] your device is completely untrustworthy: it is not doing what you want; it is doing what a whole slew of unknown people want. If you thought that you were using your smart phone to do stuff that you want, you’re wrong – most of the cycles and traffic your smart phone generates are not spent doing what you want.
In a very real sense, it’s not your phone. It’s owned by a collection of marketing weasels (and a few government agencies) and it only incidentally does a few things for you now and then.
Think I’m kidding?
More than three in four Android apps contain at least one third-party “tracker”, according to a new analysis of hundreds of apps.
The study by French research organisation Exodus Privacy and Yale University’s Privacy Lab analysed the mobile apps for the signatures of 25 known trackers, which use various techniques to glean personal information about users to better target them for advertisements and services.
Among the apps found to be using some sort of tracking plugin were some of the most popular apps on the Google Play Store, including Tinder, Spotify, Uber and OKCupid. All four apps use a service owned by Google, called Crashlytics, that primarily tracks app crash reports, but can also provide the ability to “get insight into your users, what they’re doing, and inject live social content to delight them.”
Delight me. (Picture me saying that in Samuel L. Jackson’s voice) What they are doing is exporting: your location, your browsing activity, what other apps you have installed, which apps you are running (so it can see what you run the most), your bookmarks, and – anything else they can fool you into giving the app permission to access.
They tell you it’s OK because you gave permission, but it’s not – it makes your system untrustworthy. A trustworthy system would do a little pop-up with full disclosure, and an opt-in. A message like that, you will never, ever see: because the people who are building these harvesting apps are not honest about what they are doing. A disclosure-box would be honest; they’re sneaky. What does that tell you? They know they’re not trustworthy, too.
Other less widely-used trackers can go much further. One cited by Yale is FidZup, a French tracking provider with technology that can “detect the presence of mobile phones and therefore their owners” using ultrasonic tones. FidZup says it no-longer uses that technology, however, since tracking users through simple wifi networks works just as well.
The Yale researchers said: “FidZup’s practices closely resemble those of Teemo (formerly known as Databerries), the tracker company that was embroiled in scandal earlier this year for studying the geolocation of 10 million French citizens, and SafeGraph, who ‘collected 17tn location markers for 10m smartphones during [Thanksgiving] last year.’ Both of these trackers have been profiled by Privacy Lab and can be identified by Exodus scans.”
Remember: if they were being honest, they’d ask. But they’re not – they’re hiding this stuff in apps and gaming your approval with fine print in the End User Agreement – the End User Agreement saying, as Proctoscope says: “If you grant us that access we will sell everything you have to anyone who asks us nicely.”
This is not an edge-case scenario: Uber has allegedly stopped tracking customers after their trips end. Hint to smart phone users: when you’re done with any app that has access to your location, turn it off immediately when finished. Besides, your phone will run faster and your battery will last longer because it’s not constantly updating Uber (or a dozen other providers) what you’re up to.
FidZup and Teemo’s apps used to turn your microphone on and sample it constantly listening for ultrasonic chirps that were output by location-tracking points. When the app heard a chirp, it would push the ID of the chirp up to Teemo’s cloud and it knew your location very precisely. And you were wondering why your new phone’s battery life sucked: it was burning its CPU like crazy doing a frequency analysis of everything coming in the microphone. Or, may I say “micropwn”?
Don’t feel better that you’re using an iPhone [though Android is manifestly worse] – Apple’s app store is not there to keep untrustworthy apps off your phone; it’s there to keep untrustworthy apps that crash your phone or make Apple look bad off your phone. That’s for a simple reason: Apple sells phones, Google sells ads. Google’s platform is optimized for Google’s purposes and Apple’s platform is optimized to Apple’s. Microsoft, of course, does the same things but for its own purposes, as does Facebook, etc.
Here’s another way of thinking about it: you already have an ‘app’ – it’s called “a browser.” If some organization wants you to run their app, that’s because they want to do something that your browser doesn’t facilitate. In some cases, that may be something really cool (I’m a fan of Hipstamatic, for example) but in most cases it’s that they want to collect a bunch of stuff that your browser would protect you from giving up so freely. Every app that ‘phones home’ to see if you’ve got messages (Snapchat, Hipstamatic, Instagram, Facebook, Spotify…) when they connect up, what information are they transferring? At the very least, your location can be determined based on the network address of your current access point (there are companies that sell that information) Remember when Google’s street view cars went around making pictures for the maps? They also mapped all the wifi access points they saw. So, if you’re coming from a carrier network, they have that data (they know the carrier) and if you’re using your meth-cooking buddy’s home WiFi, they know that too, and will give that information to the FBI when the FBI asks about the devices that used your buddy’s access point.That’s all marketing lingo for “we figure out where you are and sell it to anyone.” When they say “increase customer satisfaction” their customer is a marketing weasel – you’re not the customer, you’re the commodity.
So, there is a vast infrastructure of sneaky, nasty, deceptive code that is deployed by marketers to infect your browser so they can track everything you are doing. This reduces your ability to trust your browser tremendously, since you (naturally!) have no idea what it’s doing: it is not your browser. And, there is a similar vast infrastructure of evil running on your smartphone, sucking your battery life, tracking your location, monitoring the sounds around you, and eating your bandwidth and performance to transmit all that to dozens of companies: it is not your smartphone.
You’re just paying for it.
Android: first off, it an operating system platform produced by a marketing company. Naturally, it’s going to facilitate the delivery of advertisements. But beyond that, there is a fascinating tale of deep technical cluelessness and hubris. The smart guys at Google (and they have some smart guys!) (excepting James Damore) came up with one of the worst possible operating system distribution models for smart phones: they release the entire core software and let device makers add whatever they want without a clear dividing line between that which is Android and that which is device-maker-modified Android. So the device maker adds some fancy thingie specific to their device, and codes a couple security holes into it: now there is a security flaw that’s not Google’s problem and (since it’s not Google’s code) it’s only going to get fixed if the device maker gets around to it. Worse, what if the device maker alters something that’s also something Google alters – which version do you wind up with? That’s an open question. What if Google fixed a bug in a cryptographic processing routine and the device manufacturer had their own version and doesn’t patch the Google fix back into their code-base? Essentially, it’s the world’s worst operating system distribution model since the original BSD UNIX distributions – where everyone hacked away at it and sort of converged on a standardized wad of bugs, eventually. But that only happened after the UNIX operating system-based vendors had so thoroughly crapped all over the market that Microsoft wound up becoming the dominant player. In cynical moments I think Google actually did Android as a way of driving a bunch of cell phone makers completely crazy (look what it did to Amazon’s phone) I simply cannot believe Google did Android’s software model by accident, when they basically solved system administration for their own servers, using a distributed form of configuration management. They could have done that with the phone O/S but it would have taken a bunch of additional thinking to identify and block what interfaces the device makers could play with, and which they couldn’t, then to devise a binary patching system atop that, so that the device makers could patch their crap and Google could patch their crap and all the crap would be patched.
“Privacy Lab:” Why have a privacy lab? There is no privacy anymore. It’s like having a Dodo Bird Lab.
Hey, here’s one:
- Collect underpants use the accelerometers on the phone to have an app that figures out when you’re having sex. Based on, you know, characteristic movements. And then you can push that to the cloud and sell that to companies that home-deliver pizza, cigarettes, and red wine. And you can crossmatch for nearby phones and figure out who the person’s partners are, then sell that to Amazon as input into their “Big Data” mine.