Much attention has been focused on how the British firm Cambridge Analytica used various forms of dirty tricks to try and influence elections in various countries. The level of success achieved by them is hard to gauge since, unlike with advertisements aimed at selling products where there is a measurable outcome, electoral success is harder to gauge since we don’t know how individuals actually voted and also so many factors go into how people vote that drawing a straight line from cause to effect is hard. Pointing to the fact that winners in some close elections, like Donald Trump in the US and Uhuru Kenyatta in Kenya, were clients of CA as evidence of their effectiveness is interesting but we don’t know how many of their other clients lost.
What is indisputable though is that they claimed to their clients that they could microtarget voters so effectively that they could influence them, and were caught admitting to being willing to supplement those actions with dirty tricks such as entrapping politicians into compromising situations and secretly recoding them. That Cambridge Analytica’s sleazy activities were exposed by a similar sting is a fitting irony
But for microtargeting to work even in principle requires the targeter to have great amounts of information about individuals and this is where Facebook comes in. Apparently CA had access to Facebook’s vast trove of data and now Facebook is in the hot seat, having to explain how they came to be colluding with CA. What is coming out is that Facebook has been quite cavalier about the way they divulge people’s information to third parties.
Clive Thompson writes that apparently almost any company that wanted to develop apps and games for use on Facebook or its related sites were given access to the database so that they could develop algorithms to target individuals with ads, the same methods used by CA to produce targeted videos and ads. One such game developer Ian Bogost describes how, even though his goal was not to monetize user information, he was given access to the Facebook database when he wanted to develop his Cow Clicker game that was a parody of games like FarmVille. He himself admits that while his game was not great work, to his surprise it became inexplicably popular. But what was interesting was how much data he was able to access so easily from unknowing users of his game.
Cow Clicker is not an impressive work of software. After all, it was a game whose sole activity was clicking on cows. I wrote the principal code in three days, much of it hunched on a friend’s couch in Greenpoint, Brooklyn. I had no idea anyone would play it, although over 180,000 people did, eventually. I made a little money from the whole affair, but I never optimized it for revenue generation. I certainly never pondered using the app as a lure for a data-extraction con. I was just a strange man making a strange game on a lark.
And yet, if you played Cow Clicker, even just once, I got enough of your personal data that, for years, I could have assembled a reasonably sophisticated profile of your interests and behavior. I might still be able to; all the data is still there, stored on my private server, where Cow Clicker is still running, allowing players to keep clicking where a cow once stood, before my caprice raptured them into the digital void.
He goes on to describe how apps get made and published on Facebook and that the users of these apps are not made aware that all their personal data is going to some third party and not retained by Facebook.
In essence, Facebook was presenting apps as quasi-endorsed extensions of its core service to users who couldn’t have been expected to know better. That might explain why so many people feel violated by Facebook this week—they might never have realized that they were even using foreign, non-Facebook applications in the first place, let alone ones that were siphoning off and selling their data. The website always just looked like Facebook.
Millions of apps had been created by 2012, when I hung up my cowboy hat. Not only apps apparently designed with duplicity in mind, like Aleksandr Kogan’s personality-quiz, which extracted data that was then sold to Cambridge Analytica. But hundreds of thousands of creators of dumb toys, quizzes, games, and communities that might never have intended to dupe or violate users surely did so anyway, because Facebook rammed their data down our throats. On the whole, none of us asked for your data. But we have it anyway, and forever.
And that is the problem. Facebook cannot get back that data even if it wanted to. Facebook’s business model seems to depend on getting more and more people to spend more and more time on their site and placing greater restrictions on app developers will undermine that.
Marcus Ranum says
There’s a subtle point: this stuff, in the US, is generally ‘protected’ contractually. There’s no technological fix for what to do about use and dissemination of data once someone’s got it. So, the US’ love of “click to use contracts” where things can be embedded deep down in the terms of service -- that’s what’s really come back to haunt everyone on this issue. From my reading of what happened (I’ve also just interviewed an honest to god data scientist who knows something about the affair, for my column over at Search Security) the “stolen” data was collected through an API that Facebook allowed people some distant menu to opt out of, but which was collected and distributed under Facebook’s contract that users are required to agree to. In other words, according to contract law, capitalists be all like: “lol, fuck you.”
Marcus Ranum says
users who couldn’t have been expected to know better
I do not accept that argument. Anyone who is smart enough to use a computer at all is smart enough to realize that Facebook (just like every big business built on ‘eyeballs’) was selling data about them.
It is unreasonable to join “WeAreWorth$15BillionBecauseWeSellAds.Com” and claim that one expected one’s personal information to be kept private. That argument is as unreasonable as, oh, say, walking into a meeting with Russians to collude on obtaining stolen emails, and then complaining that the Russians didn’t give stolen emails, therefore there was no attempt to collude. Nobody’d make an argument that absurd.
I think a lot of advertising capital has gone into pushing a certain view of Facebook and other corporate scumbags. I think the truth of what they do to people who trust them even a little is out there if you go looking for it. The million dollar question is: What triggers an average person to go looking and how would they recognize the truth when they saw it? Would they have the necessary grounding in logic and background to understand what they were seeing?
And the bonus question: If you’re inundated with a constant barrage of lies (let’s call them “advertisements” for convenience) and generally lack an adequate education in how to unravel them, are you completely responsible for acting on those lies? Or is this more like the lies the tobacco companies sold and it’s really time for the whole game to change?
Don’t have good answers myself, just think the questions are worth asking.