Much attention has been focused on how the British firm Cambridge Analytica used various forms of dirty tricks to try and influence elections in various countries. The level of success achieved by them is hard to gauge since, unlike with advertisements aimed at selling products where there is a measurable outcome, electoral success is harder to gauge since we don’t know how individuals actually voted and also so many factors go into how people vote that drawing a straight line from cause to effect is hard. Pointing to the fact that winners in some close elections, like Donald Trump in the US and Uhuru Kenyatta in Kenya, were clients of CA as evidence of their effectiveness is interesting but we don’t know how many of their other clients lost.
What is indisputable though is that they claimed to their clients that they could microtarget voters so effectively that they could influence them, and were caught admitting to being willing to supplement those actions with dirty tricks such as entrapping politicians into compromising situations and secretly recoding them. That Cambridge Analytica’s sleazy activities were exposed by a similar sting is a fitting irony
But for microtargeting to work even in principle requires the targeter to have great amounts of information about individuals and this is where Facebook comes in. Apparently CA had access to Facebook’s vast trove of data and now Facebook is in the hot seat, having to explain how they came to be colluding with CA. What is coming out is that Facebook has been quite cavalier about the way they divulge people’s information to third parties.
Clive Thompson writes that apparently almost any company that wanted to develop apps and games for use on Facebook or its related sites were given access to the database so that they could develop algorithms to target individuals with ads, the same methods used by CA to produce targeted videos and ads. One such game developer Ian Bogost describes how, even though his goal was not to monetize user information, he was given access to the Facebook database when he wanted to develop his Cow Clicker game that was a parody of games like FarmVille. He himself admits that while his game was not great work, to his surprise it became inexplicably popular. But what was interesting was how much data he was able to access so easily from unknowing users of his game.
Cow Clicker is not an impressive work of software. After all, it was a game whose sole activity was clicking on cows. I wrote the principal code in three days, much of it hunched on a friend’s couch in Greenpoint, Brooklyn. I had no idea anyone would play it, although over 180,000 people did, eventually. I made a little money from the whole affair, but I never optimized it for revenue generation. I certainly never pondered using the app as a lure for a data-extraction con. I was just a strange man making a strange game on a lark.
And yet, if you played Cow Clicker, even just once, I got enough of your personal data that, for years, I could have assembled a reasonably sophisticated profile of your interests and behavior. I might still be able to; all the data is still there, stored on my private server, where Cow Clicker is still running, allowing players to keep clicking where a cow once stood, before my caprice raptured them into the digital void.
He goes on to describe how apps get made and published on Facebook and that the users of these apps are not made aware that all their personal data is going to some third party and not retained by Facebook.
In essence, Facebook was presenting apps as quasi-endorsed extensions of its core service to users who couldn’t have been expected to know better. That might explain why so many people feel violated by Facebook this week—they might never have realized that they were even using foreign, non-Facebook applications in the first place, let alone ones that were siphoning off and selling their data. The website always just looked like Facebook.
Millions of apps had been created by 2012, when I hung up my cowboy hat. Not only apps apparently designed with duplicity in mind, like Aleksandr Kogan’s personality-quiz, which extracted data that was then sold to Cambridge Analytica. But hundreds of thousands of creators of dumb toys, quizzes, games, and communities that might never have intended to dupe or violate users surely did so anyway, because Facebook rammed their data down our throats. On the whole, none of us asked for your data. But we have it anyway, and forever.
And that is the problem. Facebook cannot get back that data even if it wanted to. Facebook’s business model seems to depend on getting more and more people to spend more and more time on their site and placing greater restrictions on app developers will undermine that.