The intelligence state has not been collecting data for its own entertainment; they have to find a use for it. Remember back to when the Bush administration’s secret interception program was disclosed? Probably the most painful bit of news in it was that the NSA wasn’t (yet) capable of doing anything particularly interesting or useful with the data.
If you’ve worked in computing at all in the last few years, you will have noticed there is a big push surrounding artificial intelligence and “big data” – these things are all interconnected:
- Collect data into a big pile
- Realize it is far to much to understand let alone search through
- I know! Let’s get some software to do it!
- Machine learning
Machine learning/AI does work but its learning is going to depend on having a measurable success criterion. It’s like the joke:
An AI walks into a bar and the bartender asks “what do you want?” The AI points to the guy next to him and says “I’ll have what he’s having.” Another AI walks into the bar, and points at the first AI and says “I’ll have what he’s having.
None of that matters, anymore. Now that the police state has authorized itself to collect and process whatever data it can collect, that’s what it’s going to do. And damn the consequences – we must prove that the effort was successful, for budgetary reasons.
Reporting in The Verge: [verge]
According to Ronal Serpas, the department’s chief at the time, one of the tools used by the New Orleans Police Department to identify members of gangs like 3NG and the 39ers came from the Silicon Valley company Palantir. The company provided software to a secretive NOPD program that traced people’s ties to other gang members, outlined criminal histories, analyzed social media, and predicted the likelihood that individuals would commit violence or become a victim. As part of the discovery process in Lewis’ trial, the government turned over more than 60,000 pages of documents detailing evidence gathered against him from confidential informants, ballistics, and other sources – but they made no mention of the NOPD’s partnership with Palantir, according to a source familiar with the 39ers trial.
The program began in 2012 as a partnership between New Orleans Police and Palantir Technologies, a data-mining firm founded with seed money from the CIA’s venture capital firm. According to interviews and documents obtained by The Verge, the initiative was essentially a predictive policing program, similar to the “heat list” in Chicago that purports to predict which people are likely drivers or victims of violence.
This is cops doing what cops always do: they identify a ‘bad neighborhood’ and ‘bad actors’ and watch who they talk to and what they do. The only difference is that there is a piece of machine-learning in there, now, to help add new links into the analysis based on probability.
When we say that “racists will train racist AIs” this is exactly what we’re talking about. What is Palantir going to find in those areas? More of what cops have already been finding in those areas. So the places where cops are turning a blind eye to the peccadilloes of the wealthy, we’ll see areas that look peaceful and well-maintained. In the places where cops are looking more closely for gang activity, we will find gang activity. Then, the software will magnify it.
I’m not particularly concerned that this is going to turn into a magical big brother system – I predict it’s mostly a boondoggle that is going to such a bunch of money out into a basically useless (but pretty) system. Here’s where I get a bit worried:
Predictive policing technology has proven highly controversial wherever it is implemented, but in New Orleans, the program escaped public notice, partly because Palantir established it as a philanthropic relationship with the city through Mayor Mitch Landrieu’s signature NOLA For Life program. Thanks to its philanthropic status, as well as New Orleans’ “strong mayor” model of government, the agreement never passed through a public procurement process.
Reading that carefully, it appears that: New Orleans’
dictator mayor tossed a bunch of procurement dollars to a defense/intelligence company under the table, and called it a humanitarian gesture. That’s just plain old garden variety cronyism and corruption. But, it sounds like Palantir’s mostly doing this as a part of their sales cycle – get in the door, get the software in use, do some demos, charge some dollars.
“Palantir is a great example of an absolutely ridiculous amount of money spent on a tech tool that may have some application,” the former official said. “However, it’s not the right tool for local and state law enforcement.”
The concern, to me, is that machine learning is going to amplify the existing expectations of the NOLA police; it’s a self-fulfilling prophecy system.
And then there’s the pesky issue that cops just started sharing all that data about their citizens, with no control and oversight. Remember the Strava heatmap disaster? As police and intelligence agencies keep getting their hands on more and more data, where will be many more stories like that – except they will be personal. The capabilities of such systems are already much greater than most people realize. Want to find out where a certain person spends their time? Consult the license-plate scanner systems, locate them roughly, then search for anything related in the target area. If you have the identifier of someone’s
personal tracking device smart phone you can heat-map them just like with Strava. The ability to access and share data is getting blurred more and more, as companies cheerfully begin giving data in response to polite requests from goverment. The police no longer need to get a warrant to place a tracking device in your car, if they can get your location data from your Waze or Google Maps server requests. Data is now being kept in multiple platforms with different resolutions, and combining it to build a very precise picture gets easier and easier. And it’s all happening with, basically, no oversight or regulation.
Someone needs to tell those cops “Hey we just spent your raises on Palantir!” Because, basically, that’s what’s going on. Everyone’s got their hand in the till but Palantir’s got a clever sales rep who made his quarter with that “philanthropy” trick. Want to bet this is going to crop up elsewhere?
By the way, when you see articles about Palantir, they often say something about how the company has tried to expand into commercial markets with less success, what they are really saying is: “It doesn’t actually do very much, but when the intelligence guys use it they classify it so they don’t have to disclose the fact that they bought a bunch of expensive software that doesn’t do anything useful.”