This is one of those “it’s worse than you think” things that won’t really hurt much. It’ll just cost money. And a few lives.
As the police state division of the military/industrial complex rockets out of control, there are a lot of con-artists attracted to the sweet sweet smell of that great big trough of poorly overseen spending. In its simplest form, this fraud/waste/abuse shows itself as buying expensive tactical rifles for urban cops (who will virtually never need to sniper-kill anyone) or buying landmine resistant trucks for them to drive around in.
You’ve probably already heard about the jerks who were selling dowsing rods for explosive detection, at $16,000 – $50,000 apiece, knowing they didn’t actually do anything at all. [nyt] Fortunately, at least one of them is in jail, now, and not running around enjoying the canapes and champagne.
I can imagine how it’s useful for detecting mines: you just walk around and when your existence goes blank and everyone else hears a catastrophic explosion: you’ve successfully found a mine. It’s called the ADE651 and it uses the same principles of operation as its manufacturer’s previous product: a dowsing rod for finding lost golf balls. 
But that’s not the pernicious stupidity we’re here to talk about today. Since we were just talking about facial recognition, [stderr] it’s depressing to see that someone is trying to implement Sam Harris’ profiling system[sh] in software.
We should profile Muslims, or anyone who looks like he or she could conceivably be Muslim, and we should be honest about it.
That’s actually a realistic recommendation, because “he or she could conceivably be Muslim” covers all humans. Which is, actually, the typical recommendation of security experts. But there’s a company that’s trying to use facial features to profile personalities: [faception]
It’s ironic to see a company founded by israelis re-implementing the failed ideas of eugenicists and nazis. Actually, it’s not: it’s just disgusting and sad.
There are twin studies that show that some parts of human behavior appear to be genetic. But genetic determinism remains a shibboleth, in spite of nazis and racists best effort to support it with cherry-picked statistics.
The face that I’m reading this with is shaped by my DNA, but it’s also shaped by an accident that occurred in 2013 in which my jaw was badly broken and my bite was changed so I tend to do most of my chewing on the right side. My face also changes rather dramatically as a result of the weight I put on and shed thanks to the metabolic problems I had after starving for two months with my jaw wired shut (food obsession, binge dieting, lather rinse repeat) The face I am reading this with sometimes has glasses on it and other times contacts. Right now, my face has got a tremendously grumpy-looking frown on it as I contemplate the stupidity that is Faception.com I’m not sure whether they were pandering to Sam Harris by making “High IQ” look like him, although “High IQ” also looks suspiciously like Faception’s “Chief Profiler” …
The self-inflicted thrashing of their credibility continues:
Their own “Thecnologist” Tal Alufi, looks a lot like a “terrorist” perhaps they’re just trolling themselves. By the way, this is what a pedophile looks like:
The problem is, of course, that they’re promoting this garbage for security applications including border control.
Since the types of analysis that are performed on law enforcement image databases are kept largely secret, it is entirely possible that this sort of garbage will find its way into use.
You’ll notice that the people in the examples are all displayed in white against a baby-shit yellow background. That should give you pause to think, too: what’s one of the primary things that face recognition software is going to key off of? If you guessed skin color, you’re a winner! Other things that will be prominent are facial hair, hair texture, and glasses/non-glasses.
To understand how these systems – all of these systems, including the ones in our heads – work, you need to realize that they’re just pattern-matching rules. Those rules are adaptive based on our experience (and the experience of the software as it’s trained) so, for example, if you have important childhood memories of a person with a beard that was angry a lot, you’re going to be more likely to interpret people with beards as “angry” than someone who doesn’t. That’s not DNA, that’s experience and memory. When a human programs an AI, they program their experiences into it, as well, in the form of the training sets. If you train an AI using stereotypical imagery – let’s say pedophiles from Hollywood and television – you’re actually getting a distillation of Hollywood’s stereotype. Hollywood character designers deliberately play on stereotypes as part of dramatic characterization: the picture of Epstein above might work fine for a stereotype of a wealthy playboy, but it’d be a surprise character reveal if the plot twists had him turn out to be a pedophile. In fact, if you look at mug shots, you’ll notice that character stereotypes such as the ones this AI is trying to apply – simply do not apply.
Worse, as Bruce Schneier pointed out when he was trying to school the stubbornly obtuse Harris on airport security[sch]: if someone knows your stereotypic profile, they can game it. In fact that is exactly why the 9/11 killers made a point of looking like typical professional office-workers as they went through airport security in Boston. You will notice the conspicuous lack of beards, lack of Saudi head-scarf and thawb, lack of sandals, etc.
The reason this is problematic is because AI is the new savior that the police state is looking towards to help it sort out the gigantic mountains of data that it’s collecting. IBM is coy about how Watson would certainly be worth the NSA having a look at, but – the best it can possibly be is an artificial intelligence which means that, just like a ‘real’ intelligence, it’s garbage in/garbage out. If you grew up after 9/11 you might see those two guys as “terrorists” but if you grew up in the 70s this is what a “terrorist” probably looks like:
Facial recognition is not a panacea, nor are any automated rule-based matching systems, whether they are implemented in silicon or in a brain.
“We kill people based on metadata” – Michael Hayden, former Director of NSA
I guess it’s not “just metadata” if it’s being used to kill people. Michael Hayden (who, by the way, looks like the “pedophile” profile from Faception) [wikipedia] defended the NSA’s use of scoring systems to determine who to strike in Pakistan and Afghanistan.
Last year, The Intercept published documents detailing the NSA’s SKYNET programme. According to the documents, SKYNET engages in mass surveillance of Pakistan’s mobile phone network, and then uses a machine learning algorithm on the cellular network metadata of 55 million people to try and rate each person’s likelihood of being a terrorist. [theintercept]
I worry that, as the mountains of data pile up, and the police state is already unable to deal with all that they collect, they will resort to extremely inaccurate automated methods. This will be possible, for them, because there is no downside cost for them if they are wrong. The best way to rein the problem in, in fact, is to make it expensive for law enforcement to make a mistake with facial recognition. At present, the most plausible route for that is via lawsuits for wrongful detainment – which is why that is exactly the fulcrum-point of the current profiling debate: the police state wants to be indemnified from harm caused when it incorrectly or illegally detains, scares, or harms someone. If there’s one wish I have it’s that a billionaire or two would pay all the legal bills for anyone who has a gun pointed at them by a law enforcement officer, or who is detained and searched, to sue them personally and their state or government agency.
We must resist this dangerous bullshit, because it’s the root of the intellectual tree that allows cops to gun down black people and say “they looked scary.” The emphasis on appearance over fact and actual behavior is dangerous.
By the way, isn’t it weird and self-defeating that Faception’s illustration includes someone wearing a mask? That’s sort of a tacit admission that their whole programme is trivially easy to defeat.
This is scary bullshit. But then it’s social psychology: a field that should be burned down and plowed with salt. I’ll spare you the details so you don’t have to watch it unless you need a purgative: people in pictures who are smiling or who have widened eyes are rated as more interested/friendly than people who aren’t. That’s an amazing discovery, I admit. But let’s not call in air strikes on everyone who’s frowning, because we frowners are threatening and look like “terrorists”!