You can be sure as water’s wet – if someone doesn’t tell cops “don’t intrude on people” (You know, like the constitution tried to…) they’re going to explore the grey zones around the people’s rights. And by “grey zones” that means “areas where they can pretend not to understand” or “it looks grey to me.”
As Sam Harris says, in defense of profiling: [sh]
We should profile Muslims, or anyone who looks like he or she could conceivably be Muslim, and we should be honest about it. And, again, I wouldn’t put someone who looks like me entirely outside the bull’s-eye (after all, what would Adam Gadahn look like if he cleaned himself up?) But there are people who do not stand a chance of being jihadists, and TSA screeners can know this at a glance.
What he doesn’t realize is that racist profiling is going to inevitably be automated, and will result in an automatic robo-hassle or robo-shakedown against the target. [eng] In China, Uighurs are geo-fenced using face-recognition systems, so that they can’t travel:
China is adding facial recognition to its overarching surveillance systems in Xinjiang, a Muslim-dominated region in the country’s far west that critics claim is under abusive security controls. The geo-fencing tools alert authorities when targets venture beyond a designated 300-meter safe zone, according to an anonymous source who spoke to Bloomberg.
Managed by a state-run defense contractor, the so-called “alert project” matches faces from surveillance camera footage to a watchlist of suspects. The pilot forms part of the company’s efforts to thwart terrorist attacks by collecting the biometric data of millions of citizens (aged between 12 to 65), which is then linked to China’s household registration ID cards.
Don’t laugh at China, though – the US is actually quite a bit ahead of China in deploying such systems – it’s just been hidden using America’s tried and favorite’d technique: public/private cooperation. First, the FBI primed its database using drivers license photos from every state that electronically captures drivers’ faces (i.e.: all of them now) The FBI face recognition database contains photos of half the US population and (surprise!) the recognizers are about 15% accurate – probably because drivers license photos are fairly standardized. Guess what else is unexpected: it’s more likely to be wrong about black people. Guess what else is unexpected? Congress was “shocked” at the size of the FBI’s deployment, but didn’t do a damn thing about it. Naturally.
By the way, the FBI didn’t ask. They just did it. So that way nobody could say “no” because it’s already operational and anyone who wants to take down such a system is helping terrorists and criminals, naturally.
I joked “Wait until Mississippi gets this” – referencing the fact that it’ll give racist cops an excuse to pull over anyone who’s driving while black, and probably shoot them besides – “the face recognition system said he was an arab terrorist!” Suddenly a 15% chance takes on significance. It’s also going to be a terrific excuse for abusive policing at borders. I’m upper middle-class and white, a TSA ‘known’ traveler, and I am usually carrying a ton of computer gear when I travel – customs/border patrol usually just wave me through. But if a face recognition system meant that 15% of the time I was going to miss my connection because I was stuck in a huge line at the border? Yeah, I’d be vocally unhappy. I’d get my privilege all bent out of shape. This system is going to further increase racial and class separations in the US; and – as I said – the American South is where it’s going to be adopted, first.
Oh, looky y’all: 
ORLANDO (CNN) – Orlando International has become the first United States airport to commit to using facial recognition technology.
It involves using facial biometric cameras that can be put near places like departure gates.
The cameras verify a passenger’s identity in less than 2 seconds, and they’re 99 percent accurate.
Officials say the system will be used to process the 5 million international travelers that go through the airport every year.
That story doesn’t even make sense. International travelers don’t come in at departure gates. They arrive at Customs, inbound. Outbound travellers don’t matter much, really. And somehow the system is being claimed as 99% accurate. And, by the way, there is no such thing as a “biometric camera” – they mostly use cheap 3K webcams (except they probably pay $1000 apiece for them) – the magic is all done in the server farm, somewhere.
And you can bet that the FBI will ask the airports, politely, “can we have a copy of that?” after all, it’s so they can help make sure Sam Harris misses 15% of his future connections.
We’ve already seen that the unregulated sharing of user data can cause problems (e.g.: the Strava heatmaps revealing secret US bases) – don’t think that this point has not sunk in; it’s just being ignored. For example:
Waze, the ubiquitous navigation app cherished by millions of traffic-harried drivers and begrudged by others who live along suggested shortcuts, is being tapped by the federal government to try to make roads safer.
The U.S. Transportation Department ingested a trove of anonymous data from Wazers, as the company calls its users, and put it up against meticulously collected crash data from the Maryland State Police. [wp]
Waze may even believe that their data is anonymized. Did they consider, for a second, that there are license-plate scanners all over the place, now, and every license-plate scanner at every toolbooth and airport parking lot, as well as many other locations, share that information with State Police – who are making it available to the FBI via “fusion centers”? I’m not saying “it’d be a simple perl script to match up Waze data with license-plate scanner data and de-anonymize drivers” because I don’t code in perl; otherwise, I guess it’d take a day or two to build that capability: you simply model the ID and whenever it passes by a place that records license-plates, you match all the IDs in that geofence against other places where those IDs have been seen where any of the other IDs in that geofence are not present. The FBI and local police forces also have stingrays up all over the place, that track phone numbers and sweep up text messages – that gives an excellent second source to merge along with the license-plate scanners. Best of all, it can be done asynchronously and it parallelizes beautifully. Ooh, big secret algorithm disclosed there. Perhaps the airport’s facial recognition systems are getting the claimed 99% accuracy because they are cross-checking against the passenger-lists.
It’s a weird mixture of hyper-optimism and short-sightedness.
Meanwhile, there is the “AI will figure out everything” crowd, who do seem to be putting their effort into making that happen: [osf]
Description: We show that faces contain much more information about sexual orientation than can be perceived and interpreted by the human brain. We used deep neural networks to extract features from 35,326 facial images. These features were entered into a logistic regression aimed at classifying sexual orientation. Given a single facial image, a classifier could correctly distinguish between gay and heterosexual men in 81% of cases, and in 74% of cases for women. Human judges achieved much lower accuracy: 61% for men and 54% for women. The accuracy of the algorithm increased to 91% and 83%, respectively, given five facial images per person. Facial features employed by the classifier included both fixed (e.g., nose shape) and transient facial features (e.g., grooming style). Consistent with the prenatal hormone theory of sexual orientation, gay men and women tended to have gender-atypical facial morphology, expression, and grooming styles. Prediction models aimed at gender alone allowed for detecting gay males with 57% accuracy and gay females with 58% accuracy. Those findings advance our understanding of the origins of sexual orientation and the limits of human perception. Additionally, given that companies and governments are increasingly using computer vision algorithms to detect people’s intimate traits, our findings expose a threat to the privacy and safety of gay men and women.
Got that? They think that gay people look different enough that an AI can detect them. It’s not like there’s any way any of the things the AI is detecting might be culturally determined. I checked my copy of Vaught’s Practical Character Reader and there are no entries for gay people. I guess I’m just stuck with the double-helping of social prejudice Vaught’s offers on other topics.
The Government Accountability Office (GAO) has some nasty and accurate things to say about FBI’s database: [gao]
The Department of Justice’s (DOJ) Federal Bureau of Investigation (FBI) operates the Next Generation Identification-Interstate Photo System (NGI-IPS) – a face recognition service that allows law enforcement agencies to search a database of over 30 million photos to support criminal investigations. NGI-IPS users include the FBI and selected state and local law enforcement agencies, which can submit search requests to help identify an unknown person using, for example, a photo from a surveillance camera.
Prior to deploying NGI-IPS, the FBI conducted limited testing to evaluate whether face recognition searches returned matches to persons in the database (the detection rate) within a candidate list of 50, but has not assessed how often errors occur. FBI officials stated that they do not know, and have not tested, the detection rate for candidate list sizes smaller than 50, which users sometimes request from the FBI. By conducting tests to verify that NGI-IPS is accurate for all allowable candidate list sizes, the FBI would have more reasonable assurance that NGI-IPS provides leads that help enhance, rather than hinder, criminal investigations. Additionally, the FBI has not taken steps to determine whether the face recognition systems used by external partners, such as states and federal agencies, are sufficiently accurate for use by FACE Services to support FBI investigations
The office recommended that the attorney general determine why the FBI did not obey the disclosure requirements, and that it conduct accuracy tests to determine whether the software is correctly cross-referencing driver’s licenses and passport photos with images of criminal suspects.