Accuracy Is Not The Objective


This is a sad one; I’m afraid we’re looking at more police malfeasance.

Police in Wales are claiming that their face-recognition system has been working pretty well in the last 9 months; they have arrested over 450 people.

The problem is that their facial recognition system has a 92% false positive rate. [ars] I’m afraid they are missing the point: it gives the police an excuse to stop and question anyone, anytime.

A “false positive” isn’t even (quite) the right term. If you’ve got a test that returns true or false, a “false positive” is when the system returns “true” and the test is not true. For something like facial recognition in a police application, the correct test is not true/false, it’s:

  1. Who is this?
  2. Is that person wanted for a crime?

Presumably, what they are describing is a false identification. The way Ars describes it:

New data about the South Wales Police’s use of the technology obtained by Wired UK and The Guardian through a public records request shows that of the 2,470 alerts from the facial recognition system, 2,297 were false positives. In other words, nine out of 10 times, the system erroneously flagged someone as being suspicious or worthy of arrest.

Did the system mis-identify passers-by as Max The Pirate, wanted for piracy on the high seas? Or did it correctly identify the passers-by as who they were, and incorrectly flag them as “stop and frisk!”? There’s a really bad failure mode hidden in there: the system appears to be identifying a lot of people incorrectly but it’s identifying them as someone worth a stop and frisk. That’s highly suspicious; I’d expect the system to identify someone incorrectly, then fail. I.e.: “I don’t know who that is” which never leads to a “stop and frisk.” Someone is lying.

In other words, the cops are just lying. The facial recognition truck is a way of having an excuse to pull over anyone based on the inaccurate recognizer; they could just replace it with a camera that rings “ding dong” whenever it sees motion. Time for an ID check!

Wait ’till LAPD gets their hands on one of these. They’ll just shoot everyone they see and say “it said ‘shoot on sight.'” They’ll love that.

“Of course, no facial recognition system is 100 percent accurate under all conditions. Technical issues are normal to all face recognition systems, which means false positives will continue to be a common problem for the foreseeable future,” the police wrote. “However, since we introduced the facial recognition technology, no individual has been arrested where a false positive alert has led to an intervention and no members of the public have complained.

Nobody has complained? I doubt that. But perhaps the population is just so happy that they’re not going to get shot, that they are grateful to escape.

Comments

  1. cvoinescu says

    Knowing the police around here, it’s also quite possible that what happened was something along these lines: the camera in the van checked some 5,000,000 frames against the database. In 2,400 cases, it rang the bell, showing the captured image and the photo and details of the wanted person, side by side. In 2,300 of those cases, the police in the van looked at that screen, said “that lady on the bike doesn’t really look like this 50-year-old guy, does she?”, rolled their eyes, and pressed the “nope” button. In 100 cases, they went “Bloody hell, the bloody thing actually works! Get the cuffs!” and nabbed someone they were actually looking for.

    You’ll excuse me if I’m an optimist. They really do behave much better here than in the US (faint praise, I know).

  2. Dunc says

    Police in Wales are claiming that their face-recognition system has been working pretty well in the last 9 months; they have arrested over 450 people.

    The problem is that their facial recognition system has a 92% false positive rate. [ars] I’m afraid they are missing the point: it gives the police an excuse to stop and question anyone, anytime.

    Sounds like it’s working perfectly then. [/s]

    That’s highly suspicious; I’d expect the system to identify someone incorrectly, then fail. I.e.: “I don’t know who that is” which never leads to a “stop and frisk.” Someone is lying.

    Thing is, they don’t have a database of everybody (well, not that they’re going to admit to and use for this purpose), they have a database of targets. It’s not two-step process where first you identify somebody from a global list and then ask if they’re on the target list, it’s just a question of “does this face match* anybody on the target list?”. Which is a perfect recipe for false positives, because you’re multiplying the individual false positive rate (the number of times person A is incorrectly identified as person B) by the number of people on the list – and I’m betting it’s a pretty long list. And since they don’t have a list of people they’re not looking for, there is never an option for positive elimination (“we know who this is and we’re not looking for them”). Either you “match”* somebody, in which case you get stopped, or you’re a null result.

    What I’m wondering now is whether the system just picks the strongest match*, or if there’s a possibility for one person to be incorrectly identified as multiple targets. Oh shit, it’s Max The Pirate and Dick Dastardly!

    (* Matching in this sense is not a strict binary yes or no, it’s a probability. Presumably the match threshold is tunable – anybody want to guess how they’ve tuned it? I call dibs on “as wide as they think they can get away with”.)

    they could just replace it with a camera that rings “ding dong” whenever it sees motion

    Well, yeah, but that would be (a) a bit obvious, and (b) a lot cheaper. I bet somebody’s making good money out of this. Probably the same bastards that sold those dowsing rods to the Iraqis as bomb detectors…

  3. says

    Presumably the match threshold is tunable – anybody want to guess how they’ve tuned it? I call dibs on “as wide as they think they can get away with”.

    I think that’s exactly the problem. After all, the police will face no serious consequences if they stop someone innocent, but if they let one known terrorist get through, there’ll be hell to pay. Consequently, it’s in their best interest to cast as wide a net as possible, not to minimize false hits.

  4. says

    Tabby Lavalamp@#4:
    of all the possible dystopias, somehow we’ve ended up in the most ridiculous

    Yes, it’s as though Cthulhu was real, but decided to make our world like Fawlty Towers.

  5. says

    LykeX@#3:
    After all, the police will face no serious consequences if they stop someone innocent, but if they let one known terrorist get through, there’ll be hell to pay.

    It’s the “perverse incentives” from hell.
    That the cops suffer no consequences for harassing citizens: that’s a big piece of the problem.

  6. jrkrideau says

    Did I calculate this correctly The police stopped 5,625 people to arrest 450?

    Just in terms of the police time wasted discovering that you have stopped the wrong person again for the Nth time that day, the system has to be a failure.

    I wonder what the False Negative rate is? Somehow, something that generates that high a False Positive rate may be just as bad in generating False Negatives.

  7. jrkrideau says

    @ 6 Marcus
    the cops suffer no consequences for harassing citizens

    This is in Wales not the USA. I wonder how many citizens after seeing that piece are talking to their local councillors and their MPs? With that False Positive rate the police may well be hitting some of the local dignitaries, possibly including local councillors. Well, we can hope.

  8. says

    jrkrideau@#7:
    Did I calculate this correctly The police stopped 5,625 people to arrest 450?

    Sounds right. Impressive, huh?

    With that False Positive rate the police may well be hitting some of the local dignitaries, possibly including local councillors.

    Not a chance. The powerful always have ways of signalling their importance so they can be left alone. Most of this stuff would sort out if it impacted policy-makers. But annoying people know that and are careful to not interfere with the ruling class.

  9. Hatchetfish says

    I would bet Dunc has the situation down. It makes no effort to ‘identify’, it just compares to the entire target list and hands the fuzz the best match. It has no mechanism for negatives, true or false, it’s really just a carnival game: “Which British person with a warrant looks the most like you.”

    The system doesn’t have an “I don’t know who that is” state, just the silent “no one with a warrant looks enough like that person to alert on” state.

  10. says

    Hatchetfish@#11:
    The system doesn’t have an “I don’t know who that is” state, just the silent “no one with a warrant looks enough like that person to alert on” state.

    I would say the real purpose of the system is revealed through its design.

  11. says

    cvoinescu@#12:
    But…
    “But police have defended its use and say additional safeguards are in place.”
    So, it’s OK. Additional safeguards are in place!

  12. says

    sonofrojblake@#15:
    Yeah, but:
    “False positive rate of 98% doesn’t count, say police, because ‘checks and balances'”

    See? Checks and balances. Fuhgeddaboudit.