A Feedback Loop Feeds Back: “Predictive AI” is predictably bad


A Florida sheriff’s office is repeatedly harassing certain families within a county, abusing authority and writing up people for false charges, filing multiple nuisance charges like “disorderly conduct” repeatedly against the same people.  He claims his “predictive AI” says certain people have committed crimes, so he’s targeting them.  That’s bad enough, but his false arrests are fed into that AI and used as “data” or proving they’re committing more crimes.

That’s not “evidence”, that’s self-fulfilling prophecy.  It’s the ignorant assuming they have knowledge when they don’t even understand how it’s supposed to work.

Everyone who reads this story keeps saying “Minority Report”, but I hated that movie.  I prefer to quote Arthur Conan Doyle from “The Sign of Four”:

You can, for example, never foretell what any one man will do, but you can say with precision what an average number will be up to. Individuals vary, but percentages remain constant. So says the statistician.

Predictions and trends are only guides, not proof.  And no matter how good a model, or how much you know about human behaviour, you cannot convict people for things they haven’t done.  A serial criminal who commits the same crime a hundred times still can’t be assumed guilty until you prove they did it the 101st time.  It is possible for repeat offenders to stop.

Some of those harassed in Florida have no criminal history or arrest record before this.

A sheriff launched an algorithm to predict who might commit a crime. Dozens of people said they were harassed by deputies for no reason.

  • A Florida sheriff’s office deployed a futuristic algorithm that uses crime data to predict who is likely to commit another crime.

  • In a sweeping six-month investigation published this week, the Tampa Bay Times reported that the algorithm relied on questionable data and arbitrary decisions and led to the serial harassment of people without any evidence of specific crimes.

  • According to the report, former sheriff’s office employees said officers went to the homes of people singled out by the algorithm, charged them with zoning violations, and made arrests for any reason they could. Those charges were fed back into the algorithm.

 – – – – – – – – – – – – – – – – – – – –

I can imagine a conversation with a journalist going something like this:

Reporter: “Why did you arrest them?”

Sheriff: “Our AI said to because they were arrested before.”

Reporter: “Why were they arrested previously?”

Sheriff: “Because our AI said they would commit crimes.”

Reporter: “But what crimes did they actually commit in the past?”

Sheriff: “That doesn’t matter, this is about prevention.”

Reporter: “So you’re arguing that you can arrest people because you’ve arrested them before?  That your unjustified arrests in the past are justification for repeatedly arresting them in the future?”

Sheriff: “Maybe I should feed your name into the database.”

I was expecting Marcus Ranum to post this story first.

Comments

  1. says

    I was expecting Marcus Ranum to post this story first.

    My invective tank was empty at the time when I saw it, and I had just ordered some novichok-grade invective from Russia but it appears to have been mis-delivered. So I’m glad you caught this story.

    These AI-based systems don’t exist to do anything but confirm the cops’ preconceptions or, worse, to serve as “probable cause” for violating someone’s constitutional rights. (I don’t think it is constitutional to search without a warrant based on a cop’s assessment of reasonableness) – the scam works simply because the cops don’t ask the AI about everyone – they just point it at the black people or whatever, and it says “suspicious!”

    Use of these systems ought to be banned until they can demonstrate adequate accuracy, which they are not even close to, police departments using them are engaging in judicial malpractice.

    Unfortunately, police now seem to be taking the approach: “you should just be glad we’re not shooting you. Yet.”

  2. jrkrideau says

    It is amazing the crap that one can sell to the police and then get the junk science by a court.

    Of course forensic science is pretty much an oxymoron at any time.