Quantcast

«

»

Jun 16 2013

The security illusion

No system of detection is perfect, whether it be in medicine or police work. There is always the chance of false positives and the wider you make your dragnet, the larger the number of false positives that you are going to get. While the massive secret databases of the NSA are touted as an efficient means of detecting patterns to thwart terrorist attacks, it is simply a statistical fact that any pattern matching software will throw up false positives.

With the revelation of the NSA’s massive surveillance program that targets pretty much everyone, it is guaranteed that many wholly innocent people will find themselves in the crosshairs of the government as potential terrorists and will suffer harassment even though they have done nothing to warrant such suspicion. But what makes this worse is that since the program is kept secret and people do not have the chance to clear their names in open court, they will have little or no mean of proving their innocence. We have already seen many cases of people being put on no-fly lists (created using name-matching software) and prevented from boarding planes with no reasons given, so they have no means of clearing their names.

But it can and does get even worse. Gail Collins reminds us of the story of Brandon Mayfield as a cautionary tale of how the national security state can get things horribly wrong and ruin the lives of people. He was picked up as a suspect in the 2004 bombing of a commuter train in Spain as a result of a faulty match in a fingerprint database. What was even worse, he was married to an Egyptian and his daughter was named (gasp!) Sharia! What more evidence of terrorist sympathies do you need? You have to read Collins’s account to truly appreciate the hell that he and his family went through (arrests, secret searches of their home and offices, etc.) to appreciate how bad it can get.

Fortunately the Spanish authorities felt, reasonably enough, that a crime on its soil was unlikely to have been committed by someone who had never been to that country and so they continued their search and found the person whose fingerprint truly matched. That was what resulted in Mayfield being ‘cleared’ though you can be sure that he remains on the watch list

Cory Doctorow describes a more recent case of a faulty match of patterns.

You should care about privacy because if the data says you’ve done something wrong, then the person reading the data will interpret everything else you do through that light. Naked Citizens, a short, free documentary, documents several horrifying cases of police being told by computers that someone might be up to something suspicious, and thereafter interpreting everything they learn about that suspect as evidence of wrongdoing. For example, when a computer programmer named David Mery entered a tube station wearing a jacket in warm weather, an algorithm monitoring the CCTV brought him to the attention of a human operator as someone suspicious. When Mery let a train go by without boarding, the operator decided it was alarming behaviour. The police arrested him, searched him, asked him to explain every scrap of paper in his flat. A doodle consisting of random scribbles was characterised as a map of the tube station. Though he was never convicted of a crime, Mery is still on file as a potential terrorist eight years later, and can’t get a visa to travel abroad. Once a computer ascribes suspiciousness to someone, everything else in that person’s life becomes sinister and inexplicable.

Of course, these cases will not convince those who have been conditioned to think that they are in imminent danger of a terrorist threat and that we need to give the government all these secret powers to keep them safe. Most people also have an unreasonably high opinion of computers. They think that the chances of them being mistakenly identified as a potential terrorist is small. And they are correct. But for those few who do happen to be out of luck, it can destroy their lives.

People seem surprisingly willing to give up their freedoms as long as someone else is likely to pay the price.

2 comments

  1. 1
    slc1

    As one wag put it, a conservative is a liberal who has been mugged, a liberal is a conservative who has been indited.

  2. 2
    Jeffrey Johnson

    This article is actually touching directly on what the real problem is. The problem is not the gathering of the data. The problem is the impact it can have on individual lives if the data is used stupidly or carelessly. We all react viscerally to the gathering of the data because we can see immediately how it can turn us into an unfortunate character in a Kafka novel.

    But if we step away and unpack it for a moment, and ask, what are the possible solutions to this problem, the simplest and most crude solution is to block the gathering of the data. But is that the best solution?

    This crude solution entirely throws out any value that might come from the ability to analyze this data. This immediately suggest that this simple minded solution is possibly not the optimal solution.

    Is our freedom really so trivial that it depends on the ability to mask what we are doing, that it depends on a perfect anonymity that can only be observed by those we choose? Isn’t our freedom something grander that depends on something more important than stealth and concealment? Isn’t part of freedom the ability to present ourselves publicly without fear?

    So what alternative solution is there to protecting the freedom of innocent people, other than granting them the same kind of secrecy and safety from prying eyes that every criminal and terrorist desires?

    In my view the answer lies not in blocking the gathering of the data, but on working hard as a society to eliminate the secrecy that could allow abuse to go unchecked. The handling of the data needs to be scrutinized and defined in detail, with citizens having the power to review how their data has been accessed and by whom. Citizens need to be able to challenge the security state when it makes mistakes, and be given the presumption of innocence. In other words our Constitutional powers to face our accusers, to not incriminate ourselves, to be granted presumption of innocence are what need to be reaffirmed and strengthened. Calling this information our “property” and asserting merely the 4th amendment doesn’t seem a very resounding assertion of freedom to me.

    Computer systems can be constructed so that every action is audited and logged, that access to data can be recorded on a per record, per user, per access basis. The picture people have of every young Snowden having access to look up anyone’s address or private correspondence on a whim, if true, can be stopped using technological approaches that still enable properly warranted criminal investigations.

    As I see it, simply ending the discussion by saying we need absolute privacy is throwing the baby out with the bathwater. Rather we should take the harder but better course of defining carefully how and under what circumstances such data can be accessed, and placing technological accountability in place that provides transparent review capable of exposing any abuses. Was Brandon Mayfield’s real problem that his fingerprint came up on a false positive, or was it the shoddy and unprofessional bias with which he was treated after that event? I think the problem we need to address is the failure of protections for those who become suspects, either validly or in error.

    Khalid Al-Masri, Maher Arar, Binyamin Mohammed et. al, Nasser Awlaki, and many others have been denied access to the courts to address grievences against our government. This denial was based on an overly broad assertion of the state secrets privilege. This is the really bad problem we have that we should be focusing on, not the gathering of data. If we block the gathering of data we still must win this larger fight against secrecy, and if we win the fight against such excessive abuse of secrecy, the data gathering becomes a much less dangerous problem.

    The abuse of state secrets privilege expanded post 9/11 beyond simply allowing certain evidence to be rendered inadmissible. It overnight turned into the ability to block entire cases if any relevant evidence was asserted by the government as a compromise of national security. This is far too broad a power. There is a proposed legal solution to this problem called the State Secrets Protection Act, which has languished in the Senate since 2007. This Act provides judicial review of executive assertion of national security privilege, and it requires the government, under judicial review, to provide any reasonable facsimiles of evidence relevant to a trial whenever a document or other source contains classified information.

    Rather than complaining about this potentially useful data gathering technology, we should be fixing and strengthening those aspects of the law that really protect us from abuse. No amount of privacy will stop armed SWAT teams from invading your home, or unreasonable arrests or detentions without recourse. It only makes it a bit harder for them to find you.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite="" class=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>