Police IT staff checked wrong box, deleted 25% of body cam footage
In 2014, Oakland Police Dept. made fateful error, hadn’t set up backups, either.
Maybe they hired their IT staff from Hillary Clinton, from www.RoseMaryWoods.com IT staffing.
One-quarter of all body-worn camera footage from the Oakland, California, police was accidentally deleted in October 2014, according to the head of the relevant unit.
As per the San Francisco Chronicle, Sgt. Dave Burke testified on Tuesday at a murder trial that this was, in fact, a mistake.
This incident marks yet another setback in the efforts to roll out body-worn cameras to police agencies nationwide.
In late August 2016, the Seattle Police Department reported a similar IT glitch involving body camera footage.
“Nothing should have ever been lost from the system,” Burke said in court, later adding, “The settings were set to never delete.”
(Source: Ars Technica)
Back in 2009 I did some consulting for Taser regarding security architecture for data transmission between cop-cameras and wearable point-of-presence (Taser calls it the Axon) and their cloud service. Taser’s concern was that their customers – the police departments – needed to be sure enough that the data in the cloud storage was going to be available, tamper-proof, and backed-up. The premise of the cloud service, of course, was that the cop-cameras’ data didn’t need to be stored locally, which meant no IT footprint at all; everything would be neatly managed from the cloud, and of course! This is my thing! My bread and butter! All accesses were going to be recorded and there would be identification and access control (I&A) and high security at the cloud data center, with backups going to Iron Mountain.
I have discussed this several times in this blog: if you cannot do basic data management, you need to step aside and let Google and Amazon and other grown-ups do it. In formal security-ese that means “availability”, “access control”, and “integrity” – it is utter bullshit that someone forgot to check a box and suddenly a whole lot of data was erased – and, oh, conveniently, there were no backups either. “A backup system had been purchased but not installed” and the erasure glitch was a result of a software upgrade.
We are supposed to believe that there exist chucklefucks in the world that are so utterly incompetent that they do a software upgrade on a production system without backups? That strains my credulity. Unless Donald Rumsfeld has become a system administrator, it is simply impossible that anyone would be so incompetent. I know junior systems administrators who’ve just got root on their first Linux machine, that would never, ever, make a mistake like that.
Elsewhere in the article, it scopes the size of the video archive as “dozens of terabytes.” Oh, I am so impressed. I’m a hobbyist who plays with digital media, digital photography, done footage, slow motion, yadda-dadda and my desktop computer is spinning 16tb. In addition to all the other stuff I do, I manage that in triplicate. I even post lengthy blog postings on how. And when I do, some rational person asks “Why not just use Carbonite?” which is a great question! You mean the cloud? Like Taser suggested?
What a great idea!
Technical neeping aside, let’s get to the money shot, shall we?
Though officers activated body-worn cameras when they arrived at Fern Street near Fairfax Avenue, no footage from the July 24, 2013, slaying could be found because of the reported data erasure one year later.
Ah, so, the data was erased a year later and the backup system wasn’t installed a year later. But the data that was deleted was undiscovered. So IT Specialist doing a software install a year later with no backups a year later and nobody did any backups on the production system in the year between the incident and the erasure? So the backup system was in the state of “hasn’t been installed yet” for a year?
Burke was asked to explain the deletions in court because Annie Beles, the attorney for defendant Mario Floyd, has said that footage taken from officers who arrived at the scene of Salamon’s killing would contradict witness statements.
Beles said the footage would likely show that no trash bins were knocked over in the street, which is notable because Ford, the prosecutor, has argued that Floyd threw a trash bin at Salamon as he demanded her phone minutes before co-defendant Stephon Lee allegedly fired three shots, killing her.
Pull the other one. It’s got bells on.
Cops tampering with evidence in a judicial murder case? That’d never happen. Right?
I never got back with the Taser folks because I got pulled off in other directions, but there’s a part in my consultant’s report where I mentioned that there may be “externalities” governing various police forces’ behavior, which might not have anything to do with whether the cloud storage was secure, or not. One of the points I emphasized was that having everything up in the cloud meant that it would dramatically reduce the likelihood that someone might leak video – you know, if Officer Porko tells his buddy “Hey I saw $hot_young_actress puking her guts up and she failed a breathalizer and I could totally see her underpants” there’s no way for them to go get it because the cloud service would log and audit all accesses and a supervisor would have to counter-approve any access once the video had been uploaded to the cloud. I felt that the concerns that cloud administrators might be sitting there watching the videos were not significant because there’d be too much to watch and unless they knew which camera Officer Porko was wearing and that the incident occurred at all, it’d just be a needle in a haystack of data, to them.
The cloud storage option should be mandatory for all cop departments. We can’t trust them. We shouldn’t trust them.
Oh, one last thing: whatever kind of software upgrade gives you the option to delete 25% of your data? I’ve built software upgrade processes, and run/installed countless application upgrades. I’ve never encountered a software upgrade that gives the user the option to delete terabytes of data without clicking “OK” and “YES I AM SURE” a couple times. Software upgrades are one of the trickiest bits of product design out there, because you have to extremely carefully forward-move site-specific settings and application data, specifically to prevent that kind of thing from happening. No vendor builds a system that does that, they’d tell the customer “if you want a fresh install without the data, install it on a new system and swap over to it once it’s up and running.” But, in that case, you’d have those old hard drives in a media safe, somewhere, wouldn’t you?
In 2013 a 2tb hard drive was “the thing” and they cost about $200 apiece. So let’s say 24tb is $2400. If you’re putting that in a rackmount with 24 bays, you’re looking at $3000 for the rack, let’s say $10,000 all told in 2013 pricing. Slap BSD and ZFS on it and let’s bump up the storage space to capacity (48tb) for another $2400 – call it $15,000 for the server and an intermediate level systems admin would have that up and running in a couple days. A novice in a week. An old grumpy badger would have it working by lunchtime. A 48tb rack-mount tape library is about $5,000 in 2013 costs, and can do unattended backups since the sizeof(tape) is equal to or greater than the sizeof(storage). Setting up a tape backup library is maybe a couple days’s work. Actually, the way I roll, I’d forgo the tape backup and have 2 of the $15,000 storage servers in 2 locations, one of them colo’d at a service provider, or I’d let Iron Mountain do it. Problem solved. OH, AND TURN ON SYSTEM LOGGING ON THE FUCKING THING.