Destroying Data


Data destruction is a part of good systems administration; you should design it in to your understanding of how you use your systems.

I did a piece back in August 2016 about data management, backups, and system administration. There are a few too many topics in that article to do them justice, so I’m going to talk just about data segregation and why you want to think about your data that way.

Back in 2005, I bought a refurbished laptop on Ebay (for a friend) and when I got it, I discovered that the former owner had not zeroized the hard drive and reinstalled Windows – they simply deleted their ‘My Files’ folder. Meanwhile, their ‘Favorites’ folder was untouched – and they had some pretty interesting ‘Favorites’ indeed: I could tell the seller was a stoner and liked very particular kinds of porn. So I zeroized the hard drive and reinstalled Windows. Back in around 2002, I was on a technology advisory board for a company called Trust Digital, which did endpoint encryption on cell phones, and I suggested that we do a little marketing piece by buying a dozen or so cell phones on ebay and using a forensics tool similar to Encase on them. We did, and there was some scary, scary stuff on them. There were also some pretty cool pictures of horny naked executives and their significant others, but none of Anthony Wiener – though you can bet a forensic examination of his storage would be interesting!

How do you keep this from happening?

Option 1: You never let your media leave your control.

Option 2: You destroy any media that’s about to leave your control.

Option 3: You encrypt your data and periodically de-key it, so if it goes out of your control, it reverts to a cloud of bits.

Option 4: You don’t care at all. Go Directly to Boardwalk, collect $200.

None of those options is very good unless you can segregate your data. Segregating your data means being able to tell the difference between your data and disposable data. For example, if I look over my shoulder here at my gaming computer, it has two SSDs in it: one is devoted to Windows 10, the other is partitioned into two segments, one of which is E:\Steam and contains downloadable game assets and the other F:\Whatever contains screenshots, settings, and DVD data I’m ripping and encoding. If I adopted a simple backup and data management strategy, of not segregating my data, i.e.: it was all C:\, then I’d have to backup and manage 600gb of data, instead of 35gb in the one partition I have defined as “mine.”  Designing your system with data segregation means you’re saving yourself time and effort by defining what matters to you, and how it matters. Further, since the stuff on the F:\Whatever drive is not sensitive, I don’t bother to encrypt it. In this example, I’m using machine-level segregation! My important data is on my desktop server, a few feet over from the gaming system. I spent extra money to have two separate systems so I could let Steam and Microsoft control the software on the gaming machine, and otherwise ignore it. The desktop server contains more sensitive stuff, which is also segregated. Some partitions are backed up, some are encrypted and backed up, and others, like C:\Windows are disposable. If I had to send my computer for repair* or I’ve got a cool new piece of Equation Group malware on the system drive, I can pull out all the other hard drives that contain my data, and I can be fairly confident that the stuff in C:\Windows mostly doesn’t matter. (I’m sure Microsoft leaves data-turds all over the place, and there is probably data in the paging areas I don’t want to have leave my house, but, hypothetically, I might be willing to let C:\Windows leave my hands for a while.)

my desktop server's partition map

my desktop server’s partition map

If you look at Disk 0 – my main drive – you’ll see it’s partitioned into C:\ (windows) and E:\ (home) They’re separate segments on the same drive (it’s a nice big SSD) so I get fast performance on both filesystems, but I only worry about the data on E:\ since Microsoft will cheerfully give me another 100gb of crap any time I ask them for it; there’s no need for me to lovingly preserve a copy of Windows.

If you’re a systems administrator with a lot of experience, you’ll know to think about your drive layout when you build your system. It’s you’re a more typical computer user, you go to Best Buy or whatever and come home with a system that already has an operating system installed. And, it’s usually been installed in a way that is most convenient for the operating system, not for the user. They don’t want to have to explain partitions to you, so you get one big partition “have a nice day.” In that case you have 3 options:

Option 1: Add another hard drive. Call it E:\Mystuff and put your stuff on it. If you ever need to send the system somewhere, take it out. Consider buying an Icy Dock removable drive bay so you can pop drives in and out if you need to. Or use an external USB drive (this has the added benefit of allowing you to turn it off if you want your files inaccessible)

Option 2: Reinstall the operating system more intelligently. Assume 200gb or so for operating system and programs (this will change over O/S releases and software load-out: measure what you currently use and add a large fudge factor of overhead) and partitition the root drive so that you’ve restricted the amount of space Microsoft feels it can crap all over. Put your stuff in the secondary partition, Microsoft and whatnot in the primary partition, and – if the hard drive fails, you lose the whole mess together.

Option 3: Create a container partition. A container partition is a virtual partition that exists within a single file in a filesystem. So, imagine you have a drive partition called C:\Windows and 200gb file called C:\Windows\Home.TC – you could make that Home.TC file look like another hard drive – the operating system simply treats the space in the file as it would space on a hard drive, and off it goes. You’ll experience some minor performance impact from doing this, but generally you’ll not notice it – operating systems have gotten pretty good at mapping virtual filesystems, because of the degree to which cloud systems are virtualized.

When I buy a laptop, I use Option 3, because it can be hard to add a whole new drive to a laptop, and Option 2 is unattractive because laptop device drivers are notoriously squirrelly and trying to get all the devices working after a bare metal install can make you pull your hair out. Also, since I travel a lot and my laptop might be examined crossing national borders, having a virtual encrypted volume seems like a decent idea for annoying the secret police. Just make sure you remember your passphrase for it, when they waterboard you, and dekey it when you leave your hotel (don’t dekey it while under video surveillance at an airport!)

tc-1

Adding a new virtual partition is pretty simple is you use TrueCrypt: you just follow any of a number of walkthroughs for it. but: whenever someone mentions TrueCrypt in a blog, they get emails from marketing people saying “TrueCrypt has been discontinued for security reasons, you should tell people to use our product!”  That’s true. Feel free to use something else. But we’re mostly using TrueCrypt, here, for segregation, not trying to keep the secret police away from our stash of nuclear secrets.

To create a volume you just click “Create Volume” and then walk through the prompts.

tc-2

So I’m creating a large file that’s going to be called E:\temp\freethoughtblogs.secret, which will contain a file system that will appear to be a hard drive.

tc-3

You tell it the size you want (I told it a measly 500mb, since my archive of freethoughtblogs secrets is remarkably small)  Do not use a password like “freethoughtblogs” that anyone might guess. Also, do not use a password like: “you can waterboard me until I die, I will never tell you!” because you might find yourself yelling that at the secret police some day, and they don’t like surrealism.

tc-4

Then, it formats the drive, and you now have a 500mb file (or whatever) of crypto-noise.

tc-5

Yup, looks like noise. Don’t delete that file – it’s your virtual hard drive. You can mount it using TrueCrypt to attach it as a filesystem:

tc-6

Above, you can see I selected the file as my container file freethoughtblogs.secret holding the encrypted filesystem. When I give the password and tell it to mount on drive F:, it appears on my system:

tc-7

And windows sees it! (I had to rename it “Freethought” because, reasons)

tc-8

Now, if I only had one drive in my system ( C:\ ) configured the way it came from Best Buy, I could use this trick to make a container file that’s encrypted well enough, so that if I sold the computer all I do is unmount the container file and forget the password (“freethought”) and it’s just a cloud of crypto-bits.

I want to emphasize this one more time:
I recommend this as a system adminstration technique, not a security technique.

It’s a great way of making sure your data dies with you. Which, if you’re getting on in years, is something else to think about.**

Destroying hard drives is also a good excuse for an upgrade, if you’re running out of space and have been eyeing those new 1tb SSDs:

video by Marcus Ranum, taken with Edgertronic slow-motion camera @2000fps shot with .44 magnum JHP

For all intents and purposes, when your container is dismounted, it’s just crypto-bits and you don’t need to worry about what happens to the drive. If you use some kind of backup system, you can back the contents of the encrypted volume (if it’s mounted, the contents are decrypted when you access them through the file system) up to another medium, or you can back up the encrypted container (which means you’ll back up the whole container each time)   I used to keep a storage network server on my home network, which had several large container files, which I would mount with TrueCrypt – my accesses back and forth across the network were encrypted at the block level by TrueCrypt, and I didn’t have to worry if somehow someone stole my storage server (unlikely, but …)  By the same token, when I finally decommissioned the storage server, I simply bagged it and dropped it in a dumpster; I knew the data was all in the containers and was all encrypted at the container level and there was nothing on the server that was recoverable without the passphrases for the container files)

Now, you can create a container file using Windows built in capabilities, except – since it’s Windows – it’s much more painful than it needs to be. Also, if you want to use Windows built in encryption (“bitlocker”) you need Windows Pro. Lastly, since the FBI never screamed bloody murder about Microsoft adding bitlocker to Windows, I assume that bitlocker’s backdoored. Remember, I am a professional paranoid. But also, the US Government has a deep and rich history of backdooring software; it seems absurd to imagine they’d refrain from pressuring Microsoft when they pressure AT&T to backdoor their own networks, Facebook to backdoor their messaging, Google to backdoor their email systems, Apple to backdoor their phones, etc. If you want to know how to do a container file in Windows, you go into the disk management subsystem and create what’s called a VHD (Virtual Hard Drive)

vhd-1

Another small fixed size virtual drive. Then you have to use Windows’ extremely awkward partition management to put a partition and format the virtual drive (Be SURE you are not formatting a real drive by accident)  Then Windows will see the drive:

vhd-2

That’s New Volume (I:) that I just created. So now I have a TrueCrypt container and a Windows VHD.

vhd-3

I’m not going to illustrate the “turn on bitlocker” setup process because this machine is not running a version of Windows that’s professional enough to have bitlocker. The How to Geek article [1] has a very detailed walkthrough, if you care.

There is one really nice feature of using VHDs and bitlocker and that’s that Windows treats the container file as special and blocks all access to it while it’s mounted. So you can’t delete the container accidentally.

Wrapping up:

So, this has turned into a rather meandering voyage: the question was: “How do you handle data on devices, if you need to send them away for repair?” and my answer is necessarily “it depends.”  Or “it’s complicated.” It depends on the device and whether the hardware is accessible or not. When I break an iPhone screen, which about once every couple years, the last thing I do with the device as it’s dying, is to go to the settings and reset it to factory default. Is that good? You’ve got to trust Apple, or put a bullet through the thing.*** If it’s a hard drive in a desktop, I’d pull it out before I sent it out of my hands (I already have backups of everything)

Really, you can’t ever answer a security question without taking into account your threat model: who’s your adversary and how strongly motivated are they to come after your data? Once you’ve established your threat model, then you can reason about what paths they are likely to pursue to come at you. I.e.: an angry internet troll is going to adopt very different attack strategies from NSA hackers, who are going to adopt very different attack strategies from the Russian mafia.****

divider2

(* That’s a hypothetical. I do my own repairs.)

(** My accountant is my executor, and has orders to physically destroy all of my hard drives by taking them to a machine shop and having someone take an oxyacetylene torch to them)

(*** And you’d better know where the memory is in the device so you hit it with the bullet. Or use an oxyacetylene torch.)

(**** When they ask for your password, you’re probably going to wind up in several dumpsters, in different parts of town, so it’s irrelevant: you’ll give it to them. Not that doing so will help you at all, or smooth your final moments.)

Bruce Schneier: Recovering Data From Cell Phones

Microsoft: How to Turn On Bitlocker Drive Encryption

[1]How To Geek: How to Create and Encrypted Container File With Bitlocker on Windows

Peter Guttmann: Secure Deletion of Data From Magnetic and Solid-State Memory (Peter makes an interesting discovery the NSA’s recommended data wiping technique was good enough to prevent anyone but the NSA from reading a wiped disk. Coincidence! This was from 1995, and of course drive densities and different track encoding techniques have probably changed the underlying assumptions completely. I remember when Peter did this talk in ’95, and it brought down the house.)

Comments

  1. colnago80 says

    I haven’t used Windows for a long time but can one make an external USB drive bootable in Windows? This is no problem on a Mac (I have 7 external drives daisy-chained together using Firewire, 1 of which has a backup of OSX6, 2 of which have OSX10, 3 of which have OSX11 and 1 of which has OSX12; probably won’t be able to run off of OSX13 when it comes out later this year).

  2. johnson catman says

    TrueCrypt has been discontinued for security reasons, you should tell people to use our product!” That’s true. Feel free to use something else.

    So, is there a current encryption software that you use or recommend?

  3. Dunc says

    I haven’t used Windows for a long time but can one make an external USB drive bootable in Windows?

    Yes, you can.

  4. Turi says

    @2 The source code of TrueCrypt has been picked up and is now developed under the name VeraCrypt. So far i have not heard anything that would discourage the use of VeraCrypt. TrueCrypt was also audited and has been found to be in an OK place, so you can use the old version of it as long as your system is still compatible.

  5. blf says

    As I recall, one of the colored books was about data destruction, and the recommendation in the book for destroying hard drives was to first grind it up into small pieces (they even specified the maximum “size” of a piece, something on the order of millimetre or two square, as I now vaguely recall), and then incinerate the lot. I assume they also specified the minimum incineration (e.g., time and temperature), but now only recall that a secure incineration facility be used.

    There were also recommendations if you couldn’t do the full grind-and-incinerate, but I have no specific recollections.

    All this was for ANY drive, not just those which may have at one point contained classified material, albeit drives known to be entirely unclassified their entire lives could be destroyed just by, as I now recall, either incineration or grinding, though both was still the recommendation.

    Overwriting, whilst useful, wasn’t much help (at least for the full set of concerns of the colored book, which included worrying about potential recovery of Top Secret material by sophisticated agents) — slow, too error-prone, and (at least for the magnetic drives of the time) it was still possible to recover the contents despite repeated overwritings (there were traces of the older contents at the margins of each sector, as I now recall). If memory serves me right, even a complete track reformatting (recreation of the sectors) was inadequate, as at least the inner and outer side-margins of the older sectors could be analyzed.

    (This was in the magnetic disc era, SSDs were then truly exotic esoteric expensive creatures used by Cray supercomputers and similar. As I sortof recall, the book also talked about magnetic tape destruction, but I have no memory of optical media (e.g., CDs, which were new-ish then). I think printer listings and punched cards (this was a long time ago!) fell under the destruction of paper materials (different sete of guidelines)…)

  6. says

    johnson catman@#2:
    Use Veracrypt, or an old version of TrueCrypt (you can find them) Veracrypt is fine.

    The story of TrueCrypt’s shutdown is still shrouded in mystery but many of us suspect that the authors chose to shut it down rather than respond to a national security letter ordering them to make certain modifications to their software. That’s just strategic speculation, however, you might find this to be supporting the theory:
    http://www.newyorker.com/tech/elements/how-the-government-killed-a-secure-e-mail-company

  7. says

    blf@#5:
    As I recall, one of the colored books was about data destruction, and the recommendation in the book for destroying hard drives was to first grind it up into small pieces (they even specified the maximum “size” of a piece, something on the order of millimetre or two square, as I now vaguely recall), and then incinerate the lot.

    I believe it was the light blue one. But I never read any of them carefully; they’re a cure for insomnia and a cause of amnesia.

    The specific degaussing guide Peter Guttmann was referring to is here:
    https://fas.org/irp/nsa/degausse.pdf
    Meanwhile, as you said, NSA’s approach to dealing with media was to shred it then melt it. Nobody ever figured out a way to reconstitute data off a melted puddle of aluminum chips that had been intermixed – perhaps Maxwell’s Demon could do it, but it’d be hard even for that wee beastie.

    In case you are ever having trouble sleeping, the whole series is here:
    https://fas.org/irp/nsa/rainbow.htm

    When I was doing my early work at TIS I was on the Trusted Mach kernel project for about 3 weeks (while I desperately tried to talk someone who had extended me a job offer elsewhere into re-opening the position) and I read the orange book stuff and was going “WTF” – it’s a way to build unbuildable useless F-35s, basically.

  8. says

    colnago80@#1:
    I apologize for not talking about Macs; I don’t know anything about Mac encryption options since the last time I touched one, which was 1992. So I didn’t want to say anything inaccurate.

    I assume that the Mac’s model is “Trust Apple” and that most Mac users do that. It seems to be the approach in iOS (which I do have and use) (And I am skeptical of Apple’s claims that it’s difficult to break into an iOS device’s files even with Apple’s help, because cloud backups are perfect for offline attacks on keys.)

  9. Sunday Afternoon says

    @blf #5, and to amplify Marcus’ comment about the underlying assumptions behind Peter Guttmann’s observations changing.

    The density of magnetic recording has gone up by over 1000 times, from less than 1 gigabit per square inch (Gbpsi) to routinely over 1000 Gbpsi for experimental results reported in the literature. The recovery technique described by Guttmann is magnetic force microscopy (https://en.wikipedia.org/wiki/Magnetic_force_microscope) – there’s no surprise that the illustration in that article shows measurements of areas of hard drive platters!

    Back in 1995, the read head sensitivity was low, especially compared with MFM tips. While most magnetically overwritten data could not be recovered by the read head, enough residual magnetization as you describe could remain for detailed MFM study to enable data recovery. Current tunneling magnetic resistance heads, coupled with reductions in spacing between head and recording medium make the sensors much closer to MFM tips in terms of both sensitivity and resolution. My suspicion is that this reduction in the difference read sensors and MFM tip sensitivity (even for the latest carbon nano-tube tips) makes overwriting old data so that the read sensor cannot detect it more secure now than it was in 1995.

  10. Pierce R. Butler says

    … none of Anthony Wiener – though you can bet a forensic examination of his storage would be interesting!

    Only for those interested in [*ahem*] self-portraits of Anthony Weiner: he doesn’t seem to have scored with anybody else – with the possible exception of his ex-wife – through his, ah, digital efforts.

    Marcus Ranum @ # 8: I assume that the Mac’s model is “Trust Apple” and that most Mac users do that.

    Maybe they do, but (particularly on a basic buyer-seller basis) the World’s Largest Market-Cap Corporation® has long since invalidated that approach forever – and leaks from the iCloud have already drowned several celebs’ privacy.

  11. colnago80 says

    Re Pierce Butler

    I don’t use the Cloud. That’s why I have 12 Terabytes of external disk storage divided up amongst 7 external hard drives.

  12. says

    Pierce R. Butler@#10:
    leaks from the iCloud have already drowned several celebs’ privacy.

    Interesting sub-point:*
    iCloud didn’t “leak” it functioned as it was supposed to. The problem was with how it was supposed to function. It’s a problem lots of computers and devices have – namely that their “restore” function amounts to a recovery-path that allows all the data in the device to be stolen. If you’ve got your pictures in the cloud and I can social engineer you into resetting your password (or I can guess it) I can buy another device and then tell the cloud “this is my new device” and it cheerfully does what it’s supposed to do, and gives me a copy of all your files.

    (* Well, interesting to a security practitioner)

  13. says

    colnago80@#11:
    That’s why I have 12 Terabytes of external disk storage divided up amongst 7 external hard drives.

    If it’s important to you, I hope you have another copy in a safe deposit in a bank somewhere.