Remember the Y2K panic? There were fears that computers that came into widespread use in the mid-twentieth century, when the year 2000 seemed very far away, had been programmed with internal clocks that assumed that the first two digits of the year was 19. There was concern that when the year 2000 rolled around, many systems would crash because the computers might think the year had suddenly reverted to 1900 and thereby go awry. There were fears of planes crashing, power systems going down, the banking and financial sectors going haywire, and so on.
There was a lot of activity among computer professionals to take steps to solve the problem before the end of 1999. I personally did not do anything since I am not a computer professional. I also tend not to wait up on December 31 until the New Year rolls around but on that day I did decide to stay up until midnight to see what might happen. And there was … nothing. Everything was just as before.
As a result, people started speculating that Y2K was an overhyped threat, that there had never been a serious threat and that the only thing the panic did was provide plenty of opportunities for computer engineers to find work. But in a letter to the editor in a recent issue of The New Yorker magazine, someone who was intimately involved with the efforts to prevent any disaster says that that is the wrong lesson to take away from that episode.
Indeed, in the past twenty-four years, Y2K has become a stand-in for any much hyped threat that turns out to be insignificant.
In reality, however, Y2K is a rare example of a known problem that was successfully addressed in advance through the combined efforts of government and private industry. I am one of the many programmers who worked on Y2K preparation. Like my colleagues, I waited at home on New Year’s Eve, 1999, watching anxiously, and then triumphantly, as the lights stayed on in country after country while clocks across the globe struck midnight. These efforts could have been turned into an example of how to coördinate catastrophe prevention, if they hadn’t been so successful that the general public now believes the threat didn’t really exist.
This is often the case. We tend to remember the disasters that happen and forget those that were successfully prevented. For example, the Covid pandemic will be remembered for a long time but there have been other pandemic scares in the past, such as SARS, swine flu, the avian flu, and Ebola. I recall my university having all manner of meetings to warn of the dangers of some of those and recommending actions to at least mitigate them. In the end, those diseases did not get completely out of control or were limited to relatively small geographical areas.
Was it the actions taken that prevented a much larger catastrophe? Or was the threat overhyped in the first place? It is hard to tell because we cannot do a controlled experiment.
It is like taking vaccines to prevent diseases such as Covid. If we do not get the disease, was it the vaccine that prevented it? Or (as vaccine skeptics claim) there was never any virus at all or that it posed any real threat or that some random treatment promoted by random people prevented them from getting it. It is hard to convince such people of the efficacy of vaccines when they are successful in suppressing the disease for a long time. We now have people thinking that measles, whooping cough, and other childhood diseases are not really that serious because vaccines have been so successful for so long that they have no idea how serious a threat to children’s health they are.
Things that do not happen are harder to study than things that do.
flex says
IIRC, the most common fix for the Y2K bug was to add a line of code to say that if the two-digit date was equal to or less than 30 the software should consider it to be 20XX and if the two-digit date was greater than 30, then the software should consider the date to be 19XX.
This was done with the assumption that by 2030 no critical software would remain in use which only uses a two-digit date.
I wonder how true that assumption is? Well, we may find out in six years.
mikey says
I probably worked twice as many hours in ’99 as in other years, mitigating the Y2K bug. (Paid off my house, too!) There is good reason not much bad stuff happened.
xohjoh2n says
@1, then another 8 years after that for the 2038 bug…
JM says
I know a couple of older programmers who came out of retirement to work on Y2K stuff because they where pulling in ridiculous hourly rates. Work on Y2K for a couple of months and they could afford to move into a nicer house.
I think a lot of people don’t realize just how dangerous SARS was either. SARS is why China had such a strict Covid lockdown. SARS was pretty deadly, official rate is more then 10% dead, but not real infectious. By locking down the areas with SARS everybody with the disease either died or recovered in short order. This was the wrong strategy for Covid and China didn’t switch up even long after this was clear.
Marcus Ranum says
The systems that control power grids, etc., were pretty quickly fixed in their upgrade cycle. The big concern was billing systems, medical insurance and records, auto licenses ablnd registration -- little things like “will the automated parking lot robot start refusing to let people out?” Very quickly it changed from “civilization may collapse” to “people could be inconvenienced” -- I personally know a guy who made sure that a major hotel chain’s computers would not confuse guests’ stays as being from 0099AD till 2000AD -- 2000 years at $200/night would take a lot of time for front desk teams to fix, so the computers made sure it did not happen.
The problem was because it took a while to teach the idiots in the media about the significance of the problem, and once they understood they had a potential scare story they stopped listening to the tech workers who were saying “we’re fixing it.” They just kept banging on the end of the world story. Y2K was when I realized that the media are largely lazy and stupid.
TGAP Dad says
Computers have never assumed the date was within the 20th century. From my perspective, the primary problem was that the languages used to produce software often provided a shortcut command to get the date, which by default only used the last two digits for the year. A COBOL example is:
MOVE CURRENT-DATE TO MY-REPORT-DATE, where CURRENT-DATE is the system’s shortcut, and came formatted as mm/dd/yy in the US. There were other ways to get a proper date, including 4-digit year, but they were more cumbersome.
Rob Grigjanis says
My second career as a programmer started in 1997, and the Y2K stuff started shortly after that. I remember one concern was some banking code which used (19)99 as meaning the client had expired. Interesting times, and a steep learning curve for a newcomer. In hindsight, the only bummer was that consultants were being paid two or three times my salary for doing about the same level of work I was doing after a few months. But I found it very interesting, sometimes even fun.
Matt G says
Vaccines are the poster child (children?) for the phrase “a victim of its own success.”
Pierce R. Butler says
I worked for an organization which had a mini-crisis circa June 30, 1998, when credit cards with a Y2K expiration date started getting rejected. The banks got to work and fixed that PDQ.
What worried me more a year later had to do with reports that millions of regulatory processors throughout the electric grid might all start popping “divide by zero” errors when their little internal clocks hit a date when all the engineers had figured they’d have long been replaced by. Not sure how that got addressed, but I suspect somebody took a few into a lab and artificially accelerated them, and heaved the mother of all sighs of relief when they kept ticking.
Raging Bee says
I was working for Raytheon in the late 1990s, and I often heard my boss’s boss saying we should get well ahead of the curve by making our IT systems “Y10K compatible.” Not sure what that would have entailed beyond the Y2K efforts; but at least it would have sounded good as a marketing boast…though I’m not sure why a weapons-system built in the 21st Century would have to be Y10K compatible…unless we’re fighting invaders from the future…?
John Morales says
[ Was a joke, RB. (10)base2 = 2(base 10) ]
angoratrilobite says
It would have been a disaster for us if we hadn’t changed every single date in every database we had. Our FICO would have been a nightmare to untangle and our A/R would have freaked.
It was a hierarchical database so the unload and reload was a pain in the ass but we got it done. I still remember COBOL fondly.
chigau (違う) says
This was never a problem for Apple.
Raging Bee says
So Apple/Mac geeks just sneered down at all us Microsoft/Intel shmucks that whole time? Why didn’t I hear about that sooner?
chigau (違う) says
Dunno.
Maybe your computers had a
SYNTAX ERROR
SYNTAX ERROR
SYNTAX ERROR
Raging Bee says
I’m a writer! How can I get syntax errors?!
lanir says
I think the Y2K thing was just the first time a lot of companies realized that back-end software they paid for once might have to be rewritten or have serious updates applied.
Not everybody learned their lesson then, either. There was a similar issue around software with a web interface written specifically for Internet Explorer 6 or 7. I saw that myself at a company I worked for in 2014. Which is several years later than anyone had any business installing IE 6 on their computer. And they were still using it when I got a new job in mid 2014.
sonofrojblake says
There’s the flipside: Scot “get the hell away from Black people” Adams, the guy who used to draw Dilbert and now shills its supposed continuation to his fans on some private site, had a Law of Slow Moving Disasters, of which y2k was one. Back when he had a hugely popular cartoon and an international platform, he could get his message out. His thesis is that if you see something come from far enough away, human ingenuity and market forces combine to make sure it won’t be a problem, because scientists and engineers will work way hard and just solve it, so there’s no need to worry. Y2K is the ultimate demonstration of this. It WAS a big threat, but the people who mattered knew that, and could fix it so it could seem like it just never was.
You can guess where this is going, maybe.
If you guessed “climate change”, pat yourself on the back. No need to worry about it folks, it’s a Slow Moving Disaster. So it’ll never get here, because scientists and engineers have seen it coming far enough out that it’ll just get fixed. Hurray!
(I’m going to tag those last two sentences as sarcasm, because I’m keenly aware there are those here who won’t recognise as such without help).
DrVanNostrand says
It’s not just Y2K. I’ve had climate deniers use the ozone hole as an excuse. “What ever happened to that? The big problem all the paranoid nerds obsessed about just disappeared.” I’d tell them to their face it’s only because we successfully eliminated the vast majority of CFC use through international treaty. AND that the ozone still hasn’t fully recovered. They said it was all lies.
sonofrojblake says
Oh, the ozone hole, don’t get me started, I’ve had the same conversation. It’s actually a really good example.
“It just went away”. Yes, it did. You know why? Because we identified a single, specific cause (CFCs) and just… stopped. Completely banned almost* all uses of them, practically overnight, and introduced MORE IMMEDIATELY DANGEROUS alternatives instead.
Yes, you read that right -- before the Montreal protocol, an aerosol can would just spray deodorant onto your armpit -- the CFC propellant was, by design, practically inert. Post-Montreal we went back to using butane as a propellant, and your body spray can, if you need it to and have a lighter handy, double as a handy flamethrower. In industrial settings, liquid ammonia -- which is horribly toxic and smelly -- is a good alternative commonly used refrigerant. You can tell if there’s a small leak because a few ppm in the air will make your eyes water at 50 metres -- it’s very nasty stuff.
I don’t remember much public backlash at the time. And, over the course of years, the problem… went away, because the cause just stopped completely. So you want the same for climate change? Great. Stop burning oil and gas RIGHT NOW. I mean yeah, ideally stop completely forty or a hundred years ago, but leaving fantasy aside if you want a solution like we had for the ozone layer, STOP, completely RIGHT NOW, overnight, globally. No more petrol or diesel engines. No more coal or gas-fired power stations. No more fossil-fuelled airliners or freight trains or ships with engines, RIGHT NOW. That’s what it would take to put us in a position where climate change MIGHT start to repair itself DECADES from now, if we’re lucky.
*Not all use of CFCs was banned. The raw material for the manufacture of PTFE (“Teflon”) is dichlorodifluoromethane (R22), a commonly used refrigerant gas. Post-Montreal, its use as a refrigerant was banned, but since it is chemically transformed in the process into a non-ozone-depleting substance, its use for PTFE manufacture was not. The only problem was, PTFE manufacture used some tiny percentage of the global capacity for R22 production. Plants that had been turning out hundreds of thousands of tonnes of the stuff (and selling a perhaps a couple of thousand tonnes for PTFE) suddenly saw 99% of their trade made illegal. Bad for them. But also bad for the PTFE manufacturer, who went from being able to rely on being a tiny customer of a high-quality local source that delivered all the time to having to find an international source with much less control over quality and delivery schedules. The modern world is built almost entirely on complex, interdependent and very fragile webs like this. It’s amazing it works as well as it does.
Holms says
Your post reminds me of the bellyaching workplaces sometimes see about the safety rules. “Why does this warehouse ban lifting anything over 15kg without assistance? I’ve been working here X years and I’ve never seen a single back injury!” -- the safety rule has prevented the thing it is designed to prevent, duh, and so the rule’s necessity is not visible.
Jazzlet says
sonofrojblake @20
I don’t think there was much backlash, but most of the required action didn’t directly affect the public, it involved manufacturers changing what they did, but as far as the public were concerned the fridges went on working and the aerosols went on spraying. Action on climate change will change people’s lives in far more obvious ways. There has long been research showing that this makes a big difference in how people respond to possible environmental action -- “sure plant more forests, but don’t take away my internal combustion engine”.
jenorafeuer says
Necro-ing this thread a bit (I was on vacation for three weeks) but just to add my own comment on this as someone who worked as a programmer during that time…
Was it overhyped? Yes, and no.
Yes, it was overhyped in that a lot of the more panic-inducing stories (Planes will fall from the sky! Your coffee maker will explode!) were clearly never going to happen.
No, it wasn’t overhyped, because until those panic-inducing stories became public the big business (and mostly tech-ignorant) C-suite people who control the purse strings weren’t going to bother actually budgeting to fix the problem.
So really, the problem only got fixed because of the irrational overhyped stories that anybody with any sense knew were false, but it took those to make the people without sense to see that the problem might actually affect them.
When my father asked me during the early build-up to the panic what I thought a good business to invest in for this was, my response was ‘payroll outsourcing’. Because one of the biggest real issues was going to be in all those old big corporate payroll databases that had been built on machines twenty years before that everybody thought were going to be long since replaced (because the programmers were still thinking in the old mainframe mode where code had to be rewritten for each new machine, and by the time the 1970s and 1980s came around, more standardized computing environments meant you could just recompile the exact same code without having to rewrite it for the new machine, and so you had a generation of people porting the code to new equipment who’d never actually had to read or understand it).