According to David Lareau, Medicomp’s then COO (and current CEO), everyone was pretty calm at midnight, despite the worldwide anxious buildup preceding the date: January 1, 2000. The year known as Y2K.
Stefan Weitz, on Microsoft’s IT team at the time, left his network operations center around 1 am West Coast time, feeling like the turn of the century had been “a little anticlimactic.”
Bill Huber, then a chief procurement officer at a big bank, was home with his family. He didn’t get any worrying calls.
Not a bad evening for the IT practitioner, considering some economists and corporate execs had warned that widespread computer failures could lead to a recession or even fatal catastrophes when machines mishandled the approaching date. The fear was that a two-digit formatting decision made during computing’s early days would lead to machines misinterpreting the year “2000” as the year “1900,” and disrupt any payroll, aircraft, power-plant, or other technologies that relied on accurate calendar data.
Organizations facing the new century had to make sure their systems’ code could handle the number 2000, and most orgs succeeded, minus a few disruptions: a power-plant timing issue here, a spy-satellite shutdown there. IT pros who spoke with us planned for the event well in advance, taking inventory and updating outdated code.
Some saw Y2K as an event where everyone overreacted.
“I think emotionally, it felt that way. But really, what it was was that we did come together in a very quick period of time, analyze the problem, figure out what we needed to do, and most people did figure it out,” Roger A. Grimes, data-drive defense evangelist at KnowBe4 (who even had a Y2K consulting business), told us.
Crash course. One IT pro who felt some relief after all the confetti settled was Danny Allan, then a senior consulting analyst at Canada’s Carleton University. In addition to upgrading the school’s thousands of machines with vendors’ updated-for-Y2K software (which often required a campus ride on a golf cart, he remembers), he and the team had to prepare for the new century by scanning the code of a six-foot-tall mainframe. The tractor-trailer-sized computer housed the university’s research, grades, and accounting—essentially the school’s ERP, Allan said.
“If the university couldn’t actually access the customer records and the payment records, that was going to be a significant problem,” he told us.
Eighteen months in advance, Allan and IT colleagues looked for code variables that reflected dates and made sure those dates had a four-digit number, not a two-digit one that would lead to a “subtraction problem.”
The subtraction problem. Programming languages like COBOL, to save memory space, gave coders the option to represent years with two digits, nixing the “19” in 20th century figures.. A year like “1970” might be placed in data structures as “70.”
If a program had to make calculations based on dates, suddenly “00” could mean 1900 or 2000, which leads to some mathematical confusion. If that same program had to calculate someone’s age by subtracting the present year from one’s birth year, the correct interpretation of “00” becomes important.
Top insights for IT pros
From cybersecurity and big data to cloud computing, IT Brew covers the latest trends shaping business tech in our 4x weekly newsletter, virtual events with industry experts, and digital guides.
Many data types had to be modified from the two characters to four characters. Allan updated COBOL code from a terminal computer, a sandbox environment that connected to the mainframe.
And the remediation problem was not limited to COBOL; if that were the case, Y2K prep would’ve been a breeze, Grimes told us in a follow-up email. Coders looking to save a few seconds had an opportunity to place two-digit years in other languages, too, and, leading up to New Year’s Eve, IT pros had to look closely at how dates were placed and interpreted in all their programs and scripts.
More prep. A miscalculation of age would have been an especially troublesome scenario for Dan Lohrmann, CIO for Michigan’s Department of Technology, Management, and Budget from 1997 to 2000. Beginning in 1997, he and his team of 200 reviewed network infrastructure, software, and the state’s retirement system, which ran on lots and lots of COBOL code.
“Upgrading the COBOL code: That was probably 90% of our problem,” Lohrmann said. And the year-long work paid off; he said New Year’s Eve was a non-event. “It was more of a New Year’s Eve party,” he said.
Weitz spent much of 1999 testing Microsoft’s internal applications, like payroll and messaging. As midnight hit New Zealand, Weitz had checks to do: Could you log into the network? Could you send an email to New Zealand’s Exchange servers? Could you send instant messages?
Those tests had to happen for each time zone approaching midnight. He began to feel relief as the East Coast began their celebrations.
“If our first US core set of services didn’t tip over, likely we were probably out of the woods,” he said.
Weitz remembers his Y2K prep as a “solid year-long project” of “voluminous” test cases: ripping code from production, placing it in a testing environment, running the clock forward, and seeing what worked or didn’t. Breaks would go to a development team for fixes and then a retest.
On January 1, the team girded for a crisis that didn’t arrive.
“Nothing happened, which is a testament to all the work that people did,” Weitz said.
Countdown. Twenty-five years later, IT pros face unpredictable threats like ransomware that can take infrastructure down at once; many healthcare orgs, for example, have experienced costly, unexpected downtime in 2024. Emerging technologies like AI may require developers to modify existing code. If that’s not enough: the National Institute of Standards and Technology urges orgs to update their encryption by 2035 to defend against quantum-computing attacks—an effort Deloitte is calling “Y2Q.”
While Y2K did not lead to a large-scale IT crisis, the pros supporting preparations had an advantage: a clear deadline.
Y2K was a success, Grimes said—a demonstration that “we could do big, grand things together as a world.”
Threats like cyberattacks or quantum encryption breakers won’t occur at an agreed upon time, which hinders the kind of large-scale coordination on display in the years before 2000. (Or 00, if you’re still speaking COBOL.)
“Having a defined date was probably a blessing,” he said.
This is one of the stories of our Quarter Century Project, which highlights the various ways industry has changed over the last 25 years. Check back each month for new pieces in this series and explore our timeline featuring the ongoing series.