Peter de Jager had New Year’s Eve 1999 all mapped out.
The Canadian computer consultant had spent the past few years traveling the world — visiting 45 countries and conducting over 2,000 media interviews to warn of the imminent dangers of the “Y2K” computer programming problem.
Now, with his work all but done, he was all set to head to Ireland’s west coast to enjoy a pint in his favorite bar, O’Connor’s in Doolin, County Clare.
“It’s my happy place,” he told The Post.
But as New Year’s Eve drew ever nearer, the world’s fears of what might occur — ranging from the catastrophic to the comedic —grew exponentially: People everywhere were worried that bank accounts would be wiped out, energy supplies would be cut off, people would die in hospitals and, of course, personal computers would explode instantaneously.
So de Jager changed tack.
“People kept saying the planes were going to fall out of the sky. So I decided to take a flight from Chicago to London Heathrow, making sure I was in the air as the clocks turned midnight,” he recalled. “So I called United Airlines. They gave me a ticket — I didn’t even buy it.
“And it was just to demonstrate that I had no concerns about flying.”
What he thought mattered — after all, he had fueled a lot of the fear.
Thirty years ago, on September 6, 1993, de Jager wrote a landmark article in Computerworld magazine highlighting the scale of the challenges that needed tackling before one century turned to the next at midnight on Jan.1, 2000.
Entitled “Doomsday 2000,” the story clearly and concisely sets out some of the likely consequences of the date change and the huge efforts it would take to try to mitigate them.
But many didn’t get it. Maybe because of what de Jager said after the article.
“They’d say, ‘What’s the worst that could happen?’” de Jager recalled. “And I’d say, ‘Well, people could die.’”
The US government established the Center for Year 2000 Strategic Stability with the Russian Federation so they could work jointly on mitigating any false positive readings in each of their nuclear missile early warning systems, lest they inadvertently bomb each other.
John Hamre, the US Deputy Secretary of Defense, drew comparisons with catastrophic natural disasters.
“The Y2K problem is the electronic equivalent of the [1998 storm] El Niño and there will be nasty surprises around the globe,” he said.
For others, Y2K was a sign of something much bigger.
From radical religious groups to extreme right-wing organizations, Y2K seemed to signal the end of the world they had long been anticipating.
At the Preparedness Expo in Atlanta in June 1998, sales of survival equipment for the coming apocalypse boomed, while one militia leader, Norm Olson, predicted Y2K would “be the worst time for humanity since the Noahic flood.”
On Pat Robertson’s Christian Broadcasting Network, Y2K stories like “Countdown to Chaos: Prophecy for 2000” and “The Year 2000: A Date With Disaster” filled the airwaves.
“It was seen as the coming of the end times and used by the preppers and survivalists,” de Jager told The Post.
The Y2K computer issue, as de Jager explained it, was the result of cost-cutting and convenience in the early days of programming.
With computer memory prohibitively expensive, early programmers abbreviated year dates to just two digits –— “99” for “1999” — to save disk space and money.
The problem, however, was that when 1999 became 2000, computers would then recognize the year as 1900.
And that meant potential chaos.
But the scale of Y2K’s potential danger was massively underestimated at first.
“Many organizations just didn’t realize how big the problem was. They would say, ‘So just put a couple of extra digits in,’” said de Jager. “But some of these systems have over a billion lines of code and you had to go through all of them.
“And it’s like pick-up sticks; you can’t change one line of code without also thinking about the impact on any other line of code.”
Within two years of de Jager’s article, the problem would get a branding makeover that caught the public’s attention.
David Eddy, a computer programmer from Massachusetts, termed the problem Y2K, a snappy alternative to what others were calling Century Date Change (CDC) and Faulty Date Logic (FADL).
Others, meanwhile, were calling it the Millennium Bug.
That was not only misleading but failed to convey the seriousness of just what was at play.
“A computer ‘bug’ refers typically to an error, mistake or flaw in a computer program,” said Robin Guenier, the Executive Director of Taskforce 2000, the United Kingdom’s response unit designed to raise awareness of the Y2K problem.
“And Y2K was not a bug — nor was it a ‘virus.’”
The global panic was a situation borne out of consumer ignorance and a media sensing a potentially cataclysmic story, according to Michael Halvorson, a technical writer and History Professor at Pacific Lutheran University.
“Y2K raised alarms because computers were still relatively new and computer programming was not well understood by the general public,” Halvorson told The Post.
“There were front covers of Time and Newsweek predicting the end of the world — IT people were not doing that,” de Jager added.
Although it’s remembered as something of a joke now, when 1999 became 2000, there were, inevitably, some issues.
Some, like 150 slot machines going haywire at a racetrack in Delaware, were comparatively minor.
Others — such as the Naval Observatory, the nation’s official timekeeper, and presidential candidate Al Gore’s website both displaying the year as “19100” — were soon corrected.
At a Super Video rental store in Colonie, NY, one customer was charged an overdue fee of $91,250 for failing to return their copy of the John Travolta thriller “The General’s Daughter” on time.
Thanks to Y2K, the store’s system calculated that the movie was actually 100 years overdue, even though the movie was only released in 1999.
And there were some more shocking consequences.
In the UK, failure to address Y2K on a National Health Service PathLAN computer resulted in 154 pregnant women in the English city of Sheffield being given incorrect Down Syndrome test results — resulting in two terminations and four Downs babies being born to mothers who had been told they were in the low-risk group.
According to de Jager, the reason that Y2K appeared to pass without more incidents was that most organizations at risk invested the necessary resources into remediation work, even though it wasn’t cheap or easy.
“It’s incredibly difficult to put an estimate on it but it’s likely $300-$400 billion was spent it on it globally,” he says. “And half of the money spent on Y2K was just planning what to do.”
That’s not to say it won’t happen again.
Since 2000, there have been many examples of computer systems failing to cope with date changes.
Remember when the parking meters in NYC wouldn’t take credit cards in January 2020?
That was because the software had an end date of January 1, 2020, and had never been updated.
The big one to look out for, however, is 2038.
That’s when older computers using 32-bit processors will fail to cope with the date and time change at 03:14:07 UTC on March 19 — the point when Unix (the machines’ time-measurement system, created in 1970) rolls over to 0, meaning they won’t be able to differentiate between 2038 and 1970.
“I’ve been asked 2038 ever since my work on Y2K,” said de Jager. “Is it going to cause a problem? Maybe. Should someone be looking at it? Definitely.
“But let me tell you, it won’t be me. I’ve done my shift.”
This story originally Appeared on NYPost