A lazy fix 20 years ago means the Y2K bug is taking down computers now

naib

[H]ard|Gawd
Joined
Jul 26, 2013
Messages
1,289
https://www.newscientist.com/articl...ans-the-y2k-bug-is-taking-down-computers-now/


Parking meters, cash registers and a professional wrestling video game have fallen foul of a computer glitch related to the Y2K bug.

The Y2020 bug, which has taken many payment and computer systems offline, is a long-lingering side effect of attempts to fix the Y2K, or millennium bug.

Both stem from the way computers store dates. Many older systems express years using two numbers – 98, for instance, for 1998 – in an effort to save memory. The Y2K bug was a fear that computers would treat 00 as 1900, rather than 2000.


Programmers wanting to avoid the Y2K bug had two broad options: entirely rewrite their code, or adopt a quick fix called “windowing”, which would treat all dates from 00 to 20, as from the 2000s, rather than the 1900s. An estimated 80 per cent of computers fixed in 1999 used the quicker, cheaper option.

“Windowing, even during Y2K, was the worst of all possible solutions because it kicked the problem down the road,” says Dylan Mulvin at the London School of Economics.


Coders chose 1920 to 2020 as the standard window because of the significance of the midpoint, 1970. “Many programming languages and systems handle dates and times as seconds from 1970/01/01, also called Unix time,” says Tatsuhiko Miyagawa, an engineer at cloud platform provider Fastly.
 
In all fairness no one programming these things thought they would keep uesing the same systems 20 year later theses things where old in 2000.

I got hired to work on some Y2K stuff, and that even during the mid 90s, he thought he was going to be long retired as Y2K came up, so that it wouldn't be his problem.

But yeah, a lot of people think their code is going to be rewritten, but more often than not, it just turns out to be patch after patch onto the original code. It's why, at quite a few of the companies I've worked at, we still were modifying programs from the 1960s. At some point, they become too big to rewrite, and you end up not being able to replace that COBOL programmer who originally wrote them. Because it comes down to spending a massive fortune and taking a few years just to get everything back to today, or continuing to upgrade.
 
Thanks for posting this. We had some computer driven equipment go down stroke of midnight on new year's eve where I work and we had no idea why. It was obviously date related, but we couldn't figure out why 2020. Guess now I know.
 
While I'm sure most programmers wouldn't expect the code to keep running till 2020. I wonder why they chose to make the range 1920-2020 instead of say 1950-2050?
 
While I'm sure most programmers wouldn't expect the code to keep running till 2020. I wonder why they chose to make the range 1920-2020 instead of say 1950-2050?
That's mentioned in the article

Coders chose 1920 to 2020 as the standard window because of the significance of the midpoint, 1970. “Many programming languages and systems handle dates and times as seconds from 1970/01/01, also called Unix time,” says Tatsuhiko Miyagawa, an engineer at cloud platform provider Fastly
 
A lot of developers struggle to write code that'll still work 6 months down the line, forget 20 years. Its only getting worse
 
I forget who, but someone was telling me that before the last Y2K bug one of the problems was that some large company he was working for whobran back end systems for the financial sector didn't even have the source code for the system. It had been written at some point in the 70s and then just never been maintained.
 
In all fairness no one programming these things thought they would keep uesing the same systems 20 year later theses things where old in 2000.
When you're updating code that was written 40 years earlier (and you know that those programmers never expected their code to last 40 years), it's a really dumb assumption to think that it won't stick around for 20 more. That said, this is also a company problem, because I believe these solutions were done as a stopgap to make the Y2K deadline and I think it was assumed that they'd continue working on a permanent solution, but of course with the immediate disaster averted, companies said, OK work on something else now.
 
While I'm sure most programmers wouldn't expect the code to keep running till 2020. I wonder why they chose to make the range 1920-2020 instead of say 1950-2050?

The company I worked in at that time, chose to use 30 as the cutoff point.
So the window is 1930 till 2029

But I think the window year was paramaterized. So you can update it if the same program is still running in 2029
 
The company I worked in at that time, chose to use 30 as the cutoff point.
So the window is 1930 till 2029

But I think the window year was paramaterized. So you can update it if the same program is still running in 2029

...if anyone remembers what you did 30 years ago when 2929 comes around, or if they are blissfully ignorant until everything comes to a screeching halt....


I don't know what the solution is here.

You don't want to waste money replacing systems that are working, but at the same time these types of issues inevitably creep up.

Best practice would be to do a full code review of every custom piece of software every X years, and at a minimum replace it with something new every Y years?

Not sure what the appropriate numbers are. 2 and 10?
 
...if anyone remembers what you did 30 years ago when 2929 comes around, or if they are blissfully ignorant until everything comes to a screeching halt....


I don't know what the solution is here.

You don't want to waste money replacing systems that are working, but at the same time these types of issues inevitably creep up.

Best practice would be to do a full code review of every custom piece of software every X years, and at a minimum replace it with something new every Y years?

Not sure what the appropriate numbers are. 2 and 10?

That's not going to happen though. As I previously mentioned, no company is going to want to spend millions of dollars and a few years of effort with zero visual return, while their current products get stalled until the effort is complete. You can rewrite portions in pieces though, which is what many companies are doing, but it turns a 3 year project into 15+ years.

The problem is, as technology gets cheaper and faster, it also becomes significantly more complex. And while writing it correctly in the first place is often a huge step in the right direction, one thing to take into consideration is that what's correct today might not have been yesterday, nor may it be correct tomorrow. That's how we got into the Y2K debacle in the first place. Memory was exorbitantly expensive. For example, in 1980, a Gigabyte HDD cost $40000. 1 MB of RAM would have cost $6500. (Yes, if you have 16 GB of RAM in your computer, it would have cost roughly $105 million dollars back then). (Source: https://jcmit.net/memoryprice.htm) So, storing everything in 2 digits made sense. To represent 4 digits requires 14 bits, whereas 2 digits requires 7 bits. By comparison, 2 digits in terms of letters requires 16 bits, and 4 require 32 bits. And today, your typical integer, which is what most people use to hold numbers, requires either 32 or 64 bits. That means in the space a modern programmer might just use for one date, a programmer back then could have fit 4-9 dates, saving valuable memory, and thus, tons of money.

This is also why it just isn't an easy fix. Today, when memory is dirt cheap, it is an easy fix (well, at least easier fix). Because we're not worried about compression, your typical 32 bit integer might range from -2 billion to +2billion. A 64 bit integer goes to -9 quintillion to +9 quintillion. Unless you're doing scientific applications, not much need to worry about compression. But when you're dealing with an era that involves compression, you need to go through what is probably an extremely complex program, and rewrite massive portions of it, or you can find a few key compression functions, and have a quick check to see if the 2 digit # is less than 20, and if so, display 20+X rather than 19+X.

And while not Y2K related, take a look at the VVVVVV source that was released last week I believe. https://github.com/TerryCavanagh/VVVVVV/ This is actually fairly typical of the type of quality you'll see in many commercial applications, but it's a total nightmare when it comes to programming. There's a huge divide between academia and commercial. You can do something right, fast, or cheap, but you can't do all 3.
 
Who's using shit from the 90s still? Besides Frys electronics cash register systems lmao what a shit business.
 
Who's using shit from the 90s still? Besides Frys electronics cash register systems lmao what a shit business.


Hardware may (or may not) have been upgraded, but a lot of the custom written software never gets updated. It just moves from running natively to running in VM's, but stays old.

That, and then there's all the old middleware tying major databases and systems together, that was a hack at best when it was new, and now it is 20 years old.
 
Hardware may (or may not) have been upgraded, but a lot of the custom written software never gets updated. It just moves from running natively to running in VM's, but stays old.

That, and then there's all the old middleware tying major databases and systems together, that was a hack at best when it was new, and now it is 20 years old.

Yeah I was never really a middle or general hardware IT guy. I've always specialized in network engineering with a focus in Routing and Switching and all topologies possible and all associated protocols. Never have I ever been a fan of breakfix IT. Makes me want to vomit. I had a temp job once when I was between work once. Dude was like hey man we need you to help this lady get her email working on her iPad because she called in said she was bored waiting at the airport for her flight. I said nope you got this and quit right then and there and just took unemployment for 2 months until my next big job started.
 
Yeah I was never really a middle or general hardware IT guy. I've always specialized in network engineering with a focus in Routing and Switching and all topologies possible and all associated protocols. Never have I ever been a fan of breakfix IT. Makes me want to vomit. I had a temp job once when I was between work once. Dude was like hey man we need you to help this lady get her email working on her iPad because she called in said she was bored waiting at the airport for her flight. I said nope you got this and quit right then and there and just took unemployment for 2 months until my next big job started.

You sound fun.
 
You sound fun.

When I say this I say this with the utmost sincerity. I would rather take a job at Home Depot stacking lumber than doing Breakfix IT. Nothing pains me more than having to explain how your Torrent downloading habit using your work PC is the reason your whole network got a Virus, then having to fix each machine, then having to explain in english what happened, then having to restore bare metal shit on machines that were broken, then having to explain why the bill is so high, then having to explain to the employee downloading torrents why this happened, lastly having to explain why the job took so long to fix because the company I work for refuses to allow me to consult with the customers about replacing their Core2 Duo's from 15 years ago and its 2020 and the machines are as slow as molasses on a cold day, then having to hear them bitch because I have other clients in the ticket system that need my time just as much but I am not allowed to get a second person to help fix the issue because the billing would be too high for the mom and pop shop, but the owner is mad this week because another client bitched, while I have to hear the brunt of his anger in some unrelated bullshit rant, but the boss remembers my face during his rant even though I have nothing to do with it, and then the customer I just fixed bitches because their invoice is too high and I billed too many hours when I was yelled at for not billing enough hours overall and meeting quota. FUCK BREAK FIX

Another job out west I was hired to be a full fledged Cisco Network Engineer, certified and all at the time, and after 5 months I walked out/got fired because to that date they had me work on ZERO engineering projects. I had a client that week literally cry and give me a hug because they were having constant network outages on their lan because someone back the office decided they KNEW what spanning tree and trunking was without reading a single thing about it and screwed everything up. I logged in, took the incentive of 5 mins to fix the damned HP switches and everything was perfect. She called the company I worked for and praised the living shit out me. Turns out the Owner found out I touched the network gear that he personally fucked up, though he could do no harm, and told me I was fired. This was the same time I just handed him my written notice to quit. Fuck break fix. I mean it.

Its a vicious cycle of constantly fixing mentally retarded fuckups by even more mentally retarded bosses who hire you for your special talents and then forget the reason they hired you and give you a nice 9 hour/day desk with a phone and constant fucking program error fixes and phone calls on how to set up fucking gmail when your a CCNA route switch certified employee.

I used to do it and I hate it.

Needless to say I do not do IT anymore. The only fun was when I was in the CoLo doing work in the lovely quiet noise of racks and racks of chillwater fans blowing and the beautiful - almost laminar like flow of fiber optics and ethernet cabling running everywhere.
 
Last edited:
Back
Top