Why was Y2K specifically a big deal if computers actually store their numbers in binary? Why would a significant decimal date have any impact on a binary number?

1.53K views

I understand the number would have still overflowed *eventually* but why was it specifically new years 2000 that would have broken it when binary numbers don’t tend to align very well with decimal numbers?

EDIT: A lot of you are simply answering by explaining what the Y2K bug is. I am aware of what it is, I am wondering specifically why the number ’99 (`01100011` in binary) going to 100 (`01100100` in binary) would actually cause any problems since all the math would be done in binary, and decimal would only be used for the display.

EXIT: Thanks for all your replies, I got some good answers, and a lot of unrelated ones (especially that one guy with the illegible comment about politics). Shutting off notifications, peace ✌

In: 478

84 Answers

Anonymous 0 Comments

The biggest assumption that a developer makes is that everything it relies on works as expected.

Usually, this is fine because at time of writing the software, everything DOES work as expected. It’s tested.

But because everything works, developers go with the easiest solution.

Need to compare the current date to one that was input by the user? Well here’s a little utility that outputs the current date in an easy to parse format! A little string parsing, and you’re good to go!

Sounds lovely, right?

Well…

Sometimes one of the lower components doesn’t work right. Sometimes that’s caused by an update, and sometimes that’s caused by reality slipping out of supported bounds.

The broken component in this case is that date utility. It thinks the year is 99… But it’s gonna have a choice to make. Is it 00? 100? 100 but the 1 is beyond its registered memory space? Depends on how it was written.

Let’s say they used 100 because it’s just simple to calculate as int then convert to a string.

The program above it gets 1/1/ 100 as the date. The parser sees that and goes “ok, it’s January first, 19100. So January 1st, 1980 was 17120 years ago.” Computers are not exactly known for checking themselves, so a date 20 years ago really is treated as if it were over a thousand years ago by every other utility.

And I do mean every other utility. If there’s a point where that becomes binary down the line, it’s gonna try to store that number regardless of whether or not enough space was allocated (32 bits is NOT enough space for that late of a date), and unless protections were added (and why would they have been?), You’re gonna corrupt anything that happens to be next to it by replacing it with part of this massive date.

Y2K just happened to be a very predictable form of this issue, and plenty of developers had prepared defences to ensure it didn’t cause actual disaster.

You are viewing 1 out of 84 answers, click here to view all answers.