answersLogoWhite

0

In the 1960s and 1970s, computer space was expensive. One cost-cutting measure was to render the year as two digits instead of four. So, for example, a computer would recognize the year 1974 as just "74". This works great until you get to the year 2000, which the computer will render as "00" and mistake for 1900.

Why is this a problem? Well, let's say there's a library with a non-Y2K compliant computer. You check out a book on December 21st, 1999. The book is due on January 3rd, 2000. The next day, you return the book and the library computer says you owe them millions of dollars in late fees. How can this be? Well, the computer recognized the current date as 12/22/99, the date the book was due as 01/03/00, and concluded the book was 99 years overdue. Now imagine the same kind of errors affecting other "date-sensitive" programs, such as the computers in charge of bank accounts.

You see, those computers were intended to work short-term and no one expected them to still be in operation by the year 2000. After all, they would be outdated by then. But many of them were still in operation and thus had to be updated to recognize four-digit years. Billions of dollars were spent correcting the problem.

The problem was overhyped to the public as a technological apocalypse, leading to the popular perception that Y2K would cause planes to fall from the sky and other such nonsense. To this day, debates rages as to whether the money spent on Y2K prevented the crisis or if there just wasn't that much of a crisis in the first place.

User Avatar

Wiki User

13y ago

What else can I help you with?