- Joined
- May 8, 2005
- Messages
- 1,297
- Reaction score
- 46
- Location
- Pergatory
- Gender
- Male
- Political Leaning
- Centrist
Basicly the only way the Y2K doomsday senario would've played out, is if we were still using computers from back in the 50's and the 60's. The current computer BIOS's (Binary In/Out System) 1980's+, is where your computer keeps track of time. But the BIOS clocks in the 1980's+ were always set so far ahead of the computers that the computers would have been replaced long before the clocks went bad. So once again, Business problem solving prevails. Just for an example, my old Pentium 75 Mhz machine's clock went up to the year 2224. And it was made back in 1992-1993. It also had a 200 year span to what ever time you set it's start date to.The programming problem
The underlying programming problem was quite real. In the 1960s, computer memory and storage were scarce and expensive, and most data processing was done on punch cards which represented text data in 80-column records. Programming languages of the time, such as COBOL and RPG, processed numbers in their ASCII or EBCDIC representations. They occasionally used an extra bit called a "zone punch" to save one character for a minus sign on a negative number, or compressed two digits into one byte in a form called binary-coded decimal, but otherwise processed numbers as straight text. Over time the punch cards were converted to magnetic tape and then disk files and later to simple databases like ISAM, but the structure of the programs usually changed very little. Popular software like dBase continued the practice of storing dates as text well into the 1980s and 1990s.
Saving two characters for every date field was significant in the 1960s. Since programs at that time were mostly short-lived affairs programmed to solve a specific problem, or control a specific hardware-setup, most programmers of that time did not expect their programs to remain in use for many decades. The realisation that databases were a new type of program with different characteristics had not yet come, and hence most did not consider fixing two digits of the year a significant problem. There were exceptions, of course; the first person known to publicly address the problem was Bob Bemer who had noticed it in 1958, as a result of work on genealogical software. He spent the next twenty years trying to make programmers, IBM, the US government and the ISO care about the problem, with little result. This included the recommendation that the COBOL PICTURE clause should be used to specify four digit years for dates. This could have been done by programmers at any time from the initial release of the first COBOL compiler in 1961 onwards. However lack of foresight, the desire to save storage space, and overall complacency prevented this advice from being followed. Despite magazine articles on the subject from 1970 onwards, the majority of programmers only started recognizing Y2K as a looming problem in the mid-1990s, but even then, inertia and complacency caused it to be mostly ignored until the last few years of the decade.
Storage of a combined date and time within a fixed binary field is often considered a solution, but the possibility for software to misinterpret dates remains, because such date and time representations must be relative to a defined origin. Roll-over of such systems is still a problem but can happen at varying dates and can fail in various ways. For example:
The typical Unix timestamp stores a date and time as a 32-bit signed integer number representing, roughly speaking, the number of seconds since January 1, 1970, and will roll over in 2038 and cause the year 2038 problem.
The popular spreadsheet Microsoft Excel stores a date as a number of days since an origin (often erroneously called a Julian date). A Julian date stored in a 16-bit integer will overflow after 65,536 days (approximately 179 years). Unfortunately, some releases of the program start at 1900, others at 1904.
In the C programming language, the standard library function to get the current year originally did have the problem that it returned only the year number within the 20th century, and for compatibility's sake still returns the year as year minus 1900. Many programmers in C, and in Perl and JavaScript, two programming languages widely used in Web development that use the C functions, incorrectly treated this value as the last two digits of the year. On the Web this was a mostly harmless bug, but it did cause many dynamically generated webpages to display January 1, 2000, as "1/1/19100", "1/1/100", or variations of that depending on the format.
Even before January 1, 2000 arrived, there were also some worries about September 9, 1999 (albeit lesser compared to those generated by Y2K). This date could also be written in the numeric format, 9/9/99, which is somewhat similar to the end-of-file code, 9999, in old programming languages. It was feared that some programs might unexpectedly terminate on that date. This is actually an urban legend, because computers do not store dates in that manner. In this case, the date would be stored 090999 or 9/9/99, to prevent confusion of the month-day boundary.
Another related problem for the year 2000 was that it was a leap year even though years ending in "00" are normally not leap years. (A year is a leap year if it is divisible by 4 unless it is both divisible by 100 and not divisible by 400.) Fortunately, like Y2K, most programs were fixed in time.
http://en.wikipedia.org/wiki/Y2K
:comp: :badpc:
Now to how computer companies banked on it. Now that we know that Y2K was a scare, now lets get to the beneficiaries of the incidence. Dell, HP, Compaq, IBM, etc.. all of them at the time started to selling the their personal computers for less than $1000. All preceding years you couldn't even have touched one for less than $2000. But even in the mist of all this information they took advantage of the Y2K scare, by putting [Y2K compliant stickers] and advertising that their computers were Y2K compliant. Forcing the people who already have a computer (Basicly telling them their computer is obsolete), and new buyers into the computer market. But of course it wouldn't matter because a computer's BIOS clock is set in 200 year increments. America is just a little older than that if that gives you the idea of how far back in time you would have to go to get a non-Y2K compliant computer.