• This is a political forum that is non-biased/non-partisan and treats every person's position on topics equally. This debate forum is not aligned to any political party. In today's politics, many ideas are split between and even within all the political parties. Often we find ourselves agreeing on one platform but some topics break our mold. We are here to discuss them in a civil political debate. If this is your first visit to our political forums, be sure to check out the RULES. Registering for debate politics is necessary before posting. Register today to participate - it's free!
  • Welcome to our archives. No new posts are allowed here.

Y2K was another "the end of the world" conspiracy?

stsburns

American Infidel
DP Veteran
Joined
May 8, 2005
Messages
1,297
Reaction score
46
Location
Pergatory
Gender
Male
Political Leaning
Centrist
The programming problem
The underlying programming problem was quite real. In the 1960s, computer memory and storage were scarce and expensive, and most data processing was done on punch cards which represented text data in 80-column records. Programming languages of the time, such as COBOL and RPG, processed numbers in their ASCII or EBCDIC representations. They occasionally used an extra bit called a "zone punch" to save one character for a minus sign on a negative number, or compressed two digits into one byte in a form called binary-coded decimal, but otherwise processed numbers as straight text. Over time the punch cards were converted to magnetic tape and then disk files and later to simple databases like ISAM, but the structure of the programs usually changed very little. Popular software like dBase continued the practice of storing dates as text well into the 1980s and 1990s.

Saving two characters for every date field was significant in the 1960s. Since programs at that time were mostly short-lived affairs programmed to solve a specific problem, or control a specific hardware-setup, most programmers of that time did not expect their programs to remain in use for many decades. The realisation that databases were a new type of program with different characteristics had not yet come, and hence most did not consider fixing two digits of the year a significant problem. There were exceptions, of course; the first person known to publicly address the problem was Bob Bemer who had noticed it in 1958, as a result of work on genealogical software. He spent the next twenty years trying to make programmers, IBM, the US government and the ISO care about the problem, with little result. This included the recommendation that the COBOL PICTURE clause should be used to specify four digit years for dates. This could have been done by programmers at any time from the initial release of the first COBOL compiler in 1961 onwards. However lack of foresight, the desire to save storage space, and overall complacency prevented this advice from being followed. Despite magazine articles on the subject from 1970 onwards, the majority of programmers only started recognizing Y2K as a looming problem in the mid-1990s, but even then, inertia and complacency caused it to be mostly ignored until the last few years of the decade.

Storage of a combined date and time within a fixed binary field is often considered a solution, but the possibility for software to misinterpret dates remains, because such date and time representations must be relative to a defined origin. Roll-over of such systems is still a problem but can happen at varying dates and can fail in various ways. For example:

The typical Unix timestamp stores a date and time as a 32-bit signed integer number representing, roughly speaking, the number of seconds since January 1, 1970, and will roll over in 2038 and cause the year 2038 problem.
The popular spreadsheet Microsoft Excel stores a date as a number of days since an origin (often erroneously called a Julian date). A Julian date stored in a 16-bit integer will overflow after 65,536 days (approximately 179 years). Unfortunately, some releases of the program start at 1900, others at 1904.
In the C programming language, the standard library function to get the current year originally did have the problem that it returned only the year number within the 20th century, and for compatibility's sake still returns the year as year minus 1900. Many programmers in C, and in Perl and JavaScript, two programming languages widely used in Web development that use the C functions, incorrectly treated this value as the last two digits of the year. On the Web this was a mostly harmless bug, but it did cause many dynamically generated webpages to display January 1, 2000, as "1/1/19100", "1/1/100", or variations of that depending on the format.
Even before January 1, 2000 arrived, there were also some worries about September 9, 1999 (albeit lesser compared to those generated by Y2K). This date could also be written in the numeric format, 9/9/99, which is somewhat similar to the end-of-file code, 9999, in old programming languages. It was feared that some programs might unexpectedly terminate on that date. This is actually an urban legend, because computers do not store dates in that manner. In this case, the date would be stored 090999 or 9/9/99, to prevent confusion of the month-day boundary.

Another related problem for the year 2000 was that it was a leap year even though years ending in "00" are normally not leap years. (A year is a leap year if it is divisible by 4 unless it is both divisible by 100 and not divisible by 400.) Fortunately, like Y2K, most programs were fixed in time.
http://en.wikipedia.org/wiki/Y2K
Basicly the only way the Y2K doomsday senario would've played out, is if we were still using computers from back in the 50's and the 60's. The current computer BIOS's (Binary In/Out System) 1980's+, is where your computer keeps track of time. But the BIOS clocks in the 1980's+ were always set so far ahead of the computers that the computers would have been replaced long before the clocks went bad. So once again, Business problem solving prevails. Just for an example, my old Pentium 75 Mhz machine's clock went up to the year 2224. And it was made back in 1992-1993. It also had a 200 year span to what ever time you set it's start date to.

:comp: :badpc:

Now to how computer companies banked on it. Now that we know that Y2K was a scare, now lets get to the beneficiaries of the incidence. Dell, HP, Compaq, IBM, etc.. all of them at the time started to selling the their personal computers for less than $1000. All preceding years you couldn't even have touched one for less than $2000. But even in the mist of all this information they took advantage of the Y2K scare, by putting [Y2K compliant stickers] and advertising that their computers were Y2K compliant. Forcing the people who already have a computer (Basicly telling them their computer is obsolete), and new buyers into the computer market. But of course it wouldn't matter because a computer's BIOS clock is set in 200 year increments. America is just a little older than that if that gives you the idea of how far back in time you would have to go to get a non-Y2K compliant computer.
 
stsburns said:
Basicly the only way the Y2K doomsday senario would've played out, is if we were still using computers from back in the 50's and the 60's. The current computer BIOS's (Binary In/Out System) 1980's+, is where your computer keeps track of time. But the BIOS clocks in the 1980's+ were always set so far ahead of the computers that the computers would have been replaced long before the clocks went bad. So once again, Business problem solving prevails. Just for an example, my old Pentium 75 Mhz machine's clock went up to the year 2224. And it was made back in 1992-1993. It also had a 200 year span to what ever time you set it's start date to.

Y2K was not just about bios, in fact that was a minor point. Every program that used dates was at risk. I worked for a large company that processed stock exchange data in real-time, this included subtracting and adding dates and days. Eg: A stock at its lowest level on 4 Jan 2000, lowest since 12 Dec 1999, therefore lowest for (4 Jan 2000 minus 12 Dec 1999) days or incorrectly, 4 Jan 00 minus 12 Dec 99.

Nothing to do with bios.
 
paulmarkj said:
Y2K was not just about bios, in fact that was a minor point. Every program that used dates was at risk. I worked for a large company that processed stock exchange data in real-time, this included subtracting and adding dates and days. Eg: A stock at its lowest level on 4 Jan 2000, lowest since 12 Dec 1999, therefore lowest for (4 Jan 2000 minus 12 Dec 1999) days or incorrectly, 4 Jan 00 minus 12 Dec 99.

Nothing to do with bios.
The firmware run software? And also to point out how many machines actually did fail because of a time? Name one! :doh Think about it?
 
stsburns said:
The firmware run software? And also to point out how many machines actually did fail because of a time? Name one! :doh Think about it?

I didn't say the "firmware run software" at any point. I said that bios was not our great concern - it was the applications that ran on the systems. The applications that had their own date calculations separate from bios.

I totally agree with you that the problem was hyped up, and I tried to persuade my company not to waste the $100,000,000 (yes, 8 zeros) on the problem, but they had customers saying they were employing lawyers ready to sue in case there was downtime. Customers wanted us to demonstrate to them that we were Y2K compliant. I had to demonstrate to customers that we were ready, so I know what the pressures were.

As for buying new hardware: we did NOT buy new PCs or any other equipment because we had to test the existing equipment. New equipment would have just compounded the problem because of extra testing. (A sticker saying "y2k compliant" cut no ice with us).

We did find a problems, which of course we fixed before 2000 (in fact all done by Q1 1999). But the exercise was more to do with the satisfying the customer. Imagine how it would have looked if we had a problem on Jan 1 (or Jan 4 - the first working day), as minor as it may have been our customers would have rightly said, "you've had years to fix it. How can we trust you?" We would have looked incompetent.

As for PC prices coming down, they've come down every year between 1985 and 2005!

(Or should I say: as for PC prices coming down, they've come down every year between 1985 and 2005! :doh )
 
paulmarkj said:
We did find a problems, which of course we fixed before 2000 (in fact all done by Q1 1999). But the exercise was more to do with the satisfying the customer. Imagine how it would have looked if we had a problem on Jan 1 (or Jan 4 - the first working day), as minor as it may have been our customers would have rightly said, "you've had years to fix it. How can we trust you?" We would have looked incompetent.

(Or should I say: as for PC prices coming down, they've come down every year between 1985 and 2005! :doh )
Tell me more about the Y2K problem? I'm curios of what kind of problem you ran into? As for PC's they have been coming down, but I do think Y2K just sold books and PC's. Even though your company didn't upgrade, imagine how many that did? Or how many people bought new PC's in the fear of Y2K! :mrgreen:
 
stsburns said:
Tell me more about the Y2K problem? I'm curios of what kind of problem you ran into? As for PC's they have been coming down, but I do think Y2K just sold books and PC's. Even though your company didn't upgrade, imagine how many that did? Or how many people bought new PC's in the fear of Y2K! :mrgreen:

The company as a whole used data from over 100 countries around the world, and from various sources within those countries. In our department we took data from over 150 sources, each source being in a different format, so each source has to be interpreted in different ways.

I can't remember everything - I don't work there any more, so I can't check the records, but here are a couple:

1) We had a process that calculated how old data was. This process was generally OK, but in certain circumstances it threw up problems, in particular when a correction to data was sent over the millennium threshold.

For example:

Data: UPDATE 121212 123.456, date sent: 13:45:21 29-Dec-1999
Time received: 13:45:23 12-Dec-1999, so the data is 2 secs old.
Original received time is stored as 13:45:23 12-Dec-99
Correction: CORRECTION 121212 123.458, date received: 13:52 4-Jan-2000

Meaning: The correction says the original update 121212 is wrong, replace the data with 123.458 .

But because we've stored the original update time as 29-Dec-99, we calculate that the correction message arrived this late:

13:52 4-Jan-2000 minus 13:45:21 12-Dec-99

Year is different format, was converted to:

13:52 4-Jan-00 minus 13:45:21 12-Dec-99

Of course, this would never be a problem unless the two dates are in separate millenniums. But after testing threw up the problem, we changed the Algorithm to fix it.

2) Most of the 150 data readers (each different to match the sources) were OK, but one was erroneous: it could not read 00 as a date, and assumed it was wrong so deleted the year. So, when it worked OK, it would do this:

22 Sep 1999 was coded as:

22-SEP-99
dd-MMM-yy dd=22 mmm=Sep yy=99

When it was wrong it did this, eg: 01 May 2000

01-MAY-00 Drop the unrecognizable year, string becomes:
01-MAY-
dd-MMM-yy

So, dd=01 , mmm=MAY, yy=<blank>

Which would have caused havoc if not corrected.

3) We had, sent to us, the date 1 year on from 29-Feb-200 as 29-Feb-2001.

4) We had, sent to us, the date one day after 28 Feb 2000 as 1 Mar 2000.
 
Back
Top Bottom