Go Back   AFA Forums > Science, Logic and Reason > Technology and Innovations

Technology and Innovations All Things Technological

Reply
 
Thread Tools
  #11  
Old 28th February 2016, 09:16 PM
GenericBox's Avatar
GenericBox GenericBox is offline
Anonymising my profile for a while...
 
Join Date: Jan 2009
Posts: 2,486
Default Re: Y2K Revisited

I don't think the 2 digit year became universal due to incompetent programmers.

Even up to 1990 -- you were paying about $1000 USD per 20-40mb of storage -- and in the 80s upwards of $5000 per 20mb.

So if you're a bank/large business with 100,000 customer records, saving 2 characters on dates mean massive efficiency/cost benefits. Small businesses with more modern and fluid / upgradeable systems may not have had this problem because they could match the hardware/software with current trends.

Not only that -- but I work with modern languages (and never touched COBOL or any of the more low-level languages) and I wouldn't even imagine the thought of my code needing to last for 20+ years.

I daresay programmers in more ... important ... fields than I have been taught the lessons from years past to foresee decades of use -- but still, it would take serious commitment from any business to approve spending budget on future-proofing software for 30+ years.
Reply With Quote
Thank 142857 thanked this post
  #12  
Old 19th August 2017, 09:11 AM
northodox northodox is offline
AFA Member
 
Join Date: Oct 2010
Location: Wollongong
Posts: 12
Default Re: Y2K Revisited

Having been in The IT industry since 1983, I can attest to the potential risks the Y2K "bug" could have wrought.
Old business systems, written in COBOL, PL/1 or FORTRAN, might store numbers in Binary Coded Decimal. That's where 4 bits are used to store a decimal digit. You can store fifteen numbers in 4 bits (typically 0-15) but BCD stores only 0 through to 9 as follows:
0000
0001
0010
...
0101

And the values 0110 through to 1111 never appears.
BTW, 4 bits is referred to as a nibble.

So in those days, a date such as 311299 (format DDMMYY) might be stored as:
0011 0001 0001 0010 0101 0101

Calculating the difference between dates (where the bug could really bite) needed a mini-algorithm over those date elements (allowing for leap years as well).
Obviously, finding a difference between 060100 (6 Jan 2000) and 311299 would fail. The program would treat year 00 as 1900 and the difference would be a large negative number. If it didn't error out.

As these systems were originally written in the '80s, who would have thought the code would still be in use in 2000? "Anyway, I'll be retired by then", I heard occasionally.

There were two common solutions:
1) change YY to YYYY. That is, the code needs changing to represent that year format and the dates need to be stored in databases in the extended format. This is the best solution - although more effort - as it will last well into the future. It's amazing how many "ancient" COBOL and FORTRAN programs are still ticking over in the business world.
2) a simple code change: wherever a 2 digit year is used, add logic to calculations where any year greater than 50 is treated as 1900 plus the year and any year less than 51 is treated as 2000 plus the year. This is simpler solution but introduces a Y2051 bug. Plus, that cut over date will be different purposes. For a birth date, in Y2000, the cut over might need to be 10; for a loan start date, cutover might be 60. The inconsistency makes for messiness.

Which solution is used depends on the criticality of the program and the likely future of the software. These days, we laugh at software lasting longer than 5 years, but that wasn't always the case. Software has become commoditised along with so many other products.

So, Y2K could have caused serious issues but it is hard to imagine utilities failing to function.
Nowadays, dates are commonly stored in enterprise systems as full binary, using 32 or 64 bits - some even 128. Some systems will store dates as the number of seconds since January 1, 1970. That makes it easy to find the difference between two dates (down to the nearest second!) but those systems storing dates like this in a 32 bit field will experience a Y2038 bug, as that's when the seconds since 1-Jan-1970 reaches 2^32. Old systems using two digit years with a cutover of 33 will outlast them!
Reply With Quote
Thank Strato thanked this post
  #13  
Old 19th August 2017, 05:37 PM
142857's Avatar
142857 142857 is offline
Senior Member
 
Join Date: Jun 2015
Location: Sydney
Posts: 1,198
Default Re: Y2K Revisited

Quote:
northodox said View Post
As these systems were originally written in the '80s, who would have thought the code would still be in use in 2000? "Anyway, I'll be retired by then", I heard occasionally.
Y2K was on the radar - kinda - by the late 80s. Nobody was doing much about it though.

Some COBOL systems I've worked on dated back to the mid 60s. At the time software developers barely expected those systems to last much past 1970.
__________________
".....If there are gods, but unjust, then you should not want to worship them. If there are no gods, then you will be gone, but...will have lived a noble life that will live on in the memories of your loved ones."

- Marcus Aurelius (claimed)
Reply With Quote
Reply

Bookmarks

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +11. The time now is 02:08 AM.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2017, vBulletin Solutions Inc.
Feedback Buttons provided by Advanced Post Thanks / Like (Pro) - vBulletin Mods & Addons Copyright © 2017 DragonByte Technologies Ltd.