Sign In

Communications of the ACM

Communications of the ACM

Inside Risks: Ten Myths About Y2K Inspections

View as: Print Mobile App ACM Digital Library Full Text (PDF) Share: Send by email Share on reddit Share on StumbleUpon Share on Hacker News Share on Tweeter Share on Facebook

As I write this, the alarmist reports about Y2K are being replaced with more comforting statements. Repeatedly, I hear, "We have met the enemy and fixed the bugs." I would find such statements comforting if I had not heard them before when they were untrue. How often have you seen a product, presumably well-tested, sent to users full of errors? By some estimates, 70% of first fixes are not correct. Why should these fixes, made to old code by programmers who are not familiar with the systems, have a better success rate?

The Y2K mistake would never have been made if programmers had been properly prepared for their profession. There were many ways to avoid the problem without using more memory. Some of these were taught 30 years ago and are included in software design textbooks. The programmers who wrote this code do not have my confidence, but we are now putting a great deal of faith in many of the same programmers. Have they been re-educated? Are they now properly prepared to fix the bugs or to even know if they have fixed them? In discussing this problem with a variety of programmers and engineers, I have heard a few statements that strike me as unprofessional urban folklore. Such statements are false, but I have heard each of them used to declare victory over a Y2K problem.

Myth 1. Y2K is a software problem. There is no problem if the hardware is not programmable.
Obviously, hardware that stores dates can have the same problems.

Myth 2. There is no Y2K problem if the system does not have a real-time clock.
Systems that simply relay a date from one system with a clock to other systems can have problems.

Myth 3. There is no Y2K problem if the system does not have a battery to maintain date/time during a power outage.
Date information may enter the system from other sources and cause problems

Myth 4. There is no Y2K problem if the software does not process dates.
The software may depend for data on software that does process dates such as the operating system or software in another computer.

Myth 5. Software that does not need to process dates is immune to Y2K problems.
Software obtained by software reuse may process dates even though it need not do so.

Myth 6. Systems can be tested one-at-a-time by specialized teams. If each system is fixed, the combined systems will work correctly.
It is possible to fix two communicating systems for Y2K so that each works, but they are not compatible. Many of the fixes today simply move the 100-year window. Not only will the problem reappear when people are even less familiar with the code, two systems that have been fixed in this way may not be compatible. Where two such systems communicate, each may pass its tests with flying colors, but ...

Myth 7. There is no Y2K problem if no date-dependent data flows in or out of a system while it is running.
Date information may enter the system during a build on EPROMs or diskettes, among others.

Myth 8. Date stamps in files don't matter.
Some of the software in the system may process the date stamps, for example, to make sure the latest version of a module is being used, when doing backups, and so forth.

Myth 9. Planned testing using critical dates is adequate.
As Harlan Mills used to say, "Planned testing is a source of anecdotes, not data." Programmers who overlook a situation or event may also fail to test it.

Myth 10. You can rely on keyword scan lists.
Companies are assembling long lists of words that may be used as identifiers for date-dependent data. They seem to be built on the assumption that programmers are monolingual English speakers who never misspell a word.

As long as I hear such statements from those who are claiming victory over Y2K, I remain concerned. I was a skeptic when the gurus were predicting disaster. I remain a skeptic now that they are claiming success.

Back to Top


David L. Parnas holds the NSERC/Bell Industrial Research Chair in Software Engineering, and is Director of the Software Engineering Programme in the Department of Computing and Software at McMaster University, Hamilton, Ontario.

©1999 ACM  0002-0782/99/0500  $5.00

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

The Digital Library is published by the Association for Computing Machinery. Copyright © 1999 ACM, Inc.


No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account
Article Contents:
  • Article
  • Author