Do you know of any rule of thumb for how often a piece of software should need maintenance? I am not thinking about bug fixes, since bugs are there from the moment the code is written, but about the constant refactoring that seems to go on in code. Sometimes I feel as if programmers use refactoring as a way of keeping their jobs, rather than offering any real improvement. Is there a "best used by" date for software?
I definitely like the idea of software coming with a freshness mark like you see on milk cartons. As with other perishable products, software does seem to go bad after a while, and I am often reminded of the stench of spoiled milk when I open certain source files in my editor. No wonder so many programmers grimace whenever they go to fix bugs.
I think that a better analogy for software is that of infrastructure. Anyone who has lived long enough to see new infrastructure built, neglected, and then repaired should understand this analogy. Consider the highways built in the U.S. in the 1950s. When these roads were first built they were considered a marvel, helping commuters get into and out of large cities. Everyone loves something new, and the building of this infrastructure was heralded with a good deal of fanfare, speeches, and other celebratory events that you associate with large projects.
Once completed, however, the process of neglect sets incost cutting, slow repairs, ignoring major design flaws until bits of the roadways fall down. Finally, the highway is so poorly maintained that it is a menace, and then, unless you get lucky and an earthquake destroys the hideous thing, you come to the usual engineering decision: repair or rebuild.
The difference with software is that if code is used in the same way, day in and day out, and never extended or changedother than fixing previously existing bugsit should not wear out. Not wearing out depends on a few thingsespecially that hardware does not advance. A working system delivered in 1980on, say, a classic mini-computer such as the VAXshould, if the same hardware is present, work the same today as it did when it was built.
The problems of software maintenance arise because things change.
While the original libraries used to build a system do not wear out in any physical sense, the code they interact with changes over time as idiots (oops, I meant to say marketers) demand new features and as the speed and complexity of hardware advances. Efforts at portability are noble and often worthwhile, but there is simply no way that a piece of code that ran on a 1-MIPS CISC (complex instruction set computing) computer is going to runwithout significant retesting and changeson a modern processor with modern peripherals. Operating systems and device drivers can go only so far to hide the underlying changes from applications.
While I have seen plenty of navel-gazing exercises masquerading as refactoring, there comes a time in the life of all software when the design decisions it expresses must be reexamined. There is no hard and fast limit for this. If the code was a "prototype"you know, code that management swore up and down they would never use, and then didit is going to go bad sooner rather than later.
Programs that were written in a more reasonable style and without ridiculous schedules imposed from above maintain their freshness longer. I consider my own code to have a "best by date" of one year from when I complete the project. If I have not looked at some code in a year, I have probably forgotten how it worked, anyway.
I have been upgrading some Python 2 code to Python 3 and ran across the following change in the language. It used to be that division (/) of two integers resulted in an integer, but to get that functionality in Python 3, I need to use //. There is still a /, but that is different. Why would anyone in their right mind have two similar operations that are so closely coded? Don't they know this will lead to errors?
Divided by Division
Python is not the firstand I am quite sure it will not be the lastlanguage to use visually similar keywords to mean different things. Consider C and C++ where bitwise and logical operations use very similar images to mean totally different operations: | for bitwise
or operation and || for the logical, for example. I also recently discovered this change in Python 3 and my coworkers discovered it just after I did, as I was quite vocal in my reaction.
The problem of not having visually distinctive images in programming goes back to the problem, alluded to by Poul-Henning Kamp ("Sir, Please Step Away from the ASR-33!," Communications, November 2010), of the character set we use to create our languages. Language designers have only the character set shown in the accompanying figure to work with when they are looking for something to represent a shortcut to an operation. Many of the characters already have well-established meanings outside of programming, such as the arithmetic operations +, -, *, and /, and the language designer who decides to change their meanings should be severely punished.
It is certainly possible to forgo shortcuts and to make everything a function such as (
plus a b) for functional syntax, or create a large list of reserved words as in
a equals b plus c for Algol-like languages. The fact is, as programmers, we like compact syntax and would balk at using something as bulky as the example I have just given.
Another alternative is to throw away ASCII encoding and move to something richer in which we can have more distinct images to which we can attach semantic meanings. The problem then arises of how to type in that code. Modern computer keyboards are meant to allow programmers to type ASCII. Ask Japanese programmers whether they use a Japanese keyboard or an American one, and nine out of 10 will tell you an American one. They choose the U.S. version because the "programmer keys," the ones that represent the glyphs shown in the figure, are in the easiest-to-use placement. Extending our character set to allow for complex glyphs will slow the process of entering new code, and we all know that typing speed is the biggest indicator of code quality. Many years ago there was a language called APL that required a special keyboard. That language is mostly deadlook at the keyboard shown here to find out why.
That brings us to where we are now with / meaning one thing and // meaning another. I am quite sure many bugs will result from this conflation of images, and I am sure they are going to occur when the person working on the code has just been awakened from a deep sleep by a panicked telephone call. In the light of day, it is easy to tell / from //, but in the dim light of reawakening, it is not so easy.
You Don't Know Jack About Software Maintenance
Paul Stachour, David Collier-Brown
The Meaning of Maintenance
George V. Neville-Neil
The Software Industry IS the Problem
January 14, 2009
The Digital Library is published by the Association for Computing Machinery. Copyright © 2013 ACM, Inc.
No entries found