Every opportunity to look forward is also an invitation to look back. We have little hope of understanding where we are going except within the context of where we have been. My own views on the future of computing and software grow from nearly four decades of work in the field. I would not want to drift into a nostalgic reverie, but to put this into perspective. When I started my career at MIT, LISP had just been invented by McCarthy, CorbatÓ was developing the first practical multiuser timesharing system, and Digital Equipment Corporation had donated a low serial-number PDP-1 for research use. On it, I would learn the art of programming with reusable components by writing routines for image analysis of physics experiments.
It was, like today, an era of anticipation when the possibilities seemed boundless, constrained only by the imagination and the limited speed and capacity of our computing machinery. I remember one of my professors, Marvin Minsky, remarking to a group of students that real-time computer translation of spoken natural language was within striking distance. The theory, the algorithms, and the software issues were thought to be well understood; all that was needed was faster computers with more memory. History has shown that the problem is not just a matter of hardware. Despite the fact that computers have become more powerful by many orders of magnitude, solutions to this and other problems that once seemed so tractable continue to elude us today.
Indeed, the decades since my apprenticeship have seen a staggering progression of developments on the hardware side of computing. In this light, it is easy to get excited about the bright future of computing and great fun to engage the imagination in unfettered speculation about future devices of ever increasing power and decreasing size.
Although I have designed computer hardware in my day, the software side of computing has always been my primary interest. On that side, it is much harder to point to anything like genuine progress. When we look back over the evolution of software, we see fewer clearly defined landmarks, and what dominates the landscape is the relentless trend toward ever larger, more complex, and less reliable systems loaded down with a panoply of bewildering features of questionable utility.
Wait a minute, Doc. Are you telling me you built a time machine out of a DeLorean?
—Marty McFly to Dr. Brown in the 1985 film Back to the Future
While change in software is evident everywhere, it is far more difficult to document genuine advancement in the usable capability that software delivers to its users. For the basic operations that account for 90% of what I use regularly, the differences are miniscule between the current version of Microsoft Word that I use today under Windows and the Wordstar program I used in 1980 running on CP/M. There has been little increase in basic abilities or performance from the user perspective. In fact, today’s application leviathans often take as much time to launch from our ultra-fast hard drives as those lean but effective programs of yesteryear loaded from pitifully slow 8-inch floppy disks.
When we look back over the evolution of software, we see fewer clearly defined landmarks, and what dominates the landscape is the relentless trend toward ever larger, more complex, and less reliable systems loaded down with a panoply of bewildering features of questionable utility.
Ironically, even as hardware has become increasingly reliable and dependable, software has become far less so. It has been years since I’ve had to deal with a disk crash, yet hardly a day passes without the operating system and application software conspiring to crash one or more of the machines in my office. A six-year-old machine that serves as our firewall sits with its disk spinning away 24/7 for years with nary a glitch, yet Windows goes brain-dead if it is not rebooted at least once a week.
We have been peppered for decades with claims about the accelerating pace of change, yet many of the processes that shape the practices in computer science and software engineering grind glacially slow. Today, for instance, the core software engineering concepts of coupling and cohesion are cited in nearly every basic text and are taught in colleges and universities around the world, yet it took nearly a decade to get anything published in an academically respectable journal and another decade before significant academic adoptions occurred.
Ultimately, the true pace of change is not dictated by the evolution of science or technology or of ideas, but by the capacities of humans and human social systems to accommodate change. A product, a service, a practice, or a perspective—however new and innovative—can have no impact without acceptance; no significance without change in people and their institutions.
Hiding in Hardware
The true problem with software is hardware. We have been seduced by the promise of more and more and have become entranced under the spell of Moore’s Law. Continued progress in hardware is not a friend, but our nemesis. We have been shielded by hardware advances from confronting our own incompetence as software professionals and our immaturity as an engineering profession.
Contemporary programmers will point to the operating systems and protest that programming environments today are enormously more complex than those of yesteryear, but the real problem is in how we deal with this situation, in the discipline—or its lack—through which we attempt to overcome complexity.
Some years ago when one of the then-leading computing companies surveyed its own internal software engineering practices, the most mature, systematic, and disciplined programming processes were found among application programmers producing business software for internal consumption. Next in line were those creating engineering applications. On down the line and rock-bottom last were the so-called professionals writing the core operating system and its utilities. Where discipline counted for the most, it was least evident.
The story has changed little today. Our profession produces monstrously convoluted operating systems delivered with more than 100,000 known bugs. Leading commercial software systems are produced without benefit of analysis or design and without the guidance of models or diagrams. Bill Gates publicly declares he does not believe in diagrams and does not want his programmers doing design. So, we make him a hero and buy his software.
Gates is not alone. The vendors who sell us the modeling tools that make disciplined design possible do not use the tools or the methods themselves. They cobble together disparate pieces obtained through business acquisitions and market the pastiche as an integrated solution.
Many of the people who proudly display the title of software engineer on their business cards have never engineered anything in their lives. A title more reflective of what they do in practice might be "rapid code construction technician." The profession continues to elevate high-speed hacking over design and engineering.
It is possible to do better. Several million lines of code power the network laser printer that gets heavy daily use in our office. The machine is on continuously and prints on demand from anywhere in the network, yet it has never crashed and no detectable bugs have ever been evident.
The programming practices found within such niches give me cause for hope. Where memory is tight and processing speed is limited, programmers rise to the occasion, demonstrating all those good practices of advanced design and disciplined coding at the foundations of our profession. They subdivide and conquer complexity, they create reusable components that are highly cohesive and loosely coupled, they develop sound algorithms and data structures implemented in reliable code.
Those who work on desktop applications, software tools, or operating environments will all protest about unfair parallels, about the differences between the controlled and limited environment of embedded systems programming, and the vastly more complex world of Windows programming. Although there is some validity in this excuse, the truth is most of the protesting programmers do not practice the needed discipline nor use the systematic techniques of construction such complexity demands.
Continued and rapid growth in the power of hardware has not only enabled new applications and capabilities, but has permitted sloppy, unprofessional programming to become the virtual standard of business and industry. Hardware has allowed the software profession to avoid growing up, to remain in an irresponsible adolescence in which unstable products with hundreds of thousands of bugs are shipped and sold en masse. We have multiplied our sins by conditioning our customers and our management to accept this deplorable state as the norm.
The true golden age of software engineering lies ahead, then, when the ability of hardware engineers to deliver doubled and redoubled capability begins its decline, when our ability to deliver more to a demanding marketplace will depend on something more sophisticated than unbridled code-crunching.
Fortunately, we already have the tools at hand in the established foundations of good software engineering and programming practice. All we will need to do is remember them. The bottom line is simply this: we will begin to mature as a profession when we cease being seduced by hardware, when we no longer think of computing as being about machinery, and when we stop looking to hardware to solve software problems. We need to grow up and accept mature professional responsibility not only for the software products we design and produce, but also for doing a better job of passing on to each new generation of software professionals the accumulated body of principles and practices that are our historical heritage.
If we do these things, the future of software looks bright indeed.