Jeff Kramer's view, expressed in his article "Is Abstraction the Key to Computing?" (Apr. 2007), that abstraction is indeed a key concept in computing, especially in software design, is correct but far from new. It's a lesson I learned from the late E.W. Dijkstra 40 years ago and underlies every software development method proposed since then. Dijkstra said many useful things. Among them is the most useful definition of "abstraction" I know: "An abstraction is one thing that represents several real things equally well." This positive definition is more useful than the more typical ones Kramer quoted that emphasize the elimination of information. Dijkstra's clarifies what must remain.
Dijkstra's definition allows us to distinguish between an abstraction and a lie. When a model makes assumptions that are not true of a real object (such as infinite memory), these assumptions are often defended by saying "It is an abstraction." Using Dijkstra's definition, such models are not abstractions. Rather than represent several things equally well, they represent nothing at all. Because they embody unrealistic assumptions, one cannot trust the conclusions that might be drawn from them.
Models that are not abstractions in Dijkstra's sense may provide insight or understanding but can also mislead. Programs based on them may not work, and theories based on them may yield results not relevant in the real world.
Dijkstra's work showed that two distinct skills are related to abstractions:
Mathematics courses teach us how to work with abstractions but not usually how to develop appropriate ones. Many researchers I know can analyze formal models, deriving properties and proving theorems, but do not seem to notice (or care) when a model is based on an impractical design or makes assumptions that are not true in reality. Both skills are important, but teaching the second is much more difficult and is the essence of design.
Many computer science courses fail to teach students how to develop abstractions because they use models that are not abstractions but lies. Students must be taught the implications of an idea often attributed to Albert Einstein: "Everything should be as simple as possible but not simpler." Finding the simplest model that is not a lie is the key to better software design.
David Lorge Parnas
Abstraction is certainly the key to computing, as Jeff Kramer said (Apr. 2007), but mathematics is not the key to abstraction. Abstraction is all about ideas and finding the essence of something. The key to computing is figuring out what is important in an incompletely understood situation and developing a conceptual framework that captures its essence.
Mathematics, especially as it is taught at the undergraduate level, has nothing to do with helping students get to the essence of something. Undergraduate mathematics is primarily taught as instruction in a foreign language. Students are required to learn previously defined concepts and a new and often opaque notation in which these concepts are expressed. As typically taught, undergraduate mathematics may give students powerful tools but has nothing to do with learning the process of abstraction.
Worse, mathematics is sometimes taught strictly as a formalism; if the symbols can be manipulated properly, one doesn't have to care what they mean.
A nice way to teach computer science students about mathematics as an abstraction process would be to teach it as analogous to software design patterns, giving students useful ways to think about the abstract aspects of software. Mathematics can be taught as a useful way to think about abstract aspects of problems. Taught this way, mathematics helps computer science students develop their intuition about abstraction at the same time they are mastering mathematical tools and concepts.
I couldn't agree more with Jeff Kramer's article (Apr. 2007) on the value of teaching abstraction to students of computer science and software engineering. Mastering abstraction is not straightforward due, perhaps, to human psychology; thinking abstract ideas without knowing all their detail can be difficult. Some of us are naturally "top-down" thinkers, others "bottom-up" and uncomfortable with black-box thinking. However, solving complex engineering problems requires that we think top-down by establishing various levels of abstraction without concern for the detail of implementing each abstraction.
Using recursion to solve problems is an excellent way to master the concept of abstraction, because recursion is about solving a problem by using the solution to the same problem. We thus live with an abstraction that the smaller-size problem has been solved, and the job is now to assemble solutions of smaller-size problems to solve the larger-size problem.
Years ago, I taught myself BASIC programming languages. However, it was not until I took an introductory computer science course in college that I fully understood the use of abstraction to conquer complexity. It used the textbook Structure and Interpretation of Computer Programs (mitpress.mit.edu/sicp/) to help us learn to use a dialect of the LISP programming language to solve problems. Since LISP involves recursion, we used recursion to solve problems. Other courses helped me further master the concept of abstraction. For example, a digital design course required us to build digital devices using transistor-to-transistor logic on a circuit board. We had to view each piece of TTL as an abstraction without concern for how TTL works, focusing instead on assembling the various abstractions together to build other higher-level abstractions.
Building abstractions allowed us to focus on one manageable piece of abstraction at a time while assuming all the other abstractions the current one depends on were resolved. Problem solving through abstraction enables our minds to scale, as the size and complexity of the problem increases. Abstraction thinking is teachable. Learning to solve problems through recursion and black-box solutions helps us master the concept of abstraction.
Zhen Hua Liu
San Mateo, CA
I am not certain that the fact that a pencil falls when dropped serves to assert "...in the realm of causes it is a proof for the existence of God," as Tom Croy said in his "Forum" comment "Stop Chasing the AI Illusion" (Apr. 2007). The following remark is attributed to Alan Turing, our beloved father of AI: "When machines know how to think we won't know how they do it." If you're the believing kind, this might be a more celebrated authority. AI is, to me, nothing more than a programming system for certain classes of problems.
David H. Brandin
Former President of ACM (19821984)
Los Altos, CA
I disagree with Frederick Hills's "Forum" comment "Let Everyone Be a Programmer"(Apr. 2007) for two reasons: The first is that, while true that average users in the pre-Windows 3.1 world did spend more time and effort programming their systems, they did so out of necessity and were more "techie" than the average user is today. The second are more of an observation. Microsoft has a long history of being sued for including out-of-the-box features in Windows. Recall the media-player circus? If anyone would suggest that Microsoft had bundled a free development environment in Windows, the silence in Redmond would be shattered by legions of lawyers experiencing cardiac arrest.
Many programming tools are free and easy to learn, including Express editions of Microsoft's development suite. There are also Perl, Tcl/Tk, Ruby, and a host of free BASIC and C/C++ tools in the wild, all available to whoever wants them.
Those who want to learn to program usually do learn to program. However, most users don't want to; they want PCs to be commodities. Even today's supposedly tech-savvy kids are not really that keen on hacking computers (in the old sense) but on using them for games and socializing.
The reason for the shortage of programmers in the U.S., as well as here in South Africa, is probably a cultural issue. In a world where the media shows little apparent respect for scientists and engineers, and the money seems to be elsewhere, too few quality students take science, engineering, or IT courses. Who can blame them?
Moreover, in my experience at least, many IT students come to a course with their heads full of visions of Mr. Gates's billions, believing that IT is the way to easy money. The result is disillusioned students, high failure and dropout rates, and lowered standards for those with the aptitude and disposition for the work. Discussions with peers in the U.K., Australia, and the U.S. lead me to believe I am not the only one to have had this experience.
Gauteng, South Africa
Jeffrey A. Rosenwald's assertion in his "Forum" comment "Fewer Degrees of Separation Means More Precision in Software Projects" (Nov. 2006) concerning Phillip G. Armour's assertion in his "The Business of Software" column ("Software: Hard Data," Sept. 2006) that "software is developed iteratively through a closed-loop feedback system" may be close to reality in many environments, but the rest of the comment did little to support his argument.
Saying that the accumulation of errors and the (dismissed) possibility of errors in one step possibly offsets errors in a subsequent step seems plain wrong. For example, he offered no rationale for choosing the function x'= x2 - 2 as representative of any software development process yet expects us to be surprised when values from near starting points diverge after a few iterations. If the iterations of a software development process are to be described through a simple algebraic function, it should be one with a bit of negative feedback, modeling the tendency of reasonable practitioners to learn from their own mistakes.
For example, x'= x - (x2 - Y)/2*x converges rapidly to one of the square roots of Y from any starting value other than zero. I don't suggest that this is a model of any particular project evolution, but it is not much more complicated than Rosenwald's function. It also reflects the common practice of basing actions on how far one is from one's objective, rather than doing exactly the same thing over and over again.
George Santayana famously said "Those who cannot learn from history are doomed to repeat it." Rosenwald appears to believe that software teams are doomed.
©2007 ACM 0001-0782/07/0600 $5.00
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.
The Digital Library is published by the Association for Computing Machinery. Copyright © 2007 ACM, Inc.
No entries found