Hans Moravec’s "Robots, After All" (Oct. 2003) seemed to me to be yet another version of "Someday computers will wake up," with the same intellectually faulty arguments and foundational quicksand AI has always suffered.
It included a nice graphical depiction of the argument, plotting mental power and MIPS on the same scale, but it was a case of good news and bad news. It was an effective visual presentation of Moravec’s argument. It was so good it distracted from the underlying fallacies. Just because you put a picture of a chimp and a human on the same graph with a Dell Computer doesn’t mean that chimp, or human, capabilities can be meaningfully measured in MIPS.
Moravec was operating with a description of human behavior that is standard in much of the AI community, drawing (or leaping to) some extravagant conclusions based on it. That description has been at the heart of AI’s predictions, and failures, for perhaps 40 years; a person is a machine, the behavior of a person is machine behavior, or processes; all we have to do is model the processes.
Moravec’s presentation followed the same pattern of all too much work in AI. He cited modest successes in limited tasks, assumed success in more complex tasks requires nothing more than more MIPS, ignored the question of the nature of that complexity, ignored several huge unsolved problems, and then made startling predictions. Here are a number of examples: assuming 10K MIPS is equivalent to a lizard mind; assuming it is appropriate to model a mind as a set of processes, supporting the entire MIPS measure; assuming lizard-level capability is adequate for the chores of guarding homes, taking inventory and playing games; ignoring or dismissing the aspects of physical work—acting on concepts and exercising skills—where AI has demonstrated almost no progress; assuming trainability will (no indication how) somehow be an attribute of the supposed 300K MIPS mouse brain; assuming that the old reinforcement theory of learning will, with assumed "conditioning modules," produce real learning; and assuming learning "skill[s] through imitation will afford a kind of consciousness."
Left entirely undefined, unspecified, and unaddressed are the following questions: What is behavior?; What is a mind?; What is consciousness?; What are "psychological factors?" and "Cultural factors"?; What is reasoning?; What role does reasoning (of any kind) play in behavior, life, and consciousness?; What is involved in that enormous range of human life dismissively categorized as "everyday actions"?; Is behavior reducible to underlying processes?; Is a person a machine?
AI has been all too ready to make startling predictions based on modest successes and huge theoretical assumptions. I find it difficult to understand why a continuation of this tradition is considered worthy of publication in Communications.
H. Joel Jeffrey
DeKalb, IL
Author Responds:
Prof. Jeffrey is correct: 40 years of effort proved insufficient to recapitulate what took 500 million years to evolve the first time. May he forgive those of us who persist in the effort and try to plan ahead. I make no apology for the mechanistic assumptions; they are counterintuitive and at variance with much of Western philosophy, let alone religion, but are supported by more and more hard evidence, especially, in recent years, from molecular biology. The burden of proof is now on those who would claim life cannot be explained as molecular-scale machinery.
Mandate Safety-Critical Software
Peter G. Neumann’s "Information System Security Redux" (Inside Risks, Oct. 2003) proposed that we "Just Say No" to security-deficient software—a slogan and attitude that won’t succeed.
Years ago we acknowledged that automobile seat belts would save lives. Auto manufacturers offered safety as an option, but few buyers responded. Drivers do not wear seat belts without laws mandating their use; manufacturers cannot be expected to build what consumers will not buy. That’s why "Just Say No" cannot succeed. Today, we must view software as a critical resource, and, as we did with automobile seat belts, mandate safety.
I thus propose the creation of a National Software Safety Board (NSSB) that would issue a template describing how safety is to be measured, the criteria for safety that must be attained, and a schedule for attaining it. Moreover, the NSSB would test critical software to assure us that safety goals are indeed being attained.
The scope of the NSSB would be limited to Internet protocols and operating systems. Safety-critical operating systems exist, mainly in the embedded software domain. Software engineers know how to build secure safety-critical software. We now need the discipline of law to make them do it.
So, buckle up your buffer overflows, and solve this vulnerability.
Joseph Frisina
Wayne, NJ
Limited Math Limits the CS Curriculum
The scientific foundation of computer science cannot be soft (Sept. 2003).
Historically, CS was part of the mathematics curriculum. The earliest CS curricula rested heavily on mathematics. As the discipline and technology of CS matured, the field came into its own, drifting away from its mathematical roots.
Now we see a number of CS curricula with fewer math courses than good physics and engineering programs have. Why are curricula designers removing mathematics from the CS program? Is it to improve enrollment and retention?
Mathematical maturity is necessary for studying many applications of CS, including digital signal processing, image processing, machine vision, and computer graphics.
Creating mathematically impoverished curricula, we limit access not only to courses but to course topics as well. For example, if a CS student knows only linear algebra, then I dare not teach radiosity in my computer graphics class. And I probably cannot require a third course in university physics, as such courses often cover Maxwell’s equations.
Who then would devise the algorithms based on mathematics? What would happen if we continued to rob the CS curricula of math? Will computer scientists have to invoke algorithms implemented by engineers? Will computer scientists even be able to understand such algorithms?
Douglas Lyon
Fairfield, CT
I do not disagree with Kim B. Bruce et al.’s "Why Math?" as far as it goes (Sept. 2003). But in the context of a university education, a central reason for studying mathematics was not mentioned: Math is a cultural artifact. It has beauty and permeates history. I would not want students to graduate without, say, having read a haiku by Issa, seen a drawing by Dürer, shared Darwin’s understanding of speciation, or grasped some aspect of Einstein’s contribution to our knowledge of the world. Neither would I want them to leave school without being exposed to the wonder of Euclid’s proof that there are an infinite number of primes or Cantor’s exploration of infinite sets. All are remarkable achievements.
The article by Peter B. Henderson rejects, for a CS curriculum, the need for the classical mathematics of continuity in favor of discrete mathematics. I do not disagree about the importance of discrete mathematics; it is useful and deep. But those of us who read Game Developer magazine, along with Communications, find its articles full of partial differential equations and integrals; until I started reading them, I had no idea of their technical depth. Few are the CS students who are not attracted to game design, and seeing the need for continuous mathematics in games can be strong motivation. But students not adept at calculus will find many doors closed. The beauty and power of the concepts of calculus are in and of themselves worthwhile.
Keith Devlin’s introduction to the section leaned in this direction but did not quite come out and say it. I understand we must balance the burden of learning a wealth of mathematics against the many other demands on students, but I would never recommend ignoring the aesthetics and history of mathematics or the calculus.
Jef Raskin
Pacifica, CA
The taxi-scheduling illustrative anecdote in Kim Bruce et al.’s "Why Math?" concluded: "Mathematical proofs are the only way to distinguish among the alternatives [algorithms]." There are, in fact, at least three other ways: search the literature; create a prototype and measure the performance as data points are added; and ask someone knowledgeable to give an assessment.
Customers of consultants don’t require, or pay for, proofs. Moreover, a proof is useless in explaining to a client that the attractive-sounding option isn’t practical.
Consulting has more analytical, synthetic, and social aspects than mathematical ones. I’ve never worked with a new CS graduate who had the mental tools for learning the architecture of a large system and understanding how to extend it without doing damage to it. I’ve often wished my colleagues had this ability, but never for more math skills.
John Craig
Orem, UT
Hands-On How We Learn
I loved Philip Armour’s "Closing the Learning Application Gap" (The Business of Software, Sept. 2003). One thing that has always annoyed me about teaching engineers is their proclivity for, as Armour put it, thinking (and talking) about how they would build something I was trying to teach them. Putting up with it gracefully irritated me. The article helped me understand that’s how they learn. I now see better why those of us with "engineer’s minds" never understand the theory until we get a couple of labs under our belts.
In my current position at Boeing, I’m responsible for a massive training effort; the Global Positioning System is being re-hosted from an IBM mainframe to Sun Microsystems servers, and our software engineers (about 60) need to be trained in Unix (Solaris) and new languages. I’ve always been a proponent of hands-on training to the extent possible, but Armour’s column gave me new ways to view training approaches. I’ll especially encourage our engineers to develop their own "mushrooms." That will surely make my job easier.
Joy Getha
Colorado Springs, CO
Not Enough Joy of Computing
I agree with Peter Denning’s "Great Principles of Computing" (The Profession of IT, Nov. 2003) that many students have turned to cheating and plagiarism to pass programming courses and that they do not experience the joy of computing. However, I hardly think students whose logical prowess does not permit them to master conditional statements would appreciate the intellectual principles of computer science. The U.S. college population has grown to include many students with a less intellectual bent.
Process algebra is the intellectual foundation for many aspects of current computing practice. In his An Introduction to Process Algebra, Wan Fokkink writes that the book grew out of an undergraduate course at the University of Wales Swansea. Meanwhile, Communicating Sequential Processes by C.A.R. Hoare is a classic. Students who have to cheat to pass a programming course would certainly not love reading either of these books or even simpler less-elegant works.
Personally, I experience great joy in developing engineering programs to solve interesting problems. For example, I assigned a project to implement LZW compression and loved doing it myself, creating dracula.zip, the compressed version of another classic, and uncompressing to restore the original text. I recently showed my class how the GIF format uses LZW, tracing a GIF file together. Fun. But students who cannot program are not ready or able to appreciate it. First things first.
Art Gittleman
Long Beach, CA
Join the Discussion (0)
Become a Member or Sign In to Post a Comment