Opinion
Systems and Networking Letters to the editor

Shine the Light of Computational Complexity

Posted
  1. Introduction
  2. Hold Manufacturers Liable
  3. Author's Response:
  4. Code Syntax Is Understanding
  5. Author's Response:
  6. Credit Due for Tandem's Hardware and OS
  7. Author's Response:
  8. References
  9. Footnotes
Letters to the Editor

Regarding Moshe Y. Vardi’s view of computational complexity in his Editor’s Letter "On P, NP, and Computational Complexity" (Nov. 2010), I’d like to add that the goal of computational complexity is to explore the potential and limitation of efficient computation. While P vs. NP is a central pivot in that direction, computational complexity is not reduced to it exclusively; nevertheless, my comments are limited to P vs. NP.

P vs. NP refers to the relative difficulty of finding solutions to computational problems in comparison to checking the correctness of solutions to these problems. Common sense suggests that finding solutions is more difficult than checking their correctness, and it is widely believed that P is different from NP. Vardi advocated the study of P vs. NP, saying that knowing is different from believing and warning that beliefs are sometimes wrong.

The ability to prove a central result is connected to obtaining a much-deeper understanding of the main issues at the core of a field. Thus, a proof that P is different from NP is most likely to lead to a better understanding of efficient computation, and such a theoretical understanding is bound to have a significant effect on computer practice. Furthermore, even ideas developed along the way, attempting to address P vs. NP, influence computer practice; see, for example, SAT solvers.

This does not dispute the claim that there is a gap between theory and practice; theory is not supposed to replace but rather inform practice. One should not underestimate the value of good advice or good theory; neither should one overestimate it. Real-life problems are solved in practice, but good practice benefits greatly from good theory.

One should also realize that the specific formulation of the P vs. NP question (in terms of polynomial running time) is merely the simplest formulation of a more abstract question. Ditto with respect to the focus on worst-case complexity. In either case, the current formulation should be viewed as a first approximation, and it makes sense to study and understand it before moving forward.

Unfortunately, we lack good theoretical answers to most natural questions regarding efficient computation—not because we ask the wrong questions but because answering is so difficult.

Despite our limited understanding compared to the questions, we have made significant progress in terms of what we knew several decades ago. Moreover, this theoretical progress has influenced computer practice (such as in cryptography). It makes sense that most of computer science deals with actually doing the best it can at the moment—develop the best computational tools, given the current understanding of efficient computation—rather than wait for sufficient progress in some ambitious but distant project. It also makes sense that theoretical computer science (TCS) helps meet today’s practical challenges. But it is crucial for a particular aspect of TCS—complexity theory—to devote itself to understanding the possibilities and limitations of efficient computation.

Oded Goldreich, Rehovot, Israel

Back to Top

Hold Manufacturers Liable

In his Viewpoint "Why Isn’t Cyber-space More Secure?" (Nov. 2010), Joel F. Brenner erroneously dismissed the value of making software manufacturers liable for defects, with this misdirected statement: "Deciding what level of imperfection is acceptable is not a task you want your Congressional representative to perform." But Congress doesn’t generally make such decisions for non-software goods. The general concept of "merchantability and fitness for a given application" applies to all other goods sold and likewise should be applied to software; the courts are available to resolve any dispute over whether an acceptable level of fitness has indeed been met.

In no other commercial realm do we tolerate the incredible level of unreliability and insecurity characteristic of today’s consumer software; and while better engineering is more challenging and the software industry could experience dislocations as its developers learn to follow basic good engineering practices in every product they bring to market, that lesson does not excuse the harm done to consumers from not employing basic good engineering practices.

L. Peter Deutsch, Palo Alto, CA

Back to Top

Author’s Response:

The challenge is in writing standards that would improve security without destroying creativity. "Basic good engineering" is not a standard. A "merchantability and fitness" standard works for, say, lawnmowers, where everyone knows what a defect looks like. It doesn’t work for software because defining "defect" is so difficult, and the stuff being written is flying off the shelves; that is, it’s merchantable. It’s also sold pursuant to enforceable contracts. So while courts are indeed available to resolve disputes, they usually decide them in favor of the manufacturer. Deutsch and I both want to see more secure and reliable software, but, like it or not, progress in that direction won’t be coming from Congress.

Joel F. Brenner, Washington, D.C.

Back to Top

Code Syntax Is Understanding

In his article "Sir, Please Step Away from the ASR-33!" (Nov. 2010), Pohl-Henning Kamp was deliberately provocative regarding programming language syntax, but his arguments were confused and off the mark. To call attention to new directions available to language designers, he focused on syntax, and surprisingly, complained about the "straightjacket" imposed by ASCII but also made a good point regarding the importance of "expressing our intentions clearly." Still, he distorted the role of "syntax," which involves critical goals beside "expressivity," including machine interpretation, amenability to formal analysis, efficiency (in many dimensions), and persistence over time. A computer language also concerns communicating with the computer.

Kamp seemed to conflate the formal syntax of a language with a variety of presentation and communication issues in programming environments, rather than with the language itself. His examples even demonstrated my point; no matter what the formal syntax, contemporary tools can overlay useful semantics, making it much easier for humans to express their ideas. Why in the world would we want to enshrine the vagaries of human perception and cognition in the syntax of a computer language?

I also have reservations about many of Kamp’s suggested improvements. He clearly privileges "expression" over "communication," and his reference to using color and multi-column layouts is highly problematic. These concepts make assumptions about the technical capabilities available to users that are as likely to change as the perceived technical constraints that led to the syntax of C. Despite his intention to be provocative, Kamp was also quite conservative in his technological assumptions, staying with two dimensions, eschewing sound, ignoring handheld technology, and generally expecting WIMP interfaces on commodity PCs.

I find the biggest problem with programming languages involves understanding, not expression. I’m probably typical in that I’ve read far more code than I’ve written, most of it by strangers in contexts I can only guess. Many of my toughest challenges involve unraveling the thoughts of these other programmers based on limited evidence in the code. For the reader, adding color coding, complex nonlinear texts, and thousands of glyphs means only more cognitive load. It’s difficult enough to "execute C/Java/etc in my head"; mapping complex, multi-colored hypertext to formal logic only makes it more difficult.

Kamp made a great case for why humans should never see programming languages, just like they should never see XML or RDF. They should express themselves through environments that promote communication by humans, with the added ability for the machine to generate correct code (in multiple languages) as needed. While such communication could be implemented through a programming language, the language itself would need features not generally found in conventional languages, including semantics (not easy to define or express) that model displays and layouts.

All in all, I thank Kamp for his comments, but ASCII isn’t the heart of the matter. The real heart is the design of programming environments.

Robert E. McGrath, Urbana, IL

Back to Top

Author’s Response:

As desirable as McGrath’s vision might be, in a trade where acrophobia is the norm, giant leaps may be ineffective. So while others work on the far future, I am happy to start the stairs that will eventually get us there.

Poul-Henning Kamp, Slagelse, Denmark

Back to Top

Credit Due for Tandem’s Hardware and OS

Joe Armstong explained in his article "Erlang" (Sept. 2010) how a programming language originally developed for "building high-performance telecom switches" is today used for a range of high-availability, scalable applications, but I would like to clarify two parts of that explanation: The first was saying, "This technique was used by Jim Gray2 in the design of the fault-tolerant Tandem computer." Gray made major contributions at Tandem, but by late 1980, when he joined the company, its fundamental hardware and operating system fault-tolerance techniques were already established. Assigning credit accurately for the various aspects of a large and complex system design (such as Tandem’s NonStop systems) may be tricky, but Barlett1 and Katzman4 were key contributors to the hardware and the operating system, respectively. Bartlett’s paper acknowledged Dennis McEvoy, Dave Hinders, Jerry Held, and Robert Shaw as contributing to design, implementation, and testing. Finally, the co-inventors listed on the first patent granted on the Tandem system, 4,228,496, October 14, 1980, filed September 7, 1976, were: Katzman, James A. (San Jose, CA); Bartlett, Joel F. (Palo Alto, CA); Bixler, Richard M. (Sunnyvale, CA); Davidow, William H. (Atherton, CA); Despotakis, John A. (Pleasanton, CA); Graziano, Peter J. (Los Altos, CA); Green, Michael D. (Los Altos, CA); Greig, David A. (Cupertino, CA); Hayashi, Steven J. (Cupertino, CA); Mackie, David R. (Ben Lomond, CA); McEvoy, Dennis L. (Scotts Valley, CA); Treybig, James G. (Sunnyvale, CA); and Wierenga, Steven W. (Sunnyvale, CA).

Armstrong also said, "Adding transactions is easy," then sketched an implementation. While the implementation may be useful in the telecom world, it does not handle the durability provided by database transactions; see, for example, Gray and Reuter.3

Paul McJones, Mountain View, CA

Back to Top

Author’s Response:

McJones is correct in saying the transactions described in my article are nondurable. Database transactions with ACID—atomicity, consistency, isolation, durability—properties are provided by the mnesia database included in the Erlang distribution. The purpose of that section was not to cover kinds of transactions but to show the ease a particular type of nondurable transaction involving an in-memory reversion to an earlier state could on failure be accomplished through single-assignment variables.

Joe Armstrong, Stockholm

Back to Top

Back to Top

    1. Bartlett, J.F. A 'nonstop' operating system. In Proceedings of the Hawaii International Conference on System Sciences (1978), 103–119.

    2. Gray, J. Why Do Computers Stop and What Can Be Done About It? Technical Report 85.7. Tandem Computers, Inc., 1985.

    3. Gray, J. and Reuter, A. Transaction Processing: Concepts and Techniques. Morgan Kaufman, 1993.

    4. Katzman, J.A. System architecture for NonStop computing. CompCon (1977), 77–80.

    Communications welcomes your opinion. To submit a Letter to the Editor, please limit your comments to 500 words or less and send to letters@cacm.acm.org.

    DOI: http://doi.acm.org/10.1145/1897816.1897818

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More