Opinion
Computing Applications Forum

Forum

Posted
  1. When It Comes to Software, Don't Just Do It
  2. A Single Vested Source Is Not Data
  3. How to Welcome Uninvited Guests
  4. Skip Toy-Program Thinking in Real-World Projects
  5. Call It Problem Solving, Not Computational Thinking
  6. Author

The things Peter G. Neumann described as risks in his "Inside Risks" column ("Risks Relating to System Compositions," July 2006) I would generally characterize as stupidity.

Unfortunately, too many software-oriented people, especially hackers and other free spirits, take an approach that is the equivalent of playing Russian Roulette, inevitably causing most of their own problems. Who hasn’t witnessed cowboys "improving" their code, only to break other modules and delay final integration and testing. Rigorous configuration management (CM) would prevent this, but most projects use erroneous forms of CM, requirements handling, and other processes. Worse, they sometimes take the Nike approach, that is, they "Just Do It."

The "risks" noted by Neumann are easily handled by a disciplined systems process that includes systems architecture and systems engineering supported by rigorous infrastructure processes (such as CM).

Unfortunately, too many projects that do not omit architecture completely when not just jumping to a solution, confuse software architecture with systems architecture. Indeed, most software developers view software as the (entire) system, not just as a component of a larger entity.

Systems architecture must include all necessary technical architectures (such as those for hardware, software, security, communications, information, and operations) while considering the context and constraints that determine solution alternatives for analysis and selection. A key result of the systems architecture is the selection of the nonfunctional requirements that drive solution feasibility.

Most of what Neumann called composition problems are handled well by systems approaches usually called "emergent properties." The other problem types are handled through other systems approaches.

Admittedly, some systems engineers omit architecture, do not understand requirements, and otherwise fail to take a professional approach to solving problems. This is not a fault of systems architecture or systems engineering but reflects a lack of practitioner training and experience, the blame for which belongs to a program’s management.

Moreover, the creation system (such as an enterprise’s executive management) can also cause problems (such as allowing faulty project management and failing to provide training or adequate resources).

There are certainly large, complex, nonlinear, unprecedented systems with extreme constraints and difficult contexts for which software engineering is unable to guarantee solutions. But systems methods offer the best chance for success, while the traditional software approaches may actually guarantee failure.

William Adams
Springfield, VA

Author’s Response:

Adams’s comments only strengthen my conclusions. In theory, composition may be utterly trivial, like snapping together Lego blocks. In practice, however, composition is not at all trivial. Practical difficulties toward achieving composability include flawed approaches like disregarding systems engineering, requirements, architecture, sensible software development, interface design, and assurance, as well as inadequate foresight (particularly short-term optimization) and self-serving business practices (such as monolithic nondecomposable systems, built-in incompatibilities, and overreliance on patch management). Needed are improved education and major culture shifts in development.

Peter G. Neumann
Menlo Park, CA

Back to Top

A Single Vested Source Is Not Data

We’re extremely concerned about Kallol Bagchi et al.’s article "Global Software Piracy: Can Economic Factors Alone Explain the Trend?" (June 2006), wondering how a peer-reviewed magazine from a serious professional organization would permit publication of a report that predicated its research on data from a single source with a vested interest in magnifying apparent losses due to activities it characterizes as "piracy." (For a critique of the Business Software Alliance, see "BSA or Just BS?" in The Economist, May 21, 2005.)

We do not doubt that unauthorized copying of copyrighted materials takes place, but estimating its true scale is difficult. Multiple approaches and sources are needed and must be fully open to review.

John C. Nash
Dru Lavigne
Russell McOrmond
Charles McDonald
Michael Richardson
Raymond Wood
Ottawa, Ontario, Canada

Author’s Response:

While multiple sources and approaches are preferred, estimations obtained through regressions are standard and viable. The Business Software Alliance and International Data Corporation are primary sources of piracy data used extensively by researchers. We conducted our preliminary study to identify possible factors—economic, technical, regulatory and societal—associated with piracy, not to identify individual nations as "pirates." If the data is exaggerated, all nations are equally affected. Thus, trends are still detectable.

Kallol Bagchi
Peeter Kirs
El Paso, TX
Robert Cerveny
Boca Raton, FL

Back to Top

How to Welcome Uninvited Guests

Peter J. Denning’s "The Profession of IT" column ("Infoglut," July 2006) began with a thorough assessment of the information-overload problem, but its later exploration of related solutions was seriously off the mark. Denning’s advocacy of, for example, using a "smart push" approach would solve, at best, only one aspect of the problem, not even the most serious one. If you find that too many people have been coming to your house uninvited and that some of them are stealing your property, would you post a list of "desirable guests" at the door as a reasonable approach to dealing with the problem?

Berhooz Parhami
Santa Barbara, CA

Author’s Response:

Parhami did not adequately frame the problem he seems to most want to address. I assume that problem is spam, adware, and malware, all uninvited guests designed to steal your attention, if not your data. Although I cited this problem in the column, in my enthusiasm I jumped immediately to a less-studied area that might be called "friendly-fire infoglut." It is caused by our well-intentioned friends, coworkers, and service providers, who, in their own enthusiasm to help, individually and collectively, send data at rates far beyond our capacity to process it. I apologize to my readers for whom this transition was too abrupt.

Friendly-fire infoglut is a known problem in hastily formed networks, enterprises, large organizational networks, and subscriber networks. It represents a growing problem for publishers, newsletter distributors, bloggers, discussion groups, and other information sources to which people want to subscribe. The "value information at the right time," or VIRT, idea deals with the subtle but critical problem of determining what information recipients care about and sending only that information. It represents a potentially disruptive technology that promises to enrich the network experience for everyone.

Peter Denning
Monterey, CA

Back to Top

Skip Toy-Program Thinking in Real-World Projects

Fred Martin’s "Technical Opinion" ("Toy Projects Considered Harmful," July 2006) was right about class assignments typically failing to reflect many of the realities of software development on the job but also fell into one of the worst aspects of toy-program thinking when he decried languages with strong static rules.

To think that dynamic languages help by making programs "just work," one has to be in denial about the limitations of testing and the realities of programs that will be used by a large community. We’ve all heard the theoretical views of combinatorics researchers and the infeasibility of exhaustive testing. In practice, they are overly pessimistic, but the fact remains that an exhausting regimen of test-case development and automation is required for a released program to have a sufficiently low residual bug rate to go live and be considered stable or mature.

In contrast, static programming language rules catch all the bugs they are able to catch in a single run. Far more important, they do it without added work to produce cases and expected results and to check actual results. Where applicable, they represent the most effective tool for removing bugs. Their limitation is that there are still an enormous number of programming errors that can’t be detected through static rules. All the more reason to exploit them when possible, freeing up limited developer time for the bugs that can’t be made static.

Martin seemed to be saying that a dynamic language makes it possible to program without thinking much about the type of values a variable might hold. Quite the opposite. You have to think much more about data types when your language doesn’t help, if your program is to have any chance of making a passable showing beyond the handful of immediate cases you try this way.

Nobody would deny that coding a few lines, then giving them a test case or two, seeing that the cases "just work," declaring this bit of function done, and moving on to something else provides instant gratification. It may also be a good way to engage students early in their education.

Unfortunately, this pleasant experience is the epitome of a toy-project process. It’s like making a separate trip to the grocery store for every meal. For a given meal, it may put food on the table quicker. But after a few days, the time spent on shopping trips is at least several times greater than stocking up for a week all at once.

Producing a stable program of significant complexity requires the maturity to do as much as possible about all inputs, states, and outputs before going live with it. A static language entails deferred gratification involving the fixing of all static errors before seeing any execution, whether that execution is right or wrong. But it’s also the most effective known way to quickly reduce the number of bugs. Programmers can then spend more time on the bugs that no language would ever be able to catch.

Rodney M. Bates
Wichita, KS

Back to Top

Call It Problem Solving, Not Computational Thinking

I wholeheartedly support the idea behind Jeannette M. Wing’s "Viewpoint" ("Computational Thinking," Mar. 2006). The characteristics she identified for her pedagogical idea, particularly that it should be "about ideas and not artifacts" and "for everyone, everywhere" were compelling. But I disagree just as strongly with her calling it "computational thinking." The traditional and still-meaningful name for the idea is "problem solving." I have several reasons for objecting to the word "computational":

  • It has not, for at least several decades, described the work of "computers." Computers rarely compute but do manipulate information;
  • The implication is that computational thinking is a course computer scientists should teach. Computer science concepts can certainly be part of such a course, but problem solving is a universal activity, and many disciplines are capable of teaching it; and
  • Problem solving is a centuries-old discipline. While computers are a powerful new tool for doing it, the underlying discipline should be focused on problems+solutions, not just on solution approaches.

Robert L. Glass
Brisbane, Australia

Back to Top

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More