Architecture and Hardware Forum


  1. To Produce a Good Plan, Start with a Good Process
  2. Who Defined the Requirements?
  3. Proof Depends on Evidence
  4. End the Tragedy of the Wireless Commons
  5. Look Further for Domain Name Persistence
  6. Before Coding, Learn Disciplined Engineering
  7. Author

While Phillip G. Armour’s "The Business of Software" column ("Counting Boulders and Measuring Mountains," Jan. 2006) was based on good ideas, many of them were overshadowed by errors and omissions that made it difficult to understand what he was getting at. A systems perspective is necessary, but even systems engineers can miss the wider view, being wrapped up in possibly faulty processes or lacking the experience to see beyond their own domain.

Failing to see the big picture, software engineers can end up making suboptimal if not outright bad decisions. Who hasn’t seen a programmer make an unauthorized "improvement" only to cause other working, tested, accepted modules to begin to fail or delay testing and increase costs?

Armour didn’t make a clear distinction between planning and execution. For example, were the project’s estimates for budget planning or for execution? Were they for initial sizing/planning or for an actual bid to a customer? Risk also must be considered, along with strategic objectives. Were they purely profit or perhaps to gain experience from bidding on or doing the project? Did the developers have to achieve a given result and know the resources—time, money, people—they needed to achieve it? Would the project have been further complicated if it involved an absolute deadline? Or was it a need-it-yesterday contract following a 9/11-type emergency? The latter would be closer to a research effort than to an engineering project. If that were the case, time would be imperative and costs essentially irrelevant. Yet other projects might minimize resources by trading off schedule delay. And each primary consideration would require a different planning and execution approach.

In order to execute an engineering project, the fine resolution and detailed planning Armour objected to is actually vital to success. One should not think that because they are not necessary in certain projects they can be unilaterally expunged in all projects, as the agile advocates would have us believe.

No rational executive would give the green light to a project based on an unsubstantiated guesstimate of total costs. Risk may not be eliminated through detailed planning but can be minimized when the plan is based on more than wishful thinking. Without a good process that selects the right approach, mistakes are likely, and tools, not users, are likely to be blamed for the result.

Armour correctly noted that many requirements are not properly defined before a project gets under way. The only time that can happen without detailed requirements is when it’s for research or possibly as a feasibility project to aid in planning. If it is, it should not be confused with the effort needed to create a system or product. Moreover, the software engineer is not responsible for determining system requirements; the systems engineer is supposed to give them to the software engineer.

Many practitioners do not understand functional requirements or the effects of nonfunctional components on total requirements; they may be unstated yet still return to bite everyone later on. Practitioners may not even understand the difference between a real requirement and a tentative solution specification (often erroneously described as "requirements") that can be modified for the benefit of the total system through the mutual agreement of all affected parties. Such specifications may be changed unilaterally by software developers to optimize some relatively minor component yet wreak havoc on the overall system.

William Adams
Springfield, VA

Author Responds:

The column addressed the kind of estimation needed for winning commitment rather than for planning or running a project. Detailed planning is necessary—when it is time for planning. During a project’s early stages, we sometimes confuse planning and commitment and may not have the detail we need to render a proper work breakdown structure (WBS) approach that is effective or timely. Risk must be addressed, but WBS tends to make risk less visible and so less usable in early project commitment decision support.

Phillip G. Armour
Deer Park, IL

Back to Top

Who Defined the Requirements?

The article "Agile Project Management: Steering from the Edges" by Sanjiv Augustine et al. (Dec. 2005) was most welcome but would have been even better if it had provided more detail, including: how the requirements were defined and who owned them; what the quality assurance team did (testing isn’t QA); what the SWAT team did that couldn’t have been done by the developers; and the number of defects reported in the year following release.

Peter Farrell-Vinay
Croydon, U.K.

Authors Respond:

Requirements were defined in agile style through user stories and owned by the business teams; business analysts worked with users and customers to define the stories. The QA test involved mainly testers responsible for both test plans and the execution of system and performance testing. The SWAT team consisted of developers trying to ensure that system code integrated across multiple teams was shippable at the end of every iteration. We are, however, unsure of the number of defects reported in the year following release.

Sanjiv Augustine
Fairfax, VA
Bob Payne
Washington, D.C.
Susan Woodcock
Fairfax, VA

Back to Top

Proof Depends on Evidence

At first I was happy to see Amy S. Bruckman’s "Viewpoint" ("Student Research and the Internet," Dec. 2005); that students and researchers today tend to stop looking after a few clicks and ignore older work has long been a concern of mine. However, I was shocked to read, "An idea is not much of a fact if only one lonely person believes it." Many ideas widely accepted as fact were once ideas believed by only one person.

For an example not widely known today, consider the life of Oliver Heaviside (1850–1925, a British electrical engineer who studied radio waves). In computing, it was once widely believed that Algol 60’s parameter passing would not allow a general "swap" program because most approaches did not work. Popularity is not an indicator of truthfulness, and it is counterproductive to suggest it might be.

What I missed from Bruckman were a few of the simple rules long known in science, including:

Proof. If a proposed fact is mathematical in nature, one must look at its proof and ensure it strictly follows the rules; one must also identify the assumptions (axioms) and rules of inference, including them in the statement of the theorem; and

Observation. If a proposed fact is empirical, one must examine how it relates to observations. Were the measurements made with care? Were all factors that might affect the measurements identified and controlled? Were there enough measurements? And is the "fact" consistent with all known observations?

Requiring students to "cite at least one book" in a science-based field is horrifying—for what it says about our students and what it says about our own understanding of science and engineering. We must teach them how to scrutinize what they find in any media.

David Lorge Parnas
Limerick, Ireland

Author Responds:

Parnas advocates an objective view of reality. This is but one of many competing views about the nature of truth in a spectrum from objective to subjective. One reasonable compromise is the pragmatic-realist notion that objective reality exists but is knowable only through our subjective perceptions. These are fascinating questions philosophers devote their lives to. In order to deal with the increasingly complex environment of people and ideas created by the Internet, students need some introduction to these issues. Recognizing that multiple views exist can help when trading ideas with someone with a divergent epistemology from one’s own.

Amy S. Bruckman

Back to Top

End the Tragedy of the Wireless Commons

There is a solution to the tragedy of the wireless commons described by Jan Damsgaard et al. in "Wireless Commons: Perils in the Common Good" (Feb. 2006). Consider a wireless network in which each node has three characteristics:

  • Attenuates its power in proportion to the proximity of an interlocutor (such as a cell phone);
  • Acts as a hub and a router for its neighbors; and
  • Experimentally interrogates any new node to ensure it conforms to the first two characteristics; if it doesn’t, its signals are not relayed.

A network of nodes with these characteristics can grow almost without limit, as adding new nodes also adds bandwidth. A selfish attempt to insert a node that does not act as a repeater is automatically rendered pointless. The result would be a common pasture that grows with every cow that comes to graze.

Adrian Bowyer
Bath, U.K.

Back to Top

Look Further for Domain Name Persistence

While a handle system is needed to provide permanent domain names, the solution proposed by Michael J. O’Donnell in "Separate Handles from Names on the Internet" (Dec. 2005) was more proof-of-concept than workable long-lasting solution. In particular, building atop today’s Domain Name System architecture to provide persistent domain names would provide only a partial answer to the growing demand for information persistence (see "The Decay and Failures of Web References" by Diomidis Spinellis, Communications, Jan. 2003). To the list of ongoing efforts O’Donnell outlined in his conclusions, I would add Robert E. Kahn’s Handle System (www.handle.net), an implementation of which provides the foundation for the Digital Object Identifier System (www.doi.org).

Alessandro Berni
La Spezia, Italy

Back to Top

Before Coding, Learn Disciplined Engineering

John C. Knight’s and Nancy G. Leveson’s "Inside Risks" ("Software and Higher Education," Jan. 2006) reminded me of the recent graduates entering my field—embedded systems; they are smart and eager but uneducated about software engineering. They usually come out of electrical engineering programs, since embedded systems is where hardware and software meet. They know a language or two and can write some badly formatted code.

Back in the days of 4KB 8051 programs, being able to code was fine. Today’s million-line applications demand developers who are expert at software engineering, which has little to do with coding. Few colleges teach much about process (any process), yet disciplined engineering is required to build large, reliable, secure systems. They don’t even teach great coding.

If the goal is to turn out great novelists, the English department would require that students know how to read and ensure they had read many great novels before starting one of their own. The music department would assume aspiring composers had listened to lots of music and read many scores. Yet the EE and CS departments take a different tack, describing how, say, a for loop works. Assigned to write code, most students must invent their own styles and adopt some half-baked (if any) process.

They should instead be taught a language only in their senior year, after they’ve learned how to build programs. We need a breed of developer who’s revolted by today’s undisciplined approach. We need grads who refuse to accept jobs in the typical heroic shops because they realize these outfits are mired in obsolete approaches that usually fail.

Jack Ganssle

Back to Top

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More