My observations relate to two columns from the January 2021 issue of Communications: Michael A. Cusumano's "Technology Strategy and Management" and Thomas Haigh's "Historical Reflections." Reading one column right after the other, I could not help but notice the stark contrast between Haigh's and Cusumano's accounts with respect to the engineering profession. Whereas, on one side, there is a glorification of the "pure and noble" ethos, on the other side there is perhaps the saddest statement, when engineers brag in email about how they "'tricked' the FAA regulators."
While there are certainly human idiosyncrasies involved, this discrepancy shows to me that engineers adapt their belief system to the frame in which they operate: if you cannot get recognition from management for classical engineering skills, like safety and longevity, engineers—in desperation?—will adapt their ethos accordingly.
With the financial crises, we have seen how rapidly the perception of a profession can deteriorate: the word "bankster" can be found now in the most important German dictionary, the Duden. Let's hope, we won't see "en-cheat-eer" anytime soon. Unfortunately, with the German automakers' diesel manipulations and Boeing's 737 MAX, we are already a good way down that path. I am afraid the public's changing perception and the critical scrutiny of (computer) engineers will come to haunt our profession in the future…
Holger Kienle, Berlin, Germany
I fully agree with Kienle's comment that "engineers adapt their belief system to the frame in which they operate." Sometimes companies require this to create a successful business, but the consequences can be deadly. Let's look at Thomas Haigh's column on the book, The Soul of a New Machine, in that light. Yes, the book is an inspiring account of the startup culture at Data General and the development of a new minicomputer model. However, what the book and column do not discuss is that, essentially, the project built the wrong product at the wrong time. The minicomputer business was already under threat from high-end workstations and soon would be permanently disrupted by personal computers. Data General had to shift to making data storage equipment and then was acquired by EMC. We see no evidence in this story that the engineers and managers understood the business context and how technology and markets were changing. We might view the 737 MAX debacle in this context, but without making excuses for Boeing's mistakes. Boeing was struggling to catch up with Airbus and retrofitted an old product for a hot new market segment. Both managers and engineers made decisions that would cost hundreds of lives and billions of dollars in losses, and severely damage Boeing's reputation for engineering excellence and safety. I think the lesson is that both managers and engineers need to understand the financial or competitive pressures in their business, but, in these situations, engineers in particular need to resist compromising their training. Technical experts know how to estimate risk and the probability of catastrophic failure, even though they are subject to the same human frailties as everyone else.
Michael A. Cusumano, Cambridge, MA, USA
Even in Kidder's romantic portrayal, the Data General engineers rushed the design process to get the machines out the door quickly, but the minicomputers they were producing were not as safety-critical as the systems aeronautical engineers deal with. Maybe the problem at Boeing was management's urge to treat planes like other devices.
Thomas Haigh, Milwaukee, WI, USA
I read with great interest Thomas Haigh's discussion of The Soul of a New Machine by Tracy Kidder in the January, 2021, issue of Communications. I have fond memories of being in a packed auditorium at Purdue University in the early 1980s to hear Kidder discuss his recent book. Because his appearance was organized by the English Department, the head of the English Department introduced him and told us how he selected Kidder as a guest speaker. He had read a review of the book and gone to the local bookstore to buy it only to find that the book was out of stock. When he asked who was buying the book, he was told that "computer people" were the customers. If the book was so popular at Purdue, he reasoned, hosting Kidder as a guest speaker would definitely help the English Department.
Kidder was an excellent speaker. I remember two comments he made with respect to the book. He said when he talked to the Eagle project's engineers, he discovered every one of them had been required to take one or more English or literature courses in college. Kidder added that he, as an English major, had not been required to take any computer science or engineering courses when he was in college and said that, if he were in charge, he would make a computer science or engineering course a requirement for every English major.
The second comment involved his process for writing the book. In exchange for being allowed access to the Eagle project, Kidder agreed that the project members could review the manuscript before it was published. Many of the engineers took advantage of this opportunity to review the manuscript and suggested changes. Kidder was pleasantly surprised that not one of the changes involved the descriptions of the people he had written about, even when he may have presented them in an unflattering manner. His engineer-manuscript-readers only suggested changes to improve the technical accuracy of the book, and Kidder felt that this attention to technical accuracy was a major contributor to the book's success.
Herbert Schwetman, Austin, TX, USA, Member of ACM since 1965
Information compression (IC) is a surprising omission from the features of the human mind described by Gary Marcus and Ernest Davies (Communications, Jan. 2021). Research into the role of IC in human learning, perception, and cognition was pioneered by Fred Attneave and Horace Barlow and others in the 1950s and has continued up to the present. There is a recent review in Wolff.3
A possible reason for IC to be over-looked as a unifying principle in the human mind is that it can be hidden in plain sight. For example:
It is true as Marcus has argued1 that in many respects the human mind is a kluge, perhaps because of the haphazard nature of evolution. But it is possible, at the same time, that IC can be a unifying principle in understanding the human mind.
Gerry Wolff, Menai Bridge, U.K.
Regarding "Disputing Dijkstra" by Mark Guzdial (March 202l), although I am not enough of a philosopher to confirm his argument for metaphors in any academic sense, there does seem to be considerable value in the concept. And there is of course the universal metaphor—"It's turtles all the way down!"
An interesting point raised by Dijkstra is the notion that software spans many layers of abstraction. I have often thought of software as a way to "make your own physics"; that is, you get to construct the behavior of objects from the lowest to the highest levels. And you can change the charge on an electron a little, if you like, and further you can change it only on Tuesdays, or when the sun is shining and the date is a prime number. This then raises the problem of layering violations which allow some relatively small change in behavior at the low level to inordinately affect behavior at the high level.
While we can identify areas of physics, biology, and other sciences where this occurs—for example nuclear radiation can interrupt the biological hierarchy—in software it seems to be everywhere. Analogies to physical models such as this can be useful in assessing, for example, system structure and modularity.
So, while the idea that metaphors are an essential guide seems well-established, looking for ways in which software and computer science does not fit into our common mental models is, I think, an important and useful exercise. We don't necessarily need to do so with an eye toward abolishing the metaphorical models, but we should be on the lookout for new metaphors that help us describe and explicate the behavior of our systems.
Larry Stabile, Cambridge, MA, USA
3. Wolff, J.G. Information compression as a unifying principle in human learning, perception, and cognition. Complexity (2019), Article ID 1879746; doi.org/10.1155/2019/1879746.
Communications welcomes your opinion. To submit a Letter to the Editor, please limit your comments to 500 words or less, and send to firstname.lastname@example.org
©2021 ACM 0001-0782/21/7
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and full citation on the first page. Copyright for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and/or fee. Request permission to publish from email@example.com or fax (212) 869-0481.
The Digital Library is published by the Association for Computing Machinery. Copyright © 2021 ACM, Inc.
No entries found