With cloud, big data, supercomputing, and social media, it's clear that computing has an eye on the future. But these days the computing profession also has an unusual engagement with history. Three recent books articulating the core principles or essential nature of computing place the field firmly in history. Purdue University has just published an account of its pioneering effort in computer science.4 Boole, Babbage, and Lovelace are in the news, with bicentennial celebrations in the works. Communications readers have been captivated by a specialist debate over the shape and emphasis of computing's proper history.a And concerning the ACM's role in these vital discussions, our organization is well situated with an active History Committee and full visibility in the arenas that matter.
Perhaps computing's highly visible role in influencing the economy, reshaping national defense and security, and creating an all-embracing virtual reality has prompted some soul searching. Clearly, computing has changed the world—but where has it come from? And where might it be taking us? The tantalizing question whether computing is best considered a branch of the mathematical sciences, one of the engineering disciplines, or a science in its own right remains unsolved. History moves to center stage according to Subrata Dasgupta's It Began with Babbage: The Genesis of Computer Science.1 Dasgupta began his personal engagement with history in conversation with Maurice Wilkes and David Wheeler. Babbage, Lovelace, Hollerith, Zuse, Aiken, Turing, and von Neumann, among others, loom large in his pages.
Turing's complex legacy is of enhanced importance today with the expansion of the A.M. Turing Award.
Two recent books further suggest that computing is historically grounded. Peter Denning and Craig Martell's Great Principles of Computing2 builds on Denning's 30-year quest to identify and codify "principles" as the essence of computing. The authors readily grant the origins of the Association for Computing Machinery, initially coupled to the study and analysis of computing machines. In their perspective on computing as science, they approvingly quote Edsger Dijkstra's quip "computer science is no more about computers than astronomy is about telescopes." Dijkstra and others in the founding generation closely connected to studies in logic, computability, and numerical analysis naturally saw computing as a mathematical or theoretical endeavor and resisted a focus on engineering questions and technological manifestations. Similarly, Denning and Martell look beyond the 42 ACM-recognized computing domains, such as security, programming languages, graphics or artificial intelligence, to discern common principles that guide or constrain "how we manipulate matter and energy to perform computations," their apt description of the field. For each of their six principles—communication, computation, coordination, recollection, evaluation, and design—historical cases and historical figures shape their exposition. Communication is Claude Shannon, Harry Nyquist, Richard Hamming. These are historical principles.
In Great Principles the closer the authors get to cutting-edge science, the less their findings resemble the science-fair model of hypothesis, data collection, and analysis. They start from Dijkstra's view that "programming is one of the most difficult branches of applied mathematics." But programming is more than math. Programming languages from Fortran (1957) to Python (2000) are expressions of algorithms in an artificial language with its own syntax, often tailored for specific applications. Programmers with varied levels of skill work with compilers or interpreters, debugging tools, and version control as well as grapple with different means for avoiding errors. The practice of programming, however, is not cut-and-dried application of known laws. "Good programming is an artisan skill developed with good training and years of practice," they affirm.
Design as a core computing principle emerges from the authors' treatment of ENIAC and EDVAC in the 1940s through the information protection principles of Saltzer and Schroeder (1975) and forward to the design hints of Butler Lampson (1983). Judgment, intuition, and sense of history come to the fore. "Success of a design ... depends on knowledge of history in the designer's field, which informs the designer on what works and what does not work." Design returns powerfully in their conclusion, which emphatically places "designers and their work at the center of the progress and innovation in computing." Great Principles does not stand apart from history; it embraces historical examples and historical thinking. And with design at its core, computing is history.
Matti Tedre's The Science of Computing: Shaping a Discipline5 examines three broad historical debates about the nature of computing: about computing as a distinctive theoretical field (starting in the 1930s), as an engineering field, and as a science in its own right. Tedre writes in the shadow of Denning's principles, with due tribute. His engagement with history is long and deep. Tedre sets up the pre-history in Leibniz, Boole, and Frege and closely examines the "decision problem" that animated Church and Turing, arriving at a surprising conclusion. He suggests, unmistakably, that "Turing's mathematical ideas had little if any influence on the invention of the modern computer." At Princeton in the mid-1930s the pieces were there—but they did not gel: Turing gives a seminar on his just-published computable numbers paper, aided by Alonzo Church, but "there was rather bad attendance." With just two reprint requests, Turing despairs. And in a fellowship recommendation that von Neumann wrote for Turing in June 1937—just where you would expect a line about computability or decision problem—the great mathematician and soon-to-be namesake of von Neumann architecture praises instead Turing's "good work" in quasi-periodic functions! At this critical juncture Turing's influence on von Neumann is, at best, indirect and elusive.b
Tedre also closely examines the rival visions for "computer science" in the 1960s and the shifting emphases in ACM's model curricula. Three distinct debates engagingly frame the emerging scientific character of computing, including debates on formal verification, when advocates like C.A.R. Hoare (1985) sought to formally prove program correctness and create computing from axioms; on software engineering, which unsettled the theoretical and mathematical foundations of the pioneers; and on experimental computer science, which it seems everyone loved but no one quite practiced. Tedre gives a balanced treatment of each debate, attending to the intellectual and institutional dimensions, as people sought funding from the NSF, aimed at disciplinary identity, and struggled to create educational coherence. Computing emerges as a science, but there is no unfolding of a singular Newtonian paradigm.
Turing's complex legacy is of enhanced importance today with the expansion of the A.M. Turing Award, given for "major contributions of lasting importance to computing." The Turing Award recipients are dramatis personae for each of these books. Tedre, especially, heavily cites their contributions in Communications. The ACM History Committee, created in 2004, recently concluded a major revamping of the Turing Award website (http://amturing.acm.org). Michael R. Williams, professor emeritus at the University of Calgary, expanded the individual entries beginning with Alan Perlis in 1966, aiming at in-depth coverage for ACM members as well as accessible treatments that might spread the word. The History Committee has just launched a major oral-history initiative to ensure there are interviews with each of the 42 living Turing laureates, creating (where interviews are yet needed) a compelling video record.c These oral histories, continued year by year, will complement the ongoing work on the Turing website, overseen now by Thomas Haigh.
Clearly, computing has changed the world—but where has it come from? And where might it be taking us?
The History Committee connects the ACM membership with professional historians of computing. Committee members represent research centers and museums, libraries and academic departments, industry and government laboratories, and varied ACM committees.3 Since 2009 the History Committee has supported 22 historical projects on ACM's storied history. So far the results include five completed Ph.D. dissertations, two published books, and a bevy of conference papers and other contributions. We responded to the ACM membership's curiosity about archival principles and methods with a workshop at the Charles Babbage Institute in May 2014.d This month we will hold an ACM history workshop at the annual meetings of the Society for the History of Technology and the SIGCIS history of computing group.e ACM members' interest in oral history methods and SIG-centered history are on the docket.
The computing-history gap that Donald Knuth was troubled by and that Thomas Haigh anatomized might be tractable.f Despite the clear challenges of doing professional history with rigorous computing content, we have evident successes. In her 2012 History Committee-supported Ph.D. dissertation ("Turing Award Scientists: Contribution and Recognition in Computer Science") Irina Nikiforova from Georgia Tech investigated intellectual and institutional patterns in which fields of computer science and which computer scientists were likely awardees. In another dissertation, completed in 2013 ("A House with the Window to the West: The Akademgorodok Computer Center (1958–1993))" Princeton's Ksenia Tatarchenko follows Andrei Ershov and his colleagues' efforts to build computer science in Soviet Russia and forge professional ties—across the "iron curtain"—to the ACM community. New York University's Jacob Gaboury's 2014 dissertation ("Image Objects: Computer Graphics at the University of Utah") investigates the prolific Evans and Sutherland network. Books done with ACM support are out from Cambridge University Press and forthcoming from ACM Books.g In funding original research on ACM, as with enhanced publicity for the Turing awardees, we see many opportunities for constructive collaboration and professional dialogue in the years to come.
b. Andrew Hodges, Alan Turing: The Enigma (Simon & Schuster 1983), quotes "bad attendance," and "good work." Dasgupta1 largely agrees (p. 58), then hedges (p. 113). By contrast, Martin Davis in The Universal Computer (2000) and George Dyson in Turing's Cathedral (2012) suggest a close connection between Turing and von Neumann.
c. See ACM History Committee interviews at http://history.acm.org/content.php?do=interviews.
d. See "ACM History Committee Archiving Workshop" ACM SIGSOFT Software Engineering Notes http://dl.acm.org/citation.cfm?doid=2693208.2693215 and http://history.acm.org/public/public_documents/ACM-archiving-workshop_2014-05.pdf.
g. With ACM funding Andrew Russell completed a set of interviews with European networking pioneers that led to his book Open Standards and the Digital Age (Cambridge University Press, 2014). ACM funding supported Bernadette Longo's biography of ACM founder: Edmund Berkeley and the Social Responsibility of Computer Professionals (ACM Books, forthcoming 2015).
The Digital Library is published by the Association for Computing Machinery. Copyright © 2015 ACM, Inc.
No entries found