Every discipline that comes of age consecrates its own roots in the process. In footnotes, anecdotes, and names of departmental buildings, occasions are found to remember and celebrate personalities and ideas that a discipline considers its own. A discipline needs heroes to help create a narrative that legitimizes and fortifies its own identity. Such a narrative hardly reflects the complexity of historical reality. Rather, it echoes the set of preferences and programmatic choices of those in charge of a discipline at a given moment in a given place. Each name that gets integrated into an officialized genealogy is the result of discussions and negotiations, of politics and propaganda.
To the general public, the genealogies of physics and mathematics are probably more familiar than that of computer science. For physics we go from Galileo via Newton to Einstein. For mathematics we begin with Euclid and progress over Descartes, Leibniz, Euler and Gauss up to Hilbert. Computer science by contrast is a relatively young discipline. Nevertheless, it is already building its own narrative in which Alan Turing plays a central role.
In the past decennia, and especially during the 2012 centenary celebration of Turing, his life and legacy received an increasing amount of attention. Recently, Communications published two columns in which Turing’s legacy is put into a more historical context.7,9 We continue this line of research by focusing on how Turing functioned as a hero within the formation of computer science. We will do so here by comparing the consecration of Turing with that of Gauss in mathematics.
Making Gauss a Hero
In the early 19th century, the Prussian minister Wilhelm von Humboldt sought to introduce mathematics as a discipline per se in higher education. To do so, he needed an icon to represent German mathematics. He turned to the one German who had been praised in a report on the progress of mathematics to emperor Napoleon: Carl Friedrich Gauss (1777–1855). Also, the new generation of mathematicians favored a conceptual approach over computations and saw Gauss as the herald of this new style of mathematics. As such, Gauss became synonymous with German mathematics for both political as well as more internal reasons.
Toward the end of the 19th century, the prominent mathematician Felix Klein developed this Gauss image into a programmatic vision. From 1886 onward, he had started to actively transform Göttingen’s mathematics department into the world’s foremost mathematical center. He promoted a close alliance between pure and applied mathematics and got cooperation with the industry on the way. On a national scale, he worked for the professionalization of mathematics education. To shape this disciplinary empire, Klein, too, used Gauss.
In his 1893 address to the first International Congress for Mathematics in Chicago, Klein talked about the latest developments in mathematics and spoke of: a return to the general Gaussian programme [but] what was formerly begun by a single master-mind […] we must now seek to accomplish by united efforts and cooperation.10
The edition of Gauss’ collected works (1869–1929) provided an abundance of historical material that Klein used to build an image of Gauss supporting his personal vision on mathematics. Klein portrayed Gauss as the lofty German who was able to pursue practical studies because of his theoretical research, a portrayal that, although very influential, was biased nonetheless.
In the 20th century, Klein’s interpretation of Gauss was picked up by the international mathematical community and was modified accordingly. In the U.S., following Klein’s 1893 address, Gauss’s fertile combination of pure and applied struck a note for a mathematical community that often worked closely in alliance with industry.11 In France, after World War II, the Bourbaki-group emphasized the abstraction of Gauss’s work that transcended national boundaries and had helped pave the way for their structural approach to mathematics. However, in contrast with the Kleinian "pure mathematician," Gauss was also "rediscovered" after the birth of the digital computer as a great calculator and explorer of the mathematical discourse.4
Making Turing a Hero
Just like Gauss was instrumental to Humboldt and Klein to further the institutionalization of mathematics, Turing played a similar role in the professionalization of the ACM in the 1960s. This goes back to the 1950s, when some influential ACM members, including John W. Carr III, Saul Gorn, and Alan J. Perlis, wanted to connect their programming feats to modern logic. Stephen Kleene’s Introduction to Metamathematics (1952), which contained a recast account of Turing’s 1936 paper "On computable numbers," was an important source.
In 1954, Carr recommended programmers to deal with "the generation of systems rather than the systems themselves" and with "the ‘generation’ of algorithms by other algorithms," and hence with concepts akin to metamathematics.3 Similarly, around 1955, Gorn became accustomed to viewing a universal Turing machine as a conceptual abstraction of the modern computer (see, for example, Gorn8). By the end of the 1950s, Carr and Gorn explicitly used Turing’s universal machine to express the fundamental interchangeability of hardware and language implementations. Turing’s 1936 theory thus helped influence ACM members to articulate a theoretical framework that could accommodate for what programmers had been accomplishing independently of metamathematics.6
The first wave of recognition that Turing received posthumously is but a ripple when compared to the second wave.
In 1965, ACM Vice President Anthony Oettinger (who had known Turing personally), and the rest of ACM’s Program Committee proposed that an annual "National Lecture be called the Allen [sic] M. Turing Lecture."1 Lewis Clapp, the chairman of the ACM Awards Committee, collected information on the award procedures "in other professional societies." In 1966 he wrote: [a]n awards program […] would be a fitting activity for the Association as it enhances its own image as a professional society. […] [I]t would serve to accentuate new software techniques and theoretical contributions. […] The award itself might be named after one of the early great luminaries in the field (for example, "The Von Neuman [sic] Award" or "The Turing Award", etc.)2
ACM’s first Turing Awardee in 1966 was Perlis, a well-established computer scientist, former president of the ACM, and close colleague of Carr and Gorn. Decorating Perlis is in hindsight thus rather unsurprising. Turing, by contrast, was not well known in computing at large, even though his 1936 universal machine had become a central concept for those few who wanted to give computer programming a theoretical impetus and also a professional status.a
The first wave of recognition that Turing received posthumously with the Turing award in 1966 is but a ripple when compared to the second wave. This started in the 1970s with the disclosure of some of Turing’s war work for the Allies, followed by Andrew Hodges’ authoritative 1983 biography, which also added a personal dimension to Turing’s story: his life as a gay man in a homophobic world. This made Turing also known outside of computer science. The second wave culminated in the 2012 Turing centenary celebrations that nurtured the perception of Turing as the inventor of the modern computer and artificial intelligence. Some even claim Turing anticipated the Internet and the iPhone.
The year 2012 was full of activities: there were over 100 academic meetings, plaques, documentaries, exhibitions, performances, theater shows, and musical events. The celebrations also brought together a group of people with diverse backgrounds and promoted computer science to the general public, an achievement of which the longer-term impact has yet to be awaited.12 A discipline has its heroes for good reasons.
As Hodges’ biography shows, Turing’s work was multifaceted. Not only did Turing contribute in 1936 to the foundations of mathematics, which later proved to be fundamental for theoretical computer science, he also worked at Bletchley Park during World War II to help break the Enigma. He became an experienced programmer of the Ferranti Mark I for which he wrote a programmer’s manual and even designed a computer, known as the ACE. He reflected on thinking machines and contributed to the field of morphogenesis.
It is therefore not surprising that for many today the multidisciplinary nature of computer science is personified in Turing who achieved all these different things in one short lifespan. Along these lines, Barry Cooper, the driving force behind the Turing centenary, said the following in 2012: The mission of [the Turing Centenary] was to address concerns about how science was fragmenting. We wanted to return to more joined-up thinking about computability and how it affects everyone’s life. More generally, too, the Turing Year was important in highlighting the need for fundamental thinking.12
From this perspective, Turing’s theoretical work gives new impetus to the sciences as a whole, not just to computer science per se. The recent volume Alan Turing—His Work and Impact5—Turing’s collected papers cum essays from renowned scientists—also wants to bring this point home. It echoes even on the political level. The House of Commons has considered naming the new Technology and Innovation elite centers after Turing. According to the chairman of the Science and Technology Committee, "There isn’t a discipline in science that Turing has not had an impact upon." As such, computer science, and especially theoretical computer science with its focus on computability, becomes the connecting discipline among the other sciences, and thereby turns into a fundamental science, not unlike mathematics.
Is Turing for computer science what Gauss is for mathematics?
The focus on computability and fundamental thinking is certainly not accidental. To a large extent the drive behind the Turing Year came from theoreticians. They do not ignore that Turing also worked in engineering. However, many of them argue that Turing must have invented the computer because of his theoretical 1936 paper. According to this view on science and technology, also present in Klein’s Gauss interpretation, theory precedes practice.
Looking Backward into the Future
Over the past century, the one-dimensional image of Gauss has been replaced by a multitude of images. This shows a discipline in constant evolution assesses its own identity through its heroes and allows for a multiplicity of readings. Certainly, each reading may further the agenda of a particular community, but the diversity of all images taken together, all grounded in some way in Gauss’ legacy, positively stimulates the openness and generosity of a field.
Is Turing for computer science what Gauss is for mathematics? Computer science, as its histories show, has many origins, and this should be fostered. In this sense, the variety of topics and the diversity of approaches of Turing’s work, embracing both the practical and the theoretical, reflects an essential aspect of computer science. However, if one celebrates Turing mainly because of his theoretical work, one runs the risk of increasing already existing divides. Instead of favoring one reading of Turing and crowding out others, why not view Turing’s own accomplishments as an invitation? The historian could integrate Turing into a more complex historical account. The computer scientist could look back and reflect on the state of computer science, finding new ways of rapprochement between the many branches of computer science, between theory and practice.
Join the Discussion (0)
Become a Member or Sign In to Post a Comment