Computers may be the best repository of all time for information—as long as the operating system or storage medium is not out of date—but they are unable to record or reproduce the sensual presence of a material work of art. Unlike the qualities of material works of art, games and arbitrary interaction do not qualify the computer as a medium for memories and recollections."1
In common with most senior academics, I am required from time to time to offer assessments on applications for research funding. Over recent years, I have been pleased to see funding agencies increasingly asking applicants to pay formal attention to the means by which the outputs of publically funded research may be preserved over the medium to long term. A second, although clearly related, concern is how applicants plan to ensure that ongoing access to research results may be achieved. These are good questions, and the fact they are being asked represents real progress, but unfortunately, for the most part, they are not well answered.
Based on my experience, a number of misunderstandings appear to be very widespread. Chief among these is that preservation may be said to have been achieved if a proportion (which may or may not be a significant proportion) of the digital outputs of a project are backed up and stored at some point during the project’s lifetime. A related misunderstanding is the notion that all that is required to assure access is that the project team should develop a website and keep it active for a year or so after the conclusion of a project. There is no obvious consensus on what constitutes the ‘medium’ or ‘long’ term, and very little comprehension that preservation and access need to be considered even before a project begins. The choices we make about file formats, data models, hardware, and software all impact on how easy it will be to actively preserve information for future use over the years and decades ahead. Not all the material generated by research projects is digital, of course, and digital preservation techniques often need to be supplemented by longer established techniques which are widely understood in the gallery, library, archive, and museum (GLAM) domains, but are perhaps less well known elsewhere. The more obvious lacunae in preservation awareness among applicants for public funding are, in principle at least, relatively easy to address, and call for little more than taking advantage of the numerous sources of information which publically and freely available. However, as one begins to engage seriously with preservation issues, it does not take long to come up against problems that are much more difficult to address, and which are increasingly common.
I have recently been concerned by the preservation challenges presented when dealing with interactivity and ephemerality. The drive toward providing ever more complex and nuanced forms of computer interactivity is not new. Indeed, it has been around right from the earliest days of computing, Ivan Sutherland’s Sketchpad represented a significant early step, and Sutherland is usually now remembered as the founding father in computer interaction. The history can be traced through Doug Engelbart’s visionary effort during the 1950s to "augment the human intellect" by making available a vast amount of human knowledge via highly responsive workstations. Engelbart’s so-called NLS (oNLine System), which was developed courtesy of an ARPA grant facilitated by Sutherland (working under the direction of J.C.R. Licklider), was first demonstrated in 1968 in what came to be called the "Mother of All Demos."a This remarkable event featured the introduction of the computer mouse, video conferencing, teleconferencing, hypertext, word processing, hypermedia, object addressing and dynamic file linking, bootstrapping, and a collaborative real-time editor.
During the 1970s, Apple did a great deal of work to move interactivity technology out of the laboratory into the living room, and today the ideas put forward by Licklider, Sutherland, and Engelbart are considered commonplace. Development continues of course, and Virtual Reality systems are just a recent manifestation of the increasingly sophisticated ways we are able to interact with computers.
From the perspective of the arts, Oliver Grau1 has argued that the development of what he calls ‘illusionary visual space’ is part of the overall art history of illusion and immersion, and draws connections with interactive art, interface design, agents, telepresence, and image evolution.
The drive toward providing ever more complex and nuanced forms of computer interactivity is not new.
The existence of complex interactivity affordances in modern computer systems has a significant impact on how we think, and can think about computers, and this in turn has consequences for our conception of what we are trying to preserve and later make accessible for future generations.
Among computer scientists, the tendency has traditionally been to concentrate on the tangible or the physical.b Thus, preserving (say) the financial records of a company would typically be treated as a series of tasks involving ensuring the bits that comprise the company accounts package and its associated data are saved on a stable medium, and stored in a safe place. These in turn would need to be periodically updated to run on new generations of hardware when they become available (migration) or might subject to virtualization or emulation. While the implied processes of careful analysis, storage, preservation management, and access provision, usually work well for financial systems and text documents, they serve us much less well when human-computer interaction offers users the opportunity to influence fundamentally the performance of software. Grau’s notion of illusionary visual space draws attention to the fact that sometimes the most significant features of an object, and those that we would be the most concerned to preserve, are not tangible features at all, but rather lie outside the materiality of the object.
Brenda Laurel suggests that interactivity can be thought of as sitting on the three axes, of ‘frequency’, ‘range’, and ‘significance’.3 Frequency represents the number of occasions on which the user interacts with the computer. Range represents the number of distinct choices that are available, while significance represents the degree to which the choices made by the user alters the outcome. Thus: "A not-so interactive computer game judged by these standards would only let you do something once in a while, only give you a few things to choose from, and the things you could choose wouldn’t make much difference to the whole action (or produce significant changes to the state of the underlying system). A very interactive computer game (or desktop or flight simulator) would let you do something that really mattered at any time, and it could be anything you could think of."3
As we reflect on the complexities of preserving the key characteristics of artifacts with a high degree of meaningful interactivity, it becomes clear that interaction gives the operation of computers a certain degree of ephemerality, and variability, which in turn produces highly individualized, and potentially unique, behaviors in computer systems. Thus, the behavior of computers may well be impossible to reproduce in full on subsequent occasions, even supposing the software designer/author were motivated to do so. Laurel characterizes the operation of computers as ‘performative’2 and comparable to theatre. So just as no two performances in a theater are (or can be) exactly alike, computer systems whose core behaviors are dependent on the variable input of users may also be unique. The degree to which this is actually so in practice depends crucially on the sort of interaction the computer system in question permits.
In many cases it is more sensible to think of software in terms of what it does, rather than in terms of the lines of code that constitute a program, or the hardware on which the software runs. Construed this way, the human component of human-computer interaction significantly affects the process of preservation. For example, if we are trying to preserve a flight simulator, we need to consider not only the program as written but the affordances which it offers, and how these are experienced in practice, not all of which may be clear even to the programmer. Even the notion that there is a single ‘object’ of preservation cannot be treated as a given. Just as there is no single, entirely authoritative performance of Hamlet that can stand for all, we can say there is no single authoritative instance of the running of a flight simulator that having been preserved means we require no other. In both cases, serious curatorial skill needs to be brought to bear to decide what exactly ought to be preserved. In the language of mainstream preservation activity, we need to establish, as far as it is possible to do so, the "significant properties" of the object of preservation, and capture those for future generations.
Although he was writing about artworks Grau could just as easily have been thinking about highly interactive computer systems, as he expressed well the essence of the problem: "The strength of material works of art, both past and present, lies principally in their function as illuminating and vibrant testimonies of the social memory of humankind. For only fixed artworks are able to preserve ideas and concepts enduringly and conserve the statements of individuals or an epoch. An open work, which is dependent on interaction with a contemporary audience, or its advanced variant that follows game theory—the work is postulated as a game and the observers, according to the ‘degrees of freedom’, as players—effectively means that images lose their capability to be historical memory and testimony. In its stead, there is a durable technical system as framework and transient, arbitrary, non-reproducible, and infinitely manipulable images. The work of art as a discrete object disappears."1
As the complexity of digital objects increases, and the digital impinges on more and more of our lives, so does the need to see preservation and curation as an interdisciplinary team effort.
For anyone charged with the preservation of highly interactive computer systems, Grau’s view of matters, particularly as expressed in the quotation with which I began this column, is rather too bleak.
There are a number of approaches we can take that do much to ameliorate matters. First, we should, where appropriate, abandon the idea that there is a single authoritative ‘object’ of preservation, in favor of recognizing that sometimes the essence of an object, its most significant features, if you will, lie outside the materiality of the object. That is not to say the hardware and software do not matter—they clearly do—but there is, in addition, something intangible that must also be preserved and documented. There is no simple formula for determining which aspects of user "experience" need to be captured, as this will vary from situation to situation. This is an appraisal and documentation task of the greatest complexity and one which takes a great deal of skill to perform effectively. This might involve not only recording archetypical examples of interaction but potentially also noting how things might have been.
It may not always be possible to preserve complex interactive computer systems in such a way that they can be reproduced perfectly in the future. This might be due financial, technical or other reasons. In such cases, careful documentation of what are considered as the most important elements of the interaction represents a valuable resource for future generations. It is not possible to reproduce and experience at firsthand David Garrick’s first performance of Hamlet in Dublin during the 1742 season, but it remains instructive and valuable to know that his performance(s) aroused such excitement, the Irish capital was said the have been gripped with "Garrick Fever."
The corollary to abandoning the notion of a single authoritative ‘object’ of preservation, is to abandon the idea of there being a single authoritative act of preservation. As the complexity of digital objects increases, and the digital impinges on more and more aspects of our lives, so does the need to see preservation and curation as an interdisciplinary team effort.c The make-up of preservation teams will vary depending on the object in question, but might involve social scientists, computer scientists, artists,d and ethnographers, in addition to traditional curators and historians. Digital preservation involves much more than simply saving bits, or putting up a website. Dealing with complex interactive software highlights the role of nuanced, detailed and patient documentation in passing on to future generations a rich understanding not only of human-machine interaction, but also human-human interaction past and present.
As someone who spends most of his working life dealing with curators of one kind or another, I find it rather satisfying that as computer systems become ever more complex and nuanced, the prospects for complete automation of the preservation process are, at the bleeding edge of technology at least, getting further away, if anything.
Join the Discussion (0)
Become a Member or Sign In to Post a Comment