I have been thinking about scaffolding as I watch a huge new hotel and condo complex being constructed adjacent to the building where my office is located. It is a long process, starting with digging a deep hole that is braced around the edges by steel beams pounded into the ground and thick wooden slats between the beams to keep the edges of the hole from collapsing. A foundation is laid and from there, a collection of temporary struts is erected to hold up the next floor as the rebar and concrete are laid. I was struck by the similarity to catalysts in chemical reactions that promote the reaction but are not used up in the process. So it is with temporary scaffoldings that hold things in place until they are not needed anymore and are removed to be reused again and again.
In the early days of Internet development, the teams working on it made heavy use of the Arpanet, its predecessor, supporting Internet development with email, remote timesharing access, file transfers, and other applications. Once the Internet protocols had been specified and tested, the Arpanet scaffolding was taken down (that is, the so-called NCP protocols) and replaced with TCP/IP and suitable modifications were made to the earlier Arpanet applications to make them compatible with the new Internet protocols. This same process was used to develop the MCI Mail application in the early 1980s. During MCI Mail development, the implementation team made use of a Digital Equipment Corporation email service called All-in-One to coordinate the work. Once we had MCI Mail working, we were able to retire the earlier system in the same way scaffolding is removed from a completed building.
This must be a common paradigm for many other kinds of software development in which a temporary support system is adopted or perhaps even developed to facilitate the next evolutionary step. I am seeing a similar process taking place in the development of the Solar System Internet being tested on the existing Internet in mostly terrestrial (and some low Earth orbiting or near-Earth assets). Artificial mechanisms introducing delay or disruption allow for the testing of the space-oriented Bundle Protocol Suite in preparation for its deployment in non-Internet environments in space.
This line of thinking also drew my attention to work on compiler-compilers that I once used to instrument FORTRAN programs by running them through a special compiler that injected additional source code into the program it was compiling to build in performance measurements to be taken during execution. It was sometimes challenging to think about what would happen at compiler-compiler time (creating the code that would produce a compiler) which, at compile time, would produce source code including the specially injected source measurement code that would then be turned into executable code to take measurements at execution time. Remembering what phase would produce which resulting output was sometimes confusing and required a few diagrams to keep straight what happened and when—not unlike those science fiction stories that involve time travel in which you must figure out what happens and when, as in the Back to the Future film series.
Tools for building new systems often require temporary scaffolding to reach the point where the intended system can support itself.
These reminiscences have persuaded me that tools for building new systems often require temporary scaffolding to reach the point where the intended system can support itself. Alan Kay's Smalltalk language eventually was able to compile itself with a compiler written in its own language, if my aging memory serves. I would be interested to hear from readers about their own experiences with the creation of software scaffolds that have enabled their development of new systems. I wonder whether this is like what happens when a new species is born of an earlier one but the two are distinct and non-interoperable?
Join the Discussion (0)
Become a Member or Sign In to Post a Comment