Moshe Y. Vardi’s Editor’s Letter "Moore’s Law and the Sand-Heap Paradox" (May 2014) took me back to my computer engineering education in the 1970s, a period of transition from expensive to (relatively) inexpensive hardware. My classes, both theory and practice, required that I understand microcode and software execution environments well enough to avoid gross inefficiencies. If Moore’s Law is indeed winding down, as Vardi said, software practitioners must focus even more than they already do on developing efficient code.
In my more than 35 years as a software professional, I have noted with dismay the bloatware phenomenon fueled by the expectation that Moore’s Law would mask inefficient software. Developing resilient, secure, efficient software requires more skills, time, and money, along with a different mind-set, from what we see in the commercial software industry today.
No one should count on a breakthrough on the hardware side in the face of Moore’s Law’s impending demise, although I will be delighted if proven wrong. Software researchers and engineers alike (and the organizations funding them) must reset their expectations vis-à-vis hardware advances. As Vardi said, "new algorithms and systems" and better use of existing resources through virtualization and software parallelism can help mitigate the slowdown in hardware advances.
David K. Hemsath, Round Rock, TX
Deeper Roots of Transactional Programming
Vincent Gramoli’s and Rachid Guerraoui’s article "Democratizing Transactional Programming" (Jan. 2014) suggested transaction theory was first formalized in 1976. As with many early developments in computing, the concepts of transaction management for data storage and retrieval developed in engineering practice before emerging in research papers or industry standards. Even so, the article postponed the theorization of transactions by at least a couple of years. Its Figure 1 ("History of Transactions") pinpointed an article by Eswaran et al. (including Jim Gray) called "Notions of Consistency and Predicate Locks in a Database System" in Communications (Nov. 1976) as the earliest published work on transactions. In fact, Eswaran et al. had condensed their own substantively identical IBM Research technical report (Dec. 30, 1974) On the Notions of Consistency and Predicate Locks in a Data Base System.
Reflecting the considerable development of industrial data management systems by the early 1970s, at least some essential features of the transaction concept had already been presented in an earlier published work—the CODASYL Data Base Task Group April 71 Report—which was indeed the first industry standard in concurrent database management. It included the now ubiquitous terms "data definition language" and "data manipulation language" to enable the declaration of a schema (and views upon it) and specified the statements needed to retrieve, insert, and update data elements.
In 1974, Eswaran et al. wrote: "While fairly static locking schemes are acceptable in a conventional operating system, a particular data base transaction may lock an arbitrary logical subset of the data base." A transaction that establishes priority claims over access to a set of shared related data items enables execution of concurrent programs affecting consistent (application-meaningful) state transitions.
Although the CODASYL report did not include the term "transaction," it did include rules for managing locking and contention between different "run-units," or independent program executions accessing a single data store. The CODASYL data manipulation language verbs OPEN
and CLOSE
demarcated units of work in which record sets and individual records could be locked, and competing run-units allowed degrees of access or completely excluded until the run-unit closed. The CODASYL report also contemplated rollback semantics in the face of failures.
Maurice Wilkes, a famous early pioneer of computing, used the term "transaction" when describing "transaction journals" as recoverable records of "a condensed summary of the transaction recorded immediately before the update is made" in his 1972 article "On Preserving the Integrity of Data Bases" in Computer Journal. Two years later, R.A. Davenport’s "Design of Transaction-Oriented Systems Employing a Transaction Monitor" in Proceedings of the ACM Annual Conference recalled the role of early transaction processing monitors in crystallizing the transaction abstraction.
Notwithstanding these efforts, Eswaran et al.’s 1974 technical report took the transaction concept from semi-codified expression of practical knowledge to rigorous formalization, intersecting relational database theory in its early years. It stands today as a landmark publication in the history of automated data management.
Alastair Green, London, U.K.
Join the Discussion (0)
Become a Member or Sign In to Post a Comment