March 1987 - Vol. 30 No. 3
Features
Computer science: the emergence of a discipline
The continued rapid development of computer science will require an expansion of the science base and an influx of talented new researchers. Computers have already altered the way we think and live; now they will begin to elevate our knowledge of the world.
The quest for efficiency in computational methods yields not only fast algorithms, but also insights that lead to elegant, simple, and general problem-solving methods.
An interview with the 1986 A. M. Turing Award recipients—John E. Hopcroft and Robert E. Tarjan
In the following interview, which took place at the 1986 Fall Joint Computer Conference in Dallas, Texas, John Hopcroft and Robert Tarjan discuss their collaboration and its influence on their separate research today. They also comment on supercomputing and parallelism, particularly with regard to statements by FJCC Keynote speakers Kenneth Wilson, Nobel laureate and director of Cornell University's Supercomputer Center, and C. Gordon Bell, chief architect on the team that designed DEC's VAX and now with the National Science Foundation. Finally the Turing Award winners air their views on the direction of computer science as a whole and on funding and the Strategic Defense Initiative.
Automatic correction to misspelled names: a fourth-generation language approach
Using an information theoretic likeness measure defined as an inner product on a data space created from a table of valid names, this 4GL procedure searches the database space for the nearest correctly spelled name.
Instead of limiting functionality, usability complements functionality. It affects how and with what effectiveness a system is used, and even whether or not it is used at all.
Pairing heaps: experiments and analysis
The pairing heap has recently been introduced as a new data structure for priority queues. Pairing heaps are extremely simple to implement and seem to be very efficient in practice, but they are difficult to analyze theoretically, and open problems remain. It has been conjectured that they achieve the same amortized time bounds as Fibonacci heaps, namely, O(log n) time for delete and delete_min and O(1) for all other operations, where n is the size of the priority queue at the time of the operation. We provide empirical evidence that supports this conjecture. The most promising algorithm in our simulations is a new variant of the twopass method, called auxiliary twopass. We prove that, assuming no decrease_key operations are performed, it achieves the same amortized time bounds as Fibonacci heaps.
The effect of data structures on the logical complexity of programs
The logical complexity of a program is a measure of the effort required to understand it. We hypothesize that the logical complexity of a program increases with the increase in the opaqueness of the relationship between the physical data structures used in the program and their corresponding abstract data types. The results of an experiment conducted to investigate this hypothesis are reported. Documentation techniques for making programs easier to understand using complex data structures are discussed. Data structure diagrams, data structure invariants, stepwise transformation of data structures, and formal specification of the mapping between abstract and concrete data structures are illustrated using two nontrivial examples.