February 1986 - Vol. 29 No. 2
Features
Combinatorics, complexity, and randomness
The 1985 Turing Award winner presents his perspective on the development of the field that has come to be called theoretical computer science.
To illustrate the "remarkable extent to which complexity theory operates by means of analogs from computability theory," Richard Karp created this conceptual map or jigsaw puzzle. To lay out the puzzle in the plane, he used a "graph planarity algorithm." The more distantly placed parts might not at first seem related, "but in the end, the theory of NP-completeness does bring them all together," Karp says.
The upper right portion of the puzzle shows concepts related to combinatorial explosions and the notion of a "good" or "efficient" algorithm. In turn, "Complexity" connects these concepts to the upper left portion, which represents the concerns of early computability theorists.
The traveling salesman problem is closer to the upper right corner because it is probably intractable. It therefore borders on "NP-completeness" and "Combinatorial explosion."
To some extent, however, certain divisions blur. "Linear programming," for example, has an anomalous status—the most widely used algorithms for solving such problems in practice are not good in the theoretical sense, and those that are good in the theoretical sense are often not good in practice. One example is the ellipsoid method that was the object of so much attention six years ago. It ran in polynomial time, but the polynomial was of such a high degree that the method proved good in the technical sense, but ineffective in practice. "The reason is that our notion of polynomial-time algorithms doesn't exactly capture the notion of an intuitively efficient algorithm," Karp explains. "When you get up to n5 or n6, then it's hard to justify saying that it is really efficient. So Edmonds's concept of a good algorithm isn't quite a perfect formal counterpart of good in the intuitive sense." Further, the simplex algorithm is good in every practical sense, Karp says, but not good according to the standard paradigm of complexity theory. The most recent addition to linear programming solutions, an algorithm devised by Narendra Karmarkar that some think challenges the simplex algorithm, is good in the technical sense and also appears to be quite effective in practice, says Karp.
The good algorithm segment is adjacent to "Heuristics" because a heuristic algorithm may work well, but lack a theoretical pedigree. Some heuristic algorithms are always fast, but sometimes fail to give good solutions. Others always give an optimal solution, but are not guaranteed to be fast. The simplex algorithm is of the latter type.
"Undecidability, " "Combinatorial explosion," and "Complexity" are on the same plane because they are analogs of one another; undecidability involves unbounded search, whereas combinatorial explosions are by definition very long but not unbounded searches. Complexity theory bridges the gap because, instead of asking whether a problem can be solved at all, it poses questions about the resources needed to solve a problem.
The lower left region contains the segments Karp has been concerned with most recently and that contain open-ended questions. "Randomized algorithm," for example, is situated opposite "Probabilistic analysis" because both are alternatives to worst-case analyses of deterministic algorithms. Randomized algorithms might be able to solve problems in polynomial time that deterministic ones cannot and that could mean an extension of the notion of good algorithms. Perhaps through software designs for non-von Neumann machines, algorithms can be made more efficient in practice through parallelism. Finally, some parts of the puzzle are not yet defined. Says Karp, "They correspond to the unknown territory that remains to be explored in the future."
Complexity and parallel processing: an interview with Richard Karp
In the following interview, which took place at ACM 85 in Denver, Karp discusses the relation of his work to leading-edge computing topics like parallel processing and artificial intelligence. Tracing his experience as a pioneer in highly theoretical computer science, Karp describes how the decision to go against established wisdom led to the work for which he is best known and how a colleague's findings led him to see links between two previously unrelated areas. Throughout, he stresses the exchange of ideas with colleagues that helped yield fundamental insights.
The advantages of user-defined distfix operators—a syntactic convenience that enhances the readability of programs—can be obtained as an extension of almost any programming language without requiring dynamic changes to the parser.
A note on the Berry-Meekings style metric
A modification of the Berry-Meekings "style metric"—applied to software from the corporate environment—finds little relationship between this style metric and error proneness.
Program style analysis: a natural by-product of program compilation
Analyzing program style may be properly considered an integral part of program compilation—as successfully implemented in two style-analysis tools for Fortran 77: AUTOMARK and ASSESS. The authors look forward to the day when all major language compilers routinely provide a standard style-analysis vector for users to interpret as they see fit.
Dynamic initial allocation and local reallocation procedures for multiple stacks
Two new procedures for manipulating multiple stacks which share sequential memory locations are discussed. The first is the dynamic initial allocation procedure in which each stack is allocated as its first element arrives rather than having every stack preallocated at the very beginning of the entire process. The second is the local reallocation procedure; in this scheme, when a stack overflows, only its neighboring stacks, rather than the entire memory area, are reorganized provided that a certain condition is satisfied. The results of simulation appear to suggest that these new approaches improve the operational performance in many applications. With appropriate modifications, these concepts may also be applied to any other type of multiple linear lists (e.g., multiple queues) sharing sequential memory locations.
Organizational power and the information services department: a reexamination
In a recent application of the theory of strategic contingencies in three large multinational firms, Lucas found that information services departments were perceived by others as having low levels of power and influence and suggested a variety of reasons for the results. This note continues the application of the theory of strategic contingencies to the information services department by describing a study of intraorganizational power that uses basically the same procedures as the Lucas study and obtains similar results. In an effort to stimulate future power-related research in the information systems area, this note concludes by suggesting several reasons, beyond those given by Lucas, for the levels of power attributed to information services departments.