February 1986 - Vol. 29 No. 2

February 1986 issue cover image

Features

Research and Advances

Piecing together complexity

To illustrate the "remarkable extent to which complexity theory operates by means of analogs from computability theory," Richard Karp created this conceptual map or jigsaw puzzle. To lay out the puzzle in the plane, he used a "graph planarity algorithm." The more distantly placed parts might not at first seem related, "but in the end, the theory of NP-completeness does bring them all together," Karp says. The upper right portion of the puzzle shows concepts related to combinatorial explosions and the notion of a "good" or "efficient" algorithm. In turn, "Complexity" connects these concepts to the upper left portion, which represents the concerns of early computability theorists. The traveling salesman problem is closer to the upper right corner because it is probably intractable. It therefore borders on "NP-completeness" and "Combinatorial explosion." To some extent, however, certain divisions blur. "Linear programming," for example, has an anomalous status—the most widely used algorithms for solving such problems in practice are not good in the theoretical sense, and those that are good in the theoretical sense are often not good in practice. One example is the ellipsoid method that was the object of so much attention six years ago. It ran in polynomial time, but the polynomial was of such a high degree that the method proved good in the technical sense, but ineffective in practice. "The reason is that our notion of polynomial-time algorithms doesn't exactly capture the notion of an intuitively efficient algorithm," Karp explains. "When you get up to n5 or n6, then it's hard to justify saying that it is really efficient. So Edmonds's concept of a good algorithm isn't quite a perfect formal counterpart of good in the intuitive sense." Further, the simplex algorithm is good in every practical sense, Karp says, but not good according to the standard paradigm of complexity theory. The most recent addition to linear programming solutions, an algorithm devised by Narendra Karmarkar that some think challenges the simplex algorithm, is good in the technical sense and also appears to be quite effective in practice, says Karp. The good algorithm segment is adjacent to "Heuristics" because a heuristic algorithm may work well, but lack a theoretical pedigree. Some heuristic algorithms are always fast, but sometimes fail to give good solutions. Others always give an optimal solution, but are not guaranteed to be fast. The simplex algorithm is of the latter type. "Undecidability, " "Combinatorial explosion," and "Complexity" are on the same plane because they are analogs of one another; undecidability involves unbounded search, whereas combinatorial explosions are by definition very long but not unbounded searches. Complexity theory bridges the gap because, instead of asking whether a problem can be solved at all, it poses questions about the resources needed to solve a problem. The lower left region contains the segments Karp has been concerned with most recently and that contain open-ended questions. "Randomized algorithm," for example, is situated opposite "Probabilistic analysis" because both are alternatives to worst-case analyses of deterministic algorithms. Randomized algorithms might be able to solve problems in polynomial time that deterministic ones cannot and that could mean an extension of the notion of good algorithms. Perhaps through software designs for non-von Neumann machines, algorithms can be made more efficient in practice through parallelism. Finally, some parts of the puzzle are not yet defined. Says Karp, "They correspond to the unknown territory that remains to be explored in the future."
Research and Advances

Complexity and parallel processing: an interview with Richard Karp

In the following interview, which took place at ACM 85 in Denver, Karp discusses the relation of his work to leading-edge computing topics like parallel processing and artificial intelligence. Tracing his experience as a pioneer in highly theoretical computer science, Karp describes how the decision to go against established wisdom led to the work for which he is best known and how a colleague's findings led him to see links between two previously unrelated areas. Throughout, he stresses the exchange of ideas with colleagues that helped yield fundamental insights.
Research and Advances

Program style analysis: a natural by-product of program compilation

Analyzing program style may be properly considered an integral part of program compilation—as successfully implemented in two style-analysis tools for Fortran 77: AUTOMARK and ASSESS. The authors look forward to the day when all major language compilers routinely provide a standard style-analysis vector for users to interpret as they see fit.
Research and Advances

Dynamic initial allocation and local reallocation procedures for multiple stacks

Two new procedures for manipulating multiple stacks which share sequential memory locations are discussed. The first is the dynamic initial allocation procedure in which each stack is allocated as its first element arrives rather than having every stack preallocated at the very beginning of the entire process. The second is the local reallocation procedure; in this scheme, when a stack overflows, only its neighboring stacks, rather than the entire memory area, are reorganized provided that a certain condition is satisfied. The results of simulation appear to suggest that these new approaches improve the operational performance in many applications. With appropriate modifications, these concepts may also be applied to any other type of multiple linear lists (e.g., multiple queues) sharing sequential memory locations.
Research and Advances

Organizational power and the information services department: a reexamination

In a recent application of the theory of strategic contingencies in three large multinational firms, Lucas found that information services departments were perceived by others as having low levels of power and influence and suggested a variety of reasons for the results. This note continues the application of the theory of strategic contingencies to the information services department by describing a study of intraorganizational power that uses basically the same procedures as the Lucas study and obtains similar results. In an effort to stimulate future power-related research in the information systems area, this note concludes by suggesting several reasons, beyond those given by Lucas, for the levels of power attributed to information services departments.

Recent Issues

  1. July 2024 CACM cover
    July 2024 Vol. 67 No. 7
  2. June 2024 Vol. 67 No. 6
  3. May 2024 CACM cover
    May 2024 Vol. 67 No. 5
  4. April 2024 CACM cover with text
    April 2024 Vol. 67 No. 4