August 1990 - Vol. 33 No. 8
Features
Computer perspectives: the bandwidth famine
A number of years ago I visited a young and successful computer company and was given a tour of the facilities by one of the directors. We passed offices in which people were working at computer terminals; I was told they constituted the accounts department. A little further on we came to the software department; here people were also using terminals. “Now,” said my host, “I will show you the hardware department.” To my surprise, instead of the oscilloscopes and waveform analyzers that I had expected to find, I saw more computer terminals. This was a sign of the times.Today, the designer of printed circuit boards such as those intended to be plugged into PCs, has a great variety of software tools from which to choose: programs for layout and routing; programs for checking that the design rules have been followed correctly; logic simulators for checking that the design is functionally correct; and more elaborate simulators for checking the timing. These tools make it possible to produce a working prototype without performing any experimental work on a laboratory bench.In the dark ages of electronics, the debugging of circuits was carried out using experimental versions made by screwing the components down on a piece of wood and connecting them. The behavior of the circuit was checked using an oscilloscope. These experimental versions were known as breadboards, because someone had compared them to those boards on which bread was sliced. The wooden board has been obsolete for years, but the term has survived. A modern version of the breadboard (still used occasionally) is the wire-wrapped prototype. It is, however, costly in time and money and the effort spent on it does not advance the physical design of the final product. Moreover, in certain respects, particularly as regards electrical interference and cross talk, the wire-wrapped version may give misleading information.
Legally speaking: should program algorithms be patented
In the Legally Speaking column last May [6], we reported on a survey conducted at last year's ACM-sponsored Conference on Computer-Human Interaction in Austin, Tex. Among the issues about which the survey inquired was whether the respondents thought patent protection should be available for various aspects of computer programs. The 667 respondents overwhelmingly supported copyright protection for source and object code although they strongly opposed copyright or patent protection for “look and feel” and most other aspects of programs. Algorithms were the only aspect of programs for which there was more than a small minority of support for patent protection. Nevertheless, more than half of the respondents I opposed either copyright or patent protection for algorithms. However, nearly 40 percent of the respondents regarded algorithms as appropriately protected by patents. (Another eight percent would have copyright law protect them.)We should not be surprised that these survey findings reflect division within the technical community about patents as a form of protection for this important kind of computer program innovation. A number of prominent computer professionals who have written or spoken about patent protection for algorithms or other innovative aspects of programs have either opposed or expressed reservations about this form of protection for software [2, 4, 5].This division of opinion, of course, has not stopped many firms and some individuals from seeking patent protection for algorithms or other software innovations [8]. Although the Refac Technology patent infringement lawsuit against Lotus and other spreadsheet producers may be in some jeopardy, it and other software patent lawsuits have increased awareness of the new availability of software patents. This situation, in turn, has generated some heated discussion over whether this form of legal protection will be in the industry's (and society's) long-term best interests.The aim of this column is to acquaint readers with the legal debate on patent protection for algorithms and other computer program innovations, an issue which seems to be as divisive among lawyers as those in the computer field. [3, 9].
Cyc: toward programs with common sense
Cyc is a bold attempt to assemble a massive knowledge base (on the order of 108 axioms) spanning human consensus knowledge. This article examines the need for such an undertaking and reviews the authos' efforts over the past five years to begin its construction. The methodology and history of the project are briefly discussed, followed by a more developed treatment of the current state of the representation language used (epistemological level), techniques for efficient inferencing and default reasoning (heuristic level), and the content and organization of the knowledge base.
Knowledge and natural language processing
KBNL is a knowledge-based natural language processing system that is novel in several ways, including the clean separation it enforces between linguistic knowledge and world knowledge, and its use of knowledge to aid in lexical acquisition. Applications of KBNL include intelligent interfaces, text retrieval, and machine translation.
Natural language understanding and speech recognition
Natural language understanding must be an integral part of any automatic speech recognition system that attempts to deal with interactive problem solving. The methods for representing and integrating knowledge from different sources may be valuable for the understanding process as well as speech recognition.
Cache considerations for multiprocessor programmers
Although caches in most computers are invisible to programmers, they significantly affect program performance. This is particularly true for cache-coherent, shared-memory multiprocessors. This article presents recent research into the performance of parallel programs and its implications for programmers who may know little about caches.
A bridging model for parallel computation
The success of the von Neumann model of sequential computation is attributable to the fact that it is an efficient bridge between software and hardware: high-level languages can be efficiently compiled on to this model; yet it can be effeciently implemented in hardware. The author argues that an analogous bridge between software and hardware in required for parallel computation if that is to become as widely used. This article introduces the bulk-synchronous parallel (BSP) model as a candidate for this role, and gives results quantifying its efficiency both in implementing high-level language features and algorithms, as well as in being implemented in hardware.
The willingness of one scientist to share data with another scientist continues to be influenced by a number of enconomic, social, psychological and political factors.
Physical design equivalencies in database conversion
As relational technology becomes increasingly accepted in commercial data processing, conversion of some of the huge number of existing navigational databases to relational databases is inevitable. It is thus important to understand how to recognize physical design modifications and enhancements in the navigational databases and how to convert them to equivalent relational terms as applicable.
A very fast substring search algorithm
This article describes a substring search algorithm that is faster than the Boyer-Moore algorithm. This algorithm does not depend on scanning the pattern string in any particular order. Three variations of the algorithm are given that use three different pattern scan orders. These include: (1) a “Quick Search” algorithm; (2) a “Maximal Shift” and (3) an “Optimal Mismatch” algorithm.
Inside risks: insecurity about security?
Background A few highly visible cases of computer system exploitations have raised general awareness of existing vulnerabilities and the considerable risks they entail.