August 1984 - Vol. 27 No. 8
Features
We are entering a new era in very high performance computing that will be dominated by parallel architectured systems. It is critical for the United States to maintain its leadership as this new era, with its broadened applications, evolves over the next decade. Toward this end the National Science Foundation sponsored a workshop in November 1983 to focus the collective strength of universities, industry, and government on projects for development of knowledge-intensive industries.
Data that are missing, lost, or incomplete can be very significant in certain situations. Programs and languages need to be able to identify and manage such data effectively.
Reflections on software research
Can the circumstances that existed in Bell Labs that nurtured the UNIX project be produced again?
To what extent should one trust a statement that a program is free of Trojan horses? Perhaps it is more important to trust the people who wrote the software.
Visibility aspects of programmed dynamic data structures
Unlike static structures, dynamic Pascal-like data structures often suffer visibility problems due to the unrestricted use of the general pointer mechanism. By classifying these structures and identifying the different kinds of pointers, a methodology is proposed for achieving improved visibility.
Pass-algorithms: a user validation scheme based on knowledge of secret algorithms
A superset of the secondary passwords technique, pass-algorithms provide enormous flexibility and ease of implementation and represent an attractive alternative to more costly systems security features and equipment. The method validates users who demonstrate knowledge of a secret algorithm rather than a password.
Improved interpretation of UNIX-like file names embedded in data
When the data processed by a program span several files, the common practice of including file names as data in some of the files leads to difficulties in moving or sharing that data. In systems using tree structured directories, this problem can be solved by making a syntactic distinction between absolute and relative file names.
The EAS-E application development system: principles and language summary
EAS-E is based on the entity-attribute-set view of system description—a useful formalism for system modeling and planning even when programming is done in languages other than EAS-E.
Training wheels in a user interface
New users of high-function application systems can become frustrated and confused by the errors they make in the early stages of learning. A training interface for a commercial word processor was designed to make typical and troublesome error states “unreachable,” thus eliminating the sources of some new-user learning problems. Creating a training environment from the basic function of the system itself afforded substantially faster learning coupled with better learning achievement and better performance on a comprehension post-test. A control group spent almost a quarter of their time recovering from the error states that the training interface blocked off. We speculate on how this training strategy might be refined, and more generally, on how function should be organized in a user interface.
The median filter is well-known [1, 2]. However, if a user wishes to predefine a set of feature types to remove or retain, the median filter does not necessarily satisfy the requirements. A more general filter, called the Weighted Median Filter, of which the median filter is a special case, is described. It enables filters to be designed with a wide variety of properties. Particular cases of filter requirements are discussed and the corresponding filters are derived. The notion of a minimal weighted median filter, of a subclass that act identically, is introduced and discussed. The question of finding the number of distinct ways a class of filters can act is considered and solved for some classes.
Memory occupancy patterns in garbage collection systems
Some programming languages and computer systems use dynamic memory allocation with garbage collection. It would be useful to understand how the utilization of memory depends on the stochastic parameters describing the size and life distributions of the cells. We consider a class of dynamic storage allocation systems which use a first-fit strategy to allocate cells and perform noncompacting garbage collections to recover free memory space when memory becomes fully occupied. A formula is derived for the expected number of holes (available cells) in memory immediately following a garbage collection which specializes to an analogue of Knuth's 'Fifty Percent' rule for nongarbage-collection systems. Simulations confirm the rule for exponentially distributed cell lifetimes. Other lifetime distributions are discussed. The memory-size requirements for noncompacting garbage collection are also analyzed.
Determinants of program repair maintenance requirements
Considerable resources are devoted to the maintenance of programs including that required to correct errors not discovered until after the programs are delivered to the user. A number of factors are believed to affect the occurrence of these errors, e.g., the complexity of the programs, the intensity with which programs are used, and the programming style. Several hundred programs making up a manufacturing support system are analyzed to study the relationships between the number of delivered errors and measures of the programs' size and complexity (particularly as measured by software science metrics), frequency of use, and age. Not surprisingly, program size is found to be the best predictor of repair maintenance requirements. Repair maintenance is more highly correlated with the number of lines of source code in the program than it is to software science metrics, which is surprising in light of previously reported results. Actual error rate is found to be much higher than that which would be predicted from program characteristics.