May 1987 - Vol. 30 No. 5
Features
A mathematically focused curriculum for computer science
A curriculum that is flexible enough to suit all students studying computer science, and that reintegrates the theoretical aspects of the field along with more practical or vocational aspects, is proposed.
Taking “computer literacy” literally
Computer literacy implies an analogy between computer-related skills and linguistic literacy that has not been seriously explored. Recent studies of linguistic literacy can illuminate the definition of computer literacy and can suggest new ways of teaching it.
This report summarizes the activities of the Computing Sciences Accreditation Board from its inception in 1984 through its first accreditation cycle completed in June 1986. The major activities during this period were directed at developing the CSAB structure necessary to carry out the accreditation process, and at conducting the first round of accreditation visits and actions.
Intelligent information-sharing systems
The Information Lens system is a prototype intelligent information-sharing system that is designed to include not only good user interfaces for supporting the problem-solving activity of individuals, but also good organizational interfaces for supporting the problem-solving activities of groups.
Distribution of mathematical software via electronic mail
A large collection of public-domain mathematical software is now available via electronic mail. Messages sent to "netlib@anl-mcs" (on the Arpanet/CSNET) or to "research!netlib" (on the UNIX® network) wake up a server that distributes items from the collection. The one-line message "send index" causes a library catalog to be sent by return mail.
A software architecture for supporting the exchange of electronic manuscripts
As electronic-manuscript exchange becomes more prevalent, problems arise in translating among the wide variety of electronic representations. The optimum solution is a system that can support both the use and the creation of translation tools.
An empirical validation of software cost estimation models
Practitioners have expressed concern over their inability to accurately estimate costs associated with software development. This concern has become even more pressing as costs associated with development continue to increase. As a result, considerable research attention is now directed at gaining a better understanding of the software-development process as well as constructing and evaluating software cost estimating tools. This paper evaluates four of the most popular algorithmic models used to estimate software costs (SLIM, COCOMO, Function Points, and ESTIMACS). Data on 15 large completed business data-processing projects were collected and used to test the accuracy of the models' ex post effort estimation. One important result was that Albrecht's Function Points effort estimation model was validated by the independent data provided in this study [3]. The models not developed in business data-processing environments showed significant need for calibration. As models of the software-development process, all of the models tested failed to sufficiently reflect the underlying factors affecting productivity. Further research will be required to develop understanding in this area.
An analysis of variance (ANOVA) model is developed for determining the existence of significant differences among strategies employing heuristics. Use of the model is illustrated in an application involving capacity assignment for networks utilizing the dynamic hierarchy architecture, in which the apex node is reassigned in response to changing environments. The importance of the model lies in the structure provided to the evaluation of heuristics, a major need in the assessment of benefits of artificial-intelligence applications. A nested three-factor design with fixed and random effects provides a numerical example of the model.