GEORGE 3—A general purpose time sharing and operating system
Computing Profession
An Operating System is described which will run on a wide variety of configurations of the I.C.T. 1900, and can handle a large number of online console users while at the same time running several offline (background) jobs. The system is not oriented towards either mode and can be either a batch processing system (such as the ATLAS Supervisor, IBSYS, or GECOS), or a multiaccess system (resembling, to the user, CTSS or MULTICS), or both simultaneously, depending on the installation, which can adjust the Schedulers.
Both online users and offline jobs use a common Command Language. The system includes a Multilevel device-independent File Store.
An experimental model of system/360
The problem of predicting the performance of modern computer systems is formidable. One general technique which can ease this problem is macroscopic simulation. This paper reports on the applicability of that technique to System/360.
The paper describes an experimental model of System/360—its hardware, software, and its environment. The measures of system performance produced by the model consist of statistics relating to turnaround time, throughput, hardware utilization, software utilization, and queueing processes.
The model is mechanized in SIMSCRIPT and consists of some 1750 statements. An auxiliary program, the Job Generator, creates automatically the properties of System/360 jobs that get simulated.
The “information explosion” noted in recent years makes it essential that storage requirements for all information be kept to a minimum. A fully automatic and rapid three-part compressor which can be used with “any” body of information to greatly reduce slow external storage requirements and to increase the rate of information transmission through a computer is described in this paper. The system will also automatically decode the compressed information on an item-by-item basis when it is required.
The three component compressors, which can be used separately to accomplish their specific tasks, are discussed: NUPAK for the automatic compression of numerical data, ANPAK for the automatic compression of “any” information, and IOPAK for further compression of information to be stored on tape or cards.
Methods for analyzing data from computer simulation experiments
This paper addresses itself to the problem of analyzing data generated by computer simulations of economic systems. We first turn to a hypothetical firm, whose operation is represented by a single-channel, multistation queueing model. The firm seeks to maximize total expected profit for the coming period by selecting one of five operating plans, where each plan incorporates a certain marketing strategy, an allocation of productive inputs, and a total cost.
The results of the simulated activity under each plan are subjected to an F-test, two multiple comparison methods, and a multiple ranking method. We illustrate, compare, and evaluate these techniques. The paper adopts the position that the particular technique of analysis (possibly not any one of the above) chosen by the experimenter should be an expression of his experimental objective: The F-test tests the homogeneity of the plans; multiple comparison methods quantify their differences; and multiple ranking methods directly identify the one best plan or best plans.
Reduction in the size of decision tables can be accomplished by several techniques. The techniques considered in this paper are on the parsing of decision tables with regard to horizontal and vertical data structures, job identity, hardware and job priorities, and context relationships. Such parsing rests upon some conventions for the linkage of decision tables.
A comparison of batch processing and instant turnaround
A study of the programming efforts of students in an introductory programming course is presented and the effects of having instant turnaround (a few minutes) as opposed to conventional batch processing with turnaround times of a few hours are examined. Among the items compared are the number of computer runs per trip to the computation center, program preparation time, keypunching time, debugging time, number of runs, and elapsed time from the first run to the last run on each problem. Even though the results are influenced by the fact that “bonus points” were given for completion of a programming problem in less than a specified number of runs, there is evidence to support “Instant” over “Batch”.
The most fundamental underlying problem in sophisticated software systems involving elaborate, changing data structure is dynamic storage allocation for flexible problem modeling. The Free Storage Package of the AED-1 Compiler System allows blocks of available storage to be obtained and returned for reuse. The total available space is partitioned into a hierarchy of free storage zones, each of which has its own characteristics. Blocks may be of any size, and special provisions allow efficient handling of selected sizes, control of shattering and garbage collection, and sharing of physical space between zones. The routines of the package perform high level functions automatically, but also allow access and control of fine internal details as well.
Shape the Future of Computing
ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.
Get InvolvedCommunications of the ACM (CACM) is now a fully Open Access publication.
By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.
Learn More