Advertisement

Telefile: a case study of an online savings bank application

The development of an on-line computer system for a savings bank institution is traced from the early conceptual needs of the bank to the consummation of design by The Teleregister Corporation. Both bank and equipment criteria are specified which led to the development of the Telefile System of The Teleregister Corporation.Operation of the on-line and off-line programs are described and statistics are cited for reliability and performance of the system. Benefits to the bank are discussed from the banker's point of view; an indication of future trends in the on-line savings bank field is also discussed.

Some legal implications of the use of computers in the banking business

The introduction of computers into the banking business has a wide variety of legal implications that merit careful attention at this very early stage. The industry is highly regulated by government and, hence, is subject to many statutes and regulations. It also is affected by important common law rules established by courts. The legal ramifications involve not only the mechanization itself, but also the very significant, economically attractive phenomenon of off premises processing. It is essential to identify and provide for many legal aspects right now, before systems and practices crystallize, in order to avoid the later impact of unanticipated physical complications and expense.The legal aspects of computerization in the banking business are especially diverse. In some states, there might be the basic question whether banks are authorized by law to invest in the new facilities, either directly or through cooperatives. More challenging are questions relating to off-premises processors, particularly with respect to the obligation not to disclose information concerning a bank's customers, the adequacy of fidelity bond coverage, the extent of liability for improper refusal to pay a check, and susceptibility to regulation by government agencies. Also pertinent is the propriety of data processing by banks for nonbank entities and particularly of the rendering of that service without charge for bank depositors.

A serial technique to determine minimum paths

The need to determine minimum paths through a maze very often arises in such fields as traffic, transportation, communication and network studies. Computer analysis of these maze problems has been hampered in many cases due to the excessive size of the network under consideration. A technique has been developed to handle networks of very large magnitude by serially processing the network repetitively until only minimum paths remain.

Report of a visit to discuss common programming languages in Czechoslavakia and Poland, 1963

Early in June 1963 there was a meeting in Berlin [1] of the Subcommittee for Programming Languages, SC5 of TC97, the Technical Committee for Standardization of Computers and Information Processing [2]. Taking advantage of the proximity of Poland and Czechoslovakia, who are interested in the subject but have not actively participated in SC5, a small group representing the Secretariat of SC5 visited those countries. The major purpose of the visit was to discuss such topics as the state of the art of programming languages in each country—both development and use, any national standardization activity, participation in international standardization, and the present state and future prospects of international standardization. A formal report was made to SC5 in Berlin after the visit. The present report is not an official report of the visit. It is a private report of the group, intended for public dissemination. It includes some material not directly relevant to the official purpose of the visit and omits some of the material pertinent only to the official ISO activity, or parts of the discussions which it would be premature or discourteous to publish at this time.

Recent improvements in MADCAP

MADCAP is a programming language admitting subscripts, superscripts and certain forms of displayed formulas. The basic implementation of this language was described in a previous paper [MADCAP: A scientific compiler for a displayed formula textbook language, Comm. ACM 4 (Jan. 61), 31-36]. This paper discusses recent improvements in the language in three areas: complex display, logical control, and subprogramming. In the area of complex display, the most prominent improvements are a notation for integration and for the binomial coefficients. In the area of logical control the chief new feature is a notation for variably nested looping. The discussion of subprogramming is focused on MADCAP's notation for and use of “procedures.”

Flexible abbreviation of words in a computer language

An increasing number of computer programs are designed to accept and translate a symbolic, English-like language which facilitates communication between the user and the computer. A common feature of such programs is a pre-determined vocabulary of expressions for specifying the input to the program. This note describes a generalized technique for permitting flexible abbreviation of such expressions in order to further simplify the task of writing source programs.

Optimizing bit-time computer simulation

A major component of a bit-time computer simulation program is the Boolean compiler. The compiler accepts the Boolean functions representing the simulated computer's digital circuits, and generates corresponding sets of machine instructions which are subsequently executed on the “host” computer. Techniques are discussed for increasing the sophistication of the Boolean compiler so as to optimize bit-time computer simulation. The techniques are applicable to any general-purpose computer.

Length of strings for a merge sort

Detailed statistics are given on the length of maximal sorted strings which result from the first (internal sort) phase of a merge sort onto tapes. It is shown that the strings produced by an alternating method (i.e. one which produces ascending and descending strings alternately) tend to be only three-fourths as long as those in a method which produces only ascending strings, contrary to statements which have appeared previously in the literature. A slight modification of the read-backward polyphase merge algorithm is therefore suggested.

A description of the APT language

The APT (Automatically Programmed Tools) language for numerical control programming is described using the metalinguistic notation introduced in the ALGOL 60 report. Examples of APT usage are included. Presented also are an historical summary of the development of APT and a statement concerning its present status.

Coding clinical laboratory data for automatic storage and retrieval

A series of clinical laboratory codes have been developed to accept and store urinalysis, blood chemistry, and hematology test results for automatic data processing. The codes, although constructed as part of a computerized hospital simulation, have been able to handle the results of every laboratory test that they have encountered. The unique feature of these codes is that they can accept conventionally recorded qualitative as well as quantitative test results. Consequently, clinical test results need not be arbitrarily stratified, standardized, or altered in any way to be coded. This paper describes how the codes were developed and presents a listing of the urinalysis codes. Five criteria used in developing the codes are outlined and the problem of multiple-synonymous terminology is discussed. A solution to the problem is described. Flexible, computer-produced, composite laboratory reports are also discussed, along with reproduction of such a report. The paper concludes that even though many problems remain unsolved, the next ten years could witness the emergence of a practical automated information system in the laboratory.

An error-correcting parse algorithm

During the past few years, research into so-called “Syntax Directed Compiler” and “Compiler Compiler” techniques [1, 2, 3, 4, 5, 6] has given hope that constructing computer programs for translating formal languages may not be as formidable a task as it once was. However, the glow of the researchers' glee has obscured to a certain extent some very perplexing problems in constructing practical translators for common programming languages. The automatic parsing algorithms indeed simplify compiler construction but contribute little to the production of “optimized” machine code, for example. An equally perplexing problem for many of these parsing algorithms has been what to do about syntactically incorrect object strings. It is common knowledge that most of the ALGOL or FORTRAN “programs” which a compiler sees are syntactically incorrect. All of the parsing algorithms detect the existence of such errors. Many have considerable difficulty pinpointing the location of the error, printing out diagnostic information, and recovering enough to move on to other correct parts of the object string. It is the author's opinion that those algorithms which do the best job of error recovery are those which are restricted to simpler forms of formal languages.

Recursive programming in Fortran II

An oft-mentioned advantage of ALGOL over FORTRAN is the recursion capability of the former. FORTRAN adherents often belittle this advantage by claiming that all recursive relations can be reduced to recurrence or iterative relations, or that no recursive relations exist which are worth coding as recursive relations. The question of the truth of this must be left unanswered here. It is hoped that the technique described below will draw the poison from the FORTRANers' wounds by allowing them to recurse to their complete satisfaction. There is an hereditary resemblance between this technique and the MAD (Michigan Algorithm Decoder) recursion technique.

USA participation in an international standard glossary on information processing

A considerable number of glossaries in the area of information processing have been produced in the USA in the last ten years [1, 2]. In some cases the glossaries were reworked versions of earlier glossaries, while in other cases major new contributions were made. All told, the glossary effort has cost thousands of man-hours of work. Several years ago the ASA X3 sectional committee sponsored by BEMA was established to prepare standards for the USA in the information processing field. (See Appendix 1 for the meaning of abbreviations and acronyms. See also [3].) ASA X3.5 was assigned the double scope of advising the other X3.n subcommittees on the establishment of definitions required for their proposed standards and of establishing a standard glossary, pASGIP, for general use. At the same time there was important British standardization activity. After reworking a number of earlier drafts, the BSI released the “Glossary of Terms Used in Automatic Data Processing,” British Standard 3527: 1962. The British effort differed in at least one very important respect from the USA glossaries. It was organized along subject rather than alphabetical lines. This was to have important consequences, as we shall see.

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More