Advertisement

Research and Advances

Special section on operating systems

This special collection of articles examines recent research in operating systems, focusing on particular systems and projects that are seminal and representative. The areas these articles cover include distributed operating systems, heterogeneous computer systems, distributed programming, and real-time systems.
Research and Advances

Impact of system response time on state anxiety

Recent research has shown that user satisfaction and productivity are affected by system response time. Our purpose is to provide the results of empirical research on system response time and its effect on state anxiety. Test subjects were classified as Type A or Type B personality in order to determine if personality type had any affect on the relationship between system response time and state anxiety. The results show that both Type A and Type B personalities exhibit a statistically significant increase in state anxiety during poor or variable system response times.
Research and Advances

Computer technology and jobs: an impact assessment model

A model is proposed that associates the impact of computer technology on a job, with the set of underlying characteristics that describe the activities performed on the job. An empirical test of the model has been undertaken. One thousand and thirty-five experts assessed the impact of computer technology that they believed would occur on 306 jobs over a three-year period. Job characteristics data was obtained from prior analyses of the jobs, using the Position Analysis Questionnaire (PAQ). Six job dimensions derived from analysis PAQ data were significant predictors of the technological impact ratings provided by the experts: engaging in physical activities; being aware of the work environment; performing clerical-related functions; working in an unpleasant or hazardous environment; performing service-related activities; and performing supervising, directing, and estimating functions.
Research and Advances

A fair share scheduler

Central-processing-unit schedulers have traditionally allocated resources fairly among processes. By contrast, a fair Share scheduler allocates resources so that users get their fair machine share over a long period.
Research and Advances

Computer backup pools, disaster recovery, and default risk

There is a growing popularity of computer backup pools, where a few members share the ownership, or right for service, of a computer center. Such a center stands by to provide for the lost computing capacity of a member suffering a computer breakdown and disaster recovery. The efficiency of such a solution may be examined from various points of view, such as costs, response time, reliability etc. We focus on the reliability of such an arrangement. Two types of default risks are discussed: the probability that the center itself will break down when needed, so that it would be unable to provide service (this is similar to the traditional measure of a "probability of ruin") and a "perceived probability of ruin" (the probability that a member will be affected by the failure of the center). We borrow the concepts of probability of ruin from the risk management and insurance literature, in order to reach explicit relationships between these probabilities and the pricing of a mutual computer pool. It is shown that the membership fee for each participant must be a function of both the payments of all members and their loss (call for service) distributions, reflecting thereby the simultaneity of and mutual interdependence of members.
Research and Advances

Generality in artificial intelligence

My 1971 Turing Award Lecture was entitled "Generality in Artificial Intelligence." The topic turned out to have been overambitious in that I discovered I was unable to put my thoughts on the subject in a satisfactory written form at that time. It would have been better to have reviewed my previous work rather than attempt something new, but such was not my custom at that time. I am grateful to ACM for the opportunity to try again. Unfortunately for our science, although perhaps fortunately for this project, the problem of generality in artificial intelligence (AI) is almost as unsolved as ever, although we now have many ideas not available in 1971. This paper relies heavily on such ideas, but it is far from a full 1987 survey of approaches for achieving generality. Ideas are therefore discussed at a length proportional to my familiarity with them rather than according to some objective criterion. It was obvious in 1971 and even in 1958 that AI programs suffered from a lack of generality. It is still obvious; there are many more details. The first gross symptom is that a small addition to the idea of a program often involves a complete rewrite beginning with the data structures. Some progress has been made in modularizing data structures, but small modifications of the search strategies are even less likely to be accomplished without rewriting. Another symptom is no one knows how to make a general database of commonsense knowledge that could be used by any program that needed the knowledge. Along with other information, such a database would contain what a robot would need to know about the effects of moving objects around, what a person can be expected to know about his family, and the facts about buying and selling. This does not depend on whether the knowledge is to be expressed in a logical language or in some other formalism. When we take the logic approach to AI, lack of generality shows up in that the axioms we devise to express commonsense knowledge are too restricted in their applicability for a general commonsense database. In my opinion, getting a language for expressing general commonsense knowledge for inclusion in a general database is the key problem of generality in AI. Here are some ideas for achieving generality proposed both before and after 1971. I repeat my disclaimer of comprehensiveness.
Research and Advances

Issues in the pragmatics of qualitative modeling: lessons learned from a xerographics project

The photocopier is one of the most complex machines because xerography involves many types of physical phenomena. ARIA is a qualitative simulation of xerography that is intended to teach technicians the reasons behind some of the subtle problems that occur in copiers. This effort to model xerography exposed shortcomings in the techniques of qualitative modeling as applied to complex systems and helped to better understand the impact of certain basic modeling decisions.
Research and Advances

The relationship of MIS steering committees to size of firm and formalization of MIS planning

Top-level management and directors of management information systems (MIS) have been urged to oversee information systems (IS) development efforts using MIS steering committees. Although such committees have been shown to assist with planning problems facing the MIS director, little is known about just how such committees relate to the planning process. Drawing on organizational theory literature, a conceptual model is presented that relates firm size and MIS steering committees to four IS planning practices. The results of this survey of 456 MIS directors suggest that use of an MIS steering committee is associated with more formalized and systematic MIS planning. However, some of these relationships are stronger in large firms.
Research and Advances

Profiles in computing: Donald E. Knuth: scholar with a passion for the particular

"Age 30 is kind of appropriate because I got the first copy of volume 1 from the publisher nine days after my 30th birthday. So, a large part of the work had been done when I was 30 years old. They already were working on typesetting the second volume." Commenting on his books' influence, Knuth says, "It's been phenomenal from my point of view. In 1976 a study was done of how many people writing papers on computer science made a reference to my book somewhere in their articles, and it was found that about 30 percent of the papers in Communications, Journal of the ACM, and SIAM Journal on Computing cited the book. So it has an impact in that way." What about sales? Knuth notes that publishers may joke about professors whose books never sell, but they don't apply here. "I know that people buy the book. I don't know how many read it. But the sales have been incredible. I think something between 1000 and 2000 copies [have been sold] per month for 20 years."
News

Fifteen years ACM

The development years of ACM, as recounted in 1962 by founding member and former president Franz L. Alt, depicts the players and progress of an organization committed to sharing computing knowledge and skills.
Research and Advances

Profiles in computing: Allan L. Scherr

"Most of the work I've done has been done to break things into existence that didn't exist before. . . . In a sense, my whole career's been about building organizations that didn't exist before, creating processes to do things that have never been done before, and solving technical problems that hadn't been solved before. The work I did at MIT was that way as well. There was no real foundation to build on, and I had to make it up as I went along. That's characterized, if not my whole career, at least the parts of my career that I consider the most rewarding." "Pioneers are also the people that get arrows shot through them. That's the downside, and I've had my share of arrows pulled out of my hide."

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More