Opinion
Computing Applications The profession of IT

Fifty Years of Operating Systems

A recent celebration of 50 years of operating system research yields lessons for all professionals in designing offers for their clients.
Posted
  1. Introduction
  2. Timeline
  3. The Great Confluence of 1965
  4. OS Principles
  5. Lessons
  6. References
  7. Author
  8. Footnotes
  9. Figures
  10. Tables
  11. Sidebar: Founding History
computer monitor with icons, illustration

Operating systems are a major enterprise within computing. They are hosted on a billion devices connected to the Internet. They were a $33 billion global market in 2014. The number of distinct new operating systems each decade is growing, from nine introduced in the 1950s to an estimated 350 introduced in the 2010s.a

Operating systems became the subject of productive research in late 1950s. In 1967, the leaders of operating systems research organized the SOSP (symposium on operating systems principles), starting a tradition of biannual SOSP conferences that has continued 50 years. The early identification of operating system principles crystallized support in 1971 for operating systems to become part of the computer science core curriculum (see the sidebar).

In October 2015, as part of SOSP-25, we celebrated 50 years of OS history. Ten speakers and a panel discussed the evolution of major segments of OS, focusing on the key insights that were eventually refined into OS principles (see http://sigops.org/sosp/sosp15/history). A video record is available in the ACM Digital Library. I write this summary not only because we are all professional users of operating systems, but also because these 50 years of operating systems research yield important lessons for all computing professionals who design systems for customers.

Back to Top

Timeline

A remarkable feature of our history is that the purposes and functions of an operating system have changed so much, encompassing four stages:

  • Batch systems: one job at a time (1950–1960);
  • Interactive systems: many users on multiple systems constantly interacting, communicating, and sharing resources (1960–1975);
  • Desktop systems: Immersive personalizable distributed systems to manage work in an office (1975–2005); and
  • Cloud-mobile systems: Immersive personalizable systems to manage all aspects of one’s life, work, and social relations (2005 onward)

The accompanying figure depicts a memory layout of an early batch operating system.

Back to Top

The Great Confluence of 1965

The very first operating systems were little more “manual operating procedures” for the first computers in the 1950s. These procedures established a queue of jobs waiting to be executed; an operator put the jobs on the machine one by one and returned output to the requesting users. These procedures were soon automated in the late 1950s; IBM’s 1401 front end to the IBM 709x number crunchers was the best known of commercial “spooling” systems. From that time on, computer system engineers became interested in automating all aspects of computing including in-execution job scheduling, resource allocation, and user interaction, and pre-execution job design, preparation, testing, and debugging. By 1965, their experiments yielded a set of eight principles that became the starting point for a new generation of operating systems:

  • Interactive computing (time-sharing)
  • Hierarchical file systems
  • Fault tolerant structures
  • Interrupt systems
  • Automated overlays (virtual memory)
  • Multiprogramming
  • Modular programming
  • Controlled information sharing

The MIT Multics project (http://multicians.org) and the IBM System 360 project were the first to bring forth systems with all these characteristics; Multics emphasized interactivity and community, System 360 a complete line of low to high-end machines with a common instruction set. Moreover, Multics used a high-level language (a subset of PL/I) to program the operating system because the designers did not want to tackle a system of such size with assembly language. Developed from 1964 to 1968, these systems had an enormous influence later generations of operating systems.

Dennis Ritchie and Ken Thompson at Bell Labs loved the services available from Multics, but loathed the size and cost. They extracted the best ideas and blended with a few of their own to produce Unix (1971), which was small enough to run on a minicomputer and was written in a new portable language C that was close enough to code to be efficient and high level enough to manage OS program complexity. Unix became a ubiquitous standard in the configuration interfaces of operating systems and in the middleware of the Internet. In 1987, Andy Tanenbaum released Minix, a student-oriented version of Unix. His student, Linus Torvalds, launched Linux from Minix.

Back to Top

OS Principles

By the late 1960s OS engineers believed they had learned a basic set of principles that led to reliable and dependable operating systems. The SOSP institutionalized their search for OS principles. In my own work, I broadened the search for principles to include all computing3,4 (see http://greatprinciples.org).

I am often asked, “What is an OS (or CS) principle?” A principle is a statement either of a law of computing (Box 1) or of design wisdom for computing (Box 2).

*  Box 1. Examples of Laws

ins01.gif

*  Box 2. Examples of Design Wisdom

ins02.gif

Of the many possible candidates for principle statements, which ones are worthy of remembering? Our late colleague Jim Gray proposed a criterion: A principle is great if it is “Cosmic”—it is timeless and incredibly useful. Operating systems contributed nearly one-third of the 41 great principles listed in a 2004 survey (see http://greatprinciples.org). The accompanying table gives examples—OS is truly a great contributor to the CS field.

Back to Top

Lessons

As I looked over the expanse of results achieved by the over 10,000 people who participated in OS research over the past 50 years, I saw some lessons that apply to our daily work as professionals.


Both the researcher and professional search for answers. The one pushes the frontier of knowledge, the other makes systems more valuable to customers.


Even though it seems that research is academic and does not apply to professional work, a closer look at what actually happens reveals a great deal of overlap. Both the researcher and the professional seek answers to questions. The one aims to push the frontier of knowledge, the other to make a system more valuable to a customer. If we want to find out what it is like to explore a question, our main sources are academic research papers; there are very few written professional case studies. The typical research paper tells a tidy story of an investigation and a conclusion. But the actual investigation is usually untidy, uncertain, and messy. The uncertainty is a natural consequence of numerous contingencies and unpredictable circumstances through which the investigator must navigate. We can never know how a design proposal will be received until we try it and see how people react.

You can see this in the presentations of the speakers at the conference, as they looked back on their struggles to find answers to the questions they asked. They were successful because they allowed themselves to be beginners constantly searching for what works and what does not work: building, tinkering, and experimenting. From this emerged many insights.

The results of their work were almost always systems that others could use and experiment with. After the messy process of learning what worked, they wrote neat stories about what they learned. Before they produced theories, they first produced prototypes and systems.

Professionals do this too. When sitting around the fire spinning yarns of what they did for their customers, they too tell neat stories and graciously spare their clients their struggles with their designs.

Back to Top

Back to Top

Back to Top

Back to Top

Figures

UF1 Figure. Memory layout of an early batch operating system.

Back to Top

Tables

UT1 Table. Examples of computing principles contributed by operating systems.

Back to Top

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More