Opinion
Computing Applications Voices

Information Security: 50 Years Behind, 50 Years Ahead

Trust among people and organizations will be even more critical in securing communications and commerce in the future networked environment.
Posted
  1. Article
  2. Author
  3. Figures

What was the state of information security—the combination of computer and communication security—as Communications first went to press? Cryptography was both secret and primitive, able to protect the confidentiality of communications but unable to perform most of the functions we ask of it today. Computer security was nonexistent.

Information security today is a vast field, with more money, publications, and practitioners than all of computer science had a half-century ago. Cryptography is largely public and becoming a standardized part of the infrastructure. Computer security is not so settled but has made great strides since its birth in the 1960s and is an important aspect of day-to-day computer operations.

Where is information security going? Away. Today it would be possible to say that you did a computation securely if you did it entirely on your own computers and if you protected them appropriately.

But we live at the end of the era of isolated computation. Within the next decade Web services will have created a vast infrastructure of companies and other organizations that can perform your most important computations faster and cheaper than you could ever do them for yourself, just as Google can search better than you can. You’ll be unable to stay in business without using them but also unable to conceal your secrets from them. All the cryptography, bounds checking, and packet filtering will still be there, but the main mechanism of information security will be contractual.

How did we get to this situation? In 1958 computer security would have been very difficult to distinguish from the security of the computer itself. Computer rooms were guarded, operators and users were vetted, and card decks and printouts were locked in safes—all physical or administrative measures. Process confinement, kernelized operating systems, and formally specified programs were all a decade in the future.

Cryptography, in the limited quarters in which it was known and practiced, would have been more recognizable but very primitive. From World War I, when mechanized cryptography got its real start, through World War II, most military cryptography was mechanical, performed by electromechanical machines whose action combined a number of table lookups with modular arithmetic. A character to be enciphered was put through a sequence of table lookups interspersed with the addition of keying characters—a slow process capable of encrypting teletype traffic but utterly inadequate for coping with voice.

The 1950s were dominated by the effort to bring the speed of encryption closer to the speed of the modern world. Military cryptography—to the degree it had gone beyond rotor machines—consisted primarily of what are called long-cycle systems. The backbones of these systems were linear feedback shift registers of maximal period. Two techniques were used to make the output (which was to be XORed with the plain text) nonlinear. The registers stuttered (paused or skipped states about half the time), and the output came from tapping a number of stages and combining them into one bit with nonlinear combinational logic.

Only one small laboratory, the Air Force Cambridge Research Center (military but out of the mainstream of cryptography), had begun looking at the ancestors of the U.S. Data Encryption Standard and many other systems while working on cryptographic techniques for identification friend or foe, the technique by which a fire-control radar recognizes that an incoming plane is friendly and should not be fired on. The radar sends the plane a challenge; if the plane decrypts the challenge, modifies it in an agreed-upon way, and reencrypts it correctly, the radar tells the gun to hold its fire.

The process of recognizing a signal by its correct encryption is one to which the stream ciphers of communications are ill suited. Rather than a system in which each bit of the message depends on one bit of key, with which it was XORed, a system is needed in which every bit of output depends on every bit of input. Today we call such a system block ciphers or electronic code books.

Over the past 50 years, both computer security and cryptography have made great strides, and CACM has played an important role in the growth of each. Computer security as we think of it today was the offspring of time sharing and multiprocessing. Once a computer could run jobs on behalf of several users at a time, guarding the computer room was no longer sufficient. It was necessary to guarantee that each individual process inside the computer was not spying on another such process.

Time sharing was born in the early 1960s and by the late 1960s was a major force. It was in use in computing laboratories around the world and offered commercially by service bureaus that, five years earlier, had been running one program at a time for their customers who submitted decks of cards. The turn from the 1960s to the 1970s marked the birth of both computer security and the modern era in cryptography.

Computer security came first. The introduction of timesharing had been particularly disruptive in the culture of military laboratories. Time sharing allowed those doing unclassified work to move into a crude approximation of the environment we enjoy today—15-character-per-second model-35 teletypes, then primitive cathode-ray tube screens rather than high-speed flat-screen displays—but interactive work within one’s own office during normal working hours. Those dependent on classified computing found themselves ghettoized into working in the computer area for a few hours in the evening after the others had gone home. The result was a major program to produce far more secure computers. The formula was simple, starting with writing better code. As we envisioned it then, this meant mathematically proven to be correct. But as not all of one’s code can be one’s best code, less-trusted code had to be confined so it couldn’t do any damage. These are problems on which much time has been expended yet still have no fully satisfactory solution.

Curiously, computer security in the late 20th century was rescued by another great development of computer science—networking—particularly client-server computing. Networking brought forth the need for cryptography, a subject kept secret from and neglected by the computer science community at the time.

The 1970s saw the development of public-key cryptography, a new approach to secure communication that surmounted a long-accepted obstacle to the broad use of cryptography, that is, to communicate securely you must share a secret at the outset. Public-key cryptography made a major improvement in key management—by eliminating most of the need to transport secret keys—and made possible digital signatures. Together they improved key management, and digital signatures fostered the growth of Internet commerce in the 1990s. The appearance of public-key also sparked an explosion of public, business, and government interest in more conventional forms of cryptography, catapulting cryptography to the position of best understood and most satisfactory part of information security.

Today, public-key cryptography has given birth to a second generation of systems, replacing the modular arithmetic of the first generation with arithmetic on elliptic curves. The U.S. Data Encryption Standard of the 1970s, an algorithm of moderate strength, has been replaced with the Advanced Encryption Standard, which may be the most secure and carefully studied algorithm in the world. Technical developments have been accompanied by a change of heart at the National Security Agency, which has embraced the public developments by adopting a “Suite B” of public algorithms as satisfactory for all levels of classified traffic. Cryptography is now the soundest aspect of information security, and modern security systems are rarely penetrated by confronting the cryptography directly.

Computer security has not fared as well. Progress developing high-assurance systems, though substantial, has not been as great as expected or required. Implementation of features in most commercial computing has taken precedence over security, and the state of Internet security is widely lamented. Real progress awaits major improvements in both construction and evaluation of computer programs.

In contrast to cryptography’s solid technical status and the fact that the Secure Sockets Layer is the most widely deployed cryptographic mechanism of all time, SSL effectiveness is limited. The weak computer-security foundation on which cryptography must be implemented has made it problematic to scale the key management system to Internet size.

Coupled with this is a serious human-factors failure of all security systems. The Internet is a medium in which users want to talk to both people and machines they trust, as well as to those they don’t trust. Unfortunately, the means to recognize those who are trustworthy (and, say, accept upgrades to programs from them) is not available. As a result, cryptography has failed to protect us from a network in which a quarter of the computers have been captured by bot networks and half the email is spam.

What will happen over the next half century? Two great challenges loom:

True computer, communications, and network security are seen by police and intelligence agencies as an obstacle to the prevention of terrorism. Although attempts to block the use of good cryptography subsided at the end of the 1990s, a program of building-in ubiquitous wiretapping is being carried out at a pace that does not inspire confidence that the interception facilities will be secure against capture and misused by parties unknown.

More fundamental is the growth of Web services. Today, even the most security-conscious companies cannot avoid trusting their engineering and marketing directions to Google and its trade-secret techniques. The query stream reveals all of our interests, and only Google’s good practices and reputation guarantee they are not put at the service of competitors. Much sooner than the next half century, Web services will have destroyed locality in computing.

No significant corporate computation will take place on any one organization’s machines. Programs will look at various yellow pages and advertisements and choose the most cost-effective providers for their most intensive computations. Image rendering, heat flow, marketing campaign modeling, and a host of services not yet imagined will be provided by myriad companies offering proprietary solutions.

When this happens, what we call secure computation today—you did it on your own computer and protected it adequately—may be gone forever.

Back to Top

Back to Top

Figures

UF1 Figure. British cryptologists used an electromechanical device called the Bombe, designed by alan Turing and rebuilt here for the Bletchley Park Museum, to help break code signals from the German Enigma Machine during World War II.

Back to top

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More