Practice
Computing Applications Practice

The NSA and Snowden: Securing the All-Seeing Eye

How good security at the NSA could have stopped him.
Posted
  1. Introduction
  2. What Did Snowden Do?
  3. Rings of Security
  4. Stopping Snowden
  5. Summary
  6. References
  7. Author
  8. Sidebar: Constitutionality
The NSA and Snowden, illustration

back to top  

Edward Snowden, while a contractor for the U.S. National Security Agency (NSA) at Booz Allen Hamilton in Hawaii, copied up to 1.7 million top-secret and above documents, smuggling copies on a thumb drive out of the secure facility in which he worked and releasing many of those documents to the press.2 This has altered the relationship of the U.S. government with the American people, as well as with other countries. This article examines the computer-security aspects of how the NSA could have prevented this from happening, perhaps the most damaging breach of secrets in U.S. history.19 The accompanying sidebar looks at the Constitutional, legal, and moral issues.

According to Presidential Executive Order 13526, ” ‘Top Secret’ shall be applied to information, the unauthorized disclosure of which reasonably could be expected to cause exceptionally grave damage to the national security.”24 There are clearance levels above top secret, such as SCI (sensitive compartmented information), SAP (special access programs), and CNWDI (critical nuclear weapon design information).9 The British equivalent to top secret is most secret.

Back to Top

What Did Snowden Do?

Snowden was a computer system administrator. Guarding against rogue system administrators (a.k.a sys admins) is more difficult than guarding against users, but it can be done. Note that the NSA has an almost infinite budget and resources, and thus could have been following good security practices all along. In the words of White House cybersecurity adviser Richard Clarke, “If you spend more on coffee than on IT security, you will be hacked. What’s more, you deserve to be hacked.”20

National Public Radio’s “All Things Considered” last December 17 stated the stolen documents were on Microsoft’s SharePoint document-management system. Of the 1.7 million documents likely copied, Snowden shared up to 200,000 documents with reporters; the NSA did not dispute this.2,19 Rick Ledgett, head of the NSA’s task force accessing the “damage” done by Snowden, claimed “system administrators…have passwords that give them the ability to go around those… security measures, and that’s what Snowden did.”19

That the NSA’s Ledgett claims to be unaware of the past 30 years of computer-security techniques and technology for preventing a system administrator from stealing data is puzzling.10,15,29 This is discussed later in the section “Orange Book and Two-Person Authorization.” The NSA no longer uses SharePoint for this purpose, which begs the question, why did the NSA abandon secure Orange Book compliance and other good security practices for computer systems that handle classified data?

In an interview with CBS’s “60 Minutes,” on December 15, 2013 General Keith B. Alexander, director of the NSA, admitted that part of Snowden’s job was to transfer large amounts of classified data between NSA computer systems.19 Snowden then copied files to a USB memory stick and concealed it on his person to smuggle vast amounts of data out of the NSA.11,26 A simple one-minute scan on the way out by a handheld metal detector—”wanding,” as used by the Transportation Security Administration (TSA) and at courthouses—would have found any flash memory device.

Back to Top

Rings of Security

Let’s digress briefly to discuss the important concept of rings of security, my term for the industry-standard but less obvious term security in depth. This means having multiple concentric rings of security so that if attackers get through the first or outermost ring they encounter, then, hopefully, the second or third or fourth ring will stop them; no one security measure is 100% effective. These rings mostly are about authentication and are unrelated to what a user is allowed to do once authenticated. Consider how rings of security might apply to an ordinary network; this “ordinary” level of security is insufficient where very high security is needed such as the NSA, banks, systems handling large numbers of Social Security or credit-card numbers, among others.


There are a number of security methods the NSA could have used that would have stopped Snowden. Many of these have been in use for a decade or more, yet the NSA did not use them.


Suppose we want to have a network in which sys admins are able to SSH (Secure Shell) into a server from home. In the first ring the firewall might allow SSH access only from a short list of IP addresses of the sys admins’ home systems. Thus, instead of being able to attack from any of a billion systems on the Internet someone would have to launch her attack from one of, perhaps, a dozen system administrators’ home networks, a vastly reduced vulnerability profile. Modern TCP/IP implementations, used by SSH, are very immune to IP spoofing. When combined with end-to-end encryption person-in-the-middle attacks are virtually eliminated.

The second ring might allow SSH authentication only via public/private keys on these home Linux or Unix systems. Prohibiting SSH from accepting passwords prevents password-guessing risks and thus access from unauthorized systems. The third ring would monitor log files for attacks and block those IPs, preferably automatically. The fourth ring would be a strong passphrase on that SSH private key. A fifth ring could require sys admins’ home systems (and, of course, all systems at the office) to lock the screen after a few minutes of inactivity.

Back to Top

Stopping Snowden

There are a number of security methods the NSA could have used that would have stopped Snowden. Many of these have been in use for a decade or more, yet the NSA did not use them.

Islands of Security. The obvious place to start in this case is with preventing sys admins or others from getting into unauthorized systems. The islands-of-security concept is a safeguard in case someone manages to penetrate the network. In a high-security organization, different segments, even different systems, should be treated as islands of security that do not trust each other or the network in the vast ocean of systems. This means different systems should have different root passwords, different user passwords, different SSH passphrases, and almost all traffic between systems should be encrypted. Systems should have encrypted file systems and encrypted backups.

Physical Security. Each island of security should be physically protected against attack. This certainly would include the systems and peripherals and the network carrying any unencrypted confidential data. Even large commercial collocation facilities have steel cages around some systems and video cameras watching these areas. The payment card industry (PCI) security standard requires such protection for large credit-card processors. High-security operations should install video cameras and keep the recordings for a long time.

One simple safeguard is to put two high-security locks on each cage, each lock needing a different key possessed by a different person. Thus, two people must be present when the hardware is accessed. Similarly, networking cables could be secured (for example, inside of steel pipe), or the data encrypted before sending it around the LAN or WAN. There is no indication that Snowden took advantage of any lack of physical security, although it is critical for protection.

Prevent Unauthorized Copying. The ability to plug in a USB memory stick or insert a blank DVD for writing should be disabled. Most DVD burners and USB jacks should be removed as well. Cameras, recorders, mobile phones, and any other unauthorized storage devices should be forbidden and guarded against. Metal detectors at doors would detect violators. Radio frequency (RF) emissions should be monitored, and Faraday cages could be incorporated to block RF emissions. None of these techniques is expensive.

Two-Factor Authentication. Even Snowden’s top-secret clearance was not sufficient to allow him access to some of the documents he stole. The NSA admitted that Snowden used the higher-than-top-secret clearances of the user accounts of some top NSA officials. This was possible because he had created these accounts or used his sys admin privileges to modify the accounts to access even more highly classified documents remotely using NSAnet, the NSA’s classified intranet.13 Snowden’s access to accounts with higher security clearance than his violated the long-accepted security policy that the system should prevent anyone from accessing data with a higher clearance than the user’s. It would have been a trivial matter for the computer to prevent this and instead require the services of a system administrator with that higher clearance level to adjust those accounts as needed.

This also violated the concept of two-factor authentication. Authentication is the ability of a computer (or security guard or even a store clerk) to determine if you really are who you claim to be. Typically, an authentication method consists of what you know (password or PIN), what you have (credit card or RFID-equipped badge issued to employees and consultants or USB dongle), or what you are (your signature or fingerprint or retina scan or your picture on a hard-to-forge document such as a driver’s license, employee badge, or passport). Each of these is called a factor. None of these methods is expensive, and all are effective. While fingerprints can be faked with some effort, this is more difficult with modern high-quality fingerprint readers, which are available commercially.

Many organizations use the very popular two-factor authentication to grant access to computers or facilities or money, requiring, for example, that one does not get access without providing a password or an RFID-equipped badge and a fingerprint. Three-factor authentication would be even better.

Had the NSA required good two-factor authentication, such as a fingerprint and password compared against central databases to which Snowden did not have administrative access, it would have prevented him from impersonating others to use their accounts—which is how he obtained documents above his security clearance. Collecting these factors for the databases would be done by two different sets of people, neither being the set that manages classified documents as Snowden did. This separation of authority is critical for good security as it requires multiple people to effect a compromise.

Even if the person managing users’ passwords went rogue, she would not have access to the fingerprint database. The password manager could be prevented from seeing the user entering his password by having the user enter a separate inner room via a one-person mantrap to which the person managing password changes does not have access. That room would have a virtual keyboard on a physically hardened touchscreen, making rogue use of a keystroke logger difficult. Lack of space here does not allow discussion of deeper exploits such as spoofing fingerprints, guarding against keyloggers, TEMPEST (the NSA’s own set of security standards for radio frequency leakage of information), social engineering, and more.

Social engineering is where an attacker tricks someone into revealing information that he should not reveal. Email messages falsely claiming to be from your bank asking you to click on a link and provide your password or offering to share stolen money with you are examples. Snowden used social engineering to obtain the password of at least one NSA employee who subsequently resigned; it has been addressed extensively in other papers and books. Good recurrent education and strict policy forbidding sharing one’s passwords, badge, or dongle under any circumstance might have prevented this part of Snowden’s breach.

Orange Book and Two-Person Authorization. Someone is less likely to do something dishonest if someone else is watching. This is why many stores have at least two people working and why armored car services use two people. It also is why you see “Two signatures required for amounts over $5,000” at the bottom of some checks.

The NSA created the Orange Book specification for Trusted Computer System Evaluation Criteria 30 years ago, requiring the federal government and contractors to use it for computers handling data with multiple levels of security classification. This author enhanced one Orange Book-compliant Unix system to have additional security capabilities. Such a computer would prevent, say, a user with only secret clearance from viewing a top-secret document. One also could create different “compartments” in which to keep separate sets of documents. Only someone allowed access to a particular named compartment could access documents in that compartment, even if that person otherwise has sufficient security clearance.

This high-security clearance is known as “compartmentalized security” (a.k.a. “need to know”). An important aspect of protecting a body of secrets is that very few people should have access to more than a small portion of them. A person working with one critical compartment should be barred from accessing other critical compartments. Those that know many of the secrets, such as General Alexander, get constant Secret Service protection.

One compartment might be “spying on Americans’ phone records without a valid warrant.” Another might be “listening to Americans’ domestic phone conversations and reading email without a valid warrant.”3,12,17,22 A third might be “hacking the phones of leaders of allied countries.” As Snowden should not have been involved in any of those projects and thus should lack sufficient clearance, he would not have been able to access those programs’ documents or even know that they existed. In reality, however, the NSA allowed one person, Snowden, unfettered, unmonitored access to 1.7 million documents.

Also important is the Orange Book concept of not trusting any one system administrator. Instead, a role-1 sys admin queues system changes, such as new accounts or changes to an existing accounts. A second person, in role 2, cannot initiate such requests but must approve the queued requests before they can take effect. An Orange Book OS also prevents use of a login simulator by displaying a special symbol when soliciting a password that no other program can display. Snowden may have used a login simulator.

How expensive might this two-person authorization have been? In 2013, the NSA had approximately 40,000 employees and perhaps 40,000 contractors, including 1,000 system admins.8,25 Adding another 1,000 system administrators to watch the first set would have increased the payroll by a trivial 1%.

Given this, is the NSA going to adopt two-person authorization and the Orange Book policy that it created? No, the NSA is going to fire 90% of its system administrators to limit human access and put most of the servers in the NSA’s own cloud.1 A cloud is just another name for a set of computers remotely accessible over a network and typically managed by others, usually a vendor (a.k.a., contractor). Maybe it will hire Booz Allen, Snowden’s former employer, to manage this cloud.

Log Events and Monitor. The NSA should monitor how many documents one accesses and at what rate, and then detect and limit this. It is astonishing, both with the NSA’s breach and similar huge thefts of data such as Target’s late-2013 loss of data for 40 million credit cards (including mine), that nobody noticed and did anything. Decent real-time monitoring and automated response to events would have detected both events early on and could have prevented most of each breach.

The open source Logcheck and Log-watch programs will generate alerts of abnormal events in near real time, and the Fail2Ban program will lock out the attacker. All are free and easily can be customized to detect excessive quantities of downloads of documents. There are many comparable commercial applications, and the NSA certainly has the budget to create its own.

No Internet Access or Homework Whatsoever. Obvious, this policy is to prevent classified data from leaving a secure building. For after-hours problems, a sys admin either must drive to the office or be on-site at all times. One former CIA director nearly was fired for taking classified data home to work on, violating a strict policy against it. (He was not stealing the data; he just wanted to work at home.) Snowden took classified material home and worked on it with a hood covering him and the computer so that his girlfriend could not see it.19 Clearly, then, he could have photographed the screen.

Prevent Removable Media from Leaving the Building. Recall the rings of security. One ring would prevent removable media from leaving the building. Every gas-station owner has figured this out, attaching a large object to each restroom key. The NSA could put each thumb drive inside a large steel box, or it could replace the standard USB connectors and those of the computers with custom-designed connectors that are difficult to duplicate.

Creatively Use Encryption. Consider that one of Snowden’s jobs was copying large amounts of classified data from one computer to a thumb drive and then connecting that thumb drive to another computer and downloading the data. He likely secreted the thumb drive on his person after downloading the data he wanted and took it home. This theft could have been prevented rather easily with the use of public-key encryption.33 In public-key encryption there are two related keys: a public key and a secret key, also called a private key. If the original “clear text” is encrypted with the public key, then it can be decrypted only with the secret key, not with the public key used to encrypt the data.

The NSA should have had a public/secret-key pair created for each sys admin needing to transfer data and a separate account on each computer for each sys admin to transfer this data. The person generating this encrypted data on the source computer (for example, Snowden) would have to provide the ID of the public key of a different sys admin—say, Julia—to the custom program allowed to write to the USB thumb drive; software would not allow his own public key to be used. The set of sys admins allowed to do transfers of data would have no members in common with the set of sys admins on the source and destination computers with root access. In other words, a “Data Transfer System Administrator” such as Snowden would not have root or physical access to computers and sys admins having root or physical access would be prohibited from transferring data between systems. This separation of responsibilities is critical. Only that custom program, not sys admins, would be allowed to write to the thumb drive. That computer would encrypt the data with Julia’s public key and write that encrypted data to the thumb drive.

Snowden then would download the encrypted data to the destination computer via the thumb drive using a custom program on the destination computer (with that program having sole access to the USB drive) after he had logged into his account. That program would prompt Snowden for the account in which to transfer that encrypted data to (for example, Julia’s), and then move the encrypted file to her account. Julia would log in to the destination computer and provide the passphrase that unlocks her encrypted secret key and her fingerprint or RFID-equipped badge to that custom program, which then would decrypt that data into Julia’s account. After that, she could move the data to the final location on the destination computer. The implementation is trivial.


An outside security audit performed quarterly or annually would have found the NSA’s problems and, perhaps, fixed them in time to stop Snowden.


Needless to say, the sys admins tasked with this data transfer would not have the root (administrative) access to these computers that would allow getting around this custom program’s restrictions, and these computers would be running modern versions of Orange Book-compliant operating systems that would require two system administrators for privileged access in any case. Furthermore, Snowden would not have Julia’s fingerprint or passphrase or, if used, her badge for authentication. The open source GNU Privacy Guard (GPG) stores private keys on disk or elsewhere in an encrypted form that can be decrypted only by providing a passphrase or other authentication.15

Thus, no sys admin acting alone could decrypt data that he or she encrypted to a thumb drive. This would have prevented Snowden’s theft by thumb drive. These custom programs (which would run on the source and destination computers) could be written in a day or two using the open source GPG encryption program by a substantial percentage of those reading this article. Thus, even if a USB drive was smuggled out of a secure NSA facility, it would have no value.

Similarly, there could be an additional ring of file-level encryption for highly classified files with separate public/secret key pairs. Only those users entitled to read these documents (and not even sys admins tasked with copying files) would have the secret keys to decrypt them. Those using the destination system (after legitimate copying by Snowden and Julia) would be able to decrypt the files. The system administrator, however, never would have seen the decrypted documents even by reading the raw disk. By itself, this simple precaution would have prevented the wholesale theft of many documents by Snowden. Combined with the use of public-key encryption for transferring data between systems, Snowden would have had to defeat two extremely challenging rings of security to steal data. Using encrypted file systems or whole-disk encryption on all computers handling classified data would offer an additional ring of security.

Plan for Break-in to Minimize Damage. The NSA’s Ledgett acknowledges, “We also learned for the first time that part of the damage assessment considered the possibility that Snowden could have left a bug or virus behind on the NSA’s system[s], like a time bomb.”19 The agency should have planned for a possible break-in to minimize the harm and quickly and reliably assess the damage. For example, it could be prepared to compare a system’s current state with a trusted backup taken before the break-in. This comparison could be run on a different and trusted system.29 The use of islands of security and not putting all of its eggs in one basket would have minimized the damage greatly. It could have been running a file-system integrity checker all along to detect tampering with files.

Periodic Security Audits. Security is an ongoing process. An outside security audit performed quarterly or annually would have found the NSA’s problems and, perhaps, fixed them in time to stop Snowden. Such an audit is quite common and considered good practice. This is similar to the outside financial audit of large companies required by… the U.S. government. The report should be reviewed by the highest levels of management to avoid lower levels simply ignoring inconvenient findings.

Back to Top

Summary

The NSA seemingly had become lax in utilizing even the most important, simple, and cheap good computer-security practices with predictable consequences, even though it has virtually unlimited resources and access—if it wants it—to the best computer-security experts in the country.

Most of the good security practices covered here were discussed in the author’s Real World Linux Security first published in 2000.29 The most important of these security practices also were discussed in this author’s article, “The Seven Deadly Sins of Linux Security,” published in the May/June 2007 issue of ACM Queue.

I am honored there are autographed copies of my book in the NSA’s headquarters. The vast majority of NSA employees and contractors are eminently talented law-abiding dedicated patriots. It is unfortunate that a tiny percentage no doubt ignored warnings that these security problems desperately needed fixing to avoid a serious breach.

q stamp of ACM Queue Related articles
on queue.acm.org

Communications Surveillance: Privacy and Security at Risk
Whitfield Diffie and Susan Landau
http://queue.acm.org/detail.cfm?id=1613130

More Encryption Is Not the Solution
Poul-Henning Kamp
http://queue.acm.org/detail.cfm?id=2508864

Four Billion Little Brothers?: Privacy, mobile phones, and ubiquitous data collection
Katie Shilton
http://queue.acm.org/detail.cfm?id=1597790

Back to Top

Back to Top

Back to Top

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More