Home/Magazine Archive/July 1999 (Vol. 42, No. 7)/High-Confidence Design For Security: Don't Trust?Verify/Full Text

Communications of the ACM
# High-Confidence Design For Security: Don't Trust?Verify

The widespread use of networks makes information security a major concern when the underlying network (for example, the Internet) is assumed to be insecure. Systems with security requirements typically must operate with a high degree of confidencethey must be highly assured. The task of designing and building secure systems raises a fundamental question: *how do we know with confidence that our designs will be secure?* Having confidence in a secure system requires having confidence in the strength of the cryptographic algorithms, the correctness of the hardware and software implementations, and knowing the implementation supports a security model.

This article describes methods for establishing confidence that implementations meet their specifications and security requirements. These methods are rigorous in nature, rely on mathematical logic, and are accessible to engineering students at the master's level. As is typical in systems engineering, various methods are used depending on what level of design is being addressed.

The collection of people, keys, processes, or machines who send and receive information and who access information resources (databases, processors, printers) are called *principals*. Security properties typically deal with the ability of principals to access information or resources. Key security properties include:

- Privacy or confidentiality: knowing which principals can read data;
- Integrity: detecting corruption of information;
- Authentication: knowing the identity of a principal or the source of information;
- Access control: restricting the use of a resource to privileged principals;
- Non-repudiation: preventing principals from denying their actions; and
- Availability of service: guaranteeing authorized principals access to services.

The *Handbook of Applied Cryptography* [6] describes each of these properties in detail. The means for achieving these properties depends on the collection of security mechanisms that supply security services, the correct implementation of these mechanisms, and how the mechanisms are used. Some common security mechanisms used to achieve these security properties are:

- Encryption: hiding information in a reversible way, that is, for message
*m*, and keys*K*and*K*:^{-1}*decrypt(K*,^{-1}*encrypt(K,m))*=*m*, such that it is computationally infeasible to determine the keys*K*and*K*;^{-1} - Hash functions: one-way functions that map messages of arbitrary length to a fixed number of bits where for any hash value
*hash(m*), it is computationally infeasible to find another message_{1}*m*such that_{2}*hash*(*m*) = hash(_{1}*m*), and find_{2}*m*given_{1}*hash*(*m*);_{1} - Digital signatures: a means to associate information with principals;
- Certification: vouching for the accuracy of information by a recognized authority; and
- Authorization: granting permission to do something or to act in a particular role.

Encryption is used to meet confidentiality requirements. There are two kinds of encryption: symmetric (or secret key) encryption, and asymmetric (or public key) encryption. In symmetric key cryptography, the same key is used as a parameter of a cryptographic function such as DES (Data Encryption Standard) [6] to encrypt or decrypt data. DES is an instance of a Feistel cipher [6]. Feistel ciphers have a repetitive structure where each stage or *round* of calculations involves splitting the input into two halves (*L _{i}*-1

Asymmetric key cryptography uses two keys. A *public key* is available to everyone. A *private key* is known only by a single principal. Thus a private key in some sense uniquely identifies a principal if that key is not compromised.

An example of a public key encryption algorithm is RSA by Rivest, Shamir, and Adleman [9]. Encryption and decryption are done by exponentiation. If *M* is the value of a message block, and *(e,n)* is a public key where *e* and *n* are integers, then the encrypted value of *M* is *M ^{e}*

Digital signatures are used to meet the needs for integrity, authentication, access control, and non-repudiation. One method for accomplishing both encryption for confidentiality and generating digital signatures is shown in the figure appearing here. A signed and encrypted message (*ciphertext,signature*), consists of *ciphertext*the plaintext encrypted with a data encryption key (DEK), and *signature*the message hash encrypted with a signature key. Recalling the properties of hash functions and public key encryption, the hash value of the plaintext identifies a message and the signature key identifies the sender. DES can be used to encrypt the plaintext and RSA can be used to sign the hash value of the plaintext.

Given the pair (*ciphertext,signature*), the plaintext is extracted by decrypting *ciphertext* using the inverse of the DEK, and the integrity and authenticity checked by comparing the hash of the decrypted message with the decrypted signature received with the encrypted message. If the message arrives intact, then the owner of the private key (assuming that the private key of the sender has not been compromised) is the only one who could have sent the message.

What gives us confidence in the security of a system and how do we go about establishing that confidence? Considering the security properties and mechanisms described in the previous sections as shown in Figure 1, the following questions come to mind: Are the cryptoalgorithms and hash functions good enough? Are the mechanisms, such as those shown in the figure, correctly implemented in hardware and software? Does the implementation support some security or trust model? Each of these questions is addressed here.

** Confidence in algorithms.** Confidence in the underlying cryptographic algorithms requires a detailed analysis of the algorithms themselves using number theory, statistics, and complexity theory. This is to substantiate the expectation that the algorithms are resistant to cryptanalysis used by attackers. An important part of having confidence in an algorithm is whether it has been thoroughly examined by analysts. In most cases, implementers do not create new cryptoalgorithms nor are they responsible for assessing the cryptographic strength of those cryptoalgorithms. Rather, most system designers use cryptoalgorithms as components and are responsible for their correct implementation and use. This brings us to the questions of correct implementation and maintaining trust and security.

** Confidence in implementations.** Having confidence in the implementation of security mechanisms requires knowledge that the implementations are correct. Besides the normal testing and simulation to check correctness for specific test cases, formal verification using mathematical logic is possible and becoming more practical.

Higher-order logic, logic that allows functions to be parameters of functions, is well suited for describing and verifying data paths and their specifications.

Temporal logic and reasoning tools based on model checking [7], have verified properties of microprocessors and controllers. As model checkers explore the properties of state machines, they are particularly appropriate for verifying properties of controllers and control paths. Example properties include checking for the absence of deadlock and the reachability of certain states. Freedom from deadlock is expressed as "for all states it is possible to make a transition to another state." Eiriksson in [1] used the Cadence Berkeley Labs SMV (Symbolic Model Verifier) model checker [7] to verify the correctness of a one-million gate application specific integrated circuit (ASIC) implementing a cache coherence protocol.

Higher-order logic, logic that allows functions to be parameters of functions, is well suited for describing and verifying data paths and their specifications. The ability to use functions as parameters means security mechanisms are specified, described, and verified without having to specify the specific hash or encryption function used in a particular instance. The correctness of the mechanisms is established based on the general properties of hash and encryption functions. For example, in the integrity checking portion of the mechanism in the figure, the pair *(ciphertext,signature)* is deemed *OK* if there is a match between the hash of the decrypted ciphertext and the decrypted signature.

The properties of the integrity check depend on the properties of hash and decryption functions. If the decryption functions invert the corresponding encryption functions, then the integrity check is true if and only if the received message is equivalent to the message that was sent, where equivalence is defined by having the same hash value. Specification and verification of integrity checking at the abstract level and lower levels using specific message structures is possible using higher-order logic [3] and automated tools [4].

To give a sense of how lower-level implementation descriptions are related to higher-level behavioral specifications, consider the addition operation. What must be proved is the implementation at the bit level corresponds to correct arithmetic-level behavior given a particular mapping from bit vectors to integers (consider the unsigned interpretation of bit vectors).

Formal reasoning and tools are being adopted by industry, particularly by integrated circuit manufacturers.

For example, say we have a 4-bit adder *Add*(*In _{1}*,

Relating low-level implementations to higher-level specifications (such as gate-level implementations to arithmetic behavior) relates models or descriptions at differing levels of abstraction. This is done at a variety of levels. For example, Tahar and Kumar have verified, using higher-order logic, that the pipelined DLX RISC processor implementation satisfies its instruction set architecture as seen by users [10].

While the preceding examples have a hardware flavor to them, software verification is becoming practical as well, especially in the area of model checking of software. Examples using model checking for software include checking a telephone switching application using Lucent Technology's VeriSoft tool [2], and checking safety-critical systems using the Naval Research Laboratory's SCR (Software Cost Reduction) tools [4].

** Confidence in security.** The cryptographic algorithms and security mechanisms described in the previous sections establish that messages and data were encrypted and signed using a key

A critical question is *whom does K represent?* This question raises a number of issues that extend beyond cryptographic strength of algorithms and correct implementation of hardware and software. These questions include:

- What is the basis for associating keys with principals?
- Who are the recognized authorities that vouch for principals, their information, and their privileges?
- How do we decide if we should honor a request to access a resource such as a file?

These questions point to a need to reason about trust relationships, authentication, and authority using some model of security or trust. This is particularly important for distributed systems where system components are geographically distant. Generally, these questions are handled as follows:

- Keys are associated with principals using
*certificates*. Certificates document the association of public keys with principals. They are unforgeable as they are cryptographically signed by the certification authority issuing the certificate. An example is the X.509 certificate standard [5]. - Certain principals and their keys are explicitly recognized as having the authority to certify the validity of associating keys with principals.
- Access to resources is determined by consulting an access control list after authenticating who is making a request and in what role the person is making the request.

The challenge is to have a means to reason consistently about who is making requests. One way to address this need is to create specialized logical systems to reason about trust, authentication, and authority. These systems have a specialized language and inference rules for reasoning about principals and those for whom they speak. The advantage of having such a system is it maintains logical consistency when making decisions on authentication and access control. Of course, the issue of whether the inference rules themselves make sense for the application at hand must be examined closely.

One example is the logic created by Lampson et al. to reason about authentication in distributed systems [10]. In this system, all communication is done using *channels*a channel is an implementation of what is shown in the figure. When a statement is decrypted and its integrity is verified, the channel has made the statement *K says s.*

Certificates are signed statements saying that a key *speaks for* a principal. (Note: in this logic, principals are people, processes, or keys.) A certificate has the form *K _{CA}*

The logic has (among others) the following inference rules where *P _{1}* and

- If
*(P*_{1}*speaks for P*) and_{2}*(P*_{1}*says s)*, then*P*_{2}*says s.* - If
*P*1*says (P*_{2}*speaks for P*, then_{1)}*P*_{2}*speaks for P*_{1.}

Assume that principal *B* has received the certificate *K _{CA}* says (

The chain of reasoning is as follows. Using the first rule, the second assumption, and the received certificate, *B* concludes *CA says (K _{A}*

The logic of Lampson et al. contains many rules for reasoning about delegation of authority, principals acting in roles, and authentication with multiple certification authorities. The logic was used to design an authentication system as part of Digital Equipment Corporation's Taos operating system [11]. Taos featured a distributed file system that authenticated principals and their privileges.

Having confidence that a system is secure requires confidence in the cryptographic algorithms, confidence that the security mechanisms are implemented properly, and that the mechanisms themselves support a security model. While the mathematical analysis necessary to reason about the cryptographic strength of algorithms may be beyond most system engineers, the type of mathematical logic used to analyze the correctness of implementations and trust relationships is not. In fact, these reasoning methods are taught to engineering students at the master's level. Formal reasoning and tools are being adopted by industry, particularly by integrated circuit manufacturers.

This article is not comprehensive in its treatment of all security concerns and formal reasoning techniques. For example, the analysis of covert channels has not been addressed. Nevertheless, the methods discussed in this article do cover a wide variety of concerns that must be addressed to establish confidence. These reasoning methods and tools can and should be part of the repertoire of designers of secure systems.

1. Eiriksson, A.T. The formal design of a 1M-gate ASICs. In *Formal Methods in Computer-Aided Design, G. Gopalakrishnan and P. Windley, Eds., Lecture Notes in Computer Science, 1522*, Springer-Verlag (1998), 4963.

2. Godefroid, P., Hanmer, R.S., and Jagadeesan, L.J. Systematic software testing using VeriSoft: An analysis of the 4ESS heart-beat monitor. *Bell Labs Technical Journal 3*, 2 (Apr.June 1998).

3. Gordon, M.J.C. A proof generating system for higher-order logic. In G. Birtwistle and P.A. Subramanyam, Eds., *VLSI Specification, Verification, and Synthesis*. Kluwer, 1987.

4. Heitmeyer, C., Kirby, J., Labaw, B., and Bharadwaj, R. SCR: A toolset for specifying and analyzing software requirements. In *Proceedings of the 10th Annual Conference on Computer-Aided Verification (CAV'98)*, (Vancouver, 1998).

5. ITU-T. Information Processing SystemsOpen Systems InterconnectionThe Directory Authentication Framework, ITU-T Recommendation X.509. 11/93. Also ISO/IEC 9594-8

6. Lampson, B., Abadi, M., Burroughs, M. and Wobber, E. Authentication in distributed systems: Theory and practice. *ACM Trans. Computer Systems 10*, 4 (Nov. 1992), 265310.

7. McMillan, K.L. *Symbolic Model Checking*. Kluwer Academic Publishers, Boston, 1993.

8. Menzies, A.J., von Oorschot, P.C., and Vanstone, S.A. *Handbook of Applied Cryptography*. CRC Press, NY, 1996.

9. Rivest, R.L., Shamir, A., and Adleman, L. A method for obtaining digital signatures and public-key cryptosystems. *Commun. ACM 21*, 2 (Feb. 1978), 120126.

10. Tahar, S. and Kumar, R. A practical methodology for the formal verification of RISC processors. *Formal Methods in System Design 13*, 2 (1998), 159225.

11. Wobber, E., Abadi, M., Burroughs, M., and Lampson, B. *Authentication in the TAOS Operating System*. Report 117, Digital Equipment Corporation Systems Research Center, Palo Alto, CA (Dec. 1993).

12. Zhou, D. and Chin, S.K. Verifying privacy enhanced mail functions with higher order logic. *DIMACS Series in Discrete Mathematics 38* (1998), 1120.

**©1999 ACM 0002-0782/99/0700 $5.00**

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

The Digital Library is published by the Association for Computing Machinery. Copyright © 1999 ACM, Inc.

No entries found