Research and Advances
Architecture and Hardware

High-Confidence Design For Security: Don't Trust?Verify

Don't trust—verify.
Posted
  1. Introduction
  2. Security Properties and Mechanisms
  3. Establishing Confidence
  4. Conclusions
  5. References
  6. Author
  7. Figures

The widespread use of networks makes information security a major concern when the underlying network (for example, the Internet) is assumed to be insecure. Systems with security requirements typically must operate with a high degree of confidence—they must be highly assured. The task of designing and building secure systems raises a fundamental question: how do we know with confidence that our designs will be secure? Having confidence in a secure system requires having confidence in the strength of the cryptographic algorithms, the correctness of the hardware and software implementations, and knowing the implementation supports a security model.

This article describes methods for establishing confidence that implementations meet their specifications and security requirements. These methods are rigorous in nature, rely on mathematical logic, and are accessible to engineering students at the master’s level. As is typical in systems engineering, various methods are used depending on what level of design is being addressed.

Back to Top

Security Properties and Mechanisms

The collection of people, keys, processes, or machines who send and receive information and who access information resources (databases, processors, printers) are called principals. Security properties typically deal with the ability of principals to access information or resources. Key security properties include:

  • Privacy or confidentiality: knowing which principals can read data;
  • Integrity: detecting corruption of information;
  • Authentication: knowing the identity of a principal or the source of information;
  • Access control: restricting the use of a resource to privileged principals;
  • Non-repudiation: preventing principals from denying their actions; and
  • Availability of service: guaranteeing authorized principals access to services.

The Handbook of Applied Cryptography [6] describes each of these properties in detail. The means for achieving these properties depends on the collection of security mechanisms that supply security services, the correct implementation of these mechanisms, and how the mechanisms are used. Some common security mechanisms used to achieve these security properties are:

  • Encryption: hiding information in a reversible way, that is, for message m, and keys K and K-1: decrypt(K-1, encrypt(K,m)) = m, such that it is computationally infeasible to determine the keys K and K-1;
  • Hash functions: one-way functions that map messages of arbitrary length to a fixed number of bits where for any hash value hash(m1), it is computationally infeasible to find another message m2 such that hash(m1) = hash(m2), and find m1 given hash(m1);
  • Digital signatures: a means to associate information with principals;
  • Certification: vouching for the accuracy of information by a recognized authority; and
  • Authorization: granting permission to do something or to act in a particular role.

Encryption is used to meet confidentiality requirements. There are two kinds of encryption: symmetric (or secret key) encryption, and asymmetric (or public key) encryption. In symmetric key cryptography, the same key is used as a parameter of a cryptographic function such as DES (Data Encryption Standard) [6] to encrypt or decrypt data. DES is an instance of a Feistel cipher [6]. Feistel ciphers have a repetitive structure where each stage or round of calculations involves splitting the input into two halves (Li-1, Ri-1), passing the right input Ri-1 to the left output Li, and computing the right output Ri by taking the bitwise exclusive-or of the left input Li-1 with a function of the right input Ri-1 and subkey Ki denoted by F(Ri-1, Ki). The function F is a complex function consisting of permuting the order of bits in a block of bits and substituting one block of bits for another. DES uses 16 rounds with specific functions defining substitutions, permutations, and subkey generation.

Asymmetric key cryptography uses two keys. A public key is available to everyone. A private key is known only by a single principal. Thus a private key in some sense uniquely identifies a principal if that key is not compromised.

An example of a public key encryption algorithm is RSA by Rivest, Shamir, and Adleman [9]. Encryption and decryption are done by exponentiation. If M is the value of a message block, and (e,n) is a public key where e and n are integers, then the encrypted value of M is Me mod n, that is, M raised to the power e modulo n. (Recall that 23 mod 7 = 8 mod 7 = 1 mod 7). A decryption key (d,n) is the inverse of an encryption key (e, n), that is, (Me)d mod n = M. The original message is retrieved when the encrypted value is raised to the power d.

Digital signatures are used to meet the needs for integrity, authentication, access control, and non-repudiation. One method for accomplishing both encryption for confidentiality and generating digital signatures is shown in the figure appearing here. A signed and encrypted message (ciphertext,signature), consists of ciphertext—the plaintext encrypted with a data encryption key (DEK), and signature—the message hash encrypted with a signature key. Recalling the properties of hash functions and public key encryption, the hash value of the plaintext identifies a message and the signature key identifies the sender. DES can be used to encrypt the plaintext and RSA can be used to sign the hash value of the plaintext.

Given the pair (ciphertext,signature), the plaintext is extracted by decrypting ciphertext using the inverse of the DEK, and the integrity and authenticity checked by comparing the hash of the decrypted message with the decrypted signature received with the encrypted message. If the message arrives intact, then the owner of the private key (assuming that the private key of the sender has not been compromised) is the only one who could have sent the message.

Back to Top

Establishing Confidence

What gives us confidence in the security of a system and how do we go about establishing that confidence? Considering the security properties and mechanisms described in the previous sections as shown in Figure 1, the following questions come to mind: Are the cryptoalgorithms and hash functions good enough? Are the mechanisms, such as those shown in the figure, correctly implemented in hardware and software? Does the implementation support some security or trust model? Each of these questions is addressed here.

Confidence in algorithms. Confidence in the underlying cryptographic algorithms requires a detailed analysis of the algorithms themselves using number theory, statistics, and complexity theory. This is to substantiate the expectation that the algorithms are resistant to cryptanalysis used by attackers. An important part of having confidence in an algorithm is whether it has been thoroughly examined by analysts. In most cases, implementers do not create new cryptoalgorithms nor are they responsible for assessing the cryptographic strength of those cryptoalgorithms. Rather, most system designers use cryptoalgorithms as components and are responsible for their correct implementation and use. This brings us to the questions of correct implementation and maintaining trust and security.

Confidence in implementations. Having confidence in the implementation of security mechanisms requires knowledge that the implementations are correct. Besides the normal testing and simulation to check correctness for specific test cases, formal verification using mathematical logic is possible and becoming more practical.


Higher-order logic, logic that allows functions to be parameters of functions, is well suited for describing and verifying data paths and their specifications.


Temporal logic and reasoning tools based on model checking [7], have verified properties of microprocessors and controllers. As model checkers explore the properties of state machines, they are particularly appropriate for verifying properties of controllers and control paths. Example properties include checking for the absence of deadlock and the reachability of certain states. Freedom from deadlock is expressed as “for all states it is possible to make a transition to another state.” Eiriksson in [1] used the Cadence Berkeley Labs SMV (Symbolic Model Verifier) model checker [7] to verify the correctness of a one-million gate application specific integrated circuit (ASIC) implementing a cache coherence protocol.

Higher-order logic, logic that allows functions to be parameters of functions, is well suited for describing and verifying data paths and their specifications. The ability to use functions as parameters means security mechanisms are specified, described, and verified without having to specify the specific hash or encryption function used in a particular instance. The correctness of the mechanisms is established based on the general properties of hash and encryption functions. For example, in the integrity checking portion of the mechanism in the figure, the pair (ciphertext,signature) is deemed OK if there is a match between the hash of the decrypted ciphertext and the decrypted signature.

The properties of the integrity check depend on the properties of hash and decryption functions. If the decryption functions invert the corresponding encryption functions, then the integrity check is true if and only if the received message is equivalent to the message that was sent, where equivalence is defined by having the same hash value. Specification and verification of integrity checking at the abstract level and lower levels using specific message structures is possible using higher-order logic [3] and automated tools [4].

To give a sense of how lower-level implementation descriptions are related to higher-level behavioral specifications, consider the addition operation. What must be proved is the implementation at the bit level corresponds to correct arithmetic-level behavior given a particular mapping from bit vectors to integers (consider the unsigned interpretation of bit vectors).


Formal reasoning and tools are being adopted by industry, particularly by integrated circuit manufacturers.


For example, say we have a 4-bit adder Add(In1, In2, Sum) that adds In1 and In2 to get Sum. An example at the bit level is 0011+0101=1000. At the arithmetic level, if we use an unsigned interpretation of 4-bit words, we have 3+5=8, where value(0011)=3, value(0101)=5, and value(1000)=8. If all input and output bit vectors allowable by adder Add(In1, In2, Sum) satisfy the relationship value(In1)+value(In2)=value(Sum), then Add satisfies the arithmetic specification of addition for unsigned numbers.

Relating low-level implementations to higher-level specifications (such as gate-level implementations to arithmetic behavior) relates models or descriptions at differing levels of abstraction. This is done at a variety of levels. For example, Tahar and Kumar have verified, using higher-order logic, that the pipelined DLX RISC processor implementation satisfies its instruction set architecture as seen by users [10].

While the preceding examples have a hardware flavor to them, software verification is becoming practical as well, especially in the area of model checking of software. Examples using model checking for software include checking a telephone switching application using Lucent Technology’s VeriSoft tool [2], and checking safety-critical systems using the Naval Research Laboratory’s SCR (Software Cost Reduction) tools [4].

Confidence in security. The cryptographic algorithms and security mechanisms described in the previous sections establish that messages and data were encrypted and signed using a key K. Consider the case where the mechanisms in the figure are implemented correctly and the pair (ciphertext,signature) is deemed OK. If ciphertext decrypts to statement s and the key used to sign hash(s) is K, then we can say K says s.

A critical question is whom does K represent? This question raises a number of issues that extend beyond cryptographic strength of algorithms and correct implementation of hardware and software. These questions include:

  • What is the basis for associating keys with principals?
  • Who are the recognized authorities that vouch for principals, their information, and their privileges?
  • How do we decide if we should honor a request to access a resource such as a file?

These questions point to a need to reason about trust relationships, authentication, and authority using some model of security or trust. This is particularly important for distributed systems where system components are geographically distant. Generally, these questions are handled as follows:

  • Keys are associated with principals using certificates. Certificates document the association of public keys with principals. They are unforgeable as they are cryptographically signed by the certification authority issuing the certificate. An example is the X.509 certificate standard [5].
  • Certain principals and their keys are explicitly recognized as having the authority to certify the validity of associating keys with principals.
  • Access to resources is determined by consulting an access control list after authenticating who is making a request and in what role the person is making the request.

The challenge is to have a means to reason consistently about who is making requests. One way to address this need is to create specialized logical systems to reason about trust, authentication, and authority. These systems have a specialized language and inference rules for reasoning about principals and those for whom they speak. The advantage of having such a system is it maintains logical consistency when making decisions on authentication and access control. Of course, the issue of whether the inference rules themselves make sense for the application at hand must be examined closely.

One example is the logic created by Lampson et al. to reason about authentication in distributed systems [10]. In this system, all communication is done using channels—a channel is an implementation of what is shown in the figure. When a statement is decrypted and its integrity is verified, the channel has made the statement K says s.

Certificates are signed statements saying that a key speaks for a principal. (Note: in this logic, principals are people, processes, or keys.) A certificate has the form KCA says (KA speaks for A), where CA is a certificate authority and A is a principal whose public key is KA. This formula is interpreted as the statement KA speaks for A is signed by KCA. If it is accepted that KCA speaks for CA (KCA is CA‘s key), where CA is a recognized certification authority, then we conclude that KA speaks for A (KA is A‘s key). As a simple illustration of how the logic of Lampson et al. is used, we go through the chain of reasoning that enables principal B to conclude KA is A‘s key.

The logic has (among others) the following inference rules where P1 and P2 are any principals and s is any statement:

  1. If (P1 speaks for P2) and (P1 says s), then P2 says s.
  2. If P1 says (P2 speaks for P1), then P2 speaks for P1.

Assume that principal B has received the certificate KCA says (KA speaks for A). If B assumes (1) (KA speaks for Anyone and (2) KCA speaks for CA, then B can conclude KA speaks for A using the preceding two rules, the two assumptions, and the certificate.

The chain of reasoning is as follows. Using the first rule, the second assumption, and the received certificate, B concludes CA says (KA speaks for A). Using the first assumption, B concludes CA speaks for A, and using the first rule again concludes that A says (KA speaks for A). Finally, using the second rule, B determines KA speaks for A. If our authentication and access control mechanisms act consistently with the inference rules, then we can have confidence in the security of the system, assuming the security model as reflected by the inference rules is appropriate for the application.

The logic of Lampson et al. contains many rules for reasoning about delegation of authority, principals acting in roles, and authentication with multiple certification authorities. The logic was used to design an authentication system as part of Digital Equipment Corporation’s Taos operating system [11]. Taos featured a distributed file system that authenticated principals and their privileges.

Back to Top

Conclusions

Having confidence that a system is secure requires confidence in the cryptographic algorithms, confidence that the security mechanisms are implemented properly, and that the mechanisms themselves support a security model. While the mathematical analysis necessary to reason about the cryptographic strength of algorithms may be beyond most system engineers, the type of mathematical logic used to analyze the correctness of implementations and trust relationships is not. In fact, these reasoning methods are taught to engineering students at the master’s level. Formal reasoning and tools are being adopted by industry, particularly by integrated circuit manufacturers.

This article is not comprehensive in its treatment of all security concerns and formal reasoning techniques. For example, the analysis of covert channels has not been addressed. Nevertheless, the methods discussed in this article do cover a wide variety of concerns that must be addressed to establish confidence. These reasoning methods and tools can and should be part of the repertoire of designers of secure systems.

Back to Top

Back to Top

Back to Top

Figures

F1 Figure 1. Channels with encryption and digital signatures.

Back to top

    1. Eiriksson, A.T. The formal design of a 1M-gate ASICs. In Formal Methods in Computer-Aided Design, G. Gopalakrishnan and P. Windley, Eds., Lecture Notes in Computer Science, 1522, Springer-Verlag (1998), 49–63.

    2. Godefroid, P., Hanmer, R.S., and Jagadeesan, L.J. Systematic software testing using VeriSoft: An analysis of the 4ESS heart-beat monitor. Bell Labs Technical Journal 3, 2 (Apr.–June 1998).

    3. Gordon, M.J.C. A proof generating system for higher-order logic. In G. Birtwistle and P.A. Subramanyam, Eds., VLSI Specification, Verification, and Synthesis. Kluwer, 1987.

    4. Heitmeyer, C., Kirby, J., Labaw, B., and Bharadwaj, R. SCR: A toolset for specifying and analyzing software requirements. In Proceedings of the 10th Annual Conference on Computer-Aided Verification (CAV'98), (Vancouver, 1998).

    5. ITU-T. Information Processing Systems—Open Systems Interconnection—The Directory Authentication Framework, ITU-T Recommendation X.509. 11/93. Also ISO/IEC 9594-8

    6. Lampson, B., Abadi, M., Burroughs, M. and Wobber, E. Authentication in distributed systems: Theory and practice. ACM Trans. Computer Systems 10, 4 (Nov. 1992), 265–310.

    7. McMillan, K.L. Symbolic Model Checking. Kluwer Academic Publishers, Boston, 1993.

    8. Menzies, A.J., von Oorschot, P.C., and Vanstone, S.A. Handbook of Applied Cryptography. CRC Press, NY, 1996.

    9. Rivest, R.L., Shamir, A., and Adleman, L. A method for obtaining digital signatures and public-key cryptosystems. Commun. ACM 21, 2 (Feb. 1978), 120–126.

    10. Tahar, S. and Kumar, R. A practical methodology for the formal verification of RISC processors. Formal Methods in System Design 13, 2 (1998), 159–225.

    11. Wobber, E., Abadi, M., Burroughs, M., and Lampson, B. Authentication in the TAOS Operating System. Report 117, Digital Equipment Corporation Systems Research Center, Palo Alto, CA (Dec. 1993).

    12. Zhou, D. and Chin, S.K. Verifying privacy enhanced mail functions with higher order logic. DIMACS Series in Discrete Mathematics 38 (1998), 11–20.

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More