Research and Advances
Data and Information

Ultimate Cryptography

Posted
  1. Article
  2. Author
  3. Figures

Predict the state of cryptography in 1,000 years. The thought is daunting. The very narrowness of the field makes the problem that much more difficult. It is one thing to try to decide whether sentient life on Earth will be made of carbon or silicon, quite another to speculate on whether it will need to keep secrets and how it will go about doing it.

We have barely any evidence that cryptography existed 3,000 years ago, just some puzzles on Egyptian tombstones in the Valley of the Kings. About 2,000 years ago, transposition ciphers had made the barest appearance and Julius Caesar carried on correspondence in a very simple substitution cipher. About 1,000 years ago, simple substitution ciphers were familiar enough for their weaknesses to be understood and for people to begin exploring more complex systems intended to counter those weaknesses.

Today, cryptography is an elaborate electronic, mathematical, and computational edifice that transforms trillions of bits into or out of cipher text every second. The concepts commonplace in this theory—polyalphabetic systems and the distinction between systems and keys—all arose late in the millennium just ended.

What could a prophet in any of those millennial years have predicted about the state of cryptography 1,000 years later. Even to have predicted in 1901 what the state of cryptography would be in 2001 would have required foresight worthy of the Delphic Oracle.

The most accessible predictions are tactical, asking: What will happen to the problems we understand and that concern us today? In cryptography, these problems fall into three general areas: mathematics, computational complexity, and computing technology.


In a highly integrated environment, there will still be security, but cryptography may not be an appropriate security mechanism.


The mathematical problems lie largely in the area of algebraic geometry, the branch of mathematics that deals with solving sets of algebraic equations over very general arithmetic structures. In many respects, cryptography consists of choosing or designing arithmetic systems meant to make the equations difficult to solve. For cryptography, the problem is spiced up by demanding not necessarily the solutions to sets of equations but rather the most likely solutions, given some assumptions about the unknown elements, keys, and plaintext.

If we date modern mathematics from the development of calculus in the late 17th century, it is little more than 300 years old. Noting the field’s steadily increasing rate of growth during this period, we may imagine that mathematics will make at least comparable progress over the next few hundred years. It does not seem rash to suspect the majority of problems now confounding us will seem elementary long before another millennium has passed and we will know with great assurance whether or not the types of cryptographic systems in use today can be made secure.

Mathematics generally exhibits a very permissive attitude toward the issue of when a problem can be solved, taking little account of efficiency. The theory of computational complexity concerns how difficult it is to solve mathematical problems. Complexity theory is a far more recent development than the traditional mathematical disciplines vital to cryptography, having arisen in close association with computing, primarily in the latter half of the 20th century. Its many theorems notwithstanding, complexity theory is an infant that founders on the simplest of cryptographic problems. Nonetheless, its very youth is reason to hope and expect that before many more centuries have passed, it will have undergone a transformation similar to the one between 17th-century and 20th-century mathematics.

The progress of computing is the most predicted and perhaps least certain of the three areas. For decades, we have lived in the sunlight of Moore’s Law, which in generalized form, says the cost of computing halves every 18 months. Were Moore’s Law to continue for the next 1,000 years, some 666 18-month periods, the cost of computation would decline to 1/2666th of its current value. At present, it is feasible to undertake cryptanalytic tasks requiring approximately 264 operations and popular to regard systems as secure if their solution requires the square of this figure, or 2128 operations. This suggests that in 1,000 years, cryptographic systems, even optimally efficient ones, might require upward of 730 bits of key.

It is both clear that Moore’s Law cannot go on forever and that reports of its demise are premature. Moore’s Law in its purer form is about the size of computing elements. Let us therefore ask how small they might possibly get. The smallest elements envisioned by contemporary physics are strings. If we accept the physicists optimism that they are headed for a final theory, it seems plausible that strings represent the smallest possible computing elements. Let us also accept the technologists’ optimism that whatever is not impossible will be done and assume that computers in which the gates are individual strings will appear one day.

Sizes in the string world are around 10235 meters, or 10228 times smaller than the smallest feature sizes on today’s chips. The density of elements (ignoring such inconveniences as heat) might plausibly grow as the cube of the feature size as speeds grow in inverse proportion to that size. A string computer might therefore be 10112, or 2370, times as fast as current machines. Keys a mere 434-bits long might suffice.

Physicists today are working on quantum computing, an ambitious attempt to expand parallelism in computing by computing in quantum superposition, a phenomenon in which a computer’s state evolves toward all possible outcomes simultaneously. Although quantum computing appears threatening to popular public-key cryptosystems, its effect on more general cryptanalytic problems seems only to be square-rooting the costs, or doubling the size of keys. Quantum computing alone might force us to use keys of 200 or 250 bits.

Some forms of physical analysis yield even more conservative estimates. Seth Lloyd, a professor of mechanical engineering at MIT, recently argued that no computer comparable in size and mass to a contemporary laptop could ever be more than 1040, or 2132, times faster than contemporary machines. These figures suggest that keys might need to be only 196 bits—a bit less than the 256-bit keylength of the Advanced Encryption Standard, recently announced by the National Institute of Standards and Technology.

Although these various estimates are in a sense fairly far apart, they are qualitatively quite close. Neither quantum computing nor any other foreseeable development seems to make a critical qualitative change in the problem: either reduce non-deterministic polynomial time to polynomial time, thus making all problems easy in the view of contemporary complexity theory, or give us an oracle for the halting problem, thus uniting computational complexity with the theory of recursive functions.

We might also ask a deeper question about cryptography: What capabilities will be demanded of it in the future, and what will it offer in return? A prophet in the year 1001 would hardly have been likely to imagine shift-register sequences, public-key cryptography, or zero-knowledge proofs, so our hopes of predicting cryptographic requirements 1,000 years hence are none too bright. Nonetheless, some things do suggest themselves. In the 1950s, a new capability—spread spectrum, a form of communication that could be detected only by receivers in possession of secret information—was added to the world of secure communication. In 1970, it was joined by a proposal for communications that could be detected but not received without secret information. (Unfortunately, although this field—quantum cryptography—has prospered, it has evolved away from its original promise and now provides an alternative to number theory as a basis for key management systems.) What basically new—and seemingly impossible—capabilities might we hope for?

Although spread-spectrum communications are difficult for those without the key to detect, anyone who has the key can locate a spread-spectrum transmitter through ordinary direction-finding techniques. One attractive addition to our toolkit would thus be a scheme for unlocatable communications. However, it seems to require the energy of the transmission be the same at every point in space, thus infinite in total energy and impossible. But who knows? Another seeming impossibility are messages that can be understood but not copied, a capability whose development the copyright mavens will be happy to fund.

Perhaps the most fundamental question of all is: Will there be any kind of cryptography at all? Whether intelligence in 3001 is based on carbon, silicon, or strings, it may be much more closely organized than intelligent life today. Cryptography is a security measure suited to protecting information against independent intelligent opponents. Improving communications seem likely to bind the human species or its descendants into a much closer-knit organization, while the prospect that star travel will bring us in contact with new competitors is far from a certainty. In a highly integrated environment, there will still be security, but cryptography may not be an appropriate security mechanism.

Within our bodies, we find cryptography-like authentication in the immune system but not the encoding of messages for secrecy that constitutes the hard core of the subject. For this or one of myriad other reasons, cryptography may be merely a distant memory in 3001.

Back to Top

Back to Top

Figures

UF1 Figure. Using interactive ray tracing to explore a maximum-intensity projection of the Gigabyte Visible Female data set from the National Library of Medicine. Created by Steven G. Parker, Scientific Computing and Imaging Institute, University of Utah.

Back to top

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More