Thought you all might like to read this article by a co-worker of mine:
--
By Gerald Gilbert
After the great chess master Gary Kasparov was defeated by IBMs Big Blue computing machine in a famous match in May 1997, he displayed genuine amazement (and perfectly human anger) at the outcome. Amazing as this outcome was, it is even more amazing when considering the fact that Big Blue, and indeed all computing machines in use todayranging from palm devices to supercomputersare in a sense glorified abacuses!
How can this be? The information that is manipulated, stored, and transformed in both a massively parallel electronic computer and in a wooden abacus is classical information. Although a modern computer is mind-numbingly fast compared to an abacus, the underlying principles of information science that govern both machines are identical: the physical states that represent the information, the dynamic evolution of those states, and therefore the information itself, are all described by the equations of classical mechanics.
The paradigm for information science is, however, beginning to change. All things must pass, opined George Harrison in his pretty song of that title, and in a sense, this is just what is occurring today at the scientific frontier defined by the intersection of physics, engineering, and computer science. Worldwide research over the last few years has explored the possibilities for hardware and software development inherent in the explicit use of the physical properties of quantum mechanical systems, such as atoms and their components.
Emerging from the work is the nascent concept of a quantum computer. Although current computers incorporate semiconductors and thus rely upon quantum effects in their underlying technology, their design principles are purely classical. The memory of a computing machine is composed of some number of registers, which can be thought of as boxes that house bit values. A given classical bit can describe precisely one or the other of two exclusive information states, usually referred to as 0 and 1. At any particular time, an n-bit register in a classical computer can describe a maximum of n bit values. After all, how could a bit represent both a 0 and a 1 at the same time?
Actually, it can, if we consider the terrain of the very small, where the rules of quantum mechanics hold sway. If we imagine making use of elementary particles, such as electrons, or perhaps the nuclei of atoms, as the building blocks of computer memories, we find a new realm of possibilities. The analogue of an n-bit register in a quantum computer represents all of the 2n information states that are potentially associated with n distinct classical bits. This behavior of quantum information is so different from that of classical information that we distinguish it with a new concept, that of a quantum bit (or qubit, as it has come to be called), the generalization of a classical bit. At any instant, the quantum computer is in a linear superposition of all its possible bit patterns. If we can eventually build one, such a machine will be no abacus. It will not even be the mother of all abacuses. It will represent a new world of computing, a terra incognita the outlines of which we are only beginning to understand and explore today.
Quantum computers are being developed within the context of quantum information science, an emerging interdisciplinary field in which the basic, underlying principles and possibilities of the science and technology of information are being discovered and applied. The essence of the new field is captured in the statement information is physicalthe data with which we work are encoded in the states of some physical objects. As such, the processes and paths that the data can participate in and follow are dictated by the laws of physics, which at their root are provided by quantum mechanics. Quantum mechanics underlies all current physical theory. The Standard Model (Weinberg-Salam model) of elementary particles, which correctly accounts for three out of the four fundamental forces of nature (the electromagnetic and strong and weak nuclear forces), is a quantum field theory, i.e., a mathematically consistent combination of quantum mechanics and special relativity. String Theory, a promising approach to describing the remaining fundamental force (gravity) unaccounted for in the Standard Model, along with the other three forces, is a quantum mechanical generalization of Einsteins general theory of relativity. Although many have looked, there have been no experimental contradictions to the predictions of quantum mechanics discovered thus far, and we have had 75 years during which to discover them.
So there is a deep and fundamental reason to carry out research in quantum information science: the experimental evidence tells us that nature is not really classical, but is instead quantum mechanical. This fact alone impels one to think about the possibility of computing, and communicating, with quantum mechanics, and brings to mind other questions that we will address below. However, there are purely practical motivations as well. It is already known that there are certain classes of computational problems for which one can obtain solutions much more efficiently with quantum computers than with classical computers. In some cases one obtains a moderate speedup (Lov Grovers quantum algorithm for searching a database), and in other cases one obtains an exponential speedup (Peter Shor's algorithm for period-finding and factoring). These allow for solving problems that are basically impossible to solve with classical computers. Considering the application of quantum information more generally to communications tasks, we already know that under certain circumstances it is possible to increase the capacity of a communications channel beyond that which is allowed for by classical information theory; this is called quantum superdense coding. In other circumstances it is possible to carry out a task that has no classical counterpart and is otherwise physically impossible. Such a task is quantum cryptography, which allows unconditionally secret communications between remotely located parties (see Pushing the Frontier of Science).
Among the possible uses to which quantum computers might be applied, special attention (not to mention substantialand growingfunding) has been directed to the subject of cryptanalysis, the science and art of breaking codes. Immediately upon Peter Shors discovery of an algorithm for period-finding and factoring, it became apparent that quantum computers represent both an opportunity and a specter: much of current cryptography will be rendered useless if and when powerful quantum computers are built. The development of powerful quantum computers would pose a distinct threat to all communications and other information processing systems that rely in an important way for their cryptographic security on public key cryptography methods.
It is important to note here that all of the known methods of public key cryptography are vulnerable to quantum computersthis includes those methods based on the difficulty of factoring very large numbers, exemplified by Rivest-Shamir-Adelman (RSA) encryption, and those methods based on the use of elliptical curves. The impact of this vulnerability would extend across the military, economic, and diplomatic domains precisely to the extent that public key cryptography is embedded in various systems employed in these domains. It is now an official, announced policy of the Department of Defense (DOD) to embed and implement public key cryptography very widely throughout many important DOD information and communication systems.
Adversaries equipped with powerful quantum computers who are also able to intercept the relevant information should be able to exploit this. Any currently intercepted ciphertext communications that are stored by an adversary for future analysis if and when powerful quantum computers become available will also become transparent if they have been protected by public key cryptography today. Current doctrine, concept-of-operations, order-of-battle information, etc., for future military operations, to the extent that they are protected today using public key encryption, should be considered just as vulnerable to decryption by an adversary with quantum computers as future transmissions intercepted after the eventual appearance of quantum computers.
Quite apart from these and other possible practical applications of quantum computing, we must confront quantum information science in any event.
The developmental trend in classical computer design and manufacture discerned by Gordon Moore, universally referred to as Moores Law, consists in the observation that the memory capacity of integrated circuit chips appears to be doubling every 18 months for a given size of chip. The implication of this observation is that every 18 months only half as many atoms are needed as were before to store a bit of information.
As a consequence of Moores Law, classical computing will inevitably come into direct conflict with the basic physical laws of nature. If current trends in miniaturization and energy efficiency continue as they have during the period that provided the basis for Moores Law, we will find that computer technology must reach the quantum level at some time between 2010 and 2020. At that point individual transistors will have shrunk to the mesoscopic scale (about 10 angstrom or 1 nanometer). Owing to the onset of large-scale quantum fluctuations in the operation of transistors, circuits, and switching units, which results in the replacement of deterministic values with quantum probabilistically distributed ones, this occurrence will mean that all aspects of computer operation will be dominated by quantum mechanical effects, and it will be necessary to modify computer design accordingly. Our knowledge of the physics of quantum mechanical systems leads us to the conclusion that the design principles of classical computers must then be replaced with the new design principles of quantum mechanical computing machines.
--
By Gerald Gilbert
After the great chess master Gary Kasparov was defeated by IBMs Big Blue computing machine in a famous match in May 1997, he displayed genuine amazement (and perfectly human anger) at the outcome. Amazing as this outcome was, it is even more amazing when considering the fact that Big Blue, and indeed all computing machines in use todayranging from palm devices to supercomputersare in a sense glorified abacuses!
How can this be? The information that is manipulated, stored, and transformed in both a massively parallel electronic computer and in a wooden abacus is classical information. Although a modern computer is mind-numbingly fast compared to an abacus, the underlying principles of information science that govern both machines are identical: the physical states that represent the information, the dynamic evolution of those states, and therefore the information itself, are all described by the equations of classical mechanics.
The paradigm for information science is, however, beginning to change. All things must pass, opined George Harrison in his pretty song of that title, and in a sense, this is just what is occurring today at the scientific frontier defined by the intersection of physics, engineering, and computer science. Worldwide research over the last few years has explored the possibilities for hardware and software development inherent in the explicit use of the physical properties of quantum mechanical systems, such as atoms and their components.
Emerging from the work is the nascent concept of a quantum computer. Although current computers incorporate semiconductors and thus rely upon quantum effects in their underlying technology, their design principles are purely classical. The memory of a computing machine is composed of some number of registers, which can be thought of as boxes that house bit values. A given classical bit can describe precisely one or the other of two exclusive information states, usually referred to as 0 and 1. At any particular time, an n-bit register in a classical computer can describe a maximum of n bit values. After all, how could a bit represent both a 0 and a 1 at the same time?
Actually, it can, if we consider the terrain of the very small, where the rules of quantum mechanics hold sway. If we imagine making use of elementary particles, such as electrons, or perhaps the nuclei of atoms, as the building blocks of computer memories, we find a new realm of possibilities. The analogue of an n-bit register in a quantum computer represents all of the 2n information states that are potentially associated with n distinct classical bits. This behavior of quantum information is so different from that of classical information that we distinguish it with a new concept, that of a quantum bit (or qubit, as it has come to be called), the generalization of a classical bit. At any instant, the quantum computer is in a linear superposition of all its possible bit patterns. If we can eventually build one, such a machine will be no abacus. It will not even be the mother of all abacuses. It will represent a new world of computing, a terra incognita the outlines of which we are only beginning to understand and explore today.
Quantum computers are being developed within the context of quantum information science, an emerging interdisciplinary field in which the basic, underlying principles and possibilities of the science and technology of information are being discovered and applied. The essence of the new field is captured in the statement information is physicalthe data with which we work are encoded in the states of some physical objects. As such, the processes and paths that the data can participate in and follow are dictated by the laws of physics, which at their root are provided by quantum mechanics. Quantum mechanics underlies all current physical theory. The Standard Model (Weinberg-Salam model) of elementary particles, which correctly accounts for three out of the four fundamental forces of nature (the electromagnetic and strong and weak nuclear forces), is a quantum field theory, i.e., a mathematically consistent combination of quantum mechanics and special relativity. String Theory, a promising approach to describing the remaining fundamental force (gravity) unaccounted for in the Standard Model, along with the other three forces, is a quantum mechanical generalization of Einsteins general theory of relativity. Although many have looked, there have been no experimental contradictions to the predictions of quantum mechanics discovered thus far, and we have had 75 years during which to discover them.
So there is a deep and fundamental reason to carry out research in quantum information science: the experimental evidence tells us that nature is not really classical, but is instead quantum mechanical. This fact alone impels one to think about the possibility of computing, and communicating, with quantum mechanics, and brings to mind other questions that we will address below. However, there are purely practical motivations as well. It is already known that there are certain classes of computational problems for which one can obtain solutions much more efficiently with quantum computers than with classical computers. In some cases one obtains a moderate speedup (Lov Grovers quantum algorithm for searching a database), and in other cases one obtains an exponential speedup (Peter Shor's algorithm for period-finding and factoring). These allow for solving problems that are basically impossible to solve with classical computers. Considering the application of quantum information more generally to communications tasks, we already know that under certain circumstances it is possible to increase the capacity of a communications channel beyond that which is allowed for by classical information theory; this is called quantum superdense coding. In other circumstances it is possible to carry out a task that has no classical counterpart and is otherwise physically impossible. Such a task is quantum cryptography, which allows unconditionally secret communications between remotely located parties (see Pushing the Frontier of Science).
Among the possible uses to which quantum computers might be applied, special attention (not to mention substantialand growingfunding) has been directed to the subject of cryptanalysis, the science and art of breaking codes. Immediately upon Peter Shors discovery of an algorithm for period-finding and factoring, it became apparent that quantum computers represent both an opportunity and a specter: much of current cryptography will be rendered useless if and when powerful quantum computers are built. The development of powerful quantum computers would pose a distinct threat to all communications and other information processing systems that rely in an important way for their cryptographic security on public key cryptography methods.
It is important to note here that all of the known methods of public key cryptography are vulnerable to quantum computersthis includes those methods based on the difficulty of factoring very large numbers, exemplified by Rivest-Shamir-Adelman (RSA) encryption, and those methods based on the use of elliptical curves. The impact of this vulnerability would extend across the military, economic, and diplomatic domains precisely to the extent that public key cryptography is embedded in various systems employed in these domains. It is now an official, announced policy of the Department of Defense (DOD) to embed and implement public key cryptography very widely throughout many important DOD information and communication systems.
Adversaries equipped with powerful quantum computers who are also able to intercept the relevant information should be able to exploit this. Any currently intercepted ciphertext communications that are stored by an adversary for future analysis if and when powerful quantum computers become available will also become transparent if they have been protected by public key cryptography today. Current doctrine, concept-of-operations, order-of-battle information, etc., for future military operations, to the extent that they are protected today using public key encryption, should be considered just as vulnerable to decryption by an adversary with quantum computers as future transmissions intercepted after the eventual appearance of quantum computers.
Quite apart from these and other possible practical applications of quantum computing, we must confront quantum information science in any event.
The developmental trend in classical computer design and manufacture discerned by Gordon Moore, universally referred to as Moores Law, consists in the observation that the memory capacity of integrated circuit chips appears to be doubling every 18 months for a given size of chip. The implication of this observation is that every 18 months only half as many atoms are needed as were before to store a bit of information.
As a consequence of Moores Law, classical computing will inevitably come into direct conflict with the basic physical laws of nature. If current trends in miniaturization and energy efficiency continue as they have during the period that provided the basis for Moores Law, we will find that computer technology must reach the quantum level at some time between 2010 and 2020. At that point individual transistors will have shrunk to the mesoscopic scale (about 10 angstrom or 1 nanometer). Owing to the onset of large-scale quantum fluctuations in the operation of transistors, circuits, and switching units, which results in the replacement of deterministic values with quantum probabilistically distributed ones, this occurrence will mean that all aspects of computer operation will be dominated by quantum mechanical effects, and it will be necessary to modify computer design accordingly. Our knowledge of the physics of quantum mechanical systems leads us to the conclusion that the design principles of classical computers must then be replaced with the new design principles of quantum mechanical computing machines.