New York Times science writer Johnson (Strange Beauty, 1999, etc.) explains why quantum computers are expected to be the next major breakthrough.
The author begins by recalling his youthful disappointment when he received a build-it-yourself computer and discovered how simple it was. But that anticlimax revealed a central truth: all digital computers are in essence bundles of on-off switches. The logical destination of the trend toward miniaturization is a computer in which each switch is a single atom. There is more to the quantum computer, however, than mere compactness, as Johnson makes clear in a quick summary of quantum theory. The beauty of the “qubit” (as scientists have dubbed the quantum bit) is that it can be in several superimposed states: not just “on” or “off,” but both at once. Thus, the numbers 1 through 1024 can all be represented at once by ten quantum switches. Put into practice, this capability enables a stunning increase in speed, essential for tackling such problems as the factoring of very large numbers, which is a key to modern cryptography. Johnson spends some time examining ways in which the simple switches that are the basis for computers could be built from quantum parts. He doesn't minimize the difficulties of the task. To give just one example, capturing atoms (or molecules, or subatomic particles) and training them to act as switches requires cooling them close to absolute zero, impractical for desktop applications. Nor are the qubits anywhere near as stable as one would like, with a few seconds the best working lifetime so far achieved. Still, the potential of the nascent technology is fascinating, and if successful, its development is likely to be one of the most closely watched scientific stories of the new century.
A tantalizing glimpse of how the uncertainties of quantum theory may yet be tamed for work of the highest precision.