Apr.20, 2017

Every now and then we see a flurry of news about quantum computing. But one recent piece rather focused the mind: a company claimed to have an encryption algorithm you will soon need, when quantum computers render today’s encryption algorithms useless.

Now to the skeptic, such a claim raises questions. Right away there is curiosity. What is quantum computing (Figure 1)? Is it real? If so, how does it work? And what does it have to do with cryptography? Then there are more personal questions. Is quantum computing going to force me to change my design practices? Am I going to have to learn this stuff?

**Figure 1. Even in artists’ renderings, quantum computing elements don’t look like anything from the digital hardware world.**

It turns out these are not easy questions to explore. The literature is mostly in one of two genres. The first form is aimed at general readers, and treats quantum mechanics rather like a satanic cult: dark, possibly dangerous, and you wouldn’t understand anyway. This makes it rather tricky to draw conclusions.

The second genre is entirely different but equally helpful, written by experts to impress other experts. This form is easily identified by several touchstones: Turing machines, Richard Feynman’s name, Hilbert space, and Hadamard gates, all mentioned within about 75 words, followed by a blizzard of equations with unfamiliar and unexplained nomenclature. Of course you remember what |0> means!

**Three Parallel Universes**

One reason this gets so complicated is that important contributions to quantum computing have come from three disciplines with very different vocabularies and interests. The idea began with theoretical physicists. Notably in 1980, physicist Paul Benioff of Argonne National Laboratory described how some quantum-mechanical effects could be used to implement a Turing machine. Two years later, iconic physicist Richard Feynman also raised the question of a computer using quantum behavior.

The thread was picked up by an entirely different group: computer scientists and mathematicians. Taking the basic ideas of a quantum bit (a qbit) and of reversible unitary transforms (which they called quantum gates, or qgates) from physics, computer scientists explored what kinds of computing you might be able to do if ideal qbits and qgates actually existed. They found cases where such theoretical computers could be far faster than conventional digital computers.

This result prompted experimental physicists—a third group entirely—to begin trying to build physical devices that could approximate ideal qbits and qgates. That has been a long, resource-eating pursuit that still has not proved that a useful quantum computer is physically possible. But it has offered tantalizing hints.

**Some Explanations**

So what is this theoretical computer that we should be interested? For clarity, let’s first eliminate some things that it is not. A quantum computer is not a conventional computer simulating quantum-mechanical phenomena. Nor is it a conventional digital computer build from some post-Moore’s-Law transistors so tiny that they store or switch individual quanta of energy.

Instead, quantum computers are machines based on unique behaviors predicted by quantum mechanics and utterly like the behavior of classical systems. One is the ability to constrain a particle or group of particles so that some aspect of it has only two discrete quantum base states—call them 0 and 1. (We will dispense with the funny-looking brackets here.) The aspect might be the spin on an electron, the polarization of a photon, or the charge on a quantum dot, for example.

Second, quantum computing depends on superposition—the counterintuitive ability of a quantity to be in a combination of both 0 and 1 base states simultaneously, until you measure it. Once you measure it, the state reverts to either 0 or 1, and all the rest of the information disappears. Quantum mechanics properly expresses the superimposed state as a sum of the two base states, each multiplied by a complex coefficient and always with a total magnitude of 1. This can also be imagined as a unit vector starting at the origin and ending somewhere on a sphere called the Bloch Sphere (Figure 2). A key here is that the square of the complex coefficient on the 0 base states represents the probability that you will measure the qbit to be in the 0 base state, and likewise for the 1 base state. And when you make a measurement, you will always come up with either exactly a 0 state or exactly a 1 state.

.*Figure 2. The Bloch Sphere is one way to visualize quantum superposition in a qbit.*

This is important because it allows a qbit to be, conceptually, both a 0 and a 1 simultaneously. And hence a register made up of n qbits can simultaneously “contain” all the possible binary numbers n bits long. This allows a quantum computer to perform a single operation not on just one n-bit integer, but on all possible n-bit integers at once—very substantial parallelism as n gets large.

Third, quantum computing depends on the ability of a qgate to change those coefficients—and hence your probability of measuring any particular number—in predictable ways. If you start out with all the coefficients in all the qbits equal, and then measure all the qbits in the register, you are equally likely to come up with any string of bits between all zeros and all ones, inclusive. But by running this initial state through a carefully designed sequence of qgates, the quantum computer can manipulate the coefficients so that the pattern you are most likely to measure at the output qbits represents the result of a computation—for instance, it might be highly probable that you will measure the bits of a number that is a perfect square.

**A Computer on Paper**

But what does all this have to do with real computing? To answer that, we have to turn our attention from the theoretical physicists to the computer scientists and mathematicians. To do useful work, we would have to be able to place a register of qbits into a known superposition of states. We would need qgates, maybe wires, and some kind of output.

All this is easy for computer scientists—they can just assume that these ideas exist in real life. They do have to make concessions to quantum mechanics, though. In order not to violate the laws of quantum physics, computer scientists must require that qgates be reversible—you can put a result into the output and get the correct input values at the input. And they must insist that the qgates be unitary transforms. Sparing the matrix algebra, this roughly means that when you put the state of a qbit through a qgate, the state you get out will certainly measure as either a 0 or a 1: the sum of the probabilities from the squares of those coefficients still has to add to exactly one.

Notice that these qgates, even in theory, are very unlike normal Boolean logic gates. Most Boolean functions aren’t reversible, for instance. There is no way to infer the inputs to a NAND gate from its output unless the output happens to be 0. And of course logic gates only work with 1s and 0s, while the qgates work by rotating the vector inside the Bloch sphere. There really is no close analogy beyond the names.

The computer scientists worked out that a very small set of qgates is sufficient for emulating a Turing machine—just the set of one-input qgates and a single two-input qgate. The most often used example of the two-input qgate is the controlled NOT (CNOT) gate. This reversible function either flips the vector state of a qbit or leaves it unchanged, depending on the state of a second qbit. It is rather like the quantum analogy to an XOR gate.

**Useful Work**

We still haven’t really answered the question of how you would use one of these things. The answer is that if you string enough qgates together in the right pattern, you can prepare the input qbits to represent all possible numbers in your domain of inputs, and at the output of the qgate array you can, in theory, measure the bits that represent values of some useful function.

An example might help. In 1994, mathematician Peter Shor, then at Bell Labs, developed an algorithm for factoring very large numbers using quantum subroutines. This factoring is a vital problem in applied math because there is no analytical solution: the only way is trial and error, and you can only make the algorithm faster by choosing more cleverly the numbers to try. So when you make the input number very large, the amount of trial and error becomes enormous. This is the basis of RSA-like cryptography algorithms, not at all incidentally. RSA and elliptic-curve cyphers are hard to crack specifically because it is so hard to factor huge numbers.

Shor’s algorithm combined some traditional computing with two quantum functions that directly accelerate the trial-and-error part by, in effect, trying all possible numbers at the same time (Figure 3). One of these quantum functions performs modular exponentiation, and the other does a quantum version of a fast Fourier transform. For reasons only a mathematician could love, if we put in a set of n qbits prepared so that together they represent all the possible binary numbers up to length n, then in the qgates various superimposed states cancel each other—much like interference between two coherent light beams—and we are left with a distinct pattern of states in the output register.

*Figure 3. Shor’s algorithm depends on a core of quantum subroutines to do modulo exponentiation and FFT operations. (drawing courtesy of Tyson Williams)*

This pattern isn’t a prime factor—it is only an intermediate step that allows us to calculate a possible prime factor. That calculation is done by measuring the qbits—remembering that we are only likely, not certain, to measure the most probably state of each qbit—and then to apply a lot of conventional computing in an ordinary CPU to produce the possible factor, and then to test to see if it is really a factor or a false result.

All of this may sound hopelessly complex and infeasible. But the ability of the quantum exponentiation and FFT algorithms to work on all possible powers of 2 simultaneously to find the largest prime factor makes Shor’s algorithm faster than conventional computing for large numbers using even rather slow theoretical quantum subroutines.

Shor’s algorithm makes a dramatic example because it is both bafflingly unlike conventional computing and potentially enormously important. But it is not alone. The US National Institute of Standards and Technology (NIST) maintains a large library of quantum computing algorithms in its Quantum Algorithm Zoo, at math.nist.gov/quantum/zoo/.

Are these algorithms just math exercises? It is too early to tell for sure. But in practice, researchers have actually built laboratory quantum calculators with a few working qbits. These machines have successfully factored the number 15 (at IBM in 2001), unsurprisingly finding 3 and 5, and the current world’s record, 21 (done by a cross-institutional team in 2012). So for small numbers the idea works. Usefully large numbers will have to wait for machines with many more qbits. And that brings up the question of practicality.

**The Path to Realization**

In order to construct workable quantum computing devices, we have to achieve a number of implementation milestones. We have to build working qbits—not just five, but thousands. We have to figure out the quantum gates, and the equivalent of wires—unless we can make the gates act directly on the state in the input quantum register. All of these are huge undertakings, and the schedule for their achievement is unpredictable.

The challenges, unfortunately, come not so much from the novelty of the problems as from the laws of quantum mechanics and classical physics. Perhaps the foremost, and least familiar, is called decoherence. The job of a qbit is to hold a physical thing—an ion, a packet of photons, or an electron, perhaps–in place so that we can manipulate and eventually measure a quantized quantity like charge or spin. For that quantity to behave in a quantum way instead of a classical-physics way, we have to be able to restrict it to a superposition of the two pure base states we’ve been calling 0 and 1.

But the nature of quantum systems is to couple to things around them, vastly increasing the number of possible base states. Physicists call that blurring away from the pure states decoherence. An analogy might be a coherent laser beam in a waveguide striking an impurity and blurring from a superposition of two modes into completely incoherent light. The job of the physical qbit design is to stave off decoherence for as long as possible.

This means, in effect, that even a single qbit is a demanding laboratory apparatus, possibly involving lasers or RF transmitters, closely controlled electric and magnetic fields, exact dimensions, special materials, and maybe cryogenic cooling. Using it is essentially performing an elaborate experimental procedure. Even with all this work, today “as long as possible” is measured in tens of microseconds. So you have very little time to perform quantum computations before your qbits have lost their coherence. It is as if the information leaks out.

Today these limitations preclude large quantum registers or computations that require more than a few microseconds. But research is underway to implement far more extensive arrays of qbits and qgates using microelectronic fabrication techniques.

The work is itself somewhat incoherent, though, because there is no agreement on what physical phenomenon to use to hold the quantum state. There are qbit designs that quantize the polarization of photons, the charge on electrons trapped in quantum dots, the net spin of super-cooled trapped ions, the charge in a device called a transmon, and various other approaches.

The kind of qbit you choose will naturally determine how you implement quantum gates. For instance you might use the interaction of RF pulses with internal spins in trapped molecules, or the interaction of beam splitters with photon modes in waveguides. Clearly the subject lies deep within the boundaries of experimental physics. And as mentioned, implementation of qbits or qgates requires considerable external equipment, from digital logic to lasers or RF transmitters and antennas to cryogenic coolers.

Qbit implementation will also dictate how you measure the state of the qbit. You may need a super-sensitive photometer or bolometer, a resistance bridge, or some other incredibly sensitive device to measure the qbits and resolve the superposition state into a base state. And this process of measuring the qbit state brings up another issue unfamiliar to conventional computing: getting the wrong answer.

**Doubt**

There are two main kinds of problems with extracting a base state from a qbit. The first is that you are measuring a quantum superposition, not a classical quantity. Assuming the qbit has remained coherent, you will get one or the other of the base states, but you can’t be sure which: you can only be sure that the probability of your getting a particular state will be the square of the coefficient of that state in the superposition. If you measure a qbit in exactly the same state a hundred times, you should expect to get a distribution of 0s and 1s that converges toward the squares of the coefficients.

So you don’t know that the base state you measured on any give try is the one that had the highest probability. After you have read out a quantum output register by measuring the bits—thereby setting them all to base states—you have three choices. You can bet that you have the correct answer and go ahead. You can check through conventional computing, as Shor’s algorithm does, to see if the number you read is in fact a valid solution. Or you can repeat the computation a large number of times, either sequentially or in parallel, and take the most frequent result. It is also possible to design your computation so that the answer you want is the probability distribution of the base states rather than a particular binary number. In that case too, repetition is in order.

That is true even in a theoretical perfect quantum computer. But in a real implementation, there is another issue: good old classical noise. Even if everything goes well, there are no decoherent qbits, and the computation is designed to resolve the answer you want with very high probability, you are still observing the qbits by attempting to measure very, very small physical quantities. Noise happens. Once again, the only solution is either to detect an error by further calculation, or to perform the computation so many times that you are willing to accept any remaining uncertainty in the result. The concept of a guaranteed right answer is rather foreign to quantum computing.

If this is not painting a rosy picture of the future of quantum computing, take heart. Discovery is under way to find the best choice for qbits—though the answer may turn out to be algorithm-dependent. Microelectronics folk are working on miniaturization of quantum components through new materials and structures—allowing very large arrays of quantum computing devices that could be mass-produced like CPU chips. Computer scientists are developing the equivalent of assemblers and compilers that can convert an algorithm into an arrangement of quantum registers and qgates in a particular technology.

Is it worth it? Here is one data point. Shor estimated that a modest hybrid quantum/conventional computer could crack a powerful RSA cypher faster than a conventional computer could encrypt it. There have been similar results projected for tasks like sorting, and untangling other similarly hard math-based problems. So yes, there is enough promise here that researchers will keep working. But it might not be good to hold your breath.

CATEGORIES : Computing/ AUTHOR : Ron Wilson

**For Further Reading**

For comparison purposes, explore an FFT IP block for FPGAs.

## 20 comments to “Who Cares About Quantum Computing?”

Thanks for a well written introduction to quantum computing for non-physicists. I was certainly skeptical when I saw that an FPGA guy was going to try to explain QC, but I was pleasantly surprise.

As an engineer who has worked on error correction for quantum computing, getting started in this subject required a lot of head-scratching. I did a LOT of reading of the two types of articles you mention, and it took a long time to piece together a working understanding. Your description will be help speed up this discovery process for newbies trying to wrap their head around this fascinating subject.

Ron,

Thank you for advancing my quantum understanding, at least a little. Good job. I wholeheartedly agree with your assessment of the current state of the literature.

I’ve just finished “Decoding the Universe” by Charles Seife. It’s a great read, but definitely towards the pop-sci end of the literature spectrum. And it’s ten years old. Does anyone know of a similar, but more current, book? And, does anyone know of a book aimed at someone with a good understanding of college-level physics and computer science? If not, please start writing.

dB

I put my life experience in a book titled “New Age Quantum Physics” It is in two parts. The first part chronicles the development of quantum physics starting over 2000 years ago. Then the book attempts to leverage that material into understanding what Quantum Physics is really about. It is available on Amazon.Well, you asked

My first point is that the physics community does not understand quantum mechanics.

Therefore, I doubt a quantum computer means anything.

Second, quantum computing depends on something called collapse.

This, again, is a totally misunderstood phenomena.

The major error in all of this is that the physics community hangs their hat on the idea that the universe is made out of small hard balls that interact in strange ways. Once they realize that particles are not hard balls, they will make progress.

In essence when a quantum field collapses it appears to mathematically be in an infinite number of combinations. In using this to do calculations, any question has all the answers. So it is true that the answers appear instantaneously. The problem is locating the right one of the bunch. I have a physics degree and have been studying quantum physics all my life. I am old so if you wish you can just claim this note is the mutterings of an old man.

Please connect with me Al!

Excellent introduction, thank you!

A good introduction. I think it’s a good idea to note two of Einsteins quotes to see why some people struggle with quantum theory :

The more successful quantum theory gets the stupider it looks.

Since mathematicians seized the theory of relativity I no longer understand it.

You must bear in mind that quantum theory is only a mathematical model and is no more true than 3×3=9 or 100+111=1011, it’s just useful definitions and conventions.

Theorizing is, after all, the process of complicating the obvious !

It should be noted Albert Einstein, Max Planck, and David Bohm held that the indeterminism or uncertainty is an epistemological problem of human ignorance. Quantum events have natural causes and follow determined laws but the causes have not yet been discovered. So, the quantum indeteterminism is not ontological, it is epestimic. Not describing the way things actually are, but instead reflects our current limited knowledge and our perception of how things exist. Now the research may turn out to produce a new mathematical or computational approach to predictability resulting in a new approach to computing, but turn out to be entirely mistaken insofar as the ontology of physical phenomena is concerned. But again, it may all be for naught simply because it doesn’t really reflect anything but the limitation of human knowledge at this time about nature itself.

Physicists agree on the mathematics of quantum theory, but disagree on the interpretation. The approach used in this research is based on the interpretation of the Copenhagen theorists, who like Heisenberg, interpreted the “ghost in the box” as ontological indeterminacy characterizing the material universe at the most fundamental level believing that there are no natural causes or laws that determine the quantum behavior. It is not merely an epistemic limitation on knowledge, but fundamentally describes the uncertainty of the way things are at the quantum level. Not a temporary epistemic problem but reflects an inherent indeterminacy of the physical world itself.

Whether one finds a place for causality or detrmininism in the quantum world depends on one’s philosophical understanding of causality. Thus if there is no indeterminism in the physical world, all this research might produce productive computational models but not reflect anything that actually exists. Lot of time, money, and research potentially grounded on the epistemic limitation of what phycists know about the laws that ontologically govern what only appears to be indeterminism because of an interpretation and a limitation of knowledge. The entire effort is built on a philosophical presupposition, that the data actually reflects the way the physical world is at the quantum level.

The belief that quantum events are uncaused Is not grounded in science but in philosophy. When science reaches the limits of what it can measure, all it can say scientifically, in accordance with its method, is that it has reached the limits of what it can measure. Now it may become generally accepted that what is unmeasured is uncaused or unreal only if it assumes that causes must always be measurable or that reality must always be quantifiable. But these are philosophical assumptions, however, not findings of empirical science. They are philosophical beliefs if you will.

So, we must wait to see if a) all this research produces anything viable and tangible, b) if the philosophical understanding of indeterminism where some x can both be 0 and 1 simultaneously is simply erroneous or theoretical non sense based merely on mathematical models, and c) if the presupposition that there is no natural deterministic cause for what merely appears to be indeterminate is really simply a fallacy argumentum ad ignorantiam.

Argument from ignorance (from Latin: argumentum ad ignorantiam), also known as appeal to ignorance (in which ignorance represents “a lack of contrary evidence”), is a fallacy in informal logic. It asserts that a proposition is true because it has not yet been proven false (or vice versa). … unknown between true or false.

Excellent article. A few gaps but very well written simplifying a complex subject.

What can you do with quantum computing?

Quick answer: all the things listed in the Quantum Algorithm Zoo (http://math.nist.gov/quantum/zoo/). But these things may not be obviously useful if you aren’t an applied mathematician. Most proposed applications seem to use a combination of conventional computing and quantum computing.

ron

I’m just starting with Quantum Cryptography and it was a really nice article that appeared as newsletter in my mailbox in just the right time. I’m looking forward to more articles about Quantum Computing from Intel FPGA 🙂

Gnanam Elumalai: An excellent article demystifying some of the steps after superposition.

Excellent article, I understood the concept of quantum computing. Thank you

Great article, Ron!

Regarding “And this process of measuring the qbit state brings up another issue unfamiliar to conventional computing: getting the wrong answer”.

When manufacturers learn to get a stable result of QC, one should not forget and use this property in chips for parallel automatic generation of FALSE electromagnetic noise, which radiates any computer. Then passwords, encryption and other methods of protecting information will really make sense …

To answer David’s question try “Schrodinger’s Killer App, by Jonathan P Dowling.

A very good White Paper on the risk quantum computing poses to security systems and quantum-safe approaches.

http://www.etsi.org/images/files/ETSIWhitePapers/QuantumSafeWhitepaper.pdf

Thank you for you excellent article. As an electronic engineer I read several books on the subject and I got immediately bored by the involved math which isn’t in general linked to the real applications and even more to the possible use of QC in a real environment. Perfect and elegant theories but – at least from my point of view – totally useless but for the mathematiticians. Moreover I can’t figure out how FPGAs can be linked to QC at least in the foreseeable future. In any case I agree that electronic engineers should be aware of the subject and, as it is my case, ready to explain to their university students what it is all about. But engineers are practical people which must solve real problems, design real systems and..make them working. May I suspect that QC in this respect will be probably totally useless for the general industry people and of any interest only for the research institutions?

Excellent article Ron. What are your thoughts about quantum entanglement?

I share the same skepticism about the practicality and usefulness of QC, at least in the near future, with both T. Smith and Giovani Neri. I really enjoyed reading both comments specially T. Smith’s.

This article is timely since I teach QC in a graduate course, “Advanced Computer Architecture”, from high-level perspective covering: a) the history and motivations, b) current status, systems, and applications, and c) challenges and future trends. It is an attempt to keep our students here at the University of Kansas (KU) up-to-date with the latest technological trends. The response has been very positive but lots of questions remain unanswered.

One of these fundamental issues, is that decoherence of quantum objects could be an essential property/cause for separating the measurable effects of sub-atomic micro-world from those of our daily-life macro-world. How could this essential property be eliminated or at least minimized and yet achieve some quantum usefulness or even claim quantumness.

I would also like to hear from you all your thoughts about the quantumness of the D-Wave machine (https://www.dwavesys.com); the claimed to be the first and only existing quantum machine.

Esam, thank you for the kind words. My thought about entanglement was that, while it is an important issue in QC, if I even mentioned it in the article I would myself become hopelessly entangled. Is there such a thing as a quantum rat-hole one could be absorbed into? I am told that entanglement is important to some algorithms, but I didn’t feel I had the time or resources to pursue the subject further, so I left it out. (Along with a lot of other things I still don’t understand.)

I think you make another very interesting point. We mostly assume that making useful QCs is just a matter of successive refinement of the implementation, such as we have come to expect from microelectronics. But that may not be true. May it turn out that there is a fundamental principle, perhaps as you say having to do with decoherence, that prevents QCs reaching the scale necessary to solve real problems?

ron

Thanks Ron for your response.

Regarding entanglement and the current available systems, I mentioned the D-Wave as the claimed to be the “FIRST and ONLY” quantum computer, here are some more claims from the Chinese.

Last year, the Chinese launched their QC satellite that utilizes entanglement for providing “Hack-Proof” quantum communication and claimed some success there (how verifiable this, I don’t know!):

http://www.space.com/33760-china-launches-quantum-communications-satellite.html

A couple of days ago, the Chinese are also claiming the “FIRST and ONLY” QC based on quantum dots:

http://wallstreetpit.com/113417-worlds-first-quantum-computer/

I guess there will be more to come.

However, I still believe that decoherence is an essential property based on which our sensible macro-world phenomena are distinguishable from those quantum phenomena of the micro-world. Therefore, scaling to real-life usefulness would in my opinion remain a fundamental problem. I think we, or at least QC enthusiasts, need to be more modest and realistic in their claims.