From the article:

**Hit: Quantum Computing Goes Commercial**

In May, D-Wave Systems sold the world’s first quantum computer. The buyer was Lockheed Martin Corporation, who did not disclose how they intend to use the machine. The system, named D-Wave One, employs a 128-qubit chip, called Rainier, and uses superconducting technology to generate “adiabatic quantum computing” (that some claim is not true quantum computing). The cost of the system was not disclosed, but undoubtedly this is one of those cases in which if you have to ask, you probably can’t afford it.

It still bothers me (marginally) that the claim that AQC is not “true quantum computing” still festers in the collective conscious. One of the things I’ve learned over the past ten years is that dogma is extremely difficult to dislodge. Opinions and beliefs have tremendous inertia — even ones that are wrong and/or harmful.

I think the gate model of quantum computing set back the field of actually building real quantum computers by 20 years or so. I can imagine a parallel universe where the ideas of experimental condensed matter physicists drove the underlying theory of quantum computation, instead of theoretical computer scientists and mathematicians. In this parallel universe, by now we’d likely have dozens of real working quantum computers of all sorts of types. The main problem with the gate model is that, while it is beautiful for theoretical computer scientists, it is astronomically horrible from the implementation side. Somehow we got into a situation where experimental physicists (ie implementers) bought the story that the gate model was “real” quantum computing and other ideas weren’t.

Someone (I think maybe it was Eric) had a classic line that I sometimes think about when this subject comes up. When questioned about whether what we’ve built was a “real” quantum computer, he said “How about we race our 25,000 Josephson junction superconducting adiabatic quantum computer against your powerpoint deck [editor's comment: powerpoint deck == most advanced gate model quantum computer ever built] and see who wins.”

The point is that no gate model quantum computer has ever been built. I have speculated for some time that no useful gate model quantum computer will *ever* be built, because of a long list of inter-related challenges that no-one — even though lots of smart people have tried — even has the faintest notion of how to solve.

### Like this:

Like Loading...

*Related*

So do you think we’ll have dozens of different types of quantum computers in the next 20 years?

(or will the dwave architecture become the “IBM-PC” of quantum computing?)

@nn: Our objective is to provide to our customers and partners the fastest and most efficient computing systems on earth. If other computing systems (quantum or not) are developed, we’ll do our best to make sure that our gear obliterates them.

It is true that the article was in error to cast doubt on adiabatic quantum computing as “true quantum computing”; there is no doubt in the theoretical computer science community that adiabatic quantum computing (just like topological quantum computing using universal anyons, or measurement on cluster states, or others) is universal for BQP and therefore fully equivalent to the gate model and deserving to be called true quantum computation.

There is a far more serious error in the article, however, when it states that the D-Wave One accomplishes adiabatic quantum computing, which it does not. D-Wave may yet build a true (i.e. scalable and universal) adiabatic quantum computer if it is able to implement tunable ZZ (or YY) coupling in addition to XX, and is able to improve decoherence times enough to remain in the exact ground state avoiding thermal transitions between states while evolving slowly enough to avoid Landau-Zener transitions. The first group to build a true adiabatic quantum computer, whoever it is, will be able to translate and run any quantum algorithm from any model of quantum computation, including for example Shor’s algorithm to break RSA. Whether the hardware implements the gate model or some other universal quantum architecture such as adiabatic quantum computing is irrelevant; any true quantum computer can factor large composite numbers, and any machine that cannot factor large composite numbers is not a quantum computer.

Recent progress has convinced me that the D-Wave’s machines do indeed employ quantum and not just classical effects. But that does not make any of them a “true quantum computer” any more than a bank of rectifiers is made a “true computer” from the fact that it depends vitally on electronic components (diodes) with a non-linear response, which is also the key to the operation of transistors. The accurate statement would be something like: “The D-Wave One may not be a quantum computer, but it is a powerful optimization engine, of a type never built before, that depends crucially on quantum tunneling for its correct operation.” Or, as a soundbite: “It’s not a quantum computer, but it is a new kind of computer that exploits quantum effects.” I’d also accept “quantum annealing machine” and might even go so far as to countenance “special-purpose adiabatic quantum calculator”.

The progress in superconducting circuits made by your group and others has been very impressive, to the point that if I had to guess I would now predict that the first scalable architecture for universal quantum computation that ever exists will consist of a network of Josephson junctions. (My favored scenario is a sort of quantum metamaterial: a repeating pattern of loops whose interactions define a local Hamiltonion, with a spectal gap wider than the operating temperature, whose elementary exitations have the statistics of Fibonacci anyons.) Whatever progress has been made, though, the day of the true quantum computer is not yet, and it does the field no service to claim it prematurely to the public media.

I’ve argued that computers built to run a specific quantum algorithm (like ours) should be referred to as quantum computers. Ultimately this just boils down to what you define as a quantum computer — the definition I prefer is related to the previous point. If a machine can run a quantum algorithm then it deserves to be called a quantum computer.

While universal quantum computers are quite interesting for a variety of reasons, practically there isn’t a whole lot of reason to try to implement one right now. There is an extreme gap between the difficulty of building a practical quantum computer to implement quantum annealing for optimization and a practical computer to do, say, generic quantum simulation (which seems to be the only commercially useful application of a universal QC so far).

By the way it may be possible to implement an efficient factoring algorithm using our system (not Shor’s — a different algorithm for factoring using quantum annealing) — we’ll see.

We do know how to implement XZ couplers into our processor architecture, and have in fact designed and built some, but there is no compelling reason to try to build a universal AQC currently — there are simply no practically useful algorithms for one (except for optimization, and you don’t need XZ for that — at least not yet). If we had a uAQC now, the only thing we’d know what to do with it would be optimization.

While in principle you can map gate model algorithms into universal AQC, the overhead makes doing so impractical. In order for uAQC to be worth doing, algorithms solving valuable problems would need to be developed.

Another point to consider is that the T=0 version of quantum annealing is not as computationally powerful as the finite T version. Adding thermal transitions can substantially increase the success probability of quantum annealing algorithms.

I personally enjoy the comparisons of the D-Wave One System to the Altair 8080 and the general QC community climate to circa 1970s before the personal computing boom. Though I believe we are still far far away from operating systems or quantum mobile apps, the enthusiasm in the field is just as genuine as when Paul Allen flew to New Mexico to test Altair BASIC. Naysayers and proponents of the gate model will come and go but I love to see that D-Wave has a clear mission and a plan to continue developing higher-level qubit processors. Like the lead horse in a race, keep running in stride, history is never made by those living in the past. Best wishes – Nashid

Doesn’t it bother your Geordie, that you yourself don’t know how strong is decoherence in your quantum computer? Because even small decoherence rising possiblity of wrong result exponentionaly. Say, somebody think, that if your computer is classic, then it can give almost as good answer as classical anealing, as they speak. Why so many counting on this classical anealing? Analog computer with 100-1000 qubbits/adders if worthless. Say, need add 128 real numbers with double precision. It for desktop computer is very very easy task and takes only about 0.0000001 s. Because, seems your computer can’t do anything more if it is just some superanalog computer. Also latency to read answer, because of speed of light, because intel chip is not very near that cool stuff, which you call quantum chip (rainer processor), so additionally waist of time for inputing and reading answer. If you would have 10^12 qubits and it sum up numbers, then it could be as fast as 1000 desktop computers to sum up some 10^12 (trilion) real numbers. So how you don’t figure it out yet, if your computer is classic or quantum or superanalog (classical anealing, which I heard supose to blind from real answer about if it is simple analog bullshit or real quantum)?

So do you admit, that in case, you computer is classic (analog or not), that it still is fast or in that case it mast be very slow and worthless (if it is classic and not a little bit real quantum computer)?

Keep in mind that intel i7 CPU waisting all transistors on cache and thus it have only about 4 cores and each core have about forth 32 bit precision blank spaces. Maybe with new AVX 256bit (instead 128 bit) intel instructions this is 8 blank spaces for numbers, which will be added or multiplied. So even I don’t count cores, because many programing programs (free pascal etc) using only single core, so only one place for 128 bit precision number (yes, it is the best what can do CPU with all that SSE-SSE4 power). So 3GHz and it is about 10^9 – 10^10 real numbers addition operations per second or multiplication operations. Maybe with single precision up to 10^11, say, addition operations per second. With your computer setting of qubit how much it can be +1 (from 0 to +1, like 0.2) and how much it can be -1 (from 0 to -1) and what is field on qubit, so another multiplication operation (from -1 to +1 range). So it don’t seems as a lot can be computing power from some classic tricks with magnetic field on ‘qubits'; it must be still something like analog computer, like I said about summing up real numbers (of course it seems, that with trillion real numbers analog addition should be very unprecise, but as you say approximate answer may do a trick and who knows maybe analog can sum up quite precisely even trillion numbers).

But, 128 ‘qubits’ for some analog classical computer, excuse me, that can not even compare to desktop computer, because is very unpowerfull. So why you don’t just stop using, that fraze, that you computer is very fast (and computationally powerfull and faster than desktop computer) without quantum computer effects. Or you still can tell me, that 128 qubits Rainer without quantum computer effects is faster than desktop computer (say intel Core i7)?

Geordie could you reveal what the next d-wave processor is going to be after the 512q-bit is it a 1024 or 2048? do you expect d-wave to be able to follow the doubling of q-bits every year or will there be a slow down?

-kasper

Hi Kasper, the historical trend has been doubling the number every year. This has held now for about 8 years.