First look at some results from Washington chips

Colin Williams recently presented some new results in the UK. Here you can see some advance looks at the first results on up to 933 qubits. These are very early days for the Washington generation. Things will get a lot better on this one before it’s released (Rainier and Vesuvius both took 7 generations of iteration before they stabilized). But some good results on the first few prototypes.

One of the interesting things we’re playing with now is the following idea (starts at around 22:30 of the presentation linked to above). Imagine instead of measuring the time to find the ground state of a problem with some probability, instead measure the difference between the ground state energy and the median energy of samples returned, as a function of time and problem size. If we do this what we find is that the median distance from the ground state scales like \sqrt{|E|+N} where N is the number of qubits, and |E| is the number of couplers in the instance (proportional to N for the current generation). More important, the scaling with time flattens out and becomes nearly constant. This is consistent with the main error mechanism being mis-specification of problem parameters in the Hamiltonian (what we call ICE or Intrinsic Control Errors).

In other words, the first sample from the processor (ie constant time), with high probability, will return a sample no further than O(\sqrt{N}) from the ground state. That’s pretty cool.

Discrete optimization using quantum annealing on sparse Ising models

Another paper, demonstrating some interesting techniques for overcoming practical problems in using D-Wave hardware. (Apologies Diana for the continuing lack of interpretation of these results :-) ). These techniques were applied to Low Density Parity Check problems.

Discrete optimization using quantum annealing on sparse Ising models

  • 1D-Wave Systems, Burnaby, BC, Canada
  • 2Department of Computer Science, Joint Center for Quantum Information and Computer Science, University of Maryland, College Park, MD, USA

This paper discusses techniques for solving discrete optimization problems using quantum annealing. Practical issues likely to affect the computation include precision limitations, finite temperature, bounded energy range, sparse connectivity, and small numbers of qubits. To address these concerns we propose a way of finding energy representations with large classical gaps between ground and first excited states, efficient algorithms for mapping non-compatible Ising models into the hardware, and the use of decomposition methods for problems that are too large to fit in hardware. We validate the approach by describing experiments with D-Wave quantum hardware for low density parity check decoding with up to 1000 variables.

Reexamining classical and quantum models for the D-Wave One processor

More science on data from the D-Wave One system at USC.

Reexamining classical and quantum models for the D-Wave One processor

We revisit the evidence for quantum annealing in the D-Wave One device (DW1) based on the study of random Ising instances. Using the probability distributions of finding the ground states of such instances, previous work found agreement with both simulated quantum annealing (SQA) and a classical rotor model. Thus the DW1 ground state success probabilities are consistent with both models, and a different measure is needed to distinguish the data and the models. Here we consider measures that account for ground state degeneracy and the distributions of excited states, and present evidence that for these new measures neither SQA nor the classical rotor model correlate perfectly with the DW1 experiments. We thus provide evidence that SQA and the classical rotor model, both of which are classically efficient algorithms, do not satisfactorily explain all the DW1 data. A complete model for the DW1 remains an open problem. Using the same criteria we find that, on the other hand, SQA and the classical rotor model correlate closely with each other. To explain this we show that the rotor model can be derived as the semiclassical limit of the spin-coherent states path integral. We also find differences in which set of ground states is found by each method, though this feature is sensitive to calibration errors of the DW1 device and to simulation parameters.

Quantum annealing correction for random Ising problems

A new paper from users of the D-Wave Two at USC. Here’s the abstract:

We demonstrate that the performance of a quantum annealer on hard random Ising optimization problems can be substantially improved using quantum annealing correction (QAC). Our error correction strategy is tailored to the D-Wave Two device. We find that QAC provides a statistically significant enhancement in the performance of the device over a classical repetition code, improving as a function of problem size as well as hardness. Moreover, QAC provides a mechanism for overcoming the precision limit of the device, in addition to correcting calibration errors. Performance is robust even to missing qubits. We present evidence for a constructive role played by quantum effects in our experiments by contrasting the experimental results with the predictions of a classical model of the device. Our work demonstrates the importance of error correction in appropriately determining the performance of quantum annealers.

Two interesting papers from the Ames crew

Hi everyone! Sorry for being silent for a while. Working. :-)

Two interesting papers appeared on the arxiv this week, both from people at Ames working on their D-Wave Two.

First: A Quantum Annealing Approach for Fault Detection and Diagnosis of Graph-Based Systems

Second: Quantum Optimization of Fully-Connected Spin Glasses

Enjoy!