First look at some results from Washington chips

Colin Williams recently presented some new results in the UK. Here you can see some advance looks at the first results on up to 933 qubits. These are very early days for the Washington generation. Things will get a lot better on this one before it’s released (Rainier and Vesuvius both took 7 generations of iteration before they stabilized). But some good results on the first few prototypes.

One of the interesting things we’re playing with now is the following idea (starts at around 22:30 of the presentation linked to above). Imagine instead of measuring the time to find┬áthe ground state of a problem with some probability, instead measure the difference between the ground state energy and the median energy of samples returned, as a function of time and problem size. If we do this what we find is that the median distance from the ground state scales like \sqrt{|E|+N} where N is the number of qubits, and |E| is the number of couplers in the instance (proportional to N for the current generation). More important, the scaling with time flattens out and becomes nearly constant. This is consistent with the main error mechanism being mis-specification of problem parameters in the Hamiltonian (what we call ICE or Intrinsic Control Errors).

In other words, the first sample from the processor (ie constant time), with high probability, will return a sample no further than O(\sqrt{N}) from the ground state. That’s pretty cool.

Lockheed Martin piece about D-Wave technology

Here is a short piece describing what Lockheed Martin and scientists at USC think about D-Wave technology.

Here is a great quote from the piece.

It’s a game changer for the corporation, it’s a game changer for our customers, and ultimately it’s a game changer for humanity. Computationally this is the equivalent of the Wright brothers at Kitty Hawk.

- Greg Tallant, Research Engineering Manager, Flight Control & VMS Integration – FW, Advanced Development Programs, Lockheed Martin Aeronautics Company