The issue NP-complete and NP-hard problems are not a red herring. In popular (not peer reviewed) magazines such as New Scientist, Scientific American and IEEE magazine publication, and of course news media that state specifically or imply these sorts of problems. A great many people believe that this would be possible.

By the way, have you seen the 2012 feature file “Traveling Salesman”, an intellectual thriller which depicts a conversation among 4 top mathematicians who were hired by the government a 2012 i film about four mathematicians solving the P versus NP problem,

]]>Here are some things to consider:

0. In your definition of “solve” you need to specify what you mean by ‘right’. Recall that the central premise of quantum computation is that you get a sample from a probability distribution that you are attempting to programmatically define. If I take one sample from a distribution, and I want it to be say the Boltzmann distribution, because I’m training deep neural nets, you’d need to give me a prescription for determining whether it’s ‘right’ (hint: you can’t). You’re thinking about global optimization which isn’t what quantum computers do at all.

1. No-one thinks or expects that quantum computation will ever give worst case polynomial scaling for NP-complete problems, or for finding global optima for NP-hard problems. This is not relevant to the discussion and never has been — it’s a red herring.

2. Quantum computers return samples from a probability distribution. This distribution is complicated. However we know certain things about it. One is that any sample returned is guaranteed to be close to the global optimum in energy. This is related to the energy scales of temperature and the problem Hamiltonian. The Hamiltonian’s energy scales are much larger than the temperature, so the annealing process highly favors solutions that are within kT of the ground state. There may be many (worst case exponentially many) of these but kT is MUCH less than the total energy spectrum. So even in the worst case you are guaranteed to get a ‘good’ solution in this sense.

3. Your statement about ‘the overwhelming part of … ‘ etc. seems to be not all that relevant to anything. What’s your point? The same is true of any new processor, and seems especially obvious with a special purpose piece of hardware such as a neuromorphic chip (of which ours is one of many).

4. What the paper you’re quoting showed is that in our second commercial generation we reached parity with fine-tuned algorithms run on GPUs. I’m extremely proud of that accomplishment, and the team that pulled it off. Spinning this as a disappointment seems particularly perverse — it’s a historic achievement. Things will continue to exponentially get better on our side, as they have for 10+ years, and over time this technology will grow to be a major competitor to silicon based solutions for many different tasks. The time for superconducting processors isn’t quite here yet, but it’s coming.

In 2014, a group of researchers from ETH Zurich, Google and Microsoft published a report how to define quantum speedup, and reported that they were not able to measure quantum speedup with the D-Wave Two device.

]]>