What do I mean by red herring, you might ask?

The etymology of the phrase may be the practice of saving a hunted fox by dragging a smoked herring across its trail – creating a new, useless scent trail. When smoked, herring turns bright red and is quite odoriferous. The latter trait made it possible to deliberately leave a strong trail on the ground to facilitate training hounds to track a scent. Having been so trained, hounds would readily follow the scent of the fish over that of the fox, allowing their quarry to escape.

Analogy clarification:: Fish –> factoring; Fox –> ever other application of QCs.

Scott Aaronson over at Shtetl-optimized recently posted a great description of Shor’s algorithm for non-specialists, in response to a request from JR Minkel, a Scientific American writer. Even though Scott has an alarming lack of musical taste he has few peers when it comes to describing complex technical issues to non-specialists.

The premise of Shor’s algorithm is this: take one (large) QC, one large product of prime numbers, shake well, out comes the factors of the large product of primes.

The algorithm is interesting because it produces this output (the factors) much faster than you can without a QC. It is also possibly interesting from the applications perspective because the problem it solves is important for certain types of code-breaking.

There is a particular type of encryption in use that uses the “hardness” of factoring large products of prime numbers to keep data secure. This particular approach would become vulnerable in a world with large QCs.

So let’s say we lived in a world with factoring machines available at Walmart. Would internet commerce collapse? Of course not. The most likely thing that would happen is that encryption software providers–the people who keep your credit card numbers secure–would change the “hard” part of their encryption to something that (1) wasn’t factoring and (2) was impervious to QCs. Is this technically possible? Absolutely. Also over time it’s likely that encryption — even retail encryption — will migrate over to quantum encryption.

So you might ask why the NSA cares about factoring at all. I can only speculate, but here are three possible reasons: (1) they have a mandate to be at the leading edge of understanding all things cryptanalytic, regardless of practical application; (2) the possibility that messages encrypted today have been stored somewhere and will be decrypted in 5-10 years with some new advanced technology (QC); (3) they are concerned that new non-factoring crypto algorithms will be developed.

Alright so why am I going on about this? The reason is that when asked what QCs are good at, many QC experts respond with the factoring example, when there are several other perfectly great quantum algorithms that would also fit the bill. Why the focus on factoring? I’m not entirely sure, but here are two possible reasons: (1) how the algorithm works is technically very interesting (when people understand it, they usually say, man that’s cool!); (2) usually researchers are financed by the NSA, so it’s self-serving to put a line in the intro to your paper that lists code-breaking as an application, and then every paper written about QC starts with the same sentence, and the meme takes hold.

Why does it matter that experts focus on factoring? There are two reasons. The first is that factoring isn’t interesting from the applications/commercial perspective. It is hard to imagine how a company could become successful building machines that did nothing but factor products of primes.

The second reason is that the application is very negative in the sense that it makes the world worse than it was before the tech was introduced. There is no social benefit to adding factoring to the list of things humans can do well (or if there is, I haven’t heard anyone mention it). This “no social benefit” issue is often ignored by scientists and technologists, but I think this is a huge mistake. Most people feel great about supporting technologies with a clear social benefit (cures for cancer, etc.); not so much for weapons.

I think it would do the whole QC field a lot of good if the people “out there” who aren’t experts could hear about the algorithms/applications that stand a chance of creating real value, like quantum simulation, or possibly applications of the recent NAND tree algorithm, or optimization.

Whenever people ask what QCs can do, there’s a tendency for people to respond with the factoring example, which I think is making it very difficult for any non NSA people (ie pretty much everybody) to get excited about QCs. It’s hard to get really excited about something that’s both commercially worthless and has no positive social upside.

If the scientists & management at Merck/Pfizer/Exxon/DE Shaw/Amgen/Corning/Dow etc. understood (in the same way the NSA understands Shor’s algorithm) that QCs could exponentially speed up their quantum chemistry calculations there would be 100s of millions of dollars of investment from these sources flowing into practical QC R&D. This is a wonderful thing called market pull which would benefit everybody in the QC field, even if you don’t personally care about actually getting these machines into the hands of end users.

In contrast to factoring, quantum simulation is both extremely commercially valuable (a large fraction of the world’s supercomputer cycles are currently spent solving the Schrodinger equation) and offers the possibility for huge social good (new clean energy sources, better medicines, cleaner chemical plants, etc.).

Maybe it’s time to put away the fish.

Geordie

I agree for all the reasons stated above that factoring is one of the least impressive applications for a quantum computer. However, is one of the reasons that it is thrown out there so much because it shows the greatest speedup over classical algorithms than other quantum algorithms? Does for instance Grover’s algorithm show exponential speedup?

Quantum computing science would be better served if other applications were used to highlight the benefits of quantum computers. However, when explaining just how fast some quantum algorithms are to the laymen, Shor’s algorithm will continue to be useful example. We’ll just have to continue to grit our teeth everytime someone mentions RSA in the same sentence as quantum computing.

Chris: A variety of quantum simulations algorithms are exponentially sped up. For example, calculating the energy of a molecule as a function of its nuclear positions (which is a good definition of what chemistry is) is exponentially sped up. Also memory requirements are exponentially decreased.

From my point of view the most important QC algorithm is the Abrams-Lloyd algorithm http://arxiv.org/PS_cache/quant-ph/pdf/9807/9807070.pdf . It is way more important than Shor’s algorithm and receives very little press. Abrams-Lloyd is at the heart of most useful quantum simulations algorithms.

Geordie

Can you check the link? It appears to be broken.

Ah… had an extra period in the hyperlink.

How can the NAND tree algorithm be applied ?

http://scottaaronson.com/blog/?p=207 I see scott aaronson’s explanation of the NAND tree and the original paper.

Is it better for finding some optimization of some aspect of complex decision trees?

http://en.wikipedia.org/wiki/Decision_tree

Geordie, I don’t normally like to be a troll but the Abrams-Lloyd algorithm has a very big problem with it.

Before the algoritm begins you need to have access to a state (V_a in their paper) that is promised to have a non-negligible component of the ground state. They assume in this paper that this is easy to find, which is true for some systems and very not true for others.

As far as I understand it, being promised access to the state V_a in certain 2-local Hamiltonians allows you to solve QMA-complete problems (this was shown by Kempe, Kitaev and Regev). This suggests that finding the state V_a will be very hard for some systems, or on the upside, if it is easy Scott has to eat a crow and will become an experimentalist.

This doesn’t mean that all hope is lost though. If you can find a way of making these states, then you might be able to apply the Abrams-Lloyd algorithm. Also, even if it is hard to find a state like V_a for a system, it might still be easier to perform the simulation on a quantum simulator than on a classical computer. At the moment we are stuck at an upper limit of about 30 qubits for simulating simple stuff that is remotely interesting. We might be able to find spectrum information on simulated systems with 100-200 qubits that we can’t do on a classical computer because we can explicitly simulate the Hamiltonians that we are interested in (even though this might be typically a hard task). Maybe this sentance doesn’t make sense, but it is possible to find solutions to instances of hard computational problems and that might be good enough for some of the problems that physicists and chemists care about.

Generally, it is thought around the theory community that QC’s are hot for simulating the dynamic properties of systems but aren’t so great for working out the static properties of systems. You can show pretty easily that QC’s can simulate any k-local Hamiltonian dynamics (given finite-dimensional particles). There is also heaps of work out there in the QI community on showing which Hamiltonian systems can simulate which others and which ones are better than others at these sort of tasks.

Mick: The point about finding a suitable V_a is a good one, although there is at least one proposal that looks good to me, which we showed in the Science paper in the sidebar on quantum chemistry.

The idea is to use an AQC-like approach to put yourself in a groundstate of a hamiltonian where the overlap of this groundstate and the true groundstate is large enough.

There may be certain cases where the approach doesn’t work, which is fine–in practice we care more about the average case and high throughput on huge numbers of cases.

Also re. # qubits and usefulness etc. the chemists tell us that 50 qubits qould let us do calculations beyond anything that’s been done for some specific types of quantum chemistry calculations.

By the way I think that statics are going to turn out to be a lot easier to calculate than dynamics.

Geordie – Maybe I wasn’t clear enough, it will work in certain instances, ie, those instances where you can actually find a V_a. The problem is that in many cases this might turn out to be a killer problem. On the other hand, we might get a whole lot smarter and work out that there are lots of interesting physical systems (like the one you point out) where it is relatively easy to find V_a. It’s just that a quantum computer can’t generally do this quickly – and when I say generally I mean for all Hamiltonian systems.

BTW, I disagree on the whole statics vs dynamics thing but I think it’s a religious issue as there is almost no proof one way or the other. It’s just that it seems that a lot of the problems that we encounter for simulating static properties (ie questions like “produce the Gibbs state for this system”) on a classical computer seem to also be problems in the quantum case.

Quantum encryption my ass. Quantum encryption isn’t a replacement for public key cryptography. A successful QC can clearly render RSA worthless through factoring. It isn’t clear that QC algorithms couldn’t be eventually developed to tackle discrete logarithms and perhaps even discrete logarithms over elliptic curves, and then that would pretty much cover all of public key encryption, rendering it worthless. Without public key algorithms, SSL don’t work, certificates don’t work, signatures don’t work. Unless a QC-resistant public key algorithm is identified, QC pretty much ruins security for everyone.

So yes, despite the positive implications for things like simulations, the fact that QC can break RSA is a pretty bad omen for communications security, and a harbinger of times to come.

Denis: It’s relatively easy to find asymmetric functions to replace factoring & discrete log for public key crypto. The reason is that QCs are quite limited in what they are good at. While the current asymmetric methods may be vulnerable, it won’t be difficult for software companies to change from factoring to something else that is probably secure against QCs.

Also don’t discount quantum encryption so quickly. I don’t see any fundamental reason why in the long term key distribution couldn’t happen using it. I think you’re wrong, it could become a replacement for public key crypto over the long term, especially if different classes of internet are introduced.

“… especially if different classes of internet are introduced.”

such as this.

Actually I was thinking more of something like http://www.bc.net/ .

Does BCNET already have an operational QKD network up and running? To be clear, the link to the network is here and here, white paper here. The Acin paper outlines new quantum communications protocols to take advantage of entanglement chains in these networks.

No, BCNET (and Canarie) are extremely high bandwidth networks separate from the internet. We use them to hatch plots related to Canadian World Domination and the like.

Factoring is interesting because it is one of the oldest (if not THE oldest) computational problems! Euclid knew of it, Gauss and Euler worked on it, and it is perhaps the central problem in number theory. More great minds have attempted this problem than perhaps any in history. To write it off as a “red herring” is either ignorant or dishonest.

Furthermore, you say “It’s relatively easy to find asymmetric functions to replace factoring & discrete log for public key crypto.” Care to give a concrete example of a realistic public-key encryption system that is not vulnerable to quantum computing? The only remaining options that I know of are lattice-based systems. While they are near and dear to my heart, they are still far far far from ready for prime-time. And the jury is still out on whether lattice problems really are hard for quantum computers.

In fact, it is not at all trivial to devise public-key systems that are resistant to quantum computers, and this is more than just an idle curiosity.