A great question

In my last post, a reader asked an excellent question that I thought I’d provide my perspective on. This particular issue is complex, so bear with me. Here’s the question:

If quantum computing is eventually bound to change and transform the computing business landscape completely, how come that companies with deep pockets like Intel and IBM are not planning any version of quantum computer any time soon, but they still focus only on traditional computing? If your work is worthwhile, why a company such intel, which can throw US$4 billion into a fab, can not spend a hundreth of this money to finance DWave (editor’s note: or any effort, including an internal one)?

There are at least three good reasons why a big company would decide not to invest in quantum computing as it’s currently perceived. The first is an economic argument based on the time value of money. I have a great article on this by HP’s Stan Williams that anyone interested in this question should read. Here it is:

stan_williams_qc_investment.pdf

The punchline is that any opportunity sufficiently far away with sufficient risk isn’t a good investment regardless of the size of the opportunity.

The second good reason is covered in Clay Christensen’s The Innovator’s Dilemma, which is another must-read for understanding the dynamics of technology strategy in big companies. The main point is that businesses acting rationally produce products that their customers ask for. It’s not rational to produce something that no-one wants. Major disruptive technologies are sometimes things that no-one is asking for (like QCs). This is related to the first reason. People working on QCs seem to think that companies like IBM etc. will try to build these just because they are cool. That’s not the way this type of thing works. People invest in technology development because they believe people will want to buy it. What argument would someone inside IBM use to justify a long-term high-risk high-expense internal QC effort? If there were clear large-market applications where you could definitely say, if we built this we would own a new $10B/year market with 100% margins then maybe you could justify the investment. But currently there is an extreme lack of clarity “out there” on the issue of what you’d do with a QC if you had one.

The third good reason was pointed out by a respondent to the original question. If you’re an IBM, HP, etc., it makes a lot of sense to argue as follows: Look, we don’t know at least two things. One, we don’t know what the roadmap for QC development looks like. Some say QCs will never be built. Most people say machines competitive with conventional approaches are 20 years away. Some people say 5. Some (ie me) point out that QCs are already being sold (NMR QCs) at high margins. Two, we don’t know what the market looks like. What are the applications? Who are the customers? What’s the size of the opportunity over time? So given both enormous technical and market risk, let’s do the following. Let’s wait and see what happens. If a competitor or a start-up can actually demonstrate (1) sufficient reduction of technical risk, (2) a clear path to real scalability (note: I don’t mean scalability in the sense that it’s sometimes used in research articles, ie. in principle scalable, I mean practically scalable), and (3) clear market information, most likely in the form of paying customers and a high margin business model with high growth (~40%+ year over year revenue growth), then these big guys figure that they can always get in on the action by partnering, acquiring, licensing, etc. It is rational to pay more for less risk. If I were the CEO of IBM, given the information currently publicly available, that’s certainly the strategy I would use.

Given these types of arguments, you might ask why it makes sense to try to build QCs in a business context at all. In other words, why would a start-up attempt to do this? Here’s some reasons:

  1. There are several well-known case studies of situations where the prevailing wisdom about how long it would take to develop a new technology or achieve a scientific milestone were wrong. Here’s two: sequencing the human genome (Celera) and producing synthetic insulin (Genentech): here’s a Genentech case study very relevant to QC & D-Wave: scott_stern_1.pdf
  2. If, as in the case of Celera or Genentech, the prevailing wisdom as to timescales to deployment could be shortened by a factor of 5 by a focused effort, then we’re talking about QCs competitive with conventional approaches in 1-4 years, not 5-20. This drastically changes the conclusions of the time value of money argument (reason #1 above).
  3. Most of the people working in quantum information science aren’t even remotely interested in applications. There is a lot of interest in algorithms, and sometimes people interchangeably use the two terms, but there is a major fundamental difference between the two. This lack of interest in applications is a green field opportunity for a start up. If there are big market applications for QCs in practice, the fact that there simply aren’t a lot of people looking to connect technology to users means that a small focused effort stands a good chance at identifying these in advance of competitors. Therefore reason #2 above related to the innovator’s dilemma is actually an extremely valuable advantage for a start up. We can prospect without worrying about competition from incumbents.
  4. Building QCs, especially in a business context, is extremely hard. The only way any effort has a chance of succeeding is by having a lot of A+ people in a wide range of roles. A first mover has a big advantage here. If someone wants to be involved in a really serious, long-term, well-financed effort with the kind of infrastructure required to do this very hard thing, there simply aren’t a lot of options for where to go. As an example, would I personally have joined Celera or Genentech in their early days if I had the chance? You bet I would. The same dynamic is at work now in QC development.

31 thoughts on “A great question

  1. Geordie,

    When was the HP article written, because if I’m not mistaken both HP and IBM are doing quantum information research:

    http://www.hpl.hp.com/research/qip/

    http://www.research.ibm.com/quantuminfo/

    It appears to me that the difference may be the approach (optical in HP’s case) or that the company is focusing more on theory than playing with real qubits( as seems to be IBM’s case). Again both of these approaches seem for reasons quoted above … risk. Playing with photons and theory seems pretty cheap when compared to manufacturing superconducting chips and cooling down to mK ranges. Not to mention tracking down and hiring the staff needed to do so.

    With this risk involved, it seems that the better funded companies are only willing to take little steps until quantum computing science can be proven to be on more stable ground. I’m not sure what it would take to make them commit more. Will DWave’s presentation give them a wake-up call? Or will it only be after profitable quantum computational applications are made and run?

    I whole-heartedly agree with reason 2 as to why a start-up is better suited to make advances. To put it simply, HP and IBM will continue to make and sell printers/servers even if their quantum computational groups fail to come up with anything. DWave’s whole team, on the other hand, is focused on producing and using one thing. Quantum computing.

  2. Chris:

    There is a big difference between doing research and committing to delivering products. The latter requires a real strategic commitment, a large investment, good leadership and extreme focus on one path forward. The former allows the companies to stay abreast of what’s going on via their scientists’ links to the academic / research communities and come up with some important innovations along the way, but that’s probably it. The efforts that you reference, although they contain excellent people working on good ideas, are insufficiently supported to be able to pull this type of thing off.

    As an example of what a “real” project looks like: the cell chip cost in excess of $1.7B to develop. While it looks like QCs are 1-2 orders of magnitude cheaper to build than this, you get the picture. If an IBM or HP really wanted to build QCs those are the numbers we’re talking about, not a few million per year or whatever they are up to now.

    I was at a conference a while back and someone referred to the race to build a QC as a horse race. Someone replied that it was more like a donkey race, which is kind of funny and sort of true.

  3. Geordie,
    first of all, let me thanks you for your very informative answer.
    I will bother you again for 2 minutes..

    I have read carefully many times your reply, but I can not agree with two things you say:

    1) ” then these big guys figure that they can always get in on the action by partnering, acquiring, licensing, etc. ”
    In my ( again ) very humble opinion, the big guys are, at least, three: IBM, Intel and HP ( which currently has its own team working on QC ), which a fourth ( Samsung ) also in the list. The above companies have a market cap of some tens of billions of USD, if not a hundred or more, and are competitors so, if anyone outside comes up with a good technology, which could provide relevant cash inflow and re-define their business landscape, they would have to engage in a bid war to get an exclusive license or they will have to share it.
    This would be good news for D-Wave, which could be bought for some billions, but costy for the company which has to buy the technology and even worse for the other companies, which could lose market share.
    In case of partnership/licensing/etc. the above analysis does not change a lot

    If IBM had been working and developing its own QC technology, this, while more risky, would have meant not to have to share the technology with any one and not to have to engage in any costy bid war with any other competitor.
    I do not have to mention that IBM has state-of-the-art labs in Zurich and some other places around the world.

    2) Also, you do not seem to mention the fact that current silicon technology is bound to a dead end sometime soon.
    The major player in the field, Intel, has a 45nm process almost polished, a 32 nm which is, according to Gargini, ” in good shape ” and due out in tewo years` time, and a 22nm process ” still in research/development ” phase.
    The 22nm is due out in 2011, according to Intel roadmap, and in 2015, according to ITRS roadmap.
    After that, there is the void, tunneling is likely provide a halt to traditional silicon design and nobody has a clear path ( carbon nanotubes? spintronics? .. ) for what comes afterwards.
    A 15nm technology ( the would-be next logical step ) is not even mentioned in any real roadmap ( except the would-be roadmap of ITRS, maybe, which is just a ” hope ” of a roadmap, and not much more.. ) of any of the big companies.
    And 2011 is only four years away.
    If there is a good technology coming out, which could, at least in principle, keep Moore`s law going, it seems strange that the big guys have not let this technology have a chance.
    And I did not see any real news of your imminent presentation of a QC tech in the following geek sites, which I browse every day:
    http://www.theinquirer.net
    http://www.dailytech.com
    http://www.tgdaily.com
    http://www.geek.com
    http://www.eetimes.com
    http://www.anandtech.com
    ..

    I see three reasons for which you could be the first to develop a QC, before any other does:

    1) your QC works, but lacks the capabilities to provide real-world solution which could be marketed in a few years and can not be even a partial substitute of current silicon tech, not even for supercomputers, so the big guys look at D-Wave just like as an interesting experiment, but not much more;

    2) you really cought the whole semiconductor business with the pants down, you have come up with a real technology that could have a huge impact on the market on the semicon business, you will become billionarie soon and we will have protein folding and Heisenberg` s equations for all the atoms of Mendeleev`s table solved soon. This would maybe lead to world-changing applications, such as: nanomachines, new materials, plasma behaviour understood ( nuclear fusion ), wheather predictions one month away and a computer which gives you ” chessmate in 28 “;

    3) no idea..

    I hope that the real answer is no. 2 ( no irony here, I am a big fan of you and D-Wave ).

    We ( me and Chris, maybe ) may sound a little bit too suspicious, but it is just because what you are doing just seems ” too good to be true “.

    So, keep up with the good work, I have already crossed the day of your presentation in my calendar.

    Anyway, I will send you a Japanese kimono soon ( XL size? )

  4. Ah,
    The distinction here was between research and development. I thought you were saying that big companies weren’t investing at all. My mistake.

    As for the donkey race. I’ll take $20 on DWave to win please =P.

  5. Another comment on the pony show. When do you expect the big companies to show up and enter the race? At what point in development does DWave expect to have to deal with big money competitors? Or does DWave hope to be far enough out of the gates that wont be an issue, for a long while, when they do show up? Just curious.

    Thanks again for the prompt responses.

  6. Geordie,
    I am now looking at your video presentation:
    ” So you` ve built a quantum computer… now what? “.

    You have made a mistke, when you compared the market cap of the three industries, pharma, chemical and biotech with the GDP of US in 2004.
    You say:
    ” so, if you sum all the wealth and value of these three industries, you get like a significant fraction of the total GDP of the US “.

    While I agree that the three industries can be seen as enormous, it is mistake to compare the market cap of some industries and compare it with the GDP of country.

    The annual GDP is the annual Gross Domestic Product, that is, the value of all the goods and services produced within a nation in a certain time.
    http://en.wikipedia.org/wiki/Gross_domestic_product

    The market cap ( market capitalization ) of a company, is the value of that company according to the market:
    http://en.wikipedia.org/wiki/Market_cap

    Comparing the GDP of a nation with the market cap of a company is misleading.
    It would be better to compare the GDP of a nation with the revenue of a company ( or an industry ).
    The revenue of a company is ( usually ) much less than the market cap.

  7. Matteo:

    I’m going to respond to some of your points when I free up some bandwidth, but I thought I’d just clarify this GDP issue.

    The comparison wasn’t meant to suggest that the market cap of a company and the GDP of a country are equivalent concepts. Of course they’re not.

    However the market cap of a company is generally a linear function of trailing revenue; for example 2x trailing revenues is often used as a rule of thumb.

    If this rule of thumb holds then the three sectors I highlighted had roughly $1.5 trillion of revenue in 2004 (given the time we could go and check what the actual number is). This means that the “value of all the goods and services produced by these companies in 2004” (note similarity to GDP language) was about $1.5 trillion in 2004.

    This number is comparable to the GDPs of many countries in 2004. So I stand by my comparison–it’s legit.

  8. Geordie,
    yes, sorry for seeming overly critic.
    That was just a very minor critic on your presentation, it seemed to me that, when you make that comparison, speaking about national GDP and companies revenues would be more appropriate.

    But the real fact is that the potential of a QC which could be used for drug testing, material testing, etc. is huge.
    Can not wait for the presentation day.

    Keep up with the good work!!

  9. I don’t think any of these companies has any reason to be interested in QC’s. The real competitors with QC’s are supercomputers.. The business models of IBM, Intel etc will not be disrupted by QC for at least 20 years and most probably a lot longer. Companies like Intel, IBM, Hp are in the business of selling software and computers to businesses. Most businesses do not need super computers. They need better security, greater productivity, better software, cheaper computers , greater reliability. QC’s will not solve their problems.

    I see QC’s as a niche market in the short term which will be used in very specific applications and problem domains. In the long term QC modules may become parts of larger and more conventional processors. Kind of like graphic cards or phyics cards for gaming.

    Actually the real revolution and the big leap will not be QC’s. It will almost certainly be optics. The reasons is that to truly get faster speed you need to use higher frequency electromagnetic waves which basically means eventually going into the optical domain. Also optical processors will be needed for switching and processing data by telecommunications companies which have all optical infrastructures. As the cost of these processors goes down they will eventually become a consumer technology. Of course the QC’s of the future will be optical too.

  10. Jaja,
    I am desperately waiting for the 15th of February ( or is it the 13th? ) to see how it goes the public demostration of Geordie.
    I do not know where this quantum quantum computing is leading to, saw the presentation ” You built a QC, now what? “.
    Seems that Geordie was pointing to the big companies working in the pharma and in the genetics business.
    I really hope that DWave can make a machine which could crack NP-complete problems, which, otherwise, will not be solved in 50 years, not with supercomputers in the exaflop range.
    If you think, there is no conceivable computing method, other than QC, that can scale up with the complexity of NP-complete problems.

    I do not see how can optical computing can solve NP-complete problems much faster than conventional computers.
    I have been heard about this optical chip debate since when I was about 15 ( I am 34 now ), reading Jerry Pournelle` s column in Byte magazine; still, the optical chip is nowhere in sight.

    Other ” exotic ” microarchitectures, DNA computing, carbon nanotube chip, .. may provide huge enhancement is speed ( if and when they may become viable products ), but they will not solve the fundamental problem that they can not compute as Quantum Computers do.

  11. Matteo,

    Optical computers will not help you solve NP-complete problems but so what? Solving NP-complete problems in only important in certain problems like for instance scheduling, place and route etc. In these areas there are already appoximation algorithms that are good enough for most purposes. My basic point is that solving NP-complete problems is a small niche market. QC’s are not that important for most mainstream computing.

    As for optics, you do realize that there has been a revolution in this field. Theory, experimental techniques and practical applications have been continuously advancing. DVD’s, CD’s, and most importantly high speed optical communications are all the result of this. In 1987’s the highest possible bit rate on optical fibers was about 1.7 GB/s. It is now 14Tb/s. The reason this has not affected anybody is because the last mile problem has not yet been solved and most people are still receiving copper or cable connections instead of optical fibers. The capacity of these networks is so incredible that the electrical components are now the limiting part of the network.

  12. Jaja,
    I do not agree with you.
    NP-complete problems do not represent a niche market, there may be a niche market for algorithms that currently solve NP-complete problems, as they are just not fit for the problems, but, if you look to the whole list of problems that are NP and are intractable by today`s algorithms/computers, the list is very long:

    – weather forecast;
    – many financial models require NP-complete algorithms;
    – the behaviour of atoms in nanoscale;
    – protein folding;
    – even chess, I mean, computer chess, has algorithms that are almost NP-complete;

    and the list ends here just because many problems, which are NP, have not been even attacked yet, but the possibilities of working with a QC, have probably not even been completely considered yet.

    As for optics, the trend is the same as that of the semicon.

    The first IBM compatible computer I bought, was a 286, 1 MB of RAM, 10 MB HDD, and it was considered quite powerful at that time.

    Now, the HDD capacity has increased 100000-fold ( we are at the TB level ), the memory ” only ” 1000-fold, and I do not remember how many MFLOPS my computer was supposed to work at that time, but I imagine there has been a jump of few orders of magnitude even in computing power.

    Still, many of the problems that were intractable in the 80s, are intractable now, and will be intractable in the foreseeable future.

    There are problems, which are essential for simulating drugs behaviour, or for understanding the behaviour of particles at nano-scale, that require, literally, millions of years of computing time on Blue Gene, the 100+ TFLOP monster that dominates the current supercomputer list.

    That means that, even if we will have a 100000-fold increase in computing power in the next 20 years ( which is improbable, as we ar ereaching some dead ends inherent in the physics ), and you will work on a YottaFLOP super computer ( Kilo, Mega, Giga, Tera, Peta, Exa, Yotta, did I get it right? No time to check ) still many of the problems that are intractable today will be intractable tomorrow.

  13. The Stan Williams article was written in 1999. Seems disingenuous to reference an 8-year old article about “future technologies.”

  14. Geordie,

    One of the implicit historical reasons why there is little SC circuit activity for sueprcomputers of any kind (incl QC) is the well known failures at Both IBM and Univac to produce any useful products with Sc technologies …

    Now part of this history is properly attributed to the Lead /Lead Oxide tunnel junctions that suffered miserable yield killing defects from temperature cycling the soft lead. And that is the conventional wisdom…

    But even when Hypres was founded by an IBM veteran Sadeg Faris to pursue Niobium tunnel junction fabrication, no one using their foundry could successfully make a JJ based CPU. This is well documented in a public report of supercomputer technology pathfinding at a government agency.- to forestall percieved Moores law limits soon looming.

    Clearly the Nb JJs at Hypres did not suffer from stress related defects of film temperature cycling to the degree that Lead based circuits did, and that is what drove Dr. Faris to persue what IBM gave up on… BUT Faris nor his R&D customers apparently never made useful CPU high integration devices which was the holy grail of the JJ crowd.

    The question becomes how will you avoid the same subtle SC defects from limiting your integration scaling even if the circuit operation is not quite the same as a digital logic device – will the errors in the QC be harder to detect in computational results, even if the circuit operation defects or parametric shifts are harder to discern?

    Just curious about the historical context in fabrication, yield and scalability (and possibly subtle computational errors in larger arrays).

  15. Frankly I am very much eager for you guys to succeed, simply because such processing power could open doors for tons of practically unsolvable problems today. I will be looking forward to documents from your presentation, hopefully they will be available on your website.
    You claimed in a comment, if I remember correctly, about reaching 1000 Qubits by 2008 (even 100 Qubits is way too many for most apps). That indicates that you have found a way to sidestep the current limitations in QC technology. As someone said here before it sounds too good to be true, but I really do hope that it is true🙂

  16. Read the file: stan_williams_qc_investment.pdf

    I think there are two points missing:

    1) Intel and IBM are not building cars.
    That is, if they do not produce new and more powerful chips, the demand for their products is zero.
    If you produce clothes, cars, furniture, etc.; you do not need to change completely your product every two years to remain competitive.
    The engine of modern cars has been the same for the last 100 years, and also clothes, with some variations, have not changed that much.
    This is because, cars, clothes, etc. become old after few years of use, and need to get replaced.
    The same can not be said with computers.
    You do not change your computer because the keyboard or the screen because they have been not usable any more, but because there are new models of computer around, which are 5X more powerful ( or 3X, or 10X, .. )

    2) Moore` s Law is close to an end.
    http://mrtmag.com/news/intel_ibm_chip_020207/
    ” we know 32nm is going to happen and we are very optimistic about 22nm, he said. When you get to 15nm, there are more questions ”
    Fondamental limitations about shrinking a chip after the 22nm node are becoming so relevant, that current silicon technology can not ( in many experts` opinion ) go much further than the 22nm node ( which is only 4 years away ).
    So, the big guys in the industry have to come out with something new pretty soon, otherwise they will not be able to scale transistor performance much more, and that will lead to the end of Moore` s, the leveling of the market and the end of the huge profits that Intel and IBM ( semicon division ) are making.

    So, being the ” end ” near and with no clear prospect in sight, it would, in my opinion, make ene for every one in the industry to invest heavily in QC.

    Maybe, they are inveting heavily in some other technology ( carbon nanotube, I suspect ).
    http://www.reed-electronics.com/semiconductor/article/CA6413746?spacedesc=news
    ” Where IBM is investing a lot of time and money, Chen said, is with carbon nanotubes. “We think this is the most realistic alternative to a charge-transport-based system,” he said. “We don’t want to totally disrupt the existing infrastructure, and we think the carbon nanotube is the closest in that respect.” Although spintronics is very promising, he added, that infrastructure would require a considerable change from what the industry is doing today. ”

    I assume the reason for which IBM, for example, is not investing enough in QC, is maybe because they are inveting a lot in other technologies, which are closer to the silicon process currently in use

  17. Matteo,

    I wish to make a few points that maybe your already aware of but I will make them anyway

    1) QC’s cannot really solve NP-complete/hard problems. QC’s only offer quadratic speedup for NP-complete problems. The square root of an exponential function is still exponential so even for QC’s NP-complete problems are in practice intractable. All QC’s could do is solve the problem a bit faster assuming that QC’s ran at the same clock speed

    2) For weather prediction you will be limited by the accuracy of the measurement of initial conditions. In fact weather predictions are probably already limited by this. So once again QC’s won’t really help you. Also I don’t think weather prediction is NP. I think its in P.

  18. Pingback: ValleyProofs » Quantum Computing Demo at the Computer History Museum

  19. Your Celera example isn’t a very appropriate one. Celera produced a draft Whole Genome Shotgun of the human genome. There wasn’t any confusion at the time about how long it would take to sequence the genome. The public effort’s goals and timelines were related to producing an accurate and complete genome sequence. Celera’s goal was to produce some amount of coverage over most of the genome. There were few technological or scientific challenges that need to be overcome to produce a draft genome sequence (even at the time) because by the nature of the product, you ignore the hard or confusing stuff. For a more current example you can look at the mouse sequencing project which produced a WGS and assembly just over 4 years ago. Where is the complete mouse sequence? It is just about here now. The WGS approach (Celera) only changes the tyoe of collection to be broad covergae across the genome, rather than deep coverage of a specific portion of the genome in the BAC by BAC approach. Any let me tell you, if you want a complete genome after you’ve spent the money producing a WGS draft, it is 10-50x the work of producing a complete sequence from a BAC by BAC approach in the first place. This doesn’t mean that WGS isn’t applicable to a wide range of projects and because of the lower percieved cost it is used for many projects today, it just means that WGS is another tool that has its appropriate uses, and the human genome sequence really shouldn’t have been one of them.

  20. > All QC’s could do is solve the problem a bit faster assuming that
    > QC’s ran at the same clock speed

    Maybe quadratic speedup is not great for pure theoreticans but it is not a BIT faster, it allows solving problems PRINCIPIALLY unsolvable by any linear-time computers.
    Consider 2^35=34*10^9 – this one easily enumeraged by today’s computer.
    Now consider 2^70=12*10^20.
    Even if you multiply current clock speed by 100 and run it on 1000 processors it will not help.
    And the difference between 35 and 70 states in NPC problem is extremly significant for many real NPC graph problems.

  21. Also I agree with Matteo that improving solution performance of NPC problems is definitely not a niche market. Many of graph problems are NPC.
    As a ph.d student who works on natural language processing, I encounter such problems at every point, since for now many language processing algorithms from parsing to semantic processing work on relatively small subgraphs. To improve semantic learning and processing approximations are good but not good enough.

    Another domain which is filled with NPC problems is logical inference. Essentially many useful theorems could be proved if we could use such “small” quadratic speedup. For now, logical inference slowly crawls from pure research to practical applications, since it is not easily scalable. If they succeed to find QC algorithms which are quadratically speed up some types of logical inference, there should be a huge marked for this coming from database processing, web search and renewed expert systems.

    Cant wait 2 years to see if you indeed succeed to scale it up🙂

  22. Thank you for your article. Your clarifications reminded of the following remark (paraphrased): “if we ask the customers what they want, they’d ask for a faster horse (instead of a car)”. I don’t know who to attribute this to.

  23. I think that IBM and HP make most of their revenue from selling machines to do really simple things with almost linear complexity (we’ll say N.log(N) is about the worst case allowable)… but with very high reliability and well thought-out back-up and disaster recovery plans.

    eg. automating the accounting at world banks, keeping inventories for the military such that they can process 1000’s of transactions per second 365/24 even if there is an earthquake or a fire at one of the record centres. This is where most of the market for large computers is today, and a quantum computer is never going to compete for that application.

    If I were someone at IBM or HP looking to make an extra billion of sales in five or ten years time, I’d be asking myself if I wanted to spend that $50M today on researching Quantum computing, or to use that money to try to win some of the existing market share off my competitors my improving my existing products.

  24. Richard George wrote:
    ” and a quantum computer is never going to compete for that application ”

    Can not see why..

    ” If I were someone at IBM or HP looking to make an extra billion of sales in five or ten years time, I’d be asking myself if I wanted to spend that $50M today on researching Quantum computing, or to use that money to try to win some of the existing market share off my competitors my improving my existing products. ”

    I think you are missing few points:
    1) $50 million today may well be worth few ” extra billions ” of sales in 5 or 10 years time;
    2) Moore` s Law is in danger of stopping in 3-4 years, at the 22nm node, as nobody, at the moment, has a clear idea of how to proceed to an hypothetical 15nm node. Current silicon tech is probably not stretchable until 15nm, and no clear alternative has appeared so far..;
    3) quantum computing does not seem 10 years away, as D-Wave is working on a 1kqbit QC in 1 1/2 year timeframe;
    4) if $50M means $50M per year, D-Wave seems to have built a QC with less money than that

  25. Pingback: 11 questions for Geordie D. Wave « we don’t need no “sticking” room 408

  26. http://helkj.narod2.ru AllSubmitter – Программа для раскрутки сайта. Максимально качественная, быстрая и контролируемая регистрации в любых информационных ресурсах Интернета.

  27. It’s appropriate time to make some plans for the future and it’s time to
    be happy. I’ve read this post and if I could I want to suggest you few interesting things or suggestions.
    Perhaps you could write next articles referring to this article.
    I want to read more things about it!

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s