Home / Blog / The truth and the hype of Quantum Computers

The truth and the hype of Quantum Computers

Quantum computers are quite complex machines, to build and to operate. And a number of scientists feel that what we have today are still far from being “quantum computers”. In the photo a detail of an IBM QC. Image credit: IBM

I am no expert in quantum computer, nor in quantum computing. I read a lot about both, I attended quite a number of meetings where quantum was discussed (including the ones at the IEEE Future Direction Committee where there is a Quantum Initiative), yet I remain uncertain about the real current status of QC.

On the one hand I know that companies like D-Wave have been selling QC for a few years now, companies like IBM have created on-line labs that researchers can access to work on QC to perfect algorithms …  On the other hand I keep hearing scientists stating that what we have today is a tiny bit of a QC and that we do not have found ways to scale that “tiny” into something that can be used.

We have been talking (most of us just listening to other people talking) about QC for well over 30 years. During this time the “normal” computers have made steady and amazing progress. Not so for QC. They are at the end of the rainbow, promising a pot of gold that seems within reach and it is never actually reachable.

18 months ago PC Magazine published an article with the title “Quantum Computing: a bubble ready to burst?” referring to the huge investment in the area of QC that was not really delivering. Indeed, a lot of money has been, and is, given to fund QC research, start ups, a good portion of it (at least in Europe) being public money.

The reasoning goes like this:  When QC will become a reality it will take the world as a storm generating huge revenues. We cannot afford to be outpaced by other companies/Countries.

What is a given is that QC can solve computational problems that are beyond the reach of today’s computers, what is not a given is that the solution to such problems will attract/generate huge amount of money, far greater than the one being generated by today’s computers. Notice that QC will not, at least for the foreseeable future, replace today’s computer, they will be flanking them. Hence, we are not facing a situation where the successful implementation of a general purpose, affordable, QC will result in a replacement of current computer.

This will not happen because:

  • QC work well in narrow specific domains of computation
  • QC are far far away to become miniaturised, hence they won’t be able to replace the microchips that ubiquitous today
  • QC cost a lot of money. Sure, we know that technology cost is bound to decrease, however it takes time for this to happen. Current chips have such a low cost compared to their performance but it took 50 years for this to happen. Even under the most optimistic hopes it will take several decades to make QC as affordable as today’s processing. Also take into account that today’s chips will continue to improve hence any technology aiming at replacing them should target not their current performances but those that they will deliver in 10-20-30 years from now.

In addition, in it is not a given that we are going to have an affordable, multi-purpose QC anytime soon!

Warnings are raising from many scientists and researchers that QC is being hyped well beyond its “practical” capabilities. An article published by MIT Technology Review “Quantum Computing has a hype problem” is surely worth reading. The article is particularly significant because it is written by a well renewed “quantum” supporter with a deep technical knowledge in the field. Because of that it is all the more credible in its evaluation of the present situation with an emphases on promises of reaching the holy grail and no real substance to support the claim.

Sure, in a way we already have QC, we have big companies from Google to IBM that are investing into QC and have published roadmap of expected progress (look at the IBM one). In spite of that, QC remains an elusive target when compared to what has been achieved in classical computation (Moore’s law).

I personally feel that we are going (probably, you are going, I am likely to not be alive when this is going to pass) to see the fusion come to age before the quantum will come to age. And, just to be clear, by “come to age” I mean a situation where QC would be as common and normal as normal processing is today.

About Roberto Saracco

Roberto Saracco fell in love with technology and its implications long time ago. His background is in math and computer science. Until April 2017 he led the EIT Digital Italian Node and then was head of the Industrial Doctoral School of EIT Digital up to September 2018. Previously, up to December 2011 he was the Director of the Telecom Italia Future Centre in Venice, looking at the interplay of technology evolution, economics and society. At the turn of the century he led a World Bank-Infodev project to stimulate entrepreneurship in Latin America. He is a senior member of IEEE where he leads the New Initiative Committee and co-chairs the Digital Reality Initiative. He is a member of the IEEE in 2050 Ad Hoc Committee. He teaches a Master course on Technology Forecasting and Market impact at the University of Trento. He has published over 100 papers in journals and magazines and 14 books.