Home / Blog / The Quantum Advantage remains a fuzzy area

The Quantum Advantage remains a fuzzy area

It looks futuristic and indeed it is! A quantum computer dressed up in a blue casing. Image credit: Xanadu

We have been hearing on Quantum Computers for several decades now. In the last one the promise of finally getting a Quantum Computer that could be used for real applications seemed within reach to the point that when John Preskill invented the wording Quantum Supremacy most lay people thought that at a certain point QC would replace classical computers. That was not the intention of Preskill who actually defined the Quantum Supremacy as the point when QC would be able to do a specific computation that classical computing could not be doing in practical terms (like saying taking a billion years of computing time).

More recently the Quantum Supremacy was renamed int Quantum Advantage (to avoid association with White Supremacy, a no-no word), and even more recently into Quantum Computational Advantage. This latter is interesting since it has been used to point out that the “advantage” may not be a practical one, something you can use in a needed application, just a demonstration that certain types of computation are feasible with a QC and basically unfeasible with a classical computing.

I found the article on Wired very clear in examining the current status of QC.  One interesting point raised in the article is that this Quantum Computation Advantage is not a clearly dividing line, rather a fuzzy area that keeps moving. As soon as some research team shows how much faster a QC is with respect to a classical (super)computer, like Google researchers did in 2019, another team will come up showing that a classical (super)computer can achieve a comparable level of performance.

This might  also be the case with the announced Quantum Computation Advantage by Xanadu, a simulation of the quantum computer on a supercomputer or the discovery of a new algorithm that can be run on a supercomputer may prove that the advantage is not there. So far that has been the case.

However, what is really crucial, at least to me, is that industry is looking a QC for practical, every day application (like Pharma to discover and experiment new molecules) and we are not there yet (although some niche areas are already using QCs, like the ones produced by D-Wave.

One thing, however ,is clear: QC are not expected to replace classical computers, rather we might find a QC chip embedded in some supercomputer to perform specific computations.

About Roberto Saracco

Roberto Saracco fell in love with technology and its implications long time ago. His background is in math and computer science. Until April 2017 he led the EIT Digital Italian Node and then was head of the Industrial Doctoral School of EIT Digital up to September 2018. Previously, up to December 2011 he was the Director of the Telecom Italia Future Centre in Venice, looking at the interplay of technology evolution, economics and society. At the turn of the century he led a World Bank-Infodev project to stimulate entrepreneurship in Latin America. He is a senior member of IEEE where he leads the New Initiative Committee and co-chairs the Digital Reality Initiative. He is a member of the IEEE in 2050 Ad Hoc Committee. He teaches a Master course on Technology Forecasting and Market impact at the University of Trento. He has published over 100 papers in journals and magazines and 14 books.