Scientists have been dreaming and discussing Quantum Computers for at least 50 years (for a nice timeline of ideas and implementation of Quantum Computing click here) but the construction of an effective QC has proved elusive. Big players, like Google, IBM, NASA in the US, National Scientific organisations in Japan, South Korea (has started in 2019 after previous investment a 5 year program on QC), and Russia (Russia Quantum Center) are putting significant effort to turn QC into reality and a vibrant worldwide scientific community is exchanging ideas and experiences (the IEEE FDC has a Quantum Computing Initiative).
We have seen the development of a number of QC, most in labs but a few like D-Wave becoming marketable products (although discussion is still going on if these are really QC or just a good approximation), but they have been put to work in very narrow areas and none was so advanced to be able to compute something that couldn’t have been computed by a classical, von Neumann computer.
This is what Quantum Supremacy is all about: having a QC that could perform a computation that for all practical purposes would be impossible (would take too long) to be done by a classical computer.
As you surely have seen in the newspapers this last week a rumor has circulated that a Google QC had achieved this quantum supremacy. The rumor is based on a paper that appeared on a NASA website but that was immediately removed. Of course, once you put something on the web you lose control and even if you remove it there is a good probability that it can be found somewhere else. In this particular case yo can find the paper here 😉
According to the (copy of the) paper joint work of Google and NASA has produced a QC consisting of 54 Qbits, 53 of them operational, and that QC has performed in 200 seconds a computation that would have taken our best computers over 10,000 years to perform. That would indeed be a situation where one could claim a quantum supremacy.
Other people, however, are saying that such quantum supremacy is very narrow and of little practical use because it applies only to a very specific computation. Change the problem and you can no longer apply it to a QC for a solution.
IBM announced (and I posted a comment on that) a QC with 50 Qbits soon after Intel announced their own QC chip with 49 Qbits and Google hinted they were working on a 72 Qbits (Bristlecone). The problem with all these announcement is to turn the hardware (and software) into something that can be used to address a variety of problems (this is something that D-Wave let you do).
Personally I remain in a “waiting mode”, I do not think we have reached the pot of gold. It is still there, at the end of the rainbow.
For a nice overview on QC, who is working on it and what we will be able to do once we have it take a look at the video.