Quantum computation is progressing at theoretical level as well as in the lab, in terms of prototypes. There are also a few “products” on the market focussing on very specific applications. Google and IBM are among the leaders in quantum computing and in the implementation race for a practical system.
Few days ago IBM has presented in Stuttgart, Germany, to the public the first IBM System One (watch the clip). China and the US have been working on quantum computers for years. Now Europe hopes to catch up. Ten German companies have teamed up in a joint initiative called “Qutac” – including BMW, Siemens and Bosch – to develop applications for the computer. IBM’s System One will run the applications.
Google is aiming at delivering a “useful” quantum computer by 2029 (and the crucial word here is “useful”).
In a recently published article, IBM researchers claim to have demonstrated for the first time the practical advantage of quantum computers over classical computers in a real life scenario (that is not an advantage restricted to specific classes of problems). What they did was to provide both a mathematical proof and an experimental demonstration of the advantage of using a quantum computer over a classical computer for scratch storage (the one that is used within computation as temporary storage). This is a general problem that applies to any sort of computation.
In their paper they are considering current quantum computers, with noisy q-bits (meaning that they are not “perfect” q-bits), and demonstrate that even with the presently achievable (from a technology point of view) q-bits there is an advantage over classical computers.
This is very nice, although it cannot be sees as a game over for classical computers. The already mentioned expectation of Google that quantum computer will become generally useful in 2029 is a point in case.
Intel is working on cryogenic quantum chips (cryogenic because they operate at extremely low temperature to keep coherence -low noise-) and we might expect that by the end of this decade we might start to see a merging of the classical computer architecture (and chip/software) with quantum computing architectures (chip/software). This is the type of evolution I would be betting on, progress in classical computing and in quantum computing eventually resulting in a merging of the two (as we have seen happening with the merging of CPUs and GPUs in recent time, like in the M1 Apple chip).