Is there a Moore's Law for qbits, meaning do they increase exponentially with time? If so, we are halfway there, from when we stated with 1 qbit. log(133)/log(10000) is approximately 0.5.
Unfortunately the qubit number is not really good parameter to parametrize. Everyone could just copy paste their qubits on the chip today and claim to have the largest quantum computer. Making interactions with low enough error rate is the hardest part.
I really wish there would be a better parameter which would also inform when they become useful for breaking crypto.
I agree, this was done some time ago. It's still rather difficult to come up with a good measure because:
* the number of qbits that can be used to construct a circuit is not always the same as the number of qbits on the processors
* the number of iterations that a circuit can be run for is often obscured
* the amount of time and effort required to construct a circuit is often obscure
* error rates - as you mention
Simple formulas for these things is not a good measure - if an algorithm needs 10000 iterations to work then increasing the number of qbits by 100x when the max iterations is 100 will not get you a working machine.
The real problem is that this is all live science being done as commercial development. None of these machines are close to being useful (as in 20 years away). The science is brilliant though, the capabilities created by the skill that is being developed in creating these devices is going to be very useful in the future. It's just that the money is going into it under false pretenses. For China and the USA this is actually a good thing - it's driving the basic science and that needs to happen somehow. There will be dividends in the future for all of us. For places like Canada, France, Japan and the UK its bad though. These economies need to reap benefits in the next decade from their current investments. In this sense QC money is just poured out onto the ground.
>The real problem is that this is all live science being done as commercial development.
That's been my observation regarding quantum computing since I was first exposed to it around a decade ago. That these are really cool science experiments (in a very literal sense) that are being billed as early stage product development by the companies involved. It's giving the public the impression that quantum computing is at the stage of Woz wiring together the first Apple I in his garage when in fact we're at the stage of research done by Geissler and Crooke in the 1850s that would lead to the development of the first vacuum tube 50 years later.
I think that’s a gross overstatement. My impression is that these are PoCs demonstrating certain properties but not actually useful for computation. It’s an important milestone, but I think it’s too early to tell how close we are to cracking 1024 bit RSA, assuming it’s even possible to achieve anywhere near the theoretical speed up in reality. Remember - theoretical is based on our current models but in practice real-world physics comes into play & starts limiting the ability to scale this up to perform computations faster than classical approaches. It’s entirely possible that there’s a missing link in our model that would prevent us from realizing anything like the idealized quantum computer that could be used to solve BQP/EQP problems efficiently.
So we are 1% of the way there!