Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
IBM demonstrates 133-qubit Heron (tomshardware.com)
119 points by rbanffy on Dec 20, 2023 | hide | past | favorite | 62 comments


Who here missed https://news.harvard.edu/gazette/story/2023/12/researchers-c... Put simply: the breakthrough here (not related to the IBM news in parent link) massively cuts the physical qubits needed to do error-correcting (in the pursuit of logical qubits). What they demonstrated in this regard was vastly more efficient (more than the 1000:1 industry benchmark, though it's not apples to apples) than the current state of the art (as well as what IBM is hoping to achieve in its next generation chip). It seems to have pulled game-changing quantum computing years closer to reality. The paper: https://www.nature.com/articles/s41586-023-06927-3

Sorry if it's seen as off topic, my standalone submission on it didn't generate discussion https://news.ycombinator.com/item?id=38705445


Don't get me wrong, their (QuEra's) demonstration is incredibly impressive, but it seems you've been misled by inconsistent nomenclature around the phrase "logical qubit". They've demonstrated a 5/1 encoding scheme, yes, but that scheme is not anywhere close to being sufficiently redundant to allow for deep quantum circuits. When people talk about needing 1000 physical qubits, they mean to make a logical qubit with sufficiently low error rate to run interesting algorithms. In the QuEra device, when they say they "made 48 logical qubits out of 240 physical qubits", they simply meant that they used an encoding, and made no claim about the error rate on those qubits being low enough. There is no hope (that I know of) for a 5-1 encoding scheme to make error rates low enough. The QuEra device would just as well need many more physical qubits per logical qubit.


I want to point out that the experiment was at Harvard in the Lukin group. There is a proposal for constant-rate encodings using large quantum low-density parity check codes via atom rearrangement which could in principle achieve such high encoding rate. That said, it's certainly not mainstream yet. https://arxiv.org/abs/2308.08648


Yes, good point (apologies to the Lukin group). That's an interesting proposal, but it seems from a cursory read that you would need still need very many physical qubits to approach that asymptotic rate, and also you would be forced to take a very large slow down due to serializing all of your logical operations through a smaller set of conventionally encoded logical qubits. That said, I'm not current on SOA LDPC QEC proposals, so I'll moderate my claim a bit to "the first actually useful logical qubits will almost certainly have an encoding rate lower than 1/5".


"Peanut Gallery" here... These types of conversations are the reason I'm still addicted to HN.

Thank you both. And /hattip


By your estimation, what does this practically mean in terms of: we may be able to do X Y and Z in ______ years now because of this development and other recent innovations in quantum computing.

I’m more curious about this from a consumer angle - anything with at least a downstream consumer impact that is exciting? And even if no consumer impact for the foreseeable future, is there anything “cool” or more tangible than massive encryption breaking?


The encryption angle is somewhat uninteresting. Old recorded messages will be broken eventually, quantum computing or not. Thwarting the quantum threat for new keys just requires a constant factor of additional work (key size, encryption cost). Thwarting it without much overhead is an ongoing research problem.

The ______ years question is everyone's best guess. X, Y, and Z though are quantum simulations. Drug discovery, superconductors, "better" alloys (use-case dependent), .... It's shockingly expensive to simulate a properly quantum system, and that cost increases exponentially in its size. 1 atom is manageable usually, ignoring relativity. 2 is an interesting research problem. 3 are tractable if they're homogenous and you're patient. Everything beyond that requires huge simplifications or clever insights (or both). Quantum computing linearizes that simulation cost, for any problem small enough to fit inside the given qubit bounds.

One way to estimate the effect might be to look at current chemistry-related AI/ML news. It's a noiser version of the same results. If that winds up being useful then quantum computers will be even more useful (but perhaps not worth it). If that doesn't advance materials research enough then it's less obvious that a quantum computer will be helpful. Maybe, maybe not.


> Old recorded messages will be broken eventually, quantum computing or not.

That may well take multiple lifetimes, greatly reducing its impact. Mere increases in computing power will not break 256 bit Elliptic Curve cryptography for centuries. That will require algorithmic breakthroughs or large scale quantum computers as well.


This article is weirdly incoherent. I thought Tom's Hardware was reasonably high quality. Did it get sold or something?


I also remember it being great in the aughts, but it's really gone downhill.

Your question left me wondering and as it turns out, they were indeed sold in 2018: https://www.prnewswire.com/news-releases/purch-finalizes-sal...

See also https://en.wikipedia.org/wiki/Purch_Group

https://webcache.googleusercontent.com/search?q=cache:4FRODO...


Tom's Hardware is like the buzzfeed of tech news. It's low quality. Quantity over quality.


Can you recommend something high quality?


They may outsource some of the writing to an overseas platform that pays by the piece.


> When what used to cost 1,000 qubits and a complex logic gate architecture sees a tenfold cost reduction, it’s likely you’d prefer to end up with 133-qubit-sized chips – chips that crush problems previously meant for 1,000 qubit machines.

Is it just me (tired, and overworked) or is this poor writing?

edit: Oh man, this article is dreadful, what the heck? Is this ESL or AI or what is going on?

Another example picked at random, that's not even trying to discuss a complex topic:

> It's hard to see where the future of quantum takes us, and it’s hard to say whether it looks exactly like IBM’s roadmap – the same roadmap whose running changes we also discussed here.


The writer is based in Portugal. I'd blame bad translation and lack of edit by a native English speaker, plus some vague wording that doesn't readily translate.


Exactly, don't attribute to AI what can be explained by poor English skills


Yes this is borderline incoherent


This was clearly not written by a human.


LLMs are almost always grammatically correct (it's one of the first things they learn); it's their semantic correctness/grounding that's lacking. You would be very hard-pressed to get language like this out of any modern LLM that's actually worth using.


> … democratized access to hundreds or thousands of mass-produced Herons in IBM’s refrigerator-laden fields …

#brandnewsentence


Increasingly, poor writing suggests that it was written by a human.


This is some "AI-Pin"-level writing. Thousands of words, saying very little.


The next step in browser wars is integration with AI summarizer.


Swiftly followed by the development of models that generate content that resists summarisation.


But that's also de rigueur for B.A. Communication majors who get paid by volume, not quality.


Related critical review on IBM’s 1000+ qubit computer and on why it does not matter

https://youtu.be/XlCsi8zagNw?si=HgayJIHJZNFpRyo-


But it looks cool as hell


Brace yourself for "Watson" branding and IBM marketing hype.


They'd do better naming it the WOPR.


Then they can sell clusters of the things— double WOPR, triple, and so on. Cheese is extra.



I'm still waiting to hear what the problem with this is.

Personally, I enjoy a Double-WOPR with cheese, once in awhile. ;)


Ok, but what's their magic state fidelity?


That sounds more scifi than I can bear so early in my morning :-)


so, uh. What does it actually do? In any applied sense. The article doesn't seem to list what it's "useful" for.

and can it factor numbers as high as 35 reliably yet?


This important milestone in quantum computing allows the production of press releases which prevent quantum computing hype from being completely eclipsed by AI hype.


I don't think that's enough to get over the bar. Need to add some application to cold fusion and you've got it.


Only if the announcement page has so many ads, the content is literally unreadable.

I think then, we'll have it.


Just wait until they can start executing AI-related tasks with quantum computing in 20 years time. The hype will no doubt be insufferable.


The hype might end by that time. Maybe it will be "self-replicating self-organizing microrobots" by then...


Surely AI training can be cast as annealing, so we can get double hype.


Chasing through a bunch of links, their claim to "usefulness" seems to be based on this paper https://www.nature.com/articles/s41586-023-06096-3 where "Our benchmark circuit is the Trotterized time evolution of a 2D transverse-field Ising model, sharing the topology of the qubit processor" which sounds like a task picked to be easy for this particular quantum processor to solve, and maybe of interest to some physicists working with Ising models, but not really "useful" in the general sense.


The reason it doesn't list what the computer is "useful" for is because quantum computers currently don't have a lot of uses. Right now, it's mostly:

- Fourier transform in log(n)^2 rather than nlog(n) time. Shor's algorithm uses this---the numbers 2^0, 2^1, 2^2, ..., 2^n (modulo n) are periodic for some factor of phi(n) (Euler-totient function). A Fourier transform will have a peak around that period, which you can quickly find on a quantum computer.

- Speeding up database queries (e.g. Grover's algorithm).

You can also make communications more secure with some quantum stuff, but that isn't computing. In the future, some more interesting computer applications will be:

- Physical simulations, e.g. how do proteins fold, or chemical reactions take place.

- Bayesian neural networks. After training, there's some distribution the weights could be in based on the initialization. Choosing a fixed set of weights results in overfitting, but if you instead take the expected output given that distribution, you get the most likely answer. That's obviously infeasible on a classical computer (though dropout approximates this), but possible on a quantum one.


> You can also make communications more secure with some quantum stuff, but that isn't computing.

Care to elaborate? The only thing that I can think of is avoiding the current crop of side channel attacks, but I admit to only giving quantum computers cursory glances: still, I am very curious!


You send a photon with the polarization giving the bit:

Basis +: | = 0, — = 1,

Basis x: / = 0, \ = 1

You randomly choose a basis for each photon, but don't tell the receiver which basis it is until they receive the photon. They randomly choose a basis to decode on, and once learning the correct one discard the bit if they mismatch.

The key thing is an eavesdropper wouldn't know the correct basis either. If they try to "pass along the photon" they'll re-encode it in the wrong basis half the time, so the receiver will end up with the wrong bit when it should be correct (25% of the time). The sender can share some of the correct bits, and if the receiver has too many errors it's likely someone is eavesdropping.

Now, what bits do you actually send? Probably something like a "Learning With Errors" (LWE) cryptography key (as quantum computers kill RSA). Then you can transfer your actually data over classical communications. The quantum part is just the handshake at the beginning. If you detect eavesdropping you just send a new key over.

The current limitations are:

1. It's really slow. It can take seconds to send a 1,000 bit key just 1km (figure 4, 10.1109/PHOTONICS49561.2019.00010).

2. You need a single photon source so the eavesdropper cannot pass along photons while reading the message. It also has to be "heralded", i.e. you need to know when that photon gets sent. Right now these are pretty slow to generate.

3. The error rate has to be really low, otherwise you can't detect eavesdropping.

4. Your detection system can't have backscattering/other side-channel attacks. To detect single photons, people usually use "avalanche detectors", where the photon excites an electron, which goes on to excite many more due to a potential difference in a p-n junction. However this also creates small flashes every time a photon hits it.


Just to drop the name, we're talking about quantum key distribution (QKD).

> Now, what bits do you actually send?

You just send random bits and get a key. Then use the key however you want (AES, one-time-pad if you're crazy...). I guess that's actually the same as what you were saying...

> It can take seconds to send a 1,000 bit key just 1km

Toshiba marketing claims "13.7 Mb/s over a fiber distance of 10 km" [1]. I'm sure they also have a paper somewhere.

> You need a single photon source

No, people use "Decoy States", which allows you to use a "normal" weak pulsed laser. There is also continuous variable QKD which does not need single photons. You can use single photons (and some implementations do) but it's not mandatory.

[1]: https://www.global.toshiba/ww/products-solutions/security-ic...


Thanks for the corrections


There was a section of IBM's quantum computing challenge that had you encrypt/decrypt logical qbits by transforming their phase state to a known degree. If the state was altered in-transit, than the inverse transform will not cancel out correctly.

the challenge also mentioned that for it to work in practice, the qbits themselves would have to be transported somehow.


Are these things it actually does, or more “quantum computing might be useful for this in the future!” hype?

Because OP was asking the former


Shor's algorithm and chemical simulations have been implemented on quantum computers, albeit very simple cases (factoring twenty-one, observing conical intersections). I don't think Grover's has been yet, but it's not much different than Shor's. Bayesian neural networks are purely hype.


Quantum computers need about 1m logical qubits (error corrected). It is the CS fusion.


Nice, an even fancier random number generator...


For cracking 1024 bit RSA, I believe we need on the order of 10,000 qubits.

So we are 1% of the way there!


The best estimate I've seen is that we need about 5-7 orders of magnitude more qubits and 1-2 orders of magnitude lower error rates: https://sam-jaques.appspot.com/quantum_landscape_2023


Is there a Moore's Law for qbits, meaning do they increase exponentially with time? If so, we are halfway there, from when we stated with 1 qbit. log(133)/log(10000) is approximately 0.5.


Here is a fit a few years ago. https://miro.medium.com/v2/resize:fit:1400/format:webp/1*0aH...

I think this chip would intersect with the exponential.

So...300 qbits by 2031.


Unfortunately the qubit number is not really good parameter to parametrize. Everyone could just copy paste their qubits on the chip today and claim to have the largest quantum computer. Making interactions with low enough error rate is the hardest part.

I really wish there would be a better parameter which would also inform when they become useful for breaking crypto.


I agree, this was done some time ago. It's still rather difficult to come up with a good measure because:

* the number of qbits that can be used to construct a circuit is not always the same as the number of qbits on the processors

* the number of iterations that a circuit can be run for is often obscured

* the amount of time and effort required to construct a circuit is often obscure

* error rates - as you mention

Simple formulas for these things is not a good measure - if an algorithm needs 10000 iterations to work then increasing the number of qbits by 100x when the max iterations is 100 will not get you a working machine.

The real problem is that this is all live science being done as commercial development. None of these machines are close to being useful (as in 20 years away). The science is brilliant though, the capabilities created by the skill that is being developed in creating these devices is going to be very useful in the future. It's just that the money is going into it under false pretenses. For China and the USA this is actually a good thing - it's driving the basic science and that needs to happen somehow. There will be dividends in the future for all of us. For places like Canada, France, Japan and the UK its bad though. These economies need to reap benefits in the next decade from their current investments. In this sense QC money is just poured out onto the ground.


>The real problem is that this is all live science being done as commercial development.

That's been my observation regarding quantum computing since I was first exposed to it around a decade ago. That these are really cool science experiments (in a very literal sense) that are being billed as early stage product development by the companies involved. It's giving the public the impression that quantum computing is at the stage of Woz wiring together the first Apple I in his garage when in fact we're at the stage of research done by Geissler and Crooke in the 1850s that would lead to the development of the first vacuum tube 50 years later.


I think that’s a gross overstatement. My impression is that these are PoCs demonstrating certain properties but not actually useful for computation. It’s an important milestone, but I think it’s too early to tell how close we are to cracking 1024 bit RSA, assuming it’s even possible to achieve anywhere near the theoretical speed up in reality. Remember - theoretical is based on our current models but in practice real-world physics comes into play & starts limiting the ability to scale this up to perform computations faster than classical approaches. It’s entirely possible that there’s a missing link in our model that would prevent us from realizing anything like the idealized quantum computer that could be used to solve BQP/EQP problems efficiently.


do you have some further reading i can do to understand how this maps? thanks!


There’s a page at https://quantum.microsoft.com/en-us/experience/quantum-crypt... designed to explore some of the resource needs and concepts on this topic you may find interesting.

Disclaimer: I work on the quantum team at Microsoft.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: