Hacker Newsnew | past | comments | ask | show | jobs | submit | DrNosferatu's commentslogin

The math might be beautiful, but I'm very skeptical - practical - quantum computers will ever deliver their promise.

Forever is a long time, but I agree people that assert reality is the model are almost always incorrect eventually.

There is some interesting work being done, but it will never match the excessive hype. =3

"The Genius of Computing with Light"

https://www.youtube.com/watch?v=rbxcd9gaims


I would say optical computing will be practical well before QC is.

Time will tell.


Not sure what is going on in QC world; With this ACM prize it has become even more murky.

As Sabine Hossenfelder (Theoretical Physicist) points out, companies to do with QC are seeing a surge in investments and marketing. It is as if somebody knows something that the "common public" doesn't - https://www.youtube.com/watch?v=gBTS7JZTyZY

I don't know enough about the science/technology to form an opinion but have recently started down the path of trying to understand it - https://news.ycombinator.com/item?id=46599807


> It is as if somebody knows something that the "common public" doesn't

oooorrr - and hear me out - investments are inherently hype-based and irrational and there is too much money flying around to do actual smart decisions


Nope.

Quantum Computing (QC) is unlike previous technologies which were all mostly "logical structures" (i.e. the underlying Physics/Technologies were well-known). The viability of both the core Physics itself and its realization through Technology for QC are questioned by some Physicists/Technologists themselves. But in 2024/2025 many Govts. and Companies both have started investing heavily in QC. Moreover the advanced countries have implemented export controls on QC technology prohibiting export of QC computers above 34-qubits.

And now the ACM prize for something done long ago in quantum information.

Finally note that QC algorithms can be simulated (for small size qubits) on conventional computers and the current AI technologies may also play a part here i.e. implement QC algorithms on the "Cloud supercomputer" and using AI technologies.

The logical inference is that there has been some technological (one or more) breakthrough in the realization of the QC qubits technologies, QC algorithms running efficiently on the cloud, AI usage for QC etc. Nothing else explains all of the above facts.

See also: The Case Against Quantum Computing by Mikhail Dyakonov (Professor of Physics) - https://spectrum.ieee.org/the-case-against-quantum-computing


> Quantum Computing (QC) is unlike previous technologies

aaand you entered the "hype and irrational" territory. I dare you to reread your own comment, it is funny

right now QC is 5 orders of magnitude away from practical systems - there's NO profit to invest for. It's all research that is being hyped and overpromised because there's not enough money in that sector and because established players (like google) don't want to lose their face

viability of core physics does not imply immediate creation of product. I'd point to fusion, but that's also currently getting over-hyped 15-20 years too early

governments are only investing the same way as into particle accelerators - in form of research grants

simulation of QC is both extremely trivial (in "exponentially-slower" way) and existentially impossible (the whole sector would not exist if it was actually possible to use good old normal CPUs fast enough). Bringing in "AI technologies" only shows you as a gullible idiot that still parrots ai bubble without understanding exact details

If there is a breakthrough - it is secret government information, and it would not be available to non-government companies, especially those you can invest into. The moment such breakthroughs reach the market, knowledge of the very existence spreads - and yet all current known progress is dull.

The only evidence worth anything out of what you brought up is the export controls - and those have been extremely pre-emptive in preparation for geopolitics and far future tech. Error-correction barely started to be useful at 100 cubits, so 34 makes no sense other than to minimize brain drain with base tech


> aaand you entered the "hype and irrational" territory. I dare you to reread your own comment, it is funny

You have not understood the first thing about what i had pointed out.

> right now QC is 5 orders of magnitude away from practical systems - there's NO profit to invest for. It's all research that is being hyped and overpromised because there's not enough money in that sector and because established players (like google) don't want to lose their face

While there has been hype, in the last couple of years things seem to have changed and now culminated in the awarding of the ACM Turing Award prize. Do you know anything about the Physics/Mathematics behind qubits (eg. probablities/superposition/phase/noise etc.) and/or how that has been realized via technologies (eg. superconducting/photonics/trapped-ions etc.)? People are looking at "hybrid" quantum computers i.e. conventional+quantum (eg. IBM, Fujitsu), shuttling qubits on silicon (eg. Hitachi) which allows existing foundry technology to be used for QC. This is huge.

> viability of core physics does not imply immediate creation of product. I'd point to fusion, but that's also currently getting over-hyped 15-20 years too early

Non-sequiteur.

> governments are only investing the same way as into particle accelerators - in form of research grants

No, Govts. are actively funding startups in this area and including technology research/transfers in their Free Trade Agreements with other govts.

> simulation of QC is both extremely trivial (in "exponentially-slower" way) and existentially impossible (the whole sector would not exist if it was actually possible to use good old normal CPUs fast enough).

Simulation of QC is not "extremely trivial" but requires HPC technology. Datacenter/Cloud technologies are also utilized here. Generally only around 30-50 qubits have been simulated with 50+ qubits being exponentially prohibitive in terms of compute power/memory.

>Bringing in "AI technologies" only shows you as a gullible idiot that still parrots ai bubble without understanding exact details

To use your own language; this right here shows that you are just a clueless idiot about this domain. AI is a tool applied to various domains eg. AlphaFold for protein structures in Biology which solved an almost intractable problem. People are doing the same with QC+AI. There are a bunch of papers on this; for your edification start with Quantum Computing and Artificial Intelligence: Status and Perspectives - https://arxiv.org/abs/2505.23860

> If there is a breakthrough - it is secret government information, and it would not be available to non-government companies, especially those you can invest into. The moment such breakthroughs reach the market, knowledge of the very existence spreads - and yet all current known progress is dull.

This demonstrates your gullibility. Since one of the best studied usecases for QC is cryptography, if there has been a breakthrough in some lab (govt/academia/company all of whom have secrets), the powers-that-be would not want it to be widespread for security (mainly) reasons. But hints might have been given and investments encouraged. Almost all QC companies have a govt. tie-up and cryptographic technologies have already been subject to export controls from the very beginning. Another scenario is defense applications. There are plenty more but these two are the main ones.

> The only evidence worth anything out of what you brought up is the export controls - and those have been extremely pre-emptive in preparation for geopolitics and far future tech. Error-correction barely started to be useful at 100 cubits, so 34 makes no sense other than to minimize brain drain with base tech

That is the obvious superficial take. Given what i have written above, what if semiconductor technology i.e. the "hybrid" QC+Conventional allows one to simulate 100+ qubits easily now? What if there has been some breakthrough's by using AI on QC algorithms both existing and new ones? Have any formerly intractable problems in Physics/Chemistry/Biology/Mathematics been made tractable now due to AI usage? How many of these can be implemented on a QC? Etc. Etc.

To summarize; you have to look at the whole complex picture before drawing conclusions. Merely parroting trivialities like "hype" is meaningless.


Commercialization can bring in speculators and hype. And, I'd argue that speculation is a necessary for accelerating market development. Commercialization brings with it unique forcing functions that don't exist in academic settings, and this historically leads to acceleration of functional products. The first step is building a quantum computer to learn how to build a quantum computer. That step is done, while research continues in many areas, the commercialization challenges are largely engineering in nature.

I've only seen 34 qubit simulators (eg AWS SV1). My understanding is that 34 qubit uses 512GB of RAM, and each additional qubit doubles the RAM requirement. So, 50 qubit simulated would require 16.8M GB of RAM.

100 logical qubits seems to be the minimal threshold for interesting/useful quantum computing, albeit with very limited use cases. Classical still beats most. Quantinuum will hit that number in 2027. And, IonQ (often cited as being a hype-machine) expected to have 800 logical qubits in 2027.

The industry is moving out of the NISQ Era (noisy-intermediate-scale-quantum) and into the Fault-Tolerant QC (FTQC) era. NISQ is experimental. FTQC is commercial (ie reliable, repeatable).


Nice, an informative meaningful comment. From your userid; would i be correct in deducing that you are affiliated with QuBOBS Project? - https://qubobs.irif.fr/portfolio/

You are certainly right that commercialization (and speculation does play an important role here) serves as a forcing function to accelerate development of products. But this needs to be done somewhat in-sync-with/a-little-ahead-of the actual science and engineering. When the subject is inherently difficult to understand (as is the case with QC) it can very easily get out of hand and become just snake-oil/bullshit and exploited by hustlers/grifters/charlatans.

Do you have any links to more information on the points that you make above that you can share? Specifically on hybrid quantum-classical systems and silicon-based shuttling-qubits which can use current foundry technology? To me, this seems to be the future since both the scaling and availability are taken care of.

As regards scaling of qubits, Caltech recently achieved 6100(!) qubit-array - https://www.caltech.edu/about/news/caltech-team-sets-record-...

Wikipedia also has a list of quantum processors and their specs - https://en.wikipedia.org/wiki/List_of_quantum_processors


Thank you :). I'm not affiliated with Qubob project, it's just a name I picked.

QC has had its share of hype cycles. Businesses need to have a vision (which is where hype seeps in), and must be honest about what's possible today.

I am building quop.ai, a service which makes quantum computing accessible to technical and non-technical folks who don't have degrees in quantum physics.

I am not super familiar with the silicon spin, but Qutech is making progress: https://qutech.nl/2026/02/12/rolling-out-the-carpet-for-spin....

Oxford Ionics (acquired by IonQ) has a CMOS trapped ion approach: https://www.oxionics.com/blogs/unveiling-oxford-ionics-devel...

Intel is also involved in silicon spin / quantum dots with their Tunnel Falls milestone: https://quantumcomputingreport.com/argonne-national-laborato...

CalTech work is super interesting! Neutral Atoms look to be a compelling modality. QuEra and Pasqal are commercial players worth looking into.


Thanks for the pointers. I had seen them before except for the qutech one. I firmly believe the silicon approach to qubits is the way to go; but we will see how the market (and technology viability) settles everything. These are exciting times for hard Physics.

Also had a look at your quop.ai; seems pretty interesting though i need to explore it a bit more.

You might want to think about posting quop.ai to HN and get some feedback ;-)


Exciting times indeed! Will be fun to see how this all plays out.

Thanks for taking a look at Quop. If you explore it more, I'd be curious what you think as someone who follows the hardware side closely. I'll post something soon.


Also, there's a CalTech affiliated startup called Oratomic Inc that is building a Rydberg system. I think it's the same team that demonstrated 6100 qubits. And, I hear Preskill is an advisor.

Check out Eric weinsteins latest theory about how frontier physics has moved “dark” (with a grain of salt, some of the other things he says might tempt you to discount him completely)

Then soon there will be a port for the N64!


The killer app for the Data Discman would have been early (early 90s) use as a storage medium for digital cameras.

Huge lost opportunity for Sony.


The EU must leave NATO and start its own Military Alliance - without the US.

This time the alliance mandate, instead of fighting Communism, should be enforcing ICC rulings.


The EU already has stricter clauses for mutual military aid than NATO does


Awesome! Anyone for a port to the MSX?

A web version would also be cool.


Prototype in Matlab, production in C.


What about SciLab?


Excellent piece of technology. Now in 2026.0 version.

It gets better and better all the time.


Very good!

But I expected a humorous touch in stating that fusion is still 10 years away…


But, for example, isn't Cannonball (SEGA Outrun source port) open source?

https://github.com/djyt/cannonball


No it is not. There is no license in that repository.

Relevant: https://github.com/orgs/community/discussions/82431

> When you make a creative work (which includes code), the work is under exclusive copyright by default. Unless you include a license that specifies otherwise, nobody else can copy, distribute, or modify your work without being at risk of take-downs, shake-downs, or litigation. Once the work has other contributors (each a copyright holder), “nobody” starts including you.

https://choosealicense.com/no-permission/


There is a license: https://github.com/djyt/cannonball/blob/master/docs/license....

...but it's very clearly not an open source license.


ah thanks, you're right - I didn't think to look in subfolders. genuinely never seen a license in a subfolder before.


It doesn't even have to be there, you could state what the license is on a website or in an e-mail. Sometimes you can find it in the header of source files (as seen here as well). Having a "LICENSE" or "COPYING" file in the root of the repository is just a common pattern made even more common by all the tools that can consume it automatically (including GitHub's UI).


More than an overview, a step by step tutorial on this would be awesome!


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: