Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If work can be done with fewer GPUs then people will use less. This is what leading to Nvidia fall.


But there isn’t a finite amount of work. If we can do something with less then we should be able to do significantly more with more by applying the same optimizations.


I'm out here waiting for model compute costs to drop so I can run model ensembles and immediately improve accuracy.


I agree as such. Do note that there's finite amount of training data though and humankind is already close to the limits.


only if deepseek-r1 has achieved 100% AGI and the marginal benefit of more compute decreases, otherwise there is no reason to use less compute.


AI being more available and locally runnable will induce demand for GPUs.


I don't think this is true, there's a limit to how "good" the average person needs their AI to be — assuming the average person purposefully uses much AI at all (thinking of people like my mother here).




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: