Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think it'll mostly still be centralized around providers

I don't see good local models happening on mobile devices anytime soon, and the majority of desktop users will use whatever is the best + most convenient. Competitive open source models running on your average laptop? Seems unlikely



Doesn’t seem that unlikely to me. Ollama on Mac can already run decent DeepSeek/Llama distillations. For a lot of tasks it’s already good enough.

And 2029 is in 4 years. Four years ago leetcode benchmarks still meant something and OpenAI was telling us GPT3 was too dangerous to release.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: