Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Mac Minis are perfect for locally running demanding models because they can effectively use ordinary RAM as VRAM.
 help



but people dont use OpenClaw with local models

They definitely do. A common configuration is running a supervisor model in the cloud and a much smaller model locally to churn on long running tasks. This frees Openclaw up to lavishly iterate on tool building without running through too many tokens.

Unless you're running a large local model in 192GB+ this just won't be ideal, based on real-world experience.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: