Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yes it's true that LLMs hallucinate facts, but there are ways to control that. Despite the challenges they can spit out perfectly functional code to spec to boot. So for me it's not too much of a stretch to think that it'd do a reasonably good job at defending simple cases.


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: