Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The biggest weakness of genetic algorithms is they can't make use of gradients - meaning they have no idea how to 'move' towards the solution - they end up guessing and refining their guesses, which means they're much slower to converge.

Their advantage is they don't require gradients (so the fitness function to be differentiable), but I don't think they're going to be the next big thing.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: