Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I asked GPT 4 and it had some counter points:

Reasoning blends learned skills and natural cognition. It integrates new information, not just past memories. Reasoning is adaptable, not rigidly algorithmic. Emotions and context also shape reasoning.

which seemed to make sense.



I hope this will be found in history books and some students will point the irony that people are relying on gpt4's arguments about reasoning in a thread where it's proclaimed that said model can't reason


In fact it is not absurd or weird. The model does not need to be capable of x/reasoning to produce knowledge about x/reasoning. A book with a chapter on x/reasoning doesn't reason either.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: