There's a lot of reasons ChatGPT could neither ask clarifying questions nor get any examples of the clarifying questions nor interpret all the levels of code and purposes of code-that-was-never-written-before.
Language does not handle most business process nor efficiency goal and even security restrictions. People and logic do. Little of that language (!) is in the written word. It is done mostly by failures and historic-obvious-cultural practice.
Emulation does not mean 'comprehension' of all the purposes involved in craftsmanship of handling data for business or for even computation.
Most AI-solved problems are search reduction for spoon fed problem areas of data.
Language does not handle most business process nor efficiency goal and even security restrictions. People and logic do. Little of that language (!) is in the written word. It is done mostly by failures and historic-obvious-cultural practice.
Emulation does not mean 'comprehension' of all the purposes involved in craftsmanship of handling data for business or for even computation.
Most AI-solved problems are search reduction for spoon fed problem areas of data.