Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Sure, but the sources list is generated by the same system that generated the text, so it’s equally subject to hallucinations. Some examples in here:

https://dkb.blog/p/bing-ai-cant-be-trusted

To answer the question above, these systems cannot provide sources because they don’t work that way. Their source for everything is, basically, everything. They are trained on a huge corpus of text data and every output depends on that entire training.

They have no way to distinguish or differentiate which piece of the training data was the “actual” or “true” source of what they generated. It’s like the old questions “which drop caused the flood” or “which pebble caused the landslide”.



The issues are good to know about but

> Their source for everything is, basically, everything. They are trained on a huge corpus of text data and every output depends on that entire training.

Bing chat is explicitly taking in extra data. It's a distinctly different setup from chatgpt.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: