An LLM doesn't understand the difference between fact and fiction.
It just uses probability to choose the next word. Hopefully, there are a more facts in it's database that can serve as a guide. But if not, it will just as readily use fiction to produce something that sounds plausible.
Anything an LLM produces simply cannot be trusted and is a poor example of "intelligence".
jqpabc123|1 month ago
You and I understand this but an LLM doesn't.
An LLM doesn't understand the difference between fact and fiction.
It just uses probability to choose the next word. Hopefully, there are a more facts in it's database that can serve as a guide. But if not, it will just as readily use fiction to produce something that sounds plausible.
Anything an LLM produces simply cannot be trusted and is a poor example of "intelligence".