Humans update their model of the world as they receive new information.
LLMs have static weights, therefore they cannot not have a concept of truth. If the world changes, they insist on the information that was in their training data. There is nothing that forces an LLM to follow reality.
simianwords|1 month ago
imtringued|1 month ago
LLMs have static weights, therefore they cannot not have a concept of truth. If the world changes, they insist on the information that was in their training data. There is nothing that forces an LLM to follow reality.
usefulcat|1 month ago
ETA:
To elaborate a bit: based on your response, it seems like you don't think my question is a valid one.
If you don't think it's a valid question, I'm curious to know why not.
If you do think it's a valid question, I'm curious to know your answer.