top | item 47133628

(no title)

farhanhubble | 6 days ago

There could be many plausible explanations.

1. The model's default world model and priors diverge from ours. It may assume that you have another car at the wash and that's why you ask the question to begin with.

2. Language models do not really understand how space, time and other concepts from the real-world work

3. LLM's attention mechanism is also prone to getting tricked as in humans

discuss

order

No comments yet.