top | item 42330955

(no title)

eevilspock | 1 year ago

You are implicitly anthropomorphizing LLMs by implying that they (can) have intent in the first place. They have no intent, so can't lie or make a claim or confabulate. They are just a very complex black box that takes input and spits output. Searle's Chinese Room metaphor applies here.

discuss

order

No comments yet.