top | item 46812232

(no title)

usefulcat | 1 month ago

How is knowing what word is most likely to come next in a series of words remotely the same as having "the concept of truth and facts"?

discuss

order

simianwords|1 month ago

how would you prove that a human has it?

imtringued|1 month ago

Humans update their model of the world as they receive new information.

LLMs have static weights, therefore they cannot not have a concept of truth. If the world changes, they insist on the information that was in their training data. There is nothing that forces an LLM to follow reality.

usefulcat|1 month ago

Whataboutism is almost never a compelling argument, and this case is no exception.

ETA:

To elaborate a bit: based on your response, it seems like you don't think my question is a valid one.

If you don't think it's a valid question, I'm curious to know why not.

If you do think it's a valid question, I'm curious to know your answer.