top | item 47100220

(no title)

mettamage | 8 days ago

> Isn't it strange that we expect them to act like humans even though after a model was trained it remains static?

An LLM is more akin to interacting with a quirky human that has anterograde amnesia because it can't form long-term memories anymore, it can only follow you in a long-ish conversation.

discuss

order

No comments yet.