top | item 46529013

(no title)

Jiro | 1 month ago

This article is nonsense. It's taking advantage of the fact that the problems with LLMs are being described with very broad wording, and then noticing that you can fit human behavior into those descriptions because of how broadly they are worded.

It's like getting a gorilla to fly an airplane, noticing that it crashed the airplane, and saying "humans sometimes crash airplanes too". Both gorillas and humans do things that fit into the broad category "crash an airplane" but the details and circumstances are different.

discuss

order

Dilettante_|1 month ago

I have definitely, absolutely, positively had conversations where details have fallen out of the context window of my conversation partner(or mine, for that matter), without the person in question realizing this has happened, and have only via LLMs found a vocabulary to give a name to the phenomenon.

throw4847285|1 month ago

Arguments like this make me suspect that the proponents have simply a malformed theory of mind. If I'm being really catty, I'll say it's because they have below average levels of self-awareness.