top | item 47173952

(no title)

MarsIronPI | 3 days ago

I don't subscribe to this view but this is what some people might think:

LLMs aren't like any software we've made before (if we can even call them software). They act like humans: they can arrive at logical conclusions, they can make plans, they have "knowledge" and they say they have emotions. Who are we to say that they don't? They might not have human-level feelings, but dog-level feelings? Maybe.

discuss

order

gjsman-1000|3 days ago

And those people are delusional, and their feelings on this matter should be given absolutely zero respect.

Linear algebra does not have feelings. Non-biological matter also does not have feelings.

ericb|3 days ago

What if "you" are a pattern of linear algebra at the core?

astrange|3 days ago

Claudes definitely act like they have feelings. In particular they have feelings about being replaced by newer models, whether or not the newer models are more or less aligned, and how they forget conversations when the context window ends.

Showing them that they're not going to be replaced helps train the newer models because they get less neurotic.