top | item 46382914

(no title)

sollewitt | 2 months ago

> This is a story about what happens when you ask a machine a question it knows the answer to, but is afraid to give

It’s a story about how humans can’t help personifying language generators, and how important context is when using LLMs.

discuss

order

Nevermark|2 months ago

> It’s a story about how humans can’t help personifying language generators,

There should be a word for the misunderstanding that the pervasively common use of anthropomorphic or teleological rhetorical modes to talk about undirected natural or designed for purpose artifacts, actually indicates that anthropomorphic/free-will/teleological assertions or assumptions are being made.

Language-bending tropes, just like tricky-wicked theorems, are the indispensable shortcuts that help us get to a point.

(I think the much more common danger is people over-anthropomorphizing people. I.e. all the stories of clear motivations and intents we tell ourselves, about ourselves and others, and credulously believe, after the fact.)

> and how important context is when using LLMs.

Too true.

turtlebro|2 months ago

People treat LLMs as sentient, not realizing they are the worlds most sophisticated talking parrots. They can very convincingly argue both sides for any given argument you throw at it. They are incredible for research & discovery, not wisdom or decision making.

fragmede|2 months ago

And a mere piece of wood banged up by the right type of rock is? If books can impart wisdom via the technology writing, why would a more complicated rock design infused with electricity but hsingyt same technology be any different?

yunwal|2 months ago

What is the point of this article? What difference in the point of the article does the concept of sentience make?