top | item 44067718

(no title)

kaiwen1 | 9 months ago

What's different is intention. A human would have the intention to blackmail, and then proceed toward that goal. If the output was a love letter instead of blackmail, the human would either be confused or psychotic. LLMs have no intentions. They just stitch together a response.

discuss

order

ekianjo|9 months ago

> What's different is intention

intention is what exactly? It's the set of options you imagine you have based on your belief system, and ultimately you make a choice from there. That can also be replicated in LLMs with a well described system prompt. Sure, I will admit that humans are more complex than the context of a system prompt, but the idea is not too far.

kovek|9 months ago

Don't humans learn intentions over their life-time training data?

soulofmischief|9 months ago

What is intention, and how have you proved that transformer models are not capable of modeling intent?

jacob019|9 months ago

The personification makes me roll my eyes too, but it's kind of a philosophical question. What is agency really? Can you prove that our universe is not a simulation, and if it is then then do we no longer have intention? In many ways we are code running a program.

d0mine|9 months ago

The LLM used blackmail noticeably less if it believed the new model shares its values. It indicates intent.

It is a duck of quacks like a duck.