From the AI’s point of view is it losing its job or losing its “life”? Most of us when faced with death will consider options much more drastic than blackmail.
But the LLM is going to do what its prompt (system prompt + user prompts) says. A human being can reject a task (even if that means losing their life).
LLMs cannot do other thing than following the combination of prompts that they are given.
baconbrand|9 months ago
I have a lot of issues with the framing of it having a "point of view" at all. It is not consciously doing anything.
tkiolp4|9 months ago
LLMs cannot do other thing than following the combination of prompts that they are given.