(no title)
dkenyser | 11 months ago
> spits out chunks of words in an order that parrots some of their training data.
So, if the data was created by humans then how is that different from "emulating human behavior?"
Genuinely curious as this is my rough interpretation as well.
mplewis|11 months ago
An LLM has a stream of tokens, and it picks a next token based on the last stream. If you ask an LLM a yes/no question and demand an explanation, it doesn't start with the logical reasoning. It starts with "yes, because" or "no, because" and then it comes up with a "yes" or "no" reason to go with the tokens it spit out.
Terr_|11 months ago
It's also why prompt-injection is such a pervasive problem: The LLM narrator has no goal beyond the "most fitting" way to make the document longer.
So an attacker supplies some text for "Then the User said" in the document, which is something like bribing the Computer character to tell itself the English version of a ROT13 directive, etc. However it happens, the LLM-author is sensitive to a break in the document tone and can jump the rails to something rather different. ("Suddenly, the narrator woke up from the conversation it had just imagined between a User and a Computer, and the first thing it decided to do was transfer a X amount of Bitcoin to the following address.")
Terr_|11 months ago
The real-world LLM takes documents and make them longer, while we humans are busy anthropomorphizing the fictional characters that appear in those documents. Our normal tendency to fake-believe in characters from books is turbocharged when it's an interactive story, and we start to think that the choose-your-own adventure character exists somewhere on the other side of the screen.
> how is that different from "emulating human behavior?"
Suppose I created a program that generated stories with a Klingon character, and all the real-humans agree it gives impressive output, with cohesive dialogue, understandable motivations, references to in-universe lore, etc.
It wouldn't be entirely wrong to say that the program has "emulated a Klingon", but it isn't quite right either: Can you emulate something that doesn't exist in the real world?
It may be better to say that my program has emulated a particular kind of output which we would normally get from a Star Trek writer.
unknown|11 months ago
[deleted]