top | item 45309999

(no title)

BarryMilo | 5 months ago

Isn't the whole problem that it's nigh-impossible to isolate context from input?

discuss

order

Terr_|5 months ago

Yeah, ultimately the LLM is guess_what_could_come_next(document) in a loop with some I/O either doing something with the latest guess or else appending more content to the document from elsewhere.

Any distinctions inside the document involve the land of statistical patterns and weights, rather than hard auditable logic.