top | item 46764392

(no title)

RugnirViking | 1 month ago

it seems to me like this is very much an artefact of the left-to-right top-down writing method of the program. Once its committed to a token earlier in its response it kinda just has to go with it. Thats why im so interested in those LLM models that work more like stable diffusion, where they can go back and iterate repeatedly on the output.

discuss

order

No comments yet.