(no title)
base698 | 7 months ago
By loop I mean you tell it no don’t implement this service, look at this file instead and mimic that and instead it does what it did before.
base698 | 7 months ago
By loop I mean you tell it no don’t implement this service, look at this file instead and mimic that and instead it does what it did before.
yencabulator|7 months ago
My observation has been this: if you push a(/any) current-day LLM too close to the edge of its abilities, it goes "insane". Hallucinations start happening everywhere, it stops ignoring previous knowledge, etc. The best way out is to end the session, maybe do some manual work to get to a good state, perhaps update the specs, and start with a fresh context. Using "strong words" or prompting more is of no consequence, the LLM will produce essentially gibberish until reset. Sometimes using a more expensive model temporarily gets around whatever is triggering the stupidity.