(no title)
hnbad | 12 days ago
It seems that that problem hasn't really been "fixed", it's just been paved over. But I guess that's the ugly truth most people tend to forget/deny about LLMs: you can't "fix" them because there's not a line of code you can point to that causes a "bug", you can only retrain them and hope the problem goes away. In LLMs, every bug is a "heisenbug" (or should that be "murphybug", as in Murphy's Law?).
joquarky|12 days ago
"Don't think of a green elephant"
Alan Watts talked of this concept where the harder you try to suppress a thought or sensation, the more mental energy you give it, making it stronger.