top | item 44888351

(no title)

pahkah | 6 months ago

This seems like a case of tunnel vision and confirmation bias, the nasty combo that sycophantic LLMs make easy to fall prey to. Someone gets an idea, asks about it, and the LLM doesn’t ask about the context or say that doesn’t make sense, it just plays along, “confirming” that that the idea was correct.

I’ve caught myself with this a few times when I sort of suggest a technical solution that, in hindsight, was the wrong way to approach a problem. The LLM will try to find a way to make that work without taking a step back and suggesting that I didn’t understand the problem I was looking at.

discuss

order

No comments yet.