top | item 47037853

(no title)

vbezhenar | 13 days ago

It makes no sense to walk. So the whole question makes no sense as there's no real choice. It seems that LLM assumes "good faith" from the user side and tries to model the situation where that question actually makes sense, producing answer from that situation.

I think that's a valid problem with LLMs. They should recognize nonsense questions and answer "wut?".

discuss

order

lukeasch21|13 days ago

That's one of the biggest shortcomings of AI, they can't suss out when the entire premise of a prompt is inherently problematic or unusual. Guardrails are a band-aid fix as evidenced by the proliferation in jailbreaks. I think this is just fundamental to the technology. Grandma would never tell you her dying wish was that you learned how to assemble a bomb.

bean469|12 days ago

> So the whole question makes no sense as there's no real choice.

Of course there is a choice, it's even provided to the LLM directly