top | item 47041862

(no title)

cracki | 13 days ago

Absolutely!

I've been wondering for years how to make whatever LLM ask me stuff instead of just filling holes with assumptions and sprinting off.

User-configurable agent instructions haven't worked consistently. System prompts might actually contain instructions to not ask questions.

Sure there's a practical limit to how much clarification it ought to request, but not asking ever is just annoying.

discuss

order

Nition|13 days ago

Yeah nothing I've put in the instructions like "ask me if you're not sure!" has ever had a noticeable effect. The only thing that works well is:

- Ask question

- Get answer

- Go back and rewrite initial question to include clarification for the thing the AI got wrong