(no title)
dwallin | 5 days ago
The question itself never specifies that the car you would be driving is the same one that you need to be washed. The car that needs to be washed could be waiting in the parking lot of the car wash already. It doesn't state that you plan on washing your car at the car wash. Perhaps the car wash sells car cleaning equipment that you can bring back to wash your car at home?
The question is designed to be ambiguous so the llm answers it in a way that seem facially absurd to the people who are in on the scheme. What it's actually showing is a failure of imagination for those asking the question.
Do you want your chatbot to be suspicious of you trying to trick it? To me this seems patently unhelpful outside of LLMs tuned for roleplay or to operate in a highly adversarial environment.
Do you want it to assume you are an idiot asking the question because you didn't realize you need to have the car at the car wash to wash it?
Or do you want it to take the best faith assumption as to what you are asking and try to be as helpful as possible given the poor question?
E-Reverance|5 days ago