top | item 47128963

(no title)

streetfighter64 | 6 days ago

How is that a "subliminal message"? It's just a simple example of common sense, which LLMs fail because they can't reason, not because they are "overthinking". If somebody asks, "What's 2+2?", they might be insulting you, but that doesn't mean the answer is anything other than 4.

discuss

order

mattclarkdotnet|6 days ago

2+2 might well not equal 4, since you haven’t specified the base of the numbers or the modulus of the addition.

And what if it’s a full service car wash and you’ve parked nearby because it’s full so you walk over and give them the keys?

Assumptions make asses of us all…

streetfighter64|5 days ago

So you're saying it would be useful for an "AI assistant" to ask you for the base each time you give it a math problem? Do you also want it to ask you if you're using the conventional definitions of "2" and "+"? For the car wash, would you like it to ask if you're on Earth or on Mars? Do you have air in your tires? Is the car actually a toy car?

Some assumptions are always necessary and reasonable, that's why I'm saying the "AI" lacks common sense.

hmokiguess|6 days ago

It’s common sense to ask a question in riddle format? What’s the goal of the person asking the question? To challenge the other person? In what way? See if they get the obvious? Asking for clarification isn’t valid?

streetfighter64|6 days ago

It's common sense to know that you need to have your car with you to wash it. Asking the question is a challenge in the obvious yes. If you asked an AI "what's 2+2" and it said 3, would you argue that the question was a trick question?