top | item 38091552

(no title)

siffland | 2 years ago

I have no idea why this is fun, but on AI chat-bots, i always test

2 + 2(2-2)

Should be answer of 2, however on different bots (not just GPT) i have gotten 0, 2, 4 and 6 (all i can understand except the 6).

So yeah.......math messes with some bots, who would of guessed.

discuss

order

nomel|2 years ago

For GPT-4, the Wolfram Alpha plugin is great for any maths.

Sharlin|2 years ago

"would have" I believe you meant.

There's no way even GPT-3.5 fails to solve that ridiculously simple piece of arithmetic. Honestly I'd be surprised if GPT-2 got that wrong. GPT-4 can single-handedly solve vastly more difficult math problems, even though it's handicapped by being merely a language model.

IanCal|2 years ago

3.5 & 4 are fine, yes, explaining in steps how to solve it (when prompted purely with "2 + 2(2-2)"). gpt2 completing "2 + 2(2-2) = " returns " 1.5 + 1.5 + 1.5".