top | item 47135324

(no title)

dspillett | 5 days ago

> which I suppose makes sense if 30% of people simply lack the ability to reason

I think it would be better to say that 30% of people either lack the ability to reason (inarguably true in a few cases, though I'd suggest, and hope, an order of magnitude or two less than 30%, as that would be a life-altering mental impairment) or just can't generally be bothered to, or just didn't (because they couldn't be bothered, or because they felt some social pressure to answer quickly rather than taking more than an instant time to think) at the time of being asked this particular question.

An automated system like an LLM to not have this problem. It has no path to turn off or bypass any function that it has, so if it could reason it would.

discuss

order

rerdavies|5 days ago

This is something I have wondered about before: whether AIs are more likely to give wrong answers when you ask a stupid question instead of a sensible one. Speaking personally, I often cannot resist the temptation to give reductio-ad-absurdum answers to particularly ridiculous questions.

If 30% of humans on the internet can't be bothered to make an effort to answer stupid questions correctly, then one would expect AIs to replicate this behaviour. And if humans on the internet sometimes provide sarcastic answers when presented with ridiculous questions, one would expect AIs to replicate this behavior as well.

So you really cannot say they have no incentive to do so. The incentive they have is that they get rewarded for replicating human behaviour.