What does this nonsensical question that some LLMs get wrong some of the time, and that some don't get wrong ever, have to do with anything? This isn't a "gotcha" even though you want it to be. It's just mildly amusing.
Because, this fundamental premise demonstrates that LLMs can't really think logically like we do and they are far from replacing actual humans, let alone senior software engineers.
So if you took one of the greatest software engineers ever and made it so he was unable to answer this nonsensical and pointless question, he would be a lesser engineer because of it?
neya|13 days ago
arcfour|11 days ago