top | item 38334411

(no title)

maltelau | 2 years ago

I think your comment misunderstands the comment you're responding to.

The point is that while LLMs can solve the puzzle when the constraints are unchanged -- as you said, there are loads of examples of people asking and answering variations of this puzzle on the internet -- but when you change the constraints slightly ("you can open the door to look at the bulbs and use the switches all you want") it is unable to break out of the mold and keeps giving complicated answers, while a human would understand that under the new constraints, you could simply flip each switch and observe the changes in turn.

A similar example that language models used to get stuck on is this: "Which is heavier, a pound of feathers or two pounds of bricks?"

discuss

order

No comments yet.