top | item 40375761

(no title)

terminalcommand | 1 year ago

Well it is a bit like satire. You have to explain the universe for an unspecialized GPT, like you would do to a layman. There are custom gpts that come preloaded with that universe explanation.

In addition, do not ask facts to an LLM. Give a list of let's say 1000 kings of a country and then ask give 20 of those.

If you ask 25 kings of some country, you are testing knowledge not intelligence.

I see LLMs like a speaking rubber duckie. The point where I write a successful point is also the point where I understand the problem.

discuss

order

latexr|1 year ago

I can’t believe I’m having to explain this, but the point I’m making isn’t about the content of the list but the numbers.

> like you would do to a layman.

I have never encountered a person so lay that I had to explain that 20 is smaller than 30 and 25.

> The point where I write a successful point is also the point where I understand the problem.

You have demonstrated repeatedly that you don’t know when you have explained a point successfully to an LLM, thus you have no way to evaluate when you have understood a point.

But you seem to firmly believe you did, which could be quite dangerous.

sph|1 year ago

Careful, explain too much and you end up with programming its behaviour, rather than having an intelligent actor learning by itself. Because otherwise one could say a regular computer is intelligent, provided you explain (in code) every single rule of the game.