top | item 42421609

(no title)

benchmarkist | 1 year ago

And how do they do that? Be very specific.

discuss

order

killerstorm|1 year ago

Alright. Suppose "meaning" (or "understanding") is something which exists in human head.

It might seem as a complete black box, but we can get some information about it by observing human interactions.

E.g. suppose Алиса does not know English, but she has a bunch of cards with instructions "Turn left", "Bring me an apple", etc. If she shows these cards to Bob and Bob wants to help her, Bob can carry out instructions in a card. If they play this game, the meaning which card induces in Bob's head will be understood by Алиса, thus she will be able to map these cards to meaning in her head.

So there's a way to map meaning which is mediated by language.

Now from math perspective, if we are able to estimate semantic similarity between utterances we might be able to embed them into a latent "semantic" space.

If you accept that the process of LLM training captures some aspects of meaning of the language, you can also see how it leads to some degree of self-awareness. If you believe that meaning cannot be modeled with math then there's no way anyone can convince you.

benchmarkist|1 year ago

How does math encode meaning if there is no Alice and Bob? You should quickly realize the absurdity of your argument once you take people out of the equation.