top | item 46229374

(no title)

tcsenpai | 2 months ago

> But what if our neurobiological reality includes a system that behaves something like an LLM?

It almost seems like we got inspiration from our brain to build neural networks!

discuss

order

seanmcdirmid|2 months ago

It isn’t clear though. Neural networks were inspired by the brain, but transformers? It is totally plausible but do we really think just in words?

coldtea|2 months ago

>Neural networks were inspired by the brain, but transformers? It is totally plausible but do we really think just in words?

LLMs might be trained via words, but as a backend transformers are not just for words.

They're for high dimensional structured sequences. To make an analogy, transformers are not working on:

  Vector<Word>
but

  Vector<ContextualizedEmbedding>
where words just happens to be a handy training set we use.

And, we too, might not think in words, but I bet that we do think using multi-dimensional sequences/vectors.

SAI_Peregrinus|2 months ago

> It is totally plausible but do we really think just in words?

I find that proposition totally implausible. Some people certainly report only thinking in words & having a continuous inner monologue, but I'm not one of them. I think, then I describe my thoughts in words if I'm speaking or writing or thinking about speaking or writing.

coldtea|2 months ago

We've been making the same metaphor ("that's how the brain works") with each new major technology we come up with...