top | item 47140225

(no title)

sdwr | 5 days ago

It's a fine argument, just rough around the edges:

Human brains use redundancy and the physical independence of neurons to build new pathways over time.

Current LLMs have no redundancy and brittle weights. Their technology and architecture fundamentally prevents them from learning.

I think our understanding of consciousness is developing as we build new edge cases. We have a machine that understands and reacts, but can't learn, grow, or "be" over time in a meaningful way.

discuss

order

orbital-decay|5 days ago

But how is neuroplasticity relevant to consciousness, whatever it is?

>Their technology and architecture fundamentally prevents them from learning.

No? There's in-context learning which is actual learning, it's sample-efficient, and the result can be stored for a learning pipeline. Yes it's ludicrously crude and underpowered compared to neuroplasticity, but that's another question, there's nothing fundamental about this.