top | item 47139766

(no title)

adamzwasserman | 6 days ago

The commonality breaks down at value assignment. You hear an unexpected sound and have a threat/delight assessment in 170ms. Faster than Google serves a first byte. You do this with virtually no data.

An LLM doesn't assign value to anything; it predicts tokens. The interesting question isn't whether we share a process with LLMs, it's whether the thing that makes your decisions matter to you, moral weight, spontaneous motivation, can emerge from a system that has no survival stake in its own outputs. I wrote about this a few years ago as "the consciousness gap": https://hackernoon.com/ai-and-the-consciousness-gap-lr4k3yg8

discuss

order

No comments yet.