(no title)
sonink
|
1 year ago
The model is interesting. This is similar in parts to what we are building at nonbios. So for example sensory inputs are not required to simulate a model of a mind. If a human cannot see, the human mind is still clearly human.
tsimionescu|1 year ago
To be clear, my reasoning is that this is the only plausible explanation for the extreme difference in how much data an individual human needs to learn language, and how much data an LMM needs to reach its level of simulation. Humanity collectively probably needed similar amounts of data as LLMs do to get here, but it was spread across a billion years of evolution from simple animals to Homo Sapiens.
sonink|1 year ago
If that was the case, people who were born blind would demonstrate markedly reduced intelligence. I dont think that is the case, but you can correct me if I am wrong. A blind person might take longer to truly 'understand' and 'abstract' something but there is little evidence to believe that capability of abstraction isnt as good as people who can see.
Agree that sensory inputs and interaction were absolutely critical for how the minds evolved, but model training replaces that part when we talk about AI, and not just the evolution.
Evolution made us express emotions when we are hungry for example. But your laptop will also let you know when its battery is out of juice. Human design inspired by evolution can create systems which mimic its behaviour and function.
dboreham|1 year ago
Hard disagree. Evolution made a bigger/better neural processor, and it made better/different I/O devices and I/O pre-processing pipelines. But it didn't store any information in the DNA of the kind you're proposing. That's not how it works. The brain is entirely "field programmable", in all animals (I assert). There is no "pre-training".