(no title)
ai4ever | 1 year ago
These folks must all do courses in epistemology to realize that all knowledge is built up of symbolic components and not spit out by a probabilistic machine.
Gradually, reality will sync (intentional misspelling) in, and such imaginations will be seen to be futile manic episodes.
visarga|1 year ago
Knowledge ends up as symbolic representation, but it ultimately comes from the environment. Science is search, searching the physical world or other search spaces, but always about an environment.
I think many people here almost forget that the training set of GPT was the hard work of billions of people over history, who researched and tested ideas in the real world and build up to our current level. Imitation can only take you so far. For new discoveries the environment is the ultimate teacher. It's not a symbolic processing thing, it's a search thing.
Everything is search - protein folding? search. DNA evolution? search. Memory? search. Even balancing while walking is search - where should I put my foot? Science - search. Optimizing models - search for best parameters to fit the data. Learning is data compression and search for optimal representations.
Symbolic representations are very important in search, they quantize our decisions and make it possible to choose in complex spaces. Symbolic representation can be copied, modified and transmitted, without it we would not get too far. Even DNA uses its own language of "symbols".
Symbols can encode both rules and data, and more importantly, can encode rules as data, so syntax becomes object of meta-syntax. It's how compilers, functional programming and ML models work - syntax creating syntax, rules creating rules. This dual aspect of "behavior and data" is important for getting to semantics and understanding.
defamation|1 year ago
Who's to say that a model can't eventually be trained to work within certain parameters the real word operates in and make new novel ideas and inventions much like a human does in a larger scope.
ai4ever|1 year ago
https://www.siliconrepublic.com/machines/deepmind-ai-study-c...
DeepMind is overselling their AI hand when they dont have to.
"whos to say that" - this could be a leading question for any "possibility" in the AI religion.
"whos to say that god doesnt exist" etc. questions for which there are no tests, and hence fall outside the realm of science and in the realm of religion.
mnky9800n|1 year ago
the__alchemist|1 year ago
JBorrow|1 year ago
gessha|1 year ago
Judge a technology based on what it’s currently capable of and not what it promises to be.