top | item 25586996

(no title)

acdc4life | 5 years ago

My brain does nlp better than any system out there. I’m also able to ride bicycles and do motor control better than Boston Dynamics. I can also construct and prove math, do physics and code all of this in Matlab and C. My brain can handle all these wide range of tasks almost seamlessly with just 15 watts of power, that Silicon Valley’s super computers can barely do 1% of.

discuss

order

bumby|5 years ago

To be a fair comparison wouldn’t you have to include all the energy used to train “your” brain through generations of evolutionary training? Your latest model is like taking an already trained BERT model and adding a few tweaks

acdc4life|5 years ago

I see BERT as non sensical. You need to be scientific and have a mathematical theory on how humans learn language, which is a multi disciplinary task requiring physicists, mathematicians, neuroscientists, cognitive psychologists and linguists. Benchmarks are useless, theories, models, experiments and testable predictions is how science progresses. You’re making a comment on cognitive science, and trying to imply that language learning in humans isn’t learned, but pre baked. The psychological, linguistic, evolutionary biology and neuroscience evidence doesn’t seem to corroborate. The evidence points stronger to humans having general learning and problem solving abilities. For instance, there was no evolutionary pressure for humans to be good at math or programming. I was not born knowing english or calculus or probability theory, these were learned abilities. Evolution favoured brain mechanisms that lead to behaviour for success in a rapidly changing world. Had I been born in ancient Rome as a farmer, I would learn to speak Latin, and learn how to be a successful farmer, instead of the physics, math, probability, computer, driving, reading skills that I learned in my life time.

Peritract|5 years ago

Only if you're going to include all the time & energy spent creating BERT's precursors as well when calculating its cost.

omgwtfbyobbq|5 years ago

To be fair, it takes years of energy to train our brains to the point where it can do all those things, and our brains aren't extensible in the same way hardware/software is. I guess there's also a lot more variation in yields. ;)

acdc4life|5 years ago

Well, that’s not entirely true. Alpha go self played 29 million games of Go, which is in feasible for humans. Assuming a game of Go is 5mins, it would take a human 286 years to achieve the same, assuming this person doesn’t sleep, eat or do anything else other than just play Go. CPU time is an order of magnitude faster than real time, especially on GPU clusters.

oh_sigh|5 years ago

Yeah but your clone() function is very wasteful, and we can't stick 1 million of you in a dark room that just looks at people's google searches and figures out what they're really going for.

And it's not the whole picture to say the brain only uses 15 watts - when there's all sorts of necessary support systems that it couldn't run without. So it's closer to 100 watts (2000kcal/day)

acdc4life|5 years ago

Not Google search, but many companies are exploiting cheap labor over seas for different industrial use cases because our algos suck. Not just in manufacturing, there are tech companies outsourcing labeled data for tasks like object detection (Hive.ai as anvexample). Whether it makes you depressed or not, humans are cheaper than algorithms, and I don’t see that changing unless we abandon the current paradigm for ai/ml.

wongarsu|5 years ago

Comparing your pre-trained brain (that has a lot of structure (=training) through evolution) with the training costs of a new algorithm isn't really fair.

All in you require about 100W (2000kcal/day), maybe 3 times that when doing a lot of physical activity. Boston Dynamic's Spot uses about 400W. I can probably outperform it in some disciplines while it would beat me in some others. That would be a fair comparison.

acdc4life|5 years ago

> Comparing your pre-trained brain

You’re making sweeping assertions that require domain experts in several different disciplines. The scientific evidence points to the contrary. It is demonstrated that humans have a general ability to learn wide range of things, without being genetically programmed to. None of us evolved to drive cars. Yet nearly everyone in my grandparents generation were able to learn this totally new skill despite being a new invention, where you couldn’t possibly have had time for evolution to act. They weren’t genetically evolved to drive, it was learned within their lifetime and generation. There are tribal humans in different parts of the world that haven’t developed written language. Yet you can teach them written language. Where’s your “pre trained brain” theory there?

juanbyrge|5 years ago

Human brains are also the culmination of billions of years of iterative development. Computers were only invented in the last century. I would not be surprised if computers could catch up given a few tens or hundred more years.

absolutelyrad|5 years ago

Yeah, I'd give 30(realistically 15) years tops for AGI. And then we'll call it the end of history.

That is if we don't kill ourselves by making stupid mistakes.

acdc4life|5 years ago

I think this is false. My general opinion of computer scientists and engineers (especially in silicon valley) lack the scientific training that one gets in other disciplines like in physics or other hard sciences. Psychology, linguistics, cognitive science and neuroscience has generated a rich diverse experimental data in the past 50 years, and now is the prime time for a theory to emerge to connect everything. To make what I’m saying clearer, you needed Newtonian mechanics, Maxwell and Plank and Coulomb to develop and discover the science of electricity and atoms in order for us to engineer the silicon transistor. Without the original scientific knowledge, building such devices would have been impossible. This machine learning, ai, deep learning and this obsession with benchmarks are, in my opinion, a hinderance to AGI