top | item 29124594

(no title)

yaroslavvb | 4 years ago

Realistic simulation of neurons is expensive. Back in my grad school days we ran Genesis and could afford at most 10k neurons - each neuron needs a lot of work to model the corresponding differential equations. However, it's unclear how to translate this into requirements for artificial neural networks -- the type of computation is too different.

A different metric is a more relevant goalpost -- number of synapses. If each of 125 trillion synapses in the brain can adjust its strength independently of others, it loosely corresponds to a parameter in a neural network. So if we get 100 trillion parameter networks training but still no human intelligence, we'll know conclusively that the bottleneck is something else. Currently training 1T parameter networks seem feasible

discuss

order

marmaduke|4 years ago

if you collapse things to just synapses, you’ve lost of the complexity of dendritic arbors. The article doesn’t mention gap junctions but there are networks of those too with different properties.

It seems to me that mean field models, which could be deep networks internally, are a much more parsimonious computational approach.

skyde|4 years ago

we already know biological neural networks like the worm C-elegant is more intelligent than an artificial neural network of the same size.

isn’t it sufficient proof the bottleneck is elsewhere ?