(no title)
yaroslavvb | 4 years ago
A different metric is a more relevant goalpost -- number of synapses. If each of 125 trillion synapses in the brain can adjust its strength independently of others, it loosely corresponds to a parameter in a neural network. So if we get 100 trillion parameter networks training but still no human intelligence, we'll know conclusively that the bottleneck is something else. Currently training 1T parameter networks seem feasible
marmaduke|4 years ago
It seems to me that mean field models, which could be deep networks internally, are a much more parsimonious computational approach.
skyde|4 years ago
isn’t it sufficient proof the bottleneck is elsewhere ?