As a neuroengineer working in the field, this is quite accurate. Understanding the compute architecture goes a loooong way - after all, acoustic RSA key extraction (https://www.tau.ac.il/~tromer/acoustic/) is possible. Whereas we're not exactly even sure how the brain is supposed to theoretically compute, other than that it's tremendously parallelized to a degree we don't quite fathom. The electronics explosion has primarily come out of computational motifs that rely on the lightspeed resolution of semiconductor gates and heavily rely on sequential processing, but the brain doesn't work this way AT ALL.An important concept here is the 100-steps-rule (https://www.teco.edu/~albrecht/neuro/html/node7.html) - neurons are SLOW! You can out-jog most non-myelinated neural signals, and the vast majority of sensory and motor computations finish in the order of 100 "clock cycles".
Write me a computer vision algorithm that has enough parallelism to complete in 100 cycles, and we can talk about understanding the biological brain compute structure and true brain-computer interfaces.
No comments yet.