We know quite a lot. For example, we know that brains have various different nueromodulatory pathways. Take for example dopamine reward mechanism that is being talked about more openly these day. Dopamine is literally secreted by various different parts of the brain and affect different pathways.
I don't think it is anywhere feasible to emulate anything resembling this in a computational neural network with fixed input and output neurons.
Keep in mind that our brains also have a great deal of built in trained structure from evolution. So even if we understood exactly how a brain learns, we may still not be able to replicate it if we can't figure out the highly optimized initial state from which it starts in a fetus.
I concur. It might not be feasible in terms of computational power available, but I don't think there is anything fundamentally stopping application of those training mechanisms, unless the whole neuralnet paradigm is fundamentally incompatible with those learning methods.
How much of, especially "higher level cognition" like language, is encoded genetically is highly controversial and the thinking/pendulum in last decade or two has shifted substantially towards only general mechanisms being innate. E.g. the cortex may be in an essentially "random state" prior to getting input.
friendzis|10 months ago
I don't think it is anywhere feasible to emulate anything resembling this in a computational neural network with fixed input and output neurons.
dgfl|10 months ago
idiotsecant|10 months ago
tsimionescu|10 months ago
gpjt|10 months ago
friendzis|10 months ago
jampekka|10 months ago
stormfather|10 months ago