top | item 44703011

(no title)

dbagr | 7 months ago

You need recursion at some point: you can't account for all possible scenarios of combinations, as you would need an infinite number of layers.

discuss

order

crystal_revenge|7 months ago

> infinite number of layers

That’s not as impossible as it seems, Gaussian Processes are equivalent to a Neural Network with infinite hidden units, and any multilayer NN can be approximated by one with a single, larger layer of hidden units.

topspin|7 months ago

"a single, larger layer of hidden units"

Does this not mean that the entire model must cycle to operate any given part? Division into concurrent "modules" (the term appearing in this paper,) affords optimizing frequency independently and intentionally.

Also, what certainty is there that everything is best modelled with multilayer NN? Diversity of algorithms, independently optimized, could yield benefits.

Further, can we hope that modularity will create useful points of observability? The inherent serialization that develops between modules could be analyzed, and possibly reveal great insights.

Finally, isn't there a possibility that AGI could be achieved more rapidly by factoring the various processes into discrete modules, as opposed to solving every conceivable difficulty in a monolithic manner, whatever the algorithm?

That's a lot of questions. Seems like identifying possible benefits is easy enough that this approach is worthwhile exploring. We shall see I suppose. At the very least we know the modularization of HRM has a valid precedent: real biological brains.

advael|7 months ago

I mean recurrence is an attempt to allow approximation of recursive processes, no?