top | item 44841398

(no title)

michelpp | 6 months ago

Not sure why this is being downvoted, it's a thoughtful comment. I too see this crisis as an opportunity to push boundaries past current architectures. Sparse models for example show a lot of promise and more closely track real biological systems. The human brain has an estimated graph density of 0.0001 to 0.001. Advances in sparse computing libraries and new hardware architectures could be key to achieving this kind of efficiency.

discuss

order

lazide|6 months ago

Memristors have been tried for literally decades.

If the posters other guesses pay out the same rate, this will likely play out never.

kelipso|6 months ago

There was a bit of noise regarding spiking neural networks a few years ago but now I am not seeing it so often anymore.

ilaksh|6 months ago

Other technologies tried for decades before becoming huge: Neural-network AI; Electric cars; mRNA vaccines; Solar photovoltaics; LED lighting

hyperbovine|6 months ago

> Sparse models for example show a lot of promise and more closely track real biological systems.

I think sparsity is a consequence of some other fundamental properties of brain function that we've yet to understand. Just sparsifying the models we've got is not going to lead anywhere, IMO. (For example it's estimated that current AI models are already within 1%-10% of a human brain in terms of "number of parameters" (https://www.beren.io/2022-08-06-The-scale-of-the-brain-vs-ma...).)