top | item 45238965

(no title)

cpldcpu | 5 months ago

I believe the argument is that you can also encode information in the time domain.

If we just look at spikes as a different numerical representation, then they are clearly inferior. For example, consider that encoding the number 7 will require seven consecutive pulses on a single spiking line. Encoding the number in binary will require one pulse on three parallel lines.

Binary encoding wins 7x in speed and 7/3=2.333x in power efficiency...

On the other hand, if we assume that we are able to encode information in the gaps between pulses, then things quickly change.

discuss

order

HarHarVeryFunny|5 months ago

I think the main benefit of a neuromorphic design would be to make it dataflow driven (asynchronous event driven - don't update neuron outputs unless their inputs change) rather than synchronous, which is the big power efficiency unlock. This doesn't need to imply a spiking design though - that seems more of an implementation detail, at least as far as dataflow goes. Nature seems to use spike firing rates to encode activation strength.

In the brain the relative timing/ordering of different neurons asynchronously activating (A before B, or B before A) is also used (spike-timing-dependent plasticity - STDP) as a learning signal to strengthen or weaken connection strengths, presumably to learn sequence prediction in this asynchronous environment.

STDP also doesn't imply that spikes or single neuron spike train inter-spike timings are necessary - an activation event with a strength and timestamp would seem to be enough to implement a digital dataflow design, although ultimately a custom analog design may be more efficient.

GregarianChild|5 months ago

Can you explain the benefit of renaming dataflow as 'neuromorphic'?

You do understand that dataflow architectures have been tried many many times? See [1] for a brief history. MIT had a bit dataflow lab for many years (lead by the recently deceased Arvind). What is the benefit of re-inventing dataflow architectures by complete amateurs who are not at all aware of the 1/2 century research tradition on dataflow architecture, and the very clear and concrete reasons when this architecture has so far failed whenever it was tried for general purpose processors?

We can not even apply Santayana's "those who forget their history are condemned to repeat it because the 'neuromorphic' milieu doesn't even bother understanding this history.

[1] https://csg.csail.mit.edu/Dataflow/talks/DennisTalk.pdf

dist-epoch|5 months ago

> you can also encode information in the time domain.

Also known as a serial interface. They are very successful: PCIe lane, SATA, USB.

cpldcpu|5 months ago

These interfaces use serialized binary encoding.

SNNs are more similar to pulse density modulation (PDM), if you are looking for an electronic equivalent.

nickpsecurity|5 months ago

"I believe the argument is that you can also encode information in the time domain."

Brain research showed that's happening, too. You'll see many models like this if you DuckDuckGo for "spiking" "temporal" "encoding" or subtitute "time" for temporal. You can further use "neural" "network" or "brain" focus it on sub-fields.