Perception seems to be one of the main constraints on LLMs that not much progress has been made on. Perhaps not surprising, given perception is something evolution has worked on since the inception of life itself. Likely much, much more expensive computationally than it receives credit for.
Workaccount2|3 months ago
stalfie|3 months ago
(Source: "The mind is flat" by Nick Chater)
recitedropper|3 months ago
orly01|3 months ago
nomel|3 months ago
Physical analog chemical circuits whose physical structure directly is the network, and use chemistry/physics directly for the computations. For example, a sum is usually represented as the number of physical ions present within a space, not some ALU that takes in two binary numbers, each with some large number of bits, requiring shifting electrons to and from buckets, with a bunch of clocked logic operations.
There are a few companies working on more "direct" implementations of inference, like Etched AI [1] and IBM [2], for massive power savings.
[1] https://en.wikipedia.org/wiki/Etched_(company)
[2] https://spectrum.ieee.org/neuromorphic-computing-ibm-northpo...
recitedropper|3 months ago
My armchair take would be that watt usage probably isn't a good proxy for computational complexity in biological systems. A good piece of evidence for this is from the C. elegans research that has found that the configuration of ions within a neuron--not just the electrical charge on the membrane--record computationally-relevant information about a stimulus. There are probably many more hacks like this that allow the brain to handle enormous complexity without it showing up in our measurements of its power consumption.