(no title)
kidel001 | 1 year ago
But I am curious about something else. I am not a statistical mechanics person, but my understanding of information theory is that something actually refined emerges with a threshold (assuming it operates on SOME real signal) and the energy required to provide that threshold is important to allow "lower entropy" systems to emerge. Isn't this the whole principle behind Maxwell's Demon? That if you could open a little door between two equal temperature gas canisters you could perfectly separate the faster and slower gas molecules and paradoxically increase the temperature difference? But to only open the door for fast molecules (thresholding them) the little door would require energy (so it is no free lunch)? And that effectively acts as a threshold on the continuous distributions? I guess what I am asking is that isn't there a fundamental importance to thresholds in generating information? Isn't that how neurons work? Isn't that how AI models work?
No comments yet.