top | item 22852324

(no title)

jotakami | 5 years ago

I’m currently working on a PhD in cryptography and I ran into this particular entry a few months ago while trying to wrap my head around entropy as an information theoretic concept. To be honest, it triggered a cascade of revelation that I had not felt since the last time I took psychedelics.

I read another article recently about the unexpectedly large role that randomness plays in embryonic development, and an idea clicked into place:

Life is about sustaining order amongst chaos, negentropy in a sea of entropy. But how does evolution lead to larger and larger pockets of negentropy that are capable of sustaining in increasingly hostile environments? How exactly does evolution lead to more and more “advanced” life forms?

Enter the magic of randomized algorithms. Randomized algorithms can often solve hard computational problems very efficiently, with the tradeoff that they have a small chance of failure. We can envision evolutionary leaps as computational problems, such as finding just the right folded protein to catalyze a particular cellular reaction. The magic of evolution is not just in building stable order, but also in harnessing randomness/entropy to solve environmental problems and then bootstrapping those solutions to solve higher level problems. Think about how just enough randomness is allowed into the process of meiosis to create perfectly functioning new humans that are wonderfully unique.

DNA and RNA are the non-volatile memory of the biological computer. Central nervous systems eventually reached a level of complexity that allowed them to persist memories, which opened up an even higher order problem solving mechanism. We humans have taken it even further with a cerebral cortex capable of abstraction, leading to complex language and the technology to record that language permanently.

discuss

order

burnte|5 years ago

My favorite take on life an entropy is that life is the most efficient way to burn energy, to increase entropy. Life is inevitable in the universe simply as a method of increasing overall entropy efficiently. A bunch of chemicals may eventually decompose into photons and electrons, but it's a lot faster if something eats them.

willis936|5 years ago

Entropy increase is a stochastic process though. The rules of nature do not result in a system that maximizes entropy at the maximum possible rate.

viklove|5 years ago

This take seems similar to inferring intentionality from evolution.

OrderlyTiamat|5 years ago

> I ran into this particular entry a few months ago while trying to wrap my head around entropy as an information theoretic concept. To be honest, it triggered a cascade of revelation that I had not felt since the last time I took psychedelics.

To be honest, I've tried to wrap my head around entropy a few times (both in information theory and physics), but I've never really understood it well. It's related to but not (completely) the same as a measure of chaos, and it is related to but not the same as the number of potential states- and so on.

Could people direct me to a good introduction/explanation to entropy in an information theory sense? I feel like I'd really enjoy biting into this topic, but I haven't found a good entry point yet

jotakami|5 years ago

Also, just as our biological machinery is an engine of negentropy, carving out order from chaos, I think our brains continue that same work in the information theoretic sense of entropy. Humans are insatiably curious because we are alive, and to be alive is to be an engine of order among chaos.

KhoomeiK|5 years ago

> DNA and RNA are the non-volatile memory of the biological computer. Central nervous systems eventually reached a level of complexity that allowed them to persist memories, which opened up an even higher order problem solving mechanism. We humans have taken it even further with a cerebral cortex capable of abstraction, leading to complex language and the technology to record that language permanently.

Capitalism is to Natural Selection as the Brain is to the Genome. Capitalism is the same Darwinistic entropy engine as Natural Selection, just abstracted to the plane of higher thought rather than raw biochemistry.

jotakami|5 years ago

I like where you’re going with this but I’m not sure the analogy is quite right. If you’ve read Dawkins, the genome is composed of genes which are basically the quanta of self-replication. He goes on to coin the term “meme” to refer to the analogous units of self-replication in mindspace. In other words, the brain is not analogous to the genome, rather the particular collection of abstractions (memes) which dominate how a particular brain processes and reacts to the world would be its cerebral “genome”.

I think we can say that natural selection operates across all levels of abstraction, because in the end the only objective reality is the biochemical one. Everything else is simulation. There is certainly a strong parallel between capitalism and Darwinian natural selection but I can’t immediately see a way to state that relationship in a clear analogy.