top | item 43222634

(no title)

_hark | 1 year ago

Entropy is not absolute!

The entropy of some data is well-defined with respect to a model, but the model choice is free. I.e. different models will assign different entropy to the same data.

And how do we choose a model...? Well, formally by minimizing the information needed to describe both the model and data (the sum of model complexity and data entropy under the model) [1]

You might argue that's all too information-theoretic and in physics there simply is an objective count of the state-space, a maximum entropy, and so on. Alas, there is not even general consensus on whether there is a locally finite number of degrees of freedom.

[1]: https://en.wikipedia.org/wiki/Minimum_description_length

discuss

order

beagle3|1 year ago

But it is closer to absolute than you make it sound here. There are information theoretic models which are “universal” with respect to a class; that is, they are essentially as good as any in that class, for every individual case you apply - even if different cases are best described by distinct models from that class.

E.g. the KT estimator is, for each individual Bernoulli sequence, as good as the best Bernoulli model for that sequence with at most 1/2 but difference (independent of sequence length)

It is undecidable/uncomputable, and only well defined up to a constant, but you have a “universally universal” model - Kolmogorov complexity. In that sense, entropy IS an absolute.