top | item 15924177

(no title)

extra__tofu | 8 years ago

If we are flipping a fair coin, the "surprise" we have at the result of any individual flip should be the same for all individual flips. Therefore, the entropy of the event, call it "A", of 92 fair coin tosses each resulting in heads is the same as the entropy of any other 92 fair coin tosses, "B". The sum of the individual event entropys must be the same so entropy(A) = entropy(B).

If you are talking about information theory entropy and saying a message of a billion bits composed of all "ones" has lower entropy than a "random" billion bit message, then sure. This is like saying we can compress the billion bits of ones and send less bits but the same amount of information. But I don't think this is synonymous with the above. It would be like saying for event A, each proceeding individual event has less entropy than the previous -- we aren't dealing with a fair coin anymore.

discuss

order

szemet|8 years ago

It is true, if you do not have memory (memoryless chanel with a memoryless observer;)

But if you do have memory, then you are free to chose to see "92 consecutive throws with a fair coin" as one event.

In that case the Kolmogorov complexity describes perfectly well why you should be suprised at low entropy outcomes - simply because those are rare events (using our new definition of 'event').