(no title)
extra__tofu | 8 years ago
If you are talking about information theory entropy and saying a message of a billion bits composed of all "ones" has lower entropy than a "random" billion bit message, then sure. This is like saying we can compress the billion bits of ones and send less bits but the same amount of information. But I don't think this is synonymous with the above. It would be like saying for event A, each proceeding individual event has less entropy than the previous -- we aren't dealing with a fair coin anymore.
szemet|8 years ago
But if you do have memory, then you are free to chose to see "92 consecutive throws with a fair coin" as one event.
In that case the Kolmogorov complexity describes perfectly well why you should be suprised at low entropy outcomes - simply because those are rare events (using our new definition of 'event').