top | item 40780680

(no title)

klauspost | 1 year ago

Looking at it quickly, it seems like you just moving entropy to the frequency table.

If the frequency table isn't included in the count, I could just make an "infinite" compressor by storing the frequency and cut one byte off the end. From the frequency table I could then deduce what the last byte should be.

> typical entropy encoders (Huffman, ANS, etc) would require 1 bit per symbol,

No. ANS (and range/arithmetic coding) allows for probabilities to be stored with fractional bits.

discuss

order

peter-ebert|1 year ago

ANS requires 1 bit per symbol if the ratio is 1:1, you can confirm this here: https://kedartatwawadi.github.io/post--ANS/

foobarqux|1 year ago

The value of the ratio is information that needs to be transmitted. You are pretending that it’s free

gliptic|1 year ago

Not if the counts are updated correctly.