top | item 44957250

(no title)

ram_rar | 6 months ago

I’m a bit underwhelmed by the quality of articles coming out of Netflix. 100 Million records / entity is nothing for Redis — even without RAW hollow-style compression techniques used (bit-packing, dedup, dict encoding is pretty standard stuff) [1].

Framing this as a hard-scaling problem (tudum seems mostly static, please cmiiw if thats not the case) feels like pure resume-driven engineering. Makes me wonder: what stage was this system at that they felt the need to build this?

[1] https://hollow.how/raw-hollow-sigmod.pdf

discuss

order

No comments yet.