(no title)
patrickmeenan | 1 year ago
In the web case, it mostly only makes sense if users are using a site for more than one page (or over a long time).
Some of the common places where it can have a huge impact:
- Delta-updating wasm or JS/CSS code between releases. Like the youtube player JavaScript or Adobe Web's WASM code. Instead of downloading the whole thing again, the version in the user's cache can be used as a "dictionary" and just the deltas for the update can be delivered. Typically this is 90-99% smaller than using Brotli with no dictionary.
- Lazy-loading a site-specific dictionary for the HTML content. Pages after the first one can use the dictionary and just load the page-specific content (compresses away the headers, template, common phrases, logos, inline svg's or data URI's, etc). This usually makes the HTML 60-90% smaller depending on how much unique content is in the HTML (there is a LOT of site boilerplate).
- JSON API's can load a dictionary that has the keys and common values and basically yield a binary format on the wire for JSON data, compressing out all of the verbosity.
I expect we're still just scratching the surface of how they will be used but the results are pretty stunning if you have a site with regular user engagement.
FWIW, they are not "pre-shared" so it doesn't help the first visit to a site. The can use existing requests for delta updates or the dictionaries can be loaded on demand but it is up to the site to load them (and create them).
It will probably fall over if it gets hit too hard, but there is some tooling here that can generate dictionaries for you (using the brotli dictionary generator) and let you test the effectiveness: https://use-as-dictionary.com/
jiggawatts|1 year ago
Ah, gotcha: this is a new Google standard that helps Google sites when browsed using Google Chrome.
Everyone else will discover that keeping the previous version of a library around at build time doesn’t fit into the typical build process and won’t bother with this feature.
Only Facebook and maybe half a dozen similar other orgs will enable this and benefit.
The Internet top-to-bottom is owned by the G in FAANG with FAAN just along for the ride.
patrickmeenan|1 year ago
Just about any e-commerce site will involve a few pages to go through search, product pages and checkout.
Most news sites will likely see at least 2-3 pages loaded by a given user over the span of a few months. Heck, even this site could shave 20-50% for each of the pages a user visits by compressing out the common HTML: https://use-as-dictionary.com/generate/result.php?id=d639194...
Any place you'd justify building a SPA, by definition, could also be a multi-page app with dictionaries.
If you have a site that visitors always bounce immediately and only visit one page in their lifetime then it can't help, but those tend to be a lot less common.
It doesn't have to be a huge site to benefit, it just needs users that regularly visit to show large gains.
As CDNs implement support, the effort involved in supporting it will also drop, likely to the point where it can just be a checkbox to generate and use dictionary-based compression.
uf00lme|1 year ago