Sample size of one, but on mobile, my number one pet peeve (apart from popover ads and autoplay videos!) is layouts that change as the page loads. Progressive encoding with multiplexing should help make sure newly loaded images don't make article text jump around as I try to read it.
Then again, maybe this is a laziness issue since it seems to me defining a layout independent of actual image content is something that you could do in the 90's.
This is the only research that claims progressive images aren't a good user experience though. I can't confirm with electrodes attached to my brain but personally I like progressive images better than baseline and they are smaller too!
One piece of research that measures vitals instead of just asking the users which one they liked more isn't very convincing. It could even be A/B tested by measuring how long people stay on the page or bounce rate above the fold.
Sample size: three (and counting). It's an awful problem, mobile or laptop/desktop: My greatest gripe is that I may think page is fully loaded, click to highlight text and - boom! I'm sent of to some page of video or advertising through a link that abruptly displaces the target I aimed at.
> it is possible to flag individual scan layers of progressive JPEGs with high priority and making the server push those scan layers into the client browsers’ Push cache even before the request for the respective image is initiated
Whoa - I had no idea this is possible. Isn't this a crazy layering violation (why should HTTP2 know about progressive JPEGs)? The links don't seem to provide any more information about it.
edit It looks like HTTP2 only talks about streams. So it's too strong to say that you can flag "individual scan layers with high priority." You can't change the order of scan layers within a JPEG file, or send the file in anything except its natural byte order. So it seems like this has the same limitations as HTTP 1.x.
> The best way to counter negative effects of loading image assets is image compression: using tools such as Kornel Lesiński‘s ImageOptim, which utilizes great libraries like mozjpeg and pngquant, we can reduce image byte size without sacrificing visual quality.
The standard metrics these tools provide (JPEG quality or PSNR) are not enough to preserve the visual quality for the variety of images. I'm working on the project to actually do it [1].
My favorite example is the actual Google logo [2]. It's 13 504 bytes and can be reduced almost 2 times to 7 296 bytes saving probably gigabytes of bandwidth each day.
Image compression is a great first step. Imagemin is a great javascript-based tool that lets you incorperate a bunch of optimization tools (mozjpeg, pngquant, jpegtran, optipng, gifsicle, etc...) all in one (if your project can easily use javascript modules...). And there is a plugin [0] (that I wrote) for webpack to make it happen without any thought.
Another potentially massive step is to use the srcset [1] attribute of the <img> tag. It lets you provide a bunch of different resolutions for the same image, and the browser will choose the best one to download and render based on the physical screen pixel density, zoom level, and possibly in the future even things like bandwidth preferences or battery level.
Combine the imagemin plugin with a webpack-loader [2] that will auto-generate 5-ish different downscaled versions of an image as srcset, and you get a pretty perfect setup.
My web apps now always use the highest resolution image I have available by default (within reason, I do cut it down to a realistic value), then provide 5 different downscaled versions alongside it in the srcsets automatically which are all run through a battery of optimization to compress them as good as possible. And the browsers will only download the biggest one it can realistically use. Everyone gets high quality images, nobody wastes bandwidth because of support for higher res screens.
If it would be sufficient to have high PSNR only in the 'regions of interest', then you could use something like this [1], which uses a CNN model to predict a map, and a multiple JPEG encoding to achieve variable quality compression.
I never liked progressive images. Just loading vertically is far less tantalizing than having to check the pixelation to see whether you're really looking at the final product.
Progressive JPEGs are actually a worse experience on browsers that don't render them progressively, like Safari (including iOS). This is why many major sites, such as Flickr, don't use progressive JPEGs (last time I checked, which was a few years ago).
I'm curious, this would probably be better than using an image preload/loading icon as far as delay or even blank spots/placeholders.
I think I've seen this before but I've used an overlay loading gif which was shown over the image while the image loaded then when it loaded the loading gif would be hidden.
I'm always happy when sites don't mess with <img> like this, since it completely breaks down for non-JS browsers. :/ There are ways to do this without breaking, but it's not always done. Apparently.
If only some side-by-side video of the page rendering with the two methods would be available.
I like progressive images, and the whole idea, but some catchy video about a page re-layouting multiple times as images arrive on traditional http+simple jpeg would be the most convincing.
I think at this point a better solution is to use a polyfill for Webp.
The polyfill is pretty light because the underlying image format is just a single frame of VP8. Shouldn't get much performance hit since it will just render as a single frame of video on polyfilled browsers
[+] [-] philbo|9 years ago|reply
http://www.webperformancetoday.com/2014/09/17/progressive-im...
[+] [-] rebuilder|9 years ago|reply
Then again, maybe this is a laziness issue since it seems to me defining a layout independent of actual image content is something that you could do in the 90's.
[+] [-] inian|9 years ago|reply
[+] [-] rockostrich|9 years ago|reply
[+] [-] SmkyMt|9 years ago|reply
[+] [-] millstone|9 years ago|reply
Whoa - I had no idea this is possible. Isn't this a crazy layering violation (why should HTTP2 know about progressive JPEGs)? The links don't seem to provide any more information about it.
edit It looks like HTTP2 only talks about streams. So it's too strong to say that you can flag "individual scan layers with high priority." You can't change the order of scan layers within a JPEG file, or send the file in anything except its natural byte order. So it seems like this has the same limitations as HTTP 1.x.
[+] [-] the8472|9 years ago|reply
[+] [-] sedatk|9 years ago|reply
[+] [-] vladdanilov|9 years ago|reply
The standard metrics these tools provide (JPEG quality or PSNR) are not enough to preserve the visual quality for the variety of images. I'm working on the project to actually do it [1].
My favorite example is the actual Google logo [2]. It's 13 504 bytes and can be reduced almost 2 times to 7 296 bytes saving probably gigabytes of bandwidth each day.
[1] http://getoptimage.com [2] https://www.google.ru/images/branding/googlelogo/2x/googlelo...
[+] [-] Klathmon|9 years ago|reply
Another potentially massive step is to use the srcset [1] attribute of the <img> tag. It lets you provide a bunch of different resolutions for the same image, and the browser will choose the best one to download and render based on the physical screen pixel density, zoom level, and possibly in the future even things like bandwidth preferences or battery level.
Combine the imagemin plugin with a webpack-loader [2] that will auto-generate 5-ish different downscaled versions of an image as srcset, and you get a pretty perfect setup.
My web apps now always use the highest resolution image I have available by default (within reason, I do cut it down to a realistic value), then provide 5 different downscaled versions alongside it in the srcsets automatically which are all run through a battery of optimization to compress them as good as possible. And the browsers will only download the biggest one it can realistically use. Everyone gets high quality images, nobody wastes bandwidth because of support for higher res screens.
[0] https://github.com/Klathmon/imagemin-webpack-plugin
[1] https://css-tricks.com/responsive-images-youre-just-changing...
[2] https://github.com/herrstucki/responsive-loader
[+] [-] iamaaditya|9 years ago|reply
[1] https://github.com/iamaaditya/image-compression-cnn
[+] [-] the8472|9 years ago|reply
[+] [-] eknkc|9 years ago|reply
[+] [-] angry-hacker|9 years ago|reply
[+] [-] dgreensp|9 years ago|reply
See: http://calendar.perfplanet.com/2012/progressive-jpegs-a-new-...
[+] [-] simonlc|9 years ago|reply
[+] [-] unknown|9 years ago|reply
[deleted]
[+] [-] ge96|9 years ago|reply
I think I've seen this before but I've used an overlay loading gif which was shown over the image while the image loaded then when it loaded the loading gif would be hidden.
[+] [-] nothrabannosir|9 years ago|reply
[+] [-] kodfodrasz|9 years ago|reply
I like progressive images, and the whole idea, but some catchy video about a page re-layouting multiple times as images arrive on traditional http+simple jpeg would be the most convincing.
[+] [-] 79d697i6fdif|9 years ago|reply
The polyfill is pretty light because the underlying image format is just a single frame of VP8. Shouldn't get much performance hit since it will just render as a single frame of video on polyfilled browsers
[+] [-] Eric_WVGG|9 years ago|reply
[+] [-] rimantas|9 years ago|reply
[deleted]
[+] [-] dang|9 years ago|reply
We detached this subthread from https://news.ycombinator.com/item?id=13270113 and marked it off-topic.
[+] [-] woof|9 years ago|reply