top | item 4885887

24-bit color sucks

65 points| bhauer | 13 years ago |tiamat.tsotech.com | reply

89 comments

order
[+] twelvechairs|13 years ago|reply
Missing wood for trees. RGB displays are deliberately made to be 'good enough' rather than 'great'. None of them can display whole areas of the visible spectrum [1]. Before taking issue with tiny amounts of visible banding, just maybe it would be better if monitors could actually display a reasonable amount of the visible spectrum...

[1] For a reasonable graphic example http://en.wikipedia.org/wiki/File:CIExy1931_srgb_gamut.png where grey is the entire visible spectrum and the coloured areas are the standard RGB colour space.

[+] 0x0|13 years ago|reply
24bit is far from enough to prevent banding, and in many cases it's not just "a tiny amount" either. For example, a css gradient from #333 to #666:

http://i.imgur.com/9xeDT.png?1

It's quite obvious that for a gradient from 0x33 to 0x66 (one component), there are simply not enough distinct values to prevent banding. In this case, the range is 51 different values, for a gradient that's maybe spanning 500-1000 pixels. It really looks quite terrible.

[+] Xcelerate|13 years ago|reply
I should point out that the area of a region in that diagram does not correspond with the ability to distinguish colors in that region. The diagram is misleading, but your point still stands.
[+] cromwellian|13 years ago|reply
It's overreach to say high DPI displays eliminate the need for anti-aliasing. You will still get shimmering, pixel popping, and temporal aliasing, it will just be harder to notice. If you have thin shapes like hanging cables or chain-link fences being displayed, even at high resolution, you will get pixels coming in and out of existence. Higher sampled AA tends to add spatial "stability" so that slight shifts of viewpoint won't trigger this.
[+] bhauer|13 years ago|reply
Author here. You're right, high-DPI does not actually eliminate aliasing and I've toned done the wording slightly (added "nearly"). Certainly high-DPI is a huge step in the right direction even if several problems with aliasing remain.

I was simply attempting to say "thank you" to Apple for innovating with high-DPI displays so that I could follow that by saying more or less what you did: many things remain to address image clarity.

[+] ck2|13 years ago|reply
What on earth is with all the motion on the website.

Just because you can do animation doesn't mean you should.

[+] fuzzix|13 years ago|reply
It made the point about being able to distinguish fine colour gradients virtually impossible to verify - the background kept changing.

Also, it slowed site scrolling to a crawl on this reasonably beefy desktop machine.

[+] Flenser|13 years ago|reply
I don't think it wouldn't have been so bad if it was slower, like clouds on a not very breezy day. As it is, it felt like it was trying to hypnotize me.
[+] bhauer|13 years ago|reply
Author here. You're completely right. I did for fun, not because I "should."

I'm not a designer; just someone who enjoyed playing with SVG and SMIL to make a subtle background effect that I liked.

You can turn off the animation with the menu at the bottom right. Apologies to everyone for burning your CPU cycles so needlessly. Just turn it off if you don't like it.

[+] moreati|13 years ago|reply
Agreed. There's a control in the bottom right corner to control the speed of/stop it, but the animation shouldn't be there to start with.
[+] iMark|13 years ago|reply
The cpu utilisation of that page is quite impressive.
[+] 1331|13 years ago|reply
> Academics have told us that human eyes can't distinguish between the 16,777,216 colors provided by 24-bit depth, so we believe that even though it can't be true.

This is a fallacy. That the human eye cannot distinguish N different colours does not imply that a given colour space of >N colours contains every colour that the human eye can distinguish.

To give an equivalent example that is easier to understand, consider a colour space with 16.7 million shades of red. 16.7 million colours is more than the human eye can distinguish, but there are clearly many colours that the human eye can distinguish that are not in that colour space (notably shades of green, shades of blue, and combinations of red, green, and blue).

[+] 4ad|13 years ago|reply
You are in violent agreement with the author. This is what he says as well.
[+] Xcelerate|13 years ago|reply
I agree. Displays in general suck. The display on the RMBP is a step in the right direction, but there's still a lot of work to be done. Anyone who settles for "good enough" may as well live life like it's the 16th century. It was "good enough" then too.

My vision for the future? Specifications that fully exceed the human capacity for discernibility.

2880x1800? Nope, I can still see aliasing (particularly in Terminal when I'm coding).

IPS? Nope. Move your head slightly up and down and -- while the chromaticity stays (roughly) the same -- the luminance does not.

Black levels? I can still distinguish a black screen from the bezel, so it needs some work too.

Color depth? See the provided banding examples. Not to mention that the three primaries in most LCD panels form a very small triangle in the chromaticity diagram.

Refresh rates also stink. It's 2012 -- motion should be so fluid it looks real by now.

Anyway, there's a lot of potential for improvement but I'm afraid if you're not OCD (like me) or a color scientist, most people just don't care too much.

[+] wazoox|13 years ago|reply
Huh, don't get me started on image quality and such. Look how people watch constantly 4:3 movies horribly stretched on 16:9 screens. Look how bad the animation is on 99% of bluRay films (to the point of being unbearable) -- action scenes are stuttering and jittering except on the most recent hardware.
[+] donaldc|13 years ago|reply
While we're at it, let's also add 3D to displays.
[+] kevingadd|13 years ago|reply
The number of bytes of memory available on GPUs is hardly the biggest issue when considering rendering at more than 8 bits per channel of color precision. Even producing a 30-bit framebuffer (as has been supported for a while on many desktop GPUs) comes at a performance penalty and is incompatible with many pieces of software and hardware. Modern GPUs actually allow 16 bits per channel of precision when rendering, but it comes at a significant cost - various features no longer function, memory bandwidth is devoured, etc. You can't simply say 'we can spare the bits' and wave away all the technological challenges here, especially when the advantage gained from all those costs is comparatively miniscule.

To be fair, this sort of applies to retina displays as well: The hardware put into some of the Retina macs can barely handle the bandwidth demands of realtime rendering at such high resolutions. Until that problem is addressed, you certainly shouldn't be running around demanding >8bpc color precision.

Dithering for gradients could certainly make a minor difference in rendered quality, but I don't think most browser vendors are interested in making rendering performance slower right now - they're quite busy trying to make it faster. As evidenced by the fact that the OP's site runs like complete garbage in even modern browsers. Even if you spend the cycles and the power to dither, you're basically approximating something like another bit of precision. Is it really worth the cost just for another bit? You could support the argument for dithering by showing a side by side comparison, at least.

[+] bhauer|13 years ago|reply
You're right. I trivialize the challenges, and I admit that.

However, my point is that it has been some fifteen years since 24-bit "True Color" arrived. Certainly since then we've increased computing performance to the point where we can manage to throw a few more bits around, yes?

Also, you're right that browser vendors are dutifully concerned about performance. In fact, one of the reasons I enjoyed adding a subtly animated background (see previous reply apologizing to those who hate it) was a bit of evidence to back up another point I've made elsewhere: in 2012 our computers that process billions of CPU operations per second paired with high-powered GPUs can still get chunky with relatively trivial 2D animation in a web browser.

Incidentally, on that point, if you happen to have IE 10, check out how well it uses your GPU to do CSS transitions. It doesn't do the SVG/SMIL animation used in the background, but I find it fascinating how effortless it executes the animation it does support: http://tiamat.tsotech.com/ie-10-is-no-joke

I'd really like to see Chrome and Firefox catch up with that degree of GPU acceleration.

But in this particular blog entry, I'm asking that some attention return to rendering quality. I'd love to see color banding in gradients disappear soon.

[+] mrb|13 years ago|reply
Actually dithering makes a huge difference. I see zero banding whatsoever in the examples posted in this thread, because my sw/hw config does automatic dithering (Xorg fbdev 0.4.2, AMD Fusion E-350 GPU, Lenovo X120e laptop).

Contrast this with other posters who I presume don't have dithering, and write comments like "this gradient looks ugly"...

[+] fusiongyro|13 years ago|reply
Your site design is cool, but the motion is interfering with readability.
[+] graue|13 years ago|reply
Not only that, it's such a processor hog it makes my MacBook Air fans whir at full volume.

I disabled JavaScript on the site, hoping that would make it behave, and got... no blog, only this:

> This blog uses a little JavaScript. Nothing dodgy, though, and nothing hosted at third-party sites. Just some jQuery and animation bits. So please, if you'd be so kind, ask Noscript to call off the hounds.

Congratulations, those “animation bits”, which you won't let me turn off, make your site unbearably annoying to read.

[+] nwh|13 years ago|reply
I'm actually having a hard time keeping my eyes on the text. It's almost sickening trying to focus.

I hope this developer isn't making content on real websites.

[+] Luyt|13 years ago|reply
You are not the only one. I couldn't keep my eye focus on the text due to the rotating background, and closed the website after reading the first two paragraphs.
[+] comex|13 years ago|reply
It also lags down scrolling.
[+] haxxorfreak|13 years ago|reply
And it makes the fans on my rMBP go nuts
[+] yen223|13 years ago|reply
Am I the only person who couldn't tell the difference in that horizontal gradient example?
[+] nwh|13 years ago|reply
I can't on my MacBook or external screen. I tried on my iPhone—assuming that the screen might be lower quality—sadly the website blew up and there was no gradient to be found.
[+] jamesaguilar|13 years ago|reply
I could, but barely. Not enough that this website made me care about the issue.
[+] jopt|13 years ago|reply
Safari renders both sides as rgba(72, 72, 72, 1.0). I have no idea why. Perhaps (ironically) a dithering algorithm gone wrong. In chrome the divide is visible.
[+] barrkel|13 years ago|reply
I see banding in gradients quite often; more often in dynamic scenes in games than in more tightly designed static content online.

Of course much of the media we consume has quantized colour, streamed videos especially don't really stand up to any kind of close examination using today's displays, never mind futuristic ones. Banding from limited gamut is not the big problem here, there are much bigger elephants.

[+] nairteashop|13 years ago|reply
Oddly enough, I can see it in Chrome but not in Safari. I'm on OS X.
[+] Someone|13 years ago|reply
Try fiddling with your brightness/contrast settings.
[+] wildranter|13 years ago|reply
Vertical line right in the middle. Like the OP said, once you see it you can't stop seeing it.
[+] lucian1900|13 years ago|reply
I don't see any difference between those two bars, and my eyes are generally pretty good.

24bit is fine. It's <200dpi that has to die.

[+] devsatish|13 years ago|reply
With all due respect, I don't see the gradient break. may be because of that gray animation rotating behind that div?, this posts reminds me of that yesterday's HN parody.
[+] raverbashing|13 years ago|reply
An important item is missing in this discussion.

Several monitors turn your 8-bpp image to a 6-bpp image.

[+] brudgers|13 years ago|reply
What problem does 30 bit color depth solve?

The obvious answer (or strawman) might be aesthetics. Yet two bits is sufficient to create art. Each medium has its limitations. Pen and ink, watercolor, clay - why should a computer screen be seen as inherently different?

We tend to view computer art on our own often miscalibrated displays, under variable lighting conditions, and at a variety of resolutions. Computer art is generally mass produced. Yes, it is behind glass, but not in the manner of the MonaLisa.

I'm not saying that "deep color" isn't worthwhile. Only that a coherent case for its practical advantages wasn't made. The problem wasn't obvious on my screen, and I am biased toward content over form.

[+] 4ad|13 years ago|reply
Let's only use two bits then!
[+] seanalltogether|13 years ago|reply
Green is always the worst offender

https://dl.dropbox.com/u/1437645/banding.png

[+] mrb|13 years ago|reply
I don't see anything wrong with your image. The gradient looks perfect on my monitor.

Edit: it looks perfect because my sw/hw config (driver Xorg fbdev 0.4.2, AMD Fusion E-350, Lenovo X120e laptop) does automatic dithering. I can see a tiny bit of dithering on the white end of the other gradient posted by nhw: http://file.st/KWb7XS7x Other that that, no banding on either images. Dithering invisible on yours. Dithering 24-bit colors really helps.

[+] nwh|13 years ago|reply
You can go further. Undithered black-white gradients look even worse. Often they look green or yellow in bands — http://file.st/KWb7XS7x
[+] berkut|13 years ago|reply
Possibly because the human eye is more sensitive to green light.
[+] gdg92989|13 years ago|reply
ugh your right, that looks terrible
[+] haberman|13 years ago|reply
I'm missing the part about what awesome thing would be possible to do with higher-bit displays. The answer for higher-DPI displays is obvious: everything looks sharper, all the time. While it may be true that 30-bit color would allow the gradient between those two grays to be gradual instead of a single step, when does someone actually want to draw a gradient between such similar shades of gray?
[+] sukuriant|13 years ago|reply
What's even more fun is when monitors, especially LCD displays, are only 5 bits deep for each color :)
[+] lflux|13 years ago|reply
I'm not sure what's with the background, but it turned my laptop into a hairdryer.
[+] georgraphics|13 years ago|reply
"With high-DPI displays, the aliasing problem caused by insufficient pixels has been extinguished." that's not true. Aliasing is an omnipresent problem in discretized data every rasterized display is based on. Even though a retina display samples at a much higher rate, it needs anti aliasing for optimal results.

I'd like 64bit displays too. The problem is that there's quasi no content and also no content pipeline. Someone has to take the first step here.

[+] Superpig|13 years ago|reply
I don't see the gradient switch either.

And having 2GB of VRAM does not imply that you'll need a 268-million pixel display to use it all. In modern games, the vast majority of that VRAM is used by textures (and we need as much of it as we can get!)

[+] verroq|13 years ago|reply
Who are these people who only test their sites in Chrome?