top | item 12023968

(no title)

lcrs | 9 years ago

10-bit with a log or gamma encoding is widespread in film and video work, and I've never known a banding problem, even with purely generated gradients. The Rec. 2020 UHD standard does recommend 12-bit gamma-encoded to deal with the ludicrously wide gamut.

For HDR, a PQ (perceptual quantisation) encoding curve is already standardised by SMPTE - page 8 in these slides goes through the process of how it was worked out, right from human visual system basics: https://www.smpte.org/sites/default/files/2014-05-06-EOTF-Mi...

Given the abundance of log or gamma encodings for display imagery you might wonder why true linear 16-bit float is so common in CG production - why not log encode those 16 bits and get loads smoother gradients? Maybe the answer is that during production those linear files often also encode non-image data like vertex positions and normals, and perceptually "good" quantisation of those could lead to unexpected precision problems...

discuss

order

No comments yet.