Google, not content with existing image formats that support 10-bit like HEIC (used by Apple) and AVIF (based off of AV1, a codec that Google helped design and is better than JPEG), they decided in all their wisdom to make Ultra HDR for Android phones, which is an incompatible standard built on top of JPEG, which is separate from JPEG-XL.
Now Samsung has released Super HDR, without any information about that standard or if it relates to Google's Ultra HDR. Sigh.
Edit: I forgot about WebP/WebP2, which was also developed by Google as a JPEG replacement.
Considering JPEG XL and Ultra HDR are both based on JPEG, couldn't they be combined into one standard? Wouldn't it be better for everyone if the whole industry could eventually agree on a single standard?
Apple's HEIC is very annoying since it's not really supported by anything non-Apple. Would certainly be nice to see that go away.
I don't understand how google can just be consistently on the back foot of the "tech world hivemind" for going on 7 (?) years now and have zero shakeup of not just culture but at least PR.
Noone gave a crap about this format on this site until Google decided to not add it to Chrome. Noone used it, no posts were upvoted. It just became a thing when it was yet another reason to rant at Google.
Note how noone is asking Mozilla why Firefox won't support it or actually building websites using it.
Just reminding everyone that it is now 2024, and it is still impossible to send a HDR still image to a group of people not all in the same ecosystem.
Apple for example “supports” the JPEG XL format, but decodes it to sRGB SDR irrespective of the source image gamut.
As of today, Adobe Lightroom running on an Apple iDevice can edit a RAW camera image in HDR, can export the result in three formats… none of which can be viewed as HDR on the same device.
Windows 11 with all the latest updates can basically open nothing and will show garbage half the time when it can open new formats.
Linux is still stuck in the teletype era and will catch up to $399 Aldi televisions from China any decade now.
I should create an “Are we HDR yet?” page and track this stuff.
These are trillion dollar companies acting like children fighting over a toy.
“No! Use my format! I don’t want to play with your format! It’s yucky!”
I've edited photos using Photoshop and Lightroom with HDR support and was able to immediately view them in preview on my m1 Mac mini. Of course it looked different than if I viewed it in multiple different browsers or even the Canary, Dev, beta, release branches of all of those browsers if they had support for the image format at all. But they definitely did view correctly because I made sure that the images weren't professional... If viewed normally on something that wouldn't go HDR correctly you would just see a whole bunch of blown out whiteness but if it views correctly then you actually see that there was something in that Fulbright section of the image.
I also verified this by transferring the images to my NAS and then grabbing those images on my Pixel 5 at the time and also my pixel fold that I use now and both of those you could tell immediately when the HDR transforms the image. There's like a split second as the image shows up on the screen where you could tell that it's like tone mapping it or engaging the HDR display mode or something.
And I know we aren't talking about video but way back when Doom eternal came out I recorded a full playthrough of that using a capture card that I have that allows me to capture in h265 with proper HDR metadata. It was a messy setup because my only HDR monitor is my TV and my computers in my living room so I had to string an HDMI cable from my TV to the capture box input and then another HDMI to my computer monitor along with the USBC cable to my computer. So I beat the entire game in the avermedia preview window and then edited each level in DaVinci resolve exported that with all the correct settings after reading the like 4,000 page manual just to make sure I was doing it just right. The entire time I was editing I wasn't exactly sure it was going to come out right because my computer monitor is a 6-bit panel with dithering to make it 8-bit and it's not even an HDR monitor at all. But in the end, My m1 Mac mini was able to watch it on YouTube in HDR in 4K. My TCL 4K HDR TV was able to watch it using the built-in YouTube app. Basically anything I had that I could attach to a screen that would enable HDR mode would let that video play correctly, including the Pixel 5. And I did move things around in my living room just so I could make sure my windows 10/11 and I say that because I was insider preview around the transition time so it was kind of a hybrid of both in a way, that was also able to watch the video natively and on YouTube correctly.
I think things are more compatible than just looking at compatibility listings. If you have a modern computer with parts that are 7 years old but run a modern operating system and you have, and this is the kicker, a screen with a 10-bit or 12-bit panel that also has an actual rating of 1000 nits then you have something that can legitimately view the minimum standard for most HDR technical specifications.
If you're trying to look at HDR content and you say that it's not working correctly then you might not actually have a monitor that is at least the proper video industries or film industries minimum standard. It's okay if you have a cheap TV like I do that's an 8-bit panel that uses advanced dithering to make it 10 bit, mine for some whatever reason also could go up to 12 in windows. But you have to have that 10-bit minimum you have to have REC 2020/2084 and P3DCI 65 along with 1000 nits peak brightness.
Some gamer monitor saying that it's an HDR display and it has something like 600 nits isn't a standard it's a marketing term that that company made up so they could say it's HDR because all laypeople think HDR means is brighter. What's the point of having 1,024 levels of brightness per color if your brightness levels of your screen can't show that full dynamic range?
ipsum2|2 years ago
Now Samsung has released Super HDR, without any information about that standard or if it relates to Google's Ultra HDR. Sigh.
Edit: I forgot about WebP/WebP2, which was also developed by Google as a JPEG replacement.
Reason077|2 years ago
Apple's HEIC is very annoying since it's not really supported by anything non-Apple. Would certainly be nice to see that go away.
muragekibicho|2 years ago
It's really weird Google deprecated the format despite contributing engineers to help build JPEG XL. I guess it's office politics
archerx|2 years ago
sugarpile|2 years ago
scorpio241|2 years ago
https://www.fsf.org/blogs/community/googles-decision-to-depr...
ralph84|2 years ago
izacus|2 years ago
Note how noone is asking Mozilla why Firefox won't support it or actually building websites using it.
milleramp|2 years ago
scorpio241|2 years ago
Also they have a separate JPEG XL article: https://r2.community.samsung.com/t5/CamCyclopedia/JPEG-XL-Im...
altairprime|2 years ago
jiggawatts|2 years ago
Apple for example “supports” the JPEG XL format, but decodes it to sRGB SDR irrespective of the source image gamut.
As of today, Adobe Lightroom running on an Apple iDevice can edit a RAW camera image in HDR, can export the result in three formats… none of which can be viewed as HDR on the same device.
Windows 11 with all the latest updates can basically open nothing and will show garbage half the time when it can open new formats.
Linux is still stuck in the teletype era and will catch up to $399 Aldi televisions from China any decade now.
I should create an “Are we HDR yet?” page and track this stuff.
These are trillion dollar companies acting like children fighting over a toy.
“No! Use my format! I don’t want to play with your format! It’s yucky!”
navjack27|2 years ago
I also verified this by transferring the images to my NAS and then grabbing those images on my Pixel 5 at the time and also my pixel fold that I use now and both of those you could tell immediately when the HDR transforms the image. There's like a split second as the image shows up on the screen where you could tell that it's like tone mapping it or engaging the HDR display mode or something.
And I know we aren't talking about video but way back when Doom eternal came out I recorded a full playthrough of that using a capture card that I have that allows me to capture in h265 with proper HDR metadata. It was a messy setup because my only HDR monitor is my TV and my computers in my living room so I had to string an HDMI cable from my TV to the capture box input and then another HDMI to my computer monitor along with the USBC cable to my computer. So I beat the entire game in the avermedia preview window and then edited each level in DaVinci resolve exported that with all the correct settings after reading the like 4,000 page manual just to make sure I was doing it just right. The entire time I was editing I wasn't exactly sure it was going to come out right because my computer monitor is a 6-bit panel with dithering to make it 8-bit and it's not even an HDR monitor at all. But in the end, My m1 Mac mini was able to watch it on YouTube in HDR in 4K. My TCL 4K HDR TV was able to watch it using the built-in YouTube app. Basically anything I had that I could attach to a screen that would enable HDR mode would let that video play correctly, including the Pixel 5. And I did move things around in my living room just so I could make sure my windows 10/11 and I say that because I was insider preview around the transition time so it was kind of a hybrid of both in a way, that was also able to watch the video natively and on YouTube correctly.
I think things are more compatible than just looking at compatibility listings. If you have a modern computer with parts that are 7 years old but run a modern operating system and you have, and this is the kicker, a screen with a 10-bit or 12-bit panel that also has an actual rating of 1000 nits then you have something that can legitimately view the minimum standard for most HDR technical specifications.
If you're trying to look at HDR content and you say that it's not working correctly then you might not actually have a monitor that is at least the proper video industries or film industries minimum standard. It's okay if you have a cheap TV like I do that's an 8-bit panel that uses advanced dithering to make it 10 bit, mine for some whatever reason also could go up to 12 in windows. But you have to have that 10-bit minimum you have to have REC 2020/2084 and P3DCI 65 along with 1000 nits peak brightness.
Some gamer monitor saying that it's an HDR display and it has something like 600 nits isn't a standard it's a marketing term that that company made up so they could say it's HDR because all laypeople think HDR means is brighter. What's the point of having 1,024 levels of brightness per color if your brightness levels of your screen can't show that full dynamic range?
lifthrasiir|2 years ago
> Additionally, storage capacity has been reduced while maintaining image quality by providing JPEG XL format.
Kye|2 years ago