Quick summary, HDMI 2.0 no longer "exists", and HDMI 2.1 only features are now optional according to the certifying body. Manufactures are supposed to indicate which features they support.
Whelp, I guess we should just stick to Display Port 1.4
Same is done with the bluetooth standard. Manufacturers proudly claim their product uses bluetooth 5.0, reviewers assume that means higher speed and lower power, but the products don't actually have to support all sections of the standard and can pretty much just be bluetooth classic with 5.0 slapped on the label
Seems kind of reasonable. DisplayPort is exactly the same - just because your display supported DisplayPort 1.4 doesn't mean it is required to support 10 bit colour, VRR, etc.
The first thing, this isn't the first time Xiaomi has done something like this. Earlier this year they launched a WiFi 6E router, every single press ran their PR without fact checking and looking through the data. The router doesn't even support 6Ghz along with a few other optional features. After hours of searching, reading through spec apparently no one on the internet gives a damn, I wrote to Wi-Fi Alliance for clarification, and Xiaomi finally retracted their PR and relabel their router as WiFI 6 only.
The second baffling thing is how HN's react to USB-C and HDMI so differently. With USB-C / USB 4, more than half of HN expect the consumers to know which cables and port offer which features. It is the consumers's fault for not choosing the right cable or knowing what their port does. I dont see how that is different to HDMI here. All it takes is one bad actor in the market. And it will be a race to the bottom.
That is partly why Apple brought back MagSafe. You dont tell the user to go and find a USB-C 120W ( or 240W ) capable cable that may or may not be compliant to do quick charging.
Weird, I haven’t seen HNers blaming the user for USB confusion. In fact I am sure I’ve seen articles here getting shared and upvoted which discuss the USB confusion and express sympathy for users while trying to disambiguate the situation
> The second baffling thing is how HN's react to USB-C and HDMI so differently. With USB-C / USB 4, more than half of HN expect the consumers to know which cables and port offer which features. It is the consumers's fault for not choosing the right cable or knowing what their port does. I dont see how that is different to HDMI here. All it takes is one bad actor in the market. And it will be a race to the bottom.
With USB-C I get something out of the confusion - a really versatile cable standard. With HDMI there is no downside (to the consumer) in just calling HDMI 2.0 so.
LTT did some manual testing of HDMI cables [0] in hopes of answering the last question of this article, "how do consumers know if a cable supports v2.1 features?"
Does anyone know of other tests or more comprehensive data sets?
Has anyone ever seen a device that actually uses Ethernet over HDMI? The thought of being able to plug a single network cable into the back of your display and then anything plugged into that has a wired connection is lovely, but as far as I can tell absolutely nothing actually supports it, despite the ever growing set of internet connected devices sitting underneath people's TVs.
Ethernet Over HDMI is used by newer AV receivers to support eARC (extended audio return channel). The older ARC spec would work with any HDMI cable, but bandwidth limitations only allowed compressed 5.1 surround sound. eARC uses the higher bandwidth from Ethernet Over HDMI, allowing uncompressed 7.1 surround and Dolby Atomos streams.
(If you're not familiar with ARC/eARC, this lets the TV send audio from its native inputs back to the AV receiver over an HDMI cable. Without ARC, you need to plug everything directly into the AV receiver.)
I went down this rabbit hole the other night and found a German Blu-ray receiver T+A K8[0] from 2012 that supports the HDMI Ethernet Channel. I have not found, however, the other piece of equipment that I can only suspect may be be some sort of HDMI IP injector.
> Ethernet switch: distribution of an Ethernet uplink connection to BluRayplayer, streaming client, TV monitor and up to 3 source devices (via HEC),up to 2 more external devices via LAN cable (e.g. playing console
My understanding is that Ethernet over HDMI is still used by consumer devices, just no longer for the original dream of switching wired internet given the modern ubiquity of WiFi. More recent standards such as ARC [Audio Relay Channel; used for a number of surround sound setups] and CEC [Consumer Electronics Control; used for passing remote/controller data between devices] both piggy back on the Ethernet pins, and I believe they entirely interfere with using the Ethernet pins as Ethernet (though maybe only in the available bandwidth/speed?).
I tried to use this once in a theatre to connect a camera watching the stage to a greenroom backstage. It worked sometimes, but was super unreliable. Latency was often several hundred milliseconds, and sometimes the image would just straight up disappear. It may be that we had bad HDMI<->Ethernet devices, but that’s the thing: It’s not a “works or doesn’t” kind of thing, it’s a “varies with the quality of all the devices in the chain” kind of thing.
This is a stellar example of how catering to everyone results in the destruction of a brand. “HDMI 2.1” will be with us for years and it’s essentially irrelevant now, and they aren’t willing to admit they were wrong, so their only hope is to release an HDMI 2.2 that is just “all of the optional features of HDMI 2.1 are now required”, which will cause howls of outrage and confusion. I’m guessing they are too captive to manufacturers to have the courage to do that. Oh well.
It wasn't an oversight to make the features optional. They're deliberately optional so device manufacturers aren't forced into a ridiculous all or nothing situation.
HDMI has always been a shitshow. I spent years of my life playing whack-a-mole for a major set top box manufacturer because of incomplete or braindead implementations. CEC added a whole 'nother layer of shit to go wrong. There is a reason CE mfgs go to "plug fests"(https://www.cta.tech/Events/CTA-861-PlugFest) instead of being able to trust their implementation will work with other devices as long as they follow the rules.
I'm still living this dream today. It's amazing how terrible brand spankin' new "hospitality" (hotel) TVs are when it comes to poor implementations. It is still whack-a-mole and often has to be compensated for on the set-top side because the TV manufacturers are even worse at firmware.
Whoa I had never heard of a plug fest before, the concept seems pretty interesting. Silly of me to assume the mega corps just buy their competitor’s products internally.
When the new MacBook Pro came out this year, everyone was puzzled as to why the newly included HDMI port was only 2.0.
Well it turns out they lied. It’s 2.1 after all! \s
Jokes aside, it’s actually only 2.0 because internally they transmit a DisplayPort video signal and convert it to HDMI using an MCDP2900 chip[0], which is the same chip usually seen inside USB-C hubs.
So the new MacBook basically got rid of the HDMI dongle by integrating it inside the laptop.
This also breaks DDC/CI on that port and now I get a ton of support emails for Lunar (https://lunar.fyi) because people can’t control their monitor brightness/volume and think that the app is broken.
2021 HDMI is a disaster. I'm using a Sony flagship TV, a PC, and a popular Sony 7.1 receiver.
I had to update my graphics card to get 4k120 444 10bit and eARC.
Only eARC is totally broken - audio often doesn't work at all without restarting PC/TV/receiver a few times. And then once it "works" it will randomly cut out for 5-10 seconds at a time.
HDR on windows is also totally broken. It was a nightmare to get something that correctly rendered full 10bit HDR video (I ended up having to use MPC-HC with madvr and a ton of tweaking). You also have to turn off windows HDR support and use D3D exclusive mode. After updating my TV to get DRR, the audio for this setup stopped working.
Linux also has zero HDR support. Didn't have luck getting 5.1 or 7.1 working either.
MacOS can at least handle HDR on Apple displays - not sure if it works on normal displays. Haven't tried surround sound.
> HDR on windows is also totally broken. It was a nightmare to get something that correctly rendered full 10bit HDR video (I ended up having to use MPC-HC with madvr and a ton of tweaking). You also have to turn off windows HDR support and use D3D exclusive mode. After updating my TV to get DRR, the audio for this setup stopped working.
Odd, I've got 3 different HDR monitors across 2 Windows computers (one Nvidia, one AMD) and also 2 different HDR TVs occasionally connected to each and I've never had to turn HDR off or do tweaking in madvr to get it to look right. Having HDR disabled in Windows globally makes me wonder if the adjustments you're talking about are referring to tonemapping HDR content to an exclusive SDR output.
One thing I will say is TVs default to some god awful HDR settings. Well I guess you could say that about picture settings on TVs in general but it's even worse for HDR. It took me a solid 40 minutes to figure out how to get the picture settings for a Samsung Q900R to display HDR properly instead of "worlds darkest picture mode" (turns out the same values take different actual effect depending on the type of source you identify it as and "PC" is not what you want even though you specify HDR picture mode elsewhere and it detects an HDR signal...). Also for SDR content you'll need to adjust the SDR bridghtness slider depending on the actual brightness of your display.
Were you able to get 4k120Hz 444 working on Linux? What GPU do you have? I can only do 4k60 444 or 4k120 420 on my LG C1 connected to my Radeon RX 6900xt.
Not to be offensive, but -- first world problems: where did you find a new graphics carts, for starters?
Now, a bit more on a serious tone: this is all bleeding edge. And combining multiple recent development together is a recipe for corner cases and untested combinations.
That said, did you try Variable Refresh Rate with that? Bur reduction technologies (backlight strobing) are also interesting, but thankfully they require little software interaction (for now).
My design would remove all ambiguity, by having different rules for describing port vs. device capabilities.
1. HDMI ports should be described as "HDMI X.XX Compatible". This indicates that the ports themselves (and the device that contains them) will work when connected to any other HDMI X.XX device, and that the description is of a port.
This is the low bar on 2.1 devices. Their ports will sensibly negotiate optional features with other HDMI 2.1 devices.
2. HDMI devices should be described as "Full HDMI X.XX" or "Limited HDMI X.XX up to [limitations]" to distinguish devices that support all features of X.XX that apply to the device, or have applicable limitations. (Audio devices would not be considered "Limited" by non-applicable features such as image resolution.)
The ""Limited HDMI X.XX up to ..." disclosure would need to be prominently displayed in any description. One place where all limitations can be found.
Limitation phrases (like "up to 4k resolution") would have standard wording supplied by the HDMI standards body.
Additional references to HDMI versions, such as product listing titles, can be shortened to "Limited HDMI X.XX" without any limitations listed. So all limitations listed, or none, to avoid any confusion.
My design would be HDMI 3.Y where Y is a variable-length Base32 encoding of the value of a bitmask representing the presence of the underlying features of an HDMI port/cable/device.
This article doesn't mention the most infuriating aspect of HDMI - it's not an open standard! It's a closed, proprietary standard that requires licensing fees and prevents any open source drivers from existing. This is why the Linux AMD open source drivers don't support HDMI 2.1 - so you can't display 120hz@4K on Linux with AMD.
This HDMI bandwidth calculator helped me understand HDMI 1.4/2.0/2.1 far better than anything else. It is worth noting that many resolution/depth/rate configurations easily fit within TDMS limitations, especially if DSC is enabled (which it should be). FRL isn't required, unless you want to be certain you can support all HDMI 2.1 situations (mostly over 8K or 240Hz+).
If you want to compare video bandwidth requirements not only for HDMI but also for DisplayPort and some other video transports, you can also use my video timings calculator: https://tomverbeure.github.io/video_timings_calculator.
As someone who works on a lot of standards, I can say there are two common misconceptions
1. Typically, standards and certification are entirely different.
2. Standards may not follow semantic versioning - e.g. a major version may not indicate a loss of either backward or forward compatibility.
The protocols defined in USB 1 are still allowed in USB4, as are the cables and connectors. HDMI is the same way. Saying something is USB4 or HDMI 2.2 compliant is not a stronger statement than saying they are USB 1.0 or HDMI 1.0 compliant.
Likewise, statements like "USB 3.2 Gen 2x2" are garbage. The correct terminology is "SuperSpeed USB 20Gbps". Why do so many products use the incorrect, more confusing terminology while omitting the official marketing name? Often because they are not certified products.
I remember when USB2 came out and similar mischief ensued. All the hardware manufacturers got together and pushed the standards body to re-brand USB 1.1 hardware as USB 2.0 (full-speed vs. high-speed). It allowed hardware retailers to empty their shelves, while consumers thought they were getting the latest technology.
Same thing exists for USB3. Every time a new version is released, all cables and products suddenly support that revision. They just don't have any new features.
Not to mention that I've _never_ had a cable identify what it is capable of. Thus USB is a shitshow of crapiness.
I almost always err on the "never attribute to malice that which can be adequately explained by incompetance". Howerver, the "standards" bodies ability to repeatedly make a complete pigs ear of every single interconnect system makes me assume the opposite.
If it becomes too big of a problem, each cable and device will be required to have a challenge-response Obscure Brand Inc. proprietary U9 chip burned with a valid secret key and serial number at the factory that must return a valid response for the link to be enabled.
When i had a very old Samsung tv, my Nvidia Geforce videocard produced a nice image to the tv and Dolby AC3 sound to my even older surround set via a nice hdmi to optical out converter in between.
Now i have a not-so-old Philips tv and suddenly i can't get dolby ac3 sound anymore. Why? Because the GeForce communicates with the tv and the tv responds it only has stereo. The surround set has no hdmi input or output so it cannot communicate with GeForce.
I have tried everything from hacking drivers to changing EDID with some weird devices. Nothing works. Stereo only. Very frustrating.
I was recommended to replace my surround set or my tv. Both pretty expensive solutions for some weird hdmi communication bug/standard.
So i bought a $20 usb sound device to get Dolby AC3 sound to my suround set.
All because i replaced my old tv which couldn't communicate with the GeForce about its speaker setup.
The fundamental problem is a lack of supply chain integrity. Customers can buy a million cables or laptop batteries directly from (country that shall not be named), but they have no idea if they're getting fakes or not.
The fix isn't "authorized" suppliers only, but requiring a reputable someone in the supply chain to maintain evidence of continually testing products advertising trademarked standards for compliance. If it's too much work, then boohoo, sad day for them, they don't get to be traded or sold in country X.
In all honesty, flooding a market with cheap, substandard products claiming standards they don't comply with is dumping.
I blame Sony. They pushed expensive HDMI 2.1-compatible TVs and receivers alongside the PS5, strongly implying that without a 2.1-compatible living room then you might as well be playing your PS5 through a black-and-white TV.
Of course the PS5 can't really exploit any of the features of HDMI 2.1, and then it turned out that most/all 2.1-compatible receivers and TVs have glaring errors and incompatibilities that render them effectively useless.
[+] [-] mey|4 years ago|reply
Whelp, I guess we should just stick to Display Port 1.4
[+] [-] jayflux|4 years ago|reply
[+] [-] ChrisRR|4 years ago|reply
[+] [-] IshKebab|4 years ago|reply
[+] [-] ksec|4 years ago|reply
The second baffling thing is how HN's react to USB-C and HDMI so differently. With USB-C / USB 4, more than half of HN expect the consumers to know which cables and port offer which features. It is the consumers's fault for not choosing the right cable or knowing what their port does. I dont see how that is different to HDMI here. All it takes is one bad actor in the market. And it will be a race to the bottom.
That is partly why Apple brought back MagSafe. You dont tell the user to go and find a USB-C 120W ( or 240W ) capable cable that may or may not be compliant to do quick charging.
You tell them to use MagSafe cable.
[+] [-] smcl|4 years ago|reply
[+] [-] MinorTom|4 years ago|reply
With USB-C I get something out of the confusion - a really versatile cable standard. With HDMI there is no downside (to the consumer) in just calling HDMI 2.0 so.
[+] [-] smithza|4 years ago|reply
Does anyone know of other tests or more comprehensive data sets?
[0] https://linustechtips.com/topic/1387053-i-spent-a-thousand-d...
[+] [-] jon-wood|4 years ago|reply
[+] [-] tjohns|4 years ago|reply
(If you're not familiar with ARC/eARC, this lets the TV send audio from its native inputs back to the AV receiver over an HDMI cable. Without ARC, you need to plug everything directly into the AV receiver.)
[+] [-] daveevad|4 years ago|reply
[0](https://www.homecinemachoice.com/content/ta-k8-blu-ray-recei...)
> Ethernet switch: distribution of an Ethernet uplink connection to BluRayplayer, streaming client, TV monitor and up to 3 source devices (via HEC),up to 2 more external devices via LAN cable (e.g. playing console
from the manual
[+] [-] WorldMaker|4 years ago|reply
[+] [-] Uehreka|4 years ago|reply
[+] [-] floatingatoll|4 years ago|reply
[+] [-] IshKebab|4 years ago|reply
[+] [-] 01100011|4 years ago|reply
[+] [-] tacoman|4 years ago|reply
[+] [-] TheJoeMan|4 years ago|reply
[+] [-] alin23|4 years ago|reply
Well it turns out they lied. It’s 2.1 after all! \s
Jokes aside, it’s actually only 2.0 because internally they transmit a DisplayPort video signal and convert it to HDMI using an MCDP2900 chip[0], which is the same chip usually seen inside USB-C hubs.
So the new MacBook basically got rid of the HDMI dongle by integrating it inside the laptop.
This also breaks DDC/CI on that port and now I get a ton of support emails for Lunar (https://lunar.fyi) because people can’t control their monitor brightness/volume and think that the app is broken.
[0] https://www.kinet-ic.com/mcdp2900/
[+] [-] ClumsyPilot|4 years ago|reply
Damn, that sounds like something a no-name 'made in China' brand would do, what is the possible justifucation for this on a premium device?
[+] [-] eatYourFood|4 years ago|reply
[+] [-] wyager|4 years ago|reply
I had to update my graphics card to get 4k120 444 10bit and eARC.
Only eARC is totally broken - audio often doesn't work at all without restarting PC/TV/receiver a few times. And then once it "works" it will randomly cut out for 5-10 seconds at a time.
HDR on windows is also totally broken. It was a nightmare to get something that correctly rendered full 10bit HDR video (I ended up having to use MPC-HC with madvr and a ton of tweaking). You also have to turn off windows HDR support and use D3D exclusive mode. After updating my TV to get DRR, the audio for this setup stopped working.
Linux also has zero HDR support. Didn't have luck getting 5.1 or 7.1 working either.
MacOS can at least handle HDR on Apple displays - not sure if it works on normal displays. Haven't tried surround sound.
[+] [-] zamadatix|4 years ago|reply
Odd, I've got 3 different HDR monitors across 2 Windows computers (one Nvidia, one AMD) and also 2 different HDR TVs occasionally connected to each and I've never had to turn HDR off or do tweaking in madvr to get it to look right. Having HDR disabled in Windows globally makes me wonder if the adjustments you're talking about are referring to tonemapping HDR content to an exclusive SDR output.
One thing I will say is TVs default to some god awful HDR settings. Well I guess you could say that about picture settings on TVs in general but it's even worse for HDR. It took me a solid 40 minutes to figure out how to get the picture settings for a Samsung Q900R to display HDR properly instead of "worlds darkest picture mode" (turns out the same values take different actual effect depending on the type of source you identify it as and "PC" is not what you want even though you specify HDR picture mode elsewhere and it detects an HDR signal...). Also for SDR content you'll need to adjust the SDR bridghtness slider depending on the actual brightness of your display.
[+] [-] plus|4 years ago|reply
[+] [-] MayeulC|4 years ago|reply
Now, a bit more on a serious tone: this is all bleeding edge. And combining multiple recent development together is a recipe for corner cases and untested combinations.
That said, did you try Variable Refresh Rate with that? Bur reduction technologies (backlight strobing) are also interesting, but thankfully they require little software interaction (for now).
[+] [-] Nevermark|4 years ago|reply
1. HDMI ports should be described as "HDMI X.XX Compatible". This indicates that the ports themselves (and the device that contains them) will work when connected to any other HDMI X.XX device, and that the description is of a port.
This is the low bar on 2.1 devices. Their ports will sensibly negotiate optional features with other HDMI 2.1 devices.
2. HDMI devices should be described as "Full HDMI X.XX" or "Limited HDMI X.XX up to [limitations]" to distinguish devices that support all features of X.XX that apply to the device, or have applicable limitations. (Audio devices would not be considered "Limited" by non-applicable features such as image resolution.)
The ""Limited HDMI X.XX up to ..." disclosure would need to be prominently displayed in any description. One place where all limitations can be found.
Limitation phrases (like "up to 4k resolution") would have standard wording supplied by the HDMI standards body.
Additional references to HDMI versions, such as product listing titles, can be shortened to "Limited HDMI X.XX" without any limitations listed. So all limitations listed, or none, to avoid any confusion.
Done! Next problem, Internet? ...
[+] [-] andrecarini|4 years ago|reply
[+] [-] nvarsj|4 years ago|reply
[+] [-] alvarlagerlof|4 years ago|reply
[+] [-] pseudosavant|4 years ago|reply
https://www.murideo.com/cody.html
[+] [-] tverbeure|4 years ago|reply
[+] [-] theandrewbailey|4 years ago|reply
[+] [-] dwaite|4 years ago|reply
1. Typically, standards and certification are entirely different. 2. Standards may not follow semantic versioning - e.g. a major version may not indicate a loss of either backward or forward compatibility.
The protocols defined in USB 1 are still allowed in USB4, as are the cables and connectors. HDMI is the same way. Saying something is USB4 or HDMI 2.2 compliant is not a stronger statement than saying they are USB 1.0 or HDMI 1.0 compliant.
Likewise, statements like "USB 3.2 Gen 2x2" are garbage. The correct terminology is "SuperSpeed USB 20Gbps". Why do so many products use the incorrect, more confusing terminology while omitting the official marketing name? Often because they are not certified products.
[+] [-] Raqbit|4 years ago|reply
[+] [-] anonymousiam|4 years ago|reply
https://arstechnica.com/uncategorized/2003/10/2927-2/
[+] [-] sixothree|4 years ago|reply
Not to mention that I've _never_ had a cable identify what it is capable of. Thus USB is a shitshow of crapiness.
[+] [-] Frenchgeek|4 years ago|reply
[+] [-] kup0|4 years ago|reply
USB, HDMI, what can we screw up next?
Is it incompetence? Malice? I'd really like to see an in-depth investigation of this phenomenon
[+] [-] ASalazarMX|4 years ago|reply
[+] [-] noneeeed|4 years ago|reply
[+] [-] rob_c|4 years ago|reply
I wish all cables were equal too, but c'est la vie
[+] [-] errcorrectcode|4 years ago|reply
If it becomes too big of a problem, each cable and device will be required to have a challenge-response Obscure Brand Inc. proprietary U9 chip burned with a valid secret key and serial number at the factory that must return a valid response for the link to be enabled.
[+] [-] ksec|4 years ago|reply
"Word of advice. Agents are bad, but whatever you do, stay the hell away from Marketing."
- Thomas A. Anderson
[+] [-] amelius|4 years ago|reply
[+] [-] b3lvedere|4 years ago|reply
Now i have a not-so-old Philips tv and suddenly i can't get dolby ac3 sound anymore. Why? Because the GeForce communicates with the tv and the tv responds it only has stereo. The surround set has no hdmi input or output so it cannot communicate with GeForce.
I have tried everything from hacking drivers to changing EDID with some weird devices. Nothing works. Stereo only. Very frustrating.
I was recommended to replace my surround set or my tv. Both pretty expensive solutions for some weird hdmi communication bug/standard.
So i bought a $20 usb sound device to get Dolby AC3 sound to my suround set. All because i replaced my old tv which couldn't communicate with the GeForce about its speaker setup.
[+] [-] PedroBatista|4 years ago|reply
Just the thought I'll have to learn about this while HDMI disaster in order to not get burned gives me anxiety.
Also, never quite liked HDMI when it came out, but from what I'm reading they really outdone themselves during these years.
[+] [-] errcorrectcode|4 years ago|reply
The fix isn't "authorized" suppliers only, but requiring a reputable someone in the supply chain to maintain evidence of continually testing products advertising trademarked standards for compliance. If it's too much work, then boohoo, sad day for them, they don't get to be traded or sold in country X.
In all honesty, flooding a market with cheap, substandard products claiming standards they don't comply with is dumping.
https://en.wikipedia.org/wiki/Dumping_(pricing_policy)
[+] [-] theshrike79|4 years ago|reply
And what was most interesting is that price and quality didn't always correlate at all.
[+] [-] dataengineer56|4 years ago|reply
Of course the PS5 can't really exploit any of the features of HDMI 2.1, and then it turned out that most/all 2.1-compatible receivers and TVs have glaring errors and incompatibilities that render them effectively useless.
[+] [-] Mindwipe|4 years ago|reply