Wow that is a lot of bits. Seriously, that is a LOT of bits. Its almost like they said, "We're sick and tired of people complaining the cable is holding them back, here beat that suckers." :-) I guess the margins on 4K televisions have done better than expected, certainly as a 'feature' driving upgrades they have out performed '3D'.
For me it starts to push up against the question of whether monitor resolution has hit the top of the s-curve. A 9600 x 5400 (10K) screen at 200PPI is 48" x 27". That is a 55" High DPI screen. And while glorious and amazing as that would be, I'm wondering if that would be "main stream" any time soon.
I think 4K has ended up being a bit of a whiff in most respects. I just bought a 60" TV for my living room, where we are about 7 feet from it, which is pretty close (it's the narrow dimension in my rectangular living room), and I can't tell much difference between putatively 4K content and clearly-1080P (from Bluray). It's hard to be sure, because all my putatively-4K content is streaming-based, and there aren't that many sources willing or able to ship enough bits to clearly distinguish between 1080p and 4K.
That is, in a 1080p v. 4K streaming contest, if you dedicate the same larger number of bits to the 1080p content as the 4K content, you'd get a better 1080p signal, too. With lossy compression, pixels aren't directly comparable, they have a quality, and low-quality 1080p pixels v. low-quality 4K pixels makes it hard to tell what differences are the format, and which are just the low quality in general.
(In other news, I consider the idea that streaming is obviously superior to disks to be a bit crazy. What disks have is higher quality pixels, and I expect that to continue to be true for a while. I've seen Bluray 1080p streams that are clearly superior to putatively 4K over-the-internet streams. And I've got DVDs that upscale to a higher quality than putatively 1080p streams of the same content.)
Far more important is the increased color gamut and HDR support, which does provide a noticeable increase in image quality. Even from those aforementioned dubious-quality streaming sources; the higher-gamut HDR images are a visible improvement over the NTSC-sunglasses we were all wearing without realizing it.
Suppose your desk is 36" deep. A 48" wide screen at the back of it occupies 41 degrees of your world. The THX recommendation for cinema is 40 degrees. This is as immersive as you're going to get without strapping a screen to your head.
20/20 vision is about 60 pixels per degree, depending on which study you want to believe. 41x60 is 2460 pixels across -- you've already reached the max required pixels with a 3840x2160 display.
Higher resolution and higher refresh rates are still needed for VR. VR requires both a relatively high refresh rate (90Hz) and pretty high resolutions (current generation devices are 2160x1200). I believe that uses most of the bandwidth of HDMI 1.3, I'm not sure if any headsets use HDMI 2.0 yet.
And if you've tried VR, we're still at the point where pixels are obvious and higher densities would be an improvement. The usual comment I get when friends try my Vive is that it looks like an N64.
That's tough to justify for home viewing distances and sizes. I sit about six feet away from a 4k TV (at the closest) and feel like I'd have a hard time justifying greater resolution than that.
If I were guessing, this standard would be only for 1) large format projectors and 2) really larger and expensive boardroom/home theater LCDs.
I used to work for a high end home theater company.
The boss' son was a total audiophile and was constantly bragging about all the high end audio they sold and put in their clients home theaters.
I used to listen to a set of $15K speakers and then to a set of $8K speakers and I just couldn't tell the difference. It's kind of the same I think with monitor resolution, at some point, can the human eye really tell the differences between all the high end resolutions out there right now?
I have doubts most people can TBH, but would love some evidence we're not beyond the capabilities of what the human brain and eyes can actually differentiate between.
Where this becomes very useful is 3D 360° video. 120 FPS allows two 60FPS streams, while transmitting the entire 360° at 10K allows reasonable resolution for the portion the user is actually looking at.
I'm looking forward to standing as close or as far away as I'd like from some sweet zooming fractals, and it looking awesome in every case. If the screen is the size of a wall, even better :)
"Among other things, the new standard will support the game mode variable refresh rate (GM VRR). Nowadays, AMD already supports FreeSync-over-HDMI using a custom mode and with the HDMI 2.1 everything gets a little easier."
Oh, that'll be nice. Now that almost every interface is using 3D rendering tech under the hood, regardless of how it looks, I've noticed tearing has spread from something only gamers see very much of to something I'm seeing everywhere. Hopefully this helps drive the tech to push this everywhere and tearing will, in perhaps 10 years or so, be a thing of the past.
A somewhat related question -- why is the industry able to agree on a single standard for how data will move over a wire from a device to a display, but yet can't come up with a single standard for doing it without a wire.
I can take my Mac laptop or my PC laptop and roll up to my TV and with a single cable make pictures and sound come out of the TV.
But yet neither one can talk to the TV without a cable, unless I get a special device (which is different for both and ironically uses the same cable).
And even worse, if I had a Samsung phone, it could connect to the TV wirelessly.
Why can't we have a single wireless display standard? Is there a technology problem I'm missing, or is it really just walled gardens?
Miracast is such standard. Thousands of devices support it. However two major players ( Apple & Google) dont. Apple has traditionally been a closed system, so it is not a big surprise. What surprises me is that Google also removed support for Miracast from Android and started promoting their proprietary casting protocol.
Because doing it without a wire was, until relatively recently (5 years), something only iPhones did well, or at all.
Consumers have been used to cables for decades, so they demand something that just works in this area. The same goes for data. Apple tried Thundebolt and it failed, because consumers already have something that just works and is compatible in this area (USB), and they wouldn't let go of it. Wifi streaming is relatively new, compared to cables, but once we get used to it Just Working, there's no going back for consumers.
It's almost the same with Wifi voice calls, where, unfortunately, consumers don't demand something that just works, but are left with two incompatible protocols (FaceTime and Hangouts, or whatever Google's version is called these days).
Does this mean TB3 (with "only" 40Gbps bandwidth) won't be able to support all the features/resolutions of HDMI 2.1? This is of course assuming that the HDMI 2.1 spec is designed to fully utilize the hardware level bandwidth.
Also I wonder what exactly is making the 48G cable special. Is it using additional pins while maintaining backwards compatibility?
Usb 3.1 Gen 2 over USB-C can do DisplayPort1.2 as an alt-mode in some cases (if everything supports it). Kinda the solution, not HDMI. Fun times we live in.
I wonder which causes more pain for consumers: new cables with new connectors (USB-C) or new cables with backward compatible connectors (HDMI 2.1).
While the former definitely makes consumers unhappy in the short term, it seems like in the long term the latter's confusion around what cable you have vs what cable you need is worse.
Can somebody please introduce the HDMI Forum to semver. While I'm looking forward to what this will enable, it's a little overzealous for a point release.
Agreed. As a consumer, it's a bit odd that HDMI 1.x and 2.0 used the same cables, but 2.1 requires a new cable. That seems like a good reason to bump the major version number.
I've always thought 2.0 was a relatively disappointing upgrade over the previous standard. It felt like it should've had at least twice the bandwidth it ended up having. This is looking much better.
Also, considering how disappointing 2.0 was, yet it still got the "2.0" name, this should be at least a 3.0, especially with a new cable coming. The only reason I see them using "2.1" is because they don't want TV and laptop makers to yell at them for making their devices look awfully obsolete just a few years later. But it's probably going to confuse users and it's going to hurt adoption of 2.1-enabled devices.
As for the new cable business, is there a way for the devices to detect the version of the cable (maybe resistor strapping or something)?
The reason I ask is because HDMI cables are passive, so any cable with all of the pins populated should be able to work, provided it's able to satisfy the bandwidth requirements. In other words, I'm wondering if these new HDMI 2.1 cables are just regular HDMI cables with better electrical characteristics. If so, then I bet older HDMI cables will work as long as they’re kept short.
I also wonder how this variable refresh rate thing relates to the recent AMD FreeSync over HDMI thing...
I imagine it is like PCI-Express where they link up at the lowest bandwidth possible (x1 in the case of PCIe) and then keep attempting higher bandwidth modes (x4,x8,x16) until the the transceivers can no longer link and then step down one mode.
It's not written in the article how the cables are different. If the new ones user more differential lanes (like USB3 does over USB2) that could of course be detected by the transceivers. If it's still the same amount of lanes but only tighter electrical requirements then the old cables could still work.
The cables aren't versioned. They're just rated in categories for attenuation exactly like copper phone wiring is and named for the connector type just like USB-A, USB-C, micro-USB, etc are. You don't have HDMI 1.3 and HDMI 1.4 cables, though; cables bought years before a HDMI version is announced can still work with any of that new version's features that work within the bandwidth that cable can carry. The version is for the HDMI spec which the receiver must support. HDMI isn't just a cable, it's also the hardware encoding and decoding signals, and the different versions define new and different ways of encoding and decoding digital media.
As for 'headphone jack 1.4' and 'RCA jack 7.6'-- back in the day we had analogue audio and video signals and invented new types of jack, connector, and cable to carry them. You had RCA jacks, and 3.5mm jacks, and 1/4" audio jacks, and component video jacks, and S-video jacks, and SCART jacks, and coax jacks. Now we have digital audio and video and consistently use HDMI to carry it; instead of inventing new connectors we update the HDMI spec and say "Systems meeting the 1.3 spec can only transmit 1080p over their HDMI cables, and systems meeting the 1.4 spec can transmit 1080p over those cables." Isn't that better, and simpler? Instead of 20 types of cable with different features, we just use one type of cable, and add features to it over time.
So they are announcing variable data rate streams and Display Stream Compression support. I thought until now HDMI had a fixed data rate except for the control channel, unlike DisplayPort which is fully packetized. Can someone more knowledgeable chime in on whether HDMI will now be packetized too, or how they support variable data rate modes?
How does this relate to Apple’s advanced USB connectors on the new MBPs? I thought, they were supposed to replace all other types of connectors for video. Will we have HDMI for TVs and USB for monitors?
They're not "Apple’s advanced USB connectors" - they're the USB-C standard. They support (per the standard) what are called alternate modes, HDMI being one of them. HDMI 1.4b is currently supported over HDMI alt mode, which is unsurprising given that USB-C has been in the wild for a while and HDMI 2.1 is brand new.
Still no daisy chaining though, unlike with DisplayPort which uses packeted data. Why isn't DP replacing it faster? Is it too expensive to become ubiquitous?
It's a matter of targeted use case. DP is for PCs, while HDMI is for point-to-point AV. There aren't many non-PC HDMI sources that support multiple displays (mirroring or expanding), and this is by design. Chaining makes no sense in the home theater nor building video dist use cases.
[+] [-] ChuckMcM|9 years ago|reply
For me it starts to push up against the question of whether monitor resolution has hit the top of the s-curve. A 9600 x 5400 (10K) screen at 200PPI is 48" x 27". That is a 55" High DPI screen. And while glorious and amazing as that would be, I'm wondering if that would be "main stream" any time soon.
[+] [-] jerf|9 years ago|reply
That is, in a 1080p v. 4K streaming contest, if you dedicate the same larger number of bits to the 1080p content as the 4K content, you'd get a better 1080p signal, too. With lossy compression, pixels aren't directly comparable, they have a quality, and low-quality 1080p pixels v. low-quality 4K pixels makes it hard to tell what differences are the format, and which are just the low quality in general.
(In other news, I consider the idea that streaming is obviously superior to disks to be a bit crazy. What disks have is higher quality pixels, and I expect that to continue to be true for a while. I've seen Bluray 1080p streams that are clearly superior to putatively 4K over-the-internet streams. And I've got DVDs that upscale to a higher quality than putatively 1080p streams of the same content.)
Far more important is the increased color gamut and HDR support, which does provide a noticeable increase in image quality. Even from those aforementioned dubious-quality streaming sources; the higher-gamut HDR images are a visible improvement over the NTSC-sunglasses we were all wearing without realizing it.
[+] [-] dsr_|9 years ago|reply
20/20 vision is about 60 pixels per degree, depending on which study you want to believe. 41x60 is 2460 pixels across -- you've already reached the max required pixels with a 3840x2160 display.
[+] [-] cpitman|9 years ago|reply
And if you've tried VR, we're still at the point where pixels are obvious and higher densities would be an improvement. The usual comment I get when friends try my Vive is that it looks like an N64.
[+] [-] mmastrac|9 years ago|reply
If I were guessing, this standard would be only for 1) large format projectors and 2) really larger and expensive boardroom/home theater LCDs.
[+] [-] at-fates-hands|9 years ago|reply
The boss' son was a total audiophile and was constantly bragging about all the high end audio they sold and put in their clients home theaters.
I used to listen to a set of $15K speakers and then to a set of $8K speakers and I just couldn't tell the difference. It's kind of the same I think with monitor resolution, at some point, can the human eye really tell the differences between all the high end resolutions out there right now?
I have doubts most people can TBH, but would love some evidence we're not beyond the capabilities of what the human brain and eyes can actually differentiate between.
[+] [-] beojan|9 years ago|reply
[+] [-] vosper|9 years ago|reply
[+] [-] jerf|9 years ago|reply
Oh, that'll be nice. Now that almost every interface is using 3D rendering tech under the hood, regardless of how it looks, I've noticed tearing has spread from something only gamers see very much of to something I'm seeing everywhere. Hopefully this helps drive the tech to push this everywhere and tearing will, in perhaps 10 years or so, be a thing of the past.
[+] [-] digi_owl|9 years ago|reply
[+] [-] jedberg|9 years ago|reply
I can take my Mac laptop or my PC laptop and roll up to my TV and with a single cable make pictures and sound come out of the TV.
But yet neither one can talk to the TV without a cable, unless I get a special device (which is different for both and ironically uses the same cable).
And even worse, if I had a Samsung phone, it could connect to the TV wirelessly.
Why can't we have a single wireless display standard? Is there a technology problem I'm missing, or is it really just walled gardens?
[+] [-] israrkhan|9 years ago|reply
[+] [-] runeks|9 years ago|reply
Consumers have been used to cables for decades, so they demand something that just works in this area. The same goes for data. Apple tried Thundebolt and it failed, because consumers already have something that just works and is compatible in this area (USB), and they wouldn't let go of it. Wifi streaming is relatively new, compared to cables, but once we get used to it Just Working, there's no going back for consumers.
It's almost the same with Wifi voice calls, where, unfortunately, consumers don't demand something that just works, but are left with two incompatible protocols (FaceTime and Hangouts, or whatever Google's version is called these days).
[+] [-] kalleboo|9 years ago|reply
[+] [-] jd007|9 years ago|reply
Also I wonder what exactly is making the 48G cable special. Is it using additional pins while maintaining backwards compatibility?
[+] [-] sametmax|9 years ago|reply
[+] [-] protomyth|9 years ago|reply
[edit]I too wish USB-C was the single connector, but it is such a confusing standard with all the Alternate Modes and different cables on the market.
[+] [-] seanp2k2|9 years ago|reply
[+] [-] schmichael|9 years ago|reply
While the former definitely makes consumers unhappy in the short term, it seems like in the long term the latter's confusion around what cable you have vs what cable you need is worse.
[+] [-] Pxtl|9 years ago|reply
This is going to suck.
[+] [-] kimburgess|9 years ago|reply
[+] [-] profmonocle|9 years ago|reply
[+] [-] mtgx|9 years ago|reply
Also, considering how disappointing 2.0 was, yet it still got the "2.0" name, this should be at least a 3.0, especially with a new cable coming. The only reason I see them using "2.1" is because they don't want TV and laptop makers to yell at them for making their devices look awfully obsolete just a few years later. But it's probably going to confuse users and it's going to hurt adoption of 2.1-enabled devices.
[+] [-] themihai|9 years ago|reply
[+] [-] comboy|9 years ago|reply
[+] [-] unknown|9 years ago|reply
[deleted]
[+] [-] digi_owl|9 years ago|reply
[+] [-] ashark|9 years ago|reply
http://www.informationweek.com/mobile/mobile-devices/usb-typ...
Oh.... :-/
[+] [-] pg_is_a_butt|9 years ago|reply
[deleted]
[+] [-] rocky1138|9 years ago|reply
[+] [-] ak217|9 years ago|reply
DisplayPort 1.4: 32 Gbit/s
HDMI 2.1: 38 Gbit/s
Thunderbolt 3: 40 Gbit/s
[+] [-] MaxLeiter|9 years ago|reply
[+] [-] hocuspocus|9 years ago|reply
[+] [-] planteen|9 years ago|reply
[+] [-] dota_fanatic|9 years ago|reply
So, no, not 144Hz.
[+] [-] pbreit|9 years ago|reply
[+] [-] MBCook|9 years ago|reply
Since these things tend to be backwards compatible I'd assume 2.1 would still provide power.
I suppose it may not be mandatory.
[+] [-] Unklejoe|9 years ago|reply
The reason I ask is because HDMI cables are passive, so any cable with all of the pins populated should be able to work, provided it's able to satisfy the bandwidth requirements. In other words, I'm wondering if these new HDMI 2.1 cables are just regular HDMI cables with better electrical characteristics. If so, then I bet older HDMI cables will work as long as they’re kept short.
I also wonder how this variable refresh rate thing relates to the recent AMD FreeSync over HDMI thing...
[+] [-] j-walker|9 years ago|reply
[+] [-] Matthias247|9 years ago|reply
[+] [-] pasbesoin|9 years ago|reply
As such, I don't find it "wow" at all. Maybe more "just in time."
[+] [-] tbrock|9 years ago|reply
Imagine if back in the day it was like: "ohhh headphone jack 1.3 eh? Well these headphones only work on 1.4 or greater"
Or: "Wow you only have RCA 7.6 you'll have to get a TV with rev 8 to hook up your NES"
It's a friggin' cord with some connectors and some pins, how did we wind up having different versions be even possible?!
[+] [-] ryanplant-au|9 years ago|reply
As for 'headphone jack 1.4' and 'RCA jack 7.6'-- back in the day we had analogue audio and video signals and invented new types of jack, connector, and cable to carry them. You had RCA jacks, and 3.5mm jacks, and 1/4" audio jacks, and component video jacks, and S-video jacks, and SCART jacks, and coax jacks. Now we have digital audio and video and consistently use HDMI to carry it; instead of inventing new connectors we update the HDMI spec and say "Systems meeting the 1.3 spec can only transmit 1080p over their HDMI cables, and systems meeting the 1.4 spec can transmit 1080p over those cables." Isn't that better, and simpler? Instead of 20 types of cable with different features, we just use one type of cable, and add features to it over time.
[+] [-] ak217|9 years ago|reply
[+] [-] webmaven|9 years ago|reply
You might actually need to use a whole wall, which leads into immersive environments.
[+] [-] SimeVidas|9 years ago|reply
[+] [-] djrogers|9 years ago|reply
[+] [-] izacus|9 years ago|reply
[+] [-] shmerl|9 years ago|reply
[+] [-] sprayk|9 years ago|reply