Luckily with mid and high end monitors, DisplayPort is the standard. I don't understand why TVs don't come with even a single DisplayPort though. Maybe they are paid off by the HDMI forum?
What I always liked about HDMI is that it's basically backwards compatible all the way to DVI, just using passive adapters. I think that's the reason why HDMI dominates on laptops, where you're expected to move around and use projectors, which could then be very old and only have DVI inputs, and it "just works" with a plain old adapter. You could even go the other way round from a DVI output to a modern HDMI projector using the very same adapter, but I think laptops pretty much skipped DVI and went from VGA to HDMI. So in practice it doesn't really matter, especially with DP++ being supported by almost every device, which effectively gives you HDMI output using an almost passive adapter.
When I was working at a place that makes TVs, we were trying to reduce number of physical buttons to cut costs. I imagine HDMI just works and everything that has a display port probably also has an HDMI output
It sounds like there aren't weird Hollywood conte t protections on display port.
My guess is that it's not a technical or even a user experience issue. It's probably a money issue with the deals tv manufacturers make with the media industry.
i.e. Netflix won't allow their app on a device that can circumvent HDCP.
Someone without contracts with the HDMI Forum can probably release a working driver. However, AMD obviously needs to license the logo for their GPU boxes and their legal team have decided this prevents them releasing any source code based on the private specification regardless of what the code is called.
>Video Out port: DVI-D signal in 640х480 px, 60 Hz. The port supports a well-known video standard that we can't name due to copyright limitations The first letter is H, and the last one is I.
Sounds about right, Flipper Devices just released the Video Game Module which has a very recognisable port that is never referred to by it's more common name anywhere.
Of course, they don't do HDMI 2.1 or anything advanced like that, but I guess the reason for the name not appearing anywhere is the same as you're discussing here.
> proprietary formats and specifications such as ... DisplayPort
DisplayPort is not proprietary; it is a VESA standard.
> Lightning
If you owned any iPhone between 2012 and the present, you had to use Lightning, without choice. And for some inexplicable reason, Americans love iPhones.
> Aren't we able to create and use open standards?
In theory, sure. In practice you'll have to construct a financially sustainable organization that is able to motivate all interested parties to chip in and is also able to certify the implementation and at the same time also doesn't fall victim to internal corruption (e.g. high C-level compensation making it unsustainable). I think there are few-to-no precedents for that in the open source space in general, and even less when it comes to standards body organizations for maintaining a standard at that level of complexity.
In most domains proprietary specifications form the backbone of everything. A lot of governments refer to ISO standards, which by default are not open access.
It sounds like you're not aware of the reason. I personally believe that the reson is that there are many entities pushing against open standards since they would make copying easier and DRM harder.
> Aren't we able to create and use open standards?
You can, but you need monitor/graphics chip companies to use it. These are mostly the members/creators of the HDMI standard, they are also probably the best able to create a standard that others will use:
The computer related companies should have given the middle finger to the media Mafiaa long ago and tell them to STFU if they ever wanted their content to be displayed in a computer with PC based standards and not MPEG.
Once you get a libre OS you can dump the content of the BUS or fake whatever HDMI hardware out there to get pristine audio and video frames. Also, the current Hollywood movies are very subpar compared to what we had in the 90's, so who cares.
My SO has an Amazon Prime account and yet they want to show adverts in middle of a media she already paid to be displayed without ads in theory. So, you are paying them twice. Thus, I don't consider Bittorrent piracy when you legally paid for a service but the streamers can break out the rules anytime.
Sure.
Try dumping SkyShowTime shows/movies then. (refuses to run on GNU/Linux).
The average power user won't be able to run SkyShowTime on Linux. The idea is locking everyone on Windows, OS X or Linux with Secure Boot and verifiable boot chain if you want to watch movies or TV shows.
Ars Technica has to be the most careless tech news outlet and I stopped reading them long ago because of how much they just get factually wrong because of woefully inexperienced writers and to grab attention.
Their first sentence is "Any Linux user trying to send the highest-resolution images to a display at the fastest frame rate is out of luck for the foreseeable future, at least when it comes to an HDMI connection" but that's plainly not true. Hardware with closed source drivers, such as the standard nvidia ones do support those because they don't have this legal limitation. Then they even end it with that possibility of closed source AMD and didn't bother even asking themselves if anyone else has done this.
You're right, but on the other hand, I like that they mentioned DisplayPort as a viable alternative that doesn't suffer from the problem at hand. Not too thrilled about the exact phrasing, describing DP as "the likely best option", which is too opinionated for my taste.
But I'd appreciate if other publications did this, so that e.g. every article about Microsoft's subscription model for Office 365 mentioned LibreOffice, every article about Steam users losing access to some of their games mentioned GOG, and so on. Especially when the alternatives are not well-known among one's readers.
On a somewhat related note, it's very surprising how hard it is to get DisplayPort out of a USB-C connection. Although DP is natively there all the cheap adapters tend to be HDMI even though that requires extra hardware to create the complex HDMI signalling out of the DP native output. So although monitors tend to have DP inputs and now everything has DP outputs in the form of USB-C it's actually hard to connect the two.
Maybe I'm missing something, but it looks to me like Amazon even creates their own off-brand USB-C to DisplayPort cable, so it doesn't seem like it's that hard.
I was pleasantly surprised that my new Dell monitor connects with USB-C cable to to my laptop, and then lets me daisy chain another older Dell monitor off of the first one with a DisplayPort cable. I’ve got power and 2 monitors, plus a full USB-C hub, with just 1 cable into the laptop. So at least Dell has figured it out.
The source code is not a clean room re-implementation of the the HDMI 2.1 spec and as such contains IP owned in one way or another by the HDMI Forum. This would almost certainly preclude them from distributing source code to non HDMI Forum members under any license according to what I could glean from the article.
> you can't make an open source HDMI 2.1 driver because some people said so
I don't understand. Fuck whatever committee said whatever crap. Open source is open source. Just make the damn driver and give the suits 2 middle fingers.
As a Member of HDMI-Forum AMD would surely break its Forum Membership contract by open-sourcing their implementation of a specification restricted to Members.
Also, "HDMI" itself is a trademark with usage only allowed to its Members (like "Wi-Fi"), so even if a non-Member would do an open-source HDMI-implementation against the will of the HDMI-Forum, he would likely not be allowed to call it "HDMI" (like "WLAN" is used by companies without Wi-Fi Alliance Membership)
It was all about SCART, which is probably the only port worse than USB for having to rotate the connector several times before getting the correct orientation to insert it
I don't miss all the variants of DVI. DVI-D, DVI-I, DVI-A, single-link, dual-link. But people would just see the cable and think, "ah, I know this, it's a DVI cable"
The connector (resp Port) was too chunky for laptops and you didnt get sound over dvi, but other than that it worked just Fine each and every time. Never had issues upgrading stuff from VGA to DVI in an industrial Environment back then.
fsflover|2 years ago
redox99|2 years ago
fps_doug|2 years ago
But still, I like it. :-)
shultays|2 years ago
Dalewyn|2 years ago
And assuming they do, they all have HDMI anyway.
As for everything else most people would connect to their TV, they all have HDMI and precisely none have DisplayPort.
Given that, supporting DisplayPort is an unnecessary expenditure on bill of materials and labor for TV manufacturers.
from-nibly|2 years ago
My guess is that it's not a technical or even a user experience issue. It's probably a money issue with the deals tv manufacturers make with the media industry.
i.e. Netflix won't allow their app on a device that can circumvent HDCP.
globular-toast|2 years ago
coryfklein|2 years ago
unknown|2 years ago
[deleted]
dark-star|2 years ago
ZiiS|2 years ago
croes|2 years ago
>Video Out port: DVI-D signal in 640х480 px, 60 Hz. The port supports a well-known video standard that we can't name due to copyright limitations The first letter is H, and the last one is I.
https://blog.flipper.net/introducing-video-game-module-power...
rekoil|2 years ago
Of course, they don't do HDMI 2.1 or anything advanced like that, but I guess the reason for the name not appearing anywhere is the same as you're discussing here.
bayindirh|2 years ago
Not nice. Neither worms, nor HDCP forum.
globular-toast|2 years ago
DeathArrow|2 years ago
Aren't we able to create and use open standards?
delta_p_delta_x|2 years ago
DisplayPort is not proprietary; it is a VESA standard.
> Lightning
If you owned any iPhone between 2012 and the present, you had to use Lightning, without choice. And for some inexplicable reason, Americans love iPhones.
hobofan|2 years ago
In theory, sure. In practice you'll have to construct a financially sustainable organization that is able to motivate all interested parties to chip in and is also able to certify the implementation and at the same time also doesn't fall victim to internal corruption (e.g. high C-level compensation making it unsustainable). I think there are few-to-no precedents for that in the open source space in general, and even less when it comes to standards body organizations for maintaining a standard at that level of complexity.
In most domains proprietary specifications form the backbone of everything. A lot of governments refer to ISO standards, which by default are not open access.
salviati|2 years ago
croes|2 years ago
helsinkiandrew|2 years ago
You can, but you need monitor/graphics chip companies to use it. These are mostly the members/creators of the HDMI standard, they are also probably the best able to create a standard that others will use:
https://hdmiforum.org/members/
missjellyfish|2 years ago
fransje26|2 years ago
anthk|2 years ago
Once you get a libre OS you can dump the content of the BUS or fake whatever HDMI hardware out there to get pristine audio and video frames. Also, the current Hollywood movies are very subpar compared to what we had in the 90's, so who cares.
My SO has an Amazon Prime account and yet they want to show adverts in middle of a media she already paid to be displayed without ads in theory. So, you are paying them twice. Thus, I don't consider Bittorrent piracy when you legally paid for a service but the streamers can break out the rules anytime.
fifteen1506|2 years ago
The average power user won't be able to run SkyShowTime on Linux. The idea is locking everyone on Windows, OS X or Linux with Secure Boot and verifiable boot chain if you want to watch movies or TV shows.
emersion|2 years ago
Uvix|2 years ago
geraldhh|2 years ago
drpossum|2 years ago
Their first sentence is "Any Linux user trying to send the highest-resolution images to a display at the fastest frame rate is out of luck for the foreseeable future, at least when it comes to an HDMI connection" but that's plainly not true. Hardware with closed source drivers, such as the standard nvidia ones do support those because they don't have this legal limitation. Then they even end it with that possibility of closed source AMD and didn't bother even asking themselves if anyone else has done this.
morningsam|2 years ago
javier_e06|2 years ago
https://hdmiforum.org/about/hdmi-forum-board-directors/
I see people from Apple Panasonic Sony Nvidia Samsung etc.
Hardware companies. Maybe you have to buy your way in the club.
hdmileaksignal|2 years ago
Almost anti-competitive that the HDMI 2.1 spec people won't allow an open implementation.
That they don't even allow open implementation should have been a red flag to all of us that HDMI 2.1 has not been subjected to sufficient review.
Have any of you sufficiently reviewed an actual implementation of this spec? Only with black-box testing because it's closed source?
pedrocr|2 years ago
lolinder|2 years ago
https://www.amazon.com/AmazonBasics-Aluminum-USB-C-DisplayPo...
They also have an adapter: https://www.amazon.com/AmazonBasics-Aluminium-DisplayPort-Ad...
ndespres|2 years ago
tener|2 years ago
dagw|2 years ago
BillyTheMage|2 years ago
I don't understand. Fuck whatever committee said whatever crap. Open source is open source. Just make the damn driver and give the suits 2 middle fingers.
rickdeckard|2 years ago
Also, "HDMI" itself is a trademark with usage only allowed to its Members (like "Wi-Fi"), so even if a non-Member would do an open-source HDMI-implementation against the will of the HDMI-Forum, he would likely not be allowed to call it "HDMI" (like "WLAN" is used by companies without Wi-Fi Alliance Membership)
shzhdbi09gv8ioi|2 years ago
lousken|2 years ago
CalRobert|2 years ago
zekica|2 years ago
ljm|2 years ago
rictic|2 years ago
silon42|2 years ago
I normally run it on 120Hz over DP, but it will work fine over HDMI 1.1 at 60Hz. My (5+ years old) TV runs at 1080p/120Hz just fine too.
ttyyzz|2 years ago
RedShift1|2 years ago
spintin|2 years ago
[deleted]
notso411|2 years ago
[deleted]
unknown|2 years ago
[deleted]