I've worked on something similar while in college. HDMI is actually a fairly simple protocol, based on VGA. It still has scanlines and such, but the signals are digital instead of analog. So instead of a RAMDAC, you have the HDMI encoder chip. You typically need control some i2c interface to set the HDMI encoder chip in the correct mode, but that's about it.
It's worth mentioning that HDMI encoder chips are actually quite difficult to get hold of as a hobbyist; the HDMI association prohibits their sale to anybody who has not signed the relevant NDAs, paid membership fees and whatnot. The project shown here gets around this by using a freely available DVI encoder chip instead, which works fine (the HDMI specification mandates backwards compatibility with DVI) but lacks support for a number of HDMI-only features such as audio and higher resolutions.
The only HDMI encoder I have seen so far with easily accessible (i.e. leaked) documentation is the CAT6613/IT6613 from ITE, which also happens to be available for purchase in single quantities from a number of Chinese retailers. It seems to be used in the OSSC and several FPGA development boards, so it's about as close to being an unofficial standard for open source projects as it could be.
Based on what you said, you didn't actually look at HDMI protocol, only the protocol exposed to your HDMI encoder chip. You could have such a protocol on Thunderbolt 3 encoder chip.
FWIW, yes HDMI is still pretty simple, but not as simple as you describe it. Even though there are 3 pairs of data it's not one pair R, one pair G, one pair B (highest-bandwidth HDMI uses 4 pairs), it's just one data bus. The data pairs aren't only used for data colors, but also conveys audio, and info frames (which will include various stuff like HDR or VRR metadata). Of course there is the matter of DRM: the content will often be encrypted (but negotiation of that encryption happen separately, over I2C)
If you only want to drive the monitor in DVI compatibility mode, you don't need very much because the interface is (as you describe) fairly simple and electrically compatible - if you actually want "real" HDMI then it's much more complex.
I imagine a PC with a Graphics Gremlin and either a Snark Barker or Snood Bloober for audio, is going to be quite the hipster toy to have, one of these days .. all this great old, retro, PC hardware designs!
I have a couple of vintage computers. One is my original first computer. While there is a cheap expansion that would allow me to hook up any VGA screen on it, the part of the experience is the screen.
I've been hooked on to retro computing a while, and I flatted out somewhat. It all depends on what you want to do. If you want to play games or run software I suggest you take a good deep look at the emulators. I had a task to do with my XT and that was pulling the data out from it. I did some original programming for that purpose. I spent days loading software from the internet to it in minutes which was kind of miraculous. I programmed some graphics demos on it. I upgraded it with an XT-CF, etc. In the end there is no need to keep that machine up and running on a desk somewhere. The best purpose would be aesthetic, because it is really beautiful as a package, but even if it were in mint condition, I wouldn't risk running it for a couple hours daily for an useless purpose. Although it would be nice to have an 80s style terminal displaying current whether, RSS feeds, etc, it's just a bad way to run your historic machine out of working hours.
My main trouble with the emulators/virtual machines is that late-90s and early-00s Windows/DirectX games are a huge dead spot.
There are a ton of titles from this era that just don't work on current Windows, even with dgVoodoo. I just want to be able to comfortably get rid of this Win98 SE / WinXP dual boot box I have lying around...
> The frequencies and connectors used by CGA and MDA are no longer supported by modern monitors hence it is difficult for older PCs of the 1980s era to have modern displays connected to them without external adapters.
Some CGA ran over composite, and there's plenty of modern small TVs with a composite input. It's perfectly fine to use a TV as a computer monitor. (I do!)
> This analog-to-digital conversion will also lead to an inevitable loss in video quality.
Oh, now we're splitting hairs! This is super-low resolution, super-low colorspace. CGA was (at most) 16 discrete colors. In many situations it was 4 colors with 2 pallets to chose from. The CGA port was also digital, so I don't understand where the "loss in video quality" argument comes from.
IMO: I don't "get" this. You're no longer running "vintage" hardware; yet a lot of vintage hardware has limited lifespan and may become unrepairable if/when there's degradation inside the chips themselves.
If someone is going to go through all this trouble, it makes a lot more sense to emulate the whole computer.
Count me in as someone who likes vintage hardware but modern displays. In my case it's old game consoles.
The drop in video quality with composite is real. This has less to do with the resolution, but more with the fact that hardware that upscales this to an HD or 4K panel needs to make an educated guess where pixels start and end, and gets it wrong.
It looks quite ugly practically and switching to something with crisp pixels is usually very worth it.
For old game consoles it's often enough to switch to RGB or Component and you don't have to go full digital. Composite (and RF) are quite bad.
This is not an audiophile type of distinction, it's very visible and obvious to almost anyone.
FWIW, the lifetime of a CRT display is generally less than the lifetime of the rest of the PC. I don't think it's much trouble to install a card on a PC that didn't happen to have composite out, or to want something better than composite on a modern LCD display. You don't have to get it. It's not for you. That's OK.
So, this is probably a super nimby moment, but the growing excitement around "retrocomputing" (really just old machines) means a lot of hardware is becoming super expensive. Like good luck finding an IDE harddrive under 500 MB that works, people are pulling them out of working old machines, putting them on ebay for 100s to even 1000 usd, then shipping them in after thought packaging that has them fall apart on their way there. This hobby is fast becoming a rich man's game for speculators and huskers.
It's a tragedy really. I feel like efforts like OP are great because they pull pressure off the literally limited stock thereby making the speculators go elsewhere (segasaturn games and the like mostly).
Are there any recommended made to oder PCB services if you just want a card? Googling around seems to be a lot
of options all over the world but maybe some HNers have experience with some of them.
The retrocomputing community has struggled with making display controllers because it's not so practical to make one with 74xx/54xx parts (which have just a few gates on a chip) I saw an ad for once circa 1978 in Byte Magazine which was a circuit board about as big as an IBM PC expansion card where both sides of the board were packed with chips. Something like that costs about the same today as it did in 1978.
Home computers/game consoles of the time mainly had ASIC display controllers but projects like
don't really have the volume to justify making an ASIC so they wind up using FPGA (like this card) or microcontrollers to function as display controllers. Note the super low-end
did not have a video ASIC but instead tricked the microprocessor into functioning as a video controller which meant that it could only show video when it was done thinking, see
Related development for those interested in using old hardware with new displays. There is this ISA HDMI addon card in development that will add HDMI port to your existing old display card: https://www.vogons.org/viewtopic.php?t=92512
I’d love to see something like this for late 90s through mid 00s computers, because many of those machines can still find modern uses but are awkward to use with modern high-resolution displays, even if they can sometimes be adapted digitally (DVI-equipped machines). Of course you can always grab an old monitor to pair with said machine, but that comes with heavy picture quality concessions in the case of LCDs, and good CRT monitors are becoming rare and expensive.
So for example it’d be super cool to be able to drop a new GPU into a PowerMac G4 tower and allow it to drive a modern 2560x1440 display under both OS 9 and OS X.
Just be aware that old machines can bog way down when you start asking them to push millions of pixels. Their old graphics subsystems were never designed to handle that much data, bus widths are a major bottleneck.
[+] [-] LeonM|2 years ago|reply
[+] [-] spicyjpeg|2 years ago|reply
The only HDMI encoder I have seen so far with easily accessible (i.e. leaked) documentation is the CAT6613/IT6613 from ITE, which also happens to be available for purchase in single quantities from a number of Chinese retailers. It seems to be used in the OSSC and several FPGA development boards, so it's about as close to being an unofficial standard for open source projects as it could be.
[+] [-] phh|2 years ago|reply
FWIW, yes HDMI is still pretty simple, but not as simple as you describe it. Even though there are 3 pairs of data it's not one pair R, one pair G, one pair B (highest-bandwidth HDMI uses 4 pairs), it's just one data bus. The data pairs aren't only used for data colors, but also conveys audio, and info frames (which will include various stuff like HDR or VRR metadata). Of course there is the matter of DRM: the content will often be encrypted (but negotiation of that encryption happen separately, over I2C)
[+] [-] unixhero|2 years ago|reply
Next in line is BPF, which is also unclear.
[+] [-] HeckFeck|2 years ago|reply
[+] [-] raverbashing|2 years ago|reply
[+] [-] NovemberWhiskey|2 years ago|reply
[+] [-] boffinAudio|2 years ago|reply
[+] [-] waddup_|2 years ago|reply
[+] [-] jamesfmilne|2 years ago|reply
[+] [-] zare_st|2 years ago|reply
I've been hooked on to retro computing a while, and I flatted out somewhat. It all depends on what you want to do. If you want to play games or run software I suggest you take a good deep look at the emulators. I had a task to do with my XT and that was pulling the data out from it. I did some original programming for that purpose. I spent days loading software from the internet to it in minutes which was kind of miraculous. I programmed some graphics demos on it. I upgraded it with an XT-CF, etc. In the end there is no need to keep that machine up and running on a desk somewhere. The best purpose would be aesthetic, because it is really beautiful as a package, but even if it were in mint condition, I wouldn't risk running it for a couple hours daily for an useless purpose. Although it would be nice to have an 80s style terminal displaying current whether, RSS feeds, etc, it's just a bad way to run your historic machine out of working hours.
[+] [-] Teknoman117|2 years ago|reply
There are a ton of titles from this era that just don't work on current Windows, even with dgVoodoo. I just want to be able to comfortably get rid of this Win98 SE / WinXP dual boot box I have lying around...
[+] [-] gwbas1c|2 years ago|reply
Some CGA ran over composite, and there's plenty of modern small TVs with a composite input. It's perfectly fine to use a TV as a computer monitor. (I do!)
> This analog-to-digital conversion will also lead to an inevitable loss in video quality.
Oh, now we're splitting hairs! This is super-low resolution, super-low colorspace. CGA was (at most) 16 discrete colors. In many situations it was 4 colors with 2 pallets to chose from. The CGA port was also digital, so I don't understand where the "loss in video quality" argument comes from.
IMO: I don't "get" this. You're no longer running "vintage" hardware; yet a lot of vintage hardware has limited lifespan and may become unrepairable if/when there's degradation inside the chips themselves.
If someone is going to go through all this trouble, it makes a lot more sense to emulate the whole computer.
[+] [-] treve|2 years ago|reply
The drop in video quality with composite is real. This has less to do with the resolution, but more with the fact that hardware that upscales this to an HD or 4K panel needs to make an educated guess where pixels start and end, and gets it wrong.
It looks quite ugly practically and switching to something with crisp pixels is usually very worth it.
For old game consoles it's often enough to switch to RGB or Component and you don't have to go full digital. Composite (and RF) are quite bad.
This is not an audiophile type of distinction, it's very visible and obvious to almost anyone.
[+] [-] dfxm12|2 years ago|reply
[+] [-] noobermin|2 years ago|reply
It's a tragedy really. I feel like efforts like OP are great because they pull pressure off the literally limited stock thereby making the speculators go elsewhere (segasaturn games and the like mostly).
[+] [-] haunter|2 years ago|reply
[+] [-] mikecoles|2 years ago|reply
https://jlcpcb.com/ https://oshpark.com/
[+] [-] PaulHoule|2 years ago|reply
Home computers/game consoles of the time mainly had ASIC display controllers but projects like
https://www.commanderx16.com/
don't really have the volume to justify making an ASIC so they wind up using FPGA (like this card) or microcontrollers to function as display controllers. Note the super low-end
https://en.wikipedia.org/wiki/ZX80
did not have a video ASIC but instead tricked the microprocessor into functioning as a video controller which meant that it could only show video when it was done thinking, see
https://www.tinaja.com/ebooks/cvcb1.pdf
though that technique can be used today to turn a (secondary) microprocessor for a display controller.
[+] [-] jan139748|2 years ago|reply
[+] [-] jwells89|2 years ago|reply
So for example it’d be super cool to be able to drop a new GPU into a PowerMac G4 tower and allow it to drive a modern 2560x1440 display under both OS 9 and OS X.
[+] [-] jandrese|2 years ago|reply
[+] [-] prmoustache|2 years ago|reply
I understand what it means in an RGB context but it is the first time I seesomeone mentionning dark yellow as a color.
[+] [-] marceldegraaf|2 years ago|reply
[+] [-] sydbarrett74|2 years ago|reply
Sorry if I'm asking a duplicate question, but have you considered submitting this to Hackaday?
[+] [-] yeokm1|2 years ago|reply
https://hackaday.com/2023/09/10/upgraded-graphics-gremlin-ad...
[+] [-] egorfine|2 years ago|reply
[+] [-] yeokm1|2 years ago|reply
[+] [-] ok123456|2 years ago|reply
[+] [-] KingLancelot|2 years ago|reply
[deleted]