The picture on ATSC 3 is high-quality 1080i/720p with an aggressive surround soundtrack which sounds great for sports events on my home theater upstairs. I wind up watching the ATSC 1 channels more because: ATSC 1 signals have subchannels like Comet (I am a sucker for Quantum Leap, Baby Godzilla, Time Bandits, ...), the new audio codec in ATSC 3 doesn't work on every device, and the ATSC 3 signal is a little less robust than most of the ATSC 1 signals.
I think Sinclair broadcasting (the owner of the transmitter) is more interested in embedding sports betting into sports events as an interactive application than they are in 4K. I think that will wait until they start turning off ATSC 1 transmitters and turning on more ATSC 3 transmitters, but that will take a long time.
I looked into this recently when I moved houses and was without cable for a few weeks. I thought about purchasing an ATSC 3 box to try it out instead of just using an antenna with my TVs internal ATSC 1 decoder.
I will never adopt ATSC 3 because they have built targeted advertising and viewer tracking into the spec. See for example https://rampedup.us/atsc-3-0-marketing-opportunity/. It’s really too bad but to me this is the final nail in the coffin for ota programming in my opinion.
I like that OTA is still even a thing but I cannot help but think that this is going to fail - who the hell is even using OTA let alone tech saavy enough to be UPGRADING their boxes for 4k?
I'm still using one of the last Plasma TV's ever made (bought it used for cheap) which does have a built in tuner but is obviously 1080P only. At my last apartment I was line of site and only a few miles from where everything is transmitted from so I had great reception. Now it is the opposite and I would need to run a cable up the side of my apartment building to get an antenna working.
Is ATSC 3 really not backwards compatible? It seems like it would be crazy to upgrade to a system that then will not work on all older TV's considering that most new TV's sold these days do not have a tuner built in at all!
> You either need to get a new TV set or an ATSC 3 decoder to watch tune it in. [...] until they start turning off ATSC 1 transmitters
How did things get this bad? In the 90s and early 00s, you could watch virtually any OTA television channel with a television made half a lifetime earlier in the 1950s. New technology like color television was introduced, but remained compatible with old TVs.
In the 50s through 90s virtually nobody gave a damn about e-waste. Today everybody plays lip service to the idea, yet we're obsoleting recently purchased technology faster than ever before, and for what? So you can see individual pores on some ball-kicker's nose?
I wish they had made audio more robust in ATSC, even if it would mean making video less robust. For probably most programs a video dropout without an audio dropout is much less disruptive than an audio dropout without a video dropout.
> MPEG is not free. A license is required for using MPEG.
For MPEG-1 this is not true anymore. All patents have expired. It's also far less computationally expensive and easier to implement/integrate than more modern codecs. This makes MPEG-1 a "good enough" solution for (e.g.) short video clips in a game.
Disclaimer: I wrote pl_mpeg[1], so my opinion is biased.
Fun fact: The PlayStation2 (and maybe PS1?) had an "MDEC" coprocessor that was basically a JPEG decoder chip. So, in one game we used low-rez MJPEG streams for the characters' pop-up, talking-head, conversational clips. The were small enough to keep in the 32MB of main RAM for zero-latency and free to decode any frame on the fly as needed.
>But even MPEG-1 supports a maximum resolution of 4095×4095, and a maximum bandwidth of 100 mbit. So it is technically possible to encode 4k (3840×2160) content even in MPEG-1, at decent quality.
The reality is that that despite MPEG-1 and MPEG-2 support those resolution, none of the mainstream implementation supports it. ( Although you could always use Software implementation as MPEG-2 is now patent free ) Hence when you go 4K, only newer version of MPEGs are used.
It is also missing MPEG-5 with EVC. And how not only MPEG-4 has different codec, AVC itself introduces many different profiles where compression efficiency differs greatly.
True. I work with media server playback stuff for live events etc. Back in the day, we only used mpeg2 constant bitrate codec because it allowed us to sync the playback across multiple systems. The first 4k videos we could play were just a slightly hacked version of mpeg2. The software I used adopted the extension .mxl and claimed they sort of invented a new thing, but truthfully, it was just an mpeg2 encoder/decoder that ignored that 4095 rule.
Now we use HAP family (DXT/S3 based) or NotchLC (which is a specialized codec for media servers that is fairly encumbered with patents).
MPEG-4 part 2 and h.263 also support multiple profiles. MPEG-4 part 2 is based off of h.263 but with basically all the features turned on with its own additional features available.
Annoying the author doesn't seem to understand that multiple standards bodies exist. The h.xxx codecs are specified by the ITU. MPEG is a totally different organization. The MPEG-x standards are specific video encoding and delivery standards. The ITU also has delivery specifications but those are mostly about telecommunications delivery.
Video deinterlacing is one of those topics that I always felt would be a good way to teach myself deep learning techniques. The idea that you have a previous frame and a next frame with information to help you fill in the gaps of the current frame is really intuitive and easy to visualize. I'm sure this has been solved really well by now, but it seems like a good problem to learn with.
I wouldn't say it has been solved all that well. The problem is the temporal difference between the two fields, so moving objects have a "comb" distortion while more static parts of the image mostly don't.
Interlacing should never have been a part of a digital-TV standard, but broadcasters pushed a bunch of FUD and lies during the progressive-vs.-interlaced debate. They whined that going progressive "would break their pipeline," when that's obviously BS because they'd been playing film-originated material for decades. Acquisition should always have taken place in a progressive format, even if transmission continued to be interlaced for a while.
Now we're stuck with a bunch of historical footage that looks like shit.
And then there's the asinine perpetuation of non-integer frame rates; those too should have been abolished with digital TV.
Thats not necessarily only deinterlacing - lots of modern codecs are doing compression (pretty much all of them) based on how much change there is frame to frame. But almost nothing is interlaced these days. Although of course interlacing itself was a simple form of what you are describing above in a way.
> If we want to get a bit retro here, convincing video playback on a consumer PC more or less started when hardware cards such as the Video Blaster arrived on the market.
A bit odd to give QuickTime a snub here, which was released in 1991.
This article was amazing! I struggled through learning to use ffmepg as a C++ library, and then I found out how restrictive their licensing was and about mpeg patent laws which was a huge bummer. I'm building a video editor for mathematical animations, it's open source and I wanted to distribute a single executable as a paid option if you don't want to compile it yourself. Unfortunately this is borderline impossible with ffmpegs restrictive licensing.
It looks like AV1 will be a great replacement, and seems like it will be widely supported in the future. The BSD-3 license also seems way more permissive, which is a huge plus. Does anyone have any experience using it as a library? I'll definitely be experimenting with it, but I wanted to see if anyone else had a good experience with it as opposed to ffmpeg or something :)
Edit: I just wanted to add that my mis-capitalization of ffmpeg is completely intentional ;)
I does not have as many tunable options as ffmpeg, but served me better for the purpose of multimonitor screen capture (~ 7k by 3k total resolution) precisely because it flings DXGI surfaces directly without the need to copy pictures from GPU to RAM and back for encoding.
1. As GL1zdA already pointed out in blog comments 1992 Creative Video Blaster CT6000 was a single frame screen grabber and not a video encoding/decoding accelerator. YT "Creative Videoblaster CT6000 in action" https://www.youtube.com/watch?v=5LrwSGRflPE shows what it does. http://web.archive.org/web/20000829095056/http://www.ctlsg.c... "converts a standard full-motion video image (composite video source) into a format for display on a computer graphics monitor" emphasis on the _image_. "It can position and scale the video image onto the output display and allows the video image to be merged with computer graphics for interactive multimedia applicaiton." means overlay analog video on VGA monitor.
The first true video decoding accelerator on PC market came out of David Sarnoff Research Center owned by RCA, later purchased by GE, then finally video tech carved out and sold to Intel. Called Digital Video Interactive https://en.wikipedia.org/wiki/Digital_Video_Interactive it did in fact "real time audio and video decompression, 30 frames per second with a resolution of 320x240". Intel repackaged it into one chip i750 and called it Indeo. Creative released a product based on it in mid 1994 - Video Blaster RT300. PCI bus was already mainstream in 1994, as were Pentiums. You most likely blurred those two together into one product.
2. "so the quality could be very high. Streams of 8 to 12 mbit of data for a single channel were no exception."
SD video DVD bitrate is 10Mbit, and here we have OTA stations pushing 720p/1080i at same or lower bw using same codec (US mpeg2). Whole multiplexer in ATSC offers ~20Mbit, and then TV stations cram 3-4 signals in there. Lu Goughs "DRW2022 – Pt5: Digital TV (DVB-T), Digital Radio (DAB+) & FM Radio Analysis" https://goughlui.com/2022/06/05/drw2022-pt5-digital-tv-dvb-t... blog post offers good overview of the OTA TV broadcasting shitshow as seen from Australian perspective. 3-4Mbit is the average. There is nothing "very high quality" about OTA HD TV broadcast.
"‘gold standard’ of 4k 60 fps video, streaming services is where you’ll find it"
you mean Netflix 4K 24 fps at 15Mbit? :-) That is the reality we are living in now. Apple is an outlier doing 25-40Mbit, everything else is bad.
3 "CRT has some afterglow after the ray has scanned a given area, the even field was still visible somewhat as the odd field was drawn. So the two fields would blend somewhat together"
This is simply not how TV works, there is no somewhat, there is no afterglow. CRT goes dark after about 10 lines have been scanned in.
"An LCD screen does not have the same ‘afterglow’ that a CRT has"
LCD does in fact retain whole picture until the next update, aka "full afterglow" as you would put it.
"so there’s no natural ‘deinterlacing’ effect that blends the screens together."
The natural deinterlacing effect happens in viewers brain, not on the Screen. You got this entirely backwards. The whole reason interlaced picture doesnt work on LCD is because previous frame stays on the screen. I highly recommend "How a TV Works in Slow Motion - The Slow Mo Guys" https://www.youtube.com/watch?v=3BJU2drrtCM
4. "HD, another new standard was required, as DVD was limited to SD."
Sony used HD as an excuse to push new format with new licensing fees. Nothing, other than corporate greed, necessitated new storage medium. h.264 was already on the market, as were other efficient codecs. There were even failed attempts like Versatile Multilayer Disc using standard DVD with VC-1 coding.
5. 4K "special kind of confusion"
who? This is the first time I hear about people associating 4K with new video codecs.
I was huge into following the latest sound cards and video cards in the mid-late 90s, owned multiple soundblaster cards, and I didn't even remember a product called video blaster. I also checked the archive links, and aside from one model, these products were for encoding, not decoding. From my recollection, FMV on computers was a novelty reserved for short clips in games and encyclopedias like Encarta. You could go to a computer store and see a demo of a video playing from a cdrom on a 486/windows 3.1/MPEG-certified video card around 1993. But video on the PC was consistently even similar quality to VHS until Windows 95, MacOS System 7 and hardware from that time had decent support. If you had windows 95 and even the slowest Pentium, you could watch a good quality MPEG file. Granted, no content was available except for "weezer.mpg", a music video easter egg on the windows 95 cdrom, and you could download short high-quality clips such as clever commercials. For streaming an entire TV show, you could only find grainy "realvideo" streams by word-of-mouth. A sidenote is that they did work better in one sense than streaming today-- I remember watching star wars in realtime with IRC friends. The serving person had a cable modem and at least some of the viewers had 56kbps modems. These days, my phone plan throttles to 100kbps when I run out of data, and I can load HN, barely load a google search, can't load most news sites without timing out, and wouldn't even be able to navigate to any kind of video, let alone stream one.
I'm also still disappointed in the quality of streaming services of movies. 4K often looks worse than 1080p. I live in Japan and broadcast TV 1080p looks so so much better than most streaming at any resolution. The Mandalorian is the exception that comes to mind. It's annoying that anytime I've checked out 4k and 8k TVs in an electronics store, I see the same demo videos for the last few years. Great-looking but utterly boring content like a man swinging ropes around in a puddle and paint bouncing around on some kind of speaker cone or drumhead. Really drives home the point that content that actually takes advantage of the (amazing) 4k+, HDR screens is lacking.
On the other hand, I appreciate that the floor of quality is quite high, and for a person that grew up loving "this old house" as a kid, I can open youtube and see incredibly high quality/high-image quality content on any subject, whether broad or deep. And most of it is created, starring produced, and edited by one person. That's a totally different discussion though.
[+] [-] PaulHoule|3 years ago|reply
In Syracuse (about 50 miles away from me) the three network broadcasters collaborate to run one ATSC 3 transmitter with big which I can tune in with
https://www.silicondust.com/product/hdhomerun-flex-4k/
The picture on ATSC 3 is high-quality 1080i/720p with an aggressive surround soundtrack which sounds great for sports events on my home theater upstairs. I wind up watching the ATSC 1 channels more because: ATSC 1 signals have subchannels like Comet (I am a sucker for Quantum Leap, Baby Godzilla, Time Bandits, ...), the new audio codec in ATSC 3 doesn't work on every device, and the ATSC 3 signal is a little less robust than most of the ATSC 1 signals.
I think Sinclair broadcasting (the owner of the transmitter) is more interested in embedding sports betting into sports events as an interactive application than they are in 4K. I think that will wait until they start turning off ATSC 1 transmitters and turning on more ATSC 3 transmitters, but that will take a long time.
[+] [-] ipython|3 years ago|reply
I will never adopt ATSC 3 because they have built targeted advertising and viewer tracking into the spec. See for example https://rampedup.us/atsc-3-0-marketing-opportunity/. It’s really too bad but to me this is the final nail in the coffin for ota programming in my opinion.
[+] [-] Melatonic|3 years ago|reply
I'm still using one of the last Plasma TV's ever made (bought it used for cheap) which does have a built in tuner but is obviously 1080P only. At my last apartment I was line of site and only a few miles from where everything is transmitted from so I had great reception. Now it is the opposite and I would need to run a cable up the side of my apartment building to get an antenna working.
Is ATSC 3 really not backwards compatible? It seems like it would be crazy to upgrade to a system that then will not work on all older TV's considering that most new TV's sold these days do not have a tuner built in at all!
[+] [-] robonerd|3 years ago|reply
How did things get this bad? In the 90s and early 00s, you could watch virtually any OTA television channel with a television made half a lifetime earlier in the 1950s. New technology like color television was introduced, but remained compatible with old TVs.
In the 50s through 90s virtually nobody gave a damn about e-waste. Today everybody plays lip service to the idea, yet we're obsoleting recently purchased technology faster than ever before, and for what? So you can see individual pores on some ball-kicker's nose?
[+] [-] tzs|3 years ago|reply
[+] [-] nsgi|3 years ago|reply
[+] [-] password4321|3 years ago|reply
[+] [-] phoboslab|3 years ago|reply
For MPEG-1 this is not true anymore. All patents have expired. It's also far less computationally expensive and easier to implement/integrate than more modern codecs. This makes MPEG-1 a "good enough" solution for (e.g.) short video clips in a game.
Disclaimer: I wrote pl_mpeg[1], so my opinion is biased.
[1] https://github.com/phoboslab/pl_mpeg
[+] [-] corysama|3 years ago|reply
[+] [-] ksec|3 years ago|reply
The reality is that that despite MPEG-1 and MPEG-2 support those resolution, none of the mainstream implementation supports it. ( Although you could always use Software implementation as MPEG-2 is now patent free ) Hence when you go 4K, only newer version of MPEGs are used.
It is also missing MPEG-5 with EVC. And how not only MPEG-4 has different codec, AVC itself introduces many different profiles where compression efficiency differs greatly.
[+] [-] malkuth23|3 years ago|reply
Now we use HAP family (DXT/S3 based) or NotchLC (which is a specialized codec for media servers that is fairly encumbered with patents).
[+] [-] giantrobot|3 years ago|reply
Annoying the author doesn't seem to understand that multiple standards bodies exist. The h.xxx codecs are specified by the ITU. MPEG is a totally different organization. The MPEG-x standards are specific video encoding and delivery standards. The ITU also has delivery specifications but those are mostly about telecommunications delivery.
[+] [-] Melatonic|3 years ago|reply
[+] [-] seanalltogether|3 years ago|reply
[+] [-] NonNefarious|3 years ago|reply
Interlacing should never have been a part of a digital-TV standard, but broadcasters pushed a bunch of FUD and lies during the progressive-vs.-interlaced debate. They whined that going progressive "would break their pipeline," when that's obviously BS because they'd been playing film-originated material for decades. Acquisition should always have taken place in a progressive format, even if transmission continued to be interlaced for a while.
Now we're stuck with a bunch of historical footage that looks like shit.
And then there's the asinine perpetuation of non-integer frame rates; those too should have been abolished with digital TV.
[+] [-] brigade|3 years ago|reply
That said, I don't think there's been much work incorporating new research from the current resurgence of deep learning.
[+] [-] Melatonic|3 years ago|reply
[+] [-] IshKebab|3 years ago|reply
[+] [-] klodolph|3 years ago|reply
A bit odd to give QuickTime a snub here, which was released in 1991.
[+] [-] rasz|3 years ago|reply
QuickTime Road Pizza https://en.wikipedia.org/wiki/Apple_Video was one of the first viable universal video codecs on the desktop. Its not a coincidence Premiere started out on Mac https://en.wikipedia.org/wiki/Adobe_Premiere#History
[+] [-] _gabe_|3 years ago|reply
It looks like AV1 will be a great replacement, and seems like it will be widely supported in the future. The BSD-3 license also seems way more permissive, which is a huge plus. Does anyone have any experience using it as a library? I'll definitely be experimenting with it, but I wanted to see if anyone else had a good experience with it as opposed to ffmpeg or something :)
Edit: I just wanted to add that my mis-capitalization of ffmpeg is completely intentional ;)
[+] [-] Aardwolf|3 years ago|reply
The low resolution pixels are visible, but I don't see many blocking / MPEG artefacts. I've seen much worse 240p videos.
[+] [-] HidyBush|3 years ago|reply
[+] [-] NonNefarious|3 years ago|reply
[+] [-] lostmsu|3 years ago|reply
I does not have as many tunable options as ffmpeg, but served me better for the purpose of multimonitor screen capture (~ 7k by 3k total resolution) precisely because it flings DXGI surfaces directly without the need to copy pictures from GPU to RAM and back for encoding.
[+] [-] rasz|3 years ago|reply
The first true video decoding accelerator on PC market came out of David Sarnoff Research Center owned by RCA, later purchased by GE, then finally video tech carved out and sold to Intel. Called Digital Video Interactive https://en.wikipedia.org/wiki/Digital_Video_Interactive it did in fact "real time audio and video decompression, 30 frames per second with a resolution of 320x240". Intel repackaged it into one chip i750 and called it Indeo. Creative released a product based on it in mid 1994 - Video Blaster RT300. PCI bus was already mainstream in 1994, as were Pentiums. You most likely blurred those two together into one product.
2. "so the quality could be very high. Streams of 8 to 12 mbit of data for a single channel were no exception."
SD video DVD bitrate is 10Mbit, and here we have OTA stations pushing 720p/1080i at same or lower bw using same codec (US mpeg2). Whole multiplexer in ATSC offers ~20Mbit, and then TV stations cram 3-4 signals in there. Lu Goughs "DRW2022 – Pt5: Digital TV (DVB-T), Digital Radio (DAB+) & FM Radio Analysis" https://goughlui.com/2022/06/05/drw2022-pt5-digital-tv-dvb-t... blog post offers good overview of the OTA TV broadcasting shitshow as seen from Australian perspective. 3-4Mbit is the average. There is nothing "very high quality" about OTA HD TV broadcast.
"‘gold standard’ of 4k 60 fps video, streaming services is where you’ll find it"
you mean Netflix 4K 24 fps at 15Mbit? :-) That is the reality we are living in now. Apple is an outlier doing 25-40Mbit, everything else is bad.
3 "CRT has some afterglow after the ray has scanned a given area, the even field was still visible somewhat as the odd field was drawn. So the two fields would blend somewhat together"
This is simply not how TV works, there is no somewhat, there is no afterglow. CRT goes dark after about 10 lines have been scanned in.
"An LCD screen does not have the same ‘afterglow’ that a CRT has"
LCD does in fact retain whole picture until the next update, aka "full afterglow" as you would put it.
"so there’s no natural ‘deinterlacing’ effect that blends the screens together."
The natural deinterlacing effect happens in viewers brain, not on the Screen. You got this entirely backwards. The whole reason interlaced picture doesnt work on LCD is because previous frame stays on the screen. I highly recommend "How a TV Works in Slow Motion - The Slow Mo Guys" https://www.youtube.com/watch?v=3BJU2drrtCM
4. "HD, another new standard was required, as DVD was limited to SD."
Sony used HD as an excuse to push new format with new licensing fees. Nothing, other than corporate greed, necessitated new storage medium. h.264 was already on the market, as were other efficient codecs. There were even failed attempts like Versatile Multilayer Disc using standard DVD with VC-1 coding.
5. 4K "special kind of confusion"
who? This is the first time I hear about people associating 4K with new video codecs.
[+] [-] johnwalkr|3 years ago|reply
I'm also still disappointed in the quality of streaming services of movies. 4K often looks worse than 1080p. I live in Japan and broadcast TV 1080p looks so so much better than most streaming at any resolution. The Mandalorian is the exception that comes to mind. It's annoying that anytime I've checked out 4k and 8k TVs in an electronics store, I see the same demo videos for the last few years. Great-looking but utterly boring content like a man swinging ropes around in a puddle and paint bouncing around on some kind of speaker cone or drumhead. Really drives home the point that content that actually takes advantage of the (amazing) 4k+, HDR screens is lacking.
On the other hand, I appreciate that the floor of quality is quite high, and for a person that grew up loving "this old house" as a kid, I can open youtube and see incredibly high quality/high-image quality content on any subject, whether broad or deep. And most of it is created, starring produced, and edited by one person. That's a totally different discussion though.
[+] [-] kazinator|3 years ago|reply
[+] [-] tatersolid|3 years ago|reply
[+] [-] natly|3 years ago|reply
[+] [-] hulitu|3 years ago|reply
[+] [-] unknown|3 years ago|reply
[deleted]