top | item 24745564

A 20-year-old CRT monitor can be better than a 4K LCD (2019)

257 points| MrJagil | 5 years ago |vice.com

284 comments

order
[+] ChuckMcM|5 years ago|reply
Heh, everything is better on Tubes whether it is guitar amps or gaming monitors? :-)

I can pretty much assure you that with a 240Hz refresh rate 4K LCD monitor it will look better than your CRT :-) But it is perfectly valid to say "What is the best experience I can get for $X?" and find that CRT solutions out perform LCD solutions at various price points.

That said, I suspect it is less about the "superiority" of the CRT than it is about the corners cut by the LCD manufacturer in terms of display fidelity. A lot of the early "high res" displays got there by sacrificing video image quality.

That those monitors aren't great for gaming is not surprising, it is also not surprising if that was the only monitor you've ever gamed on, when you saw gaming on a better image experience you would be impressed.

If you consider the amount of RAM and processing power you have to have inside the monitor at 4K resolution you start to understand why there is a thing like nVidia's G-Sync technology. That is a lot of bits to throw around. Similarly, a monitor that processes the 4k video stream down to 1080p, and has 10 or even 12 bit dynamic range on the pixels with full motion emulation might give you a better looking display than a 4K display.

So many ways to optimize for particular markets.

[+] imstate|5 years ago|reply
The CRT monitors do have a faster response time and higher refresh rate (I've seen up to 640x480@480hz <1ms).

But you would be sacrificing fidelity for this "competitive advantage".

[+] denkmoon|5 years ago|reply
That 240hz panel will have terrible colour reproduction compared to the CRT. Only TN panels are clocked that fast, and TN are ugly.

If you want nicer colours you go for IPS, which has a very slow response time (especially when compared to a CRT).

The only modern display technology that comes close to CRT in terms of colour, contrast and responsiveness, you need to go OLED. And that's great (Oled with scanline emulation and black-frame insertion is incredible!) but OLED has the issues everyone knows about regarding degradation of the organic compounds (burn in).

I want my cake and to eat it, I want great colours and contrast and a stupid fast response time, with a panel that won't burn-in my desktop.

[+] robomartin|5 years ago|reply
> f you consider the amount of RAM and processing power you have to have inside the monitor at 4K resolution

You need nearly zero memory inside an LCD monitor of any resolution. Technically all you need is 256 bytes for EDID and that’s it. You can drive the panel directly from the graphics card from DVI, HDMI and DisplayPort, often with a single format conversion chip.

In fact, Apple’s HD Cinema Displays were built using LG panels with a DVI interface built right into them (which isn’t the norm).

Source: I designed custom FPGA-based LCD panel interfaces for many years.

[+] badsectoracula|5 years ago|reply
> I can pretty much assure you that with a 240Hz refresh rate 4K LCD monitor it will look better than your CRT :-)

Since you bring up the refresh rate, then i'd assume that you include motion in the "look better" - in which case, i can easily respond with "no, it absolutely does not".

I have a CRT next to me which can do 120Hz at 640x480 (which is a low resolution but the CRT is also very small and i use it on a mid-2000s PC for playing some older games, so it doesn't bother me). It is a Samtron which is basically poor man's Trinitron, so not even among the best out there (i also have a Trinitron but that was also not among the best out there... it was one of the cheaper models).

Despite that, motion on this thing at 120Hz can only be described as liquid butter smooth. Just moving the mouse around makes you want to... well, keep moving the mouse around because it feels so good. FPS games feel amazing.

It is so good that i decided to buy a high refresh rate monitor for my main PC. I avoided that for a long time for two reasons: a) good monitors use high resolutions like 2560x1440 often at huge sizes like 27" and i do not really like high resolutions nor huge monitor (huge in terms of viewable area) and b) all flat panels have persistence issues so chances are they wouldn't be that good.

But you know, having that CRT next to me and using it from time to time really made me want to have a similar experience on my main PC. So i decided to find something that would be close enough and bought a (rather expensive) 165Hz monitor. I mean, ok, how bad can it be?

I rarely get disappointed with new purchases i make and i can't say i was completely disappointed, but i can easily say that if i hadn't experienced using a decent CRT for years (like many that seem to praise modern display tech do - assuming they ever experienced a CRT at all, things aren't getting younger) i'd probably be much more enthusiastic.

The thing is however, i had experienced a CRT and my brand new expensive monitor is far from being as good as the CRT i bought for barely 15 euros.

It is day and night. Not just something that you need to go back and forth to compare - i realized how worse the new monitor was the moment i tried to move some windows around and launched a game i also had played on the older PC and that even though i hadn't used the older PC since a while. All it takes is using both once to realize how better the CRT is.

And it isn't like the new monitor doesn't feel smoother than the 60Hz, but it just isn't as good at the old CRT i have next to me. At best it brings back some of the responsiveness i lost when Windows forced a vsync'd compositor on me. But i never used vsync in games so that sort of responsiveness in games wasn't something i lost and i disable vsync for almost two decades now, so any tearing not only doesn't bother me - it barely registers.

Also, aside of motion, CRTs (at least the decent ones) have much better contrast than any tech outside OLED (which isn't available in PC monitor form, at least not at a non-ridiculous size, non-ridiculous resolution and non-ridiculous price).

Sadly it isn't just a matter of older or cheaper monitors. It is just that modern monitor tech simply sucks at most things outside being flat and having high resolutions. It is fine for office work, etc, which is how i guess they became popular, but for gaming they just aren't as good as the better CRTs (sure there were many crappy CRTs out there - and i am certain that anyone who complains that they dislike CRTs because they flickered used a crappy one - but people who say that they prefer CRTs do not refer to the crappy ones as i'm certain that people who say things are better nowadays do not refer to crappy TNs with washed out colors either).

> So many ways to optimize for particular markets.

I'd like an optimization for a high end CRT please :-/. The article mentions $500, i'd actually pay $1500 for a brand new (not old stock, i mean truly new) good CRT like those mentioned in the article.

[+] skybrian|5 years ago|reply
It seems like 4k is pretty darned wasteful part of the hedonic treadmill. At a more reasonable resolution, this processing power could be used for higher-end graphics algorithms, better AI, longer battery life, and so on.
[+] m463|5 years ago|reply
Yeah but could you play duck hunt?

And in the old old days you could skip the scanning and just go draw what you wanted where you wanted with vector graphics. (as long as there weren't too many "what"s)

There were even tricks like defocusing the beam.

:)

[+] louhike|5 years ago|reply
"I can pretty much assure you that with a 240Hz refresh rate 4K LCD"

It depends what you will plug on it. CRT are considered the best solution to play unmodded old consoles (up to the PS2/Dreamcast/Xbox/Gamecube generation, and even the Wii) or VHS tapes. Games and consoles were made to look good on CRT, and to take advantages of specific features of these TV.

[+] BearOso|5 years ago|reply
> On a CRT monitor, the screen is coated in millions of phosphor dots, with one red, green, and blue dot for every individual pixel.

Not true. The number of dot triads is usually greater than the number of pixels. The distance between them is specified as the “dot pitch,” which serves as a physical resolution cap. In aperture grilles they’re continuous RGB vertical lines, with the dot pitch being the horizontal distance between them.

Framebuffer pixels don’t align exactly with the dots, and that’s one reason why CRTs are so blurry. The other reason is that the DAC analog output doesn’t transition discretely between pixels. As the refresh rate and resolution get higher, the DAC has to spend less time on each pixel, so the output becomes blurred horizontally.

[+] nottorp|5 years ago|reply
I have a different problem with LCDs. I still think they're too bright. With a CRT you could set everything to white on black and you had almost no light coming out of the screen. Also brightness meant brightness and you could turn it way down for when working in dim light or darkness.

Of course i'm talking about text work here.

Maybe when OLED monitors become affordable we'll go back to monitors that aren't basically a lamp shining into your eyes all day. At least the oled on my phone looks like it doesn't put out light where it shouldn't.

For gaming, i'm not latency sensitive, but then i don't play twitch shooters any more. I'm more the kind that buys mid-low range video cards and turns on the fps limiter where it's available.

[+] ashtonkem|5 years ago|reply
I’d recommend adding bias lighting to your monitor. It won’t make the monitor less bright, but it’ll light up the wall behind your monitor so there’s less of a contrast between the wall and your monitor, which reduces eye strain.
[+] bserge|5 years ago|reply
Yeah I have no idea why the lowest levels on monitors are so bright. Are there models that can go really low without the PWM flicker effect kicking in (that's another problem I have with monitors, I can notice the damn cheap PWM backlight)?

Laptop (and phone) displays don't seem to have this problem, the brightness goes really low...

[+] imhoguy|5 years ago|reply
I am still waiting for a decent size e-ink screen (22"+) without backlight which should be enough for low-FPS stuff like backend coding, slack, terminal and text content web pages. Existing solutions are still too small - Dasung and Onyx Boox are 13.3" with HDMI.
[+] userbinator|5 years ago|reply
I suspect that people leaving LCDs at their eye-burning default brightness (looks great in a store demo competing against the lighting and other monitors, but not at all for long-term work) is at least part of the reason for all the excessively low-contrast websites.
[+] srtjstjsj|5 years ago|reply
It's mildly annoying but easily fixed with a touch of ambient lighting or a software tool looks f.lux.

Brightness isn't nearly as bad for eye strain has contrast between the screen and ambient light.

[+] tuxracer|5 years ago|reply
Please don't let CRTs come back in style. After a while they tend to develop this headache inducing high pitched tone almost akin to tinnitus that is emitted constantly while they're powered on. It seems to be so high pitched most people cannot hear it at all but if you're one of the lucky few who can it can actually be really disruptive. Unfortunately it's also often loud enough to hear through doors, walls, etc... Please be mindful of this before setting up a CRT if you go down this path. For example a house might be better for this setup vs an apartment or condo.
[+] bserge|5 years ago|reply
My eyes used to hurt a lot on CRTs. A few hours every day in front of one and I could not go outside without sunglasses, anything would make my eyes hurt.

Everything was fine if I went a week without using a computer.

With LCDs, that's not a problem anymore, and I like it.

[+] nickjj|5 years ago|reply
CRTs were really good. They really were.

I remember back in the day the HardOCP forum[0] had massive threads about people buying the FW900 in the mid-2000s. It definitely achieved legendary status. Sadly I never had a chance to use one.

I remember having a 21" NEC that let me play Quake 2 / 3 at 120hz at 640x480 in the mid-late 1990s and I think I paid like $120 for it back then from one of those refurbished monitor sites. It also did 1600x1200 at 60hz for non-gaming.

I still don't know how those refurbished sites stayed in business because they offered free shipping, but a decently sized CRT back then used to weigh like 60 pounds (30 kg).

Back then I remember waiting so many years to get an LCD because the input latency, refresh rates and color accuracy were horrendous for so long despite being 3x the price.

[0]: https://hardforum.com/threads/24-widescreen-crt-fw900-from-e...

[+] sevensor|5 years ago|reply
Many a year ago, in college, I was helping a friend move. He had a giant 21" CRT, which seemed like it weighed eighty pounds. It was the dead of night, in the middle of winter. We were walking on ice, carrying it together. (We both had late-90s computer nerd biceps.) He slipped, and with a great exertion I managed to grab his side and prevent a fall to the pavement. Neither of us had much in the world at the time, and that sweet monitor was one of his most valuable posessions. Needless to say, he was grateful that I'd rescued his 1600x1200 view into the future.
[+] taneq|5 years ago|reply
Gaming LCDs these days boast response times in the low single digit millisecond range. Even at 144Hz that’s less than a frame.

If there’s framebuffer-to-photons lag, I’d point my finger at DVI decoding before the physical movement of liquid crystals.

Also, for all the focus on “physical changes” there’s no mention of phosphor fade rate, which is a physical property of the CRT screen with a built-in trade off between latency and flicker.

Please let’s not have CRTs become the new Monster Cables.

[+] R0b0t1|5 years ago|reply
The gray to gray response time is heavily gamed. There's some tech reporter investigation on it that shows how most monitors that game the GTG response time have slower overall response than monitors with a ~7ms time.
[+] yardie|5 years ago|reply
I used to have one of these (GDM-FW900) at an old job, a long time ago. They were great monitors for the time, top of the line. They weighed almost 300lbs, IIRC. And like all CRTs they can put out some heat.

LCDs at the time still left much to be desired so VFX houses weren't rushing to replace them with inferior, at the time, technology.

I remember running mine at a higher refresh rate. 75Hz and 100Hz was easily reachable. 120Hz and you started to drop resolution and the coils would sing. I miss the wide range of resolutions you could run. LCDs still haven't been able to replicate that without making everything a blurry, boxy mess.

I was able to take the broken smaller GDM-F500 (21") home. It was dim and after a bit of research I found out replacing a capacitor and a few resistors would bring it back. But it was terrifying working directly on the HV board. Once it was working I had so much screen. Almost getting electrocuted was practically it.

[+] dreamofkoholint|5 years ago|reply
Yes! I think the ability to watch lower-res video (even 480p) on them and still have a decent picture is what I miss the most.

It would be quite comical if the 300lb figure were correct, having lugged one around before. They hovered around 92lbs, though.

[+] pritovido|5 years ago|reply
I had one of those that a programmer that worked for me sold me.

It did cost him something like $6000-9000 new.

The image was incredible, but it had lots of drawbacks:

1. It was huge and heavyweight. Most people that say "people obsession with flatness" has never expirienced having an screen that is as deep as long and tall. Attaching it to the wall was a nightmare.

2. It had problems with magnets. I put a big magnet like 1 meter away and it affected the screen.

3. It emitted X-Rays directly to your eyes. Not good.

In the end replacing it with LCDs was a great decision. Much better for the programming or CAD that I do.

OLED is great, as the article says, if you can buy-recover the cost somewhat.

[+] AtlasBarfed|5 years ago|reply
3. It emitted X-Rays directly to your eyes. Not good.

What's your source for this? Was it limited to the Sony?

[+] SomeoneFromCA|5 years ago|reply
No display made in at least in late 90-s would emit X-Rays.
[+] tomc1985|5 years ago|reply
Ugh. Keyboard hipsters are bad enough. Now we're going to have CRT hipsters!

It's like, name some piece of newly vintage tech, find its fans, start a movement, sell folks their shit nostalgia, lather, rinse, repeat

[+] visarga|5 years ago|reply
Being able to choose your desired mechanical response of the keyboard switches is useful for serious work.
[+] peatmoss|5 years ago|reply
I wonder if there’ll eventually be enough demand for new CRTs, and if manufacturing technology has improved enough to make a CRT that is less of an ecological menace. If cost were only partly a constraint, how good could we build a CRT today?

I am still mesmerized by new display technologies / display technologies that I haven’t seen in a long time. Like the vector CRTs that exist in old Asteroids cabinets. The phosphors are really bright—brighter than you can imagine an LCD being.

I also remember the first time I saw e-ink and being shocked to see something so inert looking shift to a new image.

This article really captured my imagination. I’ve not seen a modern GPU driving high frame rates on a CRT, but now I’m very curious to do so. I’d imagine the experience would defy my intuitions the same as it appears to have done for the authors here.

[+] ptx|5 years ago|reply
They took up a huge amount of desk space though, which was quite inconvenient. But perhaps they could be made flatter?
[+] badsectoracula|5 years ago|reply
> I’ve not seen a modern GPU driving high frame rates on a CRT, but now I’m very curious to do so.

You can probably find some decent CRTs on Facebook marketplace for very cheap (e.g. the other day i found a bunch of them less than 10 euros each) - they wont be as good as those mentioned in the article (unless you get really lucky) but anything that can do 120Hz or above should be enough to let you see that.

A bigger problem would be connecting them to the modern GPU. Nvidia removed the DAC from their GPUs after GTX 9xx series and even that wasn't that great (AMD also removed it some time before). So you'll need to find some way to convert the digital signal to analog VGA signal and a good DAC for that. I think there is some thread in the HardOCP forums about that but personally i haven't tried to go down that rabbit hole (yet :-P).

[+] Yizahi|5 years ago|reply
Next will be hobbyists longing resurrection of plasma TVs :)
[+] andrewzah|5 years ago|reply
CRTs are still heavily used in the smash bros. melee gaming community as the nintendo gamecube natively outputs to 480i.

Using a modern 720p+ display involves upscaling, which causes latency. This latency is wildly inconsistent across tvs, which makes competitive play untenable.

There are monitors with extremely low response time (~1ms), BenQ was the go-to for a while. It used to be that the cost was prohibitive, but now finding CRTs is more prohibitive... especially finding a good trinitron.

[+] cyrialize|5 years ago|reply
I have a tiny Sony trinitron (I think) CRT. The screen is the size of a banana diagonally. I specifically have it around just for practicing SSBM.

Many people nowadays do just use their PC with a good monitor. There are a ton of hacks that you can do to reduce the lag that you get from a monitor: https://www.youtube.com/watch?v=J6B4t5fCEbQ

The video above is from Hax, a prominent smasher who has been part of the SSBM scene for a very long time.

[+] hnick|5 years ago|reply
I was going to mention this. I've heard stories about players scouring the streets on council cleanup day in the hopes to snag an old CRT.
[+] WalterBright|5 years ago|reply
> Taylor said in an interview that he's willing to pay up to $500.

I wonder what the shipping cost of those heavy monsters are.

Me, I've used CRTs for 25 years. I have zero nostalgia for them. I don't want to ever use one again. There's nothing about them I prefer.

[+] Matthias247|5 years ago|reply
Reminds me of a story in when I was in university in around 2005. In our usenet group some people found a sale of used fw900s (definitely for less than 500€ each). They then organized a group buy, which had around 20 interested people.

Then they wondered how they would actually get all of those transferred across half the country. I think in the end they rented a truck or trailer to pick them up. It was definitely a bigger feat than getting a LCD shipped from your favorite online store.

[+] sosborn|5 years ago|reply
I feel the same way about CRTs. Things are so much better now (to me anyway). I also feel the same way about loud keyboards. The current fetish for "clackety clack" keyboards cracks me up.
[+] vidanay|5 years ago|reply
There was a really good article shared a few weeks ago about video memory on old systems that didn't use a frame buffer, instead the video data was streamed to only a couple of bytes and calculated in real time. This had the effect of nearly zero lag because sprite location could be calculated right up until the moment the first row was drawn to the screen.

I wish I could remember exactly what that article was.

[+] deergomoo|5 years ago|reply
This was quite a popular technique to achieve fancy video effects on low powered hardware.

A good example is the various parallax and scrolling effects in the intro cinematic to Link’s Awakening on the Game Boy/Color, achieved by changing various hardware scroll registers in the horizontal blanking interval (time between one line being finished and the other starting). The main limitation is that you can only do horizontal effects, not vertical.

I’m wondering if the article you’re thinking of was the Wired piece on the Atari 2600? That famously ran all of its game logic in the H- and V-blank intervals.

[+] tasty_freeze|5 years ago|reply
Color convergence on large CRT monitors is problematic. Back in the early 90s I spent $1200 out of my own pocket to buy a 20" CRT. At the time I was designing 3D rasterization algorithms and circuits and would spend hundreds of hours tweaking constants (how much difference loss does a 5b interpolation fraction introduce vs 6b, etc) and A/B testing to get the most cost effective & high quality results.

I bought and returned two and just lived with the defects of the 3rd one when I realized it was impossible to get good convergence across the display. I would confine my A/B tests to a specific area of the screen where convergence was best.

BTW, I forget the exact details, but color convergence at the factory was set with the monitor facing some particular direction, lets say north. If you set up the monitor facing east, the convergence would be off a little bit.

When quality LCDs became available, convergence was a non-issue, but they used dithering to attempt to increase the color gradations, so it didn't really help my kind of work anyway.

[+] nitrogen|5 years ago|reply
In my youth I bought one of those types of high end monitor second hand. Had 5 BNC inputs and required a resolution tweaking utility to change the sync polarity.

I spent hours tweaking convergence on the 9 million trim pots to get it just right. Best color gamut of any display I ever owned. Sadly the display physically broke something internal when I lost the included plastic screwdriver and tried using a metal one. The internal heads of the trim pots were apparently connected to the circuit.

The display was fixed to 1024x768x75Hz exactly (though you could squeeze in a bit more with modeline tweaking), but the colors were great.

[+] elric|5 years ago|reply
I remember reading something by John Carmack back in the early Oculus days, where he was basically saying he wished he could use small CRTs in VR headsets for many of the same reasons the article highlights.
[+] liquidise|5 years ago|reply
As a former epileptic, the strobing that comes with CRTs is something i’m happy to be rid of. It wasn’t something I could see so much as perceive, but it’s a feeling my body still remembers.

That said modern monitor tech still has big problems. Quantum dot has a chance of shifting the field in the next 10 years but right now any monitor effectively excels at only 2-3 of: color gamut, color accuracy, pixel density, refresh rate, response time. HDR is a whole different topic that the industry hasn’t figured out.

[+] simcop2387|5 years ago|reply
I'm curious if you've ever had the chance to see an oled monitor in person and if it reacts similarly to a CRT for you. They have a global refresh/flash instead of one that's rasterizing so it might not have the same effect and they don't have the same phosphor fade of a CRT either. I'm also curious about the LCD monitors that strobe the backlight to reduce apparent motion blur (basically they shut the backlight off during the time that each pixel is fading between colors in each frame).
[+] bleepblorp|5 years ago|reply
The problem with monitors is much more with the display industry's preference for finding new ways to scam customers instead of commercializing display technology improvements.

The history of LCD monitors and TVs is a history of the industry increasing prices while finding new corners to cut by removing functionality that reviewers and users didn't know they needed to test for.

When users first realized that IPS was vastly superior to TN for most purposes, the industry responded by making low-quality 6-bit IPS panels, which the industry happily described as 'IPS' without disclosing that the panels can't display color gradients--and often flickered badly due to FRC. Early 6-bit panels were worse than TN in everyday use.

6-bit IPS is now essentially unavoidable in smaller, non-4K, screens.

Another fun scam the display industry concocted was releasing one production run of good quality 8-bit IPS monitors, to get positive professional reviews and user word of mouth, then making subsequent production runs under the same model number with 6-bit fraud-IPS, PVA, or even TN panels.

The low-frequency PWM backlighting scam (which probably saves all of ten cents per display) is now known well enough that reviewers will test for it, but the industry has developed other ways of preventing users from purchasing good-quality hardware at a fair price.

Dynamic contrast is a particularly insidious problem. This piece of joy from the display industry changes the backlight intensity and RGB pixel values depending on screen contents. As a result, the color tint of the display changes depending on screen contents. These constant color shifts make dynamic contrast monitors unusable for even non-professional design/photo/video work and prevents color calibration entirely. This technology is even applied, with no user option to disable it, to monitors sold as sRGB pre-calibrated and marketed towards mid-range color work. Officially, dynamic contrast is intended to save energy, but the actual intent is likely to force people who want stable colors to buy professional displays at ten times the price.

A new scam that's emerged since the start of the pandemic is color gamut restriction; it's now very difficult to get displays--especially laptop displays--that support more than 50% of the sRGB gamut. Displaying actual red is now something the industry has decided to exclusively paywall into professional panels targeted at the design market.

The problem with displays isn't technology; it's an industry that's built on the premise of ripping off consumers.

[+] kristopolous|5 years ago|reply
Unless you play video games, the space, power and heat aren't worth it. I didn't have much desk space and higher resolutions became really cheap to come by.

So maybe 8 years ago I got rid of my 2 fw900s and now use rotatable 4ks that I put side by side in portrait mode, essentially getting a ~4kx4k square (18:16). It's $750 or whatever 2 go for now well spent. Most modern laptops can even drive two monitors as long you get the adapters right. It's a really good work set up. Highly recommended.

I'd honestly say it's the most significant thing that's affected how I interact with computers (and I've done foot pedals, gestural systems, custom made input devices I've designed myself, repurposed midi device for UX, my own window manager, etc ... 2 4k monitors in portrait is the top of the list, really)

[+] ddingus|5 years ago|reply
I love CRT displays myself. The standard 50/60hz is a bit rough, unless one is working on a high persistence phosphor. Older monochrome, amber, green screens often have that phosphor.

I like how simple they are to drive too. Managing a high resolution analog stream takes far fewer resources.

For personal electronics, retro fun, I recommend one.

Glowing phosphor in a glass tube just rocks!

There were variations too. I got to work on one of these in the 80's:

https://m.youtube.com/watch?v=T-F7ZySfgZ0

In a dim room, these are beautiful. Up to 4k resolution in the 70's, no display buffer. Surprisingly techy looking and feeling.

I will miss CRTs when they are no longer available. They should be made in 16:9 for a while longer. People would use them.

All that said, current flat panels continue to improve. They are fine for the majority of things most of us do.

[+] ddingus|5 years ago|reply
Just a late edit:

My favorite thing to watch on CRT displays is well produced SD programs. They look really good.

At the peak of SD, I had tuned a great Sony to near what a PVM can do. Watching DVD movies on that via RGB OR component in a dark room was a good experience. Huge dynamic range. The occasional trail from a bright entity in the program. Real black.

In many ways, many of us did not experience what was lost in the tech change. This revival makes sense in some ways.

Look at vinyl. It is a similar thing. The overall experience is really good. Gratifying.

Fact is, where there are limits, there is art. A great vinyl production is something I appreciate a lot. A similar one done digitally actually sounds better, but the art is not there.

The CRT is art. We had limits and the CRT bubbles up out of that just nailing it.