top | item 44551773

(no title)

hinterlands | 7 months ago

That slo-mo video is somewhat misleading, though. The phosphor glows for a good while, so there is a reasonable chunk of the image that's visible at any given time.

The problem in that video is that the exact location the beam is hitting is momentarily very bright, so they calibrated the exposure to that and everything else looks really dark.

discuss

order

layer8|7 months ago

The phosphor still drops off very quickly [0][1][2], roughly within a millisecond. That’s why you would need a 1000 Hz LCD/OLED screen with really high brightness (and strobing logic) to approximate CRT motion clarity. On a traditional NTSC/PAL CRT, 1 ms is just under 16 lines, but the latest line is already much brighter than the rest. The slow-motion recording showing roughly one line at a time therefore seems accurate.

[0] https://blurbusters.com/wp-content/uploads/2018/01/crt-phosp...

[1] https://www.researchgate.net/figure/Phosphor-persistence-of-...

[2] https://www.researchgate.net/figure/Stimulus-succession-on-C...

hinterlands|7 months ago

I'm not quite sure what you're saying here. My assertion is that a visible image persists on the screen longer than it appears in the slo-mo clip. You can just point a camera with an adjustable shutter speed at a CRT and see it for yourself. Here's an example (might need to copy the URL and open in a new tab, they don't like hotlinking):

https://i.sstatic.net/5K61i.png

The brightly-lit band is the part of the frame scanned by the beam while the shutter was open. The part above is the afterimage, which, while not as bright, is definitely there.

bgnn|7 months ago

I'm not sure about this calculation though. Phosphor decays exponentially with a time constant of roughly 5ms (according to HP [1]). This means when a new frame comes at 60Hz refresh rate there is still 10-15% of the previous frame related excitation is present. This means there is considerable amount of nonlinearity, hence the performance is even worse than 10ms LCD/OLED displays.

Genuine question: why do you think CRTs are better?

[1] https://hpmemoryproject.org/an/pdf/an_115.pdf

perching_aix|7 months ago

> The phosphor still drops off very quickly [0][1][2], roughly within a millisecond.

It's phosphor chemistry dependent. Different color patches on the same glass would decay at different rates even. But yeah, 1 ms is a good lower bound, although when I last researched this, it was definitely the best case scenario for CRTs. I'm fairly sure the ~500 Hz OLEDs that are already floating around are beating the more typical CRTs of old already.

> That’s why you would need a 1000 Hz LCD/OLED screen with really high brightness (and strobing logic) to approximate CRT motion clarity.

At 1000 Hz you wouldn't need the strobing anymore (I believe?), that's the whole point of going that fast. We're kinda getting there btw! Hopefully with HDMI 2.2 out, we'll see something cool.

> On a traditional NTSC/PAL CRT, 1 ms is just under 16 lines, but the latest line is already much brighter than the rest.

That doesn't really math for me. NTSC would be 480 visible lines at 60 Hz, and so 480 lines / ~16.6 ms = 28.8 lines/ms (6% of the screen). Note that of course PAL works out to the same number: 576 lines / 20 ms = 28.8 lines/ms (just 5% of the screen here though!).

wincy|7 months ago

I definitely like my new 240hz 4k oled HDR monitor, though. They're getting there! The data rate it's pushing through the displayport cable for uncompressed 4k HDR is something 80gb/s though. Absolutely mind boggling. Huge upgrade from my 1440p 165hz IPS monitor that had huge amounts of smearing when playing games.

f1shy|7 months ago

And still it was possible as a side attack, with just looking at the reflected brightness of a screen, to get a perfect image back.