top | item 40555736

(no title)

foxhill | 1 year ago

> An application with 10 frames of latency will be faster on a 1 kHz display than a perfectly coded application on a 60 Hz display.

thats actually not true. you seem to be implying that the best a 60hz display can manage is 16.6ms of latency. indeed that is the worst case value, but you should consider that early graphics technologies involved changing display modes mid scan.

it’s actually not ridiculous to suggest that old platforms had sub-millisecond latency; they did. if the scanline was on, or just before, the line where you would interact (i.e., the prompt line), the text you enter would appear immediately.

of course, “vsync”, tear free, and such like approaches “fixed” this - necessarily by adding at least a frame’s worth of latency - but also adding perceptual latency.

it’s an oft-overlooked aspect of refresh rates. a 60hz CRT, without vsync, still has the lower bound of latency lower than a 120hz display. perhaps even 240hz.

i’ve used two 240hz displays for years now. i’ll never go slower than that.

discuss

order

modeless|1 year ago

> you seem to be implying that the best a 60hz display can manage is 16.6ms of latency

Yes, if you control the whole software stack it is possible to do beam racing to get lower than one frame of latency (assuming low latency hardware for input and display panel scanout). But I'm talking about desktop/mobile applications. In general operating systems do not do this, and many actually make it impossible. Only very recently has it become possible to do beam racing in a windowed application (not using fullscreen exclusive mode) on Windows with recent graphics hardware with multiplane overlay and very, very few people have attempted to do it. I believe it is strictly impossible to do beam racing for windowed applications on macOS and Linux/Wayland. Not sure about iOS and Android.

foxhill|1 year ago

you don't need to "beam race" to achieve sub-frame latency - you don't need to be accurate. switching off vsync should, principally, be enough to achieve this.

otherwise, yes, modern APIs go out of their way to avoid the possibility of this (the dreaded "tearing" artifacts you see from the frame buffer being changed during the transmission of the video signal to the monitor). i don't believe older techniques like you've mentioned are at all possible today, and only really made sense to talk about when analogue displays were the norm.

IshKebab|1 year ago

I think you should assume that we aren't talking about CRTs. Come on.

foxhill|1 year ago

apologies, i wasn't being specific; none of what i said necessitates a CRT display, it was only as an example of how an older technology had less latency.

if modern a modern 60Hz LCD/OLED display couldn't get beneath 16.6ms latency, then what exactly is tearing?