top | item 46896862

(no title)

Aardwolf | 25 days ago

> Gaming was supposed to be one of the best drivers for 8K adoption.

While the step from 1080p 1440p to 4K is a visible difference, I don't think going from 4K to 8K would be a visible since the pixels are already invisible at 4K.

However the framerate drop would be very noticeable...

OTOH, afaik for VR headsets you may want higher resolutions still due to the much larger field of vision

discuss

order

tombert|25 days ago

I usually still play at 1080p on my Steam box because my TV is like nine feet away and I cannot tell a difference between 1080p and 4k for gaming, and I would rather have the frames.

I doubt I’m unique.

ksynwa|25 days ago

AAA games have been having really bad performance issues for the last few years while not looking much better. If you wanna game in 8K you are gonna need something like a NASA supercomputer.

laughing_man|24 days ago

AAA games are struggling for a lot of reason, and consoles are struggling as well. PC gamers tend to use a more traditional monitor setup and won't buy a gigantic television. At least, not for gaming.

xxs|24 days ago

Even with a super computer it'd be difficulty to render the frames in time with low latency.

bluescrn|25 days ago

We can’t render modern games at decent frame rates at 4k without going down the path of faking it with AI upscaling and frame generation.

There was no hope of actual 8k gaming any time soon even before the AI bubble wrecked the PC hardware market.

Attempting to render 33 million pixels per frame seems like utter madness, when 1080p is a mere 2 million, and Doom/Quake were great with just 64000. Lets have more frames instead?

(Such a huge pixel count for movies while stuck at a ‘cinematic’ 24fps, an extremely low temporal resolution, is even sillier)

teamonkey|24 days ago

Yeah, not only the huge required jump in raw fill rate, but to get the most out of a 4K TV you need higher detail models and textures and that means you also need a huge jump in VRAM, which never materialised.

alkonaut|24 days ago

I don't see a future in which we play at 4K at top settings either without AI upscaling/interpolation. Even if it were theoretically possible to do so, the performance budget the developers have going forward will be assuming that frame generation and upscaling is used.

So anyone who wants only "real frames" (Non upscaled, non generated) will need to lower their settings or only play games a few years old. But I think this will be something that becomes so natural that no one even thinks about it. Disabling it will belike someone lowering AA settings or whatever. Something only done by very niche players, like the CS community does today where some are playing 4:3 screens, lowering AA settings for maximum visibility not fidelity and so on.

charcircuit|25 days ago

VR headsets won't use the same panels that a TV would use. Any growth in the XR headset space won't help the TV industry.

Doublon|25 days ago

> While the step from 1080p 1440p to 4K is a visible difference

I even doubt that. My experience is, on a 65" TV, 4K pixels become indistinguishable from 1080p beyond 3 meters. I even tested that with friends on the Mandalorian show, we couldn't tell 4K or 1080p apart. So I just don't bother with 4K anymore.

Of course YMMV if you have a bigger screen, or a smaller room.

alex43578|25 days ago

If your Mandalorian test was via streaming, that's also a huge factor. 4K streaming has very poor quality compared to 4K Blu-ray, for instance.

yieldcrv|25 days ago

there are so many tricks you can do as well, resolution was never really the issue, sharpness and fidelity isn't the same as charming and aesthetically pleasing

DharmaPolice|25 days ago

The person was referring to gaming where most PC players are sitting closer than 3 metres from their screen.

tokyobreakfast|25 days ago

> While the step from 1080p 1440p to 4K is a visible difference

It really isn't.

What you are likely seeing is HDR which is on most (but not all!) 4K content. The HDR is a separate layer and unrelated to the resolution.

4K versions of films are usually newly restored with modern film scanning - as opposed to the aging masters created for the DVD era that were used to churn out 1st generation Blu-Rays.

The difference between a 4K UHD without HDR and a 1080p Blu-Ray that was recently remastered in 4K from the same source is basically imperceptible from any reasonable viewing distance.

The "visible difference" is mostly better source material, and HDR.

Of course people will convince themselves what they are seeing justifies the cost of the upgrade, just like the $200 audiophile outlet and $350 gold-plated videophile Ethernet cable makes the audio and video really "pop".

scratcheee|25 days ago

I know the thread is about tvs, but since gaming has come up, worth noting that at computer viewing distances the differences between 1080p/1440p and 4k really are very visible (though in my case I have a 4k monitor for media and a 1440p monitor for gaming since there’s 0 chance I can run at 4k anyway)

FeepingCreature|25 days ago

I can confirm that on a pc monitor, 1080p and 4k is very easy to tell apart.

dtech|25 days ago

For tv maybe, but you're replying to gaming, and it's definitely on a monitor, laptop or handheld