top | item 46897132

(no title)

bluescrn | 24 days ago

We can’t render modern games at decent frame rates at 4k without going down the path of faking it with AI upscaling and frame generation.

There was no hope of actual 8k gaming any time soon even before the AI bubble wrecked the PC hardware market.

Attempting to render 33 million pixels per frame seems like utter madness, when 1080p is a mere 2 million, and Doom/Quake were great with just 64000. Lets have more frames instead?

(Such a huge pixel count for movies while stuck at a ‘cinematic’ 24fps, an extremely low temporal resolution, is even sillier)

discuss

order

teamonkey|24 days ago

Yeah, not only the huge required jump in raw fill rate, but to get the most out of a 4K TV you need higher detail models and textures and that means you also need a huge jump in VRAM, which never materialised.

bluescrn|24 days ago

The frame buffers/render targets alone for 8K are massive.

Basically 400MB for 12 bytes/pixel (64bit HDR RGBA + 32bit depth/stencil)

vs the 64000 bytes that Doom had to fill...

alkonaut|24 days ago

I don't see a future in which we play at 4K at top settings either without AI upscaling/interpolation. Even if it were theoretically possible to do so, the performance budget the developers have going forward will be assuming that frame generation and upscaling is used.

So anyone who wants only "real frames" (Non upscaled, non generated) will need to lower their settings or only play games a few years old. But I think this will be something that becomes so natural that no one even thinks about it. Disabling it will belike someone lowering AA settings or whatever. Something only done by very niche players, like the CS community does today where some are playing 4:3 screens, lowering AA settings for maximum visibility not fidelity and so on.

xxs|24 days ago

In most cases you dont need anti-aliasing at 4k.