If that's true, how would one rank pumping supercritical liquid nitrogen at high rates through a heatsink? Super-grandmaster?
Seems like the heat flow would be substantially impeded by any boiling of the LN2.
Or, for that matter, simply using a chilled copper ingot as the heat sink? There must be some threshold at which the limiting problem is getting the heat out of the die, not getting the heat out of the chip's package.
What is interesting, from the screenshot, is that the game is actually CPU bound. Contrary to an often held belief in the high-end video games optimization circles.
No. I just bought a 4k screen only to find out that 12k is coming down the pipe. I do not want to think about what 1000hz 12k screens will cost. Stop this madness now.
Something is always coming down the pipe. Fortunately you have need to walk the treadmill, and if you're only just getting a 4K screen now, then you're probably not the kind of insane early-adopter who is.
12k is possible but it's hardly around the corner. Even 4K has both content and hardware issues around it. 8K is going to be the next mass market push but we're not even done with the 4K party.
Also 4K display devices are available at a modest price point now. The bigger issues are content. We're mostly there with mass-market media, but if you want to drive a AAA video game at 4K resolution you're having to make compromises and spend a lot on the hardware to drive it.
They're going to keep making new things. And the new things are going to have bigger numbers. It's okay.
12k? Never heard of. Let's go with 8k, which is 7680x4320.
Assuming our current standard of 8 bits per color with no alpha (3 bytes per pixel), which may be too low if you care so much about your monitor, your required bandwidth becomes:
7680 * 4320 * 3 * 1000 = 99532800000
99532800000 / 1024 / 1024 / 1024 = 92 gigabytes per second of bandwidth you will consume just to pump stuff to your monitor. Better not use integrated graphics!
To give a comparison, here's 4k@60hz:
3840 * 2160 * 3 * 60 = 1492992000
1492992000 / 1024 / 1024 = 1423 Mb/s.
Also notice that 8k monitors already employ tactics such as "visually lossless compression" (which means: lossy compression but they think you won't notice) and other stuff aimed at trying to not really submit full frames all the time.
Forget your 12k. It will only be useful to increase your energy bill.
At some point, it’s not worth upgrading resolution. I don’t know what that point is for you; but, eyes only have a certain arc-length resolution, beyond which everything additional as far as resolution is meaningless.
For me, that’s a bit more than 1440p at 3 feet at 27”.
[+] [-] sdenton4|5 years ago|reply
I also learned that literally pouring liquid nitrogen over a CPU from a cup is a grandmaster overclocker move.
[+] [-] ISL|5 years ago|reply
Seems like the heat flow would be substantially impeded by any boiling of the LN2.
Or, for that matter, simply using a chilled copper ingot as the heat sink? There must be some threshold at which the limiting problem is getting the heat out of the die, not getting the heat out of the chip's package.
[+] [-] pvarangot|5 years ago|reply
[+] [-] fuzxi|5 years ago|reply
Of course, the real grandmaster move is to use liquid helium - its boiling point is about 70C colder than nitrogen :)
[+] [-] saagarjha|5 years ago|reply
[+] [-] SketchySeaBeast|5 years ago|reply
[+] [-] leddt|5 years ago|reply
Edit: seems mobile and desktop have a different crop of that image. Here is the image that shows 720: https://images.ctfassets.net/rporu91m20dc/1XYHhlYZzNI1NxRRJl...
[+] [-] Whatarethese|5 years ago|reply
[+] [-] gerdesj|5 years ago|reply
It weighed a tonne and took up quite a lot of desk.
[+] [-] Ziggy_Zaggy|5 years ago|reply
[+] [-] stephc_int13|5 years ago|reply
[+] [-] 654wak654|5 years ago|reply
[+] [-] endergen|5 years ago|reply
[+] [-] dwighttk|5 years ago|reply
[+] [-] sandworm101|5 years ago|reply
[+] [-] smabie|5 years ago|reply
A RTX 2080 Ti can't even push 144fps on 1440p at max, much less 4k.
[+] [-] hellotomyrars|5 years ago|reply
12k is possible but it's hardly around the corner. Even 4K has both content and hardware issues around it. 8K is going to be the next mass market push but we're not even done with the 4K party.
Also 4K display devices are available at a modest price point now. The bigger issues are content. We're mostly there with mass-market media, but if you want to drive a AAA video game at 4K resolution you're having to make compromises and spend a lot on the hardware to drive it.
They're going to keep making new things. And the new things are going to have bigger numbers. It's okay.
[+] [-] recursive|5 years ago|reply
[+] [-] theandrewbailey|5 years ago|reply
[+] [-] dyingkneepad|5 years ago|reply
Assuming our current standard of 8 bits per color with no alpha (3 bytes per pixel), which may be too low if you care so much about your monitor, your required bandwidth becomes:
7680 * 4320 * 3 * 1000 = 99532800000
99532800000 / 1024 / 1024 / 1024 = 92 gigabytes per second of bandwidth you will consume just to pump stuff to your monitor. Better not use integrated graphics!
To give a comparison, here's 4k@60hz:
3840 * 2160 * 3 * 60 = 1492992000
1492992000 / 1024 / 1024 = 1423 Mb/s.
Also notice that 8k monitors already employ tactics such as "visually lossless compression" (which means: lossy compression but they think you won't notice) and other stuff aimed at trying to not really submit full frames all the time.
Forget your 12k. It will only be useful to increase your energy bill.
Edit: fix calculations.
[+] [-] t-writescode|5 years ago|reply
For me, that’s a bit more than 1440p at 3 feet at 27”.
[+] [-] anticensor|5 years ago|reply