Your GPU is 6 years, 7 months old. As an analogy, consider someone in 2007 objecting that their 2001 GeForce3 Ti500 can't run Crysis/Mass Effect/etc. The PS4 generation really messed with the usual conventions and expectations of PC upgrade cycles.
(I appreciate that GPU prices have creeped up and up over time, though.)
For most people, Moore's law died around 2013 with Haswell or even Ivy Bridge. In 2007, 2001 was forever ago (In particular, 2001-07 involved the jump between the space heater Pentium 4 and proper 64-bit multicore chips such as the Athlon 64 X2 and Intel Core 2 Duo/Quad as well as the growth of dedicated gaming video cards - even the GeForce3 was uncommon, while by 2007 everyone had a video card)
In 2007, CPUs and GPUs were still getting twice as fast (perceptibly) each year. That hasn't been true for a while. Other than lacking a TPM, my i7-3770k machine and GTX 970 runs as fast in desktop use as my i9-9900k+2070 Super, and that machine (which dates to....2019, I think?) still plays new game releases at 1440p just fine.
Recall that most games are designed for the XBox Series X (released 2020) and PS5 (released 2020) and still target that caliber of GPU performance.
The mesh shader issue isn't a Moore's law raw performance issue. It's an issue with the hardware not supporting specific graphic pipelines. The software fallback is slower than dedicated hardware. Like how AES-NI on a CPU is many times faster at doing AES. That's why the parent comment likens it to newer GPUs' hardware raytracing features.
"Caliber of GPU" is not just performance, but also features. The Xbox Series supports mesh shader (and PS5 with their equivalent). Nvidia 10-series and AMD RDNA1 GPUs and older do not. This youtube video compared two GPUs released around the same time, one with less performance but mesh shader support and one with more performance but no mesh shader support: https://youtu.be/UiduP4Y7RSw
> Recall that most games are designed for the XBox Series X (released 2020) and PS5 (released 2020) and still target that caliber of GPU performance.
This cycle is extremely unusual though, the length of the cross-gen period has been dramatically extended by both covid shortages and also disruptions to the game development pipeline from covid and the Russian war among other things.
Once developers are no longer forced to validate for literal base-tier ps4 hardware from 2012 (8 jaguar netbook cores and an underclocked 7850, oh my), gtx 970 tier hardware is going to immediately drop off a cliff. And frankly even pascal is not going to age well. There were lots of improvements and features in pascal that got downplayed by reviewers from 2018-2022 and now they are really starting to come into play (and would have done so earlier had it not been for the pandemic).
You can hardly say we’ve even seen next-gen games at this point, tbh. Even CP2077 is a cross-gen title - which in many ways functionally means “last-gen”.
I went on a quest last year to reduce latency on my desktop and code using an 8khz polling speed mouse, a pro gamer keyboard, and two 1440p 165hz monitors. I read a fascinating article here on Hacker News about it (in November 2022, I’d have to look up the article) Anyway, it feels great. If you’re wanting things to feel faster consider these upgrades! I highly recommend them. For the first few weeks it’d feel like I had started typing before I started typing which was a weird experience. I’d just become so used to a ton of latency.
The lesson here is that you need to bite the bullet and get top of the line at least once
Because then you always have some capital within the hardware to upgrade again without breaking the bank
But if you upgrade mid life cycle or even worse: to a mid range card that’s already mid life cycle, then you’re always getting less life span and having to do full upgrades again
Arainach|2 years ago
In 2007, CPUs and GPUs were still getting twice as fast (perceptibly) each year. That hasn't been true for a while. Other than lacking a TPM, my i7-3770k machine and GTX 970 runs as fast in desktop use as my i9-9900k+2070 Super, and that machine (which dates to....2019, I think?) still plays new game releases at 1440p just fine.
Recall that most games are designed for the XBox Series X (released 2020) and PS5 (released 2020) and still target that caliber of GPU performance.
ThatPlayer|2 years ago
"Caliber of GPU" is not just performance, but also features. The Xbox Series supports mesh shader (and PS5 with their equivalent). Nvidia 10-series and AMD RDNA1 GPUs and older do not. This youtube video compared two GPUs released around the same time, one with less performance but mesh shader support and one with more performance but no mesh shader support: https://youtu.be/UiduP4Y7RSw
paulmd|2 years ago
This cycle is extremely unusual though, the length of the cross-gen period has been dramatically extended by both covid shortages and also disruptions to the game development pipeline from covid and the Russian war among other things.
Once developers are no longer forced to validate for literal base-tier ps4 hardware from 2012 (8 jaguar netbook cores and an underclocked 7850, oh my), gtx 970 tier hardware is going to immediately drop off a cliff. And frankly even pascal is not going to age well. There were lots of improvements and features in pascal that got downplayed by reviewers from 2018-2022 and now they are really starting to come into play (and would have done so earlier had it not been for the pandemic).
You can hardly say we’ve even seen next-gen games at this point, tbh. Even CP2077 is a cross-gen title - which in many ways functionally means “last-gen”.
wincy|2 years ago
yieldcrv|2 years ago
Because then you always have some capital within the hardware to upgrade again without breaking the bank
But if you upgrade mid life cycle or even worse: to a mid range card that’s already mid life cycle, then you’re always getting less life span and having to do full upgrades again