(no title)
Whinner | 3 years ago
Intel arc a750 are under $300 and are decent cards for 1080 and do well in 1440. Dx9 support has greatly increased since release.
Going up a little in price, amds 6650 amd 6750 are 3-400.
6800xt are under $600.
Whinner | 3 years ago
Intel arc a750 are under $300 and are decent cards for 1080 and do well in 1440. Dx9 support has greatly increased since release.
Going up a little in price, amds 6650 amd 6750 are 3-400.
6800xt are under $600.
krisroadruck|3 years ago
pprotas|3 years ago
> 60% of Steam users have a 1080p monitor as their primary display
bryanlarsen|3 years ago
goosedragons|3 years ago
scarface74|3 years ago
IntelMiner|3 years ago
TaylorAlexander|3 years ago
ChuckNorris89|3 years ago
tbrownaw|3 years ago
cammikebrown|3 years ago
anthomtb|3 years ago
dvngnt_|3 years ago
krisroadruck|3 years ago
elabajaba|3 years ago
4 years ago you could get a 1070ti in Canada for $400 new.
The exchange rate is about the same as well.
qball|3 years ago
Fortunately, used 1070s and RX580s are 150CAD (100USD).
The combination of board prices being completely out of whack, and the performance delta of the 3080 over the 3070/2080Ti (made worse by the fact that the 4090 is that leap again over the 3080) being as large as it is, means the value proposition in the middle has disappeared.
It's not that the 3080 isn't worth 700USD or that the current 4000 series cards don't have a similar price/performance ratio- because they very much are priced appropriately; it's that the cheaper new cards (especially the 3070) have a far worse ratio than the expensive ones do. This is also partially why the most popular gaming GPU on Steam is the GTX1650.
And with current-generation console games targeting the equivalent of that 1070/2060, buying anything less than a 3080 is an objectively bad decision, unless you're someone who plays a lot of competitive shooters and thus can benefit from an intermediate for high framerate reasons. (The fact that most of those games aren't particularly fun to play also hurts the hardware industry.)
paulmd|3 years ago
That space is currently filled by older, slower, less efficient, less-featured last-gen products. Both companies have some significant amounts of inventory they want to burn through after the mining thing and it's going slow because of the general declines in shipments.
Generally though I think people are remembering the past with rose-colored glasses... not saying OP said this in particular, but a lot of people have latched onto the idea of the "$300 x70 tier", and the x70 tier has literally never been $300 MSRP for the entire time it's existed. It's bounced back and forth between $350 and $400 even 10 years ago, $329 was the lowest price it's ever launched at and people have latched onto that one as being the price x70 has to match forever, plus a little extra. GTX 680 (full-die GK104) was $499 10 years ago, for a 300mm^2 chip, GTX 670 was a GK104 cutdown for $399 for example, and GTX Titan was where you got the full GK110 at a mere $999 (in 2012 dollars).
https://en.wikipedia.org/wiki/List_of_Nvidia_graphics_proces...
Ampere was somewhat below the baseline, using Samsung was an attempt to make cheaper cards and push down the cost, so by $499 being a "bargain" price (for 3070) before pandemic cost spirals got too bad, and factoring in the more expensive TSMC 5nm node, I think the realistic price for a 4070 (GA104 cutdown, whatever you call it) is probably $600-700 at this point.
So there's definitely some gouging taking place, but, a lot of people are fixated on that $300 number and that's just not going to happen. Costs have just spiraled a lot more than people realize, Pascal was not cheap either and that was 7 years ago (!) this june, and everything since then has been on older, cheaper nodes to help keep the costs down... until now. Throw in the pandemic generally blowing a lot of costs up, and yeah things are expensive now.
And yes, 4850/4870 were good cheap cards, but AMD could do that because they got onto 55nm ahead of NVIDIA, and that was back in the days when shrinking first was a real advantage, you could match a high-end card with a cheap midrange card if you got to a newer node first. That's not how it works anymore, higher wafer costs and R&D costs mean newer nodes are better but they're not really cheaper even considering you get more chips per wafer. Costs are growing fast enough to eat up the increases in density.