top | item 34765572

(no title)

Whinner | 3 years ago

$1000 as the floor? Maybe if you’re talking 4K gaming but that’s the very high end.

Intel arc a750 are under $300 and are decent cards for 1080 and do well in 1440. Dx9 support has greatly increased since release.

Going up a little in price, amds 6650 amd 6750 are 3-400.

6800xt are under $600.

discuss

order

krisroadruck|3 years ago

I haven't bought a non-4K monitor in over 6 years. I honestly don't know anyone who is still using 1080p monitors as daily drivers if they are also using the machine for productivity or media work. But your point is not invalid.

bryanlarsen|3 years ago

Modern games let you set separate rendering and display resolutions so you can get most of the benefit of a 4k display without requiring a video card that can render every pixel. The new upscaling techs address really good.

goosedragons|3 years ago

1440p monitors aren't that expensive either and for a lot of people 4K at 100% scaling is too small to be comfortable at a typical 27" size. For PC gaming a high refresh rate 1080p or 1440p display is a better buy than a 60Hz 4K one at roughly the same price.

scarface74|3 years ago

This is so out of touch with reality it might as well be “do people still watch TV? I haven’t owned a TV in 10 years”

IntelMiner|3 years ago

Hi, I use a pair of 1080p monitors on my primary machine. Both for work, leisure and my hobby of video editing for youtube videos

TaylorAlexander|3 years ago

Still using my old Dell 1440p 27" monitor to edit my 4k youtube videos. I briefly considered buying a 4k monitor this year but I spent my money on a NAS instead. I use three monitors on my desktop, the other two being 1080p. I use the 27" for my main monitor and the others are for docs and videos. I haven't bought a monitor in like 10 years because these things just keep on going. I do have a 4k monitor and work and its nice but does not feel significantly different from my old 1440p monitor. If I had more money I probably wouldn't think much about an upgrade but I work for a tech non profit and live in the Bay Area so I am not out buying new stuff that often. The NAS was a long needed upgrade to serve as a backup for my important media!

ChuckNorris89|3 years ago

At my job we all got 1080p monitors for dev work.

tbrownaw|3 years ago

On in-office days I'm stuck with a pair of 1080p (at home, yes it's a pair of 4k). It's kind of annoying.

cammikebrown|3 years ago

144Hz is way more important to me than resolution. I have a 4K TV I can hook my computer up to if I really want.

anthomtb|3 years ago

I am on a pair of 1080p’s. I stare at text all day. What is the benefit of upgrading to 4K?

krisroadruck|3 years ago

@ChuckNorris89 man... why do they hate their devs? That's just mean =/ Hopefully they don't have you all on a bunch of $400 dell optiplexes too. Pouring one out for you brother.

elabajaba|3 years ago

The cheapest 3050 in Canada is $389. It is ~10% slower than a 1070ti.

4 years ago you could get a 1070ti in Canada for $400 new.

The exchange rate is about the same as well.

qball|3 years ago

>The cheapest 3050 in Canada is $389. It is ~10% slower than a 1070ti.

Fortunately, used 1070s and RX580s are 150CAD (100USD).

The combination of board prices being completely out of whack, and the performance delta of the 3080 over the 3070/2080Ti (made worse by the fact that the 4090 is that leap again over the 3080) being as large as it is, means the value proposition in the middle has disappeared.

It's not that the 3080 isn't worth 700USD or that the current 4000 series cards don't have a similar price/performance ratio- because they very much are priced appropriately; it's that the cheaper new cards (especially the 3070) have a far worse ratio than the expensive ones do. This is also partially why the most popular gaming GPU on Steam is the GTX1650.

And with current-generation console games targeting the equivalent of that 1070/2060, buying anything less than a 3080 is an objectively bad decision, unless you're someone who plays a lot of competitive shooters and thus can benefit from an intermediate for high framerate reasons. (The fact that most of those games aren't particularly fun to play also hurts the hardware industry.)

paulmd|3 years ago

Grandparent seems to have forgotten about 4070 Ti (only $800, what a bargain!) but yeah, $800 is currently the floor for current-gen hardware in the sense that nothing has launched below $800 despite being almost 6 months into this product cycle. AMD's cheapest is a $900 MSRP (but starting to fall below that) and NVIDIA's cheapest is an $800 MSRP.

That space is currently filled by older, slower, less efficient, less-featured last-gen products. Both companies have some significant amounts of inventory they want to burn through after the mining thing and it's going slow because of the general declines in shipments.

Generally though I think people are remembering the past with rose-colored glasses... not saying OP said this in particular, but a lot of people have latched onto the idea of the "$300 x70 tier", and the x70 tier has literally never been $300 MSRP for the entire time it's existed. It's bounced back and forth between $350 and $400 even 10 years ago, $329 was the lowest price it's ever launched at and people have latched onto that one as being the price x70 has to match forever, plus a little extra. GTX 680 (full-die GK104) was $499 10 years ago, for a 300mm^2 chip, GTX 670 was a GK104 cutdown for $399 for example, and GTX Titan was where you got the full GK110 at a mere $999 (in 2012 dollars).

https://en.wikipedia.org/wiki/List_of_Nvidia_graphics_proces...

Ampere was somewhat below the baseline, using Samsung was an attempt to make cheaper cards and push down the cost, so by $499 being a "bargain" price (for 3070) before pandemic cost spirals got too bad, and factoring in the more expensive TSMC 5nm node, I think the realistic price for a 4070 (GA104 cutdown, whatever you call it) is probably $600-700 at this point.

So there's definitely some gouging taking place, but, a lot of people are fixated on that $300 number and that's just not going to happen. Costs have just spiraled a lot more than people realize, Pascal was not cheap either and that was 7 years ago (!) this june, and everything since then has been on older, cheaper nodes to help keep the costs down... until now. Throw in the pandemic generally blowing a lot of costs up, and yeah things are expensive now.

And yes, 4850/4870 were good cheap cards, but AMD could do that because they got onto 55nm ahead of NVIDIA, and that was back in the days when shrinking first was a real advantage, you could match a high-end card with a cheap midrange card if you got to a newer node first. That's not how it works anymore, higher wafer costs and R&D costs mean newer nodes are better but they're not really cheaper even considering you get more chips per wafer. Costs are growing fast enough to eat up the increases in density.