top | item 28826552

Up to 600 watts of power for graphics cards with the new PCIe 5.0 powerconnector

86 points| jsiepkes | 4 years ago |igorslab.de | reply

176 comments

order
[+] myself248|4 years ago|reply
That looks all the world like a Micro-Fit 3.0 drawing, whose contacts are normally rated at 8.5A, but here they're claiming 9.2A each. Also, the largest Micro-Fit terminal is made for 18AWG, but the article mentions 16.

https://www.molex.com/molex/products/family/microfit_30?pare...

I suspect what's happening is Molex is making a new version of the terminal with higher clamping force or better plating, to keep the temperature rise down at higher current, and larger wire grips, to accommodate the thicker conductor. (This will also allow more heat flow away from the contact.)

However, the contact pitch is unchanged from 3.0mm, meaning the cavities in the housing can't grow any more, so the wire insulation thickness will be limited. That's not a big deal electrically since it's only 12 volts, but it's a consideration mechanically since the wires will be less protected against abrasion, pinching, and other damage.

[+] LeifCarrotson|4 years ago|reply
The current deratings are down to 5.5A for 12-circuit wire-to-board applications:

https://www.molex.com/pdm_docs/ps/PS-43045-001.pdf

But that is allowing a maximum temperature rise of 30C, it uses 18AWG wire as you say, and it's supposed to be capable of 600V AC/DC; it's not supposed to decompose/leak more than 5mA when exposed to 2200V.

All of that is overkill for inside of a PC. If you're willing to limit the operating voltage to 12V and can guarantee that your board side conductors have big, multi-plane copper traces pulling heat out of the contacts to an actively cooled heat sink, you can get away with a lot that you couldn't if you were using the connector as, say, an in-line disconnect in some conduit for a 480V servo drive.

[+] xxpor|4 years ago|reply
At what point would it make sense to move to a 24v or 48v power standard in PCs? The dc-dc conversion world has improved a lot since ATX was invented.
[+] kayson|4 years ago|reply
I've always wondered why CPU AIO coolers are so prevalent, but GPUs almost exclusively use air coolers. We're nearing a point where GPUs will have a TDP >2X the hungriest CPU (Threadripper), but the cooling used by most is far less effective.

It's especially noticeable in smaller builds like those using mini-itx. Most of them GPU coolers draw air in from the bottom and exhaust out the sides, which doesn't really fit well the usual PC case airflow setup of draw cool air in the front and exhaust out the back and top. It seems like blower style coolers are also getting harder to find.

I'm hoping my next PC can have an AIO at the intake connected to the GPU, and a large air cooler for the CPU. That makes way more sense to me than having the AIO on the CPU and an air cooler on the GPU...

[+] ajoy39|4 years ago|reply
The difference is that with GPUs nearly every model has a different layout of components around the actual chip, and those components need to be cooled too, so making a universal cold plate for them can be really challenging. The AIO I bought for my last computer could be used with any CPU, AMD or Intel, released in the past decade. The water block I bought for the GPU I put in my desktop two computers ago could only be used with reference designs of the AMD R9 290 or 290x.

There are models of GPUs with AIOs, EVGA makes their Hydro series, Gigabyte has their waterforce series. They just usually aren't worth the cost since you have to design a new full cover block not just got every generation but also usually for different cards in the same gen too.

Lastly, TDP is not a defined standard and each manufacturer defines it differently. Intel for example defines TDP as the maximum allowable power draw under full load. How that translates to actual heat output depends on architecture and process node. Intel's most recent chips for example all max out at just over 100 watts TDP, but they're known to need 200+ watts of heat dissipation for sustained workloads.

[+] sbradford26|4 years ago|reply
A big part is that CPU sockets are very standardized and you are only cooling the CPU and not the memory and power regulators. For graphics cards the coolers need to cool the gpu along with the memory and power regulators which can be in different places depending on the manufacturer. This usually means you need a block made specifically for your model of card. You can buy those blocks and do custom water cooling but the cost of that is incredibly high. Also it finally comes down to the AIO coolers really aren't better than good air coolers. Sure AIO tend to be quieter and move the heat to a different location but on pure performance they don't crush air coolers.
[+] jl6|4 years ago|reply
I’ve always thought of gaming as a virtuous low-impact form of entertainment, insofar as it displaces real-world pursuits like foreign ski holidays, gasoline-powered road trips, or land-hogging golf courses.

But burning nearly a kilowatt on a gaming rig feels like a step too far. Sure, the power could be sourced from wind or solar plants, but there’s still a level of excessive conspicuous consumption going on here that doesn’t sit right with me.

[+] evandijk70|4 years ago|reply
Definitely agree that a kilowatt seems very high for gaming and starts to have an environmental impact. Still, driving a car at 100kph takes around 20 kW of power, so driving somewhere is still a bigger burden on the environment than gaming.
[+] nitrogen|4 years ago|reply
excessive conspicuous consumption

Energy is a proxy for accomplishment and development. Humanity's #1 goal should be to maximize energy use per capita while minimizing energy use per activity, such that every human is doing as many desirable activities as possible.

Gaming at 1kW for an hour costs 30 cents or less in electricity. As others have mentioned indirectly, going for a Sunday drive to see the changing leaves uses more energy. A space heater uses more power.

[+] stefan_|4 years ago|reply
It's a trivial amount of power and even better, we have already electrified all the components! Computers don't require us to go out and replace all the combustion engine vehicles still around. Clean up the grid and that's it.
[+] unglaublich|4 years ago|reply
Why? Your home heating consumes 5-50kW. In colder climates, the power consumption will be compensated by less home heating.
[+] jayd16|4 years ago|reply
Probably not worth judging the entire sector by the peak theoretical output of one connector that might be relegated to industrial use.
[+] turtlebits|4 years ago|reply
Lots of things are a huge waste of electricity. Ever use your oven? At 12% efficiency you're wasting over 1kw of electricity.

Unless energy prices rise, it's literally pennies.

[+] dragontamer|4 years ago|reply
> But burning nearly a kilowatt on a gaming rig feels like a step too far.

How much energy does a golf cart use? Cause those golfers are using a golf-cart for miles when their legs are perfectly acceptable (and golf-carts are illegal in professional play)

[+] sz4kerto|4 years ago|reply
It's really low impact though. Even if your rig consumes a kW, you're still not driving anywhere (in a car that consumes at least an order of magnitude more), or going out to a restaurant (that's air conditioned), etc.
[+] tiotempestade|4 years ago|reply
Exactly why I decided to comment, only to find you already touched the topic :)

This is getting out of control!

Nvidia/AMD: build more efficient GPUs!

[+] chaosbutters314|4 years ago|reply
if you crunch the numbers on how much solar (desertec has a fun graphic on this for the interested) and wind power we could produce if politicians actually cared about fighting climate change, there is nearly an infinite supply of power for humanity.

This isnt even including fossil fuels, nuclear/fusion, biomass/trash, and hydro.

https://en.wikipedia.org/wiki/Desertec#/media/File:Fullneed....

[+] hulitu|4 years ago|reply
Why so negative. I think gaming is the future. We can heat houses and have fun. If only will be able to store the heat. Why make better chips when the user do not care ?
[+] rowanG077|4 years ago|reply
Rest assured that almost no gamer will have a card that uses that much power.
[+] DeathArrow|4 years ago|reply
Any human endeavor consumes energy.
[+] httpz|4 years ago|reply
Assuming you can install two GPUs using 600W, that's basically running a hair dryer in your desktop since most of the energy is converted to heat.
[+] noobermin|4 years ago|reply
Why is this being downvoted? If you have a kilowatt (I wanted to type that out instead of write KW to emphasize it) coming out of the wall that energy has to go somewhere. Literally warm your room during winter and turn down your heater.
[+] ksec|4 years ago|reply
This got me thinking. ( I must be missing something or having wrong assumption in my high overview of the issue. )

Do Crypto People use their GPU generated heat as heater assuming they lives in a place that requires heating anyway?

Doesn't that equate to free money assuming they could actually mine crypto?

[+] denton-scratch|4 years ago|reply
All of the energy is converted to heat. Some energy from your rig is emitted as light; but that's your monitor, and your monitor has its own power supply (which means that 600W is an underestimate).
[+] oatmeal_croc|4 years ago|reply
Well, all the energy is eventually converted to heat.
[+] worrycue|4 years ago|reply
The RTX 3080 already uses 320W (peak?). By next GPU generation who knows.

Good lord, just 2 generation back GPUs like the 1070Ti consumed 180W. Even if you add a CPU (of that era) like the i7-7700 with it 65W draw, that just 245W. Now a video card alone can draw more power.

[+] avian|4 years ago|reply
Why does the author/publisher feel the need to overlay their own logo over drawings they copied from someone else's datasheet?
[+] causi|4 years ago|reply
To hide the fact he's violating Amphenol ICC's copyright by using their images and stripping their copyright notice and publishing info off of them.
[+] skhr0680|4 years ago|reply
If someone wants to copy and paste his article, they’ll at least have to go to the effort of finding the images themselves
[+] bob1029|4 years ago|reply
Everyone tries to be fancy with wiring exotic networking in their homes, but the real performance kings are installing NEMA 14-50R in their offices right now.
[+] neither_color|4 years ago|reply
Anecdote: A relative of mine who does home remodeling told me that over the past year most of his clients plan for dedicated office rooms now, rooms that would've been planned as guest rooms and whatnot before. They're also running 12-2 wire with 20 amp breakers to these rooms now. So yeah, not quite the laundry plugs you're describing but getting closer.
[+] timw4mail|4 years ago|reply
This looks like some abomination of the ancient molex drive connection with part of an ATX main connection.

At which point does the GPU get its own dedicated power supply?

[+] stefan_|4 years ago|reply
Why would it ever? It's much more efficient to scale up the one AC mains voltage -> 12V DC supply you already have.

You think in datacenters they are just stacking up PSUs? No, all of this stuff gets vastly more efficient and reliable if you can build just one big PSU to supply say 48V DC and distribute that everywhere.

[+] AnIdiotOnTheNet|4 years ago|reply
We seem to be reaching the point where I expect it'll soon make more sense to consider the GPU as the heart of the computer and the CPU as a peripheral.
[+] cogman10|4 years ago|reply
The real question is at what point do gamers start choosing lower power parts because the GPU is too power hungry? Do we get into a world where professional gamers start having 240V lines installed for their PCs?

A US power outlet is typically rated at 120VAC@15A, that's a 1800W budget if the only thing on the circuit is a PC.

[+] scrooched_moose|4 years ago|reply
Will Smith (of Tested/Maximum PC) mentioned in one of his recent podcast episodes [1] that GPU vendors tried to add dedicated power supplies to cards, with their own AC connection, a while back (mid 2000s or so if I remember right). Consumer/industry pushback killed it before they hit shelves and the vendors focused on efficiency instead. I don't see the same pushback happening these days.

1) https://techpod.content.town/

[+] vmception|4 years ago|reply
> At which point does the GPU get its own dedicated power supply?

This is true in many machines already for several years, if you look at the machines the hard to find GPUs are actually in.

I wouldn’t be surprised if that is a market force behind this new PCIe standard.

[+] noobermin|4 years ago|reply
I'll be honest, it's very hard for me to imagine what I would do that would demand anywhere near 600 watts for a graphics card alone. I mean my PC can draw at most 120 watts and that feels like a lot, although the PC is pretty old by now. All of this for crypto? How many games are there out there that draw anywhere near this level of wattage just for graphics?
[+] ksec|4 years ago|reply
This is long overdue, for something I have been questioning since 2016. ( Our GPUs are severely TDP limited )

It is also interesting on one hand you have Apple pushing CPU and GPU integration for maximum efficiency. Where Apple could make a potential 300W TDP SoC for their Mac Pro. On the other hand you have CPU pushing to 400W ( and higher in the future ) while GPU pushing to 600W.

[+] DeathArrow|4 years ago|reply
I wonder what will be the MSRP for 3090 TI and what the actual price in stores would be. Even mid tier GPUs like 3070 cost a lot of money.
[+] nottorp|4 years ago|reply
This is insane. Give me back the 75W GPUs instead. No extra power connector, just the slot.
[+] bruce343434|4 years ago|reply
APUs/integrated GPUs have taken up that role.
[+] sp332|4 years ago|reply
It's been a few years since the release of the GTX 1650 but it's still a pretty serviceable card. Cheap, too.
[+] bartwe|4 years ago|reply
We're going to see laws dictating max wattages for consumer computers real soon.
[+] brenelson|4 years ago|reply
It doesn't make any sense. Why do you need to have 600W to power a graphics card? Why do they need an independent power connector?
[+] gambiting|4 years ago|reply
I'm not sure I understand your question, but I will try.

>>Why do you need to have 600W to power a graphics card?

You don't. There is no GPU currently that needs that much. But as cards are comfortably approaching 400W and more, a new connector is necessary so you don't end up with GPUs taking 3 or 4 existing PCIe 8-pin power connectors. This single 55-amp compatible connector allows for significantly easier routing of paths on circuit boards.

>>Why do they need an independent power connector?

Because the PCIe slot alone can only provide 75W of power. And if it could provide more you'd need to provide that power to the motherboard so you just moved your connector from one place to another.

[+] jscipione|4 years ago|reply
This is still nowhere near as elegant as Apple’s MPX connector which eliminates the extra cabling altogether.
[+] denton-scratch|4 years ago|reply
600 watts? That's one bar of a two-bar electric fire; in my youth, that two-bar fire was the only thing keeping some families warm.

I mean, that power all gets turned into heat anyway in the end; so sure, if you need more heating, forget the two-bar fire, and buy a graphics controller instead - playing games with the two-bar fire isn't a good plan.

I don't get why they can't make efficient GPUs. I mean, I do get that graphics depends on computation, and that all computation has an intrinsic minimum energy cost; but half a killowatt, to make a moving picture? That's more than the power supply for my gaming rig (which I retired, because the fan noise was excessive).