That looks all the world like a Micro-Fit 3.0 drawing, whose contacts are normally rated at 8.5A, but here they're claiming 9.2A each. Also, the largest Micro-Fit terminal is made for 18AWG, but the article mentions 16.
I suspect what's happening is Molex is making a new version of the terminal with higher clamping force or better plating, to keep the temperature rise down at higher current, and larger wire grips, to accommodate the thicker conductor. (This will also allow more heat flow away from the contact.)
However, the contact pitch is unchanged from 3.0mm, meaning the cavities in the housing can't grow any more, so the wire insulation thickness will be limited. That's not a big deal electrically since it's only 12 volts, but it's a consideration mechanically since the wires will be less protected against abrasion, pinching, and other damage.
But that is allowing a maximum temperature rise of 30C, it uses 18AWG wire as you say, and it's supposed to be capable of 600V AC/DC; it's not supposed to decompose/leak more than 5mA when exposed to 2200V.
All of that is overkill for inside of a PC. If you're willing to limit the operating voltage to 12V and can guarantee that your board side conductors have big, multi-plane copper traces pulling heat out of the contacts to an actively cooled heat sink, you can get away with a lot that you couldn't if you were using the connector as, say, an in-line disconnect in some conduit for a 480V servo drive.
I've always wondered why CPU AIO coolers are so prevalent, but GPUs almost exclusively use air coolers. We're nearing a point where GPUs will have a TDP >2X the hungriest CPU (Threadripper), but the cooling used by most is far less effective.
It's especially noticeable in smaller builds like those using mini-itx. Most of them GPU coolers draw air in from the bottom and exhaust out the sides, which doesn't really fit well the usual PC case airflow setup of draw cool air in the front and exhaust out the back and top. It seems like blower style coolers are also getting harder to find.
I'm hoping my next PC can have an AIO at the intake connected to the GPU, and a large air cooler for the CPU. That makes way more sense to me than having the AIO on the CPU and an air cooler on the GPU...
The difference is that with GPUs nearly every model has a different layout of components around the actual chip, and those components need to be cooled too, so making a universal cold plate for them can be really challenging. The AIO I bought for my last computer could be used with any CPU, AMD or Intel, released in the past decade. The water block I bought for the GPU I put in my desktop two computers ago could only be used with reference designs of the AMD R9 290 or 290x.
There are models of GPUs with AIOs, EVGA makes their Hydro series, Gigabyte has their waterforce series. They just usually aren't worth the cost since you have to design a new full cover block not just got every generation but also usually for different cards in the same gen too.
Lastly, TDP is not a defined standard and each manufacturer defines it differently. Intel for example defines TDP as the maximum allowable power draw under full load. How that translates to actual heat output depends on architecture and process node. Intel's most recent chips for example all max out at just over 100 watts TDP, but they're known to need 200+ watts of heat dissipation for sustained workloads.
A big part is that CPU sockets are very standardized and you are only cooling the CPU and not the memory and power regulators. For graphics cards the coolers need to cool the gpu along with the memory and power regulators which can be in different places depending on the manufacturer. This usually means you need a block made specifically for your model of card. You can buy those blocks and do custom water cooling but the cost of that is incredibly high. Also it finally comes down to the AIO coolers really aren't better than good air coolers. Sure AIO tend to be quieter and move the heat to a different location but on pure performance they don't crush air coolers.
I’ve always thought of gaming as a virtuous low-impact form of entertainment, insofar as it displaces real-world pursuits like foreign ski holidays, gasoline-powered road trips, or land-hogging golf courses.
But burning nearly a kilowatt on a gaming rig feels like a step too far. Sure, the power could be sourced from wind or solar plants, but there’s still a level of excessive conspicuous consumption going on here that doesn’t sit right with me.
Definitely agree that a kilowatt seems very high for gaming and starts to have an environmental impact. Still, driving a car at 100kph takes around 20 kW of power, so driving somewhere is still a bigger burden on the environment than gaming.
Energy is a proxy for accomplishment and development. Humanity's #1 goal should be to maximize energy use per capita while minimizing energy use per activity, such that every human is doing as many desirable activities as possible.
Gaming at 1kW for an hour costs 30 cents or less in electricity. As others have mentioned indirectly, going for a Sunday drive to see the changing leaves uses more energy. A space heater uses more power.
It's a trivial amount of power and even better, we have already electrified all the components! Computers don't require us to go out and replace all the combustion engine vehicles still around. Clean up the grid and that's it.
> But burning nearly a kilowatt on a gaming rig feels like a step too far.
How much energy does a golf cart use? Cause those golfers are using a golf-cart for miles when their legs are perfectly acceptable (and golf-carts are illegal in professional play)
It's really low impact though. Even if your rig consumes a kW, you're still not driving anywhere (in a car that consumes at least an order of magnitude more), or going out to a restaurant (that's air conditioned), etc.
if you crunch the numbers on how much solar (desertec has a fun graphic on this for the interested) and wind power we could produce if politicians actually cared about fighting climate change, there is nearly an infinite supply of power for humanity.
This isnt even including fossil fuels, nuclear/fusion, biomass/trash, and hydro.
Why so negative. I think gaming is the future. We can heat houses and have fun. If only will be able to store the heat. Why make better chips when the user do not care ?
Why is this being downvoted? If you have a kilowatt (I wanted to type that out instead of write KW to emphasize it) coming out of the wall that energy has to go somewhere. Literally warm your room during winter and turn down your heater.
All of the energy is converted to heat. Some energy from your rig is emitted as light; but that's your monitor, and your monitor has its own power supply (which means that 600W is an underestimate).
The RTX 3080 already uses 320W (peak?). By next GPU generation who knows.
Good lord, just 2 generation back GPUs like the 1070Ti consumed 180W. Even if you add a CPU (of that era) like the i7-7700 with it 65W draw, that just 245W. Now a video card alone can draw more power.
Everyone tries to be fancy with wiring exotic networking in their homes, but the real performance kings are installing NEMA 14-50R in their offices right now.
Anecdote: A relative of mine who does home remodeling told me that over the past year most of his clients plan for dedicated office rooms now, rooms that would've been planned as guest rooms and whatnot before. They're also running 12-2 wire with 20 amp breakers to these rooms now. So yeah, not quite the laundry plugs you're describing but getting closer.
Why would it ever? It's much more efficient to scale up the one AC mains voltage -> 12V DC supply you already have.
You think in datacenters they are just stacking up PSUs? No, all of this stuff gets vastly more efficient and reliable if you can build just one big PSU to supply say 48V DC and distribute that everywhere.
We seem to be reaching the point where I expect it'll soon make more sense to consider the GPU as the heart of the computer and the CPU as a peripheral.
The real question is at what point do gamers start choosing lower power parts because the GPU is too power hungry? Do we get into a world where professional gamers start having 240V lines installed for their PCs?
A US power outlet is typically rated at 120VAC@15A, that's a 1800W budget if the only thing on the circuit is a PC.
Will Smith (of Tested/Maximum PC) mentioned in one of his recent podcast episodes [1] that GPU vendors tried to add dedicated power supplies to cards, with their own AC connection, a while back (mid 2000s or so if I remember right). Consumer/industry pushback killed it before they hit shelves and the vendors focused on efficiency instead. I don't see the same pushback happening these days.
I'll be honest, it's very hard for me to imagine what I would do that would demand anywhere near 600 watts for a graphics card alone. I mean my PC can draw at most 120 watts and that feels like a lot, although the PC is pretty old by now. All of this for crypto? How many games are there out there that draw anywhere near this level of wattage just for graphics?
This is long overdue, for something I have been questioning since 2016. ( Our GPUs are severely TDP limited )
It is also interesting on one hand you have Apple pushing CPU and GPU integration for maximum efficiency. Where Apple could make a potential 300W TDP SoC for their Mac Pro. On the other hand you have CPU pushing to 400W ( and higher in the future ) while GPU pushing to 600W.
I'm not sure I understand your question, but I will try.
>>Why do you need to have 600W to power a graphics card?
You don't. There is no GPU currently that needs that much. But as cards are comfortably approaching 400W and more, a new connector is necessary so you don't end up with GPUs taking 3 or 4 existing PCIe 8-pin power connectors. This single 55-amp compatible connector allows for significantly easier routing of paths on circuit boards.
>>Why do they need an independent power connector?
Because the PCIe slot alone can only provide 75W of power. And if it could provide more you'd need to provide that power to the motherboard so you just moved your connector from one place to another.
600 watts? That's one bar of a two-bar electric fire; in my youth, that two-bar fire was the only thing keeping some families warm.
I mean, that power all gets turned into heat anyway in the end; so sure, if you need more heating, forget the two-bar fire, and buy a graphics controller instead - playing games with the two-bar fire isn't a good plan.
I don't get why they can't make efficient GPUs. I mean, I do get that graphics depends on computation, and that all computation has an intrinsic minimum energy cost; but half a killowatt, to make a moving picture? That's more than the power supply for my gaming rig (which I retired, because the fan noise was excessive).
[+] [-] myself248|4 years ago|reply
https://www.molex.com/molex/products/family/microfit_30?pare...
I suspect what's happening is Molex is making a new version of the terminal with higher clamping force or better plating, to keep the temperature rise down at higher current, and larger wire grips, to accommodate the thicker conductor. (This will also allow more heat flow away from the contact.)
However, the contact pitch is unchanged from 3.0mm, meaning the cavities in the housing can't grow any more, so the wire insulation thickness will be limited. That's not a big deal electrically since it's only 12 volts, but it's a consideration mechanically since the wires will be less protected against abrasion, pinching, and other damage.
[+] [-] LeifCarrotson|4 years ago|reply
https://www.molex.com/pdm_docs/ps/PS-43045-001.pdf
But that is allowing a maximum temperature rise of 30C, it uses 18AWG wire as you say, and it's supposed to be capable of 600V AC/DC; it's not supposed to decompose/leak more than 5mA when exposed to 2200V.
All of that is overkill for inside of a PC. If you're willing to limit the operating voltage to 12V and can guarantee that your board side conductors have big, multi-plane copper traces pulling heat out of the contacts to an actively cooled heat sink, you can get away with a lot that you couldn't if you were using the connector as, say, an in-line disconnect in some conduit for a 480V servo drive.
[+] [-] xxpor|4 years ago|reply
[+] [-] kayson|4 years ago|reply
It's especially noticeable in smaller builds like those using mini-itx. Most of them GPU coolers draw air in from the bottom and exhaust out the sides, which doesn't really fit well the usual PC case airflow setup of draw cool air in the front and exhaust out the back and top. It seems like blower style coolers are also getting harder to find.
I'm hoping my next PC can have an AIO at the intake connected to the GPU, and a large air cooler for the CPU. That makes way more sense to me than having the AIO on the CPU and an air cooler on the GPU...
[+] [-] ajoy39|4 years ago|reply
There are models of GPUs with AIOs, EVGA makes their Hydro series, Gigabyte has their waterforce series. They just usually aren't worth the cost since you have to design a new full cover block not just got every generation but also usually for different cards in the same gen too.
Lastly, TDP is not a defined standard and each manufacturer defines it differently. Intel for example defines TDP as the maximum allowable power draw under full load. How that translates to actual heat output depends on architecture and process node. Intel's most recent chips for example all max out at just over 100 watts TDP, but they're known to need 200+ watts of heat dissipation for sustained workloads.
[+] [-] sbradford26|4 years ago|reply
[+] [-] jl6|4 years ago|reply
But burning nearly a kilowatt on a gaming rig feels like a step too far. Sure, the power could be sourced from wind or solar plants, but there’s still a level of excessive conspicuous consumption going on here that doesn’t sit right with me.
[+] [-] evandijk70|4 years ago|reply
[+] [-] nitrogen|4 years ago|reply
Energy is a proxy for accomplishment and development. Humanity's #1 goal should be to maximize energy use per capita while minimizing energy use per activity, such that every human is doing as many desirable activities as possible.
Gaming at 1kW for an hour costs 30 cents or less in electricity. As others have mentioned indirectly, going for a Sunday drive to see the changing leaves uses more energy. A space heater uses more power.
[+] [-] stefan_|4 years ago|reply
[+] [-] unglaublich|4 years ago|reply
[+] [-] jayd16|4 years ago|reply
[+] [-] turtlebits|4 years ago|reply
Unless energy prices rise, it's literally pennies.
[+] [-] dragontamer|4 years ago|reply
How much energy does a golf cart use? Cause those golfers are using a golf-cart for miles when their legs are perfectly acceptable (and golf-carts are illegal in professional play)
[+] [-] sz4kerto|4 years ago|reply
[+] [-] tiotempestade|4 years ago|reply
This is getting out of control!
Nvidia/AMD: build more efficient GPUs!
[+] [-] chaosbutters314|4 years ago|reply
This isnt even including fossil fuels, nuclear/fusion, biomass/trash, and hydro.
https://en.wikipedia.org/wiki/Desertec#/media/File:Fullneed....
[+] [-] hulitu|4 years ago|reply
[+] [-] rowanG077|4 years ago|reply
[+] [-] DeathArrow|4 years ago|reply
[+] [-] httpz|4 years ago|reply
[+] [-] noobermin|4 years ago|reply
[+] [-] ksec|4 years ago|reply
Do Crypto People use their GPU generated heat as heater assuming they lives in a place that requires heating anyway?
Doesn't that equate to free money assuming they could actually mine crypto?
[+] [-] denton-scratch|4 years ago|reply
[+] [-] oatmeal_croc|4 years ago|reply
[+] [-] worrycue|4 years ago|reply
Good lord, just 2 generation back GPUs like the 1070Ti consumed 180W. Even if you add a CPU (of that era) like the i7-7700 with it 65W draw, that just 245W. Now a video card alone can draw more power.
[+] [-] avian|4 years ago|reply
[+] [-] causi|4 years ago|reply
[+] [-] skhr0680|4 years ago|reply
[+] [-] bob1029|4 years ago|reply
[+] [-] neither_color|4 years ago|reply
[+] [-] timw4mail|4 years ago|reply
At which point does the GPU get its own dedicated power supply?
[+] [-] stefan_|4 years ago|reply
You think in datacenters they are just stacking up PSUs? No, all of this stuff gets vastly more efficient and reliable if you can build just one big PSU to supply say 48V DC and distribute that everywhere.
[+] [-] AnIdiotOnTheNet|4 years ago|reply
[+] [-] cogman10|4 years ago|reply
A US power outlet is typically rated at 120VAC@15A, that's a 1800W budget if the only thing on the circuit is a PC.
[+] [-] scrooched_moose|4 years ago|reply
1) https://techpod.content.town/
[+] [-] crest|4 years ago|reply
[+] [-] vmception|4 years ago|reply
This is true in many machines already for several years, if you look at the machines the hard to find GPUs are actually in.
I wouldn’t be surprised if that is a market force behind this new PCIe standard.
[+] [-] noobermin|4 years ago|reply
[+] [-] ksec|4 years ago|reply
It is also interesting on one hand you have Apple pushing CPU and GPU integration for maximum efficiency. Where Apple could make a potential 300W TDP SoC for their Mac Pro. On the other hand you have CPU pushing to 400W ( and higher in the future ) while GPU pushing to 600W.
[+] [-] DeathArrow|4 years ago|reply
[+] [-] nottorp|4 years ago|reply
[+] [-] bruce343434|4 years ago|reply
[+] [-] sp332|4 years ago|reply
[+] [-] bartwe|4 years ago|reply
[+] [-] brenelson|4 years ago|reply
[+] [-] gambiting|4 years ago|reply
>>Why do you need to have 600W to power a graphics card?
You don't. There is no GPU currently that needs that much. But as cards are comfortably approaching 400W and more, a new connector is necessary so you don't end up with GPUs taking 3 or 4 existing PCIe 8-pin power connectors. This single 55-amp compatible connector allows for significantly easier routing of paths on circuit boards.
>>Why do they need an independent power connector?
Because the PCIe slot alone can only provide 75W of power. And if it could provide more you'd need to provide that power to the motherboard so you just moved your connector from one place to another.
[+] [-] unknown|4 years ago|reply
[deleted]
[+] [-] jscipione|4 years ago|reply
[+] [-] denton-scratch|4 years ago|reply
I mean, that power all gets turned into heat anyway in the end; so sure, if you need more heating, forget the two-bar fire, and buy a graphics controller instead - playing games with the two-bar fire isn't a good plan.
I don't get why they can't make efficient GPUs. I mean, I do get that graphics depends on computation, and that all computation has an intrinsic minimum energy cost; but half a killowatt, to make a moving picture? That's more than the power supply for my gaming rig (which I retired, because the fan noise was excessive).