Up to 600 watts of power for graphics cards with the new PCIe 5.0 powerconnector
That looks all the world like a Micro-Fit 3.0 drawing, whose contacts are normally rated at 8.5A, but here they're claiming 9.2A each. Also, the largest Micro-Fit terminal is made for 18AWG, but the article mentions 16.
https://www.molex.com/molex/products/family/microfit_30?pare...
I suspect what's happening is Molex is making a new version of the terminal with higher clamping force or better plating, to keep the temperature rise down at higher current, and larger wire grips, to accommodate the thicker conductor. (This will also allow more heat flow away from the contact.)
However, the contact pitch is unchanged from 3.0mm, meaning the cavities in the housing can't grow any more, so the wire insulation thickness will be limited. That's not a big deal electrically since it's only 12 volts, but it's a consideration mechanically since the wires will be less protected against abrasion, pinching, and other damage.
I've always wondered why CPU AIO coolers are so prevalent, but GPUs almost exclusively use air coolers. We're nearing a point where GPUs will have a TDP >2X the hungriest CPU (Threadripper), but the cooling used by most is far less effective.
It's especially noticeable in smaller builds like those using mini-itx. Most of them GPU coolers draw air in from the bottom and exhaust out the sides, which doesn't really fit well the usual PC case airflow setup of draw cool air in the front and exhaust out the back and top. It seems like blower style coolers are also getting harder to find.
I'm hoping my next PC can have an AIO at the intake connected to the GPU, and a large air cooler for the CPU. That makes way more sense to me than having the AIO on the CPU and an air cooler on the GPU...
I’ve always thought of gaming as a virtuous low-impact form of entertainment, insofar as it displaces real-world pursuits like foreign ski holidays, gasoline-powered road trips, or land-hogging golf courses.
But burning nearly a kilowatt on a gaming rig feels like a step too far. Sure, the power could be sourced from wind or solar plants, but there’s still a level of excessive conspicuous consumption going on here that doesn’t sit right with me.
Assuming you can install two GPUs using 600W, that's basically running a hair dryer in your desktop since most of the energy is converted to heat.
Why does the author/publisher feel the need to overlay their own logo over drawings they copied from someone else's datasheet?
Everyone tries to be fancy with wiring exotic networking in their homes, but the real performance kings are installing NEMA 14-50R in their offices right now.
This looks like some abomination of the ancient molex drive connection with part of an ATX main connection.
At which point does the GPU get its own dedicated power supply?
I'll be honest, it's very hard for me to imagine what I would do that would demand anywhere near 600 watts for a graphics card alone. I mean my PC can draw at most 120 watts and that feels like a lot, although the PC is pretty old by now. All of this for crypto? How many games are there out there that draw anywhere near this level of wattage just for graphics?
This is long overdue, for something I have been questioning since 2016. ( Our GPUs are severely TDP limited )
It is also interesting on one hand you have Apple pushing CPU and GPU integration for maximum efficiency. Where Apple could make a potential 300W TDP SoC for their Mac Pro. On the other hand you have CPU pushing to 400W ( and higher in the future ) while GPU pushing to 600W.
I wonder what will be the MSRP for 3090 TI and what the actual price in stores would be. Even mid tier GPUs like 3070 cost a lot of money.
This is insane. Give me back the 75W GPUs instead. No extra power connector, just the slot.
We're going to see laws dictating max wattages for consumer computers real soon.
It doesn't make any sense. Why do you need to have 600W to power a graphics card? Why do they need an independent power connector?
This is still nowhere near as elegant as Apple’s MPX connector which eliminates the extra cabling altogether.
600 watts? That's one bar of a two-bar electric fire; in my youth, that two-bar fire was the only thing keeping some families warm.
I mean, that power all gets turned into heat anyway in the end; so sure, if you need more heating, forget the two-bar fire, and buy a graphics controller instead - playing games with the two-bar fire isn't a good plan.
I don't get why they can't make efficient GPUs. I mean, I do get that graphics depends on computation, and that all computation has an intrinsic minimum energy cost; but half a killowatt, to make a moving picture? That's more than the power supply for my gaming rig (which I retired, because the fan noise was excessive).