What's y'all's read on the 12-pin PCIe rumor surrounding the 3080ti?

I wonder if the ultimate goal to avoid issues due to users using 6 and 8 pin splitters to power higher end cards. I seem to vaguely recall that some of the RX5700 crash issues were being blamed on how the auxiliary power was being delivered to the card.
Doubt it, Splitters will be around regardless. Jerryrigged Ebay China crap wont go away just because of a new plug...
 
I can see it being worse, like a dual SATA to 12pin. or even a single sata to 12pin

case in point. Up to 150w through pins rated for 54w total.

Ugh. And of course Amazon won't let anyone who hasn't bought that little fire-starter post an answer to the "can I safely use it" questions other people have asked.

Upvoted the 1-star "it charred" review, added comment explaining why the magic smoke came out, and flagged all the answers claiming its safe as not helpful.
 
Last edited:
I wonder if the ultimate goal to avoid issues due to users using 6 and 8 pin splitters to power higher end cards. I seem to vaguely recall that some of the RX5700 crash issues were being blamed on how the auxiliary power was being delivered to the card.


And you're acting like this is going to fix shitty PSU adapters?

People aren't going to suddenly have a change of heart and buy a new PSU. They're just going to go buy new adapters.

Unless the power consumption of these new cards doubles, then you can't expect people to treat this new connector any more seriously than the last one.

And increasing the power consumption of midrange GPUs to 400w is totally going to sell I in gaming laptops, right? You realize how impossible this whole thing is for Nvidia consumer market, right?

You do know which gaming market is Nvidia highest margin, right?

I'm not saying that this connector doesn't exist, but it won't appear on consumer cards anytime soon. This was likely an engineering sample.

The other reason this is horseshit: the spec for the 8-pin PCIe plug shipped in Jan 2007, while the first graphics card with the plug shipped six months later (2900 XT.)

https://www.anandtech.com/show/2231/18

We don't even have this spec released yet, so we're at least a year away.
 
Last edited:
And you're acting like this is going to fix shitty PSU adapters?

People aren't going to suddenly have a change of heart and buy a new PSU. They're just going to go buy new adapters.

Unless the power consumption of these new cards doubles, then you can't expect people to treat this new connector any more seriously than the last one.

And increasing the power consumption of midrange GPUs to 400w is totally going to sell I in gaming laptops, right? You realize how impossible this whole thing is for Nvidia consumer market, right?

You do know which gaming market is Nvidia highest margin, right?

I'm not saying that this connector doesn't exist, but it won't appear on consumer cards anytime soon. This was likely an engineering sample.

The other reason this is horseshit: the spec for the 8-pin PCIe plug shipped in Jan 2007, while the first graphics card with the plug shipped six months later (2900 XT.)

https://www.anandtech.com/show/2231/18

We don't even have this spec released yet, so we're at least a year away.

the speculation is that Nvidia created the 12pin spec and is petitioning to have it added to PCIe and ATX. More than likely they've designed it for some high power datacenter products (e.g. two or more GPUs on one card or slot powered by a single cable) and are just pushing it down the product stack to reduce costs.
 
the speculation is that Nvidia created the 12pin spec and is petitioning to have it added to PCIe and ATX. More than likely they've designed it for some high power datacenter products (e.g. two or more GPUs on one card or slot powered by a single cable) and are just pushing it down the product stack to reduce costs.


Of course -- how else is Nvidia supposed to power their 400w--rated Ampere supercomputer GPUs?

It's still not on the official spec sheet yet, which mean this is an engineering sample; 8-pin PCie power standard definition predates the release of the HD 2900 XT by 5 months, so it will not be involved in consumer Ampere.

I agree that this may be the future of video card power, but it's still a long way from that.
 
Last edited:
Of course -- how else is Nvidia supposed to power their 400w--rated Ampere supercomputer GPUs?

It's still not on the official spec sheet yet, which mean this is an engineering sample; 8-pin PCie power standard definition predates the release of the HD 2900 XT by 5 months, so it will not be involved in consumer Ampere.

I agree that this may be the future of video card power, but it's still a long way from that.

Two 8-pin or three 6-pin connectors aren’t allowed by the PCIe spec, yet cards have them. ATX12VO was just released, but OEMs have been producing systems with 12v only PSUs for quite a few years now. High powered USB charging was being implement long before there was a USB power spec. So there’s nothing stopping Nvidia from doing what they want and trying to push the industry towards their designs (Something they’ve done it time and time again).
 
Of course -- how else is Nvidia supposed to power their 400w--rated Ampere supercomputer GPUs?

It's still not on the official spec sheet yet, which mean this is an engineering sample; 8-pin PCie power standard definition predates the release of the HD 2900 XT by 5 months, so it will not be involved in consumer Ampere.

I agree that this may be the future of video card power, but it's still a long way from that.

The A100 PCIe is 250 Watt TDP.
The A100 HGX-2 is 400 Watt TDP (No molex)

The datasheets are public.

So to answer you question:
HGX-2.
 
The A100 PCIe is 250 Watt TDP.
The A100 HGX-2 is 400 Watt TDP (No molex)

The datasheets are public.

So to answer you question:
HGX-2.

Well you realize that they caped a 400w card at 250, to fit within pcie power specs? I'm sure this was intended to uncap.
 
Last edited:
Well you realize that they caped a 400w card at 250, to fit within pcie power specs? I'm sure this was intended to uncap.

Datacenter != consumer space....no molex's in datacenter.
They did the same with Volta FYI.

Look at their DGX
 
no way they give us cards that take much more power then they do now....the stock hsf would have to be huge unless they go wc only and then the thing would be $3k
 
no way they give us cards that take much more power then they do now....the stock hsf would have to be huge unless they go wc only and then the thing would be $3k
Have you seen the 3000 series coolers? The stock HSFs are huge.
 
If Big Navi is 30-50% over 2080 ti and Nvidia "know it" they will jump on more watts to get as much as they can to be competitive or better.
And information about this 12-pin conector follow me to think in this direction.

But we just need to wait and we will see it soon :)
 
Last edited:
no way they give us cards that take much more power then they do now....the stock hsf would have to be huge unless they go wc only and then the thing would be $3k

Stock cooling is huge and water cooling is cheap
 
HA! Not a rumour, and I was right - the adapter requires you to have two 8-pins available to begin with. So, Nvidia is just changing stuff for no reason.

https://videocardz.com/newz/seasonic-confirms-nvidia-rtx-30-series-12-pin-adapter

I disagree entirely on it being for no reason. As power demands change, it makes sense to change and simplify the connector. By that logic, we should have just been sticking three or four molex's on cards all along instead of coming out with the PCI-E 6-pin at all.

There is little to no downside here. Power is easily and cheaply adaptable. An adapter will come with every GPU sold, probably for the next decade or more (were still occasionally seeing molex adapters with cards, what, 14 years on now?). This alienates absolutely no one and irritates only the absolute worst of cable snobs. As power supply manufacturers update their products, adapters will no longer be necessary, and life will go on. The 6+2 connectors are a pain in the ass. Two of them is a double pain in the ass. Seeing them reduced to a single 12-pin cable is a lot more convenient.
 
Not entirely no reason. They want power delivery, but not the associations 3 connectors worth of power will bring.

Bonus is that it keeps cabling cleaner.
Where are you getting "3 connectors?" This replaces two 8-pins, not three. The 12-pin - which has six power pins and six ground pins - is capable of the same power delivery as two 8-pins - which also have six power pins and six ground pins.

The single-cable "clean" thing is a non-starter, because it's only "cleaner" if you order custom cables, else you'll be using an adapter from Nvidia, and if you're ordering custom cables already then you can get two clean 8-pins with combs.

It's a minor gripe, to be sure, but there was no technical reason they needed to do this.
 
As power demands change, it makes sense to change and simplify the connector.
Agreed, but power demands haven't changed here. The fact that the adapter turns two 8-pins into a single 12-pin proves that two 8-pins would have satisfactorily supplied enough.
 
Agreed, but power demands haven't changed here. The fact that the adapter turns two 8-pins into a single 12-pin proves that two 8-pins would have satisfactorily supplied enough.

Perhaps they are looking to the future? Maybe Nvidia thinks two 8-pins will someday be insufficient? If so, perhaps better to introduce a change before it's necessary and let power supply manufactures adapt proactively? Maybe it's better from an engineering or manufacturing standpoint? I certainly don't know, I'm just speculating, but I fail to see a downside. Like I said above, if for nothing else at all, I hate fighting with those stupid 6+2 connectors. Maybe Nvidia is just sick of them too? Maybe it's something power supply manufacturers or system builders have been looking for?
 
Where are you getting "3 connectors?" This replaces two 8-pins, not three. The 12-pin - which has six power pins and six ground pins - is capable of the same power delivery as two 8-pins - which also have six power pins and six ground pins.

The single-cable "clean" thing is a non-starter, because it's only "cleaner" if you order custom cables, else you'll be using an adapter from Nvidia, and if you're ordering custom cables already then you can get two clean 8-pins with combs.

It's a minor gripe, to be sure, but there was no technical reason they needed to do this.
The actual spec allows for more amperage per power wire, so in theory this new 12 pin is capable of more amps than two 8 pin connectors.

Seriously this is about reducing connector counts and nothing more.
 
Perhaps they are looking to the future? Maybe Nvidia thinks two 8-pins will someday be insufficient? If so, perhaps better to introduce a change before it's necessary and let power supply manufactures adapt proactively? Maybe it's better from an engineering or manufacturing standpoint? I certainly don't know, I'm just speculating, but I fail to see a downside. Like I said above, if for nothing else at all, I hate fighting with those stupid 6+2 connectors. Maybe Nvidia is just sick of them too? Maybe it's something power supply manufacturers or system builders have been looking for?
Personally, I find the 6+2 connectors a lot less of a hassle than an adapter I have to put between the PSU and GPU, but I can concede that's personal taste. To me, that's the downside - my PSU is already equipped with connectors that are capable of delivering enough power, but Nvidia is manufacturing a card that they can't plug into.
The actual spec allows for more amperage per power wire, so in theory this new 12 pin is capable of more amps than two 8 pin connectors.

Seriously this is about reducing connector counts and nothing more.
That makes zero sense, electrically. If the 12-pin were rated for more current, then Nvidia could not safely include a 2x8-pin to 1x12pin adapter - the same amount of current has to flow through the entire circuit, so if the 2x8pins were insufficient before, adding an adapter that physically rearranges the pins doesn't magically increase the ampacity of the wire or connectors.

The only point I can see to this is eliminating the sense pins, since those apparently are used little enough in application that Nvidia determined they weren't needed.

These aren't warpath posts, by the way. I'm still buying a 3080ti/3090, and I'll eventually get the Cablemod set to make my RM850 "natively" compatible with the card. I'm annoyed at making changes to standardized systems without a pretty compelling technical reason, and I just don't see one here.
 
  • Like
Reactions: noko
like this
Personally, I find the 6+2 connectors a lot less of a hassle than an adapter I have to put between the PSU and GPU, but I can concede that's personal taste. To me, that's the downside - my PSU is already equipped with connectors that are capable of delivering enough power, but Nvidia is manufacturing a card that they can't plug into.

That makes zero sense, electrically. If the 12-pin were rated for more current, then Nvidia could not safely include a 2x8-pin to 1x12pin adapter - the same amount of current has to flow through the entire circuit, so if the 2x8pins were insufficient before, adding an adapter that physically rearranges the pins doesn't magically increase the ampacity of the wire or connectors.

The only point I can see to this is eliminating the sense pins, since those apparently are used little enough in application that Nvidia determined they weren't needed.

These aren't warpath posts, by the way. I'm still buying a 3080ti/3090, and I'll eventually get the Cablemod set to make my RM850 "natively" compatible with the card. I'm annoyed at making changes to standardized systems without a pretty compelling technical reason, and I just don't see one here.
It’s not that they can’t supply more power, the spec doesn’t allow for it.
 
It’s not that they can’t supply more power, the spec doesn’t allow for it.
Again - if the spec on two 8-pins don't allow for more power delivery, then adding an adapter to the end of two 8-pins doesn't change the 8-pins or the spec. If the 12-pin allows for more, that's one thing - the spec may have been written with more stringent requirements on pin type, crimp, wire stranding or gauge, etc - that's believable. But if Nvidia are willing to just throw an adapter in the box, that proves that two 8-pins could deliver sufficient power.
 
  • Like
Reactions: noko
like this
That makes zero sense, electrically. If the 12-pin were rated for more current, then Nvidia could not safely include a 2x8-pin to 1x12pin adapter - the same amount of current has to flow through the entire circuit, so if the 2x8pins were insufficient before, adding an adapter that physically rearranges the pins doesn't magically increase the ampacity of the wire or connectors.

The only point I can see to this is eliminating the sense pins, since those apparently are used little enough in application that Nvidia determined they weren't needed.

These aren't warpath posts, by the way. I'm still buying a 3080ti/3090, and I'll eventually get the Cablemod set to make my RM850 "natively" compatible with the card. I'm annoyed at making changes to standardized systems without a pretty compelling technical reason, and I just don't see one here.
Again - if the spec on two 8-pins don't allow for more power delivery, then adding an adapter to the end of two 8-pins doesn't change the 8-pins or the spec. If the 12-pin allows for more, that's one thing - the spec may have been written with more stringent requirements on pin type, crimp, wire stranding or gauge, etc - that's believable. But if Nvidia are willing to just throw an adapter in the box, that proves that two 8-pins could deliver sufficient power.
I think the main thing will come down to if they change the spec on the 20AWG gauge wire to 16AWG or 18AWG if they didn't change it to at least 16 then it was a change based on saving pennies versus saving while adding tangible benefits. With 16AWG they get less resistance(close to 90%), double the cooper and added protections from bends or wire breaks in the jacket.
 
Again - if the spec on two 8-pins don't allow for more power delivery, then adding an adapter to the end of two 8-pins doesn't change the 8-pins or the spec. If the 12-pin allows for more, that's one thing - the spec may have been written with more stringent requirements on pin type, crimp, wire stranding or gauge, etc - that's believable. But if Nvidia are willing to just throw an adapter in the box, that proves that two 8-pins could deliver sufficient power.

The pins used in all PCIe power connectors are rated for 9 amps each, which mean they can handle up to 108 watts per pin, even in the 6 pin connector. So yes, two 8-pins are capable of supplying the same power as this 12-pin... however, the spec was either short-sighted or long-lived and didn't allow for it. If the card draws more power than is allowed in the spec it won't be certified. Without certification, they can't say it's a PCIe device (which is why I believe that the next round of consumer products will not have this connector).
 
I think the main thing will come down to if they change the spec on the 20AWG gauge wire to 16AWG or 18AWG if they didn't change it to at least 16 then it was a change based on saving pennies versus saving while adding tangible benefits. With 16AWG they get less resistance(close to 90%), double the cooper and added protections from bends or wire breaks in the jacket.

20awg CU wire has an ampacity of 11a and a resistance of 0.02 ohms per foot. Two feet of wiring will drop 12.0v to 11.83v at 600w. It's not going to be an issue unless you're already running at the lower end of the spec, in which case you need a better supply.
 
Y'all missed the part where it says Nvidia on the box... instead of uesing an open standard all PSU manufacturers probably have to pay Nvidia to ues that cable now...
 
  • Like
Reactions: noko
like this
Y'all missed the part where it says Nvidia on the box... instead of uesing an open standard all PSU manufacturers probably have to pay Nvidia to ues that cable now...
I don't think Nvidia would be very successful in trying to charge royalties for other manufacturers using Molex© Mini-fit™ connectors. It likely says Nvidia on the Seasonic©™® box because Nvidia is the only company using that particular configuration in the PC hardware space right now.
 
4x the power of a 6-pin with only twice the size? Seems like a win.

Wonder if there will be any older PSUs that need to run 4x 6-pin to 2x 8-pin to 1x 12-pin....

Concentrate all firepower to that Super 3090!
 
  • Like
Reactions: Auer
like this
So I first wanted to start my response to this by adding this article from Tom's Hardware on this subject: https://www.tomshardware.com/news/s...ector-lists-850w-psu#xenforo-comments-3639319 It's a good read and adds a little more to the conversation. My response starts with Seasonic already confirming the cable. Likely there's 2 ways this cable is going to be made available at least at the start: either you order the cable from a PSU manufacturer directly and they ship it to you. Or (as many have already stated) it'll come as an adapter with the new RTX 30 series cards. What Seasonic also added by their response in the Tom's Hardware article is that "the cable is still in testing" at the moment. Also, the Tom's Hardware article does make clear you need at least an 850W PSU at this time to be able to use the cable. So while they have confirmed the cable and its in testing there is a lot of questions on my mind. They start with this: will you be able when receiving the cable be able to plug this into a current modular PSU? Next, how many RTX 30 cards will use the new power connector? Certainly the 3090 will but how many more? Finally, how long will it be after the RTX 30 series launches that this cable will just come standard in the cable package for PSUs? That's what is on my mind and until next time I am out!
 
Well Nvidia can always pull an Apple, monitor stand -> $999 or Monster cables, buy the right cable from Nvidia store for $49, no, $79 and gold plated, supreme GeForce design $119. I guess we will find out once Nvidia launches the cards to see what Nvidia will want us to believe.
 
Not sure if this image is real, but considering how many other things have leaked it probably is.

1598242489446.png


Honestly, if its that small and that much more bendable then I'm all for it. The most annoying part of cable managing right now are the GPU cables anyway.
I doubt that I'll ever buy a non modular PSU ever again, and I'm sure they can figure out someway to make this work with my current PSU.
 
Back
Top