Nvidia RTX 4090 power connectors melting?

It looks good, but we can't see any bends past the sheath around the cable. Recommendation is no bends for the first 3.8Cm from the connector.
If you remove that tape/sheath off that is around all of the wires (carefully remove, can be put back on later), then slide up the cables' individual sheaths, you can see if its a 150V rated or 300V rated cable. The cables with the bad design were made with 150V rated wires. (V rating isn't the issue, but can be used to identify suspect adapters). You only need to find the label on a single wire. Instructions:

I don't know that any further visual analysis can be made without tearing further into it.

The connectors manufactured as the left one was made, are those that have been identified as failing. Telltale sign is the melting is happening on the sides. I don't believe any failures have been reported for adapters made like the one on the right.
1668022973577.png
 
Last edited:
You aren't going to get an arc off 12V. Needs 400V to jump a gap 0.03mm

Here's a nifty calculator:

https://www.cirris.com/learning-center/calculators/50-high-voltage-arc-gap-calculator

The image posted above showing a crooked connection is more likely to cause your atypical resistance problem. The GPU is going to pull x wattage from that wire but due to poor contact, the overall resistance at that junction is much higher. It will pull more current from less contact points, so in the case of the burnt connector, at the tip of the plug which matches with the melting occurring. I think due to the cheaper split female connector, any lateral load causes the split to grow wider allowing the pin inside to go in crooked in the picture and make the hot spots.
As a tig welder, you don't need 400v to jump even a much larger gap then .03mm. The better tig welders use High Frequency Start that ionizes the base to electrode gas (air, Argon etc.) to allow current to flow/arc. While the voltage to the card is DC, like a tig welder welding carbon steel or stainless steel you can still interpose a high frequency voltage or fluctuation to cause an ARC without having to touch the electrode to the material for a cleaner weld (not leaving Tungsten particles or much less).

How that relates to a GPU, lol, is anyone's guess, point is a larger gap can have an arc created with lower voltages. While the GPU is DC it does have very fast fluctuations of current flow, high frequency noise. Just saying/pointing out.

Other factor I've have not seen discussed is cycling, heating up and cooling down which causes expansion and contraction. Regardless of what people think, those pins do move around in the socket when they heat up and cool down, while perceptively not visible. If the pins and socket are of different material/metals or alloy would compound that movement. These connectors are smaller and yet passing even more current, they do heat up much more as a result plus the smaller size makes less surface area for the heat to dissipate. So their expansion and contraction I suspect would be much higher than a larger lower amp connector.
 
As a tig welder, you don't need 400v to jump even a much larger gap then .03mm. The better tig welders use High Frequency Start that ionizes the base to electrode gas (air, Argon etc.) to allow current to flow/arc. While the voltage to the card is DC, like a tig welder welding carbon steel or stainless steel you can still interpose a high frequency voltage or fluctuation to cause an ARC without having to touch the electrode to the material for a cleaner weld (not leaving Tungsten particles or much less).

How that relates to a GPU, lol, is anyone's guess, point is a larger gap can have an arc created with lower voltages. While the GPU is DC it does have very fast fluctuations of current flow, high frequency noise. Just saying/pointing out.

Other factor I've have not seen discussed is cycling, heating up and cooling down which causes expansion and contraction. Regardless of what people think, those pins do move around in the socket when they heat up and cool down, while perceptively not visible. If the pins and socket are of different material/metals or alloy would compound that movement. These connectors are smaller and yet passing even more current, they do heat up much more as a result plus the smaller size makes less surface area for the heat to dissipate. So their expansion and contraction I suspect would be much higher than a larger lower amp connector.

You can also get arcs just from a crappy connection. Not like it needs to jump to a completely different pin.
 
How that relates to a GPU, lol, is anyone's guess, point is a larger gap can have an arc created with lower voltages. While the GPU is DC it does have very fast fluctuations of current flow, high frequency noise. Just saying/pointing out.

I've shot sparks with toy cars running on AAs. A zot here, a zot there, and yeah, melted connector insulation, makes sense.

And I do believe it's enough to be a fire hazard on a long enough timeline.
 
is mine plugged in sufficiently?

---
"According to Corsair PSU Expert Jon Gerow (Johnguru), the issue of melting power connectors may be as simple as wrong connection between the cable/adapter and the connector on the graphics card. There should be no gap between the cable and connector, as shown below." https://videocardz.com/newz/mind-th...mmends-how-to-insert-12vhpwr-cables-correctly

View attachment 525484
I checked mine last night and it looks similar to yours. I noticed it had a slight gap and the retention clip on the adapter wasn't fully clipped. I tried to push it in more but it wouldn't budge. Then I tried to reseat it and still couldn't get it to go in further. I didn't want to push it any harder so I left it how it was. The adapter and connector were fine.
 
I checked mine last night and it looks similar to yours. I noticed it had a slight gap and the retention clip on the adapter wasn't fully clipped. I tried to push it in more but it wouldn't budge. Then I tried to reseat it and still couldn't get it to go in further. I didn't want to push it any harder so I left it how it was. The adapter and connector were fine.
thanks for the confirmation
 
I checked mine last night and it looks similar to yours. I noticed it had a slight gap and the retention clip on the adapter wasn't fully clipped. I tried to push it in more but it wouldn't budge. Then I tried to reseat it and still couldn't get it to go in further. I didn't want to push it any harder so I left it how it was. The adapter and connector were fine.
Dielectric grease to lube up the connector.
 
I'm surprised I haven't seen this suggested before.

I feel stupid for not thinking of it, too.
C'mon man, this is [H]ardOCP. We lube up every inch of a PC with dielectric grease around these parts! Get [H]ard or get out! :ROFLMAO:

But seriously, that's not a terrible idea unless you still have warranty; and then I imagine a manufacturer might void your warranty if they found grease un the power receptacle. :LOL:
 
C'mon man, this is [H]ardOCP. We lube up every inch of a PC with dielectric grease around these parts! Get [H]ard or get out! :ROFLMAO:

But seriously, that's not a terrible idea unless you still have warranty; and then I imagine a manufacturer might void your warranty if they found grease un the power receptacle. :LOL:
what about using some degreaser for the warranty support?
 
what about using some degreaser for the warranty support?
Maybe if evga started selling degreaser, then perhaps other manufacturers might be lenient on warranty support. Could be their chance to stay relevant in the PC enthusiasts space.
 
  • Like
Reactions: erek
like this
Dielectric grease to lube up the connector.
Absolutely this. As someone who works on cars probably more than computers dialectrics grease is a wonderful thing. It prevents arc'ing which is what causes melting at electrical connection points. It also makes sure connections don't corrode over time where weather might affect it. Plus it has the expected properties of grease. I ended up purchasing the Cablemod 4-1 adapter for my 4090 which is awesome in my huge case. I'm not worried. But all that being said, if wires are crossed you're screwed. If you're applying pressure on connection points that could potentially break and touch even dialectric grease won't help. But this isn't anything new. Same thing would happen to any PSU connector that isn't up to par. I don't know where I'm going with this other than I'm a huge proponent of dialectric grease. Anytime I take anything electrical apart in my 1989 Landcruiser it gets some.
 
for your consideration of how to properly use dielectric grease
https://rxmechanic.com/what-is-dielectric-grease/
Best write-up on dielectric grease I have ever read.

Also, of you are going to use this on a 12VHPWR plug, do NOT put the grease on the actual metal electrical connections. Just use a tiny bit on the outside edges of the male plastic connector housing. May very well help you get a full insertion.
 
Best write-up on dielectric grease I have ever read.

Also, of you are going to use this on a 12VHPWR plug, do NOT put the grease on the actual metal electrical connections. Just use a tiny bit on the outside edges of the male plastic connector housing. May very well help you get a full insertion.
Everyone read this out loud and see what happens lmao
 
From what I understand, the 12VHPWR standard itself, which nvidia surely had a major part to play in the design of it, runs closer to the limit of what the cables are rated for. The classic 8 pins are officially rated at 150W but can theoretically pull 300W, giving a safety margin of 100%. This 12VHPWR is officially rated at is rated at 600W but can theoretically pull 684W, giving a safety margin of 15%. Coupled with the terrible dual split connector pin on the nvidia adapter which will reduce contact in the pins when there's a cable bend and the whole thing is just calling for trouble. Dealing with that type of current should be overbuilt for QC variance and consumers messing things up like bending too much, not inserting it 101% in etc and it really isn't. This is probably what's causing some OEM's to mess up.

The connector should really be rated at 450W IMO, which is around 50% safety margin. 15% is way below acceptable and frankly BS. On well built 2x8pins, It's really not a problem to send 400-450W as they are electrically similar to this 12VHPWR connectors but the connectors are inherently stronger with fatter pins and the ability to custom sleeve with thicker wires etc. But NO, they are rated for 150W only hence nvidia needs 4 of these on the one end but a tiny little connector on the other end which electrically makes zero sense but that's the way these connectors are officially rated. All of this results in nvidia creating this stupid half octopus monstrosity which is the ugliest thing i've seen on a GPU in decades and points their connector on the FE cards straight towards the side of the case, then prompts the user to not bend the cable, so obviously you have to open the side panel so it's on full display.

I'm actually a bit surprised that there has been no official statement from nvidia whatsoever on these reports of melting cables. For me, they are just as much to blame as the shoddy adapter manufacturers, if not more. People are still posting pretty much every day about their adapters melting even though they inserted it all the way and didn't bend it too much etc. This whole situation is a shit show.
 
I just dont get why they didn't add 4 pins to the existing 8 pin pcie or use the same pins and connector dimensions. It would've been a bit wider then the current 12V but the 8 and 6 pin pcie connectors are a known value and reliable.
 
..The classic 8 pins are officially rated at 150W but can theoretically pull 300W, giving a safety margin of 100%. This 12VHPWR is officially rated at is rated at 600W but can theoretically pull 684W, giving a safety margin of 15%....

The connector should really be rated at 450W IMO, which is around 50% safety margin. 15% is way below acceptable and frankly BS.

The rating you reference, already has the safety margin built in. So what you are suggesting, is safety margin on the safety margin.
At 15% below a 'rating' which already has a safety margin built in, it's fine.
 
MSI made an instagram post showing people how to properly install the cables (album w/ 3 pics).
Weird thing is, they didn't even show the GPU-side connector. They messed up and showed the PSU-side connector for the 35mm and full insertion pics.

Point #1: Fully insert your PSU into power sockets of the graphics card and the power supply unit.⁠
Point #2: Insert the power socket parallel to the graphics card and the power supply unit.⁠
Point #3: Reserve at least 35mm of space for the power supply cable for cable management.⁠

 
Last edited:
If you have to make a guide like that for these things not to melt - Good luck. It’s going to melt because you didn’t design a standard that is idiot proof to a certain degree.

If I were a board maker I’d go back to 8pin and design my boards with 4 8pins.
 
They also messed up and showed the PSU-side connector for the 35mm and full insertion pics.
I think a lot of these manufacturers are tip toeing the whole connector thing until Nvidia actually comes out with a formal announcement on their findings, not just they're investigating. So this doesn't surprise me that they didn't show the connector side to the GPU, because if they showed the GPU side that would give people the sense that plugging the cable into the connector fully was a solution to the problem.
I still believe that people not plugging in the connector fully is the root of the problem, but I can understand why the card manufacturers are not fully committing to any theory yet.
 
The rating you reference, already has the safety margin built in. So what you are suggesting, is safety margin on the safety margin.
At 15% below a 'rating' which already has a safety margin built in, it's fine.
That 15% safety margin has to take into account QC variance, end user errors in terms of connection, abnormally high ambient temps, sharp bends etc. With 8 pins, we were used to 100% safety margin, and GPU's were pulling 300W from 2x8 pins when it could theoretically pull >= 600W through them so it was perfectly safe. With the 12VHPWR, nvidia is pulling 600W when it could theoretically pull 684W. It's funny to think people were being extra hard on AMD for pulling 400W through 2x8 pins a few years ago with the 295X2. Looking back at it, it wasn't even near as close to the limit as this new standard.

In consumer electronics, 15% safety margin is almost unheard of especially when it comes to the innards of expensive stuff. In military applications, sure there's less than 15% safety margins but they are built to much, much stricter tolerances and installed by professionals.

If a 12VHPWR connector is very well built, sure it should withstand a sustained 600W or even more of a load as they are technically rated for a bit higher than that. The problem is not all adapters will be built the same, and there will be end user issues and a combination of those is what's being seen in many reports.

Honestly it would make an extremely tiny bit more sense if they were gaining a lot of performance for all that extra power. Turns out it's around 2% and I just don't seem to make any sense of what nvidia was thinking. So they decide to make a terrible and finicky adapter with no regard for aesthetics and ease of use, force users and AIB partners to use it, and cause people's connectors to melt. It's been less than a month and reports seem to point at >100 cases already if not a lot more. It's absolutely bonkers and makes me think twice about purchasing their cards in the future. I've already had a 3090 brick itself from the stupid driver/game issue a year back, and now this 4090 issue makes me glad I didn't pick one up even when I had a chance to. I wouldn't feel safe having that in my system, not by a long shot.
 
If you have to make a guide like that for these things not to melt - Good luck. It’s going to melt because you didn’t design a standard that is idiot proof to a certain degree.

If I were a board maker I’d go back to 8pin and design my boards with 4 8pins.
Does the new connector save board space for circuitry or just the connectors? I know a big driver of all the new usb connections was to save pcb space for just the connector. Curious if the new connector allows power circuitry to be smaller as well.
 
Does the new connector save board space for circuitry or just the connectors? I know a big driver of all the new usb connections was to save pcb space for just the connector. Curious if the new connector allows power circuitry to be smaller as well.
Depends on the board. On the reference design? Yeah, it likely helps out big time since the board is this weird shape and ends way before the entire length of the heatsink. However, as you can see with the unreleased EVGA 4090, they extend the board out all the way and keep it rectangle, and there is plenty of room to fit four 8 pins there.

So it's easily possible, but likely not with the reference PCB.

To me, this 16 pin connector is just to support bottom-line cost cutting measures. That's it. There is no benefit to it to the consumer otherwise.

Also, you don't technically need four. They could have easily just made it three 8-pins, and you just hit the slightly lower power target which has very minimal performance loss anyways to keep you at 450watt.

See - https://www.techpowerup.com/review/nvidia-rtx-4090-450w-vs-600w/

It really sucks that the 4080 is going to use this connector, because it would be perfectly fine with two normal 8 pins, and it's not as if two 8 pins use that much more room than a single new 16-pin.
 
Last edited:
Just a reminder we've already had two native MSI cables melt, as well as a regular Seasonic cable. So all goofs, gaffs, and memes shitting on Nvidia's adapter are basically misinformation at this point.
It could be an issue with the spec (in which case all cables will be at risk of melting). Or if we still want to blame Nvidia, they could've messed something up with the card-side implementation of the plug itself.

The fact of the matter is, nearly everyone running a 4090 is using the supplied Nvidia adapter, so nearly all instances of melted cables will be with that adapter.
As more people get their hands on 3rd party cables like MODDIY and CableMod, I suspect we will see instances of those cables melting within the next month or two. All bets are off when somebody posts a melted CableMod cable.
 
I'd be curious what power supplies that people are running with these melted cables? I know a few of the MSI cables "melted" with their new ATX 3.0 supplies with their single cable.
 
I'd be curious what power supplies that people are running with these melted cables? I know a few of the MSI cables "melted" with their new ATX 3.0 supplies with their single cable.
Add on to that, how does multiple rail power supplies respond when the different rails are shunted together in the connector? Need a power supply guru here.

From what I observed, previous GPUs with multiple pcie connectors had separate power circuity for each connector, while the new connector connects everything together so one power circuit in the card for the pcie connector. The 3090Ti didn't have this issue, so maybe a mute point. Unless the connector was split per 8 pin cable and not tied together like on the 4090. I don't know.
 
Add on to that, how does multiple rail power supplies respond when the different rails are shunted together in the connector? Need a power supply guru here.

From what I observed, previous GPUs with multiple pcie connectors had separate power circuity for each connector, while the new connector connects everything together so one power circuit in the card for the pcie connector. The 3090Ti didn't have this issue, so maybe a mute point. Unless the connector was split per 8 pin cable and not tied together like on the 4090. I don't know.
That's an interesting idea. And now that I think about it in this light, it would be humorous if it came out that the actual cause of this drama was actually PSU's. Food for thought, but would also love to hear from an actual PSU guru or engineer about this.
 
Back
Top