4090 owners, how are you dealing with the power connector fiasco?

magda

2[H]4U
Joined
Sep 26, 2018
Messages
3,814
You bought a RTX 4090, and becuase of nvidia idea of saving a couple of bucks, they made small power connectors, and someone didn't check current levels for this design.

How are you dealing with this problem? Do you have a fire extinguisher next to your case?
 
You bought a RTX 4090, and becuase of nvidia idea of saving a couple of bucks, they made small power connectors, and someone didn't check current levels for this design.

How are you dealing with this problem? Do you have a fire extinguisher next to your case?
Is this the thing that was solved with an angled connector? I'm sorta out of the loop.
 
No issues here but i'm running a native pcie5 PSU on my liquid suprim.

edit: and with no contact or excessive bending of anything
 
Last edited by a moderator:
Not wrestling with it and getting the native corsair 12vhpwr cable for my RMX1000 once they are available again, also don't plan on pushing anything more than the stock 450w through my MSI, everything is maxed out at 3440 175Hz pretty much, so what's the point?
 
I ended ordering the molex connector and am going to make my own.
 
Not wrestling with it and getting the native corsair 12vhpwr cable for my RMX1000 once they are available again, also don't plan on pushing anything more than the stock 450w through my MSI, everything is maxed out at 3440 175Hz pretty much, so what's the point?
If two 8 pins can do 600watts, why does my 3080 have three 8 pin connectors?
And why does the Nvidia dongle have four?
1667415422063.png
 
If two 8 pins can do 600watts, why does my 3080 have three 8 pin connectors?
And why does the Nvidia dongle have four?
View attachment 523699
This is made by the power supply manufacturer only for their compatible power supplies. Whereas the NVIDIA adapter is made for any jackass off of the street connecting their 650W Bronze PSU that they pulled from their HP corporate desktop to it.

EDIT: they're also not "8-pins". Those are PCIe connectors to the power supply itself, not standard PCIe 8-pins.
 
This is made by the power supply manufacturer only for their compatible power supplies. Whereas the NVIDIA adapter is made for any jackass off of the street connecting their 650W Bronze PSU that they pulled from their HP corporate desktop to it.

EDIT: they're also not "8-pins". Those are PCIe connectors to the power supply itself, not standard PCIe 8-pins.
It's the same as a PCIe 8 pin with and upper and lower key difference. I just plugged my Corsair 12vhpwr PSU cable into the Nvidia dongle with a little force.
 
If two 8 pins can do 600watts, why does my 3080 have three 8 pin connectors?
And why does the Nvidia dongle have four?
View attachment 523699
Also, your numbers are incorrect. A 8pin is rated for 150w, a 6 pin, 75w and the PCIe port, 75w. So, with three 8 pins, you could theoretically pull 525w, not 600.
 
Also, your numbers are incorrect. A 8pin is rated for 150w, a6 pin, 75w and the PCIe port, 75w. So, with three 8 pins, you could theoretically pull 525w, not 600.
That Corsair 600 watt cable I posted above only uses two 8 pin connectors to supply 600 watts to the 4090.
 
That Corsair 600 watt cable I posted above only uses two 8 pin connectors to supply 600 watts to the 4090.
Those 8 pins on that adapter are not rated for the same as the 3x 8 pins your 3080 has. They may both be 8 pins, but they are not the same nor are they rated for the same power delivery.

Your card can only handle 150 per 8 pin.
 
Those 8 pins on that adapter are not rated for the same as the 3x 8 pins your 3080 has. They may both be 8 pins, but they are not the same nor are they rated for the same power delivery.

Your card can only handle 150 per 8 pin.
Yes, it's a Corsair "Type4" connection. It only connects to a compatible Corsair power supply. It is not to be connected to peripherals.
 
Those 8 pins on that adapter are not rated for the same as the 3x 8 pins your 3080 has. They may both be 8 pins, but they are not the same nor are they rated for the same power delivery.

Your card can only handle 150 per 8 pin.
This ^ It's a corsair specific Type 4 adapter with gauging to meet the 600w requirement, the number of pins is immaterial if the wires are of proper gauge/quality to carry the necessary current. A standard 3 pin 120V cable can be rated for 5a or 15a as an example.
 
I think the multiple plugs on the adapter are in case the power supply has multiple +12V rails that is below 600W. If you have a single rail PSU then the limiting factors would be the gauge of the wires and the pins in the connectors. An 18AWG wire can carry between 10-15 amps depending on distance and temperature. The metal pin in the PCIe is rated for about 9 amps. 9A x12V = 108W/single wire. Three +12V lines/PCIe plug is a little over 300W/plug.
 
I have no issues. I have a lot of room available in the case so there were no tight bends on the cable either on the PSU or 4090 end. I am going to order the Seasonic native cable as soon as they have more available which will take care of some of the bulk that minimizes airflow but even as-is I am fine.
 
No issues with mine but I have received an ATX 3.0 PSU so I will be installing that tomorrow and getting rid of the dongle.
 
Not an owner, but if you followed anything on this, you know that those affected are a tiny tiny fraction out there (but Nvidia is definitely taking the issue seriously). You might as well say, if you have a Samsung phone, destroy it before it blows up (likely not going to happen).
 
Picked up a cable mod cable for my EVGA PSU. It's cleaner looking now. The nvidia cable was in the system for a week 24/7 and I had a lot of gaming sessions in it. The cable plug adapter doesn't show any signs of overheating.
 
I have the Fasgear cable from Amazon. I didn’t have any issues with the adapter other than the bulky size of it. The Fasgear cable cleaned things up nicely and hasn’t burnt yet, but I haven’t pushed it to 600w and probably won’t.
 
i picked up a new PSU that has the single 12vhpwr. no splitter for me. individually wired cable.
 
So, I just bought a cheapo amazon cable that should come in today. Mine is not melting as far as I can tell, but I've kept things off for the most part to avoid any problems later. I just took my "good" cable out, pictures if you are interested

i b b . co /album/mS0m0M

(remove spaces above, think ibb link is hard blocked)

I do wonder what is causing the indentions in the plastic bottom layer though, maybe from when they crimped it together?
 
I thought it was strange that the BQ adapter was just sitting in california for a full week. Newegg finally updated my shipping to show a one week delay.
As a backup, someone recommended the adapter form the 3090Ti and I originally had that ordered from Amazon. But they too pushed back shipping by a week - and I already got the refund for that. Maybe I should've kept that order hah.

Wonder if there's a coincidence lol.
 
I like the Seasonic connector.

They are giving it out for free to Seasonic power supply owners.
  1. Much more well soldered
  2. Much better crimped
  3. Has a 90-degree angle
  4. Relocates the wires-combiner into a sleeve 2 inches away from connector.
    (Keeps connector stress away from the wire-merging stress -- aka redistribution of stress -- less likely for crimps/solder/whatever to detach)
It's not perfect, but better than what I've seen included with RTX 4090's at this stage.

I don't know if it's a higher temperature plastic, but (A) high temperature plastic should begin to be used, IMHO, for 600-watt DC connector of all kinds. And (B) GPU card design should have a thermosistor at the connector, for a temperature alarm. Both (A) and (B) will give more self-protection from bad connectors, as well as better protection from long-term use (corrosion/rust over the years).

1668024802570.png
 
The plug is the same on the card end, but the 4090 requires up to 600W which requires 4 legs (instead of 3 like the 3090) to get full power and boost levels.
The 4090 adapter cable only has 2 legs connected.
 
So I have been watching the various VideoCardReview guys (Jayz, GamersNexus, Igor, etc) and the latest guess to what is happening is from JohnnyGuru. In his supposition it is a loose connection or the connector not being completely seated as the problem. The incomplete mating of the connectors increases the resistance which increases the temperature. I tried it on mine yesterday, keeping the connector on but loose and it was 20C hotter after 10 minutes (Running TimeSpy) versus it being completely seated. Obviously this is not a scientific test but the IR thermometer did see a significant increase with the incomplete mating. It could be any or all of the possible issues that everyone has supposed of course, but it is another avenue of exploration for minds greater than my own :)
 
Not an owner, but if you followed anything on this, you know that those affected are a tiny tiny fraction out there (but Nvidia is definitely taking the issue seriously). You might as well say, if you have a Samsung phone, destroy it before it blows up (likely not going to happen).
The problem with that thinking, is that with a problem like this, if it does happen - Your house might burn down. This isn't something like the 2080ti space invaders issue. This is something that could end up killing people. So to expect people not to be constantly checking/worried about this is absurd.
 
The problem with that thinking, is that with a problem like this, if it does happen - Your house might burn down. This isn't something like the 2080ti space invaders issue. This is something that could end up killing people. So to expect people not to be constantly checking/worried about this is absurd.
Throw that Samsung phone away now!!!!
 
Throw that Samsung phone away now!!!!
You realize Samsung recalled those phones and replaced them with phones that didn't have the fire issue, correct? Yes, you should throw that Samsung phone away if you didn't get with Samsung to replace it for free. Only a complete idiot would be so dense as to ignore the warnings and hang onto those specific phones that had the issue.
 
You realize Samsung recalled those phones and replaced them with phones that didn't have the fire issue, correct? Yes, you should throw that Samsung phone away if you didn't get with Samsung to replace it for free. Only a complete idiot would be so dense as to ignore the warnings and hang onto those specific phones that had the issue.
You're right, of course.
 
The problem with that thinking, is that with a problem like this, if it does happen - Your house might burn down. This isn't something like the 2080ti space invaders issue. This is something that could end up killing people. So to expect people not to be constantly checking/worried about this is absurd.
Depends on how many cases there are versus cards. Things are being sensationalized. There are also a lot of idiots out there.

If it were so pervasive the YouTubers would be able to replicate it - but they’ve had difficulty doing so.

It’s a perfect storm as it’s a halo card that most don’t buy - so lots of piling on out there.
 
Back
Top