Worth upgrading to a ATX 3.0 PS for NvidaRTX4000 series or use what you have?

xDiVolatilX

2[H]4U
Joined
Jul 24, 2021
Messages
2,385
Looking for all RTX 4000 series owners to chime in here.
Are you wanting/needing to upgrade to a ATX 3.0 power supply or are you fine with whatever you're using and see no reason to upgrade?

Any issues anyone has experienced using a non-ATX3.0 power supply?

Trying to decide if spending the extra 300-400 dollars is worth it or not for ATX 3.0?
 
Last edited:
I don't think it really matters. Do you hate adapters to the point where you want to spend considerable extra money to avoid using an adapter? Because ultimately the card will get the power it needs, even with the adapter.

I used a Thermaltake toughpower 1000w PSU from ~2007 - late 2021. It saw a LOT of different hardware over that period of time. In late 2021 I finally "upgraded" to a Superflower Leadex III Gold 850w. I got a great deal, only paid about $70 for it after lining up a few promo codes / coupons. It ran my 5900X + 2080 fine for a year, and runs my 5800X3D + 4080 fine now (with the included adapter). That also includes 4 mechanical hard drives, 3 NVMe SSDs, 14 120-140mm fans, and a 2nd small videocard for my 5th and 6th monitors. If my "cheap" 850w PSU can handle all of that, it's difficult for me to understand why anyone would pay $300-400 for a PSU. I'm 99.9% certain that my 850w would still have been just fine even if I had went with a 4090 instead of a 4080. If you are paying that much for a PSU, you better have some very special power needs (mining rig with 4 videocards, etc) or you are getting taken for a ride.
 
Also if it is modular, some vendors have adapter-cables. Seasonic has them for their Prime series, so you can get a cable that goes from the plugs on the PSU to a 12VHPWR.

As for why someone would pay that much, that's what a top end PSU costs. Do you NEED top end? No, but if you want it that's what you are talking. A Seasonic Prime Titanium 1000w runs about $330.
 
I would do what I already did... Run what I have and go from there. I was going to order a Seasonic Vertex GX-1200 along with my 4090. Thankfully it was on back-order and I just checked out with the 4090. It's been running perfectly fine on my 750 eVGA Supernova and I saved myself about $300. Nothing in my rig is overclocked. I just don't see the need for it these days with both CPUs and GPUs more or less bosting to just shy of where they become unstable or grossly inefficient.

Played Warzone 2 for a few hours last night and MW2 Multiplayer most of this morning/afternoon. All without a hitch. Also loaded up Cyberpunk with all the RTX fixings enabled to get those Tensor cores firing. Even did a Furmark stress test @ 106% power limit. I daily drive this thing at 70% power limit since the performance difference is only between 2-3%

A lot of people have an unfounded fear of "frying" their new card or entire system but that really doesn't happen unless you're running a really crappy PSU that sends a surge through everything instead of simply tripping OCP. So basically you should be ok unless you're running those Gigabyte PSU's that blow up.
 
Well the 13900KS can pull over 325W maybe more? the 4090 can pull 450 with the triple cable how much with the quadruple cable?. That is almost 800W not including anything else? Kind of cutting it close? I know this is worse case but If I overclock the CPU that's over 300 all day and let the GPU run wild @100% or more that's 4_450 perhaps more under full load at 4K high frame rate? Let's not forget all the other shit like 15 fans and RGB bullshit everywhere and pumps and M.2s and peripherals etc? Am I ok with 1050? Probably. Am I reaching the upper 80% or 90% of the power supplies max? I think so, It'll be close to the limit even if it never hits the absolute max.
 
Im not suggesting to buy an 800 water. If you know you’re going to buy something I’d buy more than what you think you need. But if you’re wondering if what you have will work. That’s a very easy thing to figure out. Try it and find out. If you’re too apprehensive to give it a go then this question isn’t really serving a purpose
 
Im not suggesting to buy an 800 water. If you know you’re going to buy something I’d buy more than what you think you need. But if you’re wondering if what you have will work. That’s a very easy thing to figure out. Try it and find out. If you’re too apprehensive to give it a go then this question isn’t really serving a purpose
I have the Kill-A-Watt device right here, I just need to get another surge protector to be able to fit the Kill-A-Watt then the pc power into it. It doesn't fit the way the outlet is plugged in right now. I have been meaning to get it, I will very soon. Then I'll have an exact reading of how much power my rig is pulling. I'll report back my findings.
 
The main benefit of using a ATX 3.0 PSU is the native 12VHPWR cable. It's alot easier to deal with 1 cable vs those adaptors that makes you plug in more than just 1 cable into your PSU. Makes dealing with cable management alot easier.
 
I'm using the same old EVGA 1300 G2 (basically a Super Flower Leadex rebadge) that I've had for what feels like a decade already, and it still powers cards like the RX 7900 XTX and RTX 4080 like a champ. Just gotta deal with that unsightly adapter, that's all.

Fortunately, there are third-party cables that are made for EVGA modular PSUs, so I can just order one of those to tidy things up.

I wouldn't worry about it too much on a decent modular ATX PSU. Maybe if you're thinking far in the future where the added sense pins might actually be required, it'd be a concern, but that time certainly isn't now when the current Ada Lovelace cards don't even use two of those sense pins on their bundled adapters.
 
Last edited:
I'm using the same old EVGA 1300 G2 (basically a Super Flower Leadex rebadge) that I've had for what feels like a decade already, and it still powers cards like the RX 7900 XTX and RTX 4080 like a champ. Just gotta deal with that unsightly adapter, that's all.

Fortunately, there are third-party cables that are made for EVGA modular PSUs, so I can just order one of those to tidy things up.

I wouldn't worry about it too much on a decent modular ATX PSU. Maybe if you're thinking far in the future where the added sense pins might actually be required, it'd be a concern, but that time certainly isn't now when the current Ada Lovelace cards don't even use those sense pins on their bundled adapters.
The 3 into 1 adapters that come with the 4000 series graphics cards don't utilize the 4 additional sense pins?
 
The 3 into 1 adapters that come with the 4000 series graphics cards don't utilize the 4 additional sense pins?
To be more exact, only two of them are used (SENSE0 and SENSE1), which are simple open/grounded signal pins that wouldn't require any special circuitry on the PSU side for the GPU to determine how much power it's allowed to use. Think of 'em as pre-soldered DIP switches.

It's the other two pins (CARD_PWR_STABLE and CARD_CBL_PRES#) that aren't implemented on anything yet. They're an optional part of the spec at the moment and completely unused by the current Ada Lovelace cards, but I can't rule out the possibility that GPUs in the future might need those signals implemented on a newer ATX 3.0 PSU to work properly for whatever reason.
 
I'm debating upgrading my PSU. Not because it would be an ATX 3.0, but because I might need more power down the line. After running through OCCT power tests, various game benchmarks at 3x 4K displays (GPU 100% utilized!), I'm currently peaking at 770W with a 3900x & 4090 on a high quality 1000W power supply. I doubt my test covers any additional load that hard drives, fans, AIO pumps, etc. could add as things heat up.

After comparing online reviewers power figures, moving from a 3900X to a 13900K (or similar) might add an additional 200W peak load. 1000W might not cut it.
 
I'm debating upgrading my PSU. Not because it would be an ATX 3.0, but because I might need more power down the line. After running through OCCT power tests, various game benchmarks at 3x 4K displays (GPU 100% utilized!), I'm currently peaking at 770W with a 3900x & 4090 on a high quality 1000W power supply. I doubt my test covers any additional load that hard drives, fans, AIO pumps, etc. could add as things heat up.

After comparing online reviewers power figures, moving from a 3900X to a 13900K (or similar) might add an additional 200W peak load. 1000W might not cut it.
Get a kill-a-watt meter. You'll know exactly and it's easy to use.
 
Get a kill-a-watt meter. You'll know exactly and it's easy to use.
Yep! I've had one for ages. The 770W reading was the highest it caught during a power stress test with OCCT. What I can't tell is how much more load could be placed on it if hard drives, RAM, chipset fan, or other components started pulling a few extra watts during operation. Up to 825W seems to be a reasonable wager ... then add the increased delta from moving from a 3900X to a 13700K/13900K/13900KS later ...
 
Back
Top