Having trouble choosing a 4090 manufacturer (or whether to get it)

Starfield is going to be an absolute riot lol.

I'm looking forward to Starfield, but curious to see what the system requirements are really like considering the gameplay we've seen so far doesn't look much better than Fallout 4.
 
Different engine, but MSFS is a similar game that really taxes the CPU. The trick to optimizing that one is to turn up all the settings that would stress your GPU and turn down ones that would stress the CPU. Being GPU limited makes for a much more pleasant experience in general. It's almost as if these games need to offer unnecessarily taxing graphical settings for people with high-end GPUs so the GPU becomes the bottleneck.

One nice feature of the 4XXX series is that frame generation seems to work quite well for games such as Hogwarts to smooth out the experience.

When purchasing a card this expensive, my thoughts are that things such as frame generation and lowering settings should be a last resort, not an expected outcome. And to be exact, they would be, if the CPU was "sufficient" for the poorly coded engine. As it is currently the only settings I changed to get 70+ FPS in most areas in Hogwarts (on my 3080 Ti) was turning off the other types of Ray Tracing except shadows, and then setting the Ray Tracing intensity to medium. I didn't even turn it off. With a 4090, for $1700, while keeping this CPU, all it would let me do is probably... MAYBE... turn on one or two more settings and MAYBE turn Ray Tracing to High (not Ultra).

I think I've reiterated this ad nauseum as probably self justification to myself as to why I returned it, but again, my train of thought, as a flowchart:

I did not get the expected uplift -> The reason is apparently my CPU -> I already paid $1700 -> To make the GPU actually worth the upgrade, I would have to spend a further $700-800++ on top of this to completely change platforms (and rebuild) -> I was not willing to do so -> The GPU is not worth it to me because it does not provide the uplift I wanted for the price I was actually willing to pay, and the effort I was willing to put in.

Simple as that honestly. I think anyone considering a 4090 at this point needs a 13700k or preferably a 7800X3D at the minimum, because this will only get worse (in my opinion).

Look at this chart clipped from a youtube video:

1692754453206.png

And this is on an 5800X3D lol. Doing some head math, I think that's around a 40% uplift. The 1% MINIMUM on the 7800X3D is 5 FPS away from the AVERAGE on the 5800X3D...
 
Last edited:
Honestly people have tossed around a huge variety of reasoning for the reason why games are behaving like Hogwarts. All the way to AMD must be paying the devs to do this (and have sloppy FSR/ console performance, which are AMD platforms)

I think their basis for it was the chart in another video by the same people:

1692755039451.png


As you can see, with Ray Tracing at Ultra at 1080p, the 4090 is heavily CPU bottlenecked. Despite being significantly worse at Ray Tracing, the 7900XTX is actually neck and neck with it. I kind of wish they would show CPU and GPU utilization on these charts, because I think that would paint a very telling picture. At the 1440p results, things start looking more normal, but it takes until 4k for the 4090 to show its proper advantage over the 7900 XTX. Obviously this is no "proof" that Nvidia's drivers specifically are more CPU dependent. Maybe they just are for this game and its ray tracing implementation.

I think Intel's "gaming" solution to a "gaming" CPU should be a 4 or even 2 core CPU that's just clocked high as balls. In many of these poorly coded games, it would probably even beat the 7800X3D.
 
Catching up on this thread.

I'll say it. Custom loop is not cost effective. It never will be unless something crazy happens.

But it looks cool, right?

At the same time, going from a 3080ti to a 4090 isn't worth it either! Listen, I know.....I did it myself. Though I have no kids and my fiancée and I make enough to pay for our stupid hobbies.

Hogwarts legacy was the reason. I wasn't happy with the performance so I built a whole new PC over it....spoiler alert: it was better but not enough to merit the cost. Do I care? Nah, had the money, didn't hit me at all.....and I got my fiancée to build her first PC with my old parts. It was a win win.

So, you're on this forum full of enthusiast, wondering what the best route to take is...and I'll be one of many to say...DO WHAT YOU WANT.

Don't worry about justifying your purchase; just go and do it! (Assuming you can afford it)

Look how sexy this bigass card looks in my machine:
1692759380674.png


Which version to get? HA, go [H]ard or go home.

This pc handles any workload I throw at it...I never have to give a care about benchmarks or anything...I know I have the best.

Do I have more money than sense? Probably. But that's my modus operandi.

I'll say no matter what system you have...that 4090 is going to bottleneck on games. Get the latest and greatest everything, I can attest...it'll bottleneck. I OC'd the hell out of my poor 12th gen i7 and it still was like "k" on every 3d mark test I threw at it, till the system was unstable.

All that to say, is it necessary? no. You're just looking to burn some money for the dopamine. Is that wrong? NO, I do it all the time. It's fun! Just go for the new GPU, it'll last you for a good while considering how things are going.
 
So, you're on this forum full of enthusiast, wondering what the best route to take is...and I'll be one of many to say...DO WHAT YOU WANT.

Don't worry about justifying your purchase; just go and do it! (Assuming you can afford it)
I mean he did exactly what he wanted to do (pocket $1700) and you're kind of trying to talk them back into spending a boat load of money for....reasons.
 
I mean he did exactly what he wanted to do (pocket $1700) and you're kind of trying to talk them back into spending a boat load of money for....reasons.
You missed the entire point of my post lol.


Also I missed the part where he made the decision to pocket 1700 so there's that.
 
You missed the entire point of my post lol.


Also I missed the part where he made the decision to pocket 1700 so there's that.

I mean... your entire point kind of boiled down to "I'm Yolo Swaggins and I do what I want and you should, too". While that is a good point, I couldn't justify my purchase to myself. It's not money I would be missing too much, but it wasn't worth it over my 3080 Ti. I can also think of a lot of things to use that money on instead, like a newer stove and other appliances, which I also don't need but would certainly provide me with more benefit than the card.

Also frankly if you're that yolo with your money, why are you not just buying the 7800X3D? I showed you the charts. Your 12th gen i7 is probably doing even worse (or maybe at parity) than my 5950X, which does worse than 5800X3D. Like you would literally go up probably 40-50%++ in performance in that game (and any derivative games) by moving up processors.
 
Last edited:
I'm on a 4090 paired with a heavily tweaked 5950X and it's been the best video card experience I have had since I jumped to my 1080's in SLi back in the day. Half these "benchmarks" just leave configurations on default (as that's what most people do). If you are using PBO and loose memory timings on Zen 3, you are holding yourself back quite a bit (I've had numerous posts about this here showing the graphs).

All that being said, the 4090 is 100% a 4K card, any resolution less, you are better off with a 4080 or 4070Ti and are holding the 4090 back by quite a bit.

As for "Coil Whine", it can be the card or power supply. My old eVGA 3090FTW3U had coil whine in my main gaming PC for the time I used it (does not bother me at all, barely hear coil whine past fans), when I moved it into my TV Streaming PC when I got my 4090, the 3090 now has 0 coil whine. Cause? Power Supply in my gaming PC. I can stress the hell out of that old 3090 still in the TV PC and never hear a single whine. My 4090 has some coil whine, but like before, hardly enough to bother me when gaming (needs to be pretty quiet to hear it under load). If I put the 4090 in my TV PC guess what? No coil whine! I'd swap the power supply in my gaming PC, but honestly, not worth the cable hassle for me. It is not under sized either, it's 1200W.
 
I mean... your entire point kind of boiled down to "I'm Yolo Swaggins and I do what I want and you should, too". While that is a good point, I couldn't justify my purchase to myself. It's not money I would be missing too much, but it wasn't worth it over my 3080 Ti. I can also think of a lot of things to use that money on instead, like a newer stove and other appliances, which I also don't need but would certainly provide me with more benefit than the card.

Also frankly if you're that yolo with your money, why are you not just buying the 7800X3D? I showed you the charts. Your 12th gen i7 is probably doing even worse (or maybe at parity) than my 5950X, which does worse than 5800X3D. Like you would literally go up probably 40-50%++ in performance in that game (and any derivative games) by moving up processors.
I mean I pointed out pretty directly it wasn't a logical use of money.


I'll say it. Custom loop is not cost effective. It never will be unless something crazy happens.

But it looks cool, right?

At the same time, going from a 3080ti to a 4090 isn't worth it either! Listen, I know.....I did it myself. Though I have no kids and my fiancée and I make enough to pay for our stupid hobbies.

Hogwarts legacy was the reason. I wasn't happy with the performance so I built a whole new PC over it....spoiler alert: it was better but not enough to merit the cost. Do I care? Nah, had the money, didn't hit me at all.....and I got my fiancée to build her first PC with my old parts. It was a win win.

Not sure how this was misconstrued as "YOLO, go buy the thing". I specifically pointed out it wasn't worth it in any sense, but there was a side benefit that made it worth it for me. Also worth saying I was being tongue in cheek in the latter half of the post, though of course that's hard to convey precisely in text format. Main point being made was "Hey, no it's not worth it, but if you're just looking to burn money and can afford it, have fun" and also abstractedly saying "I'm not the guy you want to take financial advice from on this". We can blame any misunderstandings on the fact bourbon was involved in the making of that post ;)

Keep in mind a huge portion of the posts on here are people who already want the "thing" being discussed, and are just looking for affirmation. You initially were looking at custom loop water cooling....already a rather useless and frivolous purchase when considering cost/benefit, and went to considering a 4090...so you can see where I would get that impression.

As for the processor, let's not look at my microcenter cart right now :p . But really, 14th gen Intel processors are literally around the corner so I'm holding out to see how that goes.
 
Simple as that honestly. I think anyone considering a 4090 at this point needs a 13700k or preferably a 7800X3D at the minimum, because this will only get worse (in my opinion).

I'm not seeing a trend with games like Hogwarts, yet. It seems to be more the exception than the rule. The majority of games are still able to fully utilize the GPU, so at 4K resolutions CPU really doesn't matter much. I'm likely going to stick with my 5900X for now and upgrade once Zen 5 comes out.

If Hogwarts was the main reason for buying a 4090, I'd have been frustrated too.
 
I was trying to do some of my own testing, but the stupid 12vhpr cable that came with my 4090 is bad. I never noticed because I have an MSI 1000W with the cable out of the box so I don't need the adapter, but trying to use it in my other systems is a no-go, just stays with a red light on the card. I've got a new one coming from Amazon tomorrow.
 
I'm not seeing a trend with games like Hogwarts, yet. It seems to be more the exception than the rule. The majority of games are still able to fully utilize the GPU, so at 4K resolutions CPU really doesn't matter much. I'm likely going to stick with my 5900X for now and upgrade once Zen 5 comes out.

If Hogwarts was the main reason for buying a 4090, I'd have been frustrated too.

I'm not going to say it was the "main" one, but it definitely was a big one. I was getting reports that Hogwarts used over 12GB VRAM, which had me nervous about my 3080 Ti's longevity, because it was likely a sign of AAA games to come. Then I get the 4090 and find out that Hogwarts only uses over 12GB VRAM with settings that my 3080 Ti can't push anyway (and even then, only barely over 12GB). And that the 4090 couldn't push those settings anyway, on my CPU. The other reason was Stable Diffusion, but then I found out that my 3080 Ti under Linux would almost match (well not quite but yeah) the 4090 working under Windows, which gave me the idea to just use my 2080 under Linux as my Stable Diffusion card. That has been working great, too. That left very little reason to keep the 4090. It disappointed me when gaming on one of the titles I got it for, and wasn't necessary for anything else.

On the other hand, 4090 prices seem to be rising right now (slightly). Microcenter actually discounted many of them, but on PCPartpicker, the Gigabyte I got (and returned) a few moments ago is 1689 as opposed to the 1623 I got it for. Which leads me to this video I referenced earlier, which was another large impetus to trying it:

View: https://youtu.be/gQyWm6Aosy8

I was worried it was a "now or never" type of scenario, and that I would need it to play games going forward. We'll see if they keep rising (and Jayz is right), or if it's just minor market fluctuations, since the Zotac/PNY/Galax cards seem to be the same price. I don't think even if there was a shortage on the horizon that it would get me to actually switch though. After trying a 4090, I would say I'm actually more satisfied with my 3080 Ti than I was before, rather than the other way around. Was good to see the grass on the other side, but this is fine. I think I would upgrade my CPU before the 4090. We'll see what Intel has coming up.
 
Seems some poor game and setting choices led to different than normal decision making. Oh well. To each their own.
 
I was worried it was a "now or never" type of scenario, and that I would need it to play games going forward. We'll see if they keep rising (and Jayz is right), or if it's just minor market fluctuations, since the Zotac/PNY/Galax cards seem to be the same price. I don't think even if there was a shortage on the horizon that it would get me to actually switch though. After trying a 4090, I would say I'm actually more satisfied with my 3080 Ti than I was before, rather than the other way around. Was good to see the grass on the other side, but this is fine. I think I would upgrade my CPU before the 4090. We'll see what Intel has coming up.

Prices do seem to have bottomed out a few weeks ago. Lenovo was selling the MSI Suprim Liquid for under 90% of MSRP, that was the best price I saw. Since then prices are up a bit. The fact the 4090 Ti was cancelled, and Blackwell was pushed to 2025 may have increased demand slightly from gamers. It definitely factored into my decision. I was on a 3080 (non-Ti) and was going to wait for Blackwell, but decided to pull the trigger on a 4090 instead. I wasn't too impressed with most of the 40XX series outher than the 4090, which is an absolute beast and the best value despite being the most expensive. So far the returns are positive. In RDR2 on the 3080 I'd tweaked the settings to get a fairly consistent 60fps at 4K. On the 4090, I cranked absolutely everything to the max and it's still noticeably smoother. I don't think it's ever dipping under 60fps now.

Selfishly, I'm hoping for prices to go up again so I can sell my 3080 for a better price. We may see a slight uptick in demand from gamers as there are a lot of AAA titles coming out in the near future, then the holiday season will roll around.
 
Prices do seem to have bottomed out a few weeks ago. Lenovo was selling the MSI Suprim Liquid for under 90% of MSRP, that was the best price I saw. Since then prices are up a bit. The fact the 4090 Ti was cancelled, and Blackwell was pushed to 2025 may have increased demand slightly from gamers. It definitely factored into my decision. I was on a 3080 (non-Ti) and was going to wait for Blackwell, but decided to pull the trigger on a 4090 instead. I wasn't too impressed with most of the 40XX series outher than the 4090, which is an absolute beast and the best value despite being the most expensive. So far the returns are positive. In RDR2 on the 3080 I'd tweaked the settings to get a fairly consistent 60fps at 4K. On the 4090, I cranked absolutely everything to the max and it's still noticeably smoother. I don't think it's ever dipping under 60fps now.

Selfishly, I'm hoping for prices to go up again so I can sell my 3080 for a better price. We may see a slight uptick in demand from gamers as there are a lot of AAA titles coming out in the near future, then the holiday season will roll around.

I think if I was on a 3080, I would have been more likely to keep the 4090. The extra 2GB of VRAM on the Ti makes a surprising amount of difference, if games keep pushing over just slightly over 10GB right at that graphics level.

As far as Red Dead Redemption 2, there's actually a setting towards the bottom in graphics settings that you have to enable yourself, that allows you to enable other graphics settings to set to max. It contains a bunch of values that you have to manually set to "Ultra" or some equivalent. With that enabled, even with the 4090 I was almost in a slideshow. The game did look pretty good, though (just not 1.7k for 22 fps good lol).

Seems some poor game and setting choices led to different than normal decision making. Oh well. To each their own.

Yeah I feel bad for giving y'all the runaround after you gave various types of input, but at the end of the day my values couldn't justify the card. Maybe that'll change, but at least I'll have more realistic expectations for what it will provide now. I usually don't buy a graphics card at this price level. Anywhere near 1k is considered a large amount to me, so for 1.7k perhaps my expectations were simply too high. Or perhaps games these days are just broken. Maybe both.
 
As far as Red Dead Redemption 2, there's actually a setting towards the bottom in graphics settings that you have to enable yourself, that allows you to enable other graphics settings to set to max. It contains a bunch of values that you have to manually set to "Ultra" or some equivalent. With that enabled, even with the 4090 I was almost in a slideshow. The game did look pretty good, though (just not 1.7k for 22 fps good lol).

Hmmm, something may have been up with your 4090. I maxed out all those settings as well and am getting an easy 60fps. I'll play on my 144Hz monitor without Vsync tonight and see what the averages are.

Is it possible you turned up the resolution scaling so you were rendering at 8K or similar?
 
Hmmm, something may have been up with your 4090. I maxed out all those settings as well and am getting an easy 60fps. I'll play on my 144Hz monitor without Vsync tonight and see what the averages are.

Is it possible you turned up the resolution scaling so you were rendering at 8K or similar?

Ah right, actually I turned up the resolution scaling to 2x. So it was effectively 6880x2880, I suppose.

After all, it is a supposedly a 4k graphics card, right? This is only effectively 4k*1.9 since it's ultrawide. Actually, I'm trying to remember if it even let me set it to that resolution. I think I tried out one scaling option and the card ran out of VRAM lol. I only have a few pictures from when I tried the options. The game looked like this, unfortunately no FPS numbers:

1692985400435.png

1692985418792.png
1692985436645.png
 
When I switched from a 3090 to a 4090 the only game I didn't see a significant uptick in frames and smoothness was Hogwarts Legacy. It only felt like a slight upgrade for that game. Cyberpunk 2077 and MFSF were pretty noticeable upgrade in frame rates. Everything runs in 4k and those are the heaviest hitters I've tried on my rig.
 
Seems like a cool story... bro.

I got tons. Wanna hear more?

When I switched from a 3090 to a 4090 the only game I didn't see a significant uptick in frames and smoothness was Hogwarts Legacy. It only felt like a slight upgrade for that game. Cyberpunk 2077 and MFSF were pretty noticeable upgrade in frame rates. Everything runs in 4k and those are the heaviest hitters I've tried on my rig.
If time proves that it was just an exception, I might grab it again. If not... yeah I'm not going to pay that much to upgrade for such a slight gain. Unless I get a better CPU by the time I'm ready again, I guess.

Better off just waiting for the 5000 series.
That's probably a solid plan. The 5k series is rumored to be an even bigger upgrade over 4k than 4k was over 3k. Of course... who knows?
 
Dude! Yes! Why do you think I'm following this thread?
Dang, I didn't even need to tell you to click Like and Subscribe down below.

Anyway, the reason I gave them those pictures is simply because I wasn't sure myself what upscaling settings I was using, so I hoped they could try dialing in something similar to confirm... but I guess that would be difficult since I don't even have the map pulled up, so they can't even find the same spot... so it was a bit of a long shot.

I know there as another game, by the way, that was recently released that had terrible performance on PC as I recall. Star Wars something or other? Did it have the same issues as Hogwarts Legacy, or were they different?
 
Jedi Survivor. But luckily someone made a DLSS mod for it that greatly improves picture quality plus performance. On page 2 I put multiple screenshot comparisons: https://hardforum.com/threads/modder-implements-dlss-3-frame-gen-dlss-2-in-jedi-survivor.2029003/
...

* Is it me or is this boost clock kind of low, even at stock? While maxing Cyberpunk, it's pretty much pinned at 2790Mhz. Is this just a dud card? =\
...
I don't think so. Nor with that Gigabyte.
You just needed afterburner, and set the core to +100, the vram to +1000, leave power cap at default. Then you would probably have seen 2890Ghz. Both of those numbers are conservative, and you could probably get more.
My 4090 works 100% stable at 130core at 100% power target, and I can get 140 core at 105% power target. The vram can do +1500 at 100% power target. I've seen others push vram to +2000 without issue.
Anything beyond 105% power target didn't do anything for the max recorded GPU frequency, so silicon lottery +140 is max I get. At 133% power target, the card was pulling 592W. It might have been uplifting the 1% lows, but I didn't notice, so I just put it back to 100% power target. What's great about these cards is you could set it to 70% power target and only lose maybe 10% performance, and run at ~315W. And with the overengineered cooling, they are pretty quiet and cool at lower power targets. Do some searches on youtube, ppl have done power target tests, it's pretty interesting.
In GPU-Z click on the number field for the gpu clock sensor, toggle it to max, then play games. It would record the highest seen.
 
Alright, so been a long time for this thread, but I'm going to do an update. Ever since I got that really good deal on my 7800X3D upgrade, the idea of the GPU has been on my mind, fairly frequently. So then I saw another Suprim X Liquid open box at Microcenter for $1616... and I went for it. The idea is that I'm probably going to put a replacement plan on it, and then the cost of the card plus the replacement plan will pretty much almost equal out to the cost of it, if I had purchased it new. The only thing is I'm trying to decide if I want the 2 year plan ($150) or the 3 year plan ($250). I'm leaning towards the former. Supposedly, this thing is still covered under MSI anyway, but frankly I think every company these days, since EVGA is gone, are going to be pretty much scummy shitheads for RMA policies. With this, I wouldn't be dealing with MSI if anything went wrong, I would be dealing with Microcenter. Considering my local Microcenter (well one of them anyway; the other one is a bit more scummy) has always been good to me, I think that would be the best course of action for my peace of mind.

Anyway this new card... actually turned out to probably be the one that I returned from earlier in the thread. I recognized the baggy that the radiator screws came in. Was kind of funny. I certainly didn't need it, but I guess I just wanted it enough. It's a shitty deal and an impulsive buy, but it is what it is.

Anyway, it's running pretty good. I changed the configuration slightly and everything looks great:
1696998443303.png
'

Maximum memory temp that I saw was 76C. I'm kind of fine with that, to be honest.

The only problem is that I can't quite get the side doors to close without bending the adapter cabling too much for comfort, so I guess I'll just leave it open. It's going to look kind of jank, but oh fucking well. I guess it is what it is, I don't want any weird shit happening because of it bending oddly, and I'm not getting one of those right angle or U-turn adapters, have heard too many horror stories with them. Here's a quick pick so you all can shit on it lol:
1696998873904.png

As you can see, I tied the whole set of 8 pin connectors to the radiator tubing, and it works out pretty nicely. They look neater, and the adapter plugging into the GPU has a nice, gradual curve to it, and it's very stable mechanically. Since this GPU is much shorter than an air cooled 4090, I might also be able to add the metal bracket for holding the wires back, back into the case. It got in the way of my other GPUs before.
 
The only problem is that I can't quite get the side doors to close without bending the adapter cabling too much for comfort, so I guess I'll just leave it open. It's going to look kind of jank, but oh fucking well. I guess it is what it is, I don't want any weird shit happening because of it bending oddly, and I'm not getting one of those right angle or U-turn adapters, have heard too many horror stories with them. Here's a quick pick so you all can shit on it lol:

As you can see, I tied the whole set of 8 pin connectors to the radiator tubing, and it works out pretty nicely. They look neater, and the adapter plugging into the GPU has a nice, gradual curve to it, and it's very stable mechanically. Since this GPU is much shorter than an air cooled 4090, I might also be able to add the metal bracket for holding the wires back, back into the case. It got in the way of my other GPUs before.

What power supply do you have? I got a cable (not an extension, an actual cable from PSU to the card) from Corsair for my RM1000X that plugs two 8-pins to the single 12vhpr and it was like $14 on Amazon direct from their store so not a chinese knockoff. It works great and is super flexible. A lot of brands have them now.

IMG_1106.jpeg
 
What power supply do you have? I got a cable (not an extension, an actual cable from PSU to the card) from Corsair for my RM1000X that plugs two 8-pins to the single 12vhpr and it was like $14 on Amazon direct from their store so not a chinese knockoff. It works great and is super flexible. A lot of brands have them now.

View attachment 605056

It's an EVGA P2 1200W. Great power supply but unfortunately I don't think EVGA made any of those type of adapters for it, nor do I think they will ever show up. Technically Cablemod has a 4x8 to 12Vhpwr connector available to buy, but they want $100 for that and I'm like... no lol. I kind of want to stick to the OEM solution, too. I think what I'll do is just buy some of this:
https://www.amazon.com/Plastic-Punching-Dustproof-Dust-Proof-Equipment/dp/B08P1C43KL
And just use it to seal in the gaps. That'll keep it mostly dust free, and I don't really care about looks so that's good enough for me anyway. Maybe some time down the line, I'll look for a wider case to put it into, but thus far I haven't seen many cases that I would want to "upgrade" to.

And I'm kind of getting tired of moving parts around. On the bright side, my 3080 Ti is now pumping out Stable Diffusion images a hell of a lot faster than my 2080 could manage lol, and my bedroom computer is quieter because the Gigabyte 2080 doesn't spin up fans unless it gets hot.

Still deciding on what replacement plan to buy for it. They said I have 15 days from purchase to decide if I want one. I'm still leaning towards the 2 year one. I'm pretty confident that this is the 4090 that I returned for "coil whine" and then it looks like they sent it off to MSI because it did have an unopened adapter (though everything else was as I left it), so essentially I'm pretty much the only owner of this thing.
 
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
It's an EVGA P2 1200W. Great power supply but unfortunately I don't think EVGA made any of those type of adapters for it, nor do I think they will ever show up. Technically Cablemod has a 4x8 to 12Vhpwr connector available to buy, but they want $100 for that and I'm like... no lol. I kind of want to stick to the OEM solution, too. I think what I'll do is just buy some of this:
https://www.amazon.com/Plastic-Punching-Dustproof-Dust-Proof-Equipment/dp/B08P1C43KL
And just use it to seal in the gaps. That'll keep it mostly dust free, and I don't really care about looks so that's good enough for me anyway. Maybe some time down the line, I'll look for a wider case to put it into, but thus far I haven't seen many cases that I would want to "upgrade" to.

And I'm kind of getting tired of moving parts around. On the bright side, my 3080 Ti is now pumping out Stable Diffusion images a hell of a lot faster than my 2080 could manage lol, and my bedroom computer is quieter because the Gigabyte 2080 doesn't spin up fans unless it gets hot.

Still deciding on what replacement plan to buy for it. They said I have 15 days from purchase to decide if I want one. I'm still leaning towards the 2 year one. I'm pretty confident that this is the 4090 that I returned for "coil whine" and then it looks like they sent it off to MSI because it did have an unopened adapter (though everything else was as I left it), so essentially I'm pretty much the only owner of this thing.

One of these?
 
As an Amazon Associate, HardForum may earn from qualifying purchases.

Maybe I was looking at a more expensive version? The one that I saw was indeed $100. Weird. Anyway, that has a lot of negative reviews:
https://www.amazon.com/product-revi...r&reviewerType=all_reviews#reviews-filter-bar

The one that I would buy would probably be the 4x8 to 12VHWPR, which would be about $30... but there are a lot of 1 star ratings that say that their card wouldn't work properly with it, and switching to the OEM adapter completely made the issue go away... I dunno, I'll probably just stick to OEM. Thanks for the link though.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
Maybe I was looking at a more expensive version? The one that I saw was indeed $100. Weird. Anyway, that has a lot of negative reviews:
https://www.amazon.com/product-revi...r&reviewerType=all_reviews#reviews-filter-bar

The one that I would buy would probably be the 4x8 to 12VHWPR, which would be about $30... but there are a lot of 1 star ratings that say that their card wouldn't work properly with it, and switching to the OEM adapter completely made the issue go away... I dunno, I'll probably just stick to OEM. Thanks for the link though.

Haha, the OEM one is the only one I've had that didn't work. When I had the Asus TUF card I had to buy an aftermarket one, the card would stay unpowered with the one that came with the card with the red light at the plug on. I didn't really like the quality of the aftermarket one so I ended up buying an MSI 1050w power supply with the 12vhpr connector on both ends and that let it run, unfortunately the coil whine was so extreme I couldn't stand it and sent it back.

As long as it works and you can do your stuff its all good!
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
Haha, the OEM one is the only one I've had that didn't work. When I had the Asus TUF card I had to buy an aftermarket one, the card would stay unpowered with the one that came with the card with the red light at the plug on. I didn't really like the quality of the aftermarket one so I ended up buying an MSI 1050w power supply with the 12vhpr connector on both ends and that let it run, unfortunately the coil whine was so extreme I couldn't stand it and sent it back.

As long as it works and you can do your stuff its all good!

You clearly have pretty bad luck with ASUS... not sure why you keep buying them. o_0; I don't touch them with a yard stick because of the warranty horror stories, and overall they seem overpriced this gen anyway.

Anyway, I've read that there are two variants of the 12VHPWR adapter that can ship with cards. One of them is supposedly worse, so maybe you got a dud. I don't really do the brand loyalty thing, but MSI seems to be pretty solid lately. By which I mean the motherboards I've gotten from them have been pretty stable and solid boards.
 
You clearly have pretty bad luck with ASUS... not sure why you keep buying them. o_0; I don't touch them with a yard stick because of the warranty horror stories, and overall they seem overpriced this gen anyway.

Anyway, I've read that there are two variants of the 12VHPWR adapter that can ship with cards. One of them is supposedly worse, so maybe you got a dud. I don't really do the brand loyalty thing, but MSI seems to be pretty solid lately. By which I mean the motherboards I've gotten from them have been pretty stable and solid boards.

Thats not really on them, the card was an open box. I mean coil whine, but all the big cards have it to varying degrees. I have the FE now and its amazingly quiet. Also, its probably the best built card I have ever laid my hands on, its really quite impressive.
 
You clearly have pretty bad luck with ASUS... not sure why you keep buying them. o_0; I don't touch them with a yard stick because of the warranty horror stories, and overall they seem overpriced this gen anyway.

Anyway, I've read that there are two variants of the 12VHPWR adapter that can ship with cards. One of them is supposedly worse, so maybe you got a dud. I don't really do the brand loyalty thing, but MSI seems to be pretty solid lately. By which I mean the motherboards I've gotten from them have been pretty stable and solid boards.
The 30 series seemed okay? My 3080 Tuf - I would declare it as the best gpu I've ever had to date - although, maybe I have a small sample size - but, it was quiet, cool running - decent temps and way better than my EVGA 3060 and I don't mean - because it was a higher tier - the EVGA card was loud - and I hated it. ;)
I think the 40 series - Asus Tuf cards were expensive at first - but, I've seen some prices come down - and it depends on vendor. Also, it is often the most purchased (it seems) so much easier to find a used one - and considering ASUS offers transferable warranty - in Canada, at least - I would definitely consider one.
I've read a few horror stories about coil whine - but, considering they seem to sell a lot of them - it could be just the large sample of purchased cards - so, there are some out there with that? Also, other brands seem to have coil whine - I am not sure which brands have the 'least coil whine' anymore - I don't think anyone has really looked into that.
You would think one of the tech tubers would try to - but, no - it's always just a test of game vs game or sometimes comparing to an AMD card.
I am more interested in coil whine/fan noise/ temps - I want those compared. :)
 
Had a bit of a scare. Tried registering this card on MSI's web site, and it kept giving me an error. At first panicked and started preparing to return the card to Microcenter, but then decided to try out their chat support line. So technically, their chat support is only 24/7 if you're contacting about an all in one PC or a Laptop, but the person still got me sorted pretty fast even though he/she was a laptop tech.


1697272185047.png



1697272282607.png


Tbh, I'm kinda tempted to just take the extra savings I got from open box and not even get the Microcenter replacement plan. Looks like their support is pretty legit. I'm pretty sure the reason it initially failed to register was that Microcenter already had it registered to send it back to MSI to test for the coil whine that I complained about (?). Unless another user did actually have it in between when I returned it and when they got it, but it had a brand new 12VHPWR adapter in the box, so I'm pretttttyyy sure this was my card that I returned lol, considering I recognized the screw zip bag I used for the radiator mounting.


The 30 series seemed okay? My 3080 Tuf - I would declare it as the best gpu I've ever had to date - although, maybe I have a small sample size - but, it was quiet, cool running - decent temps and way better than my EVGA 3060 and I don't mean - because it was a higher tier - the EVGA card was loud - and I hated it. ;)
I think the 40 series - Asus Tuf cards were expensive at first - but, I've seen some prices come down - and it depends on vendor. Also, it is often the most purchased (it seems) so much easier to find a used one - and considering ASUS offers transferable warranty - in Canada, at least - I would definitely consider one.
I've read a few horror stories about coil whine - but, considering they seem to sell a lot of them - it could be just the large sample of purchased cards - so, there are some out there with that? Also, other brands seem to have coil whine - I am not sure which brands have the 'least coil whine' anymore - I don't think anyone has really looked into that.
You would think one of the tech tubers would try to - but, no - it's always just a test of game vs game or sometimes comparing to an AMD card.
I am more interested in coil whine/fan noise/ temps - I want those compared. :)

The going price on almost everything ASUS for the 4090 is higher than all other competitors. The TUF tends to stabilize at around the $1800s+ (USD). Though I have seen it on sale for 1599 direct from ASUS (briefly). It's not really going up or down much.

1697272460372.png


As far as coil whine, this card does definitely have some, but I would essentially have to have my ear facing towards the exhaust of the card, <=1 foot away (and most of that might actually be the pump tbh). From where I'm sitting it's inaudible, even if I turn my aquarium equipment down and mute my speakers. The Gigabyte had it anyway, as I said, so I'm still chalking it up to my home's wiring considering all of my cards have some coil whine, from every manufacturer out there. Well regardless of coil whine, I'm pretty sure that I'm going to commit to this card. I got it for a good deal and I've got it registered on their web site with a legit 3 year warranty despite it being open box. So I'm happy with that, considering it's a higher end model.
 
Last edited:
Back
Top