4090 Reviews are up.

I was thinking about snagging the MSI SUPRIM X 4090 (air or water cooled) when it comes out. Leaning toward air... I'd be curious how well that Gaming Trio performs for $100 less.
They didnt have the suprim x either so I took the gaming trio. If its not great I'll probably return it and get the Liquid AIO one.
 
They didnt have the suprim x either so I took the gaming trio. If its not great I'll probably return it and get the Liquid AIO one.
If you can, follow up on how it performs and how good (or bad) the cooling is... as I am sure the suprim X air cooling would suffer any similar issues. I know the AIO "should" cool better; but I have always been wary of an AIO on a graphics card as that will likely be a point of failure. I've never had issues with the air cooled setups.
 
Seeing as how the 50W higher out of the box Strix OC was 3% faster than the FE, and overclocking netted the same 4.5% increase as the FE... it seems like EVGA got out at the right time. Nvidia's FE makes getting a more costly AiB card silly, unless you don't have the board space and get a watercooled version.

Heck, even the AiO coolers appear to be no better than the FE cooler (besides the slot width & card length).
 
Last edited:
If you can, follow up on how it performs and how good (or bad) the cooling is... as I am sure the suprim X air cooling would suffer any similar issues. I know the AIO "should" cool better; but I have always been wary of an AIO on a graphics card as that will likely be a point of failure. I've never had issues with the air cooled setups.
I would not be wary of AIOs anymore they are pretty good. I converted a 2080 ti to a hybrid and installed a used AIO (bought here) that is still chugging today.
 
Would one game be enough to spend $1600 to play? That would be a very hard sell, at least for me. For content creators, 3d artists, would like to see some VRay results and other GPU renderers at play. Maybe the full unlocked chip will have DP2.0 for professional displays as well.
I know that the sticker price for gaming can be a lot to deal with initially, but the ROI from gaming is a great value. Look at it this way, you spend $1000 on a GPU, that's a lot of money for most people, but think of how many THOUSANDS of hours you can game on that GPU. Where if you go to the movies and maybe out for dinner one night, hell you can spend $100 for just a few hours of entertainment. I have probably gamed for months straight on my trusty 1080ti, and it cost me $700 new. Not that many devices in life give you that ROI.
[this is the spiel I give to my wife anytime I want to upgrade, feel free to use it guys]
 
So, basically none of the cards bring much more performance at all. And all the GPU's are being held back because you cannot increase voltages.

Might as well get an FE model and ignore all the AIB models. EVGA is looking very smart right now.
 
What games are you folks playing that are at 4k? CP2077? I figured that game sort of fizzled out by now. We're sort of in a drought with games. I can't even think of a game that is currently worth the investment.
 
What games are you folks playing that are at 4k? CP2077? I figured that game sort of fizzled out by now. We're sort of in a drought with games. I can't even think of a game that is currently worth the investment.
Being a 4k120hz gamer, I can tell you quite a few games can benefit from a 4090. CP2077, Age of Conan, Fallout76. None of those games maxed out can hit 120fps constant on a 3090. Those are the games I currently play.

I can imagine quite a few other games finally getting good framerate maxed out at 4k now with a 4090.
 
What games are you folks playing that are at 4k? CP2077? I figured that game sort of fizzled out by now. We're sort of in a drought with games. I can't even think of a game that is currently worth the investment.
Had a 4K120 CX OLED display for 2 years now. Older games have no issue. Newer titles are a pipe dream for 4K120. The 4090 gets us closer to that pipe dream.

At first, I was massively against the 4090, but after seeing that it offers a 2x increase in performance over my 3080 at 4K, I'm sold. I was expecting the typical 30-40% uptick, but this is way more than that.
 
Look at it this way, you spend $1000 on a GPU, that's a lot of money for most people, but think of how many THOUSANDS of hours you can game on that GPU. Where if you go to the movies and maybe out for dinner one night, hell you can spend $100 for just a few hours of entertainment. I have probably gamed for months straight on my trusty 1080ti, and it cost me $700 new. Not that many devices in life give you that ROI.

Agreed. Since I got my 2080 I've used it for over 5000 hours in the main game I play, and that's not even counting other games.
 
What games are you folks playing that are at 4k? CP2077? I figured that game sort of fizzled out by now. We're sort of in a drought with games. I can't even think of a game that is currently worth the investment.
I mainly am upgrading for better performance in RTX/raytracing experiences at 4K. Almost all of them could use the boost. I like Amid Evil, Quake II RTX and the many RTX mods of other oldschool games like Doom, Quake 1, Half-Life, Portal etc. It's neat to see the facelift of those older games, and the performance bump should be quite nice for 4K RTX. I also get on a VR kick once in a while, and in that space you can use every drop of extra performance.
 
After seeing the very low decibel and temps for the FE: seems like Nvidia could have made them even shorter, kept the same/nearly the same decibel level; and still had solid temps.
 
After seeing the very low decibel and temps for the FE: seems like Nvidia could have made them even shorter, kept the same/nearly the same decibel level; and still had solid temps.
Maybe they are going to use the same cooler for the 4090ti, and it was just easier to manufacture all using the same specs.
 
After seeing the very low decibel and temps for the FE: seems like Nvidia could have made them even shorter, kept the same/nearly the same decibel level; and still had solid temps.
Almost every card you are seeing had a cooler designed for 600W. That was spec for the GPU when the coolers went into design. NV pulled back due to seeing PSU and other customer cooling problems. Simply put, those coolers are WAY over-built for 450W cards, which is a good thing for enthusiasts...if you can fit it into your case.
 
Almost every card you are seeing had a cooler designed for 600W. That was spec for the GPU when the coolers went into design. NV pulled back due to seeing PSU and other customer cooling problems. Simply put, those coolers are WAY over-built for 450W cards, which is a good thing for enthusiasts...if you can fit it into your case.
This was my biggest issue. I'm limited to just over 13" in length, so anything above that and I'm getting a new case... which I'm probably doing anyway because I hate my current case.
 
Almost every card you are seeing had a cooler designed for 600W. That was spec for the GPU when the coolers went into design. NV pulled back due to seeing PSU and other customer cooling problems. Simply put, those coolers are WAY over-built for 450W cards, which is a good thing for enthusiasts...if you can fit it into your case.
Thats explains a lot! So those rumors of 600w were actually true, just Nvidia realized it was not going to work and went back to 450w.

So, basically the 4090 cooler will be perfect for the 4090ti/titan model.
 
Glad I didn't pay up for the other models then. Looks like there is very little performance increase for the other AIB models.
 
Thats explains a lot! So those rumors of 600w were actually true, just Nvidia realized it was not going to work and went back to 450w.

So, basically the 4090 cooler will be perfect for the 4090ti/titan model.
Looking at AIB (and even FE) reviews, many cards out there allow a power increase to 600W through overclocking software. Looks like some AIB's stopped at 520W though.
 
What I wonder about after reading reviews: Why the hell they didn't test this card in 8K
Otherwise card looks ok. Can even de-OC it to get most of the performance with much better performance/watt

Too bad price is indicating very bad trend in the industry
 
What I wonder about after reading reviews: Why the hell they didn't test this card in 8K
Otherwise card looks ok. Can even de-OC it to get most of the performance with much better performance/watt

Too bad price is indicating very bad trend in the industry

I would love to see some 8K results, since 4K is where the 4090 really shines against the 3090/Ti. If the performance gap continues to increase with resolution as it does up to 4K, maybe we have the first truly 8K capable card (assuming it's like 70-80% faster in 8K res).
 
I would love to see some 8K results, since 4K is where the 4090 really shines against the 3090/Ti. If the performance gap continues to increase with resolution as it does up to 4K, maybe we have the first truly 8K capable card (assuming it's like 70-80% faster in 8K res).

The lead won't keep on expanding. Techspots testing at 4k all looked to be clear of cpu and game bottlenecks. They had a consistent 60% performance increase of the 4090 over the 3090ti so you won't see much change to that at 8k, at least at native. Improved Dlss may see even higher gains though.
 
The lead won't keep on expanding. Techspots testing at 4k all looked to be clear of cpu and game bottlenecks. They had a consistent 60% performance increase of the 4090 over the 3090ti so you won't see much change to that at 8k, at least at native. Improved Dlss may see even higher gains though.
Not going to be great at 8k gaming in some regards being they use display port 1.4. So the 4090 is limiting itself for the higher resolution and refresh rates at higher resolution.

Kind of blows my mind Nvidia didn't go with display port 2.0.
 
This was my biggest issue. I'm limited to just over 13" in length, so anything above that and I'm getting a new case... which I'm probably doing anyway because I hate my current case.
Looks like plenty of room in the Fractal Torrent even with the Ek Elite:
ELOXinu.jpeg
 
8K gaming.

I'm still on 1080 and 1440. haha
Well the 4090 is being held back at 1440p. It basically is made for 4k+ resolutions. Sure you can get higher refresh rates, but you are limited by the CPU/game engine at that point.
 
Not going to be great at 8k gaming in some regards being they use display port 1.4. So the 4090 is limiting itself for the higher resolution and refresh rates at higher resolution.

Kind of blows my mind Nvidia didn't go with display port 2.0.
Not enough customers out there trying to high refresh game at 4K to waste that money on 2.0 (assuming it costs more to provide this over 1.4) if you ask me. We'll probably see DP 2.0 on next gen cards.
 
Not going to be great at 8k gaming in some regards being they use display port 1.4. So the 4090 is limiting itself for the higher resolution and refresh rates at higher resolution.

Kind of blows my mind Nvidia didn't go with display port 2.0.
Can't let you have too much fun, apparently.
 
Not enough customers out there trying to high refresh game at 4K to waste that money on 2.0 (assuming it costs more to provide this over 1.4) if you ask me. We'll probably see DP 2.0 on next gen cards.
You are correct. Looks like RDNA 3 cards will have display port 2.0 and PCI-E 5.0.
 
Not going to be great at 8k gaming in some regards being they use display port 1.4. So the 4090 is limiting itself for the higher resolution and refresh rates at higher resolution.

Kind of blows my mind Nvidia didn't go with display port 2.0.

There's also running 8K downsampled on a 4K monitor, which you can easily do with Nvidia DSR factors. And it does look better/sharper from the few times I've tried it on my own 4K monitor. It's not a huge difference, but it's definitely there. But my 3090 really chokes when doing it.

And then for the purposes of VR, I'd want to see how 4K x2 (4K per eye) or any similar pixel total resolution (so about ~6K res) performs as well.
 
What games are you folks playing that are at 4k? CP2077? I figured that game sort of fizzled out by now. We're sort of in a drought with games. I can't even think of a game that is currently worth the investment.
Quake 2 RTX Edition.
 
What I wonder about after reading reviews: Why the hell they didn't test this card in 8K
Otherwise card looks ok. Can even de-OC it to get most of the performance with much better performance/watt

Too bad price is indicating very bad trend in the industry
Could be just time, but also Human are quite influenced by feedback, some got ridicule for 8k testing the previous time around.

Some did:


At pure raster, at 8k more than doubling at 3090 become more the norm.

8k TV being HDMI 2.1 could start to be an issue on the next generation of card and DLSS frame adding.

Considering the numbers of 8K tv-monitor outthere it is more of a pure tech porn and pretty much all reviews I saw were rushed incomplete affair with part 2 coming up, has sample were close to embargo and with how more complex it get to test them, which would push people into already ready setup and what people are the most interested in, 1440p-4K.
 
  • Like
Reactions: XoR_
like this
What games are you folks playing that are at 4k? CP2077? I figured that game sort of fizzled out by now. We're sort of in a drought with games. I can't even think of a game that is currently worth the investment.
As I am on a 4K monitor, I play ALL my games at 4K! Right now, mostly BF2042, soon to be COD MW2, Forza Horizon 5 (beautiful looking game at 4K w/ HDR), MSFS2020. May revisit CP2077 at some point once I get a 4090 (I beat it when it came out)
 
How much money could they really save with this ? Considering the kind of margin and the presence of Arc were supposed to be released last year Intel GPU, it is hard to understand
 
How much money could they really save with this ? Considering the kind of margin and the presence of Arc were supposed to be released last year Intel GPU, it is hard to understand
I don't technically know, but I imagine it is beyond a simple "switch the port" type deal... the silicon would have to support it as well as the actual PCB/connector. Who knows when nvidia was developing the architecture and what restrictions there "might" have been. Perhaps people are right, it is a forced upgrade 2 years from now, even if the card is set right now for those higher refresh rates at 4K. Does seem really stupid considering all of the benchmarks I have been looking at today on these cards. There are already 4K 144Hz HDR monitors out, granted expensive, but the OLED versions will be coming soon as well, so why not just add the support from the start?
 
Anyone figure which Manufacturer has a warranty that can compete with EVGA's Advanced RMA? Can't seem to find a good alternative. Love any Rec's
 
Back
Top