4090 Reviews are up.

Definetly sitting out this round. My 3090 is doing just fine at 4K. I am curious to see what AMD has to offer with the 7900XT series....
Thats exactly what I plan to do. Been using this 3090 for 2 years now. Now the 4090 is a perfect 4k/120hz gaming card....no doubt about it. It would totally be badass for my Oled. But, I think I will wait for awhile and see what AMD has. Plus as of right now there isnt a game where I want upgrade for.
 
I am still gaming on mine!!!! LOL...
94d60c126dfb933539f691ee123be017c8ecaccd&rid=giphy.gif
 
Definetly sitting out this round. My 3090 is doing just fine at 4K. I am curious to see what AMD has to offer with the 7900XT series....

I could use more performance in Destiny 2 and Cyberpunk 2077. An RTX 3090 or even the 3090 Ti can't do 120FPS+ in every game consistently.
 
Maybe already mentioned but with the amount of problem some reviewer saw and giant cap in performance between say Paul and LinuxTechTips (specially 1% low), depending maybe on the Windows version used ? The crashes with dlss, etc....

Maybe all the talk of pushing the release to please AIB and let Ampere stock go was more of an it will do that has well, but more due to a windows-software stack not ready.

This feel like a significant amount of performance is left on the table, even at 4K, on many title and in 6 months on a 13900k fresh drivers-windows-game update, that Cyberpunk type of result will be seen in a couple of other place.
 
Did we ever saw same day review from mainstream reviewer for a "mainstream" product vary that much ?

Could be CPU bound even at 4K on many games that make the game selection change the result by a giant amount, could be other issues
 
Did we ever saw same day review from mainstream reviewer for a "mainstream" product vary that much ?

Could be CPU bound even at 4K on many games that make the game selection change the result by a giant amount, could be other issues
yes
 
Did we ever saw same day review from mainstream reviewer for a "mainstream" product vary that much ?

Could be CPU bound even at 4K on many games that make the game selection change the result by a giant amount, could be other issues
Yep, it is. Checking out the charts at https://babeltechreviews.com/rtx-4090-performance-45-games-vr-pro-apps-benchmarked/4/, their 12900k system with ddr5 is bottlenecking in some 4k gaming tests. A prime example is Spider-Man where even with raytracing, it's still happening. Many reviewers are using inadequate cpus too.
 
If I'm reading his benchmarks correct, @ 1440p gaming the 4090 is 28% faster than the 3090 Ti, and @ 4k gaming it's 33% faster than the 3090 Ti. The rumored RTX-4090 being 60% faster, I think was meant in reference to the RTX-3080
significantly lower than elsewhere numbers, seem to be a low sample size of game with a significant amount of them being CS:GO and Valorant that were doing over 500 fps already on previous gen at 4K.


Thanks for posting that. I forgot about them! The dlss 3 page on there is eye opening...
Indeed, does show how much latency you add with it in a giant rendering workload game like CyberPunk
dlss3_cyberpunk2077_4k_ultra_latency.png
dlss3_cyberpunk2077_4k_raytracing_latency.png


Will have to see, but maybe at first it will be exclusively for when you already have a good frame rate above 60 and you want a high frame rate on a high frame rate monitor, not to get you out of a situation where you 1% low are under 40 fps
 
one thing the 4090 does is actually make me want to upgrade my monitor from 1440p to 4K...before 1440p was the sweet spot as 4K was still too performance limited maxed out...I never had any interest in gaming at 4K for that reason as maxing things out was always important to me...the 4090 might actually make 4K gaming the new sweet spot
yeah that is how I always felt, I upgrade my system to match my monitor, as having a monitor that is not being taken full advantage of is leaving performance on the table.
overall only a tiny fraction of people even game at 4K and these cards make no sense at 1080p or 1440p...I can't see these flying off the shelves

https://store.steampowered.com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam
This is a chicken or the egg type of issue, maybe no one games at 4k because there was no GPUs that could really push 4k at high frames, now apparently there is, so perhaps those 4k monitors will make more sense now.
I know that if I owned a CX or C2 I would be all over these cards, as you can now stay locked at 120hz/120fps at 4k. Today is a shift towards 4k being the new 1080p, something that will (one day in the near future) become the standard.
 
Something strange I am learning with those reviews a 6900xt GPU was consuming significantly more power than a 3090 GPU while playing Cyberpunk (25% more), not sure why I thought otherwise, could it be a mix of focus on the peak powerconsumption and how much the GDDR6x VRAM had a higher consumption.


overall only a tiny fraction of people even game at 4K and these cards make no sense at 1080p or 1440p...I can't see these flying off the shelves

https://store.steampowered.com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam
almost 4% on the 3440x1440-3840x2160, some others and some TV connected not being the main monitor catched by steam I can imagine, on 120 millions users bases that a lot of people relative to the 4090 possible market share.

There small numbers could fly off the shelves, it is a niche product aimed at the millions with those type of monitors.

if you want super high frames with 4K, then why is 4090 not a DisplayPort 2.0 card?
Considering Intel card that were built a while ago have them, Raptor Lake iGPU seem to have them, it make little sense to me, why DLSS 3.0 without DP 2.0, that said some of the 4k-240hz monitor outthere:
https://www.pcmag.com/news/samsungs-4k-240hz-gaming-monitor-arrives-june-6

https://www.samsung.com/ca/monitors/gaming/odyssey-neo-g8-g85nb-32-inch-ls32bg852nnxgo/

Wonder how much and how well compression can push 48 Gbps, we can assume NVIDIA knows.... It is HDMI 2.1a and 1.4a at least
 
Last edited:
This is a chicken or the egg type of issue, maybe no one games at 4k because there was no GPUs that could really push 4k at high frames, now apparently there is, so perhaps those 4k monitors will make more sense now.
I have had a 4K display for years now. Top end cards push just fine especially with Adaptive Sync. I do get what you are saying though, however as I said earlier, if you want super high frames with 4K, then why is 4090 not a DisplayPort 2.0 card?
 
Seems like this is a good card to use for distributed computing from der8auer video by extrapolating the time spy extreme result. I posted more detail in our [H] DC forum here.

Well the gpu price is another story but an important factor too in decision making.
1665512088826.png
 
The performance gains are pretty nice across the board. It makes me interested in seeing what both "4080" variants end up turning in for performance.

overall only a tiny fraction of people even game at 4K and these cards make no sense at 1080p or 1440p...I can't see these flying off the shelves

https://store.steampowered.com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam
While it is a tiny fraction the raw percentages don't tell the whole story.

As of 2020 there were an average of 120 million active Steam users every month. 2.46% is nearly 2.95 million people. While this card is definitely overkill at anything below 4K we're still going to see people running UW1440 (1.63 million) and 1440p (13.5 million) purchase it. Plus there's just the fact that it's new and exciting. While I wouldn't be surprised if it doesn't sell out like we've seen with a lot of prior cards, I also wouldn't be surprised if it sells at least decently well at launch.
 
I could use more performance in Destiny 2 and Cyberpunk 2077. An RTX 3090 or even the 3090 Ti can't do 120FPS+ in every game consistently.

Rich or poor, I think everyone needs to be responsible in deciding if they need the shiny new thing. For one, its unnecessary e-waste. Second, God forbid you have a tragedy in your life or lose your job, you would really be kicking yourself in the rear when you are now struggling to pay your mortgage after just buying a card that gets you 300 fps instead of 200 fps in the games you play most.

That all said, their is definitely use case scenarios where it is worth it for some. Case in point, most RT games at 4k:
20221011_141312.jpg
 
yeah that is how I always felt, I upgrade my system to match my monitor, as having a monitor that is not being taken full advantage of is leaving performance on the table.

This is a chicken or the egg type of issue, maybe no one games at 4k because there was no GPUs that could really push 4k at high frames, now apparently there is, so perhaps those 4k monitors will make more sense now.
I know that if I owned a CX or C2 I would be all over these cards, as you can now stay locked at 120hz/120fps at 4k. Today is a shift towards 4k being the new 1080p, something that will (one day in the near future) become the standard.

problem is I'm not sure the jump from 1440p to 4K is all that big...1080p to 1440p or 4K seems like a bigger jump in terms of image quality...I have an LG 4K OLED but I don't game on that, I use it for TV, streaming, 4K Blu-rays...I have a dedicated 1440p 144hz G-Sync monitor
 
This is a fairly large performance jump. Here is hoping this scales down to the other cards, 80/80 16GB/70 and 60s.

If they're going to price it high, may as well come with the performance.

From the FPS Review:

4090_shadowtombraider.png
 
Last edited:
problem is I'm not sure the jump from 1440p to 4K is all that big...1080p to 1440p or 4K seems like a bigger jump in terms of image quality...I have an LG 4K OLED but I don't game on that, I use it for TV, streaming, 4K Blu-rays...I have a dedicated 1440p 144hz G-Sync monitor
It is to find test about it online (or my google skills are down), considering how much console can get away with it would be an interesting test to do. (and native vs non-native can make actual testing really hard which could be the reason), I imagine it depends on how good your eyes are the minutes you are remotely removed from the monitor.

Back in the days it was surprising how close in a movie theater someone with a 20/20 had to be to the giant screen to see a difference between 2 and 4K.

One thing that is nice, if you work on the same monitor you play on, you can work in 4K where the gain seem very obvious and have enough performance to play in full native resolution on the same monitor.
 
It is to find test about it online (or my google skills are down), considering how much console can get away with it would be an interesting test to do. (and native vs non-native can make actual testing really hard which could be the reason), I imagine it depends on how good your eyes are the minutes you are remotely removed from the monitor.

Back in the days it was surprising how close in a movie theater someone with a 20/20 had to be to the giant screen to see a difference between 2 and 4K.

One thing that is nice, if you work on the same monitor you play on, you can work in 4K where the gain seem very obvious and have enough performance to play in full native resolution on the same monitor.
4K TVs work well for streaming. Any time I sit close to the TV for a sports game you can tell how it's not even close to 4K quality. Sitting many feet away on the couch makes it much less noticeable.
 
1080p to 4K is interesting because I didn't think it was that huge of a jump initially. I think 1080p is the start of diminishing returns and it felt like such a giant jump over 720p and the 4:3 resolutions that everyone was rocking for so long. When you see things side by side (especially on a TV or large monitor), it's a huge difference in some content. I think you mainly notice it in huge panorama-style scenes where you have lots of detail in all directions. It's a monumental jump for content like that. However if you're playing something that's a more up close and personal, you might not even notice at all.
 
4K TVs work well for streaming. Any time I sit close to the TV for a sports game you can tell how it's not even close to 4K quality. Sitting many feet away on the couch makes it much less noticeable.
Movie and TV signal are so compressed that I am not sure how low in the list resolution would be in term of quality, under 50 mbits it seem quite useless to go over 1080p at least before AV1 in blind test.
 
This is a fairly large performance jump. Here is hoping this scales down to the other cards, 80/80 16GB/70 and 60s.

If they're going to price it high, may as well come with the performance.
We are also forgetting that the FPS benchmarks for the 4090 are based off games that are available TODAY. In one to two years from now there could be games that are released that cripple anything below a 3080. I get the feeling the 4090 is the new 1080ti, not in the cost/performance aspect but more so in performance that stayed relevant for 5+ years, but relevant for the next generation of games.
 
1080p to 4K is interesting because I didn't think it was that huge of a jump initially. I think 1080p is the start of diminishing returns and it felt like such a giant jump over 720p and the 4:3 resolutions that everyone was rocking for so long. When you see things side by side (especially on a TV or large monitor), it's a huge difference in some content. I think you mainly notice it in huge panorama-style scenes where you have lots of detail in all directions. It's a monumental jump for content like that. However if you're playing something that's a more up close and personal, you might not even notice at all.
I think it depends greatly on the display type. On a smaller display, I don't usually see a difference in games. 28"-32" or whatever. In other things it really depends. At 40"+, I think you'd see a pretty big difference comparing 2560x1440 to 3840x2160.
 
What I'm seeing are massive increases in every situation where the CPU isn't holding back the GPU. If your CPU is holding back your GPU, you can't blame your GPU for that. Being CPU limited in a certain config doesn't mean that the GPU is "slower in that game".

Besides, if I get a 4090 I'm going to be using it for at least 4 years, during that time I'll probably upgrade my CPU at least once.
 
I think it depends greatly on the display type. On a smaller display, I don't usually see a difference in games. 28"-32" or whatever. In other things it really depends. At 40"+, I think you'd see a pretty big difference comparing 2560x1440 to 3840x2160.
Its true, on my 42" 4k monitor, 1440p looks ugly, 4k is so much clearer. Even with the settings turned to low in 4k, 42" almost needs 4k.
 
We are also forgetting that the FPS benchmarks for the 4090 are based off games that are available TODAY. In one to two years from now there could be games that are released that cripple anything below a 3080. I get the feeling the 4090 is the new 1080ti, not in the cost/performance aspect but more so in performance that stayed relevant for 5+ years, but relevant for the next generation of games.
Unlikely. Pretty much all major devs will use the consoles as the baseline and the 3080 should be just fine in the coming years.
 
Its true, on my 42" 4k monitor, 1440p looks ugly, 4k is so much clearer. Even with the settings turned to low in 4k, 42" almost needs 4k.
Maybe it changed over time and over resolution, but those technology often have a non native vs native cost that can create a difference that people think is due to lower resolution looking much worse (also text would be a special case).
 
I think it depends greatly on the display type. On a smaller display, I don't usually see a difference in games. 28"-32" or whatever. In other things it really depends. At 40"+, I think you'd see a pretty big difference comparing 2560x1440 to 3840x2160.

I think that many who are not currently running 4K monitors should give DLDSR a try. Even if you have a 1440P monitor, DLDSR allows you to render the game at a higher resolution than your monitor, and then the tensor cores are used to downsample that to your monitor's native resolution, with higher quality than regular DSR. It puts almost any in-game AA options to shame and genuinely looks better in most cases. With previous cards there was question about if it was worth the performance impact. With something like a 4090 I think you could get away with using DLDSR to play at 4K in basically every single game. So even if you don't have a 4K monitor, it's still possible to put the power of the 4090 to use.
 
This is a fairly large performance jump. Here is hoping this scales down to the other cards, 80/80 16GB/70 and 60s.

If they're going to price it high, may as well come with the performance.

From the FPS Review:

View attachment 517635
The 3 announced 4xxx series cards have Tflops 73, 43, 35.5. The 3090, 3080 10gb, and 3060ti have tflops 29, 17.5, 14. The corresponding 4xxx cards all have about 150% more tflops than the listed 3xxx cards.

The 4090 was about 65% faster than the 3090 at 4k with that 150% tflop boost. So, best guess, the 4080 16gb will be about 65% faster than the 3080 10gb [midpoint between 3090ti and 4090] and the 4080 12gb will be about 65% faster than the 3060ti [near 3080ti]. Scaling on the lower end cards will be nearly as good at 1440p since there are less bottlenecks.

I realize tflops are not everything and you can't compare across architectures but it should be fairly accurate
 
Last edited:
Back
Top