4090 Reviews are up.

Maybe it changed over time and over resolution, but those technology often have a non native vs native cost that can create a difference that people think is due to lower resolution looking much worse (also text would be a special case).
Unfortunately, you can only really get a direct comparison with 1080P as there are some 1080P 40" displays out there. At 4K, the newer displays will obviously look better regardless of the content but even that's not a fair comparison. As far as I know, there are no 40"+ native 2560x1440 options on the market.

What I was generally talking about was 28-32" displays. For that, it depends on the content as to whether or not I notice the difference. Take Cyberpunk 2077 for example, you need a lot more DLSS at 4K for decent frame rates with ray tracing than you do at 2560x1440. (You still need DLSS for decent performance even at that resolution.) But, you can use the higher quality DLSS settings rather than performance or balanced at 4K. Given the same size range on displays, in that scenario I think 2560x1440 is the better looking option. All things being equal, 4K looks better if you can achieve good frame rates. That's the rub as good frame rates at 4K aren't always possible. It's not just for cases where a title supports ray tracing either. Destiny 2 is a good example of this. Sure, I can hit 120FPS+ a lot of the time but I also see drops into the high 80's frequently enough that I notice it.
 
4090 should be good for a while for me. It can do a lot of games 4k 120fps without DLSS, I figure with dlss Ill be good for 4k 120 for a long time that is until I can find 30-32" OLED 4k monitor with 240hz refresh rate.
 
4090 should be good for a while for me. It can do a lot of games 4k 120fps without DLSS, I figure with dlss Ill be good for 4k 120 for a long time that is until I can find 30-32" OLED 4k monitor with 240hz refresh rate.
I won't say that. I'll want whatever comes next after the RTX 4090 or 4090 Ti. Unless the successor to those GPU's provides virtually no improvement in performance, I almost always upgrade to the next generation.
 
4090 should be good for a while for me. It can do a lot of games 4k 120fps without DLSS, I figure with dlss Ill be good for 4k 120 for a long time that is until I can find 30-32" OLED 4k monitor with 240hz refresh rate.
I'd be happy with a 27-32" 144hz OLED at this point. Might see some models come out next year.
 
4090 should be good for a while for me. It can do a lot of games 4k 120fps without DLSS, I figure with dlss Ill be good for 4k 120 for a long time that is until I can find 30-32" OLED 4k monitor with 240hz refresh rate.
Yeah exactly, the 4090 feels like this is 'it', that this card will cover you regardless of what monitor tech you shoot for in the next few years. 8k 60? Sure why not? 4k 240hz? Go for it!
 
I won't say that. I'll want whatever comes next after the RTX 4090 or 4090 Ti. Unless the successor to those GPU's provides virtually no improvement in performance, I almost always upgrade to the next generation.
I am buying the 4090 mostly because my 3090 died in my SFF rig and I just want 4k120 locked. I don't play as much games lately. But like most here I do get the itch to upgrade always, but I've been spending way too much money and time on my other hobbies last year or so.
 
Im not saying it will easily handle those resolutions, but with lowering settings or increasing DLSS settings the 4090 probably could.
 
I have had a 4K display for years now. Top end cards push just fine especially with Adaptive Sync. I do get what you are saying though, however as I said earlier, if you want super high frames with 4K, then why is 4090 not a DisplayPort 2.0 card?
You make a very good point on the Display port version....huh. I also wonder if not being PCI-E 5.0 could hurt performance in some situations as well.
 
4K TVs work well for streaming. Any time I sit close to the TV for a sports game you can tell how it's not even close to 4K quality. Sitting many feet away on the couch makes it much less noticeable.

ESPN 4K is native...all the other 4K sports content (in the USA) is upscaled 1080p...you should check out the ESPN 4K college football or college basketball games...they look stunning...much better than the FOX 4K (upscaled) games
 
ESPN 4K is native...all the other 4K sports content (in the USA) is upscaled 1080p...you should check out the ESPN 4K college football or college basketball games...they look stunning...much better than the FOX 4K (upscaled) games
Eh I don't watch much college sports (especially football) so that would not be of much use to me. Also YT TV charges like 10 more bucks a month for just a few 4k channels last I checked, terrible value.
 
ESPN 4K is native...all the other 4K sports content (in the USA) is upscaled 1080p...you should check out the ESPN 4K college football or college basketball games...they look stunning...much better than the FOX 4K (upscaled) games
If the 1080p signal was 80Mbps like ESPN 4k college football maybe they would look even better (for example replays apparently were 1080p).

Those comparisons need to be at equal bitrate imo, in a is 4K worth it type of talk. A giant amount of 4K look better talk is simply much bigger video file look better that would still be true sometime even more true at a lower resolution.

That said at the DirectTV 80mbs 4K from beginning to end (except for replay which I guess can give some clue of the importance of 4k), it is certainly possible that 4K is worth it, at the 15-25-30 rate on regular TV-streaming, seem all marketing.
 
Techgage: https://techgage.com/article/nvidia-geforce-rtx-4090-the-new-rendering-champion/

He found something interesting: The 4090 has ECC which can be enabled in the drivers. He checked with Nvidia and they said the 3090 TI also had it.
I don't remember anyone else talking about this...
Yepper, the 3090 Ti does indeed have a Change ECC State in the Nvidia Control Panel. It really wasn't discussed much at all. Likely due to it perhaps not having much of a benefit outside of machine learning / AI tasks?
 
Too bad the 4090 is only DisplayPort 1.4a. All those frames but it is limited to 120Hz at 4K unless you want to go Chroma. I do not understand this choice on a card with this much performance to dish out.
My thoughts exactly. That performance driving next generation of 4K/5K (my ideal monitor) monitors with up to 12bit/144hz+, OLED. HDMI 2.1 is limited without DSC.

I just have not read or seen much dealing with DSC, 10bit/12bit HDR + motion clarity analysis. Is that standard based on single images? Motion? 8bit? 10bit etc. The human standard of perceptionaly equivalent I do not see as looked at by current reviewers. Then again I have not looked too hard for this information, just nothing on current content makers that looked at this. How does it really compares in motion is my real concern at high bit color ranges on ultra fast response time monitors maintaining clarity of pixels (OLED level technology displays).
 
overall only a tiny fraction of people even game at 4K and these cards make no sense at 1080p or 1440p...I can't see these flying off the shelves
Will have to see for new title once drivers , etc... stabilize. (and the next cpus-ram)

Because, in high demand games at 1080p-1440p you can see huge jump:
https://www.techpowerup.com/review/nvidia-geforce-rtx-4090-founders-edition/34.html

From a 3090
Controls goes from 104 to 191 fps at 1080p, 69 to 132 at 1440p
Cyberpunk goes from 43 fps to 82 fps at 1440p
F1 from 84 to 162
Exodus 112 to 198
RE village, 127 to 256

Would need to see the quality difference betweent hose low FPS on a 3090 versus perfectly good one too, but it is possible that some 2022-2023 games having a 4090 will make a difference to some at 1080p or 1440p.

Maybe that a big reason behind RayTracing, you can make sure any new video card will be able to show relevant boost forever has they will easily never be strong enough for that, you just have to throws more rays and enable more surface type with a couple I would imagine variable change in your code.
 
The 4090FE looks very strong, well designed. In my gaming case it would cover one pcie 1x slot. So is doable in other words for the most part. Using 2 pcie 16x slots now in that case. Most likely a 10gb network card and possibly a 5th SSD (in second pcie 16x slot which does not come from the CPU, chipset) in the future.

The added FPS does not matter if visually you cannot see them due to the display one has. The Vive Pro 2 would be my only display that potentially could push the 4090 if gamed on, like on Bigscreen for non VR titles. DLSS 3.0 analysis will be interesting but like DLSS, something I would probably would have to actually test to see if beneficial or not. When I initially used DLSS, it was much worse than what I expected and read and that was verison 2.0. Unusable in Control for me or any other game that had it. Later, the opposite, DLSS came on it's own and was usable and beneficial.

So far the only air cooled 4090 I would consider is the FE due to how well it actually does work in cooling and would fit reasonably well. As for power and spikes, it really looks like it would be better than my 3090 for spikes and small uptick for power, really not much more in actual use. OCing with this performance seems rather useless without Display tech catching up, games that actually have a reason for a setting that makes an improve gaming experience one would even care about. Except no DP2 at this time is rather concerning for long term usage.
 
Last edited:
An awesome graphics card in search of a killer app. NVIDIA delivered it seems. I'd be all over this if there was something to play!

This. I was all ready to buy one until I spent some time looking at my Steam library and upcoming games lists and seeing nothing interesting ...
 
  • Like
Reactions: noko
like this
An awesome graphics card in search of a killer app. NVIDIA delivered it seems. I'd be all over this if there was something to play!
Would one game be enough to spend $1600 to play? That would be a very hard sell, at least for me. For content creators, 3d artists, would like to see some VRay results and other GPU renderers at play. Maybe the full unlocked chip will have DP2.0 for professional displays as well.
 
Too bad the 4090 is only DisplayPort 1.4a. All those frames but it is limited to 120Hz at 4K unless you want to go Chroma. I do not understand this choice on a card with this much performance to dish out.
Yeah, not sure I understand this choice from nvidia, but it does support DSC on DP, so you can go 144hz without Chroma if your monitor supports DSC. An example would be the PG32UQX, which is 4K HDR 1400 and can hit 144Hz at 10-bit color settings. However, from 120hz - 144hz, it runs at 8bit + FRC vs. true 10-bit, but worlds better than Chroma 422. I would be surprised if you can find a human with the eye to spot that FRC difference visually with a still image outside artwork, more less with a game in motion. TBH, I have used Chroma 422 on my PG27UQ before to get 144Hz on a couple games and can't say it was noticeable while playing, but I have ultimately settled on 120Hz RGB444 these days for almost all games.
 
However, from 120hz - 144hz, it runs at 8bit + FRC vs. true 10-bit, but worlds better than Chroma 422.
What you mean?
Pretty much all 10-bit LCD panels are actually 8-bit + A-FRC
If its GPU which is supposed to actually send 8-bit signal and do its own dithering wouldn't really matter all that much - though still it would be better if dithering was done once and not twice - but I never read DSC dithers 10-bit

Not implementing new DP standard is very Nvidia
They need reason for people to get new GPU's after all...
 
Yeah exactly, the 4090 feels like this is 'it', that this card will cover you regardless of what monitor tech you shoot for in the next few years. 8k 60? Sure why not? 4k 240hz? Go for it!
Lol not with dp 1.4. Not sure why Nvidia gimped it like that.
 
Lol not with dp 1.4. Not sure why Nvidia gimped it like that.
The Samsung Odyssey do 240hz 4k with dp1.4 and-or HDMI 2.1, not sure if it involve some compromise or compression got good enough, maybe it will be common for 4k-240hz monitor to support it via those older connectors.

The odyssey G9 seem to do 10bits, 240hz at 5120x1440 with DP 1.4

If there is no compromise, needing more than 240hz at 4k could take a very long time, specially if it not supported by the best sellings video card nor the console to go above that.
 
What you mean?
Pretty much all 10-bit LCD panels are actually 8-bit + A-FRC
If its GPU which is supposed to actually send 8-bit signal and do its own dithering wouldn't really matter all that much - though still it would be better if dithering was done once and not twice - but I never read DSC dithers 10-bit

Not implementing new DP standard is very Nvidia
They need reason for people to get new GPU's after all...
I "could" be wrong, but I believe everything I have read states the PG32UQX is a TRUE 10-Bit panel, but the dithering is used from 120 ~ 144Hz for bandwidth reasons (even with DSC). The PG27UQ (which I have) is a 10bit panel that is 8bit + FRC, but it does not have DSC, so after 120Hz, I have to use YCrCb 422 vs. the RGB 444 I use at 120Hz and below.
 
Yep, this is why I tried my best to warn people that buying a 3090 ti in the last 1-2 months was a terrible idea.
yep i have a friend at work who just got into PC gameing and built his first PC a month or so ago, i warned him to wait and see what the new GPUs would be and he had to get a 3090TI, not sure what he paid but it was way too much, sucks to be him now i bet, sure he has a banging system, but for the money no way. IMO
 
Yeah, not sure I understand this choice from nvidia, but it does support DSC on DP, so you can go 144hz without Chroma if your monitor supports DSC. An example would be the PG32UQX, which is 4K HDR 1400 and can hit 144Hz at 10-bit color settings. However, from 120hz - 144hz, it runs at 8bit + FRC vs. true 10-bit, but worlds better than Chroma 422. I would be surprised if you can find a human with the eye to spot that FRC difference visually with a still image outside artwork, more less with a game in motion. TBH, I have used Chroma 422 on my PG27UQ before to get 144Hz on a couple games and can't say it was noticeable while playing, but I have ultimately settled on 120Hz RGB444 these days for almost all games.
I can attest to the Asus PG32UQX. It is an amazing monitor. I scored it for $1400 from craigslist here in the Sarasota area of Florida. (Sorry for the mess on my desk, it's been a busy week)
 

Attachments

  • unnamed.jpg
    unnamed.jpg
    498.5 KB · Views: 0
This has me wondering how AMD will do. I'm wondering if Nvidia was expecting AMD to release something good enough to threaten the performance crown. The 4090 is a good bit better than I had expected, even without DLSS and ray tracing.
 
With this level of performance and all the talk about being CPU bound, Nvidia should be pushing for more DLAA adoption than DLSS Super Resolution at this point.

I guess that's also the reason why Nvidia's marketing highlights that DLSS3 frame generation is useful in CPU bound situations.
 
Last edited:
I think I'll probably pick up one of these or the 7k version later this year, the power/performance at 60-65% power target from Roman's video looks great which will be nice to not overheat my room.
 
Would one game be enough to spend $1600 to play? That would be a very hard sell, at least for me. For content creators, 3d artists, would like to see some VRay results and other GPU renderers at play. Maybe the full unlocked chip will have DP2.0 for professional displays as well.
Techgage has some V-Ray benchmarks and others as well. https://techgage.com/article/nvidia-geforce-rtx-4090-the-new-rendering-champion/
The 4090 seems like a monster for that type of work. It cuts rendering times basically in half compared to a 3090 for the V-Ray tests.
 
I'm sure this series has been made to be monsters for AI, scientific and rendering purposes first of all, improvements in gaming is just a side effect.
 
Going to need to be putting your computer on a 20A circuit pretty soon at this rate.
 
Going to need to be putting your computer on a 20A circuit pretty soon at this rate.

Every review I've seen shows the 4090 as being easier on a PSU compared to the 3090 Ti. They revamped the power section, getting rid of the problematic transient power spikes and overall full-load consumption is slightly lower also.
 
Going to need to be putting your computer on a 20A circuit pretty soon at this rate.
huh. that's something I never even thought about. I ever back into the game, I may just do that just because. Three monitors, cooling system, cards, etc. Wouldn't be that bad of an idea.
 
Every review I've seen shows the 4090 as being easier on a PSU compared to the 3090 Ti. They revamped the power section, getting rid of the problematic transient power spikes and overall full-load consumption is slightly lower also.
Gamer's Nexus review shows 666.6W power draw when overclocked on the 4090 FE. My 3090 Ti FTW3 Ultra Hybrid hits around 516W.
 
Every review I've seen shows the 4090 as being easier on a PSU compared to the 3090 Ti. They revamped the power section, getting rid of the problematic transient power spikes and overall full-load consumption is slightly lower also.
True, but it's using around 500w under load and close to 700w if overclocked. 15A circuits should max at around 1440w for continuous load. Add in monitors, cpu, etc and anything else plugged in on that circuit and you could run into issues.
 
Microcenter had plenty. I ended up getting the gaming trio since they didn’t have the liquid.
I was thinking about snagging the MSI SUPRIM X 4090 (air or water cooled) when it comes out. Leaning toward air... I'd be curious how well that Gaming Trio performs for $100 less.
 
Back
Top