4090 Reviews are up.

I have the Asus TUF OC edition. Just curious what others experience has been.
Have you tried lowering the power % and see if the coil whine decreases/goes away? It generally is much more noticeable if you have something that can run really high FPS.
 
Have you tried lowering the power % and see if the coil whine decreases/goes away? It generally is much more noticeable if you have something that can run really high FPS.
Thanks, this is a good suggestion.
I had been messing around testing things out with Cyberpunk maxed out settings with performance DLSS which in the scene I was in was hitting around 90fps. I'll have to test it out with more variety later.

The whine's unnoticeable to me under 50% power. I think 100% is the peak and it actually gets less noticeable after going higher. Its actually significantly less noticeable maxed out at 133%. Could be due to the way the frequency of the whine changes. Turning DLSS off which drops the framerate down to around 37fps also quieted the whine significantly. I'll have to look more into it.

And again just want to emphasize the whine isn't horrible. It's just something I picked up on more than any other more recent card I've owned.
 
Sorry if already posted, they tried a 4090 on different PCI-express

https://www.techpowerup.com/review/nvidia-geforce-rtx-4090-pci-express-scaling/28.html
"Raptor Lake" PC builders can breathe a huge sigh of relief—we're happy to report that the GeForce RTX 4090 loses a negligible, inconsequential amount of performance in PCI-Express 3.0 x16 (Gen 4 x8-comparable) mode.

Relative performance than a 4090 running at PCIEe x16 4.0
At 1080p
x16 1.0: 82%
x16 2.0: 94%
x16 3.0: 97%
x16 4.0: 100%

At 1440p
x16 1.0: 80% (still faster than a 3090TI)
x16 2.0: 94%
x16 3.0: 98%
x16 4.0: 100%

4K
x16 1.0: 81% (still much faster than a 3090TI)
x16 2.0: 92%
x16 3.0: 98%
x16 4.0: 100%

That said, there is a notable exception:
https://www.techpowerup.com/review/nvidia-geforce-rtx-4090-pci-express-scaling/22.html

Metrox Exodus benefit massively from PCI express 4.0 and there is now a reason to use PCI 4 for your GPU without a doubt if you go for a 4090.

Will see with the Ryzen 8xxx if that will hold up, for the has little difference or if it is maybe it is a bit being CPU-system limited at the moment. But maybe splitting the PCI-express 16 into a x8 will again not be a major issue for this generation of cards.
 
Sorry if already posted, they tried a 4090 on different PCI-express

https://www.techpowerup.com/review/nvidia-geforce-rtx-4090-pci-express-scaling/28.html
"Raptor Lake" PC builders can breathe a huge sigh of relief—we're happy to report that the GeForce RTX 4090 loses a negligible, inconsequential amount of performance in PCI-Express 3.0 x16 (Gen 4 x8-comparable) mode.

Relative performance than a 4090 running at PCIEe x16 4.0
At 1080p
x16 1.0: 82%
x16 2.0: 94%
x16 3.0: 97%
x16 4.0: 100%

At 1440p
x16 1.0: 80% (still faster than a 3090TI)
x16 2.0: 94%
x16 3.0: 98%
x16 4.0: 100%

4K
x16 1.0: 81% (still much faster than a 3090TI)
x16 2.0: 92%
x16 3.0: 98%
x16 4.0: 100%

That said, there is a notable exception:
https://www.techpowerup.com/review/nvidia-geforce-rtx-4090-pci-express-scaling/22.html

Metrox Exodus benefit massively from PCI express 4.0 and there is now a reason to use PCI 4 for your GPU without a doubt if you go for a 4090.

Will see with the Ryzen 8xxx if that will hold up, for the has little difference or if it is maybe it is a bit being CPU-system limited at the moment. But maybe splitting the PCI-express 16 into a x8 will again not be a major issue for this generation of cards.
Very interesting as I haven’t paid attention to the PCI lanes situation on the new motherboards. Needless to say, if Metro Exodus is an omen of what to come to other games, you could be in a little of a pickle on the Raptor Lake platform if you also want to run a NVMe 5.0 PCI SSD drive. On the other hand, the new AM5 platform is on the expensive side no doubt.
 
How is everyone's experience with DLSS 3 so far? Do you notice the crappy frames it generates?
 
How is everyone's experience with DLSS 3 so far? Do you notice the crappy frames it generates?
On guru forums, people are praising it, saying it works very well. I also saw at TheFpsReview coverage that it works well in their review. https://www.thefpsreview.com/2022/1...90-founders-edition-review/7/#ftoc-heading-29.

No worse latency than native, and hard to see artifacts outside of 60fps 400 percent zoom 3 percent speed videos. The 4090 owners' thread on guru is enlightening.

It'll only get better from here.
 
For those who've gotten a 4090 what's been your experience with coil whine? It's not horrible but it's more noticeable than any card I've had in a number of generations. Just something unavoidable with the power its pulling?
I haven't had a quiet card (regarding coil whine) since Turing. My EVGA RTX 2060 is VERY loud. EVGA RTX 2080 is very loud. AMD reference 6700 XT is loud. Gigabyte 6600 XT, noticeable coil whine. But, not too bad.
 
HUB and DF’s (later) take on DLSS 3 FG are constructive criticisms that i’m sure Nvidia will take note of to further improve the technology.
 
HUB and DF’s (later) take on DLSS 3 FG are constructive criticisms that i’m sure Nvidia will take note of to further improve the technology.
HUB built up a reputation of being biased against Nvidia because they spent years downplaying raytracing and DLSS. It got so bad, after their Ampere/RDNA2 reviews in 2020 Nvidia tried to blacklist them... So I doubt Nvidia much cares about HUB's opinions.
I haven't watched a full review of their's in a year or two. Are they still trying to pretend raytracing doesn't exist, despite nearly every AAA game utilizing it?
 
HUB and DF’s (later) take on DLSS 3 FG are constructive criticisms that i’m sure Nvidia will take note of to further improve the technology.
If it was not already in the plan I am not sure why they would not add to work on something to not go over the monitor framerate when frame generation is on (VSYNC, G-sync, FreeSync type working well)

If it is not official (and it could be) I am sure many employee watch such videos.

In the Sony dump leak of 2014, we could see they had people recording a list of comments, complain, etc... of their Blu-ray players and other movie theatre item by users on movie theatre message board, I would imagine massive youtube channel commentators of the field's remark would get noted.

Trying to blacklist-control them sound more like a proof of caring about their opinions than the other way around.

Are they still trying to pretend raytracing doesn't exist, despite nearly every AAA game utilizing it?


They seem to be able to test an 3090TI of all cards, with the majority of games they use for benchmark supporsting Raytracing (Far cry 6, Watch dog, tomb raider, F1 2021, Doom, Hitman 3, Cyberpunk) and yet not sure if it is even mentioned.

The 4090 did seem to make a turn around I imagine being the time were their enough performance head room to spare for RTX to bother with it
 
HUB built up a reputation of being biased against Nvidia because they spent years downplaying raytracing and DLSS. It got so bad, after their Ampere/RDNA2 reviews in 2020 Nvidia tried to blacklist them... So I doubt Nvidia much cares about HUB's opinions.
I haven't watched a full review of their's in a year or two. Are they still trying to pretend raytracing doesn't exist, despite nearly every AAA game utilizing it?
Best video card reviews online IMO.

Nvidia apologized openly after that fiasco.
 
Anyone think the 4090 might be the next 1080 Ti? My 1080 Ti last me a whooping 4 years…
 
Anyone think the 4090 might be the next 1080 Ti? My 1080 Ti last me a whooping 4 years…
With it being a cut down die, not really - there's still plenty of CUDA cores left for a 4090 Ti if AMD's 7000 GPUs are faster in raster.
 
Anyone think the 4090 might be the next 1080 Ti? My 1080 Ti last me a whooping 4 years…

I think it def is, especially considering the node jump from crappy Samsung to latest TSMC. I cant imagine needing more power to game at 4K two years from now. One thing 4090 did expose, is a need for a better CPU to handle it.....
 
There is 1 single item holding the 4090 back from it becoming the next 8800 GTX or 1080 Ti... price. Both the 8800 GTX and 1080 Ti were expensive, but not insanely so. The RTX 4090, though.... woof.
Ya that’s true, the 1080ti was reasonably priced. The 4090 should still be a decent card in 3-4 years time but it’s not the smoking deal that the 1080ti was in retrospect.
 
Anyone think the 4090 might be the next 1080 Ti? My 1080 Ti last me a whooping 4 years…
Depends in which sense, they priced it high enough, to make the situation that the 4090 make high priced 5070-5080 look terrible performance by $ manageable, to not be some "torn" from the past.

In terms of massive performance jump over anything remotely mainstream to game with that was available before launch and long lived (and take a long time for the xx60 to beat it) very well could be.

TSMC 2nm roadmap to N2 targeted for 2025:
https://www.tomshardware.com/news/tsmc-reveals-2nm-fabrication-process

Could see a 54% less power usage for the same performance from N5 and the AI world goes fast which could maybe lee to a 225 watt class of GPU with a similar performance, specially for a company with that much compute and data, how much AI will help in the actual GPU design (of themselve) starting when, etc.... Let alone on the software side of things, chance are that the next try at frame generation will be the, oh... now it work well enough a la DLSS 2.0

It could very well be and look like it will, one issue for NVIDIA going forward could be a move from memory to cache and cache memory on a TSMC does not scale (one of the big attraction of the AMD route to chiplets cache on a cheaper and different node, letting you put more of the revelant things on the newer process for an higher yield with less of it or more of it at the same yield-price)
 
There is 1 single item holding the 4090 back from it becoming the next 8800 GTX or 1080 Ti... price. Both the 8800 GTX and 1080 Ti were expensive, but not insanely so. The RTX 4090, though.... woof.
USD $599 and $699, respectively. I think I paid $629 each for my pair of BFGTech 8800GTX OC2.
 
USD $599 and $699, respectively. I think I paid $629 each for my pair of BFGTech 8800GTX OC2.
$599 in 2006 is now $881 in 2022 dollars. Let's add tariffs, continued inflation, increased R&D costs, much more GDDR, bigger boards and wafer prices. Probably puts the high-end GPU at minimum $1200-$1300 these days.
 
Last edited:
Don't forget the 8800 Ultra which started at $830 in spring 2007. I'm not sure if that is a better comparison to the xx90/Titan cards from the recent past but it was only a bump in clock and memory speeds.
 
$599 in 2006 is now $881 in 2022 dollars. Let's add tariffs, continued inflation, increased R&D costs, much more GDDR, bigger boards and wafer prices. Probably puts the high-end GPU at minimum $1200-$1300 these days.
I kinda wish that was true, but if it was that means they would have been losing money on every 3080 they sold.
 
I kinda wish that was true, but if it was that means they would have been losing money on every 3080 they sold.
No, I used the 2006 price of the highest end card available at the time. You have to compare apples to apples. The X80 isn't the highest end card now. Does anyone really think the highest end cards could possibly have a MSRP of under $1000 upon release anymore?
 
Reply
I look forward to AMD's "the people's card" at only $599 for RTX 4090 performance. Of course it won't happen...these vendors are businesses.
Last time AMD/ATi did a proper pricing undercut was with the HD 4870... nVIDIA has to cut the price of the GTX 260 to $299 from $399, and that was only like 1-2 months after launch. Looking at how AMD price their new AM5 Ryzen CPUs... those days are long gone. I would have gone out and get the RTX 4090 had they priced it at $1199.
 
Some really like to downplay just how much of a fleecing Nvidia is doing on these cards.
I don't think Nvidia is fleecing anyone. The created a product that is superior to the previous gen 3090 (by a lot), and is selling it for about the same price if you include inflation. The only reason I think the 3080 existed at the price it did is because Samsung gave Nvidia a sweetheart deal on their 8nm node, and Nvidia gave them a massive order. Nvidia could probably cut the price on the 4090 by $600 and still make a decent amount of money, but then their investors would fire Mr. Leather Jacket.

I'm not standing up for Nvidia, btw. We all want a cheaper product... but this is what happens when you create a product that absolutely decimates anything that has come before it; you control the pricing.

Now... fleecing? That's the 3090 Ti. I'm still not sure why anyone bought that hunk of hot garbage for $2K. I bet they all feel quite foolish since the 4090 dropped about 6 months later.
 
I don't think Nvidia is fleecing anyone. The created a product that is superior to the previous gen 3090 (by a lot), and is selling it for about the same price if you include inflation. The only reason I think the 3080 existed at the price it did is because Samsung gave Nvidia a sweetheart deal on their 8nm node, and Nvidia gave them a massive order. Nvidia could probably cut the price on the 4090 by $600 and still make a decent amount of money, but then their investors would fire Mr. Leather Jacket.

I'm not standing up for Nvidia, btw. We all want a cheaper product... but this is what happens when you create a product that absolutely decimates anything that has come before you; you control the pricing.
Eh that's fair. I'm just not into downplaying the profit margin on these things. It's the premier product, so more power to them.

Wrong term for it probably, my bad.
 
Now... fleecing? That's the 3090 Ti. I'm still not sure why anyone bought that hunk of hot garbage for $2K. I bet they all feel quite foolish since the 4090 dropped about 6 months later.
Couldn't agree more - and I bought two of them - just sold them for about half what I paid for them. Still not too fussed. When I look at it - it costs me about $700 to get a performance upgrade to 4090 levels. Worth it every time.
 
Couldn't agree more - and I bought two of them - just sold them for about half what I paid for them. Still not too fussed. When I look at it - it costs me about $700 to get a performance upgrade to 4090 levels. Worth it every time.
Ouch. I'm sorry brother.
 
Back
Top