4090 Reviews are up.

seems like Unreal Engine 5 is going to usher in a new era of PC gaming in terms of graphical fidelity and that's where something like the 4090 shines...you don't really need it for today's game but next-gen is coming up fast...

 
im guessing they released the 4090 to get the sales before the (real) 4080 hits the shelves for a much more (although far too high) be interesting to see how the cripped 4080 (4070) does
 
Are these 4000series actually going to hit shelves, or is it bots for the next eight months while the FOMO's pay 100% MSRP?
 
Are these 4000series actually going to hit shelves, or is it bots for the next eight months while the FOMO's pay 100% MSRP?
The reason the 3080/3090 was so popular was because you could make money with them via crypto mining. Since crypto has crashed, the only people that want these cards are gamers and some content creators. Since the carrot of the card paying for itself is no longer dangling in front of us, scalpers can grab these at their own peril. They may sell one or 2 if they're lucky, but more than likely, they will just be sitting on them and eventually have to sell them at or below MSRP to get rid of them.

Also, a looming recession puts the 4090 deep into "luxury goods" territory. Nobody needs one for competitive gaming; this is strictly for eye-candy. Add to the fact that this card is so fast that you really need a 4K120 display to take advantage of it... unless you have one, the 4090 isn't going to seem much faster than the 3090. Even 1440p seems to be pretty CPU limited on this GPU.

Completely different economic scenario compared to the RTX 3080/3090 launch.
 
Last edited:
At least a couple of the reviews I've seen have specifically mentioned that availability isn't expected to be an issue. I think the 3090 rolled out in a perfect storm of crypto, COVID, and hyped games.
 
one thing the 4090 does is actually make me want to upgrade my monitor from 1440p to 4K...before 1440p was the sweet spot as 4K was still too performance limited maxed out...I never had any interest in gaming at 4K for that reason as maxing things out was always important to me...the 4090 might actually make 4K gaming the new sweet spot
 
one thing the 4090 does is actually make me want to upgrade my monitor from 1440p to 4K...before 1440p was the sweet spot as 4K was still too performance limited maxed out...I never had any interest in gaming at 4K for that reason as maxing things out was always important to me...the 4090 actually might make 4K gaming the new sweet spot
I've had my LG CX OLED and used 4K120 for a couple of years now... I will never go back to 1440p.
 
The 3090 Ti was ridiculed and knocked because of it's crazy power draw.
This is something that never made any sense to me. It was a high performance part. Performance isn't free. It has to come from somewhere.
So the RTX-4090 being similar is not a good thing.
Actually, it is. Managing to achieve more performance for the same power cost is a good thing.
I thought hardware was supposed to progress to smaller and more efficient and still have faster performance.
This is a very common misconception. Allow me to dispel it. This never has been the case. CPU and GPU power consumption has done nothing but increase over the last two decades. Today's GPU's consume far more power than the GeForce 2 GTS 64MB cards of yesteryear or even the venerable 8800GTX. Today's CPU's are orders of magnitude more powerful than what was considered high performance 25 years ago. They also consume far more power. In the 1990's, motherboards didn't have VRM's capable of hundreds of watts of power output.

Case in point: The TDP of a Pentium 166MHz CPU is about 14.5w. In contrast, an AMD Ryzen 9 7950X has a TDP of 170w and an actual maximum power draw of around 230w under the right conditions. If the design goal was purely efficiency without a performance uplift, they would get more efficient, but not likely very much faster. If you want more performance, the cost is power. It is for any technology I can think of. You want a more powerful firearm? Then you either need to rechamber it in a more powerful round. That means something bigger, heavier, faster, or both. Or increase its fire rate. Regardless of the avenue chosen, you need an increase in some sort of resource. Want a faster car? Fuel efficiency will go down. A 5.0L V8 engine in a Mustang GT is way more powerful than what comes in a Chevy Cruz or Honda Civic, but its fuel consumption is considerably higher in turn. Performance isn't free. There are simple ASICs that replace legacy ones that do get shrunk and have less power consumption but that's not true for anything that's designed to increase performance.

At best, you get more performance for similar power consumption. That's how manufacturers choose to use their efficiency gains. Sure, they could make a Pentium 166MHz CPU that consumes probably a tenth of what they did back in the 1990's. Who would buy it? What purpose would it serve? And of course, NVIDIA probably could have done a more efficient version of the RTX 3090 Ti, but we wouldn't have seen much in the way of a performance uplift if we saw one at all. Now, such a thing might have had some overclocking headroom but the second that people started overclocking them to outperform the older RTX 3090's, the efficiency gains would be lost. People want performance. Power consumption is secondary at best for the gaming crowds and high performance enthusiasts. As a result, efficiency gains are best spent on increasing performance, not reducing thermals or your electric bill.

Let's look at other examples. Gulftown was a die shrink of existing Core i7 900 series CPU's. What did they do with it? Added more cache and increased the core count. You could clock it higher, etc. but again that costs power and generates more heat. Die shrinks on GPU's have inevitably lead to increased clocks and at best, similar power consumption to previous models. Sure, game changing breakthroughs happen once in awhile that sort of reset the playing field a bit but power consumption and heat go up from that point in order to increase performance. This has always been true even in the mobile and server markets where performance per watt has always been prioritized over absolute performance.
 
Skimming through a bunch of the reviews. No doubt the 4090 is a beast of a card and absolute excellent performance it gives.

But w/Tax it's $1,750+ which is just insanity to see. Same with the 3090 launch, that $1,500 price was outrageous. Especially because the 3080 was only 10% to 15% slower, but half the price at $799.

But if you game at 4K and play the newest titles. Yeah this is the card to get. But that's the thing, all the games I play are a little bit older and not as demanding, the amazing performance of the RTX-4090 would be wasted on my games and 1440p resolution.
 
Last edited:
Who would buy it? What purpose would it serve?

Hey!

Some of us have been gaming since the early 1980s. That being said...

Industry relies on scads of old ships in new industrial equipment. They are cheap cheap cheap.

Not everything is gaming.

But as an aside I can play Doom (the original) on the controller of this 6 axis precision mill we have here. In that case it's a 2.5 million dollar gaming machine.

Phht
 
The reason the 3080/3090 was so popular was because you could make money with them via crypto mining. Since crypto has crashed, the only people that want these cards are gamers and some content creators. Since the carrot of the card paying for itself is no longer dangling in front of us, scalpers can grab these at their own peril. They may sell one or 2 if they're lucky, but more than likely, they will just be sitting on them and eventually have to sell them at or below MSRP to get rid of them.

Also, a looming recession puts the 4090 deep into "luxury goods" territory. Nobody needs one for competitive gaming; this is strictly for eye-candy. Add to the fact that this card is so fast that you really need a 4K120 display to take advantage of it... unless you have one, the 4090 isn't going to seem much faster than the 3090. Even 1440p seems to be pretty CPU limited on this GPU.

Completely different economic scenario compared to the RTX 3080/3090 launch.
Yeah I don't see much of a scalper market this time around with crypto out of the picture, you may see a couple of FOMO purchases on Ebay in the next 2-3 weeks for those that missed initial drop but it does sound like supply is good (rumors have it Nvidia has been stockpiling these since June and only waited to clear Ampere inventory).

What is definitely lingering is the $1200+ hangover since Turing 2080Ti launched, where Nvidia somehow convinced everyone this pricing was the new normal for non-Titan GPUs. 3090 got a pass because 3080 existed and 3090 was in all respects, the new Titan (double RAM, BFGPU) and everyone thought it was a bargain at $1500 due to the warped reality field caused by the crypto/Covid stimmy bubble of 2020/2021. The gut punch this time around is another price hike on the new Titan 4090, but the only fallback is the hiked up 4080 (4070Ti or whatever) at $1200 and then last-gen Ampere parts at a discount. So yeah the pricing on these initial ADA parts is to clear out old inventory, what happens to pricing after that is anyone's guess.

The downside is that since this card is launching during the "holiday shopping window" I'm pretty sure it enjoys the very relaxed return rules from big retailers, so like Jan 15th/31st return dates instead of your typical 15-30 day return window. But yeah that's how long I'd expect scalpers to try and sit on these and move them, long enough to get that first CC statement, if they don't sell they'll be lining up to return them ASAP.
 
FPS review was the first one I found this morning. My schedule tomorrow doesn't allow any computer time during the release (seems to happen to me every time LOL). With all these glowing reviews it will be interesting to see how long stock lasts. Will make for perhaps amusing reading when I arrive home.
 
Right now a 3080 can pull 60fps in nearly all major titles at 4K. At least as long as you're willing to use DLSS...although usually only the milder DLSS versions are needed.

I see these 40-series cards as being the card to buy for the people that are still rocking 10-series or 20-series cards that are gaming in 4K. Especially someone with a newly acquired Gsync 4K TV/Monitor since you'll be able to push above 100fps. If I was in that boat, I'd probably either buy one of these ASAP or at least plan on getting the 16GB 4080. 30-series owners (like myself), I dunno. I guess if you can sell your current card it's worthwhile. If not, you'd either have to hate DLSS or have a monitor that can push 4K/144Hz and higher.
 
FPS review was the first one I found this morning. My schedule tomorrow doesn't allow any computer time during the release (seems to happen to me every time LOL). With all these glowing reviews it will be interesting to see how long stock lasts. Will make for perhaps amusing reading when I arrive home.
I suspect they'll last on shelves a long while. Not many folks spend that kind of cheddar on GPUs. At least now that they can't make any money with them.
 
Too bad the 4090 is only DisplayPort 1.4a. All those frames but it is limited to 120Hz at 4K unless you want to go Chroma. I do not understand this choice on a card with this much performance to dish out.
 
Curious the scalper market on these new cards. If stock is plentiful, or at least only back-ordered a couple weeks and replenished, then the scalper market will be non-existent, which is great.

And people looking for an "affordable" card can pick up a 3080 Ti for a good price, probably like $600.
 
I suspect they'll last on shelves a long while. Not many folks spend that kind of cheddar on GPUs. At least now that they can't make any money with them.
There are going to be LOTS of these in NA. There is more inventory for this product than almost any card I have ever seen launched in NA at any price. I guess NV was betting that Ethereum POS would be pushed off again. Watching inventories will be interesting.
 
Curious the scalper market on these new cards. If stock is plentiful, or at least only back-ordered a couple weeks and replenished, then the scalper market will be non-existent, which is great.

And people looking for an "affordable" card can pick up a 3080 Ti for a good price, probably like $600.

Enjoy it while you can....

When there is another run up of crypto we are gonna get the same thing all over again.
 
overall only a tiny fraction of people even game at 4K and these cards make no sense at 1080p or 1440p...I can't see these flying off the shelves

https://store.steampowered.com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam
It will be interesting for sure. This is the first time where 1440p is not being maxed out on a GPU. Potential 4090 buyers going to be disappointed that they can't max out their GPU.

Nvidia just kicked the high refresh rate 4K market into gear.
 
https://www.techpowerup.com/review/nvidia-geforce-rtx-4090-founders-edition/32.html

45 percent faster at 4k, no raytracing or dlss. :) (100 divided by 69 relative = 45 percent faster than the 3090ti). Same power draw too.

No kidding. Not entirely sure where they keep getting "30%" from. The hardware unboxing video (second one posted) shows 50% or more in 4k, and it's without "software trickery".

producing 30% @ 1440p
depends on what/who youre look in at...
now whos using 30%?! ;)
 
Sure, if you have $1600-2000 to blow on a GPU.

Nice card, exorbitant price.

Just put on your credit card, lol.

Seriously tho. I can afford this card but I'm very happy with 1080p gaming so I'll wait on a 4060 to replace my RTX 2060.
 
No kidding. Not entirely sure where they keep getting "30%" from. The hardware unboxing video (second one posted) shows 50% or more in 4k, and it's without "software trickery".
Gamer math as always. They see the 3090ti at 70% while the 4090 is at 100%. Public education tells them that this is a 30% increase instead of 43%.
 
If your a 4k gamer, then the 4090 makes sense, and will help tremendously in getting high fps. But then it goes down hill from there, a 3440 X 1440 gamer, yeah it's a good card, but what, 30% to 40% faster than the 3090 Ti? And then at 1440p gaming it's 30% faster? And at 1080p the 4090 is pure overkill and silly.
 
For the $2k price of the 4090, you can get the one video card, or, buy a brand new 55" OLED TV, and a PS5, and 5.1 surround sound setup.
 
If your a 4k gamer, then the 4090 makes sense, and will help tremendously in getting high fps. But then it goes down hill from there, a 3440 X 1440 gamer, yeah it's a good card, but what, 30% to 40% faster than the 3090 Ti? And then at 1440p gaming it's 30% faster? And at 1080p the 4090 is pure overkill and silly.

Stop. Just stop. You made your following points already.
1. Its only 30% faster at 4k when using a suite with frame capped games and public education math.
2. Its only 30% faster at 1440p due to other bottlenecks.
3. That's way more power than YOU need right now so the product is stupid.

Did I miss anything?
 
Really seems like a card for RTX enthusiasts @ 4K. Otherwise just overkill.
This would be a godsend for DCS World in VR. I don't get around to actually playing as much I'd like, so I don't know that I'll actually buy one, but a combination of this, and a CPU with the AMD 3D cache technology would be a very welcome upgrade.

Currently, I don't think it's possible to get a combination that can stay out of reprojection at all times, and when it does use reprojection, it's often at the worst possible time. For players committed enough to have a sim cockpit somewhere in their house, I can imagine the cost of upgrading to a 4090 being worthwhile.
 
Stop. Just stop. You made your following points already.
1. Its only 30% faster at 4k when using a suite with frame capped games and public education math.
2. Its only 30% faster at 1440p due to other bottlenecks.
3. That's way more power than YOU need right now so the product is stupid.

Did I miss anything?

I'm not really knocking the 4090, it is a great performer, especially for 4K, it's a big upgrade. Or for Cyberpunk 2077 it's a massive upgrade.

I think we need new game releases to push this card and really show it's power, like Starfield, S.T.A.L.K.E.R. 2, upcoming new Quake game, the upcoming MMO Ashes of Creation, etc...
 
Last edited:
This would be a godsend for DCS World in VR. I don't get around to actually playing as much I'd like, so I don't know that I'll actually buy one, but a combination of this, and a CPU with the AMD 3D cache technology would be a very welcome upgrade.

Currently, I don't think it's possible to get a combination that can stay out of reprojection at all times, and when it does use reprojection, it's often at the worst possible time. For players committed enough to have a sim cockpit somewhere in their house, I can imagine the cost of upgrading to a 4090 being worthwhile.
Yep. Properly feeding a Pimax 8KX for VR (and 12X next year), and a massive uplift for Topaz AI video upscaling, are the two killer apps for me that make 4090 a no brainer. VR's GPU appetite is bottomless because even if an individual title doesn't use 100% of the GPU, you can crank SuperSampling higher to eat the headroom.

But nothing more mind blowing in all of gaming than the sense of speed flying low in an F-16 through a valley with a wide-FOV VR headset.
 
Last edited:
I'm not really knocking the 4090, it is a great performer, especially for 4K, it's a big upgrade. Or for Cyberpunk 2077 it's a massive upgrade. I think we need new game releases to push this card and really show it's power, like Starfield, S.T.A.L.K.E.R. 2, upcoming new Quake game, the upcoming MMO Ashes of Creation, etc...

Its a halo product - not really meant for 99% of users today, but it is still exciting to see the performance uplifts. The lower cards will obviously be cheaper so we could easily get a 4070 that performa like a 3090ti at half the price... or a 4060 at half that of a 3080..
 
Back
Top