$2,500 RTX-5090 ( 60% faster than 4090 )

They might do both. Close to full die 512-bit bus 32GB 5090, then make a 5080Ti by cutting down the same chip. Maybe 24GB/384 bus?
If they do this, it would be wise for them not to wait too long to launch the 5090 Ti.

Conceptual launch strategy:
Launch the 5080, then launch the 5090 ($1500 - $1800), then 5090 TI after 5090 sells out (at like $2000-$2500).

It would be a decent launch strategy to lure people who didn't snag a 5090 at launch to instead buy a 5090 Ti at a large markup, this would likely be very limited releases because they wouldn't want the disaster they had with 3090 Ti where they had to slash the price to $1000 3 months after launch, because they overproduced them. But it depends on their yields, if yields are bad I don't think they will do a 5090 Ti.

I don't expect them to do it like 3000 series, the 3090 Ti didn't sell well compared the the 3090 because it came out 1.5 years later.

There's no way Nvidia is releasing a full die 5090. They're going laser off some of the cores, bus, and hard lock the bios power limit.

And your options will be to buy it or not, no competition to consider.
I believe this is more-likely, the 5090 is going to be more limited and more cut-down this generation. But might still have a 40%+ performance uplift. I don't think it will beat the uplift of the 3090-to-4090 of 64% because I suspect they will release it as a 300-400W card since the rumors are pointing at dual-slot design.
 
I believe this is more-likely, the 5090 is going to be more limited and more cut-down this generation.
Unlike the 4090, the "free launch" via the node gain could be smaller and on a very mature node with what it means in terms of yield, the 4090 has 16384 out of 18432 core enabled or 88.888...%

Could depends if it end up on DDR7 and if that uplift make it possible, the 4090 shifted expectation about how high a gen on gen light can be on that tier, and going from TSMC 4 to a better more mature TSMC 4, I am not sure if they can afford to cut it more... we could see the other way around with an only 10% cut down one imo, by then the yield of a 92% TSMC 5 could be the same or better than the 88.8% back then.
 
Unlike the 4090, the "free launch" via the node gain could be smaller and on a very mature node with what it means in terms of yield, the 4090 has 16384 out of 18432 core enabled or 88.888...%

Could depends if it end up on DDR7 and if that uplift make it possible, the 4090 shifted expectation about how high a gen on gen light can be on that tier, and going from TSMC 4 to a better more mature TSMC 4, I am not sure if they can afford to cut it more... we could see the other way around with an only 10% cut down one imo, by then the yield of a 92% TSMC 5 could be the same or better than the 88.8% back then.
Same kinda in line with the "leaks" so far, I see the e lower cards being cut down more and the flagship remaining the same or going to the full die. They won't get too complacent, or they will start making needless errors.
 
Unlike the 4090, the "free launch" via the node gain could be smaller and on a very mature node with what it means in terms of yield, the 4090 has 16384 out of 18432 core enabled or 88.888...%

Could depends if it end up on DDR7 and if that uplift make it possible, the 4090 shifted expectation about how high a gen on gen light can be on that tier, and going from TSMC 4 to a better more mature TSMC 4, I am not sure if they can afford to cut it more... we could see the other way around with an only 10% cut down one imo, by then the yield of a 92% TSMC 5 could be the same or better than the 88.8% back then.
The 4090 could benefit from more bandwidth (e.g., memory overclocks are more significant than core overclocks) so I do think the 5090 will see a major uplift with bandwidth alone. The 5090 even if it is cut down by 15%, could still provide a 60% performance increase over the 4090, it will also be interesting to see if the increase of L2 cache makes a difference, it probably played a decent role in the performance uplift from 3090 to 4090. But yea I agree with you, it all depends if GDDR7 makes that much difference.

I'll chart it out between the 3090, 4090, and rumored 5090 for a visual reference. (5090 boost is a conservative guess, it could be higher if they're really raising the base clock that much, looking at the base clocks the new architecture can achieve much higher clocks at likely a similar voltage.)

Code:
### RTX 4090 vs RTX 5090 (Rumored) -- Maybe 60%+ performance uplift??

| Specification                  | RTX 4090                    | RTX 5090 (Rumored)           | Percentage Increase       |
|--------------------------------|-----------------------------|------------------------------|---------------------------|
| CUDA Cores                     | 16,384                      | ≈24,576                      | 50%                       |
| Streaming Multiprocessors (SMs)| 128                         | ≈192                         | 50%                       |
| Ray Tracing Cores              | 128                         | ≈192                         | 50%                       |
| Tensor Cores                   | 512                         | 768                          | 50%                       |
| Base Clock                     | 2.23 GHz                    | 2.9 GHz                      | 30%                       |
| Boost Clock                    | 2.52 GHz                    | 3.2 GHz                      | 26.98%                    |
| L2 Cache                       | 72MB                        | 128MB                        | 78%                       |
| Memory Bandwidth               | 1,008 GB/s                  | 1,568 GB/s                   | 56%                       |
| VRAM                           | 24GB GDDR6X                 | 28GB GDDR7                   | 16.7%                     |
| Memory Bus                     | 384-bit                     | 448-bit                      | 16.7%                     |
| TDP                            | 450W                        | 500W                         | 11%                       |

### RTX 3090 vs RTX 4090 -- Ended up being 64% average performance uplift.

| Specification                  | RTX 3090                    | RTX 4090                    | Percentage Increase        |
|--------------------------------|-----------------------------|-----------------------------|----------------------------|
| CUDA Cores                     | 10,496                      | 16,384                      | 56%                        |
| Streaming Multiprocessors (SMs)| 82                          | 128                         | 56%                        |
| Ray Tracing Cores              | 82                          | 128                         | 56%                        |
| Tensor Cores                   | 328                         | 512                         | 56%                        |
| Base Clock                     | 1.4 GHz                     | 2.23 GHz                    | 59%                        |
| Boost Clock                    | 1.7 GHz                     | 2.52 GHz                    | 48.23%                     |
| L2 Cache                       | 6MB                         | 72MB                        | 1100%                      |
| Memory Bandwidth               | 936 GB/s                    | 1,008 GB/s                  | 8%                         |
| VRAM                           | 24GB GDDR6X                 | 24GB GDDR6X                 | 0%                         |
| Memory Bus                     | 384-bit                     | 384-bit                     | 0%                         |
| TDP                            | 350W                        | 450W                        | 29%                        |

Edit: corrected the errors on 4090 boost clock, previously I had the max boost.
Edit 2: updated 5090 TDP based on Seasonic information.
 
Last edited:
They won't get too complacent,
Specially if they want to try to make the 4 figure xx80 tier price stick, using 4NP TSMC (i.e. TSMC 5nm) in 2026 and maybe 2027... they could need to go as far has they can while being able to sell it in China on that one...

The ..60..70 type, in my opinion they need to do an impressive Turing type performance jump this time and they will not have stock of 4060 future sales to protect like they had for the 3060 to care about, considering how small and lowbandwith the AD107 was, it would be very easy to do...

Well maybe not 2060 being 60% faster than the 1060-6GB level of jump, but at least 35%, if they want to keep the prices up.
 
The 4090 could benefit from more bandwidth (e.g., memory overclocks are more significant than core overclocks) so I do think the 5090 will see a major uplift with bandwidth alone. The 5090 even if it is cut down by 15%, could still provide a 60% performance increase over the 4090, it will also be interesting to see if the increase of L2 cache makes a difference, it probably played a decent role in the performance uplift from 3090 to 4090. But yea I agree with you, it all depends if GDDR7 makes that much difference.

I'll chart it out between the 3090, 4090, and rumored 5090 for a visual reference. (5090 boost is a conservative guess, it could be higher if they're really raising the base clock that much, looking at the base clocks the new architecture can achieve much higher clocks at likely a similar voltage.)

Code:
### RTX 4090 vs RTX 5090 (Rumored) -- Maybe 50-60% performance uplift??

| Specification                  | RTX 4090                    | RTX 5090 (Rumored)           | Percentage Increase       |
|--------------------------------|-----------------------------|------------------------------|---------------------------|
| CUDA Cores                     | 16,384                      | ≈24,576                      | 50%                       |
| Streaming Multiprocessors (SMs)| 128                         | ≈192                         | 50%                       |
| Ray Tracing Cores              | 128                         | ≈192                         | 50%                       |
| Tensor Cores                   | 512                         | 768                          | 50%                       |
| Base Clock                     | 2.23 GHz                    | 2.9 GHz                      | 30%                       |
| Boost Clock                    | 2.52 GHz                    | 3.2 GHz                      | 26.98%                    |
| L2 Cache                       | 72MB                        | 128MB                        | 78%                       |
| Memory Bandwidth               | 1,008 GB/s                  | 1,568 GB/s                   | 56%                       |
| VRAM                           | 24GB GDDR6X                 | 28GB GDDR7                   | 16.7%                     |
| Memory Bus                     | 384-bit                     | 448-bit                      | 16.7%                     |
| TDP                            | 450W                        | 350W - 450W                  | -22% - 0%                 |

### RTX 3090 vs RTX 4090 -- Ended up being 64% average performance uplift.

| Specification                  | RTX 3090                    | RTX 4090                    | Percentage Increase        |
|--------------------------------|-----------------------------|-----------------------------|----------------------------|
| CUDA Cores                     | 10,496                      | 16,384                      | 56%                        |
| Streaming Multiprocessors (SMs)| 82                          | 128                         | 56%                        |
| Ray Tracing Cores              | 82                          | 128                         | 56%                        |
| Tensor Cores                   | 328                         | 512                         | 56%                        |
| Base Clock                     | 1.4 GHz                     | 2.23 GHz                    | 59%                        |
| Boost Clock                    | 1.7 GHz                     | 2.52 GHz                    | 48.23%                     |
| L2 Cache                       | 6MB                         | 72MB                        | 1100%                      |
| Memory Bandwidth               | 936 GB/s                    | 1,008 GB/s                  | 8%                         |
| VRAM                           | 24GB GDDR6X                 | 24GB GDDR6X                 | 0%                         |
| Memory Bus                     | 384-bit                     | 384-bit                     | 0%                         |
| TDP                            | 350W                        | 450W                        | 29%                        |

Edit: corrected the errors on 4090 boost clock, previously I had the max boost.
Looks like Seasonic might have just leaked the 5090 having a ~500W TDP. That bodes well for a decent performance uplift... at the cost of heat and power

https://wccftech.com/nvidia-geforce...500w-5080-350w-5070-220w-5060-170w-5050-100w/
 
Makes sense, same node size but bigger chips. More power. Now do I really want a 500w+ space heater?
Space heater is a stretch. As long as they keep the 2.5+ slot coolers, theyll be fine and quiet. (I'm part of the minority that likes the big coolers and cards. )
 
Space heater is a stretch. As long as they keep the 2.5+ slot coolers, theyll be fine and quiet. (I'm part of the minority that likes the big coolers and cards. )
Well I had CFX and SLI setups that took more power than that. So all should be golden. Just hope it is a real launch where you have at least 15 min to order one and not wait months.
 
Space heater is a stretch. As long as they keep the 2.5+ slot coolers, theyll be fine and quiet. (I'm part of the minority that likes the big coolers and cards. )
Funny enough, Space Heater is quite apt. As most electric heaters can't draw more than 10A, so in North America, the highest setting on an out-of-the-box space heater would be ~1200w so 500w is a significant fraction of that.
 
Funny enough, Space Heater is quite apt. As most electric heaters can't draw more than 10A, so in North America, the highest setting on an out-of-the-box space heater would be ~1200w so 500w is a significant fraction of that.
I have an portable oil radiator which has 300w, 500w, and 800w settings.
 
I think the 5090 will cost 2500 minimum. No questions. A card that powerful will command money. I am skipping 5 series unless I hit the lottery.
 
Funny enough, Space Heater is quite apt. As most electric heaters can't draw more than 10A, so in North America, the highest setting on an out-of-the-box space heater would be ~1200w so 500w is a significant fraction of that.
Yea a PC is after all a glorified electric heater (that costs much more and does much more than just heating).

I literally ripped out of one my electric heaters to replace it with the (4090 based) gaming PC since I don't heat much in winter anyway and that room is tiny. Combined with the 48" OLED, various peripherals and the 5.1 speaker setup, it's often over 1000w on that plug (though they are rated at over 3000watts here anyway, Europe thing).
 
No thanks on 500 watts, sounds like it's just a overclocked 4090 with GDDR7.
 
The performance would need to be like +60% or something for me to say yes to 500w. But even then would probably skip that gen and wait for more efficient node.
 
Last edited:
No thanks on 500 watts, sounds like it's just a overclocked 4090 with GDDR7.
You're completely neglecting the chip's specs are like 50% higher across the board, if these rumors are true the 5090 will smoke the 4090 and the 4090 is my current favorite GPU of all-time.

The performance would need to be like +60% or something for me to says yes to 500w. But even then would probably skip that gen and wait for more efficient node.
Yea with this TDP information, I'm now thinking 60%+. Originally I was thinking it could be a 350W card in the back of my mind since it was dual-slot.
 
You're completely neglecting the chip's specs are like 50% higher across the board, if these rumors are true the 5090 will smoke the 4090 and the 4090 is my current favorite GPU of all-time.


Yea with this TDP information, I'm now thinking 60%+. Originally I was thinking it could be a 350W card in the back of my mind since it was dual-slot.

I put little faith in rumors, easier when a company leaks certain facts. Chip specs 50% across the board is unlikely.
 
I put little faith in rumors, easier when a company leaks certain facts. Chip specs 50% across the board is unlikely.
I agree with you on this. But kopite7kimi has been very accurate. Occasionally the leakers from Chiphell are inaccurate. MLID is probably the least accurate leaker.

But yea, I agree.
 
If they do this, it would be wise for them not to wait too long to launch the 5090 Ti.

Conceptual launch strategy:
Launch the 5080, then launch the 5090 ($1500 - $1800), then 5090 TI after 5090 sells out (at like $2000-$2500).

It would be a decent launch strategy to lure people who didn't snag a 5090 at launch to instead buy a 5090 Ti at a large markup, this would likely be very limited releases because they wouldn't want the disaster they had with 3090 Ti where they had to slash the price to $1000 3 months after launch, because they overproduced them. But it depends on their yields, if yields are bad I don't think they will do a 5090 Ti.

I don't expect them to do it like 3000 series, the 3090 Ti didn't sell well compared the the 3090 because it came out 1.5 years later.


I believe this is more-likely, the 5090 is going to be more limited and more cut-down this generation. But might still have a 40%+ performance uplift. I don't think it will beat the uplift of the 3090-to-4090 of 64% because I suspect they will release it as a 300-400W card since the rumors are pointing at dual-slot design.
There was no 4090 Ti. If we're again going into next gen without competition at the halo end, then don't expect there to be a 5090 Ti either. Maybe a Titan perhaps, but I really don't think you'll see a 5090 Ti.
 
There was no 4090 Ti. If we're again going into next gen without competition at the halo end, then don't expect there to be a 5090 Ti either. Maybe a Titan perhaps, but I really don't think you'll see a 5090 Ti.

I doubt there will be a Titan because typically Titans use the fully enabled die, but now why would Nvidia use the fully enable die in a Titan card that they can sell for maybe $3000-$4000 when they can use that same die in an AI card that sells for WAY more?
 
The size of the cooler dictate how hot the GPU runs. It does not dictate the amount of heat produced. Temp and heat are not the same thing.
A major part of a space heater's .... "heating" is to be bad at dissipating heat so if my 500W card is running at 40C it isnt a space heater in my book ;)
 
A major part of a space heater's .... "heating" is to be bad at dissipating heat so if my 500W card is running at 40C it isnt a space heater in my book ;)
Your book is not relevant. The heat all gets dumped into the same place whether or not your 500 watts comes from a space heater, a 40C video card or a 90C video card. The only variable is the rate at which that heat gets dumped into the room. A cooler running GPU consuming 500 watts will actually heat the room faster than a warmer running GPU consuming the same 500 watts.
 
Your book is not relevant. The heat all gets dumped into the same place whether or not your 500 watts comes from a space heater, a 40C video card or a 90C video card. The only variable is the rate at which that heat gets dumped into the room. A cooler running GPU consuming 500 watts will actually heat the room faster than a warmer running GPU consuming the same 500 watts.
Thats only true if all things are equal. If the 40C GPU is also more efficient at consuming the 500 watts, then it will raise the room temp less. Keep it lighthearted in here man :)
 
Thats only true if all things are equal. If the 40C GPU is also more efficient at consuming the 500 watts, then it will raise the room temp less. Keep it lighthearted in here man :)

More efficient at consuming 500 watts? What does that even mean lol. Let's say GPU A pulls 300 watts, GPU B pulls 500 watts. Both GPUs are kept cooled to 40C, where is that extra 200 watts of heat dissipation from GPU B gonna go? Into your room, meaning you're room is receiving an extra 200 watts of heat so yeah that thing is going to be a space heater.
 
Thats only true if all things are equal. If the 40C GPU is also more efficient at consuming the 500 watts, then it will raise the room temp less. Keep it lighthearted in here man :)
If computer would have any efficacy of that sort, but there is almost no energy being "eated" by the computing part, i.e. if they consume 500 watts, virtually 500 watt goes out.

There is no GPU temp by say (it will be the result of cooling them that will dictate how hot they are) and the GPU temp does not matter at all here, watt is all you need to know.
 
Last edited:
Thats only true if all things are equal. If the 40C GPU is also more efficient at consuming the 500 watts, then it will raise the room temp less. Keep it lighthearted in here man :)
Light hearted is also not relevant. When people talk about efficiency as it relates to microprocessors they are talking about power/performance. That does not translate into reduced heat. A GPU pulling 500 watts rendering 4k/30 will dump the same heat as a GPU pulling 500 watts rendering 4k/120

Now if you're doing professional work, the card that can do 4K/120 will finish the task faster and dump that heat for a shorter period of time. But over equal lengths of time, like when gaming, neither is more efficient than the other from a power consumption or heat perspective.

EDIT: On the flip side, if you actually want more efficiency as far as heat. You can certainly buy a 500 watt card that does 4K/120 but limit your frames to 4K/30 and then you'll get the same performance as the weaker 500 watt card while pulling significantly less than 500 watts.
 
Last edited:
the card that can do 4K/120 ...
EDIT: On the flip side, if you actually want more efficiency as far as heat. You can certainly buy a 500 watt card that does 4K/120 but limit your frames to 4K/30 and then you'll get the same performance as the weaker 500 watt card while pulling significantly less than 500 watts.

For that class of cards, even more than the 4090 for which it is already the case, a lot of older game if they are limited to 120hz the card will not reach 100% usage and will not consume a lot of his total power envelope, even you change no setting, simply by the fact that they would reach 200fps without v-sync and running at 80% of what they can do.
 
500W does put off a lot of heat, but it's more important to have proper ventilation on the room, once you have that the 500W is irrelevant.

It matters only if you're crypto mining or doing something that will max the GPU out 24/7 (this is when you want to undervolt to an optimal value). My 4090 pulled around 270W-370W most of the time while gaming and I had the power limit at 600W. A 5090 even with a 500W limit will pull even less, until we get more graphically demanding games. It's nice to have the increased power limit, hopefully they allow us to increase the 5090 to 600W, for instances where you're GPU limited and need the additional power, it improves the gaming experience substantially.
 
500W does put off a lot of heat, but it's more important to have proper ventilation on the room, once you have that the 500W is irrelevant.

It matters only if you're crypto mining or doing something that will max the GPU out 24/7 (this is when you want to undervolt to an optimal value). My 4090 pulled around 270W-370W most of the time while gaming and I had the power limit at 600W. A 5090 even with a 500W limit will pull even less, until we get more graphically demanding games. It's nice to have the increased power limit, hopefully they allow us to increase the 5090 to 600W, for instances where you're GPU limited and need the additional power, it improves the gaming experience substantially.

Does it really make that much of a difference? I did a bunch of testing when I got my 4090 FE between 133% power limit (600 watts) and 80% power limit and I saw practically zero difference in gaming except for like 2fps more on the 133% power limit setting, even the 1% lows saw no noticeable difference. Been running 80% power limit ever since and it's been almost 2 years now and I've never felt like I was losing out on anything. Also, if you've been running 600 watt power limit this whole time, could that be why your 4090 kicked the bucket early?
 
Does it really make that much of a difference? I did a bunch of testing when I got my 4090 FE between 133% power limit (600 watts) and 80% power limit and I saw practically zero difference in gaming except for like 2fps more on the 133% power limit setting, even the 1% lows saw no noticeable difference. Been running 80% power limit ever since and it's been almost 2 years now and I've never felt like I was losing out on anything. Also, if you've been running 600 watt power limit this whole time, could that be why your 4090 kicked the bucket early?
It makes a difference in frametime consistency (when GPU limited). In general I agree, its only like a 1-2% performance difference. The only game I've played that actually pulled nearly 600W was Metro Exodus Enhanced Edition, which was about 570W. I have seen games like Cyberpunk 2077 pull around 500W at times. When games actually use that power it is just for a short time when it is needed.

Not sure if this is what killed my 4090, I think the 1.1V might have killed it honestly. The reason I think this is because they started releasing 4090s at 1.07V limit shortly after launch IIRC.
 
I'm currently on an 11900k/3090 on a custom loop. Very happy with performance but this thing heats my 100sf office so fast under load and my office is kind of hard to keep cool in the summer.

I want to upgrade to the 15th gen intel and a 5090 which could mean an extra 200-300W or more of heat - I'm willing to deal with it when I need it but I also feel like I need to learn how to config my system to have more control over clock and power settings and the old way of just OC, keep it cool and forget it are not going to work.

Part of me also thinks I should upgrade to next gen parts that match my current performance (like say a 15600k/5070) but cut my power consumption way down.

At a certain point - when we can all push 4k and 5k resolutions at 240fps at max detail, I know I'm going to just want a system that can do all that but not turn my office into a sauna. I don't think we're quite there yet as there is still more performance to gain - maybe by the time the 7090 is a thing - if Nvidia doesn't just stop making GPUs (which seems like a real possibility).

TLDR: I want the 45" 5120x2160 OLED and a 5090 next year.
 
I'm currently on an 11900k/3090 on a custom loop. Very happy with performance but this thing heats my 100sf office so fast under load and my office is kind of hard to keep cool in the summer.

I want to upgrade to the 15th gen intel and a 5090 which could mean an extra 200-300W or more of heat - I'm willing to deal with it when I need it but I also feel like I need to learn how to config my system to have more control over clock and power settings and the old way of just OC, keep it cool and forget it are not going to work.

Part of me also thinks I should upgrade to next gen parts that match my current performance (like say a 15600k/5070) but cut my power consumption way down.

At a certain point - when we can all push 4k and 5k resolutions at 240fps at max detail, I know I'm going to just want a system that can do all that but not turn my office into a sauna. I don't think we're quite there yet as there is still more performance to gain - maybe by the time the 7090 is a thing - if Nvidia doesn't just stop making GPUs (which seems like a real possibility).

TLDR: I want the 45" 5120x2160 OLED and a 5090 next year.
I'd say the 15600k/5070 (maybe consider going AMD) to cut power consumption is the better plan. Any time you are chasing the max performance you are going to sacrifice by getting the most power hungry hardware.
 
It makes a difference in frametime consistency (when GPU limited). In general I agree, its only like a 1-2% performance difference. The only game I've played that actually pulled nearly 600W was Metro Exodus Enhanced Edition, which was about 570W. I have seen games like Cyberpunk 2077 pull around 500W at times. When games actually use that power it is just for a short time when it is needed.

Not sure if this is what killed my 4090, I think the 1.1V might have killed it honestly. The reason I think this is because they started releasing 4090s at 1.07V limit shortly after launch IIRC.
There’s virtually no chance an extra .03v is what killed it. That’s less than a 3% difference.
 
I'd say the 15600k/5070 (maybe consider going AMD) to cut power consumption is the better plan. Any time you are chasing the max performance you are going to sacrifice by getting the most power hungry hardware.
Part of the issue is resolution and screen size.

I like a big panel (38-45" UW) for work and MMOs/driving games/etc but for FPS games, I'm OK with something a little smaller (and ideally, faster) but it's hard to make a functional desk with two sized monitors and I strongly prefer to run a single panel.

If the next gen LG Ultrawides make it easy to run a fractional resolution, I think that would help a lot but at the same time, I feel like the 5090 is the culmination of 25 years of GPU dreams and I don't want to miss out. I might just have to get a better A/C for my office...
 
Back
Top