RTX 5xxx / RX 8xxx speculation

Nebell

2[H]4U
Joined
Jul 20, 2015
Messages
2,383
Continuing on for the third time :)
Last thread RTX 4xxx / RX 7xxx

New high end probably about 30-35% faster at 4k and high-end and cheaper. But not at 3080 price. I think 5080 will hit closer to $999.
AMD more competitive this time around.
This gen was an anomaly, Nvidia released 4090 because they expected AMD to be more competitive.
New cards coming out in April-May 2025.
 
Continuing on for the third time :)
Last thread RTX 4xxx / RX 7xxx

New high end probably about 30-35% faster at 4k and high-end and cheaper. But not at 3080 price. I think 5080 will hit closer to $999.
AMD more competitive this time around.
This gen was an anomaly, Nvidia released 4090 because they expected AMD to be more competitive.
New cards coming out in April-May 2025.

Rumor has it AMD already dropped out of the high end race, which I would not be surpised with given the lackluster performance/watt of the chiplet design. Now your 5080 can cost $1599 and the 5090 can be $2599, aren't you excited?
 
New high end probably about 30-35% faster at 4k and high-end and cheaper.
If it is only 30-35% faster and keep the same vram amount maybe cheaper, for a reference the 4080 seem to be 50% faster than a 3080 and the 4090 66% than a 3090.

They could want to diminish the gap from the 50-60-70 with the 80-90 and if that the case it would probably involve in part not upgrading the 80-90 as much has ampere and Lovelace did instead of really pushing the 60-70 skus.

Going from ddr6 to ddr7 could help the 4k performance quite a bit too, the 4080 seem to be limited by the memory bandwith at that resolution.

I can see them using that boatload of money, UE5 modern game demands, expecting peak of AI pass and Lovelace below meh reception to do a small 30-35% upgrade at the 5090 level and try to keep the price really high below by having a massive upgrade a la 3090 to 4090, would completely change the narrative and try to cement the really high price.
 
The 4090 was 75 percent faster than the 3090 but cost about the same?
Those days are over, they'll charge as much as they can now that AI is here. I don't see things slowing down before this launch, they're not going to need to move consumer gpus.
 
Those days are over, they'll charge as much as they can now that AI is here. I don't see things slowing down before this launch, they're not going to need to move consumer gpus.
Could see them going 48gig or more of VRAM with good AI performance and interconnectivity for shared vram workload on the 5090 series and charge a lot it is possible but that far from certain, I feel it will be only 32GB and like the 4090 with no NVLink at all, pushing pro into pro lines products.

For one reason, Hopper gpu went as high as $80,000 in China I think, regardless of the details with how much they achieved to sales H100 price wise they could be tempting to make sure gaming GPU and AI ML one stay separated as much as possible.

Sellings Hopper is much better than a $2500 4090.

they will not want their regular 5090 to compete too much with this:
https://samagame.com/en/latest/nvid...the-same-as-they-had-with-one-big-difference/

or even old H100 in the field.

A bit like this generation, at least at launch multi GPU work on the 4090 was not reliable at all (and the models did not make it that obvious to stack them).

And because of this the 5090 will not be that good for enterprise wanting to do ML and like 4090 now they will easy to find in store and large HBM memory with superbe interconnectivity with better for AI/ML workload with completely different tier of VRAM will be offered.

2025... maybe Arizona TSMC 3nm get online that year, I imagine it will not arrive in 2024 like planned but intel "1.8nm" fab could also be online by end of 2025, there could be a glut of supply has a bit of an overreaction and a lot of public subsidy in 2020 that start to show off by 2025.
 
Last edited:
Those days are over, they'll charge as much as they can now that AI is here. I don't see things slowing down before this launch, they're not going to need to move consumer gpus.
AI was already there years ago see dlss.... So what about 3060ti ($400) matching 2080ti ($1200)?
 
I honestly don't expect a massive leap this time around. Nvidia went from Samsung 8nm to TSMC 4nm so they had a huge jump in node process which is something that won't be as big this time around going from TSMC 4nm to TSMC 3nm. Maybe GDDR7 VRAM can make up the bulks of the gains though IDK I'm just talking out of my ass here but I'm not expecting another +60% performance jump.
 
Free dode gain could be less impressive Apple product could be a clue when they roll out their 3nm, the claim are quite impressive with 70% over 5NM (or 60% over nvidia special 5nm), which is not that dissimilar to that 70% density gain they had Lovelace from Ampere but there could be a diminishing return in more and more part of the GPU and not just the IO that do not translate in as much performance gain.

- Like mentionned above, rumors seem to indicate a doubling of the memory bandwidth at top end, maybe no need to spend gpu budget on increase cache size this generation
- Could be just an impression overmade, but Lovelace did look to me a lot refined Ampere, with new cache and some tricks on a better node, this could be a new architecture that Nvidia would have been working on since 2020, the rumours talk about it being a significant change architecture.


I expect a massive leap this time around

- Lovelace could have and did not by choice, the 4060 could have been 70% better than the 3060, it is not because it is a 159mm chips with 75% of the bandwidth made with a huge demand on that space they thought they would be able to sales well and probably thought they could launch it at $340-$350 when that generation plans a couple of months pre 4090 announcements were sketched, so the jump Blackwell do should be even more so a choice.
- They do seem to focus quite a bit on the percentage of gamers that are still below a 3060 GPU and seem them as a target, 2 way to get them lower price of the 5060 or make it impressive value to them because of the performance jump
- Have the feeling, regarding the point above, they will want to keep price high and if you want the xx70-xx80 to have ridiculous high price, not want to cut the xx60 pricing so decided to give it a impressive jump, you need to make an impressive jump in the higher skus and like the 4060 they too were small chips on strangling memory bandwith so easy to beat would you want to do so.
- 27-30 months in-between or so instead of 2 years ? Large time gap, require larger performance gap.
- Large money pouring in and new technology when it come to design the chips
- Brand narrative to turn around and a CEO that do the presentation, the guy/company much prefer going on a big stage fulljacket mode saying my Pascal friends it is time to upgrade than have silent ghost 4060ti release.

And maybe more importantly, peak AI demand by 2025 should have passed or at least for certain to be able to sell before making them Nvidia products, there will be a large field of used H100 system in the wilde of failed startup or even successful one that have refined their training and need less of them and a lot of competition.

This could be Ampere all over again, +50-70% all over the stack, more so at 4k because of the memory bandwith, but with a large VRAM boost instead of stagnation and because of Lovelace price a bit like Ampere did because of Turing price at a price that will look ok.
 
Personally I think any discussion about next gen cards must include Intel, I have a feeling that AMD in particular is in for something of an existential crisis.
 
Personally I think any discussion about next gen cards must include Intel, I have a feeling that AMD in particular is in for something of an existential crisis.

I think Intel will have to prove themselves as a reliable option to make serious inroads. Right now their prices have made people roll the dice on them but that will be a limited number of people. If they can make it a whole year without serious issues with drivers then they could become a serious player.
 
Personally I think any discussion about next gen cards must include Intel
If drivers continue to get better and pre DX12 get less relevant, 362mm of TSMC 4nm, and maybe they had lower fruit major gain to be made at least on the inconsistency and issues side of things being a new architecture.

If Intel can deliver over the 4070ti level of performance in the best case well before the end of 2024 well before Blackwell launch and at an aggressive price, that could be interesting.
 
If drivers continue to get better and pre DX12 get less relevant, 362mm of TSMC 4nm, and maybe they had lower fruit major gain to be made at least on the inconsistency and issues side of things being a new architecture.

If Intel can deliver over the 4070ti level of performance in the best case well before the end of 2024 well before Blackwell launch and at an aggressive price, that could be interesting.
They made a few mistakes with Arc, but driver updates showed just how good the hardware is, and there may still be gains to be had. Intel isn't doing this for lulz, it is my feeling that they are looking to muscle in and upset the status quo.
Let's be frank, Nvidia and AMD have had it way too easy for way too long, complacency was bound to set in as both brands settled into their own particular roles, Nvidia the dominant force and AMD the multi billion dollar underdog.

Of course both companies have a lot of experience, however sometimes complacency can result in tunnel vision, and an inability to adapt and react to a new threat quickly enough.

I don't think Intel is fucking around, I think they smell blood in the water.
 
They made a few mistakes with Arc, but driver updates showed just how good the hardware is, and there may still be gains to be had. Intel isn't doing this for lulz, it is my feeling that they are looking to muscle in and upset the status quo.
Let's be frank, Nvidia and AMD have had it way too easy for way too long, complacency was bound to set in as both brands settled into their own particular roles, Nvidia the dominant force and AMD the multi billion dollar underdog.

Of course both companies have a lot of experience, however sometimes complacency can result in tunnel vision, and an inability to adapt and react to a new threat quickly enough.

I don't think Intel is fucking around, I think they smell blood in the water.

I don't think Intel smells blood in the water, I think they are afraid of falling behind. Apple/Qualcomm aside, both Nvidia and AMD have more than just GPUs. AMD especially in doing very well in the CPU department. Nvidia is focusing heavily on AI and they have dabbled in some rather successful CPU design (Tegra).
Intel has CPUs and is taking heavy fire from AMD. And are actually losing that battle.
 
I am certain you will need a 55 gallon drum of lube to buy the Nvidia high end card. Might only need a 40 gallon drum if AMD also releases a Halo card. I can see the lube on the wall here...
Get ready to bend over, fellas... I know my butthole is puckering.

My crystal ball says regarding lower prices next gen: "Outlook not good..." Not unless AMD or Intel step up to the plate with competition at the high end. Until then, nVIDIA gonna charge what they gonna charge. Which is whatever the hell Jensen decides when he gets on stage...

I for one would welcome a $799 xx80 series card again, that said. (which after inflation really isn't that far from a $600 price point - which I can live with)
 
Card prices will stay where they are,even as 3nm $ goes up over 4nm by, allegedly as much as 25%. You can only bleed folks so much, NVDA isn't insane. I think MLID is half right about reticle size,as NVDA can stack in layers to get past that max size as they have done repeatedly in the past. Cache size will stay as is. Blackwells 5090 will see a bigger jump by a bit then ada over ampere thanks mostly to GDDR7,in raster,but in RT it will near double,instead of the 45 to 50 % jumps we have been getting. As Dr Jim Keller very recently reiterated(and Jenson,nvm what Huang said right after Ada launched about Moores Law,that was to shush us on pricing) ,Moore's Law is not dead,many years yet said Jim just months ago. I think they know way more about IBM and TSMC then the MLID channel.
 
Last edited:
It always makes me laugh that the general attitude seems to be that these companies are just choosing to price high. Making these cards, the research and development costs ALOT. Making graphics cards isn't like, "Hey last card was great, let's just go make and even faster one with latest node and ram tech!". These have become highly complex pieces of engineering and it all just costs alot to design and manufacture. Sure the final markup of a given card makes it look like a stinking greedy profit, but it has to be that high in order to cover the initial investment.

I'm not saying these cards couldn't be cheaper, sure they can, but not by as much as we'd normally like. The fact that there's been no real price war, even when intel showed up, says it all. Intel was in the perfect position to say, "Hey all that R&D was just us getting into the market, now let's throw out dirt cheap cards, disrupt the market, gain market share!". But they didn't do that.
 
Last edited:
It always makes me laugh that the general attitude seems to be that these companies are just choosing to price high. Making these cards, the research and development costs ALOT. Making graphics cards isn't like, "Hey last card was great, let's just go make and even faster one with latest node and ram tech!". These have become highly complex pieces of engineering and it all just costs alot to design and manufacture. Sure the final markup of a given card makes it look like a stinking greedy profit, but it has to be that high in order to cover the initial investment.

I'm not saying these cards couldn't be cheaper, sure they can, but not by as much as we'd normally like. The fact that there's been no real price war, even when intel showed up, says it all. Intel was in the perfect position to say, "Hey all that R&D was just us getting into the market, now let's throw out dirt cheap cards, disrupt the market, gain market share!". But they didn't do that.

I take it you have never heard the term "unsustainable". You can build anything if money is no object, but if you want to sell it, then your cost better reflect what people will pay for it and based on sales numbers so far it looks like they are crossing that point.

Intel cards are pretty cheap considering what they were trying to target performance wise, but their cards come with issues on older games and a unknown track record on driver updates for their cards, right now they are a lower end gamble. Right now Intel is losing money on every card they sell and they still can't move that many cards. Intel still needs quite a bit of refinement before I would consider getting a card from them.
 
I feel like most 4090 owners probably wont buy the 5090. The 4090 is right at 4k120 for most games so I don't really see the need to upgrade unless the 5090 was 70%+ faster and cheaper than the 4090.
 
Maybe the 5000 series will drop the price of a used 3090TI to the point I can replace my 3070. Already planning the next upgrade I can afford:):)
 
I feel like most 4090 owners probably wont buy the 5090. The 4090 is right at 4k120 for most games so I don't really see the need to upgrade unless the 5090 was 70%+ faster and cheaper than the 4090.
Guess I'm not most. I'll be upgrading my 4090 to a 5090 if it's a good performance boost.
 
I feel like most 4090 owners probably wont buy the 5090. The 4090 is right at 4k120 for most games so I don't really see the need to upgrade unless the 5090 was 70%+ faster and cheaper than the 4090.

Nah many 3090 owners went and bought the 4090 even though it wasn't cheaper than a 3090 and didn't offer over a 70% performance boost (although on average the gains are close to that). Even if the 5090 "only" offers a 50% boost over the 4090 while costing more I'm sure many would still jump on it.
 
The 4090 is right at 4k120 for most games
If Remnant 2 is an indication (it does not even have lumen on), 4k 30fps/60 fps on a 4090 game could start to come out (if we assume that consumer tend to try the game with the setting set to the max):

performance-3840-2160.png


And there a lot of I want the best outthere regardless among them.


3090 and didn't offer over a 70% performance boost (although on average the gains are close to that)
https://www.techpowerup.com/review/asus-geforce-rtx-4070-ti-tuf/32.html

65% in raster, 77% with RT on in that suite of games TPU game used there.
 
If Remnant 2 is an indication (it does not even have lumen on), 4k 30fps/60 fps on a 4090 game could start to come out (if we assume that consumer tend to try the game with the setting set to the max):

View attachment 591365

And there a lot of I want the best outthere regardless among them.



https://www.techpowerup.com/review/asus-geforce-rtx-4070-ti-tuf/32.html

65% in raster, 77% with RT on in that suite of games TPU game used there.

Given that many games still aren't releasing with RT I didn't want to include RT numbers hence why I said the gains are not above 70% but close to it (65% raster). I also agree that the 4090 may be enough for 4k120 now but definitely won't be in 2 more years. Everyone thought the same thing about the 3090 back in 2020 and now it's more of a 4k60 - 4k80 card instead. Hell people are even starting to classify 3090 level of performance as being more appropriate for 1440p instead now (4070 Ti).
 
Nah many 3090 owners went and bought the 4090 even though it wasn't cheaper than a 3090 and didn't offer over a 70% performance boost (although on average the gains are close to that). Even if the 5090 "only" offers a 50% boost over the 4090 while costing more I'm sure many would still jump on it.
I was a 3090 owner who bought a 4090 but mostly because a 3090 was doing close to 4k120 on anything I wanted to play and the 3090 in my main gaming rig died.
If Remnant 2 is an indication (it does not even have lumen on), 4k 30fps/60 fps on a 4090 game could start to come out (if we assume that consumer tend to try the game with the setting set to the max):



And there a lot of I want the best outthere regardless among them.



https://www.techpowerup.com/review/asus-geforce-rtx-4070-ti-tuf/32.html

65% in raster, 77% with RT on in that suite of games TPU game used there.
Certainly, there will be games that drop the 4090 off the 4k120 but I think with remnant its more of bad design and optimization than it is so demanding on the hardware. As a 3090, and 4090 owner I feel like skipping the 5090 for once and waiting for the 6090. There will always be people who want the best all the time but if GPU sales trends have shown anything lately the 5000 series won't be great sellers.

EDIT: Well unless they make the 5090 closer to 1-1.2k vs 2k and its 70%+ faster at the same time.
 
I think with remnant its more of bad design and optimization than it is so demanding on the hardware.
Maybe but we only have 2 Unreal 5 game and the first one should have been well optimized being an Epic demo of the tech to sell it and on a game made to run on worldwide computers fast paced action high fps:


779082_fortnite.png


It too was closer to 60 than 120 at 4k at the average, not the 1-5% low. Maybe the games will get way better at using it, or UE 5.x update will get smoother, but if the 2 games that use it are an indication, I would not worry too much if my 4k monitor support only 120fps for those future games.
 
I was a 3090 owner who bought a 4090 but mostly because a 3090 was doing close to 4k120 on anything I wanted to play and the 3090 in my main gaming rig died.

Certainly, there will be games that drop the 4090 off the 4k120 but I think with remnant its more of bad design and optimization than it is so demanding on the hardware. As a 3090, and 4090 owner I feel like skipping the 5090 for once and waiting for the 6090. There will always be people who want the best all the time but if GPU sales trends have shown anything lately the 5000 series won't be great sellers.

EDIT: Well unless they make the 5090 closer to 1-1.2k vs 2k and its 70%+ faster at the same time.

Those are some strange reasons for upgrading...your 3090 was already close to doing 4k120 and it died so you upgraded? No 3090 should be over 3 years old yet so you should have been under warranty for it, could have just done an RMA and get your 4k120 performance without spending extra money.
 
Those are some strange reasons for upgrading...your 3090 was already close to doing 4k120 and it died so you upgraded? No 3090 should be over 3 years old yet so you should have been under warranty for it, could have just done an RMA and get your 4k120 performance without spending extra money.
It really wasnt all that close to 4k120 unless I was using DLSS. Cold war was 90-110~ fps with DLSS, really not all that close to me if without it I am more the 60-70 range. 4090 for comparison is closer to 200fps with DLSS and 120fps+ without it. Havent played a ton of games lately but most of the games I want to do 140-200 fps with dlss right now at 4k.

I bought my 3090 from microcenter with an extended warranty so they gave me my money back. I also got the 3090 strix when it was $2200, easy upgrade for me when the 4090 liquid supreme was $1750. I am never getting an Asus product over $1k without a warranty anymore (Asus warranty is a nightmare), glad I got the warranty lately too. Just had 2 G15's die on me and they were 1 and 2 years old and the 3090 strix.
 
Back
Top