RTX 4xxx / RX 7xxx speculation

Benchmarks need to be retested with 6900 XTXH.
That card is faster than RTX 3090 in most games.
 
I really don't believe a new GPU series any time soon. Why would they release something if the current gen is selling like water in the desert?
 
I really don't believe a new GPU series any time soon. Why would they release something if the current gen is selling like water in the desert?
Because a new gen may be cheaper or slightly easier to produce, smaller node, and possibly with decreased component supply dependencies = better margins and greater volume. When the 3000 series was taped out, component supply considerations were different than today.

There are a ton of factors to profitability than just coasting. Doing the same thing too long means falling behind after a certain point.
 
Last edited:
  • Like
Reactions: Wat
like this
Well this was inevitable. Nvidia has too much die space reserved for non-raster performance. Even if AMD was stuck on the same node with the same die size as Nvidia, their strategy would beat Nvidia in gaming easily.
 
It will be interesting if gddr7 will be ready by that time. Of not, we will likely see more use of gddr6x in the midrange for Nvidia or perhaps a form of game cache akin to AMD.

If we do see gddr6x, I predict we will still have 12 GB of vram on the very high 384 bit gaming cards (4080ti). On the 256 bit options (4070 or 4080), we will likely see 8 and 16 GB options.

The upper mid range should get 12 GB gddr6x at 192 bits (4060ti) and likely faster than a 3070. Lower midrange (4060) may get 6 and 12 GB of generic gddr6 running 192 bits.
 
Well this was inevitable. Nvidia has too much die space reserved for non-raster performance. Even if AMD was stuck on the same node with the same die size as Nvidia, their strategy would beat Nvidia in gaming easily.
It will be interesting to see if NV continues this press on "things other than raw rasterization performance".
My guess - yes, and double down on getting DLSS integrated in games. They are really good about things like that, to the point of even dedicating their own engineers to integrate it for you if needed (if you're a player).

The combo of DLSS + raytracing IS really great - if and only if they're used well.

And that's the rub. We're at this point, and I'd say 3 titles do this very well. Others may have a "raytracing" checkbox which does very little. Far Cry 6 is a great example. You can turn on raytracing. It doesn't have a crippling impact on performance, so yay, but also - unbelievably small improvement. Of course that's an AMD sponsored title. Draw your own conclusions.

The next 2 generations will be very interesting in terms of how they allocate die space, what becomes the priority. And of course, those decisions are already made and the designs are in-flight right now.
Sweet lord am I glad I am not making these calls. Although - I'd be making a lot more, possibly with a golden parachute.
 
With Russia invading Ukraine and the western response being very very weak, I'm 90% sure China will attempt the same with Taiwan.
Which means you might want to stock on cheap €1500 RTX 3080 GPUs and sell them later for €5000 so you can afford that €10.000 4090/7900XT because we're in for another high price, low availability ride.
 
With Russia invading Ukraine and the western response being very very weak, I'm 90% sure China will attempt the same with Taiwan.
Which means you might want to stock on cheap €1500 RTX 3080 GPUs and sell them later for €5000 so you can afford that €10.000 4090/7900XT because we're in for another high price, low availability ride.
That's not how it works. The West won't let China take Taiwan for obvious reasons and China knows that.

If prices gets more out of hand in the next couple of years it's time to stop gaming.
 
It will be interesting to see if NV continues this press on "things other than raw rasterization performance".
My guess - yes, and double down on getting DLSS integrated in games. They are really good about things like that, to the point of even dedicating their own engineers to integrate it for you if needed (if you're a player).

The combo of DLSS + raytracing IS really great - if and only if they're used well.

And that's the rub. We're at this point, and I'd say 3 titles do this very well. Others may have a "raytracing" checkbox which does very little. Far Cry 6 is a great example. You can turn on raytracing. It doesn't have a crippling impact on performance, so yay, but also - unbelievably small improvement. Of course that's an AMD sponsored title. Draw your own conclusions.

The next 2 generations will be very interesting in terms of how they allocate die space, what becomes the priority. And of course, those decisions are already made and the designs are in-flight right now.
Sweet lord am I glad I am not making these calls. Although - I'd be making a lot more, possibly with a golden parachute.
From what I've seen, most games with DLSS 2.0+ (way more than 3 titles) look as good or better than native res and have the added advantage of not needing additional AA for the most part. It's a huge perk for me and at the very least beats the crap out of FSR, which isn't really even comparable tech anyways. Even in games where DLSS has some minor artifacting or slight degradation in IQ, if it's a faster paced game (like Control which was pretty well implemented but had some artifacting on certain edges), the performance and IQ trade-off is still more than worth it compared to just running the game at whatever the internal resolution you would be dropping the game to in order to gain a 30%+ performance improvement. But even not factoring in DLSS, AMD has a bit of a gap to make up with Nvidia still in raw RT performance. I really hope they at least catch up in that respect as RT becomes more standard across most games.
 
That's not how it works. The West won't let China take Taiwan for obvious reasons and China knows that.

If prices gets more out of hand in the next couple of years it's time to stop gaming.
The US has done a terrible job dealing with Russia so far. Don't see anything that would deter the the commies.

As far as gaming goes, it has been a long time since a new graphic effect made a game more fun to play. If we have to wait longer to get photorealistic, I am OK with that.
 
Ignoring the rumors these are my predictions of the enthusiast level cards. I think Nvidia will try to double dip this cycle and AMD will have the performance lead.

My prediction for the RTX4080Ti:
16GB GDDR6+
30% Faster than 3090
350w TDP
DLSS 3.0 at Launch
$1299
Late August, Early Sept

My prediction for the RX7900XT:
16GB GDDR6+
50% Faster than RX6900XT
330w TDP
FSR 2.0 (2 months after launch)
$1399
November

My Prediction for the RTX TITAN (Likely some other name to generate hype):
24GB GDDR6+
15% faster than the 4080Ti
450w TDP
$2000
Late Q1 2023
 
Last edited:
Ignoring the rumors these are my predictions of the enthusiast level cards. I think Nvidia will try to double dip this cycle and AMD will have the performance lead.

My prediction for the RTX4080Ti:
16GB GDDR6+
30% Faster than 3090
350w TDP
DLSS 3.0 at Launch
$1299
Late August, Early Sept
The 80Ti class card a 256-bit card? Ok...
 
Hey I'm not a memory nerd how would I know?
Ah no worries. Typically each VRAM chip is connected via a 32-bit interface, so a 3070 having 8 GDDR6 chips make it a 256-bit card. 32x8 = 256.

Now you could do a 512-bit 16GB card, but that would necessitate 16 chips. Look at the PCB for an R9 290X for an example of this. Challenge here is you need a lot of PCB real estate. Haven't really seen Nvidia do anything bigger than 384-bit (12 chips) since the 200-series.

This is a super dumbed down explanation, but hope that helps. There are exceptions to this rule (see GTX 970 debacle), and there is HBM cards out there. But typically that's how it works for gaming cards.

The 6900XT as a 16GB card uses eight 2GB GDDR6 chips so it is again a 256-bit card, but obviously they have the extra infinity cache to help compensate. You want bigger bus widths for higher end cards intended for 4k unless you have some other ways to compensate. Given how Nvidia design has been, I wouldn't expect a north of $1k card to be on a 256-bit bus so a 16GB card is unlikely unless they go bigger like a 512-bit.
 
I dunno, with the higher prices becoming more normal, I wouldn't be suprised to see HBM back on a high end consumer card
 
That's not how it works. The West won't let China take Taiwan for obvious reasons and China knows that.

If prices gets more out of hand in the next couple of years it's time to stop gaming.

Putin is literaly messing around in the middle of Europe and doing whatever he wants.
I live in Sweden, people are hysteric in the whole EU, everyone is arming up.
If Russia can invade a county in Europe, I don't see why China wouldn't find an excuse to finally take over Taiwan.
They already did take over Hong Kong. And the world watched.
Say China finds an excuse and decides to take Taiwan. Do you honestly think the US will start a world war?
 
Putin is literaly messing around in the middle of Europe and doing whatever he wants.
I live in Sweden, people are hysteric in the whole EU, everyone is arming up.
If Russia can invade a county in Europe, I don't see why China wouldn't find an excuse to finally take over Taiwan.
They already did take over Hong Kong. And the world watched.
Say China finds an excuse and decides to take Taiwan. Do you honestly think the US will start a world war?
I don't really engage in politics here on hardforum and prefer the forums being somewhat clean of political nonsense, but nonetheless I'm going to reply a last time.

Putin is not doing "whatever" he wants even though it may seem like it. Ukraine is a sovereign country who aren't a member of NATO, EU etc and Ukraine and a sort of mixed culture with a lot of Russians and Ukrainian heritage, so no he isn't just picking random countries to attack, but he is trying to take out the buffer between Europe and Russia which is disturbing.

To be honest the European countries should have been arming up way before this started, but they have been sitting on their hands acting like all these dictatorships weren't a threat. The fact that two countries the last 5 years have become a dictatorship and one of them is a superpower is incredibly disturbing. Xi Jinping, Erdoğan and Putin are very extreme leaders who cares more about their legacy than their people.

You have to understand that you can't compare Ukraine to Taiwan in any way. USA and the West have no stake in Ukraine, but have a big dependency on Taiwan when it comes to technology and that technology will not be allowed into Chinese hands. Yes! USA will defend Taiwan if it comes to that and Xi Jinping knows that as well, which is why he has not tried anything other than mouthing off.
 
Putin is literaly messing around in the middle of Europe and doing whatever he wants.
I live in Sweden, people are hysteric in the whole EU, everyone is arming up.
If Russia can invade a county in Europe, I don't see why China wouldn't find an excuse to finally take over Taiwan.
They already did take over Hong Kong. And the world watched.
Say China finds an excuse and decides to take Taiwan. Do you honestly think the US will start a world war?
I don't really engage in politics here on hardforum and prefer the forums being somewhat clean of political nonsense, but nonetheless I'm going to reply a last time.

Putin is not doing "whatever" he wants even though it may seem like it. Ukraine is a sovereign country who aren't a member of NATO, EU etc and Ukraine and a sort of mixed culture with a lot of Russians and Ukrainian heritage, so no he isn't just picking random countries to attack, but he is trying to take out the buffer between Europe and Russia which is disturbing.

To be honest the European countries should have been arming up way before this started, but they have been sitting on their hands acting like all these dictatorships weren't a threat. The fact that two countries the last 5 years have become a dictatorship and one of them is a superpower is incredibly disturbing. Xi Jinping, Erdoğan and Putin are very extreme leaders who cares more about their legacy than their people.

You have to understand that you can't compare Ukraine to Taiwan in any way. USA and the West have no stake in Ukraine, but have a big dependency on Taiwan when it comes to technology and that technology will not be allowed into Chinese hands. Yes! USA will defend Taiwan if it comes to that and Xi Jinping knows that as well, which is why he has not tried anything other than mouthing off.
Take it to the soapbox please.

Back on topic I don't see them going HBM for future cards anytime soon. The cost and availability is pretty bad ATM and will only get worse. So they will keep reserving it for their high end compute cards. They are going to have to double up the memory across the board. 12gb is not going to cut it for xx80 cards and up.
 
I dont have any inside info, just speculation on my part, but I can see the Nvidia halo card (whatever that will be) go all out and use HBM.

Gddr6x is more expensive than gddr6 and requires more precise board sodering (they have to backdrill the through board solder joints because of the reflections and pam4 signaling.. thats my simpleton way of understanding it)
Anyhow, hbm is still more expensive, but not that much more. Of course that is assuming that there won't be more supply chain nonsense.

The 3090 came about because Nvidia was suprised at how well navi21 performed. The 3090 was a "we got to beat AMD no matter what it takes" type decision.

If navi31 happens to be as good as navi21 was, then I can see Nvidia pulling out all the stops to keep the performance crown.
 
I dont have any inside info, just speculation on my part, but I can see the Nvidia halo card (whatever that will be) go all out and use HBM.

Gddr6x is more expensive than gddr6 and requires more precise board sodering (they have to backdrill the through board solder joints because of the reflections and pam4 signaling.. thats my simpleton way of understanding it)
Anyhow, hbm is still more expensive, but not that much more. Of course that is assuming that there won't be more supply chain nonsense.

The 3090 came about because Nvidia was suprised at how well navi21 performed. The 3090 was a "we got to beat AMD no matter what it takes" type decision.

If navi31 happens to be as good as navi21 was, then I can see Nvidia pulling out all the stops to keep the performance crown.


Now that the GDDr6 spec has been pushed to 24Gts speeds, it's going to be a hard sell to continue spending the premium on GDDR6x

https://videocardz.com/newz/samsung-confirms-its-gddr6-20gbps-and-24gbps-memory-is-now-sampling

I imagine the RTX 4080 will ship with GDDR6 (but it's anyone's guess on whether they will use GDDR6x on the 4090 - can we assume that Micron will push the PAM4 to faster than Samsung's new 24Gts modules?)
 
From what I've seen, most games with DLSS 2.0+ (way more than 3 titles) look as good or better than native res and have the added advantage of not needing additional AA for the most part. It's a huge perk for me and at the very least beats the crap out of FSR, which isn't really even comparable tech anyways. Even in games where DLSS has some minor artifacting or slight degradation in IQ, if it's a faster paced game (like Control which was pretty well implemented but had some artifacting on certain edges), the performance and IQ trade-off is still more than worth it compared to just running the game at whatever the internal resolution you would be dropping the game to in order to gain a 30%+ performance improvement. But even not factoring in DLSS, AMD has a bit of a gap to make up with Nvidia still in raw RT performance. I really hope they at least catch up in that respect as RT becomes more standard across most games.

I was pleasantly surprised at how well DLSS worked in COD Vanguard and Warzone. I played with it for a couple weeks then switched back to native + Filmic SMAA. Immediately noticed the excessive blur Filmic SMAA adds so I switched back to DLSS permanently, at least for those two games. There is occasional artifacting in distant shadowy areas but that's been the only down side I've noticed. Performance is notably improved and at least for my eyes, IQ is superior.

Prior to actually using it I always told myself that if I can't get the performance I want at native + some form of AA i'd upgrade my card, but when implemented properly, I can see DLSS adding a significant amount of longevity to GPUs. Good news considering the cost and availability of GPUs today and for the foreseeable future.
 
  • Like
Reactions: T4rd
like this
I was pleasantly surprised at how well DLSS worked in COD Vanguard and Warzone. I played with it for a couple weeks then switched back to native + Filmic SMAA. Immediately noticed the excessive blur Filmic SMAA adds so I switched back to DLSS permanently, at least for those two games. There is occasional artifacting in distant shadowy areas but that's been the only down side I've noticed. Performance is notably improved and at least for my eyes, IQ is superior.
The current versions of DLSS are truly incredible. I was fiddling with it in Dying Light 2 - and it's really rare I can spot anything which doesn't look as good as native.
Yes, sometimes there's something which doesn't go well, but it is really rare, and fleeting in actual gameplay.
 
It will be interesting to see if NV continues this press on "things other than raw rasterization performance".
My guess - yes, and double down on getting DLSS integrated in games. They are really good about things like that, to the point of even dedicating their own engineers to integrate it for you if needed (if you're a player).

The combo of DLSS + raytracing IS really great - if and only if they're used well.
I wish Nvidia's raytracing quality and visual impact better for RTX4xxx.
I wish more RTGI features like caustics with satisfying framerate in games and avaiable in Arc Alchemist.

 
That's not how it works. The West won't let China take Taiwan for obvious reasons and China knows that.

If prices gets more out of hand in the next couple of years it's time to stop gaming.
Lots of people are saying this, but you dont need a geforce 3090 to enjoy games. I use to game on apple IIe, and the vast majority of games now run fine on an absolute potatoe. I am here happily gaming on a video card from 2015, most games still on max settings getting 75-90fps
 
Lots of people are saying this, but you dont need a geforce 3090 to enjoy games. I use to game on apple IIe, and the vast majority of games now run fine on an absolute potatoe. I am here happily gaming on a video card from 2015, most games still on max settings getting 75-90fps
what games?
what resolution?
what 7 year old GPU?

This post should come with a long list of asterisks. Most old games sure, plenty of current titles and many not so current titles that will run like dog shit on a 7 year old GPU at max settings particularly if we are talking about 1080/1440p.
 
what games?
what resolution?
what 7 year old GPU?

This post should come with a long list of asterisks. Most old games sure, plenty of current titles and many not so current titles that will run like dog shit on a 7 year old GPU at max settings particularly if we are talking about 1080/1440p.
AMD R9 Fury 4gb. World of warcraft, call of duty, world of warships, DCS, IL-2, diablo 3, mechwarrior online, CS:GO, overwatch. I fully realize there is some AAA games that i couldnt do max settings, but if you look at the steam library probably 90% of them run on absolute low end stuff fine. I use a 1440p 165hz screen, though games like world of warships, wow doesnt matter. You realize all the games that run on an xbox one, ps4, etc will run fine on a 5 year old gpu. my argument is that it is ridiculous to think you need a $3000+ video card to enjoy a video game. To my eyes there is very little difference between 1080p and 4k gaming enjoyment. I will admit DCS looks glorious on my 6900xt all maxed out, but you can still enjoy it and be competitive on 1080p on medium
 
AMD R9 Fury 4gb. World of warcraft, call of duty, world of warships, DCS, IL-2, diablo 3, mechwarrior online, CS:GO, overwatch. I fully realize there is some AAA games that i couldnt do max settings, but if you look at the steam library probably 90% of them run on absolute low end stuff fine. I use a 1440p 165hz screen, though games like world of warships, wow doesnt matter. You realize all the games that run on an xbox one, ps4, etc will run fine on a 5 year old gpu. my argument is that it is ridiculous to think you need a $3000+ video card to enjoy a video game. To my eyes there is very little difference between 1080p and 4k gaming enjoyment. I will admit DCS looks glorious on my 6900xt all maxed out, but you can still enjoy it and be competitive on 1080p on medium
"Will run" and "will run at max settings getting 75-90fps @1440p" are two vastly different claims. One is accurate, the other is not. You aren't going to max out COD 2019 and newer on a 4GB Fury card @ 1440p, and maintain anywhere near 75-90 fps.

You can max out the game
You can run it at 1440p
You can get 75-90fps
You can do all these things on a Fury, you just cannot do them at the same time.

1080Ti is about the only card in existence that can come close to actually achieving the claim of maxing out games on a 5+ year old GPU @ 1440p and maintain 75-90fps.
 
Lots of people are saying this, but you dont need a geforce 3090 to enjoy games. I use to game on apple IIe, and the vast majority of games now run fine on an absolute potatoe. I am here happily gaming on a video card from 2015, most games still on max settings getting 75-90fps
I'm still gaming on 1070/1070ti, but if you want to game at 4k it doesn't cut it. I wouldn't be surprised if these cards have another year or two, for 1080p, but can't see them going much beyond that in terms of performance.
 
"Will run" and "will run at max settings getting 75-90fps @1440p" are two vastly different claims. One is accurate, the other is not. You aren't going to max out COD 2019 and newer on a 4GB Fury card @ 1440p, and maintain anywhere near 75-90 fps.

You can max out the game
You can run it at 1440p
You can get 75-90fps
You can do all these things on a Fury, you just cannot do them at the same time.

1080Ti is about the only card in existence that can come close to actually achieving the claim of maxing out games on a 5+ year old GPU @ 1440p and maintain 75-90fps.
again your thinking about Cyberpunk 2077 or BF2042 which is a crap game and i am talking about 90% of the actual gaming library in existence. Blizzard Overwatch for example, you can max the settings on 1440p with this 7 year old card just fine. CS:GO, world of warcraft, etc etc. The actual point we are making here is you do not need a $3000 video card to "ENJOY" gaming. 1080p vs 4k is almost no difference to my eyes, good graphics does not increase the enjoyment of game play over good story and game mechanics. Dont forget a huge swath of the population is gaming on a PS4 or xbox one OG which basically is the equivalent of an AMD 460 2gb card and they arent complaining
 
  • Like
Reactions: Wat
like this
I'm still gaming on 1070/1070ti, but if you want to game at 4k it doesn't cut it. I wouldn't be surprised if these cards have another year or two, for 1080p, but can't see them going much beyond that in terms of performance.
Of course it doesnt cut it, but who cares about 4k gaming? it doesnt improve the game at all visually, unless you are on a 65" screen with a viewing distance of 18. All it does is kill your FPS.
 
Last edited:
  • Like
Reactions: Wat
like this
Of course it doesnt cut it, but who cares about 4k gaming? it doesnt improve the game at all.

People on a message board called HardForum in a thread about next generation of video card speculation ? It is not so much about gaming, it is about gaming hardware.
 
People on a message board called HardForum in a thread about next generation of video card speculation ? It is not so much about gaming, it is about gaming hardware.
Your preaching to the choir, I have a 6900xt after all its just im sick of seeing people saying WELP i better find a new hobby. Just saying, if all you can afford a $250 gpu you can still enjoy it.
 
Your preaching to the choir, I have a 6900xt after all its just im sick of seeing people saying WELP i better find a new hobby. Just saying, if all you can afford a $250 gpu you can still enjoy it.
Outside trolling, I imagine the hobby was gaming hardware, talking about it, playing with it and chasing AAA title pushing it, not necessarily gaming.
 
again your thinking about Cyberpunk 2077 or BF2042 which is a crap game and i am talking about 90% of the actual gaming library in existence. Blizzard Overwatch for example, you can max the settings on 1440p with this 7 year old card just fine. CS:GO, world of warcraft, etc etc. The actual point we are making here is you do not need a $3000 video card to "ENJOY" gaming. 1080p vs 4k is almost no difference to my eyes, good graphics does not increase the enjoyment of game play over good story and game mechanics. Dont forget a huge swath of the population is gaming on a PS4 or xbox one OG which basically is the equivalent of an AMD 460 2gb card and they arent complaining
Well if you are only going to mention esports titles that run on a glorified calculator sure. If you want you can add a couple dozen indie titles to that too, but recent AAA games will be another story.

Like those folks at PC world, "you can game fine on integrated AMD graphics, if you drop to 720p medium to low details for 45-60 FPS" while you try to find a decently priced GPU.
 
again your thinking about Cyberpunk 2077 or BF2042 which is a crap game and i am talking about 90% of the actual gaming library in existence. Blizzard Overwatch for example, you can max the settings on 1440p with this 7 year old card just fine. CS:GO, world of warcraft, etc etc. The actual point we are making here is you do not need a $3000 video card to "ENJOY" gaming. 1080p vs 4k is almost no difference to my eyes, good graphics does not increase the enjoyment of game play over good story and game mechanics. Dont forget a huge swath of the population is gaming on a PS4 or xbox one OG which basically is the equivalent of an AMD 460 2gb card and they arent complaining

That's not what I'm thinking, and there's no reason to guess what I'm thinking either since I specifically mentioned COD, one of the same games you mentioned.

You aren't going to max that out @ 1440p and maintain 75-90fps on a Fury. Period.

I'm also not talking about enjoying a game. You keep switching goal posts.

I'm specifically talking about your claimed performance (75-90fps) in COD at your claimed resolution (1440p) and claimed settings (maxed out) using a Fury card.
 
Of course it doesnt cut it, but who cares about 4k gaming? it doesnt improve the game at all visually, unless you are on a 65" screen with a viewing distance of 18. All it does is kill your FPS.
I moved to couch gaming on a big screen a while back. When I redo my office I'll have the ability to switch between km/monitor and controller/TV.

For now I'm all controller and TV.
 
Back
Top