RTX 5xxx / RX 8xxx speculation

Where do you think that funding is going to come from?

Right now AMD's focus is on getting a big chunk of that high markup GPU market with the MI325 and MI350 releases.
 
Are any of you guys actually gonna drop 2.5k for a 5090? I am in the hell no department. My 4090 should last me 3 years hopefully.

Without question skipping the 5 series. I have no use for it with my 120hertz panel. I'll wait until refresh rates get higher with 42in plus size displays. Until then I am good with my current setup.
 
I"m planning for the 5090 to be ~$2k for non founder's edition variants.$2.5k is most likely not happening unless its acutally the full chip
 
I don't see the 5090 less than $1995, Nvidia never had an issue selling 4090's until maybe now. Why would they sell them cheaper? There will be no competition, not even for the 4090 from anyone else. This time around the 5080 price of $1200 will probably stick too leaving room for a Ti version later if need be. Then again what game will even need a 5090 capability to play? Besides pushing RT more for what little benefit to game play. Anyways I would be very surprised if the 5090 was less than that, should be a very fun card to play with if one can get their hands on one.
 
Then again what game will even need a 5090 capability to play? Besides pushing RT more for what little benefit to game play.
They seem to be pusing a lot of title that can make a 5080 work a lot.

Even with hardware rt off:
Cine-Na-2160p-p.jpg


And pathtracing has a solid impact on the visual in that one. (as for Benefit to game play, it is not sure how often better models, trees, textures are benefit to gameplay, in some game better graphics like better looking cinematography in a movie)

The next star wars game:
-3840x2160-nvidia-dlss-3-5-desktop-gpu-performance.png

30 fps on a 4090....

When you look at a Black State:

View: https://www.youtube.com/watch?v=PaNYLP5dJAw

Avowed and some otherse

That also feel like a only a 5090 will run that comfortably with all bell and whistle (probably not a native 4k until the 6090, but with quality DLSS + RR), Unreal 5 engine that use Nvidia path traced fork could be relatively common by 5090 launch, they seem to be spending money on dev to be sure that it will exist halo games for which 5080/5090 will make a difference versus a 4080 super/4090, and they probably want a Cyberpunk type to be at least around the corner.
 
They seem to be pusing a lot of title that can make a 5080 work a lot.

Even with hardware rt off:
View attachment 674404

And pathtracing has a solid impact on the visual in that one. (as for Benefit to game play, it is not sure how often better models, trees, textures are benefit to gameplay, in some game better graphics like better looking cinematography in a movie)

The next star wars game:
View attachment 674405
30 fps on a 4090....

When you look at a Black State:

View: https://www.youtube.com/watch?v=PaNYLP5dJAw

Avowed and some otherse

That also feel like a only a 5090 will run that comfortably with all bell and whistle (probably not a native 4k until the 6090, but with quality DLSS + RR), Unreal 5 engine that use Nvidia path traced fork could be relatively common by 5090 launch, they seem to be spending money on dev to be sure that it will exist halo games for which 5080/5090 will make a difference versus a 4080 super/4090, and they probably want a Cyberpunk type to be at least around the corner.

Been benchmarking WuKong on 7900 XTX, which shows, even for RTX cards RT may not the best settings or method. With RT off, Very High settings with Cinematic Textures, at 4K, 75% FSR with Frame Gen -> getting around 100fps which makes the game playable. Depending upon if a fast shooter or latency needed, frame gen may not be usable. In benchmark it looks ok.

In my view, the best quality image for playability (loosely applied here since a benchmark) is with hardware RT off. Using hardware RT requires too many other compromises, resolution, more aggressive upscaling, degradation of frame rates resulting in an overall poorer visual experience combined with less smooth game play forcing even other compromises in settings. RT full can give a single frame improvement, except combined with all the other aspects of the game it just degrades the overall experience. Since Unreal engine 5, it visually stunning anyways without hardware RT, after playing 10 minutes or so (actual game I would think), I would care less the difference in the image presented. Even if the 5090 is 60% faster here, who would even bother using full RT? Not many overall in the scheme of things since most would not have a 4080 level capable card.
 
Been benchmarking WuKong on 7900 XTX, which shows, even for RTX cards RT may not the best settings or method. With RT off, Very High settings with Cinematic Textures, at 4K, 75% FSR with Frame Gen -> getting around 100fps which makes the game playable. Depending upon if a fast shooter or latency needed, frame gen may not be usable. In benchmark it looks ok.

In my view, the best quality image for playability (loosely applied here since a benchmark) is with hardware RT off. Using hardware RT requires too many other compromises, resolution, more aggressive upscaling, degradation of frame rates resulting in an overall poorer visual experience combined with less smooth game play forcing even other compromises in settings. RT full can give a single frame improvement, except combined with all the other aspects of the game it just degrades the overall experience. Since Unreal engine 5, it visually stunning anyways without hardware RT, after playing 10 minutes or so (actual game I would think), I would care less the difference in the image presented. Even if the 5090 is 60% faster here, who would even bother using full RT? Not many overall in the scheme of things since most would not have a 4080 level capable card.

I'm playing WuKong with Full RT on :). DLSS version swapped to v3.7, using Performance mode with Preset E, mix of Cinematic + High settings following HUB's optimized setting video, and FG on. Getting 100fps on avg and it is flippen awesome.
 
We will probably stop rising both FPS and resolution for the RT type of games for a long time too, there will be a lot of power to shoot rays for 1440p upscaled to 4k that want to run at 60fps with a 7080-7090.
 
Since Unreal engine 5, it visually stunning anyways without hardware RT, after playing 10 minutes or so (actual game I would think), I would care less the difference in the image presented. Even if the 5090 is 60% faster here, who would even bother using full RT? Not many overall in the scheme of things since most would not have a 4080 level capable card.
Apparently this is nvidia version of UE 5 meant to showcase RT. Will work smoothly only on a 5090 ti
 
Last edited:
Apparently this is nvidia version of UE 5 meant to showcase RT. Will work smoothly only on a 5090 ti

I mean right now the 4090 will only get like 25fps at native 4K when using maxed Full RT. Even if the 5090 Ti was twice as fast as the 4090 in path tracing, it still wouldn't hit 60fps at native 4K at these same settings so some tweaking will still be needed if you want more than 60fps.
 
I mean at max settings. Otherwise why would you buy a 5090 ti
Max setting could be once the 6090 occur, even with pathtracing off website do not seem to test native resolution much as the cost of running native does not seem worth it.

For all the talk of diminishing return of RayTracing that was not that different for higher than 1440p resolution.
 
With GTA 6 (PC Ver) being a 2026/2027 release and the Witcher 4 + Witcher 1 remake possibly being even further away, I feel like waiting for the 7090 :cry: (PAIN)
 
How about this:

4 RDNA 4 cards

SKU#1 16gb (~4070 ti super) — $600
SKU#2 16gb (~>4070 super) — $500
SKU#3 12gb (>4070 gddr6) — $400
SKU#4 8gb (~4060 ti 8gb) — $300



chipHardwareVRAMPCBpossible sales name
SKU#1Navi 4864 CUs @ 256-bit16GB GDDR6 @ 20Gbps102-G28201Radeon RX 8800 XT
SKU#2Navi 4856CU @ 256-bit16GB GDDR6 @ 18Gbps102-G28211Radeon RX 8800 or Radeon RX 8700 XT
SKU#3Navi 48~48CU @ 192-bit12GB GDDR6 @ 19Gbps102-G28501Radeon RX 8700 XT or Radeon RX 8600 XT
SKU#4Navi 4432CU @ 128-bit8GB GDDR6102-C48701Radeon RX 8600 or Radeon RX 8500

https://m-3dcenter-org.translate.go...erie?_x_tr_sl=auto&_x_tr_tl=en&_x_tr_hl=en-GB

Launching Q4 2024

7700xt power consumption
7900xt raster
4070 ti super RT
4070 super price



View: https://m.youtube.com/watch?v=777_DEw3l5g&t=3322s
 
From VideoCardz (so take with large dose of NaCl):

MSI is adding an extra power connector for multi-GPU configurations. For now, it is marketed as a feature for AI compute which uses multiple graphics cards at once, but according to the media who were briefed about this new feature at Gamescom, there is clearly something happening behind the scenes that wasn’t discussed yet. More details are likely to be revealed soon, possibly when the new 800-series motherboards become available.

Could Crossfire be making a comeback together with the 800-series boards? :confused:

That might explain AMD not bothering with high-end monolithic cards (i.e., they think RDNA 4 can compete with multi-GPU setups).
 
From VideoCardz (so take with large dose of NaCl):



Could Crossfire be making a comeback together with the 800-series boards? :confused:

That might explain AMD not bothering with high-end monolithic cards (i.e., they think RDNA 4 can compete with multi-GPU setups).
I think AMDs plan ultimately is to have a driver-side mGPU-like solution (transparent to games) and use chiplets. There were dual-GCD rumors for Navi3 and I think Navi2 as well. Since a MCM solution would have the GCDs communicating over Infinity Fabric links, it's not out of the question that such an arrangement might also work with external IF links. We can see what that might look like with Radeon Instinct cards, which have IF edge connectors where CF/SLI fingers would have been on cards of olde.

The advantage of multi-card vs multi-chiplet would be reduced packaging cost and putting the scaling on the customer side. The disadvantages would be reduced bandwidth and increased latency.

Is it plausible? Dunno. I follow the GPU rumor mill pretty closely and have yet to see anything at all about mGPU for RDNA4. It is interesting though that we're seemingly seeing more x8+x8 PCIe bifurcation from the CPU lanes on upcoming X870/Z890 motherboards. That was something overwhelmingly eschewed on X670/Z790, but I've noticed a handfull next-gen mobos that appear to feature split PCIe lanes.

Dual 8800XT would be pretty neat. If scaling would be good under such a hypothetical scenario, performance would be a fair bit better than 4090. Not 5090-level based on Blackwell rumors, but solidly high-end.

I'm not holding my breath tho. My expectations for RDNA4 have been tempered and I'm anticipating another Polaris / RDNA1 situation.
 
From VideoCardz (so take with large dose of NaCl):



Could Crossfire be making a comeback together with the 800-series boards? :confused:

That might explain AMD not bothering with high-end monolithic cards (i.e., they think RDNA 4 can compete with multi-GPU setups).


Interesting if mGPU that is transparent to programs, otherwise blah. Just can't see this other than another version of Cross-Fire if true, probably more in line with using more than one card in applications, not games per se. Now it would be a pleasant surprise if this is true and can actually be used in games effectively, very wishful thinking.
 
I do miss the bling factor of dual GPU builds. It is a shame most games don't even support it anymore. Dont thinkg any new games support it at all.
 
I mean right now the 4090 will only get like 25fps at native 4K when using maxed Full RT. Even if the 5090 Ti was twice as fast as the 4090 in path tracing, it still wouldn't hit 60fps at native 4K at these same settings so some tweaking will still be needed if you want more than 60fps.
This is why I don't get people saying "The 4090 is so fast!!"

Like hey, it is fast, but is it "max out literally any game max RT settings at 8K 480FPS?" No? Then it's replaceable and something faster will come out.


Also, all this talk about $2500, I don't think it will be that cheap.

We've shown Nvidia that we are more than happy to pay whatever prices they ask for as long as they have the fastest card. the 4090 is SOLD OUT in many places. it's obviously not a "luxury good", it costs less than a nice set of tires for your middle-age-crisis sportscar. Sure, $2500 costs more than any video card in consumer history, but compared to other hobbies, $2500 is the TAX on what those hobbies cost. A good set of mid-to-high end golf clubs? A new exhaust system on a European sports car? A nice billiard/pool table? re-surfacing your pool? A decent gas barbeque?

I can go on, but $2500 is chicken shit in the world of "you can survive without it". you (as in, gamers) will pay $2500 for a 5090, or maybe even $3000. and the 6090 will cost $5000 and you'll still buy that, because if someone is willing to spend tens of thousands on a small fishing boat, someone JUST as passionate about gaming is going to spend that on a PC setup.

Don't get me wrong, I hate it! I'm not wealthy enough to afford a 4090 let alone a 5090, but there are enough people willing to pay those prices or more that Nvidia is willing to make just enough to cater to those folks and make a MINT out of it.

They can sell something for $2500 with $1600 profit to a million people, and they would rather that over selling something for $900 with $100 profit to ten million people

Even though they only sold 1/10th of the units, and made less overall revenue, they made 60% more profit selling higher to fewer people.

And don't feed me that BS of "betraying their audience" or "forgetting who got them to where they are" what are you going to do, buy the competition?
 
"WE showed nvidia we'll pay anything"
"not me, im too poor"
lol im with ya, i wish others would have resisted a bit though
 
$2500 if its the full fat chip is not unthinkable... but if it is cut down to non "Titan" card status, I think we will see another $1999 msrp
 
Everyone's got their limits. I was willing to pay $1400 for my 3080 Ti and $1600 for my 4090, but I would stop at $2000. $2000 is the most I've paid for any piece of hardware so far, that being the Acer X27 4K FALD IPS. More expensive monitors have come out such as the PG32UQX at $3000, and there are TV's that are more expensive, easily costing double if you want say a flagship 75 inch OLED or something that can easily run you $4000-$5000, but I've never dropped that much cash on a single piece of hardware, and I wouldn't start doing it now if the 5090 was over $2000.
 
Everyone's got their limits. I was willing to pay $1400 for my 3080 Ti and $1600 for my 4090, but I would stop at $2000. $2000 is the most I've paid for any piece of hardware so far, that being the Acer X27 4K FALD IPS. More expensive monitors have come out such as the PG32UQX at $3000, and there are TV's that are more expensive, easily costing double if you want say a flagship 75 inch OLED or something that can easily run you $4000-$5000, but I've never dropped that much cash on a single piece of hardware, and I wouldn't start doing it now if the 5090 was over $2000.
I'm sure Nvidia will sell you a 5080 with 12GB of VRAM for $1,949 to suit your more budget-oriented tastes.
 
I'm sure Nvidia will sell you a 5080 with 12GB of VRAM for $1,949 to suit your more budget-oriented tastes.

Nope never downgraded my VRAM and don't plan to now. I'll just skip a gen or two it ain't that complicated lol. Nobody NEEDS a 5090 to game, if no worthwhile upgrades comes next gen at my price point then I just won't bother it's that simple.
 
Nope never downgraded my VRAM and don't plan to now. I'll just skip a gen or two it ain't that complicated lol. Nobody NEEDS a 5090 to game, if no worthwhile upgrades comes next gen at my price point then I just won't bother it's that simple.
But that's the big issue here, is that AI models are requiring bigger and bigger GPUs but their memory needs aren't moving very quickly. This means Nvidia can't ship cards with much more than 20GB of memory or people will start buying them for AI uses, and Nvidia has special cards that have more than 20GB for AI, and they cost $5000+.

So Nvidia has to keep memory low, OR raise prices to match their enterprise stuff. So the 5090 will probably have 24GB of memory just like the 3090 and 4090, any more and it would cut into their big-boy AI sales. Which means the entire stack will be allocated memory to suit the top-tier being $2500 and having 24GB of memory.
 
Nope never downgraded my VRAM and don't plan to now. I'll just skip a gen or two it ain't that complicated lol. Nobody NEEDS a 5090 to game, if no worthwhile upgrades comes next gen at my price point then I just won't bother it's that simple.
For VR tho, it eats any GPU you throw at it.
 
What you think guys did rtx 5090 will be max 5000$ prize? I am waiting still for this card.
 
But that's the big issue here, is that AI models are requiring bigger and bigger GPUs but their memory needs aren't moving very quickly. This means Nvidia can't ship cards with much more than 20GB of memory or people will start buying them for AI uses, and Nvidia has special cards that have more than 20GB for AI, and they cost $5000+.

So Nvidia has to keep memory low, OR raise prices to match their enterprise stuff. So the 5090 will probably have 24GB of memory just like the 3090 and 4090, any more and it would cut into their big-boy AI sales. Which means the entire stack will be allocated memory to suit the top-tier being $2500 and having 24GB of memory.

Nvidia can do whatever they want but if I don't think it's worth the price I'll just pass on it.
 
This thread keeps delivering comedy gold.
This is why I don't get people saying "The 4090 is so fast!!"

Like hey, it is fast, but is it "max out literally any game max RT settings at 8K 480FPS?" No? Then it's replaceable and something faster will come out.


Also, all this talk about $2500, I don't think it will be that cheap.

We've shown Nvidia that we are more than happy to pay whatever prices they ask for as long as they have the fastest card. the 4090 is SOLD OUT in many places. it's obviously not a "luxury good", it costs less than a nice set of tires for your middle-age-crisis sportscar. Sure, $2500 costs more than any video card in consumer history, but compared to other hobbies, $2500 is the TAX on what those hobbies cost. A good set of mid-to-high end golf clubs? A new exhaust system on a European sports car? A nice billiard/pool table? re-surfacing your pool? A decent gas barbeque?

I can go on, but $2500 is chicken shit in the world of "you can survive without it". you (as in, gamers) will pay $2500 for a 5090, or maybe even $3000. and the 6090 will cost $5000 and you'll still buy that, because if someone is willing to spend tens of thousands on a small fishing boat, someone JUST as passionate about gaming is going to spend that on a PC setup.

Don't get me wrong, I hate it! I'm not wealthy enough to afford a 4090 let alone a 5090, but there are enough people willing to pay those prices or more that Nvidia is willing to make just enough to cater to those folks and make a MINT out of it.

They can sell something for $2500 with $1600 profit to a million people, and they would rather that over selling something for $900 with $100 profit to ten million people

Even though they only sold 1/10th of the units, and made less overall revenue, they made 60% more profit selling higher to fewer people.

And don't feed me that BS of "betraying their audience" or "forgetting who got them to where they are" what are you going to do, buy the competition?
My tires were $1,200, thank you very much.
 
1200 is my limit, the day a 80 series exceeds that, its over for me, I focus on my other hobbies instead. I am hoping the 5080 is that ball park and if not, its a used 4090 for me in the 900 range?
 
Back
Top