RTX 4xxx / RX 7xxx speculation

Well there goes the end of SFF builds. Managed to squeeze a 3-slot 3080 in my mini PC but it looks like with the 4 slot monstrosities its back to medium tower cases...
 
Recent rumors say their gaming GPUs will have not multiple graphics dies. Only the workstation parts.
Yeah waiting to see about this, leakers like MLID and kopite7kimi were taking about how RDNA 3 31 were going to be MCM for over a year, now all of a sudden only a few months from release they are going to be monolithic.
 
Yeah waiting to see about this, leakers like MLID and kopite7kimi were taking about how RDNA 3 31 were going to be MCM for over a year, now all of a sudden only a few months from release they are going to be monolithic.

I don't think they're going to be monolithic at the top end, they'll just be using a single GPU die. They will have other chiplets for other functions like the I/O die.

Again, rumored to be.
 
Last edited:
I’m actually hoping the power numbers are fairly accurate so I can absolutely not care about owning any of these cards. Maybe the 70 at most. However I am interested in seeing what gains in performance are there. Should be able to just read all about it and comfortably watch these Nvidia cards from the sidelines personally.
 
I’m actually hoping the power numbers are fairly accurate so I can absolutely not care about owning any of these cards. Maybe the 70 at most. However I am interested in seeing what gains in performance are there. Should be able to just read all about it and comfortably watch these Nvidia cards from the sidelines personally.
I'm not concerned about the 450w usage. I already have a 3080 running the 450w BIOS. Not a problem to pull that heat out of my case.
 
My limitation is SFX power supplies, not gonna get an ATX case just for moderately faster performance. But a 450W gpu isn't off the table for me if I can undervolt it to 350w and keep most the performance. I am really curious what AMD has, if they are claiming 50% performance/watt increase and if it is true I'm very interested in this product, we haven't seen gains like that in six years (since Pascal).
 
Yeah, I'm concerned more about how it affects the ambient room temperature, not cooling the card. Happy with my undervolted 3080 still. Not interested in the 4xxx series right now, esp if power numbers are true.
This is why I'm curious about switching to the RX series whether new or current. Prices for the 6000s are better, and regarding their next-gen cards, I've heard nothing about the insane power consumption rumored to [dare I say] plague the RTX 4000s.

My limitation is SFX power supplies, not gonna get an ATX case just for moderately faster performance. But a 450W gpu isn't off the table for me if I can undervolt it to 350w and keep most the performance. I am really curious what AMD has, if they are claiming 50% performance/watt increase and if it is true I'm very interested in this product, we haven't seen gains like that in six years (since Pascal).
I agree with everything after your first sentence. Not sure what case you have now, but a ton of ITX cases these days can support ATX power supplies. I'd imagine undervolting to 350W would cause a considerable drop in performance if you run your cards at load, but I'd love to be wrong about this!
 
I'm not blowing out 450w of heat into my PC room just for a gpu. No way.
Yeah I'm the same I guess. I'm on 350w right now but I think I'll wait for the following gen so that I may still get a significant performance increase while lowering the wattage to maybe 250w or so. I'm sure a 350w (maybe even 300w) RT4xx will be an upgrade already but I'm afraid it'll only be a small one.
 
This is why I'm curious about switching to the RX series whether new or current. Prices for the 6000s are better, and regarding their next-gen cards, I've heard nothing about the insane power consumption rumored to [dare I say] plague the RTX 4000s.


I agree with everything after your first sentence. Not sure what case you have now, but a ton of ITX cases these days can support ATX power supplies. I'd imagine undervolting to 350W would cause a considerable drop in performance if you run your cards at load, but I'd love to be wrong about this!
I have a LianLi/DAN A4-H2O (11L). My 3090 is undervolted to 750mV with minimal performance loss I'm running it at 1695mhz, usually pulls between 120w-270w, in depending on the game, some AAA games will pull around 300w.
 
I'm not blowing out 450w of heat into my PC room just for a gpu. No way.
Agree. Will need to see how well it undervolts. My 3080 only draws about 235W right now and in most games its only a frame or two slower than stock and some like HZD actually faster. Plus frame rate stays consistent.

If AMD is moving away from a chiplet design that will be really disappointing. I suspect that we will hear about 2X and 3X the speed and in the end both the Nvidia and AMD cards will be 40% faster like always. Likely skipping this generation unless somehow one of them has performance gains much higher and I can run it quietly.
 
If AMD is moving away from a chiplet design that will be really disappointing.

They aren't. Even the rumors that say the 7900-series parts will only have one GPU die (Coreteks is still saying it will have more than one GPU chiplet) it's still expected to use multiple dies to reduce costs and improve power efficiency.
 
Suppose this is the case then a short supply launch of the paper variety seems plausible.
 
Seeing that some shops ever here still charge 1.200€ and more for RTX 3080's and up to 2.600€ for 3090Ti's i'm not surprised they don't sell, for a lot of cards there has been stock for well over a year but prices don't come donw, not everywhere anyways.
 
Seeing that some shops ever here still charge 1.200€ and more for RTX 3080's and up to 2.600€ for 3090Ti's i'm not surprised they don't sell, for a lot of cards there has been stock for well over a year but prices don't come donw, not everywhere anyways.
At those prices you'd be better off importing one from the U.S.
 
Yeah we have been at that point where that 1K plus USD buyer has already picked something up. I’m expecting stupidly high prices and the need for a PSU to go with the 4080/90 making it even harder to swallow. Another reason they will probably produce fewer cards and keep high prices.
 
Thermaltake has a 1650W PSU with dual 16-pin PCI-E auxiliary power cables incoming.

https://videocardz.com/newz/thermal...-gen5-16-pin-connectors-custom-rtx-4090-ready

EVGA has you covered with a converter for using five 8-pin cables if your PSU doesn't support the new PCI-E 5.0 specification (made for the 3090 Ti K|NGP|N).

1657814894849.png


1657814919330.png


RIP anybody still on 15A breakers.
 
Last edited:
jeebus.. 2750mhz clock on the 4090 should obliterate the 3090 in pure razterization performance. what do you guys think re 4090 vs 3090 Ti? 30% better? 50% better?

same garbage launch schedule is going to mean F5 key will get destroyed. again. sigh...
 
jeebus.. 2750mhz clock on the 4090 should obliterate the 3090 in pure razterization performance. what do you guys think re 4090 vs 3090 Ti? 30% better? 50% better?

same garbage launch schedule is going to mean F5 key will get destroyed. again. sigh...
me think, 0%-5% gaming experience improvement. Just more useful and more interesting items to buy first over something I would not notice a game play improvement in other words. Still I will see what comes about from AMD and Nvidia and then determine if it is worth whatever costs that comes with it.
 
me think, 0%-5% gaming experience improvement. Just more useful and more interesting items to buy first over something I would not notice a game play improvement in other words. Still I will see what comes about from AMD and Nvidia and then determine if it is worth whatever costs that comes with it.
If there are no instruction improvements in the architecture then the clock speed alone would equal a 60% improvement in performance over the 3090.
 
If there are no instruction improvements in the architecture then the clock speed alone would equal a 60% improvement in performance over the 3090.
What does that have to do with the gaming experience? For example, monitor 120hz, going from 200fps to 400fps in a given game-> A 100% improvement in performance -> I doubt even a few would consider the gaming experience any better in that particular case.

Blanket assumptions without considering the whole system can also lead one astray. For example Far Cry 6, an unusual case, maybe just a poor usage of available CPU cores. In any case look at the performance difference between the 5800X3D to the 5600, particularly the 1% lows (where it really counts), data from Hardware Unboxed. A 36% improvement (1% lows) with the same GPU. When your talking very high frame rates the CPU really does matter. If one is going to get the presume 4090 (450w) 50%-60% increase in performance, probably best to have a 5800X3D or good Alder Lake system to support it. AMD may need something even stronger (Zen 4 Vcache?).

Screenshot 2022-07-15 233448.jpg

In my case the 5800X3D has been a eye opener, maybe because playing FarCry from a 3900x to a 5800X3D, this was at 4K, was definitely noticeable in smoothness. Where the 3900x would drop to sub 40fps, briefly but it was there, noticeable. The 5800X3D maintains over 60 FPS. Of course RT and Ultra settings using a 6900XT. Anyways the 5800X3D will get a new MSI motherboard and will change places with the 3960x (CPU that has been holding back the 3090).

Those who are going to foot the bill for the 4090, I say will need to max out the rest of the system as much as possible including a monitor that show the fps it can produce. For me uber FPS (beyond 100, I know kinda low standard for some) has very little improvement in gaming experience value. Not a competitive gamer, those who are may have a real treat on the upcoming GPUs from Nvidia and AMD.
 
Last edited:
Nice points. Im on a 6800xt and have run it with a 5600x and 12700k and can see in some instances where the 5600x leaves some frames on the table compared to the 12700k. The lows are really where it counts for sure. I frame limit faster games to 120fps and slower strategy type games to 90 or even 60. Not a competitive gamer anymore. At 1440p plenty of GPUs can do this. As an enthusiast I’m always interested in the next thing and any tech or performance advances but these halo cards are getting into the realm of e-peen and edge case use here.
 
Yeah...gonna do the 5800x3d. It'll replace the 5800x...which will replace a 3700x...which will replace a 2700x. The AMD waterfall... ;)

Meantime, I may get a placeholder 6600xt and see what this fall/winter brings for GPUs. If there's something exciting (and available and not overpriced), it'll go into the flagship machine and the rest will waterfall down, just like the CPU.
 
  • Like
Reactions: noko
like this
What does that have to do with the gaming experience? For example, monitor 120hz, going from 200fps to 400fps in a given game-> A 100% improvement in performance -> I doubt even a few would consider the gaming experience any better in that particular case.

Blanket assumptions without considering the whole system can also lead one astray. For example Far Cry 6, an unusual case, maybe just a poor usage of available CPU cores. In any case look at the performance difference between the 5800X3D to the 5600, particularly the 1% lows (where it really counts), data from Hardware Unboxed. A 36% improvement (1% lows) with the same GPU. When your talking very high frame rates the CPU really does matter. If one is going to get the presume 4090 (450w) 50%-60% increase in performance, probably best to have a 5800X3D or good Alder Lake system to support it. AMD may need something even stronger (Zen 4 Vcache?).

View attachment 492432

In my case the 5800X3D has been a eye opener, maybe because playing FarCry from a 3900x to a 5800X3D, this was at 4K, was definitely noticeable in smoothness. Where the 3900x would drop to sub 40fps, briefly but it was there, noticeable. The 5800X3D maintains over 60 FPS. Of course RT and Ultra settings using a 6900XT. Anyways the 5800X3D will get a new MSI motherboard and will change places with the 3960x (CPU that has been holding back the 3090).

Those who are going to foot the bill for the 4090, I say will need to max out the rest of the system as much as possible including a monitor that show the fps it can produce. For me uber FPS (beyond 100, I know kinda low standard for some) has very little improvement in gaming experience value. Not a competitive gamer, those who are may have a real treat on the upcoming GPUs from Nvidia and AMD.
Depends on what you play and your setup in terms of gaming experience improvement. The 3090 ti won't max out cyberpunk with ray tracing and DLSS @ 4K 120. I'd eager the experience improvement there will be big. Who knows what future titles will require? I'm getting the impression games like Stalker 2 will bring the 3090 down a peg of two. I personally won't be buying a 3090 or a 4090, but the 4070 rumours have my ears perched.
 
In the last few weeks, we went from "40 series launches in July" to "only the 4090 will launch this year."

https://twitter.com/greymon55/status/1547805136210509824

AD102:2022
AD103/104/106:2023
They have an abundance of surplus Ampere cards to move coupled with a massive downturn in discretionary spending. Not surprising the launch continues to move.

I'm guessing paper launch for the 4090 in September with actual retail launch in October. But who knows at this point, may be later.
 
They have an abundance of surplus Ampere cards to move coupled with a massive downturn in discretionary spending. Not surprising the launch continues to move.

I'm guessing paper launch for the 4090 in September with actual retail launch in October. But who knows at this point, may be later.
Yeah I figure we will be lucky to see the 4080 by the end of the year. I almost expect paper launches at this point.
 
they will definitely have 4090 retail availability this October. The rest will depend on what AMD shows I think.
 
Back
Top