RTX 4xxx / RX 7xxx speculation

No, they definitely brute forced it with the 290X. 95C die temps under load were considered "normal and by design". They did beat the OG Titan with it though, hence the release of the 780Ti shortly thereafter.

The stock cooling was bad no doubt but perf per watt wasnt actually much different in uber mode vs stock which shows it had overclocking headroom that was not even pushing it beyond the efficient part of the power curve.

If it was then the 390x would not have been even possible as a sku, since *that* was actually the 290x pushed passed the normal efficiency curve.
 
Last edited:
Apparently the next gen will be some 80-120% faster than current gen.
Well considering that recent news, seem that you could be quite right:
https://www.anandtech.com/show/17327/nvidia-hopper-gpu-architecture-and-h100-accelerator-announced

Lovelace will probably be on 4nm instead of 5 like Hopper and that the H100 is a little bit more than 3 time A100 in performance (breaking the 1000 TFlops bar at FP16 and the 2000 in int8) with less than doubling the power (and a same size / a bit smaller GPU)

The going from 400w to 700w is also in line with a massive 500-600 something watt gpu being possible here (if they keep a similar ratio, but considering they will be more split this time than with Ampere maybe not).
 
600W is nothing new since people ran SLI configs that easily hit 600 watts and more with prior generations. That being said, it sucked then, and it'll suck now unless you live in a cold climate.
 
600W is nothing new since people ran SLI configs that easily hit 600 watts and more with prior generations. That being said, it sucked then, and it'll suck now unless you live in a cold climate.
It is new for a single card. I think the last dual chip card was what a 295X and it was a 500W TDP card.
 
I retrofit my office last week with a return air duct and put a better booster in my forced air duct in anticipation for these 600w cards. lol
My eventual goal will be PC in the basement with cable ran upstairs.....eventually.
 
How does 500 or 600 watts concentrated all in one area not melt plastic or do long-term damage? I touched the backplate on my 3080 TI while it was using around 350 watts and it was fucking scorching hot and the amount of heat coming off around the card was insane.
 
How does 500 or 600 watts concentrated all in one area not melt plastic or do long-term damage? I touched the backplate on my 3080 TI while it was using around 350 watts and it was fucking scorching hot and the amount of heat coming off around the card was insane.
It depends on how it's directed and dissipated. Sure, if it was concentrated in a very small area that would be an issue. We can operate 1000w+ heaters with no issues unless you literally stick something right in front of them.
 
It depends on how it's directed and dissipated. Sure, if it was concentrated in a very small area that would be an issue. We can operate 1000w+ heaters with no issues unless you literally stick something right in front of them.
I don't know it just seems like anything that would have contact on that backplate or in the immediate vicinity would eventually be cooked. The difference between having a 3070 and a 3080 TI is absolutely massive in the amount of heat being generated in the PC case and that's only about 120 w difference so I can't even imagine going from 350 w to 600 w. And at what point does all this nonsense end? It just seems like everything is actually getting more inefficient and more power hungry when it comes to PC gaming.
 
I don't know it just seems like anything that would have contact on that backplate or in the immediate vicinity would eventually be cooked. The difference between having a 3070 and a 3080 TI is absolutely massive in the amount of heat being generated in the PC case and that's only about 120 w difference so I can't even imagine going from 350 w to 600 w.
I know I will enjoy seeing casual SFF builds cope with 450w+ gpus lol.
 
How does 500 or 600 watts concentrated all in one area not melt plastic or do long-term damage?

It will. I warped a wall with my triple-Crossfire setup back in the day. The carpet got all stiff, too, and that was on top of a wood PC case stand.

Far Cry 2 was unbelievable, though. Just mind-blowing.
 
It just seems like everything is actually getting more inefficient

I think that an illusion caused by the fact that the efficacy increase does not fully make up by how performance is wanted. Look at consoles over time for example, hardware is not that different than PC and the power from generation to generation does not move up (I think ?, could be wrong)

A 1080TI under gaming load is around 210-240 with 300 watt peaks.
A 3070 peak at 208 watt under load, well it change from source to source, but they all have:
https://www.techpowerup.com/review/nvidia-geforce-rtx-3070-founders-edition/31.html

A 3070 using less power than a 1080TI, it is about 30% more efficiant.

But that does show that efficiency did not rise that much in 5 year's, so even without being more inefficient, a lot of the gain come from more power.



index.png


But in 2022 you can build a gaming machine more powerful than a 2016 one with the same watt used would be my guess.
 
Care or not, 600W cards is going the wrong direction. Maxwell was great, more performance and less power draw.

This is for a top tier card though. People buying a 4090 will pay a huge premium for it, and Nvidia will crank everything to ridiculous levels because it’s a halo product and they need the raw benchmark numbers. This doesn’t mean that the 4080, 4070, etc, will necessarily be huge power hogs vs previous generations.
 
This is for a top tier card though. People buying a 4090 will pay a huge premium for it, and Nvidia will crank everything to ridiculous levels because it’s a halo product and they need the raw benchmark numbers. This doesn’t mean that the 4080, 4070, etc, will necessarily be huge power hogs vs previous generations.
This is my feeling. The 90 series / Titan are the "gloves off" offerings - and it is good to have such critters.

Most of the others have to go into some pretty unimpressive OEM offerings, so are constrained far more than halo / boutique selections.
 
<shortening post since mine ended up longer than I thought>

For what it's worth, Tom's Hardware has a performance per watt table:
https://www.tomshardware.com/features/graphics-card-power-consumption-tested

Apparently the 6800 is the most efficient card in the current generation, and most newer cards are more efficient than their older counterparts (until you get up to the 3080's by Nvidia, vs the 2080's). Doesn't include the 3080 Ti, but don't know if that will skew anything.

My take: If these 400-600W GPU's are actually 2x+ as powerful, it would... I guess... be worth upgrading. I would need to upgrade my PSU from an 850W to a 1kW to run the GPU and my 5950x at the same time. Hopefully they'll just stay within a reasonable price bracket, and it will actually be worth upgrading.

Yeah, right. Did you enjoy that joke? With the current state of GPU prices, I'm just still going to stick with my 2080. I'm not even sure what I would use the 4xxx to play. My games look good enough to me. If the 3080 Ti prices ever plummeted, I would much sooner upgrade to that instead. That would be far more than good enough for me at 3440x1440 for the foreseeable future. I guess I've just gotten a bit jaded and stopped caring about a few more smoke particles in my game.
 
This is for a top tier card though. People buying a 4090 will pay a huge premium for it, and Nvidia will crank everything to ridiculous levels because it’s a halo product and they need the raw benchmark numbers. This doesn’t mean that the 4080, 4070, etc, will necessarily be huge power hogs vs previous generations.
From the rumors I heard sounds like everything was going up. But happy to be wrong there.
 
This is for a top tier card though. People buying a 4090 will pay a huge premium for it, and Nvidia will crank everything to ridiculous levels because it’s a halo product and they need the raw benchmark numbers. This doesn’t mean that the 4080, 4070, etc, will necessarily be huge power hogs vs previous generations.

Well there hasn't been a single chip card at that level though. Even the 3090 base was 350. We are talking 600W base. From all the rumors its because of desperation mode by nvidia. Looks like RDNA3 is a monster and Nvidia wants to get as close as they can no matter what. That extra 10-20% in performance costing them some crazy power.
 
600 watts is great. Hope it is 2X faster than 3090. Then we are really talking.

Has nothing to do with 3090. They are pushing it hard because of 3090. Rumor is RDNA3 is going to be around 450w or less and more performance. Nvidia was there too but they had to get extra 10-20% performance and its gonna cost them shit load of power. 4080 will likely be the sweet spot.
 
Apparently the new 3090ti can consume 450 watts.

I feel like it’s just built to make you feel like holy shit we got all this performance increae for only 150w increase for 4090? Wow.

Nvidia is just settling that In to everyone’s mind lmao.
 
I feel like it’s just built to make you feel like holy shit we got all this performance increae for only 150w increase for 4090? Wow.

Nvidia is just settling that In to everyone’s mind lmao.
The crazy part is GamersNexus clocked the FTW3 3090TI at 503 watts stock and 533 watts overclocked.
 
I feel like it’s just built to make you feel like holy shit we got all this performance increae for only 150w increase for 4090? Wow.

Nvidia is just settling that In to everyone’s mind lmao.
Add the MSRP pricing element and I think that yes, in a significant part that product could be to make the press release and the graph in them on the Lovelace launch more than trying to make a lot of money with it itself.
 
The crazy part is GamersNexus clocked the FTW3 3090TI at 503 watts stock and 533 watts overclocked.

Yea it seems like this is a lead in to next gen and suckers product. Now 4090 might give you 80% more boost but hey it only took about 100 more watts so its so efficient lmao.
 
I feel like it’s just built to make you feel like holy shit we got all this performance increae for only 150w increase for 4090? Wow.

Nvidia is just settling that In to everyone’s mind lmao.
Sounds like you have it all figured out. Buy what you like.
 
*me checks prices on 1kW+ PSUs*

Maybe buying the AX1600i in 2018 wasn't so crazy after all...And I got years of a completely silent PSU for the vast majority of the time.

I do wonder where the limit will be for most consumers. 2kW? 3kW? Even if the 40xx series is being cranked to the limit just to get performance supremacy over RDNA3, the fact is power requirements seem to have really spiked in the recent years. And the bigger issue is the transient peak loads that have even overwhelmed recommended PSU wattages for certain power supplies.
 
I do wonder where the limit will be for most consumers. 2kW? 3kW? Even if the 40xx series is being cranked to the limit just to get performance supremacy over RDNA3, the fact is power requirements seem to have really spiked in the recent years. And the bigger issue is the transient peak loads that have even overwhelmed recommended PSU wattages for certain power supplies.
For a start in a lot of north america 15A, 120v would be an absolute hard limit for mainstream total computer power, i.e. 1600watt psu will be the absolute limit for a long time.

And you need to expect monitors and all the rest to be on the same circuit, even before talking about cooling challenge, just actual power from the wall will be limit way before 2Kw, before 1250 watt I would imagine.
 
  • Like
Reactions: Axman
like this
For a start in a lot of north america 15A, 120v would be an absolute hard limit for mainstream total computer power, i.e. 1600watt psu will be the absolute limit for a long time.
Could always run a 220 line. Relatively easy work.
 
  • Like
Reactions: noko
like this
Could always run a 220 line. Relatively easy work.
I am assuming that most consumer (that was an important part of the message) would not be interested in doing electrical work for their computer to boot, and just using their current electrical plug being a must.

Lot of consumer use Wi-Fi for always at the same place device (or even use a laptop in their own house), imagine running a 220 line.

Maybe the question was meant by the most consumer of 3090 type of cards that would do that, you could certainly have a point they are far from being most consumer of PC, in the next decade with electrical charging car, having 400 ampere house and higher voltage going in more place could get more common.
 
Yeah, there is the 15A, 120V limit for majority of households. Although interestingly enough, there was a PSU brand called x3 that advertised a 1600w PSu over a decade ago (It was even reviewed here at HardOCP). Thing is, it required a a 20 amp circuit to hit that power output. In reality, it was a 1200w PSU for the majority of NA households that could do 1600w if you really wanted/needed it and had the wiring for it.

Is that the future? Want more power then you gotta upgrade your household circuit?

Ok, enough of the digression and silly speculation. Back on topic.

I just hope mining activity stays relatively low and the supply chain stabilizes so that the insane demand doesn't evaporate stock of the 4000/7000 GPUs this fall. Curious where Nvidia/AMD (and Intel to a lesser extent) will price the cards. The Pascal GPUs were priced decently but then GPU mining took off in a big way, causing the first shortages. Nvidia tried to correct for that with Turning but they were priced too high and right when mining fell off. Then Nvidia tried to fix that with good pricing on Ampere and was on the wrong end of the demand curve yet again. After being burned three times, I'm wondering what direction Jensen Huang will go this time.
 
Back
Top