NVIDIA GeForce RTX 4070: up to 30% faster than RTX 3090 in gaming

Don't forget the 150-300w heat output of the bag of meat between keyboard and chair while gaming. More if also drinking Redbull.
 
I think we've about hit the peak of what humanity is capable of when it comes to high end silicon based computer hardware. CPU performance has stagnated for quite some time now - it's actually kind of sad how little its progressed since Sandy Bridge, and now we're only going to see a 25% performance improvement after 2 years with GPUs, but not only that - with also 25% more power consumed and 25% larger coolers. And that's coming after the jump between the 2080Ti and 3090 wasn't that big either (I think it was around ~30% if I recall?). At least 4K (per eye) VR at 90fps is within reach with where we are now.
 
  • Like
Reactions: gvx64
like this
I think we've about hit the peak of what humanity is capable of when it comes to high end silicon based computer hardware. CPU performance has stagnated for quite some time now - it's actually kind of sad how little its progressed since Sandy Bridge, and now we're only going to see a 25% performance improvement after 2 years with GPUs, but not only that - with also 25% more power consumed and 25% larger coolers. And that's coming after the jump between the 2080Ti and 3090 wasn't that big either (I think it was around ~30% if I recall?). At least 4K (per eye) VR at 90fps is within reach with where we are now.
2080ti to 3090 was 50%.
 
I think we've about hit the peak of what humanity is capable of when it comes to high end silicon based computer hardware. CPU performance has stagnated for quite some time now - it's actually kind of sad how little its progressed since Sandy Bridge, and now we're only going to see a 25% performance improvement after 2 years with GPUs,
Seem to be around 65% for the same size of die that has about 2.7 time more transistor by mm:
https://wccftech.com/nvidia-geforce...k-2077-dlss-3-cuts-gpu-wattage-by-25-percent/

Will see with final result but, 7950x seem to be a massive jump from the previous 5950x:

csm_amd_7950x_cpu_overall_62e3caa8cf.png


The 3090 average FPS was about 44% higher than a 2080TI with a significantly smaller die (628mm vs 754mm) :
https://www.techspot.com/review/2105-geforce-rtx-3090/

I am not sure how much stronger you need to be to gain 44% (has there is other element than purely the GPU going on, 50-55% more powerful ? but the giant memory bandwidth boost could be a major part of that performance gain).

If we add Apple silicon and others (like what Amazon do) I am really unsure and on the CPU side it need to be restricted to single thread performance because a 5950x passmark is 46k, a 3970x sandy bridge that launched at $1000 2012 dollar is around 8.4, that still a 5x gains, there is some Threadripper that double that for 10x gain.

The recent Nvidia Hopper reached 2000 Tflops in some workloads, they put ~80B transistors on a die and has 3 TB/s of memory bandwidth, it tripled the performance of the 2 year's old A100 in many ways, while less than doubling the power used. Maybe silicon with regular binary is on the end of it's rope, but maybe not, they're 2 and 1 nm coming up and possibly down 10 time after that to 0.1 nm according to some in that field.

And if some challenge of the chiplet design get worked out and software-hardware parallelization, it opens the door to a whole new life.
 
Last edited:
Small room, large room, same heat.

How much AC? Lol wtf are you talking about? That's not how things work.
Nifty thing about CPU and GPU-- If it pulls 450w, then it dissipates 450w.

Lifespan is not calculable by any review or influencer. This would literally have to be taken at the manufacturer's word.

Bake expensive hardware? 🤣

At any power draw above or below 300w, there is a cost to run. Why did you decide to set 300w as this arbitrary point? Do you get free electricity if your GPU draws ≤300w?

Your final statement shows your lack of understanding of how these things work.

Should probably just delete your whole post.
I'm not an idiot. Yes it's the same heat. I think everyone including you knew what I was getting at whether or not the post was phrased the best way. And yes I picked an arbitrary point for power draw based on what USED to be considered a high watt system setup. The constant draw possible with a 3080 Ti and an i9-12900k is frankly huge. I also based my number on what I considered a high electric bill in the US to run that system at around $.15 per kilowatt hour for 8 hours a day. I was actually using 450w as a total system constant average power draw number. Many gaming systems can easily hit this number today when playing a AAA game. Mine can.

Add in my 47" and 32" montors and speaker system into the calculation... and well... the 450w number I picked becomes conservative.

If it makes you happy, I should have just left my arbitrary number out. It muddied the conversation for no purpose.

Lifespan can be discussed in context of potential failure rates when giving average joe consumers hardware with stupidly high heat dissipation requirements then consider all the ways a system can be slightly miss-built or misused. For example I fixed one for a customer today with a 3080 Ti and Ryzen CPU and one of two CPU fans was mounted backward. How often is this kind of thing happening and what is the long term effect on all these new super high TDP CPUs and GPUs relative to their previous versions?

There are all sorts of ways hot running high-end computers end up inadequate for the task:
Built in a case that just doesn't have quite enough cooling capacity for the hardware put in it,
Has enough airflow but gets put somewhere that blocks all the intake and venting,
Case fans get installed completely wrong for airflow directions.
Gets very clogged with dust,
Buried in papers,
Coated in cat hair,
Set next to the heat register output in a room, etc etc...

I see it every day.

Up until now the majority of "enthusiast" PC hardware that people might buy from retail has been remarkably tolerant of poor operating environments and moderate build mistakes. But I've been building custom PCs for customers for decades. More and more people In the strata I would call average gamers or enthusiast gamers who don't make any kind of hobby of computer hardware in the last 2-3 years now have systems with max power draws hundreds of watts higher than they have ever been likely to purchase before. These aren't [H] enthusiasts like here. They are just plain bog standard USERS who don't care how it works and they now have systems that may need serious care and attention occasionally. That's what I'm talking about needing to be mentioned in reviews.

How many of these new GPUs and CPUs are possibly "too much hardware" for many of the people being marketed to?

Maybe there won't be an issue... but I'm sure going to be watching to see how many RTX 3080 and above systems are dying at the 4 mark instead of the 7-10 year mark. If the hardware is going to be this expensive, people are going to EXPECT it to last. Whether or not that expectation is realistic.

The average watts per hour being consumed is going up with these cards each generation, not down. You don't buy a 3080 Ti to lock the FPS at 60 and play at 1080p to try save power when rendering. You're buying one to run 1440p or 4k as fast as you can with as much eye candy as you can.

Back to the room heat issue.

In a smaller room it takes less time to kick the temp up 10 or even 15 degrees. For me a whole lot less than one gaming session. It's not unusual for me to look up during a game and suddenly realize my office is at 85F. Hence the need for window AC frequently or constantly. And not all houses are perfectly efficient. This room leaks just enough that I can get away with it longer in winter as long as the central vent is closed and the room is never heated by central air. But I have to have a window AC available to supplement even when central cooling is active because the central air thermostat is obviously basing off the rest of the house, not just this room. The central air system turns on and off based on the temp and needs of the rest of the house. The office keeps getting heat dumped into it from the PC for however long a game is running.

If your PC happens to be in a fortunate place in your house for room size, airflow, and position relative to your central air intake then you may just have no noticeable problem.
 
Last edited:
I think we've about hit the peak of what humanity is capable of when it comes to high end silicon based computer hardware. CPU performance has stagnated for quite some time now - it's actually kind of sad how little its progressed since Sandy Bridge, and now we're only going to see a 25% performance improvement after 2 years with GPUs, but not only that - with also 25% more power consumed and 25% larger coolers. And that's coming after the jump between the 2080Ti and 3090 wasn't that big either (I think it was around ~30% if I recall?). At least 4K (per eye) VR at 90fps is within reach with where we are now.
Yeah, I definitely agree here.

In summer 2004, I bought a ATI Radeon 9600XT for about $200 USD.

In late 2005, I bought a ATI Radeon X1800 XL for about $400 USD and it was about 4 x faster than the 9600 XT (Passmark ~140-150 pts vs 35 pts).

In late 2010, I bought a GeForce GTS 450 for about $100 USD which was about 10 x faster than the X1800 XL (Passmark 1,300 vs ~150).

In summer 2022, I bought a GeForce GTX 1650 Super for about $250 USD which was about 8 x faster than the GTS 450 (Passmark: 10,000 vs 1,300).

I know it's not a perfect comparison since these cards are not all at the same price-point but if components were increasing in speed at the same rate between 2010 and 2022 as it was between 2005 and 2010 then my GTX 1650 Super should have been somewhere in the order of 100x faster than my GTS 450, except it is only about 8 times faster. It's much more than just Moore's Law being broken, PC speed increases have been decelerating rapidly over the past 20 years. It's unclear to me if PC speeds will just hit a ceiling at some point and stop increasing altogether or if the rate of increase will stabilize at some vastly lower rate of increase than what we were used to in the past.
 
Generally more like 75-150w; but yeah, it's non-zero.
For a 150lb 25-35yo male, it's 80w while asleep. Even casual conversation can bring that up to 120w. I assume any game where fps matters would involve a higher state of engagement than that. Not a ton of heat, but at least as much as an AMD card for every 150lbs of meat in the room (don't forget the 75lb dog). Heat load just from the metabolism of room occupants is an actual input to HVAC system design. Don't forget the 60-100W from each screen, 150W from the rest of the system, and 20W from room lighting. We're at almost 500W before we add in the GPU. With his 300W baseline, we're now at 800W of heat in the room. At this point, what difference does an extra 150W actually make? The room is going to heat up regardless (unless you use a fan to exchange air between the room and the rest of the home - and then this all becomes moot).

Oh, almost forgot: There's a window and you're playing on a sunny day? Add 300W per window to the tally.

I also based my number on what I considered a high electric bill in the US to run that system at around $.15 per kilowatt hour for 8 hours a day.
If you're having trouble paying for $0.54/day to pay for the electricity used by a 450w GPU and you're gaming for 8 hours every day, you could also try getting a job.

Of course, one could ask how you were able to buy that $3k gaming setup to begin with. An unemployed fella is much more likely to have the 250W card which not only makes your own room-heating argument disappear but also is still massively faster than the 250W card of the previous generation.

How many of these new GPUs and CPUs are possibly "too much hardware" for many of the people being marketed to?

Maybe there won't be an issue... but I'm sure going to be watching to see how many RTX 3080 and above systems are dying at the 4 mark instead of the 7-10 year mark. If the hardware is going to be this expensive, people are going to EXPECT it to last. Whether or not that expectation is realistic.

The average watts per hour being consumed is going up with these cards each generation, not down. You don't buy a 3080 Ti to lock the FPS at 60 and play at 1080p to try save power when rendering. You're buying one to run 1440p or 4k as fast as you can with as much eye candy as you can.

Back to the room heat issue.

In a smaller room it takes less time to kick the temp up 10 or even 15 degrees. For me a whole lot less than one gaming session. It's not unusual for me to look up during a game and suddenly realize my office is at 85F. Hence the need for window AC frequently or constantly. And not all houses are perfectly efficient. This room leaks just enough that I can get away with it longer in winter as long as the central vent is closed and the room is never heated by central air. But I have to have a window AC available to supplement even when central cooling is active because the central air thermostat is obviously basing off the rest of the house, not just this room. The central air system turns on and off based on the temp and needs of the rest of the house. The office keeps getting heat dumped into it from the PC for however long a game is running.

If your PC happens to be in a fortunate place in your house for room size, airflow, and position relative to your central air intake then you may just have no noticeable problem.
I've gotta say, this is one of the biggest reaches for an anti-progress argument that I've seen in a long time. People have cats and cats clog up filters and clogged filters increase case temps and increased case temps lead to increased component temps and increased component temps make everything die instantly and nobody anywhere over the last 30 years has thought to maybe have components self-throttle when temps cross certain thresholds, so therefore technological progress should be halted!

Meanwhile, watts per frame has actually been going down with each new generation....
 
Seem to be around 65% for the same size of die that has about 2.7 time more transistor by mm:
https://wccftech.com/nvidia-geforce...k-2077-dlss-3-cuts-gpu-wattage-by-25-percent/

Will see with final result but, 7950x seem to be a massive jump from the previous 5950x:

View attachment 513200

The 3090 average FPS was about 44% higher than a 2080TI with a significantly smaller die (628mm vs 754mm) :
https://www.techspot.com/review/2105-geforce-rtx-3090/

I am not sure how much stronger you need to be to gain 44% (has there is other element than purely the GPU going on, 50-55% more powerful ? but the giant memory bandwidth boost could be a major part of that performance gain).

Things have been misleading for a while now because most of the reviews out there are "stock vs stock" and tend to ignore things like power draw (although they are reported). The TDP jumped pretty massively from the 2080Ti to the 3090. When both are operating at about the same power draw (let's say you overclock both to ~400W draw, which puts a 2080Ti around ~2200mhz and a 3090 around maybe ~1900mhz), you're only around 25-30% faster on the 3090 (that's about what I witnessed making the jump).

The 7950X looks great at first glance, but again we're looking at significantly increased power draw to pull off a bulk of the reported performance increase. When you're operating both the 5950X and 7950X at ~250W, the performance gap isn't anywhere near as large.

We're about to have ~500W GPUs and ~250W CPUs become the norm (it already has been for those tweaking and overclocking) and then things will "slow down", as we won't be able to just throw on another ~20% power draw to the following generation to make the "out of the box" benchmarks look better than they really are.
 
I am not sure the abiltiy for a chips to receive and use more power with the same or still better efficacy is nothing progress wise too, I am not sure we had heatsink less 2x86 back in the day for efficacy vs because they were not able to make hot chip
 
Our home office has my computer, 5950x (220w+) and a rtx 3080 (350w+). As well as my partner's pc with a 3900x (190w+) and a 1080ti (250+ watts). Combined with 5 monitors between the two of us and we manage to keep the room cool enough to do extended co-op gameplay sessions

We don't even have AC in the office. We just use box fans in windows (mostly because a window AC unit was tripping breakers combined with the PC power draw. 😅)
🤷

Anyway, performance per watt is an important metric. My overclocking these days is mostly based around seeing the highest amount of performance I can get before a 50mhz increase takes like 10w all by itself.

Do I think my next GPU is gonna use over 400 watts? Probably. But I'm also in a situation where I'm finding uses for excessive amounts of VRAM and 24GB is sounding mighty tempting, even more than the performance increases.
 
I want you to reply to FrgMstr 's reply to you.

You speak with an authoritive voice, but aren't one. The one you attempt to rebut however, is. So what's up? What do you know that a 30 year+ industry veteran doesn't? What's your source?
Forgot about this one. Now that the results are out and they match my estimates, I'm here to lol.

Looks like I was 100% correct regarding 600W as well. The stock BIOS allows you to go +33% on TDP. Can anyone here tell me what 133% of 450W is? I keep coming up with 598.5W, but everybody seems to be telling me I'm wrong.

It's almost like I've actually designed electronics before. Go figure.
 
Forgot about this one. Now that the results are out and they match my estimates, I'm here to lol.

Looks like I was 100% correct regarding 600W as well. The stock BIOS allows you to go +33% on TDP. Can anyone here tell me what 133% of 450W is? I keep coming up with 598.5W, but everybody seems to be telling me I'm wrong.

It's almost like I've actually designed electronics before. Go figure.
You didn't forget about anything. You chose to not answer.
And you were disputing FrgMstr knowledge of 600w. You said no way would that be happening. So what the hell are you on about?
 
You didn't forget about anything. You chose to not answer.
And you were disputing FrgMstr knowledge of 600w. You said no way would that be happening. So what the hell are you on about?
You are 100% correct. I said that it was never designed to run at 600W at stock settings. That's simply not how design specs work. When you intend to load a bolt to 4,000lbs, you don't spec in a bolt with a strength of only 4,000lbs. You spec one that can hold 8,000lbs or more.

Power delivery on a board works exactly the same way. When you need 450W of continuous power delivery, you have to spec a maximum capacity that is well above that 450W. This is because you have to account for component tolerances, momentary spikes, running inside of a case with no air flow, the impact of 10 years of aging and abuse, component thermals, minimizing coil whine, cooler assembly tolerances, allowing enthusiasts some overclocking headroom, etc etc.

When your design spec is exactly your intended use case, your design will fail. Guaranteed.
 
This is not true. And I speak on this as an expert in an industry that has a long list of design specifications, rules, and laws, and engineering.
Outside a perfect 0% tolerance quality in all the elements achieved, I feel that almost trivially true, no ? Has the smallest error in the wrong direction will make you fail.

If you are saying the design take those 5-10% tolerance into account and with a target % of product working correctly in the end of the chain it use X amount of that tolerance worst case scenario, you are probably both saying the exact same thing.
 
Forgot about this one. Now that the results are out and they match my estimates, I'm here to lol.

Looks like I was 100% correct regarding 600W as well.
What's your explanation for the increase of the cooler size and the resulting comically gigantic models, for a card that uses about or less power than a 3090Ti?
 
You are 100% correct. I said that it was never designed to run at 600W at stock settings. That's simply not how design specs work. When you intend to load a bolt to 4,000lbs, you don't spec in a bolt with a strength of only 4,000lbs. You spec one that can hold 8,000lbs or more.

Power delivery on a board works exactly the same way. When you need 450W of continuous power delivery, you have to spec a maximum capacity that is well above that 450W. This is because you have to account for component tolerances, momentary spikes, running inside of a case with no air flow, the impact of 10 years of aging and abuse, component thermals, minimizing coil whine, cooler assembly tolerances, allowing enthusiasts some overclocking headroom, etc etc.

When your design spec is exactly your intended use case, your design will fail. Guaranteed.
LOL! Yeah. "600W is your design spec." "OK, it will break when it hits 601W." "We good."
 
I fail to see the concern with power on Ethusiast hardware. 600w+ is nothing compared to systems past with SLI or CF, with 2, 3 or 4 cards. Pushing over 1000w with multiple GPUs was easily done, some systems used multiple power supplies.

If you need more A/C, get it or buy the lower power options, maybe a console is in order for you.

I pushed more watts in the past, today hardware is much faster and lower power than what I had before.
 
I fail to see the concern with power on Ethusiast hardware. 600w+ is nothing compared to systems past with SLI or CF, with 2, 3 or 4 cards. Pushing over 1000w with multiple GPUs was easily done, some systems used multiple power supplies.

If you need more A/C, get it or buy the lower power options, maybe a console is in order for you.

I pushed more watts in the past, today hardware is much faster and lower power than what I had before.

Bitches gonna bitch. Hoes gonna hoe.
 
I fail to see the concern with power on Ethusiast hardware. 600w+ is nothing compared to systems past with SLI or CF, with 2, 3 or 4 cards. Pushing over 1000w with multiple GPUs was easily done, some systems used multiple power supplies.

If you need more A/C, get it or buy the lower power options, maybe a console is in order for you.

I pushed more watts in the past, today hardware is much faster and lower power than what I had before.

600 watts for 1 piece of hardware is new issue. Also very few did SLI and only the tiniest fraction ever had 3 cards or more. You also have to deal with all this heat and electrical demand and I don't feel like redoing my house because I upgraded my computer. Only reason you used more power back in the day was you were stuffing you PC with more hardware, now just 1 piece is sucking down the power. To me it just shows just how much of a wall they are starting to hit to increase the performance now.
 
  • Like
Reactions: ncjoe
like this
This isn't new. Not sure why all the whining. How soon we forget the 295X2. 500w beast. Not to mention $1500 MSRP.

Halo gonna halo.
Crossfire-SLI solutions like the 295x2 had latency and consistency issues and many of the halo (titan-3090) had relativement insignificant enough performance gain over the "reasonable" ultra high-priced regular gaming card to be easy to not care about them, it was a bit of a luxury tax on the people that want to have the best, very little by $ gained.

The 4090 has of now do feel different, it is not an almost irrelevant to game with product over the announced 4080 that you lose almost no opportunity to not get and let people with money and that value PC building hobby get and nice for them.

It increase the people that want it and increase people that care about the price tag.

A bit like if one day, something regular online people care about feature appear only on the $600-$1000 class of motherboard, instead of those being just tech porn easy to not care about their price tag outside the very rich that like fancy computer items and-or people that need 10 gig or a platform that run a Threadripper pro and for who those price make sense for the money the computer make.

Will see how it all shake up, but would it be the first halo gaming product that offer an actual giant gain of gaming performance over everything else available we ever saw ? Which make it out a bit of the usual extravaganza that exist primarily for marketing and not really sold.
 
600 watts for 1 piece of hardware is new issue. Also very few did SLI and only the tiniest fraction ever had 3 cards or more. You also have to deal with all this heat and electrical demand and I don't feel like redoing my house because I upgraded my computer. Only reason you used more power back in the day was you were stuffing you PC with more hardware, now just 1 piece is sucking down the power. To me it just shows just how much of a wall they are starting to hit to increase the performance now.
If the required power is an issue, the card isn't for you. This generation uses much more power than 3 or 4 generations ago, and it's 3 or 4 times as fast, soooooooo.

Often people bitching about the power requirement are the ones who will never buy one anyway, so what?
 
I fail to see the concern with power on Ethusiast hardware. 600w+ is nothing compared to systems past with SLI or CF, with 2, 3 or 4 cards. Pushing over 1000w with multiple GPUs was easily done, some systems used multiple power supplies.
What a great comparison.
 
If the required power is an issue, the card isn't for you. This generation uses much more power than 3 or 4 generations ago, and it's 3 or 4 times as fast, soooooooo.

Often people bitching about the power requirement are the ones who will never buy one anyway, so what?
I take the stance of if their house is scared of a high wattage gaming pc, they probably dont run a vacuum much either lol :p
 
What a great comparison.
Insert a sarcasm tag, I'm sure.

If the comparison is bad for you, move down one or two posts and you'll see other halo cards that pulled significant power at their release nearly a decade ago.
 
If the required power is an issue, the card isn't for you. This generation uses much more power than 3 or 4 generations ago, and it's 3 or 4 times as fast, soooooooo.

Often people bitching about the power requirement are the ones who will never buy one anyway, so what?
All this talk about 600W when even AIB 4090 cards are using less power than the FE 3090 Ti.
 
All this talk about 600W when even AIB 4090 cards are using less power than the FE 3090 Ti.
No kidding. When the 4090 reviews leaked and the 4090 was walking all over the 3090ti AND using similar or less power, I was impressed.

And then out of the woodwork come the usual "OMMGGGG POWER USE QQ".

New gen, significantly faster, same or less power. I don't see the issue.
 
No kidding. When the 4090 reviews leaked and the 4090 was walking all over the 3090ti AND using similar or less power, I was impressed.

And then out of the woodwork come the usual "OMMGGGG POWER USE QQ".

New gen, significantly faster, same or less power. I don't see the issue.

This used to be [H]ard ocp, now its [ i ]mma turn this down ocp.

I do think its time to start moving gaming pcs to 220/240v lines. I just wont feel like a badass until i have to unplug my heatpump to plug in my pc for some crysis eye candy
 
Insert a sarcasm tag, I'm sure.

If the comparison is bad for you, move down one or two posts and you'll see other halo cards that pulled significant power at their release nearly a decade ago.
I must have missed single GPU examples that pushed anywhere near 600 W. Oh, but my dual socket quad SLI setup pulled some l337 watts, man, for real, yo!
 
If the required power is an issue, the card isn't for you. This generation uses much more power than 3 or 4 generations ago, and it's 3 or 4 times as fast, soooooooo.

Often people bitching about the power requirement are the ones who will never buy one anyway, so what?

Last I checked most people on here overclock so it's going to be closer to 600 watts a 4090 is using, plus a overclocked cpu and yeah people have reason to be concerned. Just because your not doesn't mean everyone else should feel the same way as you. Just not sure why you and a few others feel the need to defend it as being ok, when previous examples used were also called out for their excessive power draw. It's a continuing concerning trend that can't be sustained forever.

No crime was committed by pointing out it's a power hog, my 6900xt is not exactly thrifty either but it doesn't need to be pushed either to run my stuff at 1440p.
 
No crime was committed by pointing out it's a power hog,
Well if someone overclock it for very little gain, but being a power hog or not is quite subjective here, a bit like the new Ryzen it is a power hog in a way but it seem to also be one of the most efficiant by watt gpu ever.

If the talk of a 850 watt PSU being what is needed for many model true, I do wonder how large the market would be for a $1600+ USD video card that do not already have a 850 watt or higher PSU or mind to finally change their 600-750 watt model.

That said actual heat would still be an issue even if the power peak reduced the PSU side of the issues.

A 300 watt 4090 FE double the performance on an 6900XT a 320 to 330 watt card.

I fully get that paying that much for a card to lose some performance (any amount) by limiting it's power would not be realistic for almost everyone, you will want it, but going to 600watt versus an overclock seem a self-imposed issue for people that actively want it, here, it is already boderline that you need to push it to 450-460 instead of 300-350 or 400 max.


4K average fps at different power target:
https://allinfo.space/2022/10/11/nv...y-fast-even-with-less-tdp-and-without-dlss-3/
  • GeForce RTX 4090 FE @ 450 Watt99.9
  • GeForce RTX 4090 FE @ 350 Watt96.8
  • GeForce RTX 4090 FE @ 300 Watt90.6
  • GeForce RTX 3090 Ti @ 450 Watt58.9
  • GeForce RTX 3090 Ti @ 350 Watt56.1
  • Asus GeForce RTX 3090 Strix OC55,9
  • Nvidia GeForce RTX 3080 Ti FE52,6
  • GeForce RTX 3090 Ti @ 300 Watt49.8
  • Nvidia GeForce RTX 3080 FE47.0
  • XFX Radeon RX 6900 XT Black45,5
  • AMD Radeon RX 6900 XT43.1
  • AMD Radeon RX 6800 XT39.1
  • AMD Radeon RX 680033.0
  • Nvidia GeForce RTX 3070 FE28.5
  • AMD Radeon RX 6700 XT25,7
  • Nvidia GeForce RTX 3060 Ti FE25,2
  • Asus GeForce RTX 3090 Ti TUF OC0,0
The jump from 50% more power of 300 to 450 seem limit to use (we see a good jump for the first 50w and not much else for the next 100watt already), I would like to see how the smaller 33% jump in a way less efficiant power band to go from 450 to 600 would do.

It it obvisouly to worry about power envelope, but sometime people overdo it, like CPU under light workloads when you are playing a video game, where even the 12900K did without using that much power and is a not really an issue for the new Ryzen at least for an message board user with Hard in the name).
 
Well if someone overclock it for very little gain, but being a power hog or not is quite subjective here, a bit like the new Ryzen it is a power hog in a way but it seem to also be one of the most efficiant by watt gpu ever.

If the talk of a 850 watt PSU being what is needed for many model true, I do wonder how large the market would be for a $1600+ USD video card that do not already have a 850 watt or higher PSU or mind to finally change their 600-750 watt model.

That said actual heat would still be an issue even if the power peak reduced the PSU side of the issues.

A 300 watt 4090 FE double the performance on an 6900XT a 320 to 330 watt card.

I fully get that paying that much for a card to lose some performance (any amount) by limiting it's power would not be realistic for almost everyone, you will want it, but going to 600watt versus an overclock seem a self-imposed issue for people that actively want it, here, it is already boderline that you need to push it to 450-460 instead of 300-350 or 400 max.


4K average fps at different power target:
https://allinfo.space/2022/10/11/nv...y-fast-even-with-less-tdp-and-without-dlss-3/
  • GeForce RTX 4090 FE @ 450 Watt99.9
  • GeForce RTX 4090 FE @ 350 Watt96.8
  • GeForce RTX 4090 FE @ 300 Watt90.6
  • GeForce RTX 3090 Ti @ 450 Watt58.9
  • GeForce RTX 3090 Ti @ 350 Watt56.1
  • Asus GeForce RTX 3090 Strix OC55,9
  • Nvidia GeForce RTX 3080 Ti FE52,6
  • GeForce RTX 3090 Ti @ 300 Watt49.8
  • Nvidia GeForce RTX 3080 FE47.0
  • XFX Radeon RX 6900 XT Black45,5
  • AMD Radeon RX 6900 XT43.1
  • AMD Radeon RX 6800 XT39.1
  • AMD Radeon RX 680033.0
  • Nvidia GeForce RTX 3070 FE28.5
  • AMD Radeon RX 6700 XT25,7
  • Nvidia GeForce RTX 3060 Ti FE25,2
  • Asus GeForce RTX 3090 Ti TUF OC0,0
The jump from 50% more power of 300 to 450 seem limit to use (we see a good jump for the first 50w and not much else for the next 100watt already), I would like to see how the smaller 33% jump in a way less efficiant power band to go from 450 to 600 would do.

It it obvisouly to worry about power envelope, but sometime people overdo it, like CPU under light workloads when you are playing a video game, where even the 12900K did without using that much power and is a not really an issue for the new Ryzen at least for an message board user with Hard in the name).

I would also agree the new Ryzen is a bit of a power hog as well. Plus we all know a Ti version is coming at some point which likely will use even more power. I think more people are concerned by the amount of heat all that power generates. I know my office is the warmest room in the house due to my computer.
 
Back
Top