NVIDIA GeForce RTX 4070: up to 30% faster than RTX 3090 in gaming

From Pascal until today, AMD has never really been ahead in performance-per-watt in any meaningful way. AMD offers less total watts in exchange for less performance. Performance-per-watt is a losers argument anyway. You know who brags about gas mileage during a race? Not the guy who's winning it.
Tell that to basically everybody from the 1985 Grand Prix ...
 
Well, the performance per watt thing becomes an issue if the 'new' product needs 500w at full load compared to 350w of the prior product, and is only 20-30% faster, and we've got skyrocketing energy costs that people need to think about.
 
Did nvidia gain any more performance from Ampere per watt over the 2000 series? Iirc Ampere improvements were literally in line with power increase. This seems similar, just wondering if over the last 2 years it has gained.

AMD showed drastic improvements per watt so I’m wondering if nvidia is going off rails here so to speak just to stay competitive.
The 3080 Ti uses 30% more power than the 2080 Ti for something like 50% more gaming performance, depending on what you're looking at. That is a 15% improvement in performance-per-watt, which lines up with Thunderbolt's post about OctaneBench.
 
I'll sell a kidney for the 4090.

1647033487074.png
 
Who wants a 550w GPU? Personally I don't, but I guess some people would. I don't even enjoy the temperatures in my office when my GPU is at 360w it's like a sauna so I run it at 290w in the Summer. I may need to move to Alaska if I want the 4090. I think these rumors are false. But wouldn't surprised if it is true and marketed as a 4K/240hz or 8K/60hz card.
 
  • Like
Reactions: ncjoe
like this
Who wants a 550w GPU? Personally I don't, but I guess some people would. I don't even enjoy the temperatures in my office when my GPU is at 360w it's like a sauna so I run it at 290w in the Summer. I may need to move to Alaska if I want the 4090. I think these rumors are false. But wouldn't surprised if it is true and marketed as a 4K/240hz or 8K/60hz card.
Hopefully performance improves down the stack, so you won't need a 550W GPU to get an upgrade.
 
Who wants a 550w GPU? Personally I don't, but I guess some people would. I don't even enjoy the temperatures in my office when my GPU is at 360w it's like a sauna so I run it at 290w in the Summer. I may need to move to Alaska if I want the 4090. I think these rumors are false. But wouldn't surprised if it is true and marketed as a 4K/240hz or 8K/60hz card.

https://www.notebookcheck.net/Micro...rs-at-a-US-development-facility.531599.0.html

And the people buying that:
https://www.tomshardware.com/news/nvidia-dgx-station-320g

If 2 550 watt gpu is better for them than 4 GPU running at 350 watt.

More and more I would imagine exotic high end GPU being what can be made from GPU with something else in mind, how relevant actually used for gaming 3090 RTX sales could be ? Those would just not exist to start with if gaming was the only market we can imagine, specially if the "reasonable" 4070 really beat the 3090 rtx, that could have been the top of the line product.
 
If power is a problem, just get the lesser RTX 4050 or 4060 model that probably has RTX 3090 performance with superior perf/watt. I am all for GPU companies to go all out and not hold back on their higher end models.
 
If power is a problem, just get the lesser RTX 4050 or 4060 model that probably has RTX 3090 performance with superior perf/watt. I am all for GPU companies to go all out and not hold back on their higher end models.
Yea I know what you mean, I've already been thinking about returning to water cooling and running the tubing through the floor so the radiator could go in the basement. Just not looking forward to it. Power isn't the real issue, lazyness is.
 
The AMD leaks show them in the same boat. We’re just not at a stage where 4K is an easy thing to do.
Competition is great! One reason Pascal topped out at 250W (1080 Ti) is that Nvidia didn't have to push the envelope. AMD had nothing to challenge the high end.
 
Oh god, this spells doom and gloom to my laptop gaming dreams...

In other better news, I'm pretty confident that this is far, far from the optimal voltage range of the chip, this is likely Nvidia squeezing as much performance as possible in internal testing.

Will we actually see such a monster SKU appear? If AMD lays a stinker, probably not, and we'll at best have 350W 4080s for a good while.
 
Nvidia's really making me consider their 3080 tier GeForce Now. Seems like such a sweet deal, especially if they update the performance to *80 tier every generation.
 
Well, the performance per watt thing becomes an issue if the 'new' product needs 500w at full load compared to 350w of the prior product, and is only 20-30% faster, and we've got skyrocketing energy costs that people need to think about.
How much does 150W extra cost to run? Well, if we go with the most expensive electricity in the world ($0.99/kWh in the Solomon Islands), then we're looking at an additional $0.15/hr to run a 500W GPU compared to a 350W GPU. If you're gaming on it 2 hours per day every single day, then that $8.91 per month.

If you have a $3000 gaming rig and can't afford an additional $8.91/mo to run it, the problem is not the power draw of your GPU.
 
Last edited:
How much does 150W extra cost to run? Well, if we go with the most expensive electricity in the world ($0.99/kWh in the Solomon Islands), then we're looking at an additional $0.15/hr to run a 500W GPU compared to a 350W GPU. If you're gaming on it 2 hours per day every single day, then that $8.91 per month.

If you have a $3000 gaming rig and can't afford an addition $8.91/mo to run it, the problem is not the power draw of your GPU.
That’s actually pretty fair. Most places are $0.25/kWH in the states. Even more reasonable. I spent more than that on stupid things yesterday.
 
That’s actually pretty fair. Most places are $0.25/kWH in the states. Even more reasonable. I spent more than that on stupid things yesterday.
The national average in the US is $0.1375/kWh, so the 500w card would cost an additional $1.24/mo.

According to EIA, the most expensive continental state in December was Rhode Island (not what I was expecting) at $0.2511/kWh. A gamer in RI would expect to spend an additional $2.26/mo to run that 500W card.

This is a devastatingly huge financial burden and must be budgeted for before plunking down $3000 on a top end gaming rig. /s
 
In other news, I foresee NVIDIA splitting their market offerings to crypto vs gamers. I see these high-power 600w cards to be aimed at crypto crowd; most gamers will go for the mid-range (<= 350w) & low-range options. It only makes (corporate profit) sense to reap profit from crypto craze while still providing value to gamers.

Since the high inflation in North America has been caused by a multitude of things (e.g., crypto, supply chain disruptions, gov spending, cost of energy, etc.), I think NVIDIA has lost some gaming customers to AMD. I would love to see some sales (and re-sales) numbers to see if my intuition is right or wrong here...
 
In other news, I foresee NVIDIA splitting their market offerings to crypto vs gamers. I see these high-power 600w cards to be aimed at crypto crowd; most gamers will go for the mid-range (<= 350w) & low-range options. It only makes (corporate profit) sense to reap profit from crypto craze while still providing value to gamers.

Since the high inflation in North America has been caused by a multitude of things (e.g., crypto, supply chain disruptions, gov spending, cost of energy, etc.), I think NVIDIA has lost some gaming customers to AMD. I would love to see some sales (and re-sales) numbers to see if my intuition is right or wrong here...
I’m not sure NVidia lost any sales to AMD, if anything it’s the opposite. Lots of sources are claiming that AMD sold fewer cards to consumers in 2020 and 2021 than in previous years. They sold more APUs, CPUs, and Consoles than ever before. But fewer discrete cards, they just made less of them to make room in their higher profit items. Their margins right now are insane.
 
How much does 150W extra cost to run? Well, if we go with the most expensive electricity in the world ($0.99/kWh in the Solomon Islands), then we're looking at an additional $0.15/hr to run a 500W GPU compared to a 350W GPU. If you're gaming on it 2 hours per day every single day, then that $8.91 per month.

If you have a $3000 gaming rig and can't afford an additional $8.91/mo to run it, the problem is not the power draw of your GPU.

Perf per watt only really matters to people when they need a metric to show their product is that much better than the competition or they're actually doing a good SFF build. The whole space heater arguments regarding 125w AMD FX cpus being the perfect example. The same people complaining about that turned around and overclocked their cpu's and gpu's killing their energy usage curves. They only cared about raw performance not power usage. Somehow my 125w fx8350 being $50 cheaper than the equivalent i5 didn't matter. When Intel makes some hot chips it becomes all about raw ghz, single/dual core performance.
 
oh boy! May have to up the small bedroom (computer room) A/C unit from the ton and a half to something more. . .

Some forget it is not just the increase in wattage to the GPU, cooling cost to keep your body from melting also comes into play. Power supply will consume more than just the increase in GPU power, 500w/.9 = 556w from the wall. Anyways this sounds utterly nuts if Nvidia goes this route as a thought.
 
I'm kind of doubting such performance jumps. I can see them wanting to make ray tracing standard (while it still is a marketing gimmick), but they don't really have to. There is such huge demand as is. I would expect a good jump, to re-generate interest in GPUs after poor availability for the past few years. But I am not seeing this big of a jump.

Generally I don't mind power or heat, but if these things do offer this performance I think many of us will need new PSUs and cases designed for something this hot.
 
I'm kind of doubting such performance jumps. I can see them wanting to make ray tracing standard (while it still is a marketing gimmick), but they don't really have to. There is such huge demand as is. I would expect a good jump, to re-generate interest in GPUs after poor availability for the past few years. But I am not seeing this big of a jump.

Generally I don't mind power or heat, but if these things do offer this performance I think many of us will need new PSUs and cases designed for something this hot.
It also defeats the purpose of DLSS.
The original rumor states the entire line-up is faster than the 3090, even the 4060.

The 3090's average fps at 1440p is 167.
https://www.techpowerup.com/review/msi-geforce-rtx-3080-suprim-x-12-gb/30.html

Even in Cyberpunk, it's 79.
https://www.techpowerup.com/review/msi-geforce-rtx-3080-suprim-x-12-gb/11.html

The only way I can think for Nvidia to market these GPUs is "Making 4K accessible for everyone". I guess we're at that point now.
 
I see there is a performance level that most people would accept, added performance with huge increases in costs will become invisible or even laughable by many. For example a 3090 at 1080p or even 1440p for the cost for most is ridicules. People may have excess money now, that may not be true in the future and Nvidia may have some rather hard times selling anything at very high prices.
 
The only way I can think for Nvidia to market these GPUs is "Making 4K accessible for everyone". I guess we're at that point now.

Could be that as well. But I don't think they'd go straight to 4K. I can seem them wanting to make 1920x1080 obsolete and making 2560x1440 the new 1080 (low end cards or extremely high frame rates for competitive gamers), which would be a good thing.

But if the price goes up 30-80%, kind of a moot point.
 
Last edited:
Seeing this just makes me wonder how "lucky" we were when we hosted LAN parties to just worry about people still carrying CRT monitors. I couldn't imagine hosting a LAN today with folks showing up with 1.5kilowatt+ systems.

I had that same conversation with a friend recently about the LAN parties that we would go to in the early 2000’s. We had 13 guys in my buddy’s basement one year. Man, do I miss those days!

As you said though, at that time, a 300W power supply was a decent size. Completely different power requirements now. I still remember the first time I had to hook in power from the power supply directly to the GPU instead of it just using the power it was pulling from the AGP slot.
 
Last edited:
If they sell as fast as they get made, and they stop making them in May then that means they get more expensive until the 4000 series launches and the miners sell off their 3000’s To replace them with the 4’s.

3000s are made by Samsung so no reason to stop production.
 
3000s are made by Samsung so no reason to stop production.
Never mind, NVidia clarified that the 3000 and 4000 series will coexist. Before they had insinuate that they would be stopping production of the 3000 series a few months before the launch of the 4000 series.
The new complaint is that the 4000 series is so expensive they need the 3000 series to have something reasonable for people to buy.
 
They continued making Pascal after Turing release and continued to make 16xx, 20xx cards after Ampere (been a while since a launch with 0 supply issue on all the price points on all new SKU I guess), so I imagine we could expect that.

Insinuate that they would be stopping production of the 3000 series a few months before the launch of the 4000 series.

Who at Nvidia insinuated that ?
 
Last edited:
They continued making Pascal after Turing release and continued to make 16xx, 20xx cards after Ampere (been a while since a launch with 0 supply issue on all the price points I guess), so I imagine we could expect that.

Insinuate that they would be stopping production of the 3000 series a few months before the launch of the 4000 series.

Who at Nvidia insinuated that ?
Well supposedly the NVidia leaks say that the Ampere cards will stop production in May, which is mentioned in the slides shown in the article linked in the post that started this thread.

But when asked about it NVidia’s CFO said this:

“Even during this period of COVID and supply constraints, it’s been interesting because it’s given us the opportunity for gaming to continue to sell both the current generation as well as the Turing generation,” Kress said. “So we’ve been doing that to provide more and more supply to our gamers in that. And we may see something like that continue in the future.”
 
Last edited:
From Pascal until today, AMD has never really been ahead in performance-per-watt in any meaningful way. AMD offers less total watts in exchange for less performance. Performance-per-watt is a losers argument anyway. You know who brags about gas mileage during a race? Not the guy who's winning it.

I will just leave this here.
Shadow of the Tomb Rider, 2560x1440, Highest settings.
With a little power limit down I can go to ~260w with a few fps less.

6800xtVS3080.png
 
I will just leave this here.
Shadow of the Tomb Rider, 2560x1440, Highest settings.
With a little power limit down I can go to ~260w with a few fps less.

View attachment 453555
Tweaks like this are how they make these things viable in Laptops. The 3080 laptop chips are very well power tuned it looses 18% of its max performance when matched against its desktop counterpart but does so who’ll sipping away at 135-155w.

The way I see it 4K is just too brutal, it’s glorious and beautiful but the hardware just isn’t there in a way that allows for good power consumption. The 4000 series may change this, but I suspect we’re still another 2 generations out before the hardware is in a place where 4K on a decent power curve is going to be possible.
 
Back
Top