NVIDIA GeForce RTX 4070: up to 30% faster than RTX 3090 in gaming

Actually it is Small Form Factor....... can't tell if you were being serious or not.
Half serious??? When Shuttle first started releasing systems in that size I was working with AOpen to do a pretty big deployment of them. The same just stuck and yeah now it’s small form factor but back in the late 90’s iTX wasn’t a format yet so the Shuttle systems were the only way to get things that small so it’s just stuck with me ever since.
 
This seems viable, they are paying TSMC a fortune to get in on the 5nm production lines for these, they are going to be monolithic so $$$, but I expect good things from the 4060 series, I am hoping I can get a reasonable upgrade from that over my 2080TI, I am not unhappy with my 2080TI at 1440p and as I am not getting a new monitor any time soon that is still my target performance metric. I just want something that generates less heat.
Maybe cool it better? I have a 2080ti also paired with a 1440p 144hz monitor and I don't see myself needing an upgrade for a long time yet.
 
Maybe cool it better? I have a 2080ti also paired with a 1440p 144hz monitor and I don't see myself needing an upgrade for a long time yet.
It’s cooled fine it just heats my office too much, in the winter it’s a bonus. In the summer it most certainly isn’t. I could spring for AC but a cooler running PC would be the more environmental approach.
 
On a positive side, 3xxx cards may finally be affordable once the new cards drop. But really, those are some massive projected jumps in performance and TDP
Not likely... they will end production. If anything price will go up. lol
 
So now its not enough to spend $1400 bucks on a GPU just to hit 60fps at 4k, now you need to spend $250 on a Kilowatt PS that will double as an Induction cook-top...............
 
So now its not enough to spend $1400 bucks on a GPU just to hit 60fps at 4k, now you need to spend $250 on a Kilowatt PS that will double as an Induction cook-top...............
I mean there's a reason why super computers don't run on a 100w power supply. Don't get something for nothing.

Interesting thought you bring up, I wonder how long until we move liquid cooling loops through flooring or water heaters to try to redirect some heat.
 
At the rate things are going... high end home PCs will be a thing of the past. I mean 1000 watt power supplies are probably the end for most... in some places it won't even be legal to run a 1500 watt gaming box, never mind try and pay the power bill. lol
When your gaming box starts drawing more power then the laundromat down the street it might be time to embrace cloud gaming. haha
 
At the rate things are going... high end home PCs will be a thing of the past. I mean 1000 watt power supplies are probably the end for most... in some places it won't even be legal to run a 1500 watt gaming box, never mind try and pay the power bill. lol
When your gaming box starts drawing more power then the laundromat down the street it might be time to embrace cloud gaming. haha
I think cooling and PSU limitations will keep Nvidia inside the 750-850W envelope for most GPUs.
There aren't many people who already own 1000W units, and the price jump for those PSU's is a lot more than the 750/850 units. I could see a flagship 4090 with a stock AIO requiring a 1000W PSU, though.
 
I think cooling and PSU limitations will keep Nvidia inside the 750-850W envelope for most GPUs.
There aren't many people who already own 1000W units, and the price jump for those PSU's is a lot more than the 750/850 units. I could see a flagship 4090 with a stock AIO requiring a 1000W PSU, though.
Or it will force them to externally power the card with a brick or maybe just supplement the power with it.
 
Or it will force them to externally power the card with a brick or maybe just supplement the power with it.
Exclusive leaked photo of the 4090

3090plug.png
 
Well.... a lot of people are putting in plugs for their electric cars.
Noted... if I pull the trigger get them to install two plugs and over spec the breaker.
 
I saw some guy on the Redshift 3d forums saying he cut power to 70% yet only cut 1 second off his render time I am hoping i can do that when I get 2x 4080 TI

"Yes MSI Afterburner, 70% gives 1m:19s render time in Windows when 100% is 1m:18s"

Hell, I would want to see how much power you can cut to add 5 seconds to each frame. I am just a home user. But still Cut power down to 70% and only add 1 second to the render time shitttttt.
 
He said generates less heat. It doesn't matter what you cool it with, the energy is still dissipated into the surrounding room.
Yeah the 2080ti and the 3900x both OC’d are on a custom loop running through a top mounted 360mm radiator that blows outward. Loop never gets above 45 degrees and neither the CPU nor GPU thermal throttle so it’s clearing it out. But you can see the heat waves coming off it like it was a cartoon.
 
My first SLI rig was dual GTX480s - both using the massive Arctic air coolers.

They tripped the protection in my Silverstone 1000W PSU when I ran Furmark...

Those were the days lol...
Hey, I bet you didn't need a heater for the winter :D
 
Yeah im gonna keep my 6900xt so I can sell it for a can of baked beans in a year.

Can? A CAN? It'll get you 1 baked bean without sauce and you'll like it. If you want a can, you'll need one of these. A gallon of gas is 20 more.

Econpic7.jpg


The 4070 will be an affordable $499,99999,999,9999.95
 
I saw some guy on the Redshift 3d forums saying he cut power to 70% yet only cut 1 second off his render time I am hoping i can do that when I get 2x 4080 TI

"Yes MSI Afterburner, 70% gives 1m:19s render time in Windows when 100% is 1m:18s"

Hell, I would want to see how much power you can cut to add 5 seconds to each frame. I am just a home user. But still Cut power down to 70% and only add 1 second to the render time shitttttt.
My rigs mostly run Octane render jobs, but it's the same thing. I run the 3090s at 77% power (300W) and the difference in my OB scores is basically just lost in the noise. That basically aligns with my A6000s which are 300W stock.

Even when gaming, I stopped setting the TDP back to 100% and now just leave it at 77%. There is probably a reduction in frame rate but I can't tell. I'm sure I could adjust the clocks and get most of that difference back, too.

Most of the rumors around 4090 power consumption are just stupid linear projections based on rumored core count. What that fails to account for is that there are more sources of power draw on the board than just the processor. The VRAM alone on a 3090 can pull over 100W (I've read as high as 150W). If 150 is correct, and the 4090 uses the same VRAM, then we'd be looking at more like 450W on a hypothetical 4090 FE if it had 50% more cores (of all types) than a 3090. Of course, there's also a node shrink happening, so the GPU will be more power efficient. I'd wager than the 4090 FE will be at most 400W.

And you know what? I'm ok with that. It's a $2k flagship card. It's currently $20 extra to get a 1200W EVGA PSU instead of an 850W one (which would be plenty for a 400W GPU anyway). If $20 is a dealbreaker, then you were never going to buy the card in the first place and are just whining because you like the sound.
1647005714161.png


Don't worry everyone, while the card is expected to be 30% faster, it's also expected to be 90% less available....so really, why worry.....
I wouldn't be so sure of that. No doubt that launch day will be a mess, but after that? ETH may finally be switched to PoS at that point, plus these cards will have the next generation of LHR. On top of that, I suspect we will see further neutering of the Geforce lineup to make them less appealing for workstation use. The 30-series is a bunch of computational monsters which has made them extremely appealing for workstation and "off-label" enterprise use. There are full farms of them being used for rendering because they outperform their enterprise Ax000 counterparts at a fraction of the price and with zero drawbacks for most users. I would be surprised if Nvidia leaves so much money on the table this time around.
 
My rigs mostly run Octane render jobs, but it's the same thing. I run the 3090s at 77% power (300W) and the difference in my OB scores is basically just lost in the noise. That basically aligns with my A6000s which are 300W stock.

Even when gaming, I stopped setting the TDP back to 100% and now just leave it at 77%. There is probably a reduction in frame rate but I can't tell. I'm sure I could adjust the clocks and get most of that difference back, too.

Most of the rumors around 4090 power consumption are just stupid linear projections based on rumored core count. What that fails to account for is that there are more sources of power draw on the board than just the processor. The VRAM alone on a 3090 can pull over 100W (I've read as high as 150W). If 150 is correct, and the 4090 uses the same VRAM, then we'd be looking at more like 450W on a hypothetical 4090 FE if it had 50% more cores (of all types) than a 3090. Of course, there's also a node shrink happening, so the GPU will be more power efficient. I'd wager than the 4090 FE will be at most 400W.

And you know what? I'm ok with that. It's a $2k flagship card. It's currently $20 extra to get a 1200W EVGA PSU instead of an 850W one (which would be plenty for a 400W GPU anyway). If $20 is a dealbreaker, then you were never going to buy the card in the first place and are just whining because you like the sound.
View attachment 452726


I wouldn't be so sure of that. No doubt that launch day will be a mess, but after that? ETH may finally be switched to PoS at that point, plus these cards will have the next generation of LHR. On top of that, I suspect we will see further neutering of the Geforce lineup to make them less appealing for workstation use. The 30-series is a bunch of computational monsters which has made them extremely appealing for workstation and "off-label" enterprise use. There are full farms of them being used for rendering because they outperform their enterprise Ax000 counterparts at a fraction of the price and with zero drawbacks for most users. I would be surprised if Nvidia leaves so much money on the table this time around.
GPU Prices aren't coming down any time soon with the current world situation, and the world basically splitting into two economic blocks. Again, people keep blaming this problem solely on crypto, and aren't seeing the forest through the trees.
 
Did nvidia gain any more performance from Ampere per watt over the 2000 series? Iirc Ampere improvements were literally in line with power increase. This seems similar, just wondering if over the last 2 years it has gained.

AMD showed drastic improvements per watt so I’m wondering if nvidia is going off rails here so to speak just to stay competitive.
 
Did nvidia gain any more performance from Ampere per watt over the 2000 series? Iirc Ampere improvements were literally in line with power increase. This seems similar, just wondering if over the last 2 years it has gained.

AMD showed drastic improvements per watt so I’m wondering if nvidia is going off rails here so to speak just to stay competitive.
Yes, 3080ti/3090 uses 100w more but are substantially faster. Even more so when you look at raytracing where the extra RT capability of said cards is a huge upgrade.
 
My rigs mostly run Octane render jobs, but it's the same thing. I run the 3090s at 77% power (300W) and the difference in my OB scores is basically just lost in the noise. That basically aligns with my A6000s which are 300W stock.

Even when gaming, I stopped setting the TDP back to 100% and now just leave it at 77%. There is probably a reduction in frame rate but I can't tell. I'm sure I could adjust the clocks and get most of that difference back, too.

Most of the rumors around 4090 power consumption are just stupid linear projections based on rumored core count. What that fails to account for is that there are more sources of power draw on the board than just the processor. The VRAM alone on a 3090 can pull over 100W (I've read as high as 150W). If 150 is correct, and the 4090 uses the same VRAM, then we'd be looking at more like 450W on a hypothetical 4090 FE if it had 50% more cores (of all types) than a 3090. Of course, there's also a node shrink happening, so the GPU will be more power efficient. I'd wager than the 4090 FE will be at most 400W.

And you know what? I'm ok with that. It's a $2k flagship card. It's currently $20 extra to get a 1200W EVGA PSU instead of an 850W one (which would be plenty for a 400W GPU anyway). If $20 is a dealbreaker, then you were never going to buy the card in the first place and are just whining because you like the sound.
View attachment 452726


I wouldn't be so sure of that. No doubt that launch day will be a mess, but after that? ETH may finally be switched to PoS at that point, plus these cards will have the next generation of LHR. On top of that, I suspect we will see further neutering of the Geforce lineup to make them less appealing for workstation use. The 30-series is a bunch of computational monsters which has made them extremely appealing for workstation and "off-label" enterprise use. There are full farms of them being used for rendering because they outperform their enterprise Ax000 counterparts at a fraction of the price and with zero drawbacks for most users. I would be surprised if Nvidia leaves so much money on the table this time around.

and here is the 1300 for 204!

https://smile.amazon.com/EVGA-Super...47013147&sprefix=evga+1200,aps,96&sr=8-2&th=1
 
Did nvidia gain any more performance from Ampere per watt over the 2000 series? Iirc Ampere improvements were literally in line with power increase. This seems similar, just wondering if over the last 2 years it has gained.

AMD showed drastic improvements per watt so I’m wondering if nvidia is going off rails here so to speak just to stay competitive.
I care much more about performance per dollar. Turing was shit - the only upgrade from a 1080 Ti was roughly twice as expensive. Nvidia even acknowledged it at the Ampere rollout - "Those of you on Pascal, it's time to upgrade."

Ampere would have been great at MSRP, but we all know how that worked out.

I haven't seen anybody comment on the line that states Nvidia is trying to get "close" to RDNA3 on raster performance. That's a good sign.

Edit to add: Nvidia sent TSMC a big chunk of $$$ for 5nm wafers, and 5nm has been in full production for a good while. Compare to Samsung's 8nm where Nvidia was the first customer and yields were allegedly a problem. If Lovelace runs out of cards day 1, or is scalpable, that's squarely on Nvidia.
 
Last edited:
Did nvidia gain any more performance from Ampere per watt over the 2000 series? Iirc Ampere improvements were literally in line with power increase. This seems similar, just wondering if over the last 2 years it has gained.

AMD showed drastic improvements per watt so I’m wondering if nvidia is going off rails here so to speak just to stay competitive.
Ampere made huge gains in performance per watt compared to Turing. Looking at OctaneBench performance with RTX on, the 3090 comes in at 652 points from 350W, or 1.86pts per watt. The Titan RTX comes in at 361 points from 280W, or 1.29pts per watt. This has the 3090 scoring 44% higher per watt. If we look at the results with RTX off, then we get 1.5pts per watt for the 3090 vs 1.23pts per watt for the Titan RTX - a boost of 22%. The 3090 is so much more efficient that it even gets better performance per watt with RTX off than the Titan RTX does with RTX on.

Note also that these are peak TDP ratings rather than actual draw during a render. From my own measurements, the gap in points per actual watt drawn is even wider.

From Pascal until today, AMD has never really been ahead in performance-per-watt in any meaningful way. AMD offers less total watts in exchange for less performance. Performance-per-watt is a losers argument anyway. You know who brags about gas mileage during a race? Not the guy who's winning it.

I haven't seen anybody comment on the line that states Nvidia is trying to get "close" to RDNA3 on raster performance. That's a good sign.
That's because the line is 100% MLID dicksuck and a perfect example of why he's not a respectable news or rumor source. AMD is a generation behind Nvidia. Nvidia would never hope to get "close" to their previous generation's performance.
 
Last edited:
I would be impressed if the 3070 was 30% faster than a 3090. But, since this is nVidia, I expect that the 4070 will cost the same as the 3090Ti once it is released!
 
Back
Top