NVIDIA GeForce RTX 4070: up to 30% faster than RTX 3090 in gaming

Tweaks like this are how they make these things viable in Laptops. The 3080 laptop chips are very well power tuned it looses 18% of its max performance when matched against its desktop counterpart but does so who’ll sipping away at 135-155w.

The way I see it 4K is just too brutal, it’s glorious and beautiful but the hardware just isn’t there in a way that allows for good power consumption. The 4000 series may change this, but I suspect we’re still another 2 generations out before the hardware is in a place where 4K on a decent power curve is going to be possible.

Exactly. But I do think Nvidia is finally trying to put 1920x1080 in the grave, which would be a good thing assuming these rumors hold true. That has been a "standard" resolution for so long now and while most mid range gamers have moved on plenty still play at 1920x1080.
 
Exactly. But I do think Nvidia is finally trying to put 1920x1080 in the grave, which would be a good thing assuming these rumors hold true. That has been a "standard" resolution for so long now and while most mid range gamers have moved on plenty still play at 1920x1080.
I don't know about "in the grave" per se, as they recently worked really hard to get the 320hz refresh rate for 1080p worked into their GSync specs.
 
And raytracing is a bit of step back vs climbing resolution arguably (I think if you go very high because of the log instead of linear increase in demand has resolution goes up it stop to be true), what a card maker want is for old card to not be good enough for new game/common way to play game, I imagine they are quite agnostic if it is because of wanted refresh rate, shaders effects, raytracing, resolution, more bits by pixels with HDR, bigger game.

I am not sure what killing 1080p would actually mean, but it would over a long windows, would need the 3070ti level type of card to become the cheap one and for game to not have become harder to run over 1080p has well during that time (or DLSS to become so good that it is a non brainer to use in hard to run game).

On the latest steam HW survery, above 1080p or 1920-1200 was around 15% which is close to the percentage of video card that are 2080/3060Ti level of card or higher on the survey (14.5%)
 
The national average in the US is $0.1375/kWh, so the 500w card would cost an additional $1.24/mo.

According to EIA, the most expensive continental state in December was Rhode Island (not what I was expecting) at $0.2511/kWh. A gamer in RI would expect to spend an additional $2.26/mo to run that 500W card.

This is a devastatingly huge financial burden and must be budgeted for before plunking down $3000 on a top end gaming rig. /s
you forgot to add AC cooling to compensate for that 500w of heat.
Having the PC in another room would help with the "feel" of the heat in the room
my setup easily raises our den area 2-3 degrees and that's with only a rx5700 non-xt.
Running our central HVAC to compensate for that one room (more ideal to have a small dedicated unit in the same room) will easily add $1 to our daily usage (I get daily alerts of my power usage) if i game for more than 3 hours, while making the rest of the house an icebox lol.
Not really a problem for me I can handle an extra $31 a month worst case, but I would say that the increase in wattage from the GPU is negligible compared to the increase in cooling you will need/want to stay comfortable, more especially if you only have central AC like most in the US.
Our unit just got replaced with a modern one so maybe it will be a lot better now lol.
 
Last edited:
you forgot to add AC cooling to compensate for that 500w of heat.
Having the PC in another room would help with the "feel" of the heat in the room
my setup easily raises our den area 2-3 degrees and that's with only a rx5700 non-xt.
Running our central HVAC to compensate for that one room (more ideal to have a small dedicated unit in the same room) will easily add $1 to our daily usage (I get daily alerts of my power usage) if i game for more than 3 hours, while making the rest of the house an icebox lol.
Not really a problem for me I can handle an extra $31 a month worst case, but I would say that the increase in wattage from the GPU is negligible compared to the increase in cooling you will need/want to stay comfortable, more especially if you only have central AC like most in the US.
Our unit just got replaced with a modern one so maybe it will be a lot better now lol.

Yeah my rig raises my comp room temp a good 6 degrees I've seen when encoding lol

Have to turn the rest of the place into an ice box like you said
 
imagine running a 3090 + 3070 > 430w
or twin 1080ti ~ 400w
today...it throws off some heat
what the card(s) consume is just a part (usually not even half) of the energy consumption equation
 
imagine running a 3090 + 3070 > 430w
or twin 1080ti ~ 400w
today...it throws off some heat
what the card(s) consume is just a part (usually not even half) of the energy consumption equation

That's dependable from the played game.
There are games where computation will be half or less, there are games that will hit the watt wall.

As I show in post 119 SotTR can stay at max power limit most of the time, so the heat will spread from the PC most of the time during gameplay.
 
imagine running a 3090 + 3070 > 430w
or twin 1080ti ~ 400w
today...it throws off some heat
what the card(s) consume is just a part (usually not even half) of the energy consumption equation
Remember when we made run of the R9 390 and RX Vega's for power consumption? As we've seen with the fact that nobody can buy a GPU for anywhere near MSRP is that nobody cares about power consumption. Not even the miners.
 
Remember when we made run of the R9 390 and RX Vega's for power consumption? As we've seen with the fact that nobody can buy a GPU for anywhere near MSRP is that nobody cares about power consumption. Not even the miners.

very true
if mining still pays the costs and produces a profit...they will continue
heck i just read an article yesterday on the state of kentucky ASKING for miners to come to ky!
 
Remember when we made run of the R9 390 and RX Vega's for power consumption? As we've seen with the fact that nobody can buy a GPU for anywhere near MSRP is that nobody cares about power consumption. Not even the miners.
We made fun of it's power consumption because it was barely better than a 970 while using considerably more power. The R9 390 released 9 months after the 970, having only 5% more relative performance while using a whopping 60% more power for that measly 5% performance improvement; at the exact same MSRP.

Time really has a way of messing with some peoples heads.
 

Attachments

  • perfrel_1920_1080.png
    perfrel_1920_1080.png
    34.1 KB · Views: 0
  • power_maximum.png
    power_maximum.png
    41.1 KB · Views: 0
The 646 watt at peak card on the screenshot above..... that had an external cooling system and had reasonable noise level I think too right ?

Maybe they could simply have simple tower like cooling
 
The 646 watt at peak card on the screenshot above..... that had an external cooling system and had reasonable noise level I think too right ?

Maybe they could simply have simple tower like cooling

so a 1k ps just for that...ouch...add another 2-400 for mb and whats on it...eek thats a big spendy ps in the 1600w range
 
so a 1k ps just for that...ouch...add another 2-400 for mb and whats on it...eek thats a big spendy ps in the 1600w range
CPU while gaming do not tend to reach their possible peak possible wattage at least back in the days reviewer were using 1,200w psu without issues, I guess that there is always a possibility that this rare peak of power of the gpu and the cpu happen at the same time, but the usual total system load while gaming was around 700 watt.

https://www.anandtech.com/show/7930/the-amd-radeon-r9-295x2-review/5
Corsair AX1200i with a Core i7-4960X @ 4.2GHz

Maybe that for a 12900k today you would need to upgrade.
 
Plenty being almost 70% of single monitor steam users.... I don't see it going away any time soon.

Could be, but I wouldn't put too much thought into that. I too use Steam with a 1920x1080 monitor, but not to game. I occasionally sign it to chat with people on a laptop. I also did say "mid range". Fact is, most gamers use lower end GPUs, especially when factoring in laptops. I consider "mid range" to be GTX/RTX **60 and above. Most people using those aren't on 1920x1080 these days.

But most gamers use **50s and the laptop versions of those. Those people overwhelmingly game at 1920x1080.
 
Could be, but I wouldn't put too much thought into that. I too use Steam with a 1920x1080 monitor, but not to game. I occasionally sign it to chat with people on a laptop. I also did say "mid range". Fact is, most gamers use lower end GPUs, especially when factoring in laptops. I consider "mid range" to be GTX/RTX **60 and above. Most people using those aren't on 1920x1080 these days.

LOL yes, I'm sure that the reason that 1920x1080 is the top resolution BY FAR is because people are using Steam as a chat platform and not because that's what they game at.

But most gamers use **50s and the laptop versions of those. Those people overwhelmingly game at 1920x1080.

6/10 of the most popular cards are **60 and only 3/10 are **50 (plus 1/10 **70) so I think it's pretty safe to say that most people are still running 1920x1080, including those with **60 cards.
 
I think hanging out solely on enthusiast forums/boards insulates some people from what normal people do. Shoot, my brother wants to get a 6600/x for 1080p just so that it's up to par for 5+ years or so.

Most people don't upgrade every generation. 1080p will be staying the most used gaming resolution. It really does look just fine at 21-24", and that's plenty big for enough people sitting at an average desk.
 
This right here ^. I have friends who game almost daily whom I helped put together Haswell and Skylake builds when that was current. They are just now asking me for upgrade advice. Other than adding SSDs for more games they haven’t done any upgrades or even paid attention to what’s been happening in the hardware space since building those.
 
With how fast cheap 1080p monitors are these days, I'd imagine even a 3070 will struggle to provide above 100FPS at 1080p in all games, especially once you turn on RT features. DLSS/FSR helps to varying degrees, but isn't bulletproof.

Until we can run games at like 200+ FPS 1080p won't go anywhere, there's no need to. I wouldn't be surprised if we see a 1000Hz 1080p monitor before 1440p becomes the new baseline.
 
With how fast cheap 1080p monitors are these days, I'd imagine even a 3070 will struggle to provide above 100FPS at 1080p in all games, especially once you turn on RT features. DLSS/FSR helps to varying degrees, but isn't bulletproof.

Until we can run games at like 200+ FPS 1080p won't go anywhere, there's no need to. I wouldn't be surprised if we see a 1000Hz 1080p monitor before 1440p becomes the new baseline.
1440p is a weird resolution, desktops tend to go 1080p or 4K, 1440p I mostly see in nicer laptops, usually paired with GSync.

I have it on my desktop with what I perceive as a good screen. And I’m basically staying there until the xx70 $600 CAD market can do better. I just don’t game as much as I used too and my eyes are just tired. I’m not even sure I could tell the difference between them during actual gameplay FPS not withstanding. I mean yea I am going to be able to tell the difference between 100 and 20 FPS, but 75+ at any of them and unless I get really close I’m going to have to nitpick it. For now 1440p at 28+ inches is my go too, it does well for both gaming and productivity.
 
1440p is a weird resolution, desktops tend to go 1080p or 4K, 1440p I mostly see in nicer laptops, usually paired with GSync.

I have it on my desktop with what I perceive as a good screen. And I’m basically staying there until the xx70 $600 CAD market can do better. I just don’t game as much as I used too and my eyes are just tired. I’m not even sure I could tell the difference between them during actual gameplay FPS not withstanding. I mean yea I am going to be able to tell the difference between 100 and 20 FPS, but 75+ at any of them and unless I get really close I’m going to have to nitpick it. For now 1440p at 28+ inches is my go too, it does well for both gaming and productivity.

I think 1440p is very popular now - I'm shopping monitors and it seems like 27" 1440p is the most common spec. It's a nice mix - 1080p is too pixelated at 27" and 4K is rare at 27". 32" monitors are kind of large for typical desk.

I used to prefer 1900 x 1200 at 24" but now I prefer 27" 1440p.
 
With how fast cheap 1080p monitors are these days, I'd imagine even a 3070 will struggle to provide above 100FPS at 1080p in all games, especially once you turn on RT features. DLSS/FSR helps to varying degrees, but isn't bulletproof.

Until we can run games at like 200+ FPS 1080p won't go anywhere, there's no need to. I wouldn't be surprised if we see a 1000Hz 1080p monitor before 1440p becomes the new baseline.

I think you wildly overestimate the number of people who actually care about >200hz. Or are even that invested in >100hz. All they care about is "feels smooth". Every 144hz adaptive-sync display on the market can achieve that for >99% of the population.

The only people impressed by >200hz displays are CS:GO lifers. And CS:GO will cruise past 200fps running on a half-decade old potato.
 
LOL yes, I'm sure that the reason that 1920x1080 is the top resolution BY FAR is because people are using Steam as a chat platform and not because that's what they game at.

What is a strawman?

6/10 of the most popular cards are **60 and only 3/10 are **50 (plus 1/10 **70) so I think it's pretty safe to say that most people are still running 1920x1080, including those with **60 cards.

Many of those **60s are on par with the current generation **50 in terms of performance. I don't think you can, with a straight face, say a GTX 1060 is a mid range GPU in 2022. An X700 Pro was mid range, about 17 or so years ago. It certainly isn't now. Then consider that many mobile GPUs are notably slower than their desktop counterparts. Most RTX 3060s get frame rates a lot lower than the desktop version as most are MAX Qs.

Again, I wouldn't put too much thought into what Steam shows, but these are the top 8 GPUs on Steam:


NVIDIA GeForce GTX 1060
NVIDIA GeForce GTX 1650
NVIDIA GeForce GTX 1050 Ti
NVIDIA GeForce RTX 2060
NVIDIA GeForce GTX 1050
NVIDIA GeForce GTX 1660 Ti
NVIDIA GeForce GTX 1660 SUPER
NVIDIA GeForce RTX 3060 Laptop GPU

I'm not really sure I'd consider a GTX 1060, 1650, or 1050 Ti to be "mid range".
 
We made fun of it's power consumption because it was barely better than a 970 while using considerably more power. The R9 390 released 9 months after the 970, having only 5% more relative performance while using a whopping 60% more power for that measly 5% performance improvement; at the exact same MSRP.

Time really has a way of messing with some peoples heads.
Don't forget the 970 had 1GB less ram or more depending on which R9 390 we're talking about. Point is that modern GPU's use even more power and nobody batts an eye. If you just play games on these GPU's then the power difference in terms of cost isn't big. Miners seem to almost not even care.
 
Don't forget the 970 had 1GB less ram or more depending on which R9 390 we're talking about. Point is that modern GPU's use even more power and nobody batts an eye. If you just play games on these GPU's then the power difference in terms of cost isn't big. Miners seem to almost not even care.
0.5gb. Also, today's cards != yesterday's market on power draw. Keep being an amd citizen. Lisa Su loves you all... And your money too!
 
you forgot to add AC cooling to compensate for that 500w of heat.
Having the PC in another room would help with the "feel" of the heat in the room
my setup easily raises our den area 2-3 degrees and that's with only a rx5700 non-xt.
Running our central HVAC to compensate for that one room (more ideal to have a small dedicated unit in the same room) will easily add $1 to our daily usage (I get daily alerts of my power usage) if i game for more than 3 hours, while making the rest of the house an icebox lol.
Not really a problem for me I can handle an extra $31 a month worst case, but I would say that the increase in wattage from the GPU is negligible compared to the increase in cooling you will need/want to stay comfortable, more especially if you only have central AC like most in the US.
Our unit just got replaced with a modern one so maybe it will be a lot better now lol.
Or you could just turn on a fan.
 
You realize fans move air while adding heat, correct? Not everyone lives in a place where turning on a fan will do anything, at all, besides make it hotter.
I’ll give you the benefit of the doubt and just assume I’m missing your sarcasm.
 
I’ll give you the benefit of the doubt and just assume I’m missing your sarcasm.
Where does the energy that powers the fan go in a closed system (Your house) when it's already hotter than the surface of Venus outside? Sure, the fan helps you a little because you're sweating balls, but it's not going to magically negate the 500w GPU blowing into the house.
 
I'm no thermodynamic engineer, but if you have a heat gradient in the air between 2 rooms (or 1 room and the rest of the house), it seems that circulating the air between the 2 areas will reduce the delta T.

If a PC is putting out 500+ watts of heat, how significant is the heat priced by a fan?

Edit: As a corollary to that, are you going to get rid of your case/radiator fans because they "add heat to the system"?
 
Last edited:
You realize fans move air while adding heat, correct? Not everyone lives in a place where turning on a fan will do anything, at all, besides make it hotter.

Yes but it is not fully a closed system talk has he is responding to someone talking about the feeling in a particular room of a larger house, (not only wind can change feeling, but if there is a delta T between the room it can diminish that with circulation) a fan can help for that specific scenario even if it does not change the global amount of heat in the house. The context of the response is too the statement putting a computer in a different room, a fan pushing the hot air from the computer to a different would partialy do the same.
 
Last edited:
Yes but it is not fully a closed system talk has he is responding to someone talking about the feeling in a particular room or a larger house, a fan can help for that specific scenario even if it does not change the global amount of heat in the house.
Too true, anyone that has ever lived in a house/apartment with 12' high ceilings can attest to just how much having fans helps with keeping things booth cool in the summer and warm in the winder by simply just circulating air even when the HVAC system reaches temp and shuts off.
 
Too true, anyone that has ever lived in a house/apartment with 12' high ceilings can attest to just how much having fans helps with keeping things booth cool in the summer and warm in the winder by simply just circulating air even when the HVAC system reaches temp and shuts off.
yep, ceiling fans work very well....
We have a small "tri-level" house and purposefully put the computer area in the "basement" since it will always be the coldest part of the house, except when gaming lol. It easily gets hotter than the upstairs area (hottest part of the house).
 
Back
Top