NVIDIA GeForce RTX 4070: up to 30% faster than RTX 3090 in gaming

1300/365 = $3.56 a day to use card. Hell I know people who spend $10 a day on fast food or Starbucks coffee with no complaints.
Cool? I guess? Weird way to rationalize it when you're not paying for it a day at a time and instead it's a large upfront purchase.
 
It is not monolithic. You can try to spin it anyway you can, but it is still a chiplet design. This is how AMD is able to have less defects at TSMC and make more money. Yes there will still be an IO die. But if you call RDNA3 monolithic, then you are saying the Ryzen CPU's are also monolithic.
Ryzen CPU's can come in configurations with multiple CCX's where computing is distributed across them with them presented as a single chip to the hardware with no discernable performance difference between them. That isn't something they are going to be offering with this generation of GPUs, But yes their GPUs will be cheaper and likely half the physical size of the Nvidia ones which will drastically cut costs and improve yields so there will be a much better margin there.
This is why I said they are but they aren't because technically they are a chiplet design, but all the computing is still happening on a single chip which is monolithic.

If you want a full chiplet-based GPU you have to look at the Instinct MI250 and MI250X which are only now starting to get out into the wild, and so far their benchmarks are rather impressive, for only using 600w.
I am pretty sure AMD ran into the same problems with the RDNA 3 chiplet designs that they did with the instinct ones, in that GDDR6 is too slow to handle the split, so they need HBM2+ or GDDR7 to make it work smoothly, but using either would destroy any cost savings the chiplet design granted them.
 
After reading this at Techpowerup I 100% believe the 4090 was designed as a 600w card and is underclocked to not end up having a huge amount of end user issues. I know for a fact that the only design kit submitted by Nvidia for builds in China was 600w.

While the RTX 4090 operates at 450 W by default, the power delivery capability allows you to increase the power limit up to 600 W for overclocking.

https://www.techpowerup.com/299096/...-cooler-pcb-design-new-power-spike-management
 
It is not monolithic. You can try to spin it anyway you can, but it is still a chiplet design. This is how AMD is able to have less defects at TSMC and make more money.
That sounds pretty convincing until you look at AMD's published financials and realize that they actually lose money on every GPU they sell. In a shock to MLID idolators, it turns out that there is more that goes into GPU cost than simple silicon area.
 
1300/365 = $3.56 a day to use card. Hell I know people who spend $10 a day on fast food or Starbucks coffee with no complaints.
Ya and these are the same people who drop $5k on a system, but then complain about paying $150 for MS OS license on top of it, right, but will pre-order the next top AAA title for $200 to get an ugly plastic figure and play a crap-tastic buggy game on launch that is nothing like the company promised..
 
After reading this at Techpowerup I 100% believe the 4090 was designed as a 600w card and is underclocked to not end up having a huge amount of end user issues. I know for a fact that the only design kit submitted by Nvidia for builds in China was 600w.

While the RTX 4090 operates at 450 W by default, the power delivery capability allows you to increase the power limit up to 600 W for overclocking.

https://www.techpowerup.com/299096/...-cooler-pcb-design-new-power-spike-management
Maybe the AIB will use 600 watt bios right away with some $2000 ROG 600 watt version out of the box, that NVIDIA never planned to do but was simpler to only have one card.

Or maybe yes, before the GPU demand collapse, they planned to have a ridiculous 600 watt halo card, then demand collapse, RDNA 3 start to talk about efficiency so much they lost any fear that the 4090 will not be the top dog and decided to release de 450 watt version and let the much more expensive card release open for the future.
 
After reading this at Techpowerup I 100% believe the 4090 was designed as a 600w card and is underclocked to not end up having a huge amount of end user issues. I know for a fact that the only design kit submitted by Nvidia for builds in China was 600w.

While the RTX 4090 operates at 450 W by default, the power delivery capability allows you to increase the power limit up to 600 W for overclocking.

https://www.techpowerup.com/299096/...-cooler-pcb-design-new-power-spike-management
The standard Nvidia vBIOS allows +23% on the TDP slider. That takes a 450W card and allows it to hit 554W. 600W would be a pretty reasonable safety factor given that.

Factor in the potential for an AIB's halo card (RIP Kingpin Edition😢), and a 600W design spec seems pretty reasonable with the matching FE being only 450W.

There's also the question of why a CM would have a copy of the design spec. My understanding is that Nvidia does their own layouts rather than simply sending specs and schematics to their CM. If my understanding is correct, then the CM never would have gotten the design spec. If they had a TDP number, it would have been from them calculating it based off of the design. Power draw isn't determined by the theoretical max the hardware can survive. Without the firmware up and running, the CM never would have been able to figure out the actual design target for power aside from it being less than the maximum the hardware can survive.

MLID targeted a weakness in his fanbase and they ate it right up. There is nothing more to the 600W story than that.
 
The standard Nvidia vBIOS allows +23% on the TDP slider. That takes a 450W card and allows it to hit 554W. 600W would be a pretty reasonable safety factor given that.

Factor in the potential for an AIB's halo card (RIP Kingpin Edition😢), and a 600W design spec seems pretty reasonable with the matching FE being only 450W.

There's also the question of why a CM would have a copy of the design spec. My understanding is that Nvidia does their own layouts rather than simply sending specs and schematics to their CM. If my understanding is correct, then the CM never would have gotten the design spec. If they had a TDP number, it would have been from them calculating it based off of the design. Power draw isn't determined by the theoretical max the hardware can survive. Without the firmware up and running, the CM never would have been able to figure out the actual design target for power aside from it being less than the maximum the hardware can survive.

MLID targeted a weakness in his fanbase and they ate it right up. There is nothing more to the 600W story than that.
It was designed as a 600W card from day one. I know it was, 100%. I saw the documentation.
 
I believe there was a 600w model, even just in case they need to release something later against AMD and keep it in reserve, and end up not needing it so it never sees the light of day

1663817004753.png
 
Or maybe yes, before the GPU demand collapse, they planned to have a ridiculous 600 watt halo card, then demand collapse, RDNA 3 start to talk about efficiency so much they lost any fear that the 4090 will not be the top dog and decided to release de 450 watt version and let the much more expensive card release open for the future.
Bingo.
 
I believe there was a 600w model, even just in case they need to release something later against AMD and keep it in reserve, and end up not needing it so it never sees the light of day
I believe what we are seeing are AIB designs based on the 600W design spec that NVIDIA submitted. Tons of reasons for not bringing out "stock" 600w cards, but the biggest is simply that the needed PSU infrastructure is not in the channel to be able to fully support that. ATX3.0 spec allow for transient spikes of 200% of the PSUs rating.
 
I believe what we are seeing are AIB designs based on the 600W design spec that NVIDIA submitted. Tons of reasons for not bringing out "stock" 600w cards, but the biggest is simply that the needed PSU infrastructure is not in the channel to be able to fully support that. ATX3.0 spec allow for transient spikes of 200% of the PSUs rating.
The biggest reason for not putting out stock 600w designs is cooling, in Enterprise both AMD and Nvidia already sell 500-600w cards and they are HOT, but when your box is running 30a 240v, running fans moving a few thousand cubic feet of air per second into an airconditioned room that is manageable, but a 600w card in a home tower that is moving maybe a few hundred cubic feet into the space under somebody's desk in a poorly ventilated corner office? I mean I have plenty of servers running dual 1600-watt titanium PSUs that are able to handle those spikes but most people's homes just can't handle that sort of load and not catch fire.
 
The biggest reason for not putting out stock 600w designs is cooling, in Enterprise both AMD and Nvidia already sell 500-600w cards and they are HOT, but when your box is running 30a 240v, running fans moving a few thousand cubic feet of air per second into an airconditioned room that is manageable, but a 600w card in a home tower that is moving maybe a few hundred cubic feet into the space under somebody's desk in a poorly ventilated corner office? I mean I have plenty of servers running dual 1600-watt titanium PSUs that are able to handle those spikes but most people's homes just can't handle that sort of load and not catch fire.
Yep. Lots more reasons...
 
The biggest reason for not putting out stock 600w designs is cooling, in Enterprise both AMD and Nvidia already sell 500-600w cards and they are HOT, but when your box is running 30a 240v, running fans moving a few thousand cubic feet of air per second into an airconditioned room that is manageable, but a 600w card in a home tower that is moving maybe a few hundred cubic feet into the space under somebody's desk in a poorly ventilated corner office? I mean I have plenty of servers running dual 1600-watt titanium PSUs that are able to handle those spikes but most people's homes just can't handle that sort of load and not catch fire.

Exactly. Heat dumping / noise is gradually getting a bit higher every few years. Nice in the winter, sucks in the summer.
 
It was designed as a 600W card from day one. I know it was, 100%. I saw the documentation.
If we pretend for a moment that this is as correct as you say and that it truly was intended for standard operation at 600W, then we should end up seeing 20-30% gains via overclocking just with the stock coolers. That would net out to performance that's 3-6x as good as a 3090. I fail to see how this is a bad thing.

Further, if this is as correct as you say and that it truly was intended for standard operation at 600W, then we should see an additional 10-15% performance boost with the cards hitting 800W on aftermarket coolers. That puts it at 3.5-7x as good as a 3090, which would make it suitable for 120fps gaming at 8K (pretending such a monitor and connector existed). Once again, I fail to see how this is a bad thing.
 
Exactly. Heat dumping / noise is gradually getting a bit higher every few years. Nice in the winter, sucks in the summer.

Nice in the winter, until you get the electric bill. It's not a very efficient way to generate heat. In my backup computer I am still using 3x GTX680 in SLI. Those are 195w TDP cards stock, so pretty close to 600w TDP with all three combined. In a benchmark that fully loads all three I can literally feel the heat coming out from behind my desk. My RTX 2080 is 215w TDP in comparison. It's crazy to think that some of these new cards are more than double that.

If we pretend for a moment that this is as correct as you say and that it truly was intended for standard operation at 600W, then we should end up seeing 20-30% gains via overclocking just with the stock coolers.

I doubt it. The stock coolers are already huge (3.5 slot is insane) and likely being pushed close to their limits. Not wanting to put even bigger heatsinks on there is probably one of the reasons why they kept it to 450w.

That would net out to performance that's 3-6x as good as a 3090. I fail to see how this is a bad thing.

How the hell did you come up with "3-6x"? Did you gobble up that 2-4x nvidia marketing nonsense and tack on another "20-30%"?

Further, if this is as correct as you say and that it truly was intended for standard operation at 600W, then we should see an additional 10-15% performance boost with the cards hitting 800W on aftermarket coolers.

I wonder how many slots those coolers would take up? 5-6?

That puts it at 3.5-7x as good as a 3090

lol..... please tell me that your post was bait and that you're not actually that gullible.
 
Lmao. Now we're at 6x 3090? Come on. I never want to hear anyone criticize AMD graphs ever again.
 
They both deserve all the derision they receive for their marketing.
Except apparently the AMD claims have been pretty on point lately. All I'm saying is this isn't the 1st person I've seen go all in with these 3-4-whatever number x 3090 claims
 
If we pretend for a moment that this is as correct as you say and that it truly was intended for standard operation at 600W, then we should end up seeing 20-30% gains via overclocking just with the stock coolers.

How does that make any sense? Increasing power usage by 20% (random number) doesn't mean the performance jumps by exactly 20%. And even if the cards were designed to run at 600W initially, the coolers they are shipping with most likely would not be capable of properly cooling such a card. They were designed for something closer to 450W, or whatever the AIBs intend to run them at.

This is the same reason a **50 has a weaker cooler than a **80 or **90 card. Throwing a massive cooler and a card that doesn't need it just increases costs.
 
If we pretend for a moment that this is as correct as you say and that it truly was intended for standard operation at 600W, then we should end up seeing 20-30% gains via overclocking just with the stock coolers. That would net out to performance that's 3-6x as good as a 3090. I fail to see how this is a bad thing.

Further, if this is as correct as you say and that it truly was intended for standard operation at 600W, then we should see an additional 10-15% performance boost with the cards hitting 800W on aftermarket coolers. That puts it at 3.5-7x as good as a 3090, which would make it suitable for 120fps gaming at 8K (pretending such a monitor and connector existed). Once again, I fail to see how this is a bad thing.
Did I say it was a bad thing? Pretend all you want. I know the truth. I read the documents.

Seems like you want to argue about something. I am just giving you facts, and my track record on that front is stellar.
 
I doubt it. The stock coolers are already huge (3.5 slot is insane) and likely being pushed close to their limits. Not wanting to put even bigger heatsinks on there is probably one of the reasons why they kept it to 450w.
[snip]
I wonder how many slots those coolers would take up? 5-6?
According to lore, they would be the same coolers we're seeing now.

How the hell did you come up with "3-6x"? Did you gobble up that 2-4x nvidia marketing nonsense and tack on another "20-30%"?
I'm going off of Nvidia's release + a pair of tweets from an independent third party which has already been running the cards.
https://twitter.com/OTOY/status/1572268322481405954
https://twitter.com/JulesUrbach/status/1572281595788423168

Sure, Nvidia would tend to push numbers that are favorable to them. But, when they say "2-4x," the real numbers such as heck aren't going to be 1-1.2x. My guesstimated boost is based off of historical underclocking performance. It's figurative estimate, not a literal measurement. The point is that the boost would be substantial, and it would be great for overclockers (especially with Nvidia on the record saying they have cards hitting over 3Ghz in their lab - a 20+% increase over the published boost clocks).
 
when they say "2-4x," the real numbers such as heck aren't going to be 1-1.2x.

Those are cherry-picked numbers using DLSS 3.0. If you actually spend most of your time playing one of the few games that actually supports that then great. For other games, Jensen himself said that we're looking at closer to a 25% increase. So yeah, 1.25x
 
Those are cherry-picked numbers using DLSS 3.0. If you actually spend most of your time playing one of the few games that actually supports that then great. For other games, Jensen himself said that we're looking at closer to a 25% increase. So yeah, 1.25x
Do you have a link to the 25%?

It's difficult for me to believe that a 55% increase in core count combined with a 25% increase in boost clock leads to a total performance gain of only 25%. To have those objective inarguable stats but then provide only a 25% increase in performance would mean that the card is sitting there idle about 40% of the time while in the middle of playing a game.
 
Do you have a link to the 25%?

It's difficult for me to believe that a 55% increase in core count combined with a 25% increase in boost clock leads to a total performance gain of only 25%.
Depends how much of that hardware is going towards non rtx/DLSS stuff.
 
Depends how much of that hardware is going towards non rtx/DLSS stuff.
No. No, it doesn't. I'm comparing CUDA cores and CUDA cores alone. The massive increases in tensor and RT hardware are in addition to the massive increase in CUDA performance.
 
Speculating on performance gains gets everybody nowhere. Reviews will be out soon!
Let's all just be happy that GPUs actually exist now beyond just giving us those warm feelings like waking up after a good dream as it slowly drifts away forgotten.
The last 2 years have been a very dark time for the gaming industry.
 
Of course it was designed as a 600 watt card. This isn't really a gaming design. Nvidia is in the super computer business, as Lakados said when it comes to enterprise server racks power and cooling ceilings are very different.

Did they think they could release a consumer version at 600 watt. Probably, it is crazy but probably. My guess would be they fully expect they can spike to that if they end up needing that to win the benchmark war. Hopefully they don't start too many fires. lol
 
Reviewers need to open serious discussion and add an entire section to each video card review where they talk about the real world ramifications of the power draw, how hot it gets, how much heat it dumps into a small room, how much air conditioning you need to cool it, potential lifespan issues and how much case and fan maintenance is needed to make sure you don't screw up and bake expensive hardware. And frankly also cost to run, because above 300w you have to actually calculate yearly electricity usage if you use it every day.

I'm not buying a 450w video card. They can just screw off with that. I have enough thermal issues as it is and my office gets hot enough I have to run AC in that room most of the year even part of winter. And that's with a 3080 and 5900x. I'm not going to go even one watt past where I am now for heat generation. Not happening. This IS my practical limit. I have to live with the computer every day.
 
Reviewers need to open serious discussion and add an entire section to each video card review where they talk about the real world ramifications of the power draw, how hot it gets, how much heat it dumps into a small room, how much air conditioning you need to cool it, potential lifespan issues and how much case and fan maintenance is needed to make sure you don't screw up and bake expensive hardware. And frankly also cost to run, because above 300w you have to actually calculate yearly electricity usage if you use it every day.

I'm not buying a 450w video card. They can just screw off with that. I have enough thermal issues as it is and my office gets hot enough I have to run AC in that room most of the year even part of winter. And that's with a 3080 and 5900x. I'm not going to go even one watt past where I am now for heat generation. Not happening. This IS my practical limit. I have to live with the computer every day.
Yup. This is why I get tired of people going "lol just get a bigger psu". That's never been an issue. Component cooling has never been an issue. Heat disappattion into an enclosed room is a very real issue.

Like, it doesn't even help in winter. People talk about free heat, why? The house has central air. Still going to be that same 10 degrees warmer.
 
Heat disappattion into an enclosed room is a very real issue.

That's very true, but this also isn't some unprecedented issue that has never occurred before. There were plenty of multi-GPU setups in the past that had a combined TDP equal or greater than that of the 4090. Anyone who could tolerate the heat output of a high-end SLI or Crossfire setup will probably be just fine with a 4090. Everyone has to figure out what their limits are. The amount of time I spend gaming, I would be okay with the heat output if performance is high enough. Not happy about it, but ok with it.
 
Reviewers need to open serious discussion and add an entire section to each video card review where they talk about the real world ramifications of the power draw, how hot it gets, how much heat it dumps into a small room, how much air conditioning you need to cool it, potential lifespan issues and how much case and fan maintenance is needed to make sure you don't screw up and bake expensive hardware. And frankly also cost to run, because above 300w you have to actually calculate yearly electricity usage if you use it every day.

I'm not buying a 450w video card. They can just screw off with that. I have enough thermal issues as it is and my office gets hot enough I have to run AC in that room most of the year even part of winter. And that's with a 3080 and 5900x. I'm not going to go even one watt past where I am now for heat generation. Not happening. This IS my practical limit. I have to live with the computer every day.

Small room, large room, same heat.

How much AC? Lol wtf are you talking about? That's not how things work.
Nifty thing about CPU and GPU-- If it pulls 450w, then it dissipates 450w.

Lifespan is not calculable by any review or influencer. This would literally have to be taken at the manufacturer's word.

Bake expensive hardware? 🤣

At any power draw above or below 300w, there is a cost to run. Why did you decide to set 300w as this arbitrary point? Do you get free electricity if your GPU draws ≤300w?

Your final statement shows your lack of understanding of how these things work.

Should probably just delete your whole post.
 
Anyone who could tolerate the heat output of a high-end SLI or Crossfire setup will probably be just fine with a 4090.
Never had one, probably wouldn't want one lol. Me and my wife have some fairly efficient components and it already ratchets the pc room up 8+ degrees.
 
Back
Top