RTX 3xxx performance speculation

The PCB doesnt dissipate much power, its size has little relevance.
What affects overclocking more is how automated overclocking and power control becomes.
As time goes on we get less control, it is more automated.
The problem is we dont get potential high clocks for free any more, we have to pay for it.
Everything is performance binned.
All true, but packing all those components together in such a small space does not bode for great heat transfer. And already pushing 325w or 350w? Oof. I don't think Samsung has left a lot of headroom in these chips. Asus is already touting a 400W model. I would think that is about topped out.

Have they posted measurements yet? I think I only have about a inch of length over my current Aorus 2080TI before the card would be hitting my side mounted CPU cooler. I'll be a sad boi if I can't fit the 3090.
3090/3080/3070
1599005952784.png


I will say it makes me wonder a bit, because none of the AIB partners have released clock speeds yet for the factory OCed cards (as far as I have seen). And these cards are ostensibly about 2-4 weeks away.
No clocks, no prices.....hmmm.

Most AIB cards look like 2.5 to triple slot coolers so it would be interesting to see what the oc's ends up being for each brand.
There will be 3+ slot cards as well I was told.
 
The first dual slot triple fan 3090 gets my money...if that doesn't happen then...ugh...time for a new case.
 
All true, but packing all those components together in such a small space does not bode for great heat transfer. And already pushing 325w or 350w? Oof. I don't think Samsung has left a lot of headroom in these chips. Asus is already touting a 400W model. I would think that is about topped out.
The cooler defines heat transfer, it doesnt diminish with PCB size.
It can be larger with a smaller PCB.
 
The cooler defines heat transfer, it doesnt diminish with PCB size.
It can be larger with a smaller PCB.
Roger that. :) It is only going to be an issue on FE cards anyway. And there will not be many of those for sale, so....
 
Roger that. :) It is only going to be an issue on FE cards anyway. And there will not be many of those for sale, so....
Shame, it's a great looking design. Gonna be interesting to see those thermals reviewed.
 
Roger that. :) It is only going to be an issue on FE cards anyway. And there will not be many of those for sale, so....
I understand better what you were trying to say, it wasnt about GPU power but heat from the smaller components packed in a small space.
NVidia take enough care to ensure air flow through the cooler over components that get too warm, I wouldnt worry.
If anything gets too warm still, it will have a small heatsink on it (or a pad to the large heatsink) or a larger component will be chosen that dissipates the heat over a larger volume (surface area).
They dont cock up that often :)
 
I understand better what you were trying to say, it wasnt about GPU power but heat from the smaller components packed in a small space.
NVidia take enough care to ensure air flow through the cooler over components that get too warm, I wouldnt worry.
If anything gets too warm still, it will have a small heatsink on it (or a pad to the large heatsink) or a larger component will be chosen that dissipates the heat over a larger volume (surface area).
They dont cock up that often :)
Thanks for letting me know. :) I think these new coolers look great. I just need to see how fast one warms my office up and how loud it sounds. I am already having 290X flashbacks.
 
Thanks for letting me know. :) I think these new coolers look great. I just need to see how fast one warms my office up and how loud it sounds. I am already having 290X flashbacks.
lol, I fitted an Accelero Xtreme III cooler to my 290x, worked a treat but didnt change the amount of heat generated.
Expect more heat from the 3080 and 3090, their design power specs are higher than the 290x.
I doubt they will get as warm though, better coolers and all that.
 
I wonder how the reference cooler would perform an inverted motherboard case that exhausts out the top.
 
Didn't AMD claim a 50% performance per watt increase for RDNA2? I think its pretty impressive that NVIDIA was able to get 90% per/watt increase. I think that NVIDIA was helped by going from 12nm to 8nm and AMD doesn't have that as much this time since they are already on 7nm. Improvements on the AMD side will likely be more architectural.
Not sure how that will work out, Jensen stated that 3070 performance was basically equaling a 2080Ti, Nvidia 3070 is rated at 220w, Nvidia 2080Ti 250w. 90% per/watt increase looks more like 10%. Must have gotten the 90% and 10% backwards.

ASUS spending extra money and putting flashing light indicators on their 3000 series 400w Ampere (AMPS and more AMPS) card, selling it as a feature that is needed :ROFLMAO: based on power transients which may overload the power supply, is rather entertaining. I wonder if they will include an audible alarm to go with that?
 
I wonder how many people actually paid attention to power supply reviews on here. Part of the reason for a bigger power supply then you need is they are more efficient around 50% to 60% of their capacity. Thus I can see why 850 watt power supply is being recommended. Also will be interesting in seeing these in reviewer hands, as I dont have a ton of trust in manufacturer bar charts.
 
I wonder how many people actually paid attention to power supply reviews on here. Part of the reason for a bigger power supply then you need is they are more efficient around 50% to 60% of their capacity. Thus I can see why 850 watt power supply is being recommended. Also will be interesting in seeing these in reviewer hands, as I dont have a ton of trust in manufacturer bar charts.
And, we actually built our own equipment to test transient power response. I do not think any other review site ever covered that, and that is what ASUS is focusing on. You can ask @[spectre] He knows a lot more about this than me.
 
And, we actually built our own equipment to test transient power response. I do not think any other review site ever covered that, and that is what ASUS is focusing on. You can ask @[spectre] He knows a lot more about this than me.

This is why I really do miss your reviews. I know you won't ever get back into since this is day and age of youtube tech reviews etc etc. But, very few places do in depth reviews like you did.

I would LOVE to see a 3090 review from good ole [H] ;)
 
All true, but packing all those components together in such a small space does not bode for great heat transfer. And already pushing 325w or 350w? Oof. I don't think Samsung has left a lot of headroom in these chips. Asus is already touting a 400W model. I would think that is about topped out.


3090/3080/3070
View attachment 275405


No clocks, no prices.....hmmm.


There will be 3+ slot cards as well I was told.
Annnnnd now I need a new case, too.
 
Placing hot and high temperature rated components very close to lower rated temperature components can cause decrease lifespan due to heat transfer. Temperature is probably the #1 cause for shorter life span on electrical equipment in general. The other concern I would have is you have a 12 layer board, very tight configuration, which each layer acting like insulation and having rather high amp components being fed, with natural resistance gives heat that can build up. It is definitely an interesting design but is that design appropriate for this use case? I would suspect a higher failure rate over the lifetime of these boards unless Nvidia has really hit the super high quality stage for everything.

Those having air coolers for their CPU will get a new buddy, a fan dumping hot air straight into it from a 320w-350w non OC GPU.
 
Hold on a sec. In the video he said that the players testing the 3090 were using an LG 8K OLED display. But the smallest 8K OLED On the market right now is like 77 inches.

Did he inadvertently unveil a prototype 8K gaming display? It didn’t seem to me that they were using 77 inch displays but I suppose at 8K it might be possible to play at that distance.
 
Hold on a sec. In the video he said that the players testing the 3090 were using an LG 8K OLED display. But the smallest 8K OLED On the market right now is like 77 inches.

Did he inadvertently unveil a prototype 8K gaming display? It didn’t seem to me that they were using 77 inch displays but I suppose at 8K it might be possible to play at that distance.
Considering the new-found partnership they have with LG and their OLED TVs, I wouldn't be surprised if they were using a prototype.
 
Might be problematic for cpu temperature performance as I'm still using the old fashion air cooling (Noctua NH-u12s) .

Good for the card temp but bad of my cpu setup. Gotta see what other AIBs are designing their cooling system.

1599011236310.png
 
Not sure how that will work out, Jensen stated that 3070 performance was basically equaling a 2080Ti, Nvidia 3070 is rated at 220w, Nvidia 2080Ti 250w. 90% per/watt increase looks more like 10%. Must have gotten the 90% and 10% backwards.

ASUS spending extra money and putting flashing light indicators on their 3000 series 400w Ampere (AMPS and more AMPS) card, selling it as a feature that is needed :ROFLMAO: based on power transients which may overload the power supply, is rather entertaining. I wonder if they will include an audible alarm to go with that?

Put up the perf/Watt slide and I will explain to you why you fail physics...
 
Seasonic's PSU calculator has been updated to include the 30 series GPUs. It just uses their TGP values 350W, 320W, etc, so I dont know how helpful that is.

I got 564W with a 3080. (494W with a 1080 Ti, exactly 70W less).
That actually seems a lot higher than reality. I am pretty sure it just adds TDP's together (max output).
 
The evga XC looks like it is exactly what you want. Someone posted a pic of it on the EVGA thread, can't tell if it uses a custom pcb.
I think pretty much every AIB card will be using a "custom" PCB.
 
Might be problematic for cpu temperature performance as I'm still using the old fashion air cooling (Noctua NH-u12s) .

Good for the card temp but bad of my cpu setup. Gotta see what other AIBs are designing their cooling system.

View attachment 275446

Where do you think the heat goes with a triple open fans?

This is better because it gets the heat out faster. If it actaully works that way...
 
Where do you think the heat goes with a triple open fans?

This is better because it gets the heat out faster. If it actaully works that way...
So you think having a direct stream of GPU/heat airflow blowing directly on the CPU area is the same as being diffused with airflow coming into the case? (Serious question.) I can see several different ways to look at this and how air travels through the case and exactly how case airflow is set up.
 
Where do you think the heat goes with a triple open fans?
Hot air flow density distribution at the cpu heatsink air intake would be different between these two gpu cooling setups especially in a closed PC case. That would be my guess.
 
Hot air flow density distribution at the cpu heatsink air intake would be different between these two gpu cooling setups especially in a closed PC case. That would be my guess.
Exactly what I was thinking. Put a blowdryer an inch from your hand, then pull it back several inches.....diffusion matters.
 
  • Like
Reactions: noko
like this
Only in 2020 could a GPU fan that blows hot air directly through your CPU cooler be a selling point!

They must assume everyone has some kind of watercooler on the CPU.
 
.....
In general you need to take Nvidia's perf/watt claims with a grain of salt. They also claimed Turing was 50% better but in reality it was probably closer to 10-15% once you actually measured power draw from the wall versus performance.

Well... When new cards launch, comparing across different architectures is not as straight forward as one might think. This is because there are so many attributes and factors at play when designing a graphics card. For a new card model, there are many decisions to make. The specific goals and targets can have large impacts on characteristics such as power consumption.

Case in point, Fermi. GF100 seemed awful but people praised the GF104. The architecture can look a lot worse if they push it too hard in the cards using them.

But speaking on Turing specifically. There is a decent bump in performance going from a PCIe slot powered GTX 1050 Pascal low profile to the Turing GTX 1650 low profile cards. Without having any extra power adapters, the sub 75w limit of PCIe slot, the GTX 1650 gives close to 50% better frame rates than the GTX 1050 for about the same usage in power. It's only 35% higher fps vs my 1050ti, plus the 1650 is pulling a few watts less compared to the msi 1050ti. Like 4-5watts less, which I will happily take since it's all on the PCIe slot... The 1050ti ran right up close to the 75watt limit
 
Only in 2020 could a GPU fan that blows hot air directly through your CPU cooler be a selling point!

They must assume everyone has some kind of watercooler on the CPU.
I would argue this is even worse for water cooled systems. More likely there is dead air around the CPU and the power components in water cooled systems than air cooled. (Not in mine, but yours. ;) )
 
So you think having a direct stream of GPU/heat airflow blowing directly on the CPU area is the same as being diffused with airflow coming into the case? (Serious question.) I can see several different ways to look at this and how air travels through the case and exactly how case airflow is set up.

With a conventional open fan card, the hot GPU air is mostly just recirculated inside the box. It's eventually going to heat up everything, including the CPU.

This new design is a half blower card, so something close to half the heat is exhausted immediately out the back of the case. The second fan will blow the exhaust into the path of CPU, but this is the next fastest path to get it out the box, and yes overall it should be better, though probably just neutral for the CPU. Unless you are only thinking in terms of starting up in the first minute or two before the open fan GPU has a chance to heat up the box.

Both fans in this new design are ultimately getting heat out of the box faster, and that is what matters for longer runs.
 
I would argue this is even worse for water cooled systems. More likely there is dead air around the CPU and the power components in water cooled systems than air cooled. (Not in mine, but yours. ;) )

Yep. Even with just an AIO on the CPU you have to get that air moving over the VRMs or some exhaust going for the chassis. Custom water seems like the way to really tame these 300+ watt cards. Like 2x 360mm rads for a CPU +GPU loop I think.
 
Might be problematic for cpu temperature performance as I'm still using the old fashion air cooling (Noctua NH-u12s) .

Good for the card temp but bad of my cpu setup. Gotta see what other AIBs are designing their cooling system.

View attachment 275446
Honestly, I expect it to make a small impact on the CPU temperatures compared to current cards because some of the heat is exiting out the back of the card, not into the case. Sure, the right fan in the picture above is exhausting into the CPU fan, but think about current air-cooled card designs for a second... where does all of the heat go? Out the top and bottom of the card, bouncing off the MB and case side panel and into the CPU cooler fan, ultimately, then out the back case fan(s). Yes, these new cards will create more heat because of higher wattage compared to current-gen cards, but again, this extra heat output is lessened by the fact that some of the heat (say, 30%?) is exiting the rear of the GPU card directly and not into the case. So you have 70% of 350W = 245W of heat entering the case with a 3080 vs 100% of 225W = 225W of heat entering the case using a 2080Ti. An extra 30W of heat will increase CPU temps by what.... 1-3 degrees C when using a decent air cooler (NHD-15 class)? If you have a decent airflow case and you aren't near any thermal limits for the CPU, then it shouldn't matter anyway. Obviously if only 10% of the heat is exiting the rear of the GPU card, then a lot more heat will be exhausted into the case and into the CPU cooler's fan, which would definitely impact CPU temps to a higher degree.
 
This new design is a half blower card, so something close to half the heat is exhausted immediately out the back of the case.
So you know that "half" of the heat is dissipated by each fan path? I think that NVIDIA has been lacking transparency here for a reason.

The second fan will blow the exhaust into the path of CPU, but this is the next fastest path to get it out the box, and yes overall it should be better, though probably just neutral for the CPU.
Maybe, maybe not. Reviewers that test in cases is the only way to tell besides having one in hand personally.

Unless you are only thinking in terms of starting up in the first minute or two before the open fan GPU has a chance to heat up the box.
No, not at all. A diffused airflow after case heatsoak is actually what I am thinking.
 
  • Like
Reactions: noko
like this
So you know that "half" of the heat is dissipated by each fan path? I think that NVIDIA has been lacking transparency here for a reason.

I said "something close to half". Granted I have no idea how close or far from half it might be. Obviously no one outside some NVidia engineers know how much is exhausted out the back, and I doubt we will get a definitive answer on that.

But it's obviously exhausting more out the back, than your typical open fan card.

Definitely interested in seeing this tested vs a typical open fan AIB card. Though it probably won't be top of agenda for many reviewers, that will be racing to get standard reviews up.
 
Back
Top