RTX 3xxx performance speculation

Which way does the back fan blow? Towards the ram on the motherboard or through the fins blowing hot air into the GPU fan side? This design is not making much sense at least to me. If this cooler cost $150, removes 350w+, what can AIBs improve upon cooling wise? Can't really make it bigger, water cooling is the only way I see it could be improved upon, everything else may be less or equal to it but not better with AIB cooling is first thought. Custom boards with more traditional cooling, if that is better then I don't think Nvidia would have wasted their time on this design. Only a few more days before we get some better answers.
 
These furnace TDPs worry me. I mean, is the 3080 actually faster than a 2080 Ti in the same power envelope? Or is it just a 2080 Ti with faster RAM? And don't even get me started on the 3090. For people in ITX setups that are temperature limited, I really wonder what are we looking at once you account for reducing power limit to accommodate weaker cooling setups. Maybe 10-15% faster than 2080 Ti?
 
I think I will wait to see the full details and then user experience and trend. So far it looks like a disaster and I hoping that card design is more of a joke then actually what will be released, then again if all is all good and it runs like a champ -> sign me up for one. I doubt it will be two this time around. Also seeing AMD hand I can wait for as well.
 
its happening

https://twitter.com/9550pro/status/1297457860222152705

EgF_ghPVAAIduXz?format=jpg&name=4096x4096.jpg
 
according to vcz its FE only and Nvidia is including an adapter for existing cables.
why even bother using the 12 pin...
 
according to vcz its FE only and Nvidia is including an adapter for existing cables.
why even bother using the 12 pin...
For me I would use it because I don't like adapters, weird I know but I'll make a custom cable before using one.
 
according to vcz its FE only and Nvidia is including an adapter for existing cables.
why even bother using the 12 pin...

Not only that. It's objectively worse using adapters. They add another connection point that can fail and will increase the length, which increases resistance and heating.
 
Why would you need 850W minimum? Let's say 400W worst case scenario for the GPU, 200W for CPU and the rest maybe 50W at most for peak usage. Unless you run applications that 100% stress an overclocked Intel CPU and 100% stress the GPU, I don't see a reason to require 850W.
 
Well I'm not going to buy another PSU for this card guess I'm going for the 3080 own X3 750 watt PSUs one is brand new EVGA G5.
 
Damn that 3090 really looks juicy. I usually wait to see what the requirements for the next Battlefield are. Maybe GTA 6 will be a killer app for this?

I’m sort of worried if I buy into this card now, in a year or two there will be games built for rtx,
If you are able to hold off on a purchase do so, you will be happier waiting to see how the launches play out. An additional bonus is that you could wait until there is actual supply to go with the cards that rock the benchmarks on games that interest you. You will not say I wish I bought day 1 again like the 1080ti, that anomaly happens once in your lifetime.

Three times in my life time. 9700 Pro, 8800 GTX, and the 1080ti.
 
Damn that 3090 really looks juicy. I usually wait to see what the requirements for the next Battlefield are. Maybe GTA 6 will be a killer app for this?

I’m sort of worried if I buy into this card now, in a year or two there will be games built for rtx,


Three times in my life time. 9700 Pro, 8800 GTX, and the 1080ti.

I suspect that Cyberpunk 2077 will be used a the "game" to promote Ampere's performance.
 
Cyberpunk 2077 was suppose to be out in April but got delayed so I'm not sure if the delay was Nvidias idea.
 
I suspect that Cyberpunk 2077 will be used a the "game" to promote Ampere's performance.
I can see that, I remember them showing it without Raytracing running just fine on a 1080ti for the initial play demo. With how the last play sessions being described as struggling with Raytracing and a 2080ti, they are showing it will need some serious power to fix that even with optimization. I look forward to how far behind AMD's implementation of Raytracing is compared to Ampere. Just common sense they are at a deficit just how far, I wouldn't be surprised if they were a step behind turning.
 
Why would you need 850W minimum? Let's say 400W worst case scenario for the GPU, 200W for CPU and the rest maybe 50W at most for peak usage. Unless you run applications that 100% stress an overclocked Intel CPU and 100% stress the GPU, I don't see a reason to require 850W.
They probably do this for people that get crappy 850-watt PSUs for $50.

I would be shocked if my EVGA 750-watt 80+ Gold that I've had for 5 years couldn't handle this beast.
 
Damn that 3090 really looks juicy. I usually wait to see what the requirements for the next Battlefield are. Maybe GTA 6 will be a killer app for this?

I’m sort of worried if I buy into this card now, in a year or two there will be games built for rtx,


Three times in my life time. 9700 Pro, 8800 GTX, and the 1080ti.


I've done it a few times... and usually went full retard and bought 2x for SLI or XFIAH.

I held out for the 2x 1080TI's in my system right now.

Skipped the last generation as none of the offerings could replicate the performance in a single card.

Hoping one of the 3xxx cards will be able to.

Not sweating the power requirements, my case is the form factor of a small fridge and I have not had a power supply rated at less than 1000W in my system for many years.

Now let's make with the release already!
 
Anyone with a 2080 Ti upgrading? I want the best possible experience with CP2077 so I feel like I am going to go with whatever the top tier card will be. Hard to get around a possible $1500 card though...
 
Which way does the back fan blow? Towards the ram on the motherboard or through the fins blowing hot air into the GPU fan side? This design is not making much sense at least to me. If this cooler cost $150, removes 350w+, what can AIBs improve upon cooling wise? Can't really make it bigger, water cooling is the only way I see it could be improved upon, everything else may be less or equal to it but not better with AIB cooling is first thought. Custom boards with more traditional cooling, if that is better then I don't think Nvidia would have wasted their time on this design. Only a few more days before we get some better answers.

It looks like the fans blow into the heatsink from each side and the ‘X’ pattern of the fins channel the heat away from the gpu into the surroundings. Meaning 400w of heat dumped into your case. Toaster time.

I could be wrong of course.
 
I just thought of something with the new adapter. My PSU uses cables with capacitors. Would using a cable or adapter without capacitors potentially cause any issues with my PSU?
 
It looks like the fans blow into the heatsink from each side and the ‘X’ pattern of the fins channel the heat away from the gpu into the surroundings. Meaning 400w of heat dumped into your case. Toaster time.

I could be wrong of course.

Opinion is that it takes in cool air above the card, blows it below, then the second fan is a partial blow that exhausts some of it:
EgCt-uRWsAAEZBu?format=jpg&name=small.jpg
https://twitter.com/cataclysmza/status/1297228015378870274
 
I just thought of something with the new adapter. My PSU uses cables with capacitors. Would using a cable or adapter without capacitors potentially cause any issues with my PSU?

You are either using adaptors, in which case you still have the capacitors in your cables, or getting purpose built cables from your PSU company, that would presumably give you cables with similar capacitors as the ones it is replacing.
 
I’m totally fine with wattages when the performance is there. I ran a custom water cooled Vega 64 and it of course drank wattage when over clocked. It’s VERY likely that the 3090 will make good use of the power draw. We haven’t seen the cooler in action so I reserve judgement until the facts come in. The 2080 super I have now is significantly stronger than the 64 for a 250-265w draw and worth the juice.


edit: I really thought I hit reply to a post on this one. Oh well...
 
Last edited:
Found an interesting interview.
Jensen is hyping us, but, he can't just do that unless he has something to show for, else his reveal will be met with disappointment.
Well, I'm hyped.

GamesBeat: Is it a good guess that you’re going to reveal Ampere chips for games and desktops on September 1?

Huang:
Well, on September 1, I’ll have some really exciting news to tell you. But I don’t want to ruin it for you. I have to have some surprises for you. You’re hard to surprise. I’ve gotta surprise you.

GamesBeat: That must feel good because it’s a validation of the AI strategy, the data strategy.

Huang:
The AI strategy and the datacenter strategy are working out well. We were right that AI processing is going to require acceleration, and the Ampere architecture is the biggest generational leap we’ve ever had. It was a home run.


https://venturebeat.com/2020/08/22/how-nvidia-ceo-motivates-himself-with-funny-paranoia/amp/
 
Big Boi: https://videocardz.com/newz/nvidia-geforce-rtx-3090-graphics-card-pictured

Kind of seems like Samsung 8nm is going to produce some Fermi like wattages and temps. At least the performance will likely be insane, but I guarantee everyone that buys one is going to be pissed about a 7nm TSCM refresh next year.
Jesus fuck no they won't. The only people that believe that are the jealous fucks that can't afford one themselves. This is brought up ever time a refresh comes out and I don't see the outrage. People are perfectly fine buying the top tier card at release. People wouldn't never upgrade if they worry about what is coming next.
 
Jesus fuck no they won't. The only people that believe that are the jealous fucks that can't afford one themselves. This is brought up ever time a refresh comes out and I don't see the outrage. People are perfectly fine buying the top tier card at release. People wouldn't never upgrade if they worry about what is coming next.

Normal people would enjoy the performance and if a refresh comes out you can either sell what you have and put it towards the refresh or wait until next gen. Who TF gets upset and cries in a corner when something new comes out that is a hair faster? I can’t picture someone doing so. Keyboard heroes and their vibrant imaginations 😂.
 
These furnace TDPs worry me. I mean, is the 3080 actually faster than a 2080 Ti in the same power envelope? Or is it just a 2080 Ti with faster RAM? And don't even get me started on the 3090. For people in ITX setups that are temperature limited, I really wonder what are we looking at once you account for reducing power limit to accommodate weaker cooling setups. Maybe 10-15% faster than 2080 Ti?


Same for me. I'm tired of the space heater under my desk, solely because of the GPU.
Went from 220W GTX 570 to a 250W GTX 780, then to dual 780 SLI, and now a single 250W 980Ti.

I was hoping that the upper range offerings of this new 3000 series would peak around 200-220W TDP for the 3080Ti/3090, so I'm a bit shocked to see them weighing in around (a speculated) 320-350W.

If the TDP ratings turn out to be true, then I may just forego nVidia and get an AMD next-gen, if their Navi2 thermals and TDP are substantially better.
 
Same for me. I'm tired of the space heater under my desk, solely because of the GPU.
Went from 220W GTX 570 to a 250W GTX 780, then to dual 780 SLI, and now a single 250W 980Ti.

I was hoping that the upper range offerings of this new 3000 series would peak around 200-220W TDP for the 3080Ti/3090, so I'm a bit shocked to see them weighing in around (a speculated) 320-350W.

If the TDP ratings turn out to be true, then I may just forego nVidia and get an AMD next-gen, if their Navi2 thermals and TDP are substantially better.
Keep buying 225W+ GPU's and you will have heat. If it bothers you that much, liquid cool or downgrade your GPU's in the future.
If AMD thermals are "substantially better" there most likely will be a performance penalty.
 
I do not buy that there will be a TMSC 7nm refresh next year... When is the last time that NVIDIA did a refresh and would it makes any economical sense at all? I suspect that that the relatively minimal performance gain would not justify the material development cost.
 
If the TDP ratings turn out to be true, then I may just forego nVidia and get an AMD next-gen, if their Navi2 thermals and TDP are substantially better.
You'd might as well just go with a less expensive Nvidia product then. If AMDs TDPs are that much lower, their performance will be even worse.
Keep buying 225W+ GPU's and you will have heat. If it bothers you that much, liquid cool
Should be said that unless the tubing for the liquid cooling goes into another room, it's going to be dumping the same heat into the same room -- just more efficiently.
 
Back
Top