RTX 3xxx performance speculation

Thought this was interesting statement by ASUS about PSUs not being able to keep up with Ampere GPUs and switching speeds. Video is at the mark.



Doesn't mention Ampere at that timestamp.. Just says if your PSU is old, it might no longer handle GPU transients.
 
The 3090 is going to be an INSANE scientific computing card. I will be buying two of them immediately and dumping my 4 RTX 2080TI.
For scientific computing applications they will pay for themselves so fast it's not even funny. One of the pieces of software I use charges about $10k/yr per GPU to enable GPU acceleration. Dumping my 4 RTX 2080TIs and moving to 2 RTX 3090 will save me $20k a year in software licensing costs PLUS more than double my compute performance.

That is why they priced it where they did. It's a bargain for the machine learning and scientific computing crowd.
 
Doesn't mention Ampere at that timestamp.. Just says if your PSU is old, it might no longer handle GPU transients.
Yeah....this being announced today has nothing to do with Ampere and new 400W cards from ASUS. These are also not the droids you are looking for.
 
Yeah....this being announced today has nothing to do with Ampere and new 400W cards from ASUS. These are also not the droids you are looking for.

There is a difference between you inferring something, and it being stated. You claimed they stated it. But they didn't.
 
There is a difference between you inferring something, and it being stated. You claimed they stated it. But they didn't.
You are 100% correct. And you would be 100% wrong to state this is not in response to Ampere power usage. Don't put your head in the sand.
 
Yeah. I am sure they are quaking in their boots. ;)

It's pretty simple to look at 5700XT and extrapolate what AMD could do with that on a bigger die, and then box them in with performance and pricing.

I don't see it so much as worried, as it is laughing manically while they cut off AMDs oxygen supply. :D

Hehe post saved until Big Navi’s reveal. Uncle Jensen isn’t that generous that he’d give 2080 Ti performance for $500 UNLESS he felt the heat coming from the competition. I remember when some people here stated almost matter of factly that Ampere would at best be 30% faster than 2080 Ti and I said they were wrong and we’d see minimum 50%, looks like I was right. I’m confident Big Navi will compete very nicely and the same people are saying it won’t.
 
I'm selling after I get the 3090. With 30 day returns I'd rather not deal with someone who realizes they overpaid and have the option to send it back.

I don't use ebay, been selling on Meraci. Seems to be a lot more seller friendly, but works a bit different. You don't get the funds until the buyer receives the item and verifies it is as expected, but the buyer also isn't able to back out once they commit to buying or return after the fact. They take a flat 10% and funds go right to your bank, no PayPal involved. Cost is comparable to eBay + paypal fees.

That all said, I've yet to encounter a situation where the buyer lies about the item they received to try and force a return. I was VERY thruough about documenting the card working and having adequate packing so that I can be prepared to fight any false DOA claims.
 
It's cool that alot of people are getting the 3090, but I just want to make sure that everyone knows there will be a 3080ti in 6 months with 95% of the performance and 60% of the price.

We always seem to have some surprised and even angry reactions when this happens. Just making sure everyone understands this.

So you think its gonna be like the 780 with multiple cut down SKUs (780, 780 Ti, Titan, Titan Black) from the Kepler Titan?
 
Didn't AMD claim a 50% performance per watt increase for RDNA2? I think its pretty impressive that NVIDIA was able to get 90% per/watt increase. I think that NVIDIA was helped by going from 12nm to 8nm and AMD doesn't have that as much this time since they are already on 7nm. Improvements on the AMD side will likely be more architectural.
 
2080ti started popping up on my local CL for $800.
Figure $600 when 3090 starts shipping?
Yes, I’m aware 3070 is there to jam up 2080ti used values, just spitballing what owners that didn’t dump before today will be trying to get.
 
You are 100% correct. And you would be 100% wrong to state this is not in response to Ampere power usage. Don't put your head in the sand.

I am sure they would like to cash in on FUD around power usage. Gamers Nexus just had a PSU story this morning about how most people go well into overkill on PSU, probably to preempt a lot of the power FUD floating around.
 
I am sure they would like to cash in on FUD around power usage. Gamers Nexus just had a PSU story this morning about how most people go well into overkill on PSU, probably to preempt a lot of the power FUD floating around.
Yep, that is it. Nothing to do with actual power usage and switching needed. Nothing to see here. Move along.
 
Didn't AMD claim a 50% performance per watt increase for RDNA2? I think its pretty impressive that NVIDIA was able to get 90% per/watt increase. I think that NVIDIA was helped by going from 12nm to 8nm and AMD doesn't have that as much this time since they are already on 7nm. Improvements on the AMD side will likely be more architectural.

That's going to depend on what SKUs you compare. The Digital Foundry 3080 to 2080 implies more of a 40-50% perf/watt increase, if we use the average power consumption of ~215W for the 2080 and if 320W TDP is representative of typical power draw, and assume the 3080 is run closer to the top of the power efficiency curve compared to the 2080.

In general you need to take Nvidia's perf/watt claims with a grain of salt. They also claimed Turing was 50% better but in reality it was probably closer to 10-15% once you actually measured power draw from the wall versus performance.
 
Last edited:
  • Like
Reactions: noko
like this
I am sure they would like to cash in on FUD around power usage. Gamers Nexus just had a PSU story this morning about how most people go well into overkill on PSU, probably to preempt a lot of the power FUD floating around.

I dunno about that, it seems like on average for a custom build I would think 750W is pretty common. With a 400W card and a 125+W CPU I would think you are getting into some dangerous territory.
 
I dunno about that, it seems like on average for a custom build I would think 750W is pretty common. With a 400W card and a 125+W CPU I would think you are getting into some dangerous territory.

using PCPartPicker's system builder tool I spec'd out a future build (added 400 watts for Ampere GPU) and the total wattage came to 600 watts...I always like to leave 100 watts headroom...so 700 watts

I'm debating picking up an 850 watt PSU but I think 750w should be fine
 
using PCPartPicker's system builder tool I spec'd out a future build (added 400 watts for Ampere GPU) and the total wattage came to 600 watts...I always like to leave 100 watts headroom...so 700 watts

I'm debating picking up an 850 watt PSU but I think 750w should be fine
Rule of thumb for us in recommendations is calculate your wattage and add 20%.
 
And the part of the video I outlined said nothing close to that. Jesus, are you such an NVIDIA fanboy that you get bent out of shape this easily?

Probably a good reminder for everyone to make sure they have a quality PSU. I'm not sure why anyone ever skimps on the PSU, it's generally a part you can keep for >5 years and multiple builds.
 
Seems to me ASUS is saying because of the sensitive nature of newer hardware the days of skimping on a PSU are over. You better have a GOOD PSU if you're going to run Ampere. No name brands need not apply and even name brands with marginal voltage regulation are out. Do your research and buy a well reviewed PSU.

Gots to pay to play.
 
Guess this 11 year old Corsair AX1200 is gonna play a game of fuck around and find out.

My AX1200i is about 7 years old now but I monitor the rails and they’re solid so I’m not worried. This thing was built to last.
 
Hehe post saved until Big Navi’s reveal. Uncle Jensen isn’t that generous that he’d give 2080 Ti performance for $500 UNLESS he felt the heat coming from the competition. I remember when some people here stated almost matter of factly that Ampere would at best be 30% faster than 2080 Ti and I said they were wrong and we’d see minimum 50%, looks like I was right. I’m confident Big Navi will compete very nicely and the same people are saying it won’t.

For some reason your post has a Yahtzee Croshaw tone to it.
 
For some reason your post has a Yahtzee Croshaw tone to it.

3B433F9D-7E57-4A97-9912-CE0323AF65F0.jpeg
 
I am sure they would like to cash in on FUD around power usage. Gamers Nexus just had a PSU story this morning about how most people go well into overkill on PSU, probably to preempt a lot of the power FUD floating around.
When purchasing a PSU with 10yr warranty it makes sense to plan ahead somewhat.
I like to have a PSU with approaching 50% headroom to be sure it will last 10 years of new more powerful hardware and the 10 years without!
Another benefit is the fan remaining off or low speed, pushing a small PSU hard can add too much noise.
 
My AX1200i is about 7 years old now but I monitor the rails and they’re solid so I’m not worried. This thing was built to last.
How are you monitoring the rail?
My GPU has a volt meter breakout cable so i suppose i can monitor the 12v from that.
 
I’m rocking a 1200 P2 bc that’s all that was available over 650w the day my 1080ti choked out my old Seasonic 760.
Height of crypto mining.
Either I bought the usual Seasonic 650 for $89 bc all discounts went away.....or $109 for the evga 1200.
Wonder what psu shopping will be like in 7-8 years.
 
My AX1200i is about 7 years old now but I monitor the rails and they’re solid so I’m not worried. This thing was built to last.

LOL, i havent had a 1000 watt psu since my SLI OVEN'S aka GTX 480 Fermi's. Rocking a new seasonic 650W for my current 9700k \ 2080 Super build





---


Side note, im honestly shocked they are saying the $499 3070 is faster than the 2080Ti. What a time to be alive.
 
LOL, i havent had a 1000 watt psu since my SLI OVEN'S aka GTX 480 Fermi's. Rocking a new seasonic 650W for my current 9700k \ 2080 Super build





---


Side note, im honestly shocked they are saying the $499 3070 is faster than the 2080Ti. What a time to be alive.

I think the reason for the pricing is they are preparing for AMD's RDNA2.
 
How are you monitoring the rail?
My GPU has a volt meter breakout cable so i suppose i can monitor the 12v from that.
I doubt a voltage breakout cable from a gfx card would be to measure the incoming voltage, thats easily achieved directly from the PSU.
It will be for a difficult to capture voltage, such as that fed to the GPU.
I could be wrong but will be surprised.
 
I think the reason for the pricing is they are preparing for AMD's RDNA2.


Even if that is the case, as much as i want to believe the assholes over at AMD will finally truly have something great that competes with these cards, I believe they wont. I hope im wrong but AMD has a track record of letting us down. This news from NVIDIA today is beyond what anyone could have imagined. I mean, holy shit. Faster than the 2080Ti @ $499??? This might bring EVERYONE over to PC Master Race
 
Even if that is the case, as much as i want to believe the assholes over at AMD will finally truly have something great that competes with these cards, I believe they wont. I hope im wrong but AMD has a track record of letting us down. This news from NVIDIA today is beyond what anyone could have imagined. I mean, holy shit. Faster than the 2080Ti @ $499??? This might bring EVERYONE over to PC Master Race
I dare not be a test driver with their drivers!
Been there, tried it a few times, crashed and burned.
 
6B1C8C61-A468-4D37-BB74-4C25AE061FE2.jpeg
Useful info. This will probably make cable management kind of shitty since most 8 pins have a 6+2pin attached to them.
 
I dare not be a test driver with their drivers!
Been there, tried it a few times, crashed and burned.

Which is exactly why I am going Nvidia. Been burnt too many times from AMD GPU's. Now there CPU's, well you will need Ryzen 3000 with pcie 4.0 for the big boys (3080,3090). Otherwise you will have some performance loss staying with Intel and PCIE 3.0 if you plan to game at some 1440p and 4k+.
 
Which is exactly why I am going Nvidia. Been burnt too many times from AMD GPU's. Now there CPU's, well you will need Ryzen 3000 with pcie 4.0 for the big boys (3080,3090). Otherwise you will have some performance loss staying with Intel and PCIE 3.0 if you plan to game at some 1440p and 4k+.

Wait, god damn it!

so my 9700k / pcie 3.0 will degrade performance of the 3080 / 3090 ?
 
Back
Top