RTX 3xxx performance speculation

Gigabyte breakout power boxes have been failing on users while the /nvidia mods are doing their pr thing and deleting all the posts with pictures. Took screenshots of this thread that linked to a larger one that was deleted while I was reading through the comments. Someone said this wasn't an extra point of failure. Seems that wasn't true. It's a simple repair but now you have people sending cards back because the connectors on the breakout box is being pushed through.

View attachment 284505

View attachment 284506
Lol Gigabyte messing up a simple connector. It doesn't have to be a point of failure, but when you go cheap enough anything can be.
 
Gigabyte breakout power boxes have been failing on users while the /nvidia mods are doing their pr thing and deleting all the posts with pictures. Took screenshots of this thread that linked to a larger one that was deleted while I was reading through the comments. Someone said this wasn't an extra point of failure. Seems that wasn't true. It's a simple repair but now you have people sending cards back because the connectors on the breakout box is being pushed through.

View attachment 284505

View attachment 284506

Huh, it's almost as if I said this was a dumb idea from the moment I saw it...
 
Huh, it's almost as if I said this was a dumb idea from the moment I saw it...
And another one. But they get points for not fighting the fact they did a bonehead move and are processing the RMAs smoothly it seems. But they probably didn't need to if they just didn't cheap out.
Screenshot_20201001-140106_Chrome.jpg
 
I imagine it's a combo of not enough parts and wanting to be sure to have a lot of stock on hand. They know this SKU is going to sell like crazy.
Yep, the delay is disappointing but they don't want 4% of people to be able to get these cards. Esp if they dropped on the 15th and were OOS, and AMD had options the end of the month as well.
 
I think AMD got something big this time around. Probably a 3080 competitor.

Honestly, even if they did have something with 3080 level performance, I'm not sure I'd be interested. Even with equal raster performance, I think most people are expecting AMD's RT to be worse than Nvidia. And while I don't care a whole lot about RT, the fact that it's in both of the new consoles means we're probably going to see much more frequent implementations of it going forward. The big thing for me though is DLSS. IMO, that is game changing technology. Have their been any rumors that AMD has an answer to this? Without it, I feel like even with equally performing GPU's, Nvidia has the edge given that DLSS will both look and perform better, and as adoption improves it means one card potential has a long more longevity than the other. If AMD launches a 16GB card with 3080 performance at $499, maybe they have my attention (but we all know that's not going to happen). If AMD launches that same card at $699, for equal money, I'd rather have Nvidia's feature set and superior drivers over AMD's extra 6GB of VRAM.

I'm certainly not rooting against AMD... if there is one thing we've seen by comparing this generation to the last... it's that we absolutely need competition. I hope they release a solid card that puts a much needed fire under Nvidia's ass. I still don't see myself buying it.
 
Honestly, even if they did have something with 3080 level performance, I'm not sure I'd be interested. Even with equal raster performance, I think most people are expecting AMD's RT to be worse than Nvidia. And while I don't care a whole lot about RT, the fact that it's in both of the new consoles means we're probably going to see much more frequent implementations of it going forward. The big thing for me though is DLSS. IMO, that is game changing technology. Have their been any rumors that AMD has an answer to this? Without it, I feel like even with equally performing GPU's, Nvidia has the edge given that DLSS will both look and perform better, and as adoption improves it means one card potential has a long more longevity than the other. If AMD launches a 16GB card with 3080 performance at $499, maybe they have my attention (but we all know that's not going to happen). If AMD launches that same card at $699, for equal money, I'd rather have Nvidia's feature set and superior drivers over AMD's extra 6GB of VRAM.

I'm certainly not rooting against AMD... if there is one thing we've seen by comparing this generation to the last... it's that we absolutely need competition. I hope they release a solid card that puts a much needed fire under Nvidia's ass. I still don't see myself buying it.

Same exact feeling here. Zen 3 I am excited about, Big Navi not so much. But I really do hope they are competitive for everyone's sake.
 
^^ Does not appear there will be real inventory on the 3080 or 3090 this month and that's being optimistic on my part. reality is it will likely be much longer.

I'm starting to be in the "might as well wait for AMD" camp.

Much as I would like to upgrade... nothing motivating me off the 2x 1080ti's right his second other than the upgrade bug and wanting a single card solution.
 
Last edited:
Honestly, even if they did have something with 3080 level performance, I'm not sure I'd be interested. Even with equal raster performance, I think most people are expecting AMD's RT to be worse than Nvidia. And while I don't care a whole lot about RT, the fact that it's in both of the new consoles means we're probably going to see much more frequent implementations of it going forward. The big thing for me though is DLSS. IMO, that is game changing technology. Have their been any rumors that AMD has an answer to this? Without it, I feel like even with equally performing GPU's, Nvidia has the edge given that DLSS will both look and perform better, and as adoption improves it means one card potential has a long more longevity than the other. If AMD launches a 16GB card with 3080 performance at $499, maybe they have my attention (but we all know that's not going to happen). If AMD launches that same card at $699, for equal money, I'd rather have Nvidia's feature set and superior drivers over AMD's extra 6GB of VRAM.

I'm certainly not rooting against AMD... if there is one thing we've seen by comparing this generation to the last... it's that we absolutely need competition. I hope they release a solid card that puts a much needed fire under Nvidia's ass. I still don't see myself buying it.

AMD has its own DLSS implementation and since AMD GPUs are in both Xbox and PS5, I'm willing to bet it will have a wider adoption rate than DLSS.
I wouldn't dismiss AMD this time around.
 
AMD has its own DLSS implementation and since AMD GPUs are in both Xbox and PS5, I'm willing to bet it will have a wider adoption rate than DLSS.
I wouldn't dismiss AMD this time around.

I did some quick and dirty googling and couldn't find anything other than speculation. Got any additional info?
 
FidelityFX

Isn't FidelityFX more or less a sharpening filter? That's definitely not the same as DLSS. It may produce a better looking image, but DLSS produces a better looking image while also substantially improving performance. I don't really have data to support this but my guess is that on a 4K display, DLSS @1440p would look better than setting a game to render at 1440p and enabling FidelityFX.

Also, my understanding is that FidelityFX is GPU agnostic. Can't it be run on Nvidia hardware (and in conjunction with DLSS on titles that support both?) Obviously I've never used it, but from all the information I can find, it's no comparable to DLSS.
 
Interesting take using available data on the internet, performance increase per generation, power increase per generation, performance/$ increase per generation, Turing and Ampere does not look too good is the jest for many Nvidia generation of GPUs:

 
  • Like
Reactions: c3k
like this
Nvidia delaying 3070 to the 29th: maybe it has something to do with AMD's Big Navi release date the day before, on the 28th?
 
Interesting take using available data on the internet, performance increase per generation, power increase per generation, performance/$ increase per generation, Turing and Ampere does not look too good is the jest for many Nvidia generation of GPUs:


Look at it this way, you can overclock a 2080 Ti to 320W and it gains 10% performance on the 3080. Or you can downclock the 3080 to 250W and it loses 10% performance.

The 3080 effectively comes from the factory with a massive OC, which destroys efficiency gains but gives a minor performance boost. The real gen jump is about 20%. It's not as good as Pascal but it's also unfair to act like every generation that doesn't meet Pascal is a disappointment.
 
Interesting take using available data on the internet, performance increase per generation, power increase per generation, performance/$ increase per generation, Turing and Ampere does not look too good is the jest for many Nvidia generation of GPUs:


I tried to point this out earlier. The basic math of it is that 3080/90 specifically are some of the worst generational gains (especially in terms of power ) that Nvidia has ever had. It's math people not an opinion. Whatevs, I'll just wait and see what happens.
 
Look at it this way, you can overclock a 2080 Ti to 320W and it gains 10% performance on the 3080. Or you can downclock the 3080 to 250W and it loses 10% performance.

The 3080 effectively comes from the factory with a massive OC, which destroys efficiency gains but gives a minor performance boost. The real gen jump is about 20%. It's not as good as Pascal but it's also unfair to act like every generation that doesn't meet Pascal is a disappointment.

A 3080 @ 250w is closer to 5% performance loss. You can go all the way down to 210w before hitting the 10% mark (0.743v/1710mhz).
 
A 3080 @ 250w is closer to 5% performance loss. You can go all the way down to 210w before hitting the 10% mark (0.743v/1710mhz).
So Nvidia added 70w to 100w for 5% performance gain??? That actually sounds rather stupid of them to add that much power for virtually nothing, increasing the cost of the cooler, size, electrical components to support . . .
 
So Nvidia added 70w to 100w for 5% performance gain??? That actually sounds rather stupid of them to add that much power for virtually nothing, increasing the cost of the cooler, size, electrical components to support . . .
what are you comparing it to?
 
Look at it this way, you can overclock a 2080 Ti to 320W and it gains 10% performance on the 3080. Or you can downclock the 3080 to 250W and it loses 10% performance.

The 3080 effectively comes from the factory with a massive OC, which destroys efficiency gains but gives a minor performance boost. The real gen jump is about 20%. It's not as good as Pascal but it's also unfair to act like every generation that doesn't meet Pascal is a disappointment.
Did you watch the video? 20% is not remotely the average generational jump. He went with the top consumer die products and mapped out the generational jumps. Turing was the worst and Ampere is the 3rd worst generational jump.
 
So Nvidia added 70w to 100w for 5% performance gain??? That actually sounds rather stupid of them to add that much power for virtually nothing, increasing the cost of the cooler, size, electrical components to support . . .

Samsung 8nm is at least half a node behind 7nm TSMC so they probably felt pressured to present as much up-front performance as possible. They probably suspected otherwise they would lose mindshare. After all most media outlets mindlessly present stock performance and rarely give a nuanced explanation of the tradeoffs.
 
Samsung 8nm is at least half a node behind 7nm TSMC so they probably felt pressured to present as much up-front performance as possible. They probably suspected otherwise they would lose mindshare. After all most media outlets mindlessly present stock performance and rarely give a nuanced explanation of the tradeoffs.
Well that would be a gamble when it all falls apart if AMD actually has something decent. Of course comparisons can also be difficult as well, using for example DLSS and subsequent performance boost from that tech can be utterly amazing. If for example if 30% or more new games launched included DLSS, a batch of titles using that tech where it improves not only performance but also IQ would dramatically change the outcome on performance uplifts. The current issue with that is the almost the total lack overall of that tech support presently with a handful of games that support it well. Then AMD may have an upscaler, usable for example in all DX12 or Vulkan games with better performance uplift but not as good as IQ but more usable. Comparing apple to apple may become unfair if not very difficult to accomplish when dealing with different techs and outcomes. As for what process is used is unimportant to most buyers, it is the result or outcome that is most relevant.
 
AMD has its own DLSS implementation and since AMD GPUs are in both Xbox and PS5, I'm willing to bet it will have a wider adoption rate than DLSS.
I wouldn't dismiss AMD this time around.


MLID said this week they would not. He said RTG would have a newer "Sharpening Tool" like the one from 2019. This will not be a A.I. REAL TIME image resolution reconstruction tool that does image upscaling/that also raises frame rates like DLSS does. I hope AMD comes correct this time out,but unless you have better data then he does, lets not assume.
 
MLID said this week they would not. He said RTG would have a newer "Sharpening Tool" like the one from 2019. This will not be a A.I. REAL TIME image resolution reconstruction tool that does image upscaling/that also raises frame rates like DLSS does. I hope AMD comes correct this time out,but unless you have better data then he does, lets not assume.
It's in the consoles. Just don't know if AMD has an implementation of it. Sony specifically has patents.

Sony patents tech similar to Nvidia's DLSS, could include it in PS5 consoles to achieve higher FPS counts in 4K games
 
Hey lets not forget it took Nvidia almost exactly one year after the launch of Turing, to implement real DLSS into 2 games. And 1 year after that, only a few games yet have it.

Its cool tech. But its barely there in terms of real world meaning to the customer.

And as cool as it is, it definitely has drawbacks. I much prefer Death Stranding without DLSS.
 
Hey lets not forget it took Nvidia almost exactly one year after the launch of Turing, to implement real DLSS into 2 games. And 1 year after that, only a few games yet have it.

Its cool tech. But its barely there in terms of real world meaning to the customer.

And as cool as it is, it definitely has drawbacks. I much prefer Death Stranding without DLSS.
How is DS with DLSS?
 
How is DS with DLSS?
I've been meaning to make a video. IMO, places like digital foundry curiously avoided some of the negative aspects. The only negative they cited, is that objects without motion vectors programmed into the engine, show a blur trail with DLSS. In the case of Death Stranding, it works in its favor. As this visual artifact actually resembles other visual effects in the game.

In a nut shell: DLSS in death stranding results in amazing quality anti-aliasing. Which means better model detail and stability, during motion. And the general impression, is a higher resolution image, indeed. Not just an upscale.

However, once you pay attention to other aspects, some clear issues arise.

1. The overall impression reminds me of H.264 compression. There is a certain character imparted to the overall image, which makes it look unlike when you actually directly render the game. And I don't think the vegetation fairs particularly well, in this regard. And its everywhere. it lowers the visual depth of the image and I don't like the way it looks. Similarly, distant objects can lose some of their impression of depth.

2. it also tends to blunt certain parts of the image. Rain drops are much skinnier with DLSS. To the point that heavy rain "feels" much lighter, with DLSS. And any kind of shiny visual effect is very blunted. The main character has a jacket with a metallic flake stripe on it. With regular rendering, it sparkles and gives a convincing visual of metal flake paint or something like that. With DLSS, the sparkle is notably blurred and blunted. To be fair, TAA also blurs this effect. But DLSS puts an extra bluntness on it. Later in the game there are other shiny and sparkling effects which are also similarly effected.

Overall, I prefer the ingame FXAA and then to just let my GPU upscale the image. Aliasing is worse. But the visual effects and depth are better, IMO.

They did do an update to DLSS in death stranding like 2 weeks ago. I haven't checked it out yet. I know it added an option to scale up from an even lower base resolution. But I don't know if there were any quality improvements.
 
Last edited:
I tried to point this out earlier. The basic math of it is that 3080/90 specifically are some of the worst generational gains (especially in terms of power ) that Nvidia has ever had. It's math people not an opinion. Whatevs, I'll just wait and see what happens.

Just based on power/perf, the 3000 series is possibly Nvidias worst release.
https://videocardz.com/newz/nvidia-shares-official-geforce-rtx-3070-performance-figures

The 3070 gets a 220w tdp versus 250w on the similiar performance 2080ti. A GPU's $/perf will always get better over time, but there is nothing that can be done about watts/perf. If all we get is a 10-15% efficiency gain after a node shrink and architecture update, we will be reaching performance limits VERY fast.

If AMD can't capitalize on nVidia's back to back stumbles, they deserve to lose.
 
Off-topic but it's been over 2 weeks and there still isnt a decent evga review. xc3/ftw3/3080/3090... Nothing.
 
Off-topic but it's been over 2 weeks and there still isnt a decent evga review. xc3/ftw3/3080/3090... Nothing.

I think GN is working on one - they had a FTW3 card I believe, they should be releasing reviews for all of the cards they had eventually. First one that went up was the Gigabyte Eagle, and I believe they had an Asus TUF as well.
 
Gigabyte breakout power boxes have been failing on users while the /nvidia mods are doing their pr thing and deleting all the posts with pictures. Took screenshots of this thread that linked to a larger one that was deleted while I was reading through the comments. Someone said this wasn't an extra point of failure. Seems that wasn't true. It's a simple repair but now you have people sending cards back because the connectors on the breakout box is being pushed through.

are all the Gigabyte 3080 cards affected by this or only the Gaming OC model?

seems like every day a new issue pops up with the Ampere cards
 
Back
Top