Nvidia rtx 3090 discussion

when I saw the rtx 3090 was rendering control at 1440p.... :sick:

bet nvidia didn't exactly want them showing that bit.
 
For those looking for a review of the 3090 before launch day, enjoy:


Clock speed isn't showed on the 3090 when it is compared to the 3080 where it shows the clockspeed. It does however show the clockspeed when the card is showed alone and it seems to be running between 1650mhz and 1875mhz which seems pretty low to me. Also the VRAM never exceeds 10GB when it is showed alone. Benchmark seems faked to me.

We'll only know for sure tomorrow.
 
Last edited:
Well, Titan has always featured a memory bandwidth of 384-bit, the max offered by NVIDIA for their consumer products.

The 3090 is also 384-bit... however, the Titan used to be ~50% faster than the X080 but the 3090 is only 20% faster than the 3080.
The difference is that NVIDIA is very aggressive with the 3080 as it features 320-bit instead of 256-bit (980, 1080, 2080 were all 256-bit).
So, it is not the 3090 that is slower, it is the 3080 that it is faster.

My theory is that NVIDIA did not use the Titan naming scheme for the 3090 because the performance increase does not meet the level associated with a Titan.
So, I think that the 3090 is technically equivalent to the Titan but not from a marketing perspective.

nVidia has not been afraid of having their lesser expensive Ti models nip at the heals of past Titans or even out perform them in some gaming workloads in the past.
 
Only it's not running Control at 4k, it's running it at 2560x1440p with a filter.

Also, look at the settings, they are all turned down to a ps4/xbox one era equivalent, except the light raytracing features in control.

If we want to run first gen ps4 games at 8k, that's not really that impressive. If it could run metro exodus, on max, at 8k native 60fps, then we would be getting somewhere. If it can barely run last gen games at 4k with reduced settings, it's already obsolete for 8k.

Feel free to disagree, but I don't pander to fucking with marketing to make a something appear it can do something it cannot. Just like when AMD said fury was a true 4k videocard. Also, releasing the NDA on benchmarks at the same time the card is available, when everyone knows it will sell out, is horse shit. As much as I'd love the card, I cannot support this assault on me as a consumer.

For those looking for a review of the 3090 before launch day, enjoy:


If someone looks at this objectively it’s easy to see past NVIDIAs smoke and mirrors and realize what an absolute disaster Ampere has been for them. There is no 2080 Ti equivalent for Ampere nor will there be because we can clearly see (and NVIDIA admits) the 3090 doesn’t scale well with increased CUDA cores vs 3080 so a 3080 Ti would be pointless.

This left NVIDIA with a huge dilemma because they needed to create segmentation and maintain their high 60% margins, so how do they do this with a part that has underwhelming next generation performance?

Simple, a few things:

1. Market the 3080 against the 2080 calling it the biggest generational leap ever and then take a hit on margins by launching a tiny tiny handful of cards at $699, thus winning the hearts and minds of gamers and looking like the good guy.
2. Give the green light to AIB to create 20 gb variants and PRAY that AMD doesn’t spoil the party with a competitive part. If AMD doesn’t succeed, then use heavy marketing for the 20 gb variants and price them $150-200 more than 3080 10 gb and flood the market with them. Thus consumers will be hard pressed to find the cheaper $700 cards and will opt to buy the higher 60% margin 20 gb cards.
3. If AMD comes out with cards that have equal performance but more vram than 3080 10GB, flood the market with shills and talk about nvidia features and put a positive spin on it. Enlist streamers and tech tubers to get this job done.
4. Profit

When 3080 is pitted against the 2080 Ti with 1gb less vram and only 15-25% higher performance depending on resolution, you see what a huge failure Ampere really is because it’s performance gain is worse than Turing was vs Pascal. If 2080 Ti had been released with similar deceptive marketing at $700 but later pushed to $1000, nvidia would’ve been loved by all and Turing would be remembered as one of NVIDIAs best gpu releases.

Edit: Looks like Ampere is also suffering from some serious driver issues, enough that it's become a meme:
zioriei1bwo51.jpg
 
Last edited:
Multiple EVGA 3080 models on Newegg just switched over to Out Of Stock (no auto notify) and "may or may not restock" status.
Jacob will confirm or deny at some point, I imagine.
I think you may be onto something. Now all the EVGA 3080 cards are pulled from Newegg. While refreshing my page of "EVGA 3080," it only shows the 3090s.

https://www.newegg.com/p/pl?d=evga+3080

Someone could correct me. I do have my moments.
 
Last edited:
You're correct. All the EVGA 3080s are gone from newegg. They're gone from amazon as well. I don't see anything related on the EVGA forum.

ed- just saw Jacob's comment on reddit
 
Edit: Looks like Ampere is also suffering from some serious driver issues, enough that it's become a meme:
View attachment 282011

Apparently the cards can't hold their out of the box clocks, people are having to downclock the cards to get them stable in a lot of cases. Imagine paying 1500 dollars for a 3080 on ebay and it crashes to desktop every 10 minutes... :)
 
Apparently the cards can't hold their out of the box clocks, people are having to downclock the cards to get them stable in a lot of cases. Imagine paying 1500 dollars for a 3080 on ebay and it crashes to desktop every 10 minutes... :)
or 3090 So? I dont wanna have nightmares and be a beta tester
 
Apparently the cards can't hold their out of the box clocks, people are having to downclock the cards to get them stable in a lot of cases. Imagine paying 1500 dollars for a 3080 on ebay and it crashes to desktop every 10 minutes... :)

Yeah well those guys that paid scalpers that kind of money deserve it. Between EVGA having fan problems (it seems eVGA always has some issues), CTD because of aggressive OC by AIB and stutters in games b/c of shoddy drivers and lack of inventory, this release is shaping up to be one of NVIDIAs worst. It's funny and sad seeing it go from super hyped to deflated. This just goes to show we can't trust day one reviewers, none of them brought any of these issues to light. It was left to end users to be the guinea pigs as usual.
 
Yeah well those guys that paid scalpers that kind of money deserve it. Between EVGA having fan problems (it seems eVGA always has some issues), CTD because of aggressive OC by AIB and stutters in games b/c of shoddy drivers and lack of inventory, this release is shaping up to be one of NVIDIAs worst. It's funny and sad seeing it go from super hyped to deflated. This just goes to show we can't trust day one reviewers, none of them brought any of these issues to light. It was left to end users to be the guinea pigs as usual.
From what I have seen the hype has far from deflated. There are some people who have fallen off the hype train but I'd still say the vast majority of people who had intentions of purchasing a 30X0 are still planning on doing so as soon as they can.
 
From what I have seen the hype has far from deflated. There are some people who have fallen off the hype train but I'd still say the vast majority of people who had intentions of purchasing a 30X0 are still planning on doing so as soon as they can.

There's the psychos on r/nvidia's discord that will buy Jensen's turd if they're told it tastes good. But I think overall consumers who are level headed are quickly learning that Ampere isn't what it was promised to be. They're ok but not a "generational leap" like we were promised either.

They are garbage cards. Everyone should cease buying them immediately please and thank you.
Especially from Newegg, that site is the worst.

Ok I'll pass your message on to the NV discord. :ROFLMAO:
 
If someone looks at this objectively it’s easy to see past NVIDIAs smoke and mirrors and realize what an absolute disaster Ampere has been for them. There is no 2080 Ti equivalent for Ampere nor will there be because we can clearly see (and NVIDIA admits) the 3090 doesn’t scale well with increased CUDA cores vs 3080 so a 3080 Ti would be pointless.

This left NVIDIA with a huge dilemma because they needed to create segmentation and maintain their high 60% margins, so how do they do this with a part that has underwhelming next generation performance?

Simple, a few things:

1. Market the 3080 against the 2080 calling it the biggest generational leap ever and then take a hit on margins by launching a tiny tiny handful of cards at $699, thus winning the hearts and minds of gamers and looking like the good guy.
2. Give the green light to AIB to create 20 gb variants and PRAY that AMD doesn’t spoil the party with a competitive part. If AMD doesn’t succeed, then use heavy marketing for the 20 gb variants and price them $150-200 more than 3080 10 gb and flood the market with them. Thus consumers will be hard pressed to find the cheaper $700 cards and will opt to buy the higher 60% margin 20 gb cards.
3. If AMD comes out with cards that have equal performance but more vram than 3080 10GB, flood the market with shills and talk about nvidia features and put a positive spin on it. Enlist streamers and tech tubers to get this job done.
4. Profit

When 3080 is pitted against the 2080 Ti with 1gb less vram and only 15-25% higher performance depending on resolution, you see what a huge failure Ampere really is because it’s performance gain is worse than Turing was vs Pascal. If 2080 Ti had been released with similar deceptive marketing at $700 but later pushed to $1000, nvidia would’ve been loved by all and Turing would be remembered as one of NVIDIAs best gpu releases.

Edit: Looks like Ampere is also suffering from some serious driver issues, enough that it's become a meme:
View attachment 282011

If we filter out all the inaccuracies in this post, all we are really left with here is you're saying Ampere's top end is a worse value than Turing's top end (2080Ti vs 3090)

You say there's no 2080Ti equivalent, then go on to compare it with a card that costs $500 less, but outperforms it. That sounds a whole lot better than buying a 2080 and only matching the performance of the 1080Ti while having 3GB less memory and paying more for it.

Even a +$200 premium for a 20GB 3080 makes the value WAY better than Turing. Now you have a card that's not only cheaper than the 2080Ti, but faster than a 2080Ti and 9GB more memory than a 2080Ti

In what world is that worse than Turing, a generation where you had no choice but to spend $1200+ to get something better than a 1080Ti?
 
There's the psychos on r/nvidia's discord that will buy Jensen's turd if they're told it tastes good. But I think overall consumers who are level headed are quickly learning that Ampere isn't what it was promised to be. They're ok but not a "generational leap" like we were promised either.
Even on this forum I'd say the split is still above 50/50 for people who still want to buy the card that had intentions of buying one.

There are mainly 2 types of people who are vocal about this launch: the ones that are still really excited and want to get a 30X0 ASAP and the ones that are relentlessly whining about every aspect of the launch and product. Seems to be few people in between.

Obviously the launch is a mess and the cards aren't as great as Jensen made them sound (what a surprise, marketing in a nutshell). But holy shit are some people just relentlessly whiny.
 
Even on this forum I'd say the split is still above 50/50 for people who still want to buy the card that had intentions of buying one.

There are mainly 2 types of people who are vocal about this launch: the ones that are still really excited and want to get a 30X0 ASAP and the ones that are relentlessly whining about every aspect of the launch and product. Seems to be few people in between.

Obviously the launch is a mess and the cards aren't as great as Jensen made them sound (what a surprise, marketing in a nutshell). But holy shit are some people just relentlessly whiny.

I'm the type that went from very excited about Ampere (pre-reveal) to "ok the 10 gb sucks but this card is still great" to "ok wow, not very good at all". I'll probably still end up with an Ampere card whether its the 10 gb or 20 gb one but the hype has certainly subsided for me. If I didn't care at all about NVIDIA filters (which I basically require in Warzone), then I'd definitely give AMD a shot with Big Navi and see how things go on the other side of the fence (grass always greener and all that).
 
I am not sure why its showing Pounds instead of Dollars on the Nvidia site?

£1,399.00
 
Last edited:
If we filter out all the inaccuracies in this post, all we are really left with here is you're saying Ampere's top end is a worse value than Turing's top end (2080Ti vs 3090)

You say there's no 2080Ti equivalent, then go on to compare it with a card that costs $500 less, but outperforms it. That sounds a whole lot better than buying a 2080 and only matching the performance of the 1080Ti while having 3GB less memory and paying more for it.

Let me quote the most salient part which you ignored:
When 3080 is pitted against the 2080 Ti with 1gb less vram and only 15-25% higher performance depending on resolution, you see what a huge failure Ampere really is because it’s performance gain is worse than Turing was vs Pascal. If 2080 Ti had been released with similar deceptive marketing at $700 but later pushed to $1000, nvidia would’ve been loved by all and Turing would be remembered as one of NVIDIAs best gpu releases.

The 3080 is the flagship GPU with 3090 being touted as the exotic "sometimes prosumer, sometimes 8k, sometimes AI, we can't make up our minds" card. There's no 3080 Ti to compare against the 2080 Ti here so NVIDIA instead opted to compare the 3080 against the 2080 to give bigger marketing numbers. If they had compared it against the 2080 Ti, the performance jump would've seemed pretty abysmal. Yes the price is cheaper, it should be considering it's 2 yrs newer and on a Samsung 8 nm node for which NVIDIA probably got a handsome discount. But that doesn't negate the fact that the card is a disappointment for a flagship product performance wise and is only acceptable because of it's FE launch price (good for the dozen or so scalpers who managed to get one).
 
Let me quote the most salient part which you ignored:


The 3080 is the flagship GPU with 3090 being touted as the exotic "sometimes prosumer, sometimes 8k, sometimes AI, we can't make up our minds" card. There's no 3080 Ti to compare against the 2080 Ti here so NVIDIA instead opted to compare the 3080 against the 2080 to give bigger marketing numbers. If they had compared it against the 2080 Ti, the performance jump would've seemed pretty abysmal. Yes the price is cheaper, it should be considering it's 2 yrs newer and on a Samsung 8 nm node for which NVIDIA probably got a handsome discount. But that doesn't negate the fact that the card is a disappointment for a flagship product performance wise and is only acceptable because of it's FE launch price (good for the dozen or so scalpers who managed to get one).

Ignored? Most of my post was in response to just that.

Lets simplify it... 2080Ti was about a 30% average performance jump over a 1080Ti for an extra $500. 3080 is a 20% jump over a 2080Ti for $500 less

If you think this equates to a worse value, you need a calculator.
 
Ignored? Most of my post was in response to just that.

Lets simplify it... 2080Ti was about a 30% average performance jump over a 1080Ti for an extra $500. 3080 is a 20% jump over a 2080Ti for $500 less

If you think this equates to a worse value, you need a calculator.

DID I MENTION VALUE? I'm talking about performance jump, for a card advertised as NVIDIAs greatest generational jump, it's a complete lie when comparing flagship vs flagship (3080 vs 2080 Ti). There is no 3080 Ti and from what I can tell, no leaked info that nvidia plans for one either. Oh btw, 3080 yields 18% avg over 2080 Ti at FullHD resolution which is much worse than what 2080 Ti got over 1080 Ti. Even at 1440p it's worse and only reaches somewhat parity at 4K. Let me guess, the good old CPU bottleneck excuse?
 
DID I MENTION VALUE? I'm talking about performance jump, for a card advertised as NVIDIAs greatest generational jump, it's a complete lie when comparing flagship vs flagship (3080 vs 2080 Ti). There is no 3080 Ti and from what I can tell, no leaked info that nvidia plans for one either. Oh btw, 3080 yields 18% avg over 2080 Ti at FullHD resolution which is much worse than what 2080 Ti got over 1080 Ti. Even at 1440p it's worse and only reaches somewhat parity at 4K. Let me guess, the good old CPU bottleneck excuse?

You mentioned $700 that won't exist and +$200 for an extra 10GB... Why would you mention cost at all if you were only concerned with performance?

But since you're NOT talking about value, you have the 3090 with an extra ~10% over the 3080 which will at a minimum, equal the jump from 1080Ti to 2080Ti

Sounds like either way, you don't really have much to complain about.
 
You mentioned $700 that won't exist and +$200 for an extra 10GB... Why would you mention cost at all if you were only concerned with performance?

But since you're NOT talking about value, you have the 3090 with an extra ~10% over the 3080 which will at a minimum, equal the jump from 1080Ti to 2080Ti

Sounds like either way, you don't really have much to complain about.

The 3090 10% jump is at best, it ranges from 4% to ~10% based on that TecLabs review.
 
I had a hard on initially for the 3090 when it was announced, then came to my senses when the performance numbers were revealed. Not sure what I was expecting, I guess.

Honestly I just want a 3080 for now to be able to use HDMI 2.1.
 
I had a hard on initially for the 3090 when it was announced, then came to my senses when the performance numbers were revealed. Not sure what I was expecting, I guess.

Honestly I just want a 3080 for now to be able to use HDMI 2.1.

Same, I was dead set on 3090, now with this performance confirmation by NVDA, I will be ok with 3080 for 2 years. Even at 4K - 10 gigs should be enough for next 2 years, if I have to dial down shadows or what not a bit in like a title or two - so be it.
 
Back
Top