Nvidia rtx 3090 discussion

mnewxcv

[H]F Junkie
Joined
Mar 4, 2007
Messages
8,991
Let's talk about performance, not availability. I've been thinking about this card as being a titan replacement. However, the titans have always been nvidia produced, not aib. It looks like all the regular aibs are on board for the 3090. Do we think it is really a titan then? No other generation has had double the vram for the flagship vs the second best card except titans, correct?

For those considering one, what's your use case?
 
1600740998268.png
 
Last edited:
I’m considering it purely for 4K gaming knowing fully well it’s poor value vs the 3080. Planning to keep it for 4 years and with any luck the 20% headroom should prove useful at some point.

I also expect a true Titan to drop next year with a full GA102 and 48GB RAM.
 
Let's talk about performance, not availability. I've been thinking about this card as being a titan replacement. However, the titans have always been nvidia produced, not aib. It looks like all the regular aibs are on board for the 3090. Do we think it is really a titan then? No other generation has had double the vram for the flagship vs the second best card except titans, correct?

For those considering one, what's your use case?

No, I do not think that the 3090 is a Titan replacement. I think NVIDIA is segmenting the market more than it had in the past. The RTX 3090 is likely for gamer's who would spend the money on Titan cards just to play games. It's just my opinion but I think a true, prosumer level Titan is likely to be released at a later date with features more in line with usage scenarios not restricted to gaming.

Unless the 3090 really only performs about 10% faster than a 3080 I'm going with a 3090 We don't know on that front yet as the leaked benchmarks lack crucial context and verification. I plan on going back to 4K gaming now that options with G-Sync and 120Hz exist. When I got my RTX 2080 Ti, that wasn't really possible. Not only did such displays not really exist but the card just wasn't capable of achieving that level of performance.
 
I've heard these things make amazing paper weights... or just weights in general.
 
No, I do not think that the 3090 is a Titan replacement. I think NVIDIA is segmenting the market more than it had in the past. The RTX 3090 is likely for gamer's who would spend the money on Titan cards just to play games. It's just my opinion but I think a true, prosumer level Titan is likely to be released at a later date with features more in line with usage scenarios not restricted to gaming.

Unless the 3090 really only performs about 10% faster than a 3080 I'm going with a 3090 We don't know on that front yet as the leaked benchmarks lack crucial context and verification. I plan on going back to 4K gaming now that options with G-Sync and 120Hz exist. When I got my RTX 2080 Ti, that wasn't really possible. Not only did such displays not really exist but the card just wasn't capable of achieving that level of performance.
If a titan comes out at $2500, do you think gamers will still buy it? Some people just want the best. If the rumors of 3080 20gb are true, the 3080 10gb is a weird card, if a 3070 is really comparable to a 2080ti.
 
Well, Titan has always featured a memory bandwidth of 384-bit, the max offered by NVIDIA for their consumer products.

The 3090 is also 384-bit... however, the Titan used to be ~50% faster than the X080 but the 3090 is only 20% faster than the 3080.
The difference is that NVIDIA is very aggressive with the 3080 as it features 320-bit instead of 256-bit (980, 1080, 2080 were all 256-bit).
So, it is not the 3090 that is slower, it is the 3080 that it is faster.

My theory is that NVIDIA did not use the Titan naming scheme for the 3090 because the performance increase does not meet the level associated with a Titan.
So, I think that the 3090 is technically equivalent to the Titan but not from a marketing perspective.
 
I don't know, if a user found value in a 2080Ti $1200+, why would they not find value in a $1500 card that is 40% to 50% faster than the 2080 Ti in games and even faster at RT? I would say that is roughly the same market segment but adding another segment to it, you can do some pretty amazing rendering with like Vray and that 24gb would be a very good asset to have at a huge discount over the previous Titan.

Nvidia already offers the Studio drivers, which makes the 20gb 3080 a even cheaper Titan. Nvidia has some great RT and AI ability -> The usage for these GPU's are well widespread. the 3090 has the NVlink bridge which hopefully allows two 3090's to share the memory for 48gb => Just can't see why this would not sell to small studios like candy. Plus buy one now and one later as needed.

So it almost comes down to whatever you do, Nvidia has you covered, any angle comes into play. Nvidia is just starting to widen their software stack using RT and AI in programs that people use. From broadcasting, image processing, rendering, game playing you can say Nvidia has your back or will. Not in a bad way as in a knife so to speak.

So if AMD wins the Gaming benchmarks, they still may not win the sells.

As for me, if I ever get off my ass and do serious 3d work again => 3090 would be very much relevant and the cost is actually rather good if not great => That is if one can actually buy one.
 
This is the card I want to upgrade to, but I’m waiting on reviews. If it’s not 50-60% faster than a 2080Ti, I might not bother.
 
I got a Project Scope I wish I could post here.
It’s a BI and DB excavate and push to aws.
So there is a lot of sizing based on capturing Yd interactions I’ll split over to Kinesis to replay as a test stream.
Existing card decks o “logic” turn into a step functions log ride down into a gpu processed lava chute.

Id be looking for what could be pressed in timeframe by commodity resources and what absolutely has to run down a near gpu assisted real time lane.

you guys would be amused by the wording bc it soundslike they’re an Oracle DBA from the mid 90s as far as jargon.

So yeah, I’d have to get an HEDT box and vapor ware 3090’s otherwise they’ll have to eat the AWS bill just for initial sizing.
 
I got a Project Scope I wish I could post here.
It’s a BI and DB excavate and push to aws.
So there is a lot of sizing based on capturing Yd interactions I’ll split over to Kinesis to replay as a test stream.
Existing card decks o “logic” turn into a step functions log ride down into a gpu processed lava chute.

Id be looking for what could be pressed in timeframe by commodity resources and what absolutely has to run down a near gpu assisted real time lane.

you guys would be amused by the wording bc it soundslike they’re an Oracle DBA from the mid 90s as far as jargon.

So yeah, I’d have to get an HEDT box and vapor ware 3090’s otherwise they’ll have to eat the AWS bill just for initial sizing.
if RTX Titan cards come down in price, they might be an option for people who need the vram and are ok with used cards.
 
if RTX Titan cards come down in price, they might be an option for people who need the vram and are ok with used cards.

It's all up to this client.
I wouldn't want to find out the gap btw Turing and Ampere core speed with regards to how blank space interleaving works in Ampere.

You can shred 9 year's worth of a DB2 regional store changing gpu generations on AWS gpu instances.

I don't see a lot of this type of testing on white box builds, mainly bc of gpu cost is just proving performance with consumer skus. Not translate able into a deployment.
 
Well the reddit guy has a much worse setup.
he does say "I've managed a minor overclock of +75 Core, +200 Memory". Unclear if those numbers are stock or OCed. That makes a difference if comping to stock 3080 numbers.
 
I’m considering it purely for 4K gaming knowing fully well it’s poor value vs the 3080. Planning to keep it for 4 years and with any luck the 20% headroom should prove useful at some point.

I also expect a true Titan to drop next year with a full GA102 and 48GB RAM.

I think you will be disappointed on both based on what has been leaked and rumored so far. Unlikely to see 20% gain on the 3090 and I doubt you will see a Titan drop either, unless Nvidia is willing to pull some of their TSMC manufacturing over for the gaming crowd and that is unlikely.
 
I want to see render times vs the 3080. As i will not be buying the next gen cards. That 3090 Blower is looking nice for a Dan A4 case I can take offshore
 
I’m considering it purely for 4K gaming knowing fully well it’s poor value vs the 3080. Planning to keep it for 4 years and with any luck the 20% headroom should prove useful at some point.

I also expect a true Titan to drop next year with a full GA102 and 48GB RAM.

You do you whatever you want with your money but poor value is poor value.. Alternative cheaper approaches
1) Wait for the 3080 Ti.. should provide ~95% of the performance for significant less
2) Get a 3080 now, in two years, sell it and use the proceed for a 4080 or 4080 Ti, which should easily beat the 3090 . Not only much cheaper but you would likely end up with higher FPS on average during the next 4 years. Also, the additional performance of the 3090 over the 3080 may not be a big deal in the short term...
 
IMO there's really no point waiting for an eventual 3080 Ti. You can buy a 3090 now and game on it for the next 6-8 months (or more). 3080 Ti will probably be $999 so waiting almost a year to save $500 sounds like a miserable idea.

This is coming from the perspective of a 2080 Ti owner though who was unhappy with it's 4K performance and has already waited 2 years for something better. I'm tired of waiting TBH.
 
No, I do not think that the 3090 is a Titan replacement. I think NVIDIA is segmenting the market more than it had in the past. The RTX 3090 is likely for gamer's who would spend the money on Titan cards just to play games. It's just my opinion but I think a true, prosumer level Titan is likely to be released at a later date with features more in line with usage scenarios not restricted to gaming.

Unless the 3090 really only performs about 10% faster than a 3080 I'm going with a 3090 We don't know on that front yet as the leaked benchmarks lack crucial context and verification. I plan on going back to 4K gaming now that options with G-Sync and 120Hz exist. When I got my RTX 2080 Ti, that wasn't really possible. Not only did such displays not really exist but the card just wasn't capable of achieving that level of performance.

Given the 3090 is <10% higher TDP limit at stock compared to the 3080 I think a high TDP limit card is going to help. I imagine that’s the context you are talking about. Stock vs stock the 3090 is more PL limited.
 
Any chance NV is holding back some performance on the 3090 they will roll out in a driver update once AMD is out in order to blow them out of the water?
 
Any chance NV is holding back some performance on the 3090 they will roll out in a driver update once AMD is out in order to blow them out of the water?
That would be strange, hobble yourself when nvidia is trying to publish or blasts to focus the consumers squarely on their offerings?
3070 and 3060 seem held back to fire shots at AMD.

Also the EVGA 3090 prices off Newegg if correct would be ludicrous if anything major was being held back.

3090 has to kill it in light of the 2080ti devaluation.

Screenshot_20200921-230555.png
 
Last edited:
You do you whatever you want with your money but poor value is poor value.. Alternative cheaper approaches
1) Wait for the 3080 Ti.. should provide ~95% of the performance for significant less
2) Get a 3080 now, in two years, sell it and use the proceed for a 4080 or 4080 Ti, which should easily beat the 3090 . Not only much cheaper but you would likely end up with higher FPS on average during the next 4 years. Also, the additional performance of the 3090 over the 3080 may not be a big deal in the short term...

Well that’s the thing. The cheaper options involve waiting and hoping. Neither of which are worth the money.
 
Is it actually letting you purchase? I can't seem to bring that page up.

** nm, got that page up but no add to cart option :)
 
Does a RTX 3090 make sense? I Expect the 3090 to be about 15-20% faster than 3080 and would thus say it is down to whether you need more than the 10 GB Vram on the 3080.

Or look at it this way: say the RTX 3080 with 20 GB Vram cost around $1000, would it make sense to add those $500 to get 15-20% more compute power, NVlink for mem sharing and - most importantly - get it nowish over the waiting game For the 3080 20GB? :whistle:
 
Does a RTX 3090 make sense? I Expect the 3090 to be about 15-20% faster than 3080 and would thus say it is down to whether you need more than the 10 GB Vram on the 3080.

Or look at it this way: say the RTX 3080 with 20 GB Vram cost around $1000, would it make sense to add those $500 to get 15-20% more compute power, NVlink for mem sharing and - most importantly - get it nowish over the waiting game For the 3080 20GB? :whistle:

3090 doesn’t make any sense from a value perspective. The 20GB 3080 probably won’t either if it’s just adding 10GB vram with no other changes. Both of those cards will likely perform very similarly to a 10GB 3080.
 
I honestly believe the 3090 is the TI replacement. It makes sense to rebrand it because people would of flip their shit if they released a 3080ti for $1500. There will be a Titan coming early next year too I believe. I hope I am wrong. I would like a 3080ri that is close too 3090 performance with 22gb of ram at $999 by the end of the year. It could happen if AMD bring the fire with big Navi.
 
3090 doesn’t make any sense from a value perspective. The 20GB 3080 probably won’t either if it’s just adding 10GB vram with no other changes. Both of those cards will likely perform very similarly to a 10GB 3080.

Absolutely. I’m in the market for a Rtx 3080 vanilla, so no argument from me. Just offering another way to look at it for those considering a RTX 3090.
 
3090 doesn’t make any sense from a value perspective. The 20GB 3080 probably won’t either if it’s just adding 10GB vram with no other changes. Both of those cards will likely perform very similarly to a 10GB 3080.
I think the value add for the 3080 20gb will be longevity, hearing how people are upgrading from 900 series cards for this gen. 10GB is plenty now, but probably wont be in 5 years with increased game detail parallel with higher mainstream resolutions.
 
The reviews are not available today? They are available starting tomorrow once the card goes on sale? I wonder why they would not give us at least a day to look at #'s......
 
The 3090 is the real 3080, the 3080 the real 3070, etc. Nvidia raised the prices this generation but just didn't want to admit it.

The 3080 doesn't have enough VRAM for next generation games at 4k, and the 3090 is a bit too expensive for what it is.

Just skimming through that Linus video, it's obvious that the games aren't hitting 60fps at 8k (the Doom Eternal framerate looks like shit, honestly), which is more support for the "only 20% more performance max" data we have so far.

If the 3090 had been like 50% faster than the 3080, I probably would have bought it, but now I'm thinking about just waiting for AMD or the 20GB 3080 variants. These products all look like a set of bad tradeoffs.
 
My use case is 100% gaming; and I have my eye one the RTX 3090. Probably an evga FTW3 or Hybrid/Kingpin if I decide to go the AIO route.

Honestly, the RTX 3090 is my only real upgrade path worth anything going from my 2080Ti. Mine has a really nice overclock and in comparing with just the RTX 3080, I was really not far behind it, so the RTX 3080 is out of the race for me. I'll have to wait and see what the RTX 3090 numbers are... luckly I can afford to wait this out for a bit as my 2080Ti is kicking ass at 4K in all games I play right now (even when using RT & DLSS).
 
just saw this video. That’s pretty amazing the 3090 can run those games at 8k with good FPS.
Only it's not running Control at 4k, it's running it at 2560x1440p with a filter.

Also, look at the settings, they are all turned down to a ps4/xbox one era equivalent, except the light raytracing features in control.

If we want to run first gen ps4 games at 8k, that's not really that impressive. If it could run metro exodus, on max, at 8k native 60fps, then we would be getting somewhere. If it can barely run last gen games at 4k with reduced settings, it's already obsolete for 8k.

Feel free to disagree, but I don't pander to fucking with marketing to make a something appear it can do something it cannot. Just like when AMD said fury was a true 4k videocard. Also, releasing the NDA on benchmarks at the same time the card is available, when everyone knows it will sell out, is horse shit. As much as I'd love the card, I cannot support this assault on me as a consumer.

For those looking for a review of the 3090 before launch day, enjoy:
 
Last edited:
Back
Top