Nvidia 3000 Series pricing and specs

I agree that I think if anything is shooting the 3080 in the foot is its 10GB memory and it is one of the reasons I am leaning towards the 3090 if I can squeeze it into my case. While I think 24GB is overkill for what I ever plan to use it for, it would be interesting to see if developers find ways to leverage the extra headroom. 16GB I think would have been the sweet spot if it was possible.

16GB is not a reasonable option on 320 bit bus. You want even multiples of 10.

Rumor says there will be 20GB AIB cards, though you will likely split much of the price difference with the 3090 then.

Remember. Apple charges $200 GB for 8GB of slow DDR4 system memory.

This is the fanciest high speed GDDR available, and this will be a high margin option. I'd expect at least $1000 for the 20GB model, if not more.
 
16GB is not a reasonable option on 320 bit bus. You want even multiples of 10.

Rumor says there will be 20GB AIB cards, though you will likely split much of the price difference with the 3090 then.

Remember. Apple charges $200 GB for 8GB of slow DDR4 system memory.

This is the fanciest high speed GDDR available, and this will be a high margin option. I'd expect at least $1000 for the 20GB model, if not more.
Yes, I do realize that on the 320-bit bus memory comes in multiple of 10s. My comment on 16GB being ideal is the amount I can see using that I felt would be best for my needs, not what was possible on the memory bus.

I do also agree that a 20GB model will most likely have a pretty decent price premium for the extra 10GB. This is why I feel the $1500 for the 3090 FE isn't really that bad of a price considering it has 24GB of GDDR6X memory on top of an increase of cores. The Titan RTX command a $1,000 price premium over that with the same amount of memory (albeit, GDDR6).
 
Whew... Specs say the rtx 3080 is 285mm long... My case can take up to 290mm. Gonna be a tight fit, but whew! It will fit.
 
So the Dan A4 case says up to 300mm and from what you posted 285 we should be good to go i hope. I hope you are right Golden Tiger. The web page says those spec but damn i still hope i can fit it in even if i have to have a custom side cover made.
 
I post in PCMR in reddit and for about 2-3 months I kept saying to wait for the 3000 cards. people still went ahead about bought the 2080ti or 2080. I dont know if it was just no patience or people didnt have faith in nvidias offerings. But this gen was a huge leap
 
I paid $550 for my 1080ti in 2018. Now its either $400 for 2080ti or $600 for 3080 hmm, I dont know which is the better deal at 1440p.
 
I'm curious if Big Navi announcement was waiting for today. I'm thinking AMD will announce before the Ampere cards are available. If not, they're probably not competitive. Thoughts?

Falice NAVI DEAD...... Seems DOA.

https://wccftech.com/report-nvidia-geforce-rtx-30-gpus-to-be-in-short-supply-until-2021/

The last quote in the article(toss in Scott Herklemans raised eyebrow on twitter...minutes after the presentation ended),seems to hint RTG are shittin bricks,and were blindsided by yesterdays launch,as we ALL were. RTX3080 for me !! I own many RTG cards but cannot see RTG taking down a 2080TI for 499 with DLSS 2.0 and all the other extras thrown in.
 
While I think 24GB is overkill for what I ever plan to use it for, it would be interesting to see if developers find ways to leverage the extra headroom.

I think they had to make some hard decisions on this card. They didn't have a substantially different die setup for the 3090, it's actually quite a small increase in cores if you compare it to the huge bump going from 3070 to 3080. Also remember they are actually double counting all of their cores, the real number is half what they are officially saying due to the '2 threads per cuda core' ampere revision, which would need everything else on the card to be doubled in order to take advantage of it to the fullest.

Where the 24gb vram comes into play is 8k. We're talking 4 x 4k resolution, or 16 x 1080p displays. I haven't tried gaming at 8k on a modern game like tomb raider or control, but i can't imagine it's vram usage is timid. Only downside is the availability/price of 8k monitors and displays. If you could buy a passable 8k display for 300 bucks, it would make stomaching the 3090 a bit more palatable, but alas, we're far from that. Besides, anyone know what size screen you would need to really benefit from 8k? Alot of respected gamers barely see 4k making any sense under 32". Also this card is really being targeted at content creators, as was alluded to by Jensen in his reveal. Working with 4k and 8k editing really chews up your gpu. For people trying to get a handle on 8k graphics work, they need soooooooo much more vram right now. And yeah, they know there are gamers out there that will just throw money at whatever is the best, even if it's just like 15% faster than a 3080 at twice the price.
 
Also remember they are actually double counting all of their cores, the real number is half what they are officially saying due to the '2 threads per cuda core' ampere revision, which would need everything else on the card to be doubled in order to take advantage of it to the fullest.

Please don't spread misinformation. There is no 2 threads per CUDA core nonsense. The 2 FP32 pipes work on different warps in parallel.

Of course you’re not going to see the full performance benefit if the bottleneck is elsewhere. But that goes for any architecture.
 
8K seems like a non-starter to me. There are barely any displays and even now with smaller 4K displays we are reaching the limit of being able to discern individual pixels. By the time if/when it becomes mainstream, the 3090 will likely be obsolete.
 
I think they had to make some hard decisions on this card. They didn't have a substantially different die setup for the 3090, it's actually quite a small increase in cores if you compare it to the huge bump going from 3070 to 3080. Also remember they are actually double counting all of their cores, the real number is half what they are officially saying due to the '2 threads per cuda core' ampere revision, which would need everything else on the card to be doubled in order to take advantage of it to the fullest.

Where the 24gb vram comes into play is 8k. We're talking 4 x 4k resolution, or 16 x 1080p displays. I haven't tried gaming at 8k on a modern game like tomb raider or control, but i can't imagine it's vram usage is timid. Only downside is the availability/price of 8k monitors and displays. If you could buy a passable 8k display for 300 bucks, it would make stomaching the 3090 a bit more palatable, but alas, we're far from that. Besides, anyone know what size screen you would need to really benefit from 8k? Alot of respected gamers barely see 4k making any sense under 32". Also this card is really being targeted at content creators, as was alluded to by Jensen in his reveal. Working with 4k and 8k editing really chews up your gpu. For people trying to get a handle on 8k graphics work, they need soooooooo much more vram right now. And yeah, they know there are gamers out there that will just throw money at whatever is the best, even if it's just like 15% faster than a 3080 at twice the price.
Being that I game on a 4K monitor as well as do VR, I just feel that 10GB is cutting it a little too close with the direction that games are heading if I want to play at 4K with high/max settings (or max settings and oversampling for VR). Then tack on ray tracing and some of the other newer capabilities, and memory usage will be jumping even more. I saw on Nvidia's developers tips site (not sure how current it is) mention that ray tracing alone in games can be expected to consume another 1-2GB of memory (they didn't mention at what resolution they were referencing). With that in mind, my options are slim if I want an RTX30xx to meet the memory needs to handle what I want to do today as well as still be able to manage 4 years from now (the average span I go between upgrading GPUs). There are talks of a 20GB 3080 and a decent price gap between the 3080 and 3090 to hint at a possible placeholder for something that sits in between, but those are still ifs with no definitive whens, and I am building a new PC now that just needs a GPU, CPU water block (on order, waiting delivery), and a pump/reservoir. This makes me lean more towards the 3090. A "set it and forget it" kind of choice, but at a premium. Good thing I don't upgrade frequently.

However, I do think the 8-10GB cards will be perfect for those gaming in 2K or plan to lean on DLSS, and still give enough padding for growth and for features like Ray Tracing to use.
 
The only thing I will want in 8k or above resolutions is VR

That's true, good point. However I think VR tends to be more CPU-bound in a lot of games given the complex physics going on. Currently, anyway, but I guess that could change once higher-performance cards are released.
 
That's true, good point. However I think VR tends to be more CPU-bound in a lot of games given the complex physics going on. Currently, anyway, but I guess that could change once higher-performance cards are released.

Pretty sure Kyle disproved that in BFV and it was my fault he went down that rabbit hole lol.
 
Please don't spread misinformation. There is no 2 threads per CUDA core nonsense. The 2 FP32 pipes work on different warps in parallel.

Of course you’re not going to see the full performance benefit if the bottleneck is elsewhere. But that goes for any architecture.
This is what was put out in the Official Launch Event, Caption on:

Amp2shaderOptsPerClock.jpg


It can now do two FP32 or one FP32 and one interger calculation per clock in each cuda core, is my understanding. Nvidia added the 2nd FP32 per clock calculation ability in the Cuda core for Ampere.

Nvidia states for 3090 it has 36 TFLOPS of FP32 performance with 10496 Cuda cores. If indeed it has 10496 Cuda Cores each being able to do 2 FP32 calculations/clock the theoretical Floating Point performance is calculated:
  • # Cuda Cores X Frequency X 2 FP32 per clock
  • 10496 x 1.7ghz x 2 = 35.7TFLOPS
Really I can only conclude that it actually does have that number of Cuda Cores if that FP32 spec is right, how in the hell they doubled the number of Cuda Cores??? It is like AMD doubling the number of CU's and the chip size is not changed that much. Hopefully the white paper will explain this better.
 
8K seems like a non-starter to me. There are barely any displays and even now with smaller 4K displays we are reaching the limit of being able to discern individual pixels. By the time if/when it becomes mainstream, the 3090 will likely be obsolete.
It is a none starter but that 8k LG OLED is so sexy... I need a 77" version of it.
 
It can now do two FP32 or one FP32 and one interger calculation per clock in each cuda core, is my understanding. Nvidia added the 2nd FP32 per clock calculation ability in the Cuda core for Ampere.

They simply added more cores. I don't know where this multiple instructions in each cuda core stuff came from. I posted Nvidia's explanation below in another thread.

Really I can only conclude that it actually does have that number of Cuda Cores if that FP32 spec is right, how in the hell they doubled the number of Cuda Cores??? It is like AMD doubling the number of CU's and the chip size is not changed that much. Hopefully the white paper will explain this better.

A cuda core isn't the same as a CU. Nvidia's CU equivalent is an SM and a cuda core is just one of the many execution units in the SM. They added 50% more transistors for only 17% more SMs compared to TU102. So clearly each SM is much bigger and part of that increase is the addition of another FP32 pipe.

nv_quote.PNG
 
For what, mate?🙄if you sit 6ft away, 4k will be quite sufficient. If you will be 6 inches away, well...

I really can't believe anyone would seriously consider 8K. It's the kind of thing that would be useful if you were inspecting X-Rays, so you can get inches away from from portions of the screen and still have good resolution, but when taking in the screen as a whole, 4K will be overkill in most instances.
 
I wanted 3090 to up the resolution on VR. But since 3080 turned out to be so much faster and so much cheaper than expected, I'm just going to get it, then sell and buy 3080Ti when it comes out in 8-9 months.
 
I really can't believe anyone would seriously consider 8K. It's the kind of thing that would be useful if you were inspecting X-Rays, so you can get inches away from from portions of the screen and still have good resolution, but when taking in the screen as a whole, 4K will be overkill in most instances.

I had a 32" 4K monitor when I got my 1080ti years ago and I sold it after a couple months because it was useless. Any resolution less than 4k also looked blurry. So it was native 4K for EVERYTHING or bust. And some older games don't have UI scaling so text was impossible to read... so yeah 4K is totally worthless. I got a 1440p 144hz monitor a while ago and i'll never look back. The resolution is perfect for looks and performance, and it's also my first monitor that goes over 60hz and I've really missed out all those years. 1440p/2k is the best resolution for gaming. Period. Especially when considering many games now have resolution scaling if you want to bump the render resolution past your native screen res.
 
Bet some people be salty.

Well they bought a Gen 1.0 product... if they didn't know this was obviously going to happen, that's on them. Even this Gen 2.0 will be surpassed seriously by Gen 3.0 raytracing cards. At least with Gen 2.0 now we get decent performance, unlike with Gen 1.0.
 
Yeah I read that this morning, and again the very first question regarding the 10GB of memory for the 3080 really highlights how much people don't know what they need and how much of it they need. 10GB is more than enough for 99.9% of games.

Yeah there is some FUD surrounding that topic for sure. Even in instances where a game will take more than 8 GB or so, it's often not necessary for performance to be good.
 
Good read. Definitely makes me feel better about getting the 3080 for 4k120.

The 3080 is going to be a really hot item. I doubt most of us will be able to get hands on one this year. I bet resellers will be going crazy with them too. They'll be selling on eBay for more than a 3090 by the end of this month.
 
The 3080 is going to be a really hot item. I doubt most of us will be able to get hands on one this year. I bet resellers will be going crazy with them too. They'll be selling on eBay for more than a 3090 by the end of this month.
Unfortunately, you are probably right. Good thing I still have a 2070 Super to tie me over until I can snag one. I will not be paying scalper prices.
 
Unfortunately, you are probably right. Good thing I still have a 2070 Super to tie me over until I can snag one. I will not be paying scalper prices.

As someone with a 2080ti myself I am curious why you are so determined to sell your 2070 Super if it's meeting your needs right now (unless it's not)? Are there games that aren't performing where you want them to right now, or are you just trying to cut your losses and sell your 2070 Super sooner than later since the prices for the 20 series cards dive-bombed and you want to upgrade to be more future-proof?

I personally am still really happy with my 2080ti. I used EVGA's step-up program over a year ago to get it and in total paid $1200 for the card and while it's frustrating to see my card performance:value be totally decimated I don't feel any need whatsoever to upgrade. All my games perform above my expectations and I suspect the 2080ti will still be great for at least 2-3 more years. So I am in no way going to upgrade right now especially because of how pricey this card was and I want to get my money's worth by holding onto it as long as the games I want to play are meeting my performance expectations.
 
As someone with a 2080ti myself I am curious why you are so determined to sell your 2070 Super if it's meeting your needs right now (unless it's not)? Are there games that aren't performing where you want them to right now, or are you just trying to cut your losses and sell your 2070 Super sooner than later since the prices for the 20 series cards dive-bombed and you want to upgrade to be more future-proof?

LG OLEDs need Nvidia 30 series to do optimal 4k @ 120hz due to HDMI limitations on previous cards.
 
As someone with a 2080ti myself I am curious why you are so determined to sell your 2070 Super if it's meeting your needs right now (unless it's not)? Are there games that aren't performing where you want them to right now, or are you just trying to cut your losses and sell your 2070 Super sooner than later since the prices for the 20 series cards dive-bombed and you want to upgrade to be more future-proof?

I personally am still really happy with my 2080ti. I used EVGA's step-up program over a year ago to get it and in total paid $1200 for the card and while it's frustrating to see my card performance:value be totally decimated I don't feel any need whatsoever to upgrade. All my games perform above my expectations and I suspect the 2080ti will still be great for at least 2-3 more years. So I am in no way going to upgrade right now especially because of how pricey this card was and I want to get my money's worth by holding onto it as long as the games I want to play are meeting my performance expectations.
I'm not selling it. It will become my backup card. I am a little disappointed in performance with it right now because it struggles in certain titles. CoD:MW, Control, and Borderlands 3 are all tanking when you crank up the settings at 4K.

As MissJ84 mentioned, I have an LG CX OLED, and the only way to realize the full potential of this display is with HDMI 2.1. 4k120 is the holy grail for high resolution displays at the moment, and I want a video card that can make this LG CX shine.
 
The 3080 10GB should be plenty to power my 27” 1440p 144hz monitor. If the 3070 is equal or better than the 2080 Ti in traditional Ras/RT There’s no telling how much better the 3080 is! Hope I can snag one at Best Buy on the 17th!
 
Back
Top