GeForce RTX 3080 Ti Reportedly Shows Up In Nvidia Driver

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
11,048
"On the memory front, the GeForce RTX 3080 Ti seemingly features 20GB of GDDR6X memory that runs at 19.5 Gbps across a 320-bit memory interface. That means that graphics card could offer a maximum memory bandwidth up to 780 GBps, just 2.6% more than the GeForce RTX 3080. The GeForce RTX 3090 still delivers up to 20% higher bandwidth though.

If you look at it closely, the performance difference between the GeForce RTX 3090 and the RTX 3080 Ti should be minimum in workloads where the amount of memory and memory bandwidth aren't important factors. The price tag, however, is as different as chalk and cheese.


The Radeon RX 6900 XT sports a $999 price tag, and it's still a big price to pay for a graphics card. However, the Navi-powered graphics card looks attractive beside the GeForce RTX 3090, which sells for $1,499. If the GeForce RTX 3080 Ti were to debut at $999, it would definitely give the Radeon RX 6900 XT a run for its money."


https://www.tomshardware.com/news/nvidia-driver-geforce-rtx-3080-ti-ampere-gpu
 
Aligns with all the rumors/leaks reported thus far. $999 MSRP makes sense, guessing the AIB versions will mostly be $1100-1200 (probably where most 6900 XT AIB prices will also land). Question now is when it’ll come out. The same rumors are reporting January from what I remember.
 
I'm not complaining, and I'll probably buy some version of a 3080 if one becomes actually available - but its interesting how NV/AMD have just 'turned' the $1000 price point into yeah, whatever, that's an actual price bracket now.

I know there have been Titans, 2080TI's etc for a long time but now we have actual competition and all that did make people talk about which $1000 dollar card is the one to buy.

From what I see from my friends, forum posts, slickdeals etc, people are still super interested in the $200 price point. This battle in the 6, 7, 800, 1000 dollar area is just fantasy land.
 
I'm not complaining, and I'll probably buy some version of a 3080 if one becomes actually available - but its interesting how NV/AMD have just 'turned' the $1000 price point into yeah, whatever, that's an actual price bracket now.

I know there have been Titans, 2080TI's etc for a long time but now we have actual competition and all that did make people talk about which $1000 dollar card is the one to buy.

From what I see from my friends, forum posts, slickdeals etc, people are still super interested in the $200 price point. This battle in the 6, 7, 800, 1000 dollar area is just fantasy land.
Yeah, its unfortunate but its just the reality of the situation now. I remember when the 3080 debuted with its price tag and AMD being competitive, supposed to drive down prices and make it a win for everyone. Seems the opposite happened instead. The hardware is amazing though and I can't wait till I get my hands on them.
 
At least AMD has something in the halo category. But yeah, once a 3080Ti releases, if it's $999 the 6900XT will need a price cut to attract buyers... unless there continues to be massive shortages. Then all of these 6900s and Tis will sell at the street price, simply by being available.
 
I'm not complaining, and I'll probably buy some version of a 3080 if one becomes actually available - but its interesting how NV/AMD have just 'turned' the $1000 price point into yeah, whatever, that's an actual price bracket now.

I know there have been Titans, 2080TI's etc for a long time but now we have actual competition and all that did make people talk about which $1000 dollar card is the one to buy.

From what I see from my friends, forum posts, slickdeals etc, people are still super interested in the $200 price point. This battle in the 6, 7, 800, 1000 dollar area is just fantasy land.
RX 580/1650 super for $200......same old tech. If you want "new" you have to spend $400 up or go previous gen, so it seems.

Maybe RTX 3050 for $200 and RTX 3050Ti for $300 ?

3050 = GTX 1070 performance. Expect 4-6G card.
3050Ti = GTX 1080 performance Expect 6G card.
 
At least AMD has something in the halo category. But yeah, once a 3080Ti releases, if it's $999 the 6900XT will need a price cut to attract buyers... unless there continues to be massive shortages. Then all of these 6900s and Tis will sell at the street price, simply by being available.
If I understand the reports from various suppliers along the chain NVidia has the clear capacity to out build AMD right now. AMD is stretched very thin trying to supply a lot of parts to a lot of people.
 
I probably won't be jumping to 4K before cards get faster anyway, so the extra VRAM doesn't seem worth the money.

Plus, I am interested to see what RTX I/O brings, and if that makes any difference for VRAM usage.
 
RX 580/1650 super for $200......same old tech. If you want "new" you have to spend $400 up or go previous gen, so it seems.

Maybe RTX 3050 for $200 and RTX 3050Ti for $300 ?

3050 = GTX 1070 performance. Expect 4-6G card.
3050Ti = GTX 1080 performance Expect 6G card.

$300 is more likely the 3060, the 3050's will be $200 and $250 if they stick to $50 increments.
 
You sure you don't wanna wait for the 3080 Ti Super Mega OC Deluxe Founders Edition EXTREME X3 GAMRZ RGB Wallet Buster Edition?

I hear it comes with a free Funko Pop Anime figurine...

Sounds tempting. Do you think you could connect me with a scalper bot who will be able to acquire one before the RTX 4000 series launch?
 
$999 seems inevitable. Nvidia has a huge hole in their lineup between the $700 3080 and $1500 3090.

I'll be interested to see what benefits the extra memory bring. My 3080 runs Cuberpunk well, but not at 4K. I'm guessing that's more of an overall horsepower thing and not tied directly to VRAM. If VRAM is not the limiting factor the performance bump might be fairly minor.
 
$999 seems inevitable. Nvidia has a huge hole in their lineup between the $700 3080 and $1500 3090.

I'll be interested to see what benefits the extra memory bring. My 3080 runs Cuberpunk well, but not at 4K. I'm guessing that's more of an overall horsepower thing and not tied directly to VRAM. If VRAM is not the limiting factor the performance bump might be fairly minor.
How much VRAM does Cyberpunk use on your 3080?
 
I know how to check VRAM allocation but I don't know of a good way to see actual VRAM usage. Happy to check if anyone can suggest a way to do that.
You can use MSI Afterburner, it has an option to view how much VRAM is being used in the settings, both percentage and in Megabytes. Tick "Show in OSD"
 
1607642284006.png

While using a 3090.
 
I'm not complaining, and I'll probably buy some version of a 3080 if one becomes actually available - but its interesting how NV/AMD have just 'turned' the $1000 price point into yeah, whatever, that's an actual price bracket now.

I know there have been Titans, 2080TI's etc for a long time but now we have actual competition and all that did make people talk about which $1000 dollar card is the one to buy.

From what I see from my friends, forum posts, slickdeals etc, people are still super interested in the $200 price point. This battle in the 6, 7, 800, 1000 dollar area is just fantasy land.


The USD isnt worth what it was years ago,especially with imported goods.
 
There is "usage" and then there is "allocation". Semantics, I know. Just realize that using tools like Afterburner only ever guarantees vram allocation. There's not enough info there to say anything about how much allocated is actually being used

You pretty much need the game itself to confirm usage.
 
If it appears in the drivers already does that mean it'll be released soon? I am just a few days into my 3 months step-up for EVGA.
 
Yeah, this is why i'm kind of glad I haven't been able to get a 3080, TBH. I'd waste the money and get one if I could, but gaming at 4k, that 10GB of VRAM isn't going to hold up as time goes on. Really wish they would have released the 3080 with like 12GB.
Given the performance gap between the 3080 and the 3090 I am not terribly sure the extra memory would do as much as people would want without added clock speeds.
 
The USD isnt worth what it was years ago,especially with imported goods.
When Pascal released in april 2016:
https://www.x-rates.com/historical/?from=USD&amount=1&date=2016-04-04
https://www.x-rates.com/historical/?from=USD&amount=1&date=2020-12-10

Not sure how relevant it is in the 3080 TI pricing, a quick look an USD is worth more in almost every device now, at least in Canadian, pound, euro, australian, etc... price increase seem to have been quite similar. I think GPU getting use by richer people for more commercial reason, a change in mentality when the mining craze occurred than shifted perception and so on participated in this.

There is "usage" and then there is "allocation".
And how much of a cost there actually with flirting with your ram limit (or just a bit over it), if those are usage and not allocated ram those number according to the guru article the

3070 / 3060TI with 8 gig of ram vs 11 gig of ram card

At 1440p
2080Ti: 63
3070: 60
3060Ti: 53
1080Ti: 36

AT 4K vs 11 gig cards
3070: 31
2080TI: 31
3060TI: 26
1080TI: 18

The relative performance of the 8 gig Ampere versus the previous card that has 11 gig 2080Ti-1080Ti got higher at 4K on that title, it is more reassuring that it would be an big issue than the other way around, imo.

Same for the 3080 vs the 16 gig Amd card

1440:
3080: 76
6800XT: 68 (10.5% slower)

4K
3080: 40
6800xt: 34 (15% slower)

It is still early to use that title (because there was a scenario where we see the 3800 vs 3900 ratio drop off indicating a potential lack of VRAM causing a performance issue, but at that level the 3090 was not able to be a playable framerate either, again reassuring that scenario where vram would start to be an issue could be one that the card is not strong enough anyway).
 
Last edited:
  • Like
Reactions: Epos7
like this
Back
Top