RTX 3xxx performance speculation

nice thx. Or maybe i will buy 3090/3090 Ti ,not sure now. :) But i ask about drivers if of course i will not sell 2080 ti currently ;)

it was hard selling my 2080ti, but I always have sellers remorse when selling a gpu for 70% of what I paid for it. Eventually I forget and just accept the price of early adoption.

I bought my 980ti through EVGA and have a 10 year warranty for it because it can play anything at 1080p at high fps. I might do the same for the 3090 as it will hold me at 120-144fps at 1440p for virtually all games
 
So DLSS 3.0 wil be not on 2080 Ti? Should i be worried? :)

the 2080ti will remain a great card for another few years...the highest end cards always keep their value longer...if I had a 2080ti I don't think I would even want to upgrade to Ampere (unless I had a 4k monitor)
 
Who said 3070 gets by on 2080 super bandwidth? We don’t have any info on the 3070. All the leaks have been on GA102.


So you're assuming super-expensive GDDR6X memory on the entire lineup?

The last time we made that universal jump was with the 6600 GT and up, but that was because GDDR3 really was mass-produced and standardized by then (before GDDR2, actually.

But they still had a pre-launch test card (to build-up manufacturing):

https://techreport.com/review/6477/nvidias-geforce-fx-5700-ultra-with-gddr3/

Just like last-time, who else besides Micron is building this hack-job?

It's either they put 3060 on 192-bit Jesus Ram, or they go with 256-bit 14gbps GDDR6. But if this is as expensive as their last hack, it won't be worth it. 2070 Super-performance is going to be hard to do on less than 400 GB/s, so it has to be one of those
 
Last edited:
So you're assuming super-expensive GDDR6X memory on the entire lineup?

The last time we made that universal jump was with the 6600 GT and up, but that was because GDDR3 really was mass-produced and standardized by then.

But they still had a pre-launch test card (to build-up manufacturing):

https://techreport.com/review/6477/nvidias-geforce-fx-5700-ultra-with-gddr3/

Jut like last-time, who else besides Micron is building this hack-job?

I’m not assuming anything since I don’t know Nvidia’s performance target for the 3070. They’re certainly going to use the cheapest memory they can get away with though.
 
I’m not assuming anything since I don’t know Nvidia’s performance target for the 3070. They’re certainly going to use the cheapest memory they can get away with though.

My guesswork. They will launch with with gaps in the lineup for Super/Ti models to follow later.

So they might start with GDDR6 on the RTX 3070, upgrading to GDDR6X and more cores, for the eventual 3070 Super/Ti.

Where the 3080 can get another GDDR6X memory channel (11GB/22GB) for the 3080 Super/Ti.

Who knows if there is anything left in the tank for a 3090 Super/Ti, but the 2080 Ti never got a Super upgrade because there was no AMD part to compete with it, that required one.
 
The 2080Ti will still be as fast, or at worst a hair slower then the incoming 2nd tier card, RTX 3080. This has been the pattern for a while now where the new incoming 2nd tier cards (GTX 1080, 2080, etc) are just as fast as the previous high end card ( Ti Models ).

The one area where performance may be substantially better is with Ray Tracing, so it's possible even the 3070 may have better RT performance than the 2080Ti but don't forget the 3070 will also have cut down RT hardware so I'm doubtful it'll beat the 2080ti at RT. -

Where are you getting this info? The 3080 should be 20% faster than the 2080Ti right?
 
We won't know exact numbers till we get release drivers and hardware. Anything else is just speculation at this point.


With a 60% increase in bandwidth over 2080 Ti I would expect aft least 40%performance bump in normal games.

RTX games could be even more.

If NVIDIA were not anticipating well-over 50% increase in RTX, there would be no point in designing new memory to power this thing. With ONLY a 40% RTX improvement as your performance target, you could just rely on the 16Gbps ram upgrade.You get a 25% bandwdith bump.

The RTX 2080 Ti actually has tons of space bandwidth, The MSI card with 16gbps ram only performs 3% faster (same core clock, but 16 versus 14 memory)

https://www.reddit.com/r/nvidia/comments/giu5d5/msi_geforce_rtx_2080_ti_gaming_z_trio_tested/

so 25% real-world bandwidth increase would cover up to total 50% RTX improvement.

We're assuming that there wil be a higher RTX improvement, because optimizing first-generation RT units should be a lot easier than adding more raw shader units. So, if they are feeding the 3090 60% higher bandwidth, I would expect up to 50% normal performance, and up-to 100% RTX increase.
 
Last edited:
Tuesday can't come soon enough.
But with every passing day I'm more and more leaning towards getting a PS5 and a powerful laptop (Alienware m15 R3 with 2070 super or Alienware m17 R3 with 2080 super) and going down from 43" 4k/120hz to 27" 1440p/60hz. It will be enough for light gaming MMORPGs.
 
Tuesday can't come soon enough.
But with every passing day I'm more and more leaning towards getting a PS5 and a powerful laptop (Alienware m15 R3 with 2070 super or Alienware m17 R3 with 2080 super) and going down from 43" 4k/120hz to 27" 1440p/60hz. It will be enough for light gaming MMORPGs.

Is this a response to rising GPU prices?
You are going to buy a $3000-$4000 Laptop + $600 console to save money?

If you already have a PC, a sub $1000 3080 will utterly destroy a laptop with 2080 Super or console, pretty sure even a 3070 will do that...
 
Is this a response to rising GPU prices?
You are going to buy a $3000-$4000 Laptop + $600 console to save money?

If you already have a PC, a sub $1000 3080 will utterly destroy a laptop with 2080 Super or console, pretty sure even a 3070 will do that...

My msi gs65 with 2080 Max Q, gets destroyed by my 8700k and gtx 1080
 
Is this a response to rising GPU prices?
You are going to buy a $3000-$4000 Laptop + $600 console to save money?

If you already have a PC, a sub $1000 3080 will utterly destroy a laptop with 2080 Super or console, pretty sure even a 3070 will do that...

Lower power draw
Portability
Space saving
Money saving (Dell usually runs some pretty big discounts)

And yes, some response to rising gpu prices, but mainly the above.
 
Time for some speculation about 2000/3000 performance and cost comparisons. I'm bored at work today so I used this TweakTown article that shows the cost breakdowns and a performance comparison:

Tweet:
  • "GeForce RTX 3070 has around the same level of performance of the GeForce RTX 2080 Ti"
  • "You all can do the math on how much stronger the 3080 & 3060 are relative to this based on the previous gen"
Using TechPowerUp's Geforce 2070 FE page as a guide, you can start to build how powerful the 2070 was compared to the entire 2000 product stack. Granted this is only at 1080p, but its good enough for a quick comparison. Why use 2070? Because its the same x7xx tier, but I guess you could use any 2000s product as long as you kept the comparison math the same.

Where does this leave us? With the chart below. We have the 3060 Regular (FE?) being just a hair better than the 2080 FE for the same estimated MSRP of $399 of last gen. The x6 has the same price, but, yet again, we see a price increase on the x7 and x8 tiers. They are $100 more than the 2000 gen, which was already $100 more than the 1000 gen. Great news if you own stock in Nvidia (i do) and the idiots will still go out and buy it.

Yes, you have a $599 part equaling last gens $1000 part. But how is this different than 1000 series to 2000? Look at that same TPU link, the 2070 is about 90% of the 1080ti. And 9th gen to 10th gen? The 980ti is about 8% less than 1070. So equal to, +/- 10% for the past few generations, nothing really to write home about. Except for higher prices that is!

I tried to attach the excel file if you all wanted to check my math, zipping it, renaming to .zzz. I can't seem to attach it. Weird.

View attachment 272997

I'd expect the Super version prices to come in a bit higher, and not sure we'll ever see those Ti models, but all in all, I'd concur with those estimates.
 
I'd expect the Super version prices to come in a bit higher, and not sure we'll ever see those Ti models, but all in all, I'd concur with those estimates.
historically yes, the TI or refresh models do come in a bit higher. However, for the 2000 gen due to AMD, the Super models were the same price. The non-supers were then price cut
 
historically yes, the TI or refresh models do come in a bit higher. However, for the 2000 gen due to AMD, the Super models were the same price. The non-supers were then price cut

Every situation is different. This time AMD likely releasing RDNA2 very soon after Ampere, so that NVidia is just forced to do a quick price adjustment, which is the more typical response. Then Ti/Super is just a later update.

I wouldn't be surprised for AMD to also release more Big Navi info right after NVidia event on Sept 1st to convince more people to wait on the sidelines for RDNA2. It's a reasonable tactic.
 
Every situation is different. This time AMD likely releasing RDNA2 very soon after Ampere, so that NVidia is just forced to do a quick price adjustment, which is the more typical response. Then Ti/Super is just a later update.

I wouldn't be surprised for AMD to also release more Big Navi info right after NVidia event on Sept 1st to convince more people to wait on the sidelines for RDNA2. It's a reasonable tactic.
If AMD doesn't try to piss in Nvidia's cheerios by announcing something even the date for the reveal or releasing benchmarks to temp people waiting on the fence they don't deserve to increase market share. Take what jeep did when ford announced the Bronco, announced updates and reasons to think about getting a jeep instead. No more passes for being poorly ran.
 
Why would I buy a card with 10tb vram when my current card 1080ti, has 11. I have seen them do this before with the 980. They give you the horsepower, but not the vram, to do incredible things. Then they offer the 3090 with 24gb?

There is probably going to be buyer’s remorse with the 3080. Blatant, planned obsolescence.

I promise after the launch you will start hearing about high vram requirements in upcoming games after the card launches.

3090 or wait for a 3080ti.
 
Why would I buy a card with 10tb vram when my current card 1080ti, has 11.
What if you only use 7GB or 8GB?

Remember that the 1080Ti and 2080Ti have 11GB because Nvidia was cutting back or being generous from some specific point; that's just what happens with specific memory controller configurations.

Then they offer the 3090 with 24gb?
This is very clearly prosumer targeted. Perhaps a different targeting than the Titan line, even, probably more toward lower-level content creation or entry-level machine learning and so on.
 
If you're waiting for GPU prices to fall, you first need to hit 88mph...

I don't expect GPU prices to fall. However this time around, I have other options.
It might be a good tactic to get the new console, a laptop for editing photos/light 1440p gaming, and see how the whole Ray Tracing between Nvidia and AMD plays out. Especially with Intel waving his fat hand telling us about high-end GPUs in 2021 (which I highly doubt, but there should definitely be some info next year).
 
I'd expect the Super version prices to come in a bit higher, and not sure we'll ever see those Ti models, but all in all, I'd concur with those estimates.

Please God let's not do the Super name again. 3060, 3070, 3080, and 3090 this year then 3060ti, 3070ti, 3080ti, 3090ti next year. 3060ti would be close to 3070 performance at 3060 price and so on.

Simple.
 
Why would I buy a card with 10tb vram when my current card 1080ti, has 11. I have seen them do this before with the 980. They give you the horsepower, but not the vram, to do incredible things. Then they offer the 3090 with 24gb?

There is probably going to be buyer’s remorse with the 3080. Blatant, planned obsolescence.

I promise after the launch you will start hearing about high vram requirements in upcoming games after the card launches.

3090 or wait for a 3080ti.

Why would you buy a 2080 or 2080S when both are only 8 GB. Amazingly enough, the 2080 cards do better in nearly all scenarios.

Game vram requirements increase but so does gpu demands so it is a wash.

If the 3080 is able to play 4k ultra using close to 10 GB in todays games, it will probably only be fast enough for 4k med or 1440p high in 4 years from now. Both of those 2024 scenarios will most likely use less vram than today's situation.

Really not sure why so many can't grasp this. It played out the same way with the 4 GB Fuji.
 
Toying whether to sell my 2080 ti or not, had an offer at $970 on fleabay but let it expire. Can sell it, net around $830 (about what I paid Zotac for it) , get a 2060 as placeholder and get a 3080. Or just keep on enjoying Horizon Zero Dawn and see where the chips fall next week lol. Given how long the 1080 Tis held up value, I figure I may be better enjoying the 2080 Ti till atleast holiday season and then doing a properly timed swap. In the end I think it boils down to whether I should card hop to save a potential $200 bucks, the age old time vs money question and time is something I have very little of :)

I do not see 2080 Tis dropping below $600 (more likely be around $800 average) any time soon...
 
Last edited:
Selling my 2080 Ti for $1k cash. It was hard to do but I think moving to the RTX 3080 will be an all around better move for me especially if the performance is the same or better. Hopefully it's around $800. 3090 is too rich for my blood.
 
I used to avoid Founders Edition models but the Ampere ones look really good from a design standpoint...looks great visually as well...
 
I used to avoid Founders Edition models but the Ampere ones look really good from a design standpoint...looks great visually as well...
I just want a push-pull 140mm AIO for it so that it doesn't rise above ambient noise at stock ;)
 
Last edited:
I don't see any comments...have they been removed?

Weird, they are there for me. Do you have a script blocker enabled?

I'm on my phone so I can't screenshot the whole thread easily, but they are just speculating that the 3080 and 3070 have double the number of models compared to the 3090 for each category. 4 share one letter in the name and 4 share another. Therefore it could be that they are offering two different memory options or could just be placeholders.

They also pulled up 20 series cards to compare, and all of the 8gb cards have 'P2' in the same spot as some of the 30 series cards, so the guess is the 3060 and 3070 have 8GB models, with the 3070 getting a 16GB option.
 
Last edited:
Weird, they are there for me. Do you have a script blocker enabled?

I'm on my phone so I can't screenshot the whole thread easily, but they are just speculating that the 3080 and 3070 have double the number of models compared to the 3090 for each category. 4 share one letter in the name and 4 share another. Therefore it could be that they are offering two different memory options or could just be placeholders.


Quite possible, I'm at work now. Guess I'll check it in an hour.
 
A 3070 with 16GB, instead of the stock 8GB, would have to have double density memory chips on it. Those will not be inexpensive.

Right, I expect the PS5 will be launching with 16 8Gbit GDDR6 chips. Much like the PS4, there will be a cost-reduced 8-chip model.


But those 16Gbit chips have been available since 2018, so they have had time to work out any production issues

https://www.anandtech.com/show/12338/samsung-starts-mass-production-of-gddr6-memory

They have to become economical at some point. Right after the PS4 released with 2Gbit chips, Big Maxwell jumped up to double the density!

So right about the time a console thinks a new memory standard density is cheap enough to mass-produce, is exactly the time when the next density step is starting to get affordable for high-end cards.
 
Last edited:
Toying whether to sell my 2080 ti or not, had an offer at $970 on fleabay but let it expire. Can sell it, net around $830 (about what I paid Zotac for it) , get a 2060 as placeholder and get a 3080. Or just keep on enjoying Horizon Zero Dawn and see where the chips fall next week lol. Given how long the 1080 Tis held up value, I figure I may be better enjoying the 2080 Ti till atleast holiday season and then doing a properly timed swap. In the end I think it boils down to whether I should card hop to save a potential $200 bucks, the age old time vs money question and time is something I have very little of :)

I do not see 2080 Tis dropping below $600 (more likely be around $800 average) any time soon...

Makes no sense to sell a 2080 Ti just to buy an overpriced 2060 to tide you over. Just sell it after you buy a 3080/3090. Selling a 2080ti only made sense for people who (for whatever reason) happened to have a 1070 class backup card sitting around or are happy gaming on some old console for the next 2 months.
 
Great Scott, you're onto something Marty.

Can someone go back to circa 2010 and fix ATI/AMD?

Selling my 2080 Ti for $1k cash. It was hard to do but I think moving to the RTX 3080 will be an all around better move for me especially if the performance is the same or better. Hopefully it's around $800. 3090 is too rich for my blood.

I see you also have access to a time machine to be able to sell a used 2080 ti for $1k.
 
Makes no sense to sell a 2080 Ti just to buy an overpriced 2060 to tide you over. Just sell it after you buy a 3080/3090. Selling a 2080ti only made sense for people who (for whatever reason) happened to have a 1070 class backup card sitting around or are happy gaming on some old console for the next 2 months.
I would have returned said overpriced 2060 to one of our trillion dollar establishments getting billions from the Fed daily but nah too much hassle to deal with even if I lose a couple of hundred in the end.
 
Back
Top