RTX 3xxx performance speculation

So... I have been thinking a lot.

The he memory situation might have been done in an effort to make the launch have the biggest splash as possible. Surely Nvidia was keenly aware of the biggest grip of their 2000 series RTX line, which by far was the price increase. I would expect any company driven towards success to put effort towards controlling their image.

Seems logical that the memory configurations aren't accidental, that it is to do with cost. So the memory is a result of a cost cutting choices. And this fits pretty well with what I said above. There are always trade offs, especially in making graphics cards.

nVidia straight-up said that the 10 GB was to give a good balance of performance and cost given the expensive GDDRX6 memory.
 
They showed performance for both.

Also:

When the slide says RTX 3070 is equal or faster than 2080 Ti, are we talking about traditional rasterization or DLSS/RT workloads? Very important if you could clear it up, since no traditional rasterization benchmarks were shown, only RT/DLSS supporting games.




https://www.reddit.com/r/nvidia/comments/ilhao8/nvidia_rtx_30series_you_asked_we_answered/

I am going a bit number crunchy because relative lack of CUDA cores and Memory BW, makes it hard to see how 3070 matches 2080 Ti, with 3080 only up to 1.8X 2080.
Code:
Name      Cuda    Mem BW    Performance
3080      8704    760             x1.8 perf (demonstrated at DF)
2080      2944    448

3070      5888    448/512?      x1.0 perf claim
2080Ti    4352    616

3080 vs 2080 has 8704/2944 2.96X Cuda, compared to 1.8X perf, so 1.8x/2.96 = Cuda Perf Factor vs Pascal = 0.61x CUDA ratio - looks like you need a LOT more CUDA
3080 vs 2080 has 760/448 1.69X Memory BW compared to 1.8X perf, so 1.8x/1.69 = BW Perform Factor for = 1.07X BW - Looks like you need similar BW.

Now apply factors derived from 3080/2080 performance against the 3070/2080Ti:

3070 vs 2080ti has 5888/4352 1.35X Cuda, Multiply 0.61X factor = 0.83 = 83% performance of 2080Ti if we get the same benefit/CUDA core as 3080 Does...
3070 vs 2080ti has 512/616 .83X Memory BW, Multiply by 1.07 = .89 = 89% performance of 2080Ti if get the same benefit from memory BW as 3080. Only 77% if BW is only 448 as reported elsewhere.

Yeah, wall of numbers, but the point is you can figure out how many Ampere CUDA/BW to give you 1.8X for 3080, so you use the factors to figure out how close to 1X 3070 gets vs 2080Ti. Hopefully it gets the gist across.

It really looks like unless 3070 is much better than 3080 at utilizing CUDA cores and Memory BW in gaming, it will have a hard time matching 2080Ti in games, as claimed.

TLDR: 3070 numbers make me very skeptical it can really match 2080Ti in most cases.
 
I am going a bit number crunchy because relative lack of CUDA cores and Memory BW, makes it hard to see how 3070 matches 2080 Ti, with 3080 only up to 1.8X 2080.
Code:
Name      Cuda    Mem BW    Performance
3080      8704    760             x1.8 perf (demonstrated at DF)
2080      2944    448

3070      5888    448/512?      x1.0 perf claim
2080Ti    4352    616

3080 vs 2080 has 8704/2944 2.96X Cuda, compared to 1.8X perf, so 1.8x/2.96 = Cuda Perf Factor vs Pascal = 0.61x CUDA ratio - looks like you need a LOT more CUDA
3080 vs 2080 has 760/448 1.69X Memory BW compared to 1.8X perf, so 1.8x/1.69 = BW Perform Factor for = 1.07X BW - Looks like you need similar BW.

Now apply factors derived from 3080/2080 performance against the 3070/2080Ti:

3070 vs 2080ti has 5888/4352 1.35X Cuda, Multiply 0.61X factor = 0.83 = 83% performance of 2080Ti if we get the same benefit/CUDA core as 3080 Does...
3070 vs 2080ti has 512/616 .83X Memory BW, Multiply by 1.07 = .89 = 89% performance of 2080Ti if get the same benefit from memory BW as 3080. Only 77% if BW is only 448 as reported elsewhere.

Yeah, wall of numbers, but the point is you can figure out how many Ampere CUDA/BW to give you 1.8X for 3080, so you use the factors to figure out how close to 1X 3070 gets vs 2080Ti. Hopefully it gets the gist across.

It really looks like unless 3070 is much better than 3080 at utilizing CUDA cores and Memory BW in gaming, it will have a hard time matching 2080Ti in games, as claimed.

TLDR: 3070 numbers make me very skeptical it can really match 2080Ti in most cases.

Well.. I’ve also wondered how much they had to lower the clocks to keep the wattage reasonable (which could mean good OCing). Someone on one of my discords said these OC very well if you have the power budget. I guess we’ll find out soon.

We’re close enough I am trying not to spend *too* much time on theoreticals. I am debating 3080 vs 3090 mainly lol.
 
2070 Super at $499 compared to a 3070 at $499:
Past generational difference
Some observations or notes (opinions), Turing (I think of it as Turding) had very poor separation of the top tier cards, Basically the 2070 Super was 1080 Ti performance, the 2080 an OC version of the 1080 Ti and the 2080 Super a slightly faster OC 1080 Ti. While the 2080 Ti was 16% faster than the 2080 Super at 1440p.
Other generations had much wider spreads of performance with their top tier cards. So by Jensen comparing the 3070 to the 2080 Ti, adding in the very high stupid cost of $1300, great marketing manipulation talk, people are like WoW! Turing was a very poor generational leap for performance with new features and we are looking at roughly a 35% or generational leap with Ampere over Turing regardless of the huge increase in advertised cuda cores and comparing prices to the 2080 Ti.

The 3070 should have been compared to the 2070 Super, same price, same memory amount and bandwidth and for that $499 you are getting basically a 35% jump in performance, if Jensen general comment on performance is accurate. Is that really astounding? No.
 
2070 Super at $499 compared to a 3070 at $499:
Past generational difference
Some observations or notes (opinions), Turing (I think of it as Turding) had very poor separation of the top tier cards, Basically the 2070 Super was 1080 Ti performance, the 2080 an OC version of the 1080 Ti and the 2080 Super a slightly faster OC 1080 Ti. While the 2080 Ti was 16% faster than the 2080 Super at 1440p.
Other generations had much wider spreads of performance with their top tier cards. So by Jensen comparing the 3070 to the 2080 Ti, adding in the very high stupid cost of $1300, great marketing manipulation talk, people are like WoW! Turing was a very poor generational leap for performance with new features and we are looking at roughly a 35% or generational leap with Ampere over Turing regardless of the huge increase in advertised cuda cores and comparing prices to the 2080 Ti.

The 3070 should have been compared to the 2070 Super, same price, same memory amount and bandwidth and for that $499 you are getting basically a 35% jump in performance, if Jensen general comment on performance is accurate. Is that really astounding? No.

Not sure what people were expecting (another Pascal I guess) when Turing was still 16nm.

GPU's are facing the same issue CPU's have been for the last few years. Smaller and smaller performance increases.
 
  • Like
Reactions: noko
like this
Not sure what people were expecting (another Pascal I guess) when Turing was still 16nm.

GPU's are facing the same issue CPU's have been for the last few years. Smaller and smaller performance increases.
Turing was “12nm” and overall just shit value. I think DLSS and similar technologies will help with larger performance “gains” in the future.
 
Turing was “12nm” and overall just shit value. I think DLSS and similar technologies will help with larger performance “gains” in the future.

Glad you put that in quotes. It was tweaked 16nm and 12nm was marketing speak. Not saying it wasn't better than 16, but it was really 16+.
 
Proadvantage has preorders going. PNY right now.3080 and 3090. Eww that price though. I've seen other brands lower.
 
I'm torn, I really like and want the FE model this time around but also want to go eVGA to cover my ass if there are 20GB models released within a few months of the 10GB models...
 
I'm torn, I really like and want the FE model this time around but also want to go eVGA to cover my ass if there are 20GB models released within a few months of the 10GB models...

I'ma get what I can get. Either way seems like a win. I kinda doubt the higher VRAM cards will be out within 90 days of the release, but who knows.
 
I'm torn, I really like and want the FE model this time around but also want to go eVGA to cover my ass if there are 20GB models released within a few months of the 10GB models...
EVGA do not have any 2 slot air cooled 3080 cards, looks like FE if I buy one. Do not want to pay the extra for their water cooled versions and not sure when those will become available. Seeing listings for AIB gives some hope there will be more than expected cards available. Nvidia FE maybe really limited, not sure about AIB versions.
 
EVGA do not have any 2 slot air cooled 3080 cards, looks like FE if I buy one. Do not want to pay the extra for their water cooled versions and not sure when those will become available. Seeing listings for AIB gives some hope there will be more than expected cards available. Nvidia FE maybe really limited, not sure about AIB versions.
The xc3 is a 2.2 slot that is 2 slot without the backplate according to the EVGA mod on their forum. Didn't know they had multiple versions of it(Black, Gaming, Ultra Gaming) Not sure if it's just Clocks or PCB differences also.
 
The xc3 is a 2.2 slot that is 2 slot without the backplate according to the EVGA mod on their forum. Didn't know they had multiple versions of it(Black, Gaming, Ultra Gaming) Not sure if it's just Clocks or PCB differences also.
Unfortunately that would be enough to block using a pcie 16x (8x) slot on my motherboard using a pcie extension cable. Giving one less usable slot.
 
2070 Super at $499 compared to a 3070 at $499:
Past generational difference
Some observations or notes (opinions), Turing (I think of it as Turding) had very poor separation of the top tier cards, Basically the 2070 Super was 1080 Ti performance, the 2080 an OC version of the 1080 Ti and the 2080 Super a slightly faster OC 1080 Ti. While the 2080 Ti was 16% faster than the 2080 Super at 1440p.
Other generations had much wider spreads of performance with their top tier cards. So by Jensen comparing the 3070 to the 2080 Ti, adding in the very high stupid cost of $1300, great marketing manipulation talk, people are like WoW! Turing was a very poor generational leap for performance with new features and we are looking at roughly a 35% or generational leap with Ampere over Turing regardless of the huge increase in advertised cuda cores and comparing prices to the 2080 Ti.

The 3070 should have been compared to the 2070 Super, same price, same memory amount and bandwidth and for that $499 you are getting basically a 35% jump in performance, if Jensen general comment on performance is accurate. Is that really astounding? No.

A work of art. You put a lot of effort in this painting, almost seems as if you're trying to make ampere look as bad as you possibly could. Which is a little strange at this point, given we have yet to see independent testing for true performance. For all we know, the 3070 could end up less, than expectations, consistently lower when independently tested. It seems like a card that could vary depending on the title.

But what I find really puzzling, how is it one can manipulate & force set up comparisons against the RTX 3070 to right off the entire Ampere architecture? Framing the situation to meet your desire and need to insist a poor "generational jump".

Well, I thought it would be awesome to use your exact methods to make my own dubious claims.
By comparing the gtx 1050 to the gtx 950, we had a measly pathetic 15% "generational jump". By using your style framing and your exact logic, I can claim that jump from Maxwell to Pascal was ridiculously tiny, an awful 15%, then ask why would anybody think that is great or how's anybody astonished.

Then I can take the result from your framing and intentional rigging: Turing to Ampere 31-37%

vs what I got when using those same tricks & methods to get an awful 15% from Maxwell to Pascal

Hmmm, your ampere jump (2070ti to 3070) at 31-37% crushes my tiny 15% Pascal jump (gtx950 to gtx1050), like literally double and then some.

Wow... Ampere jump is over double what the Pascal generation jump was? Holy smokes!!!

But... Not really. It's just that's how manipulating to make a result works. You did it by framing the entire Ampere generation out of your 2070t super vs 3070 comparison and I didn't even have to look hard or use a REFRESH special ti card.. just the lgtx 1050 vs the original 2gb gtx 950.

If it's not clear still... Cherry picking one card model to put down an entire generation is just funny business.
 
I wonder how many fps will be have Quake 2 rtx on rtx 3090. On my currently 2080 Ti WF Aorus ,without dynamic res.scaling i have 48-50fps+- on 1440P. Despite of that fps i see jittery/stutter when moving fast mouse. Guess engine.
So With Rtx 3090 it will be maybe 80-90fps?: )D

This is with all maxed out.
 
A work of art. You put a lot of effort in this painting, almost seems as if you're trying to make ampere look as bad as you possibly could. Which is a little strange at this point, given we have yet to see independent testing for true performance. For all we know, the 3070 could end up less, than expectations, consistently lower when independently tested. It seems like a card that could vary depending on the title.

But what I find really puzzling, how is it one can manipulate & force set up comparisons against the RTX 3070 to right off the entire Ampere architecture? Framing the situation to meet your desire and need to insist a poor "generational jump".

Well, I thought it would be awesome to use your exact methods to make my own dubious claims.
By comparing the gtx 1050 to the gtx 950, we had a measly pathetic 15% "generational jump". By using your style framing and your exact logic, I can claim that jump from Maxwell to Pascal was ridiculously tiny, an awful 15%, then ask why would anybody think that is great or how's anybody astonished.

Then I can take the result from your framing and intentional rigging: Turing to Ampere 31-37%

vs what I got when using those same tricks & methods to get an awful 15% from Maxwell to Pascal

Hmmm, your ampere jump (2070ti to 3070) at 31-37% crushes my tiny 15% Pascal jump (gtx950 to gtx1050), like literally double and then some.

Wow... Ampere jump is over double what the Pascal generation jump was? Holy smokes!!!

But... Not really. It's just that's how manipulating to make a result works. You did it by framing the entire Ampere generation out of your 2070t super vs 3070 comparison and I didn't even have to look hard or use a REFRESH special ti card.. just the lgtx 1050 vs the original 2gb gtx 950.

If it's not clear still... Cherry picking one card model to put down an entire generation is just funny business.
Yeah I was kinda long winded, sorry to make you have to enter all those words.

Should have just said with known benchmarks, the 2080Ti is 31% faster than a 2070 Super at 1440p. Jenson said as well as his accurate PowerPoint slide the 3070 and 2080 Ti are around the same performance. Unless he stated something wrong. So the 3070 is around 31% faster than a 2070 Super at the same price at 1440p.

I picked all Nvidia reference models. Maybe unclear on context if 2070 Super Turing to 3070 Ampere.

Than you
 
Last edited:
I was all set on the 3080 but the 3070 16GB is really tempting at 1440p 144hz...so 3080 20GB vs 3070 16GB is my dilemma...
 
Unboxing NDA lifted at 9. Lots of showcase coming.

NVIDIA-GeForce-RTX-3090-in-a-case.jpg
 
I'm sorry for asking but when it the NDA to be lifted for actual reviews/benchmarks?
 
his presentation can be goofy at times but don't confuse that with a lack of knowledge...they guy knows more about hardware/tech then a lot of the forum 'experts'
He dropped a 2080 Ti... Think of the children, man!!!

(joking)
 
Back
Top