Rumor: 3080 31% faster than 2080 Ti

On videocardz they mention the model is unknown. Could be rtx 3090, 3080ti 3060. Who knows.
 
Shaping up to be an exciting year for video cards. Can’t wait to see what Intel has in store too. Hopefully it isn’t shit and we get 3 competitors in this space.
 
Makes sense for the 3080Ti. 30% for the regular 3080 would be pretty nuts. Would be awesome if true, though.
Yup, this would fall in line with previous generation leaps. I’m guessing the RTX 3080 will be 5-10% better than the RTX 2080 Ti in rasterization (probably even more in RT performance) and the RTX 3080 Ti will be 30-40% better than the RTX 2080 Ti. I anticipate larger RT performance jumps than rasterization performance jumps for this next generation.
 
What a waste to use it that way.

I'm thinking more along the lines of finally getting consistently decent frame rates at 4k Ultra

I will be pairing mine with LG CX 48" Oled display which I already pre-ordered. Combined with DLSS, I am really hoping that 3080 Ti or 3090 or whatever the high end is called will be able to drive 4K at max settings at an acceptable frame rate.
 
What the hell is wrong with some of you people? 31% would be absolutely pathetic. Even Turing did that and that was pretty much a joke. I guess some of you are just absolutely forgetting the jump the 980ti had over the 780ti and then the 1080ti jump over the 980ti. Both of those were 75% or better.
 
He literally has no clue if it's a 3080 or something higher up..

"In my opinion, what you are looking at, is the RTX 3080 or the RTX 3080 TI/Super/3090 mainstream flagship ". And can't say for sure it's not the titan either (although he has reason to think that's less likely). Of course speculation is fun, but to be taken with a grain of salt. Looking forward to real benchmarks. If they are a good amount faster, maybe current RTX cards will come down in price... either that or they'll just raise the price on these to compensate and leave pricing for 2000 series the same.
 
What the hell is wrong with some of you people? 31% would be absolutely pathetic. Even Turing did that and that was pretty much a joke. I guess some of you are just absolutely forgetting the jump the 980ti had over the 780ti and then the 1080ti jump over the 980ti. Both of those were 75% or better.

As we get closer and closer to the limits of silicon the performance improvements generation over generation are going to become smaller and smaller. This became evident in CPU's firtst as they are not as able to scale with process nodes as extremely parallelized GPU's are, but as we are seeing, it inevitably hits all things silicon.

While GPU manufacturers like to talk about how great the improvements they have made to their architecture have been and that's why the next generation performs so much better than the one before it, in reality, performance generation over generation has tracked much more closely with predicted theoretical gains from shrinking the process node.

Essentially any chip design tends to wind up being heat limited at the max, and process node heat output/power use historically has scaled pretty linearly with process node feature size.

This means in going from TSMC's 12FFN to, lets say, TSMC's 7FF process, assuming they lied equally about the true minimum feature size on both of those nodes, should have a theoretical max performance improvement of about 41.67%.

That theoretical max - however - doesn't include the fact that linear scaling with process node size pretty much stopped at 32nm. You still see an improvement as you shrink the dies further, but it is now less of an improvement than would be expected from a linear relationship.

I'd argue that if Nvidia are going to get into the 30's percent performance improvement going to next gen, they have done an exceptional job, given the current state of the technology.

Keep in mind that back when we were seeing huge generation over generation performance improvements things were different. Firstly architecture wise there was more low hanging fruit to pick off to improve the architecture. In 2020 they do some minor tweaks from process node to process node, but there just isn't as much left to do to improve things there. That's why you see Nvidia shifting their efforts to new features like RT instead of trying to improve the architecture for traditional raster graphics. All the low hanging fruit is gone, and now improvements would be extremely difficult and expensive to accomplish, and not yield as much in performance as in the past.

Overall, die shrinks just aren't as effective today as they used to be. This doesn't always track linearly. Nvidia was on 28nm for quite some time, and still they saw performance improvements within that process node, but these were largely based on process yields allowing for better bins.

Anyway, there is a reason more and more chipmakers are dropping out of the smaller process node sizes. It is very difficult. By some estimates each successive generation of smaller process node is 4x more difficult and expensive to get working right than the one before it, and GPU designs can be as great as the engineers want them to be, but in the end they have to be manufactured on these increasingly extremely complex and expensive manufacturing processes.

Add to all of this that there are fewer manufacturers for each new node. We are now down to two working on the latest gen nodes. TSMC and Samsung. Still the same number of buyers out there though. Supply and demand and all that.

I would continue to expect each successive generation to have smaller and smaller improvements, on average, than the one before it. Not doing so is just plain unrealistic, and shows a complete lack of understanding of the underlying technology.
 
Last edited:
What the hell is wrong with some of you people? 31% would be absolutely pathetic. Even Turing did that and that was pretty much a joke. I guess some of you are just absolutely forgetting the jump the 980ti had over the 780ti and then the 1080ti jump over the 980ti. Both of those were 75% or better.

Some of us live in the real world and know the diminishing returns of current process technology.
 
Some of us live in the real world and know the diminishing returns of current process technology.
Oh what a load of crap so get over yourself. And I'm referring back to people that are actually acting like 30% is a normal improvement which it absolutely is not. I gave clear examples of that with the 1080ti and 980ti over their predecessors. And are you actually silly enough to think AMD is just going to make a 30% improvement?
 
Oh what a load of crap so get over yourself. And I'm referring back to people that are actually acting like 30% is a normal improvement which it absolutely is not. I gave clear examples of that with the 1080ti and 980ti over their predecessors. And are you actually silly enough to think AMD is just going to make a 30% improvement?

If you look back from GTX 480 to now, and leave out Pascal, the average generational gain was under 40%. If Pascal was more line with the average, Turing would have been seen as normal release. But anything returning to normalcy after Pascal was bound to be a disappointment to anyone expecting similar gains.

If you are looking back on Pascal and expecting that kind of gain again, you are going to be disappointed.
 
Oh what a load of crap so get over yourself. And I'm referring back to people that are actually acting like 30% is a normal improvement which it absolutely is not. I gave clear examples of that with the 1080ti and 980ti over their predecessors. And are you actually silly enough to think AMD is just going to make a 30% improvement?
It is pathetic to cry about the performance of a leaked benchmark for an unknown GPU.
 
Some of us live in the real world and know the diminishing returns of current process technology.
Or we upgrade every other generation depending on needs. I have yet to see any company order me to buy their new card at gun point. Some people act like they are forced to buy whatever gets released latest. :D
 
Or we upgrade every other generation depending on needs. I have yet to see any company order me to buy their new card at gun point. Some people act like they are forced to buy whatever gets released latest. :D

Yeah, 30% every generation is still a VERY nice upgrade. Skip a generation, you are looking at a ~70% jump.
 
It is pathetic to cry about the performance of a leaked benchmark for an unknown GPU.
What an asinine thing to say. I'm not complaining about the benchmark, genius. I'm complaining about the people that act like 30% would be a huge mprovement.
 
Yeah, 30% every generation is still a VERY nice upgrade. Skip a generation, you are looking at a ~70% jump.
Are you actually saying that waiting 4 to 5 years to get a 70% improvement is somehow impressive?
 
30% jump over the 2080 Ti is impressive if it's the 3080.

30% jump over the 2080 Ti if it's the 3090, and I'll be unimpressed yet again.

I think the former is MUCH more likely based on the rumors.
 
Oh what a load of crap so get over yourself. And I'm referring back to people that are actually acting like 30% is a normal improvement which it absolutely is not. I gave clear examples of that with the 1080ti and 980ti over their predecessors. And are you actually silly enough to think AMD is just going to make a 30% improvement?

I really think you need to scroll up and read the wall of text Zarathustra thus far wasted his time writing for you. You have educated folks explaining things to you while you cast insults while frankly being clueless.

Of course we'd all love to see a glorious 75% performance leap. Shoot, I'll take 2000% too.

Issue is that's not going to happen so we take what we can get. 2080ti is an utter behemoth 754mm^2 die on tsmc 12nm. A new node isn't going to suddenly double performance, especially when 7nm hasn't resulted in a leap in clock frequency. We'll be lucky to even get a 3090 on such a titanic die as it is. That can't be an easy yield.

Nobody here who understands the technology is surprised to hear 30%. I was expecting 5-10% on a smaller die with a nice drop in tdp honestly. We're not saying it's an unbelievably impressive life changing leap or celebrating. We're saying it is what it is.
 
Last edited:
The best case scenario is that this is indeed the 3080 and that the 3080Ti/3090 will be another 30% on top of it giving us the ~60% performance leap over the 2080Ti we've been hoping for. I can easily see this being the 3080 since we recently got leaks of what the 3080 might look like so it might be closer to release than the 3080Ti/3090 therefore more performance leaks on it.

I just hope pricing will normalize but I'm doubtful of it happening.
 
hopefully Big Navi will force Nvidia to be more competitive as far as pricing...the 3080 isn't even the flagship and sounds like a $1000 card based on Nvidia's history
 
The best case scenario is that this is indeed the 3080 and that the 3080Ti/3090 will be another 30% on top of it giving us the ~60% performance leap over the 2080Ti we've been hoping for. I can easily see this being the 3080 since we recently got leaks of what the 3080 might look like so it might be closer to release than the 3080Ti/3090 therefore more performance leaks on it.

I just hope pricing will normalize but I'm doubtful of it happening.
Pricing is only going to "normalize" at the high end if 1) AMD has a competitive high end product and 2) they undercut NVIDIA again, and I don't see it happening quite to the extent they have in the past. Lisa Su has pivoted AMD away from trying to be "second tier provider". With AMD beating out Intel in the CPU game currently I just can't see them dramatically undercutting NVIDIA if the big Navi chips end up competitive with Ampere. Maybe a bit less ($100 or so).

If both companies end up competitive with each other, best case scenario is we get a price war going. Let's hope.
 
Yeah, 30% every generation is still a VERY nice upgrade. Skip a generation, you are looking at a ~70% jump.
Yeah, look at CPUs, how long was it sub 10% increases in performance... And people are complaining of 30% for GPUs. We are running into limits and those 50% increases aren't going to be the norm.
 
This will probably be the 3080 Ti and the 3090 might give another 10%-15% raw performance. So with drivers, clocks and a wallet full of cash, you might get 40-50% faster average than 2080 Ti with a 3090 for $1500 is my prediction.
 
The extent of any Price war will likely be what you saw when the 5700 series launched. NVidia launched Super cards the same week as 5700, and AMD dropped their prices by up to $100 to be competitive with NVidia Super cards. Or the way AMD tells it. They "Jebaited" NVidia. :rolleyes:

There isn't going to be a big margin destroying price war, that would only hurt both companies, and they both want healthy margins. AMD will undercut NVidia a bit, and NVidia will let them.
 
Are you actually saying that waiting 4 to 5 years to get a 70% improvement is somehow impressive?
Sure.. what CPU are you running?.. How many generations would you have had to skip to get a 70% increase? Why is 10% CPU increase great and 30% GPU considered crap? Transistors aren't getting cheaper like they used to and they can only map the die so big before heat removal becomes a problem. You want a 70% increase in performance, sadly you'd probably need a 70% increase in price to support a die large enough and some crazy cooling.
 
30% jump over the 2080 Ti is impressive if it's the 3080.

30% jump over the 2080 Ti if it's the 3090, and I'll be unimpressed yet again.

I think the former is MUCH more likely based on the rumors.


Nvidia is, without a doubt, sticking with their proven tactic of replacing the performance equivalent with a step up in number designation. Just like 2080 was to 1080 ti, 3080 will closely match a 2080 ti in performance. And further, a 3090 will be ~30% faster than a 2080 ti.

Now, as far as pricing goes, my guess is 2080 ti-level performance i.e. 3080 will still cost $900-1000, and 3090 is going to cost upwards of $1300. The price/performance needle won't move much with Ampere, that's assured.
 
If you look back from GTX 480 to now, and leave out Pascal, the average generational gain was under 40%. If Pascal was more line with the average, Turing would have been seen as normal release. But anything returning to normalcy after Pascal was bound to be a disappointment to anyone expecting similar gains.

If you are looking back on Pascal and expecting that kind of gain again, you are going to be disappointed.

The problem wasn't necessarily that Turing didn't have a Pascal generational gain. The problem was that it didn't have a Pascal generational gain AND the 2080Ti cost 72% more than the 1080Ti (Founders to Founders).
 
For those that think 30% is not enough of an performance increase, the easy answer is skip this gen.

If the 3070 has a 30% increase compared to my 2070, I'll take it. Gladly. But more than likely I'll just take the 3080 :)

30% is fine.
 
For those that think 30% is not enough of an performance increase, the easy answer is skip this gen.

If the 3070 has a 30% increase compared to my 2070, I'll take it. Gladly. But more than likely I'll just take the 3080 :)

30% is fine.
30% is a joke of a performance increase after waiting over two years.
 
The problem wasn't necessarily that Turing didn't have a Pascal generational gain. The problem was that it didn't have a Pascal generational gain AND the 2080Ti cost 72% more than the 1080Ti (Founders to Founders).

AMD will be glad to sell you something less. I don't see the problem. Oh, I see they already did. Going from a 1080 Ti to a 5700 was a great idea.
 
Back
Top