confirmed: AMD's big Navi launch to disrupt 4K gaming

2080ti = Titan with RT Cores
2080 = 1080Ti with RT Cores
And so on. They was no real performance difference between the generations, overall, just added a at the time unused feature, relabeled the products and charged more for them.

However, the 5700 and 5700 XT is faster than the previous Vega 56 and Vega 64, uses significantly less power and costs less at MSRP then the Vega's, as well.
Performance wise there isn't a huge difference but if you are using those extra features at this stage the difference is night and day, so while at launch there wasn't a big difference but with things like DLSS 2.0 which is going to become quite a common thing you have plenty of cases where the 2060 outpaces the 1080TI while looking better. nVidia had to get the tech out somehow and they couldn't take a loss on it so yeah early adopters foot the bill as always but now that it is here it is going to dominate.
 
Yes, they are, performance is virtually identical. You forgot that the RT cores, which were not particularly useful at release, was added. Physical die sizes are not relevant because otherwise, by your definition, the 5700XT should be far slower than the Vega 64.

Why do you have to lie to support your position, maybe because your position is a lie.

The 2080 Ti is 15-30% faster than the 1080 Ti without using RTX features, their performance is not identical, but the margin does decrease at 1080p:

https://gpu.userbenchmark.com/Compare/Nvidia-RTX-2080-Ti-vs-Nvidia-GTX-1080-Ti/4027vs3918

https://www.trustedreviews.com/news/nvidia-rtx-2080-ti-vs-gtx-1080-ti-3536639

https://www.tweaktown.com/articles/9420/geforce-ti-showdown-gtx-980-vs-1080-rtx-2080/index6.html
 
Performance wise there isn't a huge difference but if you are using those extra features at this stage the difference is night and day, so while at launch there wasn't a big difference but with things like DLSS 2.0 which is going to become quite a common thing you have plenty of cases where the 2060 outpaces the 1080TI while looking better. nVidia had to get the tech out somehow and they couldn't take a loss on it so yeah early adopters foot the bill as always but now that it is here it is going to dominate.

DLSS 2.0 has yet to prove it will be a common thing, would be nice if it did but I think we need to see a bit more proof that more games are actually adding it. Cool tech and I hope it becomes more common then it currently is.
 
DLSS 2.0 has yet to prove it will be a common thing, would be nice if it did but I think we need to see a bit more proof that more games are actually adding it. Cool tech and I hope it becomes more common then it currently is.
Only way it doesn't become common is if a cross-platform solution (likely within DirectX) is developed and adopted. And that will likely happen eventually, and like DX9 being prototyped on the ATi 9700 Pro and DX10 on the Nvidia 8800GTX, such a standard would likely be prototyped on Ampere.

Until that happens, though, it's a benefit that's just very hard to pass up. Nvidia dominates the AIC gaming market, and rightly so for many reasons, thus the addition of DLSS to improve a game's performance on the desktop would only be passed up for ideological reasons (see id software) or due to severe and / or unexpected constraints on developer resources.
 
DLSS 2.0 has yet to prove it will be a common thing, would be nice if it did but I think we need to see a bit more proof that more games are actually adding it. Cool tech and I hope it becomes more common then it currently is.
Buddy who works on Frostbite is already getting heat from management about them not having it working, a number of UE 4 titles have started implementing it and the Epic forums are all very positive about it, Unity has stated it is in the works and they hope to have full support for it shortly. That is going to cover a lot of the playing field right there once they get it out, which shouldn't be too long.
 
Buddy who works on Frostbite is already getting heat from management about them not having it working
I assume that this refers to the greater source code tree, since it's been mostly implemented in Battlefield V?
 
Why do you have to lie to support your position, maybe because your position is a lie.

The 2080 Ti is 15-30% faster than the 1080 Ti without using RTX features, their performance is not identical, but the margin does decrease at 1080p:

https://gpu.userbenchmark.com/Compare/Nvidia-RTX-2080-Ti-vs-Nvidia-GTX-1080-Ti/4027vs3918

https://www.trustedreviews.com/news/nvidia-rtx-2080-ti-vs-gtx-1080-ti-3536639

https://www.tweaktown.com/articles/9420/geforce-ti-showdown-gtx-980-vs-1080-rtx-2080/index6.html

LOL! Nvidia cards at release were exactly as I have mentioned they were, no matter the opinions of others. No one cares what the 2080 Ti is compared to a 1080Ti, since the 2080Ti is essentially a relabeled Titan with RT Cores for more money. I also clearly said the 1080Ti is the same as the 2080, except for RT cores so, there was no upgrade in that generation, at all.
 
Performance wise there isn't a huge difference but if you are using those extra features at this stage the difference is night and day, so while at launch there wasn't a big difference but with things like DLSS 2.0 which is going to become quite a common thing you have plenty of cases where the 2060 outpaces the 1080TI while looking better. nVidia had to get the tech out somehow and they couldn't take a loss on it so yeah early adopters foot the bill as always but now that it is here it is going to dominate.

The 2060 is much slower than the 1080Ti and in that case, RT cores mean nothing, since the 1080Ti does not have them. Otherwise, you would factor that the 2060 and 2080 are equal and of course, they are not.
 
I assume that this refers to the greater source code tree, since it's been mostly implemented in Battlefield V?
Didn't ask for too many details, we were playing D&D and having some beers when his boss was texting him so I asked what's up and I got "Boss is being pissy cause we need DLSS fully working yesterday, but their not giving us.. blah blah blah" we were a few beers in and the audio over roll20 isn't the best.
 
Only way it doesn't become common is if a cross-platform solution (likely within DirectX) is developed and adopted. And that will likely happen eventually, and like DX9 being prototyped on the ATi 9700 Pro and DX10 on the Nvidia 8800GTX, such a standard would likely be prototyped on Ampere.

Until that happens, though, it's a benefit that's just very hard to pass up. Nvidia dominates the AIC gaming market, and rightly so for many reasons, thus the addition of DLSS to improve a game's performance on the desktop would only be passed up for ideological reasons (see id software) or due to severe and / or unexpected constraints on developer resources.

I agree that likely there will be a standard created out of it as AMD is likely to be using something similar as well for upscaling on the consoles. The only thing I question is how far are they going to go with it, as you start risking your upper tier sales if lets say a mid tier card will get it done for 4K. If the rumors are true of much more powerful RT hardware in the next gen cards, then I wonder how much it might hurt sales at the top tier. Will be interesting to see how it turns out over the next year or two.
 
LOL! Nvidia cards at release were exactly as I have mentioned they were, no matter the opinions of others. No one cares what the 2080 Ti is compared to a 1080Ti, since the 2080Ti is essentially a relabeled Titan with RT Cores for more money. I also clearly said the 1080Ti is the same as the 2080, except for RT cores so, there was no upgrade in that generation, at all.

so tests and evidence mean nothing to you, only your personal feelings. I should not be surprised, but evidence trumps feelings, you lose, again.

1080 ti superseeded by 2080 ti, not the 2080. also i went from the Titan xp to the 2080 ti and had noticiable improvements in fps. went from janky 3440/1440 to buttery smooth.

lets see you move that goal post again...

1589571890157.jpeg
 
The 2060 is much slower than the 1080Ti and in that case, RT cores mean nothing, since the 1080Ti does not have them. Otherwise, you would factor that the 2060 and 2080 are equal and of course, they are not.
2060 using DLSS 2.0 at 1440p crushes a 1080TI at 1440p while looking better. So the RT cores can do great things and are worth it if you can use them as the 2060 is otherwise a slower card.
 
so tests and evidence mean nothing to you, only your personal feelings. I should not be surprised, but evidence trumps feelings, you lose, again.

1080 ti superseeded by 2080 ti, not the 2080. also i went from the Titan xp to the 2080 ti and had noticiable improvements in fps. went from janky 3440/1440 to buttery smooth.

lets see you move that goal post again...

View attachment 245763

And the 2080 Ti is $500 more and simply a relabeled Titan, as I said, with RT cores added. The 2080 is equal to the 1080Ti and the, more or less, same price if not slightly more. It is not all that complicated and you know that. Only the Super variants where an actual upgrade.
 
2060 using DLSS 2.0 at 1440p crushes a 1080TI at 1440p while looking better. So the RT cores can do great things and are worth it if you can use them as the 2060 is otherwise a slower card.

Yes, except that the 2060 is not actually running at 1440p all the time but variable resolutions, instead. Doubtful that it looks better than the 1080ti, unless you want to take a single image and try to dissect it. There is no crushing at all and as a matter of fact, the game has to support it first, and almost no games do, at this time.
 
Yes, except that the 2060 is not actually running at 1440p all the time but variable resolutions, instead. Doubtful that it looks better than the 1080ti, unless you want to take a single image and try to dissect it. There is no crushing at all and as a matter of fact, the game has to support it first, and almost no games do, at this time.
Looks better than native 1440p while running at better FPS than native 1440p for half the price of a 1080TI. I would call that a successful implementation of a software optimization that runs on specialized hardware. And a few of the titles I am playing do support it, many more are adding support for it and a bunch of others will have it. And don’t take my word for it go look at reviews of games running DLSS 2.0 and see for yourself there’s lots of them. And the image quality looks better and feels smoother
 
Prove it, with actual facts not your opinion. And, no, benchmarks don’t count either as they do not prove your claim.

Of course benchmarks do not count, since they clearly show what I have been saying all along. Nvidia has gotten away with charging whatever they want and you guys are willing to pay for it, although that generation gave no upgrade over the previous one. In fact, that are many Nvidia fans who have straight up said that the Supers should have been what was released in the first place.
 
I agree that likely there will be a standard created out of it as AMD is likely to be using something similar as well for upscaling on the consoles. The only thing I question is how far are they going to go with it, as you start risking your upper tier sales if lets say a mid tier card will get it done for 4K. If the rumors are true of much more powerful RT hardware in the next gen cards, then I wonder how much it might hurt sales at the top tier. Will be interesting to see how it turns out over the next year or two.
Agreed, though when talking about '4K' we should stop assuming that that means 60Hz / 60FPS (I'm guilty of this as well). There are higher resolutions and higher framerates demanded still, on the desktop and even more so for VR. There's demand for plenty more performance than what's available now and I doubt that either vendor is going to have an issue moving products.
 
Of course benchmarks do not count, since they clearly show what I have been saying all along. Nvidia has gotten away with charging whatever they want and you guys are willing to pay for it, although that generation gave no upgrade over the previous one. In fact, that are many Nvidia fans who have straight up said that the Supers should have been what was released in the first place.

No, benchmarks don’t count because similar performance does not prove your claim. You have to provide definitive proof that Turing is just Pascal with tensor and RT cores. That cannot be done with benchmarks.
 
Of course benchmarks do not count, since they clearly show what I have been saying all along. Nvidia has gotten away with charging whatever they want and you guys are willing to pay for it, although that generation gave no upgrade over the previous one. In fact, that are many Nvidia fans who have straight up said that the Supers should have been what was released in the first place.
Supers were a result of process improvements and less of anything else. They probably could have managed cards of the super performance out the gate but it probably would have added at least $100 to each card.
 
No, benchmarks don’t count because similar performance does not prove your claim. You have to provide definitive proof that Turing is just Pascal with tensor and RT cores. That cannot be done with benchmarks.

LOL! Guess you have no issue paying slightlly more for the same performance from generation to generation. Oh well, not going to prove anything to you and that is fine. I have said what I have said and it is clear you are never going to agree with me. Better question would be, if it is different, then why didn't people rush out to buy a 2080 if they already had a 1080Ti or a 2080Ti if they already had a Titan?
 
Supers were a result of process improvements and less of anything else. They probably could have managed cards of the super performance out the gate but it probably would have added at least $100 to each card.

I cannot say that I agree, the Supers were released because of the competition from AMD.
 
Supers were a result of process improvements and less of anything else. They probably could have managed cards of the super performance out the gate but it probably would have added at least $100 to each card.

No real sign of process improvement. Just a few tweaks for marketing purposes, to better compete with Navi.
 
You’re moving goal posts and using personal attacks to distract from your ridiculous claim.

*Sigh* I have changed nothing of what I have said nor move any goal posts. Not my fault if someone cannot make a goal on a wide open net. :D
 
And the 2080 Ti is $500 more and simply a relabeled Titan, as I said, with RT cores added.
:rolleyes:

You routinely spout nonsense, and this is just another example.

Turing is a significantly different architecture, even if there was a sensible reason to ignore the RT, and Tensor cores.

The SM architecture is significantly changed, with parallel Int/FP operations, and unified cache, and new shader modalities like Mesh shading.

https://devblogs.nvidia.com/nvidia-turing-architecture-in-depth/
 
:rolleyes:

You routinely spout nonsense, and this is just another example.

Turing is a significantly different architecture, even if there was a sensible reason to ignore the RT, and Tensor cores.

The SM architecture is significantly changed, with parallel Int/FP operations, and unified cache, and new shader modalities like Mesh shading.

https://devblogs.nvidia.com/nvidia-turing-architecture-in-depth/

There is nothing nonsensical about it, the 2080 Ti and the previous Titan performance are nearly identical. The 2080 and the 1080Ti performance are also near Identical. Not really an issue if you think that is nonsensical but hey.......
 
No real sign of process improvement. Just a few tweaks for marketing purposes, to better compete with Navi.
TSMC managed to squeak out some improvements to the 12nm node towards the end and that got us the higher clocks. Otherwise you could just OC the non supers and hit the same clocks, but the supers push beyond those. Similar for the memory, clocks.
 
There is nothing nonsensical about it, the 2080 Ti and the previous Titan performance are nearly identical. The 2080 and the 1080Ti performance are also near Identical. Not really an issue if you think that is nonsensical but hey.......

By your warped, if it performs the same, it must be the same, "logic", Vega 64 must really be a renamed NVidia 1080.
 
TSMC managed to squeak out some improvements to the 12nm node towards the end and that got us the higher clocks. Otherwise you could just OC the non supers and hit the same clocks, but the supers push beyond those. Similar for the memory, clocks.

It's more about the extra cores, than the overclocks . They could have done both any time they wanted to. They boosted performance when they needed it, to compete with 5700 series.
 
It's more about the extra cores, than the overclocks . They could have done both any time they wanted to. They boosted performance when they needed it, to compete with 5700 series.
Well shit I didn’t realize the supers had actual chip changes. I thought it was just clock increases and a bit more ram. Well f’ing-L
 
Well shit I didn’t realize the supers had actual chip changes. I thought it was just clock increases and a bit more ram. Well f’ing-L
They produce the same set of dies; the same part numbers with the same physical number of cores, etc. However, the Super GPUs did have more cores unlocked, or in at least one case, the next largest die.
 
By your warped, if it performs the same, it must be the same, "logic", Vega 64 must really be a renamed NVidia 1080.

LOL! What the heck? :D That does not make any sense. You see, the RX5700XT was released so it was faster than the Vega 64, used less power and released at a lower MSRP than Vega 64. The other point I have been sticking with, however......
 
LOL! What the heck? :D That does not make any sense. You see, the RX5700XT was released so it was faster than the Vega 64, used less power and released at a lower MSRP than Vega 64. The other point I have been sticking with, however......
...because it was actually designed for gaming instead of HPC?

Do you have a valid, on-topic point?
 
At the end of the day, nVidia hasn't been asleep at the switch like Intel was for what, a decade?

If AMD could take the crown from nVidia, all nVidia would likely do is release another 'super' variant, drop the price to slightly below or on par with AMD's latest, and it would cut amd off again.

AMD is sitting in a good spot now being a better bang for your buck in the mid range and nVidia is largely ignoring this.
 
There is nothing nonsensical about it, the 2080 Ti and the previous Titan performance are nearly identical. The 2080 and the 1080Ti performance are also near Identical. Not really an issue if you think that is nonsensical but hey.......

Similar performance does not mean it is the same chip. Jesus fucking Christ this is one of the dumbest arguments I have ever fucking seen.

Edit: And that's ignoring the fact that you're wrong. The 2080 ti and Titan XP do NOT have the same performance. On average, the 2080 TI will be noticeably faster than the XP.
 
Last edited:
Similar performance does not mean it is the same chip. Jesus fucking Christ this is one of the dumbest arguments I have ever fucking seen.
Don’t lie you have seen far far worse arguments made on the forum, this doesn’t even crack the top 100. But the performance increase did fall short of the hype they are not wrong on that.
 
Don’t lie you have seen far far worse arguments made on the forum, this doesn’t even crack the top 100. But the performance increase did fall short of the hype they are not wrong on that.

I can't think of too many off the top of my head, but it has been a very long day. As for performance, yeah to a point. After Pascal's incredible gains getting a fairly average increase in rasterization performance (and a huge increase in price) was definitely a let down. It was enough of a jump for me, obviously, but Turing has definitely been the most "meh" of generations in quite a long time.
 
Back
Top