First RTX 2080 Benchmarks Hit the Web with DLSS

Oh, and if this much of a storm happens over this pricing structure even before the cards are available?

Imagine what will happen if NVIDIA pushes for a 7nm refresh of these cards only a year from now? At these kinds of price levels NVIDIA almost needs to promise gamers it will be 3 years before the next card launch, just to justify the amortization of cost.
 
Oh, and if this much of a storm happens over this pricing structure even before the cards are available?

Imagine what will happen if NVIDIA pushes for a 7nm refresh of these cards only a year from now? At these kinds of price levels NVIDIA almost needs to promise gamers it will be 3 years before the next card launch, just to justify the amortization of cost.


Well at these prices they will make a bunch and refresh them 7nm and make everyone buy the new refresh again. I can almost bet we see 7nm cards launched by this time next year.
 
I don't feel this slide has solved for the issue that Kyle mentioned. Lets turn off all new BS and run a raw head to head comparison with all the same settings 1080TI to 2080TI if it was all bananas awesome they would be telling us instead of hiding in weird ray tracing advertising and weird slides with no real apples to apples comparisons. We could still be looking at a 5% performance bump + a bunch of new features that this card can't really push yet?
 
I am excited for ray tracing, but without a lot of PC exclusive AAA games coming out, I kind of don't want upgrade my GTX 970 yet.
 
The point is there are people in some very different camps when it comes to the non-raytraced non-dlss scenarios.

In regards to the 1080Ti vs the 2080Ti:

If you assume the SMs are effectively Pascals, the 2080Ti is, from the down dirty numbers, at least ~20% faster.
If you're in the camp that believes ~10-20% improvement at the SM level, you're looking at up to 40% faster.
If you believe the CEO of nvidia, then the purported SM improvement is 50%, which would mean the 2080Ti is a whooping 78% faster.
And then there are the people who believe that DLSS speed improvements (or whatever else is in there) can be applied to everything...

And those are the "numbers" we have without the "trickery"

So the issue becomes is 1200USD worth 20% performance for a 100% price increase? All the way to is 78% performance worth a 100% price increase?
The answer to this question (for most) ultimately hinges on how much have the SMs improved.

I would say in the most extreme case, is 78% worth 100% price increase? That's actually a steal in terms of value. The high end product stack is usually absolutely atrocious in terms of value. The 1080Ti was an absolute steal in terms of value.
Is a 100% price increase worth 20%? For most people that'd be a resounding no.

And this is the easy part. The nasty part is when you compare the 1080Ti to the 2080, and the 1080 to the 2070.

Note that there's a big distinction between value and budget.
 
I saw the chart, but it still seemed like the numbers were being gamed (and for the record, I'm a big fan of the RTX in concept and can't wait for RT to be everywhere). The gains seemed to be more "equivalent image quality" not real FPS. This was particularly true with the DLSS...how can an anti-aliasing technology which requires more work actually improve FPS? It can't, unless you change the metric. If it's with DLSS turned on we can do less work during the raster phase, but the resulting image looks the same then sure FPS could go up. I.e. DLSS increases FPS by reducing the raster workload which would result in a worse image, but DLSS compensates.

End result, FPS goes up and image quality stays the same. That's at least how I read the numbers.

Edit: Also that's HDR and didn't Pascal take a rather large hit when doing HDR? Perhaps Turing doesn't and that's some of the gain? Also, is RT emulation turned on for the 1080Ti to get a baseline of 1? If yes, that's really going to suck the performance of the 1080Ti down vs the 2080Ti.
 
End result, FPS goes up and image quality stays the same. That's at least how I read the numbers.

Image quality actually takes a hit. I have a youtube link to the presentation where they put a side by side (on page 3).

Now is the quality good enough, now that's a more subjective answer.

The other part of the equation is how often can you use DLSS? From all accounts, it needs specific support.
 
The point is there are people in some very different camps when it comes to the non-raytraced non-dlss scenarios.

In regards to the 1080Ti vs the 2080Ti:

If you assume the SMs are effectively Pascals, the 2080Ti is, from the down dirty numbers, at least ~20% faster.
If you're in the camp that believes ~10-20% improvement at the SM level, you're looking at up to 40% faster.
If you believe the CEO of nvidia, then the purported SM improvement is 50%, which would mean the 2080Ti is a whooping 78% faster.
And then there are the people who believe that DLSS speed improvements (or whatever else is in there) can be applied to everything...

And those are the "numbers" we have without the "trickery"

So the issue becomes is 1200USD worth 20% performance for a 100% price increase? All the way to is 78% performance worth a 100% price increase?
The answer to this question (for most) ultimately hinges on how much have the SMs improved.

I would say in the most extreme case, is 78% worth 100% price increase? That's actually a steal in terms of value. The high end product stack is usually absolutely atrocious in terms of value. The 1080Ti was an absolute steal in terms of value.
Is a 100% price increase worth 20%? For most people that'd be a resounding no.

And this is the easy part. The nasty part is when you compare the 1080Ti to the 2080, and the 1080 to the 2070.

Note that there's a big distinction between value and budget.
I personally think NVIDIA talked all about the shader improvements during the presentation. It can now do parallel INT and FP issues. I expect nothing more, but I don't have any data as to how much that will help. I suspect it will help out a few performance killing edge cases, but isn't going to dramatically improve the cores or NVIDIA would already have done it. My guess is overall, true apples to apples 25% gain. Any RT though and it's massive gain.
 
I personally think NVIDIA talked all about the shader improvements during the presentation. It can now do parallel INT and FP issues. I expect nothing more, but I don't have any data as to how much that will help. I suspect it will help out a few performance killing edge cases, but isn't going to dramatically improve the cores or NVIDIA would already have done it. My guess is overall, true apples to apples 25% gain. Any RT though and it's massive gain.

If the parallel FP and INT stuff has to be done by developers than that improvement is going to vary wildly game-to-game.
 
I'm actually quite hopeful about DLSS, I mean it may even be better than all other types of AA... or it may not lol... sure it's all speculation, but if you look at what tensor cores can do with slow motion etc. it could really change things up.

https://news.developer.nvidia.com/transforming-standard-video-into-slow-motion-with-ai/

The tensor cores can do a lot of magic, outside the confines of a game.

Before the bar chart release, I had estimated that DLSS would provide upwards of 90% performance boost, I thought that would have been awesome (despite the fact that it only had limited application).

That 90% figure was predicated on zero SM improvement.

In an "unfortunate" case, there is mounting evidence that does suggest a 10-20% improvement in SM capabilities... and that, coupled with the bar chart, killed a magical 90% improvement from DLSS down to ~30%.
 
"Guys, we know you saw some framerate issues yesterday when we showcased the gameplay, but, I mean, look at the graph!"
*gestures to graph*
"It used to be at one, now it's two! Please address any additional questions to the graph. Thank you."
 
The tensor cores can do a lot of magic, outside the confines of a game.

They have those very expensive Quadro cards, and the $3000 TITAN V also has Tensor cores, so I doubt that NVIDIA will allow the Tensor cores in their GeForce RTX lineup to be used for anything more than gaming. Turing was primarely designed for image processing, and NVIDIA is trying to pass it off as a gaming GPU as well. NVIDIA will increase GPU complexity even more as the process nodes improve, think 7nm and lower, but everything that will be released in the future won't have gaming at its heart. Basically NVIDIA will force game developers to adapt to their new architectures. Good games are not just about eye candy, but also about game play. Heck, if it was all about eye candy, then Nintendo would have been out of business a long time ago.
 
The tensor cores can do a lot of magic, outside the confines of a game.

Before the bar chart release, I had estimated that DLSS would provide upwards of 90% performance boost, I thought that would have been awesome (despite the fact that it only had limited application).

That 90% figure was predicated on zero SM improvement.

In an "unfortunate" case, there is mounting evidence that does suggest a 10-20% improvement in SM capabilities... and that, coupled with the bar chart, killed a magical 90% improvement from DLSS down to ~30%.

Yes indeed. Video processing/interpolating frames is very different from realtime 3D rendering in a video game where latency is hugely important. Sigh. 9/20 can't come any faster. XD
 
Going to wait and see what the actual reviews say on performance comparison. Still not buying one for $1,199 i'll stick to my 2 1080 Ti's.
 
  • Like
Reactions: STEM
like this
Meanwhile, from the perspective of one Wall Street "gamer" (as excerpted):

Source: [Zacks Investment Research, by way of Yahoo "News"] NVIDIA (NVDA) Gaming Drives the Deep Learning-AI Revolution

Quotes:

One of my favorite ways of seeing how GPU chips are forging the 4th industrial revolution is this graphic from NVIDIA, as the exponential power of GPU computing speed is leapfrogging brute force calculating and re-igniting Moore’s Law… Jensen showed us an updated version of this graphic on Monday with four of NVIDIA’s core GPU workhorses that have created the exponential advance. And he also implied tongue-half-in-cheek that the big reveal — Turing RTX technology — just leapfrogged his own designs by a decade. I’m still trying to figure out how serious he was.

...

As I type this on Monday afternoon, NVIDIA CEO Jensen Huang is on stage at Gamescom — the world’s largest gaming expo — being held this year in Cologne, Germany. He’s “wowing” the gamer/developer crowd with amazing views, stats and demos on NVIDIA’s record-breaking new “deep learning” architecture called Turing, featuring RTX image and light reconstruction/simulation powers that computers teach themselves.

...

I didn’t see anything about the forthcoming Call of Duty: Black Ops 4 from Activision (NASDAQ:ATVI), but then I’m not much of a gamer. I’m more interested in what the technology can do to create incredible games, and so much more.

...

Turing opens up a new golden age of gaming, with realism only possible with ray tracing, which most people thought was still a decade away,” said the founder of NVIDIA. “The breakthrough is a hybrid rendering model that boosts today’s computer graphics with the addition of lightning-fast ray-tracing acceleration and AI. RTX is going to define a new look for computer graphics. Once you see an RTX game, you can’t go back.”

--end excerpted quotes--

Jensen Huang, all of the folks at NVIDIA, and Wall Street itself, thank you early adopters
biggrin.gif


Notes: The rest of the article contains additional quote-worthy content; the article's author owns shares of NVDA for the Zacks TAZR Trader portfolio (disclosed at the end of the article).
 
Jim has come out with his analysis on youtube.(adoredtv) great video

ADorkedTV is so biased for AMD and against NVIDIA that no one with an I.Q above a wet brick can take him seriously...
 
ADorkedTV is so biased for AMD and against NVIDIA that no one with an I.Q above a wet brick can take him seriously...

You could not be more wrong! He called out vega big time and amd and called it like it is. Did you see his videos? He wasn't trash talking Nvidia at all. I think people assume he hates Nvidia. Go watch his videos he doesnt. He just says what he thinks is right. Just because you think he is biased doens't mean he is. And I have a much better I.Q above a wet brick for sure.
 
ADorkedTV is so biased for AMD and against NVIDIA that no one with an I.Q above a wet brick can take him seriously...

You should actually watch the video. His take is not only fair, but does a good job breaking down the potential performance of the cards.
 
"Guys, we know you saw some framerate issues yesterday when we showcased the gameplay, but, I mean, look at the graph!"
*gestures to graph*
"It used to be at one, now it's two! Please address any additional questions to the graph. Thank you."

In graph we trust!

150723_IMG_20180822_153746.jpg
 
You should actually watch the video. His take is not only fair, but does a good job breaking down the potential performance of the cards.

"Potential"....no thanks, I had a "debate" with him awhen he did his "Polaris" videos, he is technical cluess and his repsonse was basically "I have more viewers than you - buhu" when confronted with his ignorance.
The end result: I was right, he was wrong, but fans of AMD gobble up his hoghwash up like it's the second comming.

So not clicking and supporting a muppet speewing ignorance.
 
"Potential"....no thanks, I had a "debate" with him awhen he did his "Polaris" videos, he is technical cluess and his repsonse was basically "I have more viewers than you - buhu" when confronted with his ignorance.
The end result: I was right, he was wrong, but fans of AMD gobble up his hoghwash up like it's the second comming.

So not clicking and supporting a muppet speewing ignorance.

You have the right to disagree with him but that doesn't make him AMD lover and Nvidia hater. He could have talked shit about RTX cards but he didn't.
 
The RTX series is too shrouded in mystery and obfuscation to give me the warm fuzzy about it. The price point only amplifies that.
 
I hate these goddamn "ratio" graphs they put out...just give the fucking FPS already.

That said, 4k 60+ FPS in those games on the 2080 (not even the Ti) seems pretty decent. Not $800 decent, but still...


Take it easy. The graph works great for comparing multiple titles so that all of the lines are about the same length. Is it so hard to multiply? If a game gets 50 FPS with a 1080 and this shows 1.5x performance, then you can expect around 75 fps. There are WAY too many variables anyhow to push out an FPS claim.
 
Exactly. Jensen is just pissing on the consumers with this pricing/naming scheme. They could have EASILY gone with:

$999/$1199 = Titan RTX

$699/$799 = 2080 Ti

$499/$599 = 2080

The titan isn't anything special, it's just marketing BS that people fell for when they switched to Kepler.
 
Saw that, but doesn't help much without context. If everyone was saying terrible performance, then this is good. If everyone was saying great performance, this is bad... however the most likely scenario is that there were a lot of comments saying both good and bad, and therefor we really didn't learn anything as per usual, because a lot could then be assigned to both groups...
 
They are obviously responding to all the 30 FPS 1080p RTX stories. So they are hinting it's a lot faster than people are thinking. I am expecting the 2080Ti to be around 15% faster than the $3,000 Titan V that launched just nine months ago. From a top tier perspective, $1,200 isn't bad at all.
 
I've read pretty much every story about these cards on a shitload of websites and heres my take on it
I cannot afford any of these new cards...so none of this matters to me..and you all need to know that. Its a pretty important point!

Full disclosure...I don't read any other websites as that would involve reading more!

p.s....mebbe if the 2070 is faster then the 1080Ti...i could possibly make the jump....

p.p.s....I don't like bar charts...gimme some pies
 
Meanwhile, from the perspective of one Wall Street "gamer" (as excerpted):

Source: [Zacks Investment Research, by way of Yahoo "News"] NVIDIA (NVDA) Gaming Drives the Deep Learning-AI Revolution

Quotes:

One of my favorite ways of seeing how GPU chips are forging the 4th industrial revolution is this graphic from NVIDIA, as the exponential power of GPU computing speed is leapfrogging brute force calculating and re-igniting Moore’s Law… Jensen showed us an updated version of this graphic on Monday with four of NVIDIA’s core GPU workhorses that have created the exponential advance. And he also implied tongue-half-in-cheek that the big reveal — Turing RTX technology — just leapfrogged his own designs by a decade. I’m still trying to figure out how serious he was.

...

As I type this on Monday afternoon, NVIDIA CEO Jensen Huang is on stage at Gamescom — the world’s largest gaming expo — being held this year in Cologne, Germany. He’s “wowing” the gamer/developer crowd with amazing views, stats and demos on NVIDIA’s record-breaking new “deep learning” architecture called Turing, featuring RTX image and light reconstruction/simulation powers that computers teach themselves.

...

I didn’t see anything about the forthcoming Call of Duty: Black Ops 4 from Activision (NASDAQ:ATVI), but then I’m not much of a gamer. I’m more interested in what the technology can do to create incredible games, and so much more.

...

Turing opens up a new golden age of gaming, with realism only possible with ray tracing, which most people thought was still a decade away,” said the founder of NVIDIA. “The breakthrough is a hybrid rendering model that boosts today’s computer graphics with the addition of lightning-fast ray-tracing acceleration and AI. RTX is going to define a new look for computer graphics. Once you see an RTX game, you can’t go back.”

--end excerpted quotes--

Jensen Huang, all of the folks at NVIDIA, and Wall Street itself, thank you early adopters View attachment 98260

Notes: The rest of the article contains additional quote-worthy content; the article's author owns shares of NVDA for the Zacks TAZR Trader portfolio (disclosed at the end of the article).
Was he masturbating during writing that?
 
You have the right to disagree with him but that doesn't make him AMD lover and Nvidia hater. He could have talked shit about RTX cards but he didn't.
Did he not just say that the new nVidia RTX cards would only be 15-20% faster than the Pascal cards, despite nVidia just releasing benchmarks and info showing a 50% increase? Sounds like talking shit to me.
 
Saw that, but doesn't help much without context. If everyone was saying terrible performance, then this is good. If everyone was saying great performance, this is bad... however the most likely scenario is that there were a lot of comments saying both good and bad, and therefor we really didn't learn anything as per usual, because a lot could then be assigned to both groups...
The trend I'm seeing is that everyone is complaining about the price. Others are saying there will be a small performance increase in non-RTX scenarios. So if he's saying people will be eating their words, that's great news IMO. I have a feeling the RTX 2080 Ti is an absolute monster.
 
Did he not just say that the new nVidia RTX cards would only be 15-20% faster than the Pascal cards, despite nVidia just releasing benchmarks and info showing a 50% increase? Sounds like talking shit to me.
If you look at their specifications, 15-20% sounds very reasonable to me. 50% isn't out of the question, but with just a few more shaders and (assuming) similar clock speeds, you won't get a huge bump in performance. The big bump would then have to come from faster memory (when that is an issue) and the new rtx cores, but those have to be utilized first.
 
View attachment 98100


They provide exact FPS in the second slide. You also don't need data labels everywhere in the chart. That is what an accompanying data table is for. A chart is just supposed to be a quick way to see trending or relative comparison. Admittedly, they could have included horizontal axis lines to better read it, but it's not a big deal. I can see that FFXV is about 40% faster, PUBG a little less, Shadow of the Tomb Raider is about 25% faster, etc.

Yup. These numbers are NOT impressive for a $1200 card! At $700? Sure, but not at $1200 or even $1000!
 
If you look at their specifications, 15-20% sounds very reasonable to me. 50% isn't out of the question, but with just a few more shaders and (assuming) similar clock speeds, you won't get a huge bump in performance. The big bump would then have to come from faster memory (when that is an issue) and the new rtx cores, but those have to be utilized first.

It sounds reasonable to me at $750 or so. Not at $1000-$1200!
 
We have not yet seen how the RTX cards perform in SotTR without ray tracing.

Given the thin portfolio of games that will have ray tracing in the near future I think "how cards perform without it" is much more relevant overall.

Sure, the screenshots I've seen so far with reflections and stuff look really really nice, but what about all the games that are never going to support that?
 
Did he not just say that the new nVidia RTX cards would only be 15-20% faster than the Pascal cards, despite nVidia just releasing benchmarks and info showing a 50% increase? Sounds like talking shit to me.

Um, they showed "UP to 50%" with a 2080ti vs the 1080. The advantage over the 1080ti will be 15-20% based on what we know of the cards SO FAR. Additionally, you are choosing to believe a MARKETING SLIDE that has very little real data. We don't know almost ANYTHING about how they tested, what settings were used, etc. So you choosing to believe this as gospel tells us more about YOUR biases than it speaks intelligently to the debate.
 
If you look at their specifications, 15-20% sounds very reasonable to me. 50% isn't out of the question, but with just a few more shaders and (assuming) similar clock speeds, you won't get a huge bump in performance. The big bump would then have to come from faster memory (when that is an issue) and the new rtx cores, but those have to be utilized first.
nVidia just released benchmarks showing a 50% increase. These Adored guys are talking out their ***.
 
I'm sure that there will be scenarios where you can get the 50% indicated in the marketing slides. Whether that 50% increase is useful, as in going from 10FPS-15FPS is still terrible, or if it's in only exceptionally constrained circumstances are completely different questions.

If the 2080Ti actually is 50% faster than a 1080Ti at 1440p, non-RT games, then I'll get one. I really don't care, at all, about their RT performance because there's only a hand full of announced games, and there's likely to be another generation of cards with another performance uplift before the feature is mainstream enough for me to consider it as a purchase input.

If standard graphics performance is only 20% faster with a 2080Ti versus a 1080Ti, then it greatly depends on OC headroom on these new cards because the 1080Ti can easily make up a good chunk of that gap if you have good cooling. The slides presented that show 50% increase indicate 4k HDR, and since DLSS is showing a large performance increase, likely are using a lot of AA on both cards. It is a complete unknown at this time how well that translates down to no/lower AA and lower resolutions. I'm not saying the new cards are duds, I'm saying wait and see. There are far too many unknowns and assumptions right now to make a rational judgement call on relative performance.
 
Back
Top