First RTX 2080 Benchmarks Hit the Web with DLSS

True, but it shouldn't be hard for them to just make a framerate comparison graph in the first place. It's just extra work for the end user for no real reason.

I would also argue that a percentage increase doesn't give you the full story when it comes to playability of games. A 50% increase means something a lot different if the game started at 30 FPS, versus 80 FPS.
Really?
 
But...but...maths. :p

I get why they presented the performance numbers this way, though. Giving direct fps numbers for those titles would make it a lot easier to note that the 2080 seems to be only marginally better than a 1080ti in current graphical workloads. That jump in the past was reserved for the x70 cards, not the x80 cards.
True, the 1070 traded blows with the 980ti, the 970 traded blows with the 780ti. I think the 2070 is about 10% slower than the 1080ti, it's possible with DLSS the 2070 will beat it in those use cases where it can be turned on though. The 2080 looks to be about 15-20% faster than the 1080ti, giving a better 4k60 experience at higher settings, but I bet when Hard gets their hands on it, the image quality with DLSS will show some issues.
 
This sounds great but we need a real review from Kyle. So those of us running SLI 1080TI's need to get 2 2080TI's to make it worthwhile? So ya 2400 bucks to make this a worthy upgrade. That's assuming this metric is true where by one 2080TI is roughly equal to 2 1080TI's SLI. That's insane. New tech aside, and I mean it looks great, 1200 bucks to get roughly what I have now in regular games sucks. Am I looking at this wrong?
 
lol you know nothing about NVIDIA development costs or TSMC production costs for large chips. Your just a noob complaining about price, which seems to be pretty common these days.

Maybe the forum for you is at www.Softforum.com

I have no problem affording multiple 2080Ti's and a WX threadripper. However, just because you can afford something doesn't make it worth it for the money. I think I must have hurt your little fanboi feelings.
 
Not super-impressed with the performance/price bracketing based on these slides. May come down to how well these cards OC, as I can easily get 50% plus over a generic 1080 (in some situations, YMMV, etc) with my 1080Ti w/OC. This also begs all sorts of questions about resolution scaling, what settings everything is running at (presumably with heavy AA, or they couldn't count the DLSS as a big performance bump), and so on. This is a marketing slide, which is supposed to make you want to buy stuff, but it honestly makes me even less eager to put in a preorder. I'm not upgrading monitors from my current 144Mhz 1440p w/GSync until the 4k models come out of orbit, so those are the only numbers that matter to me. Others will have other perspectives, but quoting performance on what is almost certainly the smallest possible segment of the userbase (4k HDR monitors) doesn't seem like a great fit to move units.
 
call me skeptical of this chart...coming direct from Nvidia...coming directly on the heels of the poor performance reports...I'll wait for something a bit more unbiased...this sounds like pure marketing

Ah, but that is because it's pure marketing. I do think it's interesting the reactions from people here. If this was an AMD marketing slide most of the posts would be about how it can't be trusted.

Also keep in mind that marketing slides are always the best case scenario with the most favorable conditions for the product the slide is about.
 
I think they saying 2080 can hit 2x perf of a 1080.

So we should expect a 1080 to get 15-35fps in Shadow of the Tomb Raider?

Gee, how are non-RTX-enabled games going to play, I wonder.

ETA: This comment was just intended as snark.
 
So that looks like around ~45% average FPS increase, not too bad. The 2080Ti over 1080Ti should be about the same. So that would put the 2080Ti around 10% faster than the Titan V if it scales linearly.

If this is truly the case, the price makes sense now.

But remember though. 2080ti clocks seem to be lower than 1080ti. 2080 might be boosting much higher.
 
Well it's an interesting approach. So what they basically doing is running the game at 4k with AA off, then enabling DLSS which lets the tensor core do the AA. In other words, the tensor cores are being used as a dedicated anti-aliasing hardware. That begs the question though, this method is off the table if the game uses Ray Tracing then since those cores would be otherwise busy.

Close, but I can't imagine the original 1080/2080 numbers are with no AA; no matter what the new cards do, adding AA of any kind isn't going to result in more frames, all else being equal. They would be the same if the TC are doing all AA (so free AA essentially), no higher.

I'd imagine the 1080 numbers are with an unspecified AA method, the 2080 numbers are with the same unspecified AA method, and the the DLSS numbers are when you swap that AA for DLSS.
 
Reserving judgement until after the in dept[H] analysis.
 
For funsies, I printed off the graph, took the three games from the second slide that we had framerates on, and measured those bars. Got bar length in millimeters, and determined the missing values on slide one. For Final Fantasy one, I assumed the DLSS bar from page one as the 60fps noted on page 2. Far from scientific, but fun nonetheless.

ME = 49% increase
Hitman = 46% increase
FF = 50% Increase / 114% increase (MSAA compared to DLSS??)

upload_2018-8-22_14-47-30.png


Oh, my indepth research tools...literally done on the back of an envelop! LOL!

IMG_20180822_145330.jpg
 
they must have a huge pile of 1080s to get rid of. one 2080ti = the cost of decent system

I think it's reasonably price if you want cutting edge. It's a ti in name, but is coming out roughly the same time a Titan does. xx80ti's have tradiotionally come out 8 months after the standard varient.
 
Ray tracing looks interesting, added eye candy but nothing that is missed if you dont have it, the games still look great. Besides in a MP like BF who has time to look around, Im to busy trying to stay alive. As far as the performance at 4K and 60hz thats great, but I run at 2K and dont see myself upgrading anytime soon, and my 1070 has no problems at 2K. The prices, especially on the higher range cards dont seem to be worth the return to me, unless you game at 4K then its another story.
 
I think it's reasonably price if you want cutting edge. It's a ti in name, but is coming out roughly the same time a Titan does. xx80ti's have tradiotionally come out 8 months after the standard varient.

For all intents and purposes I consider the 2080ti a Titan. It’s die is 50% larger than a Titan Xp and is less than 10% from max die size.
 
lol you know nothing about NVIDIA development costs or TSMC production costs for large chips. Your just a noob complaining about price, which seems to be pretty common these days.

Maybe the forum for you is at www.Softforum.com

True but this is the worst time to buy video card. If people want to fund the next gen for me all power to it. I can bet when 7nm hit prices will go down plus there will be competition next year at 7nm. A year is not too long of a time for me. I will let early adopters fund the 7nm generation for me so I can buy that at discount. This is the very reason nvidia dropped 2080ti at the same time as 2080. Because they know they can't wait 6 months and they have to move to 7nm since competition will be there as well.
 
Maybe the 2080 makes for a really solid 4K gpu in today’s games and sets you up to try out the RT stuff in games that support it. It should out perform the 1080Ti for the asking price we hope. I’m definately not dropping the coin for one but any of you who are I’ve got 450 ready to rock tomorrow for your founders 1080Ti. :D

PM me.
 
The information provided is interesting, but quite lean on the details. Here are the things I am wondering:
  • Is the 1080 running any AA? If so, which AA level and type? Is the 1080 running AA in both comparisons or just when DLSS is turned on for the 2080?
  • Is DLSS comparable in appearance to a traditional AA? If so, which type?
  • Will the same DLSS benefits scale similarly at lower resolutions?
  • Will using something like DLSS at the same time as ray tracing impact the performance of one or the other? (they both use the tensor cores)
Good questions. My understanding is that DLSS is the same as SSAA using the deep learning of the Tensor cores instead of the SM cores. DLSS stands for "Deep Learning Supersampling." I think their ray tracing technique utilizes the Tensor cores as well to try and speed up the process, so DLSS + ray tracing may impact performance.
People would take the numbers more seriously if they posted more info about the testing configuration such as resolution, cpu, game settings, etc. But...I doubt they would do that.
I do like how AMD provided their settings when they provided pre-release benchmarks with Vega. Again, I think NVIDIA is trying to throw a bone to reviewers by not revealing too much before release so they get the clicks.
So we should expect a 1080 to get 15-35fps in Shadow of the Tomb Raider?

Gee, how are non-RTX-enabled games going to play, I wonder.

ETA: This comment was just intended as snark.
Just in case someone did take this seriously...
A 1080 is more likely to get 5-10 FPS in Shadow of the Tomb Raider with ray tracing enabled. We have not yet seen how the RTX cards perform in SotTR without ray tracing.
 
Not super-impressed with the performance/price bracketing based on these slides. May come down to how well these cards OC, as I can easily get 50% plus over a generic 1080 (in some situations, YMMV, etc) with my 1080Ti w/OC. This also begs all sorts of questions about resolution scaling, what settings everything is running at (presumably with heavy AA, or they couldn't count the DLSS as a big performance bump), and so on. This is a marketing slide, which is supposed to make you want to buy stuff, but it honestly makes me even less eager to put in a preorder. I'm not upgrading monitors from my current 144Mhz 1440p w/GSync until the 4k models come out of orbit, so those are the only numbers that matter to me. Others will have other perspectives, but quoting performance on what is almost certainly the smallest possible segment of the userbase (4k HDR monitors) doesn't seem like a great fit to move units.

Remember these are probably done on FE cards. Nvidia decided to overclock them out of the box this time. I think they were trying to squeeze more performance. I wouldn't be surprised if 2080 is boosting close to 1900mhz or so out of the box as well.
 
I have no problem affording multiple 2080Ti's and a WX threadripper. However, just because you can afford something doesn't make it worth it for the money. I think I must have hurt your little fanboi feelings.

You must have missed the point that no one cares what you think something is "worth".
 
I think it's reasonably price if you want cutting edge. It's a ti in name, but is coming out roughly the same time a Titan does. xx80ti's have tradiotionally come out 8 months after the standard varient.

Exactly! in 8 months I think we will be going 7nm and there should be competition. So nvidia wants cash now. I don't mind those, those who want this on the bleeding edge can go for it. I am eyeing 7n. I don't want to spend 1200 bucks for somehting that will be replaced in a years time. Only reason nvidia released 2080ti at the same time is because they don't have that time and they can charge more due to no competition.
 
Well it's an interesting approach. So what they basically doing is running the game at 4k with AA off, then enabling DLSS which lets the tensor core do the AA. In other words, the tensor cores are being used as a dedicated anti-aliasing hardware. That begs the question though, this method is off the table if the game uses Ray Tracing then since those cores would be otherwise busy.

From what I can tell so far, I think that when ray tracing is enabled the tensor cores can be used for DLSS as well as for guessing where the ray traced shadows or reflection will be placed on screen. So it’s safe to assume RTX effects will drop frames significantly when used with DLSS.
 
Exactly! in 8 months I think we will be going 7nm and there should be competition. So nvidia wants cash now. I don't mind those, those who want this on the bleeding edge can go for it. I am eyeing 7n. I don't want to spend 1200 bucks for somehting that will be replaced in a years time. Only reason nvidia released 2080ti at the same time is because they don't have that time and they can charge more due to no competition.

I really don’t think nvidia will release a refresh in less than a years time. Since when has nvidia released a node refresh that early in their projected upgrade cycles? Correct me if I’m wrong.
 
Maybe the 2080 makes for a really solid 4K gpu in today’s games and sets you up to try out the RT stuff in games that support it. It should out perform the 1080Ti for the asking price we hope. I’m definately not dropping the coin for one but any of you who are I’ve got 450 ready to rock tomorrow for your founders 1080Ti. :D

PM me.
I already dropped the coin for a MSI 1080 Gaming X and sell my 1070 Gaming X, Im not paying $650 or more for a mid range card like the 2070 just to have ray tracing and the ability to run at a resolution that I dont have, I just bought a 2K monitor after being at 1080P.
 
I really don’t think nvidia will release a refresh in less than a years time. Since when has nvidia released a node refresh that early in their projected upgrade cycles? Correct me if I’m wrong.

its not about that. You really think if AMD Is rocking 7nm, nvidia will allow them to take the lead? They would have to come out with something that is on the bleeding edge. If that wasn't the case they would not have released 2080ti 8 months earlier. They have never done that either in their lifetime. First time in their product cycle they released the 2080ti with 2080.
 
Who gives a rats bottom about 2080 vs 1080.

The 2080 is more expensive than a 1080 TI. I bet they don't want to show that one for non RTX games.
 
its not about that. You really think if AMD Is rocking 7nm, nvidia will allow them to take the lead? They would have to come out with something that is on the bleeding edge. If that wasn't the case they would not have released 2080ti 8 months earlier. They have never done that either in their lifetime. First time in their product cycle they released the 2080ti with 2080.

If AMD’s 7nm node can’t compete then yeah nvidia will happily stay where its at. Maybe they’ll fill in the gaps with a 2070ti like they did this gen. If AMD comes up ahead though the yes they’ll be scrambling to get their 7nm sku’s out. When AMD announced and released Vega with hbm2 nvidia didn’t release a GPU with HBM on it right away. At this point though we could all guess. God knows it might be another 2 years before we see a next gen product from them.
 
its not about that. You really think if AMD Is rocking 7nm, nvidia will allow them to take the lead? They would have to come out with something that is on the bleeding edge. If that wasn't the case they would not have released 2080ti 8 months earlier. They have never done that either in their lifetime. First time in their product cycle they released the 2080ti with 2080.

Meh, we will see. AMD is far enough behind atm that they would need to make decent IPC gains in addition to the node shrink to best the 2080ti in performance. Not saying it won't happen, just that the node shrink alone probably won't get the job done for AMD.
 
So that looks like around ~45% average FPS increase, not too bad. The 2080Ti over 1080Ti should be about the same. So that would put the 2080Ti around 10% faster than the Titan V if it scales linearly.

If this is truly the case, the price makes sense now.
$100 more launch price, $300-350 more compared to current prices, for 45% increase in 4k MSAA (so probably less in post AA and/or lower resolutions) is barely ok at best and quite shitty compared to past releases.

EDIT: 980->1080 was 60-70% real life increase for the same launch price, which puts 1080->2080 to shame.
 
Last edited:
The information provided is interesting, but quite lean on the details. Here are the things I am wondering:
  • Is the 1080 running any AA? If so, which AA level and type? Is the 1080 running AA in both comparisons or just when DLSS is turned on for the 2080?
  • Is DLSS comparable in appearance to a traditional AA? If so, which type?
  • Will the same DLSS benefits scale similarly at lower resolutions?
  • Will using something like DLSS at the same time as ray tracing impact the performance of one or the other? (they both use the tensor cores)
+1 to this.

This is why you wait for [H] to actually give you all the info.
 
So, without their DLSS stuff we get anywhere from 25% to 50% improvement? Assuming there's no bullshit in the graph or shenanigans in the methodology?
 
i think i have a pretty good guess on what they are doing.
they will take the worst anti aliasing technique like SSAA which is basicaly super sampling, that cut your performance by almost half, and compare it to DLSS, which imo will be similar if not worst in quality than another good anti aliasing method, let's say SMAA that has a much lesser impact on performance, but that wouldn't make a good slide would it ?
everything is in the deceit, i am talking out of my arse, but i am almost sure this will end up being the case, nvidia never change...
ray tracing is really nice, especialy if production cost on developers is reduced, that would be a great incentive for them to adopt it, if not it would just be another tech that would make you play games at 1080p@30fps instead of 144fps for better lighing and reflections, is the compromise worth it ?
 
Last edited:
The very top says: RTX 2080: 2X 1080

...are they implying that they were running two 1080's in SLI when benchmarking this? I too enjoy publishing numbers with zero context and no axis labeled.

It almost seems like an intentionally confusing chart to make the numbers look better.

"Oh, so it's twice as powerful as a 1080 ... with no context to resolution, settings, or framerate."
 
I modified their slide with a modest 35% performance increase (which could be more for the Ti) and FPS numbers at 4K are looking pretty good:

NV-GeForce-RTX-2080Ti-Performance-Games-1600x900_1.jpg
 
This helps me feel a little better about things but....

I want to run the FFXIV benchmark on it and compare to my Titan-X's....
(since that's the game I spend the most time in)

Then I want to see some CUDA scores....
(Cant sacrifice my video conversion speeds)

Then I'll decide if I want to drop the $$$ for one or two of the new cards. ;)

Of course, I'll be needing a new CPU/MB/RAM as well.... gee funny how that works.
 
Back
Top