First RTX 2080 Benchmarks Hit the Web with DLSS

When was that disproven? I don't remember reading about it, but I could have easily just missed it.

I disproved it in my own testing and had others do it to back me up and post on the forums. There is zero performance hit with HDR with NVIDIA as long as you don't do chroma sub-sampling. I think it may be a bug in the drivers.


On another note,

Another reason for speed increase:
NVIDIA-Turing-L1-L2-Cache-1600x898.jpg
 
I disproved it in my own testing and had others do it to back me up and post on the forums. There is zero performance hit with HDR with NVIDIA as long as you don't do chroma sub-sampling. I think it may be a bug in the drivers.


On another note,

Another reason for speed increase:
View attachment 98490

I like it :p:p
 
Don't take it personally but man you are confusing.

Titan has been full die. So no titan has not been replaced with 2080ti. It's not just a name swap. 2080ti is not a full die!

Why the hell in the world they would release a cut down version 2070ti as the previous 2080ti. It makes absolutely no business sense. Xx70ti has never been above xx80. I am not sure what you are thinking but boy you couldn't be more misinformed.

2080ti is not the new titan. Its amazing how people are justifying that. It's not a full die.

They could release the full die in a few months Nd call it Titan xT for 2- 3k like I expect lol.


Oh I know, that's why I reworded it like 5 times trying to figure out how to get it out there without it sounding so damn confusing.

Im not trying to justify anything to be honest, just pointing out that the titan cards were cut down Quaddros. The 2080ti is a cut down quaddro. The Titan cards were sold for $1,000 to $1,200 and the 2080ti is going for $1,000 to $1,200. The Titans were released at about the same time as the non-ti cards were released. The 2080ti is getting released right around when the non-ti cards are being released.

It stands to reason that the 2080ti is a Titan with a different branding slapped on it.

And lets be fair here, the past couple generations the Titan and the x80ti have been for all intent and purpose the exact same card anyway, you just pay a big premium for the Titan so you can have it 6 to 8 months earlier. Nvidia just said to hell with it and dropped the whole branding the exact same card twice with two big price points and kept the more popular branding of the 2080ti and the higher price point of the titan.
 
Last edited:
Oh I know, that's why I reworded it like 5 times trying to figure out how to get it out there without it sounding so damn confusing.

Im not trying to justify anything to be honest, just pointing out that the titan cards were cut down Quaddros. The 2080ti is a cut down quaddro. The Titan cards were sold for $1,000 to $1,200 and the 2080ti is going for $1,000 to $1,200. The Titans were released at about the same time as the non-ti cards were released. The 2080ti is getting released right around when the non-ti cards are being released.

It stands to reason that the 2080ti is a Titan with a different branding slapped on it.

And lets be fair here, the past couple generations the Titan and the x80ti have been for all intent and purpose the exact same card anyway, you just pay a big premium for the Titan so you can have it 6 to 8 months earlier. Nvidia just said to hell with it and dropped the whole branding the exact same card twice with two big price points and kept the more popular branding of the 2080ti and the higher price point of the titan.
What does it matter what you decide to call it or if and how much it's cut from whatever? The only thing that matters is price/performance.
 
What does it matter what you decide to call it or if and how much it's cut from whatever? The only thing that matters is price/performance.
It matters in that in previous generations, if you held out for 6 more months or so you could get the $1,200 GPU for $700 with just a name change of difference.

As of now it isn't clear if that will happen again or not.
 
Oh I know, that's why I reworded it like 5 times trying to figure out how to get it out there without it sounding so damn confusing.

Im not trying to justify anything to be honest, just pointing out that the titan cards were cut down Quaddros. The 2080ti is a cut down quaddro. The Titan cards were sold for $1,000 to $1,200 and the 2080ti is going for $1,000 to $1,200. The Titans were released at about the same time as the non-ti cards were released. The 2080ti is getting released right around when the non-ti cards are being released.

It stands to reason that the 2080ti is a Titan with a different branding slapped on it.

And lets be fair here, the past couple generations the Titan and the x80ti have been for all intent and purpose the exact same card anyway, you just pay a big premium for the Titan so you can have it 6 to 8 months earlier. Nvidia just said to hell with it and dropped the whole branding the exact same card twice with two big price points and kept the more popular branding of the 2080ti and the higher price point of the titan.

Honestly it's not a new titan because it's not called a titan. When nvidia names it a titan I will believe that. Hehe. But to be fair there is room for a titan as full die to be even more expensive. Then we will have a titan lol
 
Honestly it's not a new titan because it's not called a titan. When nvidia names it a titan I will believe that. Hehe. But to be fair there is room for a titan as full die to be even more expensive. Then we will have a titan lol

That new titan will probably cost $3k just like the rumors suggested at that point I’m out... :vomit:
 


Though i don't doubt the option will be disabled on most games.
Game mod, I know, but I thought it was cool. lol

be neat if some one modded that mod to add the RTX denoising to it
 
How much did the 8800 Ultra launch for again?

Product Price
8800 GTX 768MB $599-649
8800 GTS 640MB $399-499
8800 GTS 320MB $299-329
8600 GTS 256MB $199-229

Ultra was over $800. In alot of the articles the talk was about the NEW price range Ngreedia set. 11 years later same shit no competition to drive down prices. And I got to tell you, that's when I switched to Nvidia and never turned back, 8800 GTS 320MB. My last Radeon was the X1900 XTX and daughter's comp had a 4770.
 
If the parallel FP and INT stuff has to be done by developers than that improvement is going to vary wildly game-to-game.
I doubt it would. This is basically enhancing the issue width of the architecture. What was a shared micro-op path is now split at some point to aid in performance. This is all internal, extremely low level GPU core stuff. It's sort of like an architectural design tweak for some flavor of super-scalar.

It's not like devs will be calling different instructions, but rather the GPU can issue INT and FP to execute concurrently. For example, calculate a pixel (a FP operation) and also the next needed memory location (an INT operation) at the same time vs waiting for the FP to finish before issuing the INT.
 
Did you guys see this?

View attachment 98276
This could be great news.

However, to be fair, the 30fps @ 1080p was with a beta RT implementation. I highly doubt there is any regression in rastered graphics between a 2080Ti and a 1080Ti. That'd just be silly. More cores, better cores, slightly faster cores, plus more memory bandwidth and ROPs (likely) should yield more performance across the board. If the RT must be emulated on a 1080Ti, there's no way it has enough grunt to do so. I fully expect the 2080Ti is about 25%, maybe more, faster on average and across all rastered games. I'm hoping it's more, but I don't see how it realistically be less. Much faster than that an I suspect that NVIDIA is tweaking the output using DLSS to simultaneously reduce the workload of the CUDA cores. That's not really a bad thing, that's just not an apples to apples comparison.

What I'm not certain about is does the 2080Ti, at 10gigarays/sec, have enough grunt to perform RT at the levels seen in the demo at expected frame rates? The 1080Ti is capable of 4k@60fps in many, but not all games even when settings were turned way up (just see [H] reviews). That said, there's always games that can't be run at the highest resolutions, highest settings, highest fps on the latest GPU. I'm not sure I'd be happy with a 2080Ti that can only perform raster/DLSS games at 4k@ >60fps and only RT at 1440p or lower. If a single 2080Ti can hit RT 1440p@60fps, then I have hope that NVLINK will enable two 2080Tis to make RT 4k@60fps possible.

TLDR; the 2080Ti is going to be fast, very fast. The question is, is it fast enough when using RT features to not regress over a raster only 1080Ti (at least 4k@~60fps in many titles)? Or do I need to drop from 4k to 1440p or buy two 2080Tis to get the full experience. Don't know yet.
 
This could be great news.

However, to be fair, the 30fps @ 1080p was with a beta RT implementation. I highly doubt there is any regression in rastered graphics between a 2080Ti and a 1080Ti. That'd just be silly. More cores, better cores, slightly faster cores, plus more memory bandwidth and ROPs (likely) should yield more performance across the board. If the RT must be emulated on a 1080Ti, there's no way it has enough grunt to do so. I fully expect the 2080Ti is about 25%, maybe more, faster on average and across all rastered games. I'm hoping it's more, but I don't see how it realistically be less. Much faster than that an I suspect that NVIDIA is tweaking the output using DLSS to simultaneously reduce the workload of the CUDA cores. That's not really a bad thing, that's just not an apples to apples comparison.

What I'm not certain about is does the 2080Ti, at 10gigarays/sec, have enough grunt to perform RT at the levels seen in the demo at expected frame rates? The 1080Ti is capable of 4k@60fps in many, but not all games even when settings were turned way up (just see [H] reviews). That said, there's always games that can't be run at the highest resolutions, highest settings, highest fps on the latest GPU. I'm not sure I'd be happy with a 2080Ti that can only perform raster/DLSS games at 4k@ >60fps and only RT at 1440p or lower. If a single 2080Ti can hit RT 1440p@60fps, then I have hope that NVLINK will enable two 2080Tis to make RT 4k@60fps possible.

TLDR; the 2080Ti is going to be fast, very fast. The question is, is it fast enough when using RT features to not regress over a raster only 1080Ti (at least 4k@~60fps in many titles)? Or do I need to drop from 4k to 1440p or buy two 2080Tis to get the full experience. Don't know yet.
You would have to live in a magical fairy land to think the performance of the 2080 TI with Ray tracing will even be remotely close to what the 1080 TI does with regular rasterization.
 
You would have to live in a magical fairy land to think the performance of the 2080 TI with Ray tracing will even be remotely close to what the 1080 TI does with regular rasterization.
Well, it really does depend on how much RT needs done to look good. You could just scale back the RT until it performs well, but it may actually look worse. The demos looked generally good, outside of some questionable surface modeling, but was that realistically 10gigarays to get that and at what res/fps? I'm skeptical.

The only real during the announcement that we got was that a single 2080Ti could render the Star Wars demo with 45ms frames...or ~22.2fps.
 
I'd love for this to be accurate, but that seems like an awful lot. That would make the 2080Ti ~80% faster than the 1080Ti (ignoring RT).

4352 / 3584 * 1.5 = 1.82x

If that’s the case then the price to perf ratio ain’t gonna be that bad
 
Let's look at this a bit differently for RT performance.

NVIDIA would have a complete bust on their hands if the flagship $1200 card couldn't at least play most of the RT launch games at least 1080p@60fps smoothly. Anything less than that and RT is a complete no-go, it's DOA for this generation. However, even that would cause a major PR headache since two very expensive GPUs essentially could not use their primary marketed feature, RT. Now if the 2070 could do 1080p@ 40-60fps, that'd be acceptable as you could adjust settings to accommodate RT. It would make the lowest end card capable of using the primary feature and while still compromised, it's manageable.

There really is a floor for performance that NVIDIA would be foolish to launch and heavily market a feature that could not be used. It's possible, but unlikely.



Using that performance floor, we can do some quick scaling estimates.

1080p = ~2.1M / 1440p = ~3.7M / 2160p = ~8.3M

For a SWaG we can basically assume that 1440p is 1.76x 1080p and 2160p is 3.95x 1080p. For the RT the 2080Ti is 1.67x (10/6 gigarays) the 2070 and the 2080 is 1.33x (8/6 gigarays) vs the 2070. As for CUDA cores we get 1.89x (4352/2304) and 1.28x (2944/2304) respectively. The key here is to always keep the base constant and I'm doing that from the 2070 per my discussion above.

Thus, for RT games a reasonable target would be for the 2070 to do 1080p@60fps, but perhaps not at max settings. Using the scaling calcs above, that would make the 2080 a RT max settings 1080p@60fps or faster solution and the 2080Ti roughly capable of 1440p@60fps depending on settings. In fact, the core/RT scalings line up pretty cleanly with resolution scaling.

2070 to 2080Ti is 1.89x on the cores and 1.67x on RT which lines up nicely with going from 1080p@60fps to 1440p@60fps (1.76x). The 2080 splits the difference and allows 1440p at lower settings for slight compromise in fps; alternatively, it's a maxed out 1080p solution.

In order to get to RT 4k@60fps, we'd need near perfect NVLINK scaling across two 2080Tis. Even then some settings will need turned down. The scaling works out reasonably well if you just make a minimum assumption about how much performance is needed for a successful launch. My assumption is that when using RT, the 2070 won't quite reach the minimum acceptable PC gaming standard of 1080p@60fps without some setting compromises. It scales from there.

Edit: I did not scale the clocks here, but it's a SWaG. It doesn't lower the gigaray rating and it does drop the core scaling on the 2080Ti, but it was a bit higher than required already.
 
Last edited:
DLSS looks promising. Is there any confirmation that this is tied to the 20xx series GPU's, or is it a feature that will roll out to all Nvidia GPU's as part of the next driver update?
 
DLSS looks promising. Is there any confirmation that this is tied to the 20xx series GPU's, or is it a feature that will roll out to all Nvidia GPU's as part of the next driver update?
did you not look at any of the news or Tech info? This feature is done on the tensor cores.
 
did you not look at any of the news or Tech info? This feature is done on the tensor cores.

I did see that, but firstly I don't know what "tensor cores" means. If it is just a compute feature, it would seem it could be done on any GPU with CUDA. Might have more of a performance impact though.
 
I look forward to the competitive advantage of ray tracing in BFV. Camp all game doing nothing next to a shiny car. Someone walks up, I see their reflection, blamo with explosives.

I imagine the mod community with install side mirrors on character shoulders to be walking advantages.
 
So that looks like around ~45% average FPS increase, not too bad. The 2080Ti over 1080Ti should be about the same. So that would put the 2080Ti around 10% faster than the Titan V if it scales linearly.

If this is truly the case, the price makes sense now.

Price doesn't make sense. In the past, you've gotten a good increase in speed and the prices were the same as the previous lineup. I expect that. I don't expect to pay for more a performance increase for a new generation. If that were true, I'd be paying thousands for a new Intel CPU.

If this were older, and part of the 10xx series, yes I could see it. But, this is the new lineup. They are dropping the old ones and replacing it with these.
 
Price doesn't make sense. In the past, you've gotten a good increase in speed and the prices were the same as the previous lineup. I expect that. I don't expect to pay for more a performance increase for a new generation. If that were true, I'd be paying thousands for a new Intel CPU.

In the past AMD had a more competitive product, forcing Nvidia to use market competitive pricing.

Even when they didn't, in the past Nvidia wanted to sell owners of the previous generation upgrades, and had to price the selves attractively enough in order for that to happen.

This time around they are sitting on a mountain of 10xx inventory which they want/need to sell first.

If they offer the next gen at similar prices, no one in their right mind is going to buy the previous gen stuff, and they will have to discount or write off millions.

Businesses don't like writing off millions of dollars of inventory.

Furthemore it seems like their yields on the 20xx chips is pretty low thus far, so they are probably content keeping demand low by running high prices, for now.

All of this could change in a hurry if AMD came out with anything new thats worthwhile. Rumor has it that will be a while though.

Remember, no matter the business fairness is never a part of price setting, nor is cost. It's all about what people are willing to pay, and the fewer you want to sell, the more you can charge per unit.
 
I did see that, but firstly I don't know what "tensor cores" means. If it is just a compute feature, it would seem it could be done on any GPU with CUDA. Might have more of a performance impact though.

It specializes in 16 bit integers and below. So normal CUDA does 32 bit. A 1080ti is around 11 TFLOPS, 2080ti is 13TFLOPS of 32 bit from the CUDA cores.

In addition the tensors can do 110TFLOPS of 16 bit which is all you need for DLSS. 1080ti can do it but it’d only have it’s higher precision 32 bit cuda cores... so you’d have to have an additonal 10 1080tis in parallel somehow to get the same result with a lot of unused bits. Or one 1080ti running at around 1/10th the speed of a 2080ti.
 
The price only makes sense if every % of additional performance = each additional % more in cost vs the previous gen.

Donald Bell look at it this way...

- Lets say you have a 5" wiener.
- AMD %20 increase gives you around 1" extra giving you a 6" wiener.
- Say that costs $1000.
- Say NVidia could get you a 7" wiener, or another %20!
- I would say, you would be expected to pay more than $2000.

- So what I am saying is, if you could go from your 5" to 7" for $5000 DO IT!!
 
If 3DFX company was still around they would have invented ray tracing 3 years ago for 500.00
I am in the wrong timeline , i still think berstein bears is the correct spelling.
 
Back
Top