Nvidia GeForce RTX 2080 and RTX 2080 Ti Official Performance Charts Leaked

If the result weren't from cherry picked title I'd say the performance increase is pretty much what was expected.
Kinda curious to see a [H] review but I don't see myself changing my 1080ti at those price.
 
There seems to be a whole lot of anti-Nvidia butthurt that this forum's become a bit of a lightning rod for, and it's rooted in something other than the merits of these new cards; therefore hard to take seriously.

People that are hoping these "cherry picked results" are going to be drastically or even measurably different than benches in so-called "trusted reviews" are going to be disappointed.

The whole GPP pass you by?
 
So I can buy a new $1200 GPU that is exactly double the price my $600 GPU cost more than 2 years ago and it's not quite 2x the performance. Wow. Amazing.

Usually things move FORWARD in a new generation.


uh they are your not even looking at things like DLSS and ray tracing and that doesnt even START on the other tech thats new in these things sure in older games what you said is the case but in new stuff that will use the new tech itll be more like 70 to 100% faster but you know shit on it because its not a faster horse

this is VERY much like the Geforce 2 days and i bet on the faster horse model then turned out TnL was where everything went
 
I dont get why people compare the 1080ti to the 2080ti...

The 1080ti should be compared to the 2080 replacing it and the 2080ti should be compared to whatever Titan it replaces... Then the jump in performance wouldnt be "45%" ishh
 
I dont get why people compare the 1080ti to the 2080ti...

The 1080ti should be compared to the 2080 replacing it and the 2080ti should be compared to whatever Titan it replaces... Then the jump in performance wouldnt be "45%" ishh

the 1080ti is within 5% of the Titan XP. They are the same card.
 
I'm sure glade I'm not cynical about this sort of thing. I wonder what anyone has to gain from lying about the amount of FPS? I don't get it.

It's quite simple, i don't trust numbers from the manufacturer and prefer to wait for actual real reviews to filter through. If everyone was to trust numbers direct from nvidia or amd what would be the point in review sites? They could simply publish their reviewers guide as a review and be done with it.


It's very possible their numbers aren't exaggerated, i just prefer to wait for independent reviews, simple as that.
 
Will be passing on this Gen, wait for Ray tracing to gain full market traction and work out the kinks... 1080p Ray traced scenes aren't worth it to me. I want full Res ray tracing capabilities and I'll wait till a card can push that. My 2 V56 score 12k in time spy, should last a few years, ideally until PCIe 5.0 and DDR5 drop
 
uh they are your not even looking at things like DLSS and ray tracing and that doesnt even START on the other tech thats new in these things sure in older games what you said is the case but in new stuff that will use the new tech itll be more like 70 to 100% faster but you know shit on it because its not a faster horse

this is VERY much like the Geforce 2 days and i bet on the faster horse model then turned out TnL was where everything went

Ray tracing is great tech and will be awesome, in the future. Don't bet too heavily on it with these cards. Even the 2080 ti isn't going to hold up well in the YEARS it will take for RT to be more than a toy for Nvidia to show off. I don't expect to see tons and tons of games add RT support. Even then, the actual performance of these cards with RT enabled remains to be seen. If the 2080 ti can really only manage 1080p/60 with all the RT bells and whistles going the 2080 and 2070 are utterly fucked when it comes to RT support.

DLSS will only ever work in games where Nvidia allows it to. You are not going to see a big improvement in every game. It will always be case-by-case. It sounds like it could be incredible, but you're putting a lot of faith into something that we've only seen a single still image of. It remains to be seen if the tech will actually end up being as amazing as it sounds.
 
An easy way is to check reviews using Vega cards , not old results but new benchmarks.
Vega 64 used to trade blows with 1080, now it will trade blows with 1080Ti
 
I’m hoping we see at least one review site provide 1080 Ti performance numbers on both current (399.xx) and upcoming (411.xx) driver versions to rule out any driver trickery with the release day results.
 
DLSS is still a question mark...will it be better then TAA and SMAA?...with all the mix/match AA options plus ReShade etc I don't see DLSS being that big of a deal...SSAA and MSAA will still be the best overall AA (but with a major hit on performance)

Did you even see the performance boost? while still giving AA, at something like 4k you would need a panel >35in to see the pixels that create the jagged edges needed for full screen AA.

I can see specialized partial AA, but SMAA/MSAA is really only effective at lower resolutions(720/1080p from 4k or greater assets, otherwise you simply get a blurry mess, sound familiar) when the image is downsampled creating artifacts, lets be honest how many games use higher res assets than 4k). Honestly, as long as it can handle the artifacting created by sampling and rendering images smaller than a pixel, and as well those special programmers trying to render the Blue Teapot in one pixel, while still making it look like a teapot. AA in general for gamers will come to a point where it is no longer needed for fullscreen effect.

I read a Neuro Scientist's medical journal stating at "less than an inch" 720PPI with the best human vision it is impossible to discern an individual pixel. and from "2 feet away" 320 PPI it is the same. For reference; 34'' 2k Ultrawide is 110PPI, 27'' 2k is 109PPI, 27'' 4k is 163PPI, 27" 5k is 207PPI, 27" 8k is 326PPI.

With all that being said we can clearly see 8k 27inch monitors being the end of Full Screen Edge AA because at that point you beat what the eye is capable of. Whether at this point the NM of the pixel is at the point where aliasing and artifacts won't be an issue well that is up to creators and whether they can render something as fine as hair without using sampling to render an image because the artifacting isn't there, at that point the shift to 8k assets without issues like textures may solve most of the issues, then again will it stop creators trying to render a blue teapot in one pixel...……….probably not instead they will continue to render the full teapot and downsample to 1 blue dot rather than just displaying a blue dot without artifacting.

Sorry its long I was on a roll, but here is the point....................................DLSS will effect higher resolutions where AA isn't really needed other than to de-artifact an image or clean up sloppy texture work, or that damn blue teapot, thus you get the same effect of AA without the huge performance hit as it is dedicated to a different cluster of processing, versus stressing the Cuda Cores that should be focused on rendering images, not cleaning up those images. DLSS can be sideloaded into previous games with little initiative, versus having to make sure your whole texture system can handle RT operations which is a much larger overhaul, something like Hellblade will benefit from this greatly where the game looks amazing but hits your system like a truck, I hope CDproject Red will issue this in all the Witcher series, Its a big deal and to be honest the industry I think is more geeking out over DLSS more so than RT. AA has always been a rose in gaming sure it looks pretty but those thorns.

While I will 100% agree the cards are overpriced for the consumer market, there are several factors to take into account over the "fanboyism" and "stickershock", while NVidia's marketing department is seriously crap and as bad as Apple's marketing, they did put out an amazing product,

1- The Die size is huge, its not cheap to make a chip that large.

2- There are clusters on the Die that are Workstation/Computational centric so they will cannibalize their lower workstation products at the cost of putting these out there.

3- They have excess previous gen product to sell due to the overshot of the Crypto currency market...………..remember we asked for this as people criticized AMD and NVidia for not having enough product to counter the price inflation from 3rd party vendors, so that cost will be recouped from the customers. I wouldn't be surprised if 2060 cards aren't Pascal 1080s and 2050 cards aren't Pascal 1070 chips retuned and rebranded.

4- Performance wise people were spoiled by Maxwell to Pascal performance gains we seen, so when we go back to the ho-hum down 1 model slide power increase people are underwhelmed.

I won't excuse NVidia for the pricing, but people are going 9th cloud just because of the price, just wait till later there will be a price drop when #2/3 are addressed and when the process matures and yields get a lot better. Personally if you have a 10 series I'd wait for the next die shrink 7nm 3000 series maybe.
 
Even the 2080 ti isn't going to hold up well in the YEARS it will take for RT to be more than a toy for Nvidia to show off. I don't expect to see tons and tons of games add RT support.

Every major engine has it and dozens of high-profile games are releasing with it.

Performance is still a bit of a question, but the fabled '1080p60' misquote being pushed around looks to be a performance floor; other titles are using it at 4k, and the quoted developer said that 1440p60 would be easy using known optimizations that they hadn't gotten around to.

Remember that ray tracing isn't new, it's something that the industry (not just Nvidia!) has been working toward for years.
 
I don't listen to anything that Nvidia says... but developers?

Yeah, their words carry weight.

Devs often paid by one company to implement their version of GPU tech?

I listen to nothing but real world facts, you would be doing yourself a favor if you did the same.
 
2x680s, (700 series 900 series skipped), 1080 FE, (20xx series skipped) see you in a couple more generations for my next upgrade. :cool:
 
Every major engine has it and dozens of high-profile gamaes are releasing with it.

Performance is still a bit of a question, but the fabled '1080p60' misquote being pushed around looks to be a performance floor; other titles are using it at 4k, and the quoted developer said that 1440p60 would be easy using known optimizations that they hadn't gotten around to.

Remember that ray tracing isn't new, it's something that the industry (not just Nvidia!) has been working toward for years.

11 games have been announced with RT support. It seems like, for now, most games that are adding RTX features are focusing on DLSS. Which makes sense, DLSS is probably much much easier to implement on short notice. I'd say 2019 will be the point where we'll see if I'm right or not.

Yeah, right now its a lot of ifs and maybes. I still expect the 2070 to be basically useless for RT, but the 2080 is the big question. If the 2080 ti can do RT bells and whistles at 60fps 4K, awesome. If it's 1440p, still cool but definitely makes the two lower end cards less appealing for that.

Indeed. Pixar has been using some form of ray tracing since Cars, and fully implemented it for Monster's University. It just requires so much power that it has taken until now for it be viable (at least in theory) in consumer hardware. Been saying this since the RTX was announced, but I think in 4-5 years we'll be seeing it all over. As soon as consoles support it, ray tracing will be huge.
 
Generally agree, just want to add to this:

Been saying this since the RTX was announced, but I think in 4-5 years we'll be seeing it all over. As soon as consoles support it, ray tracing will be huge.

What we're seeing is 'hybrid' rendering; what actually gets ray traced is highly variable, and I'm expecting enough of that to be exposed in game settings to make the 2070 viable. Another way to put it, I don't expect ray tracing to just be an 'on or off' sort of thing.
 
11 games have been announced with RT support. It seems like, for now, most games that are adding RTX features are focusing on DLSS. Which makes sense, DLSS is probably much much easier to implement on short notice.
I read RT is rendered at lower res and needs strong AA to look ok.
But RT has framerate issues even at 1080p. The extra performance loss using normal methods of strong AA would kill it.
This is why DLSS is required, ray tracing would be too slow without it.
Games using a lot of RT must be given to NVidia so DLSS can be optimised. This must have a hefty cost.
This will harm the uptake of RT with smaller dev studios.
 
Thank god they didnt use RTX, that would have been a knife in the back.

I mean why wouldn't I want to use a feature that will require me to run a 4k panel at 1080p so it can run any game decently.

Crap leak.411 driver? Common
 
An easy way is to check reviews using Vega cards , not old results but new benchmarks.
Vega 64 used to trade blows with 1080, now it will trade blows with 1080Ti

Not really, unless you are saying under Linux on certain titles, or DX12 games on day 0 release.
 
Interesting that lower quality YUV 422 is slower than RGB.

Interesting observation. I wonder where in the rendering pipeline they are doing the RGB -> YUC conversion. Logistically, I would have guessed that this was just a fixed function conversion block in the actual display controller logic to send a signal to a display expecting YUC and thus shouldn't impact performance at all.

With a performance difference, it makes me wonder if there is some sort of shader doing the conversion at the end. Weirder would be if the game itself is doing the YUC processing internally. Some memory could be saved vs. RGB or YUC 444 but that is mostly academic on the PC.
 
I wish the 2070 had been included from the looks of it that would be placed between the 1080/1080ti and possibly trade blows with the 1080ti at 4K which isn't terrible for a card that will cost around what 1080 costs. That of course provided these benchmarks weren't overly cherry picked and doctored up to make them appear better than what you'll see in practical general usage across games. Will have to wait on reviews, but it doesn't look terrible overall. I'm not too shocked given the higher clocked memory's well as core clock speeds which makes a big difference to minimum/average frame rates.

...ray-tracing is the future but the present is still rasterization
Defiantly and great part about ray-tracing is it's more parallel than rasterization so SLI/CF works better with less hassle. I think that's part of why AMD is making a push towards adopting PCI-E 4.0 quickly for both it's next gen GPU's and motherboards. You double the lanes thus you could cut latency in half so CF for PCI-E 4.0 x16 would perform as well as PCI-E 3.0 x16 does now in terms of latency with twice the horse power. On top of that multi card scaling was never a serious issue anyway with ray tracing. They could pump out a 3-slot quad core VEGA 56 derived PCIE 4 x16 GPU if they wanted. It would do traditional ray tracing like a monster all w/o any proprietary gimmickry. Vega was already great at standard ray tracing.

They could scale the core count from 1-4 easily as well just make one PCB and fit as many as needed. Hell they might even be able to make it upgrade-able with drop in chips. Make the heat-sink latch more like a CPU cooler and drop in another 1-3 GPU cores instant upgrade!
 
Last edited:
You're kidding, right?
just saying nvidia oversells their new gen cards by using different benchmarks methods that puts them way ahead of their old gen, but the results are misleading, and the sad part is that most reviewers do not challenge it, and use old data for the other cards added to teh slides.
so if any reviewer actualy bench 10 series , vega and 20 series, under the same methodology nvidia asks them to review the new gen, i am confident that you will see vega cards suddenly become much faster than the 10 series at least at what we used to see this past couple years.
 
just saying nvidia oversells their new gen cards by using different benchmarks methods that puts them way ahead of their old gen, but the results are misleading, and the sad part is that most reviewers do not challenge it, and use old data for the other cards added to teh slides.
so if any reviewer actualy bench 10 series , vega and 20 series, under the same methodology nvidia asks them to review the new gen, i am confident that you will see vega cards suddenly become much faster than the 10 series at least at what we used to see this past couple years.

You're on drugs.
 
just saying nvidia oversells their new gen cards by using different benchmarks methods that puts them way ahead of their old gen, but the results are misleading, and the sad part is that most reviewers do not challenge it, and use old data for the other cards added to teh slides.
so if any reviewer actualy bench 10 series , vega and 20 series, under the same methodology nvidia asks them to review the new gen, i am confident that you will see vega cards suddenly become much faster than the 10 series at least at what we used to see this past couple years.

[H] did a pretty extensive review showing that both “fine wine” and nVidia gimping old cards via drivers didn’t hold water. People hear this nonsense once and it keeps getting repeated. It seems the general populace just loves conspiracy theories and a villian.

https://m.hardocp.com/article/2017/02/08/nvidia_video_card_driver_performance_review/13
 
[H] did a pretty extensive review showing that both “fine wine” and nVidia gimping old cards via drivers didn’t hold water. People hear this nonsense once and it keeps getting repeated. It seems the general populace just loves conspiracy theories and a villian.

https://m.hardocp.com/article/2017/02/08/nvidia_video_card_driver_performance_review/13
yet im pretty sure benching these cards according to what Nvidia will recommand the reviewers to do ( 4k + HDR ) will inherently make vaga 56 higher than 1080 level, and vega 64 very close to 1080Ti.
deny all you want but that is what's going to happen, the only thing is initial reviews won't show it.
 
yet im pretty sure benching these cards according to what Nvidia will recommand the reviewers to do ( 4k + HDR ) will inherently make vaga 56 higher than 1080 level, and vega 64 very close to 1080Ti.
deny all you want but that is what's going to happen, the only thing is initial reviews won't show it.

Ok, well come read the [H] review then. They didn’t sign the NDA at cost to themselves so they had no restraints.

And using HDR isn’t anything fishy... it’s a tech that is getting way more popular.
 
Ok, well come read the [H] review then. They didn’t sign the NDA at cost to themselves so they had no restraints.

And using HDR isn’t anything fishy... it’s a tech that is getting way more popular.
i wasn't saying it's fishy, i just said the guide lines that will be sent from nvidia to reviewers will put 10 series in bad light and 20 series in a better one, increasing the performance gap that 99% of gamers won't experience, and one way to verify that is to include up to date vega cards to the reviews, there for if vega gains experience over the 10 series from what we used to see these last couple years, then im right...and you know im right !
 
by this you are supposed to say wow the 2080 is 10-20% faster than a 1080ti

until we see real test by reptuable people its all just N bulls#$t

I see no reason for vega cards to be included.
overpriced, under achieving, and power hungry just don't fit in this type of testing
 
I think the new cards sound pretty good which I guess puts me in the minority. It's still a decent performance uplift. But I'm not in the market for a high end card right now so it won't be helping NV's sales.
 
True but its 14 games that are relevant to most gamers so that is worth considering. It's not something obscure that nobody will ever use like AoTs.

I agree with your first point. I trust NVidia less than most tech companies, but this is a broad suite of current games compared apples to apples. It's probably a reliable display of differences between the cards.

As for your offhand dig at Ashes of the Singularity: this is an incredibly good game; the most modern RTS out there with great depth, astounding graphics and game mechanics, and infinite replayability. I have hundreds of hours into it and look forward to many more. In my book it's better than 99.999% of the games out there. My GTX 1060 runs it just fine.

In fact it's laughable how hung up people are on GPU performance when there is so very, very little that's even worth playing. And quite a few of the good games would run on an iGPU.
 
while still giving AA, at something like 4k you would need a panel >35in to see the pixels that create the jagged edges needed for full screen AA.

Note on this: only if your eyes are crap. Which I assume is most people, but I'm sitting pretty far back with 31.5" at 4k and I can see individual pixels.

That's not to say that aliasing would bother me; I don't game on the screen, and personally I'd prefer something in the 35" range for 4k gaming as well, but not until there's a panel actually worth upgrading to.
 
I agree with your first point. I trust NVidia less than most tech companies, but this is a broad suite of current games compared apples to apples. It's probably a reliable display of differences between the cards.

As for your offhand dig at Ashes of the Singularity: this is an incredibly good game; the most modern RTS out there with great depth, astounding graphics and game mechanics, and infinite replayability. I have hundreds of hours into it and look forward to many more. In my book it's better than 99.999% of the games out there. My GTX 1060 runs it just fine.

In fact it's laughable how hung up people are on GPU performance when there is so very, very little that's even worth playing. And quite a few of the good games would run on an iGPU.

I’m glad you and the other 97 people on this planet enjoy it but that doesn’t make my comment wrong:

A6BCEEAB-18E4-43F3-92B4-182A9E4B1E1F.png
 
Back
Top