Nvidia RTX 2080 Ti Versus GTX 1080 Ti Benchmarks Are Out

most of the performance bump comes from :
1-memory bandwidth
2-crippling old generation
at this point i wonder if 1080p perf bump would be even bigger than 10%
i wouldn't be surprised to see most of 20 series reviews do exclusively 4k or even HDR, FXAA, etc...
 
Adored did the memory scaling to performance scaling at 4k, along with HDR... 20% more cores and 27% more bandwidth would quite reasonably make up for the difference at 4K. Too bad they won't release a non-RTX card with 5k+ CUDA cores and HBM2 :ROFLMAO:
 
Have to say those scores look really close to what I get with my 1080SLI rig in 4k. Paid around $1100 for those. Interesting how the price/performance is so close.
 
Well..

The 1080ti has 3584 CUDA cores, 224 texture units and 88 ROPs.
The 2080ti has 4352 CUDA cores, 272 texture units and 88 ROPs.

The 2080ti has approx 17% more cores than the 1080ti.. a 20-25% bump in performance could easily be accounted for by just the existence of the additional 768 cores.
Exactly!

This is why I made just that estimate and hedged for the dual path INT/FP helping in edge cases (perhaps those minimum frames?) and then the slight bump in clock.
 
I'm hopeful that NVLINK will dual GPUs scale again, but we'll see.

Me too but the few things I saw about dual link adapters are nearly the same cost as a card. Ultimately we'll all really need to see the full real post release benches to see more credible numbers to chew. Presently though, these early releases have more hype than bite.

edit: NVlink not dual link.
 
Me too but the few things I saw about dual link adapters are nearly the same cost as a card. Ultimately we'll all really need to see the full real post release benches to see more credible numbers to chew. Presently though, these early releases have more hype than bite.

edit: NVlink not dual link.
The NVLINK for RTX was about $80 IIRC, it's the Quadro NVLINKs that are $600 or so. At this point, it's unclear why the pricing difference. Faster MegaTransfers? More lanes? I don't know.
 
If you completely ignore 1/3rd of the die that is ray tracing and DLSS that may be true.

Hopefully those new drivers will allow some impressive usage of the new tech on old games. Otherwise it's like a slap in the face for the last 10 years of games and potential buyers should know they're just buying tech that will under perform by the time it's fully supported in the software side.
 
The NVLINK for RTX was about $80 IIRC, it's the Quadro NVLINKs that are $600 or so. At this point, it's unclear why the pricing difference. Faster MegaTransfers? More lanes? I don't know.
Thanks! Didn't know that. For all my criticism, I am on the fence for these.
 
Well..

The 1080ti has 3584 CUDA cores, 224 texture units and 88 ROPs.
The 2080ti has 4352 CUDA cores, 272 texture units and 88 ROPs.

The 2080ti has approx 17% more cores than the 1080ti.. a 20-25% bump in performance could easily be accounted for by just the existence of the additional 768 cores.
Exactly there is no comparison because its not the same product at the same price point.
 
Hopefully those new drivers will allow some impressive usage of the new tech on old games. Otherwise it's like a slap in the face for the last 10 years of games and potential buyers should know they're just buying tech that will under perform by the time it's fully supported in the software side.

Any game from 2+ years ago... you don’t need DLSS. It’s up to the game devs to implement ray tracing.

There already over 20+ titles incorporating RTX features and it didn’t even launch yet.
 
so in other words after nearly two and a half years there has been Zero Performance increase per dollar...

Honestly that sounds somewhat normal, look at performance per watt since I'm assuming you pay for power and AC (two cards can get the room toasty)

Of course 1 card vs two is better.. not all games scale well, less stutter... lower ambient temperature in the case. Mobos are less expensive if you only need 1 pcie x16 slot.

Plus future games with Ray tracing or whatever they are calling it might look good enough to warrant the upgrade. I'm excited to see what the new metro game will look like.

Obviously you will have little incentive to upgrade regardless until next generation considering you bought two top end cards to SLI
 
Honestly that sounds somewhat normal, look at performance per watt since I'm assuming you pay for power and AC (two cards can get the room toasty)

Of course 1 card vs two is better.. not all games scale well, less stutter... lower ambient temperature in the case. Mobos are less expensive if you only need 1 pcie x16 slot.

Plus future games with Ray tracing or whatever they are calling it might look good enough to warrant the upgrade. I'm excited to see what the new metro game will look like.

Obviously you will have little incentive to upgrade regardless until next generation considering you bought two top end cards to SLI
No it's not remotely normal. A single 1080 TI destroyed 980 SLI for not much more than half the price. Here the 2080 TI cost more than 1080 SLI and will be a lot slower. So if anything it's the complete opposite of being normal...
 
1080 TI was exceptionally good but still seems like average gains especially considering how well the 1080 TI is to begin with and I think it largely depends on visual enhancements new games will bring rather than pure performance to older games.

As far as pricing... Yea it's horrendous what's happened lately
 
1080 TI was exceptionally good but still seems like average gains especially considering how well the 1080 TI is to begin with and I think it largely depends on visual enhancements new games will bring rather than pure performance to older games.

As far as pricing... Yea it's horrendous what's happened lately
715mm^2 dies. These are monsterous chips. If they were Intel CPUs they’d be $30k.
 
Performance jump from 1080ti to 2080ti maybe 40% for now but should increase further in the future because guess what, now that Turing has async compute hardware and something similar to AMD rapid packed math (explain the massive jump in FC5), more developers will integrate those tech onto future games. I think the transition from 1080ti to 2080ti is like Keplar to Maxwell, all the perf gain come from architectural differences unlike 980ti to 1080ti where almost all the perf gain come from advanced lithography node. To the people who buy Pascal now, you are walking the Kepler path :D.
 
Seems like a shackled beast to me, without drivers. Look at that 70% Min gain in Far Cry 5. How is that not impressive? Glad I got in a Pre-Order because worst case I sell it for $2k.

Just keep telling your wallet that...
 
I'm going to guess this test doesn't use DLSS... that will be the really interesting thing with Turing. How many games is it compatible with and what is the actual real world improvement?

This is the million dollar question, RT lighting and reflections in my mind isn't the real winner here although some serious eye candy and later some potential benefits to competitive FPS gaming and of course Horror genre titles, DLSS will be the greatest thing since sliced bread, Seeing that it can be sideloaded into a game that is at 50% or more through development, I am willing to bet FPS games will almost all have it in the future as an option because Gameworks isn't required, so developers may be more willing to put the work into it.

Tech radar leaked that at the event they tested many games on a 4k monitor and all were within 100+fps, specifics are unknown on traditional gaming, but if there was a 20-40% boost in FPS at 4k and DLSS was enabled I think this gives credibility to their statement, I imagine the games tested were the ones on the NVidia chart slide released later which lines up as well to their statements.

Either way I think developers are more excited about this than the consumers are, Jensen kept saying "it just works" over and over again that I believe was more towards developers than consumers. I wouldn't be surprised if we don't get mass adoption faster than several previous technologies, the fact of the matter is 8k monitors and HDTVs are already hitting the market at beyond prosumer levels I would expect 3-5 years before they become regular prosumer PC market components, hopefully with DLSS and newer process Dies we will be able to atleast get entry level performance by then, hopefully.

As for AMD I hope they are developing side alternative technologies relative to NVidias, but I think they are at a disadvantage as NVidia invested a lot into Neural Net/AI/Mesh/Mach.Learning technologies while AMD was focusing elsewhere, or not as intensively.
 
Great info.
That means I can keep $1,200 towards my Crossfired 7nm setup.

You mean for all those crossfire supported Madapter games out there, and all that Vulkan mass market adoption with asyc compute being the end all be all?, TBF you would be wasting your money as well and you may be waiting a long time, AMDs money and resources are CPU side, I really wouldn't bank on anything impressive, The console market that is going to give consistent returns will be their primary focus for graphics technology, focusing on the high end of GPUs will only drain AMDs money further and put them back into debt again, their main bread and butter where they get the best returns is budget and mainstream, So I seriously wouldn't expect anything from them, of course the wild card is Intel in all of this. Honestly I wouldn't buy anything till late 2019/2020 when the next console gen gets released, by then Graphics technologies that are newer will be more standard and NVidia will have 3k series which will be better, and AMD may pull a hat trick but I doubt it, Intel will renter the market and may surprise us.
 
Well..

The 1080ti has 3584 CUDA cores, 224 texture units and 88 ROPs.
The 2080ti has 4352 CUDA cores, 272 texture units and 88 ROPs.

The 2080ti has approx 17% more cores than the 1080ti.. a 20-25% bump in performance could easily be accounted for by just the existence of the additional 768 cores.
That is a great catch. Thanks man.
 
The NVLINK for RTX was about $80 IIRC, it's the Quadro NVLINKs that are $600 or so. At this point, it's unclear why the pricing difference. Faster MegaTransfers? More lanes? I don't know.

It's simply because they can. They milk the Pro side for extra revenue as a regular part of doing business. Been that way since they first started making Pro cards.
 
It's simply because they can. They milk the Pro side for extra revenue as a regular part of doing business. Been that way since they first started making Pro cards.
Please link proof that there is no technical differences between them. NVLINK comes in several flavors.

Professional markets always require more support and validation, that increases costs.
 
so in other words after nearly two and a half years there has been Zero Performance increase per dollar...

Dat raytracing tho....

Why hasn't anyone remembered how horrible the reviews of the Star Trek reboot was?

All those lens flares and bright flashing reflections caught a lot of criticism about the movie.

Can't help to think the same with any game that Nvidia gives the , "J.J. Abrahams" effect to.
 
Dat raytracing tho....

Why hasn't anyone remembered how horrible the reviews of the Star Trek reboot was?

All those lens flares and bright flashing reflections caught a lot of criticism about the movie.

Can't help to think the same with any game that Nvidia gives the , "J.J. Abrahams" effect to.
RT doesnt have to be that way.

RT mimicks the path real light takes and thus is far more realistic than the lighting hacks currently used. It’s up to the designers what to do with that ray when it hits a surface. The over shiny problem is due to the designers choice to model the surface that way, not a default of RT. For example, the BF5 demo had some decent RT effects on the rifle’s wood that weren’t shiny.

Demos tend to over do things so they’re immediately obvious to people not familiar with the technology. That’s why there’s so much shiny going on.
 
These bench numbers look funky.

According to the leak, 1080 Ti averages 44 fps average in Witcher 3 with 4K max settings.

Um...what? That's barely above what a 980 Ti averages. A 1080 Ti averages closer to 65 fps.

Witcher3_3840x2160_FRAPSFPS_0.png
 
Looks like it averages 65 fps to me in that graph. (unless your graph is different than the articles' (did not read)) =)
 
I think you mistaken the Vega 64 with 1080 Ti line.

I think you didn't watch the Turkish 2080 Ti vs 1080 Ti video, which is the subject of this thread. I'm comparing the leak claiming 44fps for 1080 Ti and 56fps for 2080 Ti in Witcher 3 versus an actual legit benchmark, which I linked.
 
I think you didn't watch the Turkish 2080 Ti vs 1080 Ti video, which is the subject of this thread. I'm comparing the leak claiming 44fps for 1080 Ti and 56fps for 2080 Ti in Witcher 3 versus an actual legit benchmark, which I linked.

Pretty sure that a stock 1080 Ti at launch was seeing around 44 FPS average at 4K ultra (including hairworks). Without hairworks it looks like 60 FPS was the average at ultra.
 
Back
Top