NVIDIA GeForce RTX 2080Ti unable to run Shadow of the Tomb Raider with 60fps at 1080p with RTX On

It's the same as SM3 support in 6800-series in 2004. It was a marketing gimmick since when used it tanked performance, but it gave something to nVidia that ATI did not have at the time. Anyway, time will tell how these cards perform. We'll know in a month or sooner.
 
It's the same as SM3 support in 6800-series in 2004. It was a marketing gimmick since when used it tanked performance, but it gave something to nVidia that ATI did not have at the time. Anyway, time will tell how these cards perform. We'll know in a month or sooner.


I mean its not like ray tracing is gameworks. This is open source tech that uses Direct X. Developers will for sure implement it. It'll just be a few generations before it runs well considering what we are reading online.
 
And this is why SLI is needed. To run the latest ultra high settings.
4K and ray tracing doesn't have a high value without it.
 
And this is why SLI is needed. To run the latest ultra high settings.
4K and ray tracing doesn't have a high value without it.


Why would this mean SLI is needed? SLI barely gets any support from developers and you get 50% scaling if you are lucky.

Low benefit.
 
I mean its not like ray tracing is gameworks. This is open source tech that uses Direct X. Developers will for sure implement it. It'll just be a few generations before it runs well considering what we are reading online.

I'm no expert, but since DX12 a lot of features that have been left to developers instead of GPU drivers have gone by the wayside. Things like that whole laying stuff off to multithreaded CPUs and the biggie:

And this is why SLI is needed. To run the latest ultra high settings.
4K and ray tracing doesn't have a high value without it.

SLI. If developers can't even be bothered to implement SLI now a couple of years after DX12 was introduced, what would be the incentive to bother with something totally new like ray tracing?
 
I'm no expert, but since DX12 a lot of features that have been left to developers instead of GPU drivers have gone by the wayside. Things like that whole laying stuff off to multithreaded CPUs and the biggie:



SLI. If developers can't even be bothered to implement SLI now a couple of years after DX12 was introduced, what would be the incentive to bother with something totally new like ray tracing?
During the BFV demo the guy from DICE said NVIDIA's middleware made it easy.
 
Cool. Means I can certainly keep and not even max out my secondary 144HZ 1080p g-sync monitor with 20 series for merely $800-1000. Who needs higher res anyway?
 
I'm no expert, but since DX12 a lot of features that have been left to developers instead of GPU drivers have gone by the wayside. Things like that whole laying stuff off to multithreaded CPUs and the biggie:



SLI. If developers can't even be bothered to implement SLI now a couple of years after DX12 was introduced, what would be the incentive to bother with something totally new like ray tracing?

I know. But this is why SLI shouldn't have been killed off. We're stuck having to buy $1200 cards and it's still not enough to get 60fps @ 4K.
 
I know. But this is why SLI shouldn't have been killed off. We're stuck having to buy $1200 cards and it's still not enough to get 60fps @ 4K.

Its not that. Instead of pushing pure horse power. Nvidia decided they were going to use DLSS and push ray tracing. Its nice having tech and all that. That is why its best to wait till next gen 7nm cards.
 
I know. But this is why SLI shouldn't have been killed off. We're stuck having to buy $1200 cards and it's still not enough to get 60fps @ 4K.

I'm not sure where I read it (or maybe it was a video), but I hear Nvidia is going to have a renewed focus on SLI with this generation.

I have PT hella early tomorrow so I'm going to bed, but I'll look for the article/video tomorrow unless someone else finds it first.
 
I'm not sure where I read it (or maybe it was a video), but I hear Nvidia is going to have a renewed focus on SLI with this generation.

I have PT hella early tomorrow so I'm going to bed, but I'll look for the article/video tomorrow unless someone else finds it first.

I am sure it will. To get everyone on the rtx band wagon and make them buy 2 1200 dollar cards. They will sure push SLI when they know they can rape wallets. Look RTX we are moving things forward but you will need 2 cards to run it properly. LOL
 
I am sure it will. To get everyone on the rtx band wagon and make them buy 2 1200 dollar cards. They will sure push SLI when they know they can rape wallets. Look RTX we are moving things forward but you will need 2 cards to run it properly. LOL

They also know the new feature set is going to push down performance significantly so more folks will be looking at SLI. As soon as someone enables raytracing amd sees their FPS plummet, SLI is the only option to keep up.
 
I am sure it will. To get everyone on the rtx band wagon and make them buy 2 1200 dollar cards. They will sure push SLI when they know they can rape wallets. Look RTX we are moving things forward but you will need 2 cards to run it properly. LOL

I'm not a huge fan of the pricing, but I was much, much more frustrated by long wait and the pre-orders going up before there were any benchmarks whatsoever.

Ti and Titan are just names. I look at it like this... The original Titan, Titan Black and Titan X were all released at $1,000. The newer Titan X and Titan Xp were released at $1,200. When the first Titan was released, I believe the intention was for it to be a work station card with gaming capabilities, but instead it was a gaming card with workstation capabilities. Later renditions of the Titan evolved into high end gaming cards with a high end price giving us a sneak at future mainstream GPU performance (I concede this was probably done to cleverly raise the price on their top dog GPU line). A small minority were upset, but the majority of consumers were still fine with the pricing tiers. Then the Titan V was released with a $3,000 price tag and a renewed marketing focus on it's work station capabilities and the specs to back it up. With the price of the Titan V and it's Deep Learning performance, I just don't look at it as a gaming card anymore. Instead I just look at the Ti as the new "Titan"... And at $1,200, it's right in line with previous generations.

Now it had just better have the performance to match.

P.S.
Nvidia can still eat shit.
 
I'm not a huge fan of the pricing, but I was much, much more frustrated by long wait and the pre-orders going up before there were any benchmarks whatsoever.

Ti and Titan are just names. I look at it like this... The original Titan, Titan Black and Titan X were all released at $1,000. The newer Titan X and Titan Xp were released at $1,200. When the first Titan was released, I believe the intention was for it to be a work station card with gaming capabilities, but instead it was a gaming card with workstation capabilities. Later renditions of the Titan evolved into high end gaming cards with a high end price giving us a sneak at future mainstream GPU performance (I concede this was probably done to cleverly raise the price on their top dog GPU line). A small minority were upset, but the majority of consumers were still fine with the pricing tiers. Then the Titan V was released with a $3,000 price tag and a renewed marketing focus on it's work station capabilities and the specs to back it up. With the price of the Titan V and it's Deep Learning performance, I just don't look at it as a gaming card anymore. Instead I just look at the Ti as the new "Titan"... And at $1,200, it's right in line with previous generations.

Now it had just better have the performance to match.

P.S.
Nvidia can still eat shit.
Funny thing was we were speculating on this very forum that this was what they were going to do when the Titan started to focus more on gaming performance. Some even facetiously stated that they should start calling the Titan a Ti. And here we are.
Exactly. They should have just called it the Titan Xt and avoided this river of tears.
Agreed. All the internet nerd rage could have been avoided had they gone this direction: 2080 Ti = Titan, 2080 = 2080 Ti, 2070 = 2080.
 
Funny thing was we were speculating on this very forum that this was what they were going to do when the Titan started to focus more on gaming performance. Some even facetiously stated that they should start calling the Titan a Ti. And here we are.

Agreed. All the internet nerd rage could have been avoided had they gone this direction: 2080 Ti = Titan, 2080 = 2080 Ti, 2070 = 2080.
How would they release a $1500 Titan then???
 
I am bugged by the fact that Ray Tracing will not be supported by more mainstream cards like the 2060 and is not supported by consoles. In the short term, developers will introduce this feature in a few games to help NVIDIA promoting their new cards but it seems unlikely that RT will gain significant momentum in the next 2-3 years.

I was surprised by the release of the Ti version given that these are usually later on. I suspect that the window for this new generation (12nm) of has been reduced given the maturing of the new 7nm process.
 
I'm not sure where I read it (or maybe it was a video), but I hear Nvidia is going to have a renewed focus on SLI with this generation.

I have PT hella early tomorrow so I'm going to bed, but I'll look for the article/video tomorrow unless someone else finds it first.

I hope so. 4K @ 120 will not be a reality with it. I know SLI historically isn't efficient or reliable, but the new interface looks promising.
 
Bahaha this was funny. I stole this from the internet. Does sound like a reviewer guide that company would instruct reviewers to do.. ROFL.

How to test games between 1080 n 2080
>Include a Demo made for Turing
>Show bigger bars if DLSS is implemented
>Test at 4k cuz the 1080 doesn't have the bandwidth for it
>Test with HDR on cuz it tanks performance on Pascal
>Don't mention Gsync cuz that further tanks performance on Pascal with HDR on

Haha
 
Back
Top