CONFIRMED. nVidia RTX 2080 Ti 50% faster in games over the 1080 Ti!

Status
Not open for further replies.
The only thing I can think is they want a soft launch of RT so that game devs put effort into including the tech in future games.

There's that, and that people will actually get to see it for themselves. You cannot discount the effect that having ray-tracing- albeit with limited performance- is going to have on game developers and the industry as a whole.

They had to get hardware into peoples hands as quickly as they could to get the momentum rolling, and at the very least Nvidia is capable of doing that on current process tech without dropping the ball on raster performance.

We went from 'ray-tracing is cool but it'll never be fast enough for real-time desktop gaming' to 'fuck it we're doing it!' in a few years time.

Bravo Nvidia.
 
https://wccftech.com/nvidia-geforce-rtx-2080-gaming-performance-benchmarks-2x-over-gtx-1080-4k/

Those numbers look incredible.

As the drivers mature and games become even more optimized, numbers will only increase. It should be noted they are using 'early' drivers.

This takes a lot of worry out of me.

I knew that $1,310+ dollar with tax price point almost guaranteed insane performance.

I'll take 50% performance increase any day of the week.

Idiots say what.
 
Shower thought: Would it have made more sense for Nvidia to save Tensor/RT cores for 7nm (GTX 3080) when they could've had smaller chips or more die space?
And then used the GTX 20 series just as a bigger traditional chip with more performance and, presumably, lower prices. It seems like such a weird time to roll out all these features all simultaneously. 16nm is so old at this point.
Absolutely, as it stands right now RT has been the talk of the town, MS has been trying for sometime to make it officially supported in DX, devs have been begging for hardware support. Nvidia made a core on an old process have actual hardware support. Who the fk cares if it’s not a billion RTX cores, we now have a starting point and Nvidia (like HDR) is going to be in the lead with developing it with the developers.
 
Idiots say what.

Since he admitted that drivers will mature, which takes time, why not just wait and get it at a likely price drop? I get looking for excuses to explain a crazy expense, but if your excuse involves future benefits on a resource that doesn't have a limited production run and historically has had frequent sales (bitcoin boom aside), it's not a good excuse.
 
I was going to post JayzTwoCents vid but someone already post it in another thread or his he a moron too:)
Well me personally I like JayzTwoCents, already saw his video and thought it was well done.
 
That's exactly what I've been thinking. They should have just waited until 7 nm and then went balls out on the number of RT cores. As it is now we have a feature that will be a massive massive compromise on visual fidelity as 1080p it's going to look like Blurry pixelated crap compared to running native resolution on a 4k monitor.

Nvidia just want to recoup some of the cost now rather than later with the release of this card since it is been development for 10 years. Even if they have release it 7nm, I am not convince it will be good enough for 1440p, I still believe we are 2 generations away to have it run 1440p at an acceptable framerate.
 
Nvidia's data for the 1070, 1080 and 1080 Ti launch were dead-on correct (60% over previous gen, 35% over 1080). Nobody likes to go back and acknowledge that because, y'know, GREEN BAD!
AMD is the one that manipulates performance in their marketing slides.

Whatever numbers Nvidia is putting out for Turing, I would assume they're accurate. Just make sure you keep an eye on DLSS and raytracing.
 
Nvidia is using deceptive practices to make Turing look better than it is.
 
Last edited:
Ironic isn't it that people who are discovering Pascal struggling with something may in fact want to upgrade for that very reason. I'm not in that camp (my 1070 plays everything I need it to) but this forum is replete with examples of people who have a 1080 Ti or a Titan Xp that struggle with current content.
 
Nvidia's data for the 1
Ironic isn't it that people who are discovering Pascal struggling with something may in fact want to upgrade for that very reason. I'm not in that camp (my 1070 plays everything I need it to) but this forum is replete with examples of people who have a 1080 Ti or a Titan Xp that struggle with current content.

I think the issue for more people isn't whether or not Pascal struggles but the cost to get over the hump. A 1080 ti at around $500-550 might be good enough when the next card up is $800. If it would have come in at $600 with new features and slightly better performance more people would be excited.
 
  • Like
Reactions: Savoy
like this
Bet that dlss is some mode that kills current cards, leaving the real difference as the lower part in the nvidia slide for most titles. Hence the smoke and mirrors, still decent gains but nothing out of the ordinary.
Ps shitty nvidia spam thread is shitty
 
Bet that dlss is some mode that kills current cards, leaving the real difference as the lower part in the nvidia slide for most titles. Hence the smoke and mirrors, still decent gains but nothing out of the ordinary.
Ps shitty nvidia spam thread is shitty

current cards can't use DLSS as that require the tensor cores for AI.
 
I think it's funny that anyone would even dream 50% over a 1080 TI. If it was literally that much faster something tells me Nvidia would have mentioned it in their keynote, because that's pretty revolutionary in itself to make that big of a jump in one generation. So basically - IT. AINT. NEARLY. THAT. MUCH.

Stop drinkin the kool aid, kiddos.
 
I think it's funny that anyone would even dream 50% over a 1080 TI. If it was literally that much faster something tells me Nvidia would have mentioned it in their keynote, because that's pretty revolutionary in itself to make that big of a jump in one generation. So basically - IT. AINT. NEARLY. THAT. MUCH.

Stop drinkin the kool aid, kiddos.

Drinking? Wat? They are guzzling it!

My prediction is that the 2080Ti will eek out somewhere between 8-12% more performance at most when a sane comparison is made with a 1080Ti. The fact that Nvidia has been silent and only showing vague comparisons between the 1080 and the 2080 are pretty damning. Yes, the ray tracing tech is cool and all, but not in this first gen at 4K. It will take a bit to mature. Me, I'm waiting for the next Titan card. Going backwards in VRAM capacity bugs me.
 
If most of the extra performance comes from dlss, I hope for their sake it doesn't suck. Something gimicky like quick sync or something like that where yes it is faster but the image quality suffers dramatically.

You would definitely think they would have mentioned the performance in the Keynote. I don't think it will be 10% more performance but I would be surprised if it gets more than 30% without dlss. And at 30% it's a tough Buy when a 1080 ti can be had around $500

I would think about upgrading to the 2080 ti at $800 no questions asked, but at $1200 they have to do a better job of convincing me.
 
  • Like
Reactions: Savoy
like this
Drinking? Wat? They are guzzling it!

My prediction is that the 2080Ti will eek out somewhere between 8-12% more performance at most when a sane comparison is made with a 1080Ti. The fact that Nvidia has been silent and only showing vague comparisons between the 1080 and the 2080 are pretty damning. Yes, the ray tracing tech is cool and all, but not in this first gen at 4K. It will take a bit to mature. Me, I'm waiting for the next Titan card. Going backwards in VRAM capacity bugs me.

2080ti has improved cuda cores, 21% more of them, and 27% more bandwidth. 8-12% seems kinda precise and a bit low lol.

Also rumored to OC 5-10% higher...
 
Last edited:
2080ti has improved cuda cores, 21% more of them, and 27% more bandwidth. 8-12% seems kinda precise and a bit low lol.

Also rumored to OC 5-10% higher...

Even if the jump is as dramatic as the 980ti to 1080ti. It's still $600-700 more than the 1080ti right now. A brand new 1080ti was $699. The more comparably priced 2080 is probably only 15% improvement.
 
Reading OP's post the term "NVIDIOT" comes to mind. I can't believe how much free publicity NVIDIA is getting right now. Give it a rest. They don't deserve all this attention.
 
If anyone watch the Nvidia show he said about FP32 and INT32 Shading will run together and it will be 1.5x more performance then last generation of GPU meaning 1080 TI, there is no way 50 or 60% higher then the 1080 ti
 
Last edited:
We will see... I'm an optimist. I'll even stretch it to 15%. But 50%? That's crazy talk.

Not a Nvidia fan boy or anything here but one of the things that has me mildly interested in the 2000 series is a mention somewhere comparing it to the multi-generational jump seen with the 8800series way back when. We haven't really seen a jump like that since.. but I am expecting a 50% increase at the very least. I'd say anything less with the price increases their throwing up there won't go over well. We'll see soon enough.
 
Always remember.
Never forget.
They will bork the previous generation to make these fantasies come true.
 
I'm all for graphical enhancements but uhhh ...spending all that money to play at 60fps isn't gonna do it for me.
 
Not a Nvidia fan boy or anything here but one of the things that has me mildly interested in the 2000 series is a mention somewhere comparing it to the multi-generational jump seen with the 8800series way back when. We haven't really seen a jump like that since.. but I am expecting a 50% increase at the very least. I'd say anything less with the price increases their throwing up there won't go over well. We'll see soon enough.

price increase doesn't go well regardless. Pascal demolished previous gen. It was priced similar. This is nvidia bending people over and raping their wallets with these pricing. If they are going to increase the price for 50% performance when previous gens came at the same price. It better be 200% faster for them to almost double the price on Ti! lol.
 
price increase doesn't go well regardless. Pascal demolished previous gen. It was priced similar. This is nvidia bending people over and raping their wallets with these pricing. If they are going to increase the price for 50% performance when previous gens came at the same price. It better be 200% faster for them to almost double the price on Ti! lol.

I don't need 4k performance in my box, what I need is improved 1440p performance. Shooters these days I'm on low or medium settings to get my 165FPS, however when I do play singleplayer I would like 1440p to look nice.. my brain/reaction time is so much better with 1440p than 4k or 1080p and I have no idea why.

In short the 2070 better not suck...
 
current cards can't use DLSS as that require the tensor cores for AI.
So it is using the ray tracing cores then? So some injection type setup for now if not native ray tracing game..
 
Ok. How do we get these other web sites to stop posting this "fake news" then?

I didn't realize all these charts were fake. My apologies.

Stop posting from shite rumor sites.
You are part of the problem...like it or not!
 
https://wccftech.com/nvidia-geforce-rtx-2080-gaming-performance-benchmarks-2x-over-gtx-1080-4k/

Those numbers look incredible.

As the drivers mature and games become even more optimized, numbers will only increase. It should be noted they are using 'early' drivers.

This takes a lot of worry out of me.

I knew that $1,310+ dollar with tax price point almost guaranteed insane performance.

I'll take 50% performance increase any day of the week.

What?
1080Ti was about 80%% faster than 980Ti at 4k resolution.
Youre drooling over 50% increase at nearly twice the price?
I'm not impressed and I will pass.
 
Last edited:
What?
1080Ti was about 80%% faster than 980Ti at 4k resolution.
Youre drooling over 50% increase at nearly twice the price?
I'm not impressed and I will pass.
You're also getting Tensor and Raytracing cores, with the die being like 60%+ larger than the 1080 Ti.
I'm not saying the price is justified, but let's not over-simplify it.
 
You're also getting Tensor and Raytracing cores, with the die being like 60%+ larger than the 1080 Ti.
I'm not saying the price is justified, but let's not over-simplify it.

Fine, let's not oversimply it and analyze instead.
Ray Tracing is barely usable on 1080p/60 on current gen games. You can forget it on 4k so Ray Tracing is basically useless.
2080Ti to game on 1080p? Who's crazy enough to do that when there are barely any games out there that support RT?
Those on a budget game at 1080p and they are not going to spend $1200 on a video card. Most serious gamers are on at least 1440p since that resolution supports 144hz (and beyond?) and RT is too demanding on that resolution.
 
Status
Not open for further replies.
Back
Top