GeForce RTX 2080 and RTX 2080 Ti Reviews

To be clear, I am saying it is stupid to buy a 1080Ti right now (and was a great purchase 18 months ago). This is because the potential performance improvement of DLSS if you run AA on in any of the games you play.

The games slated for DLSS are here.

The question is when, not if. Buying a 1080Ti will get you a few months of fun and then regret when DLSS is supported in one of the games that you play.

Of course, this is assuming DLSS works as advertised, which is why I say "buy 2080 and hope"
https://www.hardocp.com/news/2018/09/20/nvidia_dlss_analysis/
 
Rise of the Tomb Raider (and likely SotTR) will exceed 9GB VRAM usage when run at 4k with high AA settings, and that's *without* supersampling. It's a thirsty son of a bitch.

Edit to add linky link:
https://www.hardocp.com/article/2016/02/29/rise_tomb_raider_graphics_features_performance/13

Although the 3gb loss is a bummer, this is once again an example of confusing how much vram is used and how much is needed.

"Kudos to the AMD Radeon R9 Fury X for not breaking or seeming to run out of VRAM with no long pauses. Surprisingly there weren't long pauses like one would expect when running out of VRAM capacity. It must be that the dynamic VRAM is able to be leveraged and keep the video card from stalling out."

So no, 8gb will NOT be a problem, at least in this game so long as you have at least 16gb of system ram.
 
Honestly I “read” some of the reviews but I don’t give a shit, I really just care about the RT performance and real game analysis so I’ll wait till those start popping up to see if I want to buy it. I don’t “need” more performance out of my games but I do want to utilize RT when it hits the market.
 
Oh and for the record, buying an 18 month old 1080Ti is stupid. Your options are

1) Wait for the [H] review then either...
2) Wait 2 years for the next generation.
3) Buy a 2080 or 2080 Ti and hope you get the value of it over the next 2 years.

Normally I would agree that it is better to buy a current gen mid range card than a previous gen high range card, at the same performance. You benefit from newer technology, support less power use, lower temps, etc. etc.

With the prices this gen however, and the fact that power use is mostly the same, for someone gaming at 1440p or less, I wouldn't feel wrong recommending a 1080ti.
 
Once ray tracing starts of hit off you'll probably have to jump back to SLI as well, if any games use it well enough to actually make it worth the performance hit. Hopefully RT and SLI/NVLink work together so people that want to/are going to buy two 2080s or 2080 tis can make full use of the set up.

It won't. As of right now, hybrid rendering (we aren't actually using pure raytracing here nor do the 2080 cards have the power to do such) is basically like GPU PhysX and will continue to remain that way until and unless the consoles support it. It is the consoles that drive the market, not PCs and no game developer is going to invest valuable programmer time doing anything but the most cursory support when it will be limited to a minority of gamers.
 
The price delta between a 2080 and a new 1080 Ti isn’t hundreds of dollars though... it’s about $100:

https://www.nowinstock.net/computers/videocards/nvidia/gtx1080ti/

Now the questions turns into “Do I want to spend an extra $100 for the potentially huge upsides RTX tech will bring?” For a lot of people, that answer will be yes.

I think the new AI enhanced tensor core based AA will be an interesting tech to look for out of these products.

Other than that, the Ray Tracing I think will be a fad. The performance impact is just too great. It will be like those sexy Asus 3D shutter glasses I got free with some video card in the early 2000's, a tech that was before its time, and spends its life being unused.

I found those things in a box in my basement about 7 years ago (when I was terribly overweight):

FB_IMG_1537465343800.jpg


Sexy.
 
So don’t buy a 1080ti now but instead buy a more expensive product for features that are promised in the future.


So we are looking for that Nvidia Fine Wine now....?;)
 
Last edited:
So, I wanted to see a value proposition for the cards. I used the PC Per reviews, and created a spreadsheet to calculate the price per frame rate. I took their average frame rate, and divided that by the MSRP and the street price. My Excel-fu isn't that good, so here's just the raw data, sorted by game, then MSRP. So what is the value of the new cards? With the exception of Wolfenstein 2, not that good.

upload_2018-9-20_14-12-46.png
 
It isn't undeveloped tech, it is a chicken and egg problem. Games won't support anything until hardware does and hardware won't support anything until games do. Someone has to go first and buyers decide whether it sticks. Given that everything is out of stock, this will stick.

I’m going to agree to disagree. Time will tell who’s right.
 
I give Nvidia props for innovation, hopefully the ray tracing is here to stay and will one day give us good performance using it. But by the looks of it, its not yet attainable without taking a huge performance hit (60-70%??? im not actually sure of this, but just ballparking from the rumors) Not to mention what the power draw will be when the whole chip is active.

Im not buying one and I think folks are gonna regret the 20XX series, right at the beginning of the move to 7nm... wait for the refresh, and let the market adopt ray tracing before buying into it.

The price bump is BS, but what do you expect from them...Also not big on closed standards, they are king of that as well.

Im glad they have pushed the performance levels though for traditional rasterization, always good, But were at the edge of a huge jump in efficiency. Wait.
 
To be clear, I am saying it is stupid to buy a 1080Ti right now (and was a great purchase 18 months ago). This is because the potential performance improvement of DLSS if you run AA on in any of the games you play.

The games slated for DLSS are here.

The question is when, not if. Buying a 1080Ti will get you a few months of fun and then regret when DLSS is supported in one of the games that you play.

Of course, this is assuming DLSS works as advertised, which is why I say "buy 2080 and hope"

I don't buy based on "maybes" I buy based on what I am getting at the moment of purchase. If I hadn't bought my 1080 ti right before the RTX cards were announced I would still have bought it after. DLSS seems like it could be incredible tech, but it is really going to depend on how heavily Nvidia pushes support and how many developers jump on board. Even if every game from 2019 on supports it (which they won't) I'm not going to be regretting my purchase. Even with DLSS a 2080 is not worth over $200 more than I paid for my 1080 ti. I tend to upgrade ever 8-15 months anyway, so if 2080 tis drop to $800 sometime next year I'll probably jump on you.

The price delta between a 2080 and a new 1080 Ti isn’t hundreds of dollars though... it’s about $100:

https://www.nowinstock.net/computers/videocards/nvidia/gtx1080ti/

Now the questions turns into “Do I want to spend an extra $100 for the potentially huge upsides RTX tech will bring?” For a lot of people, that answer will be yes.

The 2080 will likely not be great at ray tracing, if the tech demos are any indication, so that really doesn't bring anything to the table. That leaves DLSS, which is something the people should consider when looking at it. However, the 2080 has less (though much faster) VRAM which could bottleneck it in VRAM heavy games. Personally, I thing telling people to buy hardware based on "maybe" and "potentially" is a fool's errand. There is no such thing as future-proofing when it comes to PC gaming so the best option is to buy based on what hardware gives you now instead of what it might provide in the future. By the time ray tracing is more than a buzzword marketing gimmick, these cards will be too weak to take advantage of it.
 
I'm just thinking that if the price DOES go down to like $999 or lower in 3-4 months as the MSRP for the normal 2080TI , then we'll have a bunch of people saying it's a great deal. I've seen people for years on [H] moan and groan for what seems forever about stuff then if the price and tech fall in line with what they expect or want, all of a sudden it's alright and great . I'm on a wait and see attitude myself. I can afford the card easily enough, but i'll wait a while for the [H] review before i seriously consider it.
 
Pretty uneventful. Essentially the 2080 replaces the 1080ti at the same price, and the 2080ti takes the crown but at $300 more than the last ti cost.

I guess the days of direct market replacements are gone. IE similar price as last gen but more performance \ features.
 
Titan pricing at TI performance levels. Hate to see the price of a titan version of one of these cards, 2 grand minimum?

Had they not went monkey shit fight on the pricing it would have been better received, unfortunately this seems to be an experiment by nvidia to see how much green dick people are willing to swallow before the gag reflex kicks in. Unsurprisingly Asus are leading the pack in the overpricing stakes with their strix card being a "snip" at £1500 ($1970) in the uk.

Haha Green dick, made me laugh :):)
 
So, I wanted to see a value proposition for the cards. I used the PC Per reviews, and created a spreadsheet to calculate the price per frame rate. I took their average frame rate, and divided that by the MSRP and the street price. My Excel-fu isn't that good, so here's just the raw data, sorted by game, then MSRP. So what is the value of the new cards? With the exception of Wolfenstein 2, not that good.

View attachment 105230

Yea, but a model such as "$ per frame" assumes a linear pricing structure. You know, double the framerate, double the price. That has literally NEVER been the case.

High end cards have always cost more than what would be projected linearly based on the price of mid range cards. Each additional frame costs more than the one before it.

I agree these prices are high, but we've had this conversation every single time Nvidia has launched a new generation of cards in the last 10 years.

Yes, prices are high. No, it's nothing new. It depends completely on the level of compeition.

Why would Nvidia drop the price on something that performs like a 1080ti, if AMD still can't put forth anything that can perform like the 1080ti?

AMD had pretty high prices back when Nvidia screwed the pooch with the FX GPU's too.

Once the competition returns, either from AMD, or from Intel, pricing will become more sane.
 
I think most us are better off spending this kind of money on a better monitor. I have my eye on something that finally meets my criteria: 1440p, 32 inch, ips, at least a solid 120hz, G-sync. I think this would a good match for anything from a 1070 to a 1080ti.
 
Some nice analysis here:



Seems the summary at 1080p was:
2080Ti vs 1080Ti = 10.8% gain.
2080 vs 1080Ti = 4.4% gain.
2080 vs 1080 = 24.2% gain.

Results are based on analyzing a bunch of review results.

Something else mentioned was allot of reviewers were sent "Founder's edition" cards that run at 1635mhz instead of the stock speed of 1545mhz in the case of the Ti, and 1800mhz vs 1710mhz for the other model, so something to keep in mind when shopping.

There's some indication that there's a CPU bottleneck coming into play at 1080p. At 4k the improvement goes up to 31.6 percent, a little more respectable.

One interesting tidbit mentioned was 2080Ti is doing much better on AMD optimized titles now, something interesting might be happening in terms of architecture in that regard.
 
Last edited:
I think most us are better off spending this kind of money on a better monitor.

I'm waiting on an hdmi 2.1 TV for a new secondary monitor for that sweet 4k 120hz. Well I'll probably more likely wait for them to be on sale at the end of 2019 to be exact. I've still got the TN panel for when I need it for twitch games.
 
You clearly have more money than sense. And by that I mean way more money than sense...

The fuck are you talking about? The, brand new, $650 1080 ti I bought last month has effectively the same performance as a card that costs $150 more and somehow that means I have more money than sense?
 
The fuck are you talking about? The, brand new, $650 1080 ti I bought last month has effectively the same performance as a card that costs $150 more and somehow that means I have more money than sense?

I wonder if he thought you were talking about/purchasing a 2080 series and not the older 1080ti? Or he missed it in your post?
 
The fuck are you talking about? The, brand new, $650 1080 ti I bought last month has effectively the same performance as a card that costs $150 more and somehow that means I have more money than sense?
Yeah my bad as I thought you were referring to buying a 2080 TI.
 
Maybe now I will stop getting low balled for the 1080ti I have for sale.
 
Hopefully more about searching having been frustrating on a mobile phone rather than laziness, but...

Has anyone come across any apples-to-apples benchmarks in these reviews that have 2080 Ti numbers using both Coffee Lake and Ryzen CPUs?

From laziness to narcissistic, quoting myself, but am surprised that I still haven't been able to find this.

Almost exclusively reviews on Z370 boards with i7 8700Ks... one 8086K and the odd X299 platform. No AM4 love? Guessing its going to be a case of waiting on gaming benchmarks for the i7 9700K and/or i9 9900K where comparisons with the 2700X are most likely to pop up with a 2080 Ti in play?
 
Well the reviews gave me one unfortunate conclusion... 1440p144hz is here for another gen.

Not impressed with some of the benchmarks for 4k even, EG MHW.

Also, currently at 1080, I might actually pull the trigger on 2080ti...
 
From laziness to narcissistic, quoting myself, but am surprised that I still haven't been able to find this.

Almost exclusively reviews on Z370 boards with i7 8700Ks... one 8086K and the odd X299 platform. No AM4 love? Guessing its going to be a case of waiting on gaming benchmarks for the i7 9700K and/or i9 9900K where comparisons with the 2700X are most likely to pop up with a 2080 Ti in play?

If I remember right the Linus Tech Tips review used both Intel and AMD. Jay from JayzTwoCents recently built an AMD rig for the purpose of testing these cards, but he didn't have time to bench both Intel and AMD before the embargo was up.
 
Pretty uneventful. Essentially the 2080 replaces the 1080ti at the same price, and the 2080ti takes the crown but at $300 more than the last ti cost.

I guess the days of direct market replacements are gone. IE similar price as last gen but more performance \ features.

Yea, but a model such as "$ per frame" assumes a linear pricing structure. You know, double the framerate, double the price. That has literally NEVER been the case.

High end cards have always cost more than what would be projected linearly based on the price of mid range cards. Each additional frame costs more than the one before it.

I agree these prices are high, but we've had this conversation every single time Nvidia has launched a new generation of cards in the last 10 years.

Yes, prices are high. No, it's nothing new. It depends completely on the level of compeition.

Why would Nvidia drop the price on something that performs like a 1080ti, if AMD still can't put forth anything that can perform like the 1080ti?

AMD had pretty high prices back when Nvidia screwed the pooch with the FX GPU's too.

Once the competition returns, either from AMD, or from Intel, pricing will become more sane.
So Kardonxt's post is really the crux of the issue for me. Certainly, there have been halo parts that cost more per performance than the lower tier, but we are used to such significant improvement over previous generation that it was worth considering. All I am doing is putting perspective to where we are - how much extra value over the previous generation. And that is what we are not seeing with this generation of Nvidia cards - they may perform "better" than the previous generation, but at the significantly increased price, work out to a worse value. Especially when you consider the nearly identical performance of the 2080 to the 1080ti but at a higher price!
 
How many different ways can a thousand people say the same thing? <unsubscribed>
 
Although the 3gb loss is a bummer, this is once again an example of confusing how much vram is used and how much is needed.

"Kudos to the AMD Radeon R9 Fury X for not breaking or seeming to run out of VRAM with no long pauses. Surprisingly there weren't long pauses like one would expect when running out of VRAM capacity. It must be that the dynamic VRAM is able to be leveraged and keep the video card from stalling out."

So no, 8gb will NOT be a problem, at least in this game so long as you have at least 16gb of system ram.

While I agree that it was nice that the Fury X failed gracefully, system RAM is not a substitute for GPU VRAM. Games like Rise/Shadow of the Tomb Raider, Rainbow 6: Seige, Shadow of Mordor, FFXV and Wolfenstein II will push the 8GB frame buffer on the 2080 to the limit, and that is without consideration for modifications such as high res texture packs. FFXV as an example, with its optional textures, will utilize 8.5GB upon startup.

For 1440P gaming, 8GB is sufficient for most gamers now. Even 4k with some reduced settings is ok. But it's not "NOT a problem." People need to be aware of the limitation so that they understand why their new RTX 2080 is suddenly performing half as fast as a 1080 Ti, and how to avoid the situation.

http://images.hardwarecanucks.com/image//skymtl/GPU/RTX2080-REVIEW/RTX2080-REVIEW-54.jpg
 
Back
Top