ASUS ROG STRIX RTX 2080 Ti and 2080 4K Preview @ [H]

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,535
ASUS ROG STRIX RTX 2080 Ti and 2080 4K Preview

We look at real-world gaming performance on an ASUS ROG STRIX RTX 2080 Ti and ASUS ROG STRIX RTX 2080 custom retail factory overclocked video cards at 4K resolution. In this preview we will compare performance in five games at 4K with a GeForce GTX 1080 Ti FE and GeForce GTX 1080 FE.

If you like our content, please support HardOCP on Patreon.
 
More or less confirms that picking up a 2080 will be a good upgrade for my 1070 once a card I want is readily available at a price I'm willing to pay. I think the projected AIB prices seem reasonable for the 2080. High definitely, but reasonable for my personal upgrade scenario. Good preview. I know that the RTX features may not be where they should be for general use (probably until the next round of cards even) but they'll be there for dabbling in the meantime as a little icing on the cake.
 
Thanks guys for the awesome previews. This Strix TI is on my radar. Just trying to beat my common $en$e into submission for it but I do like those FPS. I also like the custom PCB they use with 2 HDMI's. I could use one for the T.V. and the other for the audio receiver. A lot of fun watching these strut their stuff in 4k.

Just curious, did you ever experience the the Vram leak with MEA in 4k w/ max settings? I've encountered it with my 1080's in SLI with same settings. Once it hits that 8GB ceiling it 'spilled over'. Kind of funny watching my 32GB fill up and then the game crashes with out of memory error. Always thought an 11GB card would hold it. I know it's one of those things that doesn't affect all cards but it is a known thing so I was wondering how the RTX's and drivers held up with that. Will your final reviews show 4k V-ram usage for these games?
 
Thanks for keeping us updated guys! Can you guys confirm something I saw elsewhere... the Fan connect headers on the card for attaching supplemental fans that are controlled by the GPU... do both cards have them on your samples?
 
Thanks for keeping us updated guys! Can you guys confirm something I saw elsewhere... the Fan connect headers on the card for attaching supplemental fans that are controlled by the GPU... do both cards have them on your samples?
Yes, both have two fan headers on them and also have an AURA RGB controller plug-in as well.
 
Thanks Kyle! another review stated they were missing from the 2080Ti, apparently a "technical issue"... Just wanted to see if it was remedied on your samples!. Looking forward to the full review and a non NDA tainted opinion! Keep er up!
Well, hell, let's have Brent_Justice verify this then, he has the ASUS card in his hands. I had the 2080 here and I KNOW it had those, but I did not verify that myself on the 2080 Ti. Sorry, I was making an assumption that the more expensive card had those.
 
So when the graphics are mainly GPU dependent, the 2080TI is a ~50 percent performance gain for double the price of a 1080TI and thats at 4k.......damn Nvidia is smart. They know people will buy the hell out of them now but its such a fine line when it comes to it being overpriced.

After seeing these framerates at these graphical settings and looking at this as objectively as possible its like a "damn...should I actually buy this shit even though the pricing is ridiculous......but its AT LEAST 1.5x faster than a 1080TI and damn near twice as fast in some cases...SHIT".

*shakes fist @ Nvidia* DAMN YOU, DAMN YOU ALL TO HELL!
 
  • Like
Reactions: Elios
like this
Let's see the 2080 drop to $599 on Black Friday, and I may finally have a replacement for my 980ti.
 
Thanks, was really looking forward to your 4K preview. It reaffirms my thinking that I really should just find the damn money for the 2080 Ti since I just got a 4K monitor. It will be a massive upgrade over the 980 vanilla obviously, but my son will be more than happy with that old card. :rolleyes:
 
It appears that if you have 4K then a 2080 Ti is the schizzle.
It's impressive that a single card can finally be beefy enough to get crazy frame rates at the highest settings.

The only but in all of this is what happens when you can enable all the fancy shit that isn't available yet.

I'm still thinking that sitting this one out is the proper thing to do, unfortunately a 2180 Ti (or whatever it will be called) will probably cost $2000 by the time it gets
released.
 
Nice preview! Still waiting on reviews after Windows10 supports RTX and DLSS, want to know how much boost DLSS gives, and how the image quality compares.

Found a small typo on the Far Cry 5 page: "The new ASUS ROG STRIX RTX 2080 is technically faster than the GeForce GTX 1080 FE by 7%." should be "The new ASUS ROG STRIX RTX 2080 is technically faster than the GeForce GTX 1080 Ti FE by 7%."
 
It appears that if you have 4K then a 2080 Ti is the schizzle.
It's impressive that a single card can finally be beefy enough to get crazy frame rates at the highest settings.

The only but in all of this is what happens when you can enable all the fancy shit that isn't available yet.

I'm still thinking that sitting this one out is the proper thing to do, unfortunately a 2180 Ti (or whatever it will be called) will probably cost $2000 by the time it gets
released.

As fast as the 2080 Ti is, there was still one game we could not max out here, and it's been out for 8 months now.... so I do wonder about those games to be released, and how demanding they will be at full settings at 4K
 
Thanks guys for the awesome previews. This Strix TI is on my radar. Just trying to beat my common $en$e into submission for it but I do like those FPS. I also like the custom PCB they use with 2 HDMI's. I could use one for the T.V. and the other for the audio receiver. A lot of fun watching these strut their stuff in 4k.

<SNIP>

If I am reading this correctly, you goal is to use the TV as the "video" and the receiver as "audio". Only problem is Windows will see this as a dual display and the video card will output both audio and video to both sources. Last time I tried something like this it did not work well. This may help you: LINK

Did some more digging, it looks like only the ROG Strix RTX 2080 OC has dual hdmi. The "TI" versions do not.

Good luck,
Paul
 
Last edited:
Well the insanity won. Just pre-ordered. In a matter of weeks and I'll officially stop using SLI.

If I am reading this correctly, you goal is to use the TV as the "video" and the receiver as "audio". Only problem is Windows will see this as a dual display and the video card will output both audio and video to both sources. Last time I tried something like this it did not work well. This may help you: LINK

Did some more digging, it looks like only the ROG Strix RTX 2080 OC has dual hdmi. The "TI" versions do not.

Good luck,
Paul

I'll let you know when it arrives about the ports. It's seems there's been some inconsistencies regarding some physical specs of the Strix 2080TI's. It's true about Windows seeing the receiver as a display. Been doing it with my 1080SLI rig for awhile but haven't noticed any performance issues.
 
As fast as the 2080 Ti is, there was still one game we could not max out here, and it's been out for 8 months now.... so I do wonder about those games to be released, and how demanding they will be at full settings at 4K

Yep, once again CryEngine sets the bar high. Still looking forward to restarting KCD even if it's still not full 60, and here's hoping they don't break more gamesaves with an update again ;)
 
The newest 4k screens do 120-144hz right? So you'll need 2 2080ti's to prob get around a 100fps in 4k.

Might as well just go with 2 1080tis. Cheaper.
 
Thanks guys for the awesome previews. This Strix TI is on my radar. Just trying to beat my common $en$e into submission for it but I do like those FPS. I also like the custom PCB they use with 2 HDMI's. I could use one for the T.V. and the other for the audio receiver. A lot of fun watching these strut their stuff in 4k.?

Does your receiver not have 4k-passthrough? You can use the the ARC input that will let you pass the audio back to the receiver via TOSLINK if you need to go direct to the panel for 4k.
 
Last time around when the 1080 and 1080ti launched there was a lot of talk that Nvidia was cherry picking the best dies for their own Founders Edition cards.

Has anyone heard anything similar this time around?

Like usual, it seems most launch board partners are using OEM board designs, at least for their initial products.

I'd be interested in getting one and sticking a water block on it, but I wonder if I am likely to hit better overclocks if I get a Founders Edition than if I get a board partner version.

There are several board partner boards that can currently be preordered. It's probably too soon to know, but I wonder if I am better off waiting until I can order a Founders Edition....
 
Way too soon to know.. I've heard evga will come out with a blower version later on. That way you wont void your warranty removing the cooler to install the block.

Anyways.. have ya seen the video on removing that blower cooler? Need a heat gun to remove the glue under the center plate just to get to the center screws. Like no thanks!

Usually the difference in overclocking these are only a small margin, 2-3 fps.

I'd just something other then the founders edition to avoid all that if your going water cooling. Make sure the company wont void the warranty if you remove the cooler.. and with evga you must keep the original cooler and put it back on it if you need to rma the card.
 
Way too soon to know.. I've heard evga will come out with a blower version later on. That way you wont void your warranty removing the cooler to install the block.

Anyways.. have ya seen the video on removing that blower cooler? Need a heat gun to remove the glue under the center plate just to get to the center screws. Like no thanks!

Usually the difference in overclocking these are only a small margin, 2-3 fps.

I'd just something other then the founders edition to avoid all that if your going water cooling. Make sure the company wont void the warranty if you remove the cooler.. and with evga you must keep the original cooler and put it back on it if you need to rma the card.

I hadn't seen reports of the FE blower being difficult to remove. I will have to check that out, thanks.

That said, I don't think they legally can void warranties for that. I mean sure, if you break something, yes, but not just for removing the cooler. Not that they won't try, and when they do, what are you going to do. Sue them over it?

That, and it sounds like it might be easy to break something removing this cooler.
 
Last edited:
Way too soon to know.. I've heard evga will come out with a blower version later on. That way you wont void your warranty removing the cooler to install the block.

Anyways.. have ya seen the video on removing that blower cooler? Need a heat gun to remove the glue under the center plate just to get to the center screws. Like no thanks!

Usually the difference in overclocking these are only a small margin, 2-3 fps.

I'd just something other then the founders edition to avoid all that if your going water cooling. Make sure the company wont void the warranty if you remove the cooler.. and with evga you must keep the original cooler and put it back on it if you need to rma the card.


I just watched this (without sound mind you) and it looks to me like he got the entire cooler off without having to loosen any glue. So, unless he says something about something he did off camera, it doesn't look like this 2080ti FE cooler is any more difficult to remove than in the past. Maybe a couple more screws, but there were always a lot of screws.

 
need more testing under Vulkan some thing is really giving these new cards a boost
 
As fast as the 2080 Ti is, there was still one game we could not max out here, and it's been out for 8 months now.... so I do wonder about those games to be released, and how demanding they will be at full settings at 4K

theres where DLSS comes in
 
As fast as the 2080 Ti is, there was still one game we could not max out here, and it's been out for 8 months now.... so I do wonder about those games to be released, and how demanding they will be at full settings at 4K

What else is anybody supposed to buy if they want maximum FPS with settings maxed out? Price is high, but performance is best available. Piss on the price all you want.
 
What else is anybody supposed to buy if they want maximum FPS with settings maxed out? Price is high, but performance is best available.

I think this is why it's more difficult to slam the 2080 Ti over pricing compared to the regular 2080. The 2080 Ti does provide a new level of performance over existing parts even if the price/performance ratio isn't good which is why I don't think nVidia will have a problem selling as many 2080 Tis as it can make for a while, it's a huge chip and can't be easy to make. While the 2080 might sell more units I don't the supply will be as tight.
 
I hadn't seen reports of the FE blower being difficult to remove. I will have to check that out, thanks.

That said, I don't think they legally can void warranties for that. I mean sure, if you break something, yes, but not just for removing the cooler. Not that they won't try, and when they do, what are you going to do. Sue them over it?

That, and it sounds like it might be easy to break something removing this cooler.
Breaking mine down soon.
 
Great testing. Would appreciate if you could add some more games in the full review especially Ass Creed Odyssey or Origins.
 
What idle temperatures are normal on a Ti? My 1080 was using ~16W (Gigabyte) and idled with the fans off at 39C. The Evga 2080 Ti XC Ultra I picked up idles at 55C and ~32W. Using two monitors a 1440p 165 Hz and 4K 60 Hz one.
 
theres where DLSS comes in

Honestly, that can be tested now just by turning off AA and looking at performance, since DLSS just removes AA from the CUDA cores onto the Tensor Cores, it's practically the same thing, running a game with AA disabled in terms of performance.

Maybe I will try it, and maybe we will see.
 
Make the CUDA cores out of these magic circuits that work faster than the speed of light, too, then? Retard.
 
Back
Top