ASUS ROG STRIX RTX 2080 Ti and 2080 4K Preview @ [H]

Honestly, that can be tested now just by turning off AA and looking at performance, since DLSS just removes AA from the CUDA cores onto the Tensor Cores, it's practically the same thing, running a game with AA disabled in terms of performance.
You should read Nvidia's paper, check page 40 of this document: https://www.nvidia.com/content/dam/...ure/NVIDIA-Turing-Architecture-Whitepaper.pdf

From my understanding, DLSS renders only half the samples in an image, and then uses DL to fill in the missing unrendered pixels. You can think of it like PS4 Pro checkerboarding just with a smarter algorithm.

This would not be the same as turning off AA as with DLSS you are only rendering half the pixels, so performance should be improved (minus the cost of the DLSS, but that should be less than the brute force approach).

You can also check this article, which gives a good overview of the technique with performance and quality comparisons: https://www.techspot.com/article/1712-nvidia-dlss/
 
You should read Nvidia's paper, check page 40 of this document: https://www.nvidia.com/content/dam/...ure/NVIDIA-Turing-Architecture-Whitepaper.pdf

From my understanding, DLSS renders only half the samples in an image, and then uses DL to fill in the missing unrendered pixels. You can think of it like PS4 Pro checkerboarding just with a smarter algorithm.

This would not be the same as turning off AA as with DLSS you are only rendering half the pixels, so performance should be improved (minus the cost of the DLSS, but that should be less than the brute force approach).

You can also check this article, which gives a good overview of the technique with performance and quality comparisons: https://www.techspot.com/article/1712-nvidia-dlss/

Thanks, I will check it out. From what I understood DLSS used machine code from AI learning, to fill pre-computed forms of imaging or anti-aliasing, basically instead of the developer spending the time to figure out the best algorithm for AA, it moves the burden to AI learned machines, and does the work, saving the developer time, but also using the most efficient methods and performance cause it's all done on the Tensor Cores. (i'm probably explaining it poorly)

I do need to read into this more and study things on precisely how it works, there seems to be a lot of either misunderstanding on it, or miscomunication on what exactly is going on. I will read into much more so that when it finally comes to a game we can test it properly.
 
thank you for all your works guys ..i think if anything i'll go with a GTX 1080 ti for now ..or save a bit more up and go with the 2080 , just hope Nvidia fixes some driver issues i have read about

thank your Brent and Kyle for your work ..i look forward to your driver performance and future reviews ..as only [H] is what i trust..fair ..and honest ones using hardwear we all use
 
Here's what I've settled on for fan speed at the moment:
4HeHJGia7C4w.png

Turns the fan over the GPU off and leaves the VRM one (which is directly over the motherboard chipset which was running 5C hotter vs. the 1080 that used to be sitting over top it) running quietly.
 
Would I see enough real world benefit from the 2080ti over a 2ghz+ 1080ti? This seems like it might be a good generation to skip out on from what I'm seeing so far.
 
Way too soon to know.. I've heard evga will come out with a blower version later on. That way you wont void your warranty removing the cooler to install the block.

Anyways.. have ya seen the video on removing that blower cooler? Need a heat gun to remove the glue under the center plate just to get to the center screws. Like no thanks!

Usually the difference in overclocking these are only a small margin, 2-3 fps.

I'd just something other then the founders edition to avoid all that if your going water cooling. Make sure the company wont void the warranty if you remove the cooler.. and with evga you must keep the original cooler and put it back on it if you need to rma the card.

So, it looks to me where the glue problem arises, is if you want to take the vapor chamber cooler/fans apart.

If you just want to take the cooler off of the PCB for water cooler, it looks pretty much the same as it has been in the past.
 
Thanks, I will check it out. From what I understood DLSS used machine code from AI learning, to fill pre-computed forms of imaging or anti-aliasing, basically instead of the developer spending the time to figure out the best algorithm for AA, it moves the burden to AI learned machines, and does the work, saving the developer time, but also using the most efficient methods and performance cause it's all done on the Tensor Cores. (i'm probably explaining it poorly)

I do need to read into this more and study things on precisely how it works, there seems to be a lot of either misunderstanding on it, or miscomunication on what exactly is going on. I will read into much more so that when it finally comes to a game we can test it properly.

Generally speaking, they have a neural network learn what an image looks like under the absolute highest quality (64x AA, same as movie quality), then have the Tensor Cores perform real-time inference to reconstruct the image to look like the ground truth render.

It can be done to render at a lower-than-native resolution image and upscale to native res to save performance (DLSS) or use as a superior AA form at native res (DLSS 2X).
 
Would I see enough real world benefit from the 2080ti over a 2ghz+ 1080ti? This seems like it might be a good generation to skip out on from what I'm seeing so far.

I think when the guys get the full review out it will help clarify.

My two cents worth on it:

1440p-Probably not. Even the most demanding current games like KCD or SOTR only bring a 2GHZ+ 1080TI down to 50-60fps. Just about every other game still plays 70-120fps maxed. Pair that TI to g-sync display and it's a near perfect match.

4K-It depends, with less demanding games or significant settings compromises that card can do 4k/60fps but at that point it's mainly just throwing pixels. A couple quotes from Brent-

"Hands down, the ASUS ROG STRIX RTX 2080 Ti offers the best 4K gaming experience now. This video card proves its performance potential better at 4K than it does at 1440p."

"That said, it did not allow this in Kingdom Come: Deliverance, proving that we actually need more GPU performance possibly in some games at 4K even today. As fast as it is, it isn’t quite enough in a game that’s been out now for 8 months. Therefore, we can’t say it will allow highest settings in every game at 4K, most, but not all."

I'd only add for perspective that we are still getting games that can tax mid-range cards at 1080p. It's been over a decade for that now. Similarly I'd say that as more bells and whistles are added to visuals we'll see the same phenomenon with 4k and beyond so waiting for the 'true 4k card' is relative to what features the games will be employing.
 
Gor a question, anyone know why ASUS put a triple fan on the 208, but just a dual (or single) fan on the 2080 TI? Would have thought they'd put triple on the big-boy.

https://rog.asus.com/articles/gaming-graphics-cards/introducing-geforce-rtx-2080-ti-and-rtx-2080-graphics-cards-from-rog-and-asus/?utm_source=asus-edm&utm_medium=email&utm_campaign=4582-20181003-RTX Graphics Card-- EDM
Both the cards used here have triple fan coolers. I think maybe you are cross-comparing different series of cards. Three series out from ASUS now. ROG Strix, ASUS Dual, and ASUS Turbo. Triple, double, and blower.
 
I was using the page I posted as source:


The Dual Geforce RTX 2080 TI has a dual fan

The Turbo Geforce 2080 TI has a single blower.

The Strix geforce 2080 oc has 3 fans

The dual geforce 2080 OC has a dual fan.

the 2080 ti's listed are only 2 fans, the only 3-fan is a 2080. Unless there are more out from ASUS that aren't listed on this page.
 
I was using the page I posted as source:


The Dual Geforce RTX 2080 TI has a dual fan

The Turbo Geforce 2080 TI has a single blower.

The Strix geforce 2080 oc has 3 fans

The dual geforce 2080 OC has a dual fan.

the 2080 ti's listed are only 2 fans, the only 3-fan is a 2080. Unless there are more out from ASUS that aren't listed on this page.

There is a Strix (3 fan) 2080 Ti. It's covered in the article here. Dunno why Asus didn't list it on that page but it definitely exists.
 
Thanks, I will check it out. From what I understood DLSS used machine code from AI learning, to fill pre-computed forms of imaging or anti-aliasing, basically instead of the developer spending the time to figure out the best algorithm for AA, it moves the burden to AI learned machines, and does the work, saving the developer time, but also using the most efficient methods and performance cause it's all done on the Tensor Cores. (i'm probably explaining it poorly)

I do need to read into this more and study things on precisely how it works, there seems to be a lot of either misunderstanding on it, or miscomunication on what exactly is going on. I will read into much more so that when it finally comes to a game we can test it properly.

Digital Foundry has a pretty good video up on their Youtube Channel.
 
Dunno why Asus didn't list it on that page but it definitely exists.
Reading is fundamental...

Select models available for pre-order today
Our first GeForce RTX 2080 Ti and RTX 2080 graphics cards will start shipping in mid-September, and you can reserve yours now in select regions. Four models are available for pre-order from the North American retailers listed in the table above. The ROG Strix GeForce RTX 2080 OC Edition is priced at $869.99 USD and $1,149 CAD, while the ASUS Dual GeForce RTX 2080 OC Edition rings in at $839.99 USD and $1,099 CAD. Both are factory overclocked, as is the Dual GeForce RTX 2080 Ti OC Edition at $1,239.99 USD and $1,649 CAD. The Turbo GeForce RTX 2080 Ti runs at stock speeds for $1,209.99 USD and $1,599 CAD.
 
And, rtwfq, none of the ti's have 3 fans. Just the 2080. My question was about only one 2080 has 3 fans, and none of the 2080 ti's.
 
The link I posted did NOT HAVE STRIX 2080 ti's. I based my question ON THAT PAGE. evilpaul posted an actual link that had a Strix 2080ti. My question was answered. There was nothing unclear, the link I posted (from an email announcement from ASUS) had no strix 2080 ti's, I based my question on that page. I thought that was pretty clear.
 
From what I understood DLSS used machine code from AI learning, to fill pre-computed forms of imaging or anti-aliasing, basically instead of the developer spending the time to figure out the best algorithm for AA, it moves the burden to AI learned machines, and does the work, saving the developer time, but also using the most efficient methods and performance cause it's all done on the Tensor Cores. (i'm probably explaining it poorly)
Wrong.

DLSS is simply an AI upscaling algorithm. Meaning it runs the game @1440p, and upscales it through AI to 4K to a quality that is comparable to a native 4K + TAA. That's how DLSS saves performance, by running the game @1440p instead of 4K.

It also upscales 1080p to 1440p.
 
any ETA on the 2080 ti Strix yet ?
The ASUS ROGS STRIX RTX2080TI-O11G GAMING cards were in stock on Newegg this morning and just about half an hour ago. Sold out again, but continue to monitor their website as they currently seem to be getting in several deliveries throughout the day. Also, you might want to save the NowInStock.net website as a favorite for tracking when the RTX2080Ti cards are in stock from a multitude of websites - all tracked on one page that auto refreshes every minute ;-)
 
The ASUS ROGS STRIX RTX2080TI-O11G GAMING cards were in stock on Newegg this morning and just about half an hour ago. Sold out again, but continue to monitor their website as they currently seem to be getting in several deliveries throughout the day. Also, you might want to save the NowInStock.net website as a favorite for tracking when the RTX2080Ti cards are in stock from a multitude of websites - all tracked on one page that auto refreshes every minute ;-)

yGJ6gg.jpg
 

Attachments

  • c5v05y.png
    c5v05y.png
    325.5 KB · Views: 60
Back
Top