Separate names with a comma.
Discussion in 'nVidia Flavor' started by Kyle_Bennett, Sep 27, 2018.
Make the CUDA cores out of these magic circuits that work faster than the speed of light, too, then? Retard.
You should read Nvidia's paper, check page 40 of this document: https://www.nvidia.com/content/dam/...ure/NVIDIA-Turing-Architecture-Whitepaper.pdf
From my understanding, DLSS renders only half the samples in an image, and then uses DL to fill in the missing unrendered pixels. You can think of it like PS4 Pro checkerboarding just with a smarter algorithm.
This would not be the same as turning off AA as with DLSS you are only rendering half the pixels, so performance should be improved (minus the cost of the DLSS, but that should be less than the brute force approach).
You can also check this article, which gives a good overview of the technique with performance and quality comparisons: https://www.techspot.com/article/1712-nvidia-dlss/
Thanks, I will check it out. From what I understood DLSS used machine code from AI learning, to fill pre-computed forms of imaging or anti-aliasing, basically instead of the developer spending the time to figure out the best algorithm for AA, it moves the burden to AI learned machines, and does the work, saving the developer time, but also using the most efficient methods and performance cause it's all done on the Tensor Cores. (i'm probably explaining it poorly)
I do need to read into this more and study things on precisely how it works, there seems to be a lot of either misunderstanding on it, or miscomunication on what exactly is going on. I will read into much more so that when it finally comes to a game we can test it properly.
thank you for all your works guys ..i think if anything i'll go with a GTX 1080 ti for now ..or save a bit more up and go with the 2080 , just hope Nvidia fixes some driver issues i have read about
thank your Brent and Kyle for your work ..i look forward to your driver performance and future reviews ..as only [H] is what i trust..fair ..and honest ones using hardwear we all use
Here's what I've settled on for fan speed at the moment:
Turns the fan over the GPU off and leaves the VRM one (which is directly over the motherboard chipset which was running 5C hotter vs. the 1080 that used to be sitting over top it) running quietly.
Would I see enough real world benefit from the 2080ti over a 2ghz+ 1080ti? This seems like it might be a good generation to skip out on from what I'm seeing so far.
So, it looks to me where the glue problem arises, is if you want to take the vapor chamber cooler/fans apart.
If you just want to take the cooler off of the PCB for water cooler, it looks pretty much the same as it has been in the past.
Generally speaking, they have a neural network learn what an image looks like under the absolute highest quality (64x AA, same as movie quality), then have the Tensor Cores perform real-time inference to reconstruct the image to look like the ground truth render.
It can be done to render at a lower-than-native resolution image and upscale to native res to save performance (DLSS) or use as a superior AA form at native res (DLSS 2X).
I think when the guys get the full review out it will help clarify.
My two cents worth on it:
1440p-Probably not. Even the most demanding current games like KCD or SOTR only bring a 2GHZ+ 1080TI down to 50-60fps. Just about every other game still plays 70-120fps maxed. Pair that TI to g-sync display and it's a near perfect match.
4K-It depends, with less demanding games or significant settings compromises that card can do 4k/60fps but at that point it's mainly just throwing pixels. A couple quotes from Brent-
"Hands down, the ASUS ROG STRIX RTX 2080 Ti offers the best 4K gaming experience now. This video card proves its performance potential better at 4K than it does at 1440p."
"That said, it did not allow this in Kingdom Come: Deliverance, proving that we actually need more GPU performance possibly in some games at 4K even today. As fast as it is, it isn’t quite enough in a game that’s been out now for 8 months. Therefore, we can’t say it will allow highest settings in every game at 4K, most, but not all."
I'd only add for perspective that we are still getting games that can tax mid-range cards at 1080p. It's been over a decade for that now. Similarly I'd say that as more bells and whistles are added to visuals we'll see the same phenomenon with 4k and beyond so waiting for the 'true 4k card' is relative to what features the games will be employing.
Gor a question, anyone know why ASUS put a triple fan on the 208, but just a dual (or single) fan on the 2080 TI? Would have thought they'd put triple on the big-boy.
https://rog.asus.com/articles/gamin...ampaign=4582-20181003-RTX Graphics Card-- EDM
Both the cards used here have triple fan coolers. I think maybe you are cross-comparing different series of cards. Three series out from ASUS now. ROG Strix, ASUS Dual, and ASUS Turbo. Triple, double, and blower.
I was using the page I posted as source:
The Dual Geforce RTX 2080 TI has a dual fan
The Turbo Geforce 2080 TI has a single blower.
The Strix geforce 2080 oc has 3 fans
The dual geforce 2080 OC has a dual fan.
the 2080 ti's listed are only 2 fans, the only 3-fan is a 2080. Unless there are more out from ASUS that aren't listed on this page.
There is a Strix (3 fan) 2080 Ti. It's covered in the article here. Dunno why Asus didn't list it on that page but it definitely exists.
Digital Foundry has a pretty good video up on their Youtube Channel.
Reading is fundamental...
Select models available for pre-order today
Our first GeForce RTX 2080 Ti and RTX 2080 graphics cards will start shipping in mid-September, and you can reserve yours now in select regions. Four models are available for pre-order from the North American retailers listed in the table above. The ROG Strix GeForce RTX 2080 OC Edition is priced at $869.99 USD and $1,149 CAD, while the ASUS Dual GeForce RTX 2080 OC Edition rings in at $839.99 USD and $1,099 CAD. Both are factory overclocked, as is the Dual GeForce RTX 2080 Ti OC Edition at $1,239.99 USD and $1,649 CAD. The Turbo GeForce RTX 2080 Ti runs at stock speeds for $1,209.99 USD and $1,599 CAD.
And, rtwfq, none of the ti's have 3 fans. Just the 2080. My question was about only one 2080 has 3 fans, and none of the 2080 ti's.
The double fan card uses the reference PCB while the Strix uses a custom PCB. The Strix are their fancy ones and have three fans. The blower ones have a single fan that sucks the hot air out the back of the case. What's unclear?
The link I posted did NOT HAVE STRIX 2080 ti's. I based my question ON THAT PAGE. evilpaul posted an actual link that had a Strix 2080ti. My question was answered. There was nothing unclear, the link I posted (from an email announcement from ASUS) had no strix 2080 ti's, I based my question on that page. I thought that was pretty clear.
DLSS is simply an AI upscaling algorithm. Meaning it runs the game @1440p, and upscales it through AI to 4K to a quality that is comparable to a native 4K + TAA. That's how DLSS saves performance, by running the game @1440p instead of 4K.
It also upscales 1080p to 1440p.
any ETA on the 2080 ti Strix yet ?
Where dat full review with 20 billion games and 50 resolutions?