ASUS ROG STRIX RTX 2080 Ti and 2080 4K Preview @ [H]

Discussion in 'nVidia Flavor' started by Kyle_Bennett, Sep 27, 2018.

  1. Brent_Justice

    Brent_Justice [H] Video Card Managing Editor Staff Member

    Messages:
    17,786
    Joined:
    Apr 17, 2000
    Proof?

    thx
     
  2. evilpaul

    evilpaul Limp Gawd

    Messages:
    146
    Joined:
    Dec 31, 2016
    Make the CUDA cores out of these magic circuits that work faster than the speed of light, too, then? Retard.
     
  3. cybereality

    cybereality 2[H]4U

    Messages:
    3,103
    Joined:
    Mar 22, 2008
    You should read Nvidia's paper, check page 40 of this document: https://www.nvidia.com/content/dam/...ure/NVIDIA-Turing-Architecture-Whitepaper.pdf

    From my understanding, DLSS renders only half the samples in an image, and then uses DL to fill in the missing unrendered pixels. You can think of it like PS4 Pro checkerboarding just with a smarter algorithm.

    This would not be the same as turning off AA as with DLSS you are only rendering half the pixels, so performance should be improved (minus the cost of the DLSS, but that should be less than the brute force approach).

    You can also check this article, which gives a good overview of the technique with performance and quality comparisons: https://www.techspot.com/article/1712-nvidia-dlss/
     
  4. Brent_Justice

    Brent_Justice [H] Video Card Managing Editor Staff Member

    Messages:
    17,786
    Joined:
    Apr 17, 2000
    Thanks, I will check it out. From what I understood DLSS used machine code from AI learning, to fill pre-computed forms of imaging or anti-aliasing, basically instead of the developer spending the time to figure out the best algorithm for AA, it moves the burden to AI learned machines, and does the work, saving the developer time, but also using the most efficient methods and performance cause it's all done on the Tensor Cores. (i'm probably explaining it poorly)

    I do need to read into this more and study things on precisely how it works, there seems to be a lot of either misunderstanding on it, or miscomunication on what exactly is going on. I will read into much more so that when it finally comes to a game we can test it properly.
     
    cybereality likes this.
  5. -=SOF=-WID99

    -=SOF=-WID99 Limp Gawd

    Messages:
    198
    Joined:
    Nov 30, 2015
    thank you for all your works guys ..i think if anything i'll go with a GTX 1080 ti for now ..or save a bit more up and go with the 2080 , just hope Nvidia fixes some driver issues i have read about

    thank your Brent and Kyle for your work ..i look forward to your driver performance and future reviews ..as only [H] is what i trust..fair ..and honest ones using hardwear we all use
     
    Kyle_Bennett likes this.
  6. evilpaul

    evilpaul Limp Gawd

    Messages:
    146
    Joined:
    Dec 31, 2016
    Here's what I've settled on for fan speed at the moment:
    4HeHJGia7C4w.png
    Turns the fan over the GPU off and leaves the VRM one (which is directly over the motherboard chipset which was running 5C hotter vs. the 1080 that used to be sitting over top it) running quietly.
     
  7. Pusher of Buttons

    Pusher of Buttons [H]ardForum Junkie

    Messages:
    9,584
    Joined:
    Dec 6, 2016
    Would I see enough real world benefit from the 2080ti over a 2ghz+ 1080ti? This seems like it might be a good generation to skip out on from what I'm seeing so far.
     
  8. Zarathustra[H]

    Zarathustra[H] Official Forum Curmudgeon

    Messages:
    25,517
    Joined:
    Oct 29, 2000
    So, it looks to me where the glue problem arises, is if you want to take the vapor chamber cooler/fans apart.

    If you just want to take the cooler off of the PCB for water cooler, it looks pretty much the same as it has been in the past.
     
    Dayaks likes this.
  9. InquisitorDavid

    InquisitorDavid [H]Lite

    Messages:
    77
    Joined:
    Jun 27, 2016
    Generally speaking, they have a neural network learn what an image looks like under the absolute highest quality (64x AA, same as movie quality), then have the Tensor Cores perform real-time inference to reconstruct the image to look like the ground truth render.

    It can be done to render at a lower-than-native resolution image and upscale to native res to save performance (DLSS) or use as a superior AA form at native res (DLSS 2X).
     
    CoreStoffer and cybereality like this.
  10. lostin3d

    lostin3d [H]ard|Gawd

    Messages:
    1,405
    Joined:
    Oct 13, 2016
    I think when the guys get the full review out it will help clarify.

    My two cents worth on it:

    1440p-Probably not. Even the most demanding current games like KCD or SOTR only bring a 2GHZ+ 1080TI down to 50-60fps. Just about every other game still plays 70-120fps maxed. Pair that TI to g-sync display and it's a near perfect match.

    4K-It depends, with less demanding games or significant settings compromises that card can do 4k/60fps but at that point it's mainly just throwing pixels. A couple quotes from Brent-

    "Hands down, the ASUS ROG STRIX RTX 2080 Ti offers the best 4K gaming experience now. This video card proves its performance potential better at 4K than it does at 1440p."

    "That said, it did not allow this in Kingdom Come: Deliverance, proving that we actually need more GPU performance possibly in some games at 4K even today. As fast as it is, it isn’t quite enough in a game that’s been out now for 8 months. Therefore, we can’t say it will allow highest settings in every game at 4K, most, but not all."

    I'd only add for perspective that we are still getting games that can tax mid-range cards at 1080p. It's been over a decade for that now. Similarly I'd say that as more bells and whistles are added to visuals we'll see the same phenomenon with 4k and beyond so waiting for the 'true 4k card' is relative to what features the games will be employing.
     
  11. pek

    pek prairie dog

    Messages:
    610
    Joined:
    Nov 7, 2005
  12. Kyle_Bennett

    Kyle_Bennett El Chingón Staff Member

    Messages:
    54,126
    Joined:
    May 18, 1997
    Both the cards used here have triple fan coolers. I think maybe you are cross-comparing different series of cards. Three series out from ASUS now. ROG Strix, ASUS Dual, and ASUS Turbo. Triple, double, and blower.
     
  13. pek

    pek prairie dog

    Messages:
    610
    Joined:
    Nov 7, 2005
    I was using the page I posted as source:


    The Dual Geforce RTX 2080 TI has a dual fan

    The Turbo Geforce 2080 TI has a single blower.

    The Strix geforce 2080 oc has 3 fans

    The dual geforce 2080 OC has a dual fan.

    the 2080 ti's listed are only 2 fans, the only 3-fan is a 2080. Unless there are more out from ASUS that aren't listed on this page.
     
  14. x3sphere

    x3sphere 2[H]4U

    Messages:
    2,575
    Joined:
    Dec 8, 2007
    There is a Strix (3 fan) 2080 Ti. It's covered in the article here. Dunno why Asus didn't list it on that page but it definitely exists.
     
    Kyle_Bennett likes this.
  15. twonunpackmule

    twonunpackmule [H]ard|Gawd

    Messages:
    1,279
    Joined:
    Sep 27, 2005
    Digital Foundry has a pretty good video up on their Youtube Channel.
     
  16. Kyle_Bennett

    Kyle_Bennett El Chingón Staff Member

    Messages:
    54,126
    Joined:
    May 18, 1997
    Reading is fundamental...

    Select models available for pre-order today
    Our first GeForce RTX 2080 Ti and RTX 2080 graphics cards will start shipping in mid-September, and you can reserve yours now in select regions. Four models are available for pre-order from the North American retailers listed in the table above. The ROG Strix GeForce RTX 2080 OC Edition is priced at $869.99 USD and $1,149 CAD, while the ASUS Dual GeForce RTX 2080 OC Edition rings in at $839.99 USD and $1,099 CAD. Both are factory overclocked, as is the Dual GeForce RTX 2080 Ti OC Edition at $1,239.99 USD and $1,649 CAD. The Turbo GeForce RTX 2080 Ti runs at stock speeds for $1,209.99 USD and $1,599 CAD.
     
    lostin3d likes this.
  17. pek

    pek prairie dog

    Messages:
    610
    Joined:
    Nov 7, 2005
    And, rtwfq, none of the ti's have 3 fans. Just the 2080. My question was about only one 2080 has 3 fans, and none of the 2080 ti's.
     
  18. evilpaul

    evilpaul Limp Gawd

    Messages:
    146
    Joined:
    Dec 31, 2016
  19. pek

    pek prairie dog

    Messages:
    610
    Joined:
    Nov 7, 2005
    The link I posted did NOT HAVE STRIX 2080 ti's. I based my question ON THAT PAGE. evilpaul posted an actual link that had a Strix 2080ti. My question was answered. There was nothing unclear, the link I posted (from an email announcement from ASUS) had no strix 2080 ti's, I based my question on that page. I thought that was pretty clear.
     
  20. Hameeeedo

    Hameeeedo Limp Gawd

    Messages:
    172
    Joined:
    May 27, 2016
    Wrong.

    DLSS is simply an AI upscaling algorithm. Meaning it runs the game @1440p, and upscales it through AI to 4K to a quality that is comparable to a native 4K + TAA. That's how DLSS saves performance, by running the game @1440p instead of 4K.

    It also upscales 1080p to 1440p.
     
    cybereality likes this.
  21. bill_d

    bill_d Limp Gawd

    Messages:
    186
    Joined:
    Jun 8, 2007
    any ETA on the 2080 ti Strix yet ?
     
  22. KickAssCop

    KickAssCop [H]ardness Supreme

    Messages:
    6,504
    Joined:
    Mar 19, 2003
    Where dat full review with 20 billion games and 50 resolutions?