Ray-Tracing and multi-GPU

Discussion in 'Video Cards' started by XoR_, Sep 16, 2018.

  1. XoR_

    XoR_ Limp Gawd

    Messages:
    365
    Joined:
    Jan 18, 2016
    In theory Ray-Tracing should be perfect for multi-GPU configurations with ability to scale well with SFR (Single Frame Rendering).

    Like original 3dfx cards where each GPU drew consecutive lines here RTX cards would calculate part of necessary rays on per-ray basis - which would translate to per-pixel with one ray per second.

    This should give consistent scaling (as long as CPU can keep up) and no added input lag like we see with AFR (Alternate Frame Rendering) making SLI actually an usable solution.
     
  2. DF-1

    DF-1 2[H]4U

    Messages:
    2,388
    Joined:
    Jun 17, 2011
    unaffordable niche within a niche. Why even bother with theorizing it.
     
  3. Archaea

    Archaea [H]ardForum Junkie

    Messages:
    8,442
    Joined:
    Oct 19, 2004
    If somehow the new SLI LINK works apart from dev implementation (NVIDIA manages it on their side) — that’s the way it succeeds.
     
  4. FlawleZ

    FlawleZ Gawd

    Messages:
    580
    Joined:
    Oct 20, 2010
    And the 2080Ti MSRP is affordable? When the 6800GT released and first brought DX9.0c and SLI to the table, the MSRP was $399. The 6600GT was a mere $200. When these cards were used in SLI it brought massive performance to the table for even less than what Nvidia wants for 1 of their latest high end GPUs.

    Nvidia wants to bring Ray Tracing to consumers. Cool. Too bad their enthusiast level card is 100% incapable of playing games at enthusiast settings. Nvidia is coming out before release and telling us to avoid potential negative reception on performance.
    In fact, some could argue 4K is almost mainstream. There hasn't been a stronger need for SLI in YEARS. Yet because they refuse to push the advancement of a tech that would actually make them more money, we are left with a half-assed experience on SLI today.
     
  5. Archaea

    Archaea [H]ardForum Junkie

    Messages:
    8,442
    Joined:
    Oct 19, 2004
    Snowdensjacket and cybereality like this.
  6. FlawleZ

    FlawleZ Gawd

    Messages:
    580
    Joined:
    Oct 20, 2010
    I wouldnt personally claim its mainstream either. However, those surveys don't tell the whole story. How many of those people currently game on a 1080 or 1080Ti? And then how many of THOSE gamers play at 4K? Enthusiast level gamers want enthusiast level hardware to play at enthusiast level detail settings and performance. This RTX generation will not be able to deliver that in a single GPU form.
     
  7. cybereality

    cybereality 2[H]4U

    Messages:
    2,975
    Joined:
    Mar 22, 2008
    Yeah, that's what I'm hoping. I bought 2 Ti's. If NVLink is as good as they say, and the driver implementation is there, we could see big gains for SLI.

    I agree that 4K is not mainstream right now for gaming PCs, *but* the kind of people that will drop $1,200+ on a GPU probably aren't at 1080P60, most likely 1440P high refresh or 4K.
     
    Dayaks likes this.
  8. Dayaks

    Dayaks [H]ardness Supreme

    Messages:
    6,294
    Joined:
    Feb 22, 2012
    I don’t think it will but I could be wrong. I haven’t read anything besides NVLink being faster... which a lot of the time it was SLI flat out not working or game breaking bugs rather than speed IMO. More annoyingly a game working great then negative performance the next day (BF4).

    I love pissing away money but I would definitely wait for reviews on SLI. In the past it’s made my overall experience worse.

    I have heard ray tracing would be easy to split but I consider that rumors. Before I hop on mGPU ever again it’d have to be automatic (seen as one device).

    Not sure if you’re on the right forum broski. We bang tens and make over 100k here.
     
  9. linuxdude9

    linuxdude9 Limp Gawd

    Messages:
    477
    Joined:
    Dec 25, 2004
    I do 1440P 165hz, and ideally like to use some supersampling. I'll be doing SLI for the first time with the 2080 Ti. I look at it as, for the games it works with, great, I'll have better performance and can crank the details. For the games it doesn't, I just accept I'll have single GPU performance. I'm aware that SLI is terrible from a cost/perf perspective, but for some of us, we're enthusiasts. This is what we enjoy doing.

    Frametimes with NVLink over SLI should be better, as frames can be transferred in a fraction of the time. Also, NVidia has been shuffling data over the SLI bridges and PCI-E bus. A lot of games don't scale well with SLI unless you have a full bandwidth 16x/16x setup on a HEDT platform; especially games that use TAA. With NVLink, the PCI-E bus is no longer used for exchanging data between the GPUs and the bandwidth is substantially higher.
     
  10. noko

    noko 2[H]4U

    Messages:
    3,828
    Joined:
    Apr 14, 2010
    I would think Ray Tracing titles would be DX 12 orientated, Microsoft made it easier for developers to allow Multi-GPU to work in the API and also probably much performance improvement from RT optimizations in software will get the performance up - I say ray tracing games that are optimized with DX 12 Multi-GPU could very much perform well at 1440p, add in DLSS maybe 4K as well. Just too much is not known yet, just some guessing. I don't think we will know true RT performance for at least another 6 months if not longer.
     
    Dayaks likes this.
  11. Dan_D

    Dan_D [H]ardOCP Motherboard Editor

    Messages:
    53,918
    Joined:
    Feb 9, 2002
    Indeed. NVIDIA needs to make multi-GPU work on a pure hardware level and take multiGPU implementation out of the hands of Microsoft or any software developers. Until they do, multi-GPU sales amongst gamers are at an end.

    Considering some of us spent more than $2,000 on a pair of Titans or Titan X's in the past, I don't think RTX 2080Ti's $1,000-$1,200 is necessarily out of bounds. Its not a price point that will make those cards mainstream, but it's not totally out there. It seems as though the RTX 2080Ti takes the place of the Titan's spot, RTX 2080 takes the top high end GPU spot and so on. Given that NVIDIA used to double dip on GPU sales from guys like me, its really not that crazy. For the performance upgrade and not having to deal with SLI's bullshit, $1,000-$1,200 isn't unreasonable at all. Hell I consider it almost a bargain. I paid nearly $1,600 for the cards in my machine now. I paid over $2,100 for my Maxwell based Titan X's. As for the costs you mentioned, this isn't 2005 anymore. Everything is more expensive in 2018 than it was in 2005.
     
    cybereality, Dayaks and Araxie like this.
  12. noko

    noko 2[H]4U

    Messages:
    3,828
    Joined:
    Apr 14, 2010
    Na, if I get a 2080 Ti it will be two to order :D. Just need some kind of monitor that can take the output.
     
  13. Stoly

    Stoly [H]ardness Supreme

    Messages:
    5,845
    Joined:
    Jul 26, 2005
    The whole idea of DX12 (and vulkan for that matter) regarding multiGPU was to give lower level access to developers so they could get bigger performance gains.

    Nvidia certainly has the resources to take it upon themselves, but I don't think AMD can afford that.
     
  14. FlawleZ

    FlawleZ Gawd

    Messages:
    580
    Joined:
    Oct 20, 2010
    I guess it depends on your expectations. If I'm spending over $1000 for a video card, it better be able to play current games at 4K with all the bells and whistles. Otherwise, if I'm buying a card to play at 1080P I'm not spending $1000. Nvidia already says 2080TI can't. So your back to either SLi or compromise on settings for playable experience on single card.
     
  15. Dan_D

    Dan_D [H]ardOCP Motherboard Editor

    Messages:
    53,918
    Joined:
    Feb 9, 2002
    I wouldn't buy two without knowing precisely what kind of gains (if any) I could expect using two of them in my system. Any monitor can take the output, so I don't know what the hell you mean by that.

    I understand why things were done this way. Unfortunately, developers tend to put consoles first and they don't generally spend the time on leveraging multiGPU. So, multiGPU has been less than stellar over the last couple of years or so. I wouldn't be surprised if NVIDIA was already working on this in some capacity. Whether or not AMD can afford to do things that way is not NVIDIA's problem.

    Technology is what it is. I've often gone for the fastest hardware available regardless of the cost with few exceptions. I only expect the best of what's available at the time. Who says the 2080Ti can't do current games at 4K with all the bells and whistles? Aside from Ray Tracing I bet it does given that a single 1080Ti can do this in most games now. Ray tracing is a feature that has too much of a performance hit today to be usable for anyone running anything better than some shitty 1920x1080 TN panel. That's the only feature that we know probably can't be leveraged well enough to be worth while right now. It's a stepping stone towards a future product that will make it a viable feature. Anti-Aliasing and many other rendering techniques and visual technologies were the same way when they first came out.

    Either some of you guys are really young or you have short memories. RTX 2080Ti is no doubt the fastest gaming card on the planet right now. How much of an improvement over the 1080Ti and whether or not the performance delivered will be worth the price of admission is another matter entirely. If a 1080Ti can actually run a great deal of games well at 4K today, the 2080Ti will do it better. Ray tracing is a feature that's not going to see wide spread support for several months at best. Like most new features, lazy ass console focused developers probably won't touch the shit at all for the next 3 years or so. It will be an afterthought in a few titles I'm sure, but it will hardly be mainstream anytime soon.

    It makes no sense to worry about whether or not RTX 2080Ti can handle anything more than 1080P with that feature today given that most gamers are still on 1080P displays or something close to that now. Fuck, HDR is hardly a blip on the radar with games today and its been possible for some time. Very few gamers outside of the Samsung TV crowd even own HDR capable displays. New graphics technologies and features usually take years to become usable, much less mainstream.

    Unfortunately, with technology we have to learn to walk before we can run.
     
    cybereality likes this.
  16. Snowdog

    Snowdog Pasty Nerd with Poor Cardio

    Messages:
    7,948
    Joined:
    Apr 22, 2006
    RT should be no easier to SLI than what we already have because it is everything we already have plus more, so most games will stick with AFR like they already do.
     
  17. Hagrid

    Hagrid Kyle's Boo

    Messages:
    8,320
    Joined:
    Nov 23, 2006
    Even if it's 100% scaling, if the game does not support it, not much help there. Maybe they need to bring it back to all games.
     
  18. Elios

    Elios [H]ardness Supreme

    Messages:
    7,139
    Joined:
    Aug 12, 2004
    clearly you are on the WRONG web site this HARDOCP not bitch about price OCP
     
  19. Dan_D

    Dan_D [H]ardOCP Motherboard Editor

    Messages:
    53,918
    Joined:
    Feb 9, 2002
    That's my point. If NVIDIA could make SLI or something like it, a pure hardware solution that's software agnostic, this would no longer matter and guys like me would buy GPU's until we either couldn't fit anymore in our machines or the scaling stops. Depending on the gaming industry to support it through Vulcan or DX12 is a recipe for disappointment.
     
    Maddness and Elios like this.
  20. Stoly

    Stoly [H]ardness Supreme

    Messages:
    5,845
    Joined:
    Jul 26, 2005
    I don't think a "pure hardware" solution is even possible. I guess nvidia could go back to profiling, but I think the best bet would be to work with the developers to implement nvlink properly. Thankfully nvidia already has tight relationship with major developers so it could happen.

    AMD on the other hand, as I stated before, doesn't have the resources to do so.
     
  21. DF-1

    DF-1 2[H]4U

    Messages:
    2,388
    Joined:
    Jun 17, 2011
    It's not entirely about price. It's about the userbase.

    It's about the two people who will spend $2400 on GPUs alone to get RTX NVLink.

    Good luck getting ANYONE to program for that.
     
  22. Hagrid

    Hagrid Kyle's Boo

    Messages:
    8,320
    Joined:
    Nov 23, 2006
    Monitors are getting faster now. 4K 60, and now 4K 144 or even 240?
    Single cards are not cutting it, if we want to move ahead.
     
  23. Dan_D

    Dan_D [H]ardOCP Motherboard Editor

    Messages:
    53,918
    Joined:
    Feb 9, 2002
    Which is why I'd buy two cards if I thought I'd get jack shit out of the second card.
     
  24. noko

    noko 2[H]4U

    Messages:
    3,828
    Joined:
    Apr 14, 2010
    NVidia own benchmark 2080 TI (OC card) for Shadow Of The Tomb Raider at 4K, not even at max setting has it less than 60 FPS.

    Anyways there are good games that play well with two cards, FarCry 5 with CFX max out exceeds 60 FPS at 4K. Also works well with 1080 It’s SLI. I just don’t see a single card hitting the mark at 4K without some compromises. Of course many games will do fine as well. But the newer intensive ones - no. Does not look like the 2080 TI will change that.

    Now if I had two 2080 TIs I would want a faster than 60 hz 4K monitor, Gsync with all the trimmings. Looking at $4500 for that combination.
     
  25. cybereality

    cybereality 2[H]4U

    Messages:
    2,975
    Joined:
    Mar 22, 2008
    Multi-GPU in Vulkan (maybe DX12 as well) can expose multiple cards as one unit. There are some things that need specific optimization, but this should not be hard for professional engine programmers.

    I think the issue is more about economics. Meaning how many people actually use more than 1 card on the market, and if the development cost is worth it. I don't believe it's a technical issue.