Real-Time Ray Tracing Support Comes to GeForce GTX GPUs and Game Engines

Discussion in 'HardForum Tech News' started by cageymaru, Mar 18, 2019.

  1. cageymaru

    cageymaru [H]ard as it Gets

    Messages:
    19,701
    Joined:
    Apr 10, 2003
    NVIDIA has announced that real-time ray tracing support is coming to GeForce GTX GPUs. This driver is scheduled to launch in April. GeForce GTX GPUs will execute ray traced effects on shader cores and support is extended to both Microsoft DXR and Vulkan APIs. NVIDIA reminds consumers that its GeForce RTX lineup of cards has dedicated ray tracing cores built directly into the GPU which deliver the ultimate ray tracing experience. GeForce RTX GPUs provide up to 2-3x faster ray tracing performance with a more visually immersive gaming environment than GPUs without dedicated ray tracing cores. NVIDIA GameWorks RTX is a comprehensive set of tools and rendering techniques that help game developers add ray tracing to games. Unreal Engine and Unity have announced that integrated real-time ray tracing support is being built into their engines.

    Real-time ray tracing support from other first-party AAA game engines includes DICE/EA's Frostbite Engine, Remedy Entertainment's Northlight Engine and engines from Crystal Dynamics, Kingsoft, Netease and others. Quake II RTX -- uses ray tracing for all of the lighting in the game in a unified lighting algorithm called path tracing. The classic Quake II game was modified in the open source community to support ray tracing and NVIDIA's engineering team further enhanced it with improved graphics and physics. Quake II RTX is the first ray-traced game using NVIDIA VKRay, a Vulkan extension that allows any developer using Vulkan to add ray-traced effects to their games.
     
  2. Daarken

    Daarken [H]Lite

    Messages:
    104
    Joined:
    Jan 3, 2006
    I wonder if this is just a quick response to the recent crytek demo showing Ray Tracing on a AMD video card.
     
  3. exlink

    exlink [H]ardness Supreme

    Messages:
    4,236
    Joined:
    Dec 16, 2006
    Interesting addition but I’ll wait till performance numbers come out before I pass judgement.
     
  4. polonyc2

    polonyc2 [H]ardForum Junkie

    Messages:
    16,367
    Joined:
    Oct 25, 2004
    the death blow for the RTX 2000 series...
     
  5. ChadD

    ChadD 2[H]4U

    Messages:
    3,927
    Joined:
    Feb 8, 2016
    Well that may have sped the announcement. But I suspect that was always the way it was heading. The real tracing push will come when AMD starts talking about Navi and their console parts. I doubt they have tensor flow hradware... but they are going to support ray tracing. People forget AMD was showing off real time ray tracing with Radeon Rays long before RTX.

    IMO Nvidia used tracing to try and sell people on their latest GPUs early as they know full well when the real wave of traced games hit they will use shaders not tensor flow. I'm sure they will use their way its ment to be played money to get some "high end RTX" tracing into a handful of those games. Perhaps it will be faster perhaps not.
     
  6. warderkeeju

    warderkeeju n00b

    Messages:
    28
    Joined:
    Apr 8, 2018
    Are they going to cripple the 900 series again with lack of support like they did for the variable refresh rate on 'un-certified monitors'?
     
  7. GoldenTiger

    GoldenTiger [H]ard as it Gets

    Messages:
    18,544
    Joined:
    Dec 2, 2004
    Misinformation. Dxr is direct x raytracing. Nvidia has dedicated hardware that takes those calls and does them on it. Anything codes to Dxr will work with Nvidia's implementation.
     
  8. SPARTAN VI

    SPARTAN VI [H]ardness Supreme

    Messages:
    7,092
    Joined:
    Jun 12, 2004
    This is like Nvidia's admitting that ray tracing adoption among game developers was stagnant because customers weren't buying their RTX hardware, and customers weren't buying the hardware because developers weren't designing games with real-time ray tracing (and it was overpriced as all hell). I suppose this is the most logical way to instantly populate an install-base of "RTX" capable hardware so developers have more of an ecosystem of customers to work with.
     
    N4CR likes this.
  9. CrazyRob

    CrazyRob [H]ard|Gawd

    Messages:
    1,274
    Joined:
    Aug 26, 2004
    Also, nvidia also previously stated that their gtx series cards were capable of doing the same raytraced effects as shader processes, but that it was significantly slower. So this shouldn't be surprising for anyone that's been paying attention. If anything, this is a move to help spread adoption of raytracing techniques in software, as more hardware will technically be able to use those features, even if it's not practical. The more programs that implement any level of raytracing, the more opportunities for their rtx series to shine.
     
    Talyrius, Fleat, GoodBoy and 8 others like this.
  10. idiomatic

    idiomatic [H]Lite

    Messages:
    66
    Joined:
    Jan 12, 2018
  11. ChadD

    ChadD 2[H]4U

    Messages:
    3,927
    Joined:
    Feb 8, 2016
    Working efficiently with a radically different architecture is going to take a lot more then a generic driver implementation. Games coming down the pike are clearly going to be coded to use DXR shaders. I highly doubt their drivers will be able to deouple all that shader math and feed it to their tensor units without some coding help. The driver perhaps can do that but my guess performance gain will not be all that great if its not optimised on the software end.

    https://devblogs.microsoft.com/directx/announcing-microsoft-directx-raytracing/

    A new command list method, DispatchRays, which is the starting point for tracing rays into the scene. This is how the game actually submits DXR workloads to the GPU.
    A set of new HLSL shader types including ray-generation, closest-hit, any-hit, and miss shaders. These specify what the DXR workload actually does computationally. When DispatchRays is called, the ray-generation shader runs. Using the new TraceRay intrinsic function in HLSL, the ray generation shader causes rays to be traced into the scene. Depending on where the ray goes in the scene, one of several hit or miss shaders may be invoked at the point of intersection. This allows a game to assign each object its own set of shaders and textures, resulting in a unique material.

    Its not like MS has ever hid that DXR uses shaders to achieve tracing. From everything I have read so far all this RTX stuff is mostly hogwash. Pure marketing BS to get gamers to pay more for GPUs with googles AI hardware sandwiched on the silicon. Sure perhaps there are some uses for tensor flow hardware for games... but as I see it not much jumps out that isn't better suited to more general GPU cores. As is almost always the case Kudos to Nvidias marketing team for confusing the hell out of some tech to sell cards.
     
  12. Cmdrmonkey

    Cmdrmonkey Gawd

    Messages:
    1,013
    Joined:
    Jul 19, 2004
    Yay now I can cut performance by 95% for no discernible difference in image quality
     
  13. FrozenSteel

    FrozenSteel Limp Gawd

    Messages:
    186
    Joined:
    Oct 24, 2011
    I'm seeing a lot of talk about Raytracing and Tensor Cores... Just to clarify, their two separate things. Tensor cores have nothing to do with Raytracing, they have everything to do with Nvidia DLSS (Deep Learning Super Sampling). Raytracing on RTX cards is offloaded to RT cores rather than running in the shaders. So in reality, Raytracing should be faster on RTX cards so long as the RT cores are utilized properly. Can Raytracing be done on shaders? Sure, but at the loss of performance as some sharers need to be consumed to perform the Raytracing operations. See the photo below which shows the difference. Note that the Tensor cores are seperated from the RT cores.

    Pascal_vs_Turing.jpg
     
    GameLifter, Auer and captaindiptoad like this.
  14. sirmonkey1985

    sirmonkey1985 [H]ard|DCer of the Month - July 2010

    Messages:
    21,235
    Joined:
    Sep 13, 2008
    when i saw the news post that's exactly what i was thinking.
     
  15. ChadD

    ChadD 2[H]4U

    Messages:
    3,927
    Joined:
    Feb 8, 2016
    See look how good a job that Nvidia marketing dept has done.

    No "RT cores" are complete and utter BS. Pascal isn't the chip to compare it to volta is. Volta also has tensor flow. Main difference is Volta has basically reference tensor flow hardware. Pascal adds the ability to run in higher precision float point modes.

    The issue for volta and GPU computation in general has always been precision. The reason why GPUs are not used for Final ray tracing renders is because they simply can't do precise calculation. IEEE 754 is the standard for float point math... GPUs haven't always conformed to that standard at all. And even now Turing and Vega alike don't provide full float point error check registers. This means there usefulness for something like a massive 50+GB ray traced render Pro level CGI work is zero. They can be used for pre vis cause they are great at quick and dirty calculation... but they suck at highly ordered double/triple checked math.

    What I'm getting at is with Turing Nvidia expanded on Tensor Flow. They added higher precision float point modes. They still don't add the same error floats you would find in x86 but they are alot better. This has AI uses of course. But the truth is Nvidia wants to get into pro level render farms. They have been providing programming support to Pixar for a few years now to expand their Renderman software with something called XPU which is a CPU + GPU final render unit for renderman. I sort of doubt Pixar uses it for any of their own flicks for some time... but it might be good enough for companies using renderman with things like flow and houdine ect for lower end CGI work for TV commercials ect. Nvidia is looking to sell into those markets and that is why they extended Tensor Flow. All those "RT" cores their marketing dept is talking about are nothing but the tensor control units allowing FP64 math to be done wiithin the tensor matrix. (just look at the counts.. 544 tensor cores 68 "RT" cores yes 8x as many exactly their nothing but FP64 control units)

    Here is a quote from Jensen himself from their Q2 earnings call;
    "In the case of Turing, it’s really designed for three major applications. The first application is to open up Pro Visualization, which is a really large market that has historically used render farms. And we’re really unable to use GPUs until we now have -- we now have the ability to do full path trace, global illumination with very, very large data sets. "
     
    Last edited: Mar 19, 2019
  16. misterbobby

    misterbobby 2[H]4U

    Messages:
    3,701
    Joined:
    Mar 18, 2014
    You go out of your way to highlight and underline that tensor cores have nothing to do with ray tracing. Maybe do a few seconds of research as they are used for denoising the ray traced image.
     
    Last edited: Mar 19, 2019
    Armenius and Dayaks like this.
  17. Oldmodder

    Oldmodder Gawd

    Messages:
    679
    Joined:
    Aug 24, 2018
    How convenient to have your tech demo's like the company in general,,,,, primadonnas.
     
  18. Lakados

    Lakados [H]ard|Gawd

    Messages:
    1,472
    Joined:
    Feb 3, 2014
    People are going on about Ray Tracing I am more interested in how the RTX cores can do some nice AI work. I would one day love to see boss fights enhanced with AI, a boss that remembers how each player defeated it and works to counteract it.... that would be interesting especially in MMO’s.

    Ray tracing is great, don’t get me wrong there but until mainstream cards < $300 have the power to do it and still deliver frame rates > 30 at 1080p it’s an interesting feature that most literally can’t afford to run.
     
    Last edited: Mar 19, 2019
  19. Zulgrib

    Zulgrib n00b

    Messages:
    31
    Joined:
    Dec 11, 2018
    Thanks Crytek?
     
    DukenukemX likes this.
  20. odditory

    odditory [H]ardness Supreme

    Messages:
    5,327
    Joined:
    Dec 23, 2007
    NV's API has clearly been in pipeline for a while, doesn't have anything to do with Crytek's canned demo; it's not an either-or. One company makes hardware, the other 3D engines - they're not actually competing - their innovations will complement one another. NV GPUs will end up the prime beneficiaries of Crytek's work by default just because of marketshare. If Crytek is able to leverage any AMD specific hardware features (compute or whatever) that'd be wonderful.

    Regardless, more companies hopping the raytracing bandwagon benefits everyone.
     
    Last edited: Mar 19, 2019
    GoldenTiger likes this.
  21. Bawjaws

    Bawjaws Limp Gawd

    Messages:
    434
    Joined:
    Feb 20, 2017
    Yep.
    Only if non-RTX cards perform similarly to the RTX series. Let's wait and see some real numbers before jumping the gun.
    Exactly. If this pushes RT forward and there are real performance/IQ improvements then it's all good.
     
  22. Prisoner849

    Prisoner849 Gawd

    Messages:
    683
    Joined:
    May 5, 2016
    Taste the cheap crack kids...you'll be back for the good stuff.
     
    GoodBoy, Crackinjahcs and ItWasMe like this.
  23. M76

    M76 [H]ardForum Junkie

    Messages:
    9,002
    Joined:
    Jun 12, 2012
    It is not practical on RTX either. Unless you consider 1080p 30fps practical.
     
    DukenukemX likes this.
  24. TorxT3D

    TorxT3D Gawd

    Messages:
    651
    Joined:
    Apr 30, 2006
    i dont care about performance numbers, the drivers can be built to purposely cripple ray tracing for previous gen cards.
    ive never trusted nvidia when it comes to their drivers.. theyve been notorious through the years to push you into buying the next tech.

    i will admit, and i commented it in the ray trace amd vid, the ray tracing kinda looks cheap or like a fake ray tracing compared to previous tech demos ive seen with rtx hardware.
     
    jeffj7 likes this.
  25. Armenius

    Armenius I Drive Myself to the [H]ospital

    Messages:
    17,390
    Joined:
    Jan 28, 2014
    People were complaining about 60 FPS 1080p with ray tracing. Well, congratulations, because now you can play games with ray tracing at 15 FPS 1080p now.
     
    BlueFireIce, Talyrius, Auer and 3 others like this.
  26. polonyc2

    polonyc2 [H]ardForum Junkie

    Messages:
    16,367
    Joined:
    Oct 25, 2004
    magic drivers from Nvidia will improve things...of course the folks that jumped to buy a 2080ti on the first day don't want to hear this...do you really think Nvidia would enable RT on older cards if it only played at sub 20 fps?...like the recent Crytek announcement showed, hardware RT was never really needed
     
  27. Armenius

    Armenius I Drive Myself to the [H]ospital

    Messages:
    17,390
    Joined:
    Jan 28, 2014
    Last I checked shader and compute cores are still hardware.
     
    GoldenTiger and Auer like this.
  28. lostin3d

    lostin3d [H]ard|Gawd

    Messages:
    1,995
    Joined:
    Oct 13, 2016
    The real test will be seeing how games really jump to adopt it now and how well they do it. After playing Metro Exodus with it I can say that it can definitely add to the game but it has to be used properly. BFV showed how not to use it. I also believe that NV should just incorporate it as another tab in CP and the dev's just have an on/off switch in game. On the tab we could have a slider or manually enter a number to control how much it's computing.
     
  29. TangledThornz

    TangledThornz Gawd

    Messages:
    633
    Joined:
    Jun 12, 2018
    The next Grand Theft Auto will look amazing if it uses ray-tracing.
     
    Armenius likes this.
  30. lostin3d

    lostin3d [H]ard|Gawd

    Messages:
    1,995
    Joined:
    Oct 13, 2016
    I mostly disagree with this. Sure the prices sucked. Sure anything less than a 2080TI was near useless for RT in 4k but for the greatest effect RT will still need a lot of performance driven hardware and drivers can only help so much. It's possible NV may want to distance themselves from the RTX moniker in future gens for the sake of publicity but anyone believing the pricing debacle is going away is deluding themselves. Until AMD or Intel can release something same year same as NV's best for a lower price they still have no reason to change their pricing path. It's still taking 2 years or more for anyone to catch up to a x80TI and by then NV is usually ready to release whatever's next.
     
  31. Meeho

    Meeho [H]ardness Supreme

    Messages:
    4,303
    Joined:
    Aug 16, 2010
    extra extra slow VS extra slow?
     
    kirbyrj likes this.
  32. DukenukemX

    DukenukemX [H]ardness Supreme

    Messages:
    4,390
    Joined:
    Jan 30, 2005
    I wonder if AMD's superior compute abilities plus working Async Compute maybe the reason for this "delayed" release of Nvidia drivers? Technically if your card supports D12 then it should support DXR. So what's with the drivers to be released in April? Is Crytek giving Nvidia some time to optimize their drivers so Nvidia doesn't look stupid compared to AMD cards?
     
  33. Shagittarius

    Shagittarius n00b

    Messages:
    63
    Joined:
    May 3, 2016
    For an enthusiast site it's amazing to see how little people understand about ray tracing, and how quickly they turn to conspiracy theories to convince themselves they are smart for avoiding RTX rather than just admitting the first gen is expensive. As soon as prices normalize with ray tracing everyone will be screaming about how necessary the RT cores are for performance. Be happy there are early adopters ready to pay the price in so that one day the masses will be able to get the same great tech for reduced prices. The butt hurt is strong in all these threads.
     
  34. MartinX

    MartinX One Hour Martinizing While You Wait

    Messages:
    7,188
    Joined:
    Jan 23, 2003
    I wonder how far we are from having real-time raytracing in games.

    I feel like unless there's a major breakthrough, it'll be about 10 years at least, if we even get there, engineers and developers love to stall at "good enough".
     
  35. Bawjaws

    Bawjaws Limp Gawd

    Messages:
    434
    Joined:
    Feb 20, 2017
    Yep.

    I personally don't care to pay the early-adoption premium for RT at this point, but I will certainly jump on the train a few stops down the line if it does gather pace. But without any early adopters, that won't happen, so fair play to people who are willing to pay for the latest and greatest features even though they haven't reached maturity yet.
     
    GoldenTiger likes this.
  36. Minutemaid

    Minutemaid n00b

    Messages:
    19
    Joined:
    Nov 5, 2011
    And learn how to spell. It’s they’re, not their.
     
  37. magoo

    magoo [H]ardForum Junkie

    Messages:
    14,348
    Joined:
    Oct 21, 2004
    Hahahahaha.......oh wait.......wait........you could ray trace on our 10xx models all the time.......I guess we forgot to tell you.....we lost that driver in the back room and just found it.....

    It just goes to prove.......Half-Life 3 confirmed.
     
  38. Biostud

    Biostud n00b

    Messages:
    62
    Joined:
    Nov 12, 2012
    So for the RTX 3xxx cards would you prefer that more silicon are used on RT cores or shader/compute power?

    Would doubling the performance of the RT cores also result in doubling the fps?
     
  39. DNMock

    DNMock Limp Gawd

    Messages:
    399
    Joined:
    Apr 16, 2015
    That's what I have been saying as well, but the more I think about it, that should be handled by the CPU now that the core wars has begun between intel and AMD. Best to probably let those extra cpu cores sitting around doing nothing handle it and let the GPU focus on the graphical bits imo.
     
    DukenukemX likes this.
  40. DukenukemX

    DukenukemX [H]ardness Supreme

    Messages:
    4,390
    Joined:
    Jan 30, 2005
    I'd prefer shader compute power since that improves both traditional games and RTX games as well. Also I'm not even sure if the RT cores do anything beneficial since turning on Ray-Tracing tanks performance regardless. But I guess we'll see when we can compare Radeons to Geforce to see if more compute is a better alternative.
    Doesn't matter, just hand me the popcorn cause this is going to get good.