Crytek Demos Real-Time Ray Traced Reflections Running on an AMD Vega 56

Discussion in 'HardForum Tech News' started by cageymaru, Mar 15, 2019.

  1. HockeyJon

    HockeyJon [H]ard|Gawd

    Messages:
    1,057
    Joined:
    Dec 14, 2014
    Yeah, but let’s see you try to run Space Invaders on that!
     
  2. speedy523

    speedy523 [H]ard|Gawd

    Messages:
    1,192
    Joined:
    Aug 13, 2008
    This is my impression also. I am unsure where the idea that ray-tracing can only be done on RTX cards has come from. RTX is just NVIDIA’s platform name, part of which includes ray-tracing. In their white papers, and even on their blog, NVIDIA has been clear that RT cores simply accelerate parts of the ray-tracing pipeline (which they also state can currently be done using traditional shaders), and to my knowledge they aren't even directly exposed. As long as an application incorporates ray-tracing through APIs like DirectX or Vulkan, NVIDIA's driver (i.e. RTX) will handle the rest.

    In the case of this demo, Crytek already states that they will be optimizing their ray-tracing implementation to use DX12 (and I would assume DXR) and Vulkan, meaning that any rays that Cryengine does trace in real-time will be accelerated by the RT cores. Depending on what this number is (which I would guess in this demo video is quite small since it's running in real-time on a Vega 56), the speed-up from using an RTX vs Non-RTX gpu may not be dramatic. However, unless Crytek is doing some magic and somehow actually tracing rays for every pixel, using an RTX card would allow them to either trace more rays at the same performance target, or potentially even path trace (one can dream!).
     
  3. DukenukemX

    DukenukemX [H]ardness Supreme

    Messages:
    4,388
    Joined:
    Jan 30, 2005
    I've been waiting for someone to do what Crytek did. What Nvidia has done isn't new. It just isn't, and it's been done way back in 2010 and it never took off. Simple reason is that whatever the ASIC can do, a Xeon does it nearly as well and can run other software. That's the problem with Nvidia's Ray Trace ASIC in that it does nothing else but Ray Tracing.

    Fast forward a few years and people have learned to do Real Time Ray Tracing on a GPU through hybrid Ray-Tracing. Which is a huge achievement since GPU's don't have good logic branch prediction like CPU's. There is a method I forget that can solve this problem and that's what these Japanese programmers did.

    My thinking is that someone could actually combine CPU+GPU to do what Nvidia does or better. I don't think that's what Crytek is doing here as they're using the GPU I assume. But gotta remember is that AMD GPU's have really good compute performance hence why they were heavily favored for crypto mining over Nvidia. Maybe this method works really well on AMD but not so much on Nvidia cards, which may explain why Nvidia put in ASIC's.

    You know what we need? We need that Quake 2 Ray Tracing mod to use this method to see how it compares to Nvidia's RTX.

     
    GameLifter likes this.
  4. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    9,867
    Joined:
    Jun 13, 2003
    Putting dedicated ray-tracing hardware into a GPU for real-time rendering isn't new?

    :ROFLMAO:
     
    DooKey and GoldenTiger like this.
  5. zehoo

    zehoo Limp Gawd

    Messages:
    251
    Joined:
    Aug 22, 2004
    Assuming there is no trickery involved this is quite impressive. I'm assuming the video is a sales pitch trying to get Devs to license their engine for the next gen of consoles which are both probably using Navi.
     
  6. ChadD

    ChadD 2[H]4U

    Messages:
    3,849
    Joined:
    Feb 8, 2016
    Still isn't anyone that has done it either. What Nvidia has done is find uses for Tensor flow hardware. That is hardly the same thing.

    Don't get me wrong if there putting tensor cores in their GPUs these days for their AI business its wise to find game related uses for it so your not shipping parts with 20% of the die doing nothing. Still as many people have talked about... using tensor for Ray casting is smart, but its probably not the best way to achieve a ray traced hybrid render anyway. There is a reason Nvidia has to apply a ton of denoise to make their tensor powered tech work.

    As others have said... bring on the Quake mod spin that uses this method and lets do some IQ and Performance comparisons. (My money is on the GPU agnostic shader method on both scores... it should be easier to work with and faster.)
     
  7. ChadD

    ChadD 2[H]4U

    Messages:
    3,849
    Joined:
    Feb 8, 2016
    AMD has more then hinted that their next gen console parts with be capable of hybrid tracing. Its very possible this is the very tech they have been hinting at, MS has always said DX tracing could use tensor or shaders to do the work.

    Should be an interesting year in GPUs as more navi stuff leaks out.
     
    EQvet80 likes this.
  8. Hallucinator

    Hallucinator Gawd

    Messages:
    563
    Joined:
    Nov 1, 2006
    Looked mind-blowing in 4k - I liked how everything looks so real.

    RTX?

    Pass.
     
    mashie likes this.
  9. Youn

    Youn [H]ardness Supreme

    Messages:
    5,307
    Joined:
    Jan 22, 2007
    there are no concrete standards for how rendering engine handle physics models, so even with the top non-realtime ray tracers they all look different and you gotta work to make them look similar...

    in other words we're still in the early days of figuring out "technically" accurate rendering, so what "should" be is not a realistic thing to expect
     
  10. knowom

    knowom Limp Gawd

    Messages:
    424
    Joined:
    Aug 15, 2008
    It would be interesting if a game engine was ever explicitly built around MIDI sequenced pixel rendering. I mean think of black midi and all the note changes that is capable of sequencing so not individual compute shading pixels?
     
  11. gigatexal

    gigatexal [H]ardness Supreme

    Messages:
    7,153
    Joined:
    Jun 22, 2004
    Dayyyuuuum. The Crytek guys are hella competent. This is crazy good. Looks fluid.
     
  12. Youn

    Youn [H]ardness Supreme

    Messages:
    5,307
    Joined:
    Jan 22, 2007
    dude, send your resume to crytek asap, they need your help :D
     
  13. Factum

    Factum [H]ard|Gawd

    Messages:
    1,631
    Joined:
    Dec 24, 2014
    Ermmmhh......DXR is hardware agnostic?
    https://en.wikipedia.org/wiki/DirectX_Raytracing

    This is not very useful without the comparing performance of different SKU's.

    Fun fact:
    Both the 1080ti (pascal) and Nvidia Titan V (volta) is able to do DXR.

    The 1080ti is just about 10% of the DXR performance of the 2080 Ti...so again:

    Being able to do DXR is not the main issue...the performance (and performance gap) is.
     
    GoldenTiger and GoodBoy like this.
  14. N4CR

    N4CR 2[H]4U

    Messages:
    3,657
    Joined:
    Oct 17, 2011
    Nvidia should take notes, that's how you demonstrate raytracing.
    Less bullshit, more delivery. They allowed freesync after years to try take marketshare, now their only other advantage is gone and it was using a 2 year old GPU.. good job Crytek.
     
  15. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    9,867
    Joined:
    Jun 13, 2003
    Ray tracing has been well demonstrated for decades...

    G-Sync was first and is still a superior implementation, and aside from that they still have faster and more efficient GPUs...

    Well yeah, good job on the tech demo video!

    Now let's see it in games, and let's see performance by independent reviewers.
     
    GoldenTiger likes this.
  16. RAutrey

    RAutrey [H]ard|Gawd

    Messages:
    1,605
    Joined:
    Jul 25, 2002
    Every instance of RTX ray tracing feels a bit off to me. It's like every demo or bit of gameplay I have seen is either over-implemented or not noticable. This didn't feel that way. Very impressive.

    Also any implementation is cheating unless it is computing photon path of every ray of light as the basis for rendering.
     
    Last edited: Mar 16, 2019
    noko, GoodBoy and N4CR like this.
  17. Factum

    Factum [H]ard|Gawd

    Messages:
    1,631
    Joined:
    Dec 24, 2014
    Links to performance differences using DXR between Radeon VII and 2080 ti: RadeonVII_DXR_2.png

    200_Ti_raytracing.png
     
    Araxie likes this.
  18. Auer

    Auer Limp Gawd

    Messages:
    509
    Joined:
    Nov 2, 2018
    So it will work for any card. That is great news. I guess as it stands it will be a while (again) before enough titles have it to make it worth while.

    Nice to see that on a Vega too.

    I don't know enough about RT and how it works. But It seems it will not take anything away from RTX at the moment.
    Not that there is much to take away anyhow.

    This time next year should be interesting.
     
  19. Factum

    Factum [H]ard|Gawd

    Messages:
    1,631
    Joined:
    Dec 24, 2014
    More on the primary factor (performance) being ignored here:
     
  20. Snowdog

    Snowdog [H]ardForum Junkie

    Messages:
    8,631
    Joined:
    Apr 22, 2006

    DXR is HW agnostic, that is the point of a system API.

    BUT, it is dependent on HW Vendor providing DXR driver implementation.

    Which AMD doesn't do. So this demo is definitely not using DXR, but some kind of custom RT implementation using common DX12 features already supported.

    Longer term you can expect most games with RT will use DXR API.
     
  21. Factum

    Factum [H]ard|Gawd

    Messages:
    1,631
    Joined:
    Dec 24, 2014
    You got that one backwards.

    DXR is DirectX Raytracing...what RTX does is provide a dedicated hardware path (ASIC) so the DXR doesn't have to run via the shader-cores.
    The shader-core can then be used for rasterization and not compete with RT for the SAME resources.

    And be careful about saying this is RTX 20x0 level of performance, because you are lacking a vital component:
    This demo's performance on RTX 20x0 hardware.

    Until you have that...no comparions can be made.
     
    GoodBoy likes this.
  22. Pitbull#2

    Pitbull#2 Limp Gawd

    Messages:
    380
    Joined:
    Mar 23, 2011
    So much for the 1200$ plus vid cards lol. I always thought that developers could do WAY more in engine than the bullshit RTX crap
     
    N4CR likes this.
  23. Riccochet

    Riccochet Off Topic Award

    Messages:
    21,501
    Joined:
    Apr 11, 2007
    Nvidia throwing out the bamboozle.
     
  24. Factum

    Factum [H]ard|Gawd

    Messages:
    1,631
    Joined:
    Dec 24, 2014
    The level of technical insight (the lack of) in this thread is sad.
    That DXR works on non-RTX hardware was never a secret?

    The thing that RTX (RT-cores + Tensor cores) did... was to give a HUGE performance benefit...making it PLAYABLE...unlike single digit perfomance on DXR when running on the shaders.

    Again, until we have performance numbers from different SKU’s....NOTHING is news about this.
     
    Araxie and GoldenTiger like this.
  25. NWRMidnight

    NWRMidnight Limp Gawd

    Messages:
    250
    Joined:
    Oct 23, 2010

    as for talking about ignoring things.. NON RTX card's shaders are performing double duty, where as the RTX cards are using Tensor cores to offload the Ray tracing load onto, meaning their shaders are not performing double duty. If you add the equivalent number of shaders to a non RTX card as there are tensor cores on a RTX card, and dedicate those shaders only doing Ray Tracing, I suspect that the performance factor would be eliminated if not out perform tensor cores.
     
  26. Factum

    Factum [H]ard|Gawd

    Messages:
    1,631
    Joined:
    Dec 24, 2014
    Compare the performance gap between a 1080 Ti and a RTX 2080 and a Titan V and your “theory” falls flat on it’s face, sorry.
     
  27. euskalzabe

    euskalzabe Gawd

    Messages:
    969
    Joined:
    May 9, 2009
    Wait, raytracing follows phisically based light rendering. That's what I meant when I said physics, light, and math are just that. There's no faking it. You can raytrace with more or less rays per pixel, but they should all look the same if calculating the same number of rpp. Is that not accurate?

    That... is the exact same thing I (and others) have said:

    upload_2019-3-16_9-16-45.png

    If it's the "translate" back to DXR part that confused you, Nvidia is after all a middle man between DX and the gamer, and their driver does precisely that: translate DX code into Nvidia proprietary code to run faster and closer-to-metal, calculate results, translate back to DX interpretable code. I mean, that's what all GPUs do.
     
  28. haste.

    haste. [H]ard|Gawd

    Messages:
    1,653
    Joined:
    Nov 11, 2011
    Crytek does some amazing things with software, but it was my understanding few are using Cryengine. I thought there was a migration due to the ease of development and lower cost to UE basically across the board...

    That's a solid demo and performance on a Vega 56. The nvidia haters are still holding onto the ohhh shiny bit when it comes to RT, but this is agnostic so we can stop that. The reflections and lighting really help with immersion and what they put together looked pretty dang good.
     
    MBTP likes this.
  29. Uvaman2

    Uvaman2 2[H]4U

    Messages:
    3,010
    Joined:
    Jan 4, 2016
    So conclusion: Taytracing should be add on cards. AMD should team up the the Taytracing company whatchamacallit, and release one as well as chiplet to be included in future Ryzen releases.
    ....Power VR is the one.. Wonder if they are good for RT.
     
    Last edited: Mar 16, 2019
  30. Aireoth

    Aireoth 2[H]4U

    Messages:
    2,433
    Joined:
    Oct 12, 2005
    Is that some kind of Taylor Swift tracking app? :p

    On topic, happy to see more DXR, it can only mean good things.
     
  31. euskalzabe

    euskalzabe Gawd

    Messages:
    969
    Joined:
    May 9, 2009
    Many have made the jump, yes. That's probably why Crytek is trying to show off, to capture clients. We'll probably see a similar update for UE in the next few months. Epic wants you to use Epic tools, not Nvidia's. Any engine maker will support other companies' tech when they absolutely have to, otherwise, they'll develop their own integrated code.
     
  32. haste.

    haste. [H]ard|Gawd

    Messages:
    1,653
    Joined:
    Nov 11, 2011
    At worst its a pretty good demonstration of their tech all things considered, but I wonder if they have the funds like Epic to push it. Devs and engineers at Crytek are doing amazing work on the budget they have, but Epic seems to be trying to bury them and you are right will likely attempt to copy it.
     
  33. MBTP

    MBTP n00b

    Messages:
    18
    Joined:
    Aug 29, 2018
  34. odditory

    odditory [H]ardness Supreme

    Messages:
    5,283
    Joined:
    Dec 23, 2007
    A canned demo instead of actual games? No. Thats not delivery of anything.

    If all Nvidia did was put out a demo then the price-butthurt-bridage would really be losing their minds.
     
    Last edited: Mar 16, 2019
  35. Factum

    Factum [H]ard|Gawd

    Messages:
    1,631
    Joined:
    Dec 24, 2014
    It the most efficient (performance) way of doing it...feel free to offer an alternative that isn’t wishfull thinking?
     
  36. Snowdog

    Snowdog [H]ardForum Junkie

    Messages:
    8,631
    Joined:
    Apr 22, 2006
    How are they even testing in DXR, when AFAIK, you need vendor drivers that support DXR and AMD doesn't provide any?
     
  37. Patton187

    Patton187 Gawd

    Messages:
    670
    Joined:
    Feb 12, 2012
    Don't know about the RTX stuff, but that demo is sweet.
     
  38. socK

    socK 2[H]4U

    Messages:
    3,654
    Joined:
    Jan 25, 2004
    I assume there's a reference or fallback layer.

    When the DX12 SDK first launched, I'm pretty sure I remember using the WARP device temporarily. It was like a pre-release version or something and the API wasn't even finalized yet and I could develop even without supported hardware/drivers.
     
  39. Youn

    Youn [H]ardness Supreme

    Messages:
    5,307
    Joined:
    Jan 22, 2007
    i think no. This is just based on my use of raytracers for arch viz over 15 years. They may use the same basic fundamental theories but the light path models, materials, atnospherics, cameras, tonemapping... all have variations which result in the final pixels looking different. Most of the time the difference is due to the engineers trying to find ways to boost render times. I've even talked to some of the developers and they say "close enough in realsim, but saves 10x time so it's worth it"
     
    Last edited: Mar 16, 2019
    dangerouseddy and euskalzabe like this.
  40. NWRMidnight

    NWRMidnight Limp Gawd

    Messages:
    250
    Joined:
    Oct 23, 2010
    ?? How so? I am unaware that the 1080ti/2080/titan v has dedicated shaders just for Raytracing, they are all still performing double duty if the tensor cores are not used (2080/titan v). You would also need drivers to support such a function, which non exist. I am also unaware that their are any 2080's or titan v's without tensor cores. Not to mention the fp16/fp32 differences between the 1080ti and 2080/titan v.
     
    Last edited: Mar 17, 2019