DICE Made Nvidia’s Ray Traced Gaming Dreams a Reality in Just Eight Months

Discussion in 'HardForum Tech News' started by DooKey, Aug 24, 2018.

  1. DooKey

    DooKey [H]ard DCOTM x4

    Messages:
    7,859
    Joined:
    Apr 25, 2001
    DICE technical director, Christian Holmquist, says they began working on ray tracing for Battlefield 5 late last year and did it with early drivers and no hardware. That's just eight months ago and when you think about it this is pretty quick since it's new for PC gaming. Also, he says they went very wide with their approach to offload the work. They designed with 12 threads in mind and this will be ideal, but higher clocked eight thread machines might work as well. Regardless, the fact that they were able to get DXR working within eight months is quite the accomplishment and really does bode well for adoption of DXR in future games. I can't wait to see what other developers bring to the table the next couple of years.

    “We haven’t communicated any of the specs yet so they might change, but I think that a six-core machine – it doesn’t have to be aggressively clocked – but 12 hardware threads is what we kind of designed it for. But it might also work well on a higher clocked eight thread machine.”
     
    Archaea likes this.
  2. xp3nd4bl3

    xp3nd4bl3 2[H]4U

    Messages:
    2,258
    Joined:
    Sep 25, 2004
    "Just" eight months to implement a bolt-on graphics feature to the point where it's functional but low performance kinda seems like a heavy lift to me.

    “It’s been really fun,” DICE technical director, Christian Holmquist told me, “but it’s also been a lot of work"

    I think we're a long ways away from significant adoption.
     
    viper1152012 likes this.
  3. J3RK

    J3RK [H]ardForum Junkie

    Messages:
    9,042
    Joined:
    Jun 25, 2004
    I would guess it would be easier to implement on a new game using an engine that already has full support. Like Unreal Engine. I could be mistaken, but it seems like adding it into a game that exists, in an engine that didn't yet support it, would in fact take longer. Also 8 months in game development terms really isn't all that long.

    I think this is more of a positive thing. Can't really see anything wrong with what they've done. I'm sure it will become more optimized, maybe use less threads, but as a first attempt that's not too bad. CPUs are getting more and more threaded. If you can afford an RTX 20xx card, you can afford a processor with 12 threads. (if you don't already have one)
     
  4. katanaD

    katanaD [H]ard|Gawd

    Messages:
    1,987
    Joined:
    Nov 15, 2016
    interesting read.
     
  5. Vega

    Vega [H]ardness Supreme

    Messages:
    6,052
    Joined:
    Oct 12, 2004
    So basically he is saying a 12 core CPU would be ideal. We all know real CPU cores are far superior to hyper-threading.
     
  6. J3RK

    J3RK [H]ardForum Junkie

    Messages:
    9,042
    Joined:
    Jun 25, 2004
    I would always prefer real cores as well. Who knows though, maybe HT does in fact speed up these particular processes a bit over just having 6-8 physical cores. It would be interesting to see several scenarios tested.

    Given the same amount of physical cores, I prefer the addition of HT in virtualization applications. There are some instances where it helps a bit.
     
    Sulphademus and Armenius like this.
  7. lollerwaffle

    lollerwaffle Gawd

    Messages:
    666
    Joined:
    Feb 3, 2008
    Ryzen 16 core it is, 2019 get here faster!
     
    Derfman, J3RK and alxlwson like this.
  8. alxlwson

    alxlwson You Know Where I Live

    Messages:
    5,776
    Joined:
    Aug 25, 2013
    AMD- The way it's mea.....wait

    AMD- Because Fuck Intel and BF(O'Doyle) rulez
     
  9. J3RK

    J3RK [H]ardForum Junkie

    Messages:
    9,042
    Joined:
    Jun 25, 2004
    I'm still running i5s in all my systems right now, but my next build will definitely be 12-16 core, and Ryzen appears to be the choice I'll be making. Right now it wouldn't make much of a difference for my work/play-loads, but soon it will. I'm thinking a nice Ryzen+2080 build will be in order in the next few months.
     
  10. TAP

    TAP Limp Gawd

    Messages:
    236
    Joined:
    Mar 29, 2016
    Considering how far out of reach ray tracing has been for a long time - I am VERY optimistic. Gotta start somewhere.
     
  11. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    9,582
    Joined:
    Jun 13, 2003
    Well, six with SMT is what he's saying, the 'hardware threads' part seems to be a bit confusing. Hyper-threading/SMT is still 'hardware' from their perspective, I guess.

    Really depends on what's going on, as the actual crunching should be on the GPU; if the additional threads are just shuffling stuff around then HT would be perfect.

    Always prefer HT regardless of the number of physical cores :).
     
    J3RK likes this.
  12. J3RK

    J3RK [H]ardForum Junkie

    Messages:
    9,042
    Joined:
    Jun 25, 2004
    Yeah, I was thinking they'd just use all those threads to schedule other ones or something like that. I'm sure some CPU intervention is needed for some of this stuff on the engine side, even though the GPU is doing the major calculations. Still have to fit that into what the rest of the game is doing. It sounds kinda brute forced, but since they didn't have hardware, maybe that's the only way they could do it?
     
    IdiotInCharge and DPI like this.
  13. DPI

    DPI Nitpick Police

    Messages:
    10,956
    Joined:
    Apr 20, 2013
    Has to start somewhere. That's the whole point.
     
  14. Virtual_Bomber

    Virtual_Bomber Limp Gawd

    Messages:
    141
    Joined:
    Jan 20, 2017
    Well it does say they went very "wide" to offload the work.

    Probably made it that way as a fail-safe I guess to get it working since they didn't have the actual hardware. Kind of like "regardless what the hardware can specifically do this will work".

    Thats what I took from it anyway.
     
    J3RK likes this.
  15. J3RK

    J3RK [H]ardForum Junkie

    Messages:
    9,042
    Joined:
    Jun 25, 2004
    Couldn't we have one thread without that? o_O
     
    IdiotInCharge likes this.
  16. dgz

    dgz [H]ardness Supreme

    Messages:
    5,081
    Joined:
    Feb 15, 2010
     
    AlphaQup, lollerwaffle and Derfman like this.
  17. Derfman

    Derfman [H]ard|Gawd

    Messages:
    1,216
    Joined:
    Jan 12, 2007
    So another article confirming 60fps at 1080p. Once again I'm sure some here will take issue to this level of performance, but I'm still impressed.
     
    J3RK likes this.
  18. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    9,582
    Joined:
    Jun 13, 2003
    I'm impressed that they're able to do it at all. That's real-time ray-tracing in real game engines in real games.

    And 1080p60 today means that 4k120 isn't that far ahead, given how quickly graphics processing speed increases, especially with process shrinks.
     
    trandoanhung1991, Derfman and J3RK like this.
  19. J3RK

    J3RK [H]ardForum Junkie

    Messages:
    9,042
    Joined:
    Jun 25, 2004
    1080/60 is just fine for me. My only stipulation is that I want the game to maintain that. So I hope when they say that, they're implying a bit of headroom. Still though, it's impressive.
     
  20. Vega

    Vega [H]ardness Supreme

    Messages:
    6,052
    Joined:
    Oct 12, 2004
    Uhh, 4K120 is eight times the demand of 1080/60.
     
  21. katanaD

    katanaD [H]ard|Gawd

    Messages:
    1,987
    Joined:
    Nov 15, 2016

    well.. there was also this little bit at the very end of the article

    so i am wondering if the RTX 2080 will do the 1080p and the 2080TI will do 1440p

    IF.. so, would make the TI more appealing to some.. even with the HIGH price.
     
  22. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    9,582
    Joined:
    Jun 13, 2003
    Well yeah!

    Nvidia has a lot of work to do ;)
     
    J3RK likes this.
  23. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    9,582
    Joined:
    Jun 13, 2003
    1440p is enough- the 2000-series is going to make 4k60+ accessible at the higher detail levels, but I'm fine turning up the details and hitting ~60FPS at 1440p with G-Sync in some games.

    Maybe I'd do 'campaign' stuff in BFV like that, then turn off RTX for the multi-player, etc.
     
  24. Joust

    Joust 2[H]4U

    Messages:
    2,610
    Joined:
    Nov 30, 2017
    Turn *OFF* RTX?

    ...

    You, sir, must not weigh more than a duck.
     
    alxlwson and honegod like this.
  25. Zarathustra[H]

    Zarathustra[H] Official Forum Curmudgeon

    Messages:
    27,710
    Joined:
    Oct 29, 2000

    It shouldn't be surprising that new, higher graphics quality has a performance hit.

    This is literally the way it has been as long as there has been PC gaming.

    The question is, is the improvement enough to warrant the performance impact, or is everyone going to be playing with RTX off, even if their hardware supports it?
     
    Peter2k likes this.
  26. J3RK

    J3RK [H]ardForum Junkie

    Messages:
    9,042
    Joined:
    Jun 25, 2004
    Yep, reminds me of the GeForce 256 SDR. Wasn't quite powerful enough for the games it was intended for. The DDR version was an improvement, but the GeForce 2 GTS was where the real performance began for the T&L generation. To be honest though, I think these RTX cards are a little better off than some of the classic examples of this sort of thing. They're still (most likely) very formidable for non RTX tech. Then that's a bonus, and a taste of things to come along with the good base performance.
     
  27. Zarathustra[H]

    Zarathustra[H] Official Forum Curmudgeon

    Messages:
    27,710
    Joined:
    Oct 29, 2000

    Ah, that brings me back.

    I never had a GeForce SDR or DDR though. I went from being a poor highschool student still using my Voodoo 1 and pre-MMX 150Mhz Pentium (@200) to - after my first summer job between my freshman and sophomore years in college having some money, which is when I upgraded to my Duron 650(@950) and Geforce 2 GTS. Those were the days.

    It was funny though. Mu freshman year we had some epic Quake2 LAN battles in my dorm, and none of the other kids could wrap their heads around how my old ghetto Pentium 1 with a Voodoo 1 was running circles around their brand new box store computers their parents had just bought them for college.
     
    Peter2k and J3RK like this.
  28. DNMock

    DNMock Limp Gawd

    Messages:
    399
    Joined:
    Apr 16, 2015
    Isn't the ray tracing handled exclusively on the tensor cores though? Not like it's eating up cuda cores to run it so it would stand to reason that with some optimizations you would be able to enable ray tracing without any loss in performance vs when it's not enabled.

    I'm sure for a few years it will need to be limited to specific shadow casting lights and screen space reflections or whatever until things can get caught up and games can be built from the ground up with it.

    That is of course, unless Nvidia pulls a rabbit out of the hat with NVLink in which case, fuck that, give me ray traced everything for my Quad Nvlink set-up and let the console peasants be blinded by it's glory!
     
    honegod likes this.
  29. Zarathustra[H]

    Zarathustra[H] Official Forum Curmudgeon

    Messages:
    27,710
    Joined:
    Oct 29, 2000

    I could be wrong, as I am not well read on Tensor cores, but the way I interpreted what I have seen to date is not that Tensor Cores are added in addition to the traditional CUDA cores, but rather that they replace the traditional CUDA cores. The way I read it, the Tensor Cores are a new more efficient version of CUDA cores able to share data in a 3D matrix. So I don't think if you have something "run on the Tensor cores" means you also have CUDA cores sitting idle. I think they are the same thing, just that the new tensor cores are more efficient than the old cuda cores and thus can do more.

    I'd wager it is even possible to run this raytracing on older CUDA cores, but probably with a MUCH larger performance hit than on the newer tensor cores, and because of this (and because they want you to buy new hardware) Nvidia likely won't enable it.

    Again, I could be completely wrong here. I just haven't seen a deep enough dive on the tech yet, but this is the impression I was left with.
     
  30. J3RK

    J3RK [H]ardForum Junkie

    Messages:
    9,042
    Joined:
    Jun 25, 2004
    The beauty of building it yourself. :cool:

    I think my Pentium 166 MMX was the first chip I ever overclocked. I don't think I pushed it hard. Just to 200 or something like that. I knew people who OCed 486 chips, but I never wanted to mess with that back then.

    You were lucky with that Voodoo card. I think I was running a Mistake in my 166. I don't think I had anything good until my P2-233. Then I added an M3D, followed shortly after by a Voodoo1. I did get that 233 to run at 300 though, and shortly after picked up a Riva 128 AGP card for my host card. (which was fun to experiment with in Quake 2) I upgrade to a V2 on that same system until the PII-SL2W8. I think that's around when Unreal came out.
     
    Zarathustra[H] likes this.
  31. Ghoststalker

    Ghoststalker Limp Gawd

    Messages:
    444
    Joined:
    Jun 4, 2001
    There are code, ray tracing, and tensor codes. All specialized in different ways. Tensor is for the deep earning AI.
     
  32. _l_

    _l_ I Am A Cock

    Messages:
    1,151
    Joined:
    Nov 27, 2016
    "You're so right." (out comes the light saber ... woooosh)
     
  33. J3RK

    J3RK [H]ardForum Junkie

    Messages:
    9,042
    Joined:
    Jun 25, 2004
    What did you accomplish in the last 8 months? :p
     
  34. polonyc2

    polonyc2 [H]ardForum Junkie

    Messages:
    16,316
    Joined:
    Oct 25, 2004
    rush job...ray-tracing will be better when games are built from the ground up with it...not adding it in as a $$ grab
     
  35. _l_

    _l_ I Am A Cock

    Messages:
    1,151
    Joined:
    Nov 27, 2016
    that's not the point. The point is while R&D is still going on and improving it until it actually is worth the $$$ people are paying for it, as though it's a done deal already - like what happened with physX
     
  36. Zarathustra[H]

    Zarathustra[H] Official Forum Curmudgeon

    Messages:
    27,710
    Joined:
    Oct 29, 2000

    LOL@Mistake :p
     
    J3RK and _l_ like this.
  37. dvsman

    dvsman 2[H]4U

    Messages:
    2,582
    Joined:
    Dec 2, 2009
    Didn't the latest intel CPU hack patch say it disables SMT? Security or speed - take your pick!
     
  38. SCSI-Terminator

    SCSI-Terminator Gawd

    Messages:
    1,013
    Joined:
    Apr 15, 2003
    From my understanding, the tensor cores are used for either: DLSS or nVidia's new AI based AA, or an AI de-noising of the raytracing samples being done. (look at nVidia's demo videos where they show 1spp noise vs ground truth and such, showing what's being fed into the AI algorithms to get the final output)

    So yes, the tensor cores are used for raytracing, but they are only involved in one part of the algorithm.
     
  39. chameleoneel

    chameleoneel 2[H]4U

    Messages:
    2,816
    Joined:
    Aug 15, 2005
    it doesn't seem like a rush job to me. The lighting is impressive.

    The problem is that based on the comparison video, it seems like they obviously held back standard effects. Most evident by those PS2 quality cube map window reflections. But lots of stuff just seems to be missing, which could be there.