What if Nvidia made a RT accelerator card?

Discussion in 'Video Cards' started by Stoly, Mar 13, 2019.

  1. Stoly

    Stoly [H]ardness Supreme

    Messages:
    6,183
    Joined:
    Jul 26, 2005
    By now its clearly apparent that Raytracing is a HUGE performance killer.

    It seems to me nvidia needs to at least double the RTX/Tensor performance to bring it to acceptable levels.

    But even with a die shrink, there may not be enough space available to cram enough RTX/Tensor cores.

    So I was thinking, why not a RT accelerator card?

    About a 3rd of Turing die space is used (wasted?) in RTX/tensor cores. So having a separate RT card advantage would have several advantages


    1. Turing would have either more room for extra CUDA cores or could be made smaller and therefore cheaper.
    2. People won't have to pay for features they don't want
    3. People that do want RT could have the performance they need.
    4. nvidia could cram at least twice the RTX/Tensor cores for much needed RT performance increase. And still be smaller and cheaper.
    5. you could SLI RT cards for even more performance.

    While this may not be very atractive to gamers, I'm sure 3d content creators would jump into it right away.
     
  2. jmilcher

    jmilcher 2[H]4U

    Messages:
    4,066
    Joined:
    Feb 3, 2008
  3. cybereality

    cybereality [H]ardness Supreme

    Messages:
    4,178
    Joined:
    Mar 22, 2008
    It wouldn't work, for the same reason that dedicated PhysX cards failed.

    Developers won't spend time/money supporting a niche market, just as they haven't really supported mGPU at all (with few exceptions like Tomb Raider).
     
    Armenius, kasakka, reaper12 and 4 others like this.
  4. CAD4466HK

    CAD4466HK [H]ard|Gawd

    Messages:
    1,082
    Joined:
    Jul 24, 2008
    Ask NovodeX about their PPU PhysX...no ask Ageia and BFG about their PhysX accelerator...no ask Nvidia what happen to GPU PhysX.

    But seriously, If they were to make one, it would be obsolete in a few years. Because if the tech takes off, it will be in GPU's, not stand alone cards. Who wants to pay for 2 pieces of hardware when you can by 1?

    Now If it would of started off as a stand alone RT accelerator.....
    My 2¢.
     
    reaper12 and jmilcher like this.
  5. cybereality

    cybereality [H]ardness Supreme

    Messages:
    4,178
    Joined:
    Mar 22, 2008
    Maybe a better idea, what if Nvidia dropped rasterization completely and made a ray-trace *only* card.
     
    {NG}Fidel, Armenius and XoR_ like this.
  6. XoR_

    XoR_ Gawd

    Messages:
    628
    Joined:
    Jan 18, 2016
    It wouldn't work because you need tight integration of RT cores with GPU itself.
    Besides RT in games is slow not because these cards cannot calculate enough ray-intersections or that they can not do de-noising fast enough but because each ray that hits surface need to run shader program and there is simply not enough CUDA cores so not enough shader performance. The same so many people complain is lacking on these cards for rasterization...

    In eg. case of BF5 you first rasterize whole scene which is already ridiculously shader heavy and then on top of that you add tons of additional shader workload. No wonder performance drops like flies.
    This is also because Titan V performs so well in this game: there is simply not that much ray intersection calculation needed.

    So NV could add much less RT cores with little performance impact. Why they didn't? Simple: they needed to make it this way (put one RT core per CUDA block) because RT core is large and it would increase complexity through the roof if they made some block with these cores and some without, both on hardware level and driver level. Also the idea is to eventually abandon rasterization completely and use path tracing to render everything so they are actually very forward thinking.

    Does this answer and put to rest your ideas once and for all?
     
    Last edited: Mar 13, 2019
    Rizen likes this.
  7. XoR_

    XoR_ Gawd

    Messages:
    628
    Joined:
    Jan 18, 2016
    Similar to my previous post: it wouldn't make much sense because RT is mostly shader based.
    Now removing rasterization parts would not make much sense because GPUs are mostly tons shaders + some other logic, here ROPs for rasterization. It might even be that these hardware resources are used to speed up RT performance. Process of loading geometry for RT require a lot of calculations on GPU side. If they somehow could use existing resources to speed up then they surely did.

    Also at this time RT only card wouldn't sell at all for gamers and removing all unnecessary for RT parts of GPU would probably not free much space.

    It is amazing some people cannot imagine and accept that NV actually know what they are doing... they are not AMD :ROFLMAO:
     
    GoldenTiger and cybereality like this.
  8. XoR_

    XoR_ Gawd

    Messages:
    628
    Joined:
    Jan 18, 2016
    Actually separate card for PhysX was good and working idea and if this took off it could be developed into separate core type inside GPU's or separate chip on package or even mounted to motherboards...

    NV bought Ageia, killed off real PhysX which had more effects than their GPU PhysX and then they ran physics calculation on shaders which completely tanked shader performance... and people complained it reduced their framerates and about exclusivity of PhysX for NV cards...

    So comparing RTX to PhysX is stoo... silly :)
     
  9. cybereality

    cybereality [H]ardness Supreme

    Messages:
    4,178
    Joined:
    Mar 22, 2008
    Well ray-tracing at launch tanked performance (in BFV) and also only works on Nvidia cards. Maybe there are parallels.
     
  10. XoR_

    XoR_ Gawd

    Messages:
    628
    Joined:
    Jan 18, 2016
    There are four types of people:
    1. do not know but think they know and yell
    2. do not know and do not yell because they know they do not know
    3. do not know, then do research and when they know a lot start yelling
    4. do not know, then do research and do even more research to be sure and then explain what they know when they see that what they know is enough to be helpful despite not knowing everything yet

    I try my hardest to be only the 4th type: Arhat, the one who has gained insight into the true nature of existence and has achieved (technological) nirvana :>

    Comparing RTX to PhysX on this scale is level 0 XD
     
    Armenius likes this.
  11. jmilcher

    jmilcher 2[H]4U

    Messages:
    4,066
    Joined:
    Feb 3, 2008
    I wasn’t implying they are similar technologies. Stand alone cards for not yet implemented or supported features, happened once. And it failed.
     
  12. jmilcher

    jmilcher 2[H]4U

    Messages:
    4,066
    Joined:
    Feb 3, 2008
    And then you have people who think the more they type, the better the point is they are trying to make.
     
    Rifter0876, {NG}Fidel and RamboZombie like this.
  13. RAutrey

    RAutrey [H]ard|Gawd

    Messages:
    1,599
    Joined:
    Jul 25, 2002
    So there are really 5 types of people? :)
     
    {NG}Fidel, RamboZombie and jmilcher like this.
  14. cybereality

    cybereality [H]ardness Supreme

    Messages:
    4,178
    Joined:
    Mar 22, 2008
    XoR_ I love ray-tracing and I want it to succeed. I think eventually in some form it will, since it's basically the Holy Grail of computer graphics.

    The point I was making was currently it is an Nvidia only feature, and that doesn't bode well for wide market adoption.

    You can look at any of those Nvidia features over the years, which still work okay and get some base level of support but aren't a focus for developers: PhysX, SLI, Surround, 3D Vision, etc.

    I know Microsoft has implemented ray-tracing in DirectX12 and there is a Vulkan extension, but we really need it on all cards from all vendors if it's going to last.
     
    reaper12 likes this.
  15. jmilcher

    jmilcher 2[H]4U

    Messages:
    4,066
    Joined:
    Feb 3, 2008
    Nvidia has a history of introducing things that do not succeed.
     
  16. sirmonkey1985

    sirmonkey1985 [H]ard|DCer of the Month - July 2010

    Messages:
    20,988
    Joined:
    Sep 13, 2008

    should see it with navi on the AMD side.. i still think ray tracing has another 2-3 years before even considered a viable graphic option but at least we're finally getting there.
     
    cybereality likes this.
  17. Stoly

    Stoly [H]ardness Supreme

    Messages:
    6,183
    Joined:
    Jul 26, 2005
    For example? And please don't say physx
     
    Armenius likes this.
  18. Stoly

    Stoly [H]ardness Supreme

    Messages:
    6,183
    Joined:
    Jul 26, 2005
    At least we are closer than Larrabee ever was :D:D
     
    sirmonkey1985 likes this.
  19. Brian_B

    Brian_B 2[H]4U

    Messages:
    2,261
    Joined:
    Mar 23, 2012
    3D Vision
    G84
    Fermi
    Gameworks
    nForce
    Very soon to be DLSS

    Just off the top of my head...
     
    RamboZombie and jmilcher like this.
  20. jmilcher

    jmilcher 2[H]4U

    Messages:
    4,066
    Joined:
    Feb 3, 2008
     
    {NG}Fidel and cybereality like this.
  21. jmilcher

    jmilcher 2[H]4U

    Messages:
    4,066
    Joined:
    Feb 3, 2008
    Especially if Intel really enters the discrete gpu market!
     
    {NG}Fidel likes this.
  22. Brian_B

    Brian_B 2[H]4U

    Messages:
    2,261
    Joined:
    Mar 23, 2012
    Also for the OP:

    Horrible idea. Just no.
     
    Armenius likes this.
  23. XoR_

    XoR_ Gawd

    Messages:
    628
    Joined:
    Jan 18, 2016
    No, only four
    Then you have masses of mindless zombie, trolls and AI generated comments

    DLSS is actually very good example of NV feature to soon replace GPU PhysX in NV-locked feature people will make fun out of XD

    Ray-tracing is gonna succeed eventually, in this or some other form. It is kinda like tesselation but this time with MS support right from the start which make me more optimistic for wider adoption.

    AMD could also work on DXR compliant hardware but reality is that they at least 2 years behind NV in GPU development so they simply cannot afford it at this time. I hope they will support it eventually.

    And I really hope PlayStation 5 will support some form of ray-tracing.
     
    GoldenTiger likes this.
  24. Brackle

    Brackle Old Timer

    Messages:
    7,257
    Joined:
    Jun 19, 2003
    I would add the GeForce experience. But thats not a feature, more like bloatware imo.
     
    {NG}Fidel and Brian_B like this.
  25. RAutrey

    RAutrey [H]ard|Gawd

    Messages:
    1,599
    Joined:
    Jul 25, 2002
    I think you have to look at RTX in general as BETA at the moment. Nvidia needed to get it in the hands of developers and consumers as data collection for optimization purposes prior to a node shrink. Getting the ball rolling on adoption was also important. I think RTX at 7nm is going to be pretty awesome.
     
    GoldenTiger likes this.
  26. RamboZombie

    RamboZombie n00b

    Messages:
    53
    Joined:
    Jul 11, 2018

    Let's just agree to disagree about that one...
     
    {NG}Fidel likes this.
  27. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    8,394
    Joined:
    Jun 13, 2003
    More or less- it shows massive improvements with limited implementations, i.e. those that stop well short of 'full ray-tracing'. Big thing is that the improvements are real, the hardware is real, and every engine is tackling it, and well, AMD and Intel are both onboard with respect to hardware support.

    ...was the only way to roll. It's only 'dead' because it has been replaced by VR, which is superior, but wasn't at all possible during the time that 3D Vision was used.

    You got me.

    Ran a bit hot, and was the last time Nvidia pushed compute-focused GPUs down into mainstream tiers. It was also wildly successful, and AMD's efforts at the time were about average. Hard to complain really.

    Works extremely well, and is done in API so it works on other vendors' hardware too.

    Was brilliant, but was killed off essentially by Intel 'closing' their chipset compatibility. Still probably the best audio implementation until HDMI came along and obviated the need for DD5.1 encoding.

    I only see DLSS getting better, and it already works very well. I expect that it will transition toward something we use all the time, particularly with super high resolution display outputs like VR where consistency is more important than periphery details.


    As for the OP: what an 'RT Accelerator' card misses is that such a solution adds a level of latency to the rendering pipeline that is simply unacceptable for real-time graphics. Think SLI, but even worse than the poorest SLI modes. SLI can still be relevant, but an RT Accelerator would require another level of interconnect because you're not just sharing per-frame data, but per-pixel data instead.
     
    GoldenTiger, Montu and Armenius like this.
  28. Stoly

    Stoly [H]ardness Supreme

    Messages:
    6,183
    Joined:
    Jul 26, 2005
    You can thank 3dvision for contributing to the developement of VR
    G84 One of nvidias best sellers
    Fermi, well maybe the GTX480, but nvidia quickly reacted with GTX580
    Gameworks The most popular developer middleware.
    Nforce, I guess you mean the original nforce but nvidia stroke back with a vengeance with nforce 2
    DLSS Ok I'll give you this one, but I really hope nvidia delivers eventually
     
    GoldenTiger and Armenius like this.
  29. reaper12

    reaper12 2[H]4U

    Messages:
    2,192
    Joined:
    Oct 21, 2006
    LOL Can't wait to see you post proof of that statement.
     
    Brackle likes this.
  30. Stoly

    Stoly [H]ardness Supreme

    Messages:
    6,183
    Joined:
    Jul 26, 2005
    Isn't it obvious?
     
  31. reaper12

    reaper12 2[H]4U

    Messages:
    2,192
    Joined:
    Oct 21, 2006
    Ah, you are making it up? Thought so.
     
    {NG}Fidel and Brackle like this.
  32. {NG}Fidel

    {NG}Fidel [H]ardness Supreme

    Messages:
    5,838
    Joined:
    Jan 17, 2005
    Fact
     
  33. TheMadHatterXxX

    TheMadHatterXxX 2[H]4U

    Messages:
    2,873
    Joined:
    Sep 7, 2004
    Because if they offloaded it to a separate card no one would buy it and they would have no way to force it down people's throats and force them to pay the added premium for graphics cards like this cycle.
     
    Brackle and Dayaks like this.
  34. 96redformula

    96redformula 2[H]4U

    Messages:
    2,350
    Joined:
    Oct 29, 2005