R9 290X goes toe-to-toe with GTX 980 Ti on DirectX 12

Discussion in 'Video Cards' started by mi7chy, Aug 20, 2015.

  1. griff30

    griff30 I Lower the Boom!

    Messages:
    5,354
    Joined:
    Jul 15, 2000

    There seems to be a lot of menstruating about Benchmarks from either camp when one does better than the other.

    Cant we all just game?
     
  2. jwcalla

    jwcalla 2[H]4U

    Messages:
    3,629
    Joined:
    Jan 19, 2011
    [​IMG]
     
  3. atom

    atom Gawd

    Messages:
    851
    Joined:
    May 3, 2005
    A lot of you are talking out of your butt and clearly don't understand the first thing of what you are talking about. Seeing how NVidia's performance is actually going down in DX12 vs 11 clearly the drivers are not ready. If you said DX12 means that drivers don't matter you're just going on your own theories which are proven in your own mind. Also, a game being "just a bunch of draw calls" does not make it very CPU limited. That is the most absurd thing I have ever heard. I am not trying to insult you guys I just want to let you know that is absurd. Nothing gets on that screen, or to the GPU in general without being put there by a draw call. That's what a draw call is. What makes a game CPU limited is having all those draw calls being executed from a single core. Artificial intelligence being calculated for every enemy on the map even though they are miles away can be taxing on the cpu. Calculating how the wind effects a million blades of grass can tax the cpu. AMD is showing a huge increase in these early demos because they were already a step ahead of the game driver wise because of their horribly failed Mantle project.
     
  4. wantapple

    wantapple Gawd

    Messages:
    598
    Joined:
    Jan 8, 2012
    Wrong.
    Drivers has nothing to do with performance in DX12.

    Don't expect an nVIDIA fix through Driver Intervention either. DirectX 12 is limited in driver intervention because it is closer to Metal than DirectX 11. Therefore nVIDIAs penchant for replacing shaders at the driver level is nullified with DirectX 12. DirectX 12 will be far more hardware limited than DirectX 11.

    Oxide confirmed it here:
     
  5. harmattan

    harmattan [H]ardness Supreme

    Messages:
    4,225
    Joined:
    Feb 11, 2008
    When talking about magic API performance improvement, this is something that is constantly claimed by the industry, but never in the history of the world has it happened -- not once. We heard the same malarky during the move from DX7 to 8, 8 to 9, 9 to 10 and the x% performance increase claimed did not materialize a single cotton-picking time. The planted and "directed" tests sure must help to sell cards though...
     
  6. griff30

    griff30 I Lower the Boom!

    Messages:
    5,354
    Joined:
    Jul 15, 2000
    But, but, but, physx....
     
  7. Flopper

    Flopper [H]ard|Gawd

    Messages:
    1,642
    Joined:
    Nov 15, 2007
    Nvidia sells you outdated tech called the 980ti etc...
    said so a long time.
    buy amd buy the future today
     
  8. FlawleZ

    FlawleZ Gawd

    Messages:
    792
    Joined:
    Oct 20, 2010
    Exactly. I've been saying this all along. Although it would be really nice to see the claims made about performance increases coming into fruition, I'll remain very skeptical until we have a thorough list of real world examples.
     
  9. wantapple

    wantapple Gawd

    Messages:
    598
    Joined:
    Jan 8, 2012
    I just wonder how they feel right know. LOL
     
  10. kalston

    kalston Gawd

    Messages:
    986
    Joined:
    Mar 10, 2011
    Pretty good considering AMD does not have anything to compete with the Titan X/980 ti in actual games :)
     
  11. DracoNB

    DracoNB Gawd

    Messages:
    599
    Joined:
    Sep 2, 2010
  12. DeadSkull

    DeadSkull [H]ardness Supreme

    Messages:
    4,482
    Joined:
    Jun 7, 2008
    Don't worry, I'm sure nvidia will just spend a few billion on PR convincing us all once again that AMD cards can't play games and have horrible drivers.
     
    Last edited: Aug 26, 2015
  13. kalston

    kalston Gawd

    Messages:
    986
    Joined:
    Mar 10, 2011
  14. viper1152012

    viper1152012 [H]ard|Gawd

    Messages:
    1,025
    Joined:
    Jun 20, 2012
    If you google fury x DX 12 benchmark you get

    http://www.extremetech.com/gaming/2...he-singularity-amd-and-nvidia-go-head-to-head

    The same company. Between the two companies they pretty much lay it on the table.

    Nvidia focused on an aggressive straight one line game and did a stellar job.
    AMD focused on asynchronous calls using more lines.

    Intel would be the first to tell you out of order execution and multiple lines of computations running side by side always wins.

    Nvidia is architecturally build for DX11 and suffers some performance hit on DX12 because their optimizations don't work in DX12.

    Not saying they won't be able to make it ... better, but the benchmark was certified by MS, Nvidia and AMD and passed DX 12 and DX11 certification requirements. Both companies had access to it for over a year and Nvidia out of the gate was blaming the game.

    I think we're seeing a turn of favor, just like when Intel jumped into HT tech, AMD went for Multiple pipelines and moving forward, if the 290X can almost overtake a 980Ti and the 290x scales better, it would cast some doubt in Nvidia's current gen and Maxwell as well.

    the Fury is hot off the press and posted terrific numbers and shows they have moved away from a single pipeline ideal and came out swinging in the new world of DX12.
     
  15. viper1152012

    viper1152012 [H]ard|Gawd

    Messages:
    1,025
    Joined:
    Jun 20, 2012
    Nvidia has always been great at squeezing performance out of their drivers and delivering 100% of what the hardware can output.

    Amd has always had terrific hardware and questionable drivers, usually landing on a sweet spot with particular revisions.

    For once, Nvidia might have dropped the ball.

    AMD is killing it with their Async Shaders and Multi-Threaded Command Buffer Recording.

    http://www.dsogaming.com/news/amd-e...ders-multi-threaded-command-buffer-recording/

    Maybe it's time for them to shine again?
     
  16. Michaelius

    Michaelius [H]ardness Supreme

    Messages:
    4,684
    Joined:
    Sep 8, 2003
    Where do you see that shining ?

    All I see is that in most favourable case Fury X slightly beats 980ti.

    Meanwhile 980 ti still keeps 20% oc headroom and offers superior performance everywhere else.

    And I get that performance without been forced to instal spyware os ;)
     
  17. cocdod

    cocdod [H]Lite

    Messages:
    95
    Joined:
    Aug 21, 2015
  18. wantapple

    wantapple Gawd

    Messages:
    598
    Joined:
    Jan 8, 2012
    DX12 is the future you like it or not. The 980ti is currently trading blows with a 2 years old gpu lol.

    You can keep Win7 and enjoy old games. Microsoft is pushing DX12 on xbox games and most of them will be ported to PC.
     
  19. wantapple

    wantapple Gawd

    Messages:
    598
    Joined:
    Jan 8, 2012
    The Star Swarm test makes use of 100,000 Draw Calls which does not bottleneck either nVIDIAs Maxwell or AMDs GCN 1.1/1.2. That is just about the only DirectX 12 feature it uses (the ability to draw more units on the screen). More units means more triangles. Star Swarm does not make use of Asynchronous Shading (Parallel Shading) or the subsequent Post Processing effects as seen under Ashes of the Singularity.

    Once you throw in Asynchronous Shading, nVIDIAs Serial architectures take a noticeable dive in performance while AMDs Parallel architectures take little to no performance hit.

    Since both the Xbox One as well as the Playstation 4 include Asynchronous Compute Engines (2 for the Xbox one like those found in a Radeon 7970 and 8 for the PS4 like those found in Hawaii/Fiji) then it is almost a given that developers will be making use of this feature. Microsoft added this feature into Direct X 12 for this reason.

    This is a case where AMDs close collaboration with consoles will surely pay off. This feature should be welcomed by any R9 200 series owner as it will breath new life into their aging hardware.
     
  20. VonGriffin

    VonGriffin Limp Gawd

    Messages:
    305
    Joined:
    Oct 15, 2013
    oh we are again at the ole argument: gaming future is Linux ehm AMD ?

    you can dream about the performance of yet nonexistent tech while im playing Witcher 3 with hairworks on my 980ti

    By the time DX12 titles hit proper i will have a new card:D
     
  21. VonGriffin

    VonGriffin Limp Gawd

    Messages:
    305
    Joined:
    Oct 15, 2013
    Ah its nice to watch Fiend s fur moving in the wind while he is about to die :p
     
  22. atom

    atom Gawd

    Messages:
    851
    Joined:
    May 3, 2005
    Then please explain to me how DX12 going 'closer to the metal' actually reduces performance in nVidia's case? Please don't throw around developer presentation's catch phrases like that. Touchpad implementation can be developed with lower level programming as well but it still needs a functioning driver.
     
  23. Zuul

    Zuul Gawd

    Messages:
    839
    Joined:
    Jan 7, 2013
    Good point :p
     
  24. Mahigan

    Mahigan Limp Gawd

    Messages:
    142
    Joined:
    Aug 25, 2015
    Last edited: Aug 26, 2015
  25. DracoNB

    DracoNB Gawd

    Messages:
    599
    Joined:
    Sep 2, 2010
    You should really add that as a separate post, tons of great detail and work done and it will sadly get buried in this thread.
     
  26. Mahigan

    Mahigan Limp Gawd

    Messages:
    142
    Joined:
    Aug 25, 2015
    Will do :)
     
  27. atom

    atom Gawd

    Messages:
    851
    Joined:
    May 3, 2005
    Thank you, Mahigan. An informed post not based on voodoo.
     
  28. Mahigan

    Mahigan Limp Gawd

    Messages:
    142
    Joined:
    Aug 25, 2015
    You're welcome atom :)
     
  29. DeadSkull

    DeadSkull [H]ardness Supreme

    Messages:
    4,482
    Joined:
    Jun 7, 2008
    Because nVidia started to rely more and more on driver intervention for shaders processing starting with Kepler. In tech way this was a step backwards because Kepler's shaders ( which Maxwell inherits) are less complex then Fermi's and do less arithmetic at the hardware level relegating the rest of the work to software.

    http://www.anandtech.com/show/5699/nvidia-geforce-gtx-680-review/3
     
  30. DeadSkull

    DeadSkull [H]ardness Supreme

    Messages:
    4,482
    Joined:
    Jun 7, 2008
    More points. Basically nVidia gimped Fermi architecture by making it simpler and more linear and moved the rest of the work to their drivers. Undoubtedly and obviously since its already happening Maxwell will suffer even more under DX12 where driver ability to make up for hardware shortfalls is nullified.
     
    Last edited: Aug 26, 2015
  31. thedocta45

    thedocta45 [H]ard|Gawd

    Messages:
    1,325
    Joined:
    Oct 10, 2007
    It's going to hilarious when the first DX12 game actually comes out, and Nvidia blows the AMD cards out of the water.
     
  32. TaintedSquirrel

    TaintedSquirrel [H]ardForum Junkie

    Messages:
    9,414
    Joined:
    Aug 5, 2013
    Considering Nvidia's iron grip on the market devs have no reason to make separate considerations for AMD's benefit unless it happens by accident. Even if that were the case, Nvidia will do what they always do and intentionally prevent it from happening... Nvidia will do everything in their power to stop it from seeing the light of day.

    If nothing else they will design Pascal to take advantage of the same feature, market it to hell and back, and pretend Maxwell/Kepler don't exist. It won't matter by then anyway, you won't be able to buy any of the current cards. Nvidia has no reason to care about their old GPUs performance when they're trying to sell new hotness.

    What I find most interesting here is that the Fury X, 980 Ti, and 290X are all within spitting distance of each other. The 290X and Fury X should perform identically under the same draw call bottleneck based on the analysis other people are providing. If all of this is true then it looks like AMD intentionally gimped their own $650 halo product compared to the rest of their line-up.
     
  33. ManofGod

    ManofGod [H]ardForum Junkie

    Messages:
    10,649
    Joined:
    Oct 4, 2007
    If that does not happen, will you please post a picture of your tears of anguish? :D So, how is Nvidia exactly going to blow away AMD if there are not NVidia only specific software optimizations like they have in DX11?

    So, Nvidia attempt to kill the DX12 market with market manipulations instead of making legitimately better hardware? Well, at least we know why Nvidia was showing better in gameworks games and you have admitted to it. :D
     
  34. TaintedSquirrel

    TaintedSquirrel [H]ardForum Junkie

    Messages:
    9,414
    Joined:
    Aug 5, 2013
    The idea is that this game might represent one specific game where AMD slips ahead of Nvidia. You can do the same thing with DX11 right now -- try Shadow of Mordor. AMD leads Nvidia in pretty much every SoM test I could find.

    http://www.anandtech.com/show/9059/the-nvidia-geforce-gtx-titan-x-review/7
    http://www.tomshardware.com/reviews/shadow-of-mordor-performance,3996-4.html

    So if you pretend for a second that SoM is the first and only DX11 game on the market, suddenly AMD looks very appealing. We might even get some fun articles about why AMD is winning DX11 thanks to their architecture, and a bunch of other nonsense that may or may not actually mean anything. And then you have threads like this popping up; "AMD leads in first SoM tests, is Nvidia doomed in DX11?"

    And of course the reality of the situation, that's simply not the case. With AotS we're looking at some kind of bottleneck that might have been originally implemented to show off Mantle. It may not even be an issue in any other DX12 game... Or maybe just RTS in particular. Or maybe as DX12 games become more advanced, the bottleneck on Nvidia hardware will become more obvious. Similar to AMD's issues with tessellation in DX11, it can really hurt in certain games where Nvidia intentionally goes overboard to kill AMD's performance, but for most games it has no effect. As far as I'm concerned, AotS' DX12 benchmark is the equivalent of Nvidia releasing a DX11 Tessellation benchmark. Neither are representative of real-world gameplay, they solely exist to emphasize one feature.
     
    Last edited: Aug 26, 2015
  35. Semantics

    Semantics 2[H]4U

    Messages:
    2,766
    Joined:
    May 18, 2010
    Simple get developers to use more geometry instead of tricky shaders, nvidia cards already crush AMD cards in that regard.
     
  36. thedocta45

    thedocta45 [H]ard|Gawd

    Messages:
    1,325
    Joined:
    Oct 10, 2007
    Sure why not I'll post myself buying an AMD card as well.

    I am more concerned that everyone is now saying DX12 is going to save AMD.

    I think its foolish this early in the game to say anything for sure about DX12 and AMD vs Nvida.

    I think AMD has an advantage due to the similarities between Mantel and DX12, however it would be foolish to think that a company with the resources of Nvidia isn't ready for DX12 with the next generation of their cards.

    AMD might have been the first to the block with good DX12 performance, but there cards are on the table you wont see much of a difference in the Fiji and the next generation of AMD cards.

    However with Nvidia the difference between the Pascal architecture and the Maxwell will be significant.

    This is just my option on whats going on.
     
  37. OldandCrusty

    OldandCrusty [H]Lite

    Messages:
    66
    Joined:
    Jun 23, 2015
    I'm an Ashes Founder, and I own a GTX 980, and a R9 290. The game looks great. If you like RTS with big battles and low amounts of micro, than this looks like it will be a fantastic game.
    That's what I'm looking at, how the game looks and plays on my hardware. If my experience is great, I don't care if some one might be getting higher frame rates than me. I care about my experience, period.
     
  38. Creig

    Creig Gawd

    Messages:
    786
    Joined:
    Sep 24, 2004
    The only way that's going to happen is in Gameworks titles.
     
  39. YeuEmMaiMai

    YeuEmMaiMai [H]ardForum Junkie

    Messages:
    14,597
    Joined:
    Jun 11, 2004
    yes it did with better visuals
     
  40. Yakk

    Yakk [H]ardness Supreme

    Messages:
    5,811
    Joined:
    Nov 5, 2010
    Yes, this is quite worrisome. In relation to this however, one phrase from the Oxide blog caught my attention :

    At first glance it's hopeful that this was brought to Microsoft's attention and certain safety measures were put into place as part of minimally optimized code. Need to look into this.