Gears of War 4 DX12 Performance Review @ [H]

Discussion in 'Video Cards' started by FrgMstr, Nov 23, 2016.

  1. FrgMstr

    FrgMstr Just Plain Mean Staff Member

    Messages:
    47,980
    Joined:
    May 18, 1997
    Gears of War 4 DX12 Performance Review - We take Gears of War 4, a new Windows 10 only game supporting DX12 natively and compare performance with seven video cards. We will find out which one provides the best experience at 4K, 1440p, and 1080p resolutions, and see how these compare to each other. We will also look specifically at the Async Compute feature.
     
  2. Shintai

    Shintai [H]ardness Supreme

    Messages:
    5,691
    Joined:
    Jul 1, 2016
    So the best running DX12 game runs very well on both gaming IHVs. With the clear win to the one that was supposed to be bad at DX12.
     
  3. atp1916

    atp1916 [H]ard|DCoTM x1

    Messages:
    3,677
    Joined:
    Jun 18, 2004
    That RX470. :D

    Really do need to play the entire GoW series.
     
    KazeoHin and Armenius like this.
  4. Armenius

    Armenius I Drive Myself to the [H]ospital

    Messages:
    16,765
    Joined:
    Jan 28, 2014
    In the benchmark I saw a similar increase in performance with async compute in the system in my sig that you guys did on the RX 480. I'm running the game at 2560x1440, and 8% translates to about 10 FPS at the framerate I'm getting. Not really all too noticeable during actual gameplay, though, as it already runs very smooth.
     
  5. Sith'ari

    Sith'ari Gawd

    Messages:
    573
    Joined:
    Oct 13, 2013
    -So, can we say that Gears of War is the 1st game that was thoroughly designed under DX12 API ? (*along with Ashes of singularity perhaps?)
    -Also, can GoW be considered as a "GeForce title", since Tim Sweeney was Jen Hsun Huang 's guest during GXT1080's presentation?
     
  6. FrgMstr

    FrgMstr Just Plain Mean Staff Member

    Messages:
    47,980
    Joined:
    May 18, 1997
    Yes.
    Sure, if you need to. I would just suggest that Gears of War 4 is going to be bought and played by many people this year and that is irrelevant unless you are a brand loyalist or apologist. It still games the same way with or without a Red or Green label.
     
  7. Derangel

    Derangel [H]ard as it Gets

    Messages:
    17,072
    Joined:
    Jan 31, 2008
    What does Tim Sweeney being at the presentation have to do with anything? Gears 4 was developed by The Coalition and not by Epic. Unless you want to label every UE4 title as a Nvidia game, which would be kind of dumb, then I don't really see it applying here. The only thing that could point towards it is that it was included with some Nvidia cards, but performance shows that the game is relatively vendor neutral performing well on almost every card tested.
     
    Sith'ari likes this.
  8. Armenius

    Armenius I Drive Myself to the [H]ospital

    Messages:
    16,765
    Joined:
    Jan 28, 2014
    Tim Sweeney has always had a questionable opinion of AMD and the direction they're trying to take technology. In my opinion he is actually one of the few professionals who recognize that AMD's olive branching is to serve AMD's best interests.

    That being said: UE4 may have been developed by Epic, but the game was developed by The Coalition, which is owned by Microsoft.
     
    trandoanhung1991, razor1 and Sith'ari like this.
  9. Aireoth

    Aireoth 2[H]4U

    Messages:
    2,322
    Joined:
    Oct 12, 2005
    Given the Xbox is a AMD GPU I would think that it would be in Microsoft's interest to insure best performance possible on AMD hardware.
     
  10. Creig

    Creig Gawd

    Messages:
    785
    Joined:
    Sep 24, 2004
    The GTX1060 and RX480 are neck and neck. Again. Looks like we have great competition in the mid-range cards between Nvidia and AMD.
     
    Armenius likes this.
  11. bl4C3y3

    bl4C3y3 n00b

    Messages:
    42
    Joined:
    Nov 18, 2016
    quote from page 2:
    "Because Presentmon measures frametime converted to framerate, we will only show the average FPS since the spikes of performance in milliseconds on the graph would erroneously indicate the actual minimum and maximum FPS."


    to test DX:MD, i have been playing around with PresentMon a little, and used Excel to analyze the data and create graphs:
    to create a smoother FPS-line, i averaged each x-number of frametimes and used this value in the graph for the FPS-line
    here are two examples of the same graph, with FPS averages used each 5 vs 50 frames (the graph is 8110 frames, and about 115sec long)
    01-5frames.jpg 02-50frames.jpg
    (blue line is FPS, with its axis on the right)
     
  12. Quartz-1

    Quartz-1 [H]ardness Supreme

    Messages:
    4,257
    Joined:
    May 20, 2011
    Thanks for including VRAM usage figures. Could you elaborate on the VRAM usage technology you mentioned? I would have liked to see some SLI / XF testing - perhaps in a follow-up article?
     
  13. Geforcepat

    Geforcepat Gawd

    Messages:
    886
    Joined:
    Jun 2, 2012
    Downloading now. thanks for the performance review. i now know what settings i need to run at. (high)
    (970, 2600k@4.2,1440p gsync 16gb ram.)
     
  14. iNViSiGOD

    iNViSiGOD Gawd

    Messages:
    597
    Joined:
    Apr 16, 2002
    Kind of makes me want to buy a copy and see what it'll do on my little dual RX 470 setup.
     
  15. noko

    noko [H]ardness Supreme

    Messages:
    4,096
    Joined:
    Apr 14, 2010
    Does this game support mGPU?
     
  16. Boggy101

    Boggy101 [H]Lite

    Messages:
    115
    Joined:
    Sep 19, 2016
    The RX 460 looks like bad value on this title. (Not to say it doesn't usually look like bad value)

    All of these crush the game at High settings I'm sure, and I can't tell the difference between High and Ultra on this game. The scaling is so small on games nowadays.
     
  17. Olle P

    Olle P Limp Gawd

    Messages:
    331
    Joined:
    Mar 29, 2010
    For this evaluation, and for the Async test in particular, I miss info about what CPU was used.
    Async is supposed to offload a throttIed CPU, and I expect it to make more of a difference than this when running a Core i5 at <3.5 GHz boost speed.
     
  18. cageymaru

    cageymaru [H]ard|News

    Messages:
    19,234
    Joined:
    Apr 10, 2003
    Game is on sale for $30 at the Microsoft Store. Make sure that you buy the digital download. It's the one that says Play Anywhere. The physical disc isn't going to help you with your Windows PC. :) It will give you a code to add to the Windows Store in the same manner as you would do with Steam game code. (No, it is not a Steam game before someone tries to say that! I know you'll by now!) :) :)
    https://www.microsoftstore.com/stor...rs-of-War-4-for-Xbox-One/productID.5061285700

    ReCore and Forza are on sale and also digital Play Anywhere titles.
    https://www.microsoftstore.com/stor...=en_US_homepage_whatsnew_4_XboxGames50_161123
     
    DrezKill likes this.
  19. daglesj

    daglesj [H]ardness Supreme

    Messages:
    5,019
    Joined:
    May 7, 2005
    I got this for free from buying a 1070 for a customer.

    One of the perks of building custom gaming rigs for people as I get a lot of codes.

    Only played about 5 mins of it. Looks nice.
     
  20. DukenukemX

    DukenukemX [H]ardness Supreme

    Messages:
    4,379
    Joined:
    Jan 30, 2005
    Is this a GameWorks game?
     
  21. razor1

    razor1 [H]ardForum Junkie

    Messages:
    10,155
    Joined:
    Jul 14, 2005
    Yeah and the other that questioned AMD's motives was John Carmack. Both of these guys started the 3d gaming industry. Its ironic how people just skip over what these guys say when its fairly obvious what AMD wants for their products, just like any other company. IF they can get an advantage its prudent for them to take it.

    And to answer DukenukemX

    Yes it is a gameworks title, Coalition had a bundling deal with nV graphics cards (1070 and 1080). Now from a programming side it uses one gameworks library, HBAO+, but this was already integrated into the engine prior, it wasn't a Coalition add on.
     
  22. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    8,729
    Joined:
    Jun 13, 2003
    While it's probably covered in the review, you do need the Anniversary Update *just to install* this game after purchasing the Xbox/W10 code (got mine from Amazon). Without it, Windows will claim that it 'cannot be installed on this device' ;).
     
  23. noko

    noko [H]ardness Supreme

    Messages:
    4,096
    Joined:
    Apr 14, 2010
  24. Dullard

    Dullard [H]ard|Gawd

    Messages:
    1,946
    Joined:
    Jun 28, 2012
    Got me, too. Regular ol' W10 Pro 64 bit won't git er done. And the DL is 80GB, I think GTAV was just 69GB.
     
  25. DrezKill

    DrezKill Limp Gawd

    Messages:
    455
    Joined:
    Mar 11, 2007
    Oh shit, thanks for the heads up! 30 bucks for this game, SOLD. I signed in and grabbed dat shiznit with tha quickness. Been playing through the game on PC with a friend over the past few weeks via local split-screen co-op, now we can try out the cross-system PC+XB1 LAN co-op. Sucks that Win10 PC won't allow two people to sign in at the same time, but I'm sure that'll come with time. XPA program pays off with this game. Wasn't expecting Gears 4 to turn out as fun as it did, and certainly wasn't expecting such an excellent PC version, especially with the use of DX12. Forza 6 on Win10 PC also impressed me with its DX12 performance. First Doom 4 with Vulkan (though performance was also awesome on OpenGL), now this game with DX12, 2016 has been amazing, and also we get to see UE4 put to good use. Not to forget AotS, DX12 performance is tight but I await Vulkan support.
     
    Last edited: Nov 24, 2016
    cageymaru likes this.
  26. KazeoHin

    KazeoHin [H]ardness Supreme

    Messages:
    7,761
    Joined:
    Sep 7, 2011
    Until this game supports MGPU, I'm going to give it a pass. I've got more than enough horsepower to run this at 4K, but not if half of my video power is sitting idle.
     
  27. Sonicks

    Sonicks [H]ard|Gawd

    Messages:
    1,400
    Joined:
    Jul 24, 2005
    Let me know if I'm reading this wrong...

    You're going to pass simply because your video card(s) can effortlessly play this game? It's a fantastic game and you're skipping out for such a trivial reason so you have to wonder why you're into PC gaming in the first place.

    Going back to the article:

    I didn't get the impression that you all knew the depth of field setting only applies to in-game cinematics. The setting should make no difference at all during game play. The performance hit is coming from the insane screen space reflections alone. 30fps or less in cinematics is not a deal breaker.
     
    Last edited: Nov 24, 2016
  28. KazeoHin

    KazeoHin [H]ardness Supreme

    Messages:
    7,761
    Joined:
    Sep 7, 2011
    No, I may have communicated that wrong.

    What I meant was that both my cards working together would be able to play this game perfectly, but without SLI support, only one of my cards will be running and would not be able to optimally run the game.
     
  29. Sonicks

    Sonicks [H]ard|Gawd

    Messages:
    1,400
    Joined:
    Jul 24, 2005
    That makes sense. It wouldn't be worth shooting for 4K with that hardware and knowing you have to bump the settings down a bit. That forum post linked above does seem like mGPU support is in the works though.
     
  30. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    8,729
    Joined:
    Jun 13, 2003
    Brent mentioned SLi/CFX in the review; did they test it?!?
     
  31. KazeoHin

    KazeoHin [H]ardness Supreme

    Messages:
    7,761
    Joined:
    Sep 7, 2011
    In The Works™

    Which is corporate speak for "Please give us money now, we'll promise you anything!"

    The proof is in the pudding.
     
    Algrim likes this.
  32. renz496

    renz496 Limp Gawd

    Messages:
    232
    Joined:
    Jul 28, 2013
    not sure but this is nvidia sponsored title. and to some people UE4 itself is one giant GameWorks lol.
     
  33. renz496

    renz496 Limp Gawd

    Messages:
    232
    Joined:
    Jul 28, 2013
    isn't that pretty much standard with DX12 games? RotTR only got multi gpu support in DX12 after 7 months or so. even in DXMD i heard the multi gpu in DX12 work based on SLI/CF profile that nvidia/AMD provide for the game. hence 1060 in DX12 multigpu did not work in DXMD as tested by PCPer.
     
    Shintai likes this.
  34. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    8,729
    Joined:
    Jun 13, 2003
    Still waiting for DX12 SLI support in BF1 too, while this last driver/patch pair messed stuff up for DX11 and SLI.
     
  35. razor1

    razor1 [H]ardForum Junkie

    Messages:
    10,155
    Joined:
    Jul 14, 2005

    Well the second part to that, is absolutely false, the UE4 engine is IHV agnostic, yeah there are branches of it that has gameworks libs, but that its. People that say that are really idiots and too lazy to download something that is free and testable, even more then idiots they are ignorant because all the tools are available to test yet they don't.
     
    Armenius likes this.
  36. Factum

    Factum [H]ard|Gawd

    Messages:
    1,528
    Joined:
    Dec 24, 2014
    That is not great, because in the same breath...AMD has left the high end alone to NVIDIA...where the most profit per unit is.
    Hence why the financials look like they do.

    If you told people 3 yaers ago that AMD would NOT be competing in the high end...you would have been laughed of the forums...and now you try and spin it to a good thing...really?
     
  37. Factum

    Factum [H]ard|Gawd

    Messages:
    1,528
    Joined:
    Dec 24, 2014
    You cannot fix stupid...lots of false claim that have been debunked but keep getting use a "idiots excuses" for AMD lacking performance:

    - Too high tesselation factor in games.
    - Tesselated ocean in Crysis 2
    - Gameworks = blackbox
    - Planned obsolescence of NVIDIA SKU's

    The list is longer, but you know what I mean ;)

    All lies/ignorance....perputated by AMD fans to explain why AMD is lagging behind.
     
  38. JustReason

    JustReason razor1 is my Lover

    Messages:
    2,485
    Joined:
    Oct 31, 2015
    and yet you just prove you are ignorant as they.

    GW is a blackbox. It isn't open source nor are the lib/code available to the public, and no past versions finally being released to the public doesn't change the current iterations blackbox nature.

    and planned obsolescence is part of nearly all companies. NVidia is definitely on a smaller timetable than most. Previous generation receive little to no attention in driver updates, hence most saying "gimping". Then the posters like you try hard as you might to make it as they are saying regression not stall. There is more than enough proof of this even in the articles posted to debunk recession that actually proved stalling/gimping.
     
  39. razor1

    razor1 [H]ardForum Junkie

    Messages:
    10,155
    Joined:
    Jul 14, 2005

    All gameworks libs are open source the ones that came out with DX11, And all gameworks that have been integrated in UE4, separate branches, source code is available and have been available for quite some time now.

    Planned obsolescence is not the same thing as "gimped" people were saying drivers were holding back older gen cards because nV was trying to making their older cards look worse on purpose. That WAS not the case. Neither AMD or nV do driver optimizations for older cards once they go EOL. Now nV has gone through 3 generations of cards in the same time AMD went with 1 gen of cards, and with the memory amounts, and raw shader capabilities changing more so on nV cards, it can be expected newer games will have different adverse affects on their cards, which was construed as nV was "gimping" their cards. It had nothing to do with nV not optimizing their drivers as you stated. Now if you want me to link you to the 3 threads here and the articles around the web that have cited and tested for this I am more then happy to, but again, it seems your memory is being a bit shifty on the matter.

    Since you stated the articles seem to prove what you have stated, please link them and we can discuss, cause I stated your memory seems to be a bit shifty.
     
  40. Armenius

    Armenius I Drive Myself to the [H]ospital

    Messages:
    16,765
    Joined:
    Jan 28, 2014
    The only reason GCN 1.1 is still supported is because it's still used in current products. GCN 1.1 is nearly as old as Kepler. The 390X was released two years after the 290X while using the same architecture.

    We've had this discussion before, with BabelTech posting a good article addressing these concerns.

    https://hardforum.com/threads/has-n...the-gtx-780-ti-vs-the-290x-revisited.1895788/
    http://www.babeltechreviews.com/nvidia-forgotten-kepler-gtx-780-ti-vs-290x-revisited/view-all/