FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,601
Gears of War 4 DX12 Performance Review - We take Gears of War 4, a new Windows 10 only game supporting DX12 natively and compare performance with seven video cards. We will find out which one provides the best experience at 4K, 1440p, and 1080p resolutions, and see how these compare to each other. We will also look specifically at the Async Compute feature.
 
In the benchmark I saw a similar increase in performance with async compute in the system in my sig that you guys did on the RX 480. I'm running the game at 2560x1440, and 8% translates to about 10 FPS at the framerate I'm getting. Not really all too noticeable during actual gameplay, though, as it already runs very smooth.
 
-So, can we say that Gears of War is the 1st game that was thoroughly designed under DX12 API ? (*along with Ashes of singularity perhaps?)
-Also, can GoW be considered as a "GeForce title", since Tim Sweeney was Jen Hsun Huang 's guest during GXT1080's presentation?
 
-So, can we say that Gears of War is the 1st game that was thoroughly designed under DX12 API ? (*along with Ashes of singularity perhaps?)
-Also, can GoW be considered as a "GeForce title", since Tim Sweeney was Jen Hsun Huang 's guest during GXT1080's presentation?
Yes.
Sure, if you need to. I would just suggest that Gears of War 4 is going to be bought and played by many people this year and that is irrelevant unless you are a brand loyalist or apologist. It still games the same way with or without a Red or Green label.
 
-So, can we say that Gears of War is the 1st game that was thoroughly designed under DX12 API ? (*along with Ashes of singularity perhaps?)
-Also, can GoW be considered as a "GeForce title", since Tim Sweeney was Jen Hsun Huang 's guest during GXT1080's presentation?

What does Tim Sweeney being at the presentation have to do with anything? Gears 4 was developed by The Coalition and not by Epic. Unless you want to label every UE4 title as a Nvidia game, which would be kind of dumb, then I don't really see it applying here. The only thing that could point towards it is that it was included with some Nvidia cards, but performance shows that the game is relatively vendor neutral performing well on almost every card tested.
 
-So, can we say that Gears of War is the 1st game that was thoroughly designed under DX12 API ? (*along with Ashes of singularity perhaps?)
-Also, can GoW be considered as a "GeForce title", since Tim Sweeney was Jen Hsun Huang 's guest during GXT1080's presentation?
Tim Sweeney has always had a questionable opinion of AMD and the direction they're trying to take technology. In my opinion he is actually one of the few professionals who recognize that AMD's olive branching is to serve AMD's best interests.

That being said: UE4 may have been developed by Epic, but the game was developed by The Coalition, which is owned by Microsoft.
 
What does Tim Sweeney being at the presentation have to do with anything? Gears 4 was developed by The Coalition and not by Epic. Unless you want to label every UE4 title as a Nvidia game, which would be kind of dumb, then I don't really see it applying here. The only thing that could point towards it is that it was included with some Nvidia cards, but performance shows that the game is relatively vendor neutral performing well on almost every card tested.

Given the Xbox is a AMD GPU I would think that it would be in Microsoft's interest to insure best performance possible on AMD hardware.
 
The GTX1060 and RX480 are neck and neck. Again. Looks like we have great competition in the mid-range cards between Nvidia and AMD.
 
quote from page 2:
"Because Presentmon measures frametime converted to framerate, we will only show the average FPS since the spikes of performance in milliseconds on the graph would erroneously indicate the actual minimum and maximum FPS."


to test DX:MD, i have been playing around with PresentMon a little, and used Excel to analyze the data and create graphs:
to create a smoother FPS-line, i averaged each x-number of frametimes and used this value in the graph for the FPS-line
here are two examples of the same graph, with FPS averages used each 5 vs 50 frames (the graph is 8110 frames, and about 115sec long)
01-5frames.jpg 02-50frames.jpg
(blue line is FPS, with its axis on the right)
 
Thanks for including VRAM usage figures. Could you elaborate on the VRAM usage technology you mentioned? I would have liked to see some SLI / XF testing - perhaps in a follow-up article?
 
Kind of makes me want to buy a copy and see what it'll do on my little dual RX 470 setup.
 
The RX 460 looks like bad value on this title. (Not to say it doesn't usually look like bad value)

All of these crush the game at High settings I'm sure, and I can't tell the difference between High and Ultra on this game. The scaling is so small on games nowadays.
 
For this evaluation, and for the Async test in particular, I miss info about what CPU was used.
Async is supposed to offload a throttIed CPU, and I expect it to make more of a difference than this when running a Core i5 at <3.5 GHz boost speed.
 
Game is on sale for $30 at the Microsoft Store. Make sure that you buy the digital download. It's the one that says Play Anywhere. The physical disc isn't going to help you with your Windows PC. :) It will give you a code to add to the Windows Store in the same manner as you would do with Steam game code. (No, it is not a Steam game before someone tries to say that! I know you'll by now!) :) :)
https://www.microsoftstore.com/stor...rs-of-War-4-for-Xbox-One/productID.5061285700

ReCore and Forza are on sale and also digital Play Anywhere titles.
https://www.microsoftstore.com/stor...=en_US_homepage_whatsnew_4_XboxGames50_161123
 
I got this for free from buying a 1070 for a customer.

One of the perks of building custom gaming rigs for people as I get a lot of codes.

Only played about 5 mins of it. Looks nice.
 
Tim Sweeney has always had a questionable opinion of AMD and the direction they're trying to take technology. In my opinion he is actually one of the few professionals who recognize that AMD's olive branching is to serve AMD's best interests.

That being said: UE4 may have been developed by Epic, but the game was developed by The Coalition, which is owned by Microsoft.

Yeah and the other that questioned AMD's motives was John Carmack. Both of these guys started the 3d gaming industry. Its ironic how people just skip over what these guys say when its fairly obvious what AMD wants for their products, just like any other company. IF they can get an advantage its prudent for them to take it.

And to answer DukenukemX

Yes it is a gameworks title, Coalition had a bundling deal with nV graphics cards (1070 and 1080). Now from a programming side it uses one gameworks library, HBAO+, but this was already integrated into the engine prior, it wasn't a Coalition add on.
 
While it's probably covered in the review, you do need the Anniversary Update *just to install* this game after purchasing the Xbox/W10 code (got mine from Amazon). Without it, Windows will claim that it 'cannot be installed on this device' ;).
 
While it's probably covered in the review, you do need the Anniversary Update *just to install* this game after purchasing the Xbox/W10 code (got mine from Amazon). Without it, Windows will claim that it 'cannot be installed on this device' ;).

Got me, too. Regular ol' W10 Pro 64 bit won't git er done. And the DL is 80GB, I think GTAV was just 69GB.
 
Game is on sale for $30 at the Microsoft Store.
Oh shit, thanks for the heads up! 30 bucks for this game, SOLD. I signed in and grabbed dat shiznit with tha quickness. Been playing through the game on PC with a friend over the past few weeks via local split-screen co-op, now we can try out the cross-system PC+XB1 LAN co-op. Sucks that Win10 PC won't allow two people to sign in at the same time, but I'm sure that'll come with time. XPA program pays off with this game. Wasn't expecting Gears 4 to turn out as fun as it did, and certainly wasn't expecting such an excellent PC version, especially with the use of DX12. Forza 6 on Win10 PC also impressed me with its DX12 performance. First Doom 4 with Vulkan (though performance was also awesome on OpenGL), now this game with DX12, 2016 has been amazing, and also we get to see UE4 put to good use. Not to forget AotS, DX12 performance is tight but I await Vulkan support.
 
Last edited:
Until this game supports MGPU, I'm going to give it a pass. I've got more than enough horsepower to run this at 4K, but not if half of my video power is sitting idle.
 
Until this game supports MGPU, I'm going to give it a pass. I've got more than enough horsepower to run this at 4K, but not if half of my video power is sitting idle.

Let me know if I'm reading this wrong...

You're going to pass simply because your video card(s) can effortlessly play this game? It's a fantastic game and you're skipping out for such a trivial reason so you have to wonder why you're into PC gaming in the first place.

Going back to the article:

I didn't get the impression that you all knew the depth of field setting only applies to in-game cinematics. The setting should make no difference at all during game play. The performance hit is coming from the insane screen space reflections alone. 30fps or less in cinematics is not a deal breaker.
 
Last edited:
Let me know if I'm reading this wrong...

You're going to pass simply because your video card(s) can effortlessly play this game? It's a fantastic game and you're skipping out for such a trivial reason so you have to wonder why you're into PC gaming in the first place.

No, I may have communicated that wrong.

What I meant was that both my cards working together would be able to play this game perfectly, but without SLI support, only one of my cards will be running and would not be able to optimally run the game.
 
No, I may have communicated that wrong.

What I meant was that both my cards working together would be able to play this game perfectly, but without SLI support, only one of my cards will be running and would not be able to optimally run the game.

That makes sense. It wouldn't be worth shooting for 4K with that hardware and knowing you have to bump the settings down a bit. That forum post linked above does seem like mGPU support is in the works though.
 
That makes sense. It wouldn't be worth shooting for 4K with that hardware and knowing you have to bump the settings down a bit. That forum post linked above does seem like mGPU support is in the works though.

In The Works™

Which is corporate speak for "Please give us money now, we'll promise you anything!"

The proof is in the pudding.
 
Still waiting for DX12 SLI support in BF1 too, while this last driver/patch pair messed stuff up for DX11 and SLI.
 
not sure but this is nvidia sponsored title. and to some people UE4 itself is one giant GameWorks lol.


Well the second part to that, is absolutely false, the UE4 engine is IHV agnostic, yeah there are branches of it that has gameworks libs, but that its. People that say that are really idiots and too lazy to download something that is free and testable, even more then idiots they are ignorant because all the tools are available to test yet they don't.
 
The GTX1060 and RX480 are neck and neck. Again. Looks like we have great competition in the mid-range cards between Nvidia and AMD.

That is not great, because in the same breath...AMD has left the high end alone to NVIDIA...where the most profit per unit is.
Hence why the financials look like they do.

If you told people 3 yaers ago that AMD would NOT be competing in the high end...you would have been laughed of the forums...and now you try and spin it to a good thing...really?
 
Well the second part to that, is absolutely false, the UE4 engine is IHV agnostic, yeah there are branches of it that has gameworks libs, but that its. People that say that are really idiots and too lazy to download something that is free and testable, even more then idiots they are ignorant because all the tools are available to test yet they don't.

You cannot fix stupid...lots of false claim that have been debunked but keep getting use a "idiots excuses" for AMD lacking performance:

- Too high tesselation factor in games.
- Tesselated ocean in Crysis 2
- Gameworks = blackbox
- Planned obsolescence of NVIDIA SKU's

The list is longer, but you know what I mean ;)

All lies/ignorance....perputated by AMD fans to explain why AMD is lagging behind.
 
You cannot fix stupid...lots of false claim that have been debunked but keep getting use a "idiots excuses" for AMD lacking performance:

- Too high tesselation factor in games.
- Tesselated ocean in Crysis 2
- Gameworks = blackbox
- Planned obsolescence of NVIDIA SKU's

The list is longer, but you know what I mean ;)

All lies/ignorance....perputated by AMD fans to explain why AMD is lagging behind.
and yet you just prove you are ignorant as they.

GW is a blackbox. It isn't open source nor are the lib/code available to the public, and no past versions finally being released to the public doesn't change the current iterations blackbox nature.

and planned obsolescence is part of nearly all companies. NVidia is definitely on a smaller timetable than most. Previous generation receive little to no attention in driver updates, hence most saying "gimping". Then the posters like you try hard as you might to make it as they are saying regression not stall. There is more than enough proof of this even in the articles posted to debunk recession that actually proved stalling/gimping.
 
and yet you just prove you are ignorant as they.

GW is a blackbox. It isn't open source nor are the lib/code available to the public, and no past versions finally being released to the public doesn't change the current iterations blackbox nature.

and planned obsolescence is part of nearly all companies. NVidia is definitely on a smaller timetable than most. Previous generation receive little to no attention in driver updates, hence most saying "gimping". Then the posters like you try hard as you might to make it as they are saying regression not stall. There is more than enough proof of this even in the articles posted to debunk recession that actually proved stalling/gimping.


All gameworks libs are open source the ones that came out with DX11, And all gameworks that have been integrated in UE4, separate branches, source code is available and have been available for quite some time now.

Planned obsolescence is not the same thing as "gimped" people were saying drivers were holding back older gen cards because nV was trying to making their older cards look worse on purpose. That WAS not the case. Neither AMD or nV do driver optimizations for older cards once they go EOL. Now nV has gone through 3 generations of cards in the same time AMD went with 1 gen of cards, and with the memory amounts, and raw shader capabilities changing more so on nV cards, it can be expected newer games will have different adverse affects on their cards, which was construed as nV was "gimping" their cards. It had nothing to do with nV not optimizing their drivers as you stated. Now if you want me to link you to the 3 threads here and the articles around the web that have cited and tested for this I am more then happy to, but again, it seems your memory is being a bit shifty on the matter.

Since you stated the articles seem to prove what you have stated, please link them and we can discuss, cause I stated your memory seems to be a bit shifty.
 
and yet you just prove you are ignorant as they.

GW is a blackbox. It isn't open source nor are the lib/code available to the public, and no past versions finally being released to the public doesn't change the current iterations blackbox nature.

and planned obsolescence is part of nearly all companies. NVidia is definitely on a smaller timetable than most. Previous generation receive little to no attention in driver updates, hence most saying "gimping". Then the posters like you try hard as you might to make it as they are saying regression not stall. There is more than enough proof of this even in the articles posted to debunk recession that actually proved stalling/gimping.
The only reason GCN 1.1 is still supported is because it's still used in current products. GCN 1.1 is nearly as old as Kepler. The 390X was released two years after the 290X while using the same architecture.

We've had this discussion before, with BabelTech posting a good article addressing these concerns.

https://hardforum.com/threads/has-n...the-gtx-780-ti-vs-the-290x-revisited.1895788/
http://www.babeltechreviews.com/nvidia-forgotten-kepler-gtx-780-ti-vs-290x-revisited/view-all/
 
Back
Top