Dying Light Video Card Performance Review @ [H]

Well I guess you should be doing the [H] reviews then? Because they are stating something totally different.

I do not play this game, nor planned too. I was just commenting on the horrible crossfire/sli support issue.

Just seems lately both are very horrible.

It could be something else .. its PCs after all .. many factors. If you read all of this thread many say they are not having sli scaling issues.

I must have had a good run with tri SLI scaling .
LOFT , SOM , FAR Cry 4, Ryse, Dying light .. all scaled in Tri SLI for me. CPU @ 5.0ghz 4820k in 4k res.

Sure there will be some that i'm not playing that are broken but so far - last 4-5 months has been good. I find you have to wait though .. eg i never play games on day 0. Normally wait 1-2 months after release a few patches kick in and then its good. Normally by the time Steam are discounting it - its ready for prime time.
 
It could be something else .. its PCs after all .. many factors. If you read all of this thread many say they are not having sli scaling issues.

I must have had a good run with tri SLI scaling .
LOFT , SOM , FAR Cry 4, Ryse, Dying light .. all scaled in Tri SLI for me. CPU @ 5.0ghz 4820k in 4k res.

Sure there will be some that i'm not playing that are broken but so far - last 4-5 months has been good. I find you have to wait though .. eg i never play games on day 0. Normally wait 1-2 months after release a few patches kick in and then its good. Normally by the time Steam are discounting it - its ready for prime time.

I agree, been burnt way too many times in new game releases. The big one I will never forget is Guild Wars 2, After 4-5 months of waiting for a good SLI driver for my 480's I gave up.

Anyway. Not sure how [H] could be getting a totally different experience then the users are reporting?
 
I agree, been burnt way too many times in new game releases. The big one I will never forget is Guild Wars 2, After 4-5 months of waiting for a good SLI driver for my 480's I gave up.

Anyway. Not sure how [H] could be getting a totally different experience then the users are reporting?

No idea.. would be nice to know what happened during the test and to re test it and do over the review.
 
Just tested it for you.. First with SLI off and I was getting around 70-85 FPS in the main safe house.. walked outside and I was getting like the same in a big open area.

Turned SLI on and while in the building I was getting 100-130s.. Outside in a big area it was the same. High of 134 fps.

Seems fine to me.

Edit: All that at 2560 x 1440 144hz refresh rate.

Hey Eclypse are you getting any vram hitching at 1440p ultra settings ? if not i might put my Asus Gsync 1440p back on as the main display for a while .
 
Hey Eclypse are you getting any vram hitching at 1440p ultra settings ? if not i might put my Asus Gsync 1440p back on as the main display for a while .

Not really.. maybe once if that was a hitch. Just ran around the town for about 10 mins going into many apartments.
 
You weren't complaining when we used Tomb Raider, sporting TressFX AMD technology for 2 years in our gaming suite which ran much better on AMD hardware. Or Crysis 3 an AMD Evolved game, which we are still using now. Or other games in our past.

We don't chose games based on who got their hands on it to inject technologies. We are open minded to any game that is worth it. We look at many factors, including popularity, GPU performance demand, forward looking and graphically intense 3D effects, special features or effects and many other things.

GameWorks is a set of technologies, just as AMD has a set of technologies it has its hand in in some games. GameWorks uses standard DX11 calls, and AMD has the ability to optimize games via drivers. Perhaps you should be more upset at the fact that AMD hasn't released drivers in 3 months for new games.

If the majority of good games out right now have one IHV's set of technologies over the other, maybe you should be complaining to the competition to involve itself more in games. We just take it as we see it, and evaluate games as they are.

There are many more games coming out this year we are looking forward to adding. Grand Theft Auto V is high on my list, as is The Witcher 3. Dying Light, for the record, is not a broken game.

I hope you realize the "set of technologies" you are comparing aren't even comparable, for one, AMD technology is open. They even release the source code at dev conferences, most recently is their latest TressFX.

Did you notice how quick it was for NV to update their drivers to boost performance in Dirt 3 & Hitman (AMD's Global Illumination) or even Tomb Raider itself with TressFX hair? Because its open, they can actually go through the code themselves to optimize it.

GameWorks isn't open. NV and developers who partake in the program have said publicly that devs who optimize for GameWorks are NOT ALLOWED to share that with AMD.

I noted that of the recent GameWorks titles released, a lot of them release broken requiring many patches. Even Watch Dogs now has terrible stutters on SLI setups and don't even scale at 4K and its an NV sponsored game. What hope does AMD have to optimize such messed up games when they have to deal with the GameWorks black box?

For the record, during your inclusion of TR in your test, you also had Metro LL and Far Cry 3 (NV TWIMTBP titles). Crysis 3 is built upon the Crytek engine that runs equally well on both vendors, with Crysis 2 favoring NV vastly due to the invisible tessellated ocean.

Now moving forward, if you end up adding Dying Light & Witcher 3 (more GameWorks titles), is there a point to even read your reviews comparing Radeon vs GeForce? The conclusion is set from the beginning. Tell me if I am wrong, address my concern here since its quite valid, because all I see ahead is endless AMD bashing in every hardware review because AMD hardware fails in NV sponsored titles. For someone less aware, they would have thought AMD GPUs fail like that in every game, or that the R290X is 25-30% slower than 980.

But wait.. when we expand the game selection, such as at TPU, Computerbase.de, Guru3d etc etc, suddenly it's only ~10% slower at 1440p and less at 4K CF vs SLI. See my point sir?

I've always respected [H] as an unbiased site that you guys call it what it is, but with your limited games on for testing, even if you aren't bias, selecting mostly GameWorks games automatically make AMD hardware look like crap. If the situation was reversed (For example if you went with SoM, Dragon Age Inq,Evolve etc) I would say exactly the same thing. Selection bias is amplified when your sample size is small. Do not take this as a disrespect but a feedback from a long time reader & someone who has recommended [H] to gamers to make hardware decisions.

Peace.
 
I ran the game on all high settings and still use an EVGA GTX580 SC 1.5. The game looked and performed very well. No stutter or lag at all. I really enjoyed this game even though the story was predictable the parkour abilities made this game feel very different then others within its genre. It is a console port so it should run on any PC that has the same or better specs than an XBox or PS.
 
Now moving forward, if you end up adding Dying Light & Witcher 3 (more GameWorks titles), is there a point to even read your reviews comparing Radeon vs GeForce? The conclusion is set from the beginning. Tell me if I am wrong,

You are wrong. Your basic premise is incorrect that games utilizing some GameWorks features mean that "the conclusion is set from the beginning."

Games have utilized GameWorks features and yet performed better on AMD GPUs. Your conclusion is false.
 
You are wrong. Your basic premise is incorrect that games utilizing some GameWorks features mean that "the conclusion is set from the beginning."

Games have utilized GameWorks features and yet performed better on AMD GPUs. Your conclusion is false.

I don't think is overtly false. Rather he has a point whether the end results say one way or another. If indeed AMD can not optimize completely or without great cost, be it man hours or monetary, there is the issue that the tests do not prove strength or horsepower but rather only show their state in a particular game. Granted the test is still valid but not as an over all, but rather as an individual outcome or a per game basis, which by this thread title is the case. But his concern is valid that based on your test suite the avg of each is greatly skewed and not accurate or indicative of the whole.
 
I don't think is overtly false. Rather he has a point whether the end results say one way or another. If indeed AMD can not optimize completely or without great cost, be it man hours or monetary, there is the issue that the tests do not prove strength or horsepower but rather only show their state in a particular game. Granted the test is still valid but not as an over all, but rather as an individual outcome or a per game basis, which by this thread title is the case. But his concern is valid that based on your test suite the avg of each is greatly skewed and not accurate or indicative of the whole.
I don't see how the average could be skewed when there simply are not many games coming out that don't use Gameworks. The last title to use AMD's Gaming Evolved was Dragon Age: Inquisition. The fact is there are very few notable titles based on the [H] criteria being released at all, and the few that are just happen to use Gameworks. In the specific case of AMD they also have not released a new driver now in 3 months.

Looking at the testing suite from their latest GPU review, the ASUS ROG Poseidon GTX 980 Platinum, there are 2 AMD Gaming Evolved titles (Crysis 3, Battlefield 4), 2 TWIMTBP titles (Far Cry 4, Watch_Dogs), and one title using features from the Gameworks libraries (Dying Light).

Given the above list of games, I do not see how there is any bias or skew in the testing suite at [H].
 
You are wrong. Your basic premise is incorrect that games utilizing some GameWorks features mean that "the conclusion is set from the beginning."

Games have utilized GameWorks features and yet performed better on AMD GPUs. Your conclusion is false.

Which GameWorks games ran better on AMD GPU? Watch Dogs due to no/poor SLI support?

Rather, ACU, FC4, Dying Light etc run much worse on AMD.

AMD GE games run very well on any vendor GPU, case in point: BF4, Crysis 3. It does not skew the result in favor of AMD at the expense of NV GPUs.

Do you not even faintly get the feeling that when you do these performance reviews that somehow, Radeons just look so awful in comparison? I'm sure you do, you dedicate paragraphs to bagging how poorly they perform or how there's no CF support in a GameWorks title. Except if we read from sites that cover many more games, the picture is totally different, CF works great in non-GameWorks titles, in fact, it's even smoother from the few sites that still do FCAT testing. Latest examples being: Total War Rome 2 where 970/980 is a stuttery mess (sweclockers, computerbase.de, pcgamehardware.de) or Ryse & Evolve (latest Crytek engine games).

Do you seriously not see that due to your small sample size, picking titles that skew one way or the other greatly affect the outcome & thus, the conclusion is set before you even start.

Don't believe me?

I guarantee if you have mostly GameWorks titles in your small suite of test samples, the conclusion is set from the beginning. Try it and see, make a performance test with more GameWorks titles. In fact, remember this point in future when Witcher 3, and other GameWorks games enter your small sample of games.
 
I don't see how the average could be skewed when there simply are not many games coming out that don't use Gameworks. The last title to use AMD's Gaming Evolved was Dragon Age: Inquisition. The fact is there are very few notable titles based on the [H] criteria being released at all, and the few that are just happen to use Gameworks. In the specific case of AMD they also have not released a new driver now in 3 months.

Looking at the testing suite from their latest GPU review, the ASUS ROG Poseidon GTX 980 Platinum, there are 2 AMD Gaming Evolved titles (Crysis 3, Battlefield 4), 2 TWIMTBP titles (Far Cry 4, Watch_Dogs), and one title using features from the Gameworks libraries (Dying Light).

Given the above list of games, I do not see how there is any bias or skew in the testing suite at [H].

Let's play a thought game, if we were to do a performance review using all AMD GE games and again all NV GameWorks games, how do you think the result would look comparing 970/980 vs R290/X?

From seeing the recent reviews from many sites, I can assure you the result would skew very badly against AMD in GameWorks titles but remain neutral or a small skew against NV in all AMD GE titles. Now, that can stem from several things, including because "AMD drivers suck" since they fail to optimize for GameWorks title.. or that "NV drivers are just good" in that they can optimize easily for AMD GE titles. OR it could simply be what AMD has publicly stated, that GameWorks is very difficult for them to optimize for since its obfuscated & not open. Regardless, the results speak for themselves.

I don't like the progression of the gaming industry where the pushing of closed features mean if I want to play Ubisoft or GameWorks title, I should get an NV GPU.. and if AMD ever pull that crap with GE (not releasing source code of their features), I would have to get an AMD GPU to play GE games. It's not healthy.
 
The mistake is that you are equating branding and technology programs as the determining factor for performance, but trust me, the case is not so. We don't care who is promoting the games, or which IHV has had input in the game, both vendors get access to these games to make optimizations, there aren't "NVIDIA games" and there aren't "AMD games" that view point has to stop, there are just games, and we use the latest games, popular games, games that are forward looking or push graphics and demand GPU horsepower.

The current problem with AMD performance has been lack of driver support over the past several months for new games. The problem, btw, with CF in FC4 was game related, and a new driver will be coming that finally supports CF. Also, CF is working in Dying Light if you chose the option to force it on in games that don't have profiles, again it is AMD who has slacked off on CF profile support here.

If there are a lot of games with NVIDIA GameWorks features in place, perhaps you should complain to AMD to get more involved in pushing their technologies like TressFX and other things into more games. The market of games, is, what it is, and we can only evaluate it as it is. Trust me on this, the tide that is currently rolling, will change, I've seen it shift so many times in the last 25 years, back and forth to NV, to AMD, and back again.

We are not going to start selecting games by which IHV is promoting or injecting features into a game, now that would be bias! We ignore that crap, and just play the latest darn games and show you how it performs.
 
I'm going to end this particular discussion with this, I think there is an over exaggeration of the influence of GameWorks in said game. There are after all, only two GameWorks features in Dying Light: Depth of Field and HBAO+.

We have found neither perform differently on AMD or NVIDIA GPUs in this game. /thread
 
I think the concern is that GAMEWORKS is closed source. Therefore most are having issue with the unknown of particular performance comparisons that may allude to actual performance of one card brand against another, whereas with those games the only real performance comparison as a term of power is between cards of the same manufacturer. Again posting performance of said cards in a single game DOES speak to their performance in that game but when used with a number of games as a percentage and related to performance of said architecture then it doesn't really become indicative of the point of the test.

In short I think most are just concerned that the closed nature of Gameworks leads to the possibility of inherent bias intentionally set forth simply because no one other than Nvidia knows Exactly what the code is doing. Even in some games where the performance is close with Gameworks in use, the question is open as to whether it is close because the code is agnostic or because AMD cards are in fact stronger but the code is holding them back to a level to make the outcome look even.

It is the unknown that gets imagination going so you can understand how some come to that conclusion, even if they may be making a bigger deal than need be.
 
Ehh there's much simpler explanation - IIRC every single case of game where AMD strugled in recent times was some kind of open world game - Dying Light, Unity, Watch Dogs, MGS V demo, Total War Atilla - those games have massive CPU demands which leads us to AMD DX11 drivers cpu overhead.

Since some people aren't aware of it or try to negate existance of the whole well documented issue they blame it on shitty game code.
 
Ehh there's much simpler explanation - IIRC every single case of game where AMD strugled in recent times was some kind of open world game - Dying Light, Unity, Watch Dogs, MGS V demo, Total War Atilla - those games have massive CPU demands which leads us to AMD DX11 drivers cpu overhead.

Since some people aren't aware of it or try to negate existance of the whole well documented issue they blame it on shitty game code.

Not entirely true. Also the debate or concern is Gameworks and the unknown it adds to any conclusion. Granted dx11 overhead can account the open world environment, it doesn't show across all games.
 
Well said Brent.

Too many AMD advocates here telling us that Gameworks is the devil incarnate. If they spread the fud enough, they get a chance for free hardware from them.

I have to very seriously question the logic behind condemning a company taking it's hard earned dollars and putting it back into developing extra features for PC gaming that we would not have had without them. All because AMD can't or won't do the same, the whole of PC gaming should suffer for it? Sorry but no.
 
All I'll say is [H] reviews have helped me, I think they nail the games I'm interested in. Never have I thought "this is an nVidia title" or "this is an AMD title". I could care less. Try to offer suggestions and not force personal agendas.

Like this review - reminded me not to get Titan X SLI, multiGPU is a bag of worms. :) It's just tempting because I want a 3440x1440 display.

nVidia has 300 engineers working on Gameworks. That's about $75M a year. For better or worse I don't think it's going anywhere.
 
Last edited:
If I could snag a titan x sli setup you wouldn't have to tell me twice! Haha. Seems like one would be set for atleast 2 years. Like the good old days of 3dfx voodoo 2 12mb.
 
Well said Brent.

Too many AMD advocates here telling us that Gameworks is the devil incarnate. If they spread the fud enough, they get a chance for free hardware from them.

I have to very seriously question the logic behind condemning a company taking it's hard earned dollars and putting it back into developing extra features for PC gaming that we would not have had without them. All because AMD can't or won't do the same, the whole of PC gaming should suffer for it? Sorry but no.

You are skewing the argument a tad and granted the opposing guy is as well. He is justified in his concern over the closed nature of Gameworks. Without knowledge into the code none of us, including Brent and other reviewers can really speak to whether AMD is unfairly hindered or not. Of course this doesn't change the facts of how the cards perform nor does it belittle the efforts of reviewers to let us know.

The debate should only be the issue of being closed and the results, usually trust, garnered by being closed, hence the guys complaint. However there is nothing we can do nor reviewers, so griping will do little.

Both sides need to tone it back a bit. Closed is bad, that should be agreed 100% across the community. But we can't badger reviewers completely as it is not their fault and all they can do is bench the cards and give the results. But maybe adding in the possibility of issues optimizing because closed might help alleviate the animosity.
 
You weren't complaining when we used Tomb Raider, sporting TressFX AMD technology for 2 years in our gaming suite which ran much better on AMD hardware.
The code for TressFX was made available so Nvidia could optimize for it. GW is a "black box" affair AMD will never have access or be able to do any optimizations. The two are not comparable.
 
Well said Brent.

Too many AMD advocates here telling us that Gameworks is the devil incarnate. If they spread the fud enough, they get a chance for free hardware from them.

I have to very seriously question the logic behind condemning a company taking it's hard earned dollars and putting it back into developing extra features for PC gaming that we would not have had without them. All because AMD can't or won't do the same, the whole of PC gaming should suffer for it? Sorry but no.

AMD developed a whole new API. Then they gave their work to msft and Khronos to be used for the PC gaming world. I'm not sure how you could miss that and come to the conclusions you have.
 
AMD developed a whole new API. Then they gave their work to msft and Khronos to be used for the PC gaming world. I'm not sure how you could miss that and come to the conclusions you have.

The way I see it is this:;
AMD knew DX12 were comming.
AMD's DX11 driver are lacking in multithreaded performance.
AMD decide to "jump the gun", toss out a IHV specific API and steal some of DX12's thunder....hoping to get "mantle" adopted wide in the business.
(AMD was part of DX12 development, so they knew how far ahead the progress of DX12 were).
And now we have forums full of people claming AMD "invented" DX12...cart before horse if I ever saw one.
Microsoft rejected "mantle" in 2013:
http://semiaccurate.com/2013/10/16/microsoft-rejects-mantle/

To list the DirectX history might be usefull, as it will show that DirectX has always been evolving:
Microsoft-DirectX-History-GDC14.jpg


Notice the gap between DirectX 9 and DirectX 10: 4 years.
We got DirectX 11.2 in 2013.
2015 launch for DirectX12 = only 2 years....half the timeframe from DirectX 9 -> DirectX 10

AMD's PR have a habbit of overhyping and underdelivering (or failing to deliver):

- GPU Physics (http://hexus.net/tech/news/graphics/5838-ati-demo-havok-fx-physics-acceleration-radeon-gpus/) MIA since 2006
- Tessellation (http://community.amd.com/community/...03/why-we-should-get-excited-about-directx-11 ) then came NVIDIA and outperformed AMD in that area...and AMD quicly moved on to "too much tessellation")
- Bulldozer (http://www.techpowerup.com/138328/bulldozer-50-faster-than-core-i7-and-phenom-ii.html) we all know how that went...
- AMD's APU's - Still not good enough to repalce GPU's..and they never will be chasing a moving goal-post (game hardware requirements are not locked in a vaccum, they keep going up)
- Mantle: (http://www.anandtech.com/show/7371/understanding-amds-mantle-a-lowlevel-graphics-api-for-gcn) We all know the story...the promises: IHV agnostic, OpenSource etc....the reality: Only for AMD, Mantle is dead: http://www.pcworld.com/article/2891672/amds-mantle-10-is-dead-long-live-directx.html

Add the fact taht it seems NVIDIA not only beats AMD's performance on Mantle with DX12...they also beat AMD's DirectX 12 performance:
http://www.anandtech.com/show/8962/the-directx-12-performance-preview-amd-nvidia-star-swarm/3

I know a lot of people hope a lot for AMD.
But hope based on conjecture, denial of AMD's shrinking R&D budget and hyped PR wil only set you up for dissapointment later on.
 
AMD developed a whole new API. Then they gave their work to msft and Khronos to be used for the PC gaming world. I'm not sure how you could miss that and come to the conclusions you have.

A whole new API that was designed around GCN. Khronos is taking some elements from Mantle and the API that comes from it will have input from all sources instead of just AMD.

Mantle was "open" because it was built for GCN and it would not have the benefit for other architectures from the competition.

Let's not forget the first and most important point AMD created Mantle. It allowed their CPU to catch up to Intel for gaming in a lot of cases when that was not happening with DX.

I would fully support AMD creating their own version Gameworks. Anything that brings extra features and goodies to PC gaming is good for PC gaming.
 
I hope you realize the "set of technologies" you are comparing aren't even comparable, for one, AMD technology is open. They even release the source code at dev conferences, most recently is their latest TressFX.

Did you notice how quick it was for NV to update their drivers to boost performance in Dirt 3 & Hitman (AMD's Global Illumination) or even Tomb Raider itself with TressFX hair? Because its open, they can actually go through the code themselves to optimize it.

GameWorks isn't open. NV and developers who partake in the program have said publicly that devs who optimize for GameWorks are NOT ALLOWED to share that with AMD.

I noted that of the recent GameWorks titles released, a lot of them release broken requiring many patches. Even Watch Dogs now has terrible stutters on SLI setups and don't even scale at 4K and its an NV sponsored game. What hope does AMD have to optimize such messed up games when they have to deal with the GameWorks black box?

For the record, during your inclusion of TR in your test, you also had Metro LL and Far Cry 3 (NV TWIMTBP titles). Crysis 3 is built upon the Crytek engine that runs equally well on both vendors, with Crysis 2 favoring NV vastly due to the invisible tessellated ocean.

Now moving forward, if you end up adding Dying Light & Witcher 3 (more GameWorks titles), is there a point to even read your reviews comparing Radeon vs GeForce? The conclusion is set from the beginning. Tell me if I am wrong, address my concern here since its quite valid, because all I see ahead is endless AMD bashing in every hardware review because AMD hardware fails in NV sponsored titles. For someone less aware, they would have thought AMD GPUs fail like that in every game, or that the R290X is 25-30% slower than 980.

But wait.. when we expand the game selection, such as at TPU, Computerbase.de, Guru3d etc etc, suddenly it's only ~10% slower at 1440p and less at 4K CF vs SLI. See my point sir?

I've always respected [H] as an unbiased site that you guys call it what it is, but with your limited games on for testing, even if you aren't bias, selecting mostly GameWorks games automatically make AMD hardware look like crap. If the situation was reversed (For example if you went with SoM, Dragon Age Inq,Evolve etc) I would say exactly the same thing. Selection bias is amplified when your sample size is small. Do not take this as a disrespect but a feedback from a long time reader & someone who has recommended [H] to gamers to make hardware decisions.

Peace.

Exactly, I'm 100% with you on this.
 
A whole new API that was designed around GCN. Khronos is taking some elements from Mantle and the API that comes from it will have input from all sources instead of just AMD.

Mantle was "open" because it was built for GCN and it would not have the benefit for other architectures from the competition.

Let's not forget the first and most important point AMD created Mantle. It allowed their CPU to catch up to Intel for gaming in a lot of cases when that was not happening with DX.

I would fully support AMD creating their own version Gameworks. Anything that brings extra features and goodies to PC gaming is good for PC gaming.

Intel CPU also gets a speedbost from "Mantle"/DX12 style API's, so it's back to status quo:
http://www.anandtech.com/show/8962/the-directx-12-performance-preview-amd-nvidia-star-swarm/4

So no, AMD didn't "catch up", sorry
 
Back
Top