Just Cause 3 to feature GameWorks!

You can't blame nV for that, its their software, they aren't going to go out of their way to make it better on other vendor's hardware, that would be doubly counter productive on their end. They would have to spend the money for the optimization for other vendors cards, and it helps sell the other vendor's cards. What company in their right mind would do this, unless they are in a position that adoption of their tech is too low to benefit from?
I dunno. AMD seems to have gone out of their way to make TressFX work just as well on Nvidia's hardware as their own. Nvidia's HairWorks on AMD hardware? Not so much.

AMD provided the code to NVIDIA when they created TressFX while NVIDIA is only providing .dlls for game/HairWorks for the Witcher. AMD devs can't see what's going on and optimize for it yet Nvidia has full source code access for TressFX.
 
Probably not worth dredging the argument WRT TressFX back out again. One could look at that as improving the gaming world or foolishness on AMD's part, depending on one's view.

Yes, it'd be nice for all us AMD gfx wielding nerds to have better optimizations wrt Gameworks, but, eh, the functionality is what it is. I'm happy to use it when it's to my liking and doesn't ninja-chop frame rates.
 
I dunno. AMD seems to have gone out of their way to make TressFX work just as well on Nvidia's hardware as their own. Nvidia's HairWorks on AMD hardware? Not so much.

AMD provided the code to NVIDIA when they created TressFX while NVIDIA is only providing .dlls for game/HairWorks for the Witcher. AMD devs can't see what's going on and optimize for it yet Nvidia has full source code access for TressFX.


Tress fx is nothing compared to Gameworks, yet, its been what 2 years since 2.0 has been released and outside of hair has there been anything else to it?

AMD really needs to get their dev support up but it costs money, and money they don't have so they have to focus on better products for the market (including getting them out to market in a timely fashion) and then gain back marketshare that will push devs automatically to their side. There are many steps to this and blaming nV for doing whats best for their company is not one of those steps its wasted energy, because the dev's really don't care what the consumer thinks about GW's, because they know the most important thing to their consumers is how the game immersion is and GW is a part of that but a very small part.
 
I dunno. AMD seems to have gone out of their way to make TressFX work just as well on Nvidia's hardware as their own. Nvidia's HairWorks on AMD hardware? Not so much.

AMD provided the code to NVIDIA when they created TressFX while NVIDIA is only providing .dlls for game/HairWorks for the Witcher. AMD devs can't see what's going on and optimize for it yet Nvidia has full source code access for TressFX.

And yet, which one has more and more games been using in the last few years.....?

Sometimes the proof is in the numbers, there's a reason a lot of games have implemented GameWorks effects.

GameWorks is just a suite of effects, it is middleware, nothing more.

Ever since we learned about TressFX I've been wanting it games, but for whatever reason, it just hasn't happened /shrug

Again, just look at the number of games with GameWorks effects, and the number of games implementing AMD technology, there is a huge discrepancy, and that has to mean something. The industry has definitely spoken on what it prefers.

I am looking forward to the TressFX in Deus Ex next year.
 
And yet, which one has more and more games been using in the last few years.....?

Sometimes the proof is in the numbers, there's a reason a lot of games have implemented GameWorks effects.

GameWorks is just a suite of effects, it is middleware, nothing more.

Ever since we learned about TressFX I've been wanting it games, but for whatever reason, it just hasn't happened /shrug

Again, just look at the number of games with GameWorks effects, and the number of games implementing AMD technology, there is a huge discrepancy, and that has to mean something. The industry has definitely spoken on what it prefers.

I am looking forward to the TressFX in Deus Ex next year.

Now maybe you don't mean to but you seem to allude that GW is better than anything else. Just because it gets used a lot doesn't mean it is better. Could be easily accessible or financially easier but none of which mean better.

Honestly I would like to see something on when some of these games decided to add GW. Seems most of the time it is announced 1 month or 2 before, which could go a long way to explain some of the out-of-the box performance issues that have been had. Not saying those issues are on GW but the devs as they seemed to rush in GW to patch their issues maybe.
 
And yet, which one has more and more games been using in the last few years.....?

Sometimes the proof is in the numbers, there's a reason a lot of games have implemented GameWorks effects.

GameWorks is just a suite of effects, it is middleware, nothing more.

Ever since we learned about TressFX I've been wanting it games, but for whatever reason, it just hasn't happened /shrug

Again, just look at the number of games with GameWorks effects, and the number of games implementing AMD technology, there is a huge discrepancy, and that has to mean something. The industry has definitely spoken on what it prefers.

I am looking forward to the TressFX in Deus Ex next year.

As much as I'm in the "Who freaking cares?" side of the whole "Gameworks Conspiracy", Nvidia has made no attempt at hiding the fact that they hand out fistfuls of cash to developers to use their tech. I think "The industry has spoken on what it prefers" and it prefers buckets of money.
 
Good for Brent for having the balls to comment in this thread. It was primed from the start to be a shit fest.

I am of the mindset of: don't like game works, turn it off. Not like there is an alternative for half of this without tons of $$$ spent by the devs.

Also if you read Brent's Farcry 4 article game works worked basically just as well on either card from a % performance hit. Within neglible differences IMO. Pretty darn good for AMD investing NOTHING into it while nVidia has 300 engineers working on it.

Never mind Gameworks also works great on consoles which nVidia has no stake in.
 
I think "The industry has spoken on what it prefers" and it prefers buckets of money.

Unfortunately I think this is the most accurate take on things. Unfortunately money talks and BS walks. In AMD's case it just doesn't have the money to give favor like Nvidia and Intel do.
 
Now maybe you don't mean to but you seem to allude that GW is better than anything else.

GameWorks is just a suite of effects. I don't have a preference. I do welcome new technology in games that push forward the graphics envelope, whatever form that may be.

All I'm saying is, there's a reason most games use GW, it means something, there is importance to that fact.

A discrepancy exists between games that use NV technology, and games that use AMD technology.

For whatever reason, game developers keep choosing GameWorks features.

Maybe AMD needs to create its own GameWorks suite of features and push back? AMD needs to compete with NVIDIA in this area, big time. It is squarely on AMD's shoulders to be aggressive in this area. If not, NVIDIA is going to continue to dominate in game features, cause if there is one thing NVIDIA is, it is aggressive.
 
I wonder why they still post "recommended specifications". No way I can rely on that.
This was maybe true back in 90s and early 00s, but today, there's no way I can run JC3 @ 4k with a 780 Ti.
 
I wonder why they still post "recommended specifications". No way I can rely on that.
This was maybe true back in 90s and early 00s, but today, there's no way I can run JC3 @ 4k with a 780 Ti.

You make a good point. Personally I think there should be 'minimun', 'recommended', and 'maximum' recommendations.
 
For like, a week. You should really blame Nvidia for not having a driver out sooner.


You see what I did there?

no.. it still run like shit.. even for AMD cards, you think TressFX doesn't represent any performance hit for AMD? man it drops 30FPS in average, about ~40 for maximum.. and about ~40 for minimums.. that's an absolutely HUGE performance hit for just TressFX in Lara which doesn't any other hair effect in the whole entire game, but not only that its still a bug mess.. who you will blame? AMD or Crystal Dynamics?

I wonder why they still post "recommended specifications". No way I can rely on that.
This was maybe true back in 90s and early 00s, but today, there's no way I can run JC3 @ 4k with a 780 Ti.

Minimum and recommended requirements are based for 1920x1080 gameplay.
 
no.. it still run like shit.. even for AMD cards, you think TressFX doesn't represent any performance hit for AMD? man it drops 30FPS in average, about ~40 for maximum.. and about ~40 for minimums.. that's an absolutely HUGE performance hit for just TressFX in Lara which doesn't any other hair effect in the whole entire game, but not only that its still a bug mess.. who you will blame? AMD or Crystal Dynamics?


What a load of bullshit. Direct from [H]'s own review, using 7970 and 680.

13632141234v2TkTbPdM_6_5.jpg


13632141234v2TkTbPdM_6_6.jpg
 
What a load of bullshit. Direct from [H]'s own review, using 7970 and 680.




sure?. you have the game? go and test yourself..

Direct from my own [G]ame Library ;) 280X stock settings (Same specs as 7970Ghz Edition)

1080P Everything ultra settings no SSAA, no motion blur, No TressFX:

2uqf3ol.jpg


1080P Everything ultra settings no SSAA, No motion Blur, TressFX

nf2xyh.jpg


if you want videos just tell me and can do some runs for you :p i don't need you to tell me how it ran for [H] review ages ago when I have the game to test myself everytime I want.
 
sure?. you have the game? go and test yourself..

Direct from my own [G]ame Library ;) 280X stock settings (Same specs as 7970Ghz Edition)

1080P Everything ultra settings no SSAA, no motion blur, No TressFX:

2uqf3ol.jpg


1080P Everything ultra settings no SSAA, No motion Blur, TressFX

nf2xyh.jpg


if you want videos just tell me and can do some runs for you :p i don't need you to tell me how it ran for [H] review ages ago when I have the game to test myself everytime I want.

My first instinct is to go with sarcasam, but I that won't get anywhere.

Doesn't it seem odd to you that you have such a large discrepancy vs the reviews from 2+ years ago? Especially at those settings? I had a 7970 when the game came out, and I played with tressfx on, and I can't say I recall a performance drop like that.

I don't have my 7970 anymore, but I'm going to disable one gpu and install the game/run the benchmark tonight when I get home.

Edit: A word
 
Last edited:
My first instinct is to go with sarcasam, but I that won't get anywhere.

Doesn't it seem odd to you that you have such a large discrepancy vs the reviews from 2+ years ago? Especially at those settings? I had a 7970 when the game came out, and I was played with tressfx on, and I can't say I recall a performance drop like that.

I don't have my 7970 anymore, but I'm going to disable one gpu and install the game/run the benchmark tonight when I get home.

it has been that way for me since I got this 280X at launch.. and also its the same with my 390X a huge drop in performance with TressFX on vs off. almost on pair with the SSAA which its certainly the highest performance hit of that game...
 
You are running just the benchmark, which stresses nothing but TressFX hair.

For a more balanced run-through, play an actual game level, like we did for our results.
 
it has been that way for me since I got this 280X at launch.. and also its the same with my 390X a huge drop in performance with TressFX on vs off. almost on pair with the SSAA which its certainly the highest performance hit of that game...

Cool info, I'll install it tonight and post screens/logs when I get done. Since you're having the same issue with a 390x, I should be able to replicate it with one of my 290's.
 
You are running just the benchmark, which stresses nothing but TressFX hair.

For a more balanced run-through, play an actual game level, like we did for our results.

IRRC you were the author/editor of that review Brent im right? not willing to fully derail this Just Cause 3 topic, but i'm willing to do a full [H] style benchmark if you have any log of the map/area used to do the test..
 
And yet, which one has more and more games been using in the last few years.....?
Just because it's being used more by developers doesn't automatically mean it's better for us, the consumers. Developers seem more and more concerned about rushing games out of the door than providing a quality product. And GameWorks libraries allow a developer to simply plug-and-play effects rather than developing them in-house . Look at Batman: Arkham Knight as a perfect example of this.

Sometimes the proof is in the numbers, there's a reason a lot of games have implemented GameWorks effects.

GameWorks is just a suite of effects, it is middleware, nothing more.
Exactly. GameWorks is middleware, very much so. Because Nvidia supplies GameWorks effects to developers in the form of DLLs, not source code.

Game developers used to take the source code provided by AMD/Nvidia and incorporate it directly into their game. They had the ability to decide the best way to use the effects provided. Except Nvidia now provides this "middleware" to developers in the form of DLLs. Even if the developer wants to pay Nvidia more money to gain access to the source code, they can't share it or make changes to it. Nvidia has gone this route in order to have complete control. They have the final say in how the developer uses the effects. Which is not how it should be.


Ever since we learned about TressFX I've been wanting it games, but for whatever reason, it just hasn't happened /shrug

Again, just look at the number of games with GameWorks effects, and the number of games implementing AMD technology, there is a huge discrepancy, and that has to mean something. The industry has definitely spoken on what it prefers.

I am looking forward to the TressFX in Deus Ex next year.
I remember back when you were of the opinion that:
http://www.hardocp.com/article/2014/11/06/alien_isolation_video_card_performance_review/8#.VlW9xfKFNv0
The performance that we experienced in this game (Alien: Isolation) is simply fast. We did not expect that this game would perform this good when we first heard about all the fancy DX11 bells and whistles being added to it. We thought for sure this would be the next "GPU Killer" but it was not. We would love to see more game developers utilize DX11 efficiencies in the way The Creative Assembly has.

We have to give The Creative Assembly and AMD big kudos for pushing DirectX 11 and DirectCompute features as far as both did. These kind of technologies using industry standards help move the gaming industry forward, and games certainly need to move in this direction. We saw some interesting technologies put into this game that by using DirectCompute work on both NVIDIA and AMD GPUs.

The great thing about all of this again is that it is done under DirectCompute, allowing NVIDIA and AMD users to benefit from next-generation gaming. No vendor lock outs here. We need more of this.
Which is the exact opposite of what Nvidia is doing today with GameWorks.
 
Which is the exact opposite of what Nvidia is doing today with GameWorks.
Making spaghetti sauce from scratch using your own fresh tomatos and spices tastes so much better than canned sauces... But Ragu is still insanely popular, moreso than homemade sauces.
Not everyone has the time, money, skills, or energy to make their own tomato sauce. Sometimes you just gotta use Ragu. It's a thick tomato paste and it tastes like shit, but when it's a choice between Ragu and eating plain noodles, pretty much everyone would take Ragu. So until another canned sauce comes along that tastes better, uses fresher ingredients, and costs less than Ragu, people will continue to use Ragu because it's cheap, easy, and available.

GameWorks is Ragu.
 
Just because it's being used more by developers doesn't automatically mean it's better for us, the consumers. Developers seem more and more concerned about rushing games out of the door than providing a quality product. And GameWorks libraries allow a developer to simply plug-and-play effects rather than developing them in-house . Look at Batman: Arkham Knight as a perfect example of this.

Arkham Knight its a very good game, well made for their purpose consoles.. WB relegated the PC port to an external studio which just made it awful crappy.. isn't nvidia fault, not even Warner Bros fault at all. the main error as you said was just to try to deliver the game as fast as possible.

Exactly. GameWorks is middleware, very much so. Because Nvidia supplies GameWorks effects to developers in the form of DLLs, not source code.

Game developers used to take the source code provided by AMD/Nvidia and incorporate it directly into their game. They had the ability to decide the best way to use the effects provided. Except Nvidia now provides this "middleware" to developers in the form of DLLs. Even if the developer wants to pay Nvidia more money to gain access to the source code, they can't share it or make changes to it. Nvidia has gone this route in order to have complete control. They have the final say in how the developer uses the effects. Which is not how it should be.

Both can be provided, GameWorks are available in Three flavors, the first one its free, nvidia deliver DLL libraries to each dev willing to use GameWorsk in their games, second its the paid license to nvidia which come with full access to GameWorks source codes, yes, its the developer who pay nvidia to allow this. And third full Nvidia Partnership License which in case Nvidia send GameWorks engineers to work with the develop to help in the tittle creation they only offer support related to gameworks, no fully game development related as many people think.


I remember back when you were of the opinion that:
http://www.hardocp.com/article/2014/11/06/alien_isolation_video_card_performance_review/8#.VlW9xfKFNv0

Which is the exact opposite of what Nvidia is doing today with GameWorks.

what's special about alien isolation? the game runs great, that's right, the game is highly optimized? not as much people think, as by game optimizations that mean not proper AA support? Sub-par compressed textures?. most of the lighting and illumination isn't real time.. even with forced AA settings and Injectors the game still suffer a lot from bad sawtooth everywhere.. man, alien isolation its a fully corridor game, everything its tight and closed, the game have great performance but also isn't nothing cutting edge technologically speaking.. yes all that Compute used for smoke and particles is great, that's the only good thing that game have in terms of visuals.. completely worthless to be
in a Benchmark Gaming suite..

and as you like quotes.. why not show'em more?

There is one major image quality concern that the current in-game AA methods do not fix and that is shader aliasing also called "specular lighting aliasing." In the screenshot above note the whitish dotted lines on the wall structure above this door. This is specular lighting that is very aliased, basically a material on geometry that creates this aliasing effect. It is common in games that utilize heavy pixel shader materials and are dark or have stark contrast.

Unfortunately none of the AA methods here, FXAA or SMAA do anything to clear it up and remove this aliasing seen throughout the game. There are AA methods that can clear this kind of aliasing up, supersample AA being one. This is another reason why we would like to see some more AA options in this game, and other PC games as well. When shader aliasing like this is encountered it would be nice to have some AA options to make it go away. It is very bad in this game particularly and needs an in-game option to remedy this.

While we mostly have positive things to say about Alien: Isolation, there are a few negatives. The one "downside" to games that perform this well is that it doesn't push the need for a faster GPU. That said it is great to see more mainstream GPU owners getting to sample high end graphics quality gameplay. While the image quality is quite good, there are a couple of things we'd like to see added. More dynamic shadows please; the shadows in the game seem somewhat pre-baked. More dynamic shadows in Alien: Isolation would go a long ways to making the game even more dramatic and compelling. It would be great for at least high end GPU owners to turn on better levels of detail to even further push the game's immersion factor.

We very much need more AA options in Alien: Isolation. The performance is there for a SMAA "4TX" option. We know SMAA 4X exists, Crysis 3 has it and that's an aging title. Finally, we would like to see a high resolution texture pack for PC gamers, or an option to run uncompressed textures. The performance is there, the technology is there, with BC7 compression we would love to see a high resolution DLC texture pack for this game. Add these few things, and visually the game will be much more engrossing.
 
Just because it's being used more by developers doesn't automatically mean it's better for us, the consumers. Developers seem more and more concerned about rushing games out of the door than providing a quality product. And GameWorks libraries allow a developer to simply plug-and-play effects rather than developing them in-house . Look at Batman: Arkham Knight as a perfect example of this.


Exactly. GameWorks is middleware, very much so. Because Nvidia supplies GameWorks effects to developers in the form of DLLs, not source code.

Game developers used to take the source code provided by AMD/Nvidia and incorporate it directly into their game. They had the ability to decide the best way to use the effects provided. Except Nvidia now provides this "middleware" to developers in the form of DLLs. Even if the developer wants to pay Nvidia more money to gain access to the source code, they can't share it or make changes to it. Nvidia has gone this route in order to have complete control. They have the final say in how the developer uses the effects. Which is not how it should be.



I remember back when you were of the opinion that:
http://www.hardocp.com/article/2014/11/06/alien_isolation_video_card_performance_review/8#.VlW9xfKFNv0

Which is the exact opposite of what Nvidia is doing today with GameWorks.

Best post I've seen in a long while, great job putting it all together.
 
Best post I've seen in a long while, great job putting it all together.

He's complaining because it's NVIDIA. Which is why you agree with him. All games have bugs. Should we blame AMD for every crappy "gaming evolved" title that came out. Battlefield 4 had tons of problems. This is AMD's fault and it's hurting consumers.
 
Making spaghetti sauce from scratch using your own fresh tomatos and spices tastes so much better than canned sauces... But Ragu is still insanely popular, moreso than homemade sauces.
Not everyone has the time, money, skills, or energy to make their own tomato sauce. Sometimes you just gotta use Ragu. It's a thick tomato paste and it tastes like shit, but when it's a choice between Ragu and eating plain noodles, pretty much everyone would take Ragu. So until another canned sauce comes along that tastes better, uses fresher ingredients, and costs less than Ragu, people will continue to use Ragu because it's cheap, easy, and available.

GameWorks is Ragu.

Hah, I like that analogy

I got nothing more to say on the topic, Ragu it is!
 
Ragu sux, therefore Gameworks sux,,,

Thats what i got out of it.

lol
 
Ragu sux, therefore Gameworks sux,,,

Thats what i got out of it.

lol

However it's better than no sauce.. so better to have shit sauce than plain and alone spaghetti... that's what most should understand.. if we had another better options we should ask for those options, but options are nonexistent. I like the idea that at least we have 1 option.. Worse is nothing.
 
So keeping developers ignorant of a better way and in the dark with Gameworks is the solution!? Unbelievable...

Or perhaps a better solution would be teaching the developers to cook on their own, "develop" thier own flavor and brand, and show them the inherent worth and profitability in doing so.

Otherwise everything will just taste like chicken.

(Oh the similes and metaphors)
 
Making spaghetti sauce from scratch using your own fresh tomatos and spices tastes so much better than canned sauces... But Ragu is still insanely popular, moreso than homemade sauces.
Not everyone has the time, money, skills, or energy to make their own tomato sauce. Sometimes you just gotta use Ragu. It's a thick tomato paste and it tastes like shit, but when it's a choice between Ragu and eating plain noodles, pretty much everyone would take Ragu. So until another canned sauce comes along that tastes better, uses fresher ingredients, and costs less than Ragu, people will continue to use Ragu because it's cheap, easy, and available.

GameWorks is Ragu.

Yeah, but in your analogy developers are chefs working in a restaurant. Gameworks in a AAA game is like getting Ragu with a $60 plate of spaghetti.
 
So keeping developers stupid and in the dark with Gameworks is the solution!? Unbelievable...

Or perhaps a better solution would be teaching the developers to cook on their own, "develop" thier own flavor and brand, and show them the inherent worth and profitability in doing so.

Otherwise everything will just taste like chicken.

(Oh the similes and metaphors)
The pervasiveness of GameWorks should be enough proof that Nvidia is doing something right.
... Or they're just getting marketing deals with publishers, and developers actually hate it. That makes a lot more sense.

I don't know about the inner dealings of devs/publishers/Nvidia but the main dev @ War Thunder is very outspoken and had nothing but good things to say about Nvidia.

Yeah, but in your analogy developers are chefs working in a restaurant. Gameworks in a AAA game is like getting Ragu with a $60 plate of spaghetti.
They still have time and budget constraints. For them, using GameWorks is probably the most feasible option. In their eyes Nvidia is PC gaming, so handing things off to them seems like the best idea. They provide the libraries, the hardware, the optimizations, and heck they are responsible for 80% of PC gamers so it's in their best interest anyway.

The plate of spaghetti might cost $60 but only about $1 of that is reserved for the sauce. The PC version, or the enhanced graphical effects on the PC version, have a low budget.
 
How did we get to the point that we're all of the opinion that the only way to get sauce on our spaghetti (love this) is by using gameworks? All of these deva that use it are major players and they all used to have to do this work themselves.

Suddenly its ok that they use shit ingredients, because its too hard to do it themselves on their multi-million dollar profit games??
 
Anyone who claims GameWorks is, or should be, the "only way" is wrong.
Also anyone who claims GameWorks is the "best way" is also wrong.

It's just the easiest way.
 
GW has its positives, granted only a few. But to the recent points: What about godrays? Was this being done in AAA games before. I have seen it with .enb but don't remember any game sans ones with GW having it. Now I am not saying tessellation is the best way here but it is one way and GW does bring it to FO4. Without debating too seriously its merit to the game overall, it does give us a little more which on the whole is good for us that want more graphical WOW.

But that being said, it is the package as it stands that is the problem. Most of the time it seems rushed in, which is likely never gonna be a good thing. Being closed, as in AMD is not in the foreseeable future going to get access to the code/libraries with which to optimize or allow code changes to alleviate stressors or strengthen positives.

My biggest peeve here is the lack of adjustment within GW selections with which to allow everyone to enjoy in awe or contentment what GW adds to the game. As of now it seems hellbent on allowing on the select few spenders and adopters of their own new series ( spenders includes fury series ) to be able to run these selections day one.
 
Anyone who claims GameWorks is, or should be, the "only way" is wrong.
Also anyone who claims GameWorks is the "best way" is also wrong.

It's just the easiest way.

not really.. in the past, games never had such kind of visual effect emphasis, most of the lighting were always fixed or pre-rendered, most of the shadows same, low quality ambient occlusion, low quality cloth behavior, fixed hairs with little to no movement, no "wind" effects, little to non-existent environment interactions, no global illumination, fixed light sources, little to no existent real time volumetric lighting and a large of etc.. even for actual games those are very hard to achieve effects, which required always insane amount of time, which is directly translated into money, which have a direct effect on the budget for the game development.. so what's the point of gameworks? offer all of this kind of effects cheap, even free. What mean that? cheaper games to develop, which can leave more profit to the developer study, which also mean they can keep making games, this is a win-win formula, the idea of keep profit in the PC gaming industry at the same time you push for graphics development sound bad? to who?. that's great, we need to keep developers making games to the PC industry and not only consoles, and only money achieve that goal.
 
GW has its positives, granted only a few. But to the recent points: What about godrays? Was this being done in AAA games before. I have seen it with .enb but don't remember any game sans ones with GW having it. Now I am not saying tessellation is the best way here but it is one way and GW does bring it to FO4. Without debating too seriously its merit to the game overall, it does give us a little more which on the whole is good for us that want more graphical WOW.

you are probably referring to volumetric lighting.. ;)
 
The last few GameWorks titles have run fine with the GameWorks specific features disabled, so why is this such a huge controversy still?

Witcher 3 ran fine on my 290X with HairWorks off, and HairWorks ran like shit on most NVIDIA products too, so it wasn't a case of bias.

Fallout 4 runs great on both vendors.

It really seems that the problem is with the developers implementing the features (cough Ubisoft) rather than GameWorks itself.
 
Back
Top