AMD Catalyst 15. 2 Beta Driver - Update

Dying Light driver would have been nice, especially since AMD took so much bad press for that game.
 
... yes?? I mean if you've waited three months what does it matter?

I think "unsupported garbage" is a bit strongly worded. What problems are you having? Very little has been released in the last couple of months after the holidays anyway.

Personally, for myself, I've been frustrated consistently and steadily since I switched over to AMD at the lack of very basic functions I've had since the mid 2000's (Probably earlier! but that's about when I switched seriously to PC gaming) like setting a custom resolution and downsampling.


So, yay! AMD does the bare minimum for all but their flagships to work, and for their flagships, seemingly, they do slightly above the bare minimum required for them to work! I'm really, really frustrated with the general driver support of these things. They basically don't go out of their way to do anything. It FEELS like the budget option. They ARE the budget option, and they FEEL like it. That's just sad.
 
Personally, for myself, I've been frustrated consistently and steadily since I switched over to AMD at the lack of very basic functions I've had since the mid 2000's (Probably earlier! but that's about when I switched seriously to PC gaming) like setting a custom resolution and downsampling.


So, yay! AMD does the bare minimum for all but their flagships to work, and for their flagships, seemingly, they do slightly above the bare minimum required for them to work! I'm really, really frustrated with the general driver support of these things. They basically don't go out of their way to do anything. It FEELS like the budget option. They ARE the budget option, and they FEEL like it. That's just sad.
Downsampling wasn't introduced as a major driver feature before NVIDIA added DSR last year. Now both AMD and NVIDIA support it, although NVIDIA's version works better currently in terms of support. So I'm not sure that's the best example of your point.

I've used NVIDIA and AMD GPUs pretty extensively and the only thing I really like about NVIDIA's drivers over AMD is the ability to force AA in a wider variety of games using NVInspector. Other than that, they are pretty much on par for features.
 
Downsampling wasn't introduced as a major driver feature before NVIDIA added DSR last year. Now both AMD and NVIDIA support it, although NVIDIA's version works better currently in terms of support. So I'm not sure that's the best example of your point.

I've used NVIDIA and AMD GPUs pretty extensively and the only thing I really like about NVIDIA's drivers over AMD is the ability to force AA in a wider variety of games using NVInspector. Other than that, they are pretty much on par for features.

You can also use Radeon Pro for AMD drivers to do everything that NVinspector does for Nvidia cards.
 
You can also use Radeon Pro for AMD drivers to do everything that NVinspector does for Nvidia cards.
Radeon Pro does work, but I had better success with a wider variety of games under NVInspector. That said, once DSR and VSR are more broadly supported, it will be kind of a moot point. :) I can run 3800x1800 on my 290Xs now, which is pretty good.
 
Downsampling wasn't introduced as a major driver feature before NVIDIA added DSR last year. Now both AMD and NVIDIA support it, although NVIDIA's version works better currently in terms of support. So I'm not sure that's the best example of your point.

I've used NVIDIA and AMD GPUs pretty extensively and the only thing I really like about NVIDIA's drivers over AMD is the ability to force AA in a wider variety of games using NVInspector. Other than that, they are pretty much on par for features.

Actually, you've been able to downsample manually with Nvidia's drivers for ages. Years upon years upon years and make custom resolutions. Again, for years and years and years.

Doing the same two things on AMD requires CRU for custom resolutions (Which also, for whatever reason, breaks HDCP), or freaking HEX EDITING to downsample. I don't even know if the hex editing even works anymore. It was obtuse as hell that that had to even be a thing.

Aside from that (Which I really, honestly, do consider fairly major things), my experience with AMD has been fine enough. Just slow as hell to do anything, and something that should seem like... something ultra basic (CUSTOM RESOLUTIONS) should be there.
 
it still works fine.
Yeah but there won't be any new features added. And it still doesn't do everything ATT did.

I switched to AMD back in 2008 for its open platform, due to RBE and ATT. Now it looks like everything has gone under, Radeon Pro being the most recent. NVInspector is now the best 3rd party driver tool and its for Nvidia cards.

At least RP still works, and will be updated to stay working. That's better than nothing.
 
Actually, you've been able to downsample manually with Nvidia's drivers for ages. Years upon years upon years and make custom resolutions. Again, for years and years and years.

Doing the same two things on AMD requires CRU for custom resolutions (Which also, for whatever reason, breaks HDCP), or freaking HEX EDITING to downsample. I don't even know if the hex editing even works anymore. It was obtuse as hell that that had to even be a thing.

Aside from that (Which I really, honestly, do consider fairly major things), my experience with AMD has been fine enough. Just slow as hell to do anything, and something that should seem like... something ultra basic (CUSTOM RESOLUTIONS) should be there.
I honestly never considered using custom resolutions and downsampling until maybe a couple of years ago. I can understand why people consider it a big deal now, but I don't think it's anything that was on either company's radar since you could do SSAA just fine. It also wasn't until the last 3-4 years - when consoles really started to hold back PC gaming - that we had enough extra power to downsample games effectively.
 
YAt least RP still works, and will be updated to stay working. That's better than nothing.

I don't believe there's any guarantee RP will be updated, is there? The original developer was hired away by AMD and now does the awful Raptr program.
 
Personally, for myself, I've been frustrated consistently and steadily since I switched over to AMD at the lack of very basic functions I've had since the mid 2000's (Probably earlier! but that's about when I switched seriously to PC gaming) like setting a custom resolution and downsampling.

Custom resolutions have been possible through CCC for... ever. What are you on about?
 
The dev who created RadeonPro has stated in the past that he will update it if there is a need too.
 
Custom resolutions might be possible, but custom resolutions for the sake of downsampling are not possible. That's how it worked for Nvidia control panel, though.
 
Unless there's some magic option I'm not seeing, where? Show me, because I sure haven't found it.

Well, there is an option to create custom resolutions within the specifications of your monitor. IIRC I used it once to lop a pixel off my screen to correct a screen rendering anomaly in some game (Crysis 2?).

Open CCC - Click on My Digital Flat-Panels - Click on HDTV Support. On the right, where it lists resolutions and refresh rates, there is a button labeled "Add". It opens a screen where you can experiment.

Edit: But, yeah, no downsampling.
 
Well, there is an option to create custom resolutions within the specifications of your monitor. IIRC I used it once to lop a pixel off my screen to correct a screen rendering anomaly in some game (Crysis 2?).

Open CCC - Click on My Digital Flat-Panels - Click on HDTV Support. On the right, where it lists resolutions and refresh rates, there is a button labeled "Add". It opens a screen where you can experiment.

Edit: But, yeah, no downsampling.

That's.... actually really awesome. Kind of a mega-obtuse way to put it/perform it, but wow. I really didn't know that existed. Thanks! You have no idea how much that helps me. I want to give you a giant sack of thank yous.
 
http://www.tomshardware.com/reviews/dying-light-benchmark-performance-analysis,4060.html

I guess its over. Even Toms reviews a game and says just get an Nvidia card as it’s cheaper and works better with no mention that I saw of NVidia not playing fair. If Nvidia and its GameWorks is really doing what AMD says then AMD needs to take some f**king action.
For me, there's no reason to continue buying AMD cards.
Saving $20-$50 isn't worth the headaches it causes. I don't want to reward Nvidia for its bullshit GameWorks stuff (if you believe AMD) but at the same time I can't deal with every game running better on Nvidia.

Literally every game I've played on my 280X, aside from Tomb Raider, has launched with the "Nvidia logo" and every benchmark I see has the GTX 770 around 10-15% above my 280X.
I've been waiting for an AMD driver for Dying Light, it's been like 2 months but at this point I'm done with the game and have moved on to AC: Rogue -- another Nvidia game. "AMD's hardware isn't well-optimized for this game yet" ... Yet? The game came out January 27th, Nvidia had their game-ready driver available the same day. How long are we supposed to wait for this game to be "optimized"?

If Fiji manages to absolutely annihilate the current Maxwell cards and it looks like Nvidia don't have anything for 6+ months afterwards (mid-range GM200 cards) then I'll be buying an AMD card. But at this point I'm done making sacrifices for them. The world has changed a lot in the last 2 years and it seems to only be getting worse for AMD. If I buy a new AMD card then it means I'm committing myself to another 3+ years of what appears to be a sinking ship. AMD might actually fold during my ownership of a 380X/390X. That worries me.

// I really wish Toms tested an i7 or something above the 4690k. Wanted to see how those extra cores/threads might help in Dying Light...
 
Last edited:
Extra cores beyond 4 don't make any meaningful difference, at least according to Techspot.

You may also want to look at Techspot's own benches, as they paint a very different picture from Tom's, at least when it comes to average FPS. 290X is second only to 980 at every tested resolution, and at 1600p it's basically tied with the 980.
 
http://www.tomshardware.com/reviews/dying-light-benchmark-performance-analysis,4060.html

I guess its over. Even Toms reviews a game and says just get an Nvidia card as it’s cheaper and works better with no mention that I saw of NVidia not playing fair. If Nvidia and its GameWorks is really doing what AMD says then AMD needs to take some f**king action.

I wonder if Gameworks will continue to be a black box now that Physx is supposedly joining the open source party. I recently came in possession of a gtx 670 and all my issues with Far Cry 4, Dying Light, Asscreed Unity....all suddenly went away. Not only am I getting better frames compared to my 7970 in all those games plus a stutter free experience, but I'm also able to use slightly higher settings and most of the exclusive Nvidia eye candy works flawlessly. Warframe and Planetside also feel much smoother for some reason and its definitely not placebo considering I have over 5k hours combined playtime for both games.

I'm actually so impressed I'm demoting the ole 7970 to HTPC living room duty. I really want to support the under dog and the company with open source ideals, its an internal conflict for sure, especially in light of Nvidias shady business practices. It almost feels like I'm being blackmailed into using their cards, but then I come to the realization that this is AMD's problem and not mine.
 
Last edited:
Techspot's game benchmarks are garbage... all of them.
Their numbers are always too high.

lol OK then look at Guru3D's benches. Same story here -- 290X second only to 980. And at 1440p the 980 only has a 6% lead over the 290X.

HardwarePal also shows the 970 being just a touch faster than the 290 (not the 290X), which is entirely consistent with what Techspot and Guru3D have found. If anything Tom's testing showing the 970 beating the 290X with a sizable margin at 1440p is the outlier here.
 
Last edited:
I didn't say their rankings were wrong, I said their numbers are too high. And they are.

Here's another

Watch Dogs Guru3D: http://www.guru3d.com/articles_pages/watch_dogs_vga_graphics_performance_benchmark_review,9.html
Watch Dogs TechSpot: http://www.techspot.com/review/827-watch-dogs-benchmarks/page4.html

TechSpot is 20-40fps over Guru3D, and they're even using 1200p as opposed to 1080p.
Same goes for the Dying Light benchmarks. And literally every other benchmark TechSpot has ever posted. Always over, never under.

It's bizarre, sure. I stopped looking at TechSpot a long time ago. They have become the go-to "It's not as bad as everyone says" benchmark website, and they're bullshit.
 
I don't know if I'd say TechSpot is over on every game, because at least for Crysis 3, their numbers are under compared to Guru3D. Same idea with BF4 and Far Cry 3 although it's less of an apples to apples comparison there.

Anyways even if we threw out Techspot, Tom's results are still anomalous when compared against Guru3D's and HardwarePal's. That was my main point.
 
I have over 1,800 games on Steam. If Nvidia and their developer network want to F over AMD users then I can certainly boycott their products with ease. Someone mentioned Warframe earlier and I get a really high frame rate since they added in the new "PhysX like effects" for AMD cards. That game runs really smooth for me.

To tell you how bad GameWorks game optimization is for AMD cards, Project Cars on a GTX 560 was almost 2x as fast as my R9 290 when you turn on the weather. It's getting better, but I'm not holding my breath for a miracle. And the game is capped at using 2GB of VRAM to make older Nvidia cards shine with it. If that shit isn't shady, then tell me what is the meaning of shady?
 
curious to see how Evolve benchmarks look once Gameworks features get patched in

right now, Evolve favors AMD hardware to the point where a 290X performs better than a 980 at 2560
 
http://www.tomshardware.com/reviews/dying-light-benchmark-performance-analysis,4060.html

I guess its over. Even Toms reviews a game and says just get an Nvidia card as it’s cheaper and works better with no mention that I saw of NVidia not playing fair. If Nvidia and its GameWorks is really doing what AMD says then AMD needs to take some f**king action.

It's been over for AMD @ Tom's for a while now.

Extra cores beyond 4 don't make any meaningful difference, at least according to Techspot.

You may also want to look at Techspot's own benches, as they paint a very different picture from Tom's, at least when it comes to average FPS. 290X is second only to 980 at every tested resolution, and at 1600p it's basically tied with the 980.

^^^THIS^^^
 
Extra cores beyond 4 don't make any meaningful difference, at least according to Techspot.

You may also want to look at Techspot's own benches, as they paint a very different picture from Tom's, at least when it comes to average FPS. 290X is second only to 980 at every tested resolution, and at 1600p it's basically tied with the 980.

This limited how we could test Dying Light and in the end we benchmarked walking around two levels of the building where the game starts. Interestingly, the frame rates are much lower in-doors anyway, so this worked out well.

There are no quality presets in Dying Light so you must set everything up. For testing we maxed out every setting with the exception of Nvidia HBAO+ to try and keep the results consistant across AMD and Nvidia cards. That said, we did leave 'Nvidia Depth of Field' enabled.

I don't have Dying Light so I don't know if the beginning level has much open world to it, but once you're in the Dying Light open world, the game starts greatly favoring Nvidia hardware
 
I don't have Dying Lights either so I'm just going by what the review sites are telling me. With that out of the way...

Guru3D's benchmark level involves going outside to the open world:

Guru3D said:
Our test will be the a level recording in the scene where you first go outside. This way you can mimic the test at home and compare a little. The benchmark can be quick or slow depending on your graphics card, resolution and image quality settings. Typically for a benchmark run there will be a scene rendered where the output of the number of frames rendered over time equals to an average framerate.

The benchmark clip seems pretty open world to me.

HardwarePal does a similar benchmark:

HardwarePal said:
One of the most demanding areas of the game is when you exit buildings into the open world. That’s where I decided to start out the 60 second run that can be seen in the video below.

The benchmark clip is not as open world as Guru3D's, but it's still outside.
 
You have to compare OC vs OC for the 970/980 and that is where they shine. You guys are likely right that the 290x beats the 970 at stock, but once you OC the 970 should tie/pull slightly ahead. The ref 970 is ~1200 MHz, people OC it to 1500-1600. That's somewhere around a 30% OC.

The reference 980 is 24.x% faster than a custom cooled 290x according to [H]'s OC vs OC review.

My understanding is you can just turn off gameworks features. Although, the only game I actually tracked was FC4 where gameworks worked well on both platforms. Enhanced god ray supposively hit AMD a little harder but they just f' up your gamma anyways. To note, I am not a huge fan of gameworks... Always felt like it should be something being developed by the game engine creators.

I am not totally convinced it's not just AMD's incompetence though. Everyone likes to ignore all of Ubi's games are shit when they launch for everyone. SLi didn't work for a few weeks then nVidia patched it. Apparently AMD doesn't even release drivers for months. If nVidia didn't release drivers for months SLi would still be broken too.
 
Last edited:
You have to compare OC vs OC for the 970/980 and that is where they shine. You guys are likely right that the 290x beats the 970 at stock, but once you OC the 970 should tie/pull slightly ahead. The ref 970 is ~1200 MHz, people OC it to 1500-1600. That's somewhere around a 30% OC.

The reference 980 is 24.x% faster than a custom cooled 290x according to [H]'s OC vs OC review.

My understanding is you can just turn off gameworks features. Although, the only game I actually tracked was FC4 where gameworks worked well on both platforms. Enhanced god ray supposively hit AMD a little harder but they just f' up your gamma anyways. To note, I am not a huge fan of gameworks... Always felt like it should be something being developed by the game engine creators.

I am not totally convinced it's not just AMD's incompetence though. Everyone likes to ignore all of Ubi's games are shit when they launch for everyone. SLi didn't work for a few weeks then nVidia patched it. Apparently AMD doesn't even release drivers for months. If nVidia didn't release drivers for months SLi would still be broken too.

GameWorks is a mandatory library of features that translate the game into usable code for the GPU. It is encrypted. AMD cannot see what is inside it. Thus they cannot write drivers for it. Nvidia writes the hooks that access the AMD drivers. Those hooks are within the encrypted GameWorks library. AMD used to brute force it, but Nvidia is using a different method of encryption now.

So what AMD does now is write a driver and allows developers to enable what features that they wish to. It is up to the developer / Nvidia to turn on CrossfireX, or whatever else they want. Notice that AMD only has problems with Nvidia GameWorks games. There is a damn good reason for this other than "AMD must be incompetent!"

If games like Far Cry 4 are high on your must be played list then I advise you to buy an Nvidia card.
 
GameWorks is a mandatory library of features that translate the game into usable code for the GPU. It is encrypted. AMD cannot see what is inside it. Thus they cannot write drivers for it. Nvidia writes the hooks that access the AMD drivers. Those hooks are within the encrypted GameWorks library. AMD used to brute force it, but Nvidia is using a different method of encryption now.

So what AMD does now is write a driver and allows developers to enable what features that they wish to. It is up to the developer / Nvidia to turn on CrossfireX, or whatever else they want. Notice that AMD only has problems with Nvidia GameWorks games. There is a damn good reason for this other than "AMD must be incompetent!"

If games like Far Cry 4 are high on your must be played list then I advise you to buy an Nvidia card.

Well that does sound really shitty.

I'd like to think AMD is still great for single card solutions. The 390x seems perfect for it with 4GB anyways.

I personally try to avoid multiGPU from either camp.
 
GameWorks is a mandatory library of features that translate the game into usable code for the GPU. It is encrypted. AMD cannot see what is inside it. Thus they cannot write drivers for it. Nvidia writes the hooks that access the AMD drivers. Those hooks are within the encrypted GameWorks library. AMD used to brute force it, but Nvidia is using a different method of encryption now.

So what AMD does now is write a driver and allows developers to enable what features that they wish to. It is up to the developer / Nvidia to turn on CrossfireX, or whatever else they want. Notice that AMD only has problems with Nvidia GameWorks games. There is a damn good reason for this other than "AMD must be incompetent!"

If games like Far Cry 4 are high on your must be played list then I advise you to buy an Nvidia card.

Just avoid Ubisoft and avoid problems :D, nearly all of their games run like crap regardless of the card you're using. I will say the newer gamework stuff does favor the Nvidia side. Doesn't mean the games run well though. I mean far cry 4 runs like poo compared to far cry 3 and its the same engine, and basically the same graphics.

You have to compare OC vs OC for the 970/980 and that is where they shine. You guys are likely right that the 290x beats the 970 at stock, but once you OC the 970 should tie/pull slightly ahead. The ref 970 is ~1200 MHz, people OC it to 1500-1600. That's somewhere around a 30% OC.

The reference 980 is 24.x% faster than a custom cooled 290x according to [H]'s OC vs OC review.

My understanding is you can just turn off gameworks features. Although, the only game I actually tracked was FC4 where gameworks worked well on both platforms. Enhanced god ray supposively hit AMD a little harder but they just f' up your gamma anyways. To note, I am not a huge fan of gameworks... Always felt like it should be something being developed by the game engine creators.

I am not totally convinced it's not just AMD's incompetence though. Everyone likes to ignore all of Ubi's games are shit when they launch for everyone. SLi didn't work for a few weeks then nVidia patched it. Apparently AMD doesn't even release drivers for months. If nVidia didn't release drivers for months SLi would still be broken too.

the 290x is faster than the 970. Even when overclocked. Heck my 290 @ 1200/1650 beats nearly every 970 benchmark i've seen. The 980 is certainly faster though, and it really is a decent step up from a 290x/290 even the 970.

Honestly the whole gamework stuff is disappointing. They shouldn't be allowed to lock other people out of its code. Other makers of graphic products need to be able to optimise for it. Perhaps Intel and AMD, will take action against Nvidia for these actions. Then again Intel never really cared too much about tweaking their gpu drivers in the first place.
 
Back
Top