AMD Radeon Software Crimson Edition 16.2

Here is the confusing bit about what AMD is doing , AMD has a good relationship with Oxide (they did help with Mantle) so someone in AMD thinks of pushing the agenda of DX12 which is of course something AMD would benefit from their cards are good their drivers are good , nothing wrong with that right?

But in reality there is only one DX12 game "out" and from a consumers point of view why would they "invest" into DX12 now , no games just an Alpha/Beta. At this point in time it is hardly relevant to push a single game to prove DX12 superiority from AMD. They might win back some folks with these benchmarks but will this work for AMD in the long run as soon as someone in AMD management declares a yes they will keep doing this and that is not beneficial in the long run because of the point that Kyle makes, creating demo's to prove superiority is not something that will make people happy if their retail game does not perform the same way, you do not buy a videocard or game just because your hardware does well with it on benchmarks ...

Btw from Oxide point of view , any press is good press :)
 
Kyle! I don't see the problem here. Ashes of the Singularity is a game with an integrated benchmark, just like some other games. You can now download the upgrade on Steam, and play the actual game with the upgraded renderer. So while AMD mention benchmark it the driver notes, these basic optimizations are useful for the game itself. Even the Ashes of the Singularity benchmark is a real world scenario, so the workload is constantly changing simulating the actual game with two AI opponent.
 
But in reality there is only one DX12 game "out" and from a consumers point of view why would they "invest" into DX12 now , no games just an Alpha/Beta. At this point in time it is hardly relevant to push a single game to prove DX12 superiority from AMD. They might win back some folks with these benchmarks but will this work for AMD in the long run as soon as someone in AMD management declares a yes they will keep doing this and that is not beneficial in the long run because of the point that Kyle makes, creating demo's to prove superiority is not something that will make people happy if their retail game does not perform the same way, you do not buy a videocard or game just because your hardware does well with it on benchmarks ...
DX12 don't need a lot of invest in the driver side. The layer is very thin, and a lot of management done in the application itself or some universal OS layer/driver comes with Windows 10 (VMM/DXKG). So doing an "app ready" driver is not necessary. The IHV can change the shaders, optimize for better vmem allocation, but nothing else. Tracking the hazard, managing the resources and the video memory itself is now explicitly controlled by the engine. And this is where the performance was come from a driver update with the DX11 API.

For DX12 it is more important to document the hardware and open source the tools to help the developers to make as good optimization as the IHV done in the drivers for DX11.
 
I don't know if you are so blinded for team red you fail to read, your tin foil hat is on so tight it is cutting off the circulation, you hope to win some subreddit award or you suffer from some type of diminished capacity. It could be all of the above.

You know what they say about ad hominem mate?

The last resort of a scoundrel.

You know why?

Because my argument is sound, backed by proof. Solid numbers which this very site found, contradicts their conclusion about how bad the efficiency is on Fury. Go ahead, you do the maths.

What kind of nonsense is this statement from the reviewers?

"You cannot deny the efficiency of the GeForce GTX 980 over the new Radeon R9 Fury. The GeForce GTX 980 is able to deliver more performance per watt. The overall system wattage usage is a lot less on GTX 980 versus R9 Fury."

I can't deny? WTF, from your own numbers, it shows Fury is able to deliver more performance, and overall, similar or better performance per watt.

So if you have something FACTUAL to say in a retort, go ahead, make my day. Otherwise you're the troll and shill that you accuse me of.

This one is also a gem:

"Somehow NVIDIA's Maxwell architecture is magical when it comes to getting the most performance out of each watt of power. This is something AMD's Fiji hasn't mastered."

Magical? Puh lease. It's called a software scheduler and a single engine design that's incapable of processing graphics & compute in parallel, to maximize DX11 at the expense of DX12 performance. Ain't nothing magical about it.
 
Just a couple points here.

I can't remember the last time I've seen Kyle so active in a thread with only 66 posts, (at the time I write this.)

On 02/21/2016, I ordered an MSI GTX 980Ti Gaming 6G "Golden Edition" v-card.

On 02/24/2016, the AoS benchmarks dropped, and my 980Ti is supposed to land on my door step on 02/25/2016. I would be a liar if I didn't admit that, the AoS bench's, instantly made me wonder if I just made a costly mistake.

No offence to Kyle intended here, but since he recently invested in 2 Titan X's and a 4k TV, I have to imagine that the AoS bench's, were not something he wanted to see. I know its not something I wanted to see. Hopefully Nvidia isn't lying about the ability to handle Async Compute, being turned off right now on their cards.
 
No offence to Kyle intended here, but since he recently invested in 2 Titan X's and a 4k TV, I have to imagine that the AoS bench's, were not something he wanted to see. I know its not something I wanted to see. Hopefully Nvidia isn't lying about the ability to handle Async Compute, being turned off right now on their cards.
So you think I am wondering about having the best gaming experience of my life right now, and how it might change months down the road?
 
Ok look at your tomb Raider review. Now look at all the others out there before yours and before the drivers that gave the performance you saw. AMD definitely put some effort into Tomb Raider which is very prevalent when comparing your review to the others. So it isn't like AMD is only working on benchmarks.
OK.
 
I think we can all agree that we're all in favor of whoever is currently winning. If AMD had a card worth a damn right now then plenty of us would be on the red team, we just switch sides to fit our needs. There's a reason I haven't gone red since the 6950.
 
Not immediately, obviously. But come on, if Nvidia actually performs this badly on a feature that would look to be a must have in games to come, I can admit I would probably regret my purchase. I realize Nvidia pretty much dominates "current" games and I am very happy ATM, but there has been a seed planted. Just saying, you're pretty active in this thread, about AMD's drivers, and the AoS benches. Just pointing out a potential reason.
LOL!:ROFLMAO:
 
DX12 don't need a lot of invest in the driver side. The layer is very thin, and a lot of management done in the application itself or some universal OS layer/driver comes with Windows 10 (VMM/DXKG). So doing an "app ready" driver is not necessary. The IHV can change the shaders, optimize for better vmem allocation, but nothing else. Tracking the hazard, managing the resources and the video memory itself is now explicitly controlled by the engine. And this is where the performance was come from a driver update with the DX11 API.

For DX12 it is more important to document the hardware and open source the tools to help the developers to make as good optimization as the IHV done in the drivers for DX11.

Don't forget the same thing happened with Mantle it went through some revisions at the beginning as well, I was not talking about the driver tho :) .
I was talking about how the benchmark does not include certain DX12 features which might not or not yet or not fully supported in either AMD or Nvidia drivers. In a retail version of the game you would see different results because of driver maturity and Oxide figuring out what is best for which architecture.
The last month of development is usually for polishing and bug splatting ...
 
Update: (2/24/2016) Nvidia reached out to us this evening to confirm that while the GTX 9xx series does support asynchronous compute, it does not currently have the feature enabled in-driver

Nvidia's Upcoming DX 11 Driver Fares better than Mantle API - Benchmark Slides Show Surprising Results

If these benchmarks are true, and it looks like they are, it reveals why Nvidia didn’t bother creating a rival API to Mantle but instead increased optimized their graphics card for the DirectX 11 API in a way that performance can be obtained from a simple driver update. Now the golden question that comes to my mind is this: If Nvidia could do all this with a Soft update, why didn’t it do it before. The obvious answer to that question is slightly alarming.

Nvidia is still on DX11 high ;)

Nvidia Actively Working To Implement DirectX 12 Async Compute With Oxide Games In Ashes Of The Singularity

Nvidia Is Actively Working With Oxide Games To Implement DirectX 12 Async Compute Support For GeForce 900 Series GPUs In Ashes Of The Singularity

Maybe they are and maybe they are not what is sure is that if you google this you get very dated articles.
 
New drivers bench slightly lower than the last Betas in Firestrike and Dragon Age, but fixed some weird performance issues I had with MW3. I also get a "display driver has stopped responding" error everytime I try and run the SteamVR test, but no actual issues in Source games.
 
Cool, we are arguing about a game no one will play but only bench.

You can play the game it is still in development stage with public updates every once in a while trying to get the game finished Oxide decided to go this route some people like this RTS game :) . Not to sure what is implemented already ....
 
AMD is right to talk up canned benchmarks because it paints them in a positive light, and it seems to be based on structural advantages of gcn over maxwell in dx12 over dx11. (And btw, ashes is a playable game right now, so it's not exactly just a tech demo)

There will be multiple dx12 titles out this year, and so it makes perfect sense for amd to want to get the word out that their cards can better handle the newer games... at least if they bother using things like async compute. I imagine nvidia will lean on gameworks game devs to not bother focusing their engines on large uses of asynch so that props nvidia up.


Buying cards is not just about the games out today, it's about the future, ESPECIALLY near inflection points where new gpus are about to drop.

Within 6 months we should have more definitive answers about who is on top, and if nvidia shifted pascal in time to take advantage of the new paradigm in a way that matches or bests amd with gcn.


But I have to say this. If I was in the market for a new gpu, my primary concern would not be what card performs best in dx11 games (i.e. tomb raider). I want to know which card is the most forward looking and better at the games that will be taxing the new api. Because THAT is where the heavyweight games with high end graphics are going to be focused. And if they are not, they are needlessly leaving performance on the table so who cares.


If you buy a brand new gpu every 6-12 months it does not matter, if your gpu cadence is more like 2-3 years picking the right card is more important than getting the winner of the day or at launch.

780ti owners were sh*t on compared to 290x owners, and I suspect the exact same thing will happen to maxwell owners with all these games using asynch because they don't seem to be designed to handle graphics+compute workloads concurrently. Point and advantage to AMD. Of COURSE they should point that out, whether the games have dropped today or not.
 
  • Like
Reactions: fnZx
like this
It means the game is an "early access game". Which means no money from me.

Kyle you're popular on the AMD subreddit.
Early Access mean people, [H] members included, have to actually play it to report bugs and make valid suggestions for the future of the game. That is still "playing" by my definition of the word.
 
In case anyone is curious...you can update this driver with the Vulcan 16.150's from last week AMD Display Driver 16.150.1009.0 (2/13/2016) and actually end up with newer versions. I tested Ashes before and after loading the vulcans over the 16.2 and got better performance. Never thought i would be the one to mix drivers
Capture3_zpsptostfs7.png
 
In case anyone is curious...you can update this driver with the Vulcan 16.150's from last week AMD Display Driver 16.150.1009.0 (2/13/2016) and actually end up with newer versions. I tested Ashes before and after loading the vulcans over the 16.2 and got better performance. Never thought i would be the one to mix drivers
Capture3_zpsptostfs7.png

Heh 16.2 has lower version numbers too.. Any bugs with the Vulkan version?
 
If Kyle was actually legitimately concerned about any of this, he'd do his own benchmarking of AoS THE GAME, get the frame times and do some actual journalism. <crickets>

As for people saying nVidia isn't bothering because there are no DX12 games, you are crazy. DX12/Vulkan is the future. Putting engineering effort into doodling around with DX11 for a few frames here and there, that would be a giant waste of resources. That's how the world works; it's forward looking (you know, like in capitalism?)
 
If Kyle was actually legitimately concerned about any of this, he'd do his own benchmarking of AoS THE GAME, get the frame times and do some actual journalism. <crickets>

I learned a long time ago that working with unreleased games is simply a waste of money. A lot of times working with newly released games is a waste of money.
 
In case anyone is curious...you can update this driver with the Vulcan 16.150's from last week AMD Display Driver 16.150.1009.0 (2/13/2016) and actually end up with newer versions. I tested Ashes before and after loading the vulcans over the 16.2 and got better performance. Never thought i would be the one to mix drivers

So, you installed vulkan drivers over 16.2..? what games have you tested?.
 
So, you installed vulkan drivers over 16.2..? what games have you tested?.
i have tested BF4 mantel and dx11....an old generals game (dx9?) Ashes dx11 and 12 and opengl Wolf new order and Rebel Galaxy...thats about it. theirs a lot i have not tested. you know how AMD bugs are very specific sometimes..there could be planty. plus i confirmed vulkan is working with the sdk apps
 
So I ran the numbers provided by Bahanime.
GTX980: 320 Watts, average FPS: 51.9
R9 Fury: 367 Watts, average FPS: 60.4

GTX980 = 0.16 Frames/watt
Fury = 0.17 Frames/watt

Or

GTX980 = 6.2 watts/frame
Fury = 6.1 watts/frame

So mathematically, meaning empirically, speaking... Kyle's conclusion was wrong.

Bahanime is correct. The R9 Fury delivers more performance per watt than the GTX 980. The R9 Fury is therefore more"efficient" in terms of power usage when compared to delivered performance.



Ps. 6 years ago, I was an nVIDIA customer, now I'm not. Kyle is biased, not only in his reviews (as shown mathematically) but in his comments here as well. A biased journalist does not exist, that's called being an editorial columnist. That's how he is behaving here and it shows in the content of his reviews (when it comes time to opine on the data).

If anyone wishes to dispute any of this, I suggest you use some of that Maxwell "magic" Kyle was talking about. Because you'd be wrong from an empirical standpoint.

I have noticed this as well. I have yet to see fury x even come close to max vram In any game. For a site that never tests vram they are constantly saying the fury x runs out of vram or doesn't have enough and stutters because of it. They did that in the gta v revirw
 
One thing I just wanted to throw out, remember the Fermi days? Kyle was very hard on Nvidia. I remember the article he wrote after living with a pair of 480's for a while and it was pretty negative (power and heat). If I remember right, Kyle and Brent actually recommended NOT buying a 470 or 480 in a review!

I do think [H] goes a little overboard on the AMD swipes now and then (front page article about AMD cutting R&D budget but nothing when Intel did the exact same thing a month prior) but they've done the same to Nvidia when they were laying eggs on the market. I don't consider it bias necessarily. More like tough love when you bring subpar products to market and AMD has been bringing up the rear lately. I'm an AMD fanboy and even I have to admit that.
 
I haven't tried this new AMD driver yet, but I did switch from about 15 years of nothing but Nvidia to AMD in the last year. I now own an AMD Fury X card. Its very very nice!
It's very quiet, runs cool, works flawlessly, and visually looks sharp. The 120mm fan spins up to about 1050-1100 RPM which is still quiet under full gaming load and the temp is in the mid 60*c in a case which doesn't have much ventalation. (Cosmos 1010) Much cooler than a 980TI would run because the heat is pumped out of the case since the radiator is mounted to the back of the case. My case is sitting on the floor and between the Fury X rad and the Corsair GTX110i GTX --- my machine is nearly silent under load ---- but still packs in a LOT of performance.

I read HardOCPs review of Rise of the Tomb Raider which Kyle linked earlier in this thread. If you read the review it'll come across as Nvidia 980ti is the clear winner in the paragraph based text, but if you look at the FPS charts it looks to me like the AMD Fury X is the clear winner.

....Stranger still???

The game is marketed heavily by Nvidia - its bundled with Nvidia cards, the splash screeen says Nvidia, the main menu says built in cooperation with Nvidia -- it's promoted heavily to be Nvidia optimized. Yet the game appears to run better with AMD Fury X both in single card and dual card based on the benchmarks in the review.

Furthermore, I'm running the game with a i7 4770k at 4.5ghz with every single setting maxed out and getting what I consider extremely smooth fps at 2560x1600 on my Fury x - with no stuttering or hitching. I dont know why the reviewer was having to turn settings down at 2560x1440 since I have no trouble at max settings at 1600p.

The fact that AMD just released a performance update is commendable because the game already ran so good on the Fury X - even without the recent update.

I've not followed [H] reviews much recently but don't think they historically were obviously biased, but darned if something doesn't seem a bit off with that particular review of Rise of the Tomb Raider. In fairness to this discussion It does read a bit biased as compared to the objective data published.

Signed, a long time nearly exclusive Nvidia advocate --- who is now very much surprised to be appreciating a Fury X card. Its quiet, fast, cool, and I haven't encountered any weird/unique driver problems. AMD exclusively supports hardware PLP 20"30"20" monitor configuration and that's why I initially jumped to red team since that is my desktop monitor setup. Frankly, I didn't expect as good an experience as I'm having.

I know some will say this is rubbish, but I also feel like the old addage of a bit better picture quality seems to hold true as well. The image of the AMD card on my home theater projector through HDMI appears to have better black levels than I could ever manage with the Nvidia cards regardless of the settings I tried to manipulate based on recommendations from different threads I found. I used full 0-255 HDMI spec on both, but the AMD is inkier blacks IMO - without being overly crushed... (my opinions anyway) My previous cards have been Nvida exclusively since 3dfx and my old Viper S2000. My last few cards have been 460, 560ti, 670 - I then switched to AMD to get PLP support and started with an AMD 285 and now a Fury X

Reference:
Nvidia GTX670, Onkyo PR5508, Panasonic AE8000U - HTPC blacks suck... - AVS Forum | Home Theater Discussions And Reviews

Glad you had a great experience. I had a similar experience when I switched to fury x crossfire from nothing but years of sli. Switched to a setup that isn't even possible with nvidia just like you (49" ips 4k non pwm 10 bit FREESYNC).

Crossfire has been amazing for me and although I do need clockblocker, it's smoother all day than any sli I've experienced. I couldn't even select 10 bit with nvidia cards. Also got my color profile forced in all games

Hard blasts 4gb hbm and says its not enough in games but doesn't even test vram numbers. Not my experience though 4gb hbm says more than enough for 4k. I almost thought I was crazy, but image quality is also better on these amd cards for whatever reason. It is noticeable even on the desktop
 
Last edited:
Glad you had a great experience. I had a similar experience when I switched to fury x crossfire from nothing but years of sli. Switched to a setup that isn't even possible with nvidia just like you (49" ips 4k non pwm 10 bit FREESYNC).

Crossfire has been amazing for me and although I do need clockblocker, it's smoother all day than any sli I've experienced. I couldn't even select 10 bit with nvidia cards. Also got my color profile forced in all games

Hard blasts 4gb hbm and says its not enough in games but doesn't even test vram numbers. Not my experience though 4gb hbm says more than enough for 4k. I almost thought I was crazy, but image quality is also better on these amd cards for whatever reason. It is noticeable even on the desktop

Which monitor are you using?
I haven't found any games yet from my collection that need clockblocker to keep the core speed high - which ones are you having slowdown with?
 
Which monitor are you using?
I haven't found any games yet from my collection that need clockblocker to keep the core speed high - which ones are you having slowdown with?
Seems that issue is hit and miss with users. Still a lot having the issue with an equal share not. Makes one curious as to why.
 
Which monitor are you using?
I haven't found any games yet from my collection that need clockblocker to keep the core speed high - which ones are you having slowdown with?

Wasabi mango uhd490. For me it seems to affect all games. So far batman arkham knight, metro 2033 redux, gta v, shadows of mordor, dying light, ryse son of rome, blade & soul, and black desert online
 
Wasabi mango uhd490. For me it seems to affect all games. So far batman arkham knight, metro 2033 redux, gta v, shadows of mordor, dying light, ryse son of rome, blade & soul, and black desert online


lol - I don't have any of those games to test.

I've played lately -
Age of Empires III release
Age of Mythology Enhanced edition rerelease
Warhammer - Vermintide
Rise of the Tomb Raider
Wolfenstein the New Oder
Battlefront
Need for Speed
Plants vs Zombies Garden Warfare
Evolve
Path of Exile
Assassin's Creed IV Black Flag
The Vanishing of Ethan Carter Redux.

I haven't noticed any issues except perhaps in Path of Exile - but I haven't played it enough to troubleshoot it yet. I did notice the FPS seemed a bit unsteady - but that game was that way with the AMD 285 card I had for a short time right before I upgraded to this Fury X....I asked in the chat and they just said Path of Exile works better on Nvidia cards. It's still completely playable - so I didn't think much of it. But FPS dipped from 60 to ~30 on and off --- so maybe that's what you are seeing? The other games all have worked without issue. (well except bad screen tearing in Wolfenstein New Order)
 
lol - I don't have any of those games to test.

I've played lately -
Age of Empires III release
Age of Mythology Enhanced edition rerelease
Warhammer - Vermintide
Rise of the Tomb Raider
Wolfenstein the New Oder
Battlefront
Need for Speed
Plants vs Zombies Garden Warfare
Evolve
Path of Exile
Assassin's Creed IV Black Flag
The Vanishing of Ethan Carter Redux.

I haven't noticed any issues except perhaps in Path of Exile - but I haven't played it enough to troubleshoot it yet. I did notice the FPS seemed a bit unsteady - but that game was that way with the AMD 285 card I had for a short time right before I upgraded to this Fury X....I asked in the chat and they just said Path of Exile works better on Nvidia cards. It's still completely playable - so I didn't think much of it. But FPS dipped from 60 to ~30 on and off --- so maybe that's what you are seeing? The other games all have worked without issue. (well except bad screen tearing in Wolfenstein New Order)

I do have vanishibg of ethan carter redux and wolfenstein also, does it in that game too. Do you monitor your system statistics? I use msi afterburner and the clockspeeds will fluctuate anywhere from 300mhz to around 800ishmhz
 
Last edited:
I had to sign up to ask this.

Kyle Bennet, as editor-in-chief of [H]ard|OCP, how would be the right way to normalize and evaluate perf/watt efficiency, between two given cards?

My own, back-of-the-envelope calculations say that at best R9 Fury can offer +37% performance with +14% Full System Power (best case minimum FPS).

The data from [H] own review comparing R9 Fury and GTX980 are below.

"Bennett" is my last name, but a lot of folks make your spelling error for some reason.

"how would be the right way to normalize and evaluate perf/watt efficiency, between two given cards" I would not know exactly what to tell you there since we very rarely focus on an actual perf/watt numbers in our reviews. That would equate to nothing besides a wattage/frame in a fully scientific view, and we all know that frame rates lie when it comes to actual gaming experiences.
 
I do have vanishibg of ethan carter redux and wolfenstein also, does it in that game too. Do you monitor your system statistics? I use msi afterburner and the clockspeeds will fluctuate anywhere from 300mhz to around 800ishmhz


With Wolfenstein New order - turn off v-sync. The game is limited to 30FPS if you have v-syn on. With v-sync off I got terrible tearing but no apparent frame loss.
I played some Vanishing of Ethan Carter Redux a couple weeks back but didn't notice framerate concerns.

I do have afterburner installed. I'll test Vanishing of Ethan Carter Redux tonight with Afterburner up and save a pic of the core speed graph.

I wonder if the free sync has anything to do with your issue? I don't have a freesync monitor -- have you tried turning off freesync to see if the issue changes? My Fury X GPU only gets into the mid 60s C at full stress test like in furmark - so I don't think it'd be a heat related throttle. My card is an XFX Fury X.
 
Back
Top