AMD Not providing R9 Nano samples to several sites.

I'm not surprised, seeing as AMD put out a release that they're not sending samples to the Tech Report anymore because of that crazy rant they released on the Fury X. It's not exactly surprising.

I would agree. I'm not saying who is right or wrong but I will say there's a lot of ammo to go around. Generally I will say that this site is far better at being consistent with what they test and why. Some might not like what is tested but you can't argue that it's not consistent. The same can't be said for the other sites. The Dirt Rally thing in particular really rubbed me the wrong way because it was the first time I had seen any site go there with that.
 
IMO, AMD should just ride out the storm with Fury and do better next time, instead of trying to make PR control decisions that could potentially have a longer term effect than a bad generation of GPU. Don't further give the company a bad rep by doing dumb stuff such as selectively giving out review samples. That will only damage the trust people have in the company, rather than just a single bad GPU generation.

Whoever that thought it would be a good idea to selectively give out review sample is incompetent IMO. He/she is just making things worse for the company.
 
Thats how I felt with it too.
The Huge Paper launch disclaimer felt rushed as well.
[H]ard is fairly good but in this situation its not so clear that things were handled as best they could.
Dont get me wrong on price though, this card should be 500 or 550.
 
Every one can speculate and have an opinion. That is why forums exist. AMD pussy footing by not sending a card only shows lack of confidence in their own product and really pathetic behaviour.

It seems their management is run by complete morons who have no clue about social marketing and it's importance in this day and age for selling high end products. Look at the amount of 980 ti users vs. Fury users on this board alone and you will understand what I am talking about.
 
I am a little more concerned about Kyle post looking more damning than it actually is.

It sounded like he hung up because that's what he felt when he heard the price, when it in fact wasn't. However by looking at the post, and that post alone, you can see how anyone else (especially AMD) would have taken it.

I am not in anyway bashing Kyle's post or disposition, it's just the wording of the post, as it is written, that is open to misinterpretation by third party.
 
So I'm wondering if maybe AMD felt like HardOCP had written their review before even receiving the card? Just going off of these posts from the 27th:

There's nothing wrong with having those opinions but it seems like you guys already made your mind up before getting a review sample, all that was really missing in this case were the performance graphs.

Our conclusions in reviews are always based on the results from that review.

For video card reviews I have devised the structure and purposely mandated the conclusion be the last page written. In this way we go in order, starting with Intro, then game testing graphs and commentary, and ending with final thoughts in the conclusion after all information has been assimilated. In this way the conclusion is entirely based on the results experienced from that review. The same review structure would be conducted with Nano.
 
The Huge Paper launch disclaimer felt rushed as well.

In regards to timing, having all the facts helps, the briefing for the paper launch of Nano was held at 4 PM August 26th, Eastern. I live on the Eastern time zone. For Kyle it was 3 PM.

The embargo date and time for the information to be published was: Thursday August 27 at 8 a.m. ET

That is the amount of time given we had to put an article together and get the information across to you. I was up till early morning putting the information together, which was something we did not have to do, but I did it for you guys, I knew Nano was going to be a hot topic and people wanted to know what it is. I put in the time with little sleep that morning to make it happen. That's how much I love you guys and that's how passionately I feel about the hardware we evaluate :cool:
 
Right - and it's great that you have a system for evaluating a video card. But I can see how AMD might look at the situation like this: They have x amount of review samples, and they have sites they can send the cards to that haven't taken a stance on the Nano yet and just posted the early preview information provided by AMD and then there's a site where the Editor-in-Chief and the Video Card managing editor have publicly stated that 1) There's no market for such a card 2) That the price is ridiculous and 3) There's no value in a small, low power card. You guys knew the performance was not going to be as high as the fury X - and regardless of whether you got a test unit or not - these were some pretty strong conclusions you already came to before getting the card. I'm not sure getting a sample would have changed the outcome of your early conclusions.
 
these were some pretty strong conclusions you already came to before getting the card

But they were the correct conclusions based on the information provided.

They are going to release a product that competes against their own products AND the competitor has a superior card at the same price point AND a possibly similar performing card with the same form factor at half the cost (970 ITX - waiting on reviews).
 
But they were the correct conclusions based on the information provided.

They are going to release a product that competes against their own products AND the competitor has a superior card at the same price point AND a possibly similar performing card with the same form factor at half the cost (970 ITX - waiting on reviews).

Dude fury nano and 970 are not close in performance... I sold my 970 mini because it sucks at 1440.
 
Makes You wonder what they have to hide. I'd guess in [H] case they didn't like the idea of longer benchmark runs
 
If you have a problem with today's latest and popular games using GameWorks, talk to the developers. It is their choice to implement which features they chose to in their own game.

The GameWorks argument is a tired debate which we have disproved on multiple accounts. Perhaps you missed our Witcher 3 article? This argument has no barring on the games we chose.

To chose a game based on which vendor does or does not have technology implemented in it, would be the very bias people blame us of. To keep it fair, we do not base our game decisions on whether the game developer has chosen to use some NVIDIA 3D effects, or AMD 3D effects.

Popular by who's standards. Out of the whole game list I know I only own 2 of them and neither are installed on my computer any more. The game Dying Light I have never even heard of. The 2 newest games I have purchased recently are ARK: Survial Evolved and Ashes of Singularity. But all people on this forum have to say is they are alpha games. Guess what, so was Witcher III. Steam shows I played it for 15 min, must be when it CTD and I uninstalled it.
I guess we all can cherry pick what ever game we like to show the performance of a product. I know personally, I have never had a problem buying games with Game work code, but after it has been proven by reliable sites that in Witcher 3 had the tessellation turned up really high to hamper AMD cards and then how nVidia attacked Stardock over their product problem, I have no problem not supporting any Gamework game from here on out. Truly that is the only way all these publishers will stop creating games the intentionally hamper one manufacturer's product. We as gamers don't want to have to go back to the old days where you had to multiple cards and time to get games to play.
 
Popular by who's standards. Out of the whole game list I know I only own 2 of them and neither are installed on my computer any more. The game Dying Light I have never even heard of. The 2 newest games I have purchased recently are ARK: Survial Evolved and Ashes of Singularity. But all people on this forum have to say is they are alpha games. Guess what, so was Witcher III. Steam shows I played it for 15 min, must be when it CTD and I uninstalled it.
I guess we all can cherry pick what ever game we like to show the performance of a product. I know personally, I have never had a problem buying games with Game work code, but after it has been proven by reliable sites that in Witcher 3 had the tessellation turned up really high to hamper AMD cards and then how nVidia attacked Stardock over their product problem, I have no problem not supporting any Gamework game from here on out. Truly that is the only way all these publishers will stop creating games the intentionally hamper one manufacturer's product. We as gamers don't want to have to go back to the old days where you had to multiple cards and time to get games to play.

Certainly not yours anyway. I don't think there are many people who uninstalls a game after a single CTD.

And besides, without getting into the hairworks argument, you could easily turn it OFF. It's not like it's forced on, or if the hairworks wasn't hurting nVidia cards just as badly.

The number of people playing would be a benchmark for 'popular' games.
 
Popular by who's standards.

this over and over again. hardocp has the most ridiculous justifications for including a said game in their test suite. Dying Light is by no means more popular than other well coded and excellent games such as Dragon Age Inquisition and Middle Earth Shadow of Mordor. These games look great, play even better and are game of the year material. On the contrary Dying Light is a mindless zombie killing trash with nothing really great in terms of graphics or gameplay.

hardocp's test suite should be more balanced to include equal amount of Nvidia Gameworks titles and AMD Gaming Evolved titles. Currently its lopsided with 3 Gameworks titles (Farcry 4, Witcher 3, Dying Light) one neutral title (GTA V) and one AMD Gaming Evolved title (BF4).

I guess we all can cherry pick what ever game we like to show the performance of a product. I know personally, I have never had a problem buying games with Game work code, but after it has been proven by reliable sites that in Witcher 3 had the tessellation turned up really high to hamper AMD cards and then how nVidia attacked Stardock over their product problem, I have no problem not supporting any Gamework game from here on out. Truly that is the only way all these publishers will stop creating games the intentionally hamper one manufacturer's product. We as gamers don't want to have to go back to the old days where you had to multiple cards and time to get games to play.

kudos for calling out Nvidia's shenanigans. Gameworks is pathetic because not only does it perform poorly on AMD GCN cards but also on Nvidia Kepler cards. Nvidia's Kepler series cards get beaten uncharacteristically by very wide margins by comparable Maxwell cards in Gameworks tiles (GTX 970 beating GTX 780 Ti by 25-30%). You even see cases where GTX 960 beats a GTX 780 which is quite disgusting.

http://www.computerbase.de/2015-05/project-cars-guide-grafikkarte-prozessor-vergleich/2/
http://www.techspot.com/review/1000-project-cars-benchmarks/page2.htm

Now show me a non Gameworks game where that happens. :rolleyes:

Gameworks is the worst practice to be seen from a GPU vendor in the 30 year old GPU industry. Its intended to serve only Nvidia's balance sheet and profits. Nvidia in fact uses Gameworks to even play down their own last gen and tries to get people to buy their latest gen.
 
If AMD reads this site like some people are claiming, then they should realize that [H]'s method of review actually allows the hardware to speak for itself. [H] provides all the data, the fps graph, setting details. You don't even need to read the conclusion if you don't want to, you can just look at the data provided to get a clear picture of what kind of experience you'll get.

I don't know of any other site that provide such review. Often you'll get min max avg numbers that don't really give a clear picture, and you rely on the reviewer telling you if it's playable, and left wondering if the reviewer's idea of playable is the same as yours.

So anyway, if the card in question is good, it will show in [H]'s review. The data don't lie. Therefore AMD's marketing dept (or whoever it was that decided so) is really dumb to deny it's upcoming card the chance to show what it can do in a proper review.

Popular by who's standards. Out of the whole game list I know I only own 2 of them and neither are installed on my computer any more. The game Dying Light I have never even heard of. The 2 newest games I have purchased recently are ARK: Survial Evolved and Ashes of Singularity. But all people on this forum have to say is they are alpha games. Guess what, so was Witcher III. Steam shows I played it for 15 min, must be when it CTD and I uninstalled it.
I guess we all can cherry pick what ever game we like to show the performance of a product. I know personally, I have never had a problem buying games with Game work code, but after it has been proven by reliable sites that in Witcher 3 had the tessellation turned up really high to hamper AMD cards and then how nVidia attacked Stardock over their product problem, I have no problem not supporting any Gamework game from here on out. Truly that is the only way all these publishers will stop creating games the intentionally hamper one manufacturer's product. We as gamers don't want to have to go back to the old days where you had to multiple cards and time to get games to play.

Just because you've never heard of Dying Light, it doesn't mean it's an unpopular game. You are going off your own personal experience only, when in actual fact there are many people out there who've played Dying Light. And Witcher 3, it's an even more popular game, even if you couldn't run it.

Gameworks uses tessellation because of what it can do in terms of visual effect. You say that like tessellation was created to make AMD look bad when in actual fact, it has been around long before Gameworks. Besides, [H] have already proved conclusively that Gameworks doesn't hamper AMD's performance. At this point, if you wish to continue thumping on this issue, you'll have to provide the evidence to back your claim up.
 
Last edited:
Ok at 6:21 in the morning, Witcher 3 has 3,533 people in game and Ark has 25,746 people in game. Guess we know which one is more popular.

Edit. Ark is suppose to be the Alpha game and I have yet to have 1 CTD in that game. Witcher 3 was suppose to be a finished product that cost $60 and it CTD the first time I played it, I payed $30 for ARK. I will give Witcher 3 a chance again later after they patch all of their problems. But when I normally only get about 10 hours a week to play video games, I really don't want to waste my time on a buggy game.
 
Ok at 6:21 in the morning, Witcher 3 has 3,533 people in game and Ark has 25,746 people in game. Guess we know which one is more popular.

Edit. Ark is suppose to be the Alpha game and I have yet to have 1 CTD in that game. Witcher 3 was suppose to be a finished product that cost $60 and it CTD the first time I played it, I payed $30 for ARK. I will give Witcher 3 a chance again later after they patch all of their problems. But when I normally only get about 10 hours a week to play video games, I really don't want to waste my time on a buggy game.

well said. Gameworks titles have been a huge mess. Buggy, unstable and just lacking rigorous QA. Some of them have been outright abysmal (AC Unity) while others are just bad (Watchdogs, Farcry 4, Dying Light). It takes multiple patches to get these games just stable and performing well. CF support takes months (sometimes 4-5 months as in the case of Farcry 4) to be enabled.

http://www.hardocp.com/article/2014..._performance_video_card_review/7#.Vel5xPmqqko

http://www.hardocp.com/article/2014...formance_image_quality_review/15#.Vel5X_mqqko

http://www.hardocp.com/article/2014/11/21/far_cry_4_video_card_performance_iq_preview/7#.Vel5j_mqqko

Contrast that to AMD Gaming Evolved titles like Alien Isolation , Dragon Age Inquisition and Civilization Beyond Earth which are stable at launch, perform well in both single and multi GPU (on both SLI and CF) and are a good example of how AAA titles should be launched.

http://www.hardocp.com/article/2014/11/06/alien_isolation_video_card_performance_review/8
 
Different demographics are always going to vary and exhibit different behaviors. So the question would what should be the target?

If we go by the gaming forums section on here there seems to be quite a bit higher interest (to put it midly) in games represented in HardOCPs historical test suites (in terms of both prerelease discussion, and post release) over other examples such as ARK or Ashes. Would HardOCP be representing the broader PC gaming community if they were to instead test Dota 2, LoL, WoW and etc (dwarfs games like ARK in popularity)? Sure, but is that the demographic that HardOCP is catering to?

Interestingly regarding Ashes, at least in the forum communities I at the very least lurk in, there has been basically zero to negilbile discussion on the game until the DX12 benchmarks surfaced. Even then the only discussion is regarding the performance and nothing regarding the games content otherwise.
 
AMD would have never priced nano at 450. It makes sense from the performance level but it doesn't make sense for the rest of their lineup.

AMD cards during the collapse of their pricing structure post GTX 970 was essentially sold at losses or no profit. They made nothing and it showed when they failed to update their gaming bundles to include more modern games. Plus the collapse of their profit margins to sub 30 percent and their 100 million plus loses when being run so lean showed this.

The 300 series rebranding was their attempt to salvage hawaii and the rest of their lineup's pricing. It was dishonest, but being AMD, people would forgive them as they are kind of a joke right now and people understand their lack of resources.

What people need to know is the 300 series is the volume market where AMD has the potential to make money. The Fury X series was a stupid vanity maneuver to improve the AMD brand by trying to capture the Halo. What made it a stupid move on AMD part is that they don't have the brand power to sell a halo card unless it undoubtedly the fastest on the market and they cut out the double precision which makes it a tough sell for the professional market. Particularly due to AMD lack of software support or driver support on the professional side. The tragic thing is, even if Fury X did capture the crown, it still would not have made AMD much money because looking at the difficulty of manufacturing the chip by looking at the volume AMD is producing the card, it simply can't generate the volume needed to recover it's R and D considering it was released so late. The Fury series is simply a spruce goose. A product released too little too late to do AMD much good.

Ultimately, if AMD priced the Fury X too low and consequently the Nano too low, the branding deception of the 300 series would have gone to waste. That being, if nano was priced at 450 or even 500, the 390x series would have to be priced at 350 to make sense and they would be stuck in the situation they were in selling cards at a loss again.

Back to Nano, the nano is AMD attempt to form some sort of crown for themselves. But the problem is this crown is over a market that barely exists and this is probably why AMD will be allowed to keep this crown. I have a feeling the gtx 980 ti or titan x could be similarly underclocked to beat Fury Nano's performance per watt. But considering the size of the market, Nvidia is likely going to let AMD have that market since 99% of people will pick the gtx 970, particularly because of AMD lack of HDMI 2 support.
 
Last edited:
R.E GameWorks, image quality is a pet peeve of mine. I want it to grow and evolve in games. I've been playing games since text graphics on local BBS servers in the 80's. I've seen it evolve over the decades. The game developer chooses what technology to implement, whether that be some effects that AMD has shown to the developer, or effects NVIDIA has shown to the developer, or others. GameWorks is middleware, middleware the developer can opt to use, or not. If new games use GameWorks features, that is just how it is, if you dislike this, take it up with them. I do not control what they do. I can only take the game, as is, and evaluate it, as is.

If you really dislike the added graphics GameWorks can bring to a game, you have the freedom to turn them OFF. It should be noted they are DX11 DirectCompute features to begin with, and run on both AMD and NVIDIA hardware. This is not a PhysX situation. Both can run the effects. But if you really don't want to see them, or you are having a performance issue, turn it off. You have this option.

Those that argue against GameWorks are actually arguing for the removal or downgrading of image quality in games. GameWorks features like HBAO+ and Tessellated Godrays have made improvements to image quality in games. Do you really want to lose these features? Do you really want worse image quality in games?

AMD has its own Ambient Occlusion method, its own Hair method, and other methods, but again it is up to the developer to use them or not, not us. If game devs are mainly using GameWorks features, and not AMD's features, perhaps there is a reason for that, or it means something. Perhaps AMD needs to evanglizie its features a little harder? NVIDIA is in it to win it. AMD needs the same approach to convince developers to use its features and make it easy for them to implement.

Maybe you guys can even help with this, tweet your fav developer and let them know what YOU want to see in future games.

Personally, I want to see visual quality in games evolve and keep getting better. I do not want it to stagnate. Often times it is easier for the developer to use pre-made solutions offered by AMD and NVIDIA. That is ok. That is their prerogative. Our job is to evaluate that, as is.

We have proven, multiple times recently that GameWorks features are not the crutch holding back AMD hardware versus NVIDIA hardware. Unfairness is shouted at us. But our results time and time again show us that this bias does not exist in the real-world between features. HairWorks? Same exact performance degradation between NVIDIA and AMD in Witcher 3. /shrugh

Instead of basing opinion on what he or she said, look at the real-world results and the actual final gameplay experience and performance. You may find things are not as slanted as you think.
 
Last edited:
Just since this is a nano thread, when are cards supposed to be reviewed and the nda lifted?
 
We have proven, multiple times recently that GameWorks features are not the crutch holding back AMD hardware versus NVIDIA hardware. Unfairness is shouted at us. But our results time and time again show us that this bias does not exist in the real-world between features. HairWorks? Same exact performance degradation between NVIDIA and AMD in Witcher 3. /shrugh

Instead of basing opinion on what he or she said, look at the real-world results and the actual final gameplay experience and performance. You may find things are not as slanted as you think. Now, I really don't want to have to say anything more about GameWorks, but sadly, I know this will not be the end of it.

Come on man. Atleast be honest and don't deny facts. What is this review then showing :rolleyes:

http://www.hardocp.com/article/2015/07/10/asus_strix_r9_fury_dc3_video_card_review/5#.VemQfPmqqko

"Now we have disabled GameWorks features altogether in this game below. We have turned off HBAO and are just using SSAO. We have turned off the NVIDIA Depth of Field option as well.

Performance jumped up on the ASUS STRIX R9 Fury. The Fury is now 31% faster than the GeForce GTX 980. The setting holding back performance seems to be the NVIDIA Depth of Field in this game. The GTX 980 can render it much better, the Fury not so much."
 
We are only trying to give you an open minded point of view Brent and if you don't want to listen then so be it because it seems your mind is set in stone on what your going use to test with and how your going do it. also the reason I pointed out World of Tanks is because it is Popular as Wargaming runs 4 worldwide severs and the NA can have 30,000 users in one night easy and these are people that buy hardware and most really don't play Fry Cry 4.
 
We have proven, multiple times recently that GameWorks features are not the crutch holding back AMD hardware versus NVIDIA hardware. Unfairness is shouted at us. But our results time and time again show us that this bias does not exist in the real-world between features. HairWorks? Same exact performance degradation between NVIDIA and AMD in Witcher 3. /shrugh.


You tested HairWorks in Witcher 3 AFTER THE DEVS PATCHED IT, the default tessellation factor was lowered from its original 64 down to 16. Because people have shown and even sites cover it, that 16 factor has no visual quality loss compared to 64 but runs heaps faster. They even put in a slider for users to select due to overwhelming feedback that it runs like a turd originally.

Why didn't you test HairWorks on vs off at the start before all the patches when you ran Witcher 3 with that feature on?

Again in your recent review, Dying Light and NV's HBAO, 50% performance loss for AMD GPUs. Fury went from being slower to heaps faster than the 980. There's proof in your own data to show GameWorks adversely affect AMD. You are here saying it's not. Why are you spinning so hard in favor of a blackbox feature? If you love your image quality, rally to get NV to make it open source so all developers are able to access it and better optimize it. Then you get better image quality AND improved performance. AMD will also have an easier time optimize NV sponsored games. All gamers win.

Are you for gamers or for NV?
 
It seems their management is run by complete morons who have no clue about social marketing and it's importance in this day and age for selling high end products.

Ohh I disagree there. They seem to be pretty good at getting out all over teh internets and conjuring up fake controversies about their competition. If they put half the effort into engineering that they spend on manufacturing outrage, they'd actually be selling some cards.

At this point it'd probably be best for the gaming community to just recognize that AMD is dead and move on.
 
^ nope

my whole take on this individual API thing is how about coding to the DX standard instead. That way everyone will be forced to meet the same render standards.

/issue
 
You tested HairWorks in Witcher 3 AFTER THE DEVS PATCHED IT, the default tessellation factor was lowered from its original 64 down to 16. Because people have shown and even sites cover it, that 16 factor has no visual quality loss compared to 64 but runs heaps faster. They even put in a slider for users to select due to overwhelming feedback that it runs like a turd originally.

That has nothing to do with GameWorks.

That was a setting made by the game developers, of which could be changed, and was, through patches. I see that as a game issue, not a GameWorks issue, totally controlled by the game developers.

Why didn't you test HairWorks on vs off at the start before all the patches when you ran Witcher 3 with that feature on?

When Witcher 3 came out we had many things going on at once, including me moving to a different state. While we could not get to an immediate gameplay performance article, we did start using the game and showing you the performance in Witcher 3 in games as soon as we could in video card reviews. Usually, you will find we start using games inside regular card reviews before we do our full gameplay evaluation. So look to those reviews, to see the results.

Again in your recent review, Dying Light and NV's HBAO, 50% performance loss for AMD GPUs. Fury went from being slower to heaps faster than the 980. There's proof in your own data to show GameWorks adversely affect AMD. You are here saying it's not. Why are you spinning so hard in favor of a blackbox feature?

You are bypassing ap2ap graphs we have done sans GameWorks features that still show a large disparity in performance between certain AMD cards and NV cards at similar price.

If you love your image quality, rally to get NV to make it open source so all developers are able to access it and better optimize it. Then you get better image quality AND improved performance. AMD will also have an easier time optimize NV sponsored games. All gamers win.

Are you for gamers or for NV?

These features are up to the game developer to integrate and use. If you are unhappy with the path the game developer has taken with their game, let them know how you feel about that. Ultimately they chose the platforms they want to use.

Again, this is where AMD needs to step up and take no mercy, seriously. NVIDIA is aggressive, very agressive. AMD is going to have to fight back. We already see GameWorks devouring most new games. This will only continue unless AMD fights back with something better for developers and convinces them to use its stuff. I'm all for competition. In our Witcher 3 article, we have shown using the latest patch that this game is a great game to use for competition right now, it is all in the review, I recommend checking it out.
 
Lmao, don't cry about not getting a review card if you talked smack about it being a paper launch.

AMD is adopting nVIDIA tactics and denying review cards to websites it perceives detrimental to its marketing campaign.
 
Brent

What is your position on NVidia 4GB frame buffer being in reality 3.5GB on a certain card?
 
Brent

What is your position on NVidia 4GB frame buffer being in reality 3.5GB on a certain card?

Not taking the bait. Off topic to this thread. You can search our forum and reviews where we have covered that.

Let's get back on topic now.
 
Smack, or the truth?

Was hardware released at the launch in June, or August 27th?

Or was a PDF?

Don't think it matters to AMD on this. AMD is going to be cherry picking its review sites like it cherry picks benchmarks and games to show off its latest graphics cards.
 
Not taking the bait. Off topic to this thread. You can search our forum and reviews where we have covered that.

Let's get back on topic now.

that's because you KNOW that is some shady stuff and if you ran NVidia through the ringer like you are AMD, you would not be getting a review sample :eek:
 
If you search our reviews, you will find where we covered it.

Any review with a GTX 970 as the card being reviewed, or used for a comparison, the issue is discussed and tested, also some game reviews.

Now let's get back on topic.
 
^ nope

my whole take on this individual API thing is how about coding to the DX standard instead. That way everyone will be forced to meet the same render standards.

/issue
"Individual API?" What are you talking about? Every current game except those running idTech are using DirectX.

Again in your recent review, Dying Light and NV's HBAO, 50% performance loss for AMD GPUs. Fury went from being slower to heaps faster than the 980. There's proof in your own data to show GameWorks adversely affect AMD. You are here saying it's not. Why are you spinning so hard in favor of a blackbox feature? If you love your image quality, rally to get NV to make it open source so all developers are able to access it and better optimize it. Then you get better image quality AND improved performance. AMD will also have an easier time optimize NV sponsored games. All gamers win.

Are you for gamers or for NV?
In the original Dying Light performance review from [H], the 290X performance dropped only 8% when HBAO+ was enabled, compared to 5% on the GTX 980. Total performance loss going from no AO to AO and HBAO+ enabled, the GTX 980 lost more performance compared to the 290X at 19% and 16% respectively. In the Fury X 4K review framerate dropped 26% and was still slower than the 980 Ti with all Gameworks disabled. In the Fury review, framerate only dropped 9% and was faster than the GTX 980 in both cases.

Where are you getting the 50% figure from?
 
that's because you KNOW that is some shady stuff and if you ran NVidia through the ringer like you are AMD, you would not be getting a review sample :eek:

The importance of the 970 "issue" has already been settled by the market. It's the #1 dGPU used on Steam. That shows how little value people place in the "issue". Time to move on.
 
Back
Top