AMD Radeon R9 Fury X Video Card Review @ [H]

Did you click the link?


What wall? Congrats on the award btw.

Yes, we clicked the link and saw the generic GameWorks page. Would you be so inclined as to quote your proof that backs up your statement?

Have no idea what award you are talking about. I assume you think you are being cute? Please correct me if I am wrong.
 
*Sigh* My only concern, based upon what I am seeing on this and various other sites is, will AMD sell enough of them to make up for the research and development they put into them. Also, if they do not make money, how are they going to make Zen into a good architecture. After all, even if it hits it out of the park, it would still take a couple of years to turn a significant profit.

I am an AMD fan but I worry for them with the obvious blunders they made with this release.
 
I never implied it did not "work" ON DX11... but it probably ain't DX11... :rolleyes:

If it was DX11 why would nVIDIA HIDE it ????

Like I mentioned in a previous post:

What if in the GameWorks code non-DX11 calls are made ? What if those calls only work on nVIDIA gpus ?
If those non-DX11 calls are made with a AMD gpu they return a error and upon an error a generic DX11 call is made... all this making the AMD gpu lose time !

:rolleyes:

Even if you had proof of that, most folks would not even care. I can be reasonably assured that it would not change the way NVidia does things and why should it? They make money and people buy their stuff.
 
I never implied it did not "work" ON DX11... but it probably ain't DX11... :rolleyes:

If it was DX11 why would nVIDIA HIDE it ????

Like I mentioned in a previous post:

What if in the GameWorks code non-DX11 calls are made ? What if those calls only work on nVIDIA gpus ?
If those non-DX11 calls are made with a AMD gpu they return a error and upon an error a generic DX11 call is made... all this making the AMD gpu lose time !

:rolleyes:

So did you just make all that stuff up? Game works are just high tesselation routines in Directx11. If it didn't work, then you wouldn't see the same results on AMD cards.

I want AMD to succeed as anyone. However, if they suck at business, then they need to remove themselves from the economic gene pool. They're not doing us any favors by bungling their releases over and over. 64 ROP and no HDMI? Apparently their R&D budget was used to double the Tonga die and that was it. They need to sell their business to another company with executives who know what they're doing.
 
First I want to point out I think GW should die as long as it stays closed source. That being said GW is not an API it is a game engine additive for features and effects, therefore requiring an API to display, hence needing DX11 not replacing it. It can not make calls other than the API will understand. Again the problem is the closed nature of GW.
 
I never implied it did not "work" ON DX11... but it probably ain't DX11... :rolleyes:

If it was DX11 why would nVIDIA HIDE it ????

Like I mentioned in a previous post:

What if in the GameWorks code non-DX11 calls are made ? What if those calls only work on nVIDIA gpus ?
If those non-DX11 calls are made with a AMD gpu they return a error and upon an error a generic DX11 call is made... all this making the AMD gpu lose time !

:rolleyes:

do yo even know what is GameWorks?.. is just a compiled .DLL Libraries, what's a library? Just a collection of implemented behavior, They are not application specific, libraries are designed to be called by a single or multiple programs to simplify the development, Instead of implementing a GPU feature five times in five different games, you can just point the same five titles at one library, Game engines like Unreal Engine 3 are typically capable of integrating with third party libraries to ensure maximum compatibility and flexibility, in fact any Unreal Engine Game Run PhysX Libraries and most people don't even know that are running specific nvidia's features but through the CPU, but not only PhysX effects but also other Physics Effects.

Nvidia’s GameWorks contains libraries that tell the GPU how to render shadows, implement ambient occlusion, or illuminate objects. In Nvidia’s GameWorks the libraries are effectively "black boxes", Nvidia has clarified that developers can see the code under certain licensing restrictions, but they cannot share that code with AMD. Yes, They are protected Files so what? they are just instructions that are hardware agnostic it work even in smartphones and yes, developers can see the whole code.

If those non-DX11 calls are made with a AMD gpu they return a error and upon an error a generic DX11 call is made... all this making the AMD gpu lose time !

This just deserve a perma-ban for blatantly ignorance.. LoL
 
That being said GW is not an API it is a game engine additive for features and effects, therefore requiring an API to display, hence needing DX11 not replacing it. It can not make calls other than the API will understand. Again the problem is the closed nature of GW.

Finally something reasonable for an AMD fanboy. Yes, the "problem" is mostly the closed Nature..
 
Finally something reasonable for an AMD fanboy. Yes, the "problem" is mostly the closed Nature..

I am always reasonable. I have never skewed facts to my point nor do I consider myself a fan boy. I just like to debate facts and get others reasonable opinions.
 
Look at the page and use common sense?
I know it is hard but try...


I linked my proof. Look at the features on that page...
My bad, I thought you already knew.

Your ignorance is showing. Please take your GameWorks discussion to another thread please.
 
much enjoyment this thread gives

some folks just cant handle the truth, IF Fury X was priced as 980 it would be a blast allas its not
 
Even if you had proof of that, most folks would not even care. I can be reasonably assured that it would not change the way NVidia does things and why should it? They make money and people buy their stuff.

Tech sites reviewing games and hardware should care...

I'm not defending that they should boycott those "suspicious" games... but they should mention the "problem" and investigate it...
 
What ignorance? The fact that you don't have a real argument?

Sample Code

Graphics and compute samples for both OpenGL and DirectX developers, showing cutting edge technique for games.:

OpenGl Samples for Windows, Android and other Operating Systems
DirectX Samples for Windows

They provide both DX and OGL examples, which leads me to believe the features are designed to run on either of those two, so I'm not sure what proof you're pointing to either.

:)
 
They provide both DX and OGL examples, which leads me to believe the features are designed to run on either of those two, so I'm not sure what proof you're pointing to either.

:)

We weren't discussing the DX or OGL support... but unfortunately I can't say anymore.

Never understood why people read the conclusion, seems like a cop-out for the TL;DR crowd.
 
This thread will get back on topic now. Continuing posting about the business associated with Gameworks and the such and you will be banned. Please go make your own thread about it and discuss it there.
 
96 Rops and 8GB of ram it this would probably sell extremely well. However, as it stands now, it will be a tougher sell for most. It is sad too that because they did not do the Bulldozer Architecture correctly out of the gate, we end up with the GPU's that AMD is selling us now. (I am thinking in terms that the money lost is causing problems with today's tech.)
 
I have updated the Conclusion of the review with some preliminary overclocking testing.

http://www.hardocp.com/article/2015/06/24/amd_radeon_r9_fury_x_video_card_review/11

We've had some time now to do some preliminary overclocking with the AMD Radeon R9 Fury X. We have found that you can control the speed of the fan on the radiator with MSI Afterburner. Turning it up to 100% fan speed keeps the GPU much cooler when attempting to overclock, and isn't that loud.

For example, during our overclocking attempts the GPU was at 37c at full-load removing temperature as a factor holding back overclocking. We also found out that you will not be able to overclock HBM, it is at 500MHz and will stay at 500MHz. You will only be able to overclock the GPU. Currently, there is no way to unlock voltage modification of the GPU.

In our testing we found that the GPU hard locked in gaming at 1150MHz 100% fan speed 37c. Moving down to 1140MHz we were able to play The Witcher 3 without crashing so far. This is with the fan at 100% and 37c degree GPU. So far, 1140MHz seems to be stable, but we have not tested other games nor tested the overclock for a prolonged amount of time.

More testing needs to be done, but our preliminary testing seems to indicate 1130-1140MHz may be the overclock. This is about a 70-80MHz overclock over stock speed of 1050MHz. That is a rather small increase in overclock and doesn't really amount to any gameplay experience or noteworthy performance improvements.

We have at least learned that temperature is not the factor holding the overclock back, at 37c there was a lot of headroom with the liquid cooling system. There are other factors holding the overclock back, one of which may be voltage.
 
Nice one, cheers for the continued efforts to get clarity on this card.
 
I have updated the Conclusion of the review with some preliminary overclocking testing.

http://www.hardocp.com/article/2015/06/24/amd_radeon_r9_fury_x_video_card_review/11

as always great job in keeping us informed.. so, the performance gain in FrameRate is null with the extra 70-80mhz?. another one, have you any way to measure the card temperature? (IR thermometer in example?) as the VRM are cooled with a heatpipe attached to the WaterBlock I guess it should also help with VRM temps. with those temps, the overclock headroom is just great, curious to see how the card will be affected by adding voltage and how behave the TDP, if its different than Maxwell then I can see buying one to test myself once the voltage lock disappear.
 
so if we get voltage modification, seeing as the GPU is sitting @ 37 degrees, there is a lot of headroom with temperature, you'll be able to add some voltage! it could still be a crazy OCing card.
Don't forget about the vrm's temperature ;). It's already approaching its limit.
 
so if we get voltage modification, seeing as the GPU is sitting @ 37 degrees, there is a lot of headroom with temperature, you'll be able to add some voltage! it could still be a crazy OCing card.

Only thing that would worry me are the VRMs etc... things I cannot control or things I don't know about that are heating up way hotter than they should.
 
Well if voltage turn out to be a bust it flies in the face of Joe Macri saying it was an overclockers dream. I just wonder if the proximity of the ram to the gpu has made them cap the overclocking potential as hbm has been locked down.

I think the technology is so new, and fragile, any changing of the HBM clocks would do or have who knows what consequences.

Maybe HBM2 will be overclockable.

Honestly though, memory bandwidth isn't holding the Fury X back, I'm fine with just GPU overclocking, we are engine limited, not bandwidth limited with this card.
 
Not null, just not worth much, but I have only tested 1 game

damn you are faster than my ninja edit skills.. ;) I think Core Clock will be much more beneficial for this card than memory overclock as the Bandwidth is already high...
 
I have updated the Conclusion of the review with some preliminary overclocking testing.

http://www.hardocp.com/article/2015/06/24/amd_radeon_r9_fury_x_video_card_review/11

Good gravy...that proves that Fury X is an even bigger disappointment than just running it stock and getting the out-of-box experience. All that cooling capability and no way to exploit it. :(

Hope multi-GPU VRAM pooling will come off without a hitch once DX12 titles make an appearance. At this point, that and a price drop are going to be the Fury X's only saving graces.
 
Good gravy...that proves that Fury X is an even bigger disappointment than just running it stock and getting the out-of-box experience. All that cooling capability and no way to exploit it. :(

Hope multi-GPU VRAM pooling will come off without a hitch once DX12 titles make an appearance.

im not really disappointed, the cooling ability is there, just we have to see how the card is affected later with the added voltage (when possible.. :()
 
I don't even know what the voltage is running at right now, everything I've tried can't read it right now. Cooling though, is not holding it back at all. AMD quoted 500W TDP out of this cooling device, so there is a lot of headroom there, temp of the GPU won't be holding the GPU back, but temp of other components on the PCB could be.....
 
im not really disappointed, the cooling ability is there, just we have to see how the card is affected later with the added voltage (when possible.. :()

If, and only if, a voltage unlock and HBM overclocking become available.
 
I think the technology is so new, and fragile, any changing of the HBM clocks would do or have who knows what consequences.

Maybe HBM2 will be overclockable.

Honestly though, memory bandwidth isn't holding the Fury X back, I'm fine with just GPU overclocking, we are engine limited, not bandwidth limited with this card.

I remember with my HD 4870 and HD 4890's GDDR5, AMD couldn't downclock the memory when the card is in low power/2d mode because that caused the screen to flicker. They never resolved that problem until the HD 5000 which cause the HD 487/90 to idle at ~50 degree C. But the performance/price of them was wonderful enough for me to overlook that... Not anymore with the Fury X.
 
Good gravy...that proves that Fury X is an even bigger disappointment than just running it stock and getting the out-of-box experience. All that cooling capability and no way to exploit it. :(

Hope multi-GPU VRAM pooling will come off without a hitch once DX12 titles make an appearance. At this point, that and a price drop are going to be the Fury X's only saving graces.
According to the FLIR shots we have seen with the shroud cover removed the VRMs are already being pushed near their thermal limits for longevity at stock clock speeds. Despite the water cooling for the core, the VRMs have no effective passive or active cooling on them. Increasing the voltage on this card would probably cause damage.
 
Back
Top