- Joined
- May 18, 1997
- Messages
- 55,620
Nvidia disagrees with you...
Not really...
https://developer.nvidia.com/content/introducing-nvidia-gameworks
Awesome wall of text quoted.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Nvidia disagrees with you...
Not really...
https://developer.nvidia.com/content/introducing-nvidia-gameworks
Did you click the link?
What wall? Congrats on the award btw.
Did you click the link?
What wall? Congrats on the award btw.
I never implied it did not "work" ON DX11... but it probably ain't DX11...
If it was DX11 why would nVIDIA HIDE it ????
Like I mentioned in a previous post:
What if in the GameWorks code non-DX11 calls are made ? What if those calls only work on nVIDIA gpus ?
If those non-DX11 calls are made with a AMD gpu they return a error and upon an error a generic DX11 call is made... all this making the AMD gpu lose time !
I never implied it did not "work" ON DX11... but it probably ain't DX11...
If it was DX11 why would nVIDIA HIDE it ????
Like I mentioned in a previous post:
What if in the GameWorks code non-DX11 calls are made ? What if those calls only work on nVIDIA gpus ?
If those non-DX11 calls are made with a AMD gpu they return a error and upon an error a generic DX11 call is made... all this making the AMD gpu lose time !
I never implied it did not "work" ON DX11... but it probably ain't DX11...
If it was DX11 why would nVIDIA HIDE it ????
Like I mentioned in a previous post:
What if in the GameWorks code non-DX11 calls are made ? What if those calls only work on nVIDIA gpus ?
If those non-DX11 calls are made with a AMD gpu they return a error and upon an error a generic DX11 call is made... all this making the AMD gpu lose time !
If those non-DX11 calls are made with a AMD gpu they return a error and upon an error a generic DX11 call is made... all this making the AMD gpu lose time !
That being said GW is not an API it is a game engine additive for features and effects, therefore requiring an API to display, hence needing DX11 not replacing it. It can not make calls other than the API will understand. Again the problem is the closed nature of GW.
Finally something reasonable for an AMD fanboy. Yes, the "problem" is mostly the closed Nature..
Look at the page and use common sense?
I know it is hard but try...
I linked my proof. Look at the features on that page...
My bad, I thought you already knew.
With all due respect... how on earth can you make that statement ????
Have you actually seen the GameWorks code ????
Nvidia disagrees with you...
Not really...
https://developer.nvidia.com/content/introducing-nvidia-gameworks
Even if you had proof of that, most folks would not even care. I can be reasonably assured that it would not change the way NVidia does things and why should it? They make money and people buy their stuff.
What ignorance? The fact that you don't have a real argument?
Sample Code
Graphics and compute samples for both OpenGL and DirectX developers, showing cutting edge technique for games.:
OpenGl Samples for Windows, Android and other Operating Systems
DirectX Samples for Windows
They provide both DX and OGL examples, which leads me to believe the features are designed to run on either of those two, so I'm not sure what proof you're pointing to either.
We've had some time now to do some preliminary overclocking with the AMD Radeon R9 Fury X. We have found that you can control the speed of the fan on the radiator with MSI Afterburner. Turning it up to 100% fan speed keeps the GPU much cooler when attempting to overclock, and isn't that loud.
For example, during our overclocking attempts the GPU was at 37c at full-load removing temperature as a factor holding back overclocking. We also found out that you will not be able to overclock HBM, it is at 500MHz and will stay at 500MHz. You will only be able to overclock the GPU. Currently, there is no way to unlock voltage modification of the GPU.
In our testing we found that the GPU hard locked in gaming at 1150MHz 100% fan speed 37c. Moving down to 1140MHz we were able to play The Witcher 3 without crashing so far. This is with the fan at 100% and 37c degree GPU. So far, 1140MHz seems to be stable, but we have not tested other games nor tested the overclock for a prolonged amount of time.
More testing needs to be done, but our preliminary testing seems to indicate 1130-1140MHz may be the overclock. This is about a 70-80MHz overclock over stock speed of 1050MHz. That is a rather small increase in overclock and doesn't really amount to any gameplay experience or noteworthy performance improvements.
We have at least learned that temperature is not the factor holding the overclock back, at 37c there was a lot of headroom with the liquid cooling system. There are other factors holding the overclock back, one of which may be voltage.
I have updated the Conclusion of the review with some preliminary overclocking testing.
http://www.hardocp.com/article/2015/06/24/amd_radeon_r9_fury_x_video_card_review/11
I have updated the Conclusion of the review with some preliminary overclocking testing.
http://www.hardocp.com/article/2015/06/24/amd_radeon_r9_fury_x_video_card_review/11
I have updated the Conclusion of the review with some preliminary overclocking testing.
http://www.hardocp.com/article/2015/06/24/amd_radeon_r9_fury_x_video_card_review/11
Don't forget about the vrm's temperature . It's already approaching its limit.so if we get voltage modification, seeing as the GPU is sitting @ 37 degrees, there is a lot of headroom with temperature, you'll be able to add some voltage! it could still be a crazy OCing card.
so if we get voltage modification, seeing as the GPU is sitting @ 37 degrees, there is a lot of headroom with temperature, you'll be able to add some voltage! it could still be a crazy OCing card.
Don't forget about the vrm's temperature . It's already approaching its limit.
as always great job in keeping us informed.. so, the performance gain in FrameRate is null with the extra 70-80mhz?.
Well if voltage turn out to be a bust it flies in the face of Joe Macri saying it was an overclockers dream. I just wonder if the proximity of the ram to the gpu has made them cap the overclocking potential as hbm has been locked down.
Not null, just not worth much, but I have only tested 1 game
Not null, just not worth much, but I have only tested 1 game
I have updated the Conclusion of the review with some preliminary overclocking testing.
http://www.hardocp.com/article/2015/06/24/amd_radeon_r9_fury_x_video_card_review/11
Have you had a chance to look @ actual clock in game? maybe it's throttling based on power draw vs temperature.
Good gravy...that proves that Fury X is an even bigger disappointment than just running it stock and getting the out-of-box experience. All that cooling capability and no way to exploit it.
Hope multi-GPU VRAM pooling will come off without a hitch once DX12 titles make an appearance.
so if we get voltage modification, seeing as the GPU is sitting @ 37 degrees, there is a lot of headroom with temperature, you'll be able to add some voltage! it could still be a crazy OCing card.
im not really disappointed, the cooling ability is there, just we have to see how the card is affected later with the added voltage (when possible.. )
I think the technology is so new, and fragile, any changing of the HBM clocks would do or have who knows what consequences.
Maybe HBM2 will be overclockable.
Honestly though, memory bandwidth isn't holding the Fury X back, I'm fine with just GPU overclocking, we are engine limited, not bandwidth limited with this card.
According to the FLIR shots we have seen with the shroud cover removed the VRMs are already being pushed near their thermal limits for longevity at stock clock speeds. Despite the water cooling for the core, the VRMs have no effective passive or active cooling on them. Increasing the voltage on this card would probably cause damage.Good gravy...that proves that Fury X is an even bigger disappointment than just running it stock and getting the out-of-box experience. All that cooling capability and no way to exploit it.
Hope multi-GPU VRAM pooling will come off without a hitch once DX12 titles make an appearance. At this point, that and a price drop are going to be the Fury X's only saving graces.
If, and only if, a voltage unlock and HBM overclocking become available.