No, we has a specific bug for titles that use the newer "DX9.0c" profile. We have different DLL's (hence effectively different drivers) for DX9 and DX10/11. By forcing the title into DX10 mode it is making the game use a the DX10/11 driver so would fix the issue (even if the title didn't do...
The 3 free games plus 20% off MoH is 7900 series specific. See the AMD blog post - there is a big table highlighting the matrix of what deals are applicable to what card. Newegg are only running the 7900 series deal at the moment and its not greyed out - as it is a hyperlink it will appear in...
GHz edition is not "an overclocked 7970" it is a full SKU with its own product definition, product test flow and full qualification. CPU speed grades are not considered "OC versions" of one another.
Shogun 2 already shipped with MLAA in-game. Other titles will as well.
Note, the FEAR 3 improvements come from improving the performance of the FXAA shader.
There are multiple optimisations that take place - generic driver optimzations, API specific optimizations, architecture specific opimizations, GPU configuration optimizations, etc., etc. The boards listed here are just the improvements for the boards that we have measured for this specific set...
This driver is Catalyst 10.1 + Mass Effect + Grey screen Hotfixes.
This dos not include Crossfire Eyefinity yet, because that has not made it into a mainline and the Cat. 9.12 Hotfix was really a "preview" of it.
Its a power saving feature for Crossfire & dual GPU cards. The ASIC's that are not connected to a display are put into an ACPI sleep mode to save system power (hence they can't be read in CCC); they fully "wake up" when an application calls for Crossfire performance.
Amusingly, Wiki redirects "Static Superscalar" to VLIW.
Prior to the games release we'd itentified a number of areas where the performance could be made better on ATI Radeon's and the developers have responded:
http://www.virtualr.net/need-for-speed-shift-discussing-shift-with-ian-bell/
We have individual control over each of the scalar processors within the VLIW - we can (and do) pack more than one instrcution at a time in order to maximise the VLIW utilization. Your link points to this:
This is very much what is achieved on the architecture.
Irrespective of whether the sound is used or not the system will detech the hardware and ask for a driver. Even though the board has two DVI outputs HDMI (with audio) can be achieved via the DVI-to-HDMI adapter that you probably got with the board. However, if you have no intention of using HDMI...
You are quite right to point out that games are becoming more shader intensive - even NVIDIA admit this with the changes to texture:shader ratio in GT200 - which is good for our architecture.
However an additional factor is that R600 was late to the party. NVIDIA had the G80 architecture...
What about it? Do you know what changes were made? Are you also suggesting that we shouldn't impove drivers? The performance numbers where what they were when people bought them, surely continued improvements are what good support of a product is all about.
Additionally, think of a DX10 title...
I can point to a "Vantage" Hotfix driver that made gains with 3DMark Vantage, however it in no way was Vantage specific - it made significant [i]generic[i] changes to our base DX10 driver code, giving nice performance gains across many DX10 apps.
http://www.rage3d.com/articles/vantage-hotfix/
I've not actually verified this one myself, but Hardspell are reporting that PowerColor have a native HDMI / DisplayPort / DVI board:
http://en.hardspell.com/doc/showcont.asp?news_id=3836
Like the HD 2000 series, the ATI Radeon HD 3870 and 3850 both have an integrated audio controller specifically for HDMI sound output (without additional cabling to the GPU). Most of the current HD 3800's available are shipping with an HDMI certified DVI-HDMI adaptor that carries the audio.
Actually, yes. Fundamentally there is no reasons why one precision should be faster than the another if the internal datapaths are built for it. If you've ever read some of the comments on precisions from the ATI engineers about support for precisions some of them are fundamentally against the...
Please note: there is no such thing as an "FP24 Shader" there is only a "Shader" (implicitly full precision) or a "Shader which contains Partial Precision hints".
However, the problem with you suggestion above is that it only accounts of that shader alone - it doe not account for when that...
Somebody hacking it and thinking there is no discernable difference is different from a developer validating that the output that is generated is the output that the artists expect to see. By altering the precision it operates at you are potentially altering the results that are intended to be...
The default path for these boards is now DX8.1, not DX9, so there performance for them is already there since the DX8.1 path is faster.
If you want to hack it, then the 5 minutes of analysis thats probably gone on here will probably tell you there is no discernable IQ differences. If you...
Just a couple of things to point out.
Definition wise, you dont force FP24, you actually force FP16 the default for compilation of shaders on under DirectX is full precision and the hardware requirements to be classified as full precision states that you must support at least FP24...
Thats somewhat of a falacy. ARTx had nothing on its roadmap like this, which is one of the reasons why it could be bought. The technology in there was all designed within a period post ATI purcahsing them, and you can even see the lineage from previous ATI parts that were used or extended in...
Different design teams doing the different parts. I think you'll surprised when you start hearing the technical details. Don't assume they will have the same capabilities because they are not addressing the exact same markets.