Some might remember a writeup I posted 12 days ago which detailed my AoC specific experience with a GTX 280, being one of the first consumers in the world (I live in Hong Kong) enjoying the new Nvidia release.
My GTX 280 AoC Specific Review:
http://www.hardforum.com/showthread.php?t=1315672
So why have I wound up with a pair of ATi 4870 in Crossfire? No Im not a child with rich parents, nor am I a bum blowing off taxpayers dollar on my own indulgence. Im a derivatives trader who knows I still have the choice to flog the GTX 280 before the ATi becomes widely available, Im literally still in the money and the fact that these cards coming out so much earlier in Hong Kong to the rest of the world gives me a huge pricing advantage.
This is a somewhat longwinded post, please bear with me as Id like to fill you in with the information that NO HARDWARE WEBSITES OUT THERE BOTHER TO POST ANYMORE. Not sure about you, but most of my friends spend hard-earned money on parts not just for numbers and benchmarks, we want a trouble-free and quality PC experience.
Heres my system spec that matters, so you can draw your bearings on my opinions:
1). As you can see Ive invested on an Intel platform thats screaming for a crossfire setup as the X48 has full PCIe 2.0 16X speed on both slots. Assuming money is no object, Im stuck with a single slot Nvidia graphic solution with no option to scale except trashing the card and slot in the next Nvidia release down the road.
While its trivial to just buy a 790i board to solve this problem, the time and effort required to get myself all set up with the SSD and Raptor raids and to arrive at a stable computing platform is not. The Gigabyte is an amazing piece of engineering (Prior to this Ive gone through 4 Asus X38/48 mainboards, all had stability issues, overclocked or otherwise) that allows me to run the e8400 at a 33% overclock requiring no voltage bumps with the entire system remained as stable as a stock setup. By stable Im not referring to being able to complete benchmarks/loops over a certain period. Thats a standard used by hardware websites. To me it is the ability to boot cleanly, that the system doesnt reset itself randomly, that games do not crash to the desktop with odd error messages, that the system doesn't overheat and be inefficient with power consumption, or the BIOS be at least in a constant state with no resets at the next boot, etc. I experienced all of these faults with Asus over at least 4 of their recent products and to me its by no means coincidental.
Perhaps the fact that Gigabyte chooses to use the highest quality capacitors, mosfets, voltage regulators etc and that their overall design goal is stability and energy efficiency all have contributed to the success Im having with the system. Theres no logical reason for me to chance an Nvidia 790i platform, life is just too precious to be spent crawling under my desk troubleshooting a PC.
(-1 to all the hardware reviews to just show you numbers and neglect real usability concerns)
2). The GTX 200 series power saving features do NOT work on an Intel platform. Thats right. I thought I was the odd one out when I saw countless other websites posting about the down-clocks and down-voltages, whereby the GPU settles at 300Mhz Memory to 100Mhz etc when no 3D app is running. What some of them failed to state was that they only achieved this on an Nvidia platform; while ALL the reviews out there failed to warn consumers that this does NOT work on an Intel platform at all. The writeups out there further confuse you by stating the downclocks are achieved in drivers, while only the full hybrid power features, i.e., shutting down the GPU and offloads 2D to the onboard IGP require an Nvidia platform. They just dont know what they are talking about.
This was confirmed after a 30 minute phone conversation with an XFX engineer, the manufacturer of my GTX 280.
What this translates to is that all the idle power consumption graphs you saw on reviews over the GTX 200 series do NOT apply to an Intel platform. Again, out of the 15 odd reviews out there, not one, I repeat, NOT A SINGLE ONE, bothered to state this clearly. Whats the user footprint in the real world with an Nvidia platform? 0.1% of the PC population?
The GPU temp does drop back to the 50s while idling in Windows, but the clocks do not scale back, which leads me to believe there is no real power savings.
Power Saving Issue with GTX 280 on an Intel Chipset (x48):
http://www.hardforum.com/showthread.php?t=1318229
(-2 to all the hardware reviews to just show you numbers and failed to experiment and report useful facts)
(-3 to the biggest hardware sites out there allocating a disproportionate number of pages on architecture and technicalities. Do consumers benefit from that sort of waffle? Does it help contribute towards their buying decisions? As far as Im concerned, those in-depth analysis were as authoritative as random rants coming from a pizza delivery boy. Wheres the credibility to all that cut and paste if the author is anything less than a double E major?)
3). Heat and Noise. I used my trusty right foot test for this. While playing AoC over a 4 hour grind session the toes on my right foot got very warm while dangling them near the back vent of my PC case. I have a dual screen setup so I can see for myself in realtime without alt-tabbing and letting the GPU cooldown that the GPU was consistently over 95C throughout the 4 hours. The fan sound coming out of the GTX 280 cuts into my gameplay enjoyment by as much as a USD700 dent to my wallet.
4). Performance. I let my human sensory and perception to be the judge instead of Excel, helped by an fps display in games. If you insist on science, the GTX 280 scored 5284 in 3DMark Vantage Extreme (I honestly dont understand why websites insist on posting scores in performance mode, who in the right mind would buy the current GPU generation to run at 1280 X 1024 rez?), with the latest buff from the 177.40 drivers unleashing PhysX. If you read my other posts Ive been using an 8800GT for the last 6 months, but unfortunately the increase I experienced in the same games Ive been playing on the old card upon switching over to the GTX did not make me go wow this is the shit! 700 bucks well spent, lets pat myself on the back!
Back to AoC, the main game that I play now, the GTX did allow me to:
Arguably the USD700 spent has already served its purpose. I can now play my favourite (of the month) game at max settings while maintaining a very smooth and playable experience.
But in view of the negatives above, and more importantly the alternative solution that the same USD700 can get me, I decided to dive into the unknown until Im fully satisfied with the dollar spent 
Ive spent days pouring over web reviews, forum postings and every little tidbit I can find on the 4870, especially in crossfire mode. As you can tell from what Ive written thus far Im less than impressed with the so called 20-page monster reviews out there for reasons already stated. So theres only 1 way to find out for myself .
I proceeded to flog the GTX 280 on Ebay, and managed a small profit. I then went down to the shops yesterday, the first day the Sapphire 4870 became available. The shop only carried one, I told them I want to buy 2 for crossfire. I also want to hedge my purchase. I asked for the option to swap the cards within 3 days for a GTX 280 should they not measure up, while they can keep the change (if I do swap, theyll make an extra USD30 profit over the price difference of 4870X2 vs GTX 280). So they managed to bump another card off 1 of the 3 shops that carried it in the entire computer shopping mall, and sold them to me at wholesale price (yes occupational advantage there).
At this stage Ive only had 1 full day of experience with the 4870 XF. This is what Ive found:
1). User experience the stuff you read about the poor quality of ATi drivers are pure bollocks. Soon as I installed the Catalyst everything was in order. The multi monitor desktop was easy to configure, the UI was as logical to use as Nvidias Forceware. AoC servers were undergoing maintenance, so I fired up Vantage to test if the cards were working: 7215 in Extreme (as if it means anything). Next up were COD4, WoW and Assassins Creed to check out what all the fuss was about with Crossfire.
2). Heat obviously it was hotter than the single GTX. Thats just laws of physics that no engineers can defy. But thats only because theres 2 cards running in my case now. Well the GPU never went hotter than 83C, and I can say for certain the heat coming off the back vent was much hotter than the GTX. Im tempted to chalk that up to more efficient cooler found on the ATi. Ive always had concerns my PC would catch fire while running the GTX, Ive never seen 95C on a GPU (105C with a mild 30MHz overclock), but with the ATis I felt the machine can go on for days without a hitch.
3). Noise this directly reflects the state of the ATi 4870 BIOS. Soon as I powered on the pair of them started to take off like jet-engines, then went dead quiet after the post screen. In windows the fans go full chat at times, then quickly powers down. During my entire 6 hours in AoC, the cards were barely audible with the fans remaining at moderate speed. Much, much, much quieter than the GTX 280.
Sorry for the wall of text, I hope you found the above a little more useful than a bunch of graphs from some teenage me-too review sites. Lets just say the pair of those red monsters will be a keeper. In closing theres a few points to make from the experience of owning the latest from both companies:
p.s. I chose to post this on [H], as these guys are amongst the best out there with their reviews, covering actual gameplay and realworld issues instead of cut and paste for 9 out of 10 pages of an article.
Edit:
There's been more than 1 response saying the downclock works. I stand corrected and it must have been due solely to my copy of the XFX card. The engineer I spoke to may not be in a very experienced position with the GTX 280 to identify whether its the card or the series at fault.
My GTX 280 AoC Specific Review:
http://www.hardforum.com/showthread.php?t=1315672
So why have I wound up with a pair of ATi 4870 in Crossfire? No Im not a child with rich parents, nor am I a bum blowing off taxpayers dollar on my own indulgence. Im a derivatives trader who knows I still have the choice to flog the GTX 280 before the ATi becomes widely available, Im literally still in the money and the fact that these cards coming out so much earlier in Hong Kong to the rest of the world gives me a huge pricing advantage.
This is a somewhat longwinded post, please bear with me as Id like to fill you in with the information that NO HARDWARE WEBSITES OUT THERE BOTHER TO POST ANYMORE. Not sure about you, but most of my friends spend hard-earned money on parts not just for numbers and benchmarks, we want a trouble-free and quality PC experience.
Heres my system spec that matters, so you can draw your bearings on my opinions:
- Gigabyte X48 DQ6
- Intel Q8400@4GHz (500FSB X 8)
- 4G DDR2 Ram@ 1GHz (5-5-5-18)
- Mtron SSD 16GBX2 Raid0 on Intel ICH9R (no bandwidth bottleneck as reported on the web with the latest Intel Chipset driver, people just dont post followups to tell you when things are fixed)
- Raptor 150GBX2 Raid0 on Intel ICH9R
- Corsair HX620 Power Supply
- Vista 32bit SP1
- I play all games at my Eizo S2401Ws native 1900 X 1200
- 177.40 driver for GTX 280 tests; 8.7 beta Catalyst driver (i.e., the Hotfix) for 4870 XF tests
1). As you can see Ive invested on an Intel platform thats screaming for a crossfire setup as the X48 has full PCIe 2.0 16X speed on both slots. Assuming money is no object, Im stuck with a single slot Nvidia graphic solution with no option to scale except trashing the card and slot in the next Nvidia release down the road.
While its trivial to just buy a 790i board to solve this problem, the time and effort required to get myself all set up with the SSD and Raptor raids and to arrive at a stable computing platform is not. The Gigabyte is an amazing piece of engineering (Prior to this Ive gone through 4 Asus X38/48 mainboards, all had stability issues, overclocked or otherwise) that allows me to run the e8400 at a 33% overclock requiring no voltage bumps with the entire system remained as stable as a stock setup. By stable Im not referring to being able to complete benchmarks/loops over a certain period. Thats a standard used by hardware websites. To me it is the ability to boot cleanly, that the system doesnt reset itself randomly, that games do not crash to the desktop with odd error messages, that the system doesn't overheat and be inefficient with power consumption, or the BIOS be at least in a constant state with no resets at the next boot, etc. I experienced all of these faults with Asus over at least 4 of their recent products and to me its by no means coincidental.
Perhaps the fact that Gigabyte chooses to use the highest quality capacitors, mosfets, voltage regulators etc and that their overall design goal is stability and energy efficiency all have contributed to the success Im having with the system. Theres no logical reason for me to chance an Nvidia 790i platform, life is just too precious to be spent crawling under my desk troubleshooting a PC.
(-1 to all the hardware reviews to just show you numbers and neglect real usability concerns)
2). The GTX 200 series power saving features do NOT work on an Intel platform. Thats right. I thought I was the odd one out when I saw countless other websites posting about the down-clocks and down-voltages, whereby the GPU settles at 300Mhz Memory to 100Mhz etc when no 3D app is running. What some of them failed to state was that they only achieved this on an Nvidia platform; while ALL the reviews out there failed to warn consumers that this does NOT work on an Intel platform at all. The writeups out there further confuse you by stating the downclocks are achieved in drivers, while only the full hybrid power features, i.e., shutting down the GPU and offloads 2D to the onboard IGP require an Nvidia platform. They just dont know what they are talking about.
This was confirmed after a 30 minute phone conversation with an XFX engineer, the manufacturer of my GTX 280.
What this translates to is that all the idle power consumption graphs you saw on reviews over the GTX 200 series do NOT apply to an Intel platform. Again, out of the 15 odd reviews out there, not one, I repeat, NOT A SINGLE ONE, bothered to state this clearly. Whats the user footprint in the real world with an Nvidia platform? 0.1% of the PC population?
The GPU temp does drop back to the 50s while idling in Windows, but the clocks do not scale back, which leads me to believe there is no real power savings.
Power Saving Issue with GTX 280 on an Intel Chipset (x48):
http://www.hardforum.com/showthread.php?t=1318229
(-2 to all the hardware reviews to just show you numbers and failed to experiment and report useful facts)
(-3 to the biggest hardware sites out there allocating a disproportionate number of pages on architecture and technicalities. Do consumers benefit from that sort of waffle? Does it help contribute towards their buying decisions? As far as Im concerned, those in-depth analysis were as authoritative as random rants coming from a pizza delivery boy. Wheres the credibility to all that cut and paste if the author is anything less than a double E major?)
3). Heat and Noise. I used my trusty right foot test for this. While playing AoC over a 4 hour grind session the toes on my right foot got very warm while dangling them near the back vent of my PC case. I have a dual screen setup so I can see for myself in realtime without alt-tabbing and letting the GPU cooldown that the GPU was consistently over 95C throughout the 4 hours. The fan sound coming out of the GTX 280 cuts into my gameplay enjoyment by as much as a USD700 dent to my wallet.
4). Performance. I let my human sensory and perception to be the judge instead of Excel, helped by an fps display in games. If you insist on science, the GTX 280 scored 5284 in 3DMark Vantage Extreme (I honestly dont understand why websites insist on posting scores in performance mode, who in the right mind would buy the current GPU generation to run at 1280 X 1024 rez?), with the latest buff from the 177.40 drivers unleashing PhysX. If you read my other posts Ive been using an 8800GT for the last 6 months, but unfortunately the increase I experienced in the same games Ive been playing on the old card upon switching over to the GTX did not make me go wow this is the shit! 700 bucks well spent, lets pat myself on the back!
Back to AoC, the main game that I play now, the GTX did allow me to:
- Turn all view distances to max
- Up the AA from none to 8xQAA or 16xAA (performance over the 2 modes were similar)
Arguably the USD700 spent has already served its purpose. I can now play my favourite (of the month) game at max settings while maintaining a very smooth and playable experience.
But in view of the negatives above, and more importantly the alternative solution that the same USD700 can get me, I decided to dive into the unknown until Im fully satisfied with the dollar spent 
Ive spent days pouring over web reviews, forum postings and every little tidbit I can find on the 4870, especially in crossfire mode. As you can tell from what Ive written thus far Im less than impressed with the so called 20-page monster reviews out there for reasons already stated. So theres only 1 way to find out for myself .
I proceeded to flog the GTX 280 on Ebay, and managed a small profit. I then went down to the shops yesterday, the first day the Sapphire 4870 became available. The shop only carried one, I told them I want to buy 2 for crossfire. I also want to hedge my purchase. I asked for the option to swap the cards within 3 days for a GTX 280 should they not measure up, while they can keep the change (if I do swap, theyll make an extra USD30 profit over the price difference of 4870X2 vs GTX 280). So they managed to bump another card off 1 of the 3 shops that carried it in the entire computer shopping mall, and sold them to me at wholesale price (yes occupational advantage there).
At this stage Ive only had 1 full day of experience with the 4870 XF. This is what Ive found:
1). User experience the stuff you read about the poor quality of ATi drivers are pure bollocks. Soon as I installed the Catalyst everything was in order. The multi monitor desktop was easy to configure, the UI was as logical to use as Nvidias Forceware. AoC servers were undergoing maintenance, so I fired up Vantage to test if the cards were working: 7215 in Extreme (as if it means anything). Next up were COD4, WoW and Assassins Creed to check out what all the fuss was about with Crossfire.
COD4 the whole 15 minutes of a TDM I was playing at > 90fps, with dips to the 70s in smoke-filled and intensely contested spots. Framerate wise its definitely a notch above the GTX 280, particularly the min framerate, which was MUCH higher than the GTX. But its the gameplay experience that counts right?
Micro-stutter does exist. Sorry.
It manifests itself fairly randomly, a bit like watching TV on an old CRT where you spot the occasional scan line drawing a wave over parts of the image. As I start concentrating on playing it doesnt really get in the way, I have to specifically look for it to be there. Having stared at the image for over an hour it did not make me any more tired than compared to a single card solution like the GTX 280. So bottom line is it seems more of a nit-picking than is a real issue.
I cant tell a difference in image quality, if I were blind-tested. All I can say is the crossfire setup was absolutely oozing with power and you can almost feel a palpable screaming from the GPUs saying unleash me. I had a quick try with Crossfire disabled, and its immediately apparent that the experience fell back to the GTX level, which is to say XF is scaling beautifully on a 4Ghz dual-core CPU.
GPU activity was 100%, max GPU temp was 83C during the entire hour.
Now the most important part for me was for the secondary monitor to remain visible whilest playing fullscreen on my primary. +1 to the ATi driver team for making this possible (I read that this isnt the case on an equivalent SLI setup). Now theres a drawback: as soon as you mouse out of the fullscreen (or alt-tab), the secondary monitor goes blank and only the primary windows desktop is visible. Not a big deal for me, I only need to read the secondary display while playing so I can look at stock tickers, msn msges, game tips/cheats etc, although this behaviour does make it somewhat difficult to interact with the stuff on your secondary monitor, so there is a sacrifice for more fps power.
WoW well it just never dipped below 100fps, crossfire or not. Quite similar to the GTX280, although this time theres no microstutter visible. There was some blue post as recent as Feb 08 claiming while dual GPUs work in WoW there is no measureable advantage. I can confirm this to be true. Anyhow its a 5 year old game (including beta), why bother, lets move on.
Assassins Creed the XF absolutely flies in this DX10 game. While the GTX wasnt exactly crap, the degree of smoothness achieved with the XF made Assassin Creed, a top of the line DX10 game, felt as old as WoW and you feel theres so much more spare power from the XF aching to be harnessed, its plain embarassing.
I did not notice microstutter at all in this game, however hard I tried to look for it.
AoC the servers finally came up. With the number of posts on the Conans forums about crappy ATi 3870 (particularly X2) experiences, poor framerates, unrendered or broken textures, no scaling with crossfire, that the game is only written for nVidia, I was ready to rip out the pair of those 4870, return to the shop and swap back for a GTX 280 in a heartbeat, if the first 10 seconds of gameplay bear any resemblance to what others have reported.
This was the acid test for me.
Before I logged off on my GTX 280, I picked a spot in Khesh, on top of a small hill overlooking mountains, grass and buildings that stretched as far back as the game allowed, and noted the fps bouncing between 40 to 44. Now, the first thing I paid attention to was if this number changed. It certainly did, for the better, reading between 60 to 62. Of course this isnt the most scientific benchmark in the world, but bear with me. I then proceeded to check across 3 different zones to look for broken textures and defects that others have fondly reported, and found none. 30 minutes into the XF experience, AoC was rendered even prettier than any Nvidia cards Ive used in the past few weeks. I cant exactly place my finger on what it is, could be the color, the sharpness, the better defined texture, the 8X MSAA (as opposed to Nvidia's 8XQAA) or a combination of all. Ive been grinding in Khesh for the past week or so now on the GTX 280 and have developed a good ballpark feel for fps numbers. I noticed a SIGNIFICANT improvement in fps under the exact same graphics settings. I now peak at 120fps outdoors with troughs at around 30fps, water and shader effects abundant in Thunder River did not even make a dent to the fps, whereas on the GTX it would quickly dip to the teens. The XF setup made the whole AoC experience feels as fluid as WoW its effing ridiculous.
No microstutter, whatsoever. GPU activity at 100% throughout. All this, with a 2nd cut beta driver, on genuinely brand new GPU engine. Im not sure what else I can nit-pick over this.
Micro-stutter does exist. Sorry.
It manifests itself fairly randomly, a bit like watching TV on an old CRT where you spot the occasional scan line drawing a wave over parts of the image. As I start concentrating on playing it doesnt really get in the way, I have to specifically look for it to be there. Having stared at the image for over an hour it did not make me any more tired than compared to a single card solution like the GTX 280. So bottom line is it seems more of a nit-picking than is a real issue.
I cant tell a difference in image quality, if I were blind-tested. All I can say is the crossfire setup was absolutely oozing with power and you can almost feel a palpable screaming from the GPUs saying unleash me. I had a quick try with Crossfire disabled, and its immediately apparent that the experience fell back to the GTX level, which is to say XF is scaling beautifully on a 4Ghz dual-core CPU.
GPU activity was 100%, max GPU temp was 83C during the entire hour.
Now the most important part for me was for the secondary monitor to remain visible whilest playing fullscreen on my primary. +1 to the ATi driver team for making this possible (I read that this isnt the case on an equivalent SLI setup). Now theres a drawback: as soon as you mouse out of the fullscreen (or alt-tab), the secondary monitor goes blank and only the primary windows desktop is visible. Not a big deal for me, I only need to read the secondary display while playing so I can look at stock tickers, msn msges, game tips/cheats etc, although this behaviour does make it somewhat difficult to interact with the stuff on your secondary monitor, so there is a sacrifice for more fps power.
WoW well it just never dipped below 100fps, crossfire or not. Quite similar to the GTX280, although this time theres no microstutter visible. There was some blue post as recent as Feb 08 claiming while dual GPUs work in WoW there is no measureable advantage. I can confirm this to be true. Anyhow its a 5 year old game (including beta), why bother, lets move on.
Assassins Creed the XF absolutely flies in this DX10 game. While the GTX wasnt exactly crap, the degree of smoothness achieved with the XF made Assassin Creed, a top of the line DX10 game, felt as old as WoW and you feel theres so much more spare power from the XF aching to be harnessed, its plain embarassing.
I did not notice microstutter at all in this game, however hard I tried to look for it.
AoC the servers finally came up. With the number of posts on the Conans forums about crappy ATi 3870 (particularly X2) experiences, poor framerates, unrendered or broken textures, no scaling with crossfire, that the game is only written for nVidia, I was ready to rip out the pair of those 4870, return to the shop and swap back for a GTX 280 in a heartbeat, if the first 10 seconds of gameplay bear any resemblance to what others have reported.
This was the acid test for me.
Before I logged off on my GTX 280, I picked a spot in Khesh, on top of a small hill overlooking mountains, grass and buildings that stretched as far back as the game allowed, and noted the fps bouncing between 40 to 44. Now, the first thing I paid attention to was if this number changed. It certainly did, for the better, reading between 60 to 62. Of course this isnt the most scientific benchmark in the world, but bear with me. I then proceeded to check across 3 different zones to look for broken textures and defects that others have fondly reported, and found none. 30 minutes into the XF experience, AoC was rendered even prettier than any Nvidia cards Ive used in the past few weeks. I cant exactly place my finger on what it is, could be the color, the sharpness, the better defined texture, the 8X MSAA (as opposed to Nvidia's 8XQAA) or a combination of all. Ive been grinding in Khesh for the past week or so now on the GTX 280 and have developed a good ballpark feel for fps numbers. I noticed a SIGNIFICANT improvement in fps under the exact same graphics settings. I now peak at 120fps outdoors with troughs at around 30fps, water and shader effects abundant in Thunder River did not even make a dent to the fps, whereas on the GTX it would quickly dip to the teens. The XF setup made the whole AoC experience feels as fluid as WoW its effing ridiculous.
No microstutter, whatsoever. GPU activity at 100% throughout. All this, with a 2nd cut beta driver, on genuinely brand new GPU engine. Im not sure what else I can nit-pick over this.
2). Heat obviously it was hotter than the single GTX. Thats just laws of physics that no engineers can defy. But thats only because theres 2 cards running in my case now. Well the GPU never went hotter than 83C, and I can say for certain the heat coming off the back vent was much hotter than the GTX. Im tempted to chalk that up to more efficient cooler found on the ATi. Ive always had concerns my PC would catch fire while running the GTX, Ive never seen 95C on a GPU (105C with a mild 30MHz overclock), but with the ATis I felt the machine can go on for days without a hitch.
3). Noise this directly reflects the state of the ATi 4870 BIOS. Soon as I powered on the pair of them started to take off like jet-engines, then went dead quiet after the post screen. In windows the fans go full chat at times, then quickly powers down. During my entire 6 hours in AoC, the cards were barely audible with the fans remaining at moderate speed. Much, much, much quieter than the GTX 280.
Sorry for the wall of text, I hope you found the above a little more useful than a bunch of graphs from some teenage me-too review sites. Lets just say the pair of those red monsters will be a keeper. In closing theres a few points to make from the experience of owning the latest from both companies:
- If there was any lingering doubts over the quality and state of ATi engineering before I made the purchase, it is now completely gone. Both the GTX and the ATi are reference boards, and to be fair the GTX feels a little better made. It feels better put together and a little more substantial. But video cards are not designed to hang in the living room is it?
- The ATi in crossfire is substantially faster than the GTX. The difference is tangible. Even my girlfriend can tell from a blind-test. Some might argue Im comparing 2 cards vs 1. All I want to say is that its the same money: USD700. Who effing cares if its 10 cards vs 1. Its what you experience day to day in the games you play that matters, for the same dollar.
- The ATi feels to be a genuine new design. From game experience it gives you the impression that theres much much more to be had from them; far more than the GTX. The whole GTX architecture has been around for 2 years. Those saying the drivers are still early are fooling themselves to reduce buyers remorse.
- Crossfire works. Much better so as your CPU scales. There was a test from iBuyPower on how XF scales from 3GHz to 4GHz and I do echo their findings.
- Finally, Im enjoying a much better gameplay experience with the same USD700 while able to stay on the much better Intel platform, instead of being forced to an NForce chipset just in order for my games to scale. Also the fact that the GTX power saving features not working on Intel makes you question the logic of spending a fortune and not benefiting from one of the major advertised features.
p.s. I chose to post this on [H], as these guys are amongst the best out there with their reviews, covering actual gameplay and realworld issues instead of cut and paste for 9 out of 10 pages of an article.
Edit:
There's been more than 1 response saying the downclock works. I stand corrected and it must have been due solely to my copy of the XFX card. The engineer I spoke to may not be in a very experienced position with the GTX 280 to identify whether its the card or the series at fault.