4870XF vs GTX 280 - My own hands on experience

Sheganks

n00b
Joined
Jun 15, 2008
Messages
21
Some might remember a writeup I posted 12 days ago which detailed my AoC specific experience with a GTX 280, being one of the first consumers in the world (I live in Hong Kong) enjoying the new Nvidia release.

My GTX 280 AoC Specific Review:
http://www.hardforum.com/showthread.php?t=1315672

So why have I wound up with a pair of ATi 4870 in Crossfire? No I’m not a child with rich parents, nor am I a bum blowing off taxpayers’ dollar on my own indulgence. I’m a derivatives trader who knows I still have the choice to flog the GTX 280 before the ATi becomes widely available, I’m literally still “in the money” and the fact that these cards coming out so much earlier in Hong Kong to the rest of the world gives me a huge pricing advantage.

This is a somewhat longwinded post, please bear with me as I’d like to fill you in with the information that NO HARDWARE WEBSITES OUT THERE BOTHER TO POST ANYMORE. Not sure about you, but most of my friends spend hard-earned money on parts not just for numbers and benchmarks, we want a trouble-free and quality PC experience.

Here’s my system spec that matters, so you can draw your bearings on my opinions:
  • Gigabyte X48 DQ6
  • Intel Q8400@4GHz (500FSB X 8)
  • 4G DDR2 Ram@ 1GHz (5-5-5-18)
  • Mtron SSD 16GBX2 Raid0 on Intel ICH9R (no bandwidth bottleneck as reported on the web with the latest Intel Chipset driver, people just don’t post followups to tell you when things are fixed)
  • Raptor 150GBX2 Raid0 on Intel ICH9R
  • Corsair HX620 Power Supply
  • Vista 32bit SP1
  • I play all games at my Eizo S2401W’s native 1900 X 1200
  • 177.40 driver for GTX 280 tests; 8.7 beta Catalyst driver (i.e., the Hotfix) for 4870 XF tests
I’ve been actively using the GTX 280 over the last 2 weeks before coming to a decision. Here’s a list of thoughts that irks me about this USD700 extravaganza:

1). As you can see I’ve invested on an Intel platform that’s screaming for a crossfire setup as the X48 has full PCIe 2.0 16X speed on both slots. Assuming money is no object, I’m stuck with a single slot Nvidia graphic solution with no option to scale except trashing the card and slot in the next Nvidia release down the road.

While it’s trivial to just buy a 790i board to solve this problem, the time and effort required to get myself all set up with the SSD and Raptor raids and to arrive at a stable computing platform is not. The Gigabyte is an amazing piece of engineering (Prior to this I’ve gone through 4 Asus X38/48 mainboards, all had stability issues, overclocked or otherwise) that allows me to run the e8400 at a 33% overclock requiring no voltage bumps with the entire system remained as stable as a stock setup. By stable I’m not referring to being able to complete benchmarks/loops over a certain period. That’s a standard used by hardware websites. To me it is the ability to boot cleanly, that the system doesn’t reset itself randomly, that games do not crash to the desktop with odd error messages, that the system doesn't overheat and be inefficient with power consumption, or the BIOS be at least in a constant state with no resets at the next boot, etc. I experienced all of these faults with Asus over at least 4 of their recent products and to me its by no means coincidental.

Perhaps the fact that Gigabyte chooses to use the highest quality capacitors, mosfets, voltage regulators etc and that their overall design goal is stability and energy efficiency all have contributed to the success I’m having with the system. There’s no logical reason for me to chance an Nvidia 790i platform, life is just too precious to be spent crawling under my desk troubleshooting a PC.

(-1 to all the hardware reviews to just show you numbers and neglect real usability concerns)


2). The GTX 200 series power saving features do NOT work on an Intel platform. That’s right. I thought I was the odd one out when I saw countless other websites posting about the down-clocks and down-voltages, whereby the GPU settles at 300Mhz Memory to 100Mhz etc when no 3D app is running. What some of them failed to state was that they only achieved this on an Nvidia platform; while ALL the reviews out there failed to warn consumers that this does NOT work on an Intel platform at all. The writeups out there further confuse you by stating the downclocks are achieved in drivers, while only the full hybrid power features, i.e., shutting down the GPU and offloads 2D to the onboard IGP require an Nvidia platform. They just don’t know what they are talking about.

This was confirmed after a 30 minute phone conversation with an XFX engineer, the manufacturer of my GTX 280.

What this translates to is that all the idle power consumption graphs you saw on reviews over the GTX 200 series do NOT apply to an Intel platform. Again, out of the 15 odd reviews out there, not one, I repeat, NOT A SINGLE ONE, bothered to state this clearly. Whats the user footprint in the real world with an Nvidia platform? 0.1% of the PC population?

The GPU temp does drop back to the 50s while idling in Windows, but the clocks do not scale back, which leads me to believe there is no real power savings.

Power Saving Issue with GTX 280 on an Intel Chipset (x48):
http://www.hardforum.com/showthread.php?t=1318229

(-2 to all the hardware reviews to just show you numbers and failed to experiment and report useful facts)

(-3 to the biggest hardware sites out there allocating a disproportionate number of pages on architecture and technicalities. Do consumers benefit from that sort of waffle? Does it help contribute towards their buying decisions? As far as I’m concerned, those “in-depth analysis” were as authoritative as random rants coming from a pizza delivery boy. Where’s the credibility to all that cut and paste if the author is anything less than a double E major?)

3). Heat and Noise. I used my trusty right foot test for this. While playing AoC over a 4 hour grind session the toes on my right foot got very warm while dangling them near the back vent of my PC case. I have a dual screen setup so I can see for myself in realtime without alt-tabbing and letting the GPU cooldown that the GPU was consistently over 95C throughout the 4 hours. The fan sound coming out of the GTX 280 cuts into my gameplay enjoyment by as much as a USD700 dent to my wallet.

4). Performance. I let my human sensory and perception to be the judge instead of Excel, helped by an fps display in games. If you insist on “science”, the GTX 280 scored 5284 in 3DMark Vantage Extreme (I honestly don’t understand why websites insist on posting scores in performance mode, who in the right mind would buy the current GPU generation to run at 1280 X 1024 rez?), with the latest buff from the 177.40 drivers unleashing PhysX. If you read my other posts I’ve been using an 8800GT for the last 6 months, but unfortunately the increase I experienced in the same games I’ve been playing on the old card upon switching over to the GTX did not make me go “wow this is the shit! 700 bucks well spent, let’s pat myself on the back!”

Back to AoC, the main game that I play now, the GTX did allow me to:
  • Turn all view distances to max
  • Up the AA from none to 8xQAA or 16xAA (performance over the 2 modes were similar)
At the same time doubling the fps number to what it was with the 8800GT. More importantly it felt a lot smoother with no more choppiness I had with the GT, in that flicking my views and get caught in frame jerks when there’s a pile of NPCs/Objects/Players in my camera. I would say under the GTX I now average 40fps, with highs at 80fps and lows at 15fps in certain maps (Thunder River comes to mind, with a lot of water and shading effects). This is all experienced in the outdoor environment, where it matters.

Arguably the USD700 spent has already served its purpose. I can now play my favourite (of the month) game at max settings while maintaining a very smooth and playable experience.

But in view of the negatives above, and more importantly the alternative solution that the same USD700 can get me, I decided to dive into the unknown until I’m fully satisfied with the dollar spent 

I’ve spent days pouring over web reviews, forum postings and every little tidbit I can find on the 4870, especially in crossfire mode. As you can tell from what I’ve written thus far I’m less than impressed with the so called “20-page” monster reviews out there for reasons already stated. So there’s only 1 way to find out for myself….

I proceeded to flog the GTX 280 on Ebay, and managed a small profit. I then went down to the shops yesterday, the first day the Sapphire 4870 became available. The shop only carried one, I told them I want to buy 2 for crossfire. I also want to hedge my purchase. I asked for the option to swap the cards within 3 days for a GTX 280 should they not measure up, while they can keep the change (if I do swap, they’ll make an extra USD30 profit over the price difference of 4870X2 vs GTX 280). So they managed to bump another card off 1 of the 3 shops that carried it in the entire computer shopping mall, and sold them to me at wholesale price (yes occupational advantage there).

At this stage I’ve only had 1 full day of experience with the 4870 XF. This is what I’ve found:

1). User experience – the stuff you read about the poor quality of ATi drivers are pure bollocks. Soon as I installed the Catalyst everything was in order. The multi monitor desktop was easy to configure, the UI was as logical to use as Nvidia’s Forceware. AoC servers were undergoing maintenance, so I fired up Vantage to test if the cards were working: 7215 in Extreme (as if it means anything). Next up were COD4, WoW and Assassin’s Creed to check out what all the fuss was about with Crossfire.

COD4 – the whole 15 minutes of a TDM I was playing at > 90fps, with dips to the 70s in smoke-filled and intensely contested spots. Framerate wise it’s definitely a notch above the GTX 280, particularly the min framerate, which was MUCH higher than the GTX. But it’s the gameplay experience that counts right?

Micro-stutter does exist. Sorry.

It manifests itself fairly randomly, a bit like watching TV on an old CRT where you spot the occasional scan line drawing a wave over parts of the image. As I start concentrating on playing it doesn’t really get in the way, I have to specifically look for it to be there. Having stared at the image for over an hour it did not make me any more tired than compared to a single card solution like the GTX 280. So bottom line is it seems more of a nit-picking than is a real issue.

I cant tell a difference in image quality, if I were blind-tested. All I can say is the crossfire setup was absolutely oozing with power and you can almost feel a palpable screaming from the GPUs saying “unleash me”. I had a quick try with Crossfire disabled, and its immediately apparent that the experience fell back to the GTX level, which is to say XF is scaling beautifully on a 4Ghz dual-core CPU.

GPU activity was 100%, max GPU temp was 83C during the entire hour.

Now the most important part for me was for the secondary monitor to remain visible whilest playing fullscreen on my primary. +1 to the ATi driver team for making this possible (I read that this isn’t the case on an equivalent SLI setup). Now there’s a drawback: as soon as you mouse out of the fullscreen (or alt-tab), the secondary monitor goes blank and only the primary windows desktop is visible. Not a big deal for me, I only need to read the secondary display while playing so I can look at stock tickers, msn msges, game tips/cheats etc, although this behaviour does make it somewhat difficult to interact with the stuff on your secondary monitor, so there is a sacrifice for more fps power.

WoW – well it just never dipped below 100fps, crossfire or not. Quite similar to the GTX280, although this time there’s no microstutter visible. There was some blue post as recent as Feb 08 claiming while dual GPUs work in WoW there is no measureable advantage. I can confirm this to be true. Anyhow it’s a 5 year old game (including beta), why bother, lets move on.

Assassins Creed – the XF absolutely flies in this DX10 game. While the GTX wasn’t exactly crap, the degree of smoothness achieved with the XF made Assassin Creed, a top of the line DX10 game, felt as old as WoW and you feel there’s so much more spare power from the XF aching to be harnessed, its plain embarassing.

I did not notice microstutter at all in this game, however hard I tried to look for it.

AoC – the servers finally came up. With the number of posts on the Conan’s forums about crappy ATi 3870 (particularly X2) experiences, poor framerates, “unrendered” or broken textures, no scaling with crossfire, that the game is only written for nVidia, I was ready to rip out the pair of those 4870, return to the shop and swap back for a GTX 280 in a heartbeat, if the first 10 seconds of gameplay bear any resemblance to what others have reported.

This was the acid test for me.

Before I logged off on my GTX 280, I picked a spot in Khesh, on top of a small hill overlooking mountains, grass and buildings that stretched as far back as the game allowed, and noted the fps bouncing between 40 to 44. Now, the first thing I paid attention to was if this number changed. It certainly did, for the better, reading between 60 to 62. Of course this isnt the most scientific benchmark in the world, but bear with me. I then proceeded to check across 3 different zones to look for “broken” textures and defects that others have fondly reported, and found none. 30 minutes into the XF experience, AoC was rendered even prettier than any Nvidia cards I’ve used in the past few weeks. I cant exactly place my finger on what it is, could be the color, the sharpness, the better defined texture, the 8X MSAA (as opposed to Nvidia's 8XQAA) or a combination of all. I’ve been grinding in Khesh for the past week or so now on the GTX 280 and have developed a good ballpark feel for fps numbers. I noticed a SIGNIFICANT improvement in fps under the exact same graphics settings. I now peak at 120fps outdoors with troughs at around 30fps, water and shader effects abundant in Thunder River did not even make a dent to the fps, whereas on the GTX it would quickly dip to the teens. The XF setup made the whole AoC experience feels as fluid as WoW its effing ridiculous.

No microstutter, whatsoever. GPU activity at 100% throughout. All this, with a 2nd cut beta driver, on genuinely brand new GPU engine. I’m not sure what else I can nit-pick over this.​

2). Heat – obviously it was hotter than the single GTX. That’s just laws of physics that no engineers can defy. But that’s only because there’s 2 cards running in my case now. Well the GPU never went hotter than 83C, and I can say for certain the heat coming off the back vent was much hotter than the GTX. I’m tempted to chalk that up to more efficient cooler found on the ATi. I’ve always had concerns my PC would catch fire while running the GTX, I’ve never seen 95C on a GPU (105C with a mild 30MHz overclock), but with the ATis I felt the machine can go on for days without a hitch.

3). Noise – this directly reflects the state of the ATi 4870 BIOS. Soon as I powered on the pair of them started to take off like jet-engines, then went dead quiet after the post screen. In windows the fans go full chat at times, then quickly powers down. During my entire 6 hours in AoC, the cards were barely audible with the fans remaining at moderate speed. Much, much, much quieter than the GTX 280.

Sorry for the wall of text, I hope you found the above a little more useful than a bunch of graphs from some teenage “me-too” review sites. Let’s just say the pair of those red monsters will be a keeper. In closing theres a few points to make from the experience of owning the latest from both companies:
  • If there was any lingering doubts over the quality and state of ATi engineering before I made the purchase, it is now completely gone. Both the GTX and the ATi are reference boards, and to be fair the GTX feels a little better made. It feels better put together and a little more substantial. But video cards are not designed to hang in the living room is it?
  • The ATi in crossfire is substantially faster than the GTX. The difference is tangible. Even my girlfriend can tell from a blind-test. Some might argue I’m comparing 2 cards vs 1. All I want to say is that it’s the same money: USD700. Who effing cares if its 10 cards vs 1. It’s what you experience day to day in the games you play that matters, for the same dollar.
  • The ATi feels to be a genuine new design. From game experience it gives you the impression that theres much much more to be had from them; far more than the GTX. The whole GTX architecture has been around for 2 years. Those saying the drivers are still early are fooling themselves to reduce buyer’s remorse.
  • Crossfire works. Much better so as your CPU scales. There was a test from iBuyPower on how XF scales from 3GHz to 4GHz and I do echo their findings.
  • Finally, I’m enjoying a much better gameplay experience with the same USD700 while able to stay on the much better Intel platform, instead of being forced to an NForce chipset just in order for my games to scale. Also the fact that the GTX power saving features not working on Intel makes you question the logic of spending a fortune and not benefiting from one of the major advertised features.

p.s. I chose to post this on [H], as these guys are amongst the best out there with their reviews, covering actual gameplay and realworld issues instead of cut and paste for 9 out of 10 pages of an article.

Edit:
There's been more than 1 response saying the downclock works. I stand corrected and it must have been due solely to my copy of the XFX card. The engineer I spoke to may not be in a very experienced position with the GTX 280 to identify whether its the card or the series at fault.
 
Thanks for the review and a nice writeup. You guys in Hong Kong definitely get a jump start on a lot of cards. Man, I'd love to see a pic of your rig. Two 4870's already... *drool*
 
Nice overview, was actually a quick read due to formatting. The only thing missing is some screens, but I will let that slide due to a down and honest review. Keep it up I like reading personal reviews.
 
Amazing read, requesting sticky!

Man I can *not* wait to get ahold of a pair of those... :D
 
Thanks for your feedback. You made it easy to read, so don't worry about the length. Good to know the 620W Corsair could handle the 4870 CF, I wasn't sure if it would. Now I'll have to think about if I want to spend the money on a pair of 4870s instead of 4850s. :)
 
Interesting. Hardly scientific, but one of the better user-reviews I've seen. :D

I was already leaning towards a 4870 over the 260 myself, for a build I'm doing in July, but I think you just sealed the deal! Now if only they'd hurry up and drop the 1gb model, so I can compare the cost/benefit ratio of that card.

~S
 
Great review, love "real-world" findings and not canned benchies. I can't wait for the HD 4870X2.
 
Thats's the most detailed "user experience" post I've ever seen.:D Great read! And a point you made at the end reflects the shift we needed in the graphics cards industry. A war over the price/performance ratio rather than just raw power. Whether 1 card is involved or 10, the overall cost and performance output matters most. AMD has really hit it home this time, and Nvidia MUST respond to this in order to compete even if they already have the "King of the Hill" title! Which means they'll have to offer even better price/perfomance-based cards.
 
Thanks for your feedback. You made it easy to read, so don't worry about the length. Good to know the 620W Corsair could handle the 4870 CF, I wasn't sure if it would. Now I'll have to think about if I want to spend the money on a pair of 4870s instead of 4850s. :)

Thanks. I spent some time on the Corsair support forums before I blew cash on the HX620 knowing what's to come. 2 thumbs up for the engineers over that forum, their responses are honest, professional and no non-sense.
 
good to get more feedback. I am waiting for the 4870X2 to come out before I take the plunge though.
 
Love to see user reviews. News of this bug with the 280s not down clocking on intel platforms is news to me, in the quick 20mins before my card decided it had enough it was down clocking properly according to Rivatuner. Glad you are enjoying your new cards, all the things you mentioned about some of the bigger sites reviews I agree with. Frankly aside from [H] and Hardware Canucks, the other reviews from big sites are no where near as informative or the information is skewed as such to not present the proper picture to a consumer, whom the review should be for, not to tout a chip makers product, Anand lately imo is all about the latter.

It is really hard to not run out and snag some 4870s to replace the X2 in my other system, certainly for my own comparison sake to put beside the 280. I'm waiting until it's big brother is out and some more of these purported XF improvements show themselves.
 
Yeah, my favorite reviewers now are definitely HardwareCanucks, [H], TechReport, and Guru3d because they stress a review of all components with a focus on in-game experience for the user. That and well written user reviews, which this one certainly is.

P.S. I wish more people would learn to format their posts like yours!
 
Sheganks said:
Mtron SSD 16GBX2 Raid0 on Intel ICH9R (no bandwidth bottleneck as reported on the web with the latest Intel Chipset driver, people just don’t post followups to tell you when things are fixed)

:eek:

theres your problem -- the programs have data frin the drives before they know they want it.

By anands own words, they're not trying to tell you how the card will perform, they're simply trying to tell you its performance relative to everything else, which you can still kinda do with canned benchmarks and graphs.
 
nice write-up!

wow does this mean I can finally go back to an intel chipset?? wooo... :) :)
 
Nice one Sheganks, you're awesome :D
I agree with you completely about what you said at the beginning... That hardware websites so often don't bother posting about the *overall* subjective experience. They tend to focus so much on numbers. Also, that they don't bother following up on reviews etc when things don't get fixed. It's great that there are people like you who do everyone a favour by making long detailed accounts for free :)
Thanks!

PS: I wish I lived in HK! :p
 
:eek:
theres your problem -- the programs have data frin the drives before they know they want it.

Not sure which of my many problems this refers to :)

:eek:
By anands own words, they're not trying to tell you how the card will perform, they're simply trying to tell you its performance relative to everything else, which you can still kinda do with canned benchmarks and graphs.

Think of how those "reviews" are done. Put a platform together, plob the card in, get the system in a position to run a game, extract a set of numbers, jot them down, unplug power and bung the next card in, rinse and repeat. These guys have hardly experienced the product at all. They just ran numbers.

Did you notice there's not a single mention in regards to image quality in all these articles (not just little boy Anand)? An article reviewing a USD700 product and the only takeaway is a digitized number...there, its 0.1243 better than the next card, now buy it. That's because they honestly dont know what the card really does and how it behaves over a course of reasonable usage in the way these "reviews" are hammered out, at the same time having absolutely no edge over 20+ other "reviews" floating on the web over the same stuff.

That's prolly why print magazines still exist.
 
:eek:



By anands own words, they're not trying to tell you how the card will perform, they're simply trying to tell you its performance relative to everything else, which you can still kinda do with canned benchmarks and graphs.

I don't really agree, it is pretty well known now that the 4800s are taking a much less performance hit versus the 200 variants when AA is enabled, yet Anand benchmarks these games with with 1 game AA is at 2, another off, another 4 with 16x AF etc etc . It isn't telling the whole picture and is frankly misleading imo. Maybe I am missing something here, but if your going to do the canned benchmark thing and not playable settings, then tell me the various performance numbers at various resolutions and AA/AF just like computerbase.de does. It is very misleading imo and a half assed reviewing method that serves no other purpose in my mind but to put something in better light and vice versa.
 
Time to get me a x48 motherboard and a couple of 4800's... :D

Awesome review! Loved it!
 
Sheganks, how are you connecting the two 4870s? Curious as to how you did it as I might go that route....
 
I don't get this Intel-platform bug with the card not downclocking. After I leave AoC on my GTX280 with clocks showing in RivaTuner, it takes about 20-30 seconds and they drop to 300/100Mhz.

Also at this time the core temp takes a nose-dive and fan goes all the way to min. This is on an Asus X38 Formula (DDR2).
 
Great review and better than some of the ones ive read from "major" websites this week.
 
Great review, I enjoyed it even though I can't afford one HD4870 right now lol.
 
Sheganks, how are you connecting the two 4870s? Curious as to how you did it as I might go that route....

I assume you mean power connections. See below:

file.php


There's 2 PCI-E 6-pin connections available from the Corsair, each sitting on different 12V rails:

file.php


So, logically you would use 1 from each rail to connect to each card.

The Sapphire package comes with a Molex (ie the power connector you use for your DVDROM) to PCI-E 6-pin converter cable, as can be seen in the first pic. Connect one of each to a Molex feed from the same rail as the earlier connector to the corresponding card.

This will go for I'd say 80% of the PSU designs out there.
 
I have to say this: Excellent review!

Granted it may not be scientific but and excellent overall read. Hope you do more of these "User reviews" in the future.:)
 
OT: Sheganks, where do you live? :p

(HK Island or Kowloon side?)

Back on topic, thats some kickass personal review you got there; certainly enlightens the possible use of 4870XF.

People reported that AoC does not use the ATI cards to its full potential, did you see those problems at anytime? Or were they @ 100% most of the time (given what you've stated).
 
Sheganks - question for you.... Have you tried Crysis? I currently am running a 4850 xfire setup and have a sealed GTX on my desk. I have considered selling the GTX to go to dual 4870's but I am worried about xfire drivers. No matter what I try I cannot get Crysis to scale in xfire. Crysis, the Witcher and Quake Wars are all I'm really playing right now. Xfire works great with the other two, just not Crysis.

Before the Crysis bashing begins.... Please, I enjoy the game and want it to work on my setup, period. I also would like to know my setup will run Crysis Warhead and Far Cry 2, as I am really looking forward to those titles as well.

Thanks for the write up, it was very well done. I love the xfire setup so far, I just think the drivers have a ways to go. Oh and the heat! My 9800gx2 generated a lot of heat, but it was thoroughly shoved out the back of the case. I have great airflow in my case and running 3dmark demo loops ambient temps get up to 110F, with the gx2 it got to around 92F. I'm sure this is because the stock single slot cooling solution on the 4850's is far inferior to the dual slot on the 4870's.
 
Sheganks - question for you.... Have you tried Crysis? I currently am running a 4850 xfire setup and have a sealed GTX on my desk. I have considered selling the GTX to go to dual 4870's but I am worried about xfire drivers. No matter what I try I cannot get Crysis to scale in xfire. Crysis, the Witcher and Quake Wars are all I'm really playing right now. Xfire works great with the other two, just not Crysis.

Sorry I dont play Crysis and my copy has been sold. Nothing more to offer other than the obligatory try the latest drivers, or settings (try follow the settings that worked best for legitreview they've done a good job on 4870x2).

OT: Sheganks, where do you live? :p

(HK Island or Kowloon side?)

Back on topic, thats some kickass personal review you got there; certainly enlightens the possible use of 4870XF.

People reported that AoC does not use the ATI cards to its full potential, did you see those problems at anytime? Or were they @ 100% most of the time (given what you've stated).

HK side.

Hovering between 75% to 100%. I wouldnt be too concerned over whats reported within a tab of a beta driver. For all I care the game could be using only 10% of the GTX 280 despite no tools being able to report that usage :)
 
Sorry I dont play Crysis and my copy has been sold. Nothing more to offer other than the obligatory try the latest drivers, or settings (try follow the settings that worked best for legitreview they've done a good job on 4870x2).



HK side.

Hovering between 75% to 100%. I wouldnt be too concerned over whats reported within a tab of a beta driver. For all I care the game could be using only 10% of the GTX 280 despite no tools being able to report that usage :)

HK side here too :p

Thats a very good utilization given its a beta driver and whatnot. Thanks for the lengthy review, looking forward to seeing more of your reviews (and maybe more games with the 4870XF if you can haha).

Cheers
 
Back
Top