My 4870 has arrived: Picture, overclocking/benching soon!

How are your fps levels for 1680 with 4xaa lower then the ones from toms hardware review.

You can't really compare across systems, settings (the page doesn't even specify), etc. and to a cruddy review site at that ;). I can't tell what AF level they used in that review (unspecified if any) (Tom's isn't the most reputable site in any case), but I was running 4x AA with transparency AA (they don't specify), and 16x AF in my tests for that comparison.
 
You can't really compare across systems, settings (the page doesn't even specify), etc. and to a cruddy review site at that ;). I can't tell what AF level they used in that review (unspecified if any) (Tom's isn't the most reputable site in any case), but I was running 4x AA with transparency AA (they don't specify), and 16x AF in my tests for that comparison.

For 1680 res they were running with 4xaa 16xaf. Their rig is here. Is yours faster or slower then theirs?
 
Those results seem weird... What happens if you lower CPU speed to stock? Because a guy in an other thread gets a massive performance decrease when overclocking the card.
 
For 1680 res they were running with 4xaa 16xaf. Their rig is here. Is yours faster or slower then theirs?

My specs are listed several times earlier in the thread... again, refer to my last post as far as why the comparison doesn't work. They also likely used the default fly-by bench whereas I used assault_harbor which is much more strenuous on the system.
 
Frostex.

Nice system you are assembling. Although you do not need me to mention this, 40% fan makes overclocking with CCC almost a no brainer.
 
hey GoldenTiger, who'd u buy your cards from? I know Ebay, need the name for purchasing.
 
hey GoldenTiger, who'd u buy your cards from? I know Ebay, need the name for purchasing.

I purchased from seller "x1387" which is a Hong Kong one. Unless there's some specific reason you want it from them (they do have a listing up at the moment for the 4870s), you're probably best off for ease and possibly price buying from someone on this side of the ocean :). Their customer service (via email) and shipping speed was great, though. If I were looking for another product about to be released/slightly pre-release I'd order from them again. The package did include an invoice that could be used for warranty/etc.
 
I saw them...wasn't sure that was the same dealer. Thanks a bunch.
 
Well, I had the chance to pick up a 4870 for around 290$ locally, and of course, I jumped right on doing so! It should be a substantial upgrade from my 8800GT640, and will cost about HALF of what I paid for that nVidia card, only 2 short years ago.

I noticed that there was a severe lack of testing of Oblivion with these new cards, from both camps, so I decided to jump on that grenade, and get some results out.

First Impressions:

The card it's self is gorgeous and extremely heavy. The way the product was boxed was also quite nice as well... Just enough packaging to do it's job and not seem overly bloated. The HIS bundle was kinda cool, and I really dug that screwdriver combo, it let me install the new card in the dark!

Bootup

The very first thing I noticed was the noise. At startup, this card's fan produces one hell of a whirl! luckily by the Windows XP screen, it had quieted down to an almost inaudible level... I took some quick exhaust temperatures of the card at idle, and recorded between 59c and 64c... So, this card does push out quite a bit of heat.

Also, I tried unplugging the 6 pin power connectors just to see what the card would do. I was pleasantly surprised when the card booted up, and displayed a warning right after POST to connect the power cables... No high pitched squeal ala nVidia. Quite elegant IMO.

Drivers... Now, if you know me, you know I hate cables in my case. You might also know that, to circumvent extra cabling mess, I don't have an internal optical drive... Unfortunately, my external DVD burner decided to take a dump on my installation plans. No problem, I'll just boot up, get to ATI's website, and download the drivers! Well, there are no drivers on ATI's site yet... So, I navigated to HIS's website, and eventually found the drivers, which at 177mb, downloaded at a painstaking 17kbps. Killer.

I'm going to take the time to describe how difficult it was to browse the internet without the driver installed. Using Window's default VGA driver, screen rendering was *unbearably* slow. I was literally counting fps on screen refreshes, even at 1024x768... It would take about 15 seconds for the firefox main page to render, after the window was already open... Compared to my old 8800GTS' default "VGA driver" rendering prowess (which is at least 300% faster than this 4870s') I was quite disappointed. I would not recommend this card for a computer without a CD rom drive, and I would dare not lose your driver disk!

Oblivion

Now, as I said earlier, this post was mainly supposed to be a review of Oblivion's performance with the 4870... As my favorite game, I'm always screwing around with it's configs, addons and settings to provide the highest level of immersion... But with my 8800GTS, I wasn't able to get good frame rates with more than 2x aa, and the settings I had chosen. This was my main push to upgrade!

The settings I play with are as follows:
Resolution: 1680x1050
Textures: Large
Tree, Actor, Item, Object, Grass "Fades" and View distance: 100%
Distant land, buildings, trees: ON
Interior and Exterior shadows: 100%
Self Shadows: OFF (caused strange glitches for me... so...)
Shadows on Grass, Tree Canopy shadows: ON
Shadow Filtering: High
Specular Distance: 100%
HDR: ON
Water, Reflections, Ripples: High or ON
(nVidia) AA: 2x (forced in Control Panel)
(nVidia) AF: 8x (Forced in Control Panel)
(nVidia) Texture Filtering: Highest Quality


(ATI) AA: 4x Edge Detect (Forced in Control Panel)
(ATI) AF: 16x (Forced in Control Panel)


Additionally, I had the following Mods installed:
Qarl's TP3
Better LOD textures
Natural Environments
HDR Weather Overhaul
Exnem's, Ren's and a bunch of race/item mods

I then tried to find the most graphically intensive spot I could find (with relative ease) to take down some numbers standing, running around and fighting (using the console to spawn some enemies). I settled on the following area:

[INSERTPIC] (I'll post it in a little bit...)

With the 8800GTS I was scoring between 24 and 34fps, and these would dip down to 21fps during some battles with 1-3 enemies at once... Sorry, I have no screenshots to prove this.

With the 4870 I was scoring between 38 and 60 fps... With a higher AA and AF setting... As you can see, this is quite a big freaking difference... We're talking almost a 100% increase in raw fps, with higher settings to boot. I was very pleased.

[INSERTPIC] (again, gimme time).

There were a few issues though, which I suspect will be resolved when I re-install windows later this summer (I'm overhauling my case for the *hopefully* last time) mainly a longer than normal hitch every time new textures had to be loaded, and some strange loading times on application startups...

It might be of interest to someone that the card does squeal, just like the 8800's... It emits a high, variable squeal that changes pitch based on GPU load... The squeal is much less noticible than the old 8800GTS I had, but is still the loudest noise in my entire system (yes, over the stock 4870 fan).

Overclocking:

I'm an overclocker and a modder by heart, and therefore, It was almost sinful to leave this card at it's stock clocks... So, I fired up the Catalyst Control panel once again to see what I could do for myself...

I topped out 2hrs gaming stable at 790/1070... Which is quite a nice OC over the stock 750/950 settings! Going from 750 to 790mhz netted me about 5fps difference with the above settings on Oblivion, and 2fps in Crysis (although, I don't care enough about Crysis to include it's numbers in this review). Going from 950mhz QDR RAM to 1070 netted me another 2fps in certain situations. The maximum temperature attained without modifying any of the fan controls was 87c on the core (wow!) but the card was fairly inaudible during the play time. I measured the exhaust temperature with a handheld IR thermometer at 71c peak (wow again!).
 
time to change your sig now ;)

Good information. I'm thinking about this for my secondary rig.
 
Well, I had the chance to pick up a 4870 for around 290$ locally, and of course, I jumped right on doing so! It should be a substantial upgrade from my 8800GT640, and will cost about HALF of what I paid for that nVidia card, only 2 short years ago.

I noticed that there was a severe lack of testing of Oblivion with these new cards, from both camps, so I decided to jump on that grenade, and get some results out.

First Impressions:

The card it's self is gorgeous and extremely heavy. The way the product was boxed was also quite nice as well... Just enough packaging to do it's job and not seem overly bloated. The HIS bundle was kinda cool, and I really dug that screwdriver combo, it let me install the new card in the dark!

Bootup

The very first thing I noticed was the noise. At startup, this card's fan produces one hell of a whirl! luckily by the Windows XP screen, it had quieted down to an almost inaudible level... I took some quick exhaust temperatures of the card at idle, and recorded between 59c and 64c... So, this card does push out quite a bit of heat.

Also, I tried unplugging the 6 pin power connectors just to see what the card would do. I was pleasantly surprised when the card booted up, and displayed a warning right after POST to connect the power cables... No high pitched squeal ala nVidia. Quite elegant IMO.

Drivers... Now, if you know me, you know I hate cables in my case. You might also know that, to circumvent extra cabling mess, I don't have an internal optical drive... Unfortunately, my external DVD burner decided to take a dump on my installation plans. No problem, I'll just boot up, get to ATI's website, and download the drivers! Well, there are no drivers on ATI's site yet... So, I navigated to HIS's website, and eventually found the drivers, which at 177mb, downloaded at a painstaking 17kbps. Killer.

I'm going to take the time to describe how difficult it was to browse the internet without the driver installed. Using Window's default VGA driver, screen rendering was *unbearably* slow. I was literally counting fps on screen refreshes, even at 1024x768... It would take about 15 seconds for the firefox main page to render, after the window was already open... Compared to my old 8800GTS' default "VGA driver" rendering prowess (which is at least 300% faster than this 4870s') I was quite disappointed. I would not recommend this card for a computer without a CD rom drive, and I would dare not lose your driver disk!

Oblivion

Now, as I said earlier, this post was mainly supposed to be a review of Oblivion's performance with the 4870... As my favorite game, I'm always screwing around with it's configs, addons and settings to provide the highest level of immersion... But with my 8800GTS, I wasn't able to get good frame rates with more than 2x aa, and the settings I had chosen. This was my main push to upgrade!

The settings I play with are as follows:
Resolution: 1680x1050
Textures: Large
Tree, Actor, Item, Object, Grass "Fades" and View distance: 100%
Distant land, buildings, trees: ON
Interior and Exterior shadows: 100%
Self Shadows: OFF (caused strange glitches for me... so...)
Shadows on Grass, Tree Canopy shadows: ON
Shadow Filtering: High
Specular Distance: 100%
HDR: ON
Water, Reflections, Ripples: High or ON
(nVidia) AA: 2x (forced in Control Panel)
(nVidia) AF: 8x (Forced in Control Panel)
(nVidia) Texture Filtering: Highest Quality


(ATI) AA: 4x Edge Detect (Forced in Control Panel)
(ATI) AF: 16x (Forced in Control Panel)


Additionally, I had the following Mods installed:
Qarl's TP3
Better LOD textures
Natural Environments
HDR Weather Overhaul
Exnem's, Ren's and a bunch of race/item mods

I then tried to find the most graphically intensive spot I could find (with relative ease) to take down some numbers standing, running around and fighting (using the console to spawn some enemies). I settled on the following area:

[INSERTPIC] (I'll post it in a little bit...)

With the 8800GTS I was scoring between 24 and 34fps, and these would dip down to 21fps during some battles with 1-3 enemies at once... Sorry, I have no screenshots to prove this.

With the 4870 I was scoring between 38 and 60 fps... With a higher AA and AF setting... As you can see, this is quite a big freaking difference... We're talking almost a 100% increase in raw fps, with higher settings to boot. I was very pleased.

[INSERTPIC] (again, gimme time).

There were a few issues though, which I suspect will be resolved when I re-install windows later this summer (I'm overhauling my case for the *hopefully* last time) mainly a longer than normal hitch every time new textures had to be loaded, and some strange loading times on application startups...

It might be of interest to someone that the card does squeal, just like the 8800's... It emits a high, variable squeal that changes pitch based on GPU load... The squeal is much less noticible than the old 8800GTS I had, but is still the loudest noise in my entire system (yes, over the stock 4870 fan).

Overclocking:

I'm an overclocker and a modder by heart, and therefore, It was almost sinful to leave this card at it's stock clocks... So, I fired up the Catalyst Control panel once again to see what I could do for myself...

I topped out 2hrs gaming stable at 790/1070... Which is quite a nice OC over the stock 750/950 settings! Going from 750 to 790mhz netted me about 5fps difference with the above settings on Oblivion, and 2fps in Crysis (although, I don't care enough about Crysis to include it's numbers in this review). Going from 950mhz QDR RAM to 1070 netted me another 2fps in certain situations. The maximum temperature attained without modifying any of the fan controls was 87c on the core (wow!) but the card was fairly inaudible during the play time. I measured the exhaust temperature with a handheld IR thermometer at 71c peak (wow again!).

The crysis numbers would be interesting to me..
 
No card does well with Crysis and we would all be better off if everyone stopped using it as a yardstick for any hardware. I know the first agument is that it was just ahead of it's time, but this is the second Crytek engine in a row that runs like ass on current and upcoming hardware, so shouldn't we all see a pattern here? When every other game gets smoked by new hardware but one game runs like ass (or finally becomes barely playable), that game is a poor benchmark.
 
No comments on my wiring job by the way. :p My main goal was to keep the wiring away from the motherboard. No case window anyway. :)

case2cf9.jpg

I can't help myself, I've never seen someone with a modular PSU has such a messy case;)

Awesome card though, ordered mine last night.
 
No card does well with Crysis and we would all be better off if everyone stopped using it as a yardstick for any hardware. I know the first agument is that it was just ahead of it's time, but this is the second Crytek engine in a row that runs like ass on current and upcoming hardware, so shouldn't we all see a pattern here? When every other game gets smoked by new hardware but one game runs like ass (or finally becomes barely playable), that game is a poor benchmark.


The truth is...it's the exact opposite. It's become the only benchmark that matters.

And I don't know where people get the misconception that the HD 4870 doesn't do well in Crysis.


It does very well, in fact. Better than any SLI i've ever had. I think a lot of people think because a GX2 will put out more raw framerates than a GTX 280 or HD 4870, that these new cards don't run Crysis well. Nothing could be further from the truth. I've run 9800 GTX SLI and the HD 4870 blows it away when it comes to minimum framerates and smooth, even game play. SLI might run 120 fps in a corridor or looking up at the sky, bumping up average frames, but in open spaces with mountains and foliage, i've never had a better experience in Crysis.

My only dilemma is whether the GTX 260 performs as well, and when the next price drop is going to take effect on it.
 
The truth is...it's the exact opposite. It's become the only benchmark that matters.

Wishful thinking, it's just obsessiveness when even Crytek says the game is poorly optimized and they made some mistakes. They have re-written huge chunks of the engine for Warhead because they know it is broken. It is a measurement of nothing, as software history has shown you can throw hardware at bad code and it usually gets noticably better, then Crytek rolls in and shows they can deliever code so shoddy that a rule of thumb no longer applies.

I enjoyed parts of Crysis, but it it no measure of a system any more than 3D Mark has been for the last couple of versions.
 
No card does well with Crysis and we would all be better off if everyone stopped using it as a yardstick for any hardware. I know the first agument is that it was just ahead of it's time, but this is the second Crytek engine in a row that runs like ass on current and upcoming hardware, so shouldn't we all see a pattern here? When every other game gets smoked by new hardware but one game runs like ass (or finally becomes barely playable), that game is a poor benchmark.

You see, that was then. The 280s can do well at resolutions up to 1920*1200 on high to very high. So for the average 22 incher, you're doing just fine hence the need for benchmarks from readers who actually own this card.

Im hearing alot of "driver issues" with the game or "bad stuttering" causing lockups or artifacts. So maybe the reason people arent showing this game off is for the simple fact it aint performing too well period. It's not a bad thing, it just goes to one spectrum of optimization that isn't there for ATI. Will it soon? Who knows.

We all know Crysis was ahead of the game or very unoptimized for it's time. We are however catching up. That is how we're benchmarking our hardware--as a point of reference, that DOES give us a good idea where we are or will be. What are you all going to say when the 4870x2 eats Crysis up, or at least fares better? You'll be sitting just like some Nvidia owners are.

So Crysis is a relevant benchmark, even if you think it sucks (like me, for gaming but that IMO :)) and I don't think there's one hardware site that would dismiss any card that can play the damn game at decent FPS.
 
My specs are listed several times earlier in the thread... again, refer to my last post as far as why the comparison doesn't work. They also likely used the default fly-by bench whereas I used assault_harbor which is much more strenuous on the system.

Ok found them. How does your 4870 overclocked stack up against the 280. I seem to recall you had a 280 before you sold it off to get the 4870 right?
 
There are two annoying things about Crysis performance:

1. There's a huge amount of loading from disk and memory thrashing, especially once the settings at are all high or above. The problem is worse when using DX10. Perhaps having more than 512 MB of video RAM helps?

GoldenTiger said he got 25% more fps in Crysis with a higher overclock, but in reality there is a ton of disk loading in the GPU benchmark, especially with 4xAA, and this can really throw the benchmark results off. For reference, I've gotten 24.7 fps before, using the same settings with the 4870 @ 790/1100. I've also gotten as low as 22 fps due to random hitching, and this is not counting the first run.

2. All the different performance bottlenecks. It's not so simple in Crysis -- it's leaves you scratching your head as to why performance isn't better. A typical overclock usually won't improve the playable settings in Crysis. Remember [H] tested an 800Mhz 8800GTS 512 and it didn't do better than a regular one.

Below are some of my results, using the built-in CPU benchmark, at both underclocked and overclocked speeds. The CPU benchmark has far less memory thrashing than the GPU benchmark, and while the explosions and physics are different every time, any real differences in frame rates will still show up.

1680x1050, DX10 High
-----------------------
650/1075: 31.8 fps
675/1075: 33.0 fps
700/1075: 34.5 fps
725/1075: 35.1 fps
750/1075: 36.3 fps
775/1075: 37.2 fps
800/1075: 37.2 fps
825/1075: 37.1 fps

So apparently we've hit a bottleneck at 775 Mhz already. If we raise our CPU speed from 3.2 to 3.5 GHz, we gain very little while there's still no difference between 775 and 825 Mhz. It's shocking to see the lack of scaling at sub-40 fps in any game.

775/1075: 37.4 fps
825/1075: 37.4 fps

Now if we raise the resolution to 1920x1200 we see some scaling. Also, performance goes down a bit if we set memory clock back down to 900 Mhz.

775/1075: 31.6 fps
825/1075: 33.5 fps
825/900: 32.8 fps

We see good scaling if we drop the resolution down to 1280x1024 and set everything to Very High.

750/1075: 30.2 fps
775/1075: 31.3 fps
800/1075: 32.2 fps
825/1075: 33.1 fps
 
HERE ARE MY RESULTS.

crysisdx10chartwv5.jpg


DX 9

1280X720 = 62.185 FPS
1680X1050 = 47.97 FPS
1920X1200 = 40.25 FPS

(SORRY DID NOT FEEL LIKE MAKING ANOTHER CHART, LOL)

and in xp under dx9 I get around 3-4 for fps more. I have taken pictures, if people don't trust me I can post them, too tired to put all of them up right now.
 
I am doing lost planet with with different settings right now, hdr on medium high, motion blur on high or no motion blur with 4xAA at 1920x1200, done with dx10 so far, and testing dx9 right now, will post some later.
 
Frostex.

Nice system you are assembling. Although you do not need me to mention this, 40% fan makes overclocking with CCC almost a no brainer.

Thanks :)

I've updated that link with piccies of the cards installed and cramping my Asus Xonar soundcard which is taking a pummeling from the heat, but so far seems to be stable.

http://www.flickr.com/photos/28160652@N04/

Fanspeeds were at 40% for both cards, I tried 60% for both last night because I was getting bad errors in 3dMark Vantage, and jesus fucking christ it's loud, I'm not putting them bad boys at 100% because quite frankly Im scared too, i have mental images of fan blades snapping and flying out the case :D Anyway turns out 35% is a real sweet spot, a degree or so more of heat and what seems like a lot less noise

I think some overclocking is in order tonight :)

BTW, anyone else had problems with 3DMark Vantage? Looks like in the first test the height of the water isn't being correctly calculated, and it mostly shooting to some stupid height, which causes all sorts of funky water flickering effects all over the screen.
 
There are two annoying things about Crysis performance:

1. There's a huge amount of loading from disk and memory thrashing, especially once the settings at are all high or above. The problem is worse when using DX10. Perhaps having more than 512 MB of video RAM helps?

GoldenTiger said he got 25% more fps in Crysis with a higher overclock, but in reality there is a ton of disk loading in the GPU benchmark, especially with 4xAA, and this can really throw the benchmark results off. For reference, I've gotten 24.7 fps before, using the same settings with the 4870 @ 790/1100. I've also gotten as low as 22 fps due to random hitching, and this is not counting the first run.

2. All the different performance bottlenecks. It's not so simple in Crysis -- it's leaves you scratching your head as to why performance isn't better. A typical overclock usually won't improve the playable settings in Crysis. Remember [H] tested an 800Mhz 8800GTS 512 and it didn't do better than a regular one.

Below are some of my results, using the built-in CPU benchmark, at both underclocked and overclocked speeds. The CPU benchmark has far less memory thrashing than the GPU benchmark, and while the explosions and physics are different every time, any real differences in frame rates will still show up.

1680x1050, DX10 High
-----------------------
650/1075: 31.8 fps
675/1075: 33.0 fps
700/1075: 34.5 fps
725/1075: 35.1 fps
750/1075: 36.3 fps
775/1075: 37.2 fps
800/1075: 37.2 fps
825/1075: 37.1 fps

So apparently we've hit a bottleneck at 775 Mhz already. If we raise our CPU speed from 3.2 to 3.5 GHz, we gain very little while there's still no difference between 775 and 825 Mhz. It's shocking to see the lack of scaling at sub-40 fps in any game.

775/1075: 37.4 fps
825/1075: 37.4 fps

Now if we raise the resolution to 1920x1200 we see some scaling. Also, performance goes down a bit if we set memory clock back down to 900 Mhz.

775/1075: 31.6 fps
825/1075: 33.5 fps
825/900: 32.8 fps

We see good scaling if we drop the resolution down to 1280x1024 and set everything to Very High.

750/1075: 30.2 fps
775/1075: 31.3 fps
800/1075: 32.2 fps
825/1075: 33.1 fps


Thanks for the info...I do prefer the GPU bench though for comparing video cards..

I do not notice all this hitching you speak of...the first run is a few frames off..then very slowly rises. less then 1 fps gain after 2nd run..
 
You see, that was then. The 280s can do well at resolutions up to 1920*1200 on high to very high. So for the average 22 incher, you're doing just fine hence the need for benchmarks from readers who actually own this card.

Im hearing alot of "driver issues" with the game or "bad stuttering" causing lockups or artifacts. So maybe the reason people arent showing this game off is for the simple fact it aint performing too well period. It's not a bad thing, it just goes to one spectrum of optimization that isn't there for ATI. Will it soon? Who knows.

We all know Crysis was ahead of the game or very unoptimized for it's time. We are however catching up. That is how we're benchmarking our hardware--as a point of reference, that DOES give us a good idea where we are or will be. What are you all going to say when the 4870x2 eats Crysis up, or at least fares better? You'll be sitting just like some Nvidia owners are.

So Crysis is a relevant benchmark, even if you think it sucks (like me, for gaming but that IMO :)) and I don't think there's one hardware site that would dismiss any card that can play the damn game at decent FPS.


Crysis just exposes system flaws better then other games....any bottlenecks will be revealed..

Poor coding...dunno I don't write game code....neither do many of you!
 
Crysis just exposes system flaws better then other games....any bottlenecks will be revealed..

Poor coding...dunno I don't write game code....neither do many of you!

Bonus marks for me then! I have written gdi and dx9 stuff before! I didn't like it though, almost failed that course. :p
 
Ok found them. How does your 4870 overclocked stack up against the 280. I seem to recall you had a 280 before you sold it off to get the 4870 right?

Yes, I had a GTX 280 before this card that was overclocked pretty heavily as well. The 4870 matches or beats it in everything I've tested/played, GRID especially is faster, TF2 is a little faster as far as minimum framerate goes, and the other games are pretty much the same (Crysis, WIC, etc.). I'm very glad I made a hundred off the resale of the 280, and then saved another two hundred off the price I'd paid for the 280... because in reality, I'm finding no negative gaming difference that's tangible, if any exists.
 
There are two annoying things about Crysis performance:

(snip)

Excellent post, and you brought up a lot of good points as well. I'm noticing in Crysis benching on Assault_harbor sometimes it goes very quickly (high framerate) off the bat, sometimes it's about the middle of the timedemo before it goes to what it should and is about 10fps lower than it is after it picks up... I can watch it and suddenly in the same spot it magically picks up a lot of frames and keeps going at that point. I haven't noticed this in actual gameplay yet, though.
 
Thanks :)

I've updated that link with piccies of the cards installed and cramping my Asus Xonar soundcard which is taking a pummeling from the heat, but so far seems to be stable.

http://www.flickr.com/photos/28160652@N04/

Fanspeeds were at 40% for both cards, I tried 60% for both last night because I was getting bad errors in 3dMark Vantage, and jesus fucking christ it's loud, I'm not putting them bad boys at 100% because quite frankly Im scared too, i have mental images of fan blades snapping and flying out the case :D Anyway turns out 35% is a real sweet spot, a degree or so more of heat and what seems like a lot less noise

I think some overclocking is in order tonight :)

BTW, anyone else had problems with 3DMark Vantage? Looks like in the first test the height of the water isn't being correctly calculated, and it mostly shooting to some stupid height, which causes all sorts of funky water flickering effects all over the screen.
Taking a look at your system, can you put the Xonar in the black PCI-E x1 above the top HD4870? If you can, it would certainly save it from the heat.
 
Wait wait, where did you get an Antec 1200?
Those things are out already?
 
Back
Top