NVIDIA Fermi - GTX 470 - GTX 480 - GTX 480 SLI Review @ [H]

The blame doesn't go elsewhere. If you're thermalright HR-03GT dropped max temps by 20C+ then Nvidia should have entertained an option such as this, no matter how absurd that sounds. You had the good sense to do this. Hot and noisy is not an unfortunate side effect of poor ATX design, it's the result of Nvidia's choice to push the limits. A fan or the case the card resides in is a design element that HAS to be factored into the equation.
I have a theory about this. My theory is that AMD and NVidia stay with the shroud/blower cooling setup because it vents hot air outside of the case. Of course, those of us with towers and good airflow don't mind whether the hot air goes out or stays in, but for SFF, workstation, and HPC environments where things are crowded because of small cases, cables, or other cards taking up other PCI-e slots, I don't think I would want an open cooler design.
 
I generally try to avoid being a dick, but in this case I can't help it. Where is the proof that on average the GTX480 will overclock as well? As far as I know, the only GTX480s in reviewers hands are samples sent to them by nVidia. These are not retail samples that just anyone can get their hands on. This is one of the big reasons Kyle didn't give any results of overclocking. Samples handed out to reviewers don't always represent the retail cards in regards to overclocking.

I'm sure Kyle did do some overclocking with the cards he had but I don't remember him mentioning much, if anything about overclocking. This isn't the first time he waited to release actual results until he got his hands on retail cards but he would normally say something about what he saw for overclocking. This could have been an oversight on his part but it could also mean that the results he saw weren't very good and he's waiting for retail samples to arrive to see if they are any better.

I believe your thoughts regarding the overclockability of the GTX480 are nothing more than wishful thinking at this point. The fact that the cards are hot as hell and power hungry at stock speeds usually indicates that the cards are not going to be very good overclockers.
settle down Beavis. I was just going by a couple of reviews that overclocked the card to 17-18%. the point I was trying to make is that just because they run hot as hell obviously doesnt mean that still cant be overclocked. I dont give a crap if they were retail cards or not but I was surprised myself to see any overclocking headroom.
 
I have three Dell 3007's that I will have to drive in Vision Surround.

Well the thing that has sold me on the Nvidia line, is the ability to be able to run three monitors in SLI using the D-DVI ports on both cards. Maybe if I had newer 30's then this would not be an issue. ATI's choice not to be able to use 3xD-DVI is the straw that broke the camels back.

You are talking about monitors that were released in '05. Buying your video card for five year old tech does have it's challenges but I don't think a person should espouse DVI support verses Displayport technology.

I do see this as a potential mark against ATI. I would like to hear ATI address this issue. I'd like to hear why they may be unable to get eyefinity to work with 3 DVI ports on two different cards.
 
No info about when GTX 485 with 512CC (or whatever it's name will be) might come out?
I don't want to buy a GTX 480 and then one month later 485 comes out..
 
Seriously, what is the point of asking about the 512 core version when the 480 core version are still not in stores and won't be for 2-3 weeks ? If there will be a "refresh", then maybe in summer, do not expect anything sooner. I will be surprised to see a non-minimal availability of cards with 480 cores at "hard launch".
 
[H]'s review showed a single GTX 480 hitting 95° C and that was in an open room with no case! From my experience with 100° C cards inside a closed case, all your other components will suffer from the extra heat as well. Watercooling is a must for 480 SLI setups and even then, tons of extra heat is dumped into your loop

The question is how the card(s) will do over 6-12-18-24 months of lifetime. The French review I mentioned earlier in this thread saw (correcting myself) 110 C temperatures when running 480 SLI in a Sonata III case (made for silence, apparently), and 106 C running a single card.

I know the operational temperatures of CPUs and how much they can handle, but not GPUs. Kinda worried
 
i have/had 8x9800GX2's running at 95-100c for more then an year now only one has died (about 2 weeks ago)
the GT200 do not like power loss thought when they are under load they tend to Die (as mine did GTX280 Folding loads)

3 of my friends when i was on Teamspeak pre-ordered them when i was talking to them, one of them was bonkers getting 2 of them,

i have an 9800GX2 in m system now (the other one Died so no QuadSLI) i love to have an Single fast card again the GTX280 i had was nice as most of my games ran smooth at high settings and AA, the GTX480 i could buy 2 GTX275's and i get the same performance but no DX11 or i could go down ATI route but suffer 2-5 months of problems with drivers when New games comes out (1 min load times for BC2 e.g.), if i can get it for £400 i most likely get one but most likely will Not be folding on it due to heat+Noise

105c is the thermal throttling range card cuts back 50% (my 9800GX2 do)
 
No info about when GTX 485 with 512CC (or whatever it's name will be) might come out?
I don't want to buy a GTX 480 and then one month later 485 comes out..

No, nor will there be for a while I suspect. nV doesn't want the rumor mill churning on this since it will drive down sales of the 480/470s.

You can, however, be assured they're working on one.
 
This was a good review. It's interesting to see that this generation of cards isn't challenged by the games being thrown at them.

I'm also not sure about GTX 480 SLI being the best you can do. Couldn't you run twin 5970s? Wouldn't that be a lot faster for the hardcore enthusiast who doesn't care about spending a shitload of money on a system?
 
It's interesting to see that this generation of cards isn't challenged by the games being thrown at them.

How did you determine that? I'm no expert, but looking at the benchmarks, the games weren't cpu limited, nor ram limited. I don't think it's unreasonable to want a game to be played with maxed out settings.

It would seem to me the cards are challenged. If games weren't "playable" on a brand new top of the line card, I would have reason to be concerned. There are new elements that will have a big impact on FPS. 3D and multimonitor gaming can both reduce framerates to a crawl. Both of these elements were introduced by video card manufacturers so I don't think it's unfair to expect a game to be playable under these conditions.

This leads back to nFinity. I refuse to even consider Nvidia cards until I see multimonitor gaming results. We already had to wait 6 months for the card, who's to say how long before nFinity actually materializes.
 
This leads back to nFinity. I refuse to even consider Nvidia cards until I see multimonitor gaming results. We already had to wait 6 months for the card, who's to say how long before nFinity actually materializes.


Yup , and even then..... Comes a whole new set of problems I'm sure. I'm not even going to bother with this card for at least a year. might be a whole different ball game by then, for the better
 
The SLI is not necessarily superior to a CrossFireX setup simply because this review states it is so due to the obvious fact they did not use the proper dual card configuration with all two Radeon5870-2GB 12-monitor DisplayPort "Eyefinities" for comparison! The reviewed 5970 card has 2GB combined framebuffer memory and only uses sixteen lanes on the motherboard.

GTX480-SLI: 3GB 32X PCIe
RDN5970: 2GB 16X PCIe (reviewed)
RDN5870-CFX: 4GB 32X PCIe (more equal comparison)

With the extra 512MB VRAM per GPU and (now) the same 32X bandwidth it should equal the MSAA settings at similar playability. The most important factor is 1080p playability at max settings at a certain price point, $100 cheaper. Everyone can deal with the inconvenience of heat, noise and higher electric bills, if they must.

Most people play on HDTVs with 1920x1080 resolution, can you max out antialising to 24X CFAA with a Radeon5870 and still maintain playability in any game? If not, the FPS for the GTX480 at 32XCSAA(equivalent to 24XCFAA) must be significantly better to justify the price.

Metro2033 has been optimized for nVidia and AMD has only just now had a chance to see the source code, so obviously you can enable 4X MSAA at a higher resolution with nV at this early stage of AMD graphics card driver development in regard to that particular game.

CSAA is a cheat antialiasing mode that looks worse than the same numbered setting in MSAA mode. It is more like an 6X MSAA mode which Radeon doesn't support, so this review is not apples-to-apples for BFBC2 so the nV card may be unfairly limited, but the 5870 is equal in FPS.

If you don't use the latest patches and Catalyst10.3b, it is also not a good comparison. On AVP they would be equal if you used the 10.3b drivers possibly because your 10.3a were only the first version to have that AVPgame-specific fix.

For "SSAA" transparency antialiasing, is nVidia's 8X TR mode the same as the AD for which there is only one setting from AMD? Well it is most certainly as good as 2X TR mode, and with that, the final showdown where each GPU board is on equal ground, and each vendor has had ample time to optimize their drivers for the oldest Dx11 game:

DiRT2 is where the RaDeON 5870 kicks the GTX 480's sorry ass back to the drawing board at over 15% HIGHER average FPS at TWICE the MSAA mode

DiRT2 max settings (Transparency antialising on)
GTX480 2560x1600 4X AA
RD5870 2560x1600 8X AA --read 'em and weep!

Guess the one thousand one hundred and twenty more shader cores in the Radeon make all the difference. GTX480's so-called improved geometry engine just can't be verified because it can't be tested using current games.
 
Last edited:
If you don't use the latest patches and Catalyst10.3b, it is also not a good comparison. On AVP they would be equal if you used the 10.3b drivers possibly because your 10.3a were only the first version to have that AVPgame-specific fix.

10.3b came out like a day before NDA lifted on the GTX480
 
Wow, the benchmarks showed that AMD really hit the mark.... months ago.

I know it's not a "benchmark", but some Folding@Home action would have been interesting to see as well. (this would be the --1-- area where the Nvidia fanboys would be proud I bet)

These items aside, the power and sound videos are awesome. Who knew you could purchase a mini jet engine from Nvidia for so cheap? I expected the sound to start clipping with the SLi'd GTX480's at any moment - haha.

Though I don't like the Folding@home performance of the AMD cards, my upgrade from a 9800GTX+ will not be with the Nvidia "heat guns" you reviewed.
 
The SLI is not necessarily superior to a CrossFireX setup simply because this review states it is so due to the obvious fact they did not use the proper dual card configuration with all two Radeon5870-2GB 12-monitor DisplayPort "Eyefinities" for comparison! The reviewed 5970 card has 2GB combined framebuffer memory and only uses sixteen lanes on the motherboard.

GTX480-SLI: 3GB 32X PCIe
RDN5970: 2GB 16X PCIe (reviewed)
RDN5870-CFX: 4GB 32X PCIe (more equal comparison)

With the extra 512MB VRAM per GPU and (now) the same 32X bandwidth it should equal the MSAA settings at similar playability. The most important factor is 1080p playability at max settings at a certain price point, $100 cheaper. Everyone can deal with the inconvenience of heat, noise and higher electric bills, if they must.

Most people play on HDTVs with 1920x1080 resolution, can you max out antialising to 24X CFAA with a Radeon5870 and still maintain playability in any game? If not, the FPS for the GTX480 at 32XCSAA(equivalent to 24XCFAA) must be significantly better to justify the price.

Metro2033 has been optimized for nVidia and AMD has only just now had a chance to see the source code, so obviously you can enable 4X MSAA at a higher resolution with nV at this early stage of AMD graphics card driver development in regard to that particular game.

CSAA is a cheat antialiasing mode that looks worse than the same numbered setting in MSAA mode. It is more like an 6X MSAA mode which Radeon doesn't support, so this review is not apples-to-apples for BFBC2 so the nV card may be unfairly limited, but the 5870 is equal in FPS.

If you don't use the latest patches and Catalyst10.3b, it is also not a good comparison. On AVP they would be equal if you used the 10.3b drivers possibly because your 10.3a were only the first version to have that AVPgame-specific fix.

For "SSAA" transparency antialiasing, is nVidia's 8X TR mode the same as the AD for which there is only one setting from AMD? Well it is most certainly as good as 2X TR mode, and with that, the final showdown where each GPU board is on equal ground, and each vendor has had ample time to optimize their drivers for the oldest Dx11 game:

DiRT2 is where the RaDeON 5870 kicks the GTX 480's sorry ass back to the drawing board at over 15% HIGHER average FPS at TWICE the MSAA mode

DiRT2 max settings (Transparency antialising on)
GTX480 2560x1600 4X AA
RD5870 2560x1600 8X AA --read 'em and weep!

Guess the one thousand one hundred and twenty more shader cores in the Radeon make all the difference. GTX480's so-called improved geometry engine just can't be verified because it can't be tested using current games.

You're making a whole lot of assumptions and even more excuses here. Why is Dirt2 the most important game? Just because ATI did better? And I'm asking you this as an ATI fan much more so than an nVidia fan.
 
Remember when ATi were the ones with hot/power hungry/loud video cards? Around the 1900XT era.

Look how things have changed...
 
As far as overclocking goes,isn't it possible Nvidia had to push the sample cards close to their max already just to gain the slight performance edge? The kind of heat and power usage they show could be an indication of that.
 
As far as overclocking goes,isn't it possible Nvidia had to push the sample cards close to their max already just to gain the slight performance edge? The kind of heat and power usage they show could be an indication of that.
again if you look at the reviews that overclocked the card it overclocks just as much(percentage wise) as the 5870. it doesnt seem logical based on how hot the stock card runs but somehow it does oc pretty decent.
 
All I am saying is that DiRT2 is the only Dx11 game that both vendors had equal time to optimize for, since it came out first. Therefore it is a more equal test than Metro2033 which AMD just learned of.

I don't understand the results in DiRT2 necessarily, either, because in other-site reviews they don't show that lead (due to diff. settings I guess)

But those games are one-offs that may or may not be chosen by the vendor to optimize for. You need to base the comparison on more standard games, like back in the Doom3 and HalfLife2 days, then you knew what the true performance difference was. These days the equivalent of those two games are Crysis and Stalker.

All I know is that every review consistently said it ties in Crysis and the GTX480 loses in StalkerCoP by a considerable amount, and that, coupled with the fact that it can lead to CPU throttling on an OC'd system in the height of summer makes me question how anyone can justify that purchase:

Twice the load temps at BTech:

96C GTX480
51C RD5870

Enthusiasts probably don't care as much about the $100 extra cost plus the 100W added electric bill rates.
 
Last edited:
I've been sitting back reading the reviews and comparing. I have two 480s on preorder and am going back and forth on whether I should cancel.

When it comes down to it, there's really no card that fits what I'm looking for. I want a card that does the following:

  • Runs multi-monitor (Eyefinity, 3D Surround) games with high IQ without hitting the framebuffer (1GB+ VRAM)
  • Does not require the use of Displayport; adapters are ok, but not preferable
  • Runs within reasonable sound and power thresholds
The 5870 E6 won't work for me due to ATI's insistance to require Displayport; the 480 is a power hog, hot and noisy. Essentially, I'm looking for a 5870 2GB version.

Here's my take on it.

You state the 5870 E6 won't do it for you because of ATI's insistence on DP. That's an understandable stance. It really is.

I would counter that statement, however, by stating that if you are seriously interested in HIGH Image Quality, you'll go for an IPS Panel monitor, which is generally a higher end model that come with a DP port connerctor, instead of the lower priced TN panels on models that do not have DP port connector. That pretty much moots the point of not getting a 5870 E6.
 
I have three Dell 3007's that I will have to drive in Vision Surround.

Well the thing that has sold me on the Nvidia line, is the ability to be able to run three monitors in SLI using the D-DVI ports on both cards. Maybe if I had newer 30's then this would not be an issue. ATI's choice not to be able to use 3xD-DVI is the straw that broke the camels back.

I was looking at the Eye-Infinity originally, but the additionally cost of finding a displayport converter/adapter and not to mention problems some have been having, has basically sold me on Nvidia's offering (pending review of the drivers when available).

Well I won't have to worry about heating the computer room next winter;)

I mainly run auto racing sims, with the odd TPS game to fill the time in between practicing for online racing events.

GTX480 = $500+ +more heat +louder card +major increase in power bill

compared to

HD5870 = $400 + DP Adapter = $30 = $430

Hmm...tough choice.
 
GTX480 = $500+ +more heat +louder card +major increase in power bill

compared to

HD5870 = $400 + DP Adapter = $30 = $430

Hmm...tough choice.

If you were going to run 3 monitors with an 5870 you'll need more than a $30 passive adapter


HD5870 = $400 + DP Active Adapter = $120 = $520
 
Wow, so I could make a mint on the box of 100 DP adapters I have sitting in a closet at work.

Go figure!
 
If you were going to run 3 monitors with an 5870 you'll need more than a $30 passive adapter


HD5870 = $400 + DP Active Adapter = $120 = $520

If you want to run 3 monitors with a 480 you need to have ... 2 480s. They do not support any more than 2 monitors per video card. So your price of ownership for 3d surround is $1000 minimum, and that's if you trust nvidia to stick to the $499 point that the video card is at. $520 or $1000. Hmmm.
 
No folding benchmark, no GPGPU benchmarks in general, no tessellation benchmark - the [H] may know hardware, but this is hardly a fair representation.

If you thought gamers were going to be blown away by Fermi, you haven't been paying attention to Nvidia's discussions about Fermi architecture.

I understand why Kyle doesn't run a F@H benchmark. Anand did in his review, though it is sort of buried. The GTX480 is about 3.5 times as fast as at folding as the GTX285. If that's true, then you're talking well north of 20k PPD from a single card.
 
I understand why Kyle doesn't run a F@H benchmark.

Are you trying to say they're biased because they didn't run a F@H benchmark? Um maybe if they did in all prior reviews, but they don't. They review based soley on gaming performance.
 
I understand why Kyle doesn't run a F@H benchmark. Anand did in his review, though it is sort of buried. The GTX480 is about 3.5 times as fast as at folding as the GTX285. If that's true, then you're talking well north of 20k PPD from a single card.

Truthfully, I wouldn't put too much stock in that "sort of" benchmark. We have practically no details about what is really being run and it's likely that the performance you see there will not get close to real performance while folding. As of right now, there are many different work units out for GPU folding and they can be all over the place for performance depending on the video card you are using to fold them. There are some work units which an 8800GT will outperform something like the 285GTX because the extra shaders on the 285GTX aren't used at all and the lower clocked shaders don't process the work unit as fast.

Sure, there may end up being some work units specifically designed for the 470/480 architecture but I doubt there will be very many of them since they probably wouldn't run worth a damn on previous cards. Also, keep in mind that the current F@H GPU client doesn't work with the Fermi architecture. Until the release of the GPU3 client, they will sit there idle with regards to folding.

The only thing I would take away from the benchmark Anand ran was the absolute best case scenario for performance which will not translate into real world performance any time soon.
 
Here's my take on it.

You state the 5870 E6 won't do it for you because of ATI's insistence on DP. That's an understandable stance. It really is.

I would counter that statement, however, by stating that if you are seriously interested in HIGH Image Quality, you'll go for an IPS Panel monitor, which is generally a higher end model that come with a DP port connerctor, instead of the lower priced TN panels on models that do not have DP port connector. That pretty much moots the point of not getting a 5870 E6.

I own an IPS based panel (e-ips) 2209wa from dell, that does not come with DP.
 
It looks like some games favor the GTX480 on average 5-10fps, but the games that favor the Radeon average 15fps higher.

It depends on the bottleneck in the game, AA can be faster on the Radeon at lower resolutions due to:

1. Radeon has 333% more shader units than the GTX480

But there are two other hardware components of the GPU (besides vertex/compute units which are not utilized in current games to an extent that any advantage nVidia has in this realm would be noticed, however 20 AMD compute units is more than 16 GPCs? so they have less parrallelism to start with) and they are:

2. Radeon has 33% more texture units than the GTX480

3. Radeon has 33% less raster units than the GTX480 (pixel pipelines) so that is why GTX480 looks faster when you run it at 2560x1600 0xAA, but as soon as you crank up the FSAA (the 2-8X TrAA nV bug notwithstanding) that single digit fps lead due to the surplus rasterization (48ROPs) is not worth the double digit loss in every other game due to the deficit in texturization (480 stream-processor cores.)

Are there any reviews that do 24X Adaptive MSAA, Temporalx3 with 1920x1080 and max settings? because those are the settings I run all games at.
 
again if you look at the reviews that overclocked the card it overclocks just as much(percentage wise) as the 5870. it doesnt seem logical based on how hot the stock card runs but somehow it does oc pretty decent.

Notice how [H] doesn't have OC results? Reviewers usually get cherry picked cards. Kyle / Brent have said it already, they're waiting on retail cards before they do any OCing.
 
The findings are all over the place. Anand says it's great (but Anand is an Intel lapdog that would do anything to discredit or damage AMD), Guru3D says it's great (he's generally fair), HardOCP says it's not worth it (generally fair) and THG says it's not even close to worth it. (They're generally fair too)

All it really looks like is that the quality control at nVidia is garbage.
 
Here's my take on it.

You state the 5870 E6 won't do it for you because of ATI's insistence on DP. That's an understandable stance. It really is.

I would counter that statement, however, by stating that if you are seriously interested in HIGH Image Quality, you'll go for an IPS Panel monitor, which is generally a higher end model that come with a DP port connerctor, instead of the lower priced TN panels on models that do not have DP port connector. That pretty much moots the point of not getting a 5870 E6.


My S-IPS Dell 3007WFP-HC doesn't have displayport, either.
 
Read the review but not 14 pages of comments. Man that's hot and power hungry. Waiting for the "gtx260 shrink" of this thing in 6-9 months or whenever.
 
Hardocp benchs made me to go ati after long time owning nvidia cards. Msi twin frozr looks like great deal and i think it will be enough until 6xxx series arrive or better revisions of Fermi (doubt its possible though power draw is off the roof)
 
I see all the talk about display port and monitors not supporting it. You know Dell does make a display port adapter. So lacking DP isn't the end of the world, nor does it mean that you can't take advantage of Eyefinity technology. Yes you have to figure it into the cost, but that's not a massive price increase when your already dropping serious cash on 30" LCD's.
 
Back
Top