NVIDIA Fermi - GTX 470 - GTX 480 - GTX 480 SLI Review @ [H]

I'm going to throw up the BS flag here with a little hesitation.

Your thoughts are noted. Sorry our content does not fit to your specific needs. I hope you find what you want elsewhere.

I am the one here that has to PAY for all the content we produce. When and if I feel as though non-gaming GPU applications are making a difference to our readers when making a purchasing decision, we will likely add something like you want. Until then I am not going to spend my money covering it.

Kyle, I've been reading your site for a very long time, but it is overly obvious at times were bias exists. It was almost difficult for me to read this review due to the obvious bias. End of the day, the GTX480 is mostly faster, especially in SLI configuration. I understand and appreciate the drawbacks being brought forth (power, cost, etc), but I much prefer reading an article that isn't LOOKING for problems.

Well, I guess if it is that obvious, you have no needs here as you will not be using our site as a resource.
 
Totally have to agree with Kyle, this is a hardware enthusiast site and not one for computer graphics artists. The point with some of these apps used in motherboard and cpu benchmarks is the point to show the what the cpu can really do in terms of horsepower. Clearly if you look at this review its one of the more unbiased reviews out. Granted the GTX 480 is the best single gpu out right now, but there are your drawbacks (heat, power, price) compared to ATI offering in which has been out for 6 months now.
 
I'm just surprised more people don't see the potential for the GTX 480. True that it has its drawbacks, but it seems clear that the architecture will be far superior in the future of gaming assuming more directX 11 games come out. In all tessellation and directX 11 tests it seems far superior to the 5000 series. There just isn't many games built for that right now, just benchmarks like heaven. Sadly I'm not sure if the potential of this card will be realized in the near term.

I do agree with the current games it isn't anything special, but the architecture does seemed poised for the future.

Just a side note: Not to play sides here but being a software engineer as well as a gamer the GPGPU possibilities with the Fermi architecture are staggering. While I agree that CUDA is proprietary it is clearly superior in terms of usability and support.
 
I'm just surprised more people don't see the potential for the GTX 480. True that it has its drawbacks, but it seems clear that the architecture will be far superior in the future of gaming assuming more directX 11 games come out. In all tessellation and directX 11 tests it seems far superior to the 5000 series. There just isn't many games built for that right now, just benchmarks like heaven. Sadly I'm not sure if the potential of this card will be realized in the near term.

I do agree with the current games it isn't anything special, but the architecture does seemed poised for the future.

Just a side note: Not to play sides here but being a software engineer as well as a gamer the GPGPU possibilities with the Fermi architecture are staggering. While I agree that CUDA is proprietary it is clearly superior in terms of usability and support.


I am fairly sure this is exactly what our review pointed out. However, we cannot conclude reviews based on "What if?" scenarios.

In that regard, it seems like NVIDIA’s move to re-design its architecture might be ahead of its time. Of course ATI has had Tessellation for years and years, so it was way ahead of the curve as well. The GeForce GTX 480 and GTX 470 look to have long legs though, and will quite likely remain relevant for a longer period than the Radeon HD 5000 series, at least in terms of raw technology. Only time will tell on that one. At any rate, with the current performance experienced, the enormous power draw, and the cost, it just seems to us like this was a wasted move right now that has only hindered NVIDIA with delays and problems with the Fermi architecture. Right now, the payoff doesn’t seem to be present with this architecture.
 
I'm just surprised more people don't see the potential for the GTX 480. True that it has its drawbacks, but it seems clear that the architecture will be far superior in the future of gaming assuming more directX 11 games come out. In all tessellation and directX 11 tests it seems far superior to the 5000 series. There just isn't many games built for that right now, just benchmarks like heaven. Sadly I'm not sure if the potential of this card will be realized in the near term.

I do agree with the current games it isn't anything special, but the architecture does seemed poised for the future.

Just a side note: Not to play sides here but being a software engineer as well as a gamer the GPGPU possibilities with the Fermi architecture are staggering. While I agree that CUDA is proprietary it is clearly superior in terms of usability and support.

I don't think that the problem is not seeing what Fermi could do down the road. I think it is probably more accurate to say, that none of what it can do means squat for gaming and more to that point, all that Fermi could do isn't being leveraged in games right now. I think Kyle is spot on in that the architecture itself will be one that will remain relevant for quite some time. In other words like G80, this is one that NVIDIA will be milking for a very long time. Yes Fermi is the fastest single GPU but the advantage it has over the 5870 hardly seems worth it given the trade offs. Obviously your mileage may vary here and there are some unknowns such as multi-monitor gaming performance, etc which could factor in later. We may not know for some time how well Fermi stacks up in that regard.
 
Even if you don't agree with everything, you have to salute the entire staff here for continuing to be very proactive in these conversations. :)

One of many reasons why I look to this place first for information. :)


It's hard to speculate on any of this but drivers can go some ways to helping these cards evolve. It's just a question of "how much" and "in which ways" ?

That's where the GTX 480 could be an interesting place in say 4-6 months? Just my speculative two cents.
 
Even if you don't agree with everything, you have to salute the entire staff here for continuing to be very proactive in these conversations. :)

One of many reasons why I look to this place first for information. :)


It's hard to speculate on any of this but drivers can go some ways to helping these cards evolve. It's just a question of "how much" and "in which ways" ?

That's where the GTX 480 could be an interesting place in say 4-6 months? Just my speculative two cents.

I agree with you there, Nvidia has never dissapointed me with their drivers and even their not so long ago fiasco with the bad fan driver I do remember a time when ATI had that exact same issue and they are still alive and well. I'm also sure some inventive companies will figure out a way to more effectively cool the beastly GTX 470/480 cards as I also remember a time when ATI also had extremely loud and hot cards of their own. The tables have switched though and ATI is running the quieter cards with the lower power consumption though. Now if only ATI cards didn't make me think my Linux OS was broken.
 
I'd like to see this benchmark run between NVidia and ATI before I make any solid conclusions.

Why? Do you play heaven 2.0 all day?

Also it seems a little fishy that this benchmark came out two days before fermi released. I bet Nvidia had a hand in this, and it most likely has Nvidia optimizations. I am glad [H] didn't use synthetic benchmarks, becasue really, who play benchmarks?
 
Apparently XFX wasn't very impressed with these cards,they've said they won't be producing them. That's got to be bad for Nvidia,given XFX's growth in sales. I'm a BFG guy myself,but XFX does seem to be growing in popularity.
 
"Why? Do you play heaven 2.0 all day?"

No, because I believe that the extreme tesselation test it provides will be more indicative of things to come with parallel processing.

I believe (or want to believe) that NVidia leaned its lesson with benchmark cheating, but we know that the drivers for these cards will have evolve (and soon) in order to show what they can do, and, as well, other game manufacturers are going to have to play catch up in order to take advantage of these features.
 
Apparently XFX wasn't very impressed with these cards,they've said they won't be producing them. That's got to be bad for Nvidia,given XFX's growth in sales. I'm a BFG guy myself,but XFX does seem to be growing in popularity.

XFX is indeed growing in popularity. They are going to be the major force which EVGA has to reckon with in the future. EVGA's choice to be an NVIDIA only partner will come back to bite them in the ass and XFX looks like it will be the one doing the chewing.
 
Apparently XFX wasn't very impressed with these cards,they've said they won't be producing them. That's got to be bad for Nvidia,given XFX's growth in sales. I'm a BFG guy myself,but XFX does seem to be growing in popularity.

XFX is indeed growing in popularity. They are going to be the major force which EVGA has to reckon with in the future. EVGA's choice to be an NVIDIA only partner will come back to bite them in the ass and XFX looks like it will be the one doing the chewing.

Wait a minute... I thought Nvidia was "trimming" its number of partners, and XFX was no longer going to be a partner??? :confused:

Or they were "punishing" XFX for selling ATI cards?

Dan, what's the real story?
 
"Why? Do you play heaven 2.0 all day?"

No, because I believe that the extreme tesselation test it provides will be more indicative of things to come with parallel processing.

I believe (or want to believe) that NVidia leaned its lesson with benchmark cheating, but we know that the drivers for these cards will have evolve (and soon) in order to show what they can do, and, as well, other game manufacturers are going to have to play catch up in order to take advantage of these features.

Remeber that Fermi does its tessellation through shaders, so if the game is shader heavy its performance drops, now Heaven looks great but its not real world.

http://www.anandtech.com/show/2977/...x-470-6-months-late-was-it-worth-the-wait-/14

This is BFBC2 DX11 and the 5870 is beating the GTX 480.

Tessellation may be nice but it isn't the whole picture, devs. will not impliment the ammount of tessellation shown in Heaven for another 6 months-1year, by that time both companys will have new tech out and the GTX480/ATI 5870 will be an after thought.
 
Wait a minute... I thought Nvidia was "trimming" its number of partners, and XFX was no longer going to be a partner??? :confused:

Or they were "punishing" XFX for selling ATI cards?

Dan, what's the real story?

Yeah, I feel like we're grazing the tip of an iceberg here.
 
Wait a minute... I thought Nvidia was "trimming" its number of partners, and XFX was no longer going to be a partner??? :confused:

Or they were "punishing" XFX for selling ATI cards?

Dan, what's the real story?

I have no idea. I just know that as a brand, XFX is growing in popularity. Brent or Kyle would be better able to answer that question I think.
 
What is also interesting that in NewEgg they list the various soon to be available Fermi cards but there is no offering from BFG who has been a staunch NV partner. I heard a roumor that BFG might soon come out with an Ati offfering so I wonder if that has anything to do with it.
 
What is also interesting that in NewEgg they list the various soon to be available Fermi cards but there is no offering from BFG who has been a staunch NV partner. I heard a roumor that BFG might soon come out with an Ati offfering so I wonder if that has anything to do with it.

That's an interesting observation. Your right. I didn't find any either.
 
That worries me as BFG has been my vendor of choice. That is mainly because of their servive and lifetime warrenty. Hmmmm
 
That worries me as BFG has been my vendor of choice. That is mainly because of their servive and lifetime warrenty. Hmmmm

Well you can get the same thing from EVGA too. The physical cards are all reference designs that come from the same factories.
 
Right, the parts are similar but not the warrenty and as I buy and install some numbers of these in systems I build that is important to me. Oh well, it will all become clear before I start buying the next gen cards anyway.
 
I read nvidia bought a heaven license. I am sure nvidia has paid some $$$ to them to make sure it runs better on their hardware, prob why there is a revision 2 of the benchmark, call it the nvidia revision $$$. XFX prob doesnt want to be strapped with a lifetime warranty on a card that idles at 90C in some cases. They can see the bite in the arse coming down the road for these cards. If XFX says no thanks, that is a hint.
 
I read nvidia bought a heaven license. I am sure nvidia has paid some $$$ to them to make sure it runs better on their hardware, prob why there is a revision 2 of the benchmark, call it the nvidia revision $$$. XFX prob doesnt want to be strapped with a lifetime warranty on a card that idles at 90C in some cases. They can see the bite in the arse coming down the road for these cards. If XFX says no thanks, that is a hint.

My 3870 liked to idle at 90C without the fan fix.
 
Unigine Heaven 2.0 has a known driverbug, fix will be forthcoming: http://twitter.com/CatalystMaker/statuses/11049610266

The GTX480 OCs to 810/1000 which gives it 48GTexels, closer to the 71.6GT my XFX HD5870 OC'd to 895/1300 does, but inside a case that GTX480 will go 5C over the throttle temp

Apparently there is also a bug with dual monitor setup using more of the chip and IDLING at 90C
 
Last edited:
Whether XFX decided on their own not to carry these cards,or it was a decision on Nvidia's part,I think in the end the one being "punished" will be Nvidia themselves. Not only are they losing a major outlet for their new cards,other companies like BFG may well decide to follow XFX to ATI's camp. I think the days of card manufacturers exclusively dealing in one brand are coming to an end.
 
I wonder whether NV may scrap the GF100 and use SLI GF104 on the same board a la GTX 295 to get 512 shaders? Given how well Fermi scales with SLI, I think that's also possible. There's no way in its current form am I going to get GTX 470 or 480.
 
I think the days of card manufacturers exclusively dealing in one brand are coming to an end.

Tell that to Sapphire, HIS, Powercolor. ATI vendors have no issue doing what EVGA and BFG have done for years (an no one says a word).
 
1) Sapphire doesn't need to make NVIDIA cards, owners of Sapphire have the Zotac brand for that.
2) According to Wikipedia - "PowerColor is a licensed producer of ATI Radeon video cards, but does also produce NVIDIA video cards under the Zogis brand name".

So from your list only HIS is not making NVIDIA cards in some form.
 
Kyle, sorry if this has been replied to already somewhere but were the fans running at 100% when you ran Furmark? If not, I wonder what the temperatures are if you force it to 100%. My Stacker 830 is a decent case but I don't know if the 4 fans in the side door are going to make much of a difference for 480GTX SLI!
 
Kyle, sorry if this has been replied to already somewhere but were the fans running at 100% when you ran Furmark? If not, I wonder what the temperatures are if you force it to 100%. My Stacker 830 is a decent case but I don't know if the 4 fans in the side door are going to make much of a difference for 480GTX SLI!


Yes, that was scaling them up to full load from idle, so you got to hear from start to finish.

Working on this right now, but I have not been able to get them to load like that in a game. Still very loud though. Got a video coming.
 
Thanks Kyle. My single GTX 295 topped out at 82 degrees after I let Furmark run for about 20 minutes in extreme burning mode. From what I have seen around the web that is a few degrees lower than what review sites had for it. That is making me feel a little better that my case is up to the task.

I will be watching for your further testing results with great interest. Keep up the good work!
 
I haven't read the entire thread, but where can I find some tri-sli 480gtx vs quad 295 or tri 285?
 
I haven't read the entire thread, but where can I find some tri-sli 480gtx vs quad 295 or tri 285?

I don't think you'll see that until after retail cards become available. It isn't as if NVIDIA offers review sites as many cards as they want.
 
We had some cards on pre order with one of the large wholesalers and the cards were supposed to be to them on April 2nd, (EVGA etc), then today they found out there are more delays and it will be more like April 8th. So I'm wondering exactly how much scrambling is going on to actually get working retail cards shipped out?
 
I don't know but the article here pointed to retail availability beginning on April 12th.
 
I don't think you'll see that until after retail cards become available. It isn't as if NVIDIA offers review sites as many cards as they want.

Thanks Dan, I wondered if this was the reason behind it.
 
Apparently there is a bug in Metro2033 DX11 Very-High with the Catalyst 10.3b (and earlier) if VSync is off

That may be the reason with no PhysX 4xAA 16AF the HD5870 gets 3fps and the GTX480 gets 21fps since they seem to all have it off for those benchmarks

With AAA the performance at the same 1920x1200 resolution is more commensurate with other game benchmarks where it usually almost ties the GTX480, but HardOCP's review states that 4xAA, however, was not playable, and I am sure they had VSync off for benchmarking!

At that same resolution with 4xAA but on the DX10 codepath (missing tesselation and DoF I guess) it gets 26fps

Is that really the defining benchmark, where literally you CANNOT play at 4xAA without a Geforce, or is it the Vsync bug?

"I know, it sounds bonkers - but apparently the 'logic' runs thus: The game, at the moment, is running as if you have 3D glasses enabled, so it is actually drawing two versions of the main image all the time and then rejecting one when it sees you haven't actually got 3D enabled. Forcing Vsync on stops it doing this so - while technically you do get a performance hit from Vsync being on, you get a far greater bonus because of the non-drawing of the phantom second image."

If this is truly the case then it sounds like a conspiracy since this is the only game all review have basically trashed the HD5870, even at Heaven2.0 it at least seems competitive at only 1.6X faster, but 7X is ridiculous!!

Scheme: Add a "3D-Vision enhancement" gone wrong. When the code was given to nVidia to "optimize" with PhysX, they slipped an intentional bug in the physics renderer to enable 3D-Vision Quadbuffering(?) if V-Sync is off (May be specific to DX11 4xAA) because they knew that reviewers would use that setting and it would cripple all but the GTX480, maybe only because of the better 3D-Vision frame-rejection optimizations in the 197.17 Forceware driver
 
Last edited:
Back
Top