NVIDIA Fermi - GTX 470 - GTX 480 - GTX 480 SLI Review @ [H]

ATI 3D would be in their next gen of cards. I'm sure 3D is not going to make or break gamers decision when it comes to multi-monitor gaming. It's only a 1up feature IMO. Fermi idles @ 90C when more than two monitors are connected to one card which means 2x90C @ idle from your 3D SLI setup. Enjoy.

It's also interesting that Fermi has to use so much energy to prevent multi-monitor screen flickering, thus the 90C idle temp.
Seems they should have went with a DisplayPort also.
 
Last edited:
Well a number of technologies both have are more in the lower user or early adaption area with little support.
 
None of this surprises me.

Nvidia was hyping itself more than outside sources were hyping it.

When you toot your own horn and nobody is there to fill in the notes in your sketchy song, it kinda leads to suspicion.

Grats to Nvidia for bringing out a card that can compete, but not overwhelm ATI.

Grats to ATI for bringing out a nice leap over the 4xxx series cards and one that shows its price to performance ratio is really excellent

Both are winners compared to previous generations, but only 1 is the best, and for the time being that seems to be ATI.
 
I use LED bulbs that use only 1.5w each so don't need to cut my power. I do my bit every day.Where I live we use hydro power anyway so no negative impact on the environment.

BUT WHAT IF 1 BILLION PEOPLE TURNED OFF THEIR 1.5 WATT BULBS?!?!
 
Or made it illegal to use your computer except for work purposes. Think how cool the planet would be... :D
 
Actually, I think fps-wise it's where it should be. But damn, the 380 SLI config sounds like having a blender permanently on next to you.

In addition I'm pretty worried about the heat, I lost my 8800 GTX because my case wasnt ventilated enough, and that was significantly less power-drawing.
 
It strikes me that this has to be a critical point in time for NVIDIA.

All of the manufacturing and performance factors with the GF100 have surely caused more than a few in the company to take a 2nd look at whether or not their design philosophy is sustainable.

A changeable market, the need for short time-to-market windows, and the power/heat/noise factors combine in a way that NVIDIA's approach seems to be somewhat self-defeating.

Given the R & D costs for each generation of GPU, the fact that the GF100 is not a 5K killer has got to give the company leadership pause.

I cannot believe that NVIDIA was ignorant of how the 480/470 stacks up against the ATI 5K series prior to releasing the cards for testing by the various web sites.

These issues are nothing new, but when the current economic climate is factored in, it makes it difficult for me to see how NVIDIA can continue down this path and remain viable.

Since a company's bottom-line purpose is to make a decent profit on their product, it makes me wonder if they are going to completely shift their focus and concentrate on industrial or commercial applications.

This ^^... basically the point I was making.

I cant imagine anyone in their right mind investing in Fermi, given that it's performance is only equal to the 5870, except at the expense of being a ridiculously power-hungry blast furnace.

Since there's no way Nvidia is going to start from scratch with a new GPU any time soon, they've basically just driven themselves out of the the gaming GPU market, as far as I see it, because there's just simply no advantage, and honestly, too many disadvantages, to Fermi.

There's simply no sane or viable reason to upgrade from any gen of GPU to Fermi, as opposed to ATi's 5870 for a single GPU solution.

So, it makes me wonder... with all the lost capital from R&D, mistakes made and releasing a sub-par GPU, where does Nvidia go from here? I just don't foresee anything coming from them in the near future with all the loss they're going to suffer, that would really even keep them in the market. I just cant imagine anyone investing in Fermi.

Well, ATi has done an incredible job with this gen, which I feel will be around for quite a while, and they're really on top of releasing drivers/updates/fixes to get things to run as smooth as possible, really caring about the gamer's experience, which is more than I can say for Nvidia, even though I've been "with" Nvidia for years.

Back to ATi for me, and if Nvidia does end up out of the consumer gamer GPU market, then perhaps any competition will come from another manufacturer, leaving room to bring in some new blood. Though, at this point, except for any price-gouging (which I don't feel ATi is going to pull), with ATi's latest, I don't even feel it's necessary to have any competition at the moment, unless they start gouging prices.
 
This ^^... basically the point I was making.

I cant imagine anyone in their right mind investing in Fermi, given that it's performance is only equal to the 5870, except at the expense of being a ridiculously power-hungry blast furnace.

Since there's no way Nvidia is going to start from scratch with a new GPU any time soon, they've basically just driven themselves out of the the gaming GPU market, as far as I see it, because there's just simply no advantage, and honestly, too many disadvantages, to Fermi.

There's simply no sane or viable reason to upgrade from any gen of GPU to Fermi, as opposed to ATi's 5870 for a single GPU solution.

So, it makes me wonder... with all the lost capital from R&D, mistakes made and releasing a sub-par GPU, where does Nvidia go from here? I just don't foresee anything coming from them in the near future with all the loss they're going to suffer, that would really even keep them in the market. I just cant imagine anyone investing in Fermi.

Well, ATi has done an incredible job with this gen, which I feel will be around for quite a while, and they're really on top of releasing drivers/updates/fixes to get things to run as smooth as possible, really caring about the gamer's experience, which is more than I can say for Nvidia, even though I've been "with" Nvidia for years.

Back to ATi for me, and if Nvidia does end up out of the consumer gamer GPU market, then perhaps any competition will come from another manufacturer, leaving room to bring in some new blood. Though, at this point, except for any price-gouging (which I don't feel ATi is going to pull), with ATi's latest, I don't even feel it's necessary to have any competition at the moment, unless they start gouging prices.

Ther will always be people who will buy nvidia no matter what as well as consumers who dot care or don't know about the advantages/disadvsntages of a fermi card (I bought 5600fx back in the day becuase I did not know anything about the fx series). However I do not think that it is as bad as people thnk it is.

I'm sure nvidia knew well in advance that they will have issues and created a gameplan as to what they will do and where they will go next. I'm no engineer but my guess is that they spent a lot of money on the fermi architecture and the only logical step is to refine it to bump up it's effeciency, which is what seems to be it's major thorn.
 
Jeebus dude its only one generation of gpu tech. You sound exactly like all the nutters banging the end is new for AMD the past 5 years.
 
This ^^... basically the point I was making.

I cant imagine anyone in their right mind investing in Fermi, given that it's performance is only equal to the 5870, except at the expense of being a ridiculously power-hungry blast furnace.

Since there's no way Nvidia is going to start from scratch with a new GPU any time soon, they've basically just driven themselves out of the the gaming GPU market, as far as I see it, because there's just simply no advantage, and honestly, too many disadvantages, to Fermi.

There's simply no sane or viable reason to upgrade from any gen of GPU to Fermi, as opposed to ATi's 5870 for a single GPU solution.

So, it makes me wonder... with all the lost capital from R&D, mistakes made and releasing a sub-par GPU, where does Nvidia go from here? I just don't foresee anything coming from them in the near future with all the loss they're going to suffer, that would really even keep them in the market. I just cant imagine anyone investing in Fermi.

Well, ATi has done an incredible job with this gen, which I feel will be around for quite a while, and they're really on top of releasing drivers/updates/fixes to get things to run as smooth as possible, really caring about the gamer's experience, which is more than I can say for Nvidia, even though I've been "with" Nvidia for years.

Back to ATi for me, and if Nvidia does end up out of the consumer gamer GPU market, then perhaps any competition will come from another manufacturer, leaving room to bring in some new blood. Though, at this point, except for any price-gouging (which I don't feel ATi is going to pull), with ATi's latest, I don't even feel it's necessary to have any competition at the moment, unless they start gouging prices.

As much as the FX series is crap it is also one of the best selling VC family. You can still find them in B&M shelves and people are still buying them. The majority of people buy VC in the $50 - $150 range and nVidia has many available in that range. Just because we here at [H] are underwhelmed by nVidia lately doesn't mean it's the same out there. Just go check out the video card dept at Best Buy or Fry's and look at what people are buying. Yes nVidia probably made a big mistake with Fermi but in no way are they in trouble because of it.
 
Jeebus dude its only one generation of gpu tech. You sound exactly like all the nutters banging the end is new for AMD the past 5 years.

The End is New!! The End is New!
Repent, for the End is New!!

Sorry... couldn't resist.....:D
 
What's up with this quote from the review?

This is the first reference video card to come with heatpipes as the stock cooling configuration. The fact that heatpipes are needed does not bode well for the type of thermals this video card produces.

That's just plain wrong, just like the quote about AMD being the first to offer crossfire profile updates separate from the driver.

Almost all cards have heatpipes.. they usually just don't stick out from the plastic cover. Have you guys ever disassembled a card? Look at G80.. even it has heatpipes.
 
I have three Dell 3007's that I will have to drive in Vision Surround.

Well the thing that has sold me on the Nvidia line, is the ability to be able to run three monitors in SLI using the D-DVI ports on both cards. Maybe if I had newer 30's then this would not be an issue. ATI's choice not to be able to use 3xD-DVI is the straw that broke the camels back.

I was looking at the Eye-Infinity originally, but the additionally cost of finding a displayport converter/adapter and not to mention problems some have been having, has basically sold me on Nvidia's offering (pending review of the drivers when available).

Well I won't have to worry about heating the computer room next winter;)

I mainly run auto racing sims, with the odd TPS game to fill the time in between practicing for online racing events.
 
I have three Dell 3007's that I will have to drive in Vision Surround.

Well the thing that has sold me on the Nvidia line, is the ability to be able to run three monitors in SLI using the D-DVI ports on both cards. Maybe if I had newer 30's then this would not be an issue. ATI's choice not to be able to use 3xD-DVI is the straw that broke the camels back.

I was looking at the Eye-Infinity originally, but the additionally cost of finding a displayport converter/adapter and not to mention problems some have been having, has basically sold me on Nvidia's offering (pending review of the drivers when available).

Well I won't have to worry about heating the computer room next winter;)

I mainly run auto racing sims, with the odd TPS game to fill the time in between practicing for online racing events.

didn't some site mention dual monitor on GTX 480 cause 90c idle temp?

that sounds pretty nuts to me :confused:
 
Good review, covers what needed to be known, sticking to what is important.

I have to say I'm a bit curious, 3.2 BILLION transistors and not accomplishing much more than the 2.2 BILLION transistors of the 5870. I have a feeling there is another shoe to drop at some point. Either that or NVidia seems to have wasted alot of silicon without producing a result.

On the power usage... well like DUH. How does anyone expect the computing power to keep increasing and magically it doesn't take any more juice? Pretty retarded thinking.

Guess what, the next generation with even more compute power will.... have even higher power consumption.

On the temperatures.... it is not the fault of either the GTX480 or the 5870 that they get hot and make noise. It is the fault of the ATX case design, a flawed and pathetic result of morons and idiots with no engineering expertise designing cases.... which then become defacto stadards because the industry as a whole pathetically lacks vision.

I wont use the stock coolers on either the 5870 or the GTX480. Hell, I tossed the cooler on my 8800GT from day one. Using the Thermalright HR-03GT... and dropping my max temps 20C+.

I think all the handringing over the heat and noise is a waste of electrons.
 
On the power usage... well like DUH. How does anyone expect the computing power to keep increasing and magically it doesn't take any more juice? Pretty retarded thinking.

Guess what, the next generation with even more compute power will.... have even higher power consumption.

Radeon 5800 would like a word with you.
 
I find that hard to believe, that two monitors just sitting there doing nothing would cause an idle temp of 90c??:confused::confused:

Reviewers have already stated that to be fact, and Nvidia confirmed it. They have to keep the idle clocks high when doing multi-monitor to avoid flicker.
 
I find that hard to believe, that two monitors just sitting there doing nothing would cause an idle temp of 90c??:confused::confused:

He has to mean load, but I would recommend you get a window unit for the room you run it in... or two.

Reviewers have already stated that to be fact, and Nvidia confirmed it. They have to keep the idle clocks high when doing multi-monitor to avoid flicker.

Oh that's fucking awesome. THE GOOD NEWS KEEP AROLLIN IN.

Here it is:

http://www.legitreviews.com/article/1258/15/

"We are currently keeping memory clock high to avoid some screen flicker when changing power states, so for now we are running higher idle power in dual-screen setups. Not sure when/if this will be changed. Also note we're trading off temps for acoustic quality at idle. We could ratchet down the temp, but need to turn up the fan to do so. Our fan control is set to not start increasing fan until we're up near the 80's, so the higher temp is actually by design to keep the acoustics lower." - NVIDIA PR
 
LOL
http://www.nvnews.net/vbulletin/showthread.php?t=149400

I attached a 120mm fan to the top of the enclosure with electrical tape and pointed it directly at the exposed heat pipes. With the fan running around 1500RPM, the GPU temperature gradually dropped a total of 11° Celsius. Furthermore, the noise of the 120mm fan began to drown out the fan on the graphics card, which eventually throttled down to 55%.

12cm fan should be a standard accessory of GTX4xx ?
 
Well, when someone [H] gets ahold of one, maybe they will take the fan out, block the PCB holes, glue on a 80-120mm fan adapter and mount a good 120mm fan with a seperate fan control on it. Also, remove and replace the shitty thermal paste NV uses stock....

Bet it results in 20-30C temperature drop at full load.
 
I have to say I'm a bit curious, 3.2 BILLION transistors and not accomplishing much more than the 2.2 BILLION transistors of the 5870.

It's a very weird state of affairs. These GPUs, right from the start, were designed for calculating things in server parks, not playing games.

There has to be an insane amount of money in the GPGPU business.
 
It's a very weird state of affairs. These GPUs, right from the start, were designed for calculating things in server parks, not playing games.

There has to be an insane amount of money in the GPGPU business.

yes... if people will be willing to pay to power and cool them.
 
It's a very weird state of affairs. These GPUs, right from the start, were designed for calculating things in server parks, not playing games.

There has to be an insane amount of money in the GPGPU business.

Nvidia has to expand the market for their chips. Once CPUs with integrated GPU dies become mainstream, they lose the biggest part of their business. High-end gaming chips and Quadro is not enough to carry them in the long run.
 
You fart knockers! I just got inside after installing new pipes on my Harley and saw you posted the review. :D
Love the videos showing wattage and the sound. Man those 480 are loud.

Nice one Butthead! hehe. hehe.

The harley engine loudness is much better than Geforce loudness, huh? Nothing quite like the sound of an engine.
 
it they didn't consume so much power these would be great cards. but it's stupid they only have two, one for 350 and one for 500... where ATI you can spend 165$ on a 5770 and still get a decent Direct X 11 card. nVidia loses this round IMO
 
it they didn't consume so much power these would be great cards. but it's stupid they only have two, one for 350 and one for 500... where ATI you can spend 165$ on a 5770 and still get a decent Direct X 11 card. nVidia loses this round IMO

They've already said there will be middle of the road products based on a different chips that competes in that lower price range (and all others). Given that those aren't enthusiast class parts, you can expect to see them designed with different considerations in mind (power, noise, etc).
 
didn't some site mention dual monitor on GTX 480 cause 90c idle temp?

that sounds pretty nuts to me :confused:

[H]'s review showed a single GTX 480 hitting 95° C and that was in an open room with no case! From my experience with 100° C cards inside a closed case, all your other components will suffer from the extra heat as well. Watercooling is a must for 480 SLI setups and even then, tons of extra heat is dumped into your loop
 
I know nothing of this guy. and am just posting this to let the reader come to there own conclusions. but the 480 doesn't look good for a 2 monitor setup. if what is said is true and that he just doesn't have a bad apple. but in light of the heat talk. it could have some relevance

http://www.legitreviews.com/article/1258/15/
 
Nice one Butthead! hehe. hehe.

The harley engine loudness is much better than Geforce loudness, huh? Nothing quite like the sound of an engine.

maybe videocards should sound like V12's. Then I would welcome some noise!

Imagine playing on your videocard and it starts making some awesome exhaust pipe notes? That would be tons of fun!
 
[H]'s review showed a single GTX 480 hitting 95° C and that was in an open room with no case! From my experience with 100° C cards inside a closed case, all your other components will suffer from the extra heat as well. Watercooling is a must for 480 SLI setups and even then, tons of extra heat is dumped into your loop

Hell, it sounds like watercooling is a must for even just one of these practically.
 
Well, when someone [H] gets ahold of one, maybe they will take the fan out, block the PCB holes, glue on a 80-120mm fan adapter and mount a good 120mm fan with a seperate fan control on it. Also, remove and replace the shitty thermal paste NV uses stock....

Bet it results in 20-30C temperature drop at full load.

Why should people have to go through all that trouble to get their $500 video card down to acceptable temperatures? Most people expect their hardware to work from the get go when they plunk down that much cash.

It appears that non-reference cooling is the only way to go to control the noise and temperatures of the power hungry GTX 480. This is one of the hottest running video cards EVER released, runs even hotter than the dual GPU 5970.

And doesn't the same thing apply to the 5000 series? Put a non-reference accelero cooler on a 5870 and it will be able to run even quieter and faster. Also I read Hardware Canucks' review which revealed that the max OC they got out of the GTX 480 was 10% on the core.

Compare this to the 5870 that runs stock 850, but can be pushed up to 1000 core or close to that easily. Thats a 17% boost! The 5870's low thermal requirements make it an overclocking beast. The GTX 480, not so much. The 480's power requirements and heat envelope are the limiting factors to overclocking it.
 
On the temperatures.... it is not the fault of either the GTX480 or the 5870 that they get hot and make noise. It is the fault of the ATX case design, a flawed and pathetic result of morons and idiots with no engineering expertise designing cases.... which then become defacto stadards because the industry as a whole pathetically lacks vision.

I wont use the stock coolers on either the 5870 or the GTX480. Hell, I tossed the cooler on my 8800GT from day one. Using the Thermalright HR-03GT... and dropping my max temps 20C+.

I think all the handringing over the heat and noise is a waste of electrons.

The blame doesn't go elsewhere. If you're thermalright HR-03GT dropped max temps by 20C+ then Nvidia should have entertained an option such as this, no matter how absurd that sounds. You had the good sense to do this. Hot and noisy is not an unfortunate side effect of poor ATX design, it's the result of Nvidia's choice to push the limits. A fan or the case the card resides in is a design element that HAS to be factored into the equation.

AIBs are allowed to put wherever fan they choose on this board. Hopefully we will see some of them addressing this issue. Perhaps we will see water cooling setups being openly supported by AIBs. This doesn't solve the problem for the average user, but at least someone will address the issue. As for the average user, I wonder if Nvidia hasn't painted itself into a corner.

This isn't an ATX design flaw, ATX has been here for a long time, this is a Nvidia design flaw. ATI downclocked their 5970 in order to fit into a certain power envelope, which was far lower than Nvidia.
 
Why should people have to go through all that trouble to get their $500 video card down to acceptable temperatures? Most people expect their hardware to work from the get go when they plunk down that much cash.

It appears that non-reference cooling is the only way to go to control the noise and temperatures of the power hungry GTX 480. This is one of the hottest running video cards EVER released, runs even hotter than the dual GPU 5970.

And doesn't the same thing apply to the 5000 series? Put a non-reference accelero cooler on a 5870 and it will be able to run even quieter and faster. Also I read Hardware Canucks' review which revealed that the max OC they got out of the GTX 480 was 10% on the core.

Compare this to the 5870 that runs stock 850, but can be pushed up to 1000 core or close to that easily. Thats a 17% boost! The 5870's low thermal requirements make it an overclocking beast. The GTX 480, not so much. The 480's power requirements and heat envelope are the limiting factors to overclocking it.
not that you would want to add even more power and heat but the gtx480 can overclock just as much(percentage wise)as a 5870.
 
The blame doesn't go elsewhere. If you're thermalright HR-03GT dropped max temps by 20C+ then Nvidia should have entertained an option such as this, no matter how absurd that sounds. You had the good sense to do this. Hot and noisy is not an unfortunate side effect of poor ATX design, it's the result of Nvidia's choice to push the limits. A fan or the case the card resides in is a design element that HAS to be factored into the equation.

AIBs are allowed to put wherever fan they choose on this board. Hopefully we will see some of them addressing this issue. Perhaps we will see water cooling setups being openly supported by AIBs. This doesn't solve the problem for the average user, but at least someone will address the issue. As for the average user, I wonder if Nvidia hasn't painted itself into a corner.

This isn't an ATX design flaw, ATX has been here for a long time, this is a Nvidia design flaw. ATI downclocked their 5970 in order to fit into a certain power envelope, which was far lower than Nvidia.

that goes to both ATI and NV :p build it as cheap as possible i guess
 
not that you would want to add even more power and heat but the gtx480 can overclock just as much(percentage wise)as a 5870.

I generally try to avoid being a dick, but in this case I can't help it. Where is the proof that on average the GTX480 will overclock as well? As far as I know, the only GTX480s in reviewers hands are samples sent to them by nVidia. These are not retail samples that just anyone can get their hands on. This is one of the big reasons Kyle didn't give any results of overclocking. Samples handed out to reviewers don't always represent the retail cards in regards to overclocking.

I'm sure Kyle did do some overclocking with the cards he had but I don't remember him mentioning much, if anything about overclocking. This isn't the first time he waited to release actual results until he got his hands on retail cards but he would normally say something about what he saw for overclocking. This could have been an oversight on his part but it could also mean that the results he saw weren't very good and he's waiting for retail samples to arrive to see if they are any better.

I believe your thoughts regarding the overclockability of the GTX480 are nothing more than wishful thinking at this point. The fact that the cards are hot as hell and power hungry at stock speeds usually indicates that the cards are not going to be very good overclockers.
 
Back
Top