NVIDIA Fermi - GTX 470 - GTX 480 - GTX 480 SLI Review @ [H]

I've been sitting back reading the reviews and comparing. I have two 480s on preorder and am going back and forth on whether I should cancel.

When it comes down to it, there's really no card that fits what I'm looking for. I want a card that does the following:

  • Runs multi-monitor (Eyefinity, 3D Surround) games with high IQ without hitting the framebuffer (1GB+ VRAM)
  • Does not require the use of Displayport; adapters are ok, but not preferable
  • Runs within reasonable sound and power thresholds
The 5870 E6 won't work for me due to ATI's insistance to require Displayport; the 480 is a power hog, hot and noisy. Essentially, I'm looking for a 5870 2GB version.

The future iteration of the 480 (ie., the 5900 to the FX5800, the 36xx to the 26xx series, etc.) will probably do what you're looking for. Like others, I suspect the delay of six months to the introduction of this card means the other team that was working on the refresh will have that done relatively quickly, perhaps even around the time of the "mainstream" launch of Fermi-variants or a time afterward.

Price really isn't an object for me here. I'd pay good money for a card(s) that can run games multi-monitor full throttle.

I'm relatively sure nVidia will be taking there sweet time to get 3D Surround working. I know Kyle said it would be up-and-running, hopefully, in 30 days, but I'm not holding my breath. By forcing people to purchase two expensive cards to run multi-monitor, they've minimized the amount of people who will be running it. The less people looking at their cards for 3D Surround, the less nV will care about getting it out there.

Actually, do the 480/470 retail boxes advertise 3D Surround as a feature? That would be fucked if so, since it doesn't even work yet.

Actually, 470/480 didn't release yesterday, the reviews were launched. He said the driver would come within 30 days, Anandtech (I believe) said the driver was coming in April, so doing the math, it would appear nVidia postponed the release to get driver and quantity issues up to snuff.

Imo, the 470/480 are not as big a disappointment as I expected since nVidia did claim the higher performance crown with the 480 and especially with SLI, but obviously the power/heat/noise are going to be an issue and not one easily cast aside. With a few choice sales, rebates, or pricecuts, AMD will not be threatened by these cards. Especially given how few of them apparently are to be released. First generation cards that are power hogs with a small release is even more evidence to me that nVidia knows the next refresh is just around the corner, but they need to release something(/anything!?) to keep the Board from rebelling.

I'm disappointed we didn't get to see how the nVidia Surround compared to Eyefinity as that was really the more interesting battle to me. Not because I want such a setup (as I dislike bezels), but because nVidia would do with drivers what ATI did with hardware and I'm always eager to see if nVidia's driver team will work magic (again?) or if the feature will fall short. Perhaps another day!

One other thing, from the review:

"The only game that clearly favors the GeForce GTX 480 is Metro 2033. (And we know that AMD still has its driver team looking over the final code release of the game and has not yet tweaked for it.) Even in BC2, 8X CSAA isn’t a huge improvement over 4X AA which the HD 5870 allowed at 2560x1600. What is the value to the gamer of being able to use 8XAA instead of 4XAA in Bad Company 2?"

I was really encouraged to see such a balanced review until I read this. I don't know, but essentially saying, "We know AMD hasn't tweaked for this game yet, SO YOU JUST WAIT AND SEE WHAT HAPPENS" is... odd when the competitor to AMD just released a brand new architecture with obviously early drivers? Couldn't you lament that nVidia's drivers are early and their team is "looking over the final code and have not yet tweaked" for all the games in question?

It seems odd to me that you're making a driver excuse for a new game with AMD hardware that's been out for over six months? One would think that nVidia 470/480 drivers would have a lot more room for improvement than AMD drivers, not because of the driver team per se but because the AMD driver team has had a whole hell of a lot longer to tweak for final hardware.

IF you want to bring driver team improving game performance into the mix, then I'd say there's an equal chance that the nVidia driver team will over the next six months find their footing with the finalized/released hardware and offer at least as many performance improvements to all the games in this review as there is ATI will.

Imo, I'd leave the "They'll fix it with drivers" out of it, as I don't see how that argument can't be made back for nVidia in spades, given this is a soft launch with drivers that will almost certainly be different by the time of shipping cards.
 
What no one is talking about is how outdated the ATX standard is with respec to PCI slots. If you could fit a TRUE-120 tower heatsink on a gtx 480 the loudness would disappear. I also wonder how cool you could keep the chip since these 2 slot pci coolers seem to suck ass.
 
It seems odd to me that you're making a driver excuse for a new game with AMD hardware that's been out for over six months? One would think that nVidia 470/480 drivers would have a lot more room for improvement than AMD drivers, not because of the driver team per se but because the AMD driver team has had a whole hell of a lot longer to tweak for final hardware.

I'm sure nVidia drivers will get better, but lets also not forget, nVidia has been working on Fermi for a LONG time, it isn't exactly new to them. So while their drivers will mature, I don't beleive they are starting off all that immature.
 
I am so glad I did not wait for this. 5870's in crossfire is where it's at.
 
Who wants to have this jet engine in your computer while playing games? Not me. I would have liked the review to talk about physx, how it played out on these cards and what direction nvidia is going with it. The Anandtech review talked about it a bit and said these new cards had as much hit using physx as the previous generation. Doesn't sound too promising.
 
Good review...about what I expected. Yeah, I know...I listened to Charlie's "lies". :p
 
Throughout the review you pitted the 5970 against 2x 480's in sli and yet when you come to do the power measurements you you measure 2x 5870's in crossfire.
Strikes me that stinks of bias.
The 5970 is like 2 down clocked 5870 and would not perform as well as 2x 5870's in crossfire and would show a larger deficit when compared to 2x 480's
And yet in the power useage test where we all know Fermi loses badly you chose to use 2x 5870's so as to make the difference appear less.

If this is not just unadulterated bias please explain your reasoning?
 
My concern with this card is that it has so many cuda cores disabled and it still uses this much power and is this hot. What I want is three monitor extened for games and normal use. In order to do that with N V I will have to have two of these in SLI and that is just too much power and heat. I also expect that they will improve and enable the rest of the cores making this version obsolute quite fast.
I feel that they will come out with an improved version and one wioth two GPU's sometime this year (a GHX 495?) which would do what I want. So, for now I will stay with my GTX 285 and if I can no longer resist the three monitor bug I will go to ATi, although I think they have something new comming our soon also so I would not buy their present version as 1. There will soon be a new one and, 2. the present series will then drop in price.
 
:rolleyes: I'm sure ATI is against you, better watch out!

I'm not a conspiracy theorist or something...just saying its going to take a lot more than one generation of their cards having "better" performance to win me over after the experiences I've had with them.
 
I would like to applaud Brent and Kyle for the review. You guys put forth a substantial amount of time and effort toward completing this review....for us. In addition, we greatly appreciate the availability to answer forum members' questions.

Just wanted to say thanks.

AMEN. I totally agree. A great review and right on launch day. Thanks. It is appreciated:cool:
 
Last edited:
Throughout the review you pitted the 5970 against 2x 480's in sli and yet when you come to do the power measurements you you measure 2x 5870's in crossfire.
Strikes me that stinks of bias.
The 5970 is like 2 down clocked 5870 and would not perform as well as 2x 5870's in crossfire and would show a larger deficit when compared to 2x 480's
And yet in the power useage test where we all know Fermi loses badly you chose to use 2x 5870's so as to make the difference appear less.

If this is not just unadulterated bias please explain your reasoning?

I think that was spot on for Kyle, it is the the 5870 vs 480 for single gpu performance. The 5970 is a differ realm of its self.
 
I don't how anyone could buy one of these. I'd like to see someone try to rationalize it. :p
 
Was looking at a 5850 Cypress Pro and was going to wait to see what NVidia released. I just pulled the trigger on a 5850!
 
It is a bit early to start talking about refreshes of NVIDIA's Fermi architecture at this point. You can't even buy Fermi based cards today. That will be a couple weeks away at least. Normally refreshes bring a modest increase in performance and usually lower power usage. Otherwise refreshes are rarely change the landscape of the video card wars too much.
 
Throughout the review you pitted the 5970 against 2x 480's in sli and yet when you come to do the power measurements you you measure 2x 5870's in crossfire.
Strikes me that stinks of bias.
The 5970 is like 2 down clocked 5870 and would not perform as well as 2x 5870's in crossfire and would show a larger deficit when compared to 2x 480's
And yet in the power useage test where we all know Fermi loses badly you chose to use 2x 5870's so as to make the difference appear less.

If this is not just unadulterated bias please explain your reasoning?

Did you happen to READ the power page, I addressed this issue directly. Reading is fundamental.
 
Nvidia will probably drop a bit over time. AMD/ATI will probably raise them with the refresh now that they don't have to worry about the current gen.

That was my point, and of course, the real problem that Fermi brings. Fanboys will be fanboys; the rest of us want good prices. My launch day 5850 has been a sound investment, as far as video cards go.

I was hoping that Fermi would at least force AMD to drop the 58x0 back to original MSRP, or maybe a shade lower. Instead Fermi allows AMD to refresh at the inflated MSRPs, if not higher ones...because hey, who is gonna stop them?
 
Woah, just went back and realized this... so the 480 actually didn't even receive Silver on it's own, and it's no ****ing wonder.

@ Kyle: do you believe even a refresh of Nvidia's latest is going to bring anything new to the table? I cant imagine what they would pull-off with a refresh that would magically blow the 5870 out of the water.


Too far out to speculate. We are not getting reliable information currently Heck, we were still not getting Fermi information two weeks ago. I don't THINK that this was supposed to be a soft launch. I think GPU supply was the issue. That is just my personal feeling.

The only people that are going to see big benefits from 5870 2GB cards are those running BIG resolutions. We are still testing though, so there may or may not be more benefits to be seen.
 
I am hoping a refresh brings the power and heat levels down along with running with the full 512 cores. I am sure it will probably happen but I am wondering with all the issues with Fermi so far that it might not be out till the end of the year. Anyone care to take an educated guess on when that might happen?
 
I would tend to think getting good quantities of the current card out are probably of concern before the refresh.
 
After this release I have way more respect for the way ATI/AMD have run their little program. The 5XXX release is all the more impressive to me. New parts at every price point within months of each other. Even if features like Eyefinity and dx11 are of diminishing value on lower end points, the consumer was still given the option to use them and get more out of their purchase if they wish.

I'm curious about hd 6XXX. If they will be moving to a smaller process do you think it's likely that we will see a die shrink for one of the evergreen chips for them to try out the process?

Honestly, Fermi may be a flop for the most part, but I think in 6 months to a year it may offer better value if games start making better use of dx11 features. Also they may be able to squeeze more performance out of it with driver updates as ATI has done. Ultimately, the power draw and heat turn me off the most. My brother did some calculations and he saves almost $10 a month with his 5850 over his previous card. Plus what kind of OC headroom does the 480 have with that heat? and much power will it consume? time to finish reading all these posts.
 
Kyle, was wondering if you could post in the uber high resolution screenshot thread with everything turned up so high regardless of the FPS and take a few screenshots - would like to see what the game looks like in such resolutions.

Thanks.
 
I'm curious about hd 6XXX. If they will be moving to a smaller process do you think it's likely that we will see a die shrink for one of the evergreen chips for them to try out the process?

ATI is developing new architecture for the 6XXX series at the end of this year. Most likely it will be the 28nm process, and I'm sure that another test like the 4770 of last year will be done to avoid any Nvidia like issues. We may see a 5780/90 with a 28nm process to test the new gen but that's all speculation by me.
 
ATI is developing new architecture for the 6XXX series at the end of this year. Most likely it will be the 28nm process, and I'm sure that another test like the 4770 of last year will be done to avoid any Nvidia like issues. We may see a 5770 with a 28nm process to test the new gen but that's all speculation by me.



i honestly dont think it will be 28nm yet what with tsmc having problems with 40nm and 32 being almost nonexistent 28nm is a long long shot
if anything i think they'll have to spend another year (at best) on the 40nm unless globalfoundries beats tsmc to the punch
 
Well Nvidia will not release another card until December/January anyway so ATI won't be rushed to deliver another gen given the current results of Fermi. If the 28nm process is online by December 2010 or January 2011, it's possible for both companies to plan for it. ATI has 10 months on the 40nm already(4770).
 
Well Nvidia will not release another card until December/January anyway so ATI won't be rushed to deliver another gen given the current results of Fermi. If the 28nm process is online by December 2010 or January 2011, it's possible for both companies to plan for it. ATI has 10 months on the 40nm already(4770).

ATI has no need to rush from the looks of things.

I'm looking forward to 28nm products from both companies. :)
 
Originally Posted by styx0r
Here's what Fermi was designed to do. I love how the GPGPU capability wasn't even mentioned. Of course, most readers are only gamers?, so they wouldn't care about practical capabilities of a new generation GPU.

If Nvidia had focused their R&D on a high-end gaming GPU, we'd have a different situation on our hands. These cards are not power efficient while competing with ATI's gpus, because they were not designed solely to render modern video games.

What? Excuse me. I use to fold for tournament.com under SnoBeast. Look up my results and listen to my teachings. I was number 1 there for a while.

Now. I was only running a GTX260 216, 8800GTS 640MB, and Radeon 4670, with a total of 4 cores cpu also. I ran it 24/7 for 30 days folding@home. Then turned computers on for another 30 days for gaming and general use. Savings not running folding@home was around $55-60 a month. Now for the sake of folding@home, PLEASE PLEASE run this fucker 24/7 full bore for a month and actually look at your bill!!! (And for the love a saving humans, has anyone seen any freakin results of this program except having nuke plants turn up the volume?) If you want to fold and can afford to, be my guest, its your hobby.

Listen, I am not a tree hugger, but when things got tight around here you look at everything to save money. And a GD graphics card and folding@home are not more important than food, rent, or life in general. So using folding@home is a piss poor way of bringing up performance for this series.
If this Fermi (which is a nuclear power plant in Monroe Mi.) was meant for graphics calculations and folding, then only release this Nuclear Turd for Stanford and graphics/movie industry. Have something in the works more mainstream than this Nuclear Fart nVidia just released.

Oh god, I ranted again. Sorry.....
 
It's OK SnowBeast, it's ok. Everything's going to be ok. Stay right where you are, don't move. I'm going for help. :(
I'm looking forward to 28nm products from both companies.
Your patience will need to be remarkable. :p
 
Kyle, was wondering if you could post in the uber high resolution screenshot thread with everything turned up so high regardless of the FPS and take a few screenshots - would like to see what the game looks like in such resolutions.

Thanks.

You want PacMan or Frogger?
 
It strikes me that this has to be a critical point in time for NVIDIA.

All of the manufacturing and performance factors with the GF100 have surely caused more than a few in the company to take a 2nd look at whether or not their design philosophy is sustainable.

A changeable market, the need for short time-to-market windows, and the power/heat/noise factors combine in a way that NVIDIA's approach seems to be somewhat self-defeating.

Given the R & D costs for each generation of GPU, the fact that the GF100 is not a 5K killer has got to give the company leadership pause.

I cannot believe that NVIDIA was ignorant of how the 480/470 stacks up against the ATI 5K series prior to releasing the cards for testing by the various web sites.

These issues are nothing new, but when the current economic climate is factored in, it makes it difficult for me to see how NVIDIA can continue down this path and remain viable.

Since a company's bottom-line purpose is to make a decent profit on their product, it makes me wonder if they are going to completely shift their focus and concentrate on industrial or commercial applications.
 
jesus those videos
i guess you could argue if that if your cards fan bothers you you ought to turn your amps to 11 but still

If you need me
I’ll be downstairs
With the Fermi
You can call but I probably won’t hear you
Because it’s loud with the Fermi on
 
Awesome review thank you for re-affirming my commitment to my HD5870! My i7 920 is OC @3.8ghz with low temps in 28-32 and avarage 34-37c. I play with all the highest settings on BFBC2, but I've yet to purchase AVP. That will be next week.
 
Thanks for the honest review! The bit and videos on sound were a great addition, thanks for doing that part also.

I second that motion, read a bunch of a reviews and none seemed to cosign the ATi cards overall superiority. Yes the Fermi is faster in some benchmarks, but when you add price, heat, and power into the equation the winner becomes very clear.Hard gives me more reasons to keep coming back.
 
Here's the most telling thing I read in all the reviews so far....

"In my personal system (Corsair 800D Chassis) with two monitors the GeForce GTX 480 graphics card would idle at 90C and if it was a sunny day and my office was warmer it would idle at 92. I fired up the new DX11 game title Aliens Versus Predator and with GPU-Z in the background I saw the temperature reach 99C while gaming for around 30 minutes. At this temperature the fan is spinning at 70dB and it honestly was not an enjoyable gaming experience. I asked NVIDIA if the card was built to run at temperatures this high and they claim that the GeForce GTX 400 series was built to operate at high temperatures and reminded me that 105C was the peak temperature for the GeForce GTX 480 video card. While benchmarking the GeForce GTX 480 graphics card on the open test bench I found the outside of the heatsink to reach 50C on the fan side and 59C on the exhaust side, so this card without a doubt will put out some heat."

If this was posted earlier I'm sorry I didn't read all 20+ pages. 90C at idle in a Corsair 800 case is really unbelievable....the above is a review from here.

http://www.legitreviews.com/article/1258/16/
 
Granted, but the fact is that the [H]ard|Folding team is currently, and has been the #1 Folding team for a long time. Coupled with the fact the Folding for the [H]orde is always being recruited for on the main [H] page, IMO, it's disingenuous to not include a Folding benchmark.

There are several reasons why they do not do F@H benchmarks and I agree with their reasoning.

1. For the most part, the vast majority of people are here for video game performance. The time and effort to do F@H benchmarks would not be cost effective.
2. For the moment, there is no client they can get for Fermi. There is no publicly available client and I don't know if there is even a closed beta client for it yet. There are also no work units released for that client even if you do have the client. The current GPU2 client will not work on Fermi.
3. Overall, the work unit and core landscape for the GPU client is too fluid. There are many different work units out there which do not even come close to scaling linearly on different cards. To even attempt a proper benchmark, you would need at least one of every work unit and test each one at least once. That's not exactly the easiest thing to do and is very time consuming.

While I would love to see the numbers the different video cards can put out for folding, I agree with Kyle's and Brent's decision to leave out any attempt at benchmarking folding performance for video cards or CPUs. There really is no way to properly do the benchmarks in a timely or efficient manner.
 
And today is the day of the Earth Hour when you cut your lights if you care even the slightest about power consumption. Great time to launch Fermi..

I use LED bulbs that use only 1.5w each so don't need to cut my power. I do my bit every day.Where I live we use hydro power anyway so no negative impact on the environment.
 
Will we be able to run a 3x1 setup with 120hz refresh rate and 3D OFF?

Because I don't think I could even manage this with eyefinity at the moment (HDMI/DP adapters are all 60hz)
 
Back
Top