NVIDIA GeForce GTX 480 Final Specs & Pricing Revealed

And this is A3. Imagine how bad the first two revisions must have been.


I hear that we will see a B1 some time this year. That rumor tells me that there are some things "bad wrong" with the A version.
 
a Quad GTX 480 system would require your wall plug to be greater then 15A :p


Yep, and most of North America is not ready for that.

http://www.hardocp.com/article/2007/08/24/ultra_x3_1600w_power_supply

The X3 1600w runs circles around its competitors in almost every category tested bar none. The amazing DC output capacity that immediately draws a user to this power supply comes with one big caveat. While almost no desktop computer would actually push this unit to its limit, once one does only a select few individuals with the correct infrastructure will be able to use all 1600 watts of its DC capacity due to the AC power draw. The Ultra X3 1600w is a stellar 1200 watt power supply that in the right home can be an even more amazing 1600 watt unit.
 
Considering the huge TDP of 295 watt, 5-10% performance increase over the Radeon 5870 is a disappointment. And with such an enormous TDP, it will be very difficult to manufacture a dual GPU card. The GTX 480 will most probably take the single GPU performance crown, but AMD will still have the fastest card: Radeon 5970.

That said, if you don't care about power consumption at all, and if the price is right, it's still a nice card. Besides the slight performance increase, it has PhysX support too and in games with a lot of tessellation performance will likely be better than 5-10%.
 
TDP
5850 - 151w
5870 - 181w
470 - 225w
5970 - 294w
480 - 295w

...Two 5870s or 480s would be the best performance. But your looking at 600w TDP For two 480s and 362w TDP for the 5870s.

Like most people, I only play games about 2 hours per day. Rest of the time, the computer is in 2D mode. That's about 60 hours per month of full load TDP on the GPU. Multiply that by the difference in TDP between 2x480s and 2x5870s: 60 hrs * (600 Watts - 362 Watts) = 14.3 KWhrs, which only costs $5.72 even in power-expensive states like California.

I am more interested in the TDP difference in 2D mode.
 
[X]eltic;1035471554 said:
Considering the huge TDP of 295 watt, 5-10% performance increase over the Radeon 5870 is a disappointment. And with such an enormous TDP, it will be very difficult to manufacture a dual GPU card. The GTX 480 will most probably take the single GPU performance crown, but AMD will still have the fastest card: Radeon 5970.

That said, if you don't care about power consumption at all, and if the price is right, it's still a nice card. Besides the slight performance increase, it has PhysX support too and in games with a lot of tessellation performance will likely be better than 5-10%.

That is highly unlikely, in games with heavy tessellation, there are still other things that need rendering, and the GTX 480 is still only 10% faster in that category. The only time it might be a factor, is in benchmarks, no Dev is going to waste time on a feature that 1% of their client base might use.
 
Can we expect an indepth review from [H]ard on the 26th?

That is the plan at the moment, but given the small amount of time we have, it will be somewhat limited, but we are already planning a ton of AIB follow up. :) Just checked my fire extinguishers and everything looks like a go.
 
Like most people, I only play games about 2 hours per day. Rest of the time, the computer is in 2D mode. That's about 60 hours per month of full load TDP on the GPU. Multiply that by the difference in TDP between 2x480s and 2x5870s: 60 hrs * (600 Watts - 362 Watts) = 14.3 KWhrs, which only costs $5.72 even in power-expensive states like California.

I am more interested in the TDP difference in 2D mode.


A 5870 with ONE screen is about 30w idle/2d. With two or more it is 60w.
 
The GTX 470 will most likely end up being the better seller as the price point is actually pretty good & the power usage is not nothing to be bitching about. As most of you seem to be doing. Pussies ;) lol. I'm willing to slap a guarantee however that the GTX 470 will be on par with the 5870 once the drivers mature. It may even outperform it down the road. It has much more headroom to push unlike the GTX 480. Future revisions will most likely bring the juice it requires down but I don't see the GTX 480 doing well. THe GTX 470 is where its at.
 
looking forward to it Kyle. I have been on the fence wanting to buy a 5870 for a few months now in anticipation of this new nvidia offering. Finally get to make a move once your review(s) are out on which way to go. Not a fanboy of either personally. I always go for best bang for the buck, my 8800gts 512 is starting to show its grey whiskers.
 
I have to say if all this info is on the money, ie: price, performance and retarded tdp, then I have to say sadly its a fail imo.

Of course we don't know for sure yet but I bet this information is close, very close.

/sigh

P.s. Kyle looks like your sources were pretty spot on from last December. Credit where it's due for sure. :)

P.s.s. thinking on this info a little more and while 2x gtx480 at those prices seems like a fail to me, perhaps 2/3 gtx 470's at a much better price/performance point might be sexy.
Hmmm will have to see how [H] review pans out.
 
Last edited:
hmm looks like I should rethink even considering sli gtx 470/480 for replacement of my 8800gts. $500 and not much of a performance gain over 5870.

So when are we going to get some official word from ATI there is a refresh coming?

It's looking like next weekend I'll have to decide on crossfire 5870 or wait for the possible ATI refresh or price drop. It's getting to the point where I might say screw all of this and pick up 2x5870 for $800 locally. My 8800gts just isn't cutting it anymore.
 
yea heres to hoping for the 5870 price drop. Even better if i can get a 5870 2gb iteration at current 5870 price; i'd gladly pay it!
 
Scary that you have to ask since most of you act like you were experts in all Nvidia....
Ever heard of Cuda, Physx and that little thing 3d Vision (the world is going 3d finally in case you havent noticed).
And yes now you can insert the usual silly comments on how those are useless, gimmicks, etc....as anything your card does not have MUST be a gimmick right? ;)

cuda you said? big deal ATi can do it too called havoc or anything else for physics that is open source and btw in case you have not noticed physics usually slows down the card in question using it..........

3d vision? yeah ok.....you got me there lol.....3d vision is like the virtual boy of past, cool idea bot never made it mainstream.
 
how about with the new 150/300 clocks introduced in 10.2/10.3

Dunno, but I would suggest not much of a difference if I had to guess.

looking forward to it Kyle. I have been on the fence wanting to buy a 5870 for a few months now in anticipation of this new nvidia offering. Finally get to make a move once your review(s) are out on which way to go. Not a fanboy of either personally. I always go for best bang for the buck, my 8800gts 512 is starting to show its grey whiskers.

Yep, I am just a fan of what kicks ass. I am hoping the Fermi ends up better than I am feeling it will though. We NEED competition in the marketplace. When Red or Green faceplants, it is bad for all of us.

P.s. Kyle looks like your sources were pretty spot on from last December. Credit where it's due for sure. :)

I try my best to get you guys solid "rumors" on video cards. I am surprised the pricing is as low as it is. Even AIBs were expecting it to be higher.
 
Scary that you have to ask since most of you act like you were experts in all Nvidia....
Ever heard of Cuda, Physx and that little thing 3d Vision (the world is going 3d finally in case you havent noticed).
And yes now you can insert the usual silly comments on how those are useless, gimmicks, etc....as anything your card does not have MUST be a gimmick right? ;)

CUDA? ATI have Stream..

PhysX? psh.... it lag my game when I turn it on even when I have my dedicated GPU. Also GTX 295 in most case IT STUTTER LIKER CRAP when physX is on, same goes to ALL SLI solution...

3D Vision? ATI have release a driver that support 3D, but it still a gimmick no matter what you say... The world is going into 3D "BUT" not with those NERD GLASS... Wanna make a bet on that?

also, GTX 295 is in most case DID NOT beat 5870, no idea what you smoking..
I move from GTX 295 to 5870 and notice the game run much better on 5870 than what I had, and some AA stutter issue is totally gone when I switch...
You can ask pretty much everyone who had a GTX 295 that jump to 5870 the question, everyone would say the same thing as I did..

If you still think your GTX 295 is superior, you need to calm yourself down and take a nap while thinking for your next claim that actually make sense..
 
3d vision? yeah ok.....you got me there lol.....3d vision is like the virtual boy of past, cool idea bot never made it mainstream.
You must be living in an alternate world amigo.
Look around you, 3D movies are being produced like never before...heard of a little movie called Avatar yet?
3D Ready tv sets are coming from all major brands.
Bluray 3D format
Even Consoles are going 3d!
Yeah is all like that crappy nintendo system. Heck, even the sega master system was better than that. So next time you try to make a point , try not to make a fool of yourself..too many people watching ;)

So yeah, I am getting Fermi, and I hope they come out cheap and soon.
 
This generation of nVidia cards was brought to you by the letters

F and X

and the number 42

Sesame Street is a production of the Children's Television Network......can you tell me how to get........
 
You must be living in an alternate world amigo.
Look around you, 3D movies are being produced like never before...heard of a little movie called Avatar yet?
3D Ready tv sets are coming from all major brands.
Bluray 3D format
Even Consoles are going 3d!
Yeah is all like that crappy nintendo system. Heck, even the sega master system was better than that. So next time you try to make a point , try not to make a fool of yourself..too many people watching ;)

Well as has been said before ATI is going to have 3d support as well. Not sure what your point is.
 
I do not know about now Shansoft but the GTX 295 crushed both the 5850 & 5870 when the cards launched. The drivers are mature now for both cards so I can see the cards being better now but at first, they were not on the level of the GTX 295.
 
5870 uses 118W @ load? I think that number is wrong. 5870 is power efficient compared to it's competitors, but I don't think it's only drawing 118W @ load.

The number is correct. There is, of course, some process variation from card-to-card (up to %20), but the 5870 power consumption of 118w is right on the money.

Take this review from Xbit Labs, a site known for their excellent power consumption measurements.

http://www.xbitlabs.com/articles/video/display/radeon-hd5870_7.html#sect0

Regular load = 107w average
Furmark load = 161w

Then compare to this TechPowerUp review, where they use the exact same methodology to capture GPU-only power consumption:

http://www.techpowerup.com/reviews/ATI/Radeon_HD_5870/28.html

Regular load = 122w average, 144w instantaneous peak
Furmark load = 212w

There's obvious variations from one card to another, and although the 5870s tested do not exceed the stated TDP (outside of Furmark), some will consume significantly more/less than others. Given that, 118w at-load sounds like a perfectly reasonable number for the 5870.

I expect that the GTX480 will also have significantly less real-world power consumption than the TDP implies, but how much we will not know until the real benchmarks are released.
 
That is highly unlikely, in games with heavy tessellation, there are still other things that need rendering, and the GTX 480 is still only 10% faster in that category. The only time it might be a factor, is in benchmarks, no Dev is going to waste time on a feature that 1% of their client base might use.
Other things need to be rendered, yes, but since tessellation is supposedly one of the strengths of the new nvidia architecture, it should make a difference. If performance is already 5-10% better in applications with normal amounts of tessellation, performance will only increase in tessellation heavy applications. Take the Heaven benchmark for example, in tessellation heavy parts of the benchmark the performance difference between Fermi and the 5870 is increased. That's something that we will probably see in games with heavy tessellation as well. It's true that it remains to be seen if developers will take advantage of it, but devs take advantage of PhysX too (= nvidia only), so it's not at all unlikely.

The GTX 470 will most likely end up being the better seller as the price point is actually pretty good & the power usage is not nothing to be bitching about. As most of you seem to be doing. Pussies ;) lol.
What really bothers me about increased power consumption is increased noise generation. To a certain extent, I couldn't care less about electricity bills, but I don't want to have a leaf blower in my computer case. I want my hardware cool and quiet.
 
The number is correct. There is, of course, some process variation from card-to-card (up to %20), but the 5870 power consumption of 118w is right on the money.

Take this review from Xbit Labs, a site known for their excellent power consumption measurements.

http://www.xbitlabs.com/articles/video/display/radeon-hd5870_7.html#sect0

Regular load = 107w average
Furmark load = 161w

Then compare to this TechPowerUp review, where they use the exact same methodology to capture GPU-only power consumption:

http://www.techpowerup.com/reviews/ATI/Radeon_HD_5870/28.html

Regular load = 122w average, 144w instantaneous peak
Furmark load = 212w

There's obvious variations from one card to another, and although the 5870s tested do not exceed the stated TDP (outside of Furmark), some will consume significantly more/less than others. Given that, 118w at-load sounds like a perfectly reasonable number for the 5870.

I expect that the GTX480 will also have significantly less real-world power consumption than the TDP implies, but how much we will not know until the real benchmarks are released.



Logic is not welcome around these parts, from what I see of most of the fans posting in other threads :p . You are correct, however :).
 
You must be living in an alternate world amigo.
Look around you, 3D movies are being produced like never before...heard of a little movie called Avatar yet?
3D Ready tv sets are coming from all major brands.
Bluray 3D format
Even Consoles are going 3d!
Yeah is all like that crappy nintendo system. Heck, even the sega master system was better than that. So next time you try to make a point , try not to make a fool of yourself..too many people watching ;)

So yeah, I am getting Fermi, and I hope they come out cheap and soon.

One movie was good in 3D in theaters, and it won't be released for 3D blu-ray until 2011. I guarantee you that 3D will not penetrate the consumer market this year, and I seriously doubt it will do squat next year either.
 
[X]eltic;1035471788 said:
Other things need to be rendered, yes, but since tessellation is supposedly one of the strengths of the new nvidia architecture, it should make a difference. If performance is already 5-10% better in applications with normal amounts of tessellation, performance will only increase in tessellation heavy applications. Take the Heaven benchmark for example, in tessellation heavy parts of the benchmark the performance difference between Fermi and the 5870 is increased. That's something that we will probably see in games with heavy tessellation as well. It's true that it remains to be seen if developers will take advantage of it, but devs take advantage of PhysX too (= nvidia only), so it's not at all unlikely.

What really bothers me about increased power consumption is increased noise generation. To a certain extent, I couldn't care less about electricity bills, but I don't want to have a leaf blower in my computer case. I want my hardware cool and quiet.

I don't think you understand how it works

In software development, there are things called features, you don't spend money on features, that no one can or will use.

So spending extra time to tessellate so heavily just so that people with a GTX 480 can play it, is not going to happen, instead, there will be a medium, where all cards from mainstream to high end can at least play with it.

and from what I understand, the more tessellation the 480 is doing, the less rendering power it has as it dedicates more resources to tessellation.

BTW, ATI had TruForm in the 8500
 
All I want are some benches and to know if these new cards are loud. I waited this long to see what the 470 would do... waiting another week is nothing. If it's no better (or worse... -gulp-) than the 5850, then I'll get the 5850. But I do like the fact that the Nvidia cards can do hardware video decoding (CUDA), and PhysX... even if it is rare and gimmicky. I hope to see [H] review these cards ASAP!
 
Lol, that's coming from a guy who's logic riddled with fallacies...

Can't resist more personal insults, eh? Don't have any argument to back you up is why I suppose... guess that's why you and other people are following me around in these threads pouncing every time I speak :rolleyes:.

Rut Ruh, now I will have to rethink getting an 480 at launch and wait for the "B" version. Kyle's "rumors" are usually spot on.

Unless it comes with extra clockspeed boosts or whatnot, what's it matter to you or me?
 
But well, Both the 480 and 470 have same amount of rops n texture units as GTX280, just more shader power.

Ati doubled up EVERYthing, and still managed to keep wattage down.

Impressive from ati's side, and their drivers are shaping up real good, their whole game is going on strong, all though i hoped for fermi at a earlier point of time, so i could snatch up on a 2nd 5850 thats mine, loaning one now.

You guys know, ati's refresh isnt that far away, and then HD6xxx not too far after, by ati's refresh nvidia have maybe completed their 400 series?
hopefully tegra will keep nvidia alive, so they can give us some good compotition till intel takes over :p

I dont belive nvidia will survive between amd and intel with integrated graphics chips in laptops, and judging by how ati is doing on the descrete market. nope, intel just needs to make a good graphics card to complete the squeeze.

3d, ati is going for that too. cuda, we have stream, opencl, directcompute. physx, well i dont have space for it(m-atx), and few games give me the experience that the card cost.
Just make physx for opencl or let it die out nvidia......

Open standards will always win, always have and always will....
 
You must be living in an alternate world amigo.
Look around you, 3D movies are being produced like never before...heard of a little movie called Avatar yet?
3D Ready tv sets are coming from all major brands.
Bluray 3D format
Even Consoles are going 3d!
Yeah is all like that crappy nintendo system. Heck, even the sega master system was better than that. So next time you try to make a point , try not to make a fool of yourself..too many people watching ;)

So yeah, I am getting Fermi, and I hope they come out cheap and soon.

once again it's not gonna take off............the sheer clunkiness of the hw involved and narrow view angle will ensure that.....
 
But well, Both the 480 and 470 have same amount of rops n texture units as GTX280, just more shader power.

Ati doubled up EVERYthing, and still managed to keep wattage down.

Supposedly they improved the efficiency dramatically as well on them, hence not needing further amounts of them. What's it matter if you're doubling up on inefficient units, vs. the same # of more efficient ones, producing the same end result? Hint: it doesn't.
 
Ati really did engineer a killer card with the 5000 series. Awesome performance, features, and low wattage. Kudos to them. :cool:
 
Well as has been said before ATI is going to have 3d support as well. Not sure what your point is.

Not the first time they claim that but unlike others, I am no fan boy so I hope they do it so that I can have another choice for video cards. In the mean time, I would believe it when I see it.
Oh, and it would be fun to see how some red boys suddenly change their views on 3d vision now that ATI is supposed to be getting it! :D

In any case back to Fermi, if it beats or matches my 295 in single so that I can SLI them I would be sooooooooo happy.
 
I'm an NVIDIA fanboy myself but it does look like ATI did a good job with its 5000 series. I have even recommended them to several people in the past few months since NVIDIA is so late with the 470/480. But it should be clear to everyone that price will match performance. So even if the performance isn't that great the price will drop. I think initially the true benefit will be for GPGPU, but personally I think the performance of the 480 will be significantly better than the 5870, otherwise NVIDIA did screw something up since the TDP is so much higher, unless it is all for overclocking.

Edit: The only other thing that keeps me from personally buy an ATI card is their drivers and support. People will complain about closed standards by NVIDIA, but NVIDIA is putting a lot of money into PC gaming and I don't have a problem with it if it makes the games better. I think ATI should put more money into drivers and features, but maybe they are starting to turn it around - Eyefinity does look promising and their seems to be good reports about the new Catalyst drivers.
 
I don't think you understand how it works

In software development, there are things called features, you don't spend money on features, that no one can or will use.

So spending extra time to tessellate so heavily just so that people with a GTX 480 can play it, is not going to happen, instead, there will be a medium, where all cards from mainstream to high end can at least play with it.

and from what I understand, the more tessellation the 480 is doing, the less rendering power it has as it dedicates more resources to tessellation.

BTW, ATI had TruForm in the 8500
Some developers even took advantage of obscure things like X-RAM, so it's not unlikely at all. In fact, in this case it's more likely. Don't forget that tessellation is not a new feature, it's being coded for already, nvidia is merely better in handling higher amounts of tessellation. What we will probably see is sliders in-game that will give an option like 'tessellation: low - normal - high - extra high' or something similar, with AMD not being able to handle the higher amounts of tessellation without a huge performance hit.

Note, I'm not saying that the GTX 480 is a winner, but it has some undeniable strengths:

-Improved tessellation performance
-PhysX support

But then again, it has some undeniable weaknesses too:

-Very high TDP of 295 watt
-Lackluster power/performance ratio
 
I think my moneys still going to go to SLI GTX470-480 once prices drop. What i5 chipset do I need to support both crossfire and SLI?
 
im most likely getting two 470's for sli on day one.

i have 3 pci 16x slots so im going to wait on the 3rd one, lol, im going to need to get a new psu for the tri-sli,
 
Back
Top