NVIDIA GeForce GTX 480 Final Specs & Pricing Revealed

I wonder if the reason behind the contradicting rumors about power draw (and number of SPs) is connected to this on-the-box voltage tweak feature. Like maybe, 250W stock, but tweak it up to 295 for a big boost.

And of course, many high end PSUs have more than two PCI-E power connectors, so having a third on the card to power an "overdrive" mode would not be that weird.
 
Last edited:
I wonder if the reason behind the contradicting rumors about power draw (and number of SPs) is connected to this on-the-box voltage tweak feature. Like maybe, 250W stock, but tweak it up to 295 for a big boost.

From the looks of it, it's the 'ultra' card that will have all 512 SPs and 295TDP, while the regular GTX480 will have 480SPs and 250TDP. Since it seems no one was expecting there to be an ultra version of the 480, that's probably where the confusion came from.
 
I doubt Asus has the voltage tweak additions available for testing yet, I doubt they even have the boxes printed as those are clearly renderings, I think they are more "high hopes" than actual product ATM. Also they are labeled as "EN"GTX480 and "EN"GTX470.. which could possibly mean "Engineering Sample"?
 
I love how all these anti-Nvidia people are calling the card "too hot, too big, too slow, too w/e". Because we all have seen reviews of retail cards.

Wait until the real release, speculations can be proven false. Don't jump to conclusions based on leaked/rumoured specs..... TDP =/= power consumption.
 
I'll watch NE for GTX480 on the release date like watching water boil !! So i can flip it to some nVidiot.
 
I love how all these anti-Nvidia people are calling the card "too hot, too big, too slow, too w/e". Because we all have seen reviews of retail cards.

Wait until the real release, speculations can be proven false. Don't jump to conclusions based on leaked/rumoured specs..... TDP =/= power consumption.

How about the fact is 8 months late, running too hot (Already one major buyer "Oak Ridge" has dumped them), is on 3rd A revision already, B1 being done (If Kyle's heard it and feels confident to say that I believe him).

40nm process has leakage problems. ATI adapted. NV didnt. They tried desperately and did patching to get round what is really a redesign issue.

Its a fundemental engineering problem. It IS costing NV alot and i doubt any A revision Fermi cards will make any profit. Using risk wafters for retail cards? Ballsy. Its going to be EXTREMELY interesting to see their actual yields but i very much doubt we'll ever see as it will be kept under carpet.

Interesting thought for ppl to keep in mind. These cards will run hot. Are we going to see the return of the same issues we saw with the laptop GPU's and damage? Apple is still pissed at NV for that one.

With the banning of lead they need to use alternative solders and die/substrate layers. I really hope they learnt from that and we dont see another return of faulty cards. I really wouldnt want to be a laptop manufacturer at present. Changing TDP on a product this close to launch date is suicide. Lets come back in 6 months and see how many laptops have NVs "killer" dx11 card in it.
 
How about the fact is 8 months late, running too hot (Already one major buyer "Oak Ridge" has dumped them), is on 3rd A revision already, B1 being done (If Kyle's heard it and feels confident to say that I believe him).

40nm process has leakage problems. ATI adapted. NV didnt. They tried desperately and did patching to get round what is really a redesign issue.

Its a fundemental engineering problem. It IS costing NV alot and i doubt any A revision Fermi cards will make any profit. Using risk wafters for retail cards? Ballsy. Its going to be EXTREMELY interesting to see their actual yields but i very much doubt we'll ever see as it will be kept under carpet.

You can't say it's "too hot", you can't say it "has problems", you can't say anything about it with 100% certainty as you don't have a GTX480 in your possession, you haven't read a full review of a GTX480, and you don't know anybody with a GTX480 in their possession.
 
You can't say it's "too hot", you can't say it "has problems", you can't say anything about it with 100% certainty as you don't have a GTX480 in your possession, you haven't read a full review of a GTX480, and you don't know anybody with a GTX480 in their possession.

and you know it all ;)
 
shades of the 2900xt launch. lots of detractors then soft-shoe reviews from other sites, followed by fanboys calling foul on [H] testing verifying the card as a dog. i wonder if history will repeat itself here.
 
You know if I were a person into conspiracies I would think mercyground shares a pillow with Charlie. Or maybe that is charlie. Ha I lol'd


Seriously dude your way too wound up about this
In the end who really gives a flying fuck if this launch rides the fail boat or not. If it sucks don't buy it, its really that simple, really. Your really trying to be right when you have no hard facts so please stfu already.

Thanks.
 
Reminder: when Charlie puplished some time back that the gf100 has high TDP, consumes a lot of power and they may not be able to have enough chips to launch enough cards having the full (by design 512 cud cores) many were upset.

Now he was spot on and only now just before launch nvidia is actually confirming it. That tells the whole story IMO.

PS I am not promoting Charlie, he does seem biased and as a journalist this goes against the job ethos (at least he should change his writing style to be in a less gloating style) but nvidia is trying to hide/blurr/call it what you want and this is not right. On the other hand they have no choice after investing so much in this architecture.
 
I love that Nvidia will finally be the top dog once again. I wouldn't mind getting one of these for my next build. However, it'll have to get down to $350-$400 before I would purchase it.
 
I love that Nvidia will finally be the top dog once again. I wouldn't mind getting one of these for my next build. However, it'll have to get down to $350-$400 before I would purchase it.

that will be some time next year
and we dont know how it run yet but for sure it will sell way over MSRP for at lest a month
and the only card that could drop that low would the 470 and by that time the 5870 should be near that price >.>

i say if you have a good highend card now skip this mess wait for the refreshes
 
You can't say it's "too hot", you can't say it "has problems", you can't say anything about it with 100% certainty as you don't have a GTX480 in your possession, you haven't read a full review of a GTX480, and you don't know anybody with a GTX480 in their possession.

so what part of 8 months late did u miss? you do NOT miss the Xmas rush with out a DAMN good reason. Nor do you miss a MAJOR software release (win7).

Thats 2 serious snafus. I dont care if you think Fermi runs like shit off stick and cool as a night breeze. Missing those deadlines is a major indication of something badly wrong. It doesnt take a genius to spot that one.
 
You know if I were a person into conspiracies I would think mercyground shares a pillow with Charlie. Or maybe that is charlie. Ha I lol'd

Seriously dude your way too wound up about this
In the end who really gives a flying fuck if this launch rides the fail boat or not. If it sucks don't buy it, its really that simple, really. Your really trying to be right when you have no hard facts so please stfu already.

Thanks.

Say what you like about charlie. Personally I rather enjoyed his articles constantly proving NV PR as liers and cheats. NV certainly like to pull the same stunts. Intel cartoons anyone?

It helped me get my laptop fixed despite them claiming it was "not their fault".

I'd love to know what NV promised Apple considering Apple really slammed them over the laptop affair and the screwups NV did.

"Buy our chips. We fixed the issue" NV
"Holy Crap Apple laptops dying" Apple
"Not us" NV
"Lying $%^$&%^" Apple.

Cue PR overdrive and more sweeping under the carpet. Dunno about you lot but i'd be pretty pissed if my 1yr old laptop died and "Tough its out of warrenty" when the company KNOWS its a known fault and just screws you over.

So yeah. You keep forking over those greenbacks to a "reputable" company. Enjoy that.

Some of us consumers stand up instead of getting shafted. Its only your company reputation at stake. Ask the Phantom dudes. Kyle has a nice story about those idiots.
 
It's nice to get here late and see only the final version of posted specs, and see everyone's comments about the earlier numbers...

I've taken everything pretty much with a grain of salt so far, so it's possible I missed something... I remember reading about the two-display limit per card, meaning SLI was required for nFinity. However, the pictured card does have two DVI ports and a (mini?) HDMI port. Still limited to two displays, or will this allow for three on one card? If it does allow for three, that's HDCP support and no adapter too.

Still limited to two displays.
 
How about the fact is 8 months late, (? How do you count?
running too hot (Already one major buyer "Oak Ridge" has dumped them),
This is not confirmed yet
is on 3rd A revision already, B1 being done (If Kyle's heard it and feels confident to say that I believe him).

40nm process has leakage problems. ATI adapted. NV didnt. They tried desperately and did patching to get round what is really a redesign issue.

3 metal spins is not unheard of, but sure. However, how can you say that the process problems of TSCM 40 nm is really a redesign issue? You're mixing apples and pears here.

Your conclusions may be correct but your facts makes it BS
 
I love that Nvidia will finally be the top dog once again. I wouldn't mind getting one of these for my next build. However, it'll have to get down to $350-$400 before I would purchase it.
they won't be top dog at least for a while still fastest one is 5970 also its time for ati release there refreshes so there fastest single gpu position can change also
 
they won't be top dog at least for a while still fastest one is 5970 also its time for ati release there refreshes so there fastest single gpu position can change also
Indeed. And if the performance improvement of only 5-10% is true, then they will have a hard time keeping the single GPU title too. If ATI brings Catalyst 10.3 and the Radeon Eyefinity Edition that 5-10% edge that nVidia has will shrink.
 
Ok reading this thread has convinced me of not buying the 480gtx, but does anyone know when the 5970 will be out, or in stores.Only place I've seen them is on ebay for very high price.I currently run 2 280gtx but I want to go to single card. Even my fps dropped considerably ONLYwith BBC 2 because I connected another monitor other games ran fine. Its a shame that 480gtx will be hot. Btw I will be selling my 280gtx for 150 each once the 5970 comes out again..


btw I need to update my rig

gygabyte udp4 x58
i7920 @ 3.7
12g Dominator 8-8-8-34 1t
2 280gtx sli
corsair 100hx
vortex 120g ssd x 2
Hanns g 28 monitor
 
Last edited:
The 5970s come in and out of stock fast. Go to Newegg, search for 5970. Click on Notify Me for each 5970 listed and Newegg will send you an email when they're in stock. They will only be in stock for 10-30 minutes or so you have to be quick. Also, you can post a Want To Buy thread in a few forums and I'm sure you'll find one within a week. Hope this helps.
 
It is possible to draw more power without more power connector but that would be very dangerous since the current draw would be higher and it would cause fire.

Sensationalize much?

Drawing a small amount more than 225W (75W comes from the socket) from an 8-pin + 6-pin setup is not going to cause a fire.

A lot of power supplies don't even have true 8-pin connectors. They use 6+2 connectors where the additional two pins just jumper off two of the 6-pin conductors. Therefore, those power supplies are perfectly capable of running the full 150W of an 8-pin connector through 6 conductors.
 
Say what you like about charlie. Personally I rather enjoyed his articles constantly proving NV PR as liers and cheats. NV certainly like to pull the same stunts. Intel cartoons anyone?

It helped me get my laptop fixed despite them claiming it was "not their fault".

I'd love to know what NV promised Apple considering Apple really slammed them over the laptop affair and the screwups NV did.

"Buy our chips. We fixed the issue" NV
"Holy Crap Apple laptops dying" Apple
"Not us" NV
"Lying $%^$&%^" Apple.

Cue PR overdrive and more sweeping under the carpet. Dunno about you lot but i'd be pretty pissed if my 1yr old laptop died and "Tough its out of warrenty" when the company KNOWS its a known fault and just screws you over.

So yeah. You keep forking over those greenbacks to a "reputable" company. Enjoy that.

Some of us consumers stand up instead of getting shafted. Its only your company reputation at stake. Ask the Phantom dudes. Kyle has a nice story about those idiots.

So, just to confirm, they MUST make shitty desktop graphics cards considering your laptop broke.... I mean, come on, laptop GPU's failing is TOTALLY relatable to desktop cards.

And in any case, you can blame RoHS for their lead-free solder requirements.

Now, back on the main topic, there are some new pics of the heatsink.

bednrr.jpg

fylvsg.jpg

2m7djpg.jpg

209qav5.jpg

zunoyd.jpg

2yz0axd.jpg

j9odu8.jpg
 
That has to be one of the more massive GPU coolers I have seen :)

Thanks for the pics!
 
Looks like a decent cooler. I wonder how the retailers will plaster them with stickers, with that ridged heatsink exposure. I guess the front will do.
 
Sensationalize much?

Drawing a small amount more than 225W (75W comes from the socket) from an 8-pin + 6-pin setup is not going to cause a fire.

A lot of power supplies don't even have true 8-pin connectors. They use 6+2 connectors where the additional two pins just jumper off two of the 6-pin conductors. Therefore, those power supplies are perfectly capable of running the full 150W of an 8-pin connector through 6 conductors.

:confused: When did I say a small amount more? I said that it is possible to draw more power from a connector than the specified power and it would cause fire if the limit is crossed.

Do you know that power is proportional to the square of current?
Do you know that a chip will have a higher leakage at a higher clock speed?
Do you know that if you increase the voltage, the power is proportional to the square of the voltage?

The number of connector or conductor doesn't mean anything, the most important thing is the thickness or the cross section of the wire. You can also use only one connector or conductor as long as your cross section is enough for the high current. :rolleyes:
 
Last edited:
they won't be top dog at least for a while still fastest one is 5970 also its time for ati release there refreshes so there fastest single gpu position can change also

Your inability to use the correct "their" "they're" "there" correctly indicates, to me atleast, that your opinion should be taken with a grain of salt since these mistakes are made by adolescents typically in the 14-17 years of age range.

Sure it may be laziness, but that is also indicative of your attitude and arrogance when it comes to sharing your opinion.

Btw Oldie, I can prove this using statistical learning algorithms if you like. It's not trolling if it has a high correlation coefficient.

The amount of propoganda done by alg7_munif and others to convince others to buy ATi cards is disheartening. I will let him keep doing it since his posting history speaks for himself.
He has always been a pro ATi guy and has, on numerous occasions, been the pioneer for the "ATi movement" on this forum.

I'm sick and tired of these GPU politics, let the bloody card come out and we'll see who's right and wrong. Stop conjecturing until the card comes out.

And sure the "5970 is top dog" even with the 480 comes out; as Robin Czerbotsky says "but, um" it's a 2 GPU card going up against a single GPU card.

I'm not going to let this degenerate into the same 3870x2 vs 7900 GTX argument, or the 4870x2 vs GTX 280, because anyone who thinks one physical slot versus another is a fair comparison is entitled to their opinion. I however believe the number of GPUs on die are a big thing here.

My bet's on 3x GTX 480s beating 2x5970s in Tri SLI vs the quad crossfire, but we'll see (obviously there are no real benchmarks yet).
 
if gtx 480 is only 10% faster than hd 5870, than it should not even be compared to a dual gpu card 5970 period. It is not even single gpu vs. 2 gpu anymore, because the gtx 480 in a sense would not be fast enough to come close to 5970 in games that excel in crossfire, which is the case in almost all popular games.
 
There is more to it than just single GPU vs dual GPU. It is not about the top dog either, it a mixture of price, performance, features and power consumption. Then there is reliability and customer support too. Listen to what the community want, keep in touch with the community through forums and etc to make them happy.
 
And sure the "5970 is top dog" even with the 480 comes out; as Robin Czerbotsky says "but, um" it's a 2 GPU card going up against a single GPU card.

I'm not going to let this degenerate into the same 3870x2 vs 7900 GTX argument, or the 4870x2 vs GTX 280, because anyone who thinks one physical slot versus another is a fair comparison is entitled to their opinion. I however believe the number of GPUs on die are a big thing here.

My bet's on 3x GTX 480s beating 2x5970s in Tri SLI vs the quad crossfire, but we'll see (obviously there are no real benchmarks yet).
Card VS card GPU vs GPU..Who cares?
What the consumer wants is $-VS-$. If either vender takes two GPUs on a card and makes it cheapers & faster than the other, they will buy it.
At this point ATI could drop their cards price to whatever they want to undercut anything nVidia puts out for a while. ATI has a $50 DX11 card right now, for christsakes and nVida cannot match that in a pipedream.
If nVidia does come out with this and prices it below ATI, offering I think people will buy it. However, they will look at the power consumption at this point in history more than ever. If they add in that the nVidia card costs as as much to run as the Family Truckster, people will fucking balk!
Jesus, this thing is automatically on the "Never to-be 'Energy Star' applicable" list!
I can see where this shit is heading in short fucking order. As I said before: California is BANNING large screen TV's even if they are "Energy Star" compliant. Those TVs pull FAR less wattage than a high end PC running Dual or quad SLI or crossfire. California doesnt ask for statistics or ponder legsilation, they ban, period. I can see nVidia's new card as being the card that broke the camels back. They will call for an END to gaming on high-end kilowatt+ systems. Arnold Schwarzenegger will announce "If it bleeds, we can kill it!"
 
Your inability to use the correct "their" "they're" "there" correctly indicates, to me atleast, that your opinion should be taken with a grain of salt since these mistakes are made by adolescents typically in the 14-17 years of age range.

Sure it may be laziness, but that is also indicative of your attitude and arrogance when it comes to sharing your opinion.

Btw Oldie, I can prove this using statistical learning algorithms if you like. It's not trolling if it has a high correlation coefficient.

The amount of propoganda done by alg7_munif and others to convince others to buy ATi cards is disheartening. I will let him keep doing it since his posting history speaks for himself.
He has always been a pro ATi guy and has, on numerous occasions, been the pioneer for the "ATi movement" on this forum.

I'm sick and tired of these GPU politics, let the bloody card come out and we'll see who's right and wrong. Stop conjecturing until the card comes out.

And sure the "5970 is top dog" even with the 480 comes out; as Robin Czerbotsky says "but, um" it's a 2 GPU card going up against a single GPU card.

I'm not going to let this degenerate into the same 3870x2 vs 7900 GTX argument, or the 4870x2 vs GTX 280, because anyone who thinks one physical slot versus another is a fair comparison is entitled to their opinion. I however believe the number of GPUs on die are a big thing here.

My bet's on 3x GTX 480s beating 2x5970s in Tri SLI vs the quad crossfire, but we'll see (obviously there are no real benchmarks yet).

3870x2 vs 7900GTX? no comparison :p the 3870x2 would murder the 7900gtx, hell a 3870 would too =p
 
Your inability to use the correct "their" "they're" "there" correctly indicates, to me atleast, that your opinion should be taken with a grain of salt since these mistakes are made by adolescents typically in the 14-17 years of age range.

Sure it may be laziness, but that is also indicative of your attitude and arrogance when it comes to sharing your opinion.

Btw Oldie, I can prove this using statistical learning algorithms if you like. It's not trolling if it has a high correlation coefficient.

The amount of propoganda done by alg7_munif and others to convince others to buy ATi cards is disheartening. I will let him keep doing it since his posting history speaks for himself.
He has always been a pro ATi guy and has, on numerous occasions, been the pioneer for the "ATi movement" on this forum.

I'm sick and tired of these GPU politics, let the bloody card come out and we'll see who's right and wrong. Stop conjecturing until the card comes out.

And sure the "5970 is top dog" even with the 480 comes out; as Robin Czerbotsky says "but, um" it's a 2 GPU card going up against a single GPU card.

I'm not going to let this degenerate into the same 3870x2 vs 7900 GTX argument, or the 4870x2 vs GTX 280, because anyone who thinks one physical slot versus another is a fair comparison is entitled to their opinion. I however believe the number of GPUs on die are a big thing here.

My bet's on 3x GTX 480s beating 2x5970s in Tri SLI vs the quad crossfire, but we'll see (obviously there are no real benchmarks yet).


That's funny, because when the 5870 came out all the Nvidia fans compared it to the GTX 295. Now Ati fans can't compare the 480 to the 5970? Kind of a double standard don't you think?
 
Off-topic: It would be nice if peeps used correct wording when making exclamations.

http://articles.latimes.com/2009/nov/19/business/fi-big-screen-tvs19

The first-in-the-nation criteria, approved unanimously Wednesday by the five-member California Energy Commission, is aimed at cutting the amount of electricity used by new high-definition TVs of up to 58 inches by a third starting Jan. 1, 2011. More stringent rules that take effect Jan. 1, 2013, would create a cumulative 50% power savings.

The standards don't apply to any of the approximately 35 million TV sets currently in use in California or units sold in the coming year.

... California is BANNING large screen TV's even if they are "Energy Star" compliant. ...

On-topic: Several people I know have been patiently awaiting the arrival of Fermi. All were dissapointed when hearing the GTX 480 would only ship with 480 cores. Maybe it won't make much of a difference, maybe it will. Only time will tell...
 
Last edited:
:confused: When did I say a small amount more? I said that it is possible to draw more power from a connector than the specified power and it would cause fire if the limit is crossed.

Anything's possible... Whether it's relevant to the topic at hand, or not, is another issue altogether. The GTX480 will never draw enough power to cause an issue with fire, or current overload. You have to pull a lot more than 200w from a 6 pin to cause a fire - considering that's 8a per non-negative wire... Even 18g conductors can deal with 8a in a short run.

The rest of your facts, while valid, don't change anything.
 
Anyone who thinks this version of Fermi is a success needs to take a step back and look at things objectively. Maybe a future revision will be but this first is a definate failure due to it being EXTREMELY LATE and having mimimal if any performance advantage over ATi's part. When was this thing supposed to be released again?......last year?
 
That a boy... you Nvidia fan boys stay aboard their sinking discrete GPU ship.

God if that hasn't been said like 50 times in the last decade about each company. It is borderline cliche

Despite your position on each company or card. Only an idiot would actually find it amusing for the competition to sink. Talk about asking to be screwed sideways with prices.

Competition breeds improvement, innovation, and better prices.
 
Back
Top