NVIDIA GeForce GTX 480 Final Specs & Pricing Revealed

If that info is true, then that's insane that the GTX 480 has nearly the same TDP as the 5890....
 
If decided to buy one,what is the cost of a used reactor these days?+EPA cleanup.
 
One article noted that a 5850 idles @ 15W with the new driver, I'm going to look into it so don't quote me :p

Sounds a little low, but honestly once you drop below 25 or 30w, who really gives a shit on a desktop system?

I guarantee you that 3D will not penetrate the consumer market this year, and I seriously doubt it will do squat next year either.

Na, will be 2012 or later for any big revenue from it. But on that note, keep in mind that movies are not games and vice versa.

Ati really did engineer a killer card with the 5000 series. Awesome performance, features, and low wattage. Kudos to them. :cool:

Yes, they did. No matter what Fermi does, 5000 has delivered for many months now.

I think my moneys still going to go to SLI GTX470-480 once prices drop. What i5 chipset do I need to support both crossfire and SLI?

P55 is the way to go. Check out MSI P55-GD65 board for around $160. Still my fave in that category.
 
Not the first time they claim that but unlike others, I am no fan boy so I hope they do it so that I can have another choice for video cards. In the mean time, I would believe it when I see it.
Oh, and it would be fun to see how some red boys suddenly change their views on 3d vision now that ATI is supposed to be getting it! :D

In any case back to Fermi, if it beats or matches my 295 in single so that I can SLI them I would be sooooooooo happy.

3D Vision is a proprietary gimmick that requires those dumb shutter glasses. Thanks but no thanks.

AMD is providing hooks into their drivers to allow ANY 3rd party to create glasses and allow for 3D meaning we could get any number of 3D solutions. This is the way it should be done, not the proprietary path NV keeps taking.
 
Take this review from Xbit Labs, a site known for their excellent power consumption measurements.

http://www.xbitlabs.com/articles/video/display/radeon-hd5870_7.html#sect0

Regular load = 107w average
Furmark load = 161w

Then compare to this TechPowerUp review, where they use the exact same methodology to capture GPU-only power consumption:

http://www.techpowerup.com/reviews/ATI/Radeon_HD_5870/28.html

Regular load = 122w average, 144w instantaneous peak
Furmark load = 212w

There's obvious variations from one card to another, and although the 5870s tested do not exceed the stated TDP (outside of Furmark), some will consume significantly more/less than others. Given that, 118w at-load sounds like a perfectly reasonable number for the 5870.

I'm not sure how much you should trust those numbers from tpu. Compare the GTX 295 for example, it also exceeds its TDP. Same goes for the 250/260/280 etc.
So, either tpu is wrong or Fermi will exceed those numbers as well.
 
im most likely getting two 470's for sli on day one.

i have 3 pci 16x slots so im going to wait on the 3rd one, lol, im going to need to get a new psu for the tri-sli,

Better get a new job to pay that power bill too; but the entire graphics card sub-forum knows you would buy a 470 even a herpes infection came free in the box.
 
3D Vision is a proprietary gimmick that requires those dumb shutter glasses. Thanks but no thanks.

AMD is providing hooks into their drivers to allow ANY 3rd party to create glasses and allow for 3D meaning we could get any number of 3D solutions. This is the way it should be done, not the proprietary path NV keeps taking.
Even in an excellent rated game like World of Warcraft, reactions to 3D Vision are highly mixed. I've tried it, and it looks nice, but it's sometimes hard to focus on menu's. Personally, I think Eyefinity gives a much more impressive - and especially immersive - gameplay experience.
 
[X]eltic;1035472089 said:
Even in an excellent rated game like World of Warcraft, reactions to 3D Vision are highly mixed. I've tried it, and it looks nice, but it's sometimes hard to focus on menu's. Personally, I think Eyefinity gives a much more impressive - and especially immersive - gameplay experience.
Well, if you really feel that a 2d representation of the 3D worlds we play in is really more immersive than actually experiencing the game in 3D then I have nothing else to say....I am off this one and this just goes beyond the limits of comprehension.

3D Vision is a proprietary gimmick that requires those dumb shutter glasses. Thanks but no thanks.

AMD is providing hooks into their drivers to allow ANY 3rd party to create glasses and allow for 3D meaning we could get any number of 3D solutions. This is the way it should be done, not the proprietary path NV keeps taking.
Please dont tell me you are one of those call the glasses dumb as your lame excuse to put down the technology but then dont feel like a retard when playing a ridiculous plastic guitar to play guitar heroes!
Oh and yes, AMD can talk all they want like they have done before but I have yet to see ANY 3d from the so talk is cheap.



Regards
 
So I'm pretty sure that the ATI engineering department is off at a pub drinking their ass off right now celebrating a huge Win for the 5000 series after seeing those numbers.

In all seriousness though, I guarantee that ATI has been sitting on the 2nd revision of the 5000 series just waiting for Nvidia to come out with official pricing and specs of "Fermi" so that they could set it up on a tee and drive it out of the GPU world. New reworked 5000 series cards are right around the corner with much lower TDP's and better performance numbers all on a smaller die. I mean who wouldn't think that, they've only had DX11 parts since September '09.
 
ATI is working on 3D also so this is not exclusive to Nvidia. So it's not worth talking about.

Lets talk about these TDP numbers and RMAs.
 
Well, if you really feel that a 2d representation of the 3D worlds we play in is really more immersive than actually experiencing the game in 3D then I have nothing else to say....I am off this one and this just goes beyond the limits of comprehension.
Regards
You are making the assumption that 3D Vision has no flaws and that it is comparable to the three dimensions that we humans experience in everyday life. But 3D Vision is nothing like that. Sure, at times it looks nice, but it has many issues. Out of focus menu's that I've experienced myself, and you're still playing your game on a small screen.
 
[X]eltic;1035471905 said:
Some developers even took advantage of obscure things like X-RAM, so it's not unlikely at all. In fact, in this case it's more likely. Don't forget that tessellation is not a new feature, it's being coded for already, nvidia is merely better in handling higher amounts of tessellation. What we will probably see is sliders in-game that will give an option like 'tessellation: low - normal - high - extra high' or something similar, with AMD not being able to handle the higher amounts of tessellation without a huge performance hit.

Note, I'm not saying that the GTX 480 is a winner, but it has some undeniable strengths:

-Improved tessellation performance
-PhysX support

But then again, it has some undeniable weaknesses too:

-Very high TDP of 295 watt
-Lackluster power/performance ratio

I'm getting ahead of my self and getting mixed up
Correct me if I'm wrong but in Heaven, that is not tessellation only, on the dragon, it's displacement map + tessellation right?
 
Well, if you really feel that a 2d representation of the 3D worlds we play in is really more immersive than actually experiencing the game in 3D then I have nothing else to say....I am off this one and this just goes beyond the limits of comprehension.


Please dont tell me you are one of those call the glasses dumb as your lame excuse to put down the technology but then dont feel like a retard when playing a ridiculous plastic guitar to play guitar heroes!
Oh and yes, AMD can talk all they want like they have done before but I have yet to see ANY 3d from the so talk is cheap.



Regards

Shutter glasses yes they suck hard. Polarized ones for things like Avatar and Alice in Wonderland not so bad but still a pain in the ass for those with glasses. If I hadn't gone and had Lasik done nearly 3 years ago I wouldn't have seen those in 3D because I'm not putting glasses on top of glasses.

But i'm done with this topic as this isn't the right thread.

Bottom line for OT is that Fermi looks like be fairly worthless. My only hope is that it is enough to cause 5870 prices to fall and a 5890 to hit the market. That way 5870 prices fall further. Then I can buy two 5870s, 2 more monitors (or maybe replace my current monitor and get 3 new ones) and just enjoy Eyefinity without having to worry about baking cupcakes at the same time.
 
My GTX 275 is running virtually everything that I throw at it - still, I have plans to upgrade my GPU sometime this summer.

There's no way I'd spend $500 on a GTX 480 if it's only 10% faster than a 5870. Screw that. I'll hold out for a 5970.

Am I missing something here? Last year the GTX 275 was an amazing deal - this is far from amazing? Kind of a letdown, actually. I certainly don't see ATI having to drop its prices now. I guess I'll have to wait for the benchmarks, but this is just bad news. :(
 
Well, if you really feel that a 2d representation of the 3D worlds we play in is really more immersive than actually experiencing the game in 3D then I have nothing else to say....I am off this one and this just goes beyond the limits of comprehension.
Why so sore?
3d is far from perfect, it's actually catastrophic in some games while some look great.

Here's a proper review:
http://www.extremetech.com/article2/0,2845,2338998,00.asp
Funny thing, he mentioned WoW and didn't seem particularly impressed either.

So, how about you get your head out of your green ass?
 
That tessellated dragon in the Heaven benchmark is the type of thing we'll end up seeing in games 3 or 4 years down the road - but we're not going to be seeing anything like that anytime soon.

No developer is going to waste valuable resources creating images like that when only 5% of gamers can actually run it.

It's a cool idea - but that's a tech demo.
 
Let the price wars begin!

What's your reasoning here? Are you being sarcastic? Price wars? With Nvidia selling the GTX 480 at that price there's no need for ATI to even think about lowering its prices.

If anything they'll raise them.

Suddenly the 5970 becomes even MORE attractive than what it already is.
 
Not incredibly impressive, certainly not considering the fact that this thing is 6 months late. Still, Nvidia needs this and the market needs this too, thanks to the recent quakes and the like 5800 series prices are starting to inch upwards again.

It's time for some price cuts, which should play well with AMD's more economical approach.
 
Fanboys :rolleyes:

What a horrible waste of oxygen.

Instead of having an intelligent and rational conversation, we're stuck with shitload of, "my team is better than your team." No objectivity, no open-minded perspective, just a bunch of idiotic out-group bias.

Oh well. If dumb people did not exist, we would have no basis to measure the intelligent by.

I am a bit surprised an alarmed by how low these prices are. This could indicate a truly sore return from Fermi which would be shitty. Guess we'll see soon
 
What's your reasoning here? Are you being sarcastic? Price wars? With Nvidia selling the GTX 480 at that price there's no need for ATI to even think about lowering its prices.

If anything they'll raise them.

Suddenly the 5970 becomes even MORE attractive than what it already is.

5970 is still selling for $750, unless amd drops it to regular price when Fermi comes out then it might look more attractive.
 
I am a bit surprised an alarmed by how low these prices are. This could indicate a truly sore return from Fermi which would be shitty. Guess we'll see soon

I wouldn't be surprised if NV's selling these at a loss to be honest, anything to steer themselves away from half a year of middling to bad press.
 
With all this talk of 3D and it being the future, are there possible side effects such as degrading your vision faster, like using old crt's with a low refresh rate?

I saw Avatar in 3D and that has been my only experience with it. It looked cool for the first 15 minutes, but then it was hardly noticeable except for a few parts. My eyes felt strained after seeing the movie. My eyes are probably the most important part of me, I sure as hell don't want to make them degrade faster by seeing a few silly 3d effects.

Haven't tried it for gaming, but hopefully it truly adds to the experience and doesn't have any unwanted side effects.
 
$399 5870 vs $350 GTX 470...Not a bad showing by Nvidia to be honest.

The prices are fine, but I'm sure people expected a hell of a lot more for launching 6 months later.

Instead of having an intelligent and rational conversation, we're stuck with shitload of, "my team is better than your team." No objectivity, no open-minded perspective, just a bunch of idiotic out-group bias.

Agreed. How someone can be a fanboy of a computer hardware company is beyond me.
 
The prices are fine, but I'm sure people expected a hell of a lot more for launching 6 months later.



Agreed. How someone can be a fanboy of a computer hardware company is beyond me.

Hey ever heard of mac-heads?...Those guys kill peoples babies for dissin macs.
 
5% slower? thought its 10-15% o_O...

and not mentioning its 100 cash more for GTX 480 for 5-10% improvement :p


if those #s are true, then a 480 is 15%-25% faster then a 470, so who knows =p unless the 470s is highly under clocked and running a 256bit bus
 
5% slower? thought its 10-15% o_O...

and not mentioning its 100 cash more for GTX 480 for 5-10% improvement :p

Not sure sure, If its 10% faster then the 5850, should be like 5% slower then the 5870.

Also dont forget ATI just release new drivers with huge increases in games. Will be interesting to see these new drivers against Nvidia's new cards.

I Think even Kyle said they would be using the new ATI Drivers for there Nvidia Evaluation.
 
I could see where it would be outlawed in California for energy usage!
They are set to outlaw TVs in California over 32" for energy consumption, next it will be the new nVidia cards, especially in SLI....
 
Playing non-stop 16 hours a day at Max TDP with SLI'd 480's for 30 straight days, would cost about $29 for the video card power use.

Most folks are not running SLI or gaming 16 hours a day every single day. (I'm sure some do)

Lets say ya game 60 hours a week with one GTX480 with what is suspected to be the TDP, that's about ~$7 a month (video card only), maybe ~$5 more a month then the 5870

With a $1000 video card cost, the cost of the power to run it isn't exactly going to "require a new reactor" to run, or "break the bank".

My wife and daughters hair dryer, curling iron and other crap use 1200-1500watts each, and they run those 1-2 hours a day, dry the hair, straighten it, then curl it.. that's way more power..

So what's really needed is to figure a way to redirect the heat of quad sli'd 480's to the bathroom for the women folk, then when they want to do there hair, fire up battlefield bc2 :D it's win win for everyone.
 
Wow nVidia is really losing traction in terms of power consumption/performance/price...basically everything. Looks like AMD is going to be dominating the market for a while.
 
Yes such a great showing!!

A card thats 5% slower for $50 cheaper!

Vrzone mentioned those benchmarks from that old rumor that was posted here 3 weeks ago so i wont use that as a fact.

there is a good chance that 470 will beat the 5870 by at least 15% in all dx11 game benchmarks.
 
Wow nVidia is really losing traction in terms of power consumption/performance/price...basically everything. Looks like AMD is going to be dominating the market for a while.

Which would make AMD and its investors happy. Everyone else loses with high prices, little choice and sloppy drivers. I pray this doesn't happen.
 
Back
Top