- Joined
- May 18, 1997
- Messages
- 55,647
And this is A3. Imagine how bad the first two revisions must have been.
I hear that we will see a B1 some time this year. That rumor tells me that there are some things "bad wrong" with the A version.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
And this is A3. Imagine how bad the first two revisions must have been.
a Quad GTX 480 system would require your wall plug to be greater then 15A
The X3 1600w runs circles around its competitors in almost every category tested bar none. The amazing DC output capacity that immediately draws a user to this power supply comes with one big caveat. While almost no desktop computer would actually push this unit to its limit, once one does only a select few individuals with the correct infrastructure will be able to use all 1600 watts of its DC capacity due to the AC power draw. The Ultra X3 1600w is a stellar 1200 watt power supply that in the right home can be an even more amazing 1600 watt unit.
common, be fair you could plug it into a 30A line instead of your oven/stove or fridge
TDP
5850 - 151w
5870 - 181w
470 - 225w
5970 - 294w
480 - 295w
...Two 5870s or 480s would be the best performance. But your looking at 600w TDP For two 480s and 362w TDP for the 5870s.
[X]eltic;1035471554 said:Considering the huge TDP of 295 watt, 5-10% performance increase over the Radeon 5870 is a disappointment. And with such an enormous TDP, it will be very difficult to manufacture a dual GPU card. The GTX 480 will most probably take the single GPU performance crown, but AMD will still have the fastest card: Radeon 5970.
That said, if you don't care about power consumption at all, and if the price is right, it's still a nice card. Besides the slight performance increase, it has PhysX support too and in games with a lot of tessellation performance will likely be better than 5-10%.
Can we expect an indepth review from [H]ard on the 26th?
Like most people, I only play games about 2 hours per day. Rest of the time, the computer is in 2D mode. That's about 60 hours per month of full load TDP on the GPU. Multiply that by the difference in TDP between 2x480s and 2x5870s: 60 hrs * (600 Watts - 362 Watts) = 14.3 KWhrs, which only costs $5.72 even in power-expensive states like California.
I am more interested in the TDP difference in 2D mode.
Scary that you have to ask since most of you act like you were experts in all Nvidia....
Ever heard of Cuda, Physx and that little thing 3d Vision (the world is going 3d finally in case you havent noticed).
And yes now you can insert the usual silly comments on how those are useless, gimmicks, etc....as anything your card does not have MUST be a gimmick right?
how about with the new 150/300 clocks introduced in 10.2/10.3
looking forward to it Kyle. I have been on the fence wanting to buy a 5870 for a few months now in anticipation of this new nvidia offering. Finally get to make a move once your review(s) are out on which way to go. Not a fanboy of either personally. I always go for best bang for the buck, my 8800gts 512 is starting to show its grey whiskers.
P.s. Kyle looks like your sources were pretty spot on from last December. Credit where it's due for sure.
Scary that you have to ask since most of you act like you were experts in all Nvidia....
Ever heard of Cuda, Physx and that little thing 3d Vision (the world is going 3d finally in case you havent noticed).
And yes now you can insert the usual silly comments on how those are useless, gimmicks, etc....as anything your card does not have MUST be a gimmick right?
You must be living in an alternate world amigo.3d vision? yeah ok.....you got me there lol.....3d vision is like the virtual boy of past, cool idea bot never made it mainstream.
You must be living in an alternate world amigo.
Look around you, 3D movies are being produced like never before...heard of a little movie called Avatar yet?
3D Ready tv sets are coming from all major brands.
Bluray 3D format
Even Consoles are going 3d!
Yeah is all like that crappy nintendo system. Heck, even the sega master system was better than that. So next time you try to make a point , try not to make a fool of yourself..too many people watching
5870 uses 118W @ load? I think that number is wrong. 5870 is power efficient compared to it's competitors, but I don't think it's only drawing 118W @ load.
Other things need to be rendered, yes, but since tessellation is supposedly one of the strengths of the new nvidia architecture, it should make a difference. If performance is already 5-10% better in applications with normal amounts of tessellation, performance will only increase in tessellation heavy applications. Take the Heaven benchmark for example, in tessellation heavy parts of the benchmark the performance difference between Fermi and the 5870 is increased. That's something that we will probably see in games with heavy tessellation as well. It's true that it remains to be seen if developers will take advantage of it, but devs take advantage of PhysX too (= nvidia only), so it's not at all unlikely.That is highly unlikely, in games with heavy tessellation, there are still other things that need rendering, and the GTX 480 is still only 10% faster in that category. The only time it might be a factor, is in benchmarks, no Dev is going to waste time on a feature that 1% of their client base might use.
What really bothers me about increased power consumption is increased noise generation. To a certain extent, I couldn't care less about electricity bills, but I don't want to have a leaf blower in my computer case. I want my hardware cool and quiet.The GTX 470 will most likely end up being the better seller as the price point is actually pretty good & the power usage is not nothing to be bitching about. As most of you seem to be doing. Pussies lol.
The number is correct. There is, of course, some process variation from card-to-card (up to %20), but the 5870 power consumption of 118w is right on the money.
Take this review from Xbit Labs, a site known for their excellent power consumption measurements.
http://www.xbitlabs.com/articles/video/display/radeon-hd5870_7.html#sect0
Regular load = 107w average
Furmark load = 161w
Then compare to this TechPowerUp review, where they use the exact same methodology to capture GPU-only power consumption:
http://www.techpowerup.com/reviews/ATI/Radeon_HD_5870/28.html
Regular load = 122w average, 144w instantaneous peak
Furmark load = 212w
There's obvious variations from one card to another, and although the 5870s tested do not exceed the stated TDP (outside of Furmark), some will consume significantly more/less than others. Given that, 118w at-load sounds like a perfectly reasonable number for the 5870.
I expect that the GTX480 will also have significantly less real-world power consumption than the TDP implies, but how much we will not know until the real benchmarks are released.
Logic is not welcome around these parts, from what I see of most of the fans posting in other threads . You are correct, however .
You must be living in an alternate world amigo.
Look around you, 3D movies are being produced like never before...heard of a little movie called Avatar yet?
3D Ready tv sets are coming from all major brands.
Bluray 3D format
Even Consoles are going 3d!
Yeah is all like that crappy nintendo system. Heck, even the sega master system was better than that. So next time you try to make a point , try not to make a fool of yourself..too many people watching
So yeah, I am getting Fermi, and I hope they come out cheap and soon.
[X]eltic;1035471788 said:Other things need to be rendered, yes, but since tessellation is supposedly one of the strengths of the new nvidia architecture, it should make a difference. If performance is already 5-10% better in applications with normal amounts of tessellation, performance will only increase in tessellation heavy applications. Take the Heaven benchmark for example, in tessellation heavy parts of the benchmark the performance difference between Fermi and the 5870 is increased. That's something that we will probably see in games with heavy tessellation as well. It's true that it remains to be seen if developers will take advantage of it, but devs take advantage of PhysX too (= nvidia only), so it's not at all unlikely.
What really bothers me about increased power consumption is increased noise generation. To a certain extent, I couldn't care less about electricity bills, but I don't want to have a leaf blower in my computer case. I want my hardware cool and quiet.
Lol, that's coming from a guy who's logic riddled with fallacies...
Rut Ruh, now I will have to rethink getting an 480 at launch and wait for the "B" version. Kyle's "rumors" are usually spot on.
You must be living in an alternate world amigo.
Look around you, 3D movies are being produced like never before...heard of a little movie called Avatar yet?
3D Ready tv sets are coming from all major brands.
Bluray 3D format
Even Consoles are going 3d!
Yeah is all like that crappy nintendo system. Heck, even the sega master system was better than that. So next time you try to make a point , try not to make a fool of yourself..too many people watching
So yeah, I am getting Fermi, and I hope they come out cheap and soon.
But well, Both the 480 and 470 have same amount of rops n texture units as GTX280, just more shader power.
Ati doubled up EVERYthing, and still managed to keep wattage down.
They'll probably sell you the B revision for 10% more lol.Rut Ruh, now I will have to rethink getting an 480 at launch and wait for the "B" version. Kyle's "rumors" are usually spot on.
Well as has been said before ATI is going to have 3d support as well. Not sure what your point is.
Some developers even took advantage of obscure things like X-RAM, so it's not unlikely at all. In fact, in this case it's more likely. Don't forget that tessellation is not a new feature, it's being coded for already, nvidia is merely better in handling higher amounts of tessellation. What we will probably see is sliders in-game that will give an option like 'tessellation: low - normal - high - extra high' or something similar, with AMD not being able to handle the higher amounts of tessellation without a huge performance hit.I don't think you understand how it works
In software development, there are things called features, you don't spend money on features, that no one can or will use.
So spending extra time to tessellate so heavily just so that people with a GTX 480 can play it, is not going to happen, instead, there will be a medium, where all cards from mainstream to high end can at least play with it.
and from what I understand, the more tessellation the 480 is doing, the less rendering power it has as it dedicates more resources to tessellation.
BTW, ATI had TruForm in the 8500
I think my moneys still going to go to SLI GTX470-480 once prices drop. What i5 chipset do I need to support both crossfire and SLI?