NVIDIA GeForce GTX 570 Video Card Review @ [H]

@vengeance

Nope, not CoD4. I don't fraps every game but I recall oblivion being able to double my frame rate give or take. Going from 15 to 30fps was a big deal at the time. Also 95% or a 100% could be only a frame or so difference so a moot point.

I am playing one right now which doubles my fps. So I know it is possible, whether it happens often maybe not but it does happen.
 
@vengeance

Nope, not CoD4. I don't fraps every game but I recall oblivion being able to double my frame rate give or take. Going from 15 to 30fps was a big deal at the time. Also 95% or a 100% could be only a frame or so difference so a moot point.

I am playing one right now which doubles my fps. So I know it is possible, whether it happens often maybe not but it does happen.

Right... every review in existance on the internet doesn't agree with your statement, but you're eyeballing in fraps says you're getting exactly double. :rolleyes:
 
Yes, I can look at fraps, disable crossfire then look at fraps again. Same scene and getting double the fps. I even decided to fire up risen just now and test it. Haven't done any great amount of testing but a quick check of fps showing 40 without crossfire and 90 with shows great scaling don't you think?

I can't say that average fps will show it as that would be foolish to judge it like that for obvious reasons. However I can say that I have noticed game which will have a different of 2x in terms of FPS.
 
Does the gtx 570 have PAP? This is the only reason I'm still hanging onto my 5870...

I don't think so. (I don't think Nvidia offers that at all now that I think about it) this is an area that AMD has been ahead in. still do you need it that much?
 
You must have missed the part where I said to undervolt the card. Only a 10% undervolting would be required to match the 122% difference to the 5870.

Not if you are to match both cards. I do assume you plan to achieve best possible performance/wattage ratio on both cards before doing a comparison? As you can see from the charts, unmodified, the product 570 isn't very power efficient vs. 6870, which [H] used in their review. (570 is good card and it seems much better then previous generation, but that was not my point). :)
 
Why is this cards price compared to the GTX 470? It performs like the GTX 480 and is cheaper and uses less power and heat, yet people still complain? And Lol @ the HD 6870. If you actually looked at the review it lost in every game. Every single one. "OC'ed it beats it" again awesome attempt at trolling. OC the GTX 570 and your already out the door again destroying the 6870, but still I really liked the attempt.

And if you complain about power then don't ask for a high end card from Nvidia. More performance = more power usage.
 
I don't think so. (I don't think Nvidia offers that at all now that I think about it) this is an area that AMD has been ahead in. still do you need it that much?

yup...my gaming rig is also used for HTPC duties. gotta have my blu-ray lossless codecs bitstreamed. :D
 
Why is this cards price compared to the GTX 470? It performs like the GTX 480 and is cheaper and uses less power and heat, yet people still complain? And Lol @ the HD 6870. If you actually looked at the review it lost in every game. Every single one. "OC'ed it beats it" again awesome attempt at trolling. OC the GTX 570 and your already out the door again destroying the 6870, but still I really liked the attempt.

And if you complain about power then don't ask for a high end card from Nvidia. More performance = more power usage.

The 480 core GTX 570 IS like a GTX 480, only cheaper, less power and less heat (and most important, less noise!).

The GTX 570 is not going up against the 6870 in the end, but the 6950 I believe, while the GTX 580 goes up against the 6970.

Do you feel that [H] shouldn't measure the power usage, or do you feel that people shouldn't comment the results?
 
Not if you are to match both cards. I do assume you plan to achieve best possible performance/wattage ratio on both cards before doing a comparison? As you can see from the charts, unmodified, the product 570 isn't very power efficient vs. 6870, which [H] used in their review. (570 is good card and it seems much better then previous generation, but that was not my point). :)

My point was what would the power consumption be if the 570 was only trying to equal the preformance of the 6870. The 570 is ~20% faster than the 6870 in the review. Using that number, we will reduce the power consumption of the 570 (264W) by 20% to 214W by reducing the shader clocks by 20%. However with the shader clocks reduced we can also reduce the voltage. To obtain the same power draw as the 6870 the voltage would only need to be lowered 13%. Reports from around the web are putting 580s with 965 cores requiring ~14% v.core increase so this seems reasonable.
 
My point was what would the power consumption be if the 570 was only trying to equal the preformance of the 6870. The 570 is ~20% faster than the 6870 in the review. Using that number, we will reduce the power consumption of the 570 (264W) by 20% to 214W by reducing the shader clocks by 20%. However with the shader clocks reduced we can also reduce the voltage. To obtain the same power draw as the 6870 the voltage would only need to be lowered 13%. Reports from around the web are putting 580s with 965 cores requiring ~14% v.core increase so this seems reasonable.

Yeah, but it becomes a bit like the 570 is equal to the 6870, we just overclock the 6870 a bit... :) You need to do reduction on both cards and find the performance/watt sweetspot before making a comparison if you want to do this out of specs. Otherwise its not an apples to apples. I have no doubt that the 6870 have some overhead and can lower voltage even further.
The 500 series are great cards (both [H] reviews have showed that) and I wouldn't mind getting one if the 6970 doesn't perform well. Noise was my big gripe with the 400 series and Nvidia fixed that. But, they are still not efficient when it comes to performance/watt.
 
agreed this looks like another fail in the gtx series from what I see. $110 higher price for 10% higher fps.

uninstall nvidia.
 
agreed this looks like another fail in the gtx series from what I see. $110 higher price for 10% higher fps.

uninstall nvidia.

:confused: The GTX 580 and 570 performs well and seems to be very solid cards. I see them as wins compared to previous cards. [H] didn't give them an editor choice gold award for nothing.

Once you come over the mainstream, you start paying more and more per % performance. Nothing new to that. By value, there is no point buying anything highend, but it increases gaming experience, which has a value in itself (and is kinda the point for many to buy GFX cards to begin with) :)
 
"NVIDIA has equipped the GeForce GTX 570 with 480 CUDA Cores. The GeForce GTX 480 also has 480 CUDA Cores. The new GeForce GTX 580 has 512 CUDA Cores, while the GeForce GTX 470 has 448 CUDA Cores. We can see that the new GTX 570 matches the GTX 480, at least in terms of shader processing potential.":rolleyes:
 
The price is right, the performance is right, and the timing is right to get them right here before Christmas!

[H]ard fail. While the 570 is a good card, it's at the very least what the 480 should have been all along. All we have here is 480 performance, 9 months later, with lower power draw and thermal profile. But it has a 10-15% lead over a card that costs more than a third less! How is that a deal?

Not to mention, the timing is very, very wrong to buy one of these. In a week we will see AMD's 6900 series, which should deliver some great performance per dollar, or at the very least, drive some competition. It is very likely that NV will need to do a price drop very soon. Anyone buying these for full price are clearly just throwing their money out the window.
 
I don't get your point, funkydmunky. It's just saying that the 570 and 480 have the same cuda core count, yeah?
 
Yeah, but it becomes a bit like the 570 is equal to the 6870, we just overclock the 6870 a bit... :) You need to do reduction on both cards and find the performance/watt sweetspot before making a comparison if you want to do this out of specs. Otherwise its not an apples to apples. I have no doubt that the 6870 have some overhead and can lower voltage even further.
Well since the comparision is on the architectures and not cards, then I don't see the specs being important. I suspect the voltages on the AMD are as low as they can safely do in a production enviroment, this of course means you can find golden cards that will do it at a lower voltage.

The 500 series are great cards (both [H] reviews have showed that) and I wouldn't mind getting one if the 6970 doesn't perform well. Noise was my big gripe with the 400 series and Nvidia fixed that. But, they are still not efficient when it comes to performance/watt.
We'll see how the 6970 comes out and if it can actually challenge the 580 in terms of head to head preformance, when they do I'll start caring more about preformance/watt.
 
agreed this looks like another fail in the gtx series from what I see. $110 higher price for 10% higher fps.

uninstall nvidia.

I think that is a bit harsh. if you remember H didn't give much in the way of a award to the 480GTX. as a value per dollar yeah they suck ass but that is compared to the midranged right now. that is a highly competitive market. with the 570 and the 580 Nvidia has finally gained enough distance from the 6870 / 5870 to make a real difference. the 480GTX was embarrassing compared to the 5870, only slightly better real world game play, hot and loud, and expensive. this rectifies some of the issues, esp the noise. was it a worth a gold award? maybe. I would not have gone that far (due only to the price above the midrange) but the 570GTX is clearly a product in a different class then a 470GTX. or for that matter a 480GTX.
 
And the trolls come out in page 5, damn I expected sooner! The GTX 500 series are a win over their last cards. Better power draw, less heat, more performance per dollar. Hard Launches on both, beat competition to market. My hands are getting tired I'll add more bullet points later. :p
 
Well since the comparision is on the architectures and not cards, then I don't see the specs being important. I suspect the voltages on the AMD are as low as they can safely do in a production enviroment, this of course means you can find golden cards that will do it at a lower voltage.

We'll see how the 6970 comes out and if it can actually challenge the 580 in terms of head to head preformance, when they do I'll start caring more about preformance/watt.

Specs are important and tuning as well, since you are going to find performance per watt. All voltages have overhead. Thats due to different things like leakage. This is why you need to adjust both. If Nvidia could have gotten away by using less voltage without any risk of high leakage parts failing, I bet they would. :)

6970 is a new architecture, so its hard to say if it will have the performance/watt as Barts or more close to the 5XX series from Nvidia.

When it comes to challenging the 580 in terms of performance and performance/Watt, you don't need to go closer then the 5970 which has better performance/watt then both the 570 and the 580. It will be interesting to see if they can do the same with Cayman.
 
was it a worth a gold award? maybe. I would not have gone that far (due only to the price above the midrange) but the 570GTX is clearly a product in a different class then a 470GTX. or for that matter a 480GTX.

I think one of the considerations on price for the GTX 570 + GTX 580 atm is, as you mentioned, the distance nVidia has established between these cards and the mid-range. While annoying, nVidia is just doing what any smart business would do right now and are exploiting that limited time window before proper competition rolls in. So unless AMD comes in with identical price/performance, I'd expect to see GTX 580 prices drop or at a bare minimum stabilize and GTX 570 prices inch down a bit (they seem to be coming in pretty close to MSRP to start with so that's a good sign) within a week or two of the HD 69xx cards.

As to how that translates into the GTX 570's award... I mean, my continued interpretation is supposed to be that those are supposed to be... hmh, not necessarily timeless, but they should extend beyond the period of a week and that week's pricing situation. At the same time, looking into the future on pricing is risky so ultimately the MSRP at review time still needs to make sense.

Where [H] chose to go was simply that the GTX 570 is a successful refresh, coming in at the same price point its predecessor launched at and cheaper than the card whose performance it mimics, it improves power consumption, sound, and heat to levels acceptable for most people. When the market's as turbulent as it is right now, those are as good a criteria as any to judge a card on.
 
Specs are important and tuning as well, since you are going to find performance per watt. All voltages have overhead. Thats due to different things like leakage. This is why you need to adjust both. If Nvidia could have gotten away by using less voltage without any risk of high leakage parts failing, I bet they would. :)

6970 is a new architecture, so its hard to say if it will have the performance/watt as Barts or more close to the 5XX series from Nvidia.

When it comes to challenging the 580 in terms of performance and performance/Watt, you don't need to go closer then the 5970 which has better performance/watt then both the 570 and the 580. It will be interesting to see if they can do the same with Cayman.

It's a two edged sword. Nvidia's tuning algorithm puts less weight on power consumption and more on raw performance. I'm simply saying if you apply the same tuning of the architectures to both designs you're likely to see the performance per watt be significantly better for the 500 series.

As far as the 5970, I'd suggest looking at a pair of 460s which is comparable to what the 5970 really is.
 
And the trolls come out in page 5, damn I expected sooner! The GTX 500 series are a win over their last cards. Better power draw, less heat, more performance per dollar. Hard Launches on both, beat competition to market. My hands are getting tired I'll add more bullet points later. :p

Pretty much. I find their weabo whining to be hilarious. They're like my very own personal jesters.
 
So HD 6870 is within 15-20% of the performance of GTX 570 and 25-30% of GTX 580.

Since HD 6970 will be 75% more powerful than HD 6870 - isn't anyone else scared of the lack of competition if AMD wins hands down?


it maybe 75% more powerful spec wise but that never translates to 75% more performance.
 
It's a two edged sword. Nvidia's tuning algorithm puts less weight on power consumption and more on raw performance. I'm simply saying if you apply the same tuning of the architectures to both designs you're likely to see the performance per watt be significantly better for the 500 series.

As far as the 5970, I'd suggest looking at a pair of 460s which is comparable to what the 5970 really is.

that and the performance per watt is lower on that due to the card being downclocked for power reasons. if you bump up the clocks and voltage to stock 5870 I think you would see that difference disappear despite the improved performance. having to run two sets of everything would kill it. AMD has a huge lead over Nvidia here but not double.
 
I noticed [H] is one of the few places to use 10.10e on the ATI cards in their tests. Hell one site used 10.9 on 6870/50. Stuff like that is why this is one of the few sites that I trust.

I have to admit that this seems like an impressive card for $350 and it draws less juice than a GTX470.
 
It's a two edged sword. Nvidia's tuning algorithm puts less weight on power consumption and more on raw performance. I'm simply saying if you apply the same tuning of the architectures to both designs you're likely to see the performance per watt be significantly better for the 500 series.

As far as the 5970, I'd suggest looking at a pair of 460s which is comparable to what the 5970 really is.

The 6870 is 255mm2, while the GTX 570 is 550mm2 (but have one SM disabled). This already makes a large difference in power consumption. The 6870 is build for a smaller footprint to begin with. There is no way you can achieve a similar performance/watt as the 6870 by downclocking and undervolting the 570.

According to the [H] review, the GTX 570 had a power consumption of 264W, while the 6870 had a power consumption of 168W. You would then have to shave off 96W (about 36%?)of the power usage from the 570, while maintaining performance at 6870 levels. This on a GPU that is already larger and more power hungry to begin with.

You are saying that Nvida's tuning algorithm puts less weight on power and more on raw performance. I'm saying that Nvidia's GPU puts less weight on power and more on raw performance. The 6870 does more, with less size and less power. (not in speed, but efficiency).
 
Man, reading this review has got my upgrade bug going again...

Good review, [H]. I am interested in what AMD has to offer soon with their 6900 series, but I dread AMD drivers. I'll most likely go with Nvidia unless AMD's new and not-yet-released offerings just completely crushes the latest GTX's out of the ballpark.
 
Ok... usually this stuff comes out.. but I didn't notice it this time.

The conclusion, 'best card for $350 bucks!' Gold Award? Hands down? What??

Isn't the 5870 2GB card $350? Isn't that card faster in most bench marks?

Seems odd that Hard placed a $350 part against a $240 part and concluded the $350 was better... that's unlike HardOCP, what gives? In truth, it seems like it's about time Nvidia! Finally a power and heat spec that's reasonable...
 
Ok... usually this stuff comes out.. but I didn't notice it this time.

The conclusion, 'best card for $350 bucks!' Gold Award? Hands down? What??

Isn't the 5870 2GB card $350? Isn't that card faster in most bench marks?

Seems odd that Hard placed a $350 part against a $240 part and concluded the $350 was better... that's unlike HardOCP, what gives? In truth, it seems like it's about time Nvidia! Finally a power and heat spec that's reasonable...

It doesn't seem like your trolling but I'll give you a different perspective. Yes you can find a 5870 around that price but like a GTX480 the GTX 570 outperforms it. The only time a 570 would lose is when it gets memory limited which is rare until you get into multicard surround scenario. We all know about 5870 in multicard... Not that great. 570 is not here to compete with a soon EOL card it's here to compete with the 6950 that comes in about 8 days. Let's not forget about the tessellation performance, enhanced FP16 and other features this card brings to the table and it's even more future proof than a 5870 2gb.
 
Not impressed. Cayman will blow the 580/70 out of the water. My 6870 with overclock can just about equal this card. Meh.


The GTX570 with a nice OC will destroy your OC'd HD6870.

I'd like to see some real cayman reviews for comparison. Right now NVIDIA is delivering. Cayman is vapor ware at this point in time.


One thing I noticed is that AMD released the 6800 series to compete against the GTX460 and GTX470. In that battle based on pricing and performance NVIDIA won IMO.

With the GTX570 and a possible HD6900 series being released I think NVIDIA is prepared and ready to beat AMD with pricing. I think the GTX580 and GTX570 will be the hot deal cards. I don't see AMD HD6900 series prices budging based on what has recently happened with the release of the 5000 series and 6000 series.
 
Last edited:
I'd buy a GTX 570 and OC the heck out it before I'd drop cash on a 580.

I wouldn't buy a GTX 570 before I saw the price vs performance and feature set of an OCed HD 6950 though.

We will be able to judge soon.

"Our final take on the GTX 570 will have to remain a work in progress for the same reason. Stay tuned to this same channel for the next episode of GPU Wars, when the truth about the 2010 crop of graphics chips will finally be revealed."

Source:
http://techreport.com/articles.x/20088/14
 
Last edited:
What do you mean? It's been improved drastically on the 570/580 compared to 470/480. In this very article Brent mentioned that the GTX 570 is nearly the same power draw as a 6870 at idle. When you game there is more power draw but that is expected on a higher end card. no? Let's see how the 570/580 compare in power draw to 6950/6970 before we draw our conclusions.

I've never been a ride it out with a cheap power supply kind of guy so I guess this doesn't matter and an extra .50-.75 a month in electricity is nothing to me.

I make a point of not wasting energy relative to my needs. I could have 100 watt lightbulbs in every socket in my house, but if a 50 watter is enough for em to see I don't see the value in wasting energy.

Unless I was pushing a massive amount of pixels I don't see the need in having a power hungry video card. The difference between 4xAA and 16xAA really doesn't matter to me enough to justify the wasted power.

It also has been Nvidia's brute force, efficiency be damned approach for years. I don't see the sense in it. If I could drive a car that had a v6 and it did everything I ever needed it do then I don't see the value in a v8.

I see why people would want the product, I just wish Nvidia took a more elegant approach than they do. Their CEO has said they are a software company, not a hardware company, and I just feel they prove that with power hungry cards and chipsets because apparently they don't know how to do it better.

I would pay more for a top tier performer if it was more energy efficient. Appliances work that way, I don't know why that business model hasn't permeated graphics.

But, whatever, different strokes and all that.
 
I make a point of not wasting energy relative to my needs. I could have 100 watt lightbulbs in every socket in my house, but if a 50 watter is enough for em to see I don't see the value in wasting energy.

Unless I was pushing a massive amount of pixels I don't see the need in having a power hungry video card. The difference between 4xAA and 16xAA really doesn't matter to me enough to justify the wasted power.

It also has been Nvidia's brute force, efficiency be damned approach for years. I don't see the sense in it. If I could drive a car that had a v6 and it did everything I ever needed it do then I don't see the value in a v8.

I see why people would want the product, I just wish Nvidia took a more elegant approach than they do. Their CEO has said they are a software company, not a hardware company, and I just feel they prove that with power hungry cards and chipsets because apparently they don't know how to do it better.

I would pay more for a top tier performer if it was more energy efficient. Appliances work that way, I don't know why that business model hasn't permeated graphics.

But, whatever, different strokes and all that.

I see what you mean. My point is that we don't know the power draw of the 6950/6970, they may be the same or more than the GTX 570/580 as it looks like AMD abandoned their sweet spot strategy this time. The 6900 Cayman chips look to be big chips with good performance. What defines power hungry exactly? We're talking about $.75c a month in electricity difference it becomes a moot point. It only consumes more when your gaming, which is a small % of the time it'll be in use. Then compare it to the unknown 6900 series which will be more than 6800 in power draw for sure. They don't seem as power hungry as one would think for this level of performance.
 
Guys get real here, half you Americans worried about power consumption probably drive SUV's, often we are talking number like 10watts idle and maybe 30watt load difference between ATI and Nvidia. That difference is TINY when thinking about power bills $$.

All that matters is that heat and noise is under control. Pays to keep power usage in perspective!. The GTX480 probably fails this to many of us, but the GTX580/570 passes.

Where do you think heat comes from and goes? 480 fails, but 580 passes? They'll heat up your room about the same.
 
I have to admit, the GTX 570 is a winner. It offers nearly identical performance to a GTX 480 with 50 watts less power consumption and lower noise/heat. There's really nothing not to like about this card at $349. Better get yours while supplies last. All I'm waiting for now is for hte 6900 series to come out so we can have fair performance reviews. Comparing a $350 MSRP card to a $240 card isn't exactly fair.
 
Not impressed. Cayman will blow the 580/70 out of the water. My 6870 with overclock can just about equal this card. Meh.

it is going to blow it away in a big, big way. my i7 970 and dual intel ssd 160 drives for raid are waiting. i just wish amd's drivers were as good as nvidia's.
 
it is going to blow it away in a big, big way. my i7 970 and dual intel ssd 160 drives for raid are waiting. i just wish amd's drivers were as good as nvidia's.

Time will tell, but I'd think that is probably one big too many.
 
The 6870 is 255mm2, while the GTX 570 is 550mm2 (but have one SM disabled). This already makes a large difference in power consumption. The 6870 is build for a smaller footprint to begin with. There is no way you can achieve a similar performance/watt as the 6870 by downclocking and undervolting the 570.

According to the [H] review, the GTX 570 had a power consumption of 264W, while the 6870 had a power consumption of 168W. You would then have to shave off 96W (about 36%?)of the power usage from the 570, while maintaining performance at 6870 levels. This on a GPU that is already larger and more power hungry to begin with.

You are saying that Nvida's tuning algorithm puts less weight on power and more on raw performance. I'm saying that Nvidia's GPU puts less weight on power and more on raw performance. The 6870 does more, with less size and less power. (not in speed, but efficiency).

I've run the math, if you'd like to dispute the math please do so, but vague numbers will get us no where.
 
Back
Top