GPU Performance Chart

Sotiri

Limp Gawd
Joined
Feb 5, 2011
Messages
178
I think you'll all agree one of the more difficult tasks in recommending a build/upgrade is GPU selection. A discussion on this topic could last hours and more often ends with the other person either totally confused or just staring at me with a glazed-over/deer-in-headlights look. To help myself, and hopefully you guys too, I've put together the following chart as a baseline for GPU selection. Of course this can't possibly address every application, the curves will move far to the right if you’re comparing to Crysis, but they move very far to the left if they just play Farmville.

This is just a first draft. I'm basing my results on an average of my personal impressions and would like to get your input. Please keep in mind this is meant to be just a general overview. If you believe I've misrepresented anything, please provide specifics and I will update the chart as necessary. Hopefully we end up with something most of us can agree on and make good use out of.

Thanks!

Rev4
gpuperformancechartrev4.jpg


Previous:
http://img23.imageshack.us/img23/3074/gpuperformancechartrev3.jpg
http://img607.imageshack.us/img607/8443/gpuperformancechartrev2.jpg
http://img818.imageshack.us/img818/1065/gpuperformancechartrev1.jpg
http://img814.imageshack.us/img814/7929/gpuperformancechart.jpg
 
Last edited:
Honestly i do not understand what you are measuring..........
 
I don't see a chart.

And other sites already do this for you, if you bothered to look.
 
For example, if someone has a 1920x1200 monitor, I can present them with this chart to demonstrate they don't need a GTX580 SLI or HD6950 Crossfire. Another example is a friend of mine that has 2 - GTX460's and thinks he can just buy two more 1920x1200 monitors. He might be able to pull it off, but I think he'll need to lower the settings way below what I consider feasible for such an investment.
 
I don't see a chart.

And other sites already do this for you, if you bothered to look.

If this information is already consolidated and presented in this way, please post the link, you'll be saving me a lot of trouble.
Tom's Hardware has a Heirarchy Chart, but it doesn't really help someone who doesn't understand the performance of each card.
 
chart is beyond confusing. why is 1920x1200 listed twice. what determines ultra/high/default? one persons playable is another persons unplayable.
 
There are too many factors being left out for this chart to be taken seriously.
 
There are too many factors being left out for this chart to be taken seriously.

Exactly....like what CPU the person is using...sticking a pair of 580's behind an E6600 will yield different results than the same cards behind a 2600K .

There is no one size fits all for GPU recommendations
 
I assume the "3" is meaning a 3 monitor setup?

If I read your chart right I would say I pretty much agree with how you have pegged the cards. It's a good baseline.

One suggestion though. Put a little more info into the chart like what the "3" means in front of the resolutions.

Other than that it's a decent chart that tells more than just the hierarchy charts.
 
chart is beyond confusing.

That's where I need your help. Information is always legible to the person presenting it, but not often understandable by the end user. Suggestions for improving the chart are welcome.

...why is 1920x1200 listed twice.

Single 1920x1200 panel vs. 3 panels.

...what determines ultra/high/default? one persons playable is another persons unplayable.

You are absolutely correct, but this chart is intended to help people who don't understand that. However, when people are asking my opinion for a new 2560x1600 monitor, I would tell them at a minimum to get either an HD6950 or GTX570. Of course, we all know it's not that easy, but I think it's a good start. The last thing I want is for someone to come back to me because their games are choppy so I think my choices are wise, unless you have a different opinion, I would sure like to hear it.

I consider Default being the in-game default settings and Max being the in-game max settings. Any suggestions on what I should call it?

I use default as a minimum because if I bought new hardware and then had to manually lower game settings, I wouldn't be happy.
 
Took me a few to figure out how this was setup, but I see it now. Might be a decent generalization for GPU abilities.
 
There are too many factors being left out for this chart to be taken seriously.

I would be naive to assume there can be a complete chart to satisfy everyone’s needs. That's not what I'm trying to do. This is just meant as a baseline or a quick reference chart to help point people in the right direction.
 
Exactly....like what CPU the person is using...sticking a pair of 580's behind an E6600 will yield different results than the same cards behind a 2600K .

I think we all agree on that, but that's a completely different discussion.
 
One suggestion though. Put a little more info into the chart like what the "3" means in front of the resolutions.

Other than that it's a decent chart that tells more than just the hierarchy charts.

Thanks! I've taken your advise and updated the heading rows. Also, I think the column on the left might be confusing people. I thought it might be helpful to show how much more power is required when you increase resulutions, such as a 78% increase when going from a single 1920x1200 to a single 2560x1600 monitor.
 
ok now it makes sense with the updated heading rows. Looks good, only thing I would change is including 2 GTX 570's/2 6950's in the 1920x1200 single display option. Only reason I suggest this is there is lots of times when 1 570 can result in a low minimum fps resulting in stutter, 2 570's fix this.
 
Thanks, that's exactly the type of input I'm looking for.

Question though, when you say "1 570 can result in a low minimum fps", is that at max game settings? Will the GTX 570 perform well at just below max settings (kinda where I've got it pegged right now)?
 
Took me a few to figure out how this was setup, but I see it now. Might be a decent generalization for GPU abilities.

This is kind of like how BFGTech generalized their GPUs. They stated a 9600GT was good for 1680x1050, 9800GT for 1920x1080, etc. (at the time).

I agree with the others though, this is almost too general of a chart and does in fact leave out many varying factors such as CPU, motherboard, clock speeds, apps being used, etc etc.

Something like the above chart can't really be good for much outside of marketing a product to an uninformed/inexperienced user who wants to do some gaming.
 
I agree with the others though, this is almost too general of a chart and does in fact leave out many varying factors such as CPU, motherboard, clock speeds, apps being used, etc etc.

Something like the above chart can't really be good for much outside of marketing a product to an uninformed/inexperienced user who wants to do some gaming.

That's exactly who I'm targeting with this chart. Please read the first line of my OP.

I started this chart a while ago to respond to family & friends, who can't even tell you what the resolution of their monitor is, when they ask me stupid questions like "Will I be OK with a GTX 570 or should I get the 580"? Only to find out they have 19" 1280x1024 screen. Of course I have the "CPU Talk" with them first, but this chart has come in handy when trying to lay out a long term upgrade path. For example, if they're short on cash now, a 560 TI will be fine on 1920 and you can later add another one to get up to 1680 surround, but don't expect too much if you want to add two more 1920's. If that's your goal, you should really be looking at a 570 right now.
 
personally i think the chart looks good, some may not be used to seeing it.

you can easily see how moving down in "speed" of GPU that the monitor requirements go down.

Personally i would flip the chart though, always start with the lowest at the top and then build up.. the way you have it you start at the top and go down (a little counter intuitive)

then a quick look you can ese which one has more stars and which one has the fewest and match it up that way.
You also may want to do a Fill in style chart
List your GPUs from low to high (low at the top high at the botton) and then low resolution at the left and high resolution at the right

and then fill in an X for each Resolution and Setting that Video card is good for

you should end up with something that looks like
eg.

GTX450 X
GTX460 X X
GTX470 X X X
GTX480 X X X X X X


etc.. etc..
 
Thanks H-street, I appreciate the input.
I've tried to graph the info like you suggest but I can't get around the gaps between the resolutions. For example, if I show a bar going all the way across to 5760x1200 for a GTX590, it would give the impression you can run 5040x1050 at max settings, but what I'm trying to say is that you'll probably only get good results at high settings. It's these gaps that have left me just plotting single points.
 
I understand the reasons for the chart and fair play to you for attempting to get something like this done. Don't think I am knocking the idea or your work, but the only problem with something like this is that the people you are targeting are the very people who won't understand the chart and will be still asking the same questions. Doesn't matter how simple you make the chart, people's minds go blank when they look at anything technical. They don't want to figure it out, they want someone to tell them which card to buy.

Look at this forum for instance, you would imagine that most people coming here would be pretty technical and good at finding out stuff for themselves, yet, how many times has it been asked "which card for me?" even though the question has been answered million times already (ok slight exaggeration) Sometimes there is a thread with the same question on the same page!!

Most people are lazy and don't want to think for themselves. That's why your chart won't work even if you do make the perfect chart.

Ok, after saying all that, how hard would it be to put in a minimum CPU value for each level? You already have split into cards and resolutions why not add CPU as well? You wouldn't have to mention name or type just, for example, quad core@ 2.8GHz etc, because that's the numbers that are on the front of the box. Other than that it's a nice chart and looking forward to seeing the finished article.
 
Thanks H-street, I appreciate the input.
I've tried to graph the info like you suggest but I can't get around the gaps between the resolutions. For example, if I show a bar going all the way across to 5760x1200 for a GTX590, it would give the impression you can run 5040x1050 at max settings, but what I'm trying to say is that you'll probably only get good results at high settings. It's these gaps that have left me just plotting single points.

No keep them as X's, so when you get to a setting/resolution you can skip it. THink of all those comparison charts you've seen on products that show products and supported features and as you scan them you can see the gaps and how far the product goes from left to right.

Also if you want to keep the "recommended" region you can color that in green and anything that is Overkill for that video card in grey
 
interesting chart and i like where your going with it. i also agree with H-street's suggestion to flip the chart.

its simplistic and easy to understand if you are there to talk the person through it since you will be able to get an understanding of the system the card will be going into before suggesting exactly what they need so it works in my opinion.
 
Took me a few to figure out how this was setup, but I see it now. Might be a decent generalization for GPU abilities.

This +1. It might need some polishing, but it's a good way to present a bound hierarchy per resolution in easy-to-understand terms to someone who doesn't know squat about the offerings-of-the-day from Nvidia and AMD.
 
Thanks, that's exactly the type of input I'm looking for.

Question though, when you say "1 570 can result in a low minimum fps", is that at max game settings? Will the GTX 570 perform well at just below max settings (kinda where I've got it pegged right now)?

yeah where u have it now is fine
 
...the people you are targeting are the very people who won't understand the chart...

I think you're right, but I also think some people respond to charts pretty well. You can talk numbers all day with some people but as soon as I put up a graph, BAM! Suddenly they get it! It drives me crazy sometimes.

Most people are lazy and don't want to think for themselves. That's why your chart won't work even if you do make the perfect chart.

LOL! Yeah, unfortunately they will still require some hand-holding, but hopefully it will cut my time way down.

...how hard would it be to put in a minimum CPU value for each level?

I've thought about it, but just GPU vs Resolution has turned out to be a big enough task. Since I intend this chart to be used only by us, I have to assume we've already tried to convince our friends/family on at least upgrading to a decent system.
 
Last edited:
Thanks everyone for your input!
I've taken some of your advice and flipped the chart top to bottom and left to right.
I tried it with X's but it looked like crap. I think color looks better.
I've also removed a couple ATI cards no longer available on Newegg.
Plus, I've moved the GTX 580 down a notch to put it behind 2x560TI's/6870's. For starters, you can't do Surround with only one 580 and from what I can tell the SLI/CF combos do have a slight advantage from the extra memory when trying to push 2560x1600.
 
Last edited:
You've left out the 1920x1080 resolution which is probably the most common monitor size currently sold.

According to the chart a single 6970 is not a good solution for triple monitors.
I run all my games at 5040x1050 resolution on a single unlocked 2 Gig 6950.
I might have to turn down a few settings for a game like Metro 2033 but it still looks good.
Also I can run less demanding games like Left 4 Dead 2 at maximum quality.

Is this chart supposed to represent only the game that is the worst case scenario for performance?
 
I like the effort you put into this, though the chart is intuitive to the average computer builder it's also a good quick summary for the newcomer. I have to say that you should bump the single 6970 at 1920x1200 on Max at least 3/4's of the way... not totally blank. It's playable, even solidly in most cases, since it's only like 20% slower at the same resolution than the 580 which you have at full green for Max.

edit: move the 480 and 570 to the column with the 6950 and change the Max for the 6970 alone.
 
Last edited:
You've left out the 1920x1080 resolution which is probably the most common monitor size currently sold.

True, but in an effort to keep it consolidated I think just showing 1920x1200 is enough. There's not a big difference in performance vs. 1080 and adding that resolution in will dilute the curve.

According to the chart a single 6970 is not a good solution for triple monitors.

The green area shows the level of performance you can expect on average. I have a single 6970 doing real well on 1920x1200 and pretty good on 2560x1600. There's no green in that row under the triple monitor columns. I was hoping this would be obvious. Please take a look at the previous versions and let me know if you think those made more sense.

I run all my games at 5040x1050 resolution on a single unlocked 2 Gig 6950.

I'm trying to stay away from overclocked results to keep things simple. We all know we can squeeze more performance out of these cards. This chart is really just a starting point, or a general baseline.

Is this chart supposed to represent only the game that is the worst case scenario for performance?

It's more an average really. It certainly doesn't apply to heavy games likes Metro or Crysis.
 
I'd point my friends to this chart, but my friends(atleast the ones that need this chart) are 1080p and below resolution
 
I have to say that you should bump the single 6970 at 1920x1200 on Max at least 3/4's of the way...
...move the 480 and 570 to the column with the 6950 and change the Max for the 6970 alone.

I'll split it with you! :)
I've bumped the 6970 2 notches (instead of 4) and dropped the 480 (not the 570). I want to show the 570 as being "better" than a 480 but if start adding too many rows the curves get screwed up. I think the 6970 does beat out the 570 in a lot of cases, but not by much, I think they're pretty close, at least for charting out a baseline anyway.

Great input, thanks for the help!
 
True, but in an effort to keep it consolidated I think just showing 1920x1200 is enough. There's not a big difference in performance vs. 1080 and adding that resolution in will dilute the curve.

There is a 15% difference between 1920x1200 and 1920x1080.
That is the about the difference between a 560 Ti to a 570 or 6950 to 6970.
If 1920x1080 is the most common resolution that is the one that should be used.


The green area shows the level of performance you can expect on average. I have a single 6970 doing real well on 1920x1200 and pretty good on 2560x1600. There's no green in that row under the triple monitor columns. I was hoping this would be obvious. Please take a look at the previous versions and let me know if you think those made more sense.
So according to your chart a 6970 isn't good at triple monitor resolutions for the average game.
That is not true and to tell people they need 2 GPUs to run triple monitors for the average game is false.
Your chart shows 2x 560 TIs and 2x 6850s as a better solution then a single 6950 or 6970 for triple monitors at 5040x1050.
This is false in many games at high resolutions because they will run out of memory unless they have 2 Gigs like the 6900s.


I'm trying to stay away from overclocked results to keep things simple. We all know we can squeeze more performance out of these cards. This chart is really just a starting point, or a general baseline.
I should have been more clear. My 6950 performance is identical to a 6970.
My results would apply to a 6970.

It's more an average really. It certainly doesn't apply to heavy games likes Metro or Crysis.
In that case then a single 6950 and 6970 2 Gig card should be listed as good performers in the 5040x and 5760x resolutions.
They are perfectly capable of good triple monitor performance in games like: Aliens vs. Predator, Bad Company 2, Borderlands, Call of Duty Black Ops, Dirt 2, F1 2010, Killing Floor, Left 4 Dead 2, Medal of Honor...
Games like Crysis and Metro 2033 may require turning off AA and HDR but they still look good.

The chart is looking good but you don't want to be advising people to spend $400 more then they have to on a system, to run the average games people play.
 
2 major issues.

1 - no older gen cards, eg. that chart is useless to me if I wanted to see a comparison between my 8800GT and a GTX 580. To me thats way more important comparing than comparing cards of the same generation.
2 - only very high resolutions listed.
 
2 major issues.

1 - no older gen cards, eg. that chart is useless to me if I wanted to see a comparison between my 8800GT and a GTX 580. To me thats way more important comparing than comparing cards of the same generation.
2 - only very high resolutions listed.

Please read the OP where I state "...one of the more difficult tasks in recommending a build/upgrade..." and you will understand this chart is not intended to compare every GPU ever sold. I have a lot of friends and family that come to me for advice and I intend to use this chart to help them make wise decisions in their new purchases. If this situation does not apply to you, then yes, this chart is useless for you. But I suspect a lot of folks in this forum are in the same boat as me.

These cards and resolutions is what I consider minimum for a new build or upgrade. The people I'm dealing with will likely keep their hardware for several years, so I'm sticking with DX11 cards so they don't bug me again anytime soon. Also, with 1080 panels so cheap now, I would not suggest a friend or family member pick up a 1050 panel unless they plan on Surround or Eyefinity.
 
Back
Top