Worth upgrading from 6970 to a GTX 580?

So with Crysis it makes no difference right? With a 6870 you can max-out Crysis @ 1680 x 1050?

You guys have NO idea what you're talking about when it comes to "overkill". There are more games in the world than Modern Warfare.

You guys tend to disregard that there are games like Crysis. You guys also tend to disregard that there are people who like to use AA and still get high frames.

There is always some dope saying that this or that card is "overkill" for a certain resolution because you read it on a review site or a thread or something instead of using your common sense. It has become a pet-peeve.

Did you even read the OP? you quoted me, where did I mention the 6870? So i presume you are calling me a dope? This isn't a thread about the 6870 compared to the 580, this is a thread about a guy with a 6950 moving to a 580 and spending $180 to do so.

So I take it your advise to the OP would be to buy the 580 to play at 1680*1050. Buying a 580gtx for that resolution, when you already have a card that plays every game perfectly, would be a pretty stupid thing to do.

You use the phrase that you don't buy a 580gtx to NOT use AA, well most of the people on this forum would say you don't buy a 580GTX to play at 1680*1050. Why, because it's not common sense, you aren't using the card to it's full potential, it would be a complete waste of money.

You are entitled to your opinion, but, it's only your opinion. You really should tone down your condescending attitude, because your opinion this time goes against nearly everyone else who posted in this thread.
 
Guys many thanks for chiming in. I will go with the general consensus which is a NO for GTX 580.
As for Metro 2033, Crysis 2 and Witcher 2 performance my numbers are generally as follows. Resolution is 1680*1050.

Metro 2033 - averages around 60 fps w/ vsync enabled. Only in some very rare areas (e.g. the train yard section) in between the game it went down to 23-25 fps where there is a lot of smoke, fog etc. It is a rare section of the game that performs poorly on all cards. There is only one advanced DX option that I turn off (I think it is Adv DOF). Also I do not care about this game anymore since I am never going to replay it.

Crysis 2 - averages around 39-44 fps outdoors with a lot of stuff going on. All options maxed out including in game tess. Vsync enabled. I am not sure if it is working though.

Witcher 2 - averages around 60 fps pegged for almost everything. Only drop I noticed thus far was when you enter chapter 1 right after the boat section. This is the game I care about but honestly, the performance is top notch with everything maxed out. Sometimes frames drop to 45 if there is a ton of crap on the screen but it does not hamper gameplay.

As for monitors and resolution. Note that IF I go 3D I will have put in around $1000 for the whole thing. Monitor and glasses cost here for $600. Graphics card will cost me $430. Selling off my current graphics will get me some money back but I also spent around $330 on it when it came out so all in all would've spent around $1000 this year just to go 3D with a single card.

Only reason I am NOT upgrading my monitor is that I want to purchase a 3D monitor. With nVidia's 3D vision I can only get a 3D monitor with the glasses. There is no option to purchased nVidia 3D vision ready 3D monitor without glasses. I don't live in the states so I don't have newegg etc to purchase parts from.

Given all of this and your comments, it seems it is worthwhile to wait out. My system is fast enough and there is no game that is unplayable right now. I also don't care about Bad Company 2 so it is not top of mind. BF3 might be demanding an upgrade but we will only know that once it comes out and I would rather spend money then.

Thanks a lot to everyone who responded.

Yeah, good call. Do you really think you would notice if you were getting10 fps more in metro?

And as you say there are new cards coming out in the next few months, and you are having no problems playing any game at the moment, it makes sense to wait and see.

Same applies to 3D, the monitors are getting better all the time. Acer's new HN274 seems to be pretty good, and it comes with 3D active shutter glasses and IR emitter built in. Hopefully the next generation of montiors and games will be bring even more improvements to 3D gaming.
 
I would try S3D on a friends machine or somehow before you jump into it. Some people love it and some can't stand it. It is a very personal choice. It does change the way you game which is something most are not expecting. Enemy AI does not change the way it games though. S3D is tougher to master so in that respect older games you have mastered become new again.

Also look at the list of games that support S3D and at what level. Compare that to the kind of games you play and see if it is something you can live with. I built a machine for that specific purpose(I used the GTX460 I use for Physx, and other spare parts I had) and took it to a store and hooked it up to a 27" 120HZ monitor. Try it before you jump by a cooperative store or one that will allow you to return the parts if you do not like it. Even paying a small restock fee is better than getting stuck with something you are not satisfied with(consider it rent).
 
Last edited:
Did you even read the OP? you quoted me, where did I mention the 6870? So i presume you are calling me a dope? This isn't a thread about the 6870 compared to the 580, this is a thread about a guy with a 6950 moving to a 580 and spending $180 to do so.

So I take it your advise to the OP would be to buy the 580 to play at 1680*1050. Buying a 580gtx for that resolution, when you already have a card that plays every game perfectly, would be a pretty stupid thing to do.

You use the phrase that you don't buy a 580gtx to NOT use AA, well most of the people on this forum would say you don't buy a 580GTX to play at 1680*1050. Why, because it's not common sense, you aren't using the card to it's full potential, it would be a complete waste of money.

You are entitled to your opinion, but, it's only your opinion. You really should tone down your condescending attitude, because your opinion this time goes against nearly everyone else who posted in this thread.


How could you not use a GTX 580 to it's full potential? Game @ 800x600? It's proven knowledge that you can't max-out Crysis with AA or Metro with AA at his resolution.


Also, I wouldn't recommend anyone buy a GTX 580 1.5GB period. Maybe a 3GB if the price is right. Why have all that power with little vram?

What I am saying is, if the dude wants a 580 it's not going to be wasted. If he wants to run Crysis with some AA he can do it. If he wants to run Metro with some AA he can do it. What business is it of yours or anyone else to tell him he's wasting his money when it's clearly not the case. Could he do it with it a 6950? Nope. 6970? closer, but nope. 6870? Obviously not.

Is there diminishing returns on investment with 580? Of course...but that card is overpriced as it. You pay a premium for the best @ any resolution.

I game @ 1920 x 1080...it's not that much more than 1680x1050. And let me tell ya, my 6950 is good for my res. but does it max-out Crysis comfortably? No. Metro? No.

Would a GTX 580 help me @ my res.? Absolutely. I just don't want to spend $500 dollars, but that's my prerogative if I want more awesome or less awesome. Heck, i'm contemplating crossfire in the future. If I want to run some of my games with better framerates, who are you to tell what i'm doing is overkill when I clearly can't max-out every game.

Who are you to judge what card he should buy based on your own pre-conceived notions of what is "overkill" without telling whole story. It's like you dudes think that DX9 console ports based on 6-year old engines are the only games out there.

And i'm not posting just to agree with everyone in a thread. I speak for myself without being influenced that what "everyone else says must be right.", which is completely the point I wish to make.
 
Last edited:
How could you not use a GTX 580 to it's full potential? Game @ 800x600? It's proven knowledge that you can't max-out Crysis with AA or Metro with AA at his resolution.


Also, I wouldn't recommend anyone buy a GTX 580 1.5GB period. Maybe a 3GB if the price is right. Why have all that power with little vram?

What I am saying is, if the dude wants a 580 it's not going to be wasted. If he wants to run Crysis with some AA he can do it. If he wants to run Metro with some AA he can do it. What business is it of yours or anyone else to tell him he's wasting his money when it's clearly not the case. Could he do it with it a 6950? Nope. 6970? closer, but nope. 6870? Obviously not.

Is there diminishing returns on investment with 580? Of course...but that card is overpriced as it. You pay a premium for the best @ any resolution.

I game @ 1920 x 1080...it's not that much more than 1680x1050. And let me tell ya, my 6950 is good for my res. but does it max-out Crysis comfortably? No. Metro? No.

Would a GTX 580 help me @ my res.? Absolutely. I just don't want to spend $500 dollars, but that's my prerogative if I want more awesome or less awesome. Heck, i'm contemplating crossfire in the future. If I want to run some of my games with better framerates, who are you to tell what i'm doing is overkill when I clearly can't max-out every game.

Who are you to judge what card he should buy based on your own pre-conceived notions of what is "overkill" without telling whole story. It's like you dudes think that DX9 console ports based on 6-year old engines are the only games out there.

And i'm not posting just to agree with everyone in a thread. I speak for myself without being influenced that what "everyone else says must be right.", which is completely the point I wish to make.
Do not quite understand your stance as I made a thread to ask about opinions. That alone gives people license to post whatever they like. If someone says that GTX 580 will be "overkill" it is their opinion which I value and in this particular situation also understand.

Please also focus on the facts that I posted. My current monitor is 1680*1050 w/ 60 Hz. I can max out the 3 games that are most graphically intense these days (Crysis 2, Witcher 2 and Metro 2033) and still maintain decent enough fps. Crysis 2 is only one around 39-44 fps as stated in my post previous to this one. I also value vsync that means that anything above 60 fps is useless to me at my resolution.

I will not upgrade a monitor unless it has 3D capabilities and currently only option I have is to purchase a 23 inch ASUS monitor w/ nVidia 3D glasses. Other brands are not available in the market in my country. That pretty much limits my options for a new monitor and/or use of 3D.

Side note on 3D: As for me preferring 3D. I generally prefer it in cinema and watch all movies in 3D that have that option available. I also am ok with ghosting at edges (happens in cinema all the time).
 
I have an Asus DirectCUII 580 1.5GB @ 900 and game at 1920x1080.

I played through Crysis 2 on Ultra with the High Res pack enabled and got ~40-50 FPS in outdoor areas, rarely hitting 30. It was more than playable. I play Metro 2033 maxed except for the Depth of Field (which is known to eat framerates) and get 35-45ish FPS during usual gameplay.

If you only want to buy one card, game at 1680x1050 and want to max everything including Crysis Ultra/Metro 2033 (and I assume RAGE as well), a 580 will do it.
 
Please also focus on the facts that I posted. My current monitor is 1680*1050 w/ 60 Hz. I can max out the 3 games that are most graphically intense these days (Crysis 2, Witcher 2 and Metro 2033) and still maintain decent enough fps. Crysis 2 is only one around 39-44 fps as stated in my post previous to this one. I also value vsync that means that anything above 60 fps is useless to me at my resolution.

If you want to use Vsync and get 60 fps, youll need frame rates OVER 60 to do it because youll have lots of drops in your fps average. If youre averaging 45 fps in Metro, Vsync isnt gonna work or its gonna drop you to 30 fps. Being able to hit 76 fps in Metro at your resolution will mean with Vsyn on, youll get a steady 60 fps Same with all the other graphic intensive games and why I believe the GTX580 is worth it.

I will not upgrade a monitor unless it has 3D capabilities and currently only option I have is to purchase a 23 inch ASUS monitor w/ nVidia 3D glasses

Never say never dude. That 22" screen might start to look real small to you one day or you might find a 27" monitor on fire sale some where and want it. 1920x1080 is a lot tougher on video cards than 1680x1050 so that extra horsepower will come in real handy.

If youve already convinced yourself that you dont want to spend the extra money on the 580, thats fine, but there are a lot of good reasons to get one even at your midrange resolution.
 
Also, I wouldn't recommend anyone buy a GTX 580 1.5GB period. Maybe a 3GB if the price is right. Why have all that power with little vram?

I.

You and other people really need to stop pushing that when its been proven that the 1.5gb is not a limiting factor even at 2560x1600...
 
Do not quite understand your stance as I made a thread to ask about opinions. That alone gives people license to post whatever they like. If someone says that GTX 580 will be "overkill" it is their opinion which I value and in this particular situation also understand.





Please also focus on the facts that I posted. My current monitor is 1680*1050 w/ 60 Hz. I can max out the 3 games that are most graphically intense these days (Crysis 2, Witcher 2 and Metro 2033) and still maintain decent enough fps. Crysis 2 is only one around 39-44 fps as stated in my post previous to this one. I also value vsync that means that anything above 60 fps is useless to me at my resolution.

I will not upgrade a monitor unless it has 3D capabilities and currently only option I have is to purchase a 23 inch ASUS monitor w/ nVidia 3D glasses. Other brands are not available in the market in my country. That pretty much limits my options for a new monitor and/or use of 3D.

Side note on 3D: As for me preferring 3D. I generally prefer it in cinema and watch all movies in 3D that have that option available. I also am ok with ghosting at edges (happens in cinema all the time).

I take issue with people who make statements of opinions without the true facts. Like saying a GTX 580 is "pure overkill". It's just not. Not an ounce of proof that it's overkill. If one game lags, and it's your favorite game, then it's not overkill to get what you want to play that game.


If you're happy with what you got that's great. All I'm saying is, if you really want a GTX 580, it's not like it absolutely won't be of any benefit to you at all. It will...that is if you want to spend the money. If you want to do 3d gaming or video, that's great too.

I'm thinking about spending $400 on an SSD. I don't need it. It's "overkill", but maybe I want it. Triple 30" monitors, Tri-SLI...Quad-Crossfire, $400 motherboards...Does anyone need a $400 motherboard for gaming?...it's all overkill. No one needs any of this shit we buy, we buy it for our entertainment. Whatever makes you happy.

You and other people really need to stop pushing that when its been proven that the 1.5gb is not a limiting factor even at 2560x1600...

I don't want to go into a full-scale argument about it....but i've gone over 1 GB vRAM with a GTX 460 @ 1600 x 900. No way i'm spending $450 for just 1.5GB vRAM.
 
Last edited:
This is the [H]. Overkill is what we do. :D But I realize we run the spectrum; from conservative budget builds, to over-the-top e-peen machines. It's all good. Let the man do what he wants to do. Too many people acting like everyone should agree with their "expert" opinions.
 
First, I would suggest 2x GTX 460 1GB

In some benchmarks, it beats the 580, and in most, it beats 6970.

http://www.anandtech.com/bench/Product/305?vs=314

6970 = 350 Dollars

SLI GTX 460 1GB = 240 Dollars

Look at benchmarks.

HOWEVER, for your question and situation I WOULD SAY NO! I dont understand why people sidegrade.
 
Last edited:
IT would not be worth it for the sidegrade.

You might as well keep the 6970 and wait for the new 7k series to come out.

and IF you do decide to get a 580 GTX get the 3gb model.
 
I will have agree with spaceguild. Saying a 6970 is overkill for 1680x1050 does not make sense when there are games even a gtx580 struggles at that resolution. If a 6970 is overkill at 1680x1050, then 6950 xfire is overkill for 1080p but I don't see anyone saying that. There's a popular believe here that if you play a lower resolution, you are automatically going to have too much wasted performance and unworthy of a higher end card. In my test going from 1920x1080 to 1680x1050 only give me 14% more performance.

Crysis 2, Crysis, Metro, Avp, Witcher2, Dragon age 2, Lost planet2, Civ5, Borderlands. A 6970 cannot maintain a min of 60fps in any of them at max setting 1680x1050 4xAA. But I guess nobody plays those games.
 
How many people paid $650 on launch day for an 8800 GTX just to play WoW? ;)

I paid $500 for a 8800 GTS 640MB just to play BF2142.:p
 
How many people paid $650 on launch day for an 8800 GTX just to play WoW? ;)

I paid $500 for a 8800 GTS 640MB just to play BF2142.:p

Not me. I waited for the 8800GT which when overclocked was as fast as the 8800GTX.

Smart people have patience
 
personally I would not change from a 6970 to a 580.......not worth the cost increase
 
This is turning into a pissing contest at this point and it's fun to watch. I'm gonna reitterate what I said before and drill down deeper.

In terms of overkill I agree that a 6970 and GTX 580 IS overkill for 1680x1050 most of the time. If there was a thread with a poll I'd imagine that someone posting that they want to game at 1680x1050 and play a bunch of games that either of those 2 cards wouldn't be popular options. However in the OP's specific situation he plays very demanding games such as Metro 2033, crysis/warhead/2

More than likely he'll want to play BF3, Dead Island, Skyrim, and Metro Last Light next year etc..

If the same thread says specifically, "I want to play the latest graphic demanding games with IQ maxed at 1680x1050 and maintain 60fps using vsync" then in the same poll the 6970 and GTX 580 would come more recommended. Since the OP never stated that, based on the information presented to us saying it was overkill was correct. Now 2 pages later knowing what we know one can say that if he were in the market for a card either the 6970 or GTX 580 would be the cards to get. But owning either of them I see no scenario remotely possible where one would say that he should sell one to get the other. I couldn't imagine someone thinking that this would be good advice. There is some fanboyism going on here and people need to look in the mirror and decide where the line is drawn between giving good advice and caving into your fanboyism.

I have no idea why anyone would recommend such a ridiculous side grade but whatever to each his own. In the many real world performance benchmarks I've seen a 6970 is 15% away(lower resolutions) from a GTX 580 and even 10% away at 2560x1600, go into crossfire and their about even, in eyefinity with crossfire vs sli surround, 6970s are either 5% slower up to 15% faster depending on the game and resolution, edge usually to to AMD there. These things seem to be consistent except in some heavy nvidia or AMD bias games.

The only flag I got which made me think about recommending a switch was the OP mentioning he was interested in doing 3d but that was not enough of a resaon to sidegrade IMHO.
 
This is turning into a pissing contest at this point and it's fun to watch. I'm gonna reitterate what I said before and drill down deeper.

In terms of overkill I agree that a 6970 and GTX 580 IS overkill for 1680x1050 most of the time. If there was a thread with a poll I'd imagine that someone posting that they want to game at 1680x1050 and play a bunch of games that either of those 2 cards wouldn't be popular options. However in the OP's specific situation he plays very demanding games such as Metro 2033, crysis/warhead/2

More than likely he'll want to play BF3, Dead Island, Skyrim, and Metro Last Light next year etc..

If the same thread says specifically, "I want to play the latest graphic demanding games with IQ maxed at 1680x1050 and maintain 60fps using vsync" then in the same poll the 6970 and GTX 580 would come more recommended. Since the OP never stated that, based on the information presented to us saying it was overkill was correct. Now 2 pages later knowing what we know one can say that if he were in the market for a card either the 6970 or GTX 580 would be the cards to get. But owning either of them I see no scenario remotely possible where one would say that he should sell one to get the other. I couldn't imagine someone thinking that this would be good advice. There is some fanboyism going on here and people need to look in the mirror and decide where the line is drawn between giving good advice and caving into your fanboyism.

I have no idea why anyone would recommend such a ridiculous side grade but whatever to each his own. In the many real world performance benchmarks I've seen a 6970 is 15% away(lower resolutions) from a GTX 580 and even 10% away at 2560x1600, go into crossfire and their about even, in eyefinity with crossfire vs sli surround, 6970s are either 5% slower up to 15% faster depending on the game and resolution, edge usually to to AMD there. These things seem to be consistent except in some heavy nvidia or AMD bias games.

The only flag I got which made me think about recommending a switch was the OP mentioning he was interested in doing 3d but that was not enough of a resaon to sidegrade IMHO.

I own a 6950 ...not a 580 fanboy. I don't care if wants a 6970 or a 580. I don't care if he triple SLIs 580s. Just don't fucking say that having either a 6970 or 580 is completely overkill for his resolution. Because it's not. Another person who fails to understand the point.
 
Like I said if you want 3d sell the unlocked 6950 and buy a 269$AR gtx570. Same performance and much beter 3d support for $40$ out of pocket after you sell the 6950.

metro_2033_1680_1050.gif
 
This is turning into a pissing contest at this point and it's fun to watch. I'm gonna reitterate what I said before and drill down deeper.

In terms of overkill I agree that a 6970 and GTX 580 IS overkill for 1680x1050 most of the time. If there was a thread with a poll I'd imagine that someone posting that they want to game at 1680x1050 and play a bunch of games that either of those 2 cards wouldn't be popular options. However in the OP's specific situation he plays very demanding games such as Metro 2033, crysis/warhead/2

More than likely he'll want to play BF3, Dead Island, Skyrim, and Metro Last Light next year etc..

If the same thread says specifically, "I want to play the latest graphic demanding games with IQ maxed at 1680x1050 and maintain 60fps using vsync" then in the same poll the 6970 and GTX 580 would come more recommended. Since the OP never stated that, based on the information presented to us saying it was overkill was correct. Now 2 pages later knowing what we know one can say that if he were in the market for a card either the 6970 or GTX 580 would be the cards to get. But owning either of them I see no scenario remotely possible where one would say that he should sell one to get the other. I couldn't imagine someone thinking that this would be good advice. There is some fanboyism going on here and people need to look in the mirror and decide where the line is drawn between giving good advice and caving into your fanboyism.

I have no idea why anyone would recommend such a ridiculous side grade but whatever to each his own. In the many real world performance benchmarks I've seen a 6970 is 15% away(lower resolutions) from a GTX 580 and even 10% away at 2560x1600, go into crossfire and their about even, in eyefinity with crossfire vs sli surround, 6970s are either 5% slower up to 15% faster depending on the game and resolution, edge usually to to AMD there. These things seem to be consistent except in some heavy nvidia or AMD bias games.

The only flag I got which made me think about recommending a switch was the OP mentioning he was interested in doing 3d but that was not enough of a resaon to sidegrade IMHO.

Nobody is recommending him to go from 6970 to gtx580 for performance alone. For 3d yes but not worth switching just for performance. What we are pointing out is the over generalization that x card is overkill for y resolution without being specific of which game and at what kind of performance level. If I can somehow tolerate 15 fps in games, does it mean a i can make a sweeping statement that 5770 is overkill for everyone at 1080p?
 
Just a quick thing. Witcher 2 and Metro 2033 run pegged at 60 fps 90% of the time. So I do not average 45 fps in those games.

As for my choice of monitor, as I said, I am only looking to upgrade monitor if it has 3D capabilities. I do not (over) spend on hardware unless I derive real benefit out of it. I do not believe 1080p alone is a significant upgrade to change my monitor just yet.

Btw those Metro 2033 numbers are from which date. They don't look right.. :confused:
 
I own a 6950 ...not a 580 fanboy. I don't care if wants a 6970 or a 580. I don't care if he triple SLIs 580s. Just don't fucking say that having either a 6970 or 580 is completely overkill for his resolution. Because it's not. Another person who fails to understand the point.

It IS overkill for his resolution! However I'll concede that what the OP does and plays give him justification to buy such a card to run at that resolution. Most 99% of games out on the fuckin market are shitty console ports which run fine on a 5770. That's the point YOUR not getting. There are maybe 4 games that stress a 6870 let alone a 6950 at that resolution. If you spend double the money based on 4 games that's fine. Hopefully games in the near future will continue to use those innovative game engines and you'll be good to go. What we are currently getting is 15 console ports, 3 source based games to every 1 great game wither a killer engine that stresses cards.

If someone asks me what's a good card to play at 1680x1050 the first question I'm going to ask is, What will you be playing and what performance are you looking for. When you jumped in and started making your recommendation, you didn't know what he was playing or doing you just said "YEAH!! sure get a $550 3gb GTX 580 for 1680x1050, sell your 6950/6970" No offense but, that's why I question the advice that many of you are giving.

Nobody is recommending him to go from 6970 to gtx580 for performance alone. For 3d yes but not worth switching just for performance. What we are pointing out is the over generalization that x card is overkill for y resolution without being specific of which game and at what kind of performance level. If I can somehow tolerate 15 fps in games, does it mean a i can make a sweeping statement that 5770 is overkill for everyone at 1080p?

That's kinda my point exactly, however the only thing in the OP which made me think he should consider a side grade was 3d, once he said he wasn't trying to pay the price of admission for that at this time there was no reason for a few of you guys to keep pushing the point. A 6970 or a GTX 580 is overkill for 1680x1050 most of the time, it really depends on the person, what they are doing and what they are looking for.
 
Last edited:
Just a quick thing. Witcher 2 and Metro 2033 run pegged at 60 fps 90% of the time. So I do not average 45 fps in those games.

As for my choice of monitor, as I said, I am only looking to upgrade monitor if it has 3D capabilities. I do not (over) spend on hardware unless I derive real benefit out of it. I do not believe 1080p alone is a significant upgrade to change my monitor just yet.

Btw those Metro 2033 numbers are from which date. They don't look right.. :confused:

Here is one from April with the the 6970 lighting @ 940 core, very high settings. Not really much difference except the gtx570 and gtx580 got faster.

http://www.techpowerup.com/reviews/MSI/HD_6970_Lightning/16.html

metro_2033_1680_1050.gif


remember they are using 4xaa. not Analytical aa like here in the Anandtech review @ 1080p.
36517.png


4xaa @ 1080p
metro_2033_1920_1200.gif
 
Last edited:
Just a quick thing. Witcher 2 and Metro 2033 run pegged at 60 fps 90% of the time. So I do not average 45 fps in those games.

As for my choice of monitor, as I said, I am only looking to upgrade monitor if it has 3D capabilities. I do not (over) spend on hardware unless I derive real benefit out of it. I do not believe 1080p alone is a significant upgrade to change my monitor just yet.

Btw those Metro 2033 numbers are from which date. They don't look right.. :confused:

If you're getting over 60 fps in Metro all the time, you dont have the settings maxed. My 5870 wouldn't get out of the 30's on my last 1680x1050 monitor at max settings.

Turn everything up to the highest level including AA, AF and DOF. I guarantee you, you wont see 60 fps 90% of the time.

And you wouldn't be upgrading monitors for the extra resolution, you'd be upgrading for the extra 5" of screen real estate. ;)
 
Last edited:
Here is one from April with the the 6970 lighting @ 940 core, very high settings. Not really much difference except the gtx570 and gtx580 got faster.

http://www.techpowerup.com/reviews/MSI/HD_6970_Lightning/16.html

metro_2033_1680_1050.gif

Thanks for posting that. LOL 26.6fps is such overkill at that resolution. We should not be using 6970 for anything less than 5760x1200. And nobody plays that game anyways except maybe OP, Mcleod and me.

No single gpu is able to avg 60fps in Avp either:
avp_1680_1050.gif
 
Last edited:
It IS overkill for his resolution! However I'll concede that what the OP does and plays give him justification to buy such a card to run at that resolution. Most 99% of games out on the fuckin market are shitty console ports which run fine on a 5770.

A 5770? 99 percent? I want what you're smoking. That's some good shit.;)
 
I'm not smoking anything actually. I should have said a 5770 at 1680x1050. And yes 99% of games run fine on that card at that resolution.

I can think of 4 games it can't do at 60 FPS and maxed, and that's just off the top of my head. If >= 4 games is <= 1% of games... man you must be including every game made on PC in your statistic.
 
I can think of 4 games it can't do at 60 FPS and maxed, and that's just off the top of my head. If >= 4 games is <= 1% of games... man you must be including every game made on PC in your statistic.

I never said maxed and at 60fps. I mean playable as in acceptable frame rates. Not all games require 60 fps to be smooth. If a Xbox 360 can maintain these console ports playable at 1080p what makes you think a much more powerful 5770 isn't?

Edit: Here is the specs of an Xbox 360 GPU

http://en.wikipedia.org/wiki/Xenos_(graphics_chip)

and a quote from the source in case you were il informed

The Xenos is a custom graphics processing unit (GPU) designed by ATI, used in the Xbox 360 video game console. Developed under the codename "C1,"[1] it is in many ways related to the R520 architecture and therefore very similar to an ATI Radeon X1900/X1950 series of PC graphics cards as far as features and performance are concerned. However, the Xenos introduced new design ideas that were later adopted in the R600 series, such as the unified shader architecture. The package contains two separate silicon dies, the GPU and an eDRAM, featuring a total of 337 million transistors.
 
Last edited:
Comparing the 360 GPU tech specs to a PC GPU couldn't be more apples and oranges.

Games like Metro, Crysis and soon to be (likely) Rage, Skyrim, etc, eat a 5770 for breakfast. So to say that 99% of games run fine on it is a bit silly.
 
I never said maxed and at 60fps. I mean playable as in acceptable frame rates. Not all games require 60 fps to be smooth. If a Xbox 360 can maintain these console ports playable at 1080p what makes you think a much more powerful 5770 isn't?

[/I][/B]

There is only one problem with your theory Xbox 360 doesnt render games at 1080p. For instance Black Ops is rendered at 1040x608 and then upconverted to 1080.
 
Yes, you guys are right, I had AA on in game and not in the CCC. Also I already noted that Adv DOF is off. Hence, the reason why I was getting 60 fps in Metro 2033. Either way, that is fine. AA within game alone is also fine with me.

Guys, I have made my decision and did not opt for the deal. Maybe next ATi card makes more sense as it should offer about 30% improvement over my current card (hopefully :p).
 
Yes, you guys are right, I had AA on in game and not in the CCC. Also I already noted that Adv DOF is off. Hence, the reason why I was getting 60 fps in Metro 2033. Either way, that is fine. AA within game alone is also fine with me.

Guys, I have made my decision and did not opt for the deal. Maybe next ATi card makes more sense as it should offer about 30% improvement over my current card (hopefully :p).

It's not about you anymore....this is war! :mad::eek::p
 
Hey guys since were already on this topic, I would love some advice as well...

My Res 1920x1200

I can get a 580 GTX 3GB for $450
I can get a 580 1.5 GB for $350-400
I can get a 6950 2GB unlocked for $250-260

Kind of a substantial difference between price ranges there. But I REALLY love maxing my games out and getting a nice smooth performance. I currently have no card right now because I just sold my old one and I cant wait for the new video cards to come out which probably wont drop till early 2012. Thanks guys!
 
Last edited:
If you want to play the super high end games like Crysis, Metro, (and I assume Rage) all maxed at that resolution you're going to need a 580. I play at 1920x1080 and can't get 60 FPS in Crysis or Metro with a 580 @ 900 core & 2500k @ 4.5 Ghz.
 
If you want to play the super high end games like Crysis, Metro, (and I assume Rage) all maxed at that resolution you're going to need a 580. I play at 1920x1080 and can't get 60 FPS in Crysis or Metro with a 580 @ 900 core & 2500k @ 4.5 Ghz.
Sorry if this is nitpicking but what you just said can be read as "he does not need a GTX 580".
It means he should spend his money on a 6950 unlock it and wait for the next round of cards since anyways, he will not be able to max out Crysis 2 or Metro 2033.

Or alternatively, it could mean is that if you can go as high as $450 then might as well get two 6950 and you will be able to max out games at the resolution.
 
Sorry if this is nitpicking but what you just said can be read as "he does not need a GTX 580".
It means he should spend his money on a 6950 unlock it and wait for the next round of cards since anyways, he will not be able to max out Crysis 2 or Metro 2033.

Or alternatively, it could mean is that if you can go as high as $450 then might as well get two 6950 and you will be able to max out games at the resolution.

True, I more meant 'the best performance out of those options', because none will max them, and a 580 won't be bottlenecked.
 
Comparing the 360 GPU tech specs to a PC GPU couldn't be more apples and oranges.

Games like Metro, Crysis and soon to be (likely) Rage, Skyrim, etc, eat a 5770 for breakfast. So to say that 99% of games run fine on it is a bit silly.

99% of games are console ports these days. There are 4 maybe 5 games that have engines that challenge current PC graphics hardware. So yes since 99% or 24 of 25 games released each year worth noting are console ports my statement was accurate. However the console ports you listed are modified to have extra PC goodies. Not sure if you know that or are playing ignorant. Metro 2033 for example boasts DX 11 features that are not present on consoles. To run it at console levels you need to run it with no eye candy and all effects off more than likely. Rage and Skyrim are still unknowns but I've read that the pc version of skyrim will have more features for pc's that can handle them.

So, even Captain obvious would tell you that a 5770 will be had for breakfast with all bells and whistles on including PC exclusive DX11 features, texture packs etc... but a 5770 can definitely run these games at Xbox 360 and PS3 Capacity and maintain playable FPS.

There is only one problem with your theory Xbox 360 doesnt render games at 1080p. For instance Black Ops is rendered at 1040x608 and then upconverted to 1080.

Very Valid point. I forget about that, however I'd venture to say that the extra power a 5770 has over a X1900XT GPU can be used to render any console port (99% of the games releasing these days) at the same graphical detail used on Xbox 360 @ 1080P with the same or higher fps as a Xbox 360 or PS3 can do.
 
Last edited:
So yes since 99% or 24 of 25 games released each year worth noting are console ports my statement was accurate.

Sorry to burst your bubble, but 24 out of 25 games is 96%. It's also worth noting that most games aren't actually console ports, but are made simultaneously for the lowest common denominator, which are consoles. Not sure if you know that or are playing ignorant, but there's a big difference between the two.
 
Back
Top