Gaming @ 2560x1600

Joined
Aug 12, 2009
Messages
56
I like my games free of tearing (vsync) and above 60fps at all times if possible. Ordered the Dell U3011 yesterday, probably will arrive in a week or 2.

It's been over 2 years since I bought my HD4870 (512MB). It will probably choke at 2560x1600.

Not interested in SLI or crossfire because my current motherboard only has one full speed PCIe slot, and I'm not going to upgrade the mobo.. budget.
A single card solution is what I'm looking for. Maybe dual gpus on a card like the upcoming 6990?

I've been reading that even though dual gpu cards are one physical card, they still use crossfire, and that screws up sometimes. It's confusing.

Any reply is appreciated! :)


Processor: i7 860
Motherboard: Asus P7P55D-E (2nd PCIe will only run at x4 speed instead of x16.)
Cooling: Noctua U12P SE2
Memory: 2x 2GB OCZ Obsidian DDR3 1600
Video Card: Palit HD4870 Sonic Dual 512MB
Hard Disk: 120GB SSD + 1.5TB + 1.5TB + 640GB

CRT/LCD Model: Samsung 32" TV + 2493HM
Case: CM HAF 932
Sound Card: Creative Xfi Extreme Gamer + Logitech Z5500
PSU: Thermaltake 750w toughpower
Software: Win7 64bit
 
Last edited:
I play starcraft 2 on that resolution without issues. I have a 5870. I recommend you get a 580 or wait for amd's 6970.
 
If money is no object you want the GTX 580 for that resolution, especially if you plan to max your games out and use AA/AF. Post system specs for more detailed analysis.

Don't get 2 GPU card you still run into similar problems as SLI/CF.
 
If you can wait 1-2 months, you'll have more choices (AMD 6900 series), but today you have only one (GTX 580).

Can you wait that long?
 
I'd go 580. If you like Vsync on though don't forget to turn on triple buffering via D3Doverrider.
 
When people consider the cost of a 2560x1600,1440 display, they often forget to factor in the cost over time of more frequent, and more expensive video card upgrades. Expect to upgrade to the latest and greatest card every year if you want to play the latest games at good frame rates with settings jacked up to max.
 
Ya, I have to buy 2-3 of the top cards every 9 months or so but I don't mind lol.
 
for that res, you realistically have to go SLI/CFX if you want to run high aa/af and full detail. there really isnt a single card on the market (including the 580) that can push that monitor to the max. you will need to invest into a cfx/sli setup or wait for the 6990 if you want a "single" card, which will still face the same scaling/driver issues as any other cfx setup at present

ninja edit: currently cfx 6870s offer the best performance for that res, and doesnt break the bank (the pair can be had for the same price as a 580, and will perform better)
 
Last edited:
for that res, you realistically have to go SLI/CFX if you want to run high aa/af and full detail. there really isnt a single card on the market (including the 580) that can push that monitor to the max. you will need to invest into a cfx/sli setup or wait for the 6990 if you want a "single" card, which will still face the same scaling/driver issues as any other cfx setup at present

ninja edit: currently cfx 6870s offer the best performance for that res, and doesnt break the bank (the pair can be had for the same price as a 580, and will perform better)

I disagree with everything said in this post.
 
Depends what you want to play. I only play SC2 and Civ V and a 6870 runs both of those at 2560x1600 with max settings. I'm comfortably over 60 fps in both games, but I don't have AA/AF on.
 
I play @ 2560x1600, BC2, MW2, FPS mostly, I play SC2... any game I can run maxed and get 60fps easy.... other than Crysis.

run 2 5850s

So I would suggest the next upgrade being a 6xxx or a 480 or 580 if you don't want to run SLI/Xfire
 
Why?

You think he might have a bottle neck some place else than that 4870?
His CPU may not bottleneck a 4870 but it may bottleneck a GTX 580 (NVIDIA cards are more CPU-dependent than AMD cards).

LoneReaction, your best bet right now is a GTX 580, but it's a horrible time to buy. Wait for the 6970 to drop to see if it's a viable option and what it does to the market. Also wait for the holiday sales. It's true that dual GPU cards largely behave like CF/SLI, but not always - sometimes they are smoother and don't have the same issues that a dual card setup would have.

Personally, I run 2560x1600 and will pretty much be sticking to single card solutions until multi-GPU tech improves. This basically limits me to grabbing the fastest card out there and overclocking the hell out of it. The GTX 580 is a decent overclocker, but is expensive for what it is (30-35% faster at most over my current setup for ~$500). I'm waiting to see what the 6970 is like and then will purchase whatever the better deal is.
 
I do agree it is a horrible time to buy a 580. Launch time is always the worst for any graphics card.

K6 has anyone proved definitively that the 6970 is going to be single or multiple GPU? I haven't seen yet...
 
I do agree it is a horrible time to buy a 580. Launch time is always the worst for any graphics card.
Agreed, with the notion that AMD's 5xxx series was a fluke. Also, it's even worse to buy right before all of the holiday sales.
K6 has anyone proved definitively that the 6970 is going to be single or multiple GPU? I haven't seen yet...
Not that I've seen. As far as I know, we're expecting a single GPU Cayman XT part, the 6970, it's little brother Cayman Pro, the 6950, and a dual Cayman GPU part, Antilles, the 6990. All rumors, although I haven't seen anything to contradict them.
 
Oh yeah thats right I remember now I was getting the 6970 and 6990 confused. I have heard those rumors but didn't know if anyone knew anything concrete yet.
 
for that res, you realistically have to go SLI/CFX if you want to run high aa/af and full detail. there really isnt a single card on the market (including the 580) that can push that monitor to the max. you will need to invest into a cfx/sli setup or wait for the 6990 if you want a "single" card, which will still face the same scaling/driver issues as any other cfx setup at present

ninja edit: currently cfx 6870s offer the best performance for that res, and doesnt break the bank (the pair can be had for the same price as a 580, and will perform better)

Yeah....I have a 30" monitor and I disagree as well.
 
OP youd be better off going SLI or crossfire over a single dual chip card. Single dual chip cards have so many issues over a single card and even issues that dual cards dont have at times. They have their pluses too but if you look at the 5970 as an example, more than 1/2 the time you'll be rolling drivers up and down from game to game and at times required to disable the internal crossfire to run the game at acceptable frame rates. Sometimes a dual 5850 solution will run a game fine whereas a 5970 will not.

If you want a single card I'd wait for the 6970 as that card is sure to change the market, it is rumored to be cheaper than the GTX 580 and if it's as competitive as I believe it will be then the GTX 580 price will be coming down around December 13th. Who knows it may be a better bet either way.

As far as Antilles (6990) wait for multiple reviews to see how it scales across alot of games. and be sure driver issues that plagued hemlock will be absent from the Antilles party before you think about that.
 
Everyone who has rattled against the necessity of SLI at these resolutions is ignoring the definition of maxing out. Here is what I am seeing:

"...But I don't turn on AA/AF" "...But not Crysis" "...But only blah and blah game"

There is not a single card solution that can truly MAX OUT the most demanding games at 2560x1440+ and maintain a minimum FPS above 60.

SLI and Crossfire still have their problems, so I am personally looking forward to better single card alternatives, though it seems like the 580 is as good as it is going to get for a while.
 
There is no rig on earth of any composition that can "max out" every game that exists by your definition. Such a thing does not exist. When we talk about "max out" we're speaking of getting a near maximum quality (95%+, AA in most but not all titles) gaming experience with 30+ FPS at all times. At least I am. I don't care about 60 fps it doesn't affect me. I've played with vsync on @ 30 fps in so many games for so many years I honestly don't even notice the difference anymore.
 
There is no rig on earth of any composition that can "max out" every game that exists by your definition. Such a thing does not exist. When we talk about "max out" we're speaking of getting a near maximum quality (95%+, AA in most but not all titles) gaming experience with 30+ FPS at all times. At least I am. I don't care about 60 fps it doesn't affect me. I've played with vsync on @ 30 fps in so many games for so many years I honestly don't even notice the difference anymore.

Correct, there is not. And while these things might not matter to you, they matter to some. It's just ridiculous to say (and nobody in this thread is really saying it, but I've seen it on this forum more than once) that upgrading from a 5850 (for example) to a GTX 580 is pointless because the 5850 can "max out" all games etc etc. The performance improvement is significant enough at the highest detail levels and resolutions to make it worthwhile for a lot of people.
 
vsync @ 30 FPS??????? what are you talking about

Never heard of setting the vsync

Vsync... 60hz monitor... 60FPS.
 
It really depends on whether or not you absolutely have to have image quality settings.

If I leave all those settings off, turn details to the lowest, shadows lowest or off, and keep AA to just 2x I can play BF2 and BC2 competitively (read: good KDR) on an old G80 8800GTS 640MB @ 2560x1600 paired with a lower end Q8300 on my dedicated DX9 XP gaming PC.

Now, recently, I've resurrected my Skulltrail behemoth and stuck a 5870 on there... I definitely prefer all the image quality settings maxed with Win7/DX11 to that of my XP/DX9 PC.
 
Correct, there is not. And while these things might not matter to you, they matter to some. It's just ridiculous to say (and nobody in this thread is really saying it, but I've seen it on this forum more than once) that upgrading from a 5850 (for example) to a GTX 580 is pointless because the 5850 can "max out" all games etc etc. The performance improvement is significant enough at the highest detail levels and resolutions to make it worthwhile for a lot of people.

well if you are going to upgrade from a 4890 then go as far as your budget will allow and "future proof" your system.

I have 2 5850s and I think it would be a complete waste of my time and money to upgrade to even 5870s... I run every game and I am happy with my setup, but the next generation of games might not be as nice to my computer.
 
Correct, there is not. And while these things might not matter to you, they matter to some. It's just ridiculous to say (and nobody in this thread is really saying it, but I've seen it on this forum more than once) that upgrading from a 5850 (for example) to a GTX 580 is pointless because the 5850 can "max out" all games etc etc. The performance improvement is significant enough at the highest detail levels and resolutions to make it worthwhile for a lot of people.

Of course you are correct. Common sense.

vsync @ 30 FPS??????? what are you talking about

Never heard of setting the vsync

Vsync... 60hz monitor... 60FPS.

Thus I quote from another post on [H] ages ago about vsync:

Essentially this means that with double-buffered VSync, the framerate can only be equal to a discrete set of values equal to Refresh / N where N is some positive integer. That means if you're talking about 60Hz refresh rate, the only framerates you can get are 60, 30, 20, 15, 12, 10, etc etc. You can see the big gap between 60 and 30 there. Any framerate between 60 and 30 your video card would normally put out would get dropped to 30.

http://hardforum.com/showthread.php?t=928593

Naturally, because I'm a budget conscious gamer, I rarely get 60 fps out of vsync which I always run in every game that it doesn't outright break. Always. Without exception. But I usually get well over 30, so I stay locked @ 30 for the duration.
 
5870 owner here too, I play most games on my Dell 30" at 2560 resolution without any problems or slowdown. Played Black ops last night and had fun. 5870 works fine for me although I am waiting to see what the 6970 can do.
 
yes I understand that all but I thought you were saying you was able to set it to 30fps... just confused.

I can not stand vsync and try every game with it, then turn it off after the first minute of play time
 
5870 owner here too, I play most games on my Dell 30" at 2560 resolution without any problems or slowdown. Played Black ops last night and had fun. 5870 works fine for me although I am waiting to see what the 6970 can do.

ya I am sure Black Ops is taxing that card really hard :rolleyes:
 
Vsync is a personal preference, naturally, but like AA and AF, it is a feature I can't live without. Tearing drives me crazy. Game breaking for me in most games.
 
:rolleyes: Used that as an example as that was the last game I played. Dirt 2, BFBC2, Crysis warhead all run fine...does that work for ya?
 
Aww crap guys, sorry about forgetting to post the rest of the system specs. I'll edit the first post.
 
Vsync is a personal preference, naturally, but like AA and AF, it is a feature I can't live without. Tearing drives me crazy. Game breaking for me in most games.

well I love everything turn on and turned up.. I love the eye candy and even more I know it makes me a better player to be able to see shadows when people are standing around corners etc

Yeah, Black Ops engine is only what, 80 million years old?

thought it was like 90 million but 80 million is close so enough on the older then dirt and looks like junk scale
 
Vsync is a personal preference, naturally, but like AA and AF, it is a feature I can't live without. Tearing drives me crazy. Game breaking for me in most games.
I dont understand how some people dont notice tearing. It can completely ruin a gaming experience for me too. luckily with mass effect i have learned to somewhat ignore it.

Yeah, Black Ops engine is only what, 80 million years old?
Are they still using the engine they used in COD 4?
 
:rolleyes: Used that as an example as that was the last game I played. Dirt 2, BFBC2, Crysis warhead all run fine...does that work for ya?

Well you should use any other those games over Black Ops, because any of those games are more of a challenge for your 5870 than Black Ops.
 
Are they still using the engine they used in COD 4?

COD 4. MW MW2 Black Ops.... same old engine... if I was reading that correctly.

I know for a fact it is the same engine as MW2, thought that was the same as the rest, someone will clear that up though.
 
Back
Top