Lower Eyefinity Resolutions?

Asacolips

Limp Gawd
Joined
Dec 4, 2008
Messages
272
I've got a 6870 that's done fairly well for me in Eyefinity, though in some games it can certainly get noticeably slow, when pumping three 1080p displays in landscape. However, the number of lower resolutions CCC lists (2400x600,4800x900, and the bezel corrected versions) are limiting to say the least. What I would really hope for is an option to drop to 3x 720p (3840x720) for the more demanding games, as it's the correct aspect ratio and has a low enough pixel budget to actually get a performance boost.

However, I've had absolutely no luck getting such a resolution to work. Are there any registry edits or other procedures that could pull this off? I've been searching google for a while now, to no avail.
 
crossfire is recommended for eyefinity IMO.

I would assume the screens must support different scaling and resolution to work that way but to ignorant to say for sure.
 
Crossfire would be my preferred choice, but a single 6870 was the most viable option for me cost wise. At any rate, these monitors normally support 720p and do work correctly in it when in the standard desktop extend mode. It's only in eyefinity groups that such resolutions don't become available.
 
it may be game specific, can you set your desktop to that resolution (3840x720)?
 
In eyefinity mode, that's a negative. I looked all over the ccc for a way to add custom resolutions, but couldn't find anything. I did find the list of resolutions (which consists of the few I mentioned in the op), but none of them were editable.
 
ok i found it, under the CCC goto Desktop Managment and then desktop properties.. under there you can see the 3 resolutions for Grouped

2400x600
4800x900
5760x1080

So the resolutions are there, you just need to find a way for games to see them.. maybe even trying to manually set the resolution may work.
 
I just did a test, fired up ArmA and it does indeed see all the different eyefinity resolutions.. so my guess is its very game dependent (some may just use the 5760 as a quick fix for Eyefinity support)
 
I just did a test, fired up ArmA and it does indeed see all the different eyefinity resolutions.. so my guess is its very game dependent (some may just use the 5760 as a quick fix for Eyefinity support)

I have three 1680x1050 2ms monitors hooked up to my unlocked 6950 and yes eyefinity support is VERY game dependant. However you are also correct. In the games that do support non-standard FOV you do get mulitple choices of resolutions that all fit your eyefinity FOV shape.

For 3 1680x1050 its as follows:

5262x1050 (w/ BC)
5040x1050
4004x800 (w/BC)
3840x800
2504x600 (w/BC)
2400x600
 
Just a FYI, in case it informs your next GPU purchase (it did mine):

AMD said they were working on more than the three resolution options back in December 2009 and I never saw them appear by the time I went from my 5970 to 580SLI. Nvidia Surround offers tons of resolutions, at least 20-30 different resolution options for surround.

What I would really hope for is an option to drop to 3x 720p (3840x720) for the more demanding games, as it's the correct aspect ratio and has a low enough pixel budget to actually get a performance boost.

I've always dropped resolution first when I need to increase performance, to keep framerates above 60. I go down from x1080 to x900 and x720 until framerates are back up again.

AMD really need to get a handle on their software. I had to do a complete driver cleaner and reinstall every time I upgraded the drivers. The Eyefinity solution seemed very clunky and buggy compared to nVidia surround, which has installed over the top without any bugs or fixing needed. In general it seems to be much more reliable sofware, especially having working vsync in surround resolutions with two GPUs.

Not trying to start a fanboy war here as I couldn't care less who's name is on the price sticker when I upgrade to the next gen; I'll make an informed decision at the time.
 
Hmmmm.... I've been running my setup at 4080x768 without any issues, games or otherwise. I hated xfire btw.... could always feel an ever so slight latency in online multiplayer (that was 58xx series though too).
 
With how flexible eyefinity can be, there's a lot that it doesn't do. One thing is that it only will give you 3 eyefinity resolutions. 2400x600 is the "safe mode" resolution. The other a determined on highest common resolution and then the other one is a common resolution down from there.

So it would be like this:
3 x 1600x900 screens = Eyefinity Resolutions: 4800x900, 4080x768, 2400x600
3 x 1680x1050 screens = Eyefinity Resolutions: 5040x1050, 3840x800, 2400x600
3 x 1920x1080 screens = Eyefinity Resolutions: 5760x1080, 4800x900, 2400x600
3 x 1920x1200 screens = Eyefinity Resolutions: 5760x1200, 5040x1050, 2400x600
3 x 2560x1600 screens = Eyefinity Resolutions: 7680x1600, 5040x1050, 2400x600 (This one I find odd, because you'd think they would use 5760x1200 for the middle resolution)

The only way to change your eyefinity resolutions is to either change out one of the monitors to a lower resolution, or to do an EDID edit on your monitor (rewriting it's highest resolution to something lower, which will make you lose the bigger resolution)
 
Just a heads up, 4800 x 900 res on 1080p screens looks really bad. Try running singular 1680 x 1050 res on a 1080 monitor and it looks worse than that over 3.
 
Just a heads up, 4800 x 900 res on 1080p screens looks really bad. Try running singular 1680 x 1050 res on a 1080 monitor and it looks worse than that over 3.

I think that is the big thing, ever since LCD got popular its been native resolution or nothing for me.

I will turn down settings etc.. but the resolution is always the last to get turned down.
 
I dunno, it's really hard to run at 7680x1600 on a lot of newer games, I wish I could run at 5760x1200 because 5040x1050 look like shit.
 
I dunno, it's really hard to run at 7680x1600 on a lot of newer games, I wish I could run at 5760x1200 because 5040x1050 look like shit.

Based on your sig, overclock the crap out of that c2d and watch your minimum fps skyrocket - from 3ghz to 3.6/3.8 would make it seem like you upgraded your gpu.
 
Just a heads up, 4800 x 900 res on 1080p screens looks really bad. Try running singular 1680 x 1050 res on a 1080 monitor and it looks worse than that over 3.

1680x1050 is a 16:10 aspect ratio so it will most definitely suck on a 16:9 monitor.

Outside of that, it's down to personal preference. Try Dirt3 with everything on Ultra at 3x1080p with 8xAA and you have framerate issues. You could drop down to High settings but dropping down to 4800x900 8xAA and leaving settings on Ultra is infinitely preferable in terms of visual quality.

I will always drop resolution first instead of quality settings. A good analogy is film: Avatar at 720p looks pretty fucking hot. I doubt many people would notice the shift from 720p to 1080p whilst watching that film. Visually, it is beyond anything we have in games today. Most games these days have copious post-processing effects that cover up aliasing. Until graphical quality hits a point where it is indistinguishable from reality aside from resolution, then I'll keep my settings on max and drop resolution to keep my framerates up.

Again, 'quality' is subjective and so YMMV.
 
Based on your sig, overclock the crap out of that c2d and watch your minimum fps skyrocket - from 3ghz to 3.6/3.8 would make it seem like you upgraded your gpu.

Been running at 3.6 for a while, on the stock heat sink fan, do you think I could crank out a little more with a tunic tower or something similar?

You my friend need a 6990 or two GTX 580 3GB on top of a 2500k.

Well my next upgrade will be a 2500k / new mobo / 8GB of RAM (maybe 4 to start with). But with all my bills, that wont happen for a while)
 
Finkus, thanks for that suggestion on Nvidia surround. I previously had a gtx 285 and considered the surround route, but opted for the 6870 due to the much cheaper price barrier. I was hoping that someone might have known a way around the 3 resolution limit, but from the sounds of things there isn't one. In that case, I'll probably eventually head back to Nvidia. I haven't had any of the typical Ati driver problems everyone rags about, but I would like to adjust my resolution in situations as demanding as surround.
 
Back
Top