3x 30" Portrait 6970 CF Eyefinity vs 580 SLI Surround Showdown

So the only way to eliminate vertical tearing is to disable vsync which would cause tearing horizontally? Am I reading correctly? lol

In my experience, whether portrait or landscape using 3x 30", both the 6970's and 580's get horrible screen tearing all over the screens with VSync off. If I enable VSync, the screen tearing completely disappears with the 580's. With the 6970's though, on one or two of the screens I get a permanent single screen tear that is very noticeable. Landscape or portrait doesn't matter.
 
Just an update with the 6970's, even at lower Eyefinity resolutions with VSync on I get that single screen tear on the monitor connected with DVI-D.

I brought my computer next to my screens and am now connected with 6 foot DP and DVI-D cables that came with the monitors for testing purposes. Going to see if I still get the DP link failure errors and screen blinking. Unfortunately that DVI-D screen tear is still there. I will try and hunt down a DVI-D to DP adapter and use it in reverse order. Put the adapter plug into the monitor and use the DVI-D connection to go from the 6970 to the monitor and see if that helps.
 
Hey vega, I get the display port error and my .of is right below my desk....but I have yet to experience and tearing....
 
Hey vega, I get the display port error and my .of is right below my desk....but I have yet to experience and tearing....

Ya, I am getting the Displayport link failure even with thick/short 6 foot cables! Either there is a hardware issue with both my 6970's, there is a driver problem, or DP is just complete rubbish. I haven't had a single screen blink yet with the short cables so there is some hope.
 
One of the dongles packaged with my 6950s was faulty. There's nothing inherently wrong with DisplayPort.
 
Last edited:
Even at lower resolutions you're getting problems? That's not good. I have two of my monitors hooked up with DVI and one with DP. I haven't noticed any tearing at all and everything is working smoothly.
 
Vega,

I may have missed you answering this, but have you tried one card at a time to see if it is the DP on just the top card? Sorry to hear about all the trouble you are having. I am having some issues of my own, but I think it is all videocard stability related. Having one of my 6950's refunded from Newegg. Ordered another and hope it fixes the problem.

System Specs:

i7 920 @ 4.2ghz HT enabled
6GB DDR3
Asus PX58D-E
850W Corsair PSU
Win7

With both cards @ 6970 speeds, Metro 2033 at 5760x1080 on normal is playable. Once you go to High settings, it REALLY drags down the system. (FYI) I'll try to grab framerates soon. Want to play 1920x1080 tonight to see how it looks with all the bells and whistles enabled.

Good luck
 
I have done some digging and apparently, this screen tear is an Eyefinity issue that AMD says it cannot fix! It deals with using mixed mode Displayport/DVI-D, which is the only friggin' way to do 3x Eyefinity with current 69xx series.

Quote from ATI beta tester: "The 3 displays are turned on at different times, have different EDIDs, run through different input connectors, with some being routed through the Ramdac or not so its not possible to sync them in all cases.

I asked the people in charge of Eyefinity so they should know or not if it is possible to fix the tearing and they said no except by using the S400 and FireGL card or by using an Eyefinity 6 card with 3 Displayport connectors. So if you wanna argue with the engineers & designers that made Eyefinity then go ahead."

Supposedly there is a slight timing difference between the DP and DVI-D connection that cannot be corrected for. That's why 580's work in Surround, they are all DVI-D. The only way around this I can find is to use all native DP connections. So to eliminate the screen tear, you need to wait for the 6990 or non-reference 6970's that are all Displayport and haven't been released.

Interestingly, I can move the screen tear around between the monitors just by setting the "preferred" monitor in the Eyefinity display. I can move the screen tear around between the DP/DVI-D monitors. I can set it so that the screen tear is only on one of the monitors on the left or right so it's not as distracting. At least it's good to know what the problem is and in theory I can swap out a reference 6970 with a future "Eyefinity 6" 6970 or a 6990 to eliminate the problem. Or I could keep the current cards and go tri-xfire with the Eyefinity 6. As long as the primary crossfire card that will be utilizing the TMDS that has all monitors connected through the same type of display connection, the problem should disappear.

Now that I know what the problem is and the solution, I can get back to benchmarking. So from my experience, if you want no screen tear now, you will have to get 580's as they use 3x DVI-D. Or wait for the Eyefinity 6 6970's or 6990's (if the 6990's end up having 3x DP) and use 3x Displayport monitors.
 
Last edited:
Vega,

I may have missed you answering this, but have you tried one card at a time to see if it is the DP on just the top card? Sorry to hear about all the trouble you are having. I am having some issues of my own, but I think it is all videocard stability related. Having one of my 6950's refunded from Newegg. Ordered another and hope it fixes the problem.

System Specs:

i7 920 @ 4.2ghz HT enabled
6GB DDR3
Asus PX58D-E
850W Corsair PSU
Win7

With both cards @ 6970 speeds, Metro 2033 at 5760x1080 on normal is playable. Once you go to High settings, it REALLY drags down the system. (FYI) I'll try to grab framerates soon. Want to play 1920x1080 tonight to see how it looks with all the bells and whistles enabled.

Good luck

I've got my processor overclocked to only about 3.33GHz right now. I'm still tuning my system since replacing my EVGA X58 3X SLI Classified motherboard with a Rampage III Formula. I've played Metro 2033 at 5760x1200 and it's smooth as butter on high settings. Though I'm using NVIDIA GeForce GTX 580's. At "Very High" the look of the game improves dramatically but performance tanks. It's not playable like that.

Metro 2033 is a demanding game. The most demanding since the original Crysis game. I don't think anything can run it maxed out at 5760x1080 or greater resolutions at present.
 
Vega;

Based on your studies, is it possible to eliminate the screen tearing by using the display port(DP) of the 6970 and connect to the DP of a zr30w and then use the two miniDP of the same 6970 and connect 2 more zr30ws by way of miniDP to DP cable (provided they are short enough)? I do not have to have DVI, but if I am going to spend the money for 6970 crossfire and also purchase a 3rd zr30w, I'd rather avoid video artifacts/crashes/etc and have full resolution. Correct me if I am wrong, but wouldn't the miniDPx2 and the native DP allow for 7680x1600 resolution and bypass the DVI-D timing issue?

by the way, when you first posted up about the 3 30inch bezel-less monitor system, what GPUs were you using then?...things regarding FPS/ FSX seemed good at that time, yes?

thanks for your thoughts...and your monumental efforts

rob
 
Vega;

Based on your studies, is it possible to eliminate the screen tearing by using the display port(DP) of the 6970 and connect to the DP of a zr30w and then use the two miniDP of the same 6970 and connect 2 more zr30ws by way of miniDP to DP cable (provided they are short enough)? I do not have to have DVI, but if I am going to spend the money for 6970 crossfire and also purchase a 3rd zr30w, I'd rather avoid video artifacts/crashes/etc and have full resolution. Correct me if I am wrong, but wouldn't the miniDPx2 and the native DP allow for 7680x1600 resolution and bypass the DVI-D timing issue?

by the way, when you first posted up about the 3 30inch bezel-less monitor system, what GPUs were you using then?...things regarding FPS/ FSX seemed good at that time, yes?

thanks for your thoughts...and your monumental efforts

rob

All current 6970's have 2x mini-DP, HDMI port, a DVI-D port and a DVI port. No way to use 3x DP natively. All monitors must be connected to the same or "primary" crossfire card for Eyefinity, unlike SLI. My original setup was 2x 480's in which I upgraded to 2x 580's. But the serious VRAM limitations on those is what made me test out these 6970's and back into AMD hell muahaha. :D
 
All current 6970's have 2x mini-DP, HDMI port, a DVI-D port and a DVI port. No way to use 3x DP natively. All monitors must be connected to the same or "primary" crossfire card for Eyefinity, unlike SLI. My original setup was 2x 480's in which I upgraded to 2x 580's. But the serious VRAM limitations on those is what made me test out these 6970's and back into AMD hell muahaha. :D

The connectivity issues, requirements for display port and active displayport adapters are some of the reasons why I wouldn't really consider the AMD side this time. There is less VRAM on the 580's but that seems to impact us most when trying to use AA more so than other times. Generally speaking the GTX 580 is still the faster card so I chose to go with those instead. Not having to deal with the BS of two active display port to dual link adapters is nice.
 
The screen tearing shocked me too when I went from 5850->6970 single card Eyefinity (5760x1200). I have also found that setting the center monitor to be the "preferred" display makes the tearing occur on one of the side monitors.

After moving the tearing to a side monitor, I've played regularly for a week and never observed the screen tearing unless I was purposely looking for it. With this change, it's really a non-issue for most users.
 
On another note, I've noticed that the single "scrolling" screen tear that you get with Vsync on with Eyefinity goes away and is replaced by the normal screen tearing with VSync off. I guess this whole DVI-D/DP simultaneous issue only effects VSync on Eyefinity.

I've also noticed that the flight sim Rise of Flight has mysteriously gotten a large FPS boost all of a sudden. This game is notorious for not support Crossfire yet now it's using both of my 6970's. I must be going crazy....
 
The connectivity issues, requirements for display port and active displayport adapters are some of the reasons why I wouldn't really consider the AMD side this time. There is less VRAM on the 580's but that seems to impact us most when trying to use AA more so than other times. Generally speaking the GTX 580 is still the faster card so I chose to go with those instead. Not having to deal with the BS of two active display port to dual link adapters is nice.

Some of the flight sims I use are seriously limited by the 1.5GB VRAM on the 580's. Not only that, in Rise of Flight, I can max out the 2GB VRAM easily on the 6970's and have it crash. Imagine how much worse it is with half a GB less. Even in a simple Heaven 2.1 benchmark using MSI Afterburner which shows GPU memory used, the memory gets pegged at the limit of 1536MB and my performance drops. I am glad you haven't ran into memory issues but the games I play they sure are a problem. :mad: But your right, nVidia has the right idea to keep everything nice and simple and use the tried and tested 3x DVI-D approach. This mish-mash of connectivity AMD has come up with is just horrid to deal with.

The screen tearing shocked me too when I went from 5850->6970 single card Eyefinity (5760x1200). I have also found that setting the center monitor to be the "preferred" display makes the tearing occur on one of the side monitors.

After moving the tearing to a side monitor, I've played regularly for a week and never observed the screen tearing unless I was purposely looking for it. With this change, it's really a non-issue for most users.

Ah so you notice the screen tear too eh? I can put it off to one of the side monitors, but I still can notice it greatly in flight sims where the sky is nice and bright. That makes screen tears very apparent versus dark images. I can live with it for now but I'd like to get rid of it permanently by going 3x DP on a "Eyefinity 6" 6970 card.
 
Thanks for your response, Vega...

just so I am clear as the last ati/amd card I used was an x800xt; it is not possible to drive 3 zr30 monitors from the primary crossfire 6970 card with one DP and two miniDP cables (total of 3 DP outputs)?....if so, at least one DVI or HDMI output must be used, thereby inducing the hell you described?....WTF is AMD thinking. Surely they must know that not everyone is using 1080...


thanks again,

rob

rob
 
Thanks for your response, Vega...

just so I am clear as the last ati/amd card I used was an x800xt; it is not possible to drive 3 zr30 monitors from the primary crossfire 6970 card with one DP and two miniDP cables (total of 3 DP outputs)?....if so, at least one DVI or HDMI output must be used, thereby inducing the hell you described?....WTF is AMD thinking. Surely they must know that not everyone is using 1080...


thanks again,

rob

rob

Exactly, what the hell was AMD thinking. :eek: The only solution I see is a Eyefinity 6 DP card or a 6990 if it has 3x DP.
 
Some of the flight sims I use are seriously limited by the 1.5GB VRAM on the 580's. Not only that, in Rise of Flight, I can max out the 2GB VRAM easily on the 6970's and have it crash. Imagine how much worse it is with half a GB less. Even in a simple Heaven 2.1 benchmark using MSI Afterburner which shows GPU memory used, the memory gets pegged at the limit of 1536MB and my performance drops. I am glad you haven't ran into memory issues but the games I play they sure are a problem. :mad: But your right, nVidia has the right idea to keep everything nice and simple and use the tried and tested 3x DVI-D approach. This mish-mash of connectivity AMD has come up with is just horrid to deal with.



Ah so you notice the screen tear too eh? I can put it off to one of the side monitors, but I still can notice it greatly in flight sims where the sky is nice and bright. That makes screen tears very apparent versus dark images. I can live with it for now but I'd like to get rid of it permanently by going 3x DP on a "Eyefinity 6" 6970 card.

I don't play flight simulators. So that may be part of it. All the games I play are first person shooters and a few other games like Batman Arkham Asylum, or Dead Space 2. None of which use that much VRAM. I also play racing games but again none of them have used up all my VRAM.
 
I don't play flight simulators. So that may be part of it. All the games I play are first person shooters and a few other games like Batman Arkham Asylum, or Dead Space 2. None of which use that much VRAM. I also play racing games but again none of them have used up all my VRAM.

Ah ya, those generally don't demand tons of VRAM. If nVidia ever came out with 3GB cards I'd be all over those.
 
thanks again, Vega...

it makes my head hurt to think that AMD would ship out a card with the known timing issue/ connectivity limitation that has plagued your testing...as was mentioned previously in this thread, being [Hard] means some pain. nonetheless, I think I sit tight with two monitors and watch what the oems do...

rob
 
So how do I set the center monitor to be preferred in order to shift the tear?
 
thanks again, Vega...

it makes my head hurt to think that AMD would ship out a card with the known timing issue/ connectivity limitation that has plagued your testing...as was mentioned previously in this thread, being [Hard] means some pain. nonetheless, I think I sit tight with two monitors and watch what the oems do...

rob

During my research, this problem has existed every since Eyefinity was created and there is no way around it besides buying a "Eyefinity' edition card to use all of the same display type connections.

So how do I set the center monitor to be preferred in order to shift the tear?

On the bottom of the Desktop & Displays screen in CCC that shows your eyefinity setup, right click on one of your monitors. One of the three is set as your "primary', if you click on the other two you can set either of them primary. Keep adjusting until you get the screen tear to the
"preferred" screen lol.

I just did a final and last confirmation that it is in fact a mixed display connection type problem. I set my Eyefinity to use only the 2 DP monitors. Problem resolved. I then set them to use one DP and one DVI-D connected monitor, problems back.

Ohh ya, even using short 6 foot DVI-D cables with the Accell active mini-DP to DVI-D adapters, the monitors still go black. Biggest pieces of junk.
 
So this V-Sync Eyefinity tearing is not even specific to 1600p resolution? One would think that many more people would have noticed this and complained, unless most people aren't using V-Sync in Eyefinity in the first place, which is the likely answer.

So in that case, I take it that your use of the the monitors in portrait mode makes the tearing without V-Sync unbearable? Or are you just sensitive to tearing in general?
 
I still don't understand why ATI doesn't give you the opportunity to use the DVI ports on the secondary CF card as well. Get rid of this who DP issue altogether.
 
It's not a DP issue. It's a sync issue between DP and DVI. But yes, I'd rather use the ports on both cards, except I'd rather use all DP ports. DVI and VGA connectors are practically built to snag on every damn cable in a cable nest.
 
So this V-Sync Eyefinity tearing is not even specific to 1600p resolution? One would think that many more people would have noticed this and complained, unless most people aren't using V-Sync in Eyefinity in the first place, which is the likely answer.

So in that case, I take it that your use of the the monitors in portrait mode makes the tearing without V-Sync unbearable? Or are you just sensitive to tearing in general?

Nope, not specific to 1600p Eyefinity. I changed the resolution to 5760x1200 and it was exactly the same. I think most like you have said don't notice it because they have VSync off. I don't know if it's how the eyes operate, but to me the vertical screen tearing in portrait is much more apparent than horizontal screen tearing in landscape. Your mileage may vary.

I still don't understand why ATI doesn't give you the opportunity to use the DVI ports on the secondary CF card as well. Get rid of this who DP issue altogether.

They might not have designed that into their architecture like nVidia did. Maybe the SLI bridge versus crossfire bridge has something to do with it.
 
The connectivity issues, requirements for display port and active displayport adapters are some of the reasons why I wouldn't really consider the AMD side this time. There is less VRAM on the 580's but that seems to impact us most when trying to use AA more so than other times. Generally speaking the GTX 580 is still the faster card so I chose to go with those instead. Not having to deal with the BS of two active display port to dual link adapters is nice.

Unless you have 3 3007s, all other 30" monitors that I'm aware of support display port natively. You'd only need one active adapter.
 
They might not have designed that into their architecture like nVidia did. Maybe the SLI bridge versus crossfire bridge has something to do with it.

It's a hardware implementation rather than software as it is on the Nvidia side. Take it up with CarrellK.....


They could allow this in software however, recall the early 24 screen Linux eyefinity demos...


Also, when the DP 1.2 hubs come out you will be able to use that, although probably not at 25 foot length.
 
Unless you have 3 3007s, all other 30" monitors that I'm aware of support display port natively. You'd only need one active adapter.

not for the new 6xxx series cards, as they only have one duel link dvi per card...:mad:
 
not for the new 6xxx series cards, as they only have one duel link dvi per card...:mad:

2 would be connected with NATIVE DISPLAY PORT
1 would be connected with their SINGLE DUAL LINK DVI -> active adapter

Where did I go wrong?
 
well w/ the 3007, you would need two active adapters to the native display ports on the card and 1x3007 to the duel link dvi
 
well w/ the 3007, you would need two active adapters to the native display ports on the card and 1x3007 to the duel link dvi

All my monitors are Dell 3007WFP-HC's. So I'd need to active display port to dual-link DVI adapters to use 6xxx series AMD cards with my machine. That's not appealing at all.
 
and I already listed the 3007 as an exception

sorry, totally miss read your post...

yea DanD, $165 for 2 adapters...oh well, don't feel like upgrading to i7 or SB yet, so I may hold out on adding a 2nd 6970 and see what duel gpu card comes out soon
 
I would argue that one purchasing a set of 30" monitors should purchase 2010 level tech such as U3011/ZR30W rather than 2007 level tech.
 
I've had 2x3007's, and like DanD, not having matching LCD's would bug the crap out of me =P

I had an i7, but I game and web surf...that's the extent of my pc usage, sold it and down graded to my current setup
 
sorry, totally miss read your post...

yea DanD, $165 for 2 adapters...oh well, don't feel like upgrading to i7 or SB yet, so I may hold out on adding a 2nd 6970 and see what duel gpu card comes out soon

It's "dual" not "duel". Additionally a 2nd 6970 will probably serve you better than a single dual GPU card. The dual GPU cards these days tend to have gimped clocks in order to keep their TDP to 300 watts or less.

I would argue that one purchasing a set of 30" monitors should purchase 2010 level tech such as U3011/ZR30W rather than 2007 level tech.

Hell no. From what I understand those monitors have a built in scaler and considerable input lag. Not everyone is sensitive to that but I sure as hell am. I couldn't live with that. So for me the 3007WFP-HC is the only choice. Unfortunately they are in short supply these days.
 
Hell no. From what I understand those monitors have a built in scaler and considerable input lag. Not everyone is sensitive to that but I sure as hell am. I couldn't live with that. So for me the 3007WFP-HC is the only choice. Unfortunately they are in short supply these days.

ZR30W apparently doesn't have a scaler. :) If that is true, then the U3011 would be definitely a no go for me because I can't abide even 15ms of input lag. My U2410s in game mode are barely acceptable.
 
Back
Top