Triple 30" Monitors

ShyGal

n00b
Joined
Dec 31, 2010
Messages
4
What video card setup is standard for triple 30" monitors (3007WFP)? Three 5870? If I do get the 5870's, does it make sense to crossfire them? Or just one video card per monitor or what? I'm kind of a newbie, so give it to me in layman's terms. Thanks.
 
if you are not gaming then yea one 5870 should run them. If you are gaming then you might want to consider 6970, and one of them wont cut it for gaming, you are going to need 2 regardless.

get two 6970's first if that doesn't cut it grab the third one. I think you would run in to framebuffer issue at that high resolution with the 5870's.
 
If you want to run them in Eyefinity, then they will need to be on the same card, so three separate cards won't work in that case. You'll need a Display port adapter, too.
 
Ok, the 3007WFP has DVI-D (dual link) with HDCP ports.

Ok, I do some gaming every now and then so I want a good powerful graphics setup.

It doesn't look like Eyefinity will be powerful enough running only a single 6970? I'm guessing three 6970 will be the best option. Yes?

Why do I need display port adapters? Isn't there a DVI-D port on the 6970?
 
What video card setup is standard for triple 30" monitors (3007WFP)? Three 5870? If I do get the 5870's, does it make sense to crossfire them? Or just one video card per monitor or what? I'm kind of a newbie, so give it to me in layman's terms. Thanks.

I wouldn't use 5870's at all unless they were 2GB models. There isn't enough memory onboard to deal with that kind of resolution. Especially if you are interested in more than 2xAA. There is little point in using 3 of them in CrossfireX as the GPU scaling just isn't there with the current 5xxx series drivers.

I used a single Radeon HD 5970 overclocked to 5870 clock speeds and it wasn't enough for three 30" monitors. Not even close. Some games worked well enough with no AA but many games just punished the card too much. I had to turn the details and settings down on way too many games for my taste. I stepped up to dual GeForce GTX 580's and couldn't be happier. I'd really like a bit more power for Crysis, AvP Metro 2033, and NFSHP. I'm contemplating adding a third GTX 580 to do the job. However 3-Way SLI has it's own set of problems that go with it. Namely power consumption and heat.

DanD seems to be doing pretty well with SLI'd 580's.

http://hardforum.com/showthread.php?t=1571119

Indeed I am. :cool:

Hey thanks for the link, this looks exactly like what I am looking for. I just wonder how will I connect 3 monitors to 2 SLI video cards?

Each card has two DVI ports. You connect the first two monitors to the primary card and one to the second. In the case of the Dell 3007WFP-HC's I just ran plain old dual-link DVI cables and that did the trick. I'm not using any adapters or anything else. With my ATI / AMD Eyefinity setup I needed a mini-display port to displayport adapter followed by an active display port to dual-link DVI adapter.
 
Ok, thank you Dan. Thanks guys. The SLI'd 580's sounds like a no brainer. Plus like you said you can always add a third 580 if you feel it isn't enough.
 
Ok, thank you Dan. Thanks guys. The SLI'd 580's sounds like a no brainer. Plus like you said you can always add a third 580 if you feel it isn't enough.

Scaling still isn't where it is with dual GPU configurations but it's still adds some performance.
 
For 3X 30'', I personnally would go 3X 6970 because of the extra memory video. I wouldn't even consider 580SLI at those high-res if you are a serious gamers. 580 SLI is seriously overpriced for what you get compared to 3X6970, or 3 unlock 6950. 69xx serie scaling is insane, even better then Nvidia scaling. For around the same price, 3 unlock 6950 would rape the 580SLI set-up at those high-res with multiple screens set-up.
 
OP, you now have your answer here:

http://hardforum.com/showthread.php?t=1573598

The guy is keeping the 2x 6970 over 2XGTX580 for 3X 30'' set-up. I know 580 SLI onwers are in denial, but the 580 is seriously lacking in video RAM. For the price, it SHOULD have 2GB like the 6970.

580SLI is not worth it for high-end gaming at really high res. :)
 
Well the poster in that thread used dual GTX 580's and dual Radeon HD 6970's. He did not use 3-Way SLI or CrossfireX. I haven't seen anything about AMD improving 3x GPU scaling. NVIDIA has done some work here. Also you need two active display port to dual link-DVI adapters which puts the cost of 6970 Crossfire into the same neighborhood as GTX 580 SLI. AMD made a mistake and went with a useless single-link DVI port on each card. Furthermore, with quite a few games the performance of Crossfire leaves something to be desired. There are tons of benchmarks being thrown around and many of them show GTX 580 SLI beating out Crossfire'ed 6970's fairly easily. All things being equal more VRAM is nice, but from what I've seen few if any games can actually make use of more than about 1.3GB of VRAM. The GTX 580 has that and slightly more.

I've got to say that given the state of AMD's drivers being less than impressive right now, even if my setup doesn't perform quite as well as a "comparable" AMD setup I'm happy that I don't have to suffer through using their drivers. As far as the OP's thread in the thread you linked, he's got an odd selection of games. His testing setup looks like a lot of work went into it but I have to question some of the methodology. I'm not saying it's invalid, but I wouldn't base my decision on that data alone.
 
Well the poster in that thread used dual GTX 580's and dual Radeon HD 6970's. He did not use 3-Way SLI or CrossfireX. I haven't seen anything about AMD improving 3x GPU scaling. NVIDIA has done some work here. Also you need two active display port to dual link-DVI adapters which puts the cost of 6970 Crossfire into the same neighborhood as GTX 580 SLI. AMD made a mistake and went with a useless single-link DVI port on each card. Furthermore, with quite a few games the performance of Crossfire leaves something to be desired. There are tons of benchmarks being thrown around and many of them show GTX 580 SLI beating out Crossfire'ed 6970's fairly easily. All things being equal more VRAM is nice, but from what I've seen few if any games can actually make use of more than about 1.3GB of VRAM. The GTX 580 has that and slightly more.

I've got to say that given the state of AMD's drivers being less than impressive right now, even if my setup doesn't perform quite as well as a "comparable" AMD setup I'm happy that I don't have to suffer through using their drivers. As far as the OP's thread in the thread you linked, he's got an odd selection of games. His testing setup looks like a lot of work went into it but I have to question some of the methodology. I'm not saying it's invalid, but I wouldn't base my decision on that data alone.

This is the only review of 6970 CrossFireX I've seen:

http://lab501.ro/placi-video/his-hd-6970-studiu-de-scalare-in-configuratii-multi-card/11
http://lab501.ro/placi-video/his-hd-6970-studiu-de-scalare-in-configuratii-multi-card/15
http://lab501.ro/placi-video/his-hd-6970-studiu-de-scalare-in-configuratii-multi-card/10

Scaling in Crysis is quite impressive. AvP seems to be an anomaly, showing no gain in performance over 2 cards:

http://lab501.ro/placi-video/his-hd-6970-studiu-de-scalare-in-configuratii-multi-card/14
 
I'd love to try some 6970's myself and compare them to my current setup, but I don't think it's going to happen. I'm not really in a hurry to buy two more graphics cards, two active display port to dual-link DVI adapters and go through the hassle of testing all that. Especially not after I just had to replace my water pump and my motherboard. :mad:
 
I run 3x30" displays in Eyefinity. Three 30" is total of 12 Megapixel display area and requires very strong setup behind. First, you need good video cards and you also need good CPU and power supply. I tell you what kind of setup I have, you will need something with same caliber.

I tried with single 5970 and ran out juice. One overclocked air-cooled 5970 (850/1200 MHz) is not enough for that monstrosity display area. Now I have two 5970s in Crossfire. They are both water-cooled and overclocked to 965/1200 MHz. Now I have enough power to run Eyefinity on 7680x1600 resolution or 8000x1600 with bezel compensation. Still you may think my FPS is quite low, something between 40 to 60 in BFBC2 for example at low graphic settings. But there are one problem with 5970s, they have only 1GB of memory per GPU. I can't use anti-aliasing or post processing on any game with eyefinity resolutions. Even 2xAA will kill the game immediately and result is 1 FPS even in menus. Same result if I try using post processing. I have i7 980X at 4.6 GHz (also water-cooled), but it's not enough for two hungry 5970. I have 1250W power supply and I can bring it to it's knees if run prime95 and furmark simultaneously.

So, I can't recommend 5000-series video cards due the lack of memory, it really hurts in eyefinity resolutions. You really need video cards wih 2GB memory per GPU. I'm changing my 5970s for 3-4x 6970 setup or maybe 2x 6990 when they are released. I'm not sure yet. You need at least two GPU setup, but even then you have to lower graphic settings in games with that resolution. Three or four GPU setup would be much better, but for that you need powerful CPU. I recommend i7 4 GHz or more for 3-4x GPU setup and power supply should be 1000W or something like that.
 
I use a Single 5870 Eyefinity 6 card with my 3 x 3007wfp-hc monitors. I have to use mini-DP to DL DVI adapters.

The eyefinity 6 edition 5870 has 2GB of VRAM. Which allows for moderate amounts of AA (2X - 4X).
 
I use a Single 5870 Eyefinity 6 card with my 3 x 3007wfp-hc monitors. I have to use mini-DP to DL DVI adapters.

The eyefinity 6 edition 5870 has 2GB of VRAM. Which allows for moderate amounts of AA (2X - 4X).

Still not enough for gaming at high resolution unless you want to turn down the quality settings in some games.
 
Still not enough for gaming at high resolution unless you want to turn down the quality settings in some games.

Well if you're at that resolution, you probably will never have enough power until maybe next iteration of cards. But that's just me.
 
3 6970s or 580s will drive a lot of games very well at that resolution. 3x 1920x1080 in 3D is actually even more demanding than 3x 2560x1600 and there's a good number of games that run a moderate to high settings producing 60 FPS which is the limit in 3D Surround
 
Nothing to stop you adding a third Radeon to a pair of 6970s, but they won't scale as well. If you're able to spend $1600 on graphics cards [you probably are with 30" monitors] or you're interested in 3D, the 580s will be the better buy.

I do stress though, if you're only going with two cards, the 6970 pair will be the better buy, as they will run that resolution better than the GTX580s, and cost a lot less.
 
As others have suggested, i would go for 2x6970s for now which will perform well, add a third if needed. I would look at 2Gb cards, no less.
 
I would love to recommend 2 HD6950 or HD6970 but the lack of dual link DVI ports kills the savings, when you have to buy 2 $100 dual link DP-to-DVI adapters for the 30" monitors. I can't believe ATI only included 1 dual link DVI port on their high end video card. So unless your 30" monitors have display port connections I'd probably pick up 2 GTX580 and not have to worry about adapters.

2 x GTX580 = ~$1000
2 x HD6970 + 2 dual link DP-to-DVI adapters = ~$1000
2 x HD6950 + 2 dual link DP-to-DVI adapters = ~$800

Although the extra ram on the HD69**'s and the lower power requirement is tempting.

Are there any HD69** with 2 dual link DVI ports? If thats the case I would recommend those....... but I don't think there is.

Too bad the GTX570 don't have more ram.
 
Are you not able to use the displayport connection from card 1, the Dual-Link DVI from card 1 and the Dual-Link DVI from card 2?
 
I would love to recommend 2 HD6950 or HD6970 but the lack of dual link DVI ports kills the savings, when you have to buy 2 $100 dual link DP-to-DVI adapters for the 30" monitors. I can't believe ATI only included 1 dual link DVI port on their high end video card.

You're right about it sucks to have only ONE dual link DVI-D port on 69x0 that support 2560x1600 resolutions. But if you do CF with 3x30's, don't you only need to buy one adapter assuming there is no display port on any of the 30's? I thought you can utilize one dual link DVI on each card and plus using one DP to DVI adapter.
 
I just bought my first one and I am concerned about just having one with my video card. The LCD screen I can write off as a business expense, but I am not sure I can get away with a video card.
 
Are you not able to use the displayport connection from card 1, the Dual-Link DVI from card 1 and the Dual-Link DVI from card 2?

no, in CF the second card is the slave card, so all display connections must come from the primary card
 
For the price, it SHOULD have 2GB like the 6970.

For the price up front and the increased power usage, it should also make you sandwiches for lunch, serve you premium beer with dinner, make snacks while you're gaming, and aggressively negotiate with your electric company. A friend of mine tripled her electricity bill by getting a pair of 580s. I'll see if I can get her to share the power consumption and meter readings. Me, I'm happy with my 5850 and a single 24" monitor for now.
 
For the price up front and the increased power usage, it should also make you sandwiches for lunch, serve you premium beer with dinner, make snacks while you're gaming, and aggressively negotiate with your electric company. A friend of mine tripled her electricity bill by getting a pair of 580s. I'll see if I can get her to share the power consumption and meter readings. Me, I'm happy with my 5850 and a single 24" monitor for now.

Bullshit. I've had high end hardware for years and it doesn't do that much to my power bill. I run my machines 24/7. I've actually gone a month without gaming before due to being busy, being out of town on business, etc. and my power bill dropped very little. Running your microwave for 10 minutes a day will do far more to your power bill than your computer will. Washing machines etc. also impact your electric bill far more than high end video cards will.
 
For the price up front and the increased power usage, it should also make you sandwiches for lunch, serve you premium beer with dinner, make snacks while you're gaming, and aggressively negotiate with your electric company. A friend of mine tripled her electricity bill by getting a pair of 580s. I'll see if I can get her to share the power consumption and meter readings. Me, I'm happy with my 5850 and a single 24" monitor for now.
Lol, no. Even if she played games 24/7, that's not going to happen.
 
A friend of mine tripled her electricity bill by getting a pair of 580s. I'll see if I can get her to share the power consumption and meter readings. Me, I'm happy with my 5850 and a single 24" monitor for now.

Bullshit.

As in complete. I've been running 3x SLI for 2.5 years 280, 480 and now 580. The only thing that even begings to use the kind of power that haysupimark is saying is central AC and even that will only double my bill in the summer when it's on the hot side. Folding 24x7 might come close to the AC but then the AC isn't running 24x7.
 
Lol, no. Even if she played games 24/7, that's not going to happen.

At a time when there are a lot of good new games out, I'll play games for hours and hours on end into the wee hours of the morning. I'll do this for weeks at a time if the games are engrossing enough. I've never seen my power bill shift. Even my Skulltrail rig with dual 4870 X2's didn't make a big difference in my power bill. That machine pulls more power than any other I've ever used. That very machine now runs side by side with my gaming rig as a server these days. Though I don't have but one GeForce GTX 280 in it now. I had three at one point. Also running test bench system doing stability testing for 20 hours straight at a time and testing for days on end doesn't impact my power bill that much. Even at times like now where there are new boards and chipsets coming in all the time.
 
I don't think 'tripled the electricity bill' is very fair, but it does have a substantial increase.
Say you were a hardcore gamer and on average spent 10 hours a day in game. If we assume idle load to be equal and compare, for sake of example, a single HD5770 versus three GTX580s, that's about 750W more power being used DC, so depending on the efficiency of the PSU is probably 850-900W. That makes 9KWh a day. At the typical cost of electricity here in the UK, that's about £1 a day, or £30 a month.
The current electricity usage [when the gaming PCs aren't being used much] between 4 people in our house is about £75/month. So, supposing there were only two of us and it was more like 40.
Adding a triple-SLI setup to a hardcore gamer's PC can almost double the electricity bill. It won't triple it though.

It's a question of usage, if you only play games on occasions, of course it's not going to make much difference.
I'll just put it out there that my dual 4870X2 PC [that's a lot less power than three 580s, or even three 570s] uses a similar amount of power to my air conditioner, 805W AC in L4D2, versus 850-1020W for the A/C unit.

It does somewhat bug me when people completely write off power consumption difference for high-end systems. Yes, you might not visibly see an increase in your bill, but that's because it fluctuates with the use of other stuff. Fact is though, spend the same amount of time gaming on a 150W card that you would a 250W card, and you're saving, in US terms, a little over a cent every hour you game. 6 hours a day for a year, $22 a year. Over the life of the card, if you upgrade reasonably often, maybe $50?
'So what, who cares about $50 over two years' - that $50, when added to the initial outlay of a card makes it more expensive by comparison, throwing previous price/performance comparisons out the window.
 
570 sli seems like a good option if you don't want to spend $1000 on your video cards. The memory seems to be the biggest problem running a 12mp display with those cards.
Also, running non native (1920x1200) on a 30" doesn't look as bad as you might think since the pixel pitch is tight and there is still a lot of resolution. I'm working on getting my third 30" right now and i'm planning to run non native when I can't get decent frame rates at full resolution.

I think the dual gpu nvidia card coming out could also be a good option.
 
It looks ghastly. Maybe not much worse than a low quality monitor, but really, 1920x1200 on a 30" is a sorry sight, You're much better off using 1:1 pixel mapping and running a 23" box within the 30" display space. [the 3008WFP and U3011 can do this, but not the 3007WFP/30" cinema]

Two GTX570s are OK for $700, but two HD6950s are faster, and $100 less (Not to mention only 340W versus 450-550W)
 
maybe the ati scalar looks worse but in a game when I'm not trying to read text it doesn't look awful. It doesn't look nearly as bad a trying to run 1680x1050 on a 1920x1200 panel or anything like that. Obviously its not ideal but being able to push 12 mega pixels of display in every game isn't realistic.
 
That's true, but it does give you the smudged look of HDTV gaming instead of the crisp clear picture you typically get from a 30" monitor. It's bearable, but it's an astronomical difference from what you would normally see. To use three monitors properly in games you have to stop paying attention to the graphics so much, so running a lower res may be less of an issue. Even at native res I find games stretched out to 3x width pretty blurry and mediocre, I'm very hot on crisp edges.
 
Yeah.. 1920x1080 on a 30" U3011 is ... such a waste. Once you go 2560x1600, you never go back.
 
Well, 1920x1200 at least for the correct aspect ratio. The 1:1 is useful though if you really can't manage 2560x1600.
 
I don't think 'tripled the electricity bill' is very fair, but it does have a substantial increase.
Say you were a hardcore gamer and on average spent 10 hours a day in game. If we assume idle load to be equal and compare, for sake of example, a single HD5770 versus three GTX580s, that's about 750W more power being used DC, so depending on the efficiency of the PSU is probably 850-900W. That makes 9KWh a day. At the typical cost of electricity here in the UK, that's about £1 a day, or £30 a month.
The current electricity usage [when the gaming PCs aren't being used much] between 4 people in our house is about £75/month. So, supposing there were only two of us and it was more like 40.
Adding a triple-SLI setup to a hardcore gamer's PC can almost double the electricity bill. It won't triple it though.

It's a question of usage, if you only play games on occasions, of course it's not going to make much difference.
I'll just put it out there that my dual 4870X2 PC [that's a lot less power than three 580s, or even three 570s] uses a similar amount of power to my air conditioner, 805W AC in L4D2, versus 850-1020W for the A/C unit.

It does somewhat bug me when people completely write off power consumption difference for high-end systems. Yes, you might not visibly see an increase in your bill, but that's because it fluctuates with the use of other stuff. Fact is though, spend the same amount of time gaming on a 150W card that you would a 250W card, and you're saving, in US terms, a little over a cent every hour you game. 6 hours a day for a year, $22 a year. Over the life of the card, if you upgrade reasonably often, maybe $50?
'So what, who cares about $50 over two years' - that $50, when added to the initial outlay of a card makes it more expensive by comparison, throwing previous price/performance comparisons out the window.
This may be a case of where people live then. I live in Florida in the US, and I have central AC. A typical summertime power bill is $250. My winter power bill has been between 140-150 a month. Comparatively, my gaming system is negligible versus my AC unit.
 
Back
Top