Eyefinity and Triple DVI working.

But what an awesome conspiracy theory!
Either way, you can have a conspiracy or else you can blame those greedy capitalists at ATi for not spending the money for the 3rd RAMDAC. Regardless, it's evil! :rolleyes:
 
But what an awesome conspiracy theory!
Either way, you can have a conspiracy or else you can blame those greedy capitalists at ATi for not spending the money for the 3rd RAMDAC. Regardless, it's evil! :rolleyes:

The 5870 is already the largest single GPU graphics card. If they added a 3rd RAMDAC... jesus christ.
 
So I've been trying to play nice for the last 6 weeks, I bought a BizLink which did not work and returned it to Dell. I then bought an Accell adapter, and it was even worse (did not work at all before I updated BIOS, there after it just flickered), since returning to the reseller was a major issue I tried RMA'ing it, but surprise-surprise they say its 100% fine and passes all tests. This only means their in house testing is lacking and it you need to return one, you need to return it to the reseller as the company that produces them is not going to be of any help, or admit that their product is having compatibility issues at all.

ERIC
 
so far ive had no issues with the VGA adapter from monoprice. no flickering and will only stop being recognized if the computer restarts ( ie windows update or i restart from the CPU. if i shut down then turn back on manually there are no issues), and even then i just have to un plug and then re plug it in.

im just gonna wait till ati decides to release a cheap active dvi to dp adapter
 
So I've been trying to play nice for the last 6 weeks, I bought a BizLink which did not work and returned it to Dell. I then bought an Accell adapter, and it was even worse (did not work at all before I updated BIOS, there after it just flickered), since returning to the reseller was a major issue I tried RMA'ing it, but surprise-surprise they say its 100% fine and passes all tests. This only means their in house testing is lacking and it you need to return one, you need to return it to the reseller as the company that produces them is not going to be of any help, or admit that their product is having compatibility issues at all.

ERIC

Well to be fair nobody knows if the problem lies with the adapters or with the vidcards.
 
Alot of us are having no problems using the active adapters to get eyefinity to work. It must be something on your end that is not working right. If u live in the seattle area I'm williing to help u out. Most likely ur not though. You can see my post a few posts back. I'm using the apple mini dp to dvi adapter with no powered usb adapter just plugging it into the usb slot on the back of my pc....maybe you should take some pics of how u have it setup or a short youtube video so we can help u out...
 
i doubt they will. i imagine they have worked out some deal with the inventors of displayport in a ridiculous attempt conforming the market to a few selected displays............this kinda reeks of dell actually.

Yeah, I'm sure that the Vesa group and Dell were totally behind this. If Displayport becomes popular absolutely no one can stop Dell because it would be impossible for them to put a royalty-free connector on their own displays. There is absolutely no reason why when they make a 6 screen version they couldn't just put 6 Ramdacs on the board. Think of all the money the "Displayport inventors" stand too make off of royalties alone. If this leads to a million more Displayport devices that means they will get 1,000,000 x $0.00(Royalty Fee) which equals $0.00. That is quite a bit of money.
 
I just started getting what I need for my Eyefinity setup today. I picked up a Radeon HD 5970 today and I'm looking for deals on some more Dell 30" panels.
 
which brand did u get?

Diamond. Its just a reference design so I didn't worry about the brand too much. The warranty is only two years, but I doubt I'll have the card longer than that. Besides I got it for $599.99 at Microcenter which was cheaper than any of them I found online and I got it right away.
 
i got my eyefinity setup working last night. I have the 5850 w/3 viewsonic 24" VX2433wm monitors. I tried using the passive adapter at first but the 3rd monitor would constantly shut itself on and off. I got this adapter yesterday:

http://www.accellcables.com/products/DisplayPort/DP/dp_dvid.htm

It works great, no flickering at all. Very happy so far, tried it with a few games and it worked sweet:

Dragon Age
Vanguard

here is a pic of my desktop at 5760x1080
4368631358_3b76f0f46f_o.jpg
 
Hey thexider,

Nice setup, pretty much what I want but what cpu are you running? Any slowdown when gaming?
 
Hey Peeps!!!

I am really interested in setting up the ATI Eyefinity. I've been trying to research it a lot before I drop the Ca$h for three 24"monitors and the displayport adaptor.

I'm not sure if you guys read this info on the up coming game Battlefield Bad Company 2?

Bad Company 2 PC Graphics Details

* BY: repi
* POSTED: Feb 19, 2010, 03:39AM
* COMMENTS: 26 (Login Required to Comment)

Introduction

My name is Johan Andersson (Twitter: repi) and I'm one of the architects working on our proprietary Frostbite engine here at DICE.

We've had numerous requests to go through in more detail about what kind of graphical features & options that exists in Battlefield: Bad Company 2 for the PC. So this is an attempt at just that.

Battlefield: Bad Company 2 is based on Frostbite 1.5, but with multiple enhancements and a lot of specific effort spent on building it up and for the PC as this is the first time Frostbite is used on PC.

DX9/DX10/DX11

Frostbite 1.5 on PC is designed for DirectX 10 as a base, this enables us to easily support all the advanced graphics features that we use on the consoles, and much more! DX10 is a very modern graphics API and gives us a lot of flexibility as a developer.

DX10 has one unfortunate draw back though; it is only supported on Windows Vista and Windows 7, not the now 9 year old Windows XP. And as we have a big PC fan base, where not everyone may have transitioned over to Vista or Windows 7 yet, we have also added a rendering path that uses the old DirectX 9.

The DX9 path is quite efficient but lacks some of the features that we have on DX10 like anti-aliasing and HBAO.


For everyone with the new generation of graphics card, like the AMD Radeon 5xxx series or the upcoming Nvidia Geforce 4xx, we've also added support for DirectX 11. The primary uses of DX11 in Bad Company 2 is to soften all the dynamic shadows as well as to improve performance in general with a few smaller DX11 optimizations that we are using.

The detection and usage of DX9/DX10/DX11 is done automatically, the game selects the highest possible version available with your graphics card and OS. If you want to force a given path you can do that in the Settings.ini file by setting 'DxVersion' to any of these options: 9, 10, 11 or auto (default).

The Settings.ini file can be found in My Documents\BFBC2 directory.


AMD Eyefinity

Bad Company 2 supports AMD's Eyefinity rendering mode where you can have 3 (or 6!) monitors connected to a single graphics card. The game will then detect that mode and the very wide aspect ratio and render using a wide horizontal field of view and keep the menus & HUDs on the middle display. This can create a very immersive experience if you have the hardware required.

Nvidia 3D Vision

We've been working together with engineers from Nvidia for adding proper support for 3D Vision stereo rendering in Bad Company 2 PC. This support is not enabled in the beta but it will be included in a patch shortly after the release of the retail game.

3D Vision works by rendering a picture for each eye every frame together with a slight offset. Then the 3D glasses together with the compatible displays and drivers correctly select which frame each eye sees to create actual depth perception in the game.

We are also looking forward to trying out the upcoming multi-monitor stereo rendering: 3D Vision Surround.

Expect to see & hear more about both 3D Vision and 3D Vision Surround in Bad Company 2 after the beta!


HBAO

HBAO stands for Horizon-based Ambient Occlusion and is a rendering technique developed by Nvidia originally and that we have integrated into Frostbite for use on all DX10 and DX11 graphics cards.

It is a technique that creates soft & realistic contact shadows between objects and can really enhance the visuals but in a quite subtle way. It can be a demanding effect for the graphics card and as such is primarily meant for more higher-end cards. It only is a cost on the GPU, not the CPU.

Since the PC Beta we've done a bunch of optimizations on the effect together with both Nvidia and AMD, so if you had performance problems with it before: please give it a shot again in the retail game if you have a modern & fast GPU!

For the full technical details about how HBAO works, see this presentation by Nvidia from SIGGRAPH 2008: Image-Space Horizon-Based Ambient Occlusion.


Field of View

BC2 uses approximately the same vertical field-of-view on all platforms and modes. But as we support arbitrary monitor aspect ratios, you can get different horizontal FOV on different monitor setups - the wider monitor you have the wider horizontal FOV you get, i.e. you actually see more on the sides of the screen.

As widescreen monitors nowadays are even more common (and actually: the new standard) we felt it was very important to support them properly. Which we do in BC2 thanks to handling arbitrary aspect ratios. A positive side effect of this is that Eyefinity also works as it should, i.e. just like a single very wide monitor.

We've seen overwhelming positive feedback for the out-of-the-box support of Eyefinity, but a few select people have voiced some concern that the wide FOV when playing with Eyefinity would be considered cheating. But now when people have been playing the Beta we haven't heard any feedback that Eyefinity have actually been a big practical competitive advantage, it is meant to give you only extra peripheral vision.

The PC as a platform is inherently not a 100% fair playing field as people have always had varying performance, network connection, hardware setups, input devices and new technology - which is also one of the strengths of the platform and something we, and many PC players, would like us to embrace & utilize.

If the community later collectively agrees that Eyefinity is an actual big advantage in multi-player we can look into potentially having it as a server option, but don't think that will be needed.


Another much requested topic is to have a customizable FOV. This is not implemented in the PC Beta, but we are adding a customizable FOV for a future retail patch. The feedback from the Beta about servers, performance & stability have been of a higher priority and needed to be solved first.


Misc

The 'High' texture detail level is not included in the PC beta, but is in the retail game which increase texture resolution a bit.

Support for multiple GPUs (AMD Crossfire / Nvidia SLI) in the first PC Beta build was only partially working and could cause visual artifacts. This is improved on in the latest Beta client and since that version we have done additional performance scaling improvements that should be a good benefit.

We would like to thank both AMD and Nvidia for their technical support and assistance during the development of Bad Company 2. Both with the PC-only advanced features and with helping to make sure the game runs and performs well a wide range of graphics cards and configurations. Special thanks to Louis Bavoil (Nvidia) and Nicolas Thibieroz (AMD).


Future

If you are interested in more details about DirectX 11 as well as some sneak peaks and technical details about our upcoming Frostbite 2 engine, AMD recently did a big interview with me in their on the AMD Underground blog: "11 Days of DirectX 11: DICE". The longer full interview is also available as an pdf here

Keep in mind that the interview is about the future, beyond Bad Company 2.






I'm from Canada and I've looked in many computer stores etc......and no one even knows of Eyefinity and displayport adaptors!!!(Pretty sad eh?)

I was just curious if I can get some input back on some of the questions I have.

I've got two Sapphire HD 5850 cards in crossfire. I'm getting conflicting reports that crossfire is supported with Eyefinity! Is this true? If so what driver is needed? I heard 10.1 patch doesn't support Eyefinity.

What should I be looking for when I go out and look for three monitors?

Lastly, displayport adaptors!! What is the best one to get, and can I order it from Canada.

Any help on this great technology is really appreciated.

Thanks Url.......
 
Last edited:
Is it still true that CF is incompatible with Eyefinity? Doesn't that seriously gimp the viability of the feature? What kind of performance do you see in modern games across three 24" monitors on a single videocard?
 
Crossfire and Eyefinity is supported now, but you have to have the game profile for it to be active. If you look at Kyle's review of 10.2 and 10.3, you will see AMD has went one step futher in seperating the profiles from their drivers. This should allow them to update profiles faster.

As for adapters, this is the one you need.
http://www.amazon.com/Accell-UltraA...1_fkmr0_1?ie=UTF8&qid=1266753295&sr=8-1-fkmr0
Dell also sells the same adapter under their name. Between Amazon and Dell you should be able to order one. Now you only need that adapter if you have monitors that don't have a DP connector.

Ok for monitors, well this is a very mixed bag of ideas and opinions out there. Pretty much the basics are these.
IPS, VA monitors will give you the ability to run either landscape mode or portrait mode while still maintaining picture quality.
TN monitors allow for faster response times but have limited viewing angles and they look terrible in portrait mode
A monitor with a DP connector built in will make it so you don't have to use an adapter like the one listed above. People have been having some flickering and display going into sleep mode with the adapter.

Hope the information helps you Urlacher54
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
Subjective with no hard numbers, but I am running a 5870 Eyefinity setup at 5760 x 1200. I have not yet found a game that supported Eyefinity that was not playable. No stuttering to notice. Is it 80 fps or 70? I don't notice any sluggishness, so I don't really care.
 
Subjective with no hard numbers, but I am running a 5870 Eyefinity setup at 5760 x 1200. I have not yet found a game that supported Eyefinity that was not playable. No stuttering to notice. Is it 80 fps or 70? I don't notice any sluggishness, so I don't really care.

Have you tried Battlefield Bad Company 2? With one overclocked HD 5870 and a Core 2 Duo @ 3.6Ghz. Running the game at 5760 X 1200 on my triple 27" LCD's using high settings I only see low teens at times and a max of 34FPS. Not great at all. That was played on my server PC untill I get my new I7 quad core parts. But I think I will buy another HD 5870 to help with the frame rates at that resolution.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
Have you tried Battlefield Bad Company 2? With one overclocked HD 5870 and a Core 2 Duo @ 3.6Ghz. Running the game at 5760 X 1200 on my triple 27" LCD's using high settings I only see low teens at times and a max of 34FPS. Not great at all. That was played on my server PC untill I get my new I7 quad core parts. But I think I will buy another HD 5870 to help with the frame rates at that resolution.

I used a i7 4.2ghz, 5850 at 900/1100, low settings in beta and had 50-60fps using 5760x1080 using catalyst 10.3 more to 60 than 50fps.
the fps rate was good there.
I always use lowest settings to avoide any cluttering as it is a fps game.

If one would like higher settings, crossfire is imo likely to have for a good gameplay.
 
thanks for all the advice and insight into my future eyefinity setup guys. Thanks for your time.

cheers
 
I have had my Dell active DVI adapter up and running for about 18 hours now and haven't had any problems yet.
 
This is good to know if I ever get an eyefinit6y setup. Probably won't be for a while, if ever.
 
Tried the dp>vga on Widescreen Gaming Forum, but I got crazy flickering and link errors. Ordered the Dell adapter for 75 euros instead.
 
I got the Accel adapter from Amazon and it seems to be working great. Once in a while, if there is a resolution shift on the screen, I have to cycle the monitor to get it to come back on, but that seems to be happening less and less. No problems with flickering that I have seen.
 
Just got battlefield bad company 2. Cant wait to get eyefinity going. Ill more than likely get one of those Accel adapters.
 
Is it getting any better with monitors and the dreaded bezels?

I'd really think about jumping to at least dual monitor if this was the case.
 
Is it getting any better with monitors and the dreaded bezels?

I'd really think about jumping to at least dual monitor if this was the case.
Bezels really aren't a problem, especially with the 10.3 bezel management. However, I wouldn't recommend using just 2 monitors, since your focus/crosshairs shouldn't have a bezel straight through it.
 
I stopped noticing the bezels after 1 minute. They're totally a non-issue for me. And I wouldn't want to use Bezel management because that would give bad guys a place to hide.

If you wear glasses it's the same as not noticing your frames any more. If you drive a car it's the same as not noticing your A-pillars any more. And even if you did continue to notice the bezels, it's still better to have 3 screens than 1 screen.
 
This is for personal interest rather then a plan to try this set up.

Has anyone ever got one of those AMD/ATI 5xxx (or 6xxxx) cards with HDMI, DVI, and VGA ports to work with 3 monitors? According to AMD/ATI it shouldn't work. But according to http://www.hardocp.com/news/2009/11/30/eyefinity_using_active_adapter/ and some reports including in this thread, using a passive DisplayPort to VGA adapter together with two DVI-D/HDMI works.

From wikipedia http://en.wikipedia.org/wiki/Evergreen_(GPU_family)#Multi-display_technologies (not properly refed and I haven't checked the refs but it sounds like it's probably right) it seems the the Evergreen supports up to 6 digital signals. It also has two RAMDACs. And 2 clock generators. A dual-link DVI-D or HDMI will use two of the digital signals (in TMDS) and one of the clocks. So presuming the card doesn't add any extra clock generators or whatever, you can get up to two dual-link DVI-D or HDMI outputs. The third output in theory needs to be DisplayPort since you don't have enough clock signals. But the fact passive adapters work suggests this isn't really the case. I presume the RAMDACs don't need the clock signal to operate and they generate their own clock for output over the VGA. Therefore you can use one of the two RAMDACs (along with one of the digital signals) on the cards for VGA output along with the two dual-link DVI-D/HDMI (using the 2 clock signals and 4 digital signals) for 3 monitors. And this I presume is what happens with the passive adapter+DVI-D/HDMI+DVI-D/HDMI case.

In theory the same thing should be doable even if not officially supported with the cards with VGA+HDMI+DVI. Whether any actually manage it is another question (they may not be designed/wired with this in mind or perhaps the bios may stop it). I've found several posts of people saying it doesn't work but this mostly seems to be theory rather then practice. Also if I understand correctly the passive adapter scenario can be a bit flaky and sometimes you lose a monitor on restart or driver reinstall so I guess the same thing may apply with the VGA port scenario meaning people may presume it doesn't work without actually trying properly. (I also read something suggesting it didn't work for someone with a 5470 but it's difficult to know how hard they tried.) As I said this isn't something I'm planning to do, just personal interest.
 
Last edited:
Back
Top