Will a non DP eyefinity card come out?

Breath_of_the_Dying

[H]ard|Gawd
Joined
Jan 8, 2007
Messages
1,454
I was wondering if the requirement for DP was a technical requirement or a design standpoint. I know currently, something about the channels using DVI-HDMI restricts using all 3 monitors without displayport, but can that change?

Whats to keep ATI or any of the card makers from having triple HDMI-out? Is it wishful thinking that by ATI's GPU refresh they'll make a triple HDMI out?

I want eyefinity for my 1920x1200 monitors, but that extra $100 for an active adapter is really holding me back. I'm even considering the 5970 for more power, but then I'd need a second adapter (mini-DP->DP->DVI)
 
I doubt we will see one. It'll require a reworking of hardware, especially the I/O chip. The only thing that might come out of this is cheaper Active DVI-DP adapters, since AMD is "reportedly" working closely with manufactures to make this happen. Also, AMD is one of the earlier supporters of DP, so they'll do what they can to make it an industry standard.
Here's a link to the Wikipedia DP page, and it details some of the benefits over DVI/HDMI.
http://en.wikipedia.org/wiki/Display_port
 
Maybe the board partners will add that ability later.
Kyle's posted AMD is trying to get a working cheaper adapter out.
 
The Eyefinity technology is only possible with Display Port when you're trying to drive more than two panels (both DVI).

The DVI requires individual clocks for each port, whereas DP requires only one clock source for multiple outputs.

So it's more of a hardware limitation when you're using DVI displays. This is why you can't use 3xDVI panels, the 3rd panel must be a DP adapter that performs TRUE DP enmulation, if CCC panel does not report the panel as DP it will not work in Eyefinity with two other DVI panels.
 
Im interested in this as well...display port was just a bad idea IMHO...just makes things more difficult...now if I want eyefinity I have to upgrade my LCDs...bad choice!
 
There are plenty of folks here with EF working on 3 DVI montiors, just check that thread to see what they are doing. We verified it working last November ourselves.

And no, I would not expect to see an all-DVI card.
 
It's possible that ATi could add another clock generator in a refresh, or that a partner could produce a card with an additional chip. It would certainly be wishful thinking to hope for one.
 
With DisplayPort being an open standard with support from all the major hardware vendors, it's a good bet that it will replace DVI, and maybe even HDMI, in the next couple of years. Releasing a card with only DVI/HDMI doesn't make sense for them, as it'll only delay the inevitable.
 
I doubt DP will ever be an industry standard, when HDMI is basically dominating everything.
 
With DisplayPort being an open standard with support from all the major hardware vendors, it's a good bet that it will replace DVI, and maybe even HDMI, in the next couple of years. Releasing a card with only DVI/HDMI doesn't make sense for them, as it'll only delay the inevitable.


agree'd.. i think we will see a much bigger change in monitors selling with DP support once the 6 PD port cards start hitting the market.. or at least we can all hope it will..
 
I doubt DP will ever be an industry standard, when HDMI is basically dominating everything.

I have to disagree with this as DP reduces the production cost in terms of I/O for monitors and video devices as it can share clock signals thus allowing a reduce cost in components and its royalty free unlike HDMI. It could fall the way of beta vs VHS but I think it has a good chance since quite a few companies are backing it.
 
Incorrect - one monitor has to use displayport, because the HDMI/DVI signal only has enough chips for two dual-channel DVI-resolution monitors.

This was apparently a calculated decision made at a time when Eyefinity's reception was not known. The DisplayPort signal(good for up to 7680x3200 at least) comes efficiently straight off the GPU chip's pins. An HDMI/DVI timing chip which converts it is expensive, and they judged that the number of customers which would buy the card because of a third timing chip was considerably less than the number of customers who would refuse to buy the card because of the slightly higher price.

It is not a limitation in the technology. It is a limitation in the reference HD5xx0 board designs that have been used so far.

This may change at some time in the future. Eyefinity was much, much more popular than what the beancounters expected. Either AMD may put this in for the next enthusiast cards released after the Eyefinity6, or other board manufacturers may release 3-timing-chip cards when they're done designing their own non-trivial PCBs (rather than just stickers), which will be soon.

The upcoming Displayport 1.2 will have enough bandwidth, features, and cost advantages over HDMI that it becomes at the very least a monitor standard.

If we ever see better-than-1080 televisions at the consumer level... 1.2 will enable a full 4K Digital Cinema standard to be used.
 
Last edited:
I doubt DP will ever be an industry standard, when HDMI is basically dominating everything.

I think you are very wrong. HDMI carries a royalty and quite frankly it offers really nothing over current DVI-D expect for sound pass through and that is not a big deal in the computer realm. DP is technically superior to HDMI and costs less. Once the DP montiors catch up with the video card technology, it is possible to have a single DP cable from you video card up to six montors IF you could daisy chain the monitors. Not sure if this is doable in 3D, but certainly can be done for the desktop.
 
Interesting too, I "hear" that the new Intel processor with GPU on the CPU package, will support 2560x1600 but only with DP, DVI-D is limited to 1920. So instead of putting Dual Link DVI onto the package, Intel saw fit to put DP clocks. I think that is telling....
 
I agree with you here.

Keep in mind it's PS3's job to try to make AMD/ATI look bad.


more like make everyone and everything look bad..


but that is quite interesting kyle that intel did that.. now if only they would start pumpin out more monitors with DP support..
 
more like make everyone and everything look bad..


but that is quite interesting kyle that intel did that.. now if only they would start pumpin out more monitors with DP support..

Or just make cheaper adaptors for those of us who don't want to buy another monitor!
 
True that Jeremy. Been thinking about getting 2 other dynex 32" 1360x768 resolition monitors identical to mine for maybe $200 a piece for a little eyeinfinity action.Been a big skeptic of eyeinfinity thus far but I think it will work great great for racing games. Then for shooters I'll just stick to one monitor.
 
Or just make cheaper adaptors for those of us who don't want to buy another monitor!


true but i want an excuse to get rid of this samsung 245BW monitor.. payed 500(425 after MIR which i never sent in) for it brand new almost 3 years ago and yet the prices still havent dropped all that much since then..


True that Jeremy. Been thinking about getting 2 other dynex 32" 1360x768 resolition monitors identical to mine for maybe $200 a piece for a little eyeinfinity action.Been a big skeptic of eyeinfinity thus far but I think it will work great great for racing games. Then for shooters I'll just stick to one monitor.


complete waste of money but since you dont listen to anyone on this forum anyways go right ahead and waste it..
 
still stuck with the 2 dvi and display port i believe when it comes to eyefinity.. but im not sure about if you are just using standard multi-display settings if the second card is usable for that or not..
 
Then for shooters I'll just stick to one monitor.

What I want is for all shooters to make it so that when going eyefinity the middle screen is exactly the same as if you were playing with one screen, and the other screens just increase your field of vision. If it worked like that for all shooters I'd go out and buy two more monitors in a heartbeat.
 
What I want is for all shooters to make it so that when going eyefinity the middle screen is exactly the same as if you were playing with one screen, and the other screens just increase your field of vision. If it worked like that for all shooters I'd go out and buy two more monitors in a heartbeat.
It does work like that for the overwhelming majority shooters. :p
 
I've been trying to figure this out, but this thread seems like a good place. If you run crossfireX with 2 5870s can you use 3 dvi ports for eyefinity and the 4th for a non eyefinity monitor? Or are you still stuck just using 2dvi/adapter for eyefinity and a cheapo card for a spare monitor.

When you're doing 3D CrossFireX you can only use the connectors off the primary card. You will need to use an active DVI>DP connector.

It does work like that for the overwhelming majority shooters. :p

I find it works best if you have 3 panels in portrait mode, as this will be closest to the 16:10 ratio, so minimal stretching is done. Not all games will give you more FOV, some will simply stretch at the sides.
 
What I want is for all shooters to make it so that when going eyefinity the middle screen is exactly the same as if you were playing with one screen, and the other screens just increase your field of vision. If it worked like that for all shooters I'd go out and buy two more monitors in a heartbeat.

That's exactly what Eyefinity does. There's very little point to Eyefinity (at least for gaming) if it didn't do this. Of the shooters I've played (L4D2, Bioshock, Modern Warfare 2, Half Life 2, Crysis & Crysis Warhead), they do exactly what you want Eyefinity to do - the middle screen shows what you'd see on a single screen setup, with the sides expanding the gaming world, not just stretching it.

This is why Eyefinity is getting so much attention. You actually see a ton more game world (my resolution is 6144x1152) than on a single monitor setup. I've gone back and forth between my three 23" Samsungs and a large 40" 1080p panel. Hands down the three 23" give me a better gaming experience.
 
Back
Top