Does anyone actually use Displayport?

Zarathustra[H]

Extremely [H]
Joined
Oct 29, 2000
Messages
38,858
I'm not trying to be silly here, I am genuinely curious..

With DVI and HDMI both being digital standards, I have never found a use for this new port standard.

Do you use it? If so, why?
 
1.) Is there something special about DisplayPort that allows for EyeFinity, or is it just that all the video cards that support EyeFinity have a bunch of displayports instead of DVI?

2.) Do you actually have DisplayPort supporting monitors, or do you rely on some sort of DVI-DisplayPort adapter?
 
Zarathustra[H];1036222281 said:
1.) Is there something special about DisplayPort that allows for EyeFinity, or is it just that all the video cards that support EyeFinity have a bunch of displayports instead of DVI?

2.) Do you actually have DisplayPort supporting monitors, or do you rely on some sort of DVI-DisplayPort adapter?

To hook up three monitors to an Eyefinity capable card you must use the Displayport.

I'm using a DP to VGA passive adaptor.Along with the two DVI ports.
 
My Dell p2210's have native displayport connections, so yes I'm using the actual port.

It's required because the HDMI and one of the DVI ports share the video lane (or whatever you want to call it, I can't remember the proper term). Only one will work if it's plugged in to either of the two.

I guess there might be a fix for it with the new cards coming out.
 
Yes. I have 4 monitors with a DP connector. I use it on one of them for Eyefinity. I would use them more if more video cards supported them. They have the potential to be pretty great since you can daisy chain monitors to them through one connection but I haven't seen a video card solution that would support that yet.
 
Zarathustra[H];1036222281 said:
1.) Is there something special about DisplayPort that allows for EyeFinity, or is it just that all the video cards that support EyeFinity have a bunch of displayports instead of DVI?

Sort of. Multi-monitor is cheaper to do on the gfx card because each VGA/DVI/HDMI connection needs a private clock generator on the GPU die. DP monitors can share a single clock gen. ATI's 5xxx cards have two legacy and one DP clockgen so any 3+ monitor configuration needs to use a DP output. Adding a 3rd legacy clock wasn't an option when the GPU was being designed because they were trying very hard to keep the die size down. Eyefinity itself was almost cut for this reason. Back when the GPU was being designed it was expected that DP support in monitors would grow faster than actually has been the case so it wasn't thought to be much of a problem. Rumors are that ATI is going to do something, probably adding a 3rd legacy clockgen, to simplify Eyefinity setups for the 6xxx series; although it's possible that they're just include an 1 link active adapter in the box. They're available for <$30 retail which probably means wholesale prices are in the $10-15 range which is affordable for higher end cards; especially since a very large ATI buy would almost certainly boost economies of scale and drive the price down.
 
I use it just because the Cable is smaller, and easier to remove and attach.
 
Yes I use it. My u2311h doesn't have hmdi. I use displayport for my 5970 and dvi (with hdmi adapter) for my ps3. Word of warning avoid dp cables from monoprice there are compatibilities with this setup. Currently using a minidp-to-regulardp cable from circuitassembly.
 
i have a dp monitor and dp cable. i just don't have the dp video card yet. i think i'll get one of the new ati 6000 series when they are out. it's like a dream i have dp or not dp. dp forever! dvi is old.
 
Yes, I use it.
Eyefinity user.
My ZR24W's have native display port.
My S2409W's use an active DP to DVI adapter.
 
Last edited:
Zarathustra[H];1036222252 said:
I'm not trying to be silly here, I am genuinely curious..

With DVI and HDMI both being digital standards, I have never found a use for this new port standard.

Do you use it? If so, why?

Nope, never had to use it when I had my 5870s, and my GTX 480 has a mini-HDMI.
 
Here's a fairly good article that covers the advantages of DisplayPort over HDMI / DVI.

http://www.brighthub.com/computing/hardware/articles/24782.aspx
I'm not overly impressed by the level of research in that article.

"Other features are also based around pretty radical differences with HDMI. DisplayPort sends data in packets, and could therefore be used to transfer signals other than audio and video." DVI and base level HDMI are allow packet based and allow sending audio signals over the wire. HDMI 1.4's spec also allows for 100MB ethernet and 2way audio to be sent. Displayport 1.2 has a 720MB/sec auxiliary channel (IIRC HDMI's is just big enough for 100MB ethernet) so it could carry both USB2 and 100MB ethernet signals, but AFAIK carrying them hasn't actually been implemented by any mainstream vendors.
 
Hence, fairly. ;)

It was one of the first articles to show up in the search. I didn't feel it necessary to provide any other results since people have the ability to do the same search themselves.

I'm not overly impressed by the level of research in that article.

"Other features are also based around pretty radical differences with HDMI. DisplayPort sends data in packets, and could therefore be used to transfer signals other than audio and video." DVI and base level HDMI are allow packet based and allow sending audio signals over the wire. HDMI 1.4's spec also allows for 100MB ethernet and 2way audio to be sent. Displayport 1.2 has a 720MB/sec auxiliary channel (IIRC HDMI's is just big enough for 100MB ethernet) so it could carry both USB2 and 100MB ethernet signals, but AFAIK carrying them hasn't actually been implemented by any mainstream vendors.
 
I *will* buy one :)

Only because I will need an external monitor for my m11x, in a few months.
 
I'm using displayport. My video card has one mini-dp and two DVI.

Mini-dp to dp (Dell U2410 has a dp port)
DVI to Dell G2410
DVI to Panasonic TC-L32S1
 
AFAIK, part of difference is that there is a regulatory body overseeing the HDMI standard that charges manufacturers to license it. DP does not have a fee associated with it, so it's cheaper for the OEMs to include it. This is why we are seeing it gaining popularity.

One thing that's interesting to note is the physical constraints of PCIe.

Many low end cards have a single-slot backplane.
Most of the midrange cards have a dual slot backplane with ports on both slots, and a dual slot cooler.
However, many high-end cards go back to a single-slot backplane with a dual slot cooler because they expect people to fit them with single-slot watercooling blocks and use them as single-slot cards for triple or quadruple SLI/CF.

Generally, the standard for those mid-high cards is to have two DVIs and a mini-HDMI, as a regular HDMI won't fit.
However, lots of dual-slot midrange cards are showing up with two DVIs, a regular HDMI, and a DP. That's certainly the most options, but the second DVI is in the second slot backplane, preventing it from being made single-slot watercooled.
 
AFAIK, part of difference is that there is a regulatory body overseeing the HDMI standard that charges manufacturers to license it. DP does not have a fee associated with it, so it's cheaper for the OEMs to include it. This is why we are seeing it gaining popularity.

One thing that's interesting to note is the physical constraints of PCIe.

Many low end cards have a single-slot backplane.
Most of the midrange cards have a dual slot backplane with ports on both slots, and a dual slot cooler.
However, many high-end cards go back to a single-slot backplane with a dual slot cooler because they expect people to fit them with single-slot watercooling blocks and use them as single-slot cards for triple or quadruple SLI/CF.

Generally, the standard for those mid-high cards is to have two DVIs and a mini-HDMI, as a regular HDMI won't fit.
However, lots of dual-slot midrange cards are showing up with two DVIs, a regular HDMI, and a DP. That's certainly the most options, but the second DVI is in the second slot backplane, preventing it from being made single-slot watercooled.

what high-end cards have a single slot backplane? i dont think ive ever seen any...
 
I don't use it right now because my monitor doesn't support it but I can give you the reason why it exists:

1) It is higher bandwidth than the other standards. HDMI was designed with HDTV in mind and just doesn't have support for really high bandwidth communication like we might want on PCs. As of yet there isn't a lot of use for it, but that's changing. Higher pixel density, higher refresh rate, 30-bit panels, etc. These all need more bandwidth.

2) Cheaper to implement. DP doesn't have royalties, like HDMI, and doesn't require a TDMS transmitter. You can put multiple DP ports on a device for much less money than DVI/HDMI.

3) Works directly with LCD panels. DVI was never suitable for direct card to panel communication like in a laptop. DP is. So you can have them directly talk, and then use the same standard to communicate from computer to monitor in a desktop. Most laptops are DP internally these days.

That's the purpose. It isn't likely to ever displace HDMI since HDMI is entrenched in the home theater market. However it already has use in the computer market and that is likely to grow. In the future you may see cheap LCDs with only DP input since it costs the least to implement.
 
what high-end cards have a single slot backplane? i dont think ive ever seen any...

Well, generally it's still a two slot backplane, but there are only ports on the lower part. Top part is just a vent.
http://commons.wikimedia.org/wiki/File:ATI_Radeon_HD_5970_Graphics_Card-oblique_view.jpg

Whereas many midrange cards have ports on both parts.
http://commons.wikimedia.org/wiki/File:ATI_Radeon_HD_5770_Graphics_Card-oblique_view.jpg

I always assumed it was for being able to make it a single-slot watercooled card by just cutting off the top part of the backplane.
 
Well, generally it's still a two slot backplane, but there are only ports on the lower part. Top part is just a vent.
http://commons.wikimedia.org/wiki/File:ATI_Radeon_HD_5970_Graphics_Card-oblique_view.jpg

Whereas many midrange cards have ports on both parts.
http://commons.wikimedia.org/wiki/File:ATI_Radeon_HD_5770_Graphics_Card-oblique_view.jpg

I always assumed it was for being able to make it a single-slot watercooled card by just cutting off the top part of the backplane.

ah i see. i thought you meant it actually came with a single slot backplane. :) yeah i think youre right. either that or the higher powered cards just need more cooling so they need a larger vent in back. could go either way.
 
DP also runs a 3.3v interconnect vs 5v for HDMI. That is easier to design for because of the wider use of 3.3.
 
Back
Top