Is it time to drop the DVI ports on video cards?

If you care about 3D monitors (which you should), you currently need DL-DVI to get the full 1080P in 3D at 60Hz. With the widely used HDMI chipsets you can only get 720P @ 60Hz in 3D. DisplayPort can handle the bandwidth, but it's not really standard at this point. Technically neither is 3D over DVI (it's Nvidia proprietary) but a lot of monitors support it. Ideally I would want HDMI components to upgrade their chipset so we could get 1080P 3D, but it doesn't seem like this is a high priority (though with 4K gaining traction, maybe it will happen soon).
 
If you care about 3D monitors (which you should), you currently need DL-DVI to get the full 1080P in 3D at 60Hz. With the widely used HDMI chipsets you can only get 720P @ 60Hz in 3D. DisplayPort can handle the bandwidth, but it's not really standard at this point. Technically neither is 3D over DVI (it's Nvidia proprietary) but a lot of monitors support it. Ideally I would want HDMI components to upgrade their chipset so we could get 1080P 3D, but it doesn't seem like this is a high priority (though with 4K gaining traction, maybe it will happen soon).
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
I have no problem with getting rid of DVI, but what about us CRT enthusiasts? Are active display ports with VGA adapters good enough for an FW900?
 
I'm a user of high-end hardware and would be exceptionally pissed if DVI went the way side. At one time, I have used 5 x dell 3007wfp-hc's with 5 x miniDP to DL-DVI adapters and it was a headache to just get things working. Those adapters break there connection easily. I have sense gone to only using 3 and they still break simply by changing resolution. People have similar problems on 2560xXXXX displays with built in DisplayPort. As cool as DP is, it's not at all perfect. It's not at all standardized (even if VESA and Apple wants you to believe it). And DVI is solid, it's just a shame that it hasn't been officially updated, but it still being used in ways that's not official.

DVI is only limited by the copper used in it's cable and what ever limit is on the TDMS clocks. Hence why you can overclock those 1440p monitors to 120 Hz when the standard doesn't allow it.

It's also why you can run 1440p / 1600p monitors using single-link DVI on newer monitors.
 
The price you pay for buying cheap hardware.

They may be cheap but the Korean monitors spank many more expensive "gaming" monitors in picture quality and how many 120hz PLS/IPS monitors are there out there?
 
Don't you need the DVI for 3D gaming? Or can you use DP/HDMI for that? AFAIK, when I was running 3D Vision on Alienware screen, it required using the Dual Link DVI cable
 
Don't you need the DVI for 3D gaming? Or can you use DP/HDMI for that? AFAIK, when I was running 3D Vision on Alienware screen, it required using the Dual Link DVI cable

The HDMI 1.4 standard can do 1080p/120. I believe DP can as well. That Alienware screen was a gen1 3D Vision display, none of them support HDMI 1.4.
 
What is it with the HDMI connectors not having a locking option? Crazy ...

idk, maybe I'm being paranoid. But i remember one time where i was without a dvi-cable, and had to resort to hdmi, and the monitor would just boot up to a black screen. It took numerous tries and couple trips to the bios, to set and adjust everything correctly.

With hdmi you have to be careful, they tend to go haywire if your monitor has a weird menu interface for setting up your desired Input. DVI is basically just plug and play right from the get-go, that's why it's generally valued so much in pc monitor set-ups.
 
With hdmi you have to be careful, they tend to go haywire if your monitor has a weird menu interface for setting up your desired Input. DVI is basically just plug and play right from the get-go, that's why it's generally valued so much in pc monitor set-ups.

That right there is the truth... Especially for many panels when they scale the hdmi in some odd method like you see on samsung tvs and asus monitors.
I always reach for DVI first because it just works.
 
I own a Dell 2407 and a 2412M, both of which are DVI. (Maybe the 2412 has DP, but I don't know).
Your Dell U2412M supports VGA, DVI-D, and DisplayPort.


Anyways, there isn't a big deal if they switch to HDMI/DP only. Converters are easy to add to the box and are cheap and it seems like all boxes you buy come with 1 adapter anyways. I probably have 5-10 sitting in the house somewhere.
The only thing that HDMI can be easily adapted to (with a simple passive adapter) is single-link DVI-D

Adapting HDMI to VGA is not possible without an active adapter.
Adapting HDMI to dual-link DVI-D is not possible, period.

This makes HDMI unsuitable for driving high resolutions and/or high refresh rates, and less than ideal for driving legacy VGA displays... unless you don't mind dishing out for an expensive active adapter.

DisplayPort gets even more hairy. If the graphics card doesn't maintain an internal RAMDAC for legacy DVI/HDMI output, then no passive adapters will work at all.

Adapting DisplayPort to single-link DVI-D may or may-not be possible without an active adapter (depends on the card).
Adapting DisplayPort to HDMI may or may-not be possible without an active adapter (depends on the card).
Adapting DisplayPort to dual-Link DVI-D is not possible without an active adapter.
Adapting DisplayPort to VGA is not possible without an active adapter.

Once again, very narrow range of applications where super-cheap passive adapters will work.
 
Adapting DisplayPort to VGA is not possible without an active adapter.

We use these passive adapters at work, they've been working great so far.
http://www.newegg.com/Product/Product.aspx?Item=N82E16815158150

The video card needs a legacy RAMDAC as you say, but I think that's a common feature right now. Perhaps not in the future, though?
 
Seeing as AMD went back to the DVI/DVI/HDMI/DP combo on their cards, I think the market is not ready to abandon DVI anytime soon.
 
That is not a passive adapter, that is an active adapter. It includes a transcoder and a DAC.

That's why the connector is so big, it includes a circuit board with all the necessary electronics. It runs off of the 5v power supply provided by DisplayPort.

Good to know. I assumed you were referring to adapters requiring external power, like going from DP to DL-DVI.
 
I would say the vast majority of monitors are connected via DVI. If anything, HDMI 1.4 is the crap port that must go. Super low bandwidth.

Hopefully in the future HDMI 2.0 GPU and displays will make a simpler connection standard and DVI can finally be phased out.
 
I would say the vast majority of monitors are connected via DVI. If anything, HDMI 1.4 is the crap port that must go. Super low bandwidth.

Hopefully in the future HDMI 2.0 GPU and displays will make a simpler connection standard and DVI can finally be phased out.

HDMI isn't that great because of the royalties and licensing fees necessary to incorporate the port into your hardware. DisplayPort is better in pretty much every way, we just need to get TVs to start using it so it becomes more popular.
 
The price you pay for buying cheap hardware.

It's not for being cheap, it is to reduce lag. There are models with more input options, but they are avoided by gamers for a reason.

I have no problem with getting rid of DVI, but what about us CRT enthusiasts? Are active display ports with VGA adapters good enough for an FW900?

We are SOL, unfortunately. AMD even dropped DVI analog support, but they always were cheap when it came to ports (no double dual link DVI for past couple of generations). I hope Nvidia won't follow.
 
coming from crossfire 7970s (undoubtedly an already potent setup) I bought an R9 290 mainly for the fact it has two dual link DVI ports on each card.

I run two korean 1440p monitors and it's hell having to choose between crossfire and being able to use both monitors at once. (funky driver issues for a while)

HDMI is great for the living room, TV/BluRay/consoles... in the big boy world of super high res, it's just too slow and clunky.

And whoever was bashing korean monitors -- there is no way in hell you could get me to pay 3X per monitor for a 'name brand' 1440p. I love the fact there is little to no input lag thanks to my displays being DVI-DL only, no scalers onboard means less things to go wrong. Clocked mine at 96Hz out of the box and loving them!
 
I'm limited to 60hz @ 1080P using HDMI, but isn't the latest version of HDMI supposed to have more bandwidth? Enough for 120hz @ 1080P at least?
 
I'm limited to 60hz @ 1080P using HDMI, but isn't the latest version of HDMI supposed to have more bandwidth? Enough for 120hz @ 1080P at least?

HDMI 2.0 has something like 19GBit worth of bandwidth, enough for 4K @ 60hz

Since there is licensing involved I'm sure it won't really hit into the market for a good long while. In the high volume world of TV/BlueRay, anything more than 1080p @ 30Hz and people think it's the best it's ever going to get.
 
HDMI isn't that great because of the royalties and licensing fees necessary to incorporate the port into your hardware. DisplayPort is better in pretty much every way, we just need to get TVs to start using it so it becomes more popular.

The reason for all those royalties is HDCP (Hardware Digital Copy Protection) - the requirement for it goes back to the age of DVD. (It's why HDMI is a common input on any flat-panel display of any sort, be it TVs or PCs; that is also why HDMI is a common output - not just for PC GPUs, as most STBs, regardless of source, also use HDMI.)

Another feature that HDMI offers is a single plug/cord - remember, HDMI also supports audio, either by itself or along with video. The bigger bugbear of HDMI is on the PC side - specifically GPUs with non-full-sized HDMI-out (such as mini-HDMI or HDMI-B) and the adapter dongle is either pilfered or just plain not included. (My refurbished GTX550Ti is one such animal - while it supports HDMI, the port is of the mini-sort, not full-size - and the adapter dongle was not included. (I get why - mini-HDMI dongles are not cheap; the alternative is DVI/HDMI dongles, which are far cheaper by comparison.) I will NOT knowingly purchase a GPU without full-sized HDMI again.
 
DVI will be around for a while considering the sheer number of existing monitors with DVI ports--yes, you can buy adapters, but people really would prefer not to buy something additional on top of what they already have. Look at how long VGA has been around and that still continues to be manufactured...
 
DVI will be around for a while considering the sheer number of existing monitors with DVI ports--yes, you can buy adapters, but people really would prefer not to buy something additional on top of what they already have. Look at how long VGA has been around and that still continues to be manufactured...

That is because businesses still used projectors with VGA.

DVI's and VGA's days are numbered. LG, Samsung, Dell, Intel, and AMD announced in 2010 that they were dumping VGA for royalty free DisplayPort and DVI for HDMI. It is happening. Their target date for deprecation is 2013-2015

http://newsroom.intel.com/community...digital-display-technology-phasing-out-analog
 
I'm gonna go out on a limb here, and say that 80% of monitors that people buy are DVI based. HDMI has many shortcomings in comparison. Displayport isn't common at all either.
 
I'm gonna go out on a limb here, and say that 80% of monitors that people buy are DVI based. HDMI has many shortcomings in comparison. Displayport isn't common at all either.

And have you seen DVI's shortcomings...among the top being that no one has been maintaining or upgrading the standard for almost a decade nor has anyone shown any desire to update it? There's also the 1600p cap as well as the lack of audio.

HDMI sure as hell ain't perfect and I've ranted about it on occasion, but I'd take DP any chance I'd get over DVI at this point. In fact it is what I have my sig rig plugged with.
 
And have you seen DVI's shortcomings...among the top being that no one has been maintaining or upgrading the standard for almost a decade nor has anyone shown any desire to update it? There's also the 1600p cap as well as the lack of audio.

HDMI sure as hell ain't perfect and I've ranted about it on occasion, but I'd take DP any chance I'd get over DVI at this point. In fact it is what I have my sig rig plugged with.

I'm quite happy with DVI. I have no intention of giving up my DVI only monitor for a long time.
 
I'm quite happy with DVI. I have no intention of giving up my DVI only monitor for a long time.

I'll try to remember to remind you "I told you so" in a year or two when you're buying an IPS 4K panel. ;)
 
And have you seen DVI's shortcomings...among the top being that no one has been maintaining or upgrading the standard for almost a decade nor has anyone shown any desire to update it? There's also the 1600p cap as well as the lack of audio.

Considering 1600p is so far away from the displays most people buy doesn't help the transition ;).
 
I'm already feeling the squeeze, it looks like none of the R290's have DVI-I, which means no way to directly connect to my CRT!
 
Bit of a necro going on here, but no. Absolutely not. In an ideal scenario we'd be rid of the nuisance oversized connectors that you're always forgetting to tighten properly, or over-tightening such that the nuts come loose when you take them off, etc. etc. but in reality, DVI (perhaps excluding dual-link) is a real 'plug and play' sort of affair, assuming you get the type right (DVI-I vs. DVI-D etc.).
With the new interfaces, as simple as they're supposed to be, there are always some irritations that make me long for the old days, be it HDMI's overscan and audio quality issues, to displayport's sync dropout and signal wakeup issues. If there was some hack way of running 3840x2160 60Hz over a pair of DVI-D cables, and you could do it with standard equipment, I'd be all over it.
I had displayport on my 3008WFP, but never ended up using it, there was simply no reason to do so over DVI-I.
 
My main screen has no analog inputs, just hdmi, dvi, and display port.
Was one of the cheaper screens with display port, I think I paid $160 for it 2-3 years ago from hp,

The HP 2310e ultra-thin WLED backlit monitor has the following features:
BrightView Technology enhances clarity and provides brilliant colors
1920 x 1080 factory-set resolution
Supports HDMI, DisplayPort and DVI-D video input connectors
Supports High-bandwidth Digital Content Protection (HDCP) to prevent transmission of non-encrypted high definition content
Plug and Play capability, if supported by your computer system
On-screen display (OSD) adjustments for ease of setup and screen optimization
Technical specifications
 
Back
Top