DVI-D (etc) Is this dead tech now?

DWD1961

[H]ard|Gawd
Joined
Nov 30, 2019
Messages
1,314
I am cleaning out my stash of cables and found about 7 DVI and DVID cables, one with a Display Port converter attached to it - lol. (I also tossed about 3 old HDMI cables, pre 2012)

I tossed them all in the trash, but I was wondering if this tech is really as dead as it seems to me?

I read that HDMI is the standard for TVs and computer monitors are going with Display Port technology.

Is that correct?

Thanks.
 
Yes, you cant really get any modern high end card with DVI anymore. It is prety well dead.
 
Yes, you cant really get any modern high end card with DVI anymore. It is prety well dead.
Thanks. I'll toss them all then. What about HDMI for computers? I only have one I just bought for my old monitor, so I don't have any older ones. Just wondering.
 
Thanks. I'll toss them all then. What about HDMI for computers? I only have one I just bought for my old monitor, so I don't have any older ones. Just wondering.
Just look around at the monitor/GPU market and you'll see what is getting used.
There are use cases for HDMI. Some like to attach their computers to TVs. Also some monitors have 1 or 2 HDMI ports.

The short response on what is used currently on GPU's are: Displayport and HDMI.
Displayport is up to version 2.0, but very few GPUs or displays currently support 2.0. It's basically the version that will allow 8k or 4k high Hz + HDR + 10-bit, etc. Currently most displays and GPUs are utilizing version 1.4.
HDMI is up to version 2.1, but most devices only support 2.0. It's a similar situation to Displayport.

Everything else is more or less legacy. It might be worth it to keep the odd VGA cable (or converter to VGA) around still for a while if you're in a situation in which you need to attach your computer to a random projector. People tend to replace projectors far less often and there are still projectors floating around using old connections. This is especially true I've found in education and small business settings.
 
Technically DVI has been dead since 1999, but until DisplayPort there really wasn't another way to achieve high refresh rates on PC monitors.
 
Thanks. I'll toss them all then. What about HDMI for computers? I only have one I just bought for my old monitor, so I don't have any older ones. Just wondering.
All of them?
My life experience tells me that trashing any not needed anymore items results in almost immediately needing it...
At least keep one cable just in case you need to use it :)
 
Technically DVI has been dead since 1999, but until DisplayPort there really wasn't another way to achieve high refresh rates on PC monitors.
One thing I find odd is that they are still piping audio through the DVI port. I know back in the day cards would specifically say that audio was on a DVI port, like my ATI 3650 which had an orange or yellow DVI port and that was the one that carried the audio.
When I was working on a friends 1950X setup with a 1080ti FTW3, I was using a DVI to HDMI cable to my bench monitor and I was getting audio.
I have another machine on my bench right now with a GTX 550ti and that also does audio through the DVI port.
I just find it odd seeing as both cards have HDMI (mini on the 550ti) that they would bother to wire up the DVI port for audio which requires a DVI to HDMI cable.
 
HDMI is an evolution of DVI and uses the same signaling. It’s cheaper to use the same hardware stack to service the DVI port and HDMI ports than it is to have separate “DVI” only hardware just to block the HDMI enhancements.

The only thing the adapters do is change the physical connection; the devices on each end don’t know the difference
 
HDMI is an evolution of DVI and uses the same signaling. It’s cheaper to use the same hardware stack to service the DVI port and HDMI ports than it is to have separate “DVI” only hardware just to block the HDMI enhancements.

Yep, HDMI, started out as repackaging DVI single channel, plus audio, plus content protection. It remains backward compatible with DVI single channel to this day.

But any high res/high refresh DVI required dual channel DVI, which HDMI is not compatible with.
 
HDMI is an evolution of DVI and uses the same signaling. It’s cheaper to use the same hardware stack to service the DVI port and HDMI ports than it is to have separate “DVI” only hardware just to block the HDMI enhancements.

The only thing the adapters do is change the physical connection; the devices on each end don’t know the difference

That was true up to v2.1. HDMI 2.1 finally dropped the 3x data line 1x clock line setup it inherited from DVI for the 4x self clocked data lines that DP has used from the beginning. That change eliminates the biggest technical difference between the two standards. The biggest remaining one is that HDMI 2.1 uses 16/18 bit coding for an encoding 11% loss (down from 20% with 8/10 in earlier versions), while Displayport 2.0 upgraded all the way to 128/130 for a mere 1.5% reduction in raw vs usable bandwidth. Beyond that they remain different in the types of additional data sent over the auxiliary channel; but the aux channels have seen little real world use in either standard.
 
That was true up to v2.1. HDMI 2.1 finally dropped the 3x data line 1x clock line setup it inherited from DVI for the 4x self clocked data lines that DP has used from the beginning. That change eliminates the biggest technical difference between the two standards. The biggest remaining one is that HDMI 2.1 uses 16/18 bit coding for an encoding 11% loss (down from 20% with 8/10 in earlier versions), while Displayport 2.0 upgraded all the way to 128/130 for a mere 1.5% reduction in raw vs usable bandwidth. Beyond that they remain different in the types of additional data sent over the auxiliary channel; but the aux channels have seen little real world use in either standard.

It's still backwards compatible with HDMI 1.x, and, as such, DVI-D. It's similar to how PCIe 3.0 is 128b/130b encoded but still compatible with PCIe 2.0/1.x and it's 8b/10b encoding (though not quite the same since HDMI has added many features that aren't part of the DVI spec where PCIe 3.0 just changed encoding schemes and bumped up the speed).
 
All of them?
My life experience tells me that trashing any not needed anymore items results in almost immediately needing it...
At least keep one cable just in case you need to use it :)
I think I did toss them all and yeah, that's why I aksed--as soon as yuo tos it, yuo need it. However, I can;t see any reason form ever using DVI again. I have an old 26: monitor (ASUS) and it uses HDMI and my next monitor will be HDMI or DP. I don't have any older monitors. So--bye bye.
 
Yep, HDMI, started out as repackaging DVI single channel, plus audio, plus content protection. It remains backward compatible with DVI single channel to this day.

But any high res/high refresh DVI required dual channel DVI, which HDMI is not compatible with.
I was specifically talking about DVI-D too. Still, seems like all DVI days are over.
 
I culled a majority of my VGA/DVI cables a few years ago - IIRC including all of my DVI-I cables (never really used because I had problems with devices deciding to go analog instead of digital), and don't have any still in use after GPU vendors port changes pushed my legacy displays to HDMI-DVI cables and forced the replacement of a VGA KVM with an HDMI model. I still have a few just in case, but most of my hoard (from monitors that came with several cables, and a few extra long ones I needed) is gone.

Whenever I clean my tech closet again I'll probably cull the inventory more, along with USB A-B cables, mini-B cables, possibly micro-B (depends how far in the future the purge is), internal USB 2.0 header/brackets, and SATA cables. It'll be the biggest purging of obsolete tech-trash since I got rid of my pata and floppy ribbons years ago. I'll probably keep 1 of each video cable for at least a few more years just in case I let myself be talked into trying to help a family member with an old PC in the future. The others will be thinned out a lot, but aren't at the point of near total disposal yet.
 
Last edited:
  • Like
Reactions: XoR_
like this
I kept at least one USB 2.0 headerbracket around, simply because MB manufacturers treat back I/O USB ports as some sort of precious commodity and are extremely stingy with them. One board placed two HDMI 1.4 connectors side by side, as if they wanted to justify wasting even more space (a total of 6 USB ports, including USB-C).
 
I'll probably keep 1 of each video cable for at least a few more years just in case I let myself be talked into trying to help a family member with an old PC in the future. The others will be thinned out a lot, but aren't at the point of near total disposal yet.
Good strategy.
You never know when you gonna need obsolete cable or bracket.
 
I am sad my korean ips panel monitor only have DVI input and now current gpu stopped having DVI ports =[
 
Yeah, last card of mine that had one was my 980Ti that I used with my sony CRT.
 
1080ti was last gen with it. Dp seems pc main goto then hdmi. Seems usb-c is starting to appear and of course thunderbolt.
 
Back
Top