What's after DVI?

NExUS1g

Gawd
Joined
Aug 15, 2004
Messages
554
I've been doing some reading regarding 4K displays, and apparently they require 2 DVI connections in order to support that high of a resolution at 60Hz, and that one DVI connection results in 30Hz.

So I imagine that must mean there's new solutions on the horizon that we'll be looking at as 4K panels come into pricing focus for the main audience over the next few years, since requiring multiple DVI connections and limitations to 60Hz really isn't practical for a mass market audience. However, after some searching around, I haven't found any information on it yet.

So my question is as follows: What're going to be the next connectors beyond HDMI/DVI to support the immense amount of data that's required for 4K displays through one cable?
 
They're great solutions for bringing it down to one cable for 4K, but those are still limited to 60Hz at 4K though.
 
They're great solutions for bringing it down to one cable for 4K, but those are still limited to 60Hz at 4K though.

I would imagine it would be another version of HDMI/DVI or Displayport.

We still have plently of time, video cards will not be able to push 120hz @ 4K for years to come.
 
It may be possible for HDMI 2.0 to exceed spec and support 100Hz+ at 4K.

HDMI 1.3+ is limited to just higher than 1080p at 60Hz but many have found their TVs can run at 120Hz over HDMI.
The HDMI 2.0 cable will need to support the bandwidth, but this might be possible with short lengths on decent quality cable.
The chips will also need to support it and first generation isnt that likely.
 
It may be possible for HDMI 2.0 to exceed spec and support 100Hz+ at 4K.

HDMI 1.3+ is limited to just higher than 1080p at 60Hz but many have found their TVs can run at 120Hz over HDMI.
The HDMI 2.0 cable will need to support the bandwidth, but this might be possible with short lengths on decent quality cable.
The chips will also need to support it and first generation isnt that likely.

Actually I find this unlikely because 60Hz is already at the very end of the spec for HDMI 2.0 @ 4k. When it came to 2560x1440 they were able to do 100Hz and 120Hz but it wasn't overclocking nearly as much as dual-link DVI at 60Hz on those monitors was like at 60% bandwidth capacity not like nearly 95% like HDMI 2.0 is running 4k @ 60Hz.

DP 1.2 has enough bandwidth to run 4k @ 100hz.
 
This page has some nice graphs with required bandwidths and connector types: http://www.noteloop.com/kit/display/interface/#4k

Unfortunately display connectors are the main bottleneck for high resolutions and refresh rates right now, and that doesn't look to change much for the next few years. Some higher-end displays already rely on 2-4 connectors, but most people probably aren't going to bother with the complexity and it's not fully supported by Nvidia yet.

I'm hoping we might get some kind of PCIE-like connection to drive future displays directly, as a PCI Express 3.0 x16 lane connection already has enough bandwidth to handle 8k 24-bit at 120Hz. Some SSDs are coming out that already use x4 connections, theoretically allowing for about twice both DP 1.2 and HDMI 2.0 speeds.
 
This page has some nice graphs with required bandwidths and connector types: http://www.noteloop.com/kit/display/interface/#4k

Unfortunately display connectors are the main bottleneck for high resolutions and refresh rates right now, and that doesn't look to change much for the next few years. Some higher-end displays already rely on 2-4 connectors, but most people probably aren't going to bother with the complexity and it's not fully supported by Nvidia yet.

I'm hoping we might get some kind of PCIE-like connection to drive future displays directly, as a PCI Express 3.0 x16 lane connection already has enough bandwidth to handle 8k 24-bit at 120Hz. Some SSDs are coming out that already use x4 connections, theoretically allowing for about twice both DP 1.2 and HDMI 2.0 speeds.

When talking about it for gaming, it seems that throwing more cables would not be a viable solution. The reason it isn't, I'm thinking, is the processing overhead and related input delays in separating the images properly into 2 or 4 areas, sending the proper bits through the proper lines and reassembling those bits into a coherent image on the screen versus just sending a serial of bits through a single line.
 
It doesn't appear so.

http://www.amd.com/us/Documents/50279_AMD_FirePro_DisplayPort_1-2_WP.pdf

See the chart on pg.3 and the paragraph on pg. 4 under the heading "4k x 2k Resolution".

I know quite a bit about this stuff. What it really comes down to is pixelclock for the bandwidth. HDMI 2.0 is 600 Mhz pixelclock dp 1.2 is 960 Mhz. That is a huge difference. Take a look at this post I made for details:

http://hardforum.com/showpost.php?p=1040272771&postcount=9
(Btw I typed all that out on the spot without having to lookup any of that info online).

FYI using totally standard cvt reduced timings 100hz for 3840x2160 comes out to (for those who don't know the first number after the resolution in quotes is pixelclock in Mhz):

Modeline "3840x2160" 888.33 3840 3888 3920 4000 2160 2163 2168 2222 +hsync -vsync

Now that is reduced which not everyone would necessarily use. Regular cvt would be:

Modeline "3840x2160" 1230.50 3840 4184 4608 5376 2160 2163 2168 2289 -hsync +vsync

Generally speaking LCD displays usually run on reduced timings or something very near it. For example full cvt modeline for 3840x2160@60Hz is 712.75 Mhz yet the reduced timing one is only 533 Mhz. The display industry is picking a modeline that is around 600 Mhz so its a lot closer to CVT reduced than it is to non-reduced.

So even taking the logic that currently they use < 600 Mhz as HDMI is only spec'd for 600 and current 30 Hz tv's are 297 Mhz modelines for UHD standard then just doing the math you can see pushing 3.333 times 297 is only 990 Mhz which is only 10 Mhz above DP 1.2 spec. Trust me you can get the timings down a bit more and do 100 Hz via dp 1.2, it would be *no* problem. Getting 120Hz over DP might be possible with some over-clocking but that would be hard. I don't think 100Hz would at all.

Also another thing to keep in mind is the page you linked me to was lsiting 4k as 4096x2160 (not 3840x2160) which increases the bandwidth requirements as I am sure most 4k displays will be 3840x2160 and not 4096x2160. I was assuming the UHDTV standard one not the cinema one.

Generally speaking video cards/outputs don't have a maximum resolution or refresh rate its all a matter of what fits in the constraints of the modeline and how high of a pixelclock you can use. For a lot of companies list 2560x1600 as a maximum digital resolution but almost any machine that is DVI can run a 3840x2400 display if the refresh rate is low enough (like 17Hz). I ran 3840x2400 @ 60Hz even off a really old geforce gt 6600 AGP via 2x dual link DVI before as well which was spec'd for 1920x1200 maximum digital resolution (as it was even pre 30 inch dell).

I am a resolution/dektop real-estate whore so I have learned about doing custom resolutions/refresh rates and hardware limitations for a lot of years.

In 1998 I was using a 17 inch CRT that ran 1600x1200@60Hz.
In 2001 I was running a 22 inch CRT that ran 2560x1920@54Hz (due to hardware constraints of the time, later upped to 63 Hz, the monitor limit)
In early 2006 I was running a 30 inch dell (2560x1600). Finally upgraded to an LCD as it wouldn't be a super downgrade in resolution.
In later 2006 I got my first (of several) 22 inch 3840x2400 display (which is one of the most whacky/weird/quirky displays you will ever deal with in your life when it comes to running it).

And stagnant... LOL I can't believe how long it took for better things to finally come.
2012. Upgraded to Yamakasi Catleap 2B 100hz. I play quakelive so I really appreciated the 100Hz display even with the slow response of IPS.
2013. Finally replacing several of my old 4k displays with new 4k TV's that are so cheap...
 
I know quite a bit about this stuff. What it really comes down to is pixelclock for the bandwidth. HDMI 2.0 is 600 Mhz pixelclock dp 1.2 is 960 Mhz. That is a huge difference. Take a look at this post I made for details:

http://hardforum.com/showpost.php?p=1040272771&postcount=9
(Btw I typed all that out on the spot without having to lookup any of that info online).

FYI using totally standard cvt reduced timings 100hz for 3840x2160 comes out to (for those who don't know the first number after the resolution in quotes is pixelclock in Mhz):

Modeline "3840x2160" 888.33 3840 3888 3920 4000 2160 2163 2168 2222 +hsync -vsync

Now that is reduced which not everyone would necessarily use. Regular cvt would be:

Modeline "3840x2160" 1230.50 3840 4184 4608 5376 2160 2163 2168 2289 -hsync +vsync

Generally speaking LCD displays usually run on reduced timings or something very near it. For example full cvt modeline for 3840x2160@60Hz is 712.75 Mhz yet the reduced timing one is only 533 Mhz. The display industry is picking a modeline that is around 600 Mhz so its a lot closer to CVT reduced than it is to non-reduced.

So even taking the logic that currently they use < 600 Mhz as HDMI is only spec'd for 600 and current 30 Hz tv's are 297 Mhz modelines for UHD standard then just doing the math you can see pushing 3.333 times 297 is only 990 Mhz which is only 10 Mhz above DP 1.2 spec. Trust me you can get the timings down a bit more and do 100 Hz via dp 1.2, it would be *no* problem. Getting 120Hz over DP might be possible with some over-clocking but that would be hard. I don't think 100Hz would at all.

Also another thing to keep in mind is the page you linked me to was lsiting 4k as 4096x2160 (not 3840x2160) which increases the bandwidth requirements as I am sure most 4k displays will be 3840x2160 and not 4096x2160. I was assuming the UHDTV standard one not the cinema one.

Generally speaking video cards/outputs don't have a maximum resolution or refresh rate its all a matter of what fits in the constraints of the modeline and how high of a pixelclock you can use. For a lot of companies list 2560x1600 as a maximum digital resolution but almost any machine that is DVI can run a 3840x2400 display if the refresh rate is low enough (like 17Hz). I ran 3840x2400 @ 60Hz even off a really old geforce gt 6600 AGP via 2x dual link DVI before as well which was spec'd for 1920x1200 maximum digital resolution (as it was even pre 30 inch dell).

I am a resolution/dektop real-estate whore so I have learned about doing custom resolutions/refresh rates and hardware limitations for a lot of years.

In 1998 I was using a 17 inch CRT that ran 1600x1200@60Hz.
In 2001 I was running a 22 inch CRT that ran 2560x1920@54Hz (due to hardware constraints of the time, later upped to 63 Hz, the monitor limit)
In early 2006 I was running a 30 inch dell (2560x1600). Finally upgraded to an LCD as it wouldn't be a super downgrade in resolution.
In later 2006 I got my first (of several) 22 inch 3840x2400 display (which is one of the most whacky/weird/quirky displays you will ever deal with in your life when it comes to running it).

And stagnant... LOL I can't believe how long it took for better things to finally come.
2012. Upgraded to Yamakasi Catleap 2B 100hz. I play quakelive so I really appreciated the 100Hz display even with the slow response of IPS.
2013. Finally replacing several of my old 4k displays with new 4k TV's that are so cheap...

I double checked, and the page I linked listed 4K as specifically 3840x2160.

By my math (which may not be accurate, so feel free to correct me) 3840*2160 = 8.2944 million pixels * 24 (8 bpc) = 199.0656 Mbps necessary to display one full 4K image per second. Multiply that by 100 for 100 Hz, and that reads out to be 19.90656 Gbps required for 3840*2160 @ 24 bpp @ 100 Hz.

Now, what this tells me is that even operating at the higher 1.2350 GHz you mention, it would require 20 separate lanes dedicated to pixel information alone to support that resolution, color depth and refresh rate, and DP has 20 lanes altogether.

Let me know what I'm missing here.

22 inch 3840x2400 display (which is one of the most whacky/weird/quirky displays you will ever deal with in your life when it comes to running it).

Despite its weirdness, that must have been one sharp display. That pixel density would put the Retina display to shame, though I don't know the details of the pixel density of Retina displays..
 
I double checked, and the page I linked listed 4K as specifically 3840x2160.

Whups I guess I was looking at the maximum resolution field in the table above.

By my math (which may not be accurate, so feel free to correct me) 3840*2160 = 8.2944 million pixels * 24 (8 bpc) = 199.0656 Mbps necessary to display one full 4K image per second. Multiply that by 100 for 100 Hz, and that reads out to be 19.90656 Gbps required for 3840*2160 @ 24 bpp @ 100 Hz.

Now, what this tells me is that even operating at the higher 1.2350 GHz you mention, it would require 20 separate lanes dedicated to pixel information alone to support that resolution, color depth and refresh rate, and DP has 20 lanes altogether.

Let me know what I'm missing here.

Ok so I may be off a bit here as I trusted nvidia's pixel clock calculation would be based off 24 bpp. The thing is DP can be 6-16 bits per channel. The logs on *nix are nice because they tell you the pixel clock limits plane as day. Pixel clock does not have to do with how many bits per pixel its just the pixels/sec. So what happens with the modeline I mentioned @ 100hz (888.33 Mhz pixelclock:


We get that pixel clock by:

4000(width with blanking) * 2222 (height with blanking) * 100(hz) = 888.8 million (888.8 Mhz) My cvt calculator thing might be a bit off since it got 888.33 somehow....

Ok so we need 888.8 Mhz pixelclock well if I believe my Nvidia logs then I know I have 960 Mhz pixelclock:

Code:
(--) Jul 03 22:53:34 NVIDIA(0): DFP-4: 960.0 MHz maximum pixel clock
(--) Jul 03 22:53:34 NVIDIA(0): DFP-4: Internal DisplayPort

More than enough right? Well the Gbps thing is calculated by pixelclock * bpp. So assuming 24 bits per pixel that would be:

960 * 24 = 23.04 Gbps

Uh oh... what happened here.. AFAIK DP 1.2 maximum effective rate is 17.28 Gbps... Well what happens when we only use 6 bits per color channel instead of 8, IE 18 bpp:

960M * 18 = 17.28

Oh shit...

So while DP does have enough bandwidth and can potentially do 108 Hz at cvt reduced timings it would only be at 18 bit color (not 24 bit).

So whats the 24-bit limit?

720M so 720 Mhz pixelclock and thus not enough for 100Hz using cvt-r timings the refresh rate would be:


720000000 / (4000(width with blanking) * 2222 (height with blanking)) = 81.0081

So basically 81 Hz.

Honestly though for gaming I would be happy with 81Hz and I might even be happy to have some reduced color for the extra 20Hz especially if its just for gaming...

So is DP 1.2 enough bandwidth to run > 100 Hz @ 4k? Yes
Is DP 1.2 enough bandwidth to run > 100 Hz @ 4k @ 24 bpp? No

I assumed 24-bit minimum when nvidia's pixelclock limit was listed which was a bad assumption on my part as that is what it is for all other (HDMI/DVI/RAMDAC) formats. My bad! Atleast we/I learned something here.


Despite its weirdness, that must have been one sharp display. That pixel density would put the Retina display to shame, though I don't know the details of the pixel density of Retina displays..

Well 'retina' is relative to what distance you are from the display most would consider the 2 feet away from that display to be retina but I can still notice the jaggies in text and stuff even at smallest renderable size. I actually have even smaller fonts than people who run windows because I run linux with X set at 75 DPI font settings (vs 96 dpi that windows uses). I have a real hard time accepting people that think 2560x1440 on a 27 inch makes the pixels to small because that is *nothing*. I have no problems working all day ~2 feet-ish away from that 22 inch 204 PPI display which is almost 4x smaller. Now I get that is an extreme and not everyone would be ok with that but I really can't understand how people have problems with stuff that is 2-3x that size heh...
 
I think Thunderbolt 2 is worth mentioning here. Most uptake of it so far has been for Apple products, but it can do 20Gbps in principle.
 
I think Thunderbolt 2 is worth mentioning here. Most uptake of it so far has been for Apple products, but it can do 20Gbps in principle.

No monitor supports it aside from the ACD, and AFAIK no mainline-PC GPU supports it...nor do either of the aforementioned intend to.

Thunderbolt has as bright a future as FireWire did. Technically a good standard, ruined by licensing req.
 
No monitor supports it aside from the ACD, and AFAIK no mainline-PC GPU supports it...nor do either of the aforementioned intend to.

Thunderbolt has as bright a future as FireWire did. Technically a good standard, ruined by licensing req.

TB 2 is same bandwidth as DP 1.2 anyway right?

I know the whole TB thing I think kept back DP on my macbook pro retina. I was surprised to find out my mac book pro with a gtx 650m did not have a DP 1.2 port. That I have to say was disappointing...
 
DMS-59 is basically just combining two DVI cables into one. I don't really see whats special about that? Its still DVI. That being said it is what all my older 4k monitors used, LOL.
 
No monitor supports it aside from the ACD, and AFAIK no mainline-PC GPU supports it...nor do either of the aforementioned intend to.

Thunderbolt has as bright a future as FireWire did. Technically a good standard, ruined by licensing req.

I would agree with that. However, we are talking about future solutions 'on the horizon.' I wouldn't put it past Apple to jump on 4k for their notoriously-expensive (but decently selling) line of monitors, nor would I put it past them to use the occasion as an opportunity to try again with their TB connector tech.
 
I would agree with that. However, we are talking about future solutions 'on the horizon.' I wouldn't put it past Apple to jump on 4k for their notoriously-expensive (but decently selling) line of monitors, nor would I put it past them to use the occasion as an opportunity to try again with their TB connector tech.

Apple can "try again" however much they want. Nothing will change until Thunderbolt drops in prices to consumers and OEMs. When a 2 fucking meter consumer cable costs $30USD to consumers, it is dead in the water for wider adoption...and who knows what the licensing cost to OEMs would be or is to put the connectors on devices.

And all that looks even more moronic for Apple et al...when Displayport is royalty free and has very few downsides in comparison, and is already appearing on monitors.
 
Again, I don't disagree, but I understand that there is a good reason why the cables are so expensive, i.e., they HAVE to be active cables: http://arstechnica.com/apple/2012/07/why-thunderbolt-cables-will-be-expensive-until-2013/

We're talking about 4K here. Until monitors in this res come way down in price, they'll continue to be bought primarily by folks for whom it makes little difference having to pay 50 beans on a cable.

Doesn't matter if there's a "good (technical) reason" that a 2m cable is $30 or not. A $30 2m cable is still outrageous to just about anyone apart from people who buy their cables from Monster, on principle.

A 50" 4K can already be had for $700USD. How much more does it need to come down in price? Caveat for us H people being that ofc at $700USD you're getting 30Hz. Even then, no monitor maker is bothering with TB either due to licensing I'd wager. Consumers ain't willing, and OEMs seem to not be willing. Guess what? FireWre all over again.
 
Well, when I referred to '4k monitors' I literally was thinking of (60-120hz) monitors not 30hz HDTVs. I don't expect these monitors to be as cheap as the 50" Seiki you refer to anytime soon, though I'll be delighted to be proven wrong.

Do any 'true' 60hz 4k displays actually exist yet? The only ones Iremember reading about were actually 2 sub-4k panels combined into the same housing, running off two separate HDMI 1.X connectors at once..
 
Well, when I referred to '4k monitors' I literally was thinking of (60-120hz) monitors not 30hz HDTVs. I don't expect these monitors to be as cheap as the 50" Seiki you refer to anytime soon, though I'll be delighted to be proven wrong.

Do any 'true' 60hz 4k displays actually exist yet? The only ones Iremember reading about were actually 2 sub-4k panels combined into the same housing, running off two separate HDMI 1.X connectors at once..

All 4k 60Hz+ displays run off two inputs (I am counting MST over DP as two inputs still). There have been ones (expensive ones) that can do 60Hz since 2007. Ive never heard of any of them that actually were made up of multiple panels though. Are you just assuming multiple panels because they take 2 or 4 inputs to drive it at the full refresh rate?
 
I seem to remember reading the 2 panels in one claim but have no idea if it was correct, no

I have seen that claim made mistakenly about panels where I know that was not the case so I am guessing its probably someone that was mistaken who said that (atleast in the case of 4k, IE 3840x2160 or 3840x2400 or 4096x2160 panels). I would guess that the you probably got bad info.

Although if one exists and you could like me I could be proven wrong =)
 
Back
Top