Display Port 2.0

Interesting development. Read a few articles and I think Anandtech did the best job of breaking down the new standard:

https://www.anandtech.com/show/1459...-20-standard-bandwidth-for-8k-monitors-beyond

Basically, VESA is taking TB3 and its four 20Gbps lanes and making them all unidirectional for a full 80Gbps bandwidth one way, and slapping on DP / USB-C connectors. The one side effect of pushing that much bandwidth is that it will require active cabling. So while a DP 2.0 cable will be backwards compatible with 1.x ports, you'll need to pair a 2.0 cable with a 2.0 port if you want to achieve something like 4K 144Hz HDR 4:4:4 chroma, which I believe is the holy grail right now.

It will also make DP 2.0 the undisputed preferred connector for PCs over HDMI 2.1 and its 48Gbps bandwidth.
 
I didn’t read the article, and I haven’t fully divided in yet, but does that mean that TB3 will be able to do DP 2.0 out of the box or with a firmware update or similar? Or will those with TB3 have to buy new hardware to support the standard?
 
I like this, but TB3 cables are already kinda wonky. Have had some interesting problems with eGPUs and properly working active cables already.

Not cheap either, retail $70 for a 10ft active, getting borderline similar to pricing of going with optical transceivers on both ends.
 
I didn’t read the article, and I haven’t fully divided in yet, but does that mean that TB3 will be able to do DP 2.0 out of the box or with a firmware update or similar? Or will those with TB3 have to buy new hardware to support the standard?

Extremely doubtful, given that they're reversing the 'receiving' side of the duplex connection so that the bandwidth is all going one way. Hard to see existing Thunderbolt controllers doing that, as they likely don't have the bandwidth available from the motherboard even if they could be programmed to do so.
 
Extremely doubtful, given that they're reversing the 'receiving' side of the duplex connection so that the bandwidth is all going one way. Hard to see existing Thunderbolt controllers doing that, as they likely don't have the bandwidth available from the motherboard even if they could be programmed to do so.

It all depends on if the TB3 controller is capable of being “hacked” to a certain degree. But I also realize that there is a good chance that it’s not feasible.

The workaround seems to be egpus with future video cards if you want to continue to use the current hardware.
 
Interesting development. Read a few articles and I think Anandtech did the best job of breaking down the new standard:

https://www.anandtech.com/show/1459...-20-standard-bandwidth-for-8k-monitors-beyond

Basically, VESA is taking TB3 and its four 20Gbps lanes and making them all unidirectional for a full 80Gbps bandwidth one way, and slapping on DP / USB-C connectors. The one side effect of pushing that much bandwidth is that it will require active cabling. So while a DP 2.0 cable will be backwards compatible with 1.x ports, you'll need to pair a 2.0 cable with a 2.0 port if you want to achieve something like 4K 144Hz HDR 4:4:4 chroma, which I believe is the holy grail right now..

No, you don't.

The new cabling improves efficiency over the link, and also increases the passive link speed (read: ACTUAL bandwidth) from 25 to to 38 Gbps (50% improvement). That's enough bandwidth to run 4k SDR 180Hz , or 4k HDR 144 hz, all without DSC.

The current DP 1.4 already does 120hz 4k SDR (or 98Hz HDR), so it's not too far to jump up to your "holly grail." The cable speeds can handle 4k 144hz HDR, or 5k 85hz HDR (no DSC required!)

They're just reusing USB 3.2 passive cable tech for the 40Gbps link, and Thunderbolt 3 active cables (or factory-validated hardwired passive cables on cheaper devices) for anything faster.
 
Last edited:
I'm hoping displayport 2.0 gets adopted faster than HDMI 2.1 - that has been the pattern in the past, the reduced cost and closed systems (e.g. internal laptop use) makes it a lot easier to implement in some ways. For desktop monitor use a lot will depend on video cards being released with support. If either AMD or Nvidia releases a card with it, there will be a temptation for monitor makers to grab the early adopter market with a new monitor, assuming the DP 2.0 ASICs are available.

The signalling rate question is interesting. If you want to use existing passive cables 6-9 feet max, you would be limited to 40gbit. Reaching the max bitrates would require new cables, and active cables are not cheap. 40gbit with DSC etc. and higher efficiency would still be a step up from what's available today, so I can see this only being for a 4k 144hz 4:4:4 HDR stream. You could get 4k120 RGB over regular cable. I also wonder if the USB-C controller output on RTX cards can be made to output this format already, with firmware updates at the existing/lower signalling rate (i.e. 40gbit).
 
Last edited:
I'm hoping displayport 2.0 gets adopted faster than HDMI 2.1 - that has been the pattern in the past, the reduced cost and closed systems (e.g. internal laptop use) makes it a lot easier to implement in some ways. For desktop monitor use a lot will depend on video cards being released with support. If either AMD or Nvidia releases a card with it, there will be a temptation for monitor makers to grab the early adopter market with a new monitor, assuming the DP 2.0 ASICs are available.

The signalling rate question is interesting. If you want to use existing passive cables 6-9 feet max, you would be limited to 40gbit. Reaching the max bitrates would require new cables, and active cables are not cheap. 40gbit with DSC etc. and higher efficiency would still be a step up from what's available today, so I can see this only being for a 4k 144hz 4:4:4 HDR stream. You could get 4k120 RGB over regular cable. I also wonder if the USB-C controller output on RTX cards can be made to output this format already, with firmware updates at the existing/lower signalling rate (i.e. 40gbit).

Yeah, I could see that happening, Or they may wait to make it a selling point with the next generation of cards: like they never included the GTX 960 in the Netflix 4k support, even though it has the same HEVC 10-bit playback and HDCP 2.2. Much like the 3GB ram cutoff was horse shit , intended to sell cards more expensive than the GT 1030.

Nvidia has never been one to backport support for new displays, when they can sell you new devices. And they will always charge a premium for it.
 
I'm hoping displayport 2.0 gets adopted faster than HDMI 2.1 - that has been the pattern in the past, the reduced cost and closed systems (e.g. internal laptop use) makes it a lot easier to implement in some ways. For desktop monitor use a lot will depend on video cards being released with support. If either AMD or Nvidia releases a card with it, there will be a temptation for monitor makers to grab the early adopter market with a new monitor, assuming the DP 2.0 ASICs are available.

The signalling rate question is interesting. If you want to use existing passive cables 6-9 feet max, you would be limited to 40gbit. Reaching the max bitrates would require new cables, and active cables are not cheap. 40gbit with DSC etc. and higher efficiency would still be a step up from what's available today, so I can see this only being for a 4k 144hz 4:4:4 HDR stream. You could get 4k120 RGB over regular cable. I also wonder if the USB-C controller output on RTX cards can be made to output this format already, with firmware updates at the existing/lower signalling rate (i.e. 40gbit).

I'm far more interested in HDMI 2.1 just because we'll finally be able to use a home theater receiver and get VRR without some phantom monitor bullshit.

Audio is just a complete fucking mess on the PC right now. The hybrid display port for video and HDMI for audio thing is NOT working. It's either dumbass Microsoft or Nvidia that won't support simply turning the video part of the signal off, and it makes the whole thing an irritating nightmare for people who want uncompressed 7.1 audio.
 
I'm far more interested in HDMI 2.1 just because we'll finally be able to use a home theater receiver and get VRR without some phantom monitor bullshit.

Co-signed. HDMI 2.1 is making VRR mandatory, Displayport 2.0 is not. That makes HDMI far more attractive to me. (Nevermind I use TVs as my PC monitor, so Displayport is simply not an option anyways).

Besides, do we *really* need more then the 12k/120Hz HDMI 2.1 supports?
 
.

Besides, do we *really* need more then the 12k/120Hz HDMI 2.1 supports?

HDMI 2.1 doesn't support that. It supports 8k 24-bit 120hz with DSC, or 8k 30hz no DSC. It drops even further if you want HDR with 4:4:4: Chroma.

It's only 48 Gbps (25% faster than the new DP 2.0 passive cable), and plays the same trick DP does with the same DSC support. And after HDMI overhead, there's very little difference between the two, except that DP 2.0 has room to grow to double HDMI 2.1.
 
Last edited:
Back
Top