They're lying to us about DP 2.1 (DisplayPort)

Roen

Weaksauce
Joined
Aug 7, 2021
Messages
113
There are 3 different bandwidth sub specifications of DP 2.1. Beware of marketing material that just says "DisplayPort 2.1". You should know when / if you actually really need the higher bandwidth sub spec though, see video for details.


View: https://www.youtube.com/watch?v=nIgHNP-9SvY
I'm in no way affiliated with TFT Central nor am I compensated for sharing this video.

DP 1.4 en 2.1 Bandwidths.png


I don't see a good reason for UHBR10 to exist so it feels like a future problem being created today just so that manufacturers can abuse the term "DP 2.1" to sell products even though supporting only UHBR10 offers no benefit to the user. One would assume monitor manufacturers will make sure their monitors support at least the bandwitdth they require at max refresh at native resolution, but if they're cheap about it e.g. 10bit RGB may not be available at the highest refresh rates.

Dear future you and me: start by calculating your bandwidth requirements and read the fine print in the specs, looking for UHBR20 or UHBR13.5 as needed. I found a display bandwidth calculator but it doesn't tell you what the number is with DSC applied. So if anyone knows by what percentage(s) DSC compresses signals, please share the info with the rest of us.
Edit: calculator with DSC on LTT forums here.
Edit #2: Quote from displayport.org's FAQ:
How does VESA’s DSC Standard compare to other image compression standards?
Compared to other image compression standards such as JPEG or AVC, etc., DSC achieves visually lossless compression quality at a low compression ratio by using a much simpler codec (coder/decoder) circuit. The typical compression ratio of DSC range from 1:1 to about 3:1 which offers significant benefit in interface data rate reduction. DSC is designed specifically to compress any content type at low compression with excellent results. The simple decoder (typically less than 100k gates) takes very little chip area, which minimizes implementation cost and device power use, and adds no more than one raster scan line (less than 8 usec in a 4K @ 60Hz system) to the display’s throughput latency, an unnoticeable delay for interactive applications.
 
Last edited:
Coax, cat, hdmi, any of the modern cables have always been thw wild west with marketing. There's standards but no one watching.

Speaking of that can I interest you in a box of CAT6e cable?

17275233322043667995644330447064.jpg
 
man, im glad you clarified which "DP" youre talking about! ;)
just like hdmi not all being the same. they should just use clear, new numbering.
 
LTT bandwidth calculator might be what you want to check out.

DP 2.1 support will be the same deal as HDMI 2.1 - details determine what it actually means.

GPUs should support the full DP 2.1 UBHR20, but so far only Radeon Pro cards do, the 7000 series is only UHBR13.5. We will see what 50 series Nvidia will do.

Full UHBR20 is barely enough for 4K @ 10-bit color @ 240 Hz. With HDR being a standard feature on high end displays, 10-bit color bandwidth demand will matter.

Most displays will use DSC, which will allow 4K @ 10-bit @ 240 Hz at fairly low compression ratios even with UHBR13.5.

Any issues with DSC are on the GPU end, especially on Nvidia cards where entire features may be unavailable when DSC is in use. Nvidia needs to fix that crap.
 
When I got DSC monitor I tried hard to spot where this compression is visible but could not find any obvious artifacts.
Later FW update made 144Hz work over HDMI 2.1 so I used that but OCd it to 155Hz. On DP it runs 160Hz.
Then I switched back to DP because I cannot see any difference.

Tried to find any reference images for DCS to do diff comparison to see how DSC butchers image but to no avail. No images and no tools to generate such images...

...and maybe for the better. If I knew where the difference is my brain would be motivated to fake seeing differences...
ignorance-is-bliss.jpg


As for DP 2.1 debacle - just like with any other standard they had to with N-th revision complicate things so no one knows what is what.
From what I understand there are cable length issues and higher bandwidth versions need to use active cabling which will be hella expensive. Hopefully by the time we actually need to use such cables they will be available and relatively affordable. I think it will not be to run monitor without DSC. If no one can see the difference there is hardly any point in not using it. Maybe for like uber professional displays or something but not for gaming monitor.

Is there single reason why anyone should care about existence of DSC in our monitors?
In area of monitors I exhausted my worrying capacity on lack of proper factory calibration and no options to calibrate out black crush that monitor manufacturer push because tards reviewing monitors call black crush as "deep blacks"
 
Last edited:
Is there single reason why anyone should care about existence of DSC in our monitors?

I think you can go either way, it'd be nice to just have all the data, but the kinds of images that will be noticably different with DSC are going to look pretty much like random pixels, and if your random pixels aren't quite right, it didn't really hurt anything did it?

There's certainly a limit though; a lot of people would be unhappy if all gaming went through h.265 before it hit your screen. Artifacts and latency would be noticable. DSC doesn't add significant latency, and by all accounts is nearly unnoticable (and the test they used to determing acceptability would highlight differences, tell us which is better a) uncompressed, b) fast cycling between uncompreased and compressed. If it's barely noticable with cycling, it's even less when it's just always compressed)
 
  • Like
Reactions: XoR_
like this
Avoiding DSC is not just about image quality, if Simon (TFT Central) and his peers say they can't see a difference I believe them: DSC is visually lossless. There's other reasons mentioned in the video why people may want or need to avoid DSC. It's another variable with the potential to cause issues so if you can avoid it...

I don't know if the following example is still relevant, but here's a 2012 quote from the creator of CRU, the program that puts the user in control of their monitor's refresh rate modes and timings:
ToastyX said:
NVIDIA's driver currently ignores EDID overrides when Display Stream Compression (DSC) is active. Please report this issue to NVIDIA.
Idk if this is still a thing today, as I can add my own modes in CRU and they work, but I still can't open or edit the extension block that seems to contain my monitor's 144Hz and 120Hz modes, it gets listed as a "Default extension block" which is what CRU calls a block it can't read. I do not have control over the modes in there, I can only guess what (else) is in there and it leaves me with little room for my own extension block(s) before the EDID is full. I'm not sure if it's DSC on Nvidia causing CRU to be unable to read Gigabyte's extension block but not knowing is already reason enough to say hey, if I have a choide between two very similar monitors and one lets me eliminate this variable, why wouldn't I pick that one? HDCP is another one of these variables that people will say is fine, in my case HDCP will randomly turn off my monitor once every number of hours when I have the spotify website open - but not the 2nd monitor. The cause was not easy to find. When possible it's a good idea to keep things as simple as they can be. It makes for less time consuming troubleshooting whenever you do have a problem. Adding another layer of complexity to something is rarely completely "free".

Even with DSC we'll still eventually run into bandwidth limitations of the slower DP 2.1 specs. It's good to know the different "tiers" within 2.1 even if not interested in avoiding DSC. 8K 120Hz 10bit with DSC requires 42.58 Gbit/s and exceeds DP 2.1 UHBR10. The most UHBR10 can do at 4K 10bit with DSC is 360Hz.
 
Last edited:
Had many issues with DSC and NVIDIA on my AW2725DF:
-Black screen after alt+tab fullscreen
-Glitched resolutions and freezes when using multiple displays
-DSR and Integer unavailable

After switching to a 7900XTX all of those issues are gone. I can also use DSR/Integer scaling (and other AMD technologies like AFMF2) and the control panel is way better. The fact the monitor uses DP 1.4 or DSC just doesnt seem to matter on AMD cards, it really is a smoother experience.


View: https://youtu.be/-TKGfEADMu4?t=497
 
Had many issues with DSC and NVIDIA on my AW2725DF:
-Black screen after alt+tab fullscreen
-Glitched resolutions and freezes when using multiple displays
-DSR and Integer unavailable

After switching to a 7900XTX all of those issues are gone. I can also use DSR/Integer scaling (and other AMD technologies like AFMF2) and the control panel is way better. The fact the monitor uses DP 1.4 or DSC just doesnt seem to matter on AMD cards, it really is a smoother experience.
People basically blame DSC for Nvidia specific issues. Nvidia don't seem to care to fix those features.
 
Idk if this is still a thing today, as I can add my own modes in CRU and they work, but I still can't open or edit the extension block that seems to contain my monitor's 144Hz and 120Hz modes, it gets listed as a "Default extension block" which is what CRU calls a block it can't read. I do not have control over the modes in there, I can only guess what (else) is in there and it leaves me with little room for my own extension block(s) before the EDID is full. I'm not sure if it's DSC on Nvidia causing CRU to be unable to read Gigabyte's extension block but not knowing is already reason enough to say hey, if I have a choide between two very similar monitors and one lets me eliminate this variable, why wouldn't I pick that one?
You can also try AW EDID Editor for editing the EDID data. CRU is a lot more cumbersome and IMO better left for just loading an EDID.
 
Had many issues with DSC and NVIDIA on my AW2725DF:
-Black screen after alt+tab fullscreen
-Glitched resolutions and freezes when using multiple displays
-DSR and Integer unavailable

After switching to a 7900XTX all of those issues are gone. I can also use DSR/Integer scaling (and other AMD technologies like AFMF2) and the control panel is way better. The fact the monitor uses DP 1.4 or DSC just doesnt seem to matter on AMD cards, it really is a smoother experience.


View: https://youtu.be/-TKGfEADMu4?t=497

Not sure if tis DSC related or something else because on my LG 27GP950 I definitely had DSR and integer scaling when running 4K@144Hz on RTX 2070. When enabling OC mode 160Hz integer scaling was definitely gone. Not sure DSR. This monitor uses DSC for anything above 95Hz - which is exactly what is dictated by DP 1.4 spec.

People basically blame DSC for Nvidia specific issues. Nvidia don't seem to care to fix those features.
It is definitely Nvidia issue, not DSC issue.
 
LTT bandwidth calculator might be what you want to check out.

DP 2.1 support will be the same deal as HDMI 2.1 - details determine what it actually means.

GPUs should support the full DP 2.1 UBHR20, but so far only Radeon Pro cards do, the 7000 series is only UHBR13.5. We will see what 50 series Nvidia will do.

Full UHBR20 is barely enough for 4K @ 10-bit color @ 240 Hz. With HDR being a standard feature on high end displays, 10-bit color bandwidth demand will matter.

Most displays will use DSC, which will allow 4K @ 10-bit @ 240 Hz at fairly low compression ratios even with UHBR13.5.

Any issues with DSC are on the GPU end, especially on Nvidia cards where entire features may be unavailable when DSC is in use. Nvidia needs to fix that crap.
UHBR20 is plenty for 4K HDR 240 Hz. With a +15% overhead it's "only" 63.96 Gbps uncompressed.
 
The data rate calculator on LTT forums says DSC reduces data rate by a factor of 3. How is that happening while still being visually lossless, without adding significant delay and processing power requirements?

Edit: it doesn't answer my question, but here's a quote from displayport.org's FAQ:
How does VESA’s DSC Standard compare to other image compression standards?
Compared to other image compression standards such as JPEG or AVC, etc., DSC achieves visually lossless compression quality at a low compression ratio by using a much simpler codec (coder/decoder) circuit. The typical compression ratio of DSC range from 1:1 to about 3:1 which offers significant benefit in interface data rate reduction. DSC is designed specifically to compress any content type at low compression with excellent results. The simple decoder (typically less than 100k gates) takes very little chip area, which minimizes implementation cost and device power use, and adds no more than one raster scan line (less than 8 usec in a 4K @ 60Hz system) to the display’s throughput latency, an unnoticeable delay for interactive applications.
Sounds like LTT calculator's 3 to 1 is not at all guaranteed.
 
Last edited:
The data rate calculator on LTT forums says DSC reduces data rate by a factor of 3. How is that happening while still being visually lossless, without adding significant delay and processing power requirements?

Edit: it doesn't answer my question, but here's a quote from displayport.org's FAQ:

Sounds like LTT calculator's 3 to 1 is not at all guaranteed.
My understanding is that the devices will just negotiate whatever DSC compression level is sufficient.

You could have e.g two 4K @ 240 Hz displays where a cheaper one uses 40 Gbps ports and a more expensive model has full 48 Gbps ports. To the end user the only difference would be the DSC compression ratio needed. The 40 Gbps model would use e.g 2.5:1 and the 48 Gbps would just need 2:1 compression.

LTT calculator nowadays represents DSC as x bits per pixel values because this matches the DSC spec better, and you can see the different compression ratios there.
 
  • Like
Reactions: elvn
like this
The data rate calculator on LTT forums says DSC reduces data rate by a factor of 3. How is that happening while still being visually lossless, without adding significant delay and processing power requirements?
Two basic non-technical answers:

1) Data is more repetitive than you might think. Quite often a pixel is very close to, or even the same as, the pixels next to it. So if instead of transmitting each one as new data you just transmit the differences, it can cut down on how much data you send a whole lot. This goes all the way back to shit like the Amiga with its "hold and modify" mode where you kept two of the RGB values the same and just modified one of them. In that case it was used to increase color depth but stuff like that works for decreasing data required.

2) Your vision is crap. All our vision is crap. We aren't nearly as sensitive to thing as we may think we are. Just because there is loss in the mathematical sense, doesn't mean you can perceive it. One simple example is you are much less perceptive to blue than the other primaries, so if blue gradients are done with less precision, you aren't going to notice as easily as if it was green or gray.
 
DSC is not noticeably compressing like the dynamic compression of youtube and other streaming services, it's not dynamically downscaling 4k to 1536p or whatever on gaming console's games that use dynamic downscaling "4k", and it's not lossy like swapping to 4:2:0, or dropping from 10bit to 8bit (though I wonder if some of the compression is lowering color output in some sense like sycraft said of blue).

I'm curious what effect different supersampling/downscaling from high rez/DLDSR, or DLSS upscaling might have. What result you'd get anywhere that you could push what is in effect a higher pixel count signal to a screen's lower resolution, compared to "normal" resolution gaming, while using DSC in both scenarios.

I agree with kasakaa, that nvidia has a lot of issues with current gpu limitations.

Another factor that I don't believe was brought up specifically in this discussion is that there are nvidia limitations for using multiple monitors as well, (plus if you ever go to an 8k screen). For example, I've seen reports from a few people who tried using a 240hz 4k + a few 120hz screens, or a 5760x2160 + two other screens at 120hz and it wouldn't work.


Some of these quotes below are about the G95NC 5760x2160 super-ultrwawide gaming display that is supposed to be capable of 240hz at that rez, some other comments about 8k display limitations too, but they all are referencing current nvidia gpu output limitations overall.

from nvidia's 4090 page's spec sheet:

1 - Up to 4K 12-bit HDR at 240Hz with DP 1.4a + DSC or HDMI 2.1a + DSC. Up to 8K 12-bit HDR at 60Hz with DP 1.4a + DSC or HDMI 2.1a + DSC.

2 - As specified in HDMI 2.1a: up to 4K 240Hz or 8K 60Hz with DSC, Gaming VRR, HDR


"NVIDIA's specs for the GeForce RTX 4090 list the maximum capabilities as "4 independent displays at 4K 120Hz using DP or HDMI, 2 independent displays at 4K 240Hz or 8K 60Hz with DSC using DP or HDMI." Could support be added as part of a driver update? That remains to be seen."


Reddit user reply from a g95nc thread:
"I want to clarify how DSC works since I have yet to see anyone actually understand what is going on.
DSC uses display pipelines within the GPU silicon itself to compress the the image down. Ever notice how one or more display output ports will be disabled when using DSC at X resolution and Y frequency? That is because the GPU stealing those display lanes to process and compress the image.
So what does this mean? It means if the configuration, in silicon, does not allow for enough display output pipelines to to be used by a single output port, THAT is where the bottleneck occurs."

"Nividia's own spec notes that only 8k 60hz is feasible using DSC over HDMI 2.1 on their cards by disabling at least one port (it will just disable the one that isn't plugged in), so it's clear all the display pipelines are interconnected for use together. I suppose it may be possible to forcibly disable 2 ports to achieve a high enough internal bandwidth to deal with 240hz at 1/2 8k resolution, but again, that is also determined by the slicing and compression capabilities."

. . .


The 8k on the 900D can only do 60Hz right now but I suspect that might be a limitation of the current gen of GPUS since the panel is 120hz / 240hz 4k. So maybe if gpus had enough bandwidth assigned to a single hdmi port it could do 8k 120hz using DSC. The 57" g95nc can do 120hz 7680x2160 with dsc, or 240hz off of dp 2.1 amd gpu - - but again that's probably because of the way nvidia alloted the ports on the gpu. HDMI 2.1 should be able to go higher with dsc if a card was designed for it.

From LTT calculator. It's 42.58 vs 41.92 at 10 bit RGB(444), so it could prob do 8k 120hz at 8 bit color, or if they did 3.25x DSC compression on 10bit or something. Also, could just run 115 Hz and it would fit hdmi 2.1 on DSC 3.0, RGB/444, 10bit, or run 99Hz at that but with DSC 2.5x. Not that native 8k gaming will get high fps. Would be cool if it could do 7680x2160 s-uw rez at relatively high hz for gaming too.

firefox_hChB6GTDAc.png

From the responses in that Tech with KG thread, KG is saying that it is true 4k 240hz, but that the 8k can only do 60Hz. I haven't seen the true 240hz 4k verified elsewhere yet though. If it's true, I'm assuming that the 8k 60hz rather than 120hz capability is a limitation with the lanes on the gpu ports. Nvidia 4000 gpus similarly can't do multiple 4k 240hz screens. So perhaps that might change in the 5000 series.

. . .
 
Last edited:
Back
Top