AMD's RX 7900 XT cards allegedly support unannounced DisplayPort 2.1 connectors

My biggest concern with Display Port has always been power feedback through the cable's power pin that would sometimes burn out a video card.
As far as I know this is only an issue with non-partner cables that don't follow the specification. I have not had any issues with DisplayPort burning out a video card in more than a decade of using it.
It's like "shaders" back in the Geforce 2/3 days.... "They used this tech in Jurassic Park?? No way.."

It took till DX 8.1 for shaders to become a standard. (with HL2/Doom3)

Honestly, I'm much more intrigued with DLSS/FSR than ray-tracing.

Until there's a universal standard for ray-tracing; which I'll assume AMD will decide with consoles etc....

We're still at the GLquake/Vquake stage IMO
Ray tracing is a "universal standard" baked into both DirectX and Vulkan.
 
Here we thought nvidia was on the leading edge, but it was in fact AMD with the real long term plan... my sources are telling me the 8000XT series will also prevent equipment failure in McDonald's ice cream machines.
I also have qualified reports that the simple ownership of one functions as a form of birth control. What a world we live in!
 

VESA RELEASES DISPLAYPORT 2.1 SPECIFICATION

Latest DisplayPort specification provides greater alignment with USB Type-C and USB4; adds new features for more efficient DisplayPort tunneling over USB4

DisplayPort-Logo-e1666017355962.jpg
BEAVERTON, Ore. – October 17, 2022 – The Video Electronics Standards Association (VESA®) announced today that it has released DisplayPort 2.1, the latest version of the DisplayPort specification, which is backward compatible with and supersedes the previous version of DisplayPort (DisplayPort 2.0). VESA has been working closely with member companies to ensure that products supporting DisplayPort 2.0 would actually meet the newer, more demanding DisplayPort 2.1 spec. Due to this effort, all previously certified DisplayPort 2.0 products including UHBR (Ultra-high Bit Rate) capable products – whether GPUs, docking station chips, monitor scalar chips, PHY repeater chips such as re-timers, or DP40/DP80 cables (including both passive and active, and using full-size DisplayPort, Mini DisplayPort or USB Type-C connectors) – have already been certified to the stricter DisplayPort 2.1 spec.

Achieving a robust, end-to-end user visual experience remains the utmost priority for VESA’s DisplayPort specification, whether across a native DisplayPort cable, via DisplayPort Alt Mode (DisplayPort over the USB Type-C connector), or tunneled through the USB4 link. As such, DisplayPort 2.1 has tightened its alignment with the USB Type-C specification as well as the USB4 PHY specification to facilitate a common PHY servicing both DisplayPort and USB4. In addition, DisplayPort 2.1 has added a new DisplayPort bandwidth management feature to enable DisplayPort tunneling to coexist with other I/O data traffic more efficiently over the USB4 link. This increased efficiency is on top of mandated support for VESA’s visually lossless Display Stream Compression (DSC) codec and VESA’s Panel Replay capability. DSC bitstream support can reduce DisplayPort transport bandwidth in excess of 67 percent without visual artifacts, while VESA’s Panel Replay capability can reduce DisplayPort tunneling packet transport bandwidth in excess of 99 percent when Panel Replay operation is taking place.

“Achieving greater alignment between DisplayPort and USB on a common PHY has been a particularly important effort within VESA given the significant overlap in use case models between the DisplayPort and USB4 ecosystems,” stated Alan Kobayashi, VESA Board Chair and VESA DisplayPort Task Group Chair. “DisplayPort 2.1 brings DisplayPort into convergence with USB4 PHY specifications to ensure the highest video performance across a broad range of consumer products. Display transport through DisplayPort, with its higher bit rates and proven visual quality of DSC compression even for HDR content, offers ample bandwidth for the needs of virtually every practical application. Features such as driving multiple displays over a single cable, or enabling multiple functions on a single port like video, power and data transfer, no longer require any compromise in video format choice. The advanced capabilities of the DisplayPort video interface are enabled by the invaluable contributions by our more than 300 member companies from across the electronics ecosystem.”

VESA-DP40-80-Cables-1-e1666018608252.jpg

VESA certified DP40 and DP80 UHBR cables guarantee display connectivity and operation at the highest performance levels for products supporting DisplayPort 2.1. Source: VESA.

DisplayPort 2.1 has also updated the DisplayPort cable specification to provide greater robustness and enhancements to full-size and Mini DisplayPort cable configurations that enable improved connectivity and longer cable lengths (beyond two meters for DP40 cables and beyond one meter for DP80 cables) without diminishing UHBR performance. VESA certified DP40 cables support up to the UHBR10 link rate (10 Gbps), with four lanes, providing a maximum throughput of 40 Gbps, while VESA certified DP80 cables support up to the UHBR20 link rate (20 Gbps), with four lanes, providing a maximum throughput of 80 Gbps.

“For all of our standards including DisplayPort, VESA has invested significant resources in testing and auditing procedures, including interoperability testing of products incorporating VESA specifications. This is to ensure that products that are introduced to market that claim support of VESA’s standards meet the high-quality benchmarks that we have established,” stated James Choate, compliance program manager for VESA. “VESA continues to investigate and develop new procedures to improve our auditing process in order to ensure robust implementation of quality products supporting DisplayPort and other VESA specs in the market. Thanks to the contributions by test equipment vendors, VESA has a solid test infrastructure in place to support wider testing and deployment of DisplayPort 2.1 certified devices in the marketplace.”
 
VESA has barely managed to finish the certification on the DP2.0 hardware and AMD is going to release a GPU with a pre release of a yet unreleased and unannounced standard that has no certified devices to test it with.
That seems a bit far fetched…
I mean AMD has barrelled ahead plenty of times with things ahead of their time so I mean they could. But I just wonder about longevity and feature support when you jump the gun on an incomplete unconfirmed standard.
And this one just for you, the all-knowing about everything.... ;)

zz.png
 
And this one just for you, the all-knowing about everything.... ;)

View attachment 519191
Well that’s good, VESA bubble fucked their way through the 2.0 certification process and took 3 years to get the hardware certified so that companies could actually start releasing qualified devices. If they scrapped that 2.0 work vendors would be pissed.

But 2.1 on initial inspection just seems to tighten signalling and cable requirements to better support USB 4. So not a huge change over 2.0, but I am glad that during the huge delays in their 2.0 development they made it robust enough that it could support the USB4 specifications.

So fatter cables and a firmware update, so AMD’s previous 2.0 stuff is what’s going in there and not something completely new, makes a lot more sense in that context.
 
Well that’s good, VESA bubble fucked their way through the 2.0 certification process and took 3 years to get the hardware certified so that companies could actually start releasing qualified devices. If they scrapped that 2.0 work vendors would be pissed.

But 2.1 on initial inspection just seems to tighten signalling and cable requirements to better support USB 4. So not a huge change over 2.0, but I am glad that during the huge delays in their 2.0 development they made it robust enough that it could support the USB4 specifications.

So fatter cables and a firmware update, so AMD’s previous 2.0 stuff is what’s going in there and not something completely new, makes a lot more sense in that context.
That is the most awesome, "I was wrong, sorry," statement I have ever seen.
 
That is the most awesome, "I was wrong, sorry," statement I have ever seen.
Yeah, I do try.
I am not unhappy to be wrong about these things.
Now the 2.1 spec actually looks decent enough for Nvidia to switch to, and when the 2.0/2.1 displays start appearing in 2023 I am sure Nvidia will make sure they have some stupid pretty GSync (certified) displays to go along with it which will confuse everybody because they don't have any 2.0/2.1 GPU's to get that rectified in 2024 for whatever they announce as their next GPU series.
 
Now I need to know when they say they brought signaling more in line with USB4 are they talking about the 1.0 spec from 2019 or the 2.0 spec announced in September of this year but isn't being unveiled until the Developer Days event in November?
And is this maybe why Zen 4 doesn't have USB 4 support but Zen 3+ will because they are just skipping the USB4v1 spec and going right to V2...
Bah, I love new things!
 
Yeah, I do try.
I am not unhappy to be wrong about these things.
Now the 2.1 spec actually looks decent enough for Nvidia to switch to, and when the 2.0/2.1 displays start appearing in 2023 I am sure Nvidia will make sure they have some stupid pretty GSync (certified) displays to go along with it which will confuse everybody because they don't have any 2.0/2.1 GPU's to get that rectified in 2024 for whatever they announce as their next GPU series.
The 4090ti will probably incorporate it.

The display port 2.1 standard is moot or more of an academic exercise unless you have an extreme refresh rate monitor with OLED's lack of latency. Sure you can overdrive a lot of TN, VA and even now IPS panels to an exceptional degree, but I don't want to deal with the kind of absurd inverse ghosting that results from doing so. No, that isn't me further self-justifying buying a 42 inch C2 OLED when the price dropped for the seemingly seasonal prime day.
 
The 4090ti will probably incorporate it.

The display port 2.1 standard is moot or more of an academic exercise unless you have an extreme refresh rate monitor with OLED's lack of latency. Sure you can overdrive a lot of TN, VA and even now IPS panels to an exceptional degree, but I don't want to deal with the kind of absurd inverse ghosting that results from doing so. No, that isn't me further self-justifying buying a 42 inch C2 OLED when the price dropped for the seemingly seasonal prime day.
DP 1.4 can't drive a 4K monitor at 120 Hz with 4:4:4 chroma. The max it can do is 98 Hz. Good IPS screens can go up to 240 Hz before the pixels can't keep up anymore.
 
DP 1.4 can't drive a 4K monitor at 120 Hz with 4:4:4 chroma. The max it can do is 98 Hz. Good IPS screens can go up to 240 Hz before the pixels can't keep up anymore.
Yeah but unless you are wanting to play with DLSS 3 and its image-generation shenanigans then you are going to be hard-pressed to push 240 fps anything at 4K unless you are happily playing in potato mode which sort of renders the whole 4K display kind of pointless.
4K gaming is just so weird to me, to make it enjoyable you need to render at 1440p and upscale it then to get great framerates you need to further interlace that between real and "fake" frames at the expense of then having sluggish feeling controls because your latency and framerates are so disproportionate.
Does this mean though that AMD is hoping their cards will be able to do 120+ FPS? And will they announce it alongside some Freesync Premium Pro certified displays?
 
Yeah but unless you are wanting to play with DLSS 3 and its image-generation shenanigans then you are going to be hard-pressed to push 240 fps anything at 4K unless you are happily playing in potato mode which sort of renders the whole 4K display kind of pointless.
4K gaming is just so weird to me, to make it enjoyable you need to render at 1440p and upscale it then to get great framerates you need to further interlace that between real and "fake" frames at the expense of then having sluggish feeling controls because your latency and framerates are so disproportionate.
Does this mean though that AMD is hoping their cards will be able to do 120+ FPS? And will they announce it alongside some Freesync Premium Pro certified displays?
My point was that DP 2.0/2.1 isn't simply an academic exercise. There is practical use for the increased bandwidth.
 
  • Like
Reactions: noko
like this
DP 1.4 can't drive a 4K monitor at 120 Hz with 4:4:4 chroma. The max it can do is 98 Hz. Good IPS screens can go up to 240 Hz before the pixels can't keep up anymore.
HDMI 2.1 at 48Gb/s can though for 4K120, and recent Radeon as well as Nvidia products have HDMI 2.1 ports as well as your better TVs and monitors. DP 2.0, and thus 2.1, has about 60% more bandwidth than HDMI 2.1... so that would translate to extra frames at 4K or obscene framerates at 1440p or 1080p. Of course, if those framerates are offset by high lag from the panel its benefit is significantly lessened just as the benefit is lessened if there is any meaningful amount of inverse ghosting.
Yeah but unless you are wanting to play with DLSS 3 and its image-generation shenanigans then you are going to be hard-pressed to push 240 fps anything at 4K unless you are happily playing in potato mode which sort of renders the whole 4K display kind of pointless.
4K gaming is just so weird to me, to make it enjoyable you need to render at 1440p and upscale it then to get great framerates you need to further interlace that between real and "fake" frames at the expense of then having sluggish feeling controls because your latency and framerates are so disproportionate.
Does this mean though that AMD is hoping their cards will be able to do 120+ FPS? And will they announce it alongside some Freesync Premium Pro certified displays?
As long as ray tracing isn't on, and especially if using FSR or DLSS 2 high end cards (3080 or better / 6800xt or better) are generally at or above 120 fps average in most games. New hardware is eclipsing 120 for the 1% lows.
 
  • Like
Reactions: kac77
like this
My point was that DP 2.0/2.1 isn't simply an academic exercise. There is practical use for the increased bandwidth.
Especially in VR, but yeah I get it's needed, I don't contend that at all. I just feel that both AMD and Nvidia have done a piss poor job at lining up their new cards with displays that actually use the inputs they put on there.
AMD has DP 2.0 and 2.1 but VESA and their launch partners (of which AMD is one) have utterly failed at getting displays to market with those inputs on them, Nvidia launched a GPU capable of destroying the upper limits on DP 1.4a and launched with it anyways.
I suppose Nvidia did at least put an HDMI 2.1 port on there and there are a number of Gsync-compatible 4K TVs on the market that work with HDMI 2.1, so that is something at least but still. It just seems so very unplanned and you would think that with all these companies spending billions on new product launches they would communicate a little better to keep their shit together.
 
HDMI 2.1 at 48Gb/s can though for 4K120, and recent Radeon as well as Nvidia products have HDMI 2.1 ports as well as your better TVs and monitors. DP 2.0, and thus 2.1, has about 60% more bandwidth than HDMI 2.1... so that would translate to extra frames at 4K or obscene framerates at 1440p or 1080p. Of course, if those framerates are offset by high lag from the panel its benefit is significantly lessened just as the benefit is lessened if there is any meaningful amount of inverse ghosting.

As long as ray tracing isn't on, and especially if using FSR or DLSS 2 high end cards (3080 or better / 6800xt or better) are generally at or above 120 fps average in most games. New hardware is eclipsing 120 for the 1% lows.

I got the samsung ark 55. Doing 4k 165hz with on HDMI. its really dope, never loved scrolling so much lmao.
 
And then one day you catch hell because you forgot to check what tape was in the device and it turns out someone had swapped a movie that you end up recording over.
Hopefully not the parent's 'special' home movie...
 
Finally it was a 54 Gbps (not the 20Gbps per lane one) version they have which is 12.5% more than the older HDMI 2.1 48 gbs, I imagine that mean it will be using DSC for those high fps.

https://www.hardwarezone.com.sg/feature-amd-radeon-rx-7900-xtx-rdna-3-gpu-engneering
The data rate for UHBR13.5 is 52.22 Gbps. Doesn't even support 4K at 240 Hz. Fail.

I'm only half joking. Remember when people criticized LG for not using the full bandwidth of HDMI 2.1, even though it was still within specification (FRL5)?
 
VESA has barely managed to finish the certification on the DP2.0 hardware and AMD is going to release a GPU with a pre release of a yet unreleased and unannounced standard that has no certified devices to test it with.
That seems a bit far fetched…
I mean AMD has barrelled ahead plenty of times with things ahead of their time so I mean they could. But I just wonder about longevity and feature support when you jump the gun on an incomplete unconfirmed standard.
Well, worst case they have a bunch of stuff implemented in hardware that doesn't do anything on other standards compliant hardware. Next worst would be some is in firmware/drivers and they can make it work. And if things go well, then it all works according to standard specifications without much tweaking at all. (Edit: which, apprently, shouldn't be difficult since it's not far removed from 2.0)
 
I just don't understand why they are doing this.

To get a paper win?

None of these GPU's are pushing the kind of performance that means they really benefit from anything above what is supported by 1.4a anyway, at least not in modern titles.

I mean, sure, you have to roll in new standards at some point, but in the grand scheme of things this seems completely inconsequential.
 
I mean, sure, you have to roll in new standards at some point, but in the grand scheme of things this seems completely inconsequential.
The gap with hdmi 2.1 is really small (at least bandwidth wise) but with DP 1.4a, they go from 32.4 to 54 gbps, that 65-70% more data.

Going from 1.4 to 2.-2.1 display port gain everything hdmi 2.1 has over it (dynamic HDR).

There is also some why not, if you put display ports on your high priced cards why not some version dp 2 instead of 1.4 like Intel-Amd did, how much costlier can it be.

To get a paper win?
Maybe a bit in some example, Intel gpus looked good on paper pre-launch with their display port 2.1, ended up with less bandwith than and no support for HDMI 2.1
 
Last edited:
I just don't understand why they are doing this.

To get a paper win?

None of these GPU's are pushing the kind of performance that means they really benefit from anything above what is supported by 1.4a anyway, at least not in modern titles.

I mean, sure, you have to roll in new standards at some point, but in the grand scheme of things this seems completely inconsequential.
To get a paper win and to drum up hype for what turned out to be a mediocre release.
 
I just don't understand why they are doing this.

To get a paper win?

None of these GPU's are pushing the kind of performance that means they really benefit from anything above what is supported by 1.4a anyway, at least not in modern titles.

I mean, sure, you have to roll in new standards at some point, but in the grand scheme of things this seems completely inconsequential.
It's a forward looking bonus more than a selling feature but it can help add longevity to the usefulness of a card beyond it's gaming lifespan. I have an old system that's limited to 1080p due to the version of hdmi the old card in it has, I don't use it much but whenever I do I'm reminded how awful 1080p on a 27" screen is.
 
Back
Top