why is AMD always behind on video decoding support? if decoding is a separate module?

Kdawg

[H]ard|Gawd
Joined
Aug 12, 2017
Messages
1,116
vega 8 came out around the same time as nvidia's 10 series.

my gtx1060 supports vp9 decode at 8k60
whereas the vega 8 maxes out at 8k30 vp9 decode support.

Seems to be the same deal with the RDNA2.
nvidia and Intel Rocket Lake Xe support 8k60 AV1 decoding, whereas the RDNA2 tops out at 8k30.

is it that difficult to stay up to date?


you're probably thinking 8k, who cares.
but still. 8k TVs might become mainstream by 2026.
Gotta watch that 8K60 hi-fi porn
 
The first thing that comes to mind is that hardware decoding is a perk but not a requirement. For anything that can't be decoded in hardware, the CPU itself can decode it in software instead.

I'll admit that I don't know exactly how much CPU horsepower would be required to software decode VP9 at 8k 60fps. This picture however, seems to indicate that even a 10-year-old quad core (i5-3470) can still easily decode VP9 at 4k 60fps in software. So I would think that there would be a pretty good chance that a brand new CPU could do 8k 60fps. AMD knows the capabilities of the CPUs that include Vega 8 graphics obviously. Perhaps they simply concluded that the CPU itself was powerful enough that it didn't need hardware decoding to that extent, and thus was not worth the transistor real-estate.

axeSsTK.png
 
neither of the chips you are comparing, in three different threads now, support 8k.
 
  • Like
Reactions: travm
like this
is it that difficult to stay up to date?
AMD is indeed lagging behind in video decode. And it is a choice made by AMD, Cezanne APUs don't support AV1 decode at all which is a bit shocking for hardware released in the year 2021. AMD used to provide "hybrid VP9 decode" up until Crimson 17.x drivers, which I understand was a 3rd party licensed decoder implemented in GPU shaders. But it was fraught with problems and they dropped it.

The real issue here is AV1 though. Decoding AV1 in software is very taxing even to current CPUs. For 8K60 you will probably need something like a Core i9-10980XE or Ryzen 5900X.
embed.php

https://www.phoronix.com/scan.php?page=news_item&px=AVX2-dav1d-0.9-Benchmarks
 
The first thing that comes to mind is that hardware decoding is a perk but not a requirement. For anything that can't be decoded in hardware, the CPU itself can decode it in software instead.

I'll admit that I don't know exactly how much CPU horsepower would be required to software decode VP9 at 8k 60fps. This picture however, seems to indicate that even a 10-year-old quad core (i5-3470) can still easily decode VP9 at 4k 60fps in software. So I would think that there would be a pretty good chance that a brand new CPU could do 8k 60fps. AMD knows the capabilities of the CPUs that include Vega 8 graphics obviously. Perhaps they simply concluded that the CPU itself was powerful enough that it didn't need hardware decoding to that extent, and thus was not worth the transistor real-estate.


Speaking from personal experience, my father's i5-2400 couldn't play 4k YouTube streams even at 30 FPS using software decode (integrated Intel GPU). I threw a Geforce 1650 in there to handle GPU decode, and it runs like a champ.
 
I think it's a bit short sided to ask why AMD is behind on anything. Certain companies just are able to prioritize certain things based on their own abilities. On the video card front AMD is definitely behind. But not by leaps and bounds. Look at FSR. It's already better then what DLSS 1.0 was. Think of AMD x64. Intel is still paying AMD royalties for that. They'll get there.
 
Last edited:
Intel just started to support HDMI 2.0. The PC industry in general is a tech laggard. Great if you like blinking lights though.
 
The first thing that comes to mind is that hardware decoding is a perk but not a requirement. For anything that can't be decoded in hardware, the CPU itself can decode it in software instead.

I'll admit that I don't know exactly how much CPU horsepower would be required to software decode VP9 at 8k 60fps. This picture however, seems to indicate that even a 10-year-old quad core (i5-3470) can still easily decode VP9 at 4k 60fps in software. So I would think that there would be a pretty good chance that a brand new CPU could do 8k 60fps. AMD knows the capabilities of the CPUs that include Vega 8 graphics obviously. Perhaps they simply concluded that the CPU itself was powerful enough that it didn't need hardware decoding to that extent, and thus was not worth the transistor real-estate.

View attachment 372225
That P4 Willamette isn't that old. Right guys? Right?

I would have killed for one of those back in 2000. We had some in the library at my middle school (No clue how they got the money for that). They ran circles around my 166Mhz Pentium at home.
 
  • Like
Reactions: N4CR
like this
That P4 Willamette isn't that old. Right guys? Right?

I would have killed for one of those back in 2000. We had some in the library at my middle school (No clue how they got the money for that). They ran circles around my 166Mhz Pentium at home.
you had a 166Mhz Pentium at home in 2000? that's not [H]ard! ;)
 
The first thing that comes to mind is that hardware decoding is a perk but not a requirement. For anything that can't be decoded in hardware, the CPU itself can decode it in software instead.

I'll admit that I don't know exactly how much CPU horsepower would be required to software decode VP9 at 8k 60fps. This picture however, seems to indicate that even a 10-year-old quad core (i5-3470) can still easily decode VP9 at 4k 60fps in software. So I would think that there would be a pretty good chance that a brand new CPU could do 8k 60fps. AMD knows the capabilities of the CPUs that include Vega 8 graphics obviously. Perhaps they simply concluded that the CPU itself was powerful enough that it didn't need hardware decoding to that extent, and thus was not worth the transistor real-estate.

View attachment 372225
I have a 2ghz c2d and it cannot smoothly do 720/60 usually, will be some tearing. 30 is smooth. Xp though and IGPU.
 
We’ve come so far, I remember having to reduce the size of windows to get video playing smoothly.
 
Back
Top