I do know the hardware supports (both) HEVC and I'm starting to wonder why 10 bit would excluded all of a sudden that is one of the reasons you would use HEVC...
I know with Nvidia cards 10bit encoding didn’t arrive until their 1000 series cards but B frame support not until their 2000 series.
I know amd Vega cards don’t do 10bit.
I can’t find any info if the Radeon VII can do it.
AMDs FE card might be able to with their pro drivers.
It’s just so hard to find this info.