ComputerBase 4K HDR Testing Shows Performance Loss for AMD and NVIDIA

DooKey

[H]F Junkie
Joined
Apr 25, 2001
Messages
13,559
Over at ComputerBase they've put up a page showing the results of 4K HDR FPS testing on the RX Vega 64 and the GTX 1080. They tested a total of 12 games and the results were pretty surprising for the NVIDIA part. The GTX 1080 showed greater FPS losses in most of these games compared to the RX Vega 64. If you look at the results you'll see a huge hit for the 1080 in Destiny 2. ComputerBase has contacted NVIDIA about their results, but they haven't received any reply at this time. Also, I would have liked to see 1080p or 1440p testing to go along with the 4K results, but sometimes you can't always get what you want. Regardless, NVIDIA needs to look into this and provide some answers. Thanks cageymaru.

P.S. Chrome can translate the page for you.

The different behavior thus has an impact on the direct duel. With classic SDR, the Radeon RX Vega 64 is on average one percent faster, and therefore virtually equally fast. With HDR, however, the AMD graphics card is a long way ahead. The lead is then ten percent.
 
Last edited:
Well, I guess it's a good thing that HDR on the PC, the monitors and apparently now, the video cards, all of it is a joke and no one is really using it.
 
This is unacceptable for NVIDIA, 25% loss in Destiny 2 due to HDR? This is BS! They need to get that fixed right away. Consoles are doing it with 0% hit.
 
it's not that big of a difference for either of them.. so meh

Still kind of whacky to see any difference at all, though a few actually do show no difference, so you can't really point to it being a hardware/driver thing or a game thing.

I suspect that the answer as to why it's not zero falls into the 'it's complicated' territory, but since it's now in the wild, hopefully it will get some attention!
 
Can someone please explain to me why there is any difference at all?

Really there shouldnt be. It could go to architecture or machine code. If the code assume all colors are 8 bits per channel it'll take two calls (per channel) to copy/move/transform 10 bits. It's been a few years since i did anything graphical/shader related so not willing to put any money on this :p
 
Last edited:
It's simple, nVidia hasn't had time to either bribe the tester or create a custom routine to detect and cheat at the results.
 
The issue is because of a WIndows bug that makes tonemapping repeat itself more than it needs to with Nvidia cards. If that was fixed, there wouldn't be any 10% drops with Nvidia cards (or even 2% drops with Vega,) because the GPU doesn't need to do anything different.
 
Performance loss compared to what? Did nvidia come out with a new driver that causes performance loss?! I Don't get it.
 
I assume this is using DirectX, but it is not clear if this is 11 or 12. I wonder how this would look using Vulkan.
 
Performance loss compared to what? Did nvidia come out with a new driver that causes performance loss?! I Don't get it.
vs no HDR.
The graphs give it away when the headline becomes invisible.
 
This is unacceptable for NVIDIA, 25% loss in Destiny 2 due to HDR? This is BS! They need to get that fixed right away. Consoles are doing it with 0% hit.

A number of years back, NVidia had the same problem when you turned on AA.

They bragged about how much faster they were than the ATI cards, but when reviewers turned on AA, the ATI card kicked their butts.
 
Urm yes the GTX1080 is a £500 card as is the Vega 64 and both perform fairly equally so can be used as a base line. Much like how for the mid range it would be the GTX 1060 6GB vs the RX580, No point in using the 1080Ti cause one, less than 2% of the market is that card and two, there is no competitor equivalent and 3, it more power still so the impact may also be less.
 
Back
Top