How does a Directx 11 game has a lower overhead/cpu limited scenario than DX12?

PontiacGTX

Gawd
Joined
Aug 9, 2013
Messages
808
In this video

kingdom come deliverance has a better performance on AMD on Ryzen 9 than a 2080 so is it the game which favors multicore? why cant most games be optimized like this even it uses (Cryengine)DX11?
 
To answer what I think you're asking-

Most DX12 implementations are ass. Mostly because they are tacked on to DX10 / DX11 engines.

Good implementations have lower overhead, as do Vulkan implementations.
 
To answer what I think you're asking-

Most DX12 implementations are ass. Mostly because they are tacked on to DX10 / DX11 engines.

Good implementations have lower overhead, as do Vulkan implementations.
Well most developers dont mind on optimizing their games for AMD in DX11 it seems? because the results shows that there is a better multicore optimization on a DX11
game than some of those DX12 titles, though maybe there is an advantage when there is a bottleneck in DX12 but it wont inheritly improve multithreading in games like some people expected
 
Well most developers dont mind on optimizing their games for AMD in DX11 it seems? because the results shows that there is a better multicore optimization on a DX11
game than some of those DX12 titles, though maybe there is an advantage when there is a bottleneck in DX12 but it wont inheritly improve multithreading in games like some people expected

It's more that DX11 is DX10 with minor extensions / upgrades in minimum hardware support, and well, everyone has DX10 / DX11 down now. DX12 is still (somehow) new.

Developers aren't optimizing for DX11, that's taken care of. They're failing to optimize for DX12, quite spectacularly.
 
It's more that DX11 is DX10 with minor extensions / upgrades in minimum hardware support, and well, everyone has DX10 / DX11 down now. DX12 is still (somehow) new.

Developers aren't optimizing for DX11, that's taken care of. They're failing to optimize for DX12, quite spectacularly.

I never bought into the DX12 hype from the time I heard it I incorpated parts of Mantle. Everyone else was excited... I never understood why because Mantle was a shitshow.

Anyways, I did expect them to do better than they have.
 
Anyways, I did expect them to do better than they have.

That's where I was at. DX12 shows so much potential extra performance that it makes little sense to leave it on the table- until you read just how much more difficult it is to implement. Developers were given exactly what they asked for, and now they've shirked off their part.

At least the advent of DXR will start pushing them to overhaul things, and if AMD actually did put the hardware into the next consoles, then we should see a 180 soon.

Hopefully.
 
In this video

kingdom come deliverance has a better performance on AMD on Ryzen 9 than a 2080 so is it the game which favors multicore? why cant most games be optimized like this even it uses (Cryengine)DX11?


To expand on what others have said, there is more to a game than just the graphics thread. There's the computer AI, sound, physics, and more. It can be that this particular game has more of the AI and physics going on (more easily paralleled), which is why it scales better with more cores compared to other games.
 
That's where I was at. DX12 shows so much potential extra performance that it makes little sense to leave it on the table- until you read just how much more difficult it is to implement. Developers were given exactly what they asked for, and now they've shirked off their part.

At least the advent of DXR will start pushing them to overhaul things, and if AMD actually did put the hardware into the next consoles, then we should see a 180 soon.

Hopefully.

That was more like Tim Sweeny (EPIC) and AMD (Mantle) doing a crusade...which kinda turned out the usual way ;)
 
And DX12 capable GPU's are actually dominant:
upload_2019-8-5_7-17-23.png
 
Well any DX11 GPU can be DX12 compatible as long as the drivers are written for it. Fermi (GeForce 4xx series, launched 2010) has DX12 drivers, Intel iGPUs starting from Haswell (2013) has DX12 drivers, and AMD has drivers starting with the HD7000 series (2012).
 
Well any DX11 GPU can be DX12 compatible as long as the drivers are written for it. Fermi (GeForce 4xx series, launched 2010) has DX12 drivers, Intel iGPUs starting from Haswell (2013) has DX12 drivers, and AMD has drivers starting with the HD7000 series (2012).

Sure about Intel?
upload_2019-8-5_8-7-25.png
 

Yep.

https://techreport.com/news/32191/nvidia-finally-lets-fermi-gpu-owners-enjoy-directx-12/
https://www.pcworld.com/article/295...ort-windows-10s-directx-12-graphics-tech.html

What you are looking at are the feature levels. The older GPUs supported DX12 up to feature 11_0 (note the use of _ instead of .), while newer GPUs support DX12 up to 12_1. I'm fairly sure there are a few in there that supported up to 12_0 only.
 
Yep.

https://techreport.com/news/32191/nvidia-finally-lets-fermi-gpu-owners-enjoy-directx-12/
https://www.pcworld.com/article/295...ort-windows-10s-directx-12-graphics-tech.html

What you are looking at are the feature levels. The older GPUs supported DX12 up to feature 11_0 (note the use of _ instead of .), while newer GPUs support DX12 up to 12_1. I'm fairly sure there are a few in there that supported up to 12_0 only.

Then we see differently on what is Directx 12 hardware...
The ones I marked I do not consider DX12 hardware...but more a "software bandiad" in the driver:

upload_2019-8-5_8-58-4.png
 
Then we see differently on what is Directx 12 hardware...
The ones I marked I do not consider DX12 hardware...but more a "software bandiad" in the driver:

View attachment 178724

Again, feature levels, not DX compatibility. Those same feature levels are applicable to DX11 as well. For example, DX11.3 requires feature level 12_0 and 12_1. So it's more accurate to say a feature level 12_1 GPU is DX11.3 and DX12 at 12_1 compatiblity (DX12 does not have increments unlike DX11), while a feature level 11_0 GPU is DX11.0 and DX12 at 11_0 compatibility. DX12 requires feature level 11_0 at minimum, higher feature levels are additional instruction sets added to DX rather than a DX12 exclusive thing.
 
I never bought into the DX12 hype from the time I heard it I incorpated parts of Mantle. Everyone else was excited... I never understood why because Mantle was a shitshow.

Anyways, I did expect them to do better than they have.
So if it incorporated parts of mantle inmediately you dont like it but if mantle was promoted by nvidia would you accept they used some code of it into DX12(unlikely this happened since mantle and Directx 12 showed different core usage)? also we didnt see SFR on DX12 or multi adapter in DX12(not until later if anything)

That was more like Tim Sweeny (EPIC) and AMD (Mantle) doing a crusade...which kinda turned out the usual way ;)
I dont recall Epic games pushing for DX12 or Mantle or Vulkan, are you sure about this? I recall Oxide,Faraxis,DICE.AMD and Microsoft pushing for a better low level API
 
So if it incorporated parts of mantle inmediately you dont like it but if mantle was promoted by nvidia would you accept they used some code of it into DX12(unlikely this happened since mantle and Directx 12 showed different core usage)? also we didnt see SFR on DX12 or multi adapter in DX12(not until later if anything)


I dont recall Epic games pushing for DX12 or Mantle or Vulkan, are you sure about this? I recall Oxide,Faraxis,DICE.AMD and Microsoft pushing for a better low level API

All I said was Mantle had issues and I was skeptical of DX12 when I heard it was being at least partially incorporated.

I am not responding to your post since it has nothing to do with I said.
 
Again, feature levels, not DX compatibility. Those same feature levels are applicable to DX11 as well. For example, DX11.3 requires feature level 12_0 and 12_1. So it's more accurate to say a feature level 12_1 GPU is DX11.3 and DX12 at 12_1 compatiblity (DX12 does not have increments unlike DX11), while a feature level 11_0 GPU is DX11.0 and DX12 at 11_0 compatibility. DX12 requires feature level 11_0 at minimum, higher feature levels are additional instruction sets added to DX rather than a DX12 exclusive thing.

Without meaning to correct, we should point out that by and large, DX12 incorporates all of DX11, but is largely a software / driver thing. Where we see GPUs that do not support DX12, in general it's first because they did not support the necessary DX11 hardware features, and second because they're too old to bother writing DX12 drivers for, which from what I understand would be pretty rare.

The topic is also talking about the developer perspective. While the hardware being accessed may be the same, DX12 has a radically different application | driver relationship, and the lack of development effort to overcome the challenges of addressing that new relationship is what we should be discussing.
 
  • Like
Reactions: Tsumi
like this
I bought this new in March 2015 for $269 with a free game and it has a factory bios from that month and year .. box even says DX 12 support back then but I had to wait all this time to see it play a DX 12 game = 3 to 4 years later and it even had Free Sync support . . it greats real near the end of the video with effects.

 
Without meaning to correct, we should point out that by and large, DX12 incorporates all of DX11, but is largely a software / driver thing. Where we see GPUs that do not support DX12, in general it's first because they did not support the necessary DX11 hardware features, and second because they're too old to bother writing DX12 drivers for, which from what I understand would be pretty rare.

The topic is also talking about the developer perspective. While the hardware being accessed may be the same, DX12 has a radically different application | driver relationship, and the lack of development effort to overcome the challenges of addressing that new relationship is what we should be discussing.

Yep... I think a lot of it has to do with the fact that you can't throw it into an engine and just expect it to work. Even with a DX12 or Vulkan specific engine, it will require a lot of tweaking to make it run optimally, at least that's my understanding of it. If you don't spend the time to tweak, you may as well throw it into DX11.
 
I bought this new in March 2015 for $269 with a free game and it has a factory bios from that month and year .. box even says DX 12 support back then but I had to wait all this time to see it play a DX 12 game = 3 to 4 years later and it even had Free Sync support . . it greats real near the end of the video with effects.


the thread purpose was comparing the DX11 game multithreading with some DX12 game using different CPUs/GPUs though
Yep... I think a lot of it has to do with the fact that you can't throw it into an engine and just expect it to work. Even with a DX12 or Vulkan specific engine, it will require a lot of tweaking to make it run optimally, at least that's my understanding of it. If you don't spend the time to tweak, you may as well throw it into DX11.
Well but I see that the only benefit on DX12 is on CPU limited scenarios(red CPU bound-CPU bottleneck) now this test lacks of avg fps to find out the performance difference because nvidia well could be faster here
 
Back
Top