Which is why [H] doesn't use built in benchmarks.That makes a lot of sense, thanks for the clarification
Interesting to see the 980Ti take a much bigger hit in actual gameplay vs benchmark compared to Fury/FuryX
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Which is why [H] doesn't use built in benchmarks.That makes a lot of sense, thanks for the clarification
Interesting to see the 980Ti take a much bigger hit in actual gameplay vs benchmark compared to Fury/FuryX
Hope springs eternal.
I am leaning to the being more a MMX situation though, if the developers end up focusing on the low levels, its going to force the hardware vendors to stick to extending rather then rebuilding. The GPU world has had the advantage of not having to stick to a register set like x86 or MMX, they have been free for the most part to do what they want in hardware and then just write a driver to link to the high level stuff. A few years of games written to take advantage of the low level registers of a generation of chips might make, a change up product a real issue. Anyway ya we agree.
EDIT... ok I couldn't help this one. There is a solution to all the terrible stagnation of the hardware I am talking about. Anyone remember Transmeta. Bad joke... but a future GPU with a Hardware/Software translation layer so that fancy new ATI XYZ5000 card can emulate a FuryX. Only half joking, if there talking about hardware translation layers in a few years, I'll remember this day. The day the software guys got their hardware access... and that future day as the day the hardware guys put a software layer in between. lol
Because DX11 games had no bugs?And now we see the complete mess that is attempting to make a to-the-metal API on a flexible platform which will have an ever increasing number of chips and architectures to deal with. Developers don't and never will have the time to get it all working well. At best I suspect we'll see tightly optimized DX12 paths for the current hotness cards, and a DX11 "fallback path" which will probably work better for everything else.
If you think it is bad now, wait a 5 years and go back to play an older DX12 title which has no idea how to optimize for your GPU.
This is much bigger than just bugs...Because DX11 games had no bugs?
If it means more jobs for hardware engineers in videogame dev studios I'm all for itYes, I'm not referring to bugs at all. Someone has to know a lot about the hardware to optimize it well. The burden is split with DX11, but the way the abstraction works it gives the GPU vendor a lot of leeway in handling things, and even working around suboptimal code from the application. This is partially why there are "game-ready" drivers.
The major engine vendors (unreal, crytek, unity, etc) will hopefully make this less of an issue for most game devs. I am skeptical if we'll end up with a net win in most cases though.
I remember seeing slides last year with things like "Developers have more responsibility, drivers matter less" and my initial thought was... Boy, PC gamers are screwed. lmao.DX 12 opens up more possibilities with better CPU utilization, combining of different processors etc. The masters of it will come over time which will make DX11 stuff look like Nintendo stuff of yesteryear. There use to be many more hardware vendors so an API that is more abstract from the hardware makes sense back then. Basically today we have three, AMD, Intel and Nvdia where Intel uses Nvdia patents (maybe AMD patents in the future). Having that extra limiting layer really would hinder progress for only three hardware platforms.
I wonder why some of the developers are releasing these DX12 patched games? Is it really just to see how it flows with the hardware out there currently? For most DX12 enabled games it is not bringing much to the table at all, more like taking some trimmings away. AoTs does show the advantage of it for CPU usage which should mean in the future more taxing code can be written for the higher end CPU's and GPU's out there.
I remember seeing slides last year with things like "Developers have more responsibility, drivers matter less" and my initial thought was... Boy, PC gamers are screwed. lmao.
AMD & Nvidia have entire teams dedicated to improving driver performance in DX11 and previous. There was an interview recently where one of the driver team guys talks about how broken AAA game code is on day one and they have to fix basic coding errors just to make the games run properly at launch. Putting all of that into the devs' hands is going to create problems, at least for now.
DX12... Great for consoles, great for Microsoft's attempt to 'unify' PC and Xbox. PC optimization goes out the window in a one-size-fits-all approach. Especially for Nvidia GPUs, since they don't have any correlation with the PS4/XBO. You can't optimize for GCN-based consoles and neglect 80% of PC gamers and expect to have success in the PC market.
Pay attention, Microsoft.
I wonder if its possible to code a DX 12 path that uses very minimal basic calls that tend to be more or less standard.
Yes, it's called DX12. This isn't DOS, you're not directly interfacing with hardware.
If you're after absolute peak performance then of course you can find yourself writing different implementations. This is nothing new...
to be fair, this is a dx11 game updated to dx12. C
No, my point is about calling it updated to DX12 when DX12 for all intents and purposes is broken for all PC gamers. You are going to have to try much harder than that to put words into my mouth silly n00bie.so the broken dx11 games are non-api games then?
Cool particle system. Did the DX 12 you wrote though really offer you anything that DX 11 wouldn't have ? I do understand it is possible to optimize, but then your aiming at driver hooks not actual hardware... or am I way off on that as well. I admit I don't do any 3d work.
I definitely liked your point of back porting to DX11 from DX12. It is what I surmised was giving AMD this previously unseen DX11 boost we have seen in these new DX12/11 games. (although I don't necessarily mean backporting as much as I mean using what was learned from DX12 in programming for DX11- Protection from the LITERAL Nazis).For this it was likely slower. The extent of my effort was to just get it working and I skipped even "easy" optimizations. It works, but I was stalling the GPU. In DX11, the driver could do some of this for me.
There's clear wins in the design though and it seems much easier to map an efficient DX12 design backwards to an efficient DX11 than the other way around. This is a big one.
Hitman was developed in DX11 then half-assedly ported to dx12I definitely liked your point of back porting to DX11 from DX12. It is what I surmised was giving AMD this previously unseen DX11 boost we have seen in these new DX12/11 games. (although I don't necessarily mean backporting as much as I mean using what was learned from DX12 in programming for DX11- Protection from the LITERAL Nazis).
Well here is an interesting thing, pcgameshardware just done a benchmark test of Chapter 2 and the performance has improved overall for both manufacturers.
However what is interesting, the NVIDIA 980ti showed notable improvements with DX12 at 1080p, and then vanished at 1440p onwards.
Hitman (2016) Episode 2: DX12 mit bis zu 60 Prozent Leistungsplus - und Problemen
Scroll down to the first chart and use the tabs to change resolution and also switch between DX11 and DX12, no need to translate unless you wish to read the article.
Maybe an area for Kyle/Brent to investigate if they revisit Hitman.
Also of interest is how they noticed a graphics quality change for both manufacturers dependant upon a couple of variables, this is shown before the performance chart.
Cheers
Thanks for confirming and no idea how I missed that doh.PCGH stated they are doing a custom run-through of an area with very heavy drawcalls and cpu load. Most likely using Intel's PresentMon to record the fps, since that's what they used in their earlier Quantum Break DX12 review.
Leldra which driver are you using?
pcgameshardware used 364.96 hotfix.
That said bear in mind they actually have the NVIDIA card beating AMD, but they are using PresentMon rather than the internal preset test.
Also there is something strange about visual quality for both manufacturers, might be worth checking as well the memory protection option.
Hitman (2016) Episode 2: DX12 mit bis zu 60 Prozent Leistungsplus - und Problemen
Also if interested try comparing performance between 1080 and 1440 with DX11-to-12.
Cheers
No the issue is now there is heavy blur with AMD and the memory limitation didn't change that. No one truly knows what the issue is, just a lot of guessing.It's ieldra by the way
I just tested 4k This is 1490/7000
I'm running 365.10 driver
Yeah I didn't really understand what they were saying in german, something to do with texture filtering but afaik it was solved when they added the 'override memory limitation' setting in the options for dx12
No the issue is now there is heavy blur with AMD and the memory limitation didn't change that. No one truly knows what the issue is, just a lot of guessing.
No the issue is now there is heavy blur with AMD and the memory limitation didn't change that. No one truly knows what the issue is, just a lot of guessing.
No the issue is now there is heavy blur with AMD and the memory limitation didn't change that. No one truly knows what the issue is, just a lot of guessing.
I just use Chrome with the translation option enabled within settings.Well they were forcing settings from the driver control panel, so I understood it to be related to texture filtering, and all the rest was German so...
Any Germans here?
Or just people who speak German?
Or even more simply, people with amd cards who can test post screenpots