NVIDIA VOLTA's TITAN V DX12 Performance Efficiency @ [H]

Great review and the only place it seems now days that does these types of analysis.

Seeing how AMD Vega architecture performed between DX 11 and DX 12 compared to Nvidia might be interesting except AMD DX 11 drivers just never really compared to Nvidia DX 11 drivers, AMD may look like in many case DX 12 superior performance increase over Nvidia but the comparison cannot really be made due to AMD DX11 drivers was not optimized as well as Nvidia. So just using the 1080Ti and Titan V was brilliant and does not fuddle up the comparison of Pascal and Volta architectures. Also Nvidia DX 11 drivers are the gold standard, highly polished and optimized, game developers really supported in the past by Nvidia etc. So by having DX 12 beat all of that speaks highly of DX12 and future promise of it's capability.

The conclusion I think was right on and game developer's should move to DX 12.

Another awesome read.
 
Thanks for putting in the effort to bring us this review, [H]!

It's amazing that a lot of developers haven't yet optimized their game engines to take more advantage of DX12 and/or Vulkan yet. Unless the drivers are intentionally sandbagging older architectures to persuade consumers to buy newer graphics cards...
 
Might be worth keeping in mind, drivers aside, the actual games might not be optimized for DX12 on Nvidia hardware. I would think any game with Async On/Off would have similar paths for all DX12 hardware, but that may not be the case.

Any chance of adding CPU utilization to the graphs in the future? To see if DX11/12 are bottle-necked at certain points or if one card may be better with mid to lower range CPUs. Would be useful for future shared power budget benchmarks as well. Even on CPU comparisons that might help if you just graphed max single CPU utilization. Get an idea if cores or clockspeed/IPC mattered more.
 
Has there EVER been a game EVER that unilaterally performed better under the new DX API compared to the last one?! :confused:

DX10 versus 9 was outright hilarious, never gunna forget that one! Though you could arguably blame that mostly on Vista.
 
I get that the drivers are not Game optimized for the Titan V and that there could be improvements in further releases, but the DX12 results don't show much improvement over the DX11 results where there is an improvement and in some cases, the 1080Ti shows a bigger improvement between the 2 API's. It might be just my way of interpreting the results, but it doesn't seem to have improved in its DX12 performance by too much for a new architecture.
 
This tells me Volta isn't much different than Pascal, they both behave similarly. I wasn't expecting much difference, tbh, but this pretty much outlines the (lack of) difference. I think recent assessments that we're nearing the end of architectural improvements are true and we'll just be seeing a die shrink/more cuda cores/higher frequency every couple of years for a while from both camps.
 
  • Like
Reactions: poee
like this
shouldnt you also add DOOM?AOTS and maybe an update later with FC5?



This tells me Volta isn't much different than Pascal, they both behave similarly. I wasn't expecting much difference, tbh, but this pretty much outlines the (lack of) difference. I think recent assessments that we're nearing the end of architectural improvements are true and we'll just be seeing a die shrink/more cuda cores/higher frequency every couple of years for a while from both camps.
Maxwell isnt too different to Paswelll Pascal, if it were not for the use of proper asynchronous compute
 
Last edited:
A really good test that I feel would have good here would be Hitman DX11/DX12 Render Target Reuse DISABLED/DX12 Render Target Reuse ENABLED.

That game easily has one of the best DX12 implementations I've seen with regards to performance enhancements, historically speaking Target Reuse Enabled has had the inverse effect on Pascal vs Disabled.
 
I just realized the review charts remind me of...

Thundercracker_specs-397.jpg
 
Has there EVER been a game EVER that unilaterally performed better under the new DX API compared to the last one?! :confused:

DX10 versus 9 was outright hilarious, never gunna forget that one! Though you could arguably blame that mostly on Vista.

Exactly. Every new DX is heralded by many as bringing massive performance Improvements. Never has this ever happened (often the opposite). Not once and it's not going to happen with DX12.

iQ and effects improvements, maybe (which is really where the focus should be).
 
Great review and the only place it seems now days that does these types of analysis.



The conclusion I think was right on and game developer's should move to DX 12.

Another awesome read.

Should game developers all move to DX12 or should they just move to Vulkan and be done with it?
 
I know cryptomining has become a bit of a dirty trigger word 'round these parts, but I'd like to see how powerfully it hashes compared to older archs.
Article is DX 11, DX 12 Pascal and first available Volta Architecture comparison. Review was very professional and straight to that point using game data which DX 11 and DX 12 is for. Eventually Cryptocurrencies may end up in games due to being digital but I've have not seen any significant movement yet in that direction. Plus what you ask for is already available on other sites and even here in the forums.
 
Not a “new” architecture IMHO, more an existing architecture with a few bolt-on extras and a node shrink. This is just my opinion.

The days of truly new graphics architectures have been gone for many years. AMD started these games in the GPU world with the “GCN” design, and of course AMD were inspired by Intels book on minor incremental improvements, touted by marketing as major breakthroughs.
 
Oh is it now? I don't think you could qualify that statement to save your life.

Kindly grow up and take your sarcasm and aggression elsewhere, kid.
lol, I don't think you will get too far here. Article is about Pascal and Volta DX 11 and DX 12 comparison if the new architecture has improvements, also Vulkan, if you can't handle what the discussion is about, hint, it is definitely not about Crypto, bye bye. There are other threads you can go to for that information or other sites. Thank you.
 
Great review and the only place it seems now days that does these types of analysis.

The conclusion I think was right on and game developer's should move to DX 12.

Another awesome read.

Developers are going to be extremely resistant to DX12, and so far none of the hardware addresses the two large problems, implementation is W10 only, workload pushed onto the developer. Between those two, DX12 might just be DOA except for a few titles (maybe ray tracing will change that?).
 
APIs and price aside, the Titan V undoubtedly shows significant performance advantages. I suppose this may just be its cuda core count advantage, but it is still impressive to me. This is 4k and 1440p benchmarks we are talking about here after all.
 
Found it interesting on Sniper Elite 4 how HUGE the maximum numbers spiked up to, but not having close to that significant of an effect on the average (although still a good bump in that game).

Wonder what it is about those scenes and Async that causes it to just kick ass with a nearly 50% boost over the maximum with Async off.
 
Silly question. Is XBox DX12? Wouldn’t that alone keep DX12 relavent. I suspect this is the reason for AMD-GCN outperformance in DX12 relative to Maxwell,pascal and Volta.
 
Good test, I was thinking, hitman, ashes and the futuremark API overhead test would have been fun to look at


Also wow, I am a little surprised by how little gain 12 has on avg.... I was hoping hard they would have struck gold by now... Sigh.
 
Silly question. Is XBox DX12? Wouldn’t that alone keep DX12 relavent. I suspect this is the reason for AMD-GCN outperformance in DX12 relative to Maxwell,pascal and Volta.

It's a DX 11.x that's a subset of DX12. DX12 is rumored to be built in to the hardware in Scorpio.
 
I just read the article, Impressive. Nice work. I see the future here. Question. Since the TitanV have more memory, what will be the benefit? What will happen if Nvidia make these the next gaming card. We all ready know that prices will not the same. That I get it and it is expected. I do own The Division. normaly i use more these game cause how much is on screen. changes on the day, night, fog, and snow, and the mix on fog and snow. the more is on screen th fps drops. on day numbers goes up. the changes agains the sun and show really change fps and hard. That is unique on the game. Even on same in house game like wildlands, even with better graphics lack o f those impacts. It will be nice to include the new one, from ubisotft Farcry5 i will start testing today and see how that impact my old 1080. I am not so savy in there areas. But is more as a consumer of what I see on my screens, and how it did change from 60, a 60 and 144hz and now 2 144hz, how that change. Even with a 1080ti and the titan V 4k gaming is next. that is what my eyes see. I will change FPS for 4k any time. Imagine 4k and high refresh rate. The future is there.
 
Back
Top