Kingdom Come: Deliverance GPU Performance Review

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,532
Kingdom Come: Deliverance GPU Performance Review

We take the new game Kingdom Come: Deliverance and test ten current video cards in it to find how each one performs, how those stack up, and what the highest playable settings are. We test 4K, 1440p, and 1080p, with multiple graphics settings, maximum distance sliders, and find out what you need to play this game and have a good experience.
 
Kyle, there are reports that typing a command in the console, "r_BatchType 1", improves Vega FPS by a lot.

 
Wow those Vega numbers are fucked bad. AMD needs to step up on getting a driver to fix this. Otherwise it's sad but get a RX 580 over a Vega 56? LOL too funny. Makes ya wonder if its an HBM thing or something. Would love to see a Geforce Volta benchmarks, but not at $3000 lol.

Good review guys. Keep calling em like you bench them. Get off your ass and fix the issue AMD.
 
no sli/crossfire support in this game?

There are reports of the Ryse: Son of Rome SLI profile working for this title.

However SLI/Crossfire is generally considered dead in terms of the videogame industry's direction.
 
  • Like
Reactions: Yaka
like this
Kyle, there are reports that typing a command in the console, "r_BatchType 1", improves Vega FPS by a lot.



I'll look into that, in the meantime just note we tested all GPUs at DEFAULT game settings, to be fair. So this command is obviously not the default setting by the developer. If it works, it makes you wonder why it wasn't default, perhaps it causes other issues we aren't aware of?
 
Quoting a reddit post about custom cfg's because[H]ardForum live-link-pop-up-deadpool-died-for-this-cancer-causing-windows to reddit if I post the url.

I've been digging around commands and settings to see how much performance I can squeeze out for myself.


Create a custom .cfg file in the KingdomComeDeliverance folder, same folder where system.cfg is placed.


I'm using following commands personally as it gives me best performance to quality ratio. Change it to your liking.


-- vsynch on/off.


r_vsync = 0


-- Ambient occlusion on/off.


r_ssdo = 0


-- Antialiasing mode.


r_antialiasingmode = 3


-- Image sharpening, higher value means more.


r_sharpening = 0


-- shadow map resolution. (256, 512, 1024 ,2048 ,4096)


e_shadowsmaxtexres = 1024


-- Resolution upscaling


r_supersampling = 0


-- Framerate cap


sys_MaxFPS = 300


Now there are two ways of loading in the config file, one way is to open up the console in the game with (¬/~) button and typing exec user to load in settings. Second way is to add launch parameter through Steam or through desktop game shortcut, the command is +exec user


You can name your config file any way you want as long as it can be executed, mine is called 'user' to stick with convention.

http://docs.cryengine.com/plugins/servlet/mobile#content/view/1605684

Some discussion from guru3d:
The draw distance is controlled by multiple settings, default for Cry-Engine is values around 100 or so units for the highest presets which this game extends to almost 500 units though it only covers three specific view distance modifiers.
The game also has it's own LOD system called "Uberlod" in addition to further separating character view distances in various groups for things like accessories or texture detail.

e_UberlodDistanceRatio

That's the uber lod cvar, it scales almost everything in the game from 0.1 or perhaps lower to 3.0 which is the default.

wh_r_UberlodRatio = 3
wh_r_UberlodMode = 0

And these two for the games own settings for this modifier, from a quick test in-game it scaled over 4000 objects from what the console output reported. Quite a demanding setting as a result though even if you scale this down framerate in cities and such which hit CPU even harder tend to still fluctuate a lot so possibly some bottleneck there.


EDIT: Also from doing a quick check of what the beta version used it seems they toned down the ambient occlusion strength in the final game, I like the more subtle look of AO in this compared to the beta but the differences are these cvars here.

Beta:
r_ssdoAmountAmbient = 2
r_ssdoAmountDirect = 4
r_ssdoAmountReflection = 5
r_ssdoRadius = 1
r_ssdoRadiusMax = 2
r_ssdoRadiusMin = 0.05

Full:
r_ssdoAmountAmbient = 1
r_ssdoAmountDirect = 1.5
r_ssdoAmountReflection = 1.5
r_ssdoRadius = 0.3
r_ssdoRadiusMax = 2
r_ssdoRadiusMin = 0.1

First two in particular have a really noticeable impact as can be seen even from the main menu, max radius is the same but the other settings were all altered between the last public beta build of late 2016 and the full game of early 2018
 
^ thanks for the info, one thing i like about this game, like Fallout 4 and Elder Scroll series is the customization of the game engine, and mods, and configuration changes you can make to tweak it just right. I want to explore that more in the game.
 
There are reports of the Ryse: Son of Rome SLI profile working for this title.

However SLI/Crossfire is generally considered dead in terms of the videogame industry's direction.

If SLI/CF is working in the CryEngine, it should be able to work in KC:D.

This is an old engine, after all, and developed outside of the normal AAA-title process, so it may take some time. However, as the benchmarks show, multi-GPU would certainly help!
 
If SLI/CF is working in the CryEngine, it should be able to work in KC:D.

This is an old engine, after all, and developed outside of the normal AAA-title process, so it may take some time. However, as the benchmarks show, multi-GPU would certainly help!

I totally agree. It seems many devs who license CryEngine often leave out whatever it is to enable SLI/CF. I saw the same thing happen to that walker 'everybody's gone to the rapture'. A couple updates later and it was added with significant performance gains for my 1080SLI rig. Other times, just finding the right SLI bits/profile is all it takes.
 
Last edited:
Kyle, as always thanks for the in depth review. I ended up buying this after all on Friday. What can I say except that I'm a sucker for good eye candy. For better or worse though, most of my weekend was spent finishing a campaign on Civ5 I started last weekend so I really only spent about 20 minutes setting up and walking around with KC.

On my 2600k/1080TI at 1440p I saw nearly identical frame-rates as your tests using Ultra and max distance. Thanks big time for the console command to turn off v-sync as I found it odd too that it wasn't in the menus. Noticed it right away since I use a 1440p/144hz/G-sync display on that rig and that's one of the things I tweak first. Can't wait to plow thru it this coming weekend now. Interesting to see how similar CryEngine can look to the Red Enginge in Witcher 3.
 
I'm curious if anyone with a Fury or Fury X has tested performance. According to gamegpu it's faster than RX 580.
 
Given the prices of graphics cards and also given the majority of people are going to be running 1080p with older cards, would it be possible to do a 1080p only test involving cards like the 280x, 970, 390/390x, maybe even 960 etc?

i really like the look of this game but unless i can run it on medium with some settings turned down such as draw distance my 280x is not going to cope, which means they don't get my money!
 
The other end of the spectrum would be cool too.
The hardware needed to maintain 60fps, 90fps etc minimum all the time at 1080p.
Or which settings are best to turn down if the hardware cant hit the min framerate (this could be done for higher res than 1080p as well)

ie no perceived hitching, a smooth experience
The goal is maximum immersion.
 
Another vote for 1080P here. I'm still gaming on 1080P and plan to for some time still until the GPU industry advances where more than just 1 card can maintain 60 FPS at 4K. Depending on screen size 1080P still looks good.
 
I think you guys might've found some interesting things if you played with individual settings.

For me it's only shadows, shader detail and resolution that really seem to have any significant effect. I'm currently using the ultra high preset with the widest menu selectable FOV (75?) at 1440p with shadows on low and shader detail on medium. I seem to be able to play around with all the other settings without any/anything more than a couple fps difference. If I drop to 1080p my framerate will go way up. It's unusual those other settings don't seem to affect fps much, but that's how it seems to work for me. This is with a stock Ryzen 1700, an OC'd RX580 8GB and 16GB of DDR4 running at 3200Mhz.
 
;)So one of the games pushing the barriers...is DX11.
I guess the "DX12 revolt" is cancelled?
If it is then its establishing a new every other pattern. It does normally take 2-3 years for significant support but even then 12 is already hitting that age. Some recent past adoption trends that followed that pattern were, DX9.0c-big, 10 not so much, 11-big, 12 not so much. Guess we may be waiting till 13 then.
 
If it is then its establishing a new every other pattern. It does normally take 2-3 years for significant support but even then 12 is already hitting that age. Some recent past adoption trends that followed that pattern were, DX9.0c-big, 10 not so much, 11-big, 12 not so much. Guess we may be waiting till 13 then.
Also, poor adoption of Win10 compared to Win7 will hardly help with the DX12 adoption.
 
I'll look into that, in the meantime just note we tested all GPUs at DEFAULT game settings, to be fair. So this command is obviously not the default setting by the developer. If it works, it makes you wonder why it wasn't default, perhaps it causes other issues we aren't aware of?
Some people are saying that this parm doesn't do anything with NVIDIA cards, and some are reporting still that it doesn't do anything on AMD Polaris and earlier architectures.
 
I couldn't get SLI working properly in this game. Tried about everything, end up w/ better performance using one 1080 Ti vs. two.
 
If it is then its establishing a new every other pattern. It does normally take 2-3 years for significant support but even then 12 is already hitting that age. Some recent past adoption trends that followed that pattern were, DX9.0c-big, 10 not so much, 11-big, 12 not so much. Guess we may be waiting till 13 then.

DX8.x was also popular...that kinda breaks that pattern ;)
 
Wow those Vega numbers are fucked bad. AMD needs to step up on getting a driver to fix this. Otherwise it's sad but get a RX 580 over a Vega 56? LOL too funny. Makes ya wonder if its an HBM thing or something. Would love to see a Geforce Volta benchmarks, but not at $3000 lol.

Good review guys. Keep calling em like you bench them. Get off your ass and fix the issue AMD.

Can confirm, Vega sucks big balls in this game. I swapped my Vega 56 for a 3Gb 1060 and it actually runs better.
 
So are gamegpu's results bogus?
goaMfjj.jpg
 
But that looks terrible.

Beggars can't be choosers, Henry.

Seriously though, Anything below 50-55fps is pretty damn nasty to my eyes, and that's with my freesync display that syncs down to 48fps. Turning shadows up and using 1080p isn't better for my liking. I love this game at 1440p because the vegetation just looks so much better, and that's the main visual attraction of this game for me. I've got no choice but to lower shadows and shader detail. I prefer the trade off :)
 
So are gamegpu's results bogus?
goaMfjj.jpg

I will say this, different parts of the game perform differently, heavy vegetation, grass, trees, landscapes, inside villages, inside castle walls, indoors, up close on characters, at night with lots of lighting, just moving around buildings. I tried to capture as much as I could in our run-through, and utilized other saved game places around the world where performance dragged on. In regards to other peoples benchmarks, it is best to ask their testing scenario in the game. It can account for differences.
 
Reading a bit more on the r_BatchType command. People are saying it's to help remove CPU bottlenecks. Guess it's helpful depending on your system configuration to manually set it. Monitor GPU/CPU utilization then set it to free up whichever is over utilized.
  • 0 - CPU friendly
  • 1 - GPU friendly
  • 2 - Automatic
 
Last edited:
If it is then its establishing a new every other pattern. It does normally take 2-3 years for significant support but even then 12 is already hitting that age. Some recent past adoption trends that followed that pattern were, DX9.0c-big, 10 not so much, 11-big, 12 not so much. Guess we may be waiting till 13 then.

Off topic, but maybe? I think the lack of DX 12 implementation is due to DX 12 pushing additional work onto the developer level, I can't and never could see many devs going for it unless something about DX12 was really must have. So far the only must have is Win 10...

Edit:

To be on topic, my Titan has a hell of a time pushing this game at 3440x1440, can't wait to see it on Turing.
 
DX8.x was also popular...that kinda breaks that pattern ;)
Too true. I remember back in the day it was usually an exciting time when a new DX was announced then seeing new games with it. Then I remember blinking and we went from 9.0c>10>11 before hardly many games really utilized 10.
 
Off topic, but maybe? I think the lack of DX 12 implementation is due to DX 12 pushing additional work onto the developer level, I can't and never could see many devs going for it unless something about DX12 was really must have. So far the only must have is Win 10...

Too true also. I've read the same on a number of sites. I don't know much about modern game design/creation but it sure seems like the tools could use some improving to help would be creators implement it.

Edit:

To be on topic, my Titan has a hell of a time pushing this game at 3440x1440, can't wait to see it on Turing.
 
Also, poor adoption of Win10 compared to Win7 will hardly help with the DX12 adoption.

Same thing happened with DX10 and Vista. At the time the only way to fully get it was upgrade from XP to VIsta. I still have my P4 build and one of the last GPU upgrades I got for it was an ATI HD2600(or something similar). I was really proud that the card did allow some DX10 enhancements even though I was still on XP. Sad part was that for the 2-3 games it really pushed over the top there were around a dozen or so that became crash happy.

Sad to say I've been one of MS's guinea pigs. Every time they offer a new API I'll do what it takes to get it in hopes of better game performance. The last 3-4 years have mostly been let downs.
 
There are reports of the Ryse: Son of Rome SLI profile working for this title.

However SLI/Crossfire is generally considered dead in terms of the videogame industry's direction.

Exactly one reason why I own a single Vega 56 over 2 x Furies. When crossfire was supported, especially with DX12 in ROTR or in Crysis 3 with DX11, the game ran quite well at 4k 60fps. When it was not supported, which was quite often, a single Fury ran ok but no where near as good as my single Vega 56.

Edit:Cool thing with this game here is that a Fury X and Fury is properly supported.
 
Back
Top