Kingdom Come: Deliverance GPU Performance Review

Discussion in 'Video Cards' started by Kyle_Bennett, Feb 19, 2018.

  1. Kyle_Bennett

    Kyle_Bennett El Chingón Staff Member

    Messages:
    51,172
    Joined:
    May 18, 1997
    Kingdom Come: Deliverance GPU Performance Review

    We take the new game Kingdom Come: Deliverance and test ten current video cards in it to find how each one performs, how those stack up, and what the highest playable settings are. We test 4K, 1440p, and 1080p, with multiple graphics settings, maximum distance sliders, and find out what you need to play this game and have a good experience.
     
  2. Remon

    Remon Limp Gawd

    Messages:
    355
    Joined:
    Jan 8, 2014
    Kyle, there are reports that typing a command in the console, "r_BatchType 1", improves Vega FPS by a lot.

     
    lostin3d likes this.
  3. Brackle

    Brackle Old Timer

    Messages:
    7,143
    Joined:
    Jun 19, 2003
    Wow those Vega numbers are fucked bad. AMD needs to step up on getting a driver to fix this. Otherwise it's sad but get a RX 580 over a Vega 56? LOL too funny. Makes ya wonder if its an HBM thing or something. Would love to see a Geforce Volta benchmarks, but not at $3000 lol.

    Good review guys. Keep calling em like you bench them. Get off your ass and fix the issue AMD.
     
  4. Yaka

    Yaka Gawd

    Messages:
    548
    Joined:
    Jan 26, 2004
    no sli/crossfire support in this game?
     
  5. DPI

    DPI [H]ardForum Junkie

    Messages:
    10,284
    Joined:
    Apr 20, 2013
    There are reports of the Ryse: Son of Rome SLI profile working for this title.

    However SLI/Crossfire is generally considered dead in terms of the videogame industry's direction.
     
    Yaka likes this.
  6. Brent_Justice

    Brent_Justice [H] Video Card Managing Editor Staff Member

    Messages:
    17,724
    Joined:
    Apr 17, 2000

    I'll look into that, in the meantime just note we tested all GPUs at DEFAULT game settings, to be fair. So this command is obviously not the default setting by the developer. If it works, it makes you wonder why it wasn't default, perhaps it causes other issues we aren't aware of?
     
  7. lilbabycat

    lilbabycat 2[H]4U

    Messages:
    3,826
    Joined:
    Jun 21, 2011
    Quoting a reddit post about custom cfg's because[H]ardForum live-link-pop-up-deadpool-died-for-this-cancer-causing-windows to reddit if I post the url.



    http://docs.cryengine.com/plugins/servlet/mobile#content/view/1605684

    Some discussion from guru3d:
     
  8. Brent_Justice

    Brent_Justice [H] Video Card Managing Editor Staff Member

    Messages:
    17,724
    Joined:
    Apr 17, 2000
    ^ thanks for the info, one thing i like about this game, like Fallout 4 and Elder Scroll series is the customization of the game engine, and mods, and configuration changes you can make to tweak it just right. I want to explore that more in the game.
     
    AceGoober, Armenius and Parja like this.
  9. rgMekanic

    rgMekanic [H]ard|News Staff Member

    Messages:
    4,823
    Joined:
    May 13, 2013
    Thanks for being awesome Brent_Justice, have not got to try the game yet, but it looks like my Fury is in for a bit of a beating at 1440
     
  10. IdiotInCharge

    IdiotInCharge Not the Idiot YOU are Looking for

    Messages:
    5,786
    Joined:
    Jun 13, 2003
    If SLI/CF is working in the CryEngine, it should be able to work in KC:D.

    This is an old engine, after all, and developed outside of the normal AAA-title process, so it may take some time. However, as the benchmarks show, multi-GPU would certainly help!
     
  11. lostin3d

    lostin3d Gawd

    Messages:
    1,003
    Joined:
    Oct 13, 2016
    I totally agree. It seems many devs who license CryEngine often leave out whatever it is to enable SLI/CF. I saw the same thing happen to that walker 'everybody's gone to the rapture'. A couple updates later and it was added with significant performance gains for my 1080SLI rig. Other times, just finding the right SLI bits/profile is all it takes.
     
    Last edited: Feb 20, 2018
    IdiotInCharge likes this.
  12. lostin3d

    lostin3d Gawd

    Messages:
    1,003
    Joined:
    Oct 13, 2016
    Kyle, as always thanks for the in depth review. I ended up buying this after all on Friday. What can I say except that I'm a sucker for good eye candy. For better or worse though, most of my weekend was spent finishing a campaign on Civ5 I started last weekend so I really only spent about 20 minutes setting up and walking around with KC.

    On my 2600k/1080TI at 1440p I saw nearly identical frame-rates as your tests using Ultra and max distance. Thanks big time for the console command to turn off v-sync as I found it odd too that it wasn't in the menus. Noticed it right away since I use a 1440p/144hz/G-sync display on that rig and that's one of the things I tweak first. Can't wait to plow thru it this coming weekend now. Interesting to see how similar CryEngine can look to the Red Enginge in Witcher 3.
     
    IdiotInCharge likes this.
  13. pandora's box

    pandora's box [H]ardness Supreme

    Messages:
    4,337
    Joined:
    Sep 7, 2004
    The developers themselves said the game will support SLI. One came out to say that it's odd Nvidia disabled it in their latest driver. I think we will see working sli in the future
     
    AceGoober, Armenius and lostin3d like this.
  14. FlawleZ

    FlawleZ Limp Gawd

    Messages:
    468
    Joined:
    Oct 20, 2010
    I'm curious if anyone with a Fury or Fury X has tested performance. According to gamegpu it's faster than RX 580.
     
  15. Colonel_Blimp

    Colonel_Blimp n00bie

    Messages:
    7
    Joined:
    Dec 30, 2017
    Given the prices of graphics cards and also given the majority of people are going to be running 1080p with older cards, would it be possible to do a 1080p only test involving cards like the 280x, 970, 390/390x, maybe even 960 etc?

    i really like the look of this game but unless i can run it on medium with some settings turned down such as draw distance my 280x is not going to cope, which means they don't get my money!
     
    Araxie likes this.
  16. Nenu

    Nenu Pick your own.....you deserve it.

    Messages:
    17,849
    Joined:
    Apr 28, 2007
    The other end of the spectrum would be cool too.
    The hardware needed to maintain 60fps, 90fps etc minimum all the time at 1080p.
    Or which settings are best to turn down if the hardware cant hit the min framerate (this could be done for higher res than 1080p as well)

    ie no perceived hitching, a smooth experience
    The goal is maximum immersion.
     
    IdiotInCharge and Araxie like this.
  17. FlawleZ

    FlawleZ Limp Gawd

    Messages:
    468
    Joined:
    Oct 20, 2010
    Another vote for 1080P here. I'm still gaming on 1080P and plan to for some time still until the GPU industry advances where more than just 1 card can maintain 60 FPS at 4K. Depending on screen size 1080P still looks good.
     
    Araxie likes this.
  18. 0_0

    0_0 n00bie

    Messages:
    22
    Joined:
    Dec 24, 2017
    I think you guys might've found some interesting things if you played with individual settings.

    For me it's only shadows, shader detail and resolution that really seem to have any significant effect. I'm currently using the ultra high preset with the widest menu selectable FOV (75?) at 1440p with shadows on low and shader detail on medium. I seem to be able to play around with all the other settings without any/anything more than a couple fps difference. If I drop to 1080p my framerate will go way up. It's unusual those other settings don't seem to affect fps much, but that's how it seems to work for me. This is with a stock Ryzen 1700, an OC'd RX580 8GB and 16GB of DDR4 running at 3200Mhz.
     
  19. Factum

    Factum [H]ard|Gawd

    Messages:
    1,074
    Joined:
    Dec 24, 2014
    ;)So one of the games pushing the barriers...is DX11.
    I guess the "DX12 revolt" is cancelled?
     
  20. stashix

    stashix Limp Gawd

    Messages:
    228
    Joined:
    May 25, 2016
    But that looks terrible.
     
  21. lostin3d

    lostin3d Gawd

    Messages:
    1,003
    Joined:
    Oct 13, 2016
    If it is then its establishing a new every other pattern. It does normally take 2-3 years for significant support but even then 12 is already hitting that age. Some recent past adoption trends that followed that pattern were, DX9.0c-big, 10 not so much, 11-big, 12 not so much. Guess we may be waiting till 13 then.
     
    Hameeeedo likes this.
  22. Meeho

    Meeho 2[H]4U

    Messages:
    3,336
    Joined:
    Aug 16, 2010
    Also, poor adoption of Win10 compared to Win7 will hardly help with the DX12 adoption.
     
    Maddness, odditory, lostin3d and 2 others like this.
  23. Armenius

    Armenius [H]ardForum Junkie

    Messages:
    13,245
    Joined:
    Jan 28, 2014
    Some people are saying that this parm doesn't do anything with NVIDIA cards, and some are reporting still that it doesn't do anything on AMD Polaris and earlier architectures.
     
  24. tacos4me

    tacos4me Gawd

    Messages:
    738
    Joined:
    Apr 5, 2006
    I couldn't get SLI working properly in this game. Tried about everything, end up w/ better performance using one 1080 Ti vs. two.
     
  25. Factum

    Factum [H]ard|Gawd

    Messages:
    1,074
    Joined:
    Dec 24, 2014
    DX8.x was also popular...that kinda breaks that pattern ;)
     
    Armenius and lostin3d like this.
  26. LigTasm

    LigTasm [H]ardness Supreme

    Messages:
    5,156
    Joined:
    Jul 29, 2011
    Can confirm, Vega sucks big balls in this game. I swapped my Vega 56 for a 3Gb 1060 and it actually runs better.
     
  27. FlawleZ

    FlawleZ Limp Gawd

    Messages:
    468
    Joined:
    Oct 20, 2010
    So are gamegpu's results bogus?
    [​IMG]
     
  28. 0_0

    0_0 n00bie

    Messages:
    22
    Joined:
    Dec 24, 2017
    Beggars can't be choosers, Henry.

    Seriously though, Anything below 50-55fps is pretty damn nasty to my eyes, and that's with my freesync display that syncs down to 48fps. Turning shadows up and using 1080p isn't better for my liking. I love this game at 1440p because the vegetation just looks so much better, and that's the main visual attraction of this game for me. I've got no choice but to lower shadows and shader detail. I prefer the trade off :)
     
    Armenius and AceGoober like this.
  29. 5150Joker

    5150Joker 2[H]4U

    Messages:
    2,564
    Joined:
    Aug 1, 2005
    Probably.
     
  30. Ididar

    Ididar Gawd

    Messages:
    576
    Joined:
    Aug 2, 2004
    Do they say what settings they did it on? I saw they had a bunch of images comparing min/medium/max but I couldn't read what settings their chart is based on.
     
    Armenius likes this.
  31. Brent_Justice

    Brent_Justice [H] Video Card Managing Editor Staff Member

    Messages:
    17,724
    Joined:
    Apr 17, 2000
    I will say this, different parts of the game perform differently, heavy vegetation, grass, trees, landscapes, inside villages, inside castle walls, indoors, up close on characters, at night with lots of lighting, just moving around buildings. I tried to capture as much as I could in our run-through, and utilized other saved game places around the world where performance dragged on. In regards to other peoples benchmarks, it is best to ask their testing scenario in the game. It can account for differences.
     
  32. Nuby1Canuby

    Nuby1Canuby Gawd

    Messages:
    741
    Joined:
    Mar 4, 2008
  33. bigdogchris

    bigdogchris Wii was a Novelty

    Messages:
    17,301
    Joined:
    Feb 19, 2008
    Reading a bit more on the r_BatchType command. People are saying it's to help remove CPU bottlenecks. Guess it's helpful depending on your system configuration to manually set it. Monitor GPU/CPU utilization then set it to free up whichever is over utilized.
    • 0 - CPU friendly
    • 1 - GPU friendly
    • 2 - Automatic
     
    Last edited: Feb 20, 2018
  34. Aireoth

    Aireoth [H]ard|Gawd

    Messages:
    1,075
    Joined:
    Oct 12, 2005
    Off topic, but maybe? I think the lack of DX 12 implementation is due to DX 12 pushing additional work onto the developer level, I can't and never could see many devs going for it unless something about DX12 was really must have. So far the only must have is Win 10...

    Edit:

    To be on topic, my Titan has a hell of a time pushing this game at 3440x1440, can't wait to see it on Turing.
     
  35. lostin3d

    lostin3d Gawd

    Messages:
    1,003
    Joined:
    Oct 13, 2016
    Too true. I remember back in the day it was usually an exciting time when a new DX was announced then seeing new games with it. Then I remember blinking and we went from 9.0c>10>11 before hardly many games really utilized 10.
     
    Maddness, Armenius and Factum like this.
  36. lostin3d

    lostin3d Gawd

    Messages:
    1,003
    Joined:
    Oct 13, 2016
     
    Aireoth likes this.
  37. lostin3d

    lostin3d Gawd

    Messages:
    1,003
    Joined:
    Oct 13, 2016
    Same thing happened with DX10 and Vista. At the time the only way to fully get it was upgrade from XP to VIsta. I still have my P4 build and one of the last GPU upgrades I got for it was an ATI HD2600(or something similar). I was really proud that the card did allow some DX10 enhancements even though I was still on XP. Sad part was that for the 2-3 games it really pushed over the top there were around a dozen or so that became crash happy.

    Sad to say I've been one of MS's guinea pigs. Every time they offer a new API I'll do what it takes to get it in hopes of better game performance. The last 3-4 years have mostly been let downs.
     
  38. ManofGod

    ManofGod [H]ardForum Junkie

    Messages:
    8,828
    Joined:
    Oct 4, 2007
    Exactly one reason why I own a single Vega 56 over 2 x Furies. When crossfire was supported, especially with DX12 in ROTR or in Crysis 3 with DX11, the game ran quite well at 4k 60fps. When it was not supported, which was quite often, a single Fury ran ok but no where near as good as my single Vega 56.

    Edit:Cool thing with this game here is that a Fury X and Fury is properly supported.
     
    Maddness, DPI and lostin3d like this.
  39. FlawleZ

    FlawleZ Limp Gawd

    Messages:
    468
    Joined:
    Oct 20, 2010
    rgMekanic and ManofGod like this.
  40. Factum

    Factum [H]ard|Gawd

    Messages:
    1,074
    Joined:
    Dec 24, 2014
    I am not sure what you think a win at ~20 FPS says?

    Besides UNPLAYABLE? (In a game that never goes above 4GB of VRAM usage)

    Troll better next time...
     
    jologskyblues, Armenius and Hameeeedo like this.