Kingdom Come: Deliverance GPU Performance Review

Exactly one reason why I own a single Vega 56 over 2 x Furies. When crossfire was supported, especially with DX12 in ROTR or in Crysis 3 with DX11, the game ran quite well at 4k 60fps. When it was not supported, which was quite often, a single Fury ran ok but no where near as good as my single Vega 56.

Edit:Cool thing with this game here is that a Fury X and Fury is properly supported.
That's the same reason I bit the bullet for my TI on my 2600k rig. One of the best decisions I ever made for a build(even more so since the gpu price hikes). I admit that I have some buyer's remorse over my 1080 SLI's, definitely the last SLI I'm doing. It's nice that with some work and research for SLI bits I can get most games working but its a total PIA everytime I have to do a new driver to reinstall all those profiles.
 
So are gamegpu's results bogus?
The picture shows low settings, take it to Ultra+ from the tabs above the chart and the results are just like other sites.
980TI eating Fury X dust at 4K. Only 4GB VRAM how could this be true? Sure they're both unplayable at those settings but where are the Fury haters now? Hameeeedo?
For one game?

How about 5 titles from 2018 no less:

Sea Of Thieves
980Ti is 20% faster than FuryX @1080p
http://gamegpu.com/mmorpg-/-онлайн-игры/sea-of-thieves-beta-test-gpu-cpu

World Of Tanks 2018
980Ti is 25% faster than FuryX @1080p, and 15% faster @1440p
http://gamegpu.com/mmorpg-/-онлайн-игры/world-of-tanks-encore-test-gpu-cpu

Subnautica
980Ti is 20% faster than FuryX @1080p and 1440p
http://www.pcgameshardware.de/Subnautica-Spiel-55121/Specials/Benchmark-Test-Review-1248983/

Dragon Ball FighterZ
1070 is 28% faster than FuryX @8K (the only resolution that isn't CPU limited), even a GTX 1060 is equal to FuryX
https://overclock3d.net/reviews/software/dragon_ball_fighterz_pc_performance_review/11

Final Fantasy 15
980Ti is 100% faster than FuryX @1080p and 2160p
http://gamegpu.com/action-/-fps-/-tps/final-fantasy-xv-benchmark-test-gpu-cpu
https://www.gamersnexus.net/game-bench/3223-ffxv-gpu-benchmark-technical-graphics-analysis


Also this isn't a hating contest, it's a fact verified by dozens of sources:

Dozens of 2017 games where the FuryX is barely faster than an RX580, and way behind the 980Ti
https://hardforum.com/threads/furyx-aging-tremendously-bad-in-2017.1948025/

[GameGPU] 980Ti is 30% faster than FuryX in 2017 games

http://gamegpu.com/test-video-cards/podvedenie-itogov-po-graficheskim-resheniyam-2017-goda

[HardwareUnboxed] GTX 980Ti is Aging Considerably Better Than FuryX!



[ComputerBase] GTX 1070 (~980Ti) is considerably ahead of the Fury X

https://www.computerbase.de/2017-12...marks_in_1920__1080_2560__1440_und_3840__2160
 
I'm getting pretty good results on my 980ti at 1440p. Mixture of max and medium settings, turning down shadows, shaders, and particles helps a lot, along with lower lod settings.

Still though first game in awhile that makes me want to upgrade. Bring on the 2080ti!
 
I would like to see more Ryzen testing myself as we have choices now we didn't have before plus in may show how much Zen + offers ,
but a side note I did notice a huge jump in graphics scores in Firestike from a overclocked x5660 to a 1400 .. I would like to think it was PCI 2.0 vs 3.0 but i think its that fabric AMD use's now speaks to gpu's faster for Radeon's better then Intel path does .

But look at the 290x as it is faster then the 980GTX now .. how many achs has Nvidia used to try and stop this cards performance over it's life span and it still makes the show >
 
Cool article, how often per year do you get a release where you write up a article for GFX benchmarks that are completely staggered towards various resolutions?

"But can it run Crysis?"

That game doesn't even do widescreen..... :p
 
Spent some time this weekend testing on both my rigs and tweaking. Noticed the v-ram usage is freakishly low for the textures its rendering. Seemed to average around 2-3GB in 1440p and 4k with ultra and max distances. Very strange and not very optimal.

Did some tweaking on the SLI rig using some bits and AFR2 and saw each(1080) card only peak at ~20% GPU usage while barely holding 40fps with common dips to ~20 at 4096x2160. I think I need to update afterburner on that rig since it showed 0 on vram. I did notice, though, that every step I took triggered substantial drive usage. Makes wonder how much of anything is getting cached in ram/vram. I read in a thread about a ram tweak but I still need to research it. I also put all distance sliders to minimum and noticed barely 5fps difference with minor changes in perceived rendering.

On the TI rig(heavily OC'd on air averaging ~2025Ghz/5503Mhz) I was getting around 50-70fps w/ all ultra and max distance settings. Very playable but fps were quite erratic at times when walking around and exploring. Since this rig is paired with a 1440p/144hz/g-sync I also tried the v-sync off command via console in regular and devmode and noticed 0 change in fps. This was also odd since I usually gain 5-10fps doing that in nearly every other game.

CPU usage on both rigs held at the usual 30-60% I see with most games.

Overall I think I'm going to really enjoy this game but I do believe that Warhorse/Crytek and NV should be able to do a lot more with optimizations. Hopefully this happens sooner than later. I was there for all the Crysis releases and yes this seems very similar but even more so it feels like poor coding. I looked at the log files and it shows that the game sees 8-11GB for the cards but only using 2-3GB for that level of graphical detail at 1440p or 4k is odd at the least.

edit: I was a little bummed yesterday when I saw the new NV driver for FF with a mention of PUBG and nothing for this.
 
I found the game very playable on *mostly* medium settings @ 4k with a lightly overclocked 1080. I set the textures to very high, and gave an extra two clicks to LOD and rarely dropped below 40fps if the built-in FPS meter is accurate.
 
Maybe I have a setting set wrong(?) but other than the landscape, the game looks blah to me. I don't think you need a better video card, I think they need to fix their game. The buildings appear blah, the textures appear blah. Maybe needs more tessellation? It should run a lot better than this, at least for what I'm seeing from a graphical perspective. You shouldn't need a $1.5k video card to run this game with better settings. For just the few hours I played, the gameplay is good but my performance is not in line with the graphical detail i'm seeing. I'm still using a 980ti but I play other games that look amazing and perform much better. I don't think its fair to say that you need a more powerful video card. This appears to be very much a game problem.
 
Maybe I have a setting set wrong(?) but other than the landscape, the game looks blah to me. I don't think you need a better video card, I think they need to fix their game. The buildings appear blah, the textures appear blah. Maybe needs more tessellation? It should run a lot better than this, at least for what I'm seeing from a graphical perspective. You shouldn't need a $1.5k video card to run this game with better settings. For just the few hours I played, the gameplay is good but my performance is not in line with the graphical detail i'm seeing. I'm still using a 980ti but I play other games that look amazing and perform much better. I don't think its fair to say that you need a more powerful video card. This appears to be very much a game problem.
What settings / resolution are you at? Even at Medium this game looks pretty stunning at 4k, even the NPCs look pretty damn good to my eyes.
 
I found the game very playable on *mostly* medium settings @ 4k with a lightly overclocked 1080. I set the textures to very high, and gave an extra two clicks to LOD and rarely dropped below 40fps if the built-in FPS meter is accurate.
I thought about doing similar settings for higher FPS. I just wanted to see what I could perceive in terms of visual quality vs. rps. I could absolutely see a difference in 4k but not enough to justify the performance hits. It really makes me wonder about how much rendering data is being read/decompressed/ and calculated vs. being seen. It'll be funny if someone plays this in 8k and some things actually get better on the same hardware.

I also realize its common to compress many things for space efficiency and some i/o speeds. I LOL when I see multiple bin/paks in the 2-8GB range for many games. Looking around the install files and I didn't see anything quite that large but it seemed like pretty much everything was in some kind of compressed file. I think they may have gone off the deep end for compression with this one. Weighed in ~39GB and for many high-res 4k happy RPG/FPS games it's not unusual to see in the 45-60GB these days. I know many are complaining about install sizes these days but sometimes its necessary. I appreciate their efforts but not about to build or mod a rig for this approach, at least not yet until that style becomes mainstream. SATA III SSD/fast CPU/GPU/fast DDR3 Ram should still suffice for most things.

I'm sure at some point I'll play it in 4k/ultra @ 60fps but for now I'm in the boat with many learning how to make it happen. I don't truly believe hardware upgrades or setting compromises are needed, yet.
 
Maybe I have a setting set wrong(?) but other than the landscape, the game looks blah to me. I don't think you need a better video card, I think they need to fix their game. The buildings appear blah, the textures appear blah. Maybe needs more tessellation? It should run a lot better than this, at least for what I'm seeing from a graphical perspective. You shouldn't need a $1.5k video card to run this game with better settings. For just the few hours I played, the gameplay is good but my performance is not in line with the graphical detail i'm seeing. I'm still using a 980ti but I play other games that look amazing and perform much better. I don't think its fair to say that you need a more powerful video card. This appears to be very much a game problem.

I mostly feel the same with this. Some of vegetation looks nice but otherwise I've yet to see anything that surpasses what I've seen in other similar RPG's in the 2015-now range. It does look considerably better in 4k but to me still not enough to justify the hits. I'm just hoping they stay committed to improving it. The game deserves it and obviously there's a fan base for it.
 
I am running 4k on ULTRA with a 1080TI and the game is usually just a little above 30fps. This is with a 4930k cpu. When it dips below, its usually during a transition or animation during speaking, but for the most part, I find game play acceptable. Can't wait for the mods to come out for this game. Game has plenty of bugs but that is the new normal these days.
 
Just saw this.
https://overclock3d.net/news/softwa...ce_patch_1_4_-_hd_textures_audio_and_beards/1

I won't have time to check it out till sometime this weekend but looks promising.

Overall, I'm pretty impressed with support Warhorse is giving this game so far. Seems like once every week or two something is put out. Also, once I was able to determine optimal settings for my rigs I was able to enjoy it a great deal more. Sure there's plenty of glitches, but there's a definite beauty to it as well. I agree with a quote I heard from one of their top people, the game sits somewhere between indy and AAA. I paid a AAA price but no real regrets due to the support they're giving.
 
Kyle, any chance of doing even a minimal follow up review since the 1.4.1 update?

I did some testing and noticed some impressive changes involving the ultra settings. In 4096x2160/v-sync on, 1080SLI, I was averaging 40-60FPS in Ultra w/ max distance settings. For kicks, I cranked up my old 1080p/120hz monitor and saw 90-114 FPS same settings. I have to say that the background vegetation textures looked the best I've ever seen in 1080p. My 1080TI in 1440p/g-sync averaged 50-75 FPS same settings.

Not sure what, but the new HD texture pack seems to optimized the higher end settings or something. I was confused, though, when I lowered settings I did gain some frames but visually not as much of a difference as before but honestly most of my time was spent enjoying ultra.
 
this games visuals. while good, do not scale very linearly with its performance

edit: have not seen the HD version to comment on that
 
Trying out the game on my rig @ 1440p with the preset of ultra high WITHOUT adjusting the extra sliders any more. Seeing times when I'm staying at the high 40s just in the first town, this game is brutal! I'm absolutely gpu limited, haven't noticed any single cpu core higher than 60%. Sure does look purdy though...
 
this games visuals. while good, do not scale very linearly with its performance

edit: have not seen the HD version to comment on that
To me the HD pack really evened it up for the metrics. I don't know why but performance actually seemed to improve.
 
Back
Top