Ryzen 1,1+,2 @4K+ gaming.

funkydmunky

2[H]4U
Joined
Aug 28, 2008
Messages
3,854
Gamers Nexus did a nice comparison of Zen through the ages to compare gaming performance at 1080p to see where the real increases are.
They did a great job and I recommend the watch. I was hoping they would throw in a few 4K's to round it out, but they did not as it was all about isolating the CPU.
I am searching for some benches that do such a comparison as I want to know how the different generations compare with VR type resolutions.
When I game in VR my RT-VR stats show I am always bound by my GPU. Never CPU. I know it is somewhat of a two way street in that the CPU helps the GPU, but by how much and would like to know when this cut off is.
If anyone has a link to anything in these regards I would be appreciative.
 
When I game in VR my RT-VR stats show I am always bound by my GPU. Never CPU. I know it is somewhat of a two way street in that the CPU helps the GPU, but by how much and would like to know when this cut off is.
I can give you some personal experience from my very recent upgrade from an R7 1700 (OC'd to 3.9ghz) to a 3900X. I mostly play 4k.

I don't have actual numbers or graphs for the most part, but I had that 1700 for years and i've been running games to see where the noticeable differences are.
 
I don't play many FPS so most of my games aren't going for massive framerates. However, very noticeable increases:
Note: These are mostly at 4k with my 1080ti.

Pretty much any Ubisoft game, especially with Assassin's Creed Origins, Odyssey, Hitman 2 - there was a weird stutter that existed on the 1700 that's absolutely eliminated now. Much, much smoother and average framerates did go up along with massive improvements in minimum framerates.

Monster Hunter World: Definitely didn't expect that I was CPU limited here, but my previously slightly inconsistent 60fps is now almost perfectly consistent. It's hard to apply numbers too, but overall just FAR better.

In one of the few games I do care about FPS, I can finally go over 110FPS in Doom Eternal is now around 150fps. at 1440p. HUGE increase there.

The biggest improvement for me overall are Frontier's games, Planet Zoo, Planet Coaster, Jurassic World Evolution, etc. My framerate went from 25ish in big parks to 40-50. More than double. I play these games a LOT and this is more than worth it for me.

Cities Skylines, frustratingly, is still around 25 FPS in my biggest cities. It's a more responsive 25fps but still just a sad 25fps.

If you have a title that you'd like me to test in particular, let me know. I own the obvious free to play stuff like Fortnite and Apex Legends, PUBG, etc. but I don't play them that often. I am curious if PUBG runs any better, maybe i'll reinstall that one...
 
Last edited:
I don't play many FPS so most of my games aren't going for massive framerates. However, very noticeable increases:
Note: These are mostly at 4k with my 1080ti.

Pretty much any Ubisoft game, especially with Assassin's Creed Origins, Odyssey, Hitman 2 - there was a weird stutter that existed on the 1700 that's absolutely eliminated now. Much, much smoother and average framerates did go up along with massive improvements in minimum framerates.

Monster Hunter World: Definitely didn't expect that I was CPU limited here, but my previously slightly inconsistent 60fps is now almost perfectly consistent. It's hard to apply numbers too, but overall just FAR better.

In one of the few games I do care about FPS, I can finally go over 110FPS in Doom Eternal is now around 150fps. at 1440p. HUGE increase there.

The biggest improvement for me overall are Frontier's games, Planet Zoo, Planet Coaster, Jurassic World Evolution, etc. My framerate went from 25ish in big parks to 40-50. More than double. I play these games a LOT and this is more than worth it for me.

Cities Skylines, frustratingly, is still around 25 FPS in my biggest cities. It's a more responsive 25fps but still just a sad 25fps.

If you have a title that you'd like me to test in particular, let me know. I own the obvious free to play stuff like Fortnite and Apex Legends, PUBG, etc. but I don't play them that often. I am curious if PUBG runs any better, maybe i'll reinstall that one...
Well I am concerned with VR titles which are 4K+ by default. I am trying to figure out how much the CPU is the bottleneck @ 4K and not to just pour all my $ into GPU.
" In one of the few games I do care about FPS, I can finally go over 110FPS in Doom Eternal is now around 150fps. at 1440p. HUGE increase there. "
Yes a big increase. But 110 fps isn't limiting at all, or is it? and it is @ a meager 1440p. I thought you gamed at 4K?
 
I thought you gamed at 4K?
I have two 4k60hz monitor and a 1440p165hz monitor. In titles where I want over 60fps, I use the monitor that can actually display over 60fps, which is my 1440p one. In general, i've found that 1440p to be about twice the performance of 4k as long as i'm not cpu bottlenecked.

Doom Eternal was 60fps at 4k with both processors, I don't bother testing over 60fps at 4k because my setup can't display it. Academic or not, its meaningless to me.

As for 110 fps being limiting - it's not, but the difference between 110 and 150 is pretty damn noticeable on a display that can support it.

I need to reinstall some sort of racing game that's not limited to 60fps to see what those feel like...

I also have a whole lot of games that played at 4k60 that didn't change much. Witcher 3, Star Wars Battlefront, Diablo III, Grim Dawn, all my 2D games perform basically identical (most are limited to 60fps as it is). I haven't touched any of my Far Cry or Tomb Raider titles in a while, I have all the reason to suspect Tomb Raider is gonna run way better now but I honestly don't remember what settings I had it on previously. I'll give Vermintide 2 a go when I find someone to co-op with, and I stopped paying for Xbox Gamepass Live a few months back because the way it installs games was creating issues, so I can't test stuff like Forza Horizon or Gears 5 anymore - but they both ran flawlessly on my 1700 as it was.

I don't have a VR set so there's a whole lot of VR titles that I have no way to even attempt to test.
 
Last edited:
Well I am concerned with VR titles which are 4K+ by default. I am trying to figure out how much the CPU is the bottleneck @ 4K and not to just pour all my $ into GPU.
" In one of the few games I do care about FPS, I can finally go over 110FPS in Doom Eternal is now around 150fps. at 1440p. HUGE increase there. "
Yes a big increase. But 110 fps isn't limiting at all, or is it? and it is @ a meager 1440p. I thought you gamed at 4K?

VR, especially dx11 titles, can hit cpu limits due to draw call limits. Since it has 2 views to draw, high geometry complexity can hit the cpu extremely hard, especially if single pass stereo (nvidia) or liquidvr (amd) aren't used. It's very game dependent, but there are a couple of popular simulators that have that issue and really need high single core speed. Two I know of are x-plane and iracing. Going from zen 1 to zen 2 can be worth 20-25% in those as fill rate is not the limiting factor. Iracing does support single pass stereo but it's still cpu limited on anything over a gtx1070.

I'd suggest finding the forum of the games you care about as there are usually extensive discussions about vr performance there and what hardware and settings result in the best performance.
 
Would love to see some actual benchies though. Seems every article focuses only on 1080p or lower to deliberately isolate the CPU. There have been a number of very well done ZEN 1/+/2 articles even comparing all @ 4GHz, but they just won't show high rez because they are trying to isolate the CPU.
That for the replies. Good info in there.
 
At 4K, you are almost entirely GPU limited. CPU-to-CPU variance would be less than 5%
 
Would love to see some actual benchies though
Yeah, sorry that i'm not equipped to offer these.
I also personally wish that benchmarkers would bench games that are truly CPU limited (Such as the Frontier Games) because doubling my performance (and actually finally becoming GPU limited for the first time ever) is a pretty noticeable change.

At 4K, you are almost entirely GPU limited. CPU-to-CPU variance would be less than 5%
It really, really depends on the title.
 
At 4K, you are almost entirely GPU limited. CPU-to-CPU variance would be less than 5%

It really depends on the game. I put this theory to the test when I reviewed the Intel Core i9-10900K. What you said is mostly true, except in specific scenarios. Destiny 2 as an example shows greater gaps than 5% across the spectrum of CPU's we tested. Shadow of the Tomb Raider shows slightly more than 5% gains going from the 9900K to the 10900K and again when the 10900K is overclocked to 5.1GHz on all cores. Hitman 2 showed vastly different minimum frame rates at 4K across our test systems despite having almost identical averages. The maximum FPS numbers were also a bit more spread out than 5% comparing the 3900X to the Intel CPU's.

However, Ghost Recon backs up what you said 100%. It is most certainly entirely GPU limited as the RTX 2080 Super I used showed a 1-3 FPS range from best to worst across all the CPU's we tested. That's pretty much margin of error territory. Given the time we had to complete the review, the CPU tests weren't as broad as we would have liked, but we still saw more than a 5% difference at 4K in some of our tests. I know we'd get more varied results with a few other CPU's thrown into the mix with different clocks and core counts as I've seen that before. These differences are certainly much larger at more CPU limited resolutions, but they still exist across the spectrum.

Your GPU matters more, but saying that you are almost entirely GPU limited at 4K simply isn't an accurate statement. There are enough cases where we can demonstrate a larger difference to show this. The GPU being the bigger factor doesn't equate to the CPU being largely unimportant. However, if you only look at averages, they are broadly the same for most CPU's. Back when the Ryzen 3000 series launched, I tested the 3900X against a Threadripper 2920X which had the same core/thread count. The averages often said they were the same, but the minimum and maximum FPS were very different across the board. Unfortunately, we didn't do 4K testing back then as I didn't have a 4K monitor on the test bench. We were also limited on time as CPU launches tend to give us almost no time to work.

The point being, I've done enough testing to know that the GPU is still the best place to put the bulk of your budget in a gaming PC build, but a higher end CPU is certainly worth it for 4K gaming if you can afford it.
 
Cities Skylines, frustratingly, is still around 25 FPS in my biggest cities. It's a more responsive 25fps but still just a sad 25fps.

This right here is exactly the reason I'm building my new PC with 32gb, since Skylines is also such a memory hog.

I all but gave up playing it with my i5 computer in my signature. I just got waaaay too much into modding and it slowed the game so damn much, haha.

So once I get that new PC in my signature built, I'll be very curious how well the game runs.
 
This right here is exactly the reason I'm building my new PC with 32gb, since Skylines is also such a memory hog.

I all but gave up playing it with my i5 computer in my signature. I just got waaaay too much into modding and it slowed the game so damn much, haha.

So once I get that new PC in my signature built, I'll be very curious how well the game runs.
It plays WAY better on my 3900x compared to my 1700, the framerate is still identical, even with the GPU barely topping 30% use, so it's definitely an engine limitation.

Zooming in and out is faster, moving around no longer stutters, there's no delay when turning on one of the overlays (traffic, police, etx.)

My main city, for reference, is 150,000 people using every single square (removed the limit) and I have maybe a hundreds mods active - mostly QOL, I got rid of a bunch of the unique building mods because they simply slowed stuff down too much.

I totally get the need for RAM, part of why I also have as much as I do (48gb). Cities skylines + a 4k video simultaneously is a big ol system hog, and I usually watch something else while playing my city builders.
 
Gamers Nexus did a nice comparison of Zen through the ages to compare gaming performance at 1080p to see where the real increases are.
They did a great job and I recommend the watch. I was hoping they would throw in a few 4K's to round it out, but they did not as it was all about isolating the CPU.
I am searching for some benches that do such a comparison as I want to know how the different generations compare with VR type resolutions.
When I game in VR my RT-VR stats show I am always bound by my GPU. Never CPU. I know it is somewhat of a two way street in that the CPU helps the GPU, but by how much and would like to know when this cut off is.
If anyone has a link to anything in these regards I would be appreciative.
It depends on the game.

Most VR games are relatively simple games (in terms of the game logic), and thus can hit the required 90 FPS on even a modest CPU. There are games where this is not the case, however. DCS World is one such game, where it's very CPU heavy, and even with a 9900K, it sometimes struggles to manage 90 FPS because it's CPU bound.
 
It really depends on the game. I put this theory to the test when I reviewed the Intel Core i9-10900K. What you said is mostly true, except in specific scenarios. Destiny 2 as an example shows greater gaps than 5% across the spectrum of CPU's we tested. Shadow of the Tomb Raider shows slightly more than 5% gains going from the 9900K to the 10900K and again when the 10900K is overclocked to 5.1GHz on all cores. Hitman 2 showed vastly different minimum frame rates at 4K across our test systems despite having almost identical averages. The maximum FPS numbers were also a bit more spread out than 5% comparing the 3900X to the Intel CPU's.

However, Ghost Recon backs up what you said 100%. It is most certainly entirely GPU limited as the RTX 2080 Super I used showed a 1-3 FPS range from best to worst across all the CPU's we tested. That's pretty much margin of error territory. Given the time we had to complete the review, the CPU tests weren't as broad as we would have liked, but we still saw more than a 5% difference at 4K in some of our tests. I know we'd get more varied results with a few other CPU's thrown into the mix with different clocks and core counts as I've seen that before. These differences are certainly much larger at more CPU limited resolutions, but they still exist across the spectrum.

Your GPU matters more, but saying that you are almost entirely GPU limited at 4K simply isn't an accurate statement. There are enough cases where we can demonstrate a larger difference to show this. The GPU being the bigger factor doesn't equate to the CPU being largely unimportant. However, if you only look at averages, they are broadly the same for most CPU's. Back when the Ryzen 3000 series launched, I tested the 3900X against a Threadripper 2920X which had the same core/thread count. The averages often said they were the same, but the minimum and maximum FPS were very different across the board. Unfortunately, we didn't do 4K testing back then as I didn't have a 4K monitor on the test bench. We were also limited on time as CPU launches tend to give us almost no time to work.

The point being, I've done enough testing to know that the GPU is still the best place to put the bulk of your budget in a gaming PC build, but a higher end CPU is certainly worth it for 4K gaming if you can afford it.

but you are testing 9900k,10900k, and 3900X those are all top of the line procs! you can't say procs dont' make a difference what about 4930k or 2600k or a FX-6300 in the mix if the processor doesn't matter. Plus you are most likely testing games that were coded to run on an Xbox or PS4 where the game engine is designed to run on jaguar cores anyway. then you're saying well the biggest difference was minimum frames. yeah because the games prob felt like they were stuttering and frametimes were all over the place especially when lots of stuff was going on at the same time. Wait till some of the next gen games drop and start telling me your little puny intel quad is gonna keep up with the big boys. Really kinda just upsets me to see people recommending that others to get bottom end, minimum spec processors for gaming rigs which in turn will turn people off of PC gaming in the future becuase their rigs run like shit when new games come out. I mean yeah if all you want to do is play counterstrike: source. then yeah go ahead and build you an I7 920 rig and be happy. but look think about you're posting at?.. ..this is [H]ard OCP son! GO [H]ard or GO {h}ome. we ain't about a bunch of panzy @$$ pentiums round here! and think about it, if you build a system with a solid enough backbone, you can always keep it going a few years and just upgrade the GPU instead of having to build a whole new rig. so keep that in mind too.

p.s. funny the guy w/ the 10900k is telling everyone else to get a weaker proc. thats some sewer tactics, bro.

and kids. really if you really want to make a system last and you are on a budget... you don't need 4K on a little dinky monitor that's sitting 3ft from your face... but that's a whole nother topic. ...but if you got a few grand to blow on a new rig every year, more power to ya. just trying to look out for the working man.
 
but you are testing 9900k,10900k, and 3900X those are all top of the line procs! you can't say procs dont' make a difference what about 4930k or 2600k or a FX-6300 in the mix if the processor doesn't matter. Plus you are most likely testing games that were coded to run on an Xbox or PS4 where the game engine is designed to run on jaguar cores anyway. then you're saying well the biggest difference was minimum frames. yeah because the games prob felt like they were stuttering and frametimes were all over the place especially when lots of stuff was going on at the same time. Wait till some of the next gen games drop and start telling me your little puny intel quad is gonna keep up with the big boys. Really kinda just upsets me to see people recommending that others to get bottom end, minimum spec processors for gaming rigs which in turn will turn people off of PC gaming in the future becuase their rigs run like shit when new games come out. I mean yeah if all you want to do is play counterstrike: source. then yeah go ahead and build you an I7 920 rig and be happy. but look think about you're posting at?.. ..this is [H]ard OCP son! GO [H]ard or GO {h}ome. we ain't about a bunch of panzy @$$ pentiums round here! and think about it, if you build a system with a solid enough backbone, you can always keep it going a few years and just upgrade the GPU instead of having to build a whole new rig. so keep that in mind too.

p.s. funny the guy w/ the 10900k is telling everyone else to get a weaker proc. thats some sewer tactics, bro.

and kids. really if you really want to make a system last and you are on a budget... you don't need 4K on a little dinky monitor that's sitting 3ft from your face... but that's a whole nother topic. ...but if you got a few grand to blow on a new rig every year, more power to ya. just trying to look out for the working man.

Are you kidding me? Did you even read what I wrote? I am the one that said processors matter at 4K when someone else said they didn't. I gave examples as well. My review with data backs up that claim. I did agree that in certain cases CPU differences weren't massive at 4K, but again, I pointed out the difference between a title that's not entirely GPU limited at 4K and one that was. I also said that if you only have a limited budget, you are better off putting that money into the GPU over the CPU. Even so, I stated very clearly that a processor was still important and that if you can afford a top end processor that it would be worth having.

And if you aren't at 4K or at least close to it (3440x1440 as an example) you will be at more CPU dependent resolutions. If your aiming for higher refresh rates, then a higher end CPU is what you need.

Let me be clear. I reviewed a Core i9-10900K. I have one on the test bench for reviewing motherboards and CPU's. I never said I was running one in my own system. I also never told anyone to buy a weaker processor. I don't know where you got that from. Clearly, you either didn't read my post or didn't understand it. Please, don't talk to me about where we are posting or being [H]ard. I'm well aware as I have over 50,000 posts here and wrote reviews for HardOCP for 15 years. My personal machines are also built with little regard for price/performance ratio.
 
Last edited:
but you are testing 9900k,10900k, and 3900X those are all top of the line procs! you can't say procs dont' make a difference what about 4930k or 2600k or a FX-6300 in the mix if the processor doesn't matter. Plus you are most likely testing games that were coded to run on an Xbox or PS4 where the game engine is designed to run on jaguar cores anyway. then you're saying well the biggest difference was minimum frames. yeah because the games prob felt like they were stuttering and frametimes were all over the place especially when lots of stuff was going on at the same time. Wait till some of the next gen games drop and start telling me your little puny intel quad is gonna keep up with the big boys. Really kinda just upsets me to see people recommending that others to get bottom end, minimum spec processors for gaming rigs which in turn will turn people off of PC gaming in the future becuase their rigs run like shit when new games come out. I mean yeah if all you want to do is play counterstrike: source. then yeah go ahead and build you an I7 920 rig and be happy. but look think about you're posting at?.. ..this is [H]ard OCP son! GO [H]ard or GO {h}ome. we ain't about a bunch of panzy @$$ pentiums round here! and think about it, if you build a system with a solid enough backbone, you can always keep it going a few years and just upgrade the GPU instead of having to build a whole new rig. so keep that in mind too.

p.s. funny the guy w/ the 10900k is telling everyone else to get a weaker proc. thats some sewer tactics, bro.

and kids. really if you really want to make a system last and you are on a budget... you don't need 4K on a little dinky monitor that's sitting 3ft from your face... but that's a whole nother topic. ...but if you got a few grand to blow on a new rig every year, more power to ya. just trying to look out for the working man.

Sips from can of Bud Light Yep you tell 'em bro, either you're [H]ardcore or you're just one of those loser fakers.
 

liked the convo way back. Here is a recent YT from Hardware unboxed that sheds some light on the subject.
At 4K it seems even the first gen Zen holds its own. Even with 1% lows. This with the top of the line GPU.
Thoughts?
 

liked the convo way back. Here is a recent YT from Hardware unboxed that sheds some light on the subject.
At 4K it seems even the first gen Zen holds its own. Even with 1% lows. This with the top of the line GPU.
Thoughts?

I'll watch that sometime tomorrow, but from my own personal experience moving from 1700 to 3900, and with my particular gameset (which is usually more focused on visual quality than PURE FRAMES), the difference from 1700 to 3900 was fucking huge.

I also almost game entirely at 1440p with the games where framerate matters at all due to GPU bottlenecks with my 1080ti. I don't have a way to test 4k without GPU bottlenecks except in a VERY few heavy simulation workloads (such as Planet Coaster/Zoo, Civ VI, etc.)
 
Last edited:
I was hoping they would throw in a few 4K's to round it out, but they did not as it was all about isolating the CPU.

I've done some testing at 4K in various games. There are some differences at 4K, but they are small. Typically, you see the differences in minimum frame rates and 1% lows. That being said, it's usually only in the most egregious of cases. For example: The average and maximum FPS would favor a CPU like the Threadripper 2920X, but in reality, it provides a pretty shit experience at 4K in some games. Destiny 2 is a good example of that. The minimum FPS fell to 26FPS (all core overclock) and 36FPS (PBO) on the Threadripper 2920X. However, the 9900K never dropped below 54FPS (stock) and 56FPS (5.0GHz overclock). The averages of the 2920X were in the same range as the other CPU's I tested it against, with maximums that favored the 2920X even over the 9900K.

However, the minimums were appalling by comparison. Despite what the numbers seem to show, the 2920X couldn't provide an acceptable gaming experience at 4K in Destiny 2 with an RTX 2080 Ti. In contrast, the 9900K and the 3900X can. However, in Ghost Recon Breakpoint the difference between the best and worst CPU's I tested is only about 3 or 4 FPS. If you compare disparate architectures that are several generations apart, you are likely to see differences. HEDT vs. mainstream CPU's can also make an impact given differences in memory and sometimes, latencies inherent to those platforms. The early Threadrippers were notorious for this as they tended to perform badly in games compared to standard Ryzens due to their CCX / CCD layout and NUMA memory architecture.
 
I'm able to run DOOM ETERNAL at over 120FPS at 4k on my Threadripper 2950X.

Or at least I was when I had my 2080Ti.

I have a 4k 120hz monitor so I had adaptive resolution set to target 120 and though the resolution dropped the frames never did.
 
I'm able to run DOOM ETERNAL at over 120FPS at 4k on my Threadripper 2950X.

Or at least I was when I had my 2080Ti.

I have a 4k 120hz monitor so I had adaptive resolution set to target 120 and though the resolution dropped the frames never did.

As I said, it depends on the game. There were games where my 2920X was perfectly fine. Destiny 2 was NOT one of them. Doom Eternal runs surprisingly well on a potato. Saying it ran at 120FPS on a Threadripper 2950X at 4K doesn't mean that this performance translates to other games which are far more demanding, potentially less optimized or ones that don't even use Vulkan.
 

liked the convo way back. Here is a recent YT from Hardware unboxed that sheds some light on the subject.
At 4K it seems even the first gen Zen holds its own. Even with 1% lows. This with the top of the line GPU.
Thoughts?

amd fine wine.
 
Gamers Nexus did a nice comparison of Zen through the ages to compare gaming performance at 1080p to see where the real increases are.
They did a great job and I recommend the watch. I was hoping they would throw in a few 4K's to round it out, but they did not as it was all about isolating the CPU.
I am searching for some benches that do such a comparison as I want to know how the different generations compare with VR type resolutions.
When I game in VR my RT-VR stats show I am always bound by my GPU. Never CPU. I know it is somewhat of a two way street in that the CPU helps the GPU, but by how much and would like to know when this cut off is.
If anyone has a link to anything in these regards I would be appreciative.
VR is engine, settings and headset/resolution dependent. iRacing, for example, is extremely dependent on the main render thread in VR for most headsets, but if you turn up render resolution and AA high enough it will switch to gpu limited even with a 3090. MSFS is also extremely cpu single thread speed limited. Most unreal and unity engine games are gpu limited, so no good/easy answer here.
 
VR is engine, settings and headset/resolution dependent. iRacing, for example, is extremely dependent on the main render thread in VR for most headsets, but if you turn up render resolution and AA high enough it will switch to gpu limited even with a 3090. MSFS is also extremely cpu single thread speed limited. Most unreal and unity engine games are gpu limited, so no good/easy answer here.
I updated the link to a new article they did. Check it out and reply if you like.
Not sure why my update to the tread has most everyone commenting ancient news other then you all are starting to read from the beginning. Please watch the last YT link I gave then add your 2 cents ya :)
 
Back
Top