AMD Ryzen 7 Real World Gaming

I'd like to see some benches while streaming with OBS. AMD pushed this angle and nearly every review I read that mentioned Ryzen's gaming performance would mention that streaming should be a strength of it, but didn't actually test it.

I know it's hard to bench, but I wouldn't mind seeing something like World of Warcraft as well. I realize that it's incredibly poorly threaded, but it's still a super popular game that (in certain circumstances) can still be pretty demanding. Wouldn't mind seeing just how much ground is being given up there.
 
This is what we settled on.

2600k @ 4.5 / 7700K @ 5.0 / Ryzen 7 a@ 4.0GHz

Cards/Resolution
480 at 1080p
480 CrossFire at 1080p and 1440p
GTX 1060 at 1080p
GTX 1080 at 1440p
GTX 1080 Ti at 4K

Games
Watch Dogs 2 DX11
DOOM Vulkan
Fallout 4 DX11
Mass Effect DX11
Sniper Elite 4 DX12
Witcher 3 DX11
BF1 Multiplayer DX11
The Division DX12
GTAV DX11
Gears of War 4 DX12

All graphs will be shown specific to resolution with the three systems shown in comparison at apples to apples, with the "least" strong cards gameplay focused on video IQ settings that shoot for an average 60fps.
 
Good selection though I'd have liked to seen ARMA 3. It's an older game but constantly updated and one of the last true PC only games that allows full modding.
 
Good selection though I'd have liked to seen ARMA 3. It's an older game but constantly updated and one of the last true PC only games that allows full modding.

Yeah but it's like single threaded and we both know who would win that fight when it comes to IPC.
 
Good selection though I'd have liked to seen ARMA 3. It's an older game but constantly updated and one of the last true PC only games that allows full modding.
Remember these are real world gaming runthroughs, not canned benchmarks run in seconds. Our Mass Effect runthrough is 14 minutes per run.

Without ANY double checking, and if EVERY runthrough goes PERFECT the first time, it represents 180 real world gaming runthroughs. And of course we are going to find places we need to double check or triple check data, and run throughs that get FUBAR for some reason or another. I figure it will take two solid weeks of work by Brent, likely longer.
 
Am I the only one who still plays DOOM (2016)? Not tge MP garbage, tye amazing SP arcade mode
 
How about Zork? I think its an overlooked game when it comes to gpu benchmarks.
 
ah.. so sad, was expecting to see high performance GPU at 1080P and not just GTX 1060/RX 480... was expecting to see at least GTX 1070 or GTX 1080 at 1080P, to see a truly CPU bottleneck with games at real settings.

Basically everything tested will be GPU limited with those cards at those selected and segmented resolution if it's really planed to run at "Real-World" Settings, I can almost see the conclusion "all of the three processor reviewed performed almost exactly the same, with basically no gaming difference buy whatever you like". not really good honestly, as of course everything with maybe the exception of Fallout 4 will be GPU limited at those picked resolution, I don't see the point of testing 4K on CPU reviews if someone is looking for CPU bottlenecks.

There is more to a game than just draw call throughput. Turning up effects can stress the CPU in different ways.
 
There is more to a game than just draw call throughput. Turning up effects can stress the CPU in different ways.

yeah I know that very well, in fact is a thing that I always jump to say in basically every [H] CPU review, that they need to test games in real world, with real settings not just 640x480 with low settings as there are settings that can affect the CPU performance way more than just resolution, (just to mention few examples I did here, here, here or here) but in this case, choosing weak GPU cards at 1080P to test CPU performance is not a good idea IMHO when they are gona be easily GPU limited.

well anyway, that review lost my interest, as im one of those apparently rare guys who play at 1080P at high FPS, was just saying it was pointless to do that segmentation of resolutions for testing CPU performance, specially without adding a comparable intel 8c/16t to the mixture.
 
Last edited:
Battlefield 1 (MP), Witcher 3 (expansion), Mass Effect: Andromeda, The Division (DX12), Sniper Elite 4, GTA V, Civilization 6 (very CPU heavy) Tom Clancy Ghost Recon

Please no more Fallout 4, Or Watch Dogs 2, or AOTS....

P.S. Anyway you can do memory speeds when it comes to gaming? Really interested if it scales like some of those questionable youtube videos say.

I like this list.

As for memory speeds, it's true that there is some scaling with certain CPU architectures but not all of them. We saw Skylake scale well with RAM speeds but Haswell-E, Broadwell, and Broadwell-E do not. I don't know if Ryzen does or not.
 
Great selection and all the games I asked for are on it + some.
And wasn't that 2600k able to do 5GHz?

We had some 2500K's that could do 5.0GHz back in the day but I don't think we have any of those anymore. I know I don't. I don't know if we ever had a 2600K that could do more than 4.8GHz or 4.9GHz.
 
yeah I know that very well, in fact is a thing that I always jump to say in basically every [H] CPU review, that they need to test games in real world, with real settings not just 640x480 with low settings as there are settings that can affect the CPU performance way more than just resolution, (just to mention few examples I did here, here, here or here) but in this case, choosing weak GPU cards at 1080P to test CPU performance is not a good idea IMHO when they are gona be easily GPU limited.

well anyway, that review lost my interest, as im one of those apparently rare guys who play at 1080P at high FPS, was just saying it was pointless to do that segmentation of resolutions for testing CPU performance, specially without adding a comparable intel 8c/16t to the mixture.

Remember, this is a real-world gaming evaluation on each platform, not a specific CPU test. Kyle has done plenty of CPU specific tests. This evaluation will determine what the real-world gaming advantage at real gaming resolutions and settings gamers play games at. That said, by us targeting 60 FPS as the average framerate for playability we are still keeping things somewhat CPU oriented instead of strictly down and gritty GPU grindy. I also think 60 FPS is a perfect target since more and more gamers are opting for faster framerates for high refresh rate displays. At 60 FPS you know the game is very smooth and playable.
 
Delighted that real world cpu and gpu are being used. I'm sick of reviews that mainly use titan level cards and very high end Intel chips. It like reading reviews of porches and Ferrari supercars that the 99% of us will never own or drive. Sure it's nice to know where the bleeding edge is but my old escorts of a rig will have to do me for a while yet. It's rumoured that a rx480 8 gb is on the way for my birthday, hope my 6600 can keep up but I doubt it. Still a good excuse to go looking for a 2nd hand 4790k if I can prise one away from someone
 
Remember, this is a real-world gaming evaluation on each platform, not a specific CPU test. Kyle has done plenty of CPU specific tests. This evaluation will determine what the real-world gaming advantage at real gaming resolutions and settings gamers play games at. That said, by us targeting 60 FPS as the average framerate for playability we are still keeping things somewhat CPU oriented instead of strictly down and gritty GPU grindy. I also think 60 FPS is a perfect target since more and more gamers are opting for faster framerates for high refresh rate displays. At 60 FPS you know the game is very smooth and playable.

Thank you Brent I always appreciate your replies, I understand what you say, but is pretty interesting that you mention targeting 60FPS for gamers that use high refresh rate displays, why target 60FPS with reduced and/or playable settings when you can aim and target for the highest FPS possible with the maximum settings possible? that's what us "high refresh" rate aim for, not only for over 100FPS, is that number with the maximum possible settings, which some games even at 1080P can't be achieved with a Titan X Pascal/1080Ti, is not only about target 60FPS as average, I know more conservative high refresh rate users look to keep MINIMUMS over 80FPS with averages over 100 as much as possible.

Again, using weak cards at 1080P with highest possible settings is far of a CPU oriented Test.. the truly test for a CPU at 1080P is to be able to maintain minimums as high as possible to offer a smooth experience while driving maxed out settings, that's the truly comparison people want to see in CPU gaming test, frametime stability over pure average of FPS.
 
Nevermind. I answered my own question.

The issue I think stems back to this idea of "real world."

We all like protein shakes, but the TRUE benchmark is if it's powerful enough to blend a Samsung Galaxy Note or an iPhone.

In other words, is it fair to discount a great game that is super popular right now and everyone is playing because it may be coded properly or dare runs smooth on next and last gen systems? Or do we need to focus on games that taxes your system regardless of it has chip/gfx optimizations or just plain has bad coding but like.. no one really plays anymore, *cough*gtav*cough*

Truth is there are times where even OW under super heavy dogfights may have a potential to be good or bad based on the system config that you're using so I think it's valid to put it on the list.

EDIT: lol i typed it all up before you edited so I'ma just post anyway lol
 
I know you have settled on the Hardware but personally I would have liked to see an Intel 8 core in the test like 5960k or 6900k.
 
We had some 2500K's that could do 5.0GHz back in the day but I don't think we have any of those anymore. I know I don't. I don't know if we ever had a 2600K that could do more than 4.8GHz or 4.9GHz.

Hmmm..yeah I was more than likely thinking of those 2500k's, thx for clearing that up.
 
What motherboard and ram combination are we looking at? I assume some 2933+mhz sticks? Don't want to make this into a "please use this platform", honestly just curious.
 
sigh, no Overwatch. I wonder if watching a replay represents real world performance or not

I'm sure it'll be informative, though


It's okay, I wanted Overwatch too just to see how the Ryzen does, but I am averaging 200+ fps on my old Xeon @ 4ghz anyways.
 
It's okay, I wanted Overwatch too just to see how the Ryzen does, but I am averaging 200+ fps on my old Xeon @ 4ghz anyways.
I was interested in 1440p 120/144hz performance for OW, but I guess I'll have to just save some pennies, buy a 1080ti and new 1440p144hz monitor and see how things go :)
 
Remember, this is a real-world gaming evaluation on each platform, not a specific CPU test. Kyle has done plenty of CPU specific tests. This evaluation will determine what the real-world gaming advantage at real gaming resolutions and settings gamers play games at. That said, by us targeting 60 FPS as the average framerate for playability we are still keeping things somewhat CPU oriented instead of strictly down and gritty GPU grindy. I also think 60 FPS is a perfect target since more and more gamers are opting for faster framerates for high refresh rate displays. At 60 FPS you know the game is very smooth and playable.

With such modern titles, is it possible to replace the 2600K with a 6C/12 or 8C/16T whether Haswell or newer CPU?
It is becoming much more relevant to see higher core Intel CPUs in such gaming tests IMO.

Thanks
 
With such modern titles, is it possible to replace the 2600K with a 6C/12 or 8C/16T whether Haswell or newer CPU?
It is becoming much more relevant to see higher core Intel CPUs in such gaming tests IMO.

Thanks

I think both ought to be on there. A lot of folks are sitting on old 2600k rigs (I was until *very* recently), trying to determine whether or not the new offerings from both Intel and AMD are worth jumping for. Keeping the 2600k on the list is a good basis for comparison for those folks. But, your point is also taken -- I'd like to see a 6 or 8 core Intel in the mix too. Although, damn... the [H] folks already got a boatload of work lined up!
 
I think both ought to be on there. A lot of folks are sitting on old 2600k rigs (I was until *very* recently), trying to determine whether or not the new offerings from both Intel and AMD are worth jumping for. Keeping the 2600k on the list is a good basis for comparison for those folks. But, your point is also taken -- I'd like to see a 6 or 8 core Intel in the mix too. Although, damn... the [H] folks already got a boatload of work lined up!

It is probably way too much work to have 3 CPUs, so if one has to give it would make more sense to look to the current high end and the future and use a 6C or 8C Intel CPU rather than 2600K, IMO anyway as some of these modern games are probably being limited by just the 4C/8T CPUs.
I think Ryzen needs to be put to the side for now as it has plenty of updates still coming, so maybe a follow-up later on with Ryzen after the various updates as Ryzen would need to be revisited after all of those updates/changes anyway (I doubt Kyle/Brent want to repeat this multiple times for same CPU platform as it is a serious amount of work and time).

As a reference PCGH showed Sniper Elite 4 worked better with 6 real cores rather than 4C/8T.
Cheers
 
Last edited:
With such modern titles, is it possible to replace the 2600K with a 6C/12 or 8C/16T whether Haswell or newer CPU?
It is becoming much more relevant to see higher core Intel CPUs in such gaming tests IMO.

Thanks
Thanks for the input. I am not changing the hardware.

I think both ought to be on there. A lot of folks are sitting on old 2600k rigs (I was until *very* recently), trying to determine whether or not the new offerings from both Intel and AMD are worth jumping for. Keeping the 2600k on the list is a good basis for comparison for those folks. But, your point is also taken -- I'd like to see a 6 or 8 core Intel in the mix too. Although, damn... the [H] folks already got a boatload of work lined up!
I am good with the hardware. I don't see anyone ditching 6C+ Intel CPUs that clock well for Ryzen at this time so I do not think that is a truly needed in the comparison.
 
Maybe next time you guys can use Civ 6. It really is CPU heavy. Would love to see if it scales with 16 threads!
 
Hey Kyle, could you also try to benchmark the 2600K with hyperthreading disabled to simulate 2500K? There are still a lot of us 2500K users around who deep down want to upgrade but are unsure how well Ryzen compares to it.
 
600GB+ of fresh game download installs later.....

:p
Lucky You! lol, man you must have the most fun job in this universe :rolleyes:

With the RyZen system please state what power plan is being used, as in AMD Balance or High Performance. Not sure if that really matters if a straight multiplier clock and manual voltage but should help in not being asked over and over again about it in the discussions ;).

Memory at 3200+ would be nice even though 2933 is really not that much slower in the scheme of things. I am able to get up to 3500 speeds using BCLK of 109.4, RealBench stable and gives some rather nice memory benchmarks, the 1070's have no problem with the increase clock speed either at pcie Gen 3. At a BCLK of 113 for 3600 memory Windows does not boot - cheap Intel pcie drive even with PCI Gen 2 settings - 960 Evo owners are going past 120mhz BLCK with PCI Gen 3. Probably a separate article dealing with memory bandwidth and gaming would be better in the long run.

Good luck and have some great fun even though it is work.
 
With the RyZen system please state what power plan is being used, as in AMD Balance or High Performance. Not sure if that really matters if a straight multiplier clock and manual voltage but should help in not being asked over and over again about it in the discussions ;).
We use High Perf profile for ALL content on HardOCP. Can't exactly do benchmarks if you are letting the software control clocks etc.
 
pauls hardware just did a vid on 2933 vs 3200 and the best improvement was in aots at ~7%. everything else was lower or nothing at all.

600GB+ of fresh game download installs later.....

:p
you got the fat pipe though, right?! so it shouldn't be too bad but you should save em for next time though.
 
Back
Top