AMD Ryzen 7 Real World Gaming

pauls hardware just did a vid on 2933 vs 3200 and the best improvement was in aots at ~7%. everything else was lower or nothing at all.

you got the fat pipe though, right?! so it shouldn't be too bad but you should save em for next time though.

1gbps package baby, love fiber, the kicker, our ISP supports even higher, 10gbps, but much more expensive, 1gbps is only $12 more than the 100mbps package, all async same down/up speeds
 
Last edited:
pauls hardware just did a vid on 2933 vs 3200 and the best improvement was in aots at ~7%. everything else was lower or nothing at all.

you got the fat pipe though, right?! so it shouldn't be too bad but you should save em for next time though.

It did scale with games that were not a nightmare for Nvidia under DX12 (Hitman and Deus Ex Mankind - so something about these games even in DX11 for Nvidia on Ryzen), or as he shows a couple are GPU bound even with the 1080Ti at 1080p (Battlefielf 1/Mass Effect Andromeda).
Mafia 3 scaled by 11.5%
Watch Dogs 2 by 5.8% (could be GPU bound at that point)
AoTs by 5.1%.

Games used really do matter and overcoming GPU bottleneck while still using 1080p resolution; 720p brings in different GPU bottlenecks in same way 4k does at the other spectrum although 4k does have relevance for many these days when gaming.
Cheers
 
I had a 1Gbps connection at my old house. I could download a 50GB game in about 17 minutes on average. It was awesome. Unfortunately, where I'm at now they only offer up to 500Mbps, but they charge nearly twice as much for half the speed. There are no other options for that kind of speed in my area. You get Spectrum Cable or Frontier FIOS. That's it. So right now I'm stuck at 300Mbps for about what I used to pay for 1Gbps from AT&T U-Verse. It sucks ass.
 
It did scale with games that were not a nightmare for Nvidia under DX12 (Hitman and Deus Ex Mankind - so something about these games even in DX11 for Nvidia on Ryzen), or as he shows a couple are GPU bound even with the 1080Ti at 1080p (Battlefielf 1/Mass Effect Andromeda).
Mafia 3 scaled by 11.5%
Watch Dogs 2 by 5.8% (could be GPU bound at that point)
AoTs by 5.1%.

Games used really do matter and overcoming GPU bottleneck while still using 1080p resolution; 720p brings in different GPU bottlenecks in same way 4k does at the other spectrum although 4k does have relevance for many these days when gaming.
Cheers
The issue I have with these comparisons is the settings are set lower than what I would play at or most of us here. For example with Pauls Hardware in GTA V he used 2x MSAA - At that resolution, 1080p, I would use at least 4x MSAA if not 8xMSAA plus most of the Advanced Graphics settings as well. In that case I doubt there would really be any difference between the two memory speeds. AOTS I just do not play. Up the resolution that most of us play at then the ram speed probably make even less of an impact. Yes memory speeds can have an impact but if you max out your gaming experience it is probably very little difference. Of course we will probably see endless arguments if RyZen is not as good as an Intel chip because of Ram speed :facepalm: So maxing RyZen memory out to begin with may just help alleviate an avalanche of pointless arguments.

One just needs to be aware and maybe smart enough to realize that monitor refresh rates also can dictate if you need faster ram, faster processors etc. If one is gaming on a 60hz 1080p monitor then an I5, Ryzen 1400 etc. will probably give just a good experience with that GTX 1060 or GTX 1070 than an Intel 6950x cpu. Getting 200 fps or constant 60 FPS on a 60hz monitor, the 60 fps would be higher quality due to no tearing of the higher frame rates. At higher resolutions it becomes GPU horsepower as more important than cpu power or ram speeds. Yes you can show the difference but from real playability, maxing out the experience HardOCP is going to show if there is any real difference.
 
Last edited:
I know someone mentioned this earlier in the thread but one of the real tests really should be a game live stream. There have been one or two Y/T vids about this already and it was one the selling points of Ryzen put forward by AMD.

Clearly it doesn't have to be every game... I'd nominate BF1 for this ....but then I'm a Battlefield fan.

Looking forward to seeing which gives consistently the smoothest gaming experience given atleast 2 of the 3 should return more than acceptable fps.
 
The issue I have with these comparisons is the settings are set lower than what I would play at or most of us here. For example with Pauls Hardware in GTA V he used 2x MSAA - At that resolution, 1080p, I would use at least 4x MSAA if not 8xMSAA plus most of the Advanced Graphics settings as well. In that case I doubt there would really be any difference between the two memory speeds. AOTS I just do not play. Up the resolution that most of us play at then the ram speed probably make even less of an impact. Yes memory speeds can have an impact but if you max out your gaming experience it is probably very little difference. Of course we will probably see endless arguments if RyZen is not as good as an Intel chip because of Ram speed :facepalm: So maxing RyZen memory out to begin with may just help alleviate an avalanche of pointless arguments.

One just needs to be aware and maybe smart enough to realize that monitor refresh rates also can dictate if you need faster ram, faster processors etc. If one is gaming on a 60hz 1080p monitor then an I5, Ryzen 1400 etc. will probably give just a good experience with that GTX 1060 or GTX 1070 than an Intel 6950x cpu. Getting 200 fps or constant 60 FPS on a 60hz monitor, the 60 fps would be higher quality due to no tearing of the higher frame rates. At higher resolutions it becomes GPU horsepower as more important than cpu power or ram speeds. Yes you can show the difference but from real playability, maxing out the experience HardOCP is going to show if there is any real difference.

Yeah it is a bit of a conundrum.
Higher settings and he would had seriously bottlenecked the GPUs (several of those games were even with the 1080Ti), lower resolution at 720p and it brings in different set of GPU bottlenecks that you would not experience at 1080p and 1440p.
Not sure what the answer is to this dilemma tbh .
Cheers
 
Last edited:
I know someone mentioned this earlier in the thread but one of the real tests really should be a game live stream. There have been one or two Y/T vids about this already and it was one the selling points of Ryzen put forward by AMD.

Clearly it doesn't have to be every game... I'd nominate BF1 for this ....but then I'm a Battlefield fan.

Looking forward to seeing which gives consistently the smoothest gaming experience given atleast 2 of the 3 should return more than acceptable fps.

Anyone who takes streaming seriously, has a rig dedicated to just streaming.

And that has been the normal for years. The right tools for the right job.
 
If testing Fallout 4, if I may request that the INI be edited to change the Threading for 16? Default is 7 for primary and Havok-physics (since 7 cores is what console devs have access to), and unfortunately I haven't gotten my system totally fleshed out to having FO4 setup to run my own tests. However, I feel making those changes will at the very least give Ryzen the best possible chance at being fully utilized.

Games: Anything by Codemasters, as they've LONG had multi-core profiles baked into their games to better utilize high core count systems!
I found it rather impressive that even GRiD 1 had an 8C profile way back in the day (likely for 4C/8T Intel chips, but maybe SMP machines). Sadly, making your own profiles for these is very labor intensive, and can even cause Steam to throw a fit due to the XML files being changed, but it'd be interesting to see how well they scale with a 16C profile! :D I know it helped out a LITTLE bit when I modified the 4C profile for my Carrizo laptop, to shove the main threads onto separate CMT Modules. Hard to really say how much it helped when it's thermally limited to 15W and shared between CPU and GPU.
Anyways, I suspect something from the F1 series will probably be the latest available, as well as having the most updated engine.

Aside from that, everyone seems to have listed off anything I'd be interested in, such as Sniper Elite 4.
I also second the inclusion of Crysis 1, if nothing else than "for the lawls"!
 
I'm aware of that.

AMD, though, are touting Ryzen as capable of doing that job on the one chip. Worth putting to the [H] test don't you think?

Definitely. No one wants to buy another PC solely for streaming, that's just wasted cash imo.
 
Not many reviewer has done 4K on SLI yet. Pick games that scale with SLI and compare.
 
I recently built a 7700K system with the GTX 1080, sliced through Wildlands and Siege (two most played games) like a knife through butter. Then over the weekend I built a Ryzen 1600 system for a friend, using the Gigabyte K7 board, paired with a RX480 8GB Strix card. My settings in both mentioned games are maxed, and so I maxed out the settings on his new system. I can't tell a difference. I'm sure the 7700k system get's higher fps, but you wouldn't be able to tell during gameplay, or at least I can't. So my real world game play take is that unless there is a fps counter running, you would be hard pressed to see a difference when running on a Ryzen setup.
 
I recently built a 7700K system with the GTX 1080, sliced through Wildlands and Siege (two most played games) like a knife through butter. Then over the weekend I built a Ryzen 1600 system for a friend, using the Gigabyte K7 board, paired with a RX480 8GB Strix card. My settings in both mentioned games are maxed, and so I maxed out the settings on his new system. I can't tell a difference. I'm sure the 7700k system get's higher fps, but you wouldn't be able to tell during gameplay, or at least I can't. So my real world game play take is that unless there is a fps counter running, you would be hard pressed to see a difference when running on a Ryzen setup.

1080p 60 Hz?
 
1080p 60 Hz?


yeah. I'm still on 1080p/60. At 1080p I'm aware they're is going to be lower fps on the Ryzen setup, but I couldn't see any difference, and obviously that's because both system's are pushing way past 60fps. If I was running a 144hz monitor, then yes... I'm sure the 7700k would make a difference in keeping a higher fps. And again, since I'm running a gtx 1080 and him a rx480, I know moving up in the rez there will also be a significant difference in fps, however is it one that's going to severely impact game play at maxed settings? If I'm getting 120fps and he's only getting 60, we are both still having a great gaming experience.
Trying to keep the fps up for 144hz and 1440p I understand things might change, and that's where one need's to pick the right hardware that'll keep the fps high, but overall I think Ryzen gives a great gaming experience, that can match Intel's platform.
 
Last edited:
World of Warcraft :whistle:
Remember when it was a game that HardOCP had in their reviews? I remember. :D
 
I'm aware of that.

AMD, though, are touting Ryzen as capable of doing that job on the one chip. Worth putting to the [H] test don't you think?

you might want to check jayztwocents youtube channel, he did a 30 day test review of the ryzen 1800x where he was also streaming wildlands on the chip encoding at 1080p/60fps @ 10mbit.
 
I really don't get how people can say they are into this hobby and continue not understand this stuff.

Short Version: Significantly GPU limited review

Even if you upgrade in pieces the review gives a great disservice of what to expect when you upgrade from your 2.5 year old video card.

Most people are running a 970 or less card and at 1080p. So for the majority what they posted was very true and realistic. Very few are running 1080 or better, once you go beyond 400 bucks for a video card the sales drop off dramatically. They were doing a gaming use scenario with realistic settings someone might use, not lets see who gets the maximum frames at settings no one will use. That was done to death on launch, I like real world use far more then synthetic benching where I use my machine in a way I never would. Only thing you need to take away is both work good with games right now, but the 1700x has more cores for other things as well.
 
I personally believe AMD has overpriced 1700x and 1800x prices should come down as soon as intel launches its counter. I will be waiting to hopefully get 1800x for less than 400 and buy a rx vega nano :)
 
I personally believe AMD has overpriced 1700x and 1800x prices should come down as soon as intel launches its counter. I will be waiting to hopefully get 1800x for less than 400 and buy a rx vega nano :)


Buy a 1700 and overclock. Most are guaranteed 3.8ghz chips.
 
Wow. Can't wait to read this. Any ideas when this will be up?
Still working on it. Think about this.....we are talking about 180 real-world gaming runthroughs and that is WITHOUT doing any verification of results and without any problems while doing those runthroughs. And I can tell you that there will be verification and problems. I would suggest that you are looking at a minimum of an actual 30 full hours of gameplay if everything is PERFECT. That does not count making sure the data from that is collected properly, analyzed, put onto graphs, changing hardware, etc. From start we figured 2 to 3 weeks for production of the single article....and the 580 launch was dropped in the middle of that. :)
 
Buy a 1700 and overclock. Most are guaranteed 3.8ghz chips.

which make even more relevant his post, 1700X and 1800X are irrelevant when you can overclock a 1700 to the same clock.
 
The 1700X and 1800X have been getting some aggressive sales on the eBay front in the recent weeks. Not as much 1700 deals, but I think that's because of the everyone's trying to score the 1700 instead of the X's.. which may be a better deal with the cooler if you're not thinking aftermarket cooling.
 
This is going to be one epic article ;)

Hopefully the Vega launch (+who knows what else) doesn't derail it too. :p
 
by the time you guys get done with this maybe the i9 intel and 9000 amd processors will be out and you can test them too.
 
OMG WHAT!?!?!!?!
HUGE PENIS!

Penis.jpg
 
Back
Top