Ryzen Memory Tested Is It Worth Buying New Faster Ram

gerardfraser

[H]ard|Gawd
Joined
Feb 23, 2009
Messages
1,366
Sorry no testings at lower resolution below 2560 x 1440 or low/medium setting. There are plenty of others who have shared there findings with the lower settings and lower resolutions

Test are to see if there is a difference when gaming at ultra settings with high resolution 2560 x 1440.


EDIT:Added Feb 01 2019

Ryzen vs RAM 2133MHz vs 2400MHz vs 3200MHz vs 3600MHz With Tweaked Timings 2560 x 1440 Ultra setting Shadow Of The Tomb Raider.
♦ 2133Mhz (16GB)CL10-10-10-10-21
♦ 2400mhz (16GB)CL10-11-11-11-21
♦ 3200mhz (16GB)CL14-14-14-14-28
♦ 3600mhz (16GB)CL14-15-15-15-42

Chart of test
Desktop-Screenshot-2019-01-31-22-19-37-64-2.png
Video Of Test




DDR4 Speeds tested
♦ 2667mhz (16GB)CL16-15-15-15-39 Command Rate 1T All sub-timings manual set
♦ 3400mhz (16GB)CL16-18-18-18-36 Command Rate 1T All sub-timings manual set
♦ 3466mhz (16GB)CL14-14-14-14-28 Command Rate 1T All sub-timings manual set

Ryzen Memory Tested Is It Worth Buying New Faster Ram

I have tested various combination of DDR4 Ram on Ryzen+ before in past six months.Always about the same results.

Today I tested a GTX 1080TI gaming at 2560 x 1440 Ultra/Max setting.

If gaming at higher resolutions and you need to know does the ram make a big difference.I would say no need to waste cash on new ram.
Mem-chart.png




COMPUTER USED
♦ CPU - AMD 2600X With MasterLiquid Lite ML240L RGB AIO
♦ GPU - Nvidia GTX 1080 Ti
♦ RAM - G.Skill Trident Z 16GB DDR4(F4-4000C18D-16GTZ) (2x8)
♦ Mobo - MSI X470 - Gaming Plus
♦ SSD - M.2 2280 WD Blue 3D NAND 500GB
♦ DSP - LG 27" 4K UHD 5ms GTG IPS LED FreeSync Gaming Monitor (27UD59P-B.AUS) - Black
♦ PSU - Antec High Current Pro 1200W
► FPS Monitoring : MSI Afterburner/RTSS
► Gameplay Recorder : Nvidia Shadowplay
► VSDC Free Video Editor

EDIT: 10 test from a few months ago

%Difference
xtum8rm-png.png

desktop-screenshot-2018-12-08-11-11-27-03-png.png
 
Last edited:
Nice tests, So is it just a myth that AMD processors need higher frequency memory? Doesn't seem to be a night/day difference from the fastest to slowest memory!
 
I have 3200mhz ram on my 2700x currently, can you please re-test with 3200. Thanks.
 
This is at high enough resolution and settings that it's mostly GPU-bound. There could never be any night-and-day difference here, and most of the interesting stuff could be expected to be in the 1% and 0.1% numbers. OP, did you average/median multiple tests to get the numbers? If not, how consistent did you observe them to be? BFV 0.1% in particular looks pretty clearly unstable.
 
Nice tests, So is it just a myth that AMD processors need higher frequency memory? Doesn't seem to be a night/day difference from the fastest to slowest memory!

No, it isn't. This test has been done by many sites. You can find one example here. Essentially, it boils down to AMD CPU's benefiting more from lower latency modules than Intel and as far as frequencies go, there is a point of diminishing returns. Somewhere around 3400MHz to 3600MHz if I recall correctly. In any case, the improvement in the 3466MHz numbers from 3400MHz is almost certainly due to the former being CL14 instead of CL15 memory and not due to the 66MHz frequency bump.
 
Add some more games from a couple months ago.

So some single channel memory you can just ignore.
10 more game test CL16 3200 and CL14 3466

added video
 
EDIT:Added Feb 01 2019
Ryzen vs RAM 2133MHz vs 2400MHz vs 3200MHz vs 3600MHz With Tweaked Timings 2560 x 1440 Ultra setting .

Shadow of the Tomb Raider
♦ 2133Mhz (16GB)CL10-10-10-10-21
♦ 2400mhz (16GB)CL10-11-11-11-21
♦ 3200mhz (16GB)CL14-14-14-14-28
♦ 3600mhz (16GB)CL14-15-15-15-42

Chart for those who do not watch video
Desktop-Screenshot-2019-01-31-22-19-37-64-2.png
 
I guess it hasn't processed yet, it looks like crap in the current 360p in youtube :p.
 
Yeah I had to reup the video ending got messed up.Its at 4k at time of this post.
 
Thanks for doing this, I was personally surprised there was that much frame difference in the ram speed.
 
Good videos but of course there's not going to be any difference running at ultra settings and 1440p. I bet if you overclock/downclock your CPU 500Mhz there will barely be any difference either at the same settings.
 
Am I reading this last chart right.

Kingdom Come - 17% difference
ROTTR - 2%
Watch Dogs - 12%
Hitman - 6%
BF - 2%, but 25% higher 1% lows?

Either way, if the application is sensitive then better clocked ram matters.

People are buying 9900k over 2700x for similar differentials.
 
Last edited:
So you agree you do not need to buy faster B-Die ram for Ryzen.Then I agree.

I don't have an opinion on if Ryzen owners should buy B-Die or not. You're essentially testing the 1080 Ti since it's clearly the bottleneck at those settings. (You can bump the resolution down to 1080p, buy a 2080 Ti, or just conclude ram speed don't matter when the GPU is the bottleneck)

And just to be clear I am just talking about the tomb raider video.
 
Is there something wrong with the tests here? How is single channel memory getting higher 0.1% lows than dual channel in some tests?
 
Why would there be anything wrong,except third party monitoring tools has an effect .You can always test yourself with MSI Afterburner/HWinfo64 /shaadplay running.
The way to make it more accurate is to buy a capture card which I will not be doing.
I am just showing gaming at 2560 x 1440 there is no real need to go buy faster more expensive DDR4 Ram for a Ryzen system.
 
Is there something wrong with the tests here? How is single channel memory getting higher 0.1% lows than dual channel in some tests?
Margin of error possibly due to GPU clock speed variation with a change in ambient temperature since the GPU is clearly the largest limit not the CPU.
RAM speed only affects the CPU and increasing resolution only increases load on GPU not CPU\RAM.
Or it may just be a random stutter as minimum FPS are not perfectly consistent across runs in all games with some games they are far more random than others.

Nothing wrong with the tests they are excellent provided you keep the above in mind.
 
Last edited:
Margin of error possibly due to GPU clock speed variation with a change in ambient temperature since the GPU is clearly the largest limit not the CPU.
RAM speed only affects the CPU and increasing resolution only increases load on GPU not CPU\RAM.
Or it may just be a random stutter as minimum FPS are not perfectly consistent across runs in all games with some games they are far more random than others.

Nothing wrong with the tests they are excellent provided you keep the above in mind.


This information is not accurate.
 
This information is not accurate.
In what way?
I may be wrong about variation in GPU speed looking at the video it seems there fairly solid each run so probably just margin of error in min FPS in this case.
 
it would have been nice with the 3466 memory in single channel mode to isolate memory bandwidh from infinty fabric


but really nice tests here

however as many ppl said its not really testing the effect on memory on the ryzen cpu as much as the effect under gamming.
if the cpu is not the determine factor increase in cpu performance is well not determing the fps and thereby not tested

its not that there is anything wrong with this test is just there are to types of test
the deep technical systems ones. and then the more experience/effect ones

the dee /technical ones will tell us the trutth about how it works
the experience/effective ones will tell us if its worth it

You are the the later ones


There is a reason we test CPU performance in low res
 
Last edited:
In what way?
I may be wrong about variation in GPU speed looking at the video it seems there fairly solid each run so probably just margin of error in min FPS in this case.


Stating that resolution increases don't tax the CPU more.
 
Sorry no testings at lower resolution below 2560 x 1440 or low/medium setting. There are plenty of others who have shared there findings with the lower settings and lower resolutions

Test are to see if there is a difference when gaming at ultra settings with high resolution 2560 x 1440.
 
Stating that resolution increases don't tax the CPU more.
They usually don't, and when they do the effect is usually still tiny compared to the extra weight on the GPU, meaning that CPU time taken per frame may increase slightly but CPU use as a percentage will still decrease.

If increasing resolution ever makes you more CPU-bound / less GPU-bound, something very weird has happened.
 
Stating that resolution increases don't tax the CPU more.
Provided the FOV is not tied to the resolution which is true for most games, If the game is CPU bottlnecked and you take the resolution from 4k to 1280x720 FPS remain exactly the same.

If you are GPU limited then you drop the resolution GPU load will go down and as the FPS increase CPU load will go up.
 
That's not strictly true either, most commonly because in some cases the CPU thread submitting work to the GPU will wait on the GPU for something.

Occasionally the CPU will have work to do that just is tied to resolution, it's just rarely a big enough effect to be noticeable. The most noticeable one I've run into is Unigine Superposition, ironically enough.
 
That's not strictly true either, most commonly because in some cases the CPU thread submitting work to the GPU will wait on the GPU for something.

Occasionally the CPU will have work to do that just is tied to resolution, it's just rarely a big enough effect to be noticeable. The most noticeable one I've run into is Unigine Superposition, ironically enough.
Thanks for the constructive reply you peaked my interest so I gave the benchmark a few runs and at first it seemed to support what you said when I limited its affinity to one thread which made it a stuttering inconsistent mess.

So instead I opted to disable HT and drop the 6700K from 4.6GHz to 800MHz to make the benchmark stay under 85% GPU load at 4k.
Also set windows to max performance and enabled K-Boost in EVGA precision X to prevent the GPU\PCI down clocking at low loads and closed the GPU monitoring\overlay software once I knew it was CPU limited.
Finlay the results were fairly consistent.

Run one prefetched the files which resulted in a score about 150 points lower than the subsequent runs

Run 2 3840x2160 Score :6485
Run 3 2560x1440 Score: 6479
Run 4 1920x1080 Score: 6479
Run 5 1280x720 Score: 6444

Retest
Run 2 1280x720 Score: 6442
Run 3 1920x1080 Score: 6438
Run 4 2560x1440 Score: 6456
Run 5 3840x2160 Score :6431

CPU load was mostly on two cores running at ~90% with a little on a third and about 65% total load across 4 cores.
So if CPU load is increased by the resolution it may show up on a system with insufficient threads but not one with insufficient performance in a game that is bound by the performance of a few cores? If so is this related to the GPU driver overhead which makes AMD GPU perform worse in CPU heavy DX11 games than Nvidia?

4K Superposition_Benchmark_v1.0_6485_1549242697.png
720PSuperposition_Benchmark_v1.0_6444_1549243576.png

Edit: Just averaged the CPU usage at 4k and 720p 52.8% and 53.0% not sure which is which but it doesn't really mater does it.
 
Last edited:
20190201_161935.jpg
My 3200 EVGA Kit works with it just Fine. I don't Know I RMAed a Chip the 2600x. I believe it got damaged with a Block leak on it and got it WET.

They Approved it and I sent it to AMD Moduslink in FL. So I guess they'll send me a New Chip. I have no Experience with that.

Bought a Scythe Mugen 5 because of it...just run it on Air...can Probably still hit 4.2-4.3ghz.

I hope I get one Back Mannnn.


In the Mean Time I'm Running the 7700k and EVGA. The Sycthe is on it..and put an Arctic Cooler on the 1060.

Been Pretty Cool though man...Check it out lol
 
So if CPU load is increased by the resolution it may show up on a system with insufficient threads but not one with insufficient performance in a game that is bound by the performance of a few cores? If so is this related to the GPU driver overhead which makes AMD GPU perform worse in CPU heavy DX11 games than Nvidia?
Sorry, I wasn't very clear - the effect is only particularly noticeable (without special investigation) in Superposition because it's a stutter rather than a constant. Most frames in Superposition have no such effect AFAICT. IIRC it was dependent on settings other than resolution, but it's been a while and I don't remember details.

Were I to download it and set up a test case right now, I'd probably run 1080p/high and 4K/high and look for anomalies on a graph of CPU use.
 
Back
Top