Affects of RAM speed on games - research.

cyclone3d

[H]F Junkie
Joined
Aug 16, 2004
Messages
16,244
I have been running some tests to see how RAM speed can affect games on my 4930k setup.

It will probably take a couple more days for me to complete the tests and post my findings, but I wanted to get some feedback on what others think the results will be.

I am testing stock and overclocked CPU speeds as well as tightening up the RAM timings from what it defaults to at each speed.

"Stock" CPU speed will be locked at 3.6Ghz since it gives more repeatable scores compared to leaving it on auto turbo.

Overclocked CPU speed is 4.7Ghz.

RAM speeds are being tested at 1333, 1600, 2133, and 2400. If I get requests for it I can add 1866 as well.

Video card setup is 2x AMD 7970s running at 1150/1575 - changed to 1120/1575 due to texture flashing in Unigine Heaven Benchmark 4.0.

8/18/2015 - I now have an ASUS R9 390 that I will be testing as well. The tests with the 7970s will be posted before I run the tests for the 390.

8/31/2015 - Driver version is 15.8 beta. (Had to restart from scratch due to Thief crashing almost all the time with 15.7.1 with Mantle crossfire enabled with MSI afterburner running or Overdrive enabled.)

9/4/2015 - The test suite so far is as follows:
Unigine Heaven Benchmark 4.0
Unigine Valley Benchmark 1.0
CatZilla 1.4 - nix this unless somebody knows how to get it to run in Windows 10 without hard locking the whole system while installing or when running it.
3DMark Advanced
S.T.A.L.K.E.R. Call of Pripyat Benchmark
Lost Planet: Extreme Condition Demo (DX10 version) - not sure which on [H] used - http://www.gamershell.com/news_41664.html
Lost Planet 2 Benchmark - http://www.geforce.com/games-applications/pc-games/lost-planet-2
Bioshock Infinite
Batman Arkham City GOTY
Dirt 3
Dirt Showdown
Thief
Tomb Raider

I am only running canned benchmarks for the sake of more consistent results and because I really don't have unlimited time to dedicate to this.

This came about because of the review of the i7-6700k showing that RAM speed can affect game engines on that setup.
http://hardocp.com/article/2015/08/05/intel_skylake_core_i76700k_ipc_overclocking_review/#.Vdp2TPT-FaQ

Tests will be done in 1280x720 and 1920x1080 as well as the lowest resolution available.

This guide was helpful for figuring out what all the different RAM timing settings are and how they relate to each other.
https://rog.asus.com/forum/showthread.php?5835-ASUS-Rampage-IV-Extreme-UEFI-Guide-for-Overclocking

The following is the information on speed/timings for the RAM I will be using for these tests.

I will add and reorganize the different speeds/timings as I get them figured out.

RAM info:
20094312694_bb9f9ab0b3_o.png


1333-tightened timings:
20095955053_7bec5b859d_o.png
 
Last edited:
3.6Ghz - 1333Mhz RAM with tightened timings and CF 7970s:
20725106549_9ca2086dd5_o.png

Code:
[B]Batman Arkham City - Max Settings - Physx Off[/B]
Note: Min fps seems to be gotten on scene change so it probably doesn't mean much
1280x720      1920x1080
Min 55        35
Max 310       259
Avg 177       138
Code:
[B]Dirt 3 - Max Settings[/B]
1280x720      1920x1080
Min 135       135
Avg 192       188
Code:
[B]Dirt Showdown - Max Settings[/B]
1280x720      1920x1080
Min 96        93
Avg 125       119
Code:
[B]Hitman: Absolution - Max Settings[/B] - This is CPU limited at this speed
1280x720      1920x1080
Min 66        66
Max 98        98
Avg 78        78
Code:
[B]Thief - Very High[/B]
1280x720
Mantle Off      Mantle On
Min 51          75
Max 161         164
Avg 81          101

1920x1080
Mantle Off      Mantle On
Min 33          72
Max 104         115
Avg 72          91
Code:
[B]Tomb Raider - Ultimate + Ultra Shadows[/B]
1280x720      1920x1080
Min 142       96
Max 228       156
Avg 180       123

Code:
[B]Lost Planet[/B]
Max Settings
This bench only shows average frame rate
640x480      1280x720      1920x1080
Snow
408          370           225
Cave
229          223           223

Default Settings (is this what [H] used or did they set everything to the absolute lowest?)
640x480      1280x720      1920x1080
Snow
480          463           338
Cave
235          231           231

Code:
[B]Lost Planet 2 - DX11 - Max Settings[/B]
720x480      1280x720      1920x1080
Test A
Avg.
213          209           176
Scene 1
220          213           163
Scene 2
205          197           154
Scene 3
213          216           209

Test B
115          111           104
[B][COLOR="Red"]Default Settings[/COLOR][/B]
720x480      1280x720      1920x1080
Test A
Avg.
249          233           204
Scene 1
251          252           194
Scene 2
224          212           190
Scene 3
269          237           228

Test B
121          117           113

S.T.A.L.K.E.R. Call of Pripyat Benchmark - max settings - 800x600, 1280x720, and 1920x1080
20725106939_989266b24c_o.png
20885590096_a181223cb8_o.png
20723815280_20e243a341_o.png



3DMark
20911863125_d8ece91bc2_o.png

20919188901_41809eaa6e_o.png

20723915788_939658e80c_o.png

20911862745_5aeeb8a2ef_o.png

20902146612_f412b6164a_o.png

20911863465_6900b32bd0_o.png


Unigine - min fps is done during scene change so it probably doesn't mean much - RAM speed does affect min fps.
20290842863_bb1827eac5_o.png

20911858415_4e4ffeab50_o.png
-
20723910628_52e8614041_o.png
-
20289277674_d800284824_o.png


---------------------------------------------------------------------------------------------------------------------------------
END 3.6-1333-tight timings
---------------------------------------------------------------------------------------------------------------------------------
 
Last edited:
4.7Ghz - 1333Mhz RAM with tightened timings and CF 7970s:
20794873686_a713977556_o.png

Code:
[B]Batman Arkham City - Max Settings - Physx Off[/B]
Note: Min fps seems to be gotten on scene change so it probably doesn't mean much
1280x720      1920x1080
Min 55        44
Max 424       245
Avg 220       144
Code:
[B]Dirt 3 - Max Settings[/B]
1280x720      1920x1080
Min 166       167
Avg 243       223
Code:
[B]Dirt Showdown - Max Settings[/B]
1280x720      1920x1080
Min 123       101
Avg 158       135
Code:
[B]Hitman: Absolution - Max Settings[/B]
1280x720      1920x1080
Min 88        88
Max 138       138
Avg 113       110
Code:
[B]Thief - Very High[/B]
1280x720
Mantle Off      Mantle On
Min 60          91
Max 193         194
Avg 96          122

1920x1080
Mantle Off      Mantle On
Min 63          79
Max 124         133
Avg 90          107
Code:
[B]Tomb Raider - Ultimate + Ultra Shadows[/B]
1280x720      1920x1080
Min 152       98
Max 252       170
Avg 197       135

Code:
[B]Lost Planet[/B]
Max Settings
This bench only shows average frame rate
640x480      1280x720      1920x1080
Snow
566          490           270
Cave
325          317           315

Default Settings (is this what [H] used or did they set everything to the absolute lowest?)
640x480      1280x720      1920x1080
Snow
706          668           614
Cave
344          337           334

Code:
[B]Lost Planet 2 - DX11 - Max Settings[/B]
720x480      1280x720      1920x1080
Test A
Avg.
247          218           176
Scene 1
237          205           151
Scene 2
230          199           166
Scene 3
273          249           208

Test B
136          128           119
[B][COLOR="Red"]Default Settings[/COLOR][/B]
720x480      1280x720      1920x1080
Test A
Avg.
288          255           210
Scene 1
282          258           190
Scene 2
275          241           188
Scene 3
308          272           252

Test B
145          140           130

S.T.A.L.K.E.R. Call of Pripyat Benchmark - max settings - 800x600, 1280x720, and 1920x1080
20829521886_a98a7eeb30_o.png
20855803095_07a30e0cb2_o.png
20829521726_b6e1cecdfd_o.png



3DMark
20828236521_b24970199d_o.png

20200183953_66dca3dc3a_o.png

20633123480_71491391ff_o.png

20200183433_7a2e22452c_o.png

20821171405_5cc70d0d77_o.png

20821171585_d400b5dc11_o.png


Unigine - min fps is done during scene change so it probably doesn't mean much - RAM speed does affect min fps.
20669830938_925e9be830_o.png

20831553776_d822d82435_o.png
-
20634666240_af1ce85313_o.png
-
20634666140_0c54b0e764_o.png

---------------------------------------------------------------------------------------------------------------------------------
END 4.7-1333-tight timings
---------------------------------------------------------------------------------------------------------------------------------
 
Last edited:
The norm seems to be no effect whatsoever at high res or other very GPU limited scenarios... You seeing anything else? Tease. :p
 
As long as you're GPU limited at minimum FPS, it makes no difference. It can make a difference in CPU limited games, but it varies from game to game, and primarily helps with minimum FPS.
 
Not directly related but way back when, perhaps a decade ago, Crucial published a paper on the benefits to gamers of adding extra memory. It was either going from 2 GB to 4 GB or from 4 GB to 8 GB. The latter would date it to around the launch of Windows Vista.

I see you have 32 GB RAM, so lack of RAM won't affect your results, but the report might prove interesting nonetheless if you can dig it up.
 
Honestly, at speeds higher than 1600 the difference in FPS should be withing statistical margin of error e.i. no difference.
 
I have always thought it was more about finding the sweet spot that works best with a given CPU and the speed you run it at. I have seen benchmarks where everything went a bit odd at a certain speed setting and then recovered when they stepped it up.


My vote is for 2133 to be the best option!
 
I have always thought it was more about finding the sweet spot that works best with a given CPU and the speed you run it at. I have seen benchmarks where everything went a bit odd at a certain speed setting and then recovered when they stepped it up.


My vote is for 2133 to be the best option!

We should start putting bets on this using old hardware.
 
with quad channel ram i cant see the difference being huge but the lowest latency will still be a help in some games so you may see 5% from 1600c9 to 2400c10

what games are you testing?
to have any hope of seeing a difference you need to add a cpu bottlenecked game with bad threading to the test like arma
thief benchmark should also see a difference if you use dx instead of mantle but real gameplay probably wont make a different and mantle will take to much load of the cpu

dont suppose you have any single sided ram for comparison? the 8g sticks your using will be double sided
 
Last edited:
multiplayer/online games are far more beneficial with faster ram than single player.

there is a huge performance boost in bf4 between 1333mhz ram and 2400mhz ram, but in single player there wasn't much.

so if you going to test games, you better off testing under MP.
 
with quad channel ram i cant see the difference being huge but the lowest latency will still be a help in some games so you may see 5% from 1600c9 to 2400c10

what games are you testing?
to have any hope of seeing a difference you need to add a cpu bottlenecked game with bad threading to the test like arma
thief benchmark should also see a difference if you use dx instead of mantle but real gameplay probably wont make a different and mantle will take to much load of the cpu

dont suppose you have any single sided ram for comparison? the 8g sticks your using will be double sided

I'll be testing in dual channel as well.

I don't have any single sided RAM to test with.

I'll add Thief to my list since I actually own a copy of it.
 
Hmmm. If your results are like everywhere i've found you min fps goes up and max stay the same.

Anandtech has some articles on it showing 2133c9. 24000c10 could be beneficial.

My rig feels more snappy at 2400 than 1600. Too bad my 1600 ram is unstable at 2400 :(
 
1) What games will you be testing with?
2) How many runs per game per RAM setup?

From past articles and reports, the difference is usually like 1 to 2 FPS. There were occasions where the difference was greater but only at certain levels and settings that for the most part don't apply to most people.
 
I feel like the ram in my sig is the sweet spot for ddr3 so this should be nice to read
 
Just a bit of an update. I've been working on RAM timings and finding out what clocks my 7970s will run at without artifacting.

I normally run everything with them at 1150/1575, but upon running Unigine Heaven Benchmark 4.0 at 1080p, I have to lower them to 1120/1575 or else I get some flashing textures.

One thing of note so far is that Unigine has stutters/pauses at specific places with lower RAM speeds. 1333 is pretty bad.. 1600 is better but still noticeable. By 2133 it is almost gone.. and 2400 shows no visible improvement over 2133.

This is at stock timings for each speed though, and tighter timings even at 1333 seems to help a bit over stock timings.
 
What's stock timings at each speed? Your RAM can only have two SPD profiles... So everything else is simply whatever you deem stable for that speed setting.

Slower speed should inherently mean tighter timings, it'd be rather pointless to test different speeds at the same timings IMO since that just isn't how RAM is binned and sold.
 
What's stock timings at each speed? Your RAM can only have two SPD profiles... So everything else is simply whatever you deem stable for that speed setting.

Slower speed should inherently mean tighter timings, it'd be rather pointless to test different speeds at the same timings IMO since that just isn't how RAM is binned and sold.

At 1333 and 1600 it defaults to 9-9-9-24-1T. I'll have to check with CPU-Z once I get home to get the full list.

At 1333 I was able to get down to 6-7-6-17-1T as well as tightening up some secondary timings. I know 1600 will run fine at 7-8-8-24-1T, but that was just a quick test I did a while back so not sure if I can get it any tighter or not.
 
Started running actual tests and am going to try to find some more I can run. I will post results as I get them done for each iteration of testing.
 
what settings will you be using?
if you dont mind i would like to run the exact same tests on my 2600k\2133c9\290 for comparison
 
what settings will you be using?
if you dont mind i would like to run the exact same tests on my 2600k\2133c9\290 for comparison

I'll post settings that I am using for the games as I post the test results.

Just a few more tests for 4.7Ghz with RAM at 1333 with tightened up timings with the 7970s.
 
Fixed the Call of Pripyat tests. I had forgot to run anything besides 1920x1080 and also missed changing one setting all the way up.

Edit: Added Lost Planet and Lost Planet 2

Edit: Added low and extreme 640x480 Unigine tests.
 
Last edited:
Added tests for 3.6Ghz with 1333Mhz RAM.

Not sure why a few of the tests gave higher scores than at 4.7Ghz.. maybe a timing issue or the fact that it looks like the base clock may have been running a tiny bit faster? Unigine shows 3410 instead of 3400.

There is definitely some CPU bottlenecking going on in some of the tests at 3.6Ghz.
 
3410 vs 3400 is well within the margin of error. That is to say, no difference at all.
 
3410 vs 3400 is well within the margin of error. That is to say, no difference at all.

I'm talking about the CPU clock speed being reported. Even though it isn't correct in either instance, something had to be a bit different when running at 3.6 instead of 4.7 if you look at all the benchmark scores that I posted.

Will hopefully get 4.7 1600 scores posted this evening.
 
So I am going to have to drop Thief with Mantle for the tests as it seems that there is a bad bug where it will either CTD or give a display driver has recovered error upon exiting.

I was hobbling along with the RAM at 1333 and was usually getting through the bench before it would crash. Now that I have moved to 1600 it crashes all the time. This is the only game that does it and it is definitely a driver bug or a bug in the game itself.

If I have MSI Afterburner running or have Overdrive enabled, even with default settings, it just crashes if I have Mantle Crossfire enabled in Thief.

Been trying to figure it out the past couple of days and have reported it to AMD now.

15.8 beta driver fixed the problem. Guess I will start over from the beginning since it is a new driver set.. bleh. At least I have my bench method down.. going to add a couple more things though.
 
Last edited:
i ran a few tests and it seems my system doesnt take as large of a performance hit from slower memory as it used to maybe windows 10 has optimised its cache usage a bit over windows 7

but its hard to know as im not able to get consistent runs when analysing the fps data in a line graph its not as consistent as it was last time i ran some tests

its been interesting running a few games windowed and watching the cpu\gpu usage in afterburner

cyclone3d if your still running your 7970cf and intend to redo tests with a 390 maybe continue with the current driver and leave out mantle till you swap to the 390?
 
I believe Fraps free can log both FPS and frametimes?

I feel frametime percentiles, min FPS and <1% FPS numbers would be more interesting metrics.
 
Just wanted to add a quick update. I have all the test done for Quad Channel 1333 and 1600 including FRAPS with the 7970s.

Working on 1866 at the moment. I will compile everything when I am done and post it up. I may end up making a new thread and linking to it from here as there is so much data.
 
Good luck with your testing. Be aware, there can be quite a variance between running ~3 tests of the same config.

Sometimes there is also a difference between running 1 benchmark, quit, launch/rerun benchmark
vs
running X amount after the first without quitting.

Some games/benchmarks the min fps can vary so much that you might not be able to measure the difference.


I recommend using excel to help arrange the results. Google drive / docs has a linkable and editable browser app that works pretty good.
 
Good luck with your testing. Be aware, there can be quite a variance between running ~3 tests of the same config.

Sometimes there is also a difference between running 1 benchmark, quit, launch/rerun benchmark
vs
running X amount after the first without quitting.

Some games/benchmarks the min fps can vary so much that you might not be able to measure the difference.


I recommend using excel to help arrange the results. Google drive / docs has a linkable and editable browser app that works pretty good.

I think I've got the setup pretty much down as far as the repeatability of test scores go. Sure there is a very slight difference between runs, but I have run these so many times that I can immediately spot a way out of line test run.

Using FRAPS is a pain since I have to sit there and hit a key at the start of the test and then right at the end of the test as well.
 
Using FRAPS is a pain since I have to sit there and hit a key at the start of the test and then right at the end of the test as well.
Time how long the test takes note it down then you just have to enter that time into fraps (stop benchmark after) each time you change the test rather than being there when it finishes

a graph like this can be handy to see when there is a small hiccup in once part of a test
in this one you can see 4.4ghz edging in front of 4.8ghz during a section in the middle of the test
although in this instance it may just be a 1fps fluctuation in a gpu limited section of the test
arma3altiscpuvsram_zps95038b9c.jpg


yep the difference between first run and second run with files prefetched into ram can be large in some games
ThiefAverageFPS_zps766cfe5a.png

ThiefMinimumFPS_zps98f4293d.png
 
Last edited:
Time how long the test takes note it down then you just have to enter that time into fraps (stop benchmark after) each time you change the test rather than being there when it finishes

a graph like this can be handy to see when there is a small hiccup in once part of a test
in this one you can see 4.4ghz edging in front of 4.8ghz during a section in the middle of the test
although in this instance it may just be a 1fps fluctuation in a gpu limited section of the test
arma3altiscpuvsram_zps95038b9c.jpg


yep the difference between first run and second run with files prefetched into ram can be large in some games
ThiefAverageFPS_zps766cfe5a.png

ThiefMinimumFPS_zps98f4293d.png

Ok, So I ran the Thief benchmark 5 times.

The first run gave me the higher minimum at 83... For all runs it ranged from 77 - 83. The minimum FPS was always hit at the same point.
The Max FPS ranged from 128-133.

The average FPS ranged from 106.1-106.3
 
Those were some old tests it would seem that they have enabled prefetch in one of the games updates for the benchmark
I now get close result like you

The results use to become consistent after the first tests all following tests were much the same as the second
Glad its going well for you
 
Last edited:
Your tests are showing more of a difference than I think any of us were expecting. Keep it up.
 
sweet! I would love to see a more CPU bound game in the benches.
 
Back
Top