[TBG] Does RAM Speed Matter? DDR3-1600 vs. 1866, 2133, and 2400 in Games

wand3r3r

Limp Gawd
Joined
Dec 17, 2011
Messages
422
The Tech Buyers Guru has a new RAM article, focusing on the speed this time.

The impact of system memory on gaming performance is a topic lots of gamers are interested in, but finding in-depth analysis can be hard. Back in mid-2013, we published our analysis of how RAM quantity affects gaming performance in our article Does Memory Matter? 4GB, 8GB, and 16GB in Gaming. That article has gone on to be one of the most popular articles on The Tech Buyer's Guru, to this day! So it’s with great pleasure that we bring you the next chapter in our RAM benchmarking analysis, a deep dive into the impact of RAM speed on gaming performance.
http://techbuyersguru.com/ramspeedgaming.php

Take a look, it's an interesting read. It's always interesting to see the real world effect that memory has on (primarily gaming) performance. The conclusion offers solid advice and alternatives for potential buyers and it's great to see the value proposition for memory! It can make the purchase easier based on price/performance, or simply all out performance.
 
I've known this for years and have recommended people not get caught up in the RAM speed thing (linked other articles just like this one proving it).

With Sandy Bridge, I recommended 1333 since the higher cost of 1600 (or more) made zero real-world difference. Then Ivy Bridge and Haswell came out, RAM prices were rock-bottom, and I recommended 1600, since more expensive 1866 (or higher) makes no real world difference.

But do people listen? Noooooooooo. They would rather do something stupid like get the more expensive faster RAM and sacrifice on something that's actually game-changing, like the GPU (pun intended). :p
 
My issue though with a lot of these tests is the work load chosen in regards to gaming which is that the game types is rather limited. In this case 5 out of the 6 game types are essentially FPS games (BF4 is tested in SP) that effectively mostly occur in rather limited and static environments with limited actors. The other is an arcade style racer. Also using static oriented benchmarks (eg. in game benchmarks, scripted level sequences) is another issue.

I actually find in general there is an over emphasis on FPS (and similar) style games used generally in tests which may be placing a bias over certain types of work loads. For example commenting on SSD impact in gaming relying on a typical FPS focused test suite that loads all data into memory in set transitions versus games that rely heavily on streaming in assets dynamically.

What I would be interested more so is a test suite with some of the following examples -
Skyrim (particularly modded)
GTA IV (particularly modded)
Company of Heroes 2
Total War Rome II
Civilization V or Beyond Earth (with regards to turn times)
MMORPGs in "heavy" scnearios (eg. town/hub clusters, raids) albeit this would be hard to control
ARMA 3
More heavy simulations games (eg. Flight Simulators)
FPS games, eg. Battlefield 4, in heavy MP situations. Again hard to control issue however.

Also some newer and/or upcoming titles would be -
Assassin's Creed Unity
Dragon Age Inquisition
GTA V

It would also be interesting in seeing frame time and related data and not just fps minimum and avgs. It would also be interesting to explore the impact specifically with regards to pushing high fps for 120/144+ fps gaming. The last issue would be testing with multitasking factored in. I understand the control issues, however the tendency for most users in current usage is not kill every single background task when gaming.

I'd also be interested going forward if the more direct access APIs have an changes.
 
I still look for the lowest latency I can get ...but the choices get slim with higher speeds
 
I've found 2133mhz stuff to be the sweet spot.

Usually only a couple bucks more per stick than the 1600/1866 sticks. Might as well, eh?
 
I've found 2133mhz stuff to be the sweet spot.

Usually only a couple bucks more per stick than the 1600/1866 sticks. Might as well, eh?

There are others that have reference graphs showing 2133 CAS9 is the "sweet" spot for $/performance, but the performance over 1600 doesn't make plopping 2x the $$ worth it though.

I'd like to see 1333 vs 2133 (DDR3 vs DDR4 too) for server performance like MySQL when your DB is in RAM/cached... anyone have test data on that?
 
There are others that have reference graphs showing 2133 CAS9 is the "sweet" spot for $/performance, but the performance over 1600 doesn't make plopping 2x the $$ worth it though.

I'd like to see 1333 vs 2133 (DDR3 vs DDR4 too) for server performance like MySQL when your DB is in RAM/cached... anyone have test data on that?

My comment was not directed at DDR3 vs DDR4 prices. DDR3 2133 is usually only 5-10 bucks more per stick than the DDR3 1333/1600 stuff. DDR3 vs DDR4 prices is a whole other can of worms.
 
My comment was not directed at DDR3 vs DDR4 prices. DDR3 2133 is usually only 5-10 bucks more per stick than the DDR3 1333/1600 stuff. DDR3 vs DDR4 prices is a whole other can of worms.

:confused: I never said you did.


And FWIW it's not $5-10 more for 2133 DDR3 that equals the same CAS as 1600, it's about 2x the cost, and not worth it. BUT, it's the sweet spot for performance if you want to squeeze that last couple FPS.

There are charts that show this out there, and I linked to them in other threads... too late for me to look them up at the moment.
 
There are others that have reference graphs showing 2133 CAS9 is the "sweet" spot for $/performance, but the performance over 1600 doesn't make plopping 2x the $$ worth it though.

I'd like to see 1333 vs 2133 (DDR3 vs DDR4 too) for server performance like MySQL when your DB is in RAM/cached... anyone have test data on that?

I doubt the server market changes much. Servers with 18 cores and a bagillion transactions per second are running on DDR4-2133 or if Ivy Bridge, DDR3-1600/1866.

One test I'd like to see (could probably DIY it) is if there's any difference in RAID SSD benchmarks using DDR4-2600 or downclocking it to DDR4-1333.
 
so you guys telling me i should run my ddr..1066 ram at 1066 if the cas is the same as ddr800 if it was?

only reason i run higher speed ram i also try to get good cas, is because my clocks are over 800 fsb, running at like 960 i think. my fsb is set to 460. without 1066 ram if i was using say, instead 800 speed ram, id have to over clock it to my now fsb of 960 or whatever it is, to run 1:1 with teh processor bandwidth, or lose processor bandwidth, and waste. or try overclocking my ram, which might not work, and might error. so it was the only reason for me to, up the speed of my ram. for the 1066 speed. i could get ddr2 800 at 4 cas but im using 1066 at 5 cas, and its running at 1:1 in my bios at 960 with 5 cas.
 
@rancer420 Your question is about DDR2. I would start a new topic instead of derailing the thread.
 
Unless you're on a FM2+ board where they use the memory sorta for gaming, then no it doesn't really matter unless you do synthetic benchmarks for your e-pen.
 
Yeah, for games, you could spend an extra $100 Bux on 2133 cl9 ram over 1600 cl10 and you get an extra ~3% performance in games.... Or you could spend that on a bigger video card and get an extra ~25% performance in games.

For rendering, server, VMs, sure: bigger, faster RAM may help, but for games it's not worth it. This is why I can't understand why some builders sell 'gaming' PCs with a top-end i7, 16Gb of 2133 RAM and pair it with a GTX 760... Honestly spend the same cash on an i5, 8gb of 1600, and grab a 980.
 
Unless you're on a FM2+ board where they use the memory sorta for gaming, then no it doesn't really matter unless you do synthetic benchmarks for your e-pen.

Ya!...what you said!...after you get off of mom's Rent-A-Center Compaq you might grow an e-penis...of course that would mean leaving the basement...I'm just sayin'...hehehe
 
Last edited:
I can only talk about my experience but every game that I am not getting optimal framerates on my GPU is hitting 100% usage. I highly doubt that if I were to switch to 2133 RAM from 1600 that I would see a big boost. I think that APU owners can see more of a boost than me but then again the money spent on faster RAM could have been spent on a better GPU for them. I plan on getting faster RAM but it is not at the top of my list.
 
It probably has more of an effect with onboard graphics, but when you say "for gaming," you're mostly going to be talking about people who use a discrete GPU, anyway.

I've usually bought relatively cheaper to medium-priced RAM. I've never bought into the high-end highest-speed stuff.
 
But for now anyways the higher speed ram is getting quite cheap. 2400 DDR3 is very competitively priced right now.

AND for those that care about timings and speed, use this formula to decide:

1/[(Ram speed)/2] x (timing)

so 1600 cl9 is 1/(1600/2) x9 = .01125

and 2400 cl12 is 1/(2400/2) x 12 = .01000

So technically the 2400 with lower CL is faster and there is a great deal of 2400 ram with cl11.
 
Does your formula account for the price delta between faster frequency/tighter timings and distinguishable differences in real-world use (if any)?
 
I'll read the article at some point, but does it take in account APU / Integrated + Shared graphics accessing the memory?

Anecdotal, my own personal experience with more than a handful of AMD APU system:, DDR3-2133 outperforms DDR3-1600 in 3D stuff enough that the premium is cost effective.
 
I'll read the article at some point, but does it take in account APU / Integrated + Shared graphics accessing the memory?

Anecdotal, my own personal experience with more than a handful of AMD APU system:, DDR3-2133 outperforms DDR3-1600 in 3D stuff enough that the premium is cost effective.

Using an IGP in a gaming or productivity capacity would be the sole justification for spending the extra money, imo.
 
Does your formula account for the price delta between faster frequency/tighter timings and distinguishable differences in real-world use (if any)?

Sure that makes sense. I was just posting the formula to determine speed because I saw someone claim they just look for lowest cl and that is somewhat misleading.

As far as real world experience, not a lot I am sure. But if you are using Ram-disk it definitely doesn't hurt. Besides I have 1600 Ram that does not like to OC, so I am now going to look for 2400 Ram although I have no need to run that speed just because I want to ensure that I can run at any speed up to that. I would like to run at 1800-2100 and 2400 Ram would guarantee that whereas 1600 does not.
 
But will you discern a real-world performance difference with a Ramdisk between frequencies? I mean, it's already faster than just about any connected storage type, even the fastest of SSDs.
 
It makes a pretty discernible difference with the loading of RAMDISK. As far as during Gaming, haven't really experimented.

But I guess my real point is: If you can have faster why wouldn't you want it? I get it if the cost is astronomical, like it was when I bought my 1600, but now it is very attainable. The gains may be marginal and not worth the Cost when that difference is rather large, but now it seems less of an issue.
 
Over Thanksgiving week there were some decent sales on 2133 ram, but if rebates do not scare you there were MUCH better deals on 1600 CAS 9 ram. I scored A-Data 2x4gb 1600 for free after $100 rebates. There was also a deal on PNY 8gb 1600 CAS 9 sticks where if you bought two they came out to either $70 after rebate or $50 using Tigerdirects 1 per household $20 off $100. I'm thinking of using the PNY in a new build.
 
I got 16GB of cas 10 DDR3 2400 for $130 last week, it makes a fairly decent difference on Haswell from the articles I read.
 
I picked up some G.SKILL Ripjaws X Series 16GB (2 x 8GB) DDR3 2400 as part of a G.SKILL Ripjaws X Series 16GB (2 x 8GB). The price difference between the DDR3 2400 and DDR3 1600 was.... $10 per pair.

It was only after that I had placed the order for two pair of these sticks (32 GB total) that I realized that Windows 7 Home Premium only support 16GB. (oopsie) However, Windows 10 is coming out at the end of the year, so I'm ready. The goal of 32GB was not for gaming, but more for video editing and virtualizations.
 
All this talk finally convinced me I needed some faster RAM. My very low profile Crucial sticks didn't want to go past 1866, so I went looking for a new set. I couldn't allow myself to downgrade from 32GB, so I ordered a 32GB set of Trident X 2400MHz 10-12-12-31 off of Newegg.

Can't say I've noticed too much of a difference, if any, in games like Battlefield 4. However, I'm going to chalk that up to being stuck on a 60Hz monitor (wish I still had the Swift!). I'm hoping to get some more time gaming in, but I've been stuck catching up on work for the last few weeks.

Anyways, at least I can rest easy knowing that I have the proper speed of memory for my 4790K. Also let me upgrade (if you can call it that) my HTPC to my old 32GB set. Feels a bit ridiculous to have 32GB in my desktop, 32GB in my HTPC, and a 16GB set in my closet (not counting laptops and my wife's computer . . . all of soon-to-be-obsolete DDR3).
 
It only makes a strong difference when using a fm2+ socket apu... I don't know why this isn't common knowledge
 
I haven't bothered with a memory oc since the Pentium 4 days. It made a difference then, but ever since c2d and later it doesn't do much at all. Just run the speed the cpu is rated for and save some cash for unlocked CPUs.
 
I thought 1600 was the sweet spot.. That's what I get, all I get.

If you get get 2400 DDR you have to fuck around with your board, voltage and timings. Unless you're LUCKY XMP works..
 
I haven't bothered with a memory oc since the Pentium 4 days. It made a difference then, but ever since c2d and later it doesn't do much at all. Just run the speed the cpu is rated for and save some cash for unlocked CPUs.

This is incorrect.

The C2Q CPUs screamed relative to stock RAM speed and overclocked CPU speed when the RAM was running mice and high and the bus speed was nice and high. Night and day difference.

The x58 platform also liked nice high RAM speeds. Stuff was so much smoother and games played smoother once you went to 1600 or higher for the RAM. Even 1333 was a good jump over the stock 1066.

And for very RAM intensive applications, I tested a couple systems. One with the CPU at 4Ghz and the RAM at 1600 and the other at 3.84Ghz and the RAM at 2000. The slower clocked machine beat the higher clocked machine by 25%.

It just depends on what you run. RAM speed can make a huge difference.
 
For me and the game feeling, the speed doesn't make a real difference.
I bought some 1033 of DDR3 and since that, I can play without problems. Also bought refurbished Memory and could do anything with it.
 
Last edited:
What makes me weep is when I see folks that have spent a whole weekend testing and tweaking their RAM timings down as low as they can to get 4578 points extra in a synthetic benchmark but actually only gets them an extra 1FPS (from 112FPS stock settings) in real world situations.

They would have been better off just playing the game that weekend.

Different strokes though...
 
Back
Top