Any reason not to buy this corsair 4266 C19 for Z370 build?

eddie500

Gawd
Joined
Jan 23, 2003
Messages
984
I have a 1080ti and want max FPS for overwatch. I have been reading that speed matters for FPS.

I have my eye on this Corsair Vengeance LPX 4266 C19 memory, $259 at Amazon.
http://www.corsair.com/en-us/vengea...266mhz-c19-memory-kit-red-cmk16gx4m2b4266c19r

Is there any reason I should not use this for a 8700k 1080ti build? Would you suggest an alternative? I don't mind spending the extra $50 dollars on memory for this 4266, unless there is better performance at a lower speed.

Want to make sure that the timings are not too slow or something else I am overlooking.

Thanks for any advice.
 
reviews suggest overwatch runs 200fps on a 3770k something tells me the 8700k will be sufficient for it
faster ram will help the cpu in games though c19-26-26-26 are lose timings which will reduce the gains from the higher frequency a bit
if you want to save some cash you could get a 3000c14 kit $206 which uses samsung b-die and take some time to oc it to ~4000c17 1.45v yourself
but if you just want to load xmp and play then go for it that's not a bad price for high speed ram these days
 
reviews suggest overwatch runs 200fps on a 3770k something tells me the 8700k will be sufficient for it
faster ram will help the cpu in games though c19-26-26-26 are lose timings which will reduce the gains from the higher frequency a bit
if you want to save some cash you could get a 3000c14 kit $206 which uses samsung b-die and take some time to oc it to ~4000c17 1.45v yourself
but if you just want to load xmp and play then go for it that's not a bad price for high speed ram these days

Thanks, so if I wanted to make it easy and not mess around with timings, the 4266 C19 ram I linked above will be a good bet compared to other speed corsair vengeance ram under $260?

Just want to make sure it doesn't make sense to get a say Corsair Vengenance LPX 3466 but with a C16 timing.
 
4266c19 is faster than 3466c16
at a rough estimate 3600 16-16-16 may be close to 4266c19 in final latency which is most important but the latter will definitely have better bandwidth which also helps
 
Make sure your motherboard supports 4266, not all motherboards will support memory speeds that fast. As an example, my mobo's official support for OC'd memory stops at 3466.

With that said, 100% of everything you're doing is way overkill for Overwatch. It's not a system intensive game. You need neither an 8700k nor a 1080 Ti, and definitely not the world's fastest RAM, to get completely playable framerates; a GTX 1070 with a 6700k and 16GB of DDR4@3000 put in a minimum framerate of 203 for Techspot (https://www.techspot.com/review/1182-nvidia-geforce-gtx-1070/page2.html) and your listed mobo and CPU are both significantly faster than the specs used in that review.
 
Also, as an aside, Intel CPUs are typically not particularly responsive to faster memory speeds, particularly for gaming. Take a look at this: https://www.techpowerup.com/reviews..._Memory_Performance_Benchmark_Analysis/9.html

The general consensus there is that once you are tall enough to ride the ride (DDR4 2400/2666 or higher) then things really start blurring together. Personally, I just recently swapped from DDR4 2133 to DDR4 3200 (got it for free, never would have paid for it), and I gained perhaps 3-5% in PUBG, my game of choice.

This is in contrast to what I've heard about AMD's Ryzen CPUs, which supposedly scale up in performance with memory speeds pretty well, though I haven't found a comprehensive 'roundup' type comparison article like for the Intel CPU I just listed.
 
intel cpu can see large gains from faster ram in game after seeing that techpowerup review i gave one of the games they tested a go on my system
but intel does care more about the final latency that raw frequency like amd

Installed the dishonored 2 demo and at 1280x720 ultra detail the first two levels were caped at 120fps or gpu limited when it dipped below with a GTX1070 at 2088MHz core 4700MHz mem and [email protected]
The third level I found a cpu limited section during the boat ride to shore
I deleted the start and end of the ride from where it hit the 120fps cap as including them artificially halved the % increase between the ram speeds

AVG FPS
Overclocked 3733 16-16-16 =92.8 FPS 5.69% increase over 3200c14 or 22.7% increase over 2133c15
3200C14 XMP =87.8 FPS 16.1% increase over 2133
SPD 2133c15 =75.6 FPS
dishonored.png
 
Make sure your motherboard supports 4266, not all motherboards will support memory speeds that fast. As an example, my mobo's official support for OC'd memory stops at 3466.

With that said, 100% of everything you're doing is way overkill for Overwatch. It's not a system intensive game. You need neither an 8700k nor a 1080 Ti, and definitely not the world's fastest RAM, to get completely playable framerates; a GTX 1070 with a 6700k and 16GB of DDR4@3000 put in a minimum framerate of 203 for Techspot (https://www.techspot.com/review/1182-nvidia-geforce-gtx-1070/page2.html) and your listed mobo and CPU are both significantly faster than the specs used in that review.

The reason I'm wanting to max out my FPS is that I have a 240hz monitor and need to get my frame rate as high as possible. I have a 1080ti, but my CPU is only a I5 2500 and my frame rate drops to low 100's with any action in game.
 
Then you should appreciate the CPU; the memory won't make a huge difference. I wouldn't pay $50 for the difference between 3200 and 4266 for sure.

dasa managed to find a niche where the memory speed makes a difference; according to his benchmarks, 3733 got nearly 6% better framerate than 3200 for this specific section of Dishonored 2. That's in contrast to the general conclusions reached by others I've seen, but he's targeting a very specific niche in one game. It's probably not the only place there is a difference either, but even by his own admission if you include the entire level and not just his cherry-picked middle section the overall framerate difference is cut in half on average. His results definitely reinforce the need to not use 2133 speed memory, but I'm not exactly being sold on the difference between 3200 and 3733 (or higher) especially if there is a %cost difference between the two speeds that likely exceeds the %performance difference between them.
 
Also, if you have a 240hz monitor, I'm assuming it has variable refresh rate support (G-Sync)? If so,well, just enjoy your monitor at whatever framerate you get. You're definitely in the land of diminishing returns, as far as performance helping you game better.
 
Also, if you have a 240hz monitor, I'm assuming it has variable refresh rate support (G-Sync)? If so,well, just enjoy your monitor at whatever framerate you get. You're definitely in the land of diminishing returns, as far as performance helping you game better.

Supposedly, even with G-sync, it is better if you can play Overwatch always above 240hz and not use G-sync.

I agree at these high refresh rates it isn't all that big deal, but the reason I am upgrading is I only play Overwatch and am building it around playing overwatch for the ultimate fast experience. 240hz monitor is awsome, but I need 240 fps+ to get optimal smoothness. I'm still yet to understand how all this works but trying to learn from the blurbusters forum.

higher FPS is key here.

From what I have read the memory clock rate matters a lot, especially if you have a 1080ti and a good CPU. So with a 8700+ and a 1080ti, the memory speed matters a lot more than if you are not pushing such fast components.

So For $50 dollars I'm not going to take chances and just going to get fast ram, especially when I have a 1080ti, 8700+.
 
4266c19 is faster than 3466c16
at a rough estimate 3600 16-16-16 may be close to 4266c19 in final latency which is most important but the latter will definitely have better bandwidth which also helps


dasa: Would you happen to know what would be a better deal for fast memory. This G.Skill 4266 which has better timings 19-19-19-39 $309
https://www.newegg.com/Product/Prod...2472&cm_re=gskill_4266-_-20-232-472-_-Product


Or this Corsair Vengeance 4266 which has 19-26-26-46 and costs $259
https://www.newegg.com/Product/Prod...air_vengeance_lpx_4266-_-20-236-160-_-Product


Both are 4266, but the G.Skill has the same CAS but the other timings are lower, 19-19-19-39 vs 26-26-26-46.

Is it worth the extra $50 for those other timings to be faster? I preferred the Corsair because of the low profile to fit under my huge CPU heatsink.

Thanks.
 
I tried finding benchmarks comparing DRAM sub-timings and not just CAS latency, but what would you know that's a fairly niche request. I did find this:

https://forums.anandtech.com/threads/ram-timing-benchmark-cl-trcd-trp-tras-cr-trc-trfc-tref.2174077/

This is a user on AT forums from 2011 who ran some seriously in-depth benchmarks adjusting exactly one timing number at a time. His post is gigantic, and I'll admit to not reading it all, but the general gist of it is that base CAS latency is the most important, but the subtimings can have an effect to a lesser degree.
 
probably not worth another $50
for all we know the cheaper kit may have a lower tRFC which is one of the most important sub timings
ether way it will be a unnoticeable difference
 
probably not worth another $50
for all we know the cheaper kit may have a lower tRFC which is one of the most important sub timings
ether way it will be a unnoticeable difference

Thanks, do I have this right? The Corsair 4266 and say corsair 3200 are the same memory, but the 4266 are just binned to run at a higher speed?

Does this mean that I can probably run the 4266 at 3200 with much lower timings?
 
No. CAS timings are relative - they're a count of cycles. Memory running at 3200 MHz (or millions of cycles per second) versus 4266 MHz will have different CAS latency numbers, but they may have the same absolute latency. In other words, if you've got one stick of RAM that runs at CAS 14 at speed X, CAS 15 at speed X+1, and CAS 16 at speed X+2, it's likely the absolute latency is remaining more-or-less consistent at all three speeds, but the number of cycles that consistent amount of latency time represents goes up correspondingly with the frequency.
 
IMO in terms of actual performance benefits, I'd just grab 3200mhz C14 and call it a day. OC your new chip to 4.8+ and it should be fine. Your 2500 is absolutely a bottleneck at 1080p on 240hz.
 
IMO in terms of actual performance benefits, I'd just grab 3200mhz C14 and call it a day. OC your new chip to 4.8+ and it should be fine. Your 2500 is absolutely a bottleneck at 1080p on 240hz.

Could you recommend me a memory brand which is 3200 C14, thanks.
 
No real recommendations, but this is your list: https://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100007611 601190328 600006072 600561668 600546709&IsNodeId=1&bop=And&Order=PRICE&PageSize=36

I'm running The Ripjaws V 3200 C16 set and it's working well, so that's the closest I get.

*Edit* I'm running 2x16, so C14 is slightly harder to come by than at 2x8.

Any reason you recommend the 3200 C14, over say the 4266 C19? The price is very close. Only like $10 dollars difference. Wouldn't it make sense to just get the higher speed memory?
 
Any reason you recommend the 3200 C14, over say the 4266 C19? The price is very close. Only like $10 dollars difference. Wouldn't it make sense to just get the higher speed memory?

I've been saying for about six posts now that it doesn't make much of a difference. Get whichever you like. That said, the price difference is not $10 - the 3200C14 is $200, the cheapest 4266 is $260. I suspect you will be happy with either, since this entire thread is about optimizing your computer for likely a less than 5% performance difference in a game where your framerate is already going to be well over 200fps. Everyone in this thread has already put more thought and consideration into this topic than it really deserves given the tiny difference it'll make regardless of which way you go, so time to pick one and be happy with it!

Just double check your mobo if you go for the 4266, like I said support for frequencies that high is a bit rarer than support at 3200.
 
Any reason you recommend the 3200 C14, over say the 4266 C19? The price is very close. Only like $10 dollars difference. Wouldn't it make sense to just get the higher speed memory?

Because achieving 3200C14 is almost gauranteed. Getting 4266c19 is not. Not all boards will 'set it and forget it'. Some may give you grief. 3200C14 is pretty solid and 99% of good boards will do it. XMP may not work on the 4266, so some manual tweaking may be required. As Sinister said, pick one and go for it, but I really suggest going for 3200C14. This is overwatch we're talking about. Worst case, turn down AA >.>
 
ram speed is more likely to show a difference with aa off than on since aa is purely gpu

5% more from ram and 5% more from deliding the cpu and maybe spending a little more on cooling and you have 10%
that is assuming its not totally gpu bottlnecked then the difference would be 0%
 
If your motherboard has the 4266 kit you're looking at in the QVL list, totally go for it. Don't even bother otherwise, you're likely to end up stuck at 3600 or worse.

When looking for superfast DDR4, always always check the QVL of your specific motherboard. In fact you should start there when looking for very high speed kits. Then just type the name of the kit in Google and see if you can find it.

And for your specific use case I do think ultra fast DDR4 is worth it. Whenever you're CPU bound, RAM speed helps some - both always work hand in hand. Sometimes increasing RAM frequency gives as much as OCing the CPU itself.

Also, you should mostly be looking at frequency. Timings seem to matter little with DDR4, or maybe the small impact of it is just offset by the very high frequencies.
 
I unfortunately understand where eddie is coming from. I'm the kind of guy that decides I need to build an new PC just so my civilization 6 turn times are 1-5 seconds shorter.

In the end, the darn games just needs better multithreaded coding. I don't even give a lick about fps lol.

Get what you can, they'll be something better to spend your money on next year.
 
Your wasting your money.

The real question ... do you have the skill to actually kill people at 200fps? I know I do but thats just my boastful pride.

Do you have a 240hz LCD?

Lastly time and time again it has been proven that over 3200 mhz really is diminished returns and it scales exponentially the faster you get. The reason is that even at 4000+ mhz effective speed the CPU can't actually saturate the true bandwidth capacity of the ram leaving you tons of wasted overhead. Whilst this is not true in synthetics necessarily, in real world software usage scenarios it is very real indeed.

Where you benefit from ram more than anything is going from dual channel to quad channel in that the CPU can access all four or four+ dimms at the same time across multiple cores each having their respective clock speed and bandwidth capabilities.

This just boils down to a marketing game for many victims of buying the hyper fast ram for the absolute most minimum gains.

I mean if you are happy wth 1-3 more FPS going from 3200 to 4xxx speeds at a several hundred dollar premium well then you have cash to burn literally.
 
Last edited:
Historically, Intel has never benefited from tighter timings. It does however, like increased speeds. Anything past 3200MHz and you start seeing diminishing returns fast. You run into that sooner on the older architectures that predate Skylake. However, Ryzen and Threadripper do benefit from increased memory speed and tighter timings as others have stated.

What's your CPU's overclock like though? I can tell you now that you won't find many processors that can handle the bleeding edge overclock and run that RAM at full speed.
 
This thread has lots of good info. But OP I used to run OW on a 3770K @ 4.8 with a variety of gpus and at medium/high settings was easily able to stay well north of 150 fps. You could buy 2400mhz C16 memory and with a 1080ti and an 8700k I doubt you would have trouble staying north of 250.
 
Back
Top