Ryzen with 3600MHz RAM Benchmark

Looking at the results from the video, I can only determine that Ryzen beats the 7700K in one game decisively. The other ones showed Ryzen getting a great performance boost from faster memory but it seems to be bottle-necked a bit by the 1070 because the 7700K vs 1700x results looked very close.

So we don't actually know which CPU is faster when the CPU is the bottleneck, he needs to bump the settings down to all LOW settings which I don't expect he'll do. JMTC
 
OK I went back through the vid to see if I could compare any aspect to another review highlighting the point that the GTX1070 is bottlenecking the 7700K for the discussion and context around absolute CPU performance.
One stands out as it can be compared with Eurogamer/Digital Foundry review where they compared the 7700K to the 1800X with RoTR, the 1800X had 3200MHz memory while the 7700K had 3000MHz memory.

Youtube vid with GTX1070:
1700X at 3.97GHz/3600MHz: 87.1 fps
1700X at 3.97GHz/3200MHz: 86.9 fps
7700K 5GHz/3200MHz: 94.3 fps

Eurogamer/Digital Foundry with Titan Pascal:
1800X no OC/3200MHz: 85.8 fps
7700K no OC/3000MHz: 126.5 fps

Both at 1080p.
The 7700K is being bottlenecked by the GTX1070 while the 1700X/1800X for now (will improve but unfortunately none of us know by how much and how long it will take) are the bottlenecks in their setup.
Obviously not every game will be so, but quite a lot will be limited by the GTX1070 in the context of testing the limits of a 7700K at 1080p.
Cheers
 
Last edited:
Yea its irritating but be aware you probably won't be able to maintain your current RAM speeds with 4x slots filled so there will be a (minor) speed trade off to be made if you want 32GB right now.

I made the RAM tradeoff. Needed that 32GB for graphics work. But what I did was get 32GB in two dimms instead of four. This did mean, however, getting double rank RAM instead of single rank, and the best I could get out of it was the "3000" profile (which is really 2933) at 15-15-15-35 timings. I combed the QVL pretty hard to make this happen.
 
When you repeat the word "biased" enough times, the review suddenly becomes "fake news"

Slightly OT, this idea that calling someone biased refutes their argument is bologna. Everybody is biased - it's a part of being human. Accusations of bias are Ad Homs. Attacking the person, not the argument. If someone believes the youtuber in question is wrong, they are certainly free to conduct their own tests and experimentation, and refute his argument that way. Hell, I'd be interested in watching that. Buy a Ryzen, some fast RAM, and a 1080 Ti... and buy a similar 7700k setup... and conduct your own memory speed tests without the complaint that the 1070 is skewing the results too much due to being GPU bound. Come back to us with the results.
 
I made the RAM tradeoff. Needed that 32GB for graphics work. But what I did was get 32GB in two dimms instead of four. This did mean, however, getting double rank RAM instead of single rank, and the best I could get out of it was the "3000" profile (which is really 2933) at 15-15-15-35 timings. I combed the QVL pretty hard to make this happen.

That is pretty good for double rank on Ryzen.
Cheers
 
I made the RAM tradeoff. Needed that 32GB for graphics work. But what I did was get 32GB in two dimms instead of four. This did mean, however, getting double rank RAM instead of single rank, and the best I could get out of it was the "3000" profile (which is really 2933) at 15-15-15-35 timings. I combed the QVL pretty hard to make this happen.

Which RAM type?
 
OK I went back through the vid to see if I could compare any aspect to another review highlighting the point that the GTX1070 is bottlenecking the 7700K for the discussion and context around absolute CPU performance.
One stands out as it can be compared with Eurogamer/Digital Foundry review where they compared the 7700K to the 1800X with RoTR, the 1800X had 3200MHz memory while the 7700K had 3000MHz memory.

Youtube vid with GTX1070:
1700X at 3.97GHz/3600MHz: 87.1 fps
1700X at 3.97GHz/3200MHz: 86.9 fps
7700K 5GHz/3200MHz: 94.3 fps

Eurogamer/Digital Foundry with Titan Pascal:
1800X no OC/3200MHz: 85.8 fps
7700K no OC/3000MHz: 126.5 fps

Both at 1080p.
The 7700K is being bottlenecked by the GTX1070 while the 1700X/1800X for now (will improve but unfortunately none of us know by how much and how long it will take) are the bottlenecks in their setup.
Obviously not every game will be so, but quite a lot will be limited by the GTX1070 in the context of testing the limits of a 7700K at 1080p.
Cheers


Benchmarks were different too, different software will get different results. RoTR is absolutely an outlier and I'm curious if there will be optimization specifically for that game. Or if AMD will just choose to ignore it.
 
Thanks dgacioch , that's at least a site people seem to trust.

Forgetting about 7700K comparisons, going from 2133 to 3200 on the 1800X seems to net around 10% give or take. That's not a bad return. Though it loses pretty bad to Intel in 2 games, the rest are still respectable.
 
OK I went back through the vid to see if I could compare any aspect to another review highlighting the point that the GTX1070 is bottlenecking the 7700K for the discussion and context around absolute CPU performance.
One stands out as it can be compared with Eurogamer/Digital Foundry review where they compared the 7700K to the 1800X with RoTR, the 1800X had 3200MHz memory while the 7700K had 3000MHz memory.

Youtube vid with GTX1070:
1700X at 3.97GHz/3600MHz: 87.1 fps
1700X at 3.97GHz/3200MHz: 86.9 fps
7700K 5GHz/3200MHz: 94.3 fps

Eurogamer/Digital Foundry with Titan Pascal:
1800X no OC/3200MHz: 85.8 fps
7700K no OC/3000MHz: 126.5 fps

Both at 1080p.
The 7700K is being bottlenecked by the GTX1070 while the 1700X/1800X for now (will improve but unfortunately none of us know by how much and how long it will take) are the bottlenecks in their setup.
Obviously not every game will be so, but quite a lot will be limited by the GTX1070 in the context of testing the limits of a 7700K at 1080p.
Cheers

Testing 7700K and Ryzen 7 with any $600+ videocard at 1080p makes a moot point. GTX1070 is absolutely the top end for 95% of 1080p gamers.
 
Benchmarks were different too, different software will get different results. RoTR is absolutely an outlier and I'm curious if there will be optimization specifically for that game. Or if AMD will just choose to ignore it.
You are dismissing it too quickly.
Check the 3200MHz Ryzen against both the vid and Eurogamer result I provided and then look at the OC.
It is clear there is a bottleneck being caused by the GTX1070 and this is not as noticable on Ryzen as the CPU bottleneck is right around the limit of the GTX1070.

What I showed is not an outlier, because nearly every result by various reviews involving tests using at least a GTX1080 has the Intel 7700K performing much better than a Ryzen shown in the video - context here is absolute CPU performance rather than gaming reality with lower GPUs.
The outlier is the video test using a GTX1070 to prove absolute CPU performance because it skews the results.

There is a reason why Eurogamer/Digital Foundry went with a Titan Pascal to test at 1080p when looking at absolute CPU performance, and you should put more weight on their testing methodology and results than the youtube vid for reasons that are pretty clear when comparing trend results.

Cheers
 
Last edited:
They have HTDC Ryzen 12/16 Core coming out. Maybe Quad Channel will help Ryzen pull it over the top!??!
 
They have HTDC Ryzen 12/16 Core coming out. Maybe Quad Channel will help Ryzen pull it over the top!??!

Not likely- even if these results are due to increased bandwidth (and not just increased interconnect speed or decreased latency), core speeds will be lower on the higher core-count products, meaning that poorly threaded applications will still be limited and fall behind the fastest single-core product, currently the 7700k.
 
They have HTDC Ryzen 12/16 Core coming out. Maybe Quad Channel will help Ryzen pull it over the top!??!

I think that platform is meant to compete with skylake/kaby lake x. My guess is the performance of the current r7 and upcoming r5 chips will continue to improve but I wouldnt expect them to meet or match the 7700k as a pure gaming chip.
 
So let me get this straight,
Your saying that any benchmark we see with a 1070 and below at 1080p with Ryzen 7 are a total fabrication when compared to a 7700k?
So according to you no one in the real world would pair a 7700k with anything lower than a 1080 at 1440p or higher?
I'm sure if he had the gear he would of tested it. Like someone already said, it's a data point to add with all the other so called "farces".
Not the end all be all comparison you seem to think everyone is capable of.

I have to agree with you here. You see all these people run at 1080p or less to prove that intels cpu are the bomb, who plays at 1080p anymore. Mulitscreen has been out since the HD 5000 series and now 4K has been out for a couple of years now. If you go with some people bias on this forum, then playing anything under 1440p should be a bias also because it is not representing real world play.
 
I have to agree with you here. You see all these people run at 1080p or less to prove that intels cpu are the bomb, who plays at 1080p anymore. Mulitscreen has been out since the HD 5000 series and now 4K has been out for a couple of years now. If you go with some people bias on this forum, then playing anything under 1440p should be a bias also because it is not representing real world play.

Well, lots of gamers would rather go with 1080p 144Hz or higher instead of going higher resolution. A lot of peoole protecting the Ryzen lineup act like 60Hz is the hard limit of ALL monitors. A lot of people value high hertz.
 
So you then agree that people need to get out their high horse and accept that everyone plays with other resolutions and graphic cards. I am sure nVidia has sold a whole bunch more 1070 graphic cards than 1080 so that would make more of a real world experience than a 1080 graphics card.
 
So you then agree that people need to get out their high horse and accept that everyone plays with other resolutions and graphic cards. I am sure nVidia has sold a whole bunch more 1070 graphic cards than 1080 so that would make more of a real world experience than a 1080 graphics card.

You can find anyone claiming anything to fit their needs. 1080p is not the holy grail of gaming and as many of us know being (ab)used to show that Ryzen is not a good cpu for gaming. The truth lies somewhere in the middle it is not great for 1080p gaming but certainly not in the way the forum trolls on here go out of their way bashing it.
 
This language of Ryzen 'sucks' or is 'not great' for gaming really needs to stop. Ryzen is great for gaming- it's just not the very best.

For most of us, 'great' is more than enough, especially given the performance that Ryzen brings to the table for intensive non-gaming tasks.
 
This language of Ryzen 'sucks' or is 'not great' for gaming really needs to stop. Ryzen is great for gaming- it's just not the very best.

For most of us, 'great' is more than enough, especially given the performance that Ryzen brings to the table for intensive non-gaming tasks.

Absolutely, it's the previous AMD generation that deserves to be bashed, but Ryzen wins back the price/performance crown for now.
Complaining that it did not beat 7700K is just silly -- if it did, the prices would be much closer to Intel's, and we would not be able to afford 6C/12T and 8C/16T CPUs for another year or two.
 
Testing 7700K and Ryzen 7 with any $600+ videocard at 1080p makes a moot point. GTX1070 is absolutely the top end for 95% of 1080p gamers.

Totally agree.
But the youtube vid in OP is about showing absolute CPU performance and was discussed in that context, because it is pointless trying to do extreme OC without that context, and so his results need to be balanced that in the absolute CPU performance context he has bottlenecked the 7700K.

There are two discussions as I mentioned; absolute CPU performance at 1080P resolution and then gaming reality using GPUs up to a certain tier such as the GTX1070 being the upper level.

Absolute CPU performance though like I mentioned is not just about now but the consideration of 2-3 years from now and considering what GPU one will be using then, but as I discussed and agreed with DuronBurgerMan is that the situation differs for everyone in how often they upgrade the GPU and CPU (which usually is kept quite a bit longer than GPUs).
A 970 was faster than the 780, the GTX1070 is as fast as 980ti up to 1440p, and so it is highly probably the Volta '2070' will be faster than the GTX1080; now consideration is whether someone will upgrade to the Volta '2070' model in the next 16-28 months while looking to keep the CPU for 5 years.
Of course one benefit with AMD is that it is less painful from a cost perspective to upgrade the CPU more often than Intel (from CPU and platform perspective).

Cheers
 
In the previous video Mindblank explains why he used a 1070... Its all he could afford due to his location. The guy is limited by his budget because where he lives, the GPU prices are more exorbitant than in the US. I give him major props though for the video. If he had more hardware to work with he would certainly give us more content. He also has a 7700k delidding video too that is pretty good.


Also, the guys has very low subscribers and no advertisements (From what I saw) so I believe he is trying his best and not trying to skew results or hurt any competition.
 
Totally agree.
But the youtube vid in OP is about showing absolute CPU performance and was discussed in that context, because it is pointless trying to do extreme OC without that context, and so his results need to be balanced that in the absolute CPU performance context he has bottlenecked the 7700K.

There are two discussions as I mentioned; absolute CPU performance at 1080P resolution and then gaming reality using GPUs up to a certain tier such as the GTX1070 being the upper level.

Absolute CPU performance though like I mentioned is not just about now but the consideration of 2-3 years from now and considering what GPU one will be using then, but as I discussed and agreed with DuronBurgerMan is that the situation differs for everyone in how often they upgrade the GPU and CPU (which usually is kept quite a bit longer than GPUs).
A 970 was faster than the 780, the GTX1070 is as fast as 980ti up to 1440p, and so it is highly probably the Volta '2070' will be faster than the GTX1080; now consideration is whether someone will upgrade to the Volta '2070' model in the next 16-28 months while looking to keep the CPU for 5 years.
Of course one benefit with AMD is that it is less painful from a cost perspective to upgrade the CPU more often than Intel (from CPU and platform perspective).

Cheers
I agree on the absolute performance indicative of the long term prospects of the platform. Actually that's the reason I would rather wait for the next Intel HEDT platform with "entry level" 4C/8T options, than jump on the more affrodable Ryzen platform now. Especially considering the cost of 3600 MHz RAM needed to get the most out of Ryzen, and all the trouble to make it work.
 
If anyone knows of DDR4 3200/+ ECC UDIMM, please share. Would love to pair it with a Ryzen for my new rig build. (;
 
I think part of the issue is that reviewers test with unrealistic setups in order to eliminate bottlenecks in other parts of the system. I fully understand why this is done, but I'd prefer to read or see tests with realistic hardware on realistic settings.

For example, CPU reviews run at low settings, low resolution, but with high-end graphics cards. This tells you theoretical performance for a chip, but maybe not what you could expect if you paired that chip with a similarly priced graphics card and ran on normal settings.

Same with GPU benchmarks. They run at 1440P or 4K (which I like, fine) but with ridiculous settings like full max ultra settings, including MSAA, and getting like 20 fps, which no one would ever play at.

I like the way [H] does it. Settings don't need to be max, they should be reasonable, getting playable frame rates, and the hardware should be roughly matched. This shows what *you* can expect if you buy the hardware. Not theoretical.
 
I have to agree with you here. You see all these people run at 1080p or less to prove that intels cpu are the bomb, who plays at 1080p anymore.

I do.
Vizio E32-D1 32" 1080p @ 120Hz, cheap black Friday impulse buy.
When I upgrade my GPU, I'll go 48" 4K min.
 
I play at 1080p, when I game. But... probably not for much longer. I've had my eye on some sweet 4k monitors lately...
 
I play at 1080p, when I game. But... probably not for much longer. I've had my eye on some sweet 4k monitors lately...

Just make sure you go up on the screen size if you can afford it. Last summer i upgraded from an old 27" 1080p monitor to a 34" ultrawide and it was probably the best upgrade I made to my rig in years.
 
Just make sure you go up on the screen size if you can afford it. Last summer i upgraded from an old 27" 1080p monitor to a 34" ultrawide and it was probably the best upgrade I made to my rig in years.

Screen size is not a problem. 30" one one side, and 42" on the other, but both are only 1080p.
 
Well I have gamed for years now on either 3 hp lp1965 in eyefinity or on a zr30w. I did have a htpc hooked up to a 1080p tv but that was replaced last year on black Friday, but evendors then I didnt game on it much.
 
And here last year I finally "upgraded" to two 24" Viewsonic 1080p 144Hz monitors from my then/now antiquated Westinghouse 19" 1400 x 900 monitors...lol.
 
No use in explaining it as the goal posts have been moved. This Mindblank's previous review showed Ryzen in a not so good situation a couple weeks ago and retested it and now it is better. No review is perfect but the data is still there, he even gives frametime graphs too which most reviewers tend to omit. Once things get patched up on my end I will do my own testing and make my own conclusions with my 1080ti. So far the experience has been good and I do game at 4k now so I can give a rats ass about 1080p anymore.

I have a beautiful 3440x1440 gsync monitor on the way. Once I get it I'll post some benches on my Ryzen rig. I'll run them with my 3.85 crunching clock. All of this will be with my Titan XP running at 2ghz.
 
Which monitor did you get? Interested to see the results! What's your setup if you don't mind me asking?
 
If anyone knows of DDR4 3200/+ ECC UDIMM, please share. Would love to pair it with a Ryzen for my new rig build. (;
Currently running @ 2933 on a single stick, but I may be limited by the gigabyte board. I have a thread under Memory & Mobos where I am updating progress. It's a micron stick, as I could not locate Samsung B-Die ECC UDIMMs. Still trying though!
 
Which monitor did you get? Interested to see the results! What's your setup if you don't mind me asking?

I got the Asus ROG PG348Q 34" 3440 x 1440 100 Hz Curved IPS G-Sync Gaming Monitor. The rest of my rig is in my signature. The Swift is moving to my Intel rig.
 
Sorry but I saw this in another forum and had to post it here.
You must not have received the memo. Intel only makes the 7700K and nobody on Earth has any use for anything other than that.
LoL.

Funny because it seems that has been the only argument and every review and youtuber uses it as their test.
 
Just curious.....Did or is anyone else able to benchmark a 1700/1800x CPU with 3600mhz RAM? I know the video shows massive gains. I am curious if anyone else can verify this?

Not benchmarks for 3200mhz either....I am curious if anyone else was able to verify his claims?
 
Just curious.....Did or is anyone else able to benchmark a 1700/1800x CPU with 3600mhz RAM? I know the video shows massive gains. I am curious if anyone else can verify this?

Not benchmarks for 3200mhz either....I am curious if anyone else was able to verify his claims?

Currently very few boards have the bios capability to mess with the blck so only a handful of users can even test it. And with all the bricks Asus caused with their friendly windows auto bios updater even less boards for folks to test with.

My poormans gaming 3 doesn't but I have some F4-3600C16D-16GTZ on the way so I can be ready if and when bios get support for higher speeds.
 
Back
Top