Ryzen with 3600MHz RAM Benchmark

What I gained from the video, taking the 7700k out of the picture and comparing Ryzen
at 2933MHz(because most people still can't hit 3200MHz) vs 3600MHz made a world of difference in over all FPS IMHO.
Just imagine what Ryzen could do at 4000MHz. ;)

The same goes for Kaby though.
 
Thx for bursting that bubble and pointing out that pesky IMC.:p
Speculation is that the Ryzen DDR4 IMC is the same one that's in Carrizo. I'm thinking I might test that theory out to the best of my ability, beings my laptop has a Carrizo, and 8GB. Which is why I can test this since it's a stupid design by HP, which lacks Dual Channel, and thus if I remove one stick from my Ryzen, match the mem speed, timings, then it'll be a close comparison (I can downcore as well to make it even more comparable). We'll see after I get Win10 stable heh


I'll send you a USB stick or DVD with Windows 10 on it (legit version from MS, no hacks, but you'll need your own key). Shoot me a PM.
Thanks a ton for the offer!! :happy: I'll definitely let you know. With my desktop being down since the beginning of may (due to anticipating Ryzen lol) this month's usage was way down, which paired with the Rollover that Verizon finally implemented last year, means I have 25GB left to burn and ~14 days to do it in. As such, before bed last night I grabbed the Win10 KB4016635 cumulative update (1.08GB). That'll bring the build from... I think RTM is build 10284 (ver 1507 or 1502), up to something like 14983. I don't know what version it'll make it, though, but we'll see. I don't think I have Asperger's but when it comes to my OS, I am a creature of habit and I wasn't looking forward to Win10 on this laptop I bought last year. However, I've gotten used to it and gotten it to where I'm mostly happy with how it looks and feels. That being said, it's Ver 1511 (build 10586) and I like a few aspects that it has which were immediately obvious with that original one I installed on the desktop.

Rambling aside... lol Point is, I may yet take you up on the offer, since what I grabbed, from my understanding, is more of "cumulative patches' and not updates like the Anniversary or upcoming Creators. So quite likely I'll still be on my same Version, this file will just change the build number. I don't know if they come in straight-64bit ISOs though, and the 32+64bit ISO requires a DL DVD. As well as being burned in UDF if it isn't set to, as I learned that mistake the hard way lol ISO mode has a file limit of 2GB and the Install.wim is 2.4GB+, so it was "corrupted". Whoops! :facepalm:
Will let'cha know though!

thats really solid of you cybereality, i was going to offer to do the same with a DVD, I'm glad the [H] community can be so out standing
I agree. With how Topweasel helped me out so I could finally get a motherboard and Cyber's offer... It restores some faith in humanity! :shame:
Or, maybe... :cautious: just maybe... :cautious: it's cuz AMD fans are just nicer people?
*hides* lol :smuggrin:

Srsly though lol A sincere tip of my hat to you (both) for being willing to help me out! Thanks guys (and/or gals heh) :pompous:
 
I am really shocked to see it beating a 5ghz i7-7700k in any gaming benchmarks.
 
I am really shocked to see it beating a 5ghz i7-7700k in any gaming benchmarks.

Wait, are you seriously surprised to see Ryzen beat an i7 7700K in a biased and flawed review?

I mean, we can probably get Piledriver to beat an i7 7700K with enough bias. So why be surprised to see Ryzen beat it in a biased review...
 
Wait, are you seriously surprised to see Ryzen beat an i7 7700K in a biased and flawed review?

I mean, we can probably get Piledriver to beat an i7 7700K with enough bias. So why be surprised to see Ryzen beat it in a biased review...

To be fair, you can compare the 3200 numbers for both CPUs and ryzen still comes out with some wins here. The 3200 for Ryzen seems to be leagues better than slower RAM, but the gains from 3200 to 3600 were pretty minimal.

I'd have liked to see the 3600 for the 7700k as well, but he at least pointed it out and wasn't trying to be sneaky about it or anything.
 
I know it was his reasoning and not yours or anyone here. I wasn't trying to imply that you or anyone here has anything to do with his benches. I'm just simply stating the fact that his method is flawed and biased.

While I know Kaby doesn't falter, it also would have seen higher performance if he has benched using 3600 memory clock, same as Ryzen. So for all ntents and purposes the gap stays the same with Kaby leading Ryzen in gaming.

The only information to be gleened from his flawed and biased benches are Ryzen performance increase between DDR7 speeds. His comparison to Kaby is utterly useless because of the biased method employed by the reviewer.

In the article I previously linked, the only game they benched that didn't enjoy fairly good gains was The Division, and even that game saw a miniscule bump (within margin of error however).

So if memory parity is what you want, just ignore the Ryzen 3600 part and compare the i7 with 3200 and the Ryzen graph bars with the 3200? Is that impossible for you?
 
To be fair, you can compare the 3200 numbers for both CPUs and ryzen still comes out with some wins here. The 3200 for Ryzen seems to be leagues better than slower RAM, but the gains from 3200 to 3600 were pretty minimal.

I'd have liked to see the 3600 for the 7700k as well, but he at least pointed it out and wasn't trying to be sneaky about it or anything.

Why would trust any numbers from a obviously biased review? Sneaky or not, biased is biased. Just because he gives some bullshit excuse? Previous tests have shown his excuse is bullshit, so why trust anything shown in that video?

So if memory parity is what you want, just ignore the Ryzen 3600 part and compare the i7 with 3200 and the Ryzen graph bars with the 3200? Is that impossible for you?

See above.

I'm surprised that so many members here would blindly trust biased reviews from some unknown youtuber... Did anyone even check this kids channel? Three quarters of his videos are playthroughs. And then he turns into a hardware reviewer?... o_O
 
Why would trust any numbers from a obviously biased review? Sneaky or not, biased is biased. Just because he gives some bullshit excuse? Previous tests have shown his excuse is bullshit, so why trust anything shown in that video?



See above.

I'm surprised that so many members here would blindly trust biased reviews from some unknown youtuber... Did anyone even check this kids channel? Three quarters of his videos are playthroughs. And then he turns into a hardware reviewer?... o_O

Honestly, it's a good challenge to the big names to get their shit together and get a motherboard that can run 3200+ RAM.

In scientific testing one guy says "I tried this and the are my results! Someone else try it too!" by publishing.

So no, I'm not taking this as gospel but until someone else confirms or denies with similar RAM speeds I'm not going to completely ignore him.
 
I know how hard it was getting the 7700k stable at 3600XMP at 5GHZ. I can only imagine how tricky that would be on the AMD.

Trial & Error for days or weeks, maybe no success at all. There are some big hurdles to take there.
 
If I can be 100% honest (not that I haven't been this entire time....), but Ryzen being what it is, kinda takes me back to the good old days of Overclocking. Even the 7700K sucking a bit at it, it's a breath of fresh air to me. If you want an epic overclock, or more speed than the next fool, you .... *gasp* ... actually have to work at it!! :eek: (That's not a jab at you or anyone here, just my opinion in general.)

We've just gotten too complacent with these cookie-cutter overclocks. It's so bad that people have been factoring in the damn Boosts into whether a chip does 'good' or not. Flogging Ryzen for being a "bad overclocker" because an 1800X has a Boost of 4GHz... Yet we seem to have ignored the fact that it's 4GHz on one, maybe two cores if you're lucky. And the all-core speed is quite a bit lower. That goes for Intel, too.

But I digress, this all just means we're not going to mash DEL right after the system is built and dial in the same settings as the rest. We're coming back to the good ole days of fiddle... test.... fiddle some more... test some more...

Sadly I can't toss my hat into the OC ring quite yet, as I've come to find out the RTM of Win 10 (the only ver I have access too currently, due to my limited monthly bandwidth), has a real bug that ends up hard-locking or BSoDing (CLOCK_WATCHDOG_TIMEOUT). Which means I'd not be able to tell if a crash was OC related or general derpiness related :(

i do agree, overclocking has become a bit too easy over the years, while sure it's nice that it has, it's also nice to go back to the old style of overclocking to eek out every ounce of performance you can out of a build.
 
Why would trust any numbers from a obviously biased review? Sneaky or not, biased is biased. Just because he gives some bullshit excuse? Previous tests have shown his excuse is bullshit, so why trust anything shown in that video?



See above.

I'm surprised that so many members here would blindly trust biased reviews from some unknown youtuber... Did anyone even check this kids channel? Three quarters of his videos are playthroughs. And then he turns into a hardware reviewer?... o_O
Have to agree.
Also he really needs a GTX 1080 and more ideally Titan Pascal or 1080ti to really see CPU performance limit even at 1080p.
I mentioned and linked it in the past that Hardware Unboxed clearly shows that a GTX1070 is a bottleneck with the 7700K and with SMT at 1080p, and unfortunately it makes this test result skewed, also really does not help that he did not match the memory when comparing the systems but then it is academic due to the GTX1070.
Which is why as some have noted even in this thread that with a GTX1070 the comparison is much closer or edging for Ryzen, and GTX1080 favours 7700K.

On the plus side for Ryzen, it is another consideration that if one is not going for a GTX1080 anytime soon then can make more sense with Ryzen for gaming.
But the offset is to consider how long you have the system for and when likely to get a card with such performance (look at how powerful GTX1070 is compared to previous gen, and how this trend will continue every 14-18 months).
Cheers
 
Why would trust any numbers from a obviously biased review? Sneaky or not, biased is biased. Just because he gives some bullshit excuse? Previous tests have shown his excuse is bullshit, so why trust anything shown in that video?



See above.

I'm surprised that so many members here would blindly trust biased reviews from some unknown youtuber... Did anyone even check this kids channel? Three quarters of his videos are playthroughs. And then he turns into a hardware reviewer?... o_O

I mean, he's clearly a hobbyist. I don't see any reason to believe he just made up bar graphs or anything though. I'd have liked to have seen 3600 on the Intel, and a GTX1080 instead, but it is what it is. That's the hardware he had access to and what he felt like testing. It's just a data point that you can take as you will. His initial Ryzen review matched up with everything else I've seen (excellent in productivity, good in some games, meh in other games), so I don't think he's just wildly inventing numbers. There's nothing particularly alarming in this video either. It's mostly GPU limited because of the GTX1070, so it's not surprising the results are a bit more Ryzen favorable than usual.
 
Last edited:
Wait, are you seriously surprised to see Ryzen beat an i7 7700K in a biased and flawed review?

I mean, we can probably get Piledriver to beat an i7 7700K with enough bias. So why be surprised to see Ryzen beat it in a biased review...

obviously biased towards intel since it had a full 1000mhz advantage... My point is i didnt expect Ryzen to beat a haswell clock for clock at any games. Now we are seeing 1000mhz deficit against kaby trading blows...
 
Last edited:
obviously biased towards intel since it had a full 1000mhz advantage... My point is i didnt expect Ryzen to beat a haswell clock for clock at any games. Now we are seeing 1000mhz deficit against kaby trading blows...

The Ryzen chip wasn't running stock, so there was no bias toward the Itel chip; WTF are you even going on about? You can't possibly be dense enough to think that just because the Intel chip can hit 5GHz but the Ryzen can't, that means the test is biased toward Intel. If you do think that, the you're just ignorant toward reality.
 
He tested with a 1070 at 1080p.
Not only does it artificially limit the 7700K's performance ceiling (since it is the faster CPU in games), it also restricts the amount of improvement we can see from faster RAM on the Ryzen chips. Any amount of time his 1070 spent at full capacity ruined the CPU results.

Why do you guys continually expose yourself to benchmarks that you know are fake, or at best, misleading? Both chips are not being accurately represented, and even the metric he set out to test (RAM performance) isn't being shown properly. The whole test was a farce.

This is AMD fanboy clickbait, like all the other nonsense YouTube benchmarks floating around in recent weeks. The amount of people defending this garbage over their confirmation biases is really disappointing, you guys know better than this. Don't make clowns of yourselves because you need a cheap and easy way to white knight for AMD.

I put away my tablet and climbed out of bed to edit this post. Why you gotta make me do that?
 
Last edited:
I don't think the benchmark is fake. I know it's not scientific, but it's just another data point. There are lots of videos on YouTube showing real users playing real games. While you can say their methods are flawed, they are showing videos of performance on the screen to see for yourself. Believe what you will.
 
"Biased" aka "I don't like the results"
If someone benchmarked games on a FX-4000 against a 7700K using a GeForce 4, they'd both get about 10fps and you'd say it was a flawed review. You wouldn't say the CPUs perform the same. This is a flawed review for the exact same reason. There's no useful information to be gleaned here because there's too much error involved.

Accusing people of disliking this benchmark because they hate AMD is just as bad as praising this benchmark because you like AMD. Some of us are in the midst of parting out new builds and we want to be able to make informed decisions, and not have to sift through BS benchmarks looking for mistakes.

I'm trying to decide between Kaby/Skylake-X vs 7700K vs 1600X and videos like this are a waste of my time. I feel bad for everyone else making purchase decisions based on bad info like this. I do not care how Ryzen or Intel perform under GPU-bottlenecked scenarios... It's not helpful.

Please help me spend $500+ wisely and don't promote clickbait videos like this one. I will be very upset if I blow my savings on a new PC to find out I was duped by fanboys with an agenda. Thanks in advance.
 
Last edited:
He tested with a 1070 at 1080p.
Not only does it artificially limit the 7700K's performance ceiling (since it is the faster CPU in games), it also restricts the amount of improvement we can see from faster RAM on the Ryzen chips. Any amount of time his 1070 spent at full capacity ruined the CPU results.

Why do you guys continually expose yourself to benchmarks that you know are fake, or at best, misleading? Both chips are not being accurately represented, and even the metric he set out to test (RAM performance) isn't being shown properly. The whole test was a farce.

This is AMD fanboy clickbait, like all the other nonsense YouTube benchmarks floating around in recent weeks. The amount of people defending this garbage over their confirmation biases is really disappointing, you guys know better than this. Don't make clowns of yourselves because you need a cheap and easy way to white knight for AMD.

I put away my tablet and climbed out of bed to edit this post. Why you gotta make me do that?

So let me get this straight,
Your saying that any benchmark we see with a 1070 and below at 1080p with Ryzen 7 are a total fabrication when compared to a 7700k?
So according to you no one in the real world would pair a 7700k with anything lower than a 1080 at 1440p or higher?
I'm sure if he had the gear he would of tested it. Like someone already said, it's a data point to add with all the other so called "farces".
Not the end all be all comparison you seem to think everyone is capable of.
 
Last edited:
I'm trying to decide between Kaby/Skylake-X vs 7700K vs 1600X and videos like this are a waste of my time. I feel bad for everyone else making purchase decisions based on bad info like this. I do not care how Ryzen or Intel perform under GPU-bottlenecked scenarios... It's not helpful.

I'm in the same boat. Deciding between 1700, 7700k, or waiting for Skylake-X. It's not useless information though. Could be very useful to someone wondering if Ryzen will bottleneck their 1070 or not. Just have to decide for yourself if the data is useful to your situation or not. It appears it is not to you. That's fine. Doesn't make it entirely useless though.
 
Everyone ignores the minimum framerates. LOL Max is what matters right? That split second when Kabylake hits 300fps and Ryzen only at 250fps but Ryzen sustains better minimums = doesn't matter. Its like a fucking game of Yahtzee with some here. Those are respectable results and I hope to see more benchmarks from the enthusiast communities, not just the media. By the way, a 1070 should chew the shit out of anything 1080p because its equal to a 980ti. Since when we moved the goal post to the 1070 being as fast as an RX 480? Only when Ryzen is tested.... Wow....


Its a good CPU and some people are just so upset by this. Baffles the shit out of me.
 
In the end makes ZERO gaming difference and has no bearing on gaming. Yet let's make a big deal out of it. So using a 1070 at 1080p is stupid :ROFLMAO:. So one should buy a 7700K because a 1080Ti (most would not own) will drive the games faster at 1080p even though no monitor in existence would be able to show it nor any human be able to tell the difference or the vast majority who game at 1080p use 60hz monitors. In addition you will suffer %50 to 100% in all the other heavy stuff you may do such as video editing, streaming, serious work etc. So Intel is the way to go and don't listen to the AMD fanboys. Right :facepalm:
 
Minimum frames, frametimes and averages to be honest. A maximum should be excluded.

Minimum framerates are an attempt to convey complex information simply, and are a failed attempt.

Frametime analysis explains this information, as well as average, median low, maximum, and so on.

If you want to distill performance statistics to be representative of a gameplay experience, you need to use frametimes.
 
To be fair, you can compare the 3200 numbers for both CPUs and ryzen still comes out with some wins here. The 3200 for Ryzen seems to be leagues better than slower RAM, but the gains from 3200 to 3600 were pretty minimal.

I'd have liked to see the 3600 for the 7700k as well, but he at least pointed it out and wasn't trying to be sneaky about it or anything.

From the video, his point was that with extra latency settings needed the run 3600 MHz, that there was no real improvement for the 7700K.

Ryzen is not getting a benefit so much from faster memory, but from faster interconnect between the CCX, so the latency impact isn't as big.

But as suggested before, you can just ignore the Ryzen 3600 numbers and compare the 3200 to 3200 numbers.

Things are improving. I have no concern about getting a Ryzen. Probably 6C/12T for the price of a 4c/4t Intel.
 
Even if you ignore the 7700K entirely in his test, he's showing frame rate improvements, especially on the minimums, increasing with higher memory, sometimes VERY significantly. That's something for people looking to build future Ryzen rigs to take to heart when picking memory and motherboards.
 
Currently though to get to 3600 with RyZen you need to OC the Baseclock. You may lose PCIe 3 in the process as in SSDs or graphics cards. Now I have not heard of any complaining about that so maybe not as big an issue. Anyways using 3200 setting BCLK would need to go from 100mhz to 113mhz.
 
Several douchebags who shall remain anonymous are totally missing the point: Ryzen is held back by the CCX and memory speed, and boosting memory seems to result in very substantial gains at lower gaming resolutions, close to where we'd expect Ryzen to be judging from its performance in non-gaming apps.

Really, why do you give a shit what someone else believes re: Ryzen vs 7700? If the comparison to the 7700 bugs you so much, ignore it. Surely your time can be better spent on something semi-productive.
 
To be fair, you can compare the 3200 numbers for both CPUs and ryzen still comes out with some wins here. The 3200 for Ryzen seems to be leagues better than slower RAM, but the gains from 3200 to 3600 were pretty minimal.

I'd have liked to see the 3600 for the 7700k as well, but he at least pointed it out and wasn't trying to be sneaky about it or anything.
Why would you want to go and mention the obvious like that. He was on a roll. Had me chuckling. ;)
 
Minimum framerates are an attempt to convey complex information simply, and are a failed attempt.

Frametime analysis explains this information, as well as average, median low, maximum, and so on.

If you want to distill performance statistics to be representative of a gameplay experience, you need to use frametimes.

The problem with Minimum framerate is that one single hickup in a 30 minute benchmark run could change the results completely. Average does a good job of smoothing out the results so provided the benchmark run was long enough, the result should be fairly consistent and comparable. A histogram of the frame time and/or a graph showing the framerate throughout the entire benchmark run paint the completre picture.
 
So let me get this straight,
Your saying that any benchmark we see with a 1070 and below at 1080p with Ryzen 7 are a total fabrication when compared to a 7700k?
So according to you no one in the real world would pair a 7700k with anything lower than a 1080 at 1440p or higher?
I'm sure if he had the gear he would of tested it. Like someone already said, it's a data point to add with all the other so called "farces".
Not the end all be all comparison you seem to think everyone is capable of.

But as I mentioned or inferred one keeps a CPU much longer than a GPU and we know each generation is quite a bit more powerful than the last.
970 was faster than the 680
1070 faster than 980ti up to 1440p resolution without extreme OCing.

And so where does one think the Volta 2070 will fit against Pascal?
It will be at least as powerful as the 1080.

And so how long before anyone here buys the Volta '2070' to go with their CPU?
At that point whether 14months or 2 years from now we are back to seeing the current differentiation where reviews used the 1080/1080ti.
And then how long before replacing said CPU bought recently.

This is part of the consideration the editor at Eurogamer/Digital Foundry was mulling over, the longer term effect and the fact one upgrades the GPU more frequently than they do CPU and so will hit the limits in the future, how quickly depends upon the tier of card one upgrades to and when.
I appreciate it will not be applicable to everyone, but the 970/1070 range of cards are always popular and will be again under Volta.
Cheers
 
Last edited:
If someone benchmarked games on a FX-4000 against a 7700K using a GeForce 4, they'd both get about 10fps and you'd say it was a flawed review. You wouldn't say the CPUs perform the same. This is a flawed review for the exact same reason. There's no useful information to be gleaned here because there's too much error involved.
.


Holy false equivalency batman! That's a 1070 he's using at high settings, or a step down from max. The comparison you're making is absurd it doesn't even enter the realm of reality, we're talking about a GTX 1070 overclocked not some old piece of garbage! He's not using max settings, which he clearly states. I really doubt they're GPU limited a step down from ultra at 1080p.
 
Holy false equivalency batman! That's a 1070 he's using at high settings, or a step down from max. The comparison you're making is absurd it doesn't even enter the realm of reality, we're talking about a GTX 1070 overclocked not some old piece of garbage! He's not using max settings, which he clearly states. I really doubt they're GPU limited a step down from ultra at 1080p.



No use in explaining it as the goal posts have been moved. This Mindblank's previous review showed Ryzen in a not so good situation a couple weeks ago and retested it and now it is better. No review is perfect but the data is still there, he even gives frametime graphs too which most reviewers tend to omit. Once things get patched up on my end I will do my own testing and make my own conclusions with my 1080ti. So far the experience has been good and I do game at 4k now so I can give a rats ass about 1080p anymore.
 
But as I mentioned or inferred one keeps a CPU much longer than a GPU and we know each generation is quite a bit more powerful than the last.
970 was faster than the 680
1070 faster than 980ti up to 1440p resolution without extreme OCing.

And so where does one think the Volta 2070 will fit against Pascal?
It will be at least as powerful as the 1080.

And so how long before anyone here buys the Volta '2070' to go with their CPU?
At that point whether 14months or 2 years from now we are back to seeing the current differentiation where reviews used the 1080/1080ti.
And then how long before replacing said CPU bought recently.

This is part of the consideration the editor at Eurogamer/Digital Foundry was mulling over, the longer term effect and the fact one upgrades the GPU more frequently than they do CPU and so will hit the limits in the future, how quickly depends upon the tier of card one upgrades to and when.
I appreciate it will not be applicable to everyone, but the 970/1070 range of cards are always popular and will be again under Volta.
Cheers

Good point, but my point is not everyone can afford to match a $700+ GPU with a $300 CPU, or go out and buy UW and 4K monitors.

Some of the elitist on this forum would have you believe otherwise, not everyone does a complete rig overhaul every time they upgrade, my self included.
Many have a never ending upgrade cycle, one component at a time. The last time I upgraded my CPU was in 2011, yeah I bought a flagship 2600K to go with my 570GTX, which was a GPU I had paired with a C2Q9550 before that. I got a 670GTX in 2012.

The point is people like me go for the second tier GPU's knowing they will be worthless in 2yrs time or it's the best they can afford.
Just because you can afford to buy a 7700k doesn't mean you can or should have to buy a Titan/1080/Ti to go with it.
Some of us just don't have the cash to throw at a monster GPU.

If that means revoking my [H]ard card, then so be it.
 
Good point, but my point is not everyone can afford to match a $700+ GPU with a $300 CPU, or go out and buy UW and 4K monitors.

Some of the elitist on this forum would have you believe otherwise, not everyone does a complete rig overhaul every time they upgrade, my self included.
Many have a never ending upgrade cycle, one component at a time. The last time I upgraded my CPU was in 2011, yeah I bought a flagship 2600K to go with my 570GTX, which was a GPU I had paired with a C2Q9550 before that. I got 670GTX in 2012.

The point is people like me go for the second tier GPU's knowing they will be worthless in 2yrs time or it's the best they can afford.
Just because you can afford to buy a 7700k doesn't mean you can or should have to buy a Titan/1080/Ti to go with it.
Some of us just don't have the cash to throw at a monster GPU.

If that means revoking my [H]ard card, then so be it.
Yeah the situation depends upon the person, some upgrade CPU and GPU rarely while others go the other extreme and upgrade the GPU every generation and that even when it is the very expensive enthusiast tier of cards.

The ones I think my post would consider the most would be those that upgrade 970/1070/Volta 2070, these cards are at that threshold of price/performance that does make it one of the most popular models and tiers.
Even if they do not buy at launch, it could mean those currently on a 1070 may buy a Volta 2070 in 2 years time but intend to keep the CPU for another 3 more years.
And the ones likely to keep the CPU longer are those with more than 4-cores, although plenty of Intel gamers still out there with older CPUs than Haswell but generally there is less inclination to upgrade if you are on a 8C/16T CPU although AMD is at a price point it is bearable to do.

There is not one answer/guideline that fits everyone on this subject, which I would say many of us agree on.
Cheers
 
Last edited:
Holy false equivalency batman! That's a 1070 he's using at high settings, or a step down from max. The comparison you're making is absurd it doesn't even enter the realm of reality, we're talking about a GTX 1070 overclocked not some old piece of garbage! He's not using max settings, which he clearly states. I really doubt they're GPU limited a step down from ultra at 1080p.

But as some already in this thread have said; GTX1070 shows "better performance" with Ryzen than 7700K and different situation to GTX1080.

To use a GTX1070 you would really need to lower the settings much more than just one notch, but the problem doing so is that it would also reduce a bit of stress-bottleneck from the CPU as some options have heavy workload.
While Hardware Unboxed generally went with the highest option, they showed even then the difference on a 7700K between GTX1070 and GTX1080 could be 10% for Civ 6 to over 55% with Battlefield 1 and around 50% even with Gears of War 4 - showing that the GTX1070 is really bottlenecking games even for 7700K at 1080p
The setting with the HW he is using is skewing the results (especially as he is no longer gaining any marginal fps with higher clocked memory), which ties into the perspective of some here who have seen similar results in other GTX1070 Ryzen youtube tests compared to the GTX1080.

Either reducing much more the settings or should test at 720p for this context, yeah appreciate from a gaming perspective this is silly but the alternative if trying to see the limits of the CPUs like he was (because he went to OC and also use extreme memory speeds) then he really needs GTX1080/1080ti or reduce resolution.

There are quite a few nuances/debates and contexts regarding the use of the GTX1070 by this Youtuber; the two extremes being GTX1070 is closer to current realistic ownership (more diverse context would also consider long term over 5 years but that is also nuanced), while GTX1080 and ideally GTX1080ti removes all GPU bottlenecks to show absolute performance of the CPUs and difference between them.
To me both need to be considered equally but also their context are very different, however I do think the youtuber was not intending for it to be current realistic ownership but absolute performance.
Cheers
 
Last edited:
Back
Top