CAD4466HK
2[H]4U
- Joined
- Jul 24, 2008
- Messages
- 2,723
You need a chilled loop to get Ryzen IMC to do 4000Mhz.
Thx for bursting that bubble and pointing out that pesky IMC.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
You need a chilled loop to get Ryzen IMC to do 4000Mhz.
What I gained from the video, taking the 7700k out of the picture and comparing Ryzen
at 2933MHz(because most people still can't hit 3200MHz) vs 3600MHz made a world of difference in over all FPS IMHO.
Just imagine what Ryzen could do at 4000MHz.
Speculation is that the Ryzen DDR4 IMC is the same one that's in Carrizo. I'm thinking I might test that theory out to the best of my ability, beings my laptop has a Carrizo, and 8GB. Which is why I can test this since it's a stupid design by HP, which lacks Dual Channel, and thus if I remove one stick from my Ryzen, match the mem speed, timings, then it'll be a close comparison (I can downcore as well to make it even more comparable). We'll see after I get Win10 stable hehThx for bursting that bubble and pointing out that pesky IMC.
Thanks a ton for the offer!! I'll definitely let you know. With my desktop being down since the beginning of may (due to anticipating Ryzen lol) this month's usage was way down, which paired with the Rollover that Verizon finally implemented last year, means I have 25GB left to burn and ~14 days to do it in. As such, before bed last night I grabbed the Win10 KB4016635 cumulative update (1.08GB). That'll bring the build from... I think RTM is build 10284 (ver 1507 or 1502), up to something like 14983. I don't know what version it'll make it, though, but we'll see. I don't think I have Asperger's but when it comes to my OS, I am a creature of habit and I wasn't looking forward to Win10 on this laptop I bought last year. However, I've gotten used to it and gotten it to where I'm mostly happy with how it looks and feels. That being said, it's Ver 1511 (build 10586) and I like a few aspects that it has which were immediately obvious with that original one I installed on the desktop.I'll send you a USB stick or DVD with Windows 10 on it (legit version from MS, no hacks, but you'll need your own key). Shoot me a PM.
I agree. With how Topweasel helped me out so I could finally get a motherboard and Cyber's offer... It restores some faith in humanity!thats really solid of you cybereality, i was going to offer to do the same with a DVD, I'm glad the [H] community can be so out standing
I am really shocked to see it beating a 5ghz i7-7700k in any gaming benchmarks.
Wait, are you seriously surprised to see Ryzen beat an i7 7700K in a biased and flawed review?
I mean, we can probably get Piledriver to beat an i7 7700K with enough bias. So why be surprised to see Ryzen beat it in a biased review...
I know it was his reasoning and not yours or anyone here. I wasn't trying to imply that you or anyone here has anything to do with his benches. I'm just simply stating the fact that his method is flawed and biased.
While I know Kaby doesn't falter, it also would have seen higher performance if he has benched using 3600 memory clock, same as Ryzen. So for all ntents and purposes the gap stays the same with Kaby leading Ryzen in gaming.
The only information to be gleened from his flawed and biased benches are Ryzen performance increase between DDR7 speeds. His comparison to Kaby is utterly useless because of the biased method employed by the reviewer.
In the article I previously linked, the only game they benched that didn't enjoy fairly good gains was The Division, and even that game saw a miniscule bump (within margin of error however).
To be fair, you can compare the 3200 numbers for both CPUs and ryzen still comes out with some wins here. The 3200 for Ryzen seems to be leagues better than slower RAM, but the gains from 3200 to 3600 were pretty minimal.
I'd have liked to see the 3600 for the 7700k as well, but he at least pointed it out and wasn't trying to be sneaky about it or anything.
So if memory parity is what you want, just ignore the Ryzen 3600 part and compare the i7 with 3200 and the Ryzen graph bars with the 3200? Is that impossible for you?
Why would trust any numbers from a obviously biased review? Sneaky or not, biased is biased. Just because he gives some bullshit excuse? Previous tests have shown his excuse is bullshit, so why trust anything shown in that video?
See above.
I'm surprised that so many members here would blindly trust biased reviews from some unknown youtuber... Did anyone even check this kids channel? Three quarters of his videos are playthroughs. And then he turns into a hardware reviewer?...
If I can be 100% honest (not that I haven't been this entire time....), but Ryzen being what it is, kinda takes me back to the good old days of Overclocking. Even the 7700K sucking a bit at it, it's a breath of fresh air to me. If you want an epic overclock, or more speed than the next fool, you .... *gasp* ... actually have to work at it!! (That's not a jab at you or anyone here, just my opinion in general.)
We've just gotten too complacent with these cookie-cutter overclocks. It's so bad that people have been factoring in the damn Boosts into whether a chip does 'good' or not. Flogging Ryzen for being a "bad overclocker" because an 1800X has a Boost of 4GHz... Yet we seem to have ignored the fact that it's 4GHz on one, maybe two cores if you're lucky. And the all-core speed is quite a bit lower. That goes for Intel, too.
But I digress, this all just means we're not going to mash DEL right after the system is built and dial in the same settings as the rest. We're coming back to the good ole days of fiddle... test.... fiddle some more... test some more...
Sadly I can't toss my hat into the OC ring quite yet, as I've come to find out the RTM of Win 10 (the only ver I have access too currently, due to my limited monthly bandwidth), has a real bug that ends up hard-locking or BSoDing (CLOCK_WATCHDOG_TIMEOUT). Which means I'd not be able to tell if a crash was OC related or general derpiness related
Have to agree.Why would trust any numbers from a obviously biased review? Sneaky or not, biased is biased. Just because he gives some bullshit excuse? Previous tests have shown his excuse is bullshit, so why trust anything shown in that video?
See above.
I'm surprised that so many members here would blindly trust biased reviews from some unknown youtuber... Did anyone even check this kids channel? Three quarters of his videos are playthroughs. And then he turns into a hardware reviewer?...
Why would trust any numbers from a obviously biased review? Sneaky or not, biased is biased. Just because he gives some bullshit excuse? Previous tests have shown his excuse is bullshit, so why trust anything shown in that video?
See above.
I'm surprised that so many members here would blindly trust biased reviews from some unknown youtuber... Did anyone even check this kids channel? Three quarters of his videos are playthroughs. And then he turns into a hardware reviewer?...
Wait, are you seriously surprised to see Ryzen beat an i7 7700K in a biased and flawed review?
I mean, we can probably get Piledriver to beat an i7 7700K with enough bias. So why be surprised to see Ryzen beat it in a biased review...
obviously biased towards intel since it had a full 1000mhz advantage... My point is i didnt expect Ryzen to beat a haswell clock for clock at any games. Now we are seeing 1000mhz deficit against kaby trading blows...
Wait, are you seriously surprised to see Ryzen beat an i7 7700K in a biased and flawed review?
I mean, we can probably get Piledriver to beat an i7 7700K with enough bias. So why be surprised to see Ryzen beat it in a biased review...
If someone benchmarked games on a FX-4000 against a 7700K using a GeForce 4, they'd both get about 10fps and you'd say it was a flawed review. You wouldn't say the CPUs perform the same. This is a flawed review for the exact same reason. There's no useful information to be gleaned here because there's too much error involved."Biased" aka "I don't like the results"
He tested with a 1070 at 1080p.
Not only does it artificially limit the 7700K's performance ceiling (since it is the faster CPU in games), it also restricts the amount of improvement we can see from faster RAM on the Ryzen chips. Any amount of time his 1070 spent at full capacity ruined the CPU results.
Why do you guys continually expose yourself to benchmarks that you know are fake, or at best, misleading? Both chips are not being accurately represented, and even the metric he set out to test (RAM performance) isn't being shown properly. The whole test was a farce.
This is AMD fanboy clickbait, like all the other nonsense YouTube benchmarks floating around in recent weeks. The amount of people defending this garbage over their confirmation biases is really disappointing, you guys know better than this. Don't make clowns of yourselves because you need a cheap and easy way to white knight for AMD.
I put away my tablet and climbed out of bed to edit this post. Why you gotta make me do that?
I'm trying to decide between Kaby/Skylake-X vs 7700K vs 1600X and videos like this are a waste of my time. I feel bad for everyone else making purchase decisions based on bad info like this. I do not care how Ryzen or Intel perform under GPU-bottlenecked scenarios... It's not helpful.
Everyone ignores the minimum framerates.
You really should. Frametimes are what matter.
Minimum frames, frametimes and averages to be honest. A maximum should be excluded.
To be fair, you can compare the 3200 numbers for both CPUs and ryzen still comes out with some wins here. The 3200 for Ryzen seems to be leagues better than slower RAM, but the gains from 3200 to 3600 were pretty minimal.
I'd have liked to see the 3600 for the 7700k as well, but he at least pointed it out and wasn't trying to be sneaky about it or anything.
Why would you want to go and mention the obvious like that. He was on a roll. Had me chuckling.To be fair, you can compare the 3200 numbers for both CPUs and ryzen still comes out with some wins here. The 3200 for Ryzen seems to be leagues better than slower RAM, but the gains from 3200 to 3600 were pretty minimal.
I'd have liked to see the 3600 for the 7700k as well, but he at least pointed it out and wasn't trying to be sneaky about it or anything.
Minimum framerates are an attempt to convey complex information simply, and are a failed attempt.
Frametime analysis explains this information, as well as average, median low, maximum, and so on.
If you want to distill performance statistics to be representative of a gameplay experience, you need to use frametimes.
So let me get this straight,
Your saying that any benchmark we see with a 1070 and below at 1080p with Ryzen 7 are a total fabrication when compared to a 7700k?
So according to you no one in the real world would pair a 7700k with anything lower than a 1080 at 1440p or higher?
I'm sure if he had the gear he would of tested it. Like someone already said, it's a data point to add with all the other so called "farces".
Not the end all be all comparison you seem to think everyone is capable of.
If someone benchmarked games on a FX-4000 against a 7700K using a GeForce 4, they'd both get about 10fps and you'd say it was a flawed review. You wouldn't say the CPUs perform the same. This is a flawed review for the exact same reason. There's no useful information to be gleaned here because there's too much error involved.
.
Holy false equivalency batman! That's a 1070 he's using at high settings, or a step down from max. The comparison you're making is absurd it doesn't even enter the realm of reality, we're talking about a GTX 1070 overclocked not some old piece of garbage! He's not using max settings, which he clearly states. I really doubt they're GPU limited a step down from ultra at 1080p.
But as I mentioned or inferred one keeps a CPU much longer than a GPU and we know each generation is quite a bit more powerful than the last.
970 was faster than the 680
1070 faster than 980ti up to 1440p resolution without extreme OCing.
And so where does one think the Volta 2070 will fit against Pascal?
It will be at least as powerful as the 1080.
And so how long before anyone here buys the Volta '2070' to go with their CPU?
At that point whether 14months or 2 years from now we are back to seeing the current differentiation where reviews used the 1080/1080ti.
And then how long before replacing said CPU bought recently.
This is part of the consideration the editor at Eurogamer/Digital Foundry was mulling over, the longer term effect and the fact one upgrades the GPU more frequently than they do CPU and so will hit the limits in the future, how quickly depends upon the tier of card one upgrades to and when.
I appreciate it will not be applicable to everyone, but the 970/1070 range of cards are always popular and will be again under Volta.
Cheers
Yeah the situation depends upon the person, some upgrade CPU and GPU rarely while others go the other extreme and upgrade the GPU every generation and that even when it is the very expensive enthusiast tier of cards.Good point, but my point is not everyone can afford to match a $700+ GPU with a $300 CPU, or go out and buy UW and 4K monitors.
Some of the elitist on this forum would have you believe otherwise, not everyone does a complete rig overhaul every time they upgrade, my self included.
Many have a never ending upgrade cycle, one component at a time. The last time I upgraded my CPU was in 2011, yeah I bought a flagship 2600K to go with my 570GTX, which was a GPU I had paired with a C2Q9550 before that. I got 670GTX in 2012.
The point is people like me go for the second tier GPU's knowing they will be worthless in 2yrs time or it's the best they can afford.
Just because you can afford to buy a 7700k doesn't mean you can or should have to buy a Titan/1080/Ti to go with it.
Some of us just don't have the cash to throw at a monster GPU.
If that means revoking my [H]ard card, then so be it.
Holy false equivalency batman! That's a 1070 he's using at high settings, or a step down from max. The comparison you're making is absurd it doesn't even enter the realm of reality, we're talking about a GTX 1070 overclocked not some old piece of garbage! He's not using max settings, which he clearly states. I really doubt they're GPU limited a step down from ultra at 1080p.