German Site Jumps the Gun on Ryzen 3000 Benchmarks

If I were writing an article, I'd probably present it something like this, maybe with different GPU and resolution examples when you hover the mouse over it. One mainstream GPU at 1080p, a high end GPU at 1440p as in my example, and an ultra high end GPU at 4k.

I think it tells the story quite nicely, without losing the detail of the distinction between CPU's above where they would be GPU limited.

View attachment 172361

Damn, it also illustrates that Far Cry 5 is an unusually CPU intensive game. Probably not the right example for this method, but it was the first chart on the linked page.

Why would ever introduce GPU limitations into a CPU test? The whole point is to eliminate bottlenecks in order to show off what the CPU can do unrestrained. You don't do that by introducing a GPU bottleneck. Also, you want as few variables changing as possible when running benchmarks.

All of Ubi's modern open world games hit the CPU pretty hard. Games like FC5, AC: Od, and Division 2 might be ones to keep an eye on in reviews. I can see an argument to be made for running tests at multiple resolutions and targeting different framerates, but above all else creating non-CPU bottlenecks should be avoided as much as possible.
 
well the VII is definitely going to be much better than the 5700xt so that'd be a definite downgrade

not really. 5700xt is estimated 14% faster than vega 64. That is half way in between vega 64 and VII, likely a little more. It wont be that far behind VII, excellent value for the price if you are looking to game only.
 
Why would ever introduce GPU limitations into a CPU test? The whole point is to eliminate bottlenecks in order to show off what the CPU can do unrestrained. You don't do that by introducing a GPU bottleneck. Also, you want as few variables changing as possible when running benchmarks.

All of Ubi's modern open world games hit the CPU pretty hard. Games like FC5, AC: Od, and Division 2 might be ones to keep an eye on in reviews. I can see an argument to be made for running tests at multiple resolutions and targeting different framerates, but above all else creating non-CPU bottlenecks should be avoided as much as possible.

I agree with eliminating the GPU bottlenecks in testing.

I'm just thinking that it may also be of value to show above which point it no longer matters for different typical applications.

It's a middle ground that can make everyone happy.
 
I agree with eliminating the GPU bottlenecks in testing.

I'm just thinking that it may also be of value to show above which point it no longer matters for different typical applications.

It's a middle ground that can make everyone happy.

What you suggest is more in line with full system testing. Those have value, but they're not all that useful in pure CPU testing. I do love people doing full system tests like that though since they're a great way of showing how to create a properly balanced build and not spend too much in one area where it won't make any difference.
 
What you suggest is more in line with full system testing. Those have value, but they're not all that useful in pure CPU testing. I do love people doing full system tests like that though since they're a great way of showing how to create a properly balanced build and not spend too much in one area where it won't make any difference.

Well, roughly 100% of all people looking to buy a CPU are looking to put it in a system.

Can't hurt to provide them that extra info as guidance, as long as it doesn't hurt the availablility of proper subsystem testing. That absolutely has to be there.
 
Well, roughly 100% of all people looking to buy a CPU are looking to put it in a system.

Can't hurt to provide them that extra info as guidance, as long as it doesn't hurt the availablility of proper subsystem testing. That absolutely has to be there.

Something like that is better left to its own article, video, whatever. Especially around launch day. Very especially around launch days as busy as this one.
 
Something like that is better left to its own article, video, whatever. Especially around launch day. Very especially around launch days as busy as this one.
Yeah so just look at a synthetic benchmark and call it a day. I mean it's great for you that all you care about is some random number that has little relevance to reality, but the rest of us would like some context. goodtalk.gif
 
Yeah so just look at a synthetic benchmark and call it a day. I mean it's great for you that all you care about is some random number that has little relevance to reality, but the rest of us would like some context.View attachment 172428

So you want reviewers to rush to toss in even more in-depth content on an already super limited timeline? Exactly how do you expect that will work out?
 
Does anyone have any idea what time these puppies go on sale? Midnight? 6am, 9am (in EST)? I AM ON PST and AMD launched the VII at 9am EST...A lot of folks will be in church tommorow at the time though..
 
Does anyone have any idea what time these puppies go on sale? Midnight? 6am, 9am (in EST)? I AM ON PST and AMD launched the VII at 9am EST...A lot of folks will be in church tommorow at the time though..

Just keep refreshing the page, but I think you're on to something.
 
Just keep refreshing the page, but I think you're on to something.


I'm trying find out so I can force myself to stay up u til midnight (that's 3am if they launch at EST again) or if I need to go to bed early and get up at 6am on Sunday.


Care to offer an opinion?
 
Just realized 7/7 at 7am might be the plan
.which is 4am here. Doh!

I know how scarce stock can be and I have decided to be cheap and get the 3700 which I know is on top of a lot of lists.
 
wow thats insane performance if its true. Odd they use 2666mhz on the memory. BUT 1 thing is the 8700k overclocks very easily. I wonder what the 3600 can do
 
I'm trying find out so I can force myself to stay up u til midnight (that's 3am if they launch at EST again) or if I need to go to bed early and get up at 6am on Sunday.


Care to offer an opinion?

I'd suggest going to bed and waking up early on Sunday.
 
I'm trying find out so I can force myself to stay up u til midnight (that's 3am if they launch at EST again) or if I need to go to bed early and get up at 6am on Sunday.

No need to stay up. The reviews will still be there tomorrow.

Don't be like those idiots who line up in front of Apple stores on launch day, just to get the latest dumb version of some phone.

There is no value in being first.
 
No need to stay up. The reviews will still be there tomorrow.

Don't be like those idiots who line up in front of Apple stores on launch day, just to get the latest dumb version of some phone.

There is no value in being first.

The only you will get being first, is having to deal with all the bugs. Not just AMD, seems like any new hardware product is like that nowdays.
 
No need to stay up. The reviews will still be there tomorrow.

Don't be like those idiots who line up in front of Apple stores on launch day, just to get the latest dumb version of some phone.

There is no value in being first.


Actually there is a value to me. I'd like to get one of the 50th AV editon cards just to have but more importantly I need a CPU so I can get my loop reassembled.

I don't have much time except for the beginning of the week. When you have chronic health issues you need to work when you can. Getting it as soon as I can helps me with that.
 
wow thats insane performance if its true. Odd they use 2666mhz on the memory. BUT 1 thing is the 8700k overclocks very easily. I wonder what the 3600 can do

Yeah I'm right there with you on that observation...I'd have at least used the recommended 3200MHz ram on the AMD build since that's what it states the spec is on the AMD page.

I'm also scratching my head regarding the use of a B450 chipset based mobo compared to the Z390 on the Intel offering.

I admit I'm not as up-to-date on the Intel side of the house anymore but isn't that their top tier enthusiast board for Coffee Lake?

If I were doing a comparison, I'd pit:

X570 against the Z390
X470 against the Z370
B450 against the H370

(And, as I said earlier, I may be wrong about the performance tiers on the Intel side)

All in all, It was an impressive performance from the 3600. It spotted Intel a 100MHz advantage and slapped the 8700K around on most of the benches.
 
Yeah I'm right there with you on that observation...I'd have at least used the recommended 3200MHz ram on the AMD build since that's what it states the spec is on the AMD page.

I'm also scratching my head regarding the use of a B450 chipset based mobo compared to the Z390 on the Intel offering.

I admit I'm not as up-to-date on the Intel side of the house anymore but isn't that their top tier enthusiast board for Coffee Lake?

If I were doing a comparison, I'd pit:

X570 against the Z390
X470 against the Z370
B450 against the H370

(And, as I said earlier, I may be wrong about the performance tiers on the Intel side)

All in all, It was an impressive performance from the 3600. It spotted Intel a 100MHz advantage and slapped the 8700K around on most of the benches.

B450 boards are better than most H370s. Not only can B450 still overclock, but some B450 boards have solid VRMs and VRM cooling. As far as performance goes, there isn't a huge difference between B450 and X470 when just doing standard air cooling.

Edit: The Gigabyte board he is using doesn't have a great VRM set up but for a 65w CPU its totally fine.
 
Moved this to the news thread if you don't mind.



It is on purpose. This is always how CPU benchmarks are done.

In every title I've ever tested, at a fixed framerate, changing resolution/graphics settings has no impact on CPU load. Because of this it makes sense to run it at the absolute minimum resolution/graphics settings so that your benchmark figures are not GPU limited.

It is a CPU review, not a full system review, so this way you can see what the CPU is capable of independent of the GPU.

You - of course - have to read these charts with a little intelligence, and realize that depending on your GPU chances are in real world gameplay, you'll never see those framerates due to being GPU limited, and realize that this means that due to GPU limitations many of the top performing CPU's will perform identically in normal use.

It involves a risk that people misinterpret the results, but I don't see how else they could do it and still get to the actual performance of the CPU.

Indeed, I do not benchmark GPU's. That's Brent's job. :) The goal is to isolate the CPU and to do that, you want to use a resolution of 1920x1080 or lower and put your graphics settings for a given game in potato mode. You don't want your GPU working too hard. That said, you don't want to use some utterly pathetic and sad GPU that would struggle to do even that much.
 
AMD RYZEN 7 3700X 8-Core 3.6 GHz (4.4 GHz Max Boost) Socket AM4 65W 100-100000071BOX Desktop Processor is $329.
https://www.newegg.com/amd-ryzen-7-...tion=3700X&cm_re=3700X-_-19-113-567-_-Product

AMD RYZEN 5 3600X 6-Core 3.8 GHz (4.4 GHz Max Boost) Socket AM4 95W 100-100000022BOX Desktop Processor is $249.
https://www.newegg.com/amd-ryzen-5-3600x/p/N82E16819113568

This is going to be a fun, sleepless night!

And only 463 more sleeply nights until I can pickup at 3800X for $200 in the FS/T section. So tempted to violate my cheapskate ethos and buy Zen 2 new. My X58 Forever vow is getting tested. [H]ard.
 
This is going to be a fun, sleepless night!

And only 463 more sleeply nights until I can pickup at 3800X for $200 in the FS/T section. So tempted to violate my cheapskate ethos and buy Zen 2 new. My X58 Forever vow is getting tested. [H]ard.

I have a three and a half year old X99 set up and I'm chomping at the bit for an upgrade. I'd go crazy holding onto something as old as X58 in my gaming system long term.
 
I have a three and a half year old X99 set up and I'm chomping at the bit for an upgrade. I'd go crazy holding onto something as old as X58 in my gaming system long term.

It's easy. Just game like it's 2009! :LOL:

Actually, my 3 X58 (5675 Xeon) setups are used for work. And little play.
 
for certain what I know here today is that the internet is going to be more toxic than Chernobyl today
 
LOL, it's about being able to compare relative performance of CPUs without other factors affecting the results. Not sure why you're not getting the point here. GPU limited tests will make different CPUs look like they have the same performance.

It's nothing to do with gaming.

But they DO have the same performance! And it has everything to do with gaming because GAMES are being tested!
Omg, are people that thick here?
If a gamer buys a CPU to game at 4k and 9900k shows as fast as 6700k, then that's how fast it is in his games.

"Look! This CPU is 50% faster than yours! You absolutely need to upgrade!"
"Stop looking at the resolution! Don't look there! Just look at the numbers! The numbers never lie! The resolution you game at is irrelevant! Everyone tests CPUs at 720p because that's how it shows that you get a better CPU for MOEARRRR MONEAH"
"NOW PAY US YOU DAMN 4K GAMER! THE RESOLUTION YOU GAME AT IS IRRELEVANT!"

Why the hell do you think people kept their 2500k for a decade? Because they didn't need to upgrade. 720p CPU tests in gaming are irrelevant and have only marketing behind it because the gamer community is probably the largest market for new mainstream CPUs.
Testing CPUs at 720p in gaming is misleading people.
 
newegg has a bunch of x570 stuff, prices are high as has been feared :(

"Feared" All I see are boards that are worth the cost and there is quite a large range of boards. I think you are confusing AMD with Intel, because you can buy the X470 boards to use with the new CPU's. Unlike Intel, where the new boards must be used. They are all worth the cost of admission but I will admit, $299 is probably the most I would spend.
 
for certain what I know here today is that the internet is going to be more toxic than Chernobyl today

I already see especially in gaming communities people have jumped full aboard on the AMD hype train but the real nerds in here know Ryzen architecture is the least fit for gaming compared to its strengths in productivity work. I expect Intel 9000 series competitors still be some 3~5% in avg faster at 1080p in 70~80% titles. The gamers think it will swipe the floor like all the CPUs will easily overclock to 4.7GHz on air and stuff....

I will especially look forward to see the responses from that community.

Not to say it won't be a great CPU, especially in multitasking there won't be any comparison but yea I think the casual gamers who aren't comp hardware nerds will be in slight dissappointment though.
 
I already see especially in gaming communities people have jumped full aboard on the AMD hype train but the real nerds in here know Ryzen architecture is the least fit for gaming compared to its strengths in productivity work. I expect Intel 9000 series competitors still be some 3~5% in avg faster at 1080p in 70~80% titles. The gamers think it will swipe the floor like all the CPUs will easily overclock to 4.7GHz on air and stuff....

I will especially look forward to see the responses from that community.

Not to say it won't be a great CPU, especially in multitasking there won't be any comparison but yea I think the casual gamers who aren't comp hardware nerds will be in slight dissappointment though.


you can never make everyone happy and the pc hardware market is about as brand loyal as it gets so it will be the same old to and fro.
 
Back
Top