German Site Jumps the Gun on Ryzen 3000 Benchmarks

Based on your recommendation, I'll be sure my next purchasing decision is based purely on Passmark.

if you're doing a lot of single threaded applications then yeah buy the AMD cpu.

if you need multi-threaded horsepower i believe AMD is coming out with a 12 core and 16 core cpu on it's mainstream and 64 core cpu on their HEDT platform.

AMD has got you covered friend.
 
you forgot to add "now that AMD is on top."

I've never put any faith in PassMark, Sysmark, PCMark or any of those. I also pay zero attention to Sisoft Sandra, Fritz Chess, Geek Bench, Blender, any of the SPEC tests (SPECint, SPECfp) or anything 3DMark (despite their historical demo modes being very entertaining at the time)

There are a lot of benchmarks out there. Most of them are irrelevant trash.
 
Moved this to the news thread if you don't mind.



It is on purpose. This is always how CPU benchmarks are done.

In every title I've ever tested, at a fixed framerate, changing resolution/graphics settings has no impact on CPU load. Because of this it makes sense to run it at the absolute minimum resolution/graphics settings so that your benchmark figures are not GPU limited.

It is a CPU review, not a full system review, so this way you can see what the CPU is capable of independent of the GPU.

You - of course - have to read these charts with a little intelligence, and realize that depending on your GPU chances are in real world gameplay, you'll never see those framerates due to being GPU limited, and realize that this means that due to GPU limitations many of the top performing CPU's will perform identically in normal use.

It involves a risk that people misinterpret the results, but I don't see how else they could do it and still get to the actual performance of the CPU.

Yeah, but what's the point? Still, no one games at 720p, even if this IS the best way to test how a CPU performs in games, no one is playing at that resolution, it's pointless. Test 1080p, 1440p and 4k.
Even if you argue that there is GPU bottleneck at higher resolution, we don't care, no one games at 720p. Show us the real results in games so we can decide if it's worth buying it for games over older, 2 core CPUs. Testing games in 720p is completely pointless, no one games at 720p.
Did I mention that no one games in 720p?
 
Yeah, but what's the point? Still, no one games at 720p, even if this IS the best way to test how a CPU performs in games, no one is playing at that resolution, it's pointless. Test 1080p, 1440p and 4k.
Even if you argue that there is GPU bottleneck at higher resolution, we don't care, no one games at 720p. Show us the real results in games so we can decide if it's worth buying it for games over older, 2 core CPUs. Testing games in 720p is completely pointless, no one games at 720p.
Did I mention that no one games in 720p?

Yeah what if newer games come out that is cpu heavy at any resolution. You'll probably see the same performance gap that you see now in the 720p tests. Can't really see why it's so hard for some people to understand lol.

I mean even [H]ardocp run cpu tests at super low resolutions lower than 720p :ROFLMAO:
 
Yeah, but what's the point? Still, no one games at 720p, even if this IS the best way to test how a CPU performs in games, no one is playing at that resolution, it's pointless. Test 1080p, 1440p and 4k.
Even if you argue that there is GPU bottleneck at higher resolution, we don't care, no one games at 720p. Show us the real results in games so we can decide if it's worth buying it for games over older, 2 core CPUs. Testing games in 720p is completely pointless, no one games at 720p.
Did I mention that no one games in 720p?

Because it shows you the theoretical max with a future unreleased GPU when the system is no longer GPU limited? Most people upgrade their GPU's more often than they do their CPU's these days.

The whole point of the review is to show the capability of the CPU. The review is completely pointless if you are just going to benchmark on a system that is bottlenecked by the GPU. You learn very about the CPU that way.

Its the same reason reviewers tend to test GPU's on the absolutely fastest CPU they can find to make sure they are not CPU limited when testing the GPU.

This allows the user to take the minimum value of any CPU and minimum value of any GPU and predict how the combination of those two components may perform.

As soon as you benchmark the components in a system where they are held back by other components, you are benching that system, not the individual component you proclaim to be testing, and it limits the usefulness of the data.

Nothing could be less relevant than pointing out that no one games at 720p. The resolution is not the point. Getting CPU data untainted by any other aspect of the system is the point. I'd test at 640x480 if I could.


Do you also suggest benchmarking with a 60hz monitor and vsync turned on? Because that's essentially what you are doing when you do a CPU benchmark at higher resolutions.
 
Yeah, but what's the point? Still, no one games at 720p, even if this IS the best way to test how a CPU performs in games, no one is playing at that resolution, it's pointless. Test 1080p, 1440p and 4k.
Even if you argue that there is GPU bottleneck at higher resolution, we don't care, no one games at 720p. Show us the real results in games so we can decide if it's worth buying it for games over older, 2 core CPUs. Testing games in 720p is completely pointless, no one games at 720p.
Did I mention that no one games in 720p?
This has got to be sarcasm right? Removing the GPU as a bottleneck factor is the best way of showing which CPU preforms best. That performance extrapolates to resolutions people do play at. And in games that do fully bottle neck and all the CPUs show 85fps (or whatever number) that makes for a pretty boring graph and uninformative graph that paints them all as equals.
 
Yeah, but what's the point? Still, no one games at 720p, even if this IS the best way to test how a CPU performs in games, no one is playing at that resolution, it's pointless. Test 1080p, 1440p and 4k.
Even if you argue that there is GPU bottleneck at higher resolution, we don't care, no one games at 720p. Show us the real results in games so we can decide if it's worth buying it for games over older, 2 core CPUs. Testing games in 720p is completely pointless, no one games at 720p.
Did I mention that no one games in 720p?

You do realize that doing a CPU test in a GPU limited scenario is literally worthless, right?
 
This has got to be sarcasm right? Removing the GPU as a bottleneck factor is the best way of showing which CPU preforms best. That performance extrapolates to resolutions people do play at. And in games that do fully bottle neck and all the CPUs show 85fps (or whatever number) that makes for a pretty boring graph and uninformative graph that paints them all as equals.

You do realize that doing a CPU test in a GPU limited scenario is literally worthless, right?


Let me play Devils advocate for a moment, as I (sortof) see where he is coming from. If you bench the system as a whole with a typical GPU at a typical resolution, more beginners might realize that they probably won't see any real world difference between a midrange and a high end CPU in real world gaming, because they will be GPU limited anyway.

That said, it is up to the user/reader to know what the GPU they plan on using is capable of, and realize that above a certain speed the CPU's are indistinguishable from eachother due to GPU limitations. Sure, nothing beats having the exact configuration you are interested in tested. You can't get more accurate results than this. The problem is that there are literally millions of different permutations and combinations of CPU's, GPU's and playable resolutions. If we spoonfeed users by benching using a typical GPU and gaming resolution, the review will only be relevant to people who have or want that exact combination, and will be useless to everyone else.

A good reviewer will explain this in the text, and maybe even include a few benchmarks at higher resolutions as well, to communicate this point, but these must come in addition to the low resolution un-bottlenecked CPU tests, not instead of them.

Yes, this will mean that the mouthbreathers who don't read and just glance at the pictures will draw the wrong conclusions and go on and on about how their Intel CPU that is 2% faster is the best ever, despite being GPU limited at exactly the same point as someone else is, but that can't be avoided. There will always be idiots.
 
I'll just wait for more accurate benchmarks when the chips finally come out.;)
 
FYI folks, if I remember reading correctly from a few weeks ago, Windows 10 1903 doesn't seem have all the Ryzen optimizations fully enabled/on, AMD is supposed to be putting out a new chipset driver any day that kicks in a bunch of performance optimizations. Something along that realm. So I would obviously guess that any leaked benchies are going to possibly be low(er) than they could with those new chipset drivers. Of course I wouldn't expect miracle chipset drivers to magically increase performance a lot. [one can wish, right?]
 
Those results don't make much sense. If it is single thread benchmark, why would a 9600KF at 3.7GHz score the same as 9700F at 3.0GHz. They are essentially the same cores, and the 9600KF has a 23% clock speed advantage.

I expect very good things from Ryzen 3000, but I'll wait for some quality reviews with explanations for any anomalies like this.

because all 3 intel 9900 cpus boost to 5 ghz in single threaded that's the clock it's using.

because that's the single threaded benchmark
 
FYI folks, if I remember reading correctly from a few weeks ago, Windows 10 1903 doesn't seem have all the Ryzen optimizations fully enabled/on, AMD is supposed to be putting out a new chipset driver any day that kicks in a bunch of performance optimizations. Something along that realm. So I would obviously guess that any leaked benchies are going to possibly be low(er) than they could with those new chipset drivers. Of course I wouldn't expect miracle chipset drivers to magically increase performance a lot. [one can wish, right?]
 
Those results don't make much sense. If it is single thread benchmark, why would a 9600KF at 3.7GHz score the same as 9700F at 3.0GHz. They are essentially the same cores, and the 9600KF has a 23% clock speed advantage.

I expect very good things from Ryzen 3000, but I'll wait for some quality reviews with explanations for any anomalies like this.

9600KF single core boost clock is 4.6 GHz, 9700F single core boost is 4.8 if I'm not mistaken. The 200 MHz difference in speed between the two should be negligible to the point that other factors (such as the frequence of the memory used) could be just as important.
 
9600KF single core boost clock is 4.6 GHz, 9700F single core boost is 4.8 if I'm not mistaken. The 200 MHz difference in speed between the two should be negligible to the point that other factors (such as the frequence of the memory used) could be just as important.

I see that now, but it would been more useful to quote the actually clock speed in single thread test, rather than the base clock of the processor.
 
You do realize that doing a CPU test in a GPU limited scenario is literally worthless, right?
That is true but a workload involving a game to test cpu is that the pinnacle of performance to begin with? Now that API have evolved what is the result that people find so reassuring about a 720p game benchmark ?

I was under the impression that a cpu benchmark would tell you about the cpu performance ...
 
If you ask me, this is pretty much what I expected.

Pretty much what I expected. Trading blows depending on the title, trailing slightly in the games which are more highly dependent on a single fast thread, but in every case fast enough that you are probably going to be GPU limited before you are CPU limited at typical settings.

I haven't played Far Cry 5, but from my experience with other Far Cry games, the Dunia engine (modified Crytek engine for open worlds) they use always pins the third core, so it is not surprising at all yo me that it may be a worst case title for Ryzen.

I for one am looking forward to getting a 3950x and overclocking the snot out of it.

Credit for finding the link goes to LstOfTheBrunnenG
100% agree, although my ambitions are a little more modest...probably a 3700X with a minor overclock. My expectations regarding all core overclocking headroom are in the couple of hundred Mhz at best range and with precision boost in play, I'm not sure if it will make an appreciable difference. Looking forward to finding out☺
 
Almost same performance and lower power consumption - pretty impressive for a fraction of the R&D and a less 'mature process'.
But of course power consumption only matters when it's AMD, because 5,000MW for 2% faster in some games is so [H].

Either way interesting to see minimums are very good if it isn't top of a game benchmark. Wonder if more optimization can help there.
I guess the Jim Keller factor has something to do with it....
 
Because it shows you the theoretical max with a future unreleased GPU when the system is no longer GPU limited? Most people upgrade their GPU's more often than they do their CPU's these days.

The whole point of the review is to show the capability of the CPU. The review is completely pointless if you are just going to benchmark on a system that is bottlenecked by the GPU. You learn very about the CPU that way.

Its the same reason reviewers tend to test GPU's on the absolutely fastest CPU they can find to make sure they are not CPU limited when testing the GPU.

This allows the user to take the minimum value of any CPU and minimum value of any GPU and predict how the combination of those two components may perform.

As soon as you benchmark the components in a system where they are held back by other components, you are benching that system, not the individual component you proclaim to be testing, and it limits the usefulness of the data.

Nothing could be less relevant than pointing out that no one games at 720p. The resolution is not the point. Getting CPU data untainted by any other aspect of the system is the point. I'd test at 640x480 if I could.


Do you also suggest benchmarking with a 60hz monitor and vsync turned on? Because that's essentially what you are doing when you do a CPU benchmark at higher resolutions.

What modern game that looks half good is not gpu limited at 1440p and above?
Maybe a handful, but I doubt they are really not gpu limited at those resolution, just able to use more cores.
Games from 2025 will not have the same requirement as today. The games will always be gpu limited at high resolution.

If they are going to show the capability of the cpu then test software benchmarks? They already do that. That's how they are showing capabilities of the cpu.
Testing games at 720p is completely pointless. It doesn't show the capability of the cpu because no one games at that resolution. It's irrelevant if the cpu is 200% faster in 720p because no one will ever see the difference because no one serious games at 720p.

There are basically no games that are not bottlenecked by the gpu at high resolution.
And if they are not dependant on the gpu at high resolution, the games are not demanding, so you could run them on a potato. So that still makes the test pointless.

My mind has not been changed.
Stop testing cpu in gaming at 720p. It's stupid.
 
What modern game that looks half good is not gpu limited at 1440p and above?
Maybe a handful, but I doubt they are really not gpu limited at those resolution, just able to use more cores.
Games from 2025 will not have the same requirement as today. The games will always be gpu limited at high resolution.

If they are going to show the capability of the cpu then test software benchmarks? They already do that. That's how they are showing capabilities of the cpu.
Testing games at 720p is completely pointless. It doesn't show the capability of the cpu because no one games at that resolution. It's irrelevant if the cpu is 200% faster in 720p because no one will ever see the difference because no one serious games at 720p.

There are basically no games that are not bottlenecked by the gpu at high resolution.
And if they are not dependant on the gpu at high resolution, the games are not demanding, so you could run them on a potato. So that still makes the test pointless.

My mind has not been changed.
Stop testing cpu in gaming at 720p. It's stupid.

Doesn’t it show minimum frames per second more accurately or maybe even frame times?
 
What modern game that looks half good is not gpu limited at 1440p and above?
Maybe a handful, but I doubt they are really not gpu limited at those resolution, just able to use more cores.
Games from 2025 will not have the same requirement as today. The games will always be gpu limited at high resolution.

If they are going to show the capability of the cpu then test software benchmarks? They already do that. That's how they are showing capabilities of the cpu.
Testing games at 720p is completely pointless. It doesn't show the capability of the cpu because no one games at that resolution. It's irrelevant if the cpu is 200% faster in 720p because no one will ever see the difference because no one serious games at 720p.

There are basically no games that are not bottlenecked by the gpu at high resolution.
And if they are not dependant on the gpu at high resolution, the games are not demanding, so you could run them on a potato. So that still makes the test pointless.

My mind has not been changed.
Stop testing cpu in gaming at 720p. It's stupid.

LOL, it's about being able to compare relative performance of CPUs without other factors affecting the results. Not sure why you're not getting the point here. GPU limited tests will make different CPUs look like they have the same performance.

It's nothing to do with gaming.
 
The point is that it is better to show CPU limited games like Crysis 3 running 1080p than GPU limited games like Wolf 2 at 720p when comparing CPUs. The former will do a better job at representing future 1080p games.
 
What modern game that looks half good is not gpu limited at 1440p and above?
Maybe a handful, but I doubt they are really not gpu limited at those resolution, just able to use more cores.
Games from 2025 will not have the same requirement as today. The games will always be gpu limited at high resolution.

If they are going to show the capability of the cpu then test software benchmarks? They already do that. That's how they are showing capabilities of the cpu.
Testing games at 720p is completely pointless. It doesn't show the capability of the cpu because no one games at that resolution. It's irrelevant if the cpu is 200% faster in 720p because no one will ever see the difference because no one serious games at 720p.

There are basically no games that are not bottlenecked by the gpu at high resolution.
And if they are not dependant on the gpu at high resolution, the games are not demanding, so you could run them on a potato. So that still makes the test pointless.

My mind has not been changed.
Stop testing cpu in gaming at 720p. It's stupid.


The goal is to test the CPU and the CPU only without any limitations from any other aspect of the system.

I'd even argue that 720p is too high of a resolution to test at for this purpose. I'd use the absolute minimum setting possible in the game menu. What's that, 1024x768 these days?

It is absolutely crucial to test the CPU in isolation of other components so we learn about the CPU. There is little value in a total system benchmark like you propose as it is only relevant to people who have the exact system configuration under test.

If you have the isolated CPU data you can use that to project to your own configuration.

Otherwise all we are doing is the equivalent of benchmarking with vsync turned on.
 
Last edited:
With that it's also extremely important to emphasize that most bottlenecks are in the GPU, and in general while CPU is important, most people gaming really need to see that the $100 between the 9900k and the 9700k could put you up a GPU bracket and that would have far more effect on their performance than 9700k to 9900k
 
With that it's also extremely important to emphasize that most bottlenecks are in the GPU, and in general while CPU is important, most people gaming really need to see that the $100 between the 9900k and the 9700k could put you up a GPU bracket and that would have far more effect on their performance than 9700k to 9900k

Agreed. These things should be mentioned in a review, (at least one targeted at beginners, as the rest of us should know this by now) and maybe even illustrated with a few benchmarks with common hardware/resolutions.

This should not and cannot replace subsystem isolation testing which is what you try to get when testing at a very low resolution.
 
The point is that it is better to show CPU limited games like Crysis 3 running 1080p than GPU limited games like Wolf 2 at 720p when comparing CPUs. The former will do a better job at representing future 1080p games.

Wolf 2 is GPU limited at low res 720p settings? Reviewers shouldn't use well optimized games because it doesn't represent the future? What kind of logic is that
 
The part many people don't seem to get is that CPU benchmarks are not real world tests, and they are not intended to be real world tests. They are isolating and testing a single component.

Because of this we want to completely eliminate any other contributing factors to the results. This is of course impossible, but to get as close as we can, we need to set ALL graphics settings to their lowest settings, and set the resolution to its lowest setting.

It doesn't matter if no one plays at 720p, or if 1080p shouldn't make you GPU limited anyway. In order to do a good job you need to completely eliminate the GPU as a factor at all, and in order to do so, minimum everything is the closest you'll get.

Again, this is not a real world system test, and it should not be read as a real world system test, but it is absolutely crucial, as the only other alternative is to test every single goddamned permutation and combination of CPU, GPU, resolution and settings, and no review ever is going to be able to do that even if they wanted to.
 
With that it's also extremely important to emphasize that most bottlenecks are in the GPU, and in general while CPU is important, most people gaming really need to see that the $100 between the 9900k and the 9700k could put you up a GPU bracket and that would have far more effect on their performance than 9700k to 9900k

If I were writing an article, I'd probably present it something like this, maybe with different GPU and resolution examples when you hover the mouse over it. One mainstream GPU at 1080p, a high end GPU at 1440p as in my example, and an ultra high end GPU at 4k.

I think it tells the story quite nicely, without losing the detail of the distinction between CPU's above where they would be GPU limited.

upload_2019-7-6_18-14-24.png


Damn, it also illustrates that Far Cry 5 is an unusually CPU intensive game. Probably not the right example for this method, but it was the first chart on the linked page.
 
I really wanna see some Kerbal Space Program CPU benchmarks of Ryzen 3000. CPU limited even at 4k. 1 core used for your ships.
 
Back
Top