Why does Ryzen 7 1800X performs so poorly in games?

http://www.phoronix.com/scan.php?page=article&item=nvidia-1080ti-ryzen&num=1

Hey folks: do you like 4k gaming benches? Welcome to Linux gaming, where even 4k on ultra settings is CPU limited!

Interesting results, but something is incredibly fishy with those Tomb Raider results. 7700K hitting FPS in the 60's at 4K seems reasonable, but Ryzen hitting FPS in the 160's is just not possible at 4K on Ultra settings. As a matter of fact, it's hitting the same FPS numbers as it did with Normal settings, something defenitely fishy going on there.
 
Interesting results, but something is incredibly fishy with those Tomb Raider results. 7700K hitting FPS in the 60's at 4K seems reasonable, but Ryzen hitting FPS in the 160's is just not possible at 4K on Ultra settings. As a matter of fact, it's hitting the same FPS numbers as it did with Normal settings, something defenitely fishy going on there.
Yeah, plus the ridiculous variation of results compared to 7700k. Someone with TR, Linux and Ryzen can consider trying to do these benches himself though, it uses a run of built-in benchmark.

P. S. It must be said though, that this uses kernel that may or may not properly support Ryzen.
 
Last edited:
Eh, there was more merit to BD lawsuit. 8 core chip does function like an 8 core chip in games, i.e. faster than 6 or 4 core chip.



What is funny is that Ryzen has some serious issues in 3dmark's combined test, in spite of having decent graphics and physics results. At this point it looks like the fabric simply can't handle too much stuff that well, leading to all the issues exaggerated by Win10 scheduler being non-CCX aware (Linux scheduler is).
Did any other sites test IOPS with an NVMe storage device and an ever increasing queue?
Only one I have seen so far was Allyn at PCPer using 1 thread and the results while not impacting normal consumer-retail real world (want to emphasise that point) showed that there may be unusual behaviour related to the fabric even for this; in his test using 1 thread but with high queue depth the 6950X had around 100k IOPS more.
Allyn also emphasises it is not a problem per se but from a technical perspective an unxpected behaviour, if this happens also for Naples it could be a concern there but not enough info to date to say whether a bug somewhere or a fabric limitation.

Cheers
 
if this happens also for Naples it could be a concern there but not enough info to date to say whether a bug somewhere or a fabric limitation.

Cheers

I think the only thing we can assume about Naples at this very moment in time should be limited to individual cores as demonstrated in Ryzen. While some of the fundamental technologies will be the same, I'm not expecting it to behave like Ryzen does really.

Of course I could be wrong, I'm just not expecting it.
 
I am sort of surprised [H] hasn't taken a shot at it this yet. I would like to see all games tested at 1080p and with memory at 3200 to see how much of a difference it makes with Ryzen. I wouldn't be surprised if they are just waiting a few months until all settles and MB manufacturers have had some time to work on few microcode updates and tweak memory side of things. It seems like no one has any x370 motherboards in stock either, lol.
 
I don't even know what to make of this thread anymore.

Now there are claims we can't see the diff between 60 and 120 on a variable refresh monitor...

That's just a new breed of insanity. It's fine if you say you're okay with 60. Cool. Or that most people play at 60. Fine. But saying there's no difference is just nuts.

I truly have no idea what is going on.
 
I don't even know what to make of this thread anymore.

Now there are claims we can't see the diff between 60 and 120 on a variable refresh monitor...

That's just a new breed of insanity. It's fine if you say you're okay with 60. Cool. Or that most people play at 60. Fine. But saying there's no difference is just nuts.

I truly have no idea what is going on.


People can't accept the very fact that Ryzen is flawed. It's simple as that. Imagine buying a car with that many issues. I see absolutely no reason to upgrade at this point. Did anyone see Intel drop prices?:)
 
People can't accept the very fact that Ryzen is flawed. It's simple as that. Imagine buying a car with that many issues. I see absolutely no reason to upgrade at this point. Did anyone see Intel drop prices?:)
I have not seen intel drop prices as of yet nor do I predict they will
 
People can't accept the very fact that Ryzen is flawed. It's simple as that. Imagine buying a car with that many issues. I see absolutely no reason to upgrade at this point. Did anyone see Intel drop prices?:)

Just because it isn't as fast in games for now doesn't mean its flawed bro. It is what it is, if one doesn't like it they have choices. Thats why market exists.
 
Imperfection is a flaw by definition, that's my point.

You know what he meant. And this is where I stop dragging this on lol. By that definition everything is flawed even intel since nothing is perfect.
 
Just because it isn't as fast in games for now doesn't mean its flawed bro. It is what it is, if one doesn't like it they have choices. Thats why market exists.

Customers expect the product to work well not have flaws. Do you buy a car with flaws? If there's a recall car companies have to suck it up and pay exorbitant costs to fix it. I'm sorry but Ryzen isn't ready for prime time. Zen 2 should address all those issues but Zen 1 is what it is.

I assume some of you remember the recall with SandyBridge and SATA. The boards were replaced at no cost. Reselling Zen would be like selling a used gun.
 
Brute force option is to treat each CCX as a distinct processor to eliminate cache thrash.

On consoles there is a 190 cycles penalty between the 2 core clusters. And there is no magic fix for that either in software, despite having full control on a fixed platform with the ability to assign threads and lock them to certain cores. Most developers simply avoid the second core cluster or only assign something with very little importance to it.

That's why i am convinced it is not scheduling. Sure, it may account for like 5-10% of it, but that clearly does not cover the distance.

Within 1 CCX it works just fine. It all falls apart when it either accesses different CCX (involves fabric), memory (involves fabric) or I/O (involves fabric)

So you may now understand my Naples comments now ;)
 
Gosh... I'm going to say an unpopular thing. But it needs to be said...

No one in this thread, no reviewer, and not even AMD know what the real performance potential of Zen is. This is all preliminary BS.

In order to really know what the performance is, the following things need to happen:

1. Compiler optimizations for the Zen architecture. That's not happened.

2. Applications need to be compiled with those optimizations.

3. New processor drivers.

4. Changes to thread scheduling.

5. Optimization of the cache usage.

6. Video card driver optimizations that take advantage of the new Zen architecture.

If all of that is done, and done properly, maybe in a best case scenario you could get a 15-20% performance increase if the developer was really invested in getting the most out of a processor. But that's unlikely- since software bloat is driven by higher level languages that produce bloated slow code.

So maybe the best case scenario is a 10% increase in the next 12 months for software.

But what really bothers me about the discussion are a couple of ideas which are pretty incorrect:

1. The idea that frame rates above 60fps are meaningful: they're not. It's a placebo affect. In fact gaming is smoother when the frame rate does not vary. V-Sync is an absolute godsend for these purposes. I'd not enter a competitive situation without it. The variable sync technologies are simply hyped technologies which get customers excited about a new (but meaningless) feature. In a sense the entire gaming "industry" is rooted by ideas pushed out by marketing departments rather than real improvements in image quality or lower latency.

2. Running at a full out frame rate, where either the proc or card is pushing 100%- is hard on your hardware. It also might increase latency in a way that lags the game- but you don't notice because the game appears "smooth" under a variable sync.

That being said, my perspective is that of a gamer, but also a workstation user and systems engineer. If you are a person who only uses your rig for gaming- I would consider that a waste of money honestly.

I have to call BS on this.

1. AMD uses Intels compiler in their own drivers. And making a CPU that needs special compiler optimizations to be viable is quite foolish at best. Every single other CPU worked out the box for old and new software. This is also the excuse used for the FX back then and where did that go after 6 years? Nowhere.

2. Again see 1.

3. Processor drivers got nothing to do with performance.

4. Debunked already and just because you make an anemic crude bolted together set of core clusters isn't someones else job to fix it.

5. That's a hardware design. Not software.

6. Again, already compiled with Intels compiler from AMD and Nvidia drivers and see 1.
 
Its not unusual to hide a massive problems like the CCX problem that is UNFIXABLE?

Are you kidding me.


That is BS dude, you know what it is, another excuse.

Why don't you try not to rationalize what AMD did, instead tell us exactly what they did.

If they get sued for this, they will probably lose, because the 8 core chip is not and cannot function like an 8 core chip in games.

There is nothing illegal in having an anemic interconnect. Pretty much all "many cores" ARM chips is the same. Consoles couldn't be called 8 cores either in your view. But they are.

If we are honest, i think they knew the problem could/would come up but it really did not matter [for them] with what they tried to do with Zen. I mean, hell, the only situations

Anyways, i think his point is that nobody would be able to sue AMD for anything in regards to CCX interconnect because it.. does not actually violate any law to have shitty interconnect. That would be like suing Intel for the sort of RAM they supported on Skulltrail.

Exactly, there are so many examples todays. Even 8-10 core devices that cant use all the cores the same time.
 
I mean, that's literally definition of being flawed, though.

A bugatti is faster than a Ferrari and Lamborghini, does that make them flawed? Flawed by definition implies a defect latent or not, without that then no it is not the definition of anything.
 
If we are honest, i think they knew the problem could/would come up but it really did not matter [for them] with what they tried to do with Zen. I mean, hell, the only situations

Anyways, i think his point is that nobody would be able to sue AMD for anything in regards to CCX interconnect because it.. does not actually violate any law to have shitty interconnect. That would be like suing Intel for the sort of RAM they supported on Skulltrail.

There is nothing illegal in having an anemic interconnect. Pretty much all "many cores" ARM chips is the same. Consoles couldn't be called 8 cores either in your view. But they are.



Exactly, there are so many examples todays. Even 8-10 core devices that cant use all the cores the same time.


civil cases there is no burden of proof beyond a reasonable doubt. All the plaintiff has to do is prove they bought something that isn't doing what its supposed to be doing, or not performing as expecting.

You guys are thinking like a criminal case, no its not like that. The defendant after the plaintiff puts out his reasons, has to rebut the statements by giving reasons or proof that that isn't the what they did or how they did. If there is an abundance of proof, either by many people having the same issues, or evidence that AMD knew something and didn't tell anyone, the plaintiff has the advantage. That advantage is weighted in by the Judge and Jury to give a % amount or what how ever they figure out.
 
Don't worry about it. I'm a systems engineer and they won't take my word on anything either.

You throw out your education or line of work as a 'look at how smart and trustworthy what I say is' tagline and then have lack of self-awareness enough to claim others are making e-peen statements? "No, I won't provide my resume." I didn't realize anyone had asked you to pull down your e-pants. But claiming that yours is yuuuuuuuge and following it immediately with "but I won't show you!".... okay boo. I'm sure it's a real giant.

Your brain can only perceive changes at 13 to 16 hz. Very few people can see the flicker of an incandescent bulb at 60hz.

It's placebo. Above about 20fps you do not get any better as a gamer.

At least make your self-arguments internally consistent. Nothing you've posted is based in fact. It doesn't make you a floating arbiter of intellect to make claims and then say "No, I won't provide proof." That's just a low-level and narcissistic method of attempting to deflect logical dissent. To then create in your mind the alternate reality that there are droves of people already googling and proving you correct is also delusional.

Why don't you try doing the google search you suggested. You might be surprised with the first thing that pops up in bold at the top of the page. And the following 10 links all suggesting that this myth that people can't see more than 45fps is exactly that. Mythos.

You can post whatever you'd like, but no.. it's not anyone else's job to fact-check you. If you're really a professional then you should grasp how argument and debate works: If you make a claim, you provide proof for said claim or the claim is worthless. The attorney you responded to understands that and I'm assuming you do too--perhaps it's inconvenient after making a completely fallacious claim?

tl;dr: I've done your google search and I find nothing that supports what you've said. Most suggest that the eye's visual threshold has nothing to do with fps but rather time between stimulus change. Most suggest that the eye can distinguish near 1000fps but that in terms of meaningful smooth motion perception the useful fps threshold is probably between 60 and 100 fps. All of the results are based on theory (myelin regeneration rates, visual threshold of fighter pilots (which was >200fps)) suggesting there is no universal truth with which to claim it all to be a "marketing gimmick."

If you'd like to provide the google search terms that lead to your 13-16hz, actually 60hz, actually 20hz claim, I'm sure we'd be happy to review it.
 
You keep deflecting from the real subject as to what any of these reviews mean in real world applications. Again I am not saying that 720p has no merit but rather you need CONTEXT (for the love of all that is holy LEARN that word, it will go a long way to making your post more appealing to the masses) for what they do in fact mean.

The test were performed at 1080p because reviewers know what they are doing. I have explained again and again what those reviews mean both for people that play games at 1080p and for people that does at higher resolutions. I also gave a link to an article that explains why one has to eliminate or reduce GPU bottlenecks when testing CPU gaming performance. I will give this link a third time

http://www.techspot.com/news/68407-...ottlenecking-cpu-gaming-benchmarks-using.html

How about speaking to the owners who so far have yet to complain of the performance.

Let us then replace third-party quantitative reviews by subjective feelings from owners...
 
If the assumptions are correct the design is Server based and scaled down to desktop hence having the issue and in all likelihood an unavoidable one at that.

Excuses, escuses. IBM Power8, Broadcom Vulcan and many other chips are exclusive for servers and don't have a weird design with a split LLC connected through a slow interconnect. Intel desktop CPUs use the same microarchitecture than Xeons for servers and also lack this weirdness.
 
Last edited:
Time to bring this back to topic, can you humanoids stop bickering on things which shouldn't belong to this thread.

Hardware.FR 2+2 vs 4+0 test with games. Clear penalty when accessing data from different CCX, at least with games. Here is another: PCGH.de

So when are we going to get a proper english deep dive to the issue at hand because we have now german review site and french site showing some hard data.
 
It's not placebo on high refresh monitors. I'm 100% confident I could pass a blind test between 60fps and 144fps on a 144hz monitor. The difference is that stark. I feel like every person saying it's placebo has never actually tried a high refresh monitor and compared the difference with high and low fps.
https://us.hardware.info/reviews/4592/vast-majority-of-gamers-prefers-120-hz-monitors

There is your test. And 80+ percent could tell the difference. So unless he comes up with something else that I couldn't find (and I put effort in) then I'd conclude. He's wrong.

That said I've been on 60hz for years and it's never been something I cared to upgrade. 4K is worth it to me by far over a much smaller expensive 120+ hz display.
After owning two 144 hz displays, I sold them and moved to a 40 inch 4k monitor (not a tv) and will never look back.

Could I tell the difference? Yea on some games. Did 144 hz make a difference? Not in my opinion.

I've seen gamers whoop ass on a old Dell box with a crt and a free mouse at low resolution at csgo, kicking the snot out of guys with 144hz panels. They don't make anyone a better gamer at all.

I love the 4k real estate. And games at 4k 60fps are much more immersion imo.
 
After owning two 144 hz displays, I sold them and moved to a 40 inch 4k monitor (not a tv) and will never look back.

Could I tell the difference? Yea on some games. Did 144 hz make a difference? Not in my opinion.

I've seen gamers whoop ass on a old Dell box with a crt and a free mouse at low resolution at csgo, kicking the snot out of guys with 144hz panels. They don't make anyone a better gamer at all.

I love the 4k real estate. And games at 4k 60fps are much more immersion imo.
A lot of CRT refresh much faster than a LCD and have zero latency compared to a LCD, so that could very well be why. Also on a side note a lot Pro or hardcore CSGO players play with low resolutions because the larger the pixels are on your screen the larger the hit box.
 
Time to bring this back to topic, can you humanoids stop bickering on things which shouldn't belong to this thread.

Hardware.FR 2+2 vs 4+0 test with games. Clear penalty when accessing data from different CCX, at least with games. Here is another: PCGH.de

So when are we going to get a proper english deep dive to the issue at hand because we have now german review site and french site showing some hard data.

Except the huge 20% variation on BF, which looks like an extreme case, all the benches performed by Hardware,fr show that the variations in performance are tiny with an average of about 2% between the 4+0 and the 2+2 configuration. Funny enough pcgameshardware.de finds a 10% variation on BF.

All those waiting a quad-core chip (on 4+0 configuration) to become the new king of gaming will be disappointed.
 
It is difficult to speculate on how a phantom chip at this point will and will not perform, the R5 strikes the best value for money of the entire SKU database and lets you pocket money towards a better graphics card which makes most of the difference.
 
so you guys gonna sue ford because the mustangs don't corner as well as a bmw? I agree with Gideon close this stupid thread.


Is it due to a fault of the design of the car or is due to the design?

AMD's latency issue in its L3 cache is due to a fault not because it was designed that way. And this is why the BD lawsuit didn't make any sense, BD was designed in a certain way and that is why its performance was what it was.

Some crappy law firm can pick it up and go full steam ahead.
 
riiiight....


Just like the 970 memory lawsuit,

There was no merit for that lawsuit for its original grounds it was filed for (it was filed for partitioning of the ram slow 512 mb), It was a 4 GB card and functioned like one too albeit a portion of the ram was slower due to guess what Latency increases lol. But during discover, the ROP amounts were found out not be accurate, and that is why nV settled, cause they knew they couldn't win that part of it.
 
Is it due to a fault of the design of the car or is due to the design?

AMD's latency issue in its L3 cache is due to a fault not because it was designed that way. And this is why the BD lawsuit didn't make any sense, BD was designed in a certain way and that is why its performance was what it was.

Some crappy law firm can pick it up and go full steam ahead.

Having practiced law for 7 years in commercial litigation and corporate contract, that will never happen. Unless you can prove you were misrepresented of facts at the outset ie: you must have understanding of CPU architecture and would have had to know from day 1 how Ryzen was developed and if you then were completely living off that fact and it turned out you acted on that representation to your detriment, then sure. But all consumers don't know and there is no way of saying you expected this and live off that self made expectation.
 
my example was because people started bringing up cars. nv LIED about shit that's why the suite happened.
 
Just like the 970 memory lawsuit,

There was no merit for that lawsuit for its original grounds it was filed for (it was filed for partitioning of the ram slow 512 mb), It was a 4 GB card and functioned like one too albeit a portion of the ram was slower due to guess what Latency increases lol. But during discover, the ROP amounts were found out not be accurate, and that is why nV settled, cause they knew they couldn't win that part of it.

NV got sued because they misrepresented fact when it wasn't true, they touted specs they didn't have. Ryzen in the odd game doesn't do as well but that is not because of AMD misrepresenting it to have specs it doesn't have.
 
NV got sued because they misrepresented fact when it wasn't true, they touted specs they didn't have. Ryzen in the odd game doesn't do as well but that is not because of AMD misrepresenting it to have specs it doesn't have.


But that wasn't found out till afterwords, the original law suit had nothing to do with the specs as I stated above.

At that point when it was found out, nV settled.
 
Back
Top