Why does Ryzen 7 1800X performs so poorly in games?

Funny you mention the FX-8350, a chip has been previously accused of having a real TDP higher than the marketing label.

We have known before launch that "95W" was a marketing label, because the real TDP was 105W with turbo/XFR disabled. AMD can be very imaginative on redefining the TDP of its chips using tricks as 'typical use' to make people believe that Ryzen is more efficient than Broadwell (95W vs 140W). But the TDP cannot be lower than the power consumption of the chip at base clocks. At least reviewers and other people are mentioning how "95W" means real 130W.


Still less then what Intel 6900k uses in power, a non issue. Everyone in the enthusiast community knows AMD rates their chip differently then Intel does, not news and no one really cares.
 
Ok both of you need to pay attention to this next point.

The 8350 has a TDP of what? ... It is 140W.

When doing set test like IBT or Prime95 how many watts does it use? ... 160-200W depending on chip VID.

AMD has in recent years with all their processors and GPUs posted TDP as a "typical" use and only pertains to the heatsink not the wattage rating of the chip.

You two are playing fast and loose with the facts to make it look like some kind of conspiracy or deception when the only deception is your take on the facts.
I would argue AMD moved the goalpost by redefining what TDP means just for them.
 
I would argue AMD moved the goalpost by redefining what TDP means just for them.

Pretty much. With an Intel CPU at stock speeds the CPU won't ever go beyond the TDP number. More often than not they come in under if not well under the TDP figures as they aren't running at 100% utilization. AMD CPUs might "average" 95watts but they can run much higher than that under full load. That's the sense I'm getting from all that I've read on the subject. (Still don't have Ryzen in hand.) For enthusiasts the TDP numbers are damn near meaningless anyway. Those figures go right out the window when you start overclocking.
 
TDP is kinda hard to define actually.

Look at this way. The X series processors are sold without heatsinks and are advertised to clock to the capacity of the cooling you add. So if you put on a 200w water cooler then don't be surprised that its clocking hard enough to pull 150w+.

If that's a problem you can put on a 95w stock cooler from your parts drawer and turn off all the cool features in the UEFI. Then go drink a glass of warm milk and go to bed early. :sleep:
 
Last edited:
TDP is kinda hard to define actually.

Look at this way. The X series processors are sold without heatsinks and are advertised to clock to the capacity of the cooling you add. So if you put on a 200w water cooler then don't be surprised that its clocking hard enough to pull 150w+.

If that's a problem you can put on a 95w stock cooler from your parts drawer and turn off all the cool features in the UEFI. The drink a glass of warm milk and go to bed early. :sleep:

Not on Intel CPUs. Intel sets the TDP as a hard limit for stock operation which should include turbo frequency adjustments. If the system runs too hot in a given system configuration then you won't get any turbo clocking at all. Its not until the user starts adjusting the voltages and cranking up the multiplier that the TDP goes out the window. AMD does it differently.
 
I would argue AMD moved the goalpost by redefining what TDP means just for them.

Redefined twice in fact, because the new 95W 1800X is consuming more power than the 125W FX-8350 which already got a redefined TDP label then.
 
Last edited:
Games have always been sensitive to cache performance. We've seen this many times over the years.
 
Maybe sooner, we got good stability at 3ghz and the comparative gaming at 3Ghz bore great results. Pushing it up eventually showed weakness in LP Finfet, we even lost 3200Mhz complet ely.

I wouldn't be surprised if AMD move to a higher power transistor, even the LP is not true LP, it is more hybrid but doesn't scale well.

 
Games have always been sensitive to cache performance. We've seen this many times over the years.
Except the FCAT tests we have are not showing up poor frame variances. Doom is incredibly smooth, Resident Evil 7 runs well to.

Ryzen is young at the moment, and wiz kid is coming up with the same ol pessimistic outlook
 
Yes, they accomplished very little with this iteration of Ryzen. They didn't do what they stated, they are not back, they aren't enthusiast systems, they aren't gaming systems, they aren't over clocking systems. What is left is a something that is already out there.........

They have accomplished two things, first they showed us why they are second and at time they can take the fight to Intel. The second the prices of Ryzen is where it should be.

Dunno what anyone else thinks... but if gaming is your secondary use of your rig- this is a great workstation chip. It's certainly not a bad gaming chip.

Hard to understand the ire about these.
 
Dunno what anyone else thinks... but if gaming is your secondary use of your rig- this is a great workstation chip. It's certainly not a bad gaming chip.

Hard to understand the ire about these.


Up to a certain point for workstations, for professionals not really, the platform wasn't made for professionals (outside of the kinks, the features) which Naples platforms will though, for pro sumers, yeah its good.
 
This came today, chance to test out a budget friendly 370 board
 

Attachments

  • IMG-20170306-WA0000.jpg
    IMG-20170306-WA0000.jpg
    130.1 KB · Views: 20
Streaming to Twitch with a 3500 setting using a GPU never looks as good as x264. Quicksync never looks as good as x264 . Nvenc doesn't look good as x264. AMD GPUs didn't look that awesome doing it. I didn't try the new ReLive stuff though.

So that's why for Twitch streaming, x264 is more important for gamers than GPU rendering. People will leave your stream to watch a higher quality stream even if you're better at the game.


Just wanted to toss that out there so that people understand the difference. If you have unlimited encoding bandwidth then GPU looks really good, but I've seen some tests where people argue that there is even a difference there. GPU encoding with unlimited bandwidth was good enough for me though.


Here is an explanation. Start it at the 25m 54s mark.



I must be getting old.... These two know enough about what they are talking about to prove they don't know anything.
 
Dunno what anyone else thinks... but if gaming is your secondary use of your rig- this is a great workstation chip. It's certainly not a bad gaming chip.

Hard to understand the ire about these.

Ryzen is a great workstation CPU. It's an unbeatable value in that category with just one problem. The X370 platform is vastly inferior to X99 in terms of PCIe lanes. There are also some applications that do benefit from memory bandwidth and again, X99 destroys X370 in that arena. Even with those caveats, if you can live with a couple of the negatives you can save an ass load of cash going with Ryzen over Broadwell-E.
 
What the actual fuck? What makes you think that any game developer would intentionally hamstring their game engine on hardware that constitutes the vast majority of systems their customers are using? Just to appease the minority that is using Ryzen 1700, 1700X and 1800X vs those using a quad core Intel chip? How well do you think that'll work when AMD releases failed hexacore chips as R5's and R3's?


Where did I say they should hamstring their games for 4 core parts? I said amd should work with engine devs and game devs to boost performance scaling on higher core parts, so that the relative advantage of performance makes the 4 core parts look worse. Nowhere in that is there a suggestion that amd should try to get them to degrade performance in any way on 4 core parts. AMD is going with a higher core strategy coupled with a competitive ipc / lower initial clock speed chip (completely reasonable based on the constraints of a first chip iteration). It is in their interest to have more games scale much better on higher core counts because at every price level, so far, they will have many more cores to play with.
 
Up to a certain point for workstations, for professionals not really, the platform wasn't made for professionals (outside of the kinks, the features) which Naples platforms will though, for pro sumers, yeah its good.

Ha!

Most guys in IT departments would kill for a workstation this capable. Hell my main client is still running dual core docked laptops for employees. The only Zeons are in the data closet- and they are old. We've got VMWare servers running on AMD parts for gods sake.

Really? Professionals? Outside of heavy content creators everyone in corporate America is limping along with lowest common denominator computers.
 
Ryzen is a great workstation CPU. It's an unbeatable value in that category with just one problem. The X370 platform is vastly inferior to X99 in terms of PCIe lanes. There are also some applications that do benefit from memory bandwidth and again, X99 destroys X370 in that arena. Even with those caveats, if you can live with a couple of the negatives you can save an ass load of cash going with Ryzen over Broadwell-E.

Ryzen is the middle ground between 1151 and X99, obviously some sacrifice is needed and that was the case
 
I am old. I don't get why the fuck anyone would watch someone else play computer games like they were watching TV. It's boring as fuck to me.

No kidding. Unless the players were friends I'd rather get a colonoscopy.
 
  • Like
Reactions: Curl
like this
Ryzen is the middle ground between 1151 and X99, obviously some sacrifice is needed and that was the case

AMD has always lagged behind in some area on the desktop side. That's just how its always been for the most part. Ryzen itself is a good competitor to Intel's HEDT CPUs in many areas but the platform falls short of that. The thing is X370 is really no better than Z270. On that front it's not really in between LGA 1151 and LGA 2011v3 at all.
 
Really? Professionals? Outside of heavy content creators everyone in corporate America is limping along with lowest common denominator computers.

Without a doubt. My work laptop has a Core i7 4600U running at 2.1GHz-2.7GHz. I've got 8GB of RAM. Hardly a power house.
 
Ha!

Most guys in IT departments would kill for a workstation this capable. Hell my main client is still running dual core docked laptops for employees. The only Zeons are in the data closet- and they are old. We've got VMWare servers running on AMD parts for gods sake.

Really? Professionals? Outside of heavy content creators everyone in corporate America is limping along with lowest common denominator computers.


err no they are not man, Young & Rubicam, their workstations are 7k + when I was with them, and if we needed one, just ask, they get one by the next couple of days. And that was an advertising firm. Think about companies that specialize on heavy workstation loads? The workstations at my current work we use, 4 CPU workstations, 20k a piece. And same thing, just a request to the IT department, and there it is by next day.

Y&R, our clients our accounts/ projects were minimum 500k and up and 500k is once the client has other things in their portfolio that are millions and up.


PS even smaller agencies that don't have account requirements like that, (first company I worked for in NYC), never had an issue to get a Dual Penryn Mac as a workstation when we needed them. Their accounts were much smaller 25k and up, so......

So workstation costs are meaningless for most companies, its not like they are going to get 10 or 15 of them, depends on the projects they have but the projects tend to take care the costs since the costs are a one time thing and it spans over multiple projects.
 
Last edited:
If they matched X99 features theyd get too close to X99 prices at which point people would just buy Intel. Time will tell if the found a large enough market segment for these parts
 
Hmmm. I can almost see a new stepping ryzen' over the horizon...
 
Ryzen is the middle ground between 1151 and X99, obviously some sacrifice is needed and that was the case

When I look at the X370 boards I basically see Z170/270 with a (near) X99 price tag. Less lanes, half the number of RAM slots (tracing them is not uncomplicated) ... I don't like that at all.
 
It agrees with what some of us have been saying before launch


L1 and L2 on Ryzen are fast. L3 gets a problem because of the way it is attached to the memory controller.

What I heard is that this higher latency problem is expected to be solved for Zen+.


That's good news for AM4 motherboard users since there will be multiple zen chip releases on the same platform. Perfectly fine performance for most people today, and if it does become an issue for some larger number of titles in a few years, drop in a zen++ chip with all the subsequent clock/cache/ipc boosts.
 
http://www.techspot.com/review/1348-amd-ryzen-gaming-performance/



But:



Disabling SMT didn't change anything because the problem with RyZen is other.


Was that 3% number based on the average of all 16 titles? I ask because looking at specific cases there were actually performance boosts with SMT AND performance hits with SMT, so a pure average loses a lot of the potential relevant information. And in several titles, SMT on actually increased the minim frames by a decent margin while the averages stayed around the same.
 
I am old. I don't get why the fuck anyone would watch someone else play computer games like they were watching TV. It's boring as fuck to me.


I used to do that with my cousin as a kid. I would play megaman x on supernes when he was over, and I would let him watch me play the game. Great deal.
 
err no they are not man, Young & Rubicam, their workstations are 7k + when I was with them, and if we needed one, just ask, they get one by the next couple of days. And that was an advertising firm. Think about companies that specialize on heavy workstation loads? The workstations at my current work we use, 4 CPU workstations, 20k a piece. And same thing, just a request to the IT department, and there it is by next day.

Y&R, our clients our accounts/ projects were minimum 500k and up and 500k is once the client has other things in their portfolio that are millions and up.

So no workstation costs are meaningless.

Ya but you don't live in the real world. You live in the Fortune 1000 world. You don't work in the real economy... you are "urban glitz business". You need to get down in the dirt where people actually produce something.

As a former city guy, I love it when city guys think "urban glitz business" is indicative of the overall economy.

I bet avocado facials are part of the benefit package.
 
  • Like
Reactions: noko
like this
Back
Top