AMD Ryzen 1700X CPU Review @ [H]

I think when they start programming games that take advantage of more than 4 cores this CPU will shine in terms of price/performance, for now it's in no man's land.

Some games and game engines can already leverage more than four cores. What most people fail to understand is that doing so doesn't guarantee linear progression in performance. Not all workloads benefit from multithreading by their nature. It isn't because the developers don't want to code for it.

Yeah the day traders and short term thinkers bailed without a thought as to how this decent basic core will stack up in the real profit areas of server and APU.

Sorry to say it, but gaming is tiny. No one cares if we don't buy to play RB6S or BF1

Sorry, but all the market research I've been shown by the manufacturers including MSI, ASUS, GIGABYTE and Intel paint a very different picture. Gaming is growing and it's one of the driving factors behind continued desktop sales. Think about it, at current prices there isn't any reason to buy a desktop for other tasks. If it hadn't been for research showing that this was were the desktop is going, do you really think MSI would have re-branded EVERYTHING for gaming? That's exactly what they did. Very few of their motherboards lack "Gaming" in the branding. There is a reason for that. The desktop has no reason to exist outside of the gaming and professional workstation tasks. Most professional workstations are OEM builds from Dell, HP etc. They aren't using Z270X Gaming 9's or anything like that.
 
To take an Intel-like stance, look at that voltage vs. frequency curve.
Ryzen should be pretty darned efficient at stock and/or lower speeds, which should help tremendously in mobile and HPC/servers.
 
Last edited:
To take an Intel-like stance, look at that voltage vs. frequency curve.
Ryzen shold be pretty darned efficient at stock, which should help tremendously in mobile and HPC/servers.


It should but we have to see how it scales though, cause again architecture vs performance vs power usage, might not scale down past a certain limit too.

the x1700 does show a good sign that it will.
 
Sorry but I have to do this, no choice...
But AMD uses less watts at idle than Intel so You must buy it if you have to pay a lot for your electricity bill.

This is a bullshit statement. You won't notice any significant difference in your electric bill going from a modern Intel based system to a Ryzen system. If you already have a modern Intel system, the cost of switching will take some time to make up as well. People overplay the cost of running PC's. The fact is you'd save more by using your washer and dryer, or microwave less than you do now than running your PC less.
 
actually yeah the x1700 is the best chip out of Ryzen's line up and probably will stay that way.
From a price perspective I'm not sure why anyone would buy the x1800 (I want to call it the x1800xt lol). I'm sure there are people who want to get the max out of their CPU without going into the BIOS but that sort of negates high end boards.
 
I cant believe all these people are spreading so much fud about this. It consumes similar or less power than intel at stock and somewhere around there when overclocked. AMD chips have historically lived on more voltage. Why does power spike matter if it still as good as intel. I don't understand all the negative being spread around by fanboys.

There is another thread here that shows like 10 different benchmarks and in most games the processor does pretty decent. Rest it performs better if SMT is disabled. Clearly games are not behaving the way they should on this processor not that might never materialize but it does pretty damn decent on alot of games. If processor sucked so bad all games would show the same behavior. At 1080p it did perform up to par with intel in alot of the benchmarks.

With that said buy the Ryzen 1700 non x and OC it to 3.9 or 4 and save yourself money. Hitting 4ghz on cores is not bad if 1800x has no advantage there is no need to waste more money.

AMD already said they know where they want to go next with zen+ I am pretty sure improving clocks is one of them.

Remember 6 months ago we were saying we wont see anything better than 3.2ghz and that would be the max, weren't we?
 
Sorry, but all the market research I've been shown by the manufacturers including MSI, ASUS, GIGABYTE and Intel paint a very different picture. Gaming is growing and it's one of the driving factors behind continued desktop sales. Think about it, at current prices there isn't any reason to buy a desktop for other tasks. If it hadn't been for research showing that this was were the desktop is going, do you really think MSI would have re-branded EVERYTHING for gaming? That's exactly what they did. Very few of their motherboards lack "Gaming" in the branding. There is a reason for that. The desktop has no reason to exist outside of the gaming and professional workstation tasks. Most professional workstations are OEM builds from Dell, HP etc. They aren't using Z270X Gaming 9's or anything like that.

Growing a 1 per cent market to 1.15 still isn't going to turn many heads. And growth in PC's isn't necessarily growth in CPUs. AMD is going to use this foundational core to power their bread and butter, APU's and they hope to expand their market share in the more profitable server lines.

Gaming isn't the engine driving those far more profitable markets.

Pointing out that device manufactures like MSI are literally tied to this niche market doesn't do anything but state the obvious. Of course they're tied to gaming with their products. I agree. I buy their stuff. Why wouldn't I? But I don't base a company's potential CPU market uptake on how they do in games alone. It's just one segment, a small one at that once you figure Intel is dominating it, by far, leaving you nothing but crumbs which aren't worth fighting for. Yet.
 
It should but we have to see how it scales though, cause again architecture vs performance vs power usage, might not scale down past a certain limit too.
the x1700 does show a good sign that it will.
Yup. The R5 and R3 should be interesting.

I mean, we (myself included) keep bitching at Intel for not going balls-to-the-walls frequency-wise, power efficiency be damned.
Can't blame AMD for doing the same.
 
Kyle ain't bullshittin'. Look at my post from a few pages back (or this thread on AT).

AMD pushed Ryzen outside the optimal bounds of GloFo's 14nm process :
https://forums.anandtech.com/threads/ryzen-strictly-technical.2500572/

It's quite possible that AMD had no choice. Going back to Bulldozer, I suspected that it was probably efficient enough at lower clock speeds but the IPC put it too far behind. When you have low IPC you can compensate for it by increasing the clock speeds. AMD ran those CPU's at speeds that were almost the maximum those chips could possibly run at. That's why they had jack shit for overclocking headroom. Unfortunately, when you scale clock speeds too high you end up well outside the sweet spot for efficiency and thus you increase power consumption and CPUs generate more heat. I suspect the same thing happened with Ryzen. The difference is that the problem wasn't as pronounced but they still needed clock speed to keep pace with Intel just as before. The clock speeds were probably increased to keep pace with Intel's offerings more closely on a stock for stock basis. They still managed to hit a reasonable TDP all things considered. Earlier rumors about earlier engineering samples indicated speeds between the 2.5GHz and 3.0GHz range. I'd imagine that those CPU's are considerably more efficient at those clocks but obviously fall well short of AMD's goals in terms of performance.

I know I've made several comparisons to Bulldozer in this thread, but while I'm drawing those parallels I don't think Ryzen is close to the fiasco that Bulldozer was. Bulldozer wasn't very good at competing with Intel in virtually any way and it consumed a lot more power and had almost no overclocking headroom to work with. It was a terrible chip that only performed competitively in specific multi-threaded workloads. It was truly awful as a desktop processor no matter what you did with it. Ryzen isn't that by any stretch of the imagination. AMD learned some valuable lessons but there is still some similarity in the two situations. If anything, the history is reminiscent of older eras where Cyrix and AMD did OK in office and productivity type applications and sucked ass at gaming. You Quake I players who couldn't afford Intel CPU's know what I'm talking about. Even with that comparison, I think we are in a better place with Ryzen than we ever were with earlier AMD CPUs. Athlon / Athlon 64 / Athlon X2 not withstanding.
 
Growing a 1 per cent market to 1.15 still isn't going to turn many heads. And growth in PC's isn't necessarily growth in CPUs. AMD is going to use this foundational core to power their bread and butter, APU's and they hope to expand their market share in the more profitable server lines.

Gaming isn't the engine driving those far more profitable markets.

Pointing out that device manufactures like MSI are literally tied to this niche market doesn't do anything but state the obvious. Of course they're tied to gaming with their products. I agree. I buy their stuff. Why wouldn't I? But I don't base a company's potential CPU market uptake on how they do in games alone. It's just one segment, a small one at that once you figure Intel is dominating it, by far, leaving you nothing but crumbs which aren't worth fighting for. Yet.

Trust me, the computer hardware manufacturers are convinced. They make a pretty big deal about it. Still, it isn't what drives processor manufacturing. The mobile and server markets are larger these days and it's easy to re-purpose CPU's designed around the concept of performance per watt for desktop use. If the market ever shifted significantly enough that would change but we are a long way off from that and it frankly won't happen. Still, do not sell PC gaming short as a factor in corporate decision making. It's a more important segment to these companies than you give them credit for.
 
I think topping out at 4ghz on brand new architecture is not bad. Remember when intel went first gen and second gen i7 parts were like in the high 2ghz range correct? I think AMD will get the clocks up. They were clocking the shit out of their last architecture it still eats shit lol. Got them no where, this is atleast a big step in the right direction.
 
Ugh, Cyrix. Hope to never see those POSes again.

The Cyrix 6x86 was the best alternative to the Pentium CPUs in its day. They did suffer due to games being optimized for Intel's FPU architecture but outside of that they were damn good. Although they did run a bit hotter. AMD had no real competition to offer at the time. The K5 was late to market and it was a piece of shit. I have a decent sized processor collection and I've probably got at least one Cyrix 6x86 in it.
 
Why does Ryzen 7 1800X performs so poorly in games?

good post. Not bad across all games, those games it falls behind disabling SMT helps. It looks like more of an optimization issue. If it was horrible gaming cpu it should suffer across the board. Which is not the case.

Cliff notes from that thread:
Disable SMT CPU for better game performance. Then go find a genie and wish for AMD to magically fix the issue even though they had years to get this taken care of before launch.
 
I think topping out at 4ghz on brand new architecture is not bad. Remember when intel went first gen and second gen i7 parts were like in the high 2ghz range correct? I think AMD will get the clocks up. They were clocking the shit out of their last architecture it still eats shit lol. Got them no where, this is atleast a big step in the right direction.

I think intel was pretty conservative with the clocks on first and second gen (nehalem and sandy bridge).. it wasn't rare to see an i7 920 overclock to 4ghz as it wasn't rare to see an i7 2600K at 5ghz, but then, those were planar architecture which behave absolutely different to 3D Transistor which it's the reason why the 3770K had that low 77W TDP.. they could have made it easily a 4ghz chip and still be at lower TDP than it's predecessor.
 
The Cyrix 6x86 was the best alternative to the Pentium CPUs in its day. They did suffer due to games being optimized for Intel's FPU architecture but outside of that they were damn good. Although they did run a bit hotter. AMD had no real competition to offer at the time. The K5 was late to market and it was a piece of shit. I have a decent sized processor collection and I've probably got at least one Cyrix 6x86 in it.
Might be misremembering in my "old" age, but I can't seem to recall any good experiences with Cyrix chips, though my experience may have been limited to the 486S and 5x86.
I remember AMD's older chips (eg. 386DX40, 486DX4-120, K62/3) in a more favorable light.
 
Might be misremembering in my "old" age, but I can't seem to recall any good experiences with Cyrix chips, though my experience may have been limited to the 486S and 5x86.
I remember AMD's older chips (eg. 386DX40, 486DX4-120, K62/3) in a more favorable light.

The AMD K5 PR133 wasn't so good :D But still, not bad for Socket 3 drop-in! :)
 
Actually at my workplace on top of the standard fair HP, Dell, Lenovo stuff we get, we also have a side budget for emerging technologies that benefit from the the enthusiast market sector. A research box on the fly for AI learning was built off an i7 and asus z170 tuf board, some x99 based systems for graphics intense 4k render/photo editing. There is a market for these machines that can do more than a "Dell" for cheaper. Having AMD back into the game bringing some competition, I feel those x99 based systems we bought would probably be AMD Ryzen setups if built today. I consider this cpu not an intel killer, but a legitimate option now for power workstations.
 
Might be misremembering in my "old" age, but I can't seem to recall any good experiences with Cyrix chips, though my experience may have been limited to the 486S and 5x86.
I remember AMD's older chips (eg. 386DX40, 486DX4-120, K62/3) in a more favorable light.

The AMD 386 and 486 chips were much slower than their Intel counterparts. They weren't quite as bad as the Cyrix chips but they weren't equal to Intel CPUs either. There were less compatibility issues with software on the AMD side in those days though. AMD is often fondly remembered for it's 486 series CPUs but most people are thinking of the DX4 100MHz and 120MHz CPUs. Their 486 series was carried on after Intel had already brought out the Pentium leaving 486 owners to naturally upgrade with AMD CPUs rather than replace their motherboard, CPU etc. Simiarly, Cyrix and AMD both had a 5x86 CPU though Cryrix's was better. It was a bit of a Pentium overdrive on a 486 motherboard type of thing. It worked but it wasn't that great. Faster than a 486 but much slower than a Pentium at similar speeds. The K6, K6-2 and K6-3 were overrated. They were slower than their Intel counterparts most of the time once again but even worse was the fact that they were tied to the socket 7 motherboards. Upgraded motherboards using non-Intel chipsets were generally known as Super 7 motherboards and by in large they were shit.

Anyway, I'm straying off topic. I just brought those old examples up to showcase how history sometimes repeats itself. I think many people were hoping for another Athlon 64, not realizing what a fluke that actually was for AMD.
 
Also.. here's a thought... what if AMD were to "halo" one of their new Naples/Snowy Owl's as an R7 HEDT or "R9" variant in the family like the 6800K/6900K options from Intel and re-work an improved X370 with quad channel memory support?

Or could we see the return of the Black Edition R7 to be the 1900X ???? Although seeing that Zepp frequency/voltage curve, who knows what's left in the GloFo process currently. Improvements over time in the manufacturing process could open head-room.

Suppose only AMD P/M's know for now. :)
 
Also.. here's a thought... what if AMD were to "halo" one of their new Naples/Snowy Owl's as an R7 HEDT or "R9" variant in the family like the 6800K/6900K options from Intel and re-work an improved X370 with quad channel memory support?

Or could we see the return of the Black Edition R7 to be the 1900X ???? Although seeing that Zepp frequency/voltage curve, who knows what's left in the GloFo process currently. Improvements over time in the manufacturing process could open head-room.

Suppose only AMD P/M's know for now. :)

I will wager we won't see a Ryzen variant with greater clock speeds or overclocking potential anytime soon. The CPU's are unlocked so the "Black Edition" moniker won't make sense at this point. Given where we see Ryzen's clocks, it just isn't going to be capable of more without some hardware revisions. As for an HEDT platform, I don't doubt that we will probably see some sort of server CPU variant or HEDT version of Ryzen on a modified server / workstation motherboard to provide a lower cost alternative to X99 based systems. It's actually a great market to go after because of Intel's current platform and CPU pricing. AMD can somewhat do that now but X370 just isn't competitive in most respects with X99.
 
Was thinking "BE" in the respect of some type of top-end / halo SKU of the R7 family.... as I would guess that the "FX" moniker is to be buried along with the previous gen stuff.
And agree on that halo/HEDT piece... the chipsets aren't there yet, but simply a I'd say a function of time and maturity of the platform (fed into from the server launches for memory controller "stuff").
 
Is it not possible that enabling SMT causes performance regression due to the increased potential for the windows scheduler to ignore CCX locality and assign data to threads on another CCX?

I'm guessing there's a major penalty for cache reads from the other complex, wasn't it limited to 18GB/s or something?

This would be easy to test if you could be certain that disabling 4 cores disabled one CCX entirely, if the performance regression from using SMT disappears then...

That is easy to fix. AMD is probably working with Microsoft to optimize the windows scheduler for Ryzen. That is no big deal , it is not a hardware flaw, it is simply an optimization issue. There is absolutely no reason Microsoft won't tale care of this expeditiously given the close relationship that exists on the console front.
 
Right, the HIGHLY VARIABLE gaming performance is the way it is because Intel has been the only target for compiler optimizations.

I'd wait until a year has passed and new games come out with Zen-optimized code paths. THEN we will know how powwrful it is.

But for now, it's a mixed-compute powerhouse, with acceptable gaming performance.
 
Last edited:
That is easy to fix. AMD is probably working with Microsoft to optimize the windows scheduler for Ryzen. That is no big deal , it is not a hardware flaw, it is simply an optimization issue. There is absolutely no reason Microsoft won't tale care of this expeditiously given the close relationship that exists on the console front.
Except why was this not done already? You can't tell me that they have not already had enough lead time. Also it is complete speculation that this has to do with Windows scheduler.
 
Am I the only one excited about Ryzen 3? Right now you can't really get anything good in the $100-$150 price range; a locked 3.7GHz Core i3 isn't really acceptable in this day and age. R3 will be unlocked, and even with the poor voltage scaling we're seeing an R3 at 4GHz should be sub-100W for day to day use, and sub-50W at 3GHz. Pair that with a 1060 or a RX 480, 8GB of memory, and a 450W PSU and you've got solid HTPC/console replacement for $500, with excellent encoding performance to boot.
R3's niche is also something Intel can't immediately enroach on; AMD can sell cheap R3's harvested from otherwise-worthless 8c dies, but Intel doesn't have such a route - their 4c die is actually quite large because of the iGPU and it is unclear whether they can harvest i5's with defective GPU's. They could drop 7350K prices or add HT to the i5 to put pressure on AMD, but in any case us consumers win in the end.

I wouldnt say i3 is not acceptable, but it is not the best in that range or tier. I am very interested as I want to get my HTPC off this turd of a Tri-core APU and low power is a must coupled with decent performance. I dont really game on it, but have fired up some side scrollers or other basic games from Steam. I dont mind tossing in a GPU but eh.
 
Except why was this not done already? You can't tell me that they have not already had enough lead time. Also it is complete speculation that this has to do with Windows scheduler.

If I'm not correct, AMD was supposed to release Ryzen last year, and from the way they kept patching the bios up to the last second, it's pretty easy to see how AMD has not enough time to manage all the solutions to the problems popping up.
 
Except why was this not done already? You can't tell me that they have not already had enough lead time. Also it is complete speculation that this has to do with Windows scheduler.
The real test is to have people turn off the SMT when running their tests. But every single soft core major release (P4 HT, AMD CMT, even Nahalem HT) has come to the market with scheduling issues that needed to be patched in Windows. We also don't know what the Window update pipeline looks like it could be months between a posted issue, code change, and validation before released to the public. The idea is simple try to assign objects to odd cores or cores 1-8 before using the rest. But who knows what it takes on Microsoft's side to actually make that happen.
 
Wonder how much the gaming bug is going to hurt them right off the Bat? I'm sure that will be a microcode/Bios/Driver fix, but every site is reporting the same for day one. I was going to join the hype train and upgrade my OC'D X5690, but other than a new mobo, there's not a huge difference right now, which is actually saddening. Even the PCI Express 3.0 lanes on Ryzen, when broken down over a load, if your into SLI and a pci SSD your just left with two 2.0 8x lanes for those cards.

So I think I will still upgrade, as I would like to support AMD, but I play everything now at or Near Max at 1080p 144, and with the scores a little wonky, may even wait up to a year for revision 2.0 of the chips and mobos. Thoughts?
 
Wonder how much the gaming bug is going to hurt them right off the Bat? I'm sure that will be a microcode/Bios/Driver fix, but every site is reporting the same for day one. I was going to join the hype train and upgrade my OC'D X5690, but other than a new mobo, there's not a huge difference right now, which is actually saddening. Even the PCI Express 3.0 lanes on Ryzen, when broken down over a load, if your into SLI and a pci SSD your just left with two 2.0 8x lanes for those cards.

So I think I will still upgrade, as I would like to support AMD, but I play everything now at or Near Max at 1080p 144, and with the scores a little wonky, may even wait up to a year for revision 2.0 of the chips and mobos. Thoughts?
Well if gaming is important for you then it would be smart to wait. Not like you have much of a choice right now as you can't buy a board to put one in lol.
 
Actually. Are we allowed to post links here if we know of a place or two selling the msi Board?
 
Wonder how much the gaming bug is going to hurt them right off the Bat? I'm sure that will be a microcode/Bios/Driver fix, but every site is reporting the same for day one. I was going to join the hype train and upgrade my OC'D X5690, but other than a new mobo, there's not a huge difference right now, which is actually saddening. Even the PCI Express 3.0 lanes on Ryzen, when broken down over a load, if your into SLI and a pci SSD your just left with two 2.0 8x lanes for those cards.

So I think I will still upgrade, as I would like to support AMD, but I play everything now at or Near Max at 1080p 144, and with the scores a little wonky, may even wait up to a year for revision 2.0 of the chips and mobos. Thoughts?


To be honest there is hardly any CPU that will get you 144fps to math your refresh rate for every game. Check out the reviews first, the issue is not in every game. Some games. A lot of the games it does fine, it seems to be how some games are handling load on multithreaded code, they may be using virtual cores instead of actual cores. If the problem was across the board in every game it would be more worrying. But that is not the case.
 
Yeah, I do a lot of MMOs and rarely ever have a problem. Also, Newegg has the MSI boards in Stock at midnight EST on the 2nd if you want one, as well as, NM, Sabre PC sold out of the Combos already, so they just have the boxed processors and heat sinks now.
 
There is some argument to be made that continuing up to 90fps continues the improved feel even more, but it is subtle, and I haven't done it since my last CRT died in 2005, so I don't have a very good memory of how it feels, but I do recall vsyncing the original Counter-Strike at 100hz/100fps at 1600x1200 on my 22" Iiyama Visionmaster Pro 510 felt a little better than 60fps, but it wasn't enough of a difference that I missed it a ton when I moved to my first flat panel, a 1920x1200 Dell 2405fpw in 2005.

Above 90fps I feel like the improvements are mostly caused by placebo. Things for people to argue about in benchmarks and charts, but not really practically relevant in game.

It reminds me of the old Q3A benchmarks when people where like "ha ha your CPU sucks because it only gets 315 fps in Q3A, but mine gets 350fps, clearly superior". Utter nonsense.

Oh I hear you.....

Q3A was one of those titles. It just defined frame rate arrogance. The genesis of that came from just how advanced the Q3 engine was for it's time.

I was heavily invested in the Q3A world and ran one of the larger adult clans. We had a couple guys running dual coppermines with the smp switch on, some with 3dfx SLI enabled, and some pretty exotic hardware for the time.. .. water cooling, heat pipes, window cut outs and cold cathodes.

Of course back then you had to build your kit. The first AIO water coolers weren't around until after 2003. (Though that Kingwin piece of garbage might have been)

But as the tech advanced and 60fps became possible, there were the same arguments- more would never be needed and 60fps on it's own was overkill.

What's old... is new again :p
 
Back
Top