Zen 2 Review Summary

what I would LOVE to see if a review of memory timings. I want to know if say 3600 CL15 speeds would be beneficial in anyway. I mean most of the reviews I see are 3200 CL14 or 3600 CL16/17 speeds. I know it sounds stupid to ask, but infinity fabric is very touchy when it comes to memory.....Really want to see how it behaves with Ryzen 2 CPU's.

All of the reviews are 3200c14 or 3600c16 which are for the most part the same. I would love to see 3600c15 as well or even 3466c14, which I could hit even with my pos B350 using a 2400g.

Also tRFC was often set way high on the last gen boards so it would be neat to see if latency could improved by lowering that as well. I managed a 62ns latency on the 2400g. Much of that was credited to lowering the tRFC.
https://hardforum.com/threads/post-your-ryzen-memory-speeds.1927366/page-9
 
But, AMD is advertising boost clocks that nobody seems to be really hitting.

My 3600X seems to be doing around 4.2-4.3 all-core at idle or light gaming; with prime95 blend it settled in to a rock-steady all-core 4.1 almost immediately, although (for now) I only ran that for a couple of minutes. But I may need some tweaking still--the BIOS (asus X470-I, latest revision) is set to auto and runs it at 1.44-1.45V! The thing tends to idle around 48C. It's also possible I need to check the thermal paste on the HSF, because my 1600X idled around 35C @ 1.375V with the same fan. At any rate, with lower voltages it might go a little bit faster--I did see 4.35 show up here and there on individual cores.
 
This pretty much sums it up.
 

Attachments

  • weighted.png
    weighted.png
    592.9 KB · Views: 0
111362.png


That power consumption for 3700X while trading blow with 9700K and 9900K, definitely will use it on my next SFF build once I get my hands on the Sentry 2.0 case.
 
Interesting Reddit thread going on right now. It looks like BIOS updates may fix the CPUs not boosting to their listed values. If there's any truth here, reviews may need to be redone.

 
View attachment 172592

That power consumption for 3700X while trading blow with 9700K and 9900K, definitely will use it on my next SFF build once I get my hands on the Sentry 2.0 case.

So much this. In most SFF cases, you will not be able to hit those 4.7ghz MCE speeds that are being used as the basis of 9900k performance in all of these reviews.

Now you are looking at 4.3-4.4 ghz speeds on the 9900k while the 3700c will still be able to hit those all core freqs of 4.2 ghz. At those speeds, the 3700x may very well match the 9900k in gaming while widening the gap for productivity.
 
Dear AMD,

Welcome back.


::looks over at his Athlon 700 and X2-3800 on his museum shelf::
you forgot the legendary Opteron 165 chip. That was my favorite o/cing era next to my old athlon tbird and copper sk-6 with a 60mm Delta fan replaced by an ax-7 and 80mm fan. My parents were happy when that swap happened..
 
you forgot the legendary Opteron 165 chip. That was my favorite o/cing era next to my old athlon tbird and copper sk-6 with a 60mm Delta fan replaced by an ax-7 and 80mm fan. My parents were happy when that swap happened..

Never owned one. I went from a PII-300 to an Athlon 700 with a secret white box asus mainboard, to 1900xp, to a 2600 T-bred / 2500 Barton, to a X2-3800. I then switched my main rig to a Q6600, but had a Phenom 940 in my HTPC, and a Athlon X4 640 in my server. I haven’t run my main rig on AMD in a long, long time.
 
Never owned one. I went from a PII-300 to an Athlon 700 with a secret white box asus mainboard, to 1900xp, to a 2600 T-bred / 2500 Barton, to a X2-3800. I then switched my main rig to a Q6600, but had a Phenom 940 in my HTPC, and a Athlon X4 640 in my server. I haven’t run my main rig on AMD in a long, long time.

I bet a lot of [H] users had a similar upgrade path, I had the 2500 barton (beast of an OC chip) and my last AMD build was also an X2, followed by a q6600 and intel chips since then.
 
Just let it go. Let's all agree on the following:

-Intel in light to general gaming workloads in non gpu bound settings has an edge.

-commend AMD on there rather large IPC and overall performance jump.

-intel customers should be happy. Zen2 forced Intel's hand on price drops across-the-board.

Conclusion:
Both are great chips, but as others have stated the new Zen 2 chips seems to strike the right balance between solid game performance and professional work loads. Overall IMO 3600+b450 provides a better bang for your buck overall. One other thing to note which I find interesting if you watch gamer Nexus's review they talk about frame latency and no longer recommend the 9600k due to the lack of SMT vs the 3600 now that's it out. Hope it's all food for thought
 
Interesting Reddit thread going on right now. It looks like BIOS updates may fix the CPUs not boosting to their listed values. If there's any truth here, reviews may need to be redone.


Zen 2 CPU's hit boost clocks just fine. This entire thing is based on the annoyingly common misunderstanding that boost clocks are the same as an all-core boost. They're not. They never have been nor does neither company advertise them as such. Boost speeds are for single or lightly threaded loads only. Sometimes you can find CPUs that will reach an all-core manual overclock of their rated boost speeds (Intel's Gen 9 K-series chips, for example), but that is never a guarantee.
 
PSA for Sandy/Ivy Bridge holdouts (probably less than 10 on here lol): Your stone age chip is OBSOLETE. My 4930K @ 4.5 is worth about as much as a Ryzen 5 1600 (non-X). :eek::eek: I've suddenly lost the last bit of interest in 9900K.

https://hardforum.com/threads/amd-ryzen-r9-3900x-review-round-up.1983744/page-3#post-1044255071
What does all of this have to do with 9900k?

9900k is still faster in most stuff most people really care about and as it is it is my that much more expensive. Certainly going full ryzen with 3900x and x570 motherboard is much more expensive all things considered.

3700x + x470 mobo will be cheaper but not really by all that much. It is not like it should make that of an massive difference.

And both platforms are pretty much dead at this point anyway...
 
not a single review of the asus x570 ws pro ace so far.

I'm torn between getting 3700x, 3800x, and 3900x :|
None of the reviews shown me anything would make my choice easier. I guess I'll have to wait for users to start testing their kits before I can make my mind.

Few troubling things I've seen is that there is supposed to be some fps jitters affecting some systems running 3700x? I hope this was just a bad press sample.
Higher cache, and 2 chiplets of zen2 on 3900x didn't seem to help a lot, and latency doesn't seem to be suffering either way? Someone would need to run some thread affinity tests, preferably with some more scientific documentation.

None of the tests were ran on older x370, or x470 so far as i know? (only early x570 mobos)

Another thing is the heat. They seem to run hot.
 
What does all of this have to do with 9900k?

9900k is still faster in most stuff most people really care about and as it is it is my that much more expensive. Certainly going full ryzen with 3900x and x570 motherboard is much more expensive all things considered.

3700x + x470 mobo will be cheaper but not really by all that much. It is not like it should make that of an massive difference.

And both platforms are pretty much dead at this point anyway...

What does it have to to with 9900K? Well given I'm still decently satisfied with the performance out of my machine (4930K @ 4.5, 980 Ti @ 1500/8000), I didn't realize how much performance I was leaving on the table with my aging hardware. So if a 3700X will already provide a 20% boost, then the last 10-15% difference is just diminishing returns at that point.

3700X is already $150 cheaper than 9900K. Assuming the same mobo price, that's $150 saved. For that $150, I could upgrade to the next tier in GPU performance (except the 2080 to 2080 Ti jump), which will have a far more meaningful impact than worrying about CPU bottlenecks. Z390 is a dead end yes, but X470 may still have some life left, so the possibility of a drop-in upgrade is a plus, though it's not guaranteed, and it's not a sway factor for me anyway.
 
What does it have to to with 9900K? Well given I'm still decently satisfied with the performance out of my machine (4930K @ 4.5, 980 Ti @ 1500/8000), I didn't realize how much performance I was leaving on the table with my aging hardware. So if a 3700X will already provide a 20% boost, then the last 10-15% difference is just diminishing returns at that point.

3700X is already $150 cheaper than 9900K. Assuming the same mobo price, that's $150 saved. For that $150, I could upgrade to the next tier in GPU performance (except the 2080 to 2080 Ti jump), which will have a far more meaningful impact than worrying about CPU bottlenecks. Z390 is a dead end yes, but X470 may still have some life left, so the possibility of a drop-in upgrade is a plus, though it's not guaranteed, and it's not a sway factor for me anyway.

Or you could stick with the 4930 and get a 2080ti, that would probably get the best gaming performance hahaha.
 
Or you could stick with the 4930 and get a 2080ti, that would probably get the best gaming performance hahaha.

lmao there's no way in hell I'm shelling out $1200 for a facking GPU that's not even a full chip. That's just fucking retarded, and I'm certainly not going to contribute to Leatherman's jacket fund anymore. If I were in the market for a GPU, I'd look for deals on used 1080 Ti's, or maybe see how 5700XT pans out once drivers improve and AIB versions are released.

I once promised myself I wouldn't upgrade the GPU until I can get Titan Xp performance for $400, and we're still not quite there just yet. Plus 980 Ti @ 1500/8000 is somewhere between 1070 and 1070 Ti performance, so still plenty good for 1440p/60 gaming.
 
My 3600X seems to be doing around 4.2-4.3 all-core at idle or light gaming; with prime95 blend it settled in to a rock-steady all-core 4.1 almost immediately, although (for now) I only ran that for a couple of minutes. But I may need some tweaking still--the BIOS (asus X470-I, latest revision) is set to auto and runs it at 1.44-1.45V! The thing tends to idle around 48C. It's also possible I need to check the thermal paste on the HSF, because my 1600X idled around 35C @ 1.375V with the same fan. At any rate, with lower voltages it might go a little bit faster--I did see 4.35 show up here and there on individual cores.

yeah been watching bearded hardwares overclocking stream vod, he got his 3700x down to 1.25v @ 4.3Ghz all core so it's possible the board manufactures over compensated on default voltages. wouldn't be surprised if we see some bios updates fixing the default voltage.
 
Seems like there is no reason to o/c all cores and just let boost do its thing if your mostly just gaming.. going to be testing my 3600 with the 450b chipset on an Asus strix ITX baord.

This was one of the points I tried to make. Unfortunately, as I noted, I rarely saw the CPU hit its advertised boost clocks. I had to use the +200MHz offset to do it and even then, it stil fell 100MHz short. There may indeed be BIOS issues with these boards. Every reviewer had access to and ended up using the same exact BIOS revisions.
 
I did and Intel is still better at single core perf and games while AMD exells at productivity because for the same money you can get more cores, including 12c now and 16c two months from now, and higher performing SMT.
I do not care for CPU rendering and CPU encoding but for single core performance and games so I got Intel and I intend to OC it to at least 5GHz if not 5.2GHz which I should be able to do. I even got 700W PSU for that :)

I was considering Ryzen 3900X but while it would suffice for my needs as much as 9900K will it is still worse than 9900K where it matters for me and and not even available now unlike Intel setup which I will be running next week. I waited long enough for Ryzen already. Waiting no more.

That said I think most people will get Ryzen now. It is after all pretty awesome product line for what it is.

I was debating between the two and went with the 3900x, well went with everything but that for now since it’s OOS. After looking at the reviews I saw that at 1440p on a 144hz gsync display, they are virtually the same. Lately I’ve been doing a lot of editing and encoding of 4K drone and GoPro videos which was a clear win for AMD. I actually bought a 2 CPU 12c/12t 1U server from eBay, used a pcie riser card with a gtx 1060 6gb for some cuda acceleration to use as a dedicated editing station because the process was just taking too much time on my main system making it useless for any other tasks that were even moderately cpu intensive.
 
lmao there's no way in hell I'm shelling out $1200 for a facking GPU that's not even a full chip. That's just fucking retarded, and I'm certainly not going to contribute to Leatherman's jacket fund anymore. If I were in the market for a GPU, I'd look for deals on used 1080 Ti's, or maybe see how 5700XT pans out once drivers improve and AIB versions are released.

I once promised myself I wouldn't upgrade the GPU until I can get Titan Xp performance for $400, and we're still not quite there just yet. Plus 980 Ti @ 1500/8000 is somewhere between 1070 and 1070 Ti performance, so still plenty good for 1440p/60 gaming.

980ti is was an amazing card for its time, and still gives performance of a $300 card. I love mine.

That said...

Once you go 1440P 144hz, the 2080/2080ti make a lot more sense, and pairing them with say a 3900X would help increase the FPS in those “in the moment” drops to quote Digital Foundry.
 
I was debating between the two and went with the 3900x, well went with everything but that for now since it’s OOS. After looking at the reviews I saw that at 1440p on a 144hz gsync display, they are virtually the same. Lately I’ve been doing a lot of editing and encoding of 4K drone and GoPro videos which was a clear win for AMD. I actually bought a 2 CPU 12c/12t 1U server from eBay, used a pcie riser card with a gtx 1060 6gb for some cuda acceleration to use as a dedicated editing station because the process was just taking too much time on my main system making it useless for any other tasks that were even moderately cpu intensive.
With that use case definitely 3900x will be better choice of the two

maybe even wait for 3950x?
 
Zen 2 CPU's hit boost clocks just fine. This entire thing is based on the annoyingly common misunderstanding that boost clocks are the same as an all-core boost. They're not. They never have been nor does neither company advertise them as such. Boost speeds are for single or lightly threaded loads only. Sometimes you can find CPUs that will reach an all-core manual overclock of their rated boost speeds (Intel's Gen 9 K-series chips, for example), but that is never a guarantee.

I mean, Anand is redoing their entire test suite for both CPUs because they found an updated BIOS significantly improved boost clocks. So yeah, it looks like the boost clocks definitely were off.
 
One thing I really really hate is when people downplay Intel. And I have seen that all morning.

I've read the reviews all over the internet this morning along with videos on Youtube.

One thing is abundantly clear ... the Intel 9900K @ stock is still faster in gaming. If you have it OC'd to 5Ghz which many of us do, then forget it.

Are you gamer or do you encode videos?

All of a sudden everyone sounds like productivity is more important yet, if you look at the numbers, 99% of the people with PC are gamers.
And 99.99% of gamers are GPU limited, so the small advantage Intel still has is moot.
 
So in summary, if you game at higher resolutions and want to get it for gaming as they are marketing it, better off keeping your current pc if it was built in the last 6 years and getting a 2080ti.

Yep, that's basically been the conclusion for every cpu review in the past 6 years.
 
Overall the Ryzen 3000 series is rather impressive, though the gaming and single threaded performance could still use more work. Other than that the 3900x pretty much crushed the i9 9990k but its a rather unfair comparison because the 3900x is a 12 core 24 thread CPU.

Bit disappointed with the limited overclocking though, AMD CPU's were never that great at overclocking from what I've seen. Hopefully future BIOS revisions will address this issue.
 
Bit disappointed with the limited overclocking though, AMD CPU's were never that great at overclocking from what I've seen. Hopefully future BIOS revisions will address this issue.

I think it is a limitation of the current TSMC 7nm node.
 
Unless you play at 1080p high refresh there is no point in going Intel atm. AMD is winning atm.
 
Yeah, I should have not waited for AMD... :(

Don't mean to be rude, but that is dumb. At the very least, Intel may be dropping prices soon, so you can benefit that way. If there's a major CPU/GPU release very soon from the point you are considering buying, it's usually prudent to wait no matter which brand/product you are most interested in.
 
What does all of this have to do with 9900k?

9900k is still faster in most stuff most people really care about and as it is it is my that much more expensive. Certainly going full ryzen with 3900x and x570 motherboard is much more expensive all things considered.

3700x + x470 mobo will be cheaper but not really by all that much. It is not like it should make that of an massive difference.

And both platforms are pretty much dead at this point anyway...
dude you have made your point. Stop beating a dead horse. This is why competition is good... Just leave this thread be...
 
Back
Top