Who's Happy So Far with Ryzen 3000 Series?

3600 and 3900x.

Both game like champions. The.3900x is just retarded powerful for the price.
 
I posted in the Zen 2 Memory Speeds thread this morning. Not really happy yet. I'm trying to get RAM B Dies rated for (4000 19-19-19-39) stable at 3733 with decent timings and low latency without much luck so far. I want to get everything worked out, then reinstall a fresh copy of Windows. One crash I had yesterday broke Windows networking pretty hard. I had to use some google fu to fix it. 3900x on an Asus Crosshair Hero VIII. I'm sure once I get stable, I'll be happy. Maybe another new BIOS is needed.
 
memtest86 can be booted directly into from a livedisk/flashdrive

If you're going to really push your ram, that's a far safer and logical way to do it than put your hdd at risk of data corruption.

https://www.memtest86.com/download.htm

many linux installs include memtest86 as a boot target already or after apt installing it. But it's always good to have it handy on a flash drive somewhere.
 
  • Like
Reactions: rUmX
like this
memtest86 can be booted directly into from a livedisk/flashdrive

If you're going to really push your ram, that's a far safer and logical way to do it than put your hdd at risk of data corruption.

Yeah, tonight I'm going to go that route with memtest86. I've never had RAM throw errors like that. I'm kind of pissed that this expensive RAM I have is requiring such high voltage to get to where most people seem to get without issue. I guess I've been lucky and always just taken RAM for granted. I'm going to reinstall Windows either way and that's always been the plan, so I'm not really worried about disk corruption at this point. I still have another working computer to fall back on. I've noticed you've been very active posting on Zen 3900x. Can you check out my post in the Zen memory speed thread and see if you can give me any pointers?
 
I haven't needed to adjust mine from the xmp profile since it worked correctly out of the box. Testing has shown that my 3200mhz ram is doing better than some other 3900x users 3200mhz ram so I haven't felt the need to tweak it. So i wouldn't be able to tell you much first hand.

I can only suggest that if the ram is not behaving correctly at the xmp profile specs (via selecting the xmp profile rather than trying to manually recreate it) then you should test with 1 stick at a time and see if the issue can be limited to an actual hardware fault - be that a stick or a slot. Both could contribute to memory errors in addition to the normal settings/agesa support issues.
 
Wish the games I play used the AMD more effectively. I only play CSGO and DCS Combat Sim. Priority is frequency in these games and my Intel 8086k@5ghz is faster :(. Once AMD can match Intel in my games I'll jump ship, but I'm not going to spend $ on an "upgrade" and lose speed. I am very impressed with the benches and for people who play more typical games and have workloads that can scale cores, AMD has an advantage IMO, esp considering bang for the $. If I was starting from scratch and didn't have this pc, I would buy a new AMD though. Glad they are bringing some competition back to the game.
 
Went from X99 6800K to X570 and 3700X, couldn't be happier with the switch.

Originally I wanted a 3900X, but that never happened due to low availability of that CPU in my region. So I got the 3700X as a second thought.

I expected it to be more of a side-grade than an upgrade until I could get my hands on a 3900X or 3950X, but I was never so wrong in my entire life.

The 3700X in multi threading tasks (which is my main priority) is over 70% faster than the 6800K was. So I'm not looking back, and I'm going to hold off on getting a 3900X or 3950X, because the 3700X is good enough until I get the upgrade itch again.
 
The 3700X in multi threading tasks (which is my main priority) is over 70% faster than the 6800K was.
Where do you notice this the most? I wouldn't have expected there would be much difference.
 
Where do you notice this the most? I wouldn't have expected there would be much difference.
Everywhere, the whole system feels more responsive. Apps load faster, and I can work much faster in 3D editing software and Resolve.

The most glaring difference is when playing XCOM2, it feels like a different game I used to get all kinds of stuttering and lagging and slowdowns in it, now it's almost completely smooth.

The 70% is based on the score difference between the 6800k and the 3700x in Cinebench 20.
 
Decided to play around with all-core overclocking. 3900X - 4.3GHz 1.325V

upload_2019-8-24_9-8-18.png
 
I went from a 2500k to an r5 1600x, and now have a 3600. The jump to the 3600 is way bigger. I feel it was well worth the $200 cost to replace the 1600x, and over time the electricity savings will help offset some of that.

I have a custom loop with an old swiftech block that used to cool the 2500k. Well, the temps I get with the 3600 are insane. It uses so little power that it idles just above ambient, about 30C, and peak load temps are in the low 50C range.

So, considering the low power consumption it is hard for me to wrap my mind around how fast it is. FPS is up everywhere I wasn't badly GPU bound, and even then I see improvements. There is less stuttering.

Finally, my memory actually works at the rated speed. I set the XMP profile and haven't had to touch it. With the 1600x, I could never get my memory faster than 2933 MHz. I'm sure this only accounts for a small part of the speed difference, but it is nice to get the performance out of my ram that I paid for. That stuff was expensive when I bought it!

I'll wrap up by adding that the old 1600x lives on as a server (domain controller, hyper-v host and Plex server). When I bought the 3600 I added an open box B350 motherboard for $20 more (love microcenter), and grabbed 16gb of 2400MHz ddr4 for another $40 from newegg. So $60 out of pocket and it blows away the G3248 / z97 combo I used to have in the basement.
 
I went from a 2500k to an r5 1600x, and now have a 3600. The jump to the 3600 is way bigger. I feel it was well worth the $200 cost to replace the 1600x, and over time the electricity savings will help offset some of that.

I have a custom loop with an old swiftech block that used to cool the 2500k. Well, the temps I get with the 3600 are insane. It uses so little power that it idles just above ambient, about 30C, and peak load temps are in the low 50C range.

So, considering the low power consumption it is hard for me to wrap my mind around how fast it is. FPS is up everywhere I wasn't badly GPU bound, and even then I see improvements. There is less stuttering.

Finally, my memory actually works at the rated speed. I set the XMP profile and haven't had to touch it. With the 1600x, I could never get my memory faster than 2933 MHz. I'm sure this only accounts for a small part of the speed difference, but it is nice to get the performance out of my ram that I paid for. That stuff was expensive when I bought it!

I'll wrap up by adding that the old 1600x lives on as a server (domain controller, hyper-v host and Plex server). When I bought the 3600 I added an open box B350 motherboard for $20 more (love microcenter), and grabbed 16gb of 2400MHz ddr4 for another $40 from newegg. So $60 out of pocket and it blows away the G3248 / z97 combo I used to have in the basement.

Glad you like your 3600, but just to clarify, you'll never realistically see cost savings in electricity over the useful life of the chip unless you were running it 24/7/365 at 100% load for YEARS. The difference is 30W at load (at idle it's pretty much a wash). 95W at 24/7 load is 2.28 KWh a day = $0.23 and 65W at 24/7 load is 1.56 KWh a day = $0.16 So a $0.07 a day difference. $0.07 a day would take 2857 days to make $200 or 7.83 years. The reality is you probably aren't going to be running it at 24/7/365 100% load, so you will likely NEVER see the cost offset. This is calculated at $0.10 a KWh so the math changes a little here and there depending on your rates.

Long story short - electricity is such a small factor for home users it's almost negligible.
 
Glad you like your 3600, but just to clarify, you'll never realistically see cost savings in electricity over the useful life of the chip unless you were running it 24/7/365 at 100% load for YEARS. The difference is 30W at load (at idle it's pretty much a wash). 95W at 24/7 load is 2.28 KWh a day = $0.23 and 65W at 24/7 load is 1.56 KWh a day = $0.16 So a $0.07 a day difference. $0.07 a day would take 2857 days to make $200 or 7.83 years. The reality is you probably aren't going to be running it at 24/7/365 100% load, so you will likely NEVER see the cost offset. This is calculated at $0.10 a KWh so the math changes a little here and there depending on your rates.

Long story short - electricity is such a small factor for home users it's almost negligible.

You're quoting lower pricing than I get, for sure. Here in the Northeast we can expect almost 22c to 25c per kwh including delivery charges etc. That would bring your estimate down to 3 years. Still quite some time... But within reach.
 
You're quoting lower pricing than I get, for sure. Here in the Northeast we can expect almost 22c to 25c per kwh including delivery charges etc. That would bring your estimate down to 3 years. Still quite some time... But within reach.

Only if you are running it 24/7/365 at 100% load ;).

I'm in Pennsylvania and I'm around $0.11.
 
Glad you like your 3600, but just to clarify, you'll never realistically see cost savings in electricity over the useful life of the chip unless you were running it 24/7/365 at 100% load for YEARS. The difference is 30W at load (at idle it's pretty much a wash). 95W at 24/7 load is 2.28 KWh a day = $0.23 and 65W at 24/7 load is 1.56 KWh a day = $0.16 So a $0.07 a day difference. $0.07 a day would take 2857 days to make $200 or 7.83 years. The reality is you probably aren't going to be running it at 24/7/365 100% load, so you will likely NEVER see the cost offset. This is calculated at $0.10 a KWh so the math changes a little here and there depending on your rates.

Long story short - electricity is such a small factor for home users it's almost negligible.

This is a lesson in making assumptions. I pay about $.15 per KWh and I do in fact run the CPU at 100% load around the clock. When I'm not playing games, my PC is used for mining crypto. This was a significant consideration for upgrading, because the hashrate and hash / watt are both much higher for the 3000 series Ryzen chips.

Correcting your numbers for my electricity cost, I'm saving $.108 per day. I can tell you I will absolutely notice the $3 a month my bill drops. That's also the difference between mining at a virtual break-even point vs. mining for a profit at the current prices.

Another key point is I said "over time the electricity savings will help offset some of that" referring to the $200 cost of the 3600. Notice I did not say ALL, but SOME of the cost. If I mine for a year on it I'll have saved $36, or 18% of the purchase price. I can't say for sure how long I'll be mining on the CPU but it should be at least a year, maybe more. I'll probably be playing games on this processor for 6 or 7 years barring a miracle breakthrough in tech.

And finally, even if I wasn't using the CPU at 100% load around the clock what I said initially is absolutely correct. Using less electricity saves money. It may not be a lot, or significant to you, but even someone who only plays games on their PC and pays less for power than I do is going to save some money on their electric bill by upgrading from a 1600x to a 3600.
 
This is a lesson in making assumptions. I pay about $.15 per KWh and I do in fact run the CPU at 100% load around the clock. When I'm not playing games, my PC is used for mining crypto. This was a significant consideration for upgrading, because the hashrate and hash / watt are both much higher for the 3000 series Ryzen chips.

Correcting your numbers for my electricity cost, I'm saving $.108 per day. I can tell you I will absolutely notice the $3 a month my bill drops. That's also the difference between mining at a virtual break-even point vs. mining for a profit at the current prices.

Another key point is I said "over time the electricity savings will help offset some of that" referring to the $200 cost of the 3600. Notice I did not say ALL, but SOME of the cost. If I mine for a year on it I'll have saved $36, or 18% of the purchase price. I can't say for sure how long I'll be mining on the CPU but it should be at least a year, maybe more. I'll probably be playing games on this processor for 6 or 7 years barring a miracle breakthrough in tech.

And finally, even if I wasn't using the CPU at 100% load around the clock what I said initially is absolutely correct. Using less electricity saves money. It may not be a lot, or significant to you, but even someone who only plays games on their PC and pays less for power than I do is going to save some money on their electric bill by upgrading from a 1600x to a 3600.

I think my point still stands that it doesn't make much sense to spend $200 on an upgrade just to save electricity. Upgrade because the processor is faster. If you are choosing between the 3600 and the 3600x then sure calculate in the potential electricity cost savings.

Personally, I haven't figured out if mining on the 3600 is worth it or not yet. I get about $0.30 a day approximately $0.20 in power cost just on the CPU. Factoring in other components it's probably a wash. Probably better just to buy crypto for me. Mining on the 1600x probably wouldn't pay for itself at this point.
 
This is a lesson in making assumptions. I pay about $.15 per KWh and I do in fact run the CPU at 100% load around the clock. When I'm not playing games, my PC is used for mining crypto. This was a significant consideration for upgrading, because the hashrate and hash / watt are both much higher for the 3000 series Ryzen chips.

Correcting your numbers for my electricity cost, I'm saving $.108 per day. I can tell you I will absolutely notice the $3 a month my bill drops. That's also the difference between mining at a virtual break-even point vs. mining for a profit at the current prices.

Another key point is I said "over time the electricity savings will help offset some of that" referring to the $200 cost of the 3600. Notice I did not say ALL, but SOME of the cost. If I mine for a year on it I'll have saved $36, or 18% of the purchase price. I can't say for sure how long I'll be mining on the CPU but it should be at least a year, maybe more. I'll probably be playing games on this processor for 6 or 7 years barring a miracle breakthrough in tech.

Mining crypto is incredibly niche. The vast majority of users are not pegging their gear at 100% all the time. It's a safe assumption to make.

And finally, even if I wasn't using the CPU at 100% load around the clock what I said initially is absolutely correct. Using less electricity saves money. It may not be a lot, or significant to you, but even someone who only plays games on their PC and pays less for power than I do is going to save some money on their electric bill by upgrading from a 1600x to a 3600.

You'd save money on power but not in total. If you are that cash strapped, then sticking with the 1600 is the far smarter choice.
 
Wish the games I play used the AMD more effectively. I only play CSGO and DCS Combat Sim. Priority is frequency in these games and my Intel 8086k@5ghz is faster :(. Once AMD can match Intel in my games I'll jump ship, but I'm not going to spend $ on an "upgrade" and lose speed. I am very impressed with the benches and for people who play more typical games and have workloads that can scale cores, AMD has an advantage IMO, esp considering bang for the $. If I was starting from scratch and didn't have this pc, I would buy a new AMD though. Glad they are bringing some competition back to the game.
What are you talking about? CSGO?

https://www.techspot.com/amp/review/1877-core-i9-9900k-vs-ryzen-9-3900x/
3900x outruns the 9900k overclocked or not...

Wasn't able to find a comparison (admittedly didn't look long) for dcs, but just wanted to make sure anyone playing csgo didn't get the wrong impression.
 
Man, I really wish I could be happy and enjoy my system. This has been the most expensive and most difficult to get stable computer I've ever built. Out of probably 10 attempts at getting different RAM speeds and timings to pass a full Memtest86 run (pre-Windows), I've only been successful twice. Most attempts make it 2 or 3 hours before reboot. One successful run was with the whole system set to default, RAM at 2133. The other time was with RAM at 3600 16-16-16-32 1.42v last night, but after passing, the computer will not boot now. When memtest86 fails, I get no errors first, the computer just reboots. The logs are unremarkable, there is no error, it just stops in the middle. Memory tests in Windows like HCI or the one built into the Ryzen DRAM calculator (same one) usually do the same thing, no errors, just reboots. I did get errors one time, but it was 3733 at super tight timings. I have G.Skill B. Dies rated for 19-19-19-39 at 4000 1.35v (setting to that XMP will stop the computer from booting) and I've tried all kinds of settings from the Ryzen DRAM Calculator for 3733 and now 3600. Fclk will not boot at 1900, but runs just fine at 1866. I can run apps just fine and pass Cinebench and some Aida stress tests for 30 minutes or so with pretty much whatever I try, tight timings included, but the memory system is kicking my butt when tested. Or maybe I'm missing something else. I am running cpu at -.10 offset. I have SOC at 1.1v as well. I can change RAM voltage in BIOS from say 1.42 to 1.415 and then it won't boot and I have to clear CMOS. I've noticed that Ryzen Master doesn't report any RAM voltage, it just shows 0, but I'm not sure if that's because Windows has crashed on memtests so many times or what. I've run Hardware Info as an executable and it reports RAM voltage. However, I'm booting to Memtest86 where Windows isn't even a factor. Sorry for the wall of text, I'm venting.
 
You haven't tested it one stick at a time to see if it's a real hardware fault vs just some bad settings being auto-chosen by the system?

4 sticks of new ram are always 4 potential points of failure in a new build regardless of what you're putting them in.

I'd test each stick one at a time, once at stock, then at 3600 then at 4000 all at the rated voltage it's supposed to be at (not manually pushing it higher). all in the same ram slot. If they all fail or your first stick begins to fail, try another slot and repeat.

You can narrow down the issue and remove the most variables that way. If you have a stick passing in a slot and another failing, you know the slot isn't the problem, the stick is. If they all fail in a given slot but not in another, you may have a bad motherboard. If they all fail in all the slots, then it could be many things ...

I'd go in that direction first before giving up or getting buyers remorse. Also, when doing these ram tests, I'd use default core voltages to just remove any other external variables.

edit: also, pay attention after booting with a single stick to the detected timings and dram settings. There's always the chance a stick from a different set of ram got mixed in with the rest at the factory/reseller and the mixed ram is the source of the problem. They'd all pass individually in that case, but not play well together.
 
You haven't tested it one stick at a time to see if it's a real hardware fault vs just some bad settings being auto-chosen by the system?

4 sticks of new ram are always 4 potential points of failure in a new build regardless of what you're putting them in.

I'd test each stick one at a time, once at stock, then at 3600 then at 4000 all at the rated voltage it's supposed to be at (not manually pushing it higher). all in the same ram slot. If they all fail or your first stick begins to fail, try another slot and repeat.

You can narrow down the issue and remove the most variables that way. If you have a stick passing in a slot and another failing, you know the slot isn't the problem, the stick is. If they all fail in a given slot but not in another, you may have a bad motherboard. If they all fail in all the slots, then it could be many things ...

I'd go in that direction first before giving up or getting buyers remorse. Also, when doing these ram tests, I'd use default core voltages to just remove any other external variables.

edit: also, pay attention after booting with a single stick to the detected timings and dram settings. There's always the chance a stick from a different set of ram got mixed in with the rest at the factory/reseller and the mixed ram is the source of the problem. They'd all pass individually in that case, but not play well together.

Thanks for the reply. It’s a two stick kit 2 x 16. I did run a test of each stick alone in slot A2. Both failed, but I wasn't running stock speed. Like I said, the system will pass with all settings at stock, but it sets RAM to 2133 CAS 15 1.20v. Maybe I'll try each stick alone at default and see if one defaults faster. The fact that I made it through with all 32 GB at 3600 16 16 16 16 32 is what befuddled me. That would seem to point out that the RAM is fine, it just doesn't want to work together with my board, CPU, BIOS any faster. At this point though, I'm open to trying anything. This was fairly expensive RAM and I'm pissed that I'm not able to run it faster than RAM at half the price. I know that dual rank sticks are tough on the system though.
 
Thanks for the reply. It’s a two stick kit 2 x 16. I did run a test of each stick alone in slot A2. Both failed, but I wasn't running stock speed. Like I said, the system will pass with all settings at stock, but it sets RAM to 2133 CAS 15 1.20v. Maybe I'll try each stick alone at default and see if one defaults faster. The fact that I made it through with all 32 GB at 3600 16 16 16 16 32 is what befuddled me. That would seem to point out that the RAM is fine, it just doesn't want to work together with my board, CPU, BIOS any faster. At this point though, I'm open to trying anything. This was fairly expensive RAM and I'm pissed that I'm not able to run it faster than RAM at half the price. I know that dual rank sticks are tough on the system though.


You are reminding me of the issues I had with 2 different Micron D/E based 32GB kits...would boot and even OC to 3600c16 and I could run things fine, but any kind of ram stress test would fail with a reboot 30min-3hrs in...No windows hardware errors, just a generic bsod..


You mentioned you are running SoC at 1.1V...Have you monitored the actual SoC voltage from something like HWinfo64? The reason I ask is that mine was set at 1.1V but it took increasing the SoC LLC to level 2 to keep it there under load. It would dip to 1.087V otherwise.

It's just a thought in case you had not thought of it...I really feel for you..I know you spend some decent coin on 4000 rated B die and to not have it work properly really sucks.
 
I'm running 64GB 3200mhz 14 14 14 g skill ram at the xmp settings (no manual dram settings) and it's working _great_

You mention it passed at 3600 but then failed to boot after rebooting. Does your motherboard have a dram boot voltage setting? Asus tends to differentiate their dram voltage setting from their dram boot voltage setting. If you require 1.42 volts to run at 3600 and never adjusted the boot voltage, the boot voltage may be at the stock dram boot voltage of like 1.25 volts instead of even 1.35 common to most > 2133 speeds. You'd think the xmp profile would handle that stuff but ...that'll vary motherboard to motherboard i'm sure. Might also be good to give yourself dram current headroom and load line calibration boosts from stock when running at these really high frequencies that require higher than 1.35 volts to be stable.

dram chips also have temp sensors in them, which is probably not visible in memtest86, but you should be able to see them in monitoring software in your regular OS during stress tests, just to make sure your ram isn't overheating.

edit: I always go through my digital power controls in the motherboard and boost load line calibration and current limits for basically everything ...that's why there's those big heatsinks on these premium boards :)
 
Glad you like your 3600, but just to clarify, you'll never realistically see cost savings in electricity over the useful life of the chip unless you were running it 24/7/365 at 100% load for YEARS. The difference is 30W at load (at idle it's pretty much a wash). 95W at 24/7 load is 2.28 KWh a day = $0.23 and 65W at 24/7 load is 1.56 KWh a day = $0.16 So a $0.07 a day difference. $0.07 a day would take 2857 days to make $200 or 7.83 years. The reality is you probably aren't going to be running it at 24/7/365 100% load, so you will likely NEVER see the cost offset. This is calculated at $0.10 a KWh so the math changes a little here and there depending on your rates.

Long story short - electricity is such a small factor for home users it's almost negligible.

You also have to cool the heat from your PC/ Run the AC more. (In Alabama here with a 8320/290x and 1700/rx580 system, I would know..................)
 
You are reminding me of the issues I had with 2 different Micron D/E based 32GB kits...would boot and even OC to 3600c16 and I could run things fine, but any kind of ram stress test would fail with a reboot 30min-3hrs in...No windows hardware errors, just a generic bsod..


You mentioned you are running SoC at 1.1V...Have you monitored the actual SoC voltage from something like HWinfo64? The reason I ask is that mine was set at 1.1V but it took increasing the SoC LLC to level 2 to keep it there under load. It would dip to 1.087V otherwise.

It's just a thought in case you had not thought of it...I really feel for you..I know you spend some decent coin on 4000 rated B die and to not have it work properly really sucks.

I've set LLC to level 2 and 3 and duty cycle to 120 per the DRAM calculator. I actually have SOC voltage at 1.12 or something like that as BIOS reports it at 1.087 when set to 1.1. Maybe I need to go a little higher. I also have DRM voltage at 1.42 to try to keep it over 1.41 in BIOS. I'll put a load on it tonight and check HW Info. My board even has voltage monitoring points for most of this stuff. May break out the multimeter. Thanks.
 
I'm running 64GB 3200mhz 14 14 14 g skill ram at the xmp settings (no manual dram settings) and it's working _great_

You mention it passed at 3600 but then failed to boot after rebooting. Does your motherboard have a dram boot voltage setting? Asus tends to differentiate their dram voltage setting from their dram boot voltage setting. If you require 1.42 volts to run at 3600 and never adjusted the boot voltage, the boot voltage may be at the stock dram boot voltage of like 1.25 volts instead of even 1.35 common to most > 2133 speeds. You'd think the xmp profile would handle that stuff but ...that'll vary motherboard to motherboard i'm sure. Might also be good to give yourself dram current headroom and load line calibration boosts from stock when running at these really high frequencies that require higher than 1.35 volts to be stable.

dram chips also have temp sensors in them, which is probably not visible in memtest86, but you should be able to see them in monitoring software in your regular OS during stress tests, just to make sure your ram isn't overheating.

edit: I always go through my digital power controls in the motherboard and boost load line calibration and current limits for basically everything ...that's why there's those big heatsinks on these premium boards :)

I looked in BIOS for DRAM boot voltage, but couldn't find it. I've set all the other LLC and duty cycle stuff per the calculator. Hopefully, I can find that setting tonight.
 
I looked in BIOS for DRAM boot voltage, but couldn't find it. I've set all the other LLC and duty cycle stuff per the calculator. Hopefully, I can find that setting tonight.

Some Asus boards do not have it...That is why on early BIOS on 1.0.0.7 we had issues with XMP not working since the board was setting 1.2V as the boot voltage with no way to change it...You had to manually set XMP speed and timings after setting the voltage to 1.4V and then rebooting. I still do not have it on my board, using the latest bios (AFIAK) released on August 1st.
 
Some Asus boards do not have it...That is why on early BIOS on 1.0.0.7 we had issues with XMP not working since the board was setting 1.2V as the boot voltage with no way to change it...You had to manually set XMP speed and timings after setting the voltage to 1.4V and then rebooting. I still do not have it on my board, using the latest bios (AFIAK) released on August 1st.

Odd, I wasn't aware that so many of the boards in their lineup lacked that setting. Very odd choice on their part to make that something only appear in certain ones. It's not like it requires anything physically special.
 
Odd, I wasn't aware that so many of the boards in their lineup lacked that setting. Very odd choice on their part to make that something only appear in certain ones. It's not like it requires anything physically special.

I am far from happy with the BIOS issues Asus has had with their AM4 boards. I had a 370 Prime Pro at first that ended up bricking the bios for no reason on a reboot. Was a known issue that they fixed in a HW revision, but having to tear a custom loop down with 3 VEGA 56s in it was not fun. I reluctantly bought a 470 Prime Pro since ebay had a 15% off coupon and it was the only board that the same spacing with 3 16X slots that NE was selling on ebay.

Now the bios releases have been a shit show..I spent 2 hours installing and then removing a powered 4 pin PWM splitter after the fans attached to it would ramp to 100% no matter the settings in the bios. I installed the 2 to 1 splitters I was using with 2 fans on each channel and the issue was still there. Finally after giving up and doing some Google Fu I found a post from spring in 2018 that had the same issue and the only fix was to turn off fan monitoring in the bios. This fixed the issue, but it makes me so mad I wasted so much time on a bug that should have been fixed last June and not present over a year later.

I think I am going back to buying AsRock or MSI boards going forward. I do not plan to go back into GPU mining any time soon, and unless Zen3 requires a new socket will not replace this board anytime soon. If the next major AGESA release gives the other mobo makers better performance, then I may just bite the bullet and replace this POS.

I still cannot get the boost I was getting on a dinky Wraith Prism and no running case fans that I was getting on 1.0.0.7 despite running a custom loop with 780mm of rad space and ram with much tighter timings. I was getting over 500 single core since my 3700x boosted so nicely in CB20. This bios has allowed me to regain a big part of it (currently getting 4877/487) but the fact that I started getting higher performance on air really rubs me the wrong way. And the issue is with Asus plain and simple.
 
You also have to cool the heat from your PC/ Run the AC more. (In Alabama here with a 8320/290x and 1700/rx580 system, I would know..................)
What? You mean for the Intel chip? Yes, the and chip will save on your ac bill too in this case, he said it is using less watts = less therms generates overall.
 
Odd, I wasn't aware that so many of the boards in their lineup lacked that setting. Very odd choice on their part to make that something only appear in certain ones. It's not like it requires anything physically special.

I got another successful Memtest86 test last night at 3600 with 14-16-15-30 timings (Safe settings from Ryzen DRAM Calculator). Putting DRAM at 1.47, SOC at 1.125 and elevating some other settings like CLDO VDDG and VDDP, VTTDDR, VREF, VPP, etc. to the max that the calculator advises seems to be the ticket, I guess. However, when I run Aida, latency is 72ns and Cinebench R20 is 7200. Aida was below 70 with 3600 16-16-16-32 timings. Based on what I've seen looking around other's benchmarks, my timings at 3600 should be getting me better scores, no?
 
Back
Top