5800X all core overclock - 4800Mhz @ 72.5℃ with 1.29v

gerardfraser

[H]ard|Gawd
Joined
Feb 23, 2009
Messages
1,366
5800X all core overclock Cinebench 20 - 4800Mhz
Multi score -6330 @ 72.5℃ with 1.29v
Too bad some people are having cooling problems,looks like I got lucky.
add ST score with AMD optimizer
Desktop-Screenshot-2020-11-16-19-43-03-74-2.png

Video of run if you like boring stuff

♦ CPU - AMD 5800X With MasterLiquid Lite ML240L RGB AIO
♦ GPU - Nvidia RTX 3080
♦ RAM - G.Skill 32 GB DDR4
♦ Mobo - MSI MAG X570 TOMAHAWK(E7C84AMS v140)
♦ SSD - NVME SSD 1TB
♦ DSP - LG B9 65" 4K UHD HDR OLED G-Sync Over HDMI
♦ PSU - Antec High Current Pro 1200W
 
Last edited:
Congrats! and thanks for sharing it. Like you said you probably have a golden 5800x with a lot peeps having problems with high temps.

How is this comparing to AMD's Curve Optimzier? Also how is your B9 OLED with 3080? any hdmi 2.1 issues? can you game at 4k 120Hz gsync HDR?

I'm still waiting/hunting for 5900x and 3080... hopefully I can finish my build in 2020 :)
 
Pretty impressive! (y)

ETA: Repost your pic, it's thumbnail size.

.
Done thanks.
Can you run an AVX stress test?
Yes I can but testing AMD curve optimizer at 5150Mhz the next few days.
^ Run Prime 95 with AVX on for a few hours please.
I will just not today,testing AMD curve optimizer got 5150Mhz ,5200Mhz left for stability and I might do Prime95 AVX but II never do Prime95 I do not need it,I play games and I need game stable.
Congrats! and thanks for sharing it. Like you said you probably have a golden 5800x with a lot peeps having problems with high temps.

How is this comparing to AMD's Curve Optimzier? Also how is your B9 OLED with 3080? any hdmi 2.1 issues? can you game at 4k 120Hz gsync HDR?

I'm still waiting/hunting for 5900x and 3080... hopefully I can finish my build in 2020 :)
No I do not have a golden sample ,I think it is normal and not a bad one. With AMD curve optimer set to 5100Mhz single core then multi core drops down to 4750Mhz and scores around 6200 and straight up 4800Mhz scores around 6300.
AMD curve optimzer is the only way to run AMD 5000 CPU and all core overclocks are dead. AMD curve optimizer gives the best of single core and multi without losing anything and can be real stable ,also 5000+ magic AMD barrier broke which is fine when gaming.

LG B9 works great on newest firmware 5002 G-sync/VRR 4K 120Hz HDR 12 bit
 
Done thanks.

Yes I can but testing AMD curve optimizer at 5150Mhz the next few days.

I will just not today,testing AMD curve optimizer got 5150Mhz ,5200Mhz left for stability and I might do Prime95 AVX but II never do Prime95 I do not need it,I play games and I need game stable.

No I do not have a golden sample ,I think it is normal and not a bad one. With AMD curve optimer set to 5100Mhz single core then multi core drops down to 4750Mhz and scores around 6200 and straight up 4800Mhz scores around 6300.
AMD curve optimzer is the only way to run AMD 5000 CPU and all core overclocks are dead. AMD curve optimizer gives the best of single core and multi without losing anything and can be real stable ,also 5000+ magic AMD barrier broke which is fine when gaming.

LG B9 works great on newest firmware 5002 G-sync/VRR 4K 120Hz HDR 12 bit


thanks for the video. That's awesome... I'm sorry but how do you get HDMI stats displayed on your TV (I own B9)? Also which flag is to show VRR on or off? (I'm sure it is there LOL).
I'm carious how well AMD curve optimizer will work on 5900x. Also I'm happy you are using very similar system setup x570 tomahawk (114) and 240 aio (same as yours, but more $$$ bling Corsair 240mm, my kids wanted it). What is your RAM settings, would please post your settings from AIDA64 (including latency score). Thanks in advance and Happy testing :)
 
Thanks for sharing. I had settled on the 5800X even with pricing being close to the 5900x. I don’t need 12 cores and 100 bucks is a 100 bucks. 👍
I play games 4K and owned 2600X/3600X/3600XT/3800X/3800XT/5800X ,when waiting in line for the 5600X I wanted . I was 24 and when it came to me they were gone.
thanks for the video. That's awesome... I'm sorry but how do you get HDMI stats displayed on your TV (I own B9)? Also which flag is to show VRR on or off? (I'm sure it is there LOL).
I'm carious how well AMD curve optimizer will work on 5900x. Also I'm happy you are using very similar system setup x570 tomahawk (114) and 240 aio (same as yours, but more $$$ bling Corsair 240mm, my kids wanted it). What is your RAM settings, would please post your settings from AIDA64 (including latency score). Thanks in advance and Happy testing :)
OK Settings/channel/channel tuning press #1 5 times to bring of menus,and just navigate to HDMI section.
AMD is in release mode ,my guess and the AMD curve optimizer will get better and works great now and 5900X will be as good as any cpu.
The VRR flag I was turning on/off was ,sorry copy and paste someone else asked me

I have a video game running in the background and Nvidia control panel open to demonstrate that G-sync/VRR(Variable Refresh Rate) on and off position. So in Nvidia control panel under Set up G-sync
#1. Check mark -Enable G-sync,G-sync Compatible
#2. Select the display you would like to change- I picked LG B9 OLED
#3.Display specific settings - How you can tell VRR is turning on/off is if you look at Total resolution 4400x2250 number .
When I put a check mark in the box ,it turns on VRR on and the number 2250 will start to fluctuate.
When I uncheck the box ,it turns off VRR and the number 2250 will go back to static.

Ram setting I have none anymore,I play games at 4k and ram does not matter but if you want here is one on 3800X with one of my B-die ram kits either 4000 or 4133 at CL16 3866Mhz. I am running CL 19 3600Mhz rank 2 32gb ram on 5800X and latency probably around 80ns
50602371587_c69671bb5b_k.jpg49608546617_506df1a677_k by gerard fraser, on Flickr
Very nice dude.
AMD Ryzen 5000 some are good and I probably just got lucky to get a working one.
 
I play games 4K and owned 2600X/3600X/3600XT/3800X/3800XT/5800X ,when waiting in line for the 5600X I wanted . I was 24 and when it came to me they were gone.

OK Settings/channel/channel tuning press #1 5 times to bring of menus,and just navigate to HDMI section.
AMD is in release mode ,my guess and the AMD curve optimizer will get better and works great now and 5900X will be as good as any cpu.
The VRR flag I was turning on/off was ,sorry copy and paste someone else asked me

I have a video game running in the background and Nvidia control panel open to demonstrate that G-sync/VRR(Variable Refresh Rate) on and off position. So in Nvidia control panel under Set up G-sync
#1. Check mark -Enable G-sync,G-sync Compatible
#2. Select the display you would like to change- I picked LG B9 OLED
#3.Display specific settings - How you can tell VRR is turning on/off is if you look at Total resolution 4400x2250 number .
When I put a check mark in the box ,it turns on VRR on and the number 2250 will start to fluctuate.
When I uncheck the box ,it turns off VRR and the number 2250 will go back to static.

Ram setting I have none anymore,I play games at 4k and ram does not matter but if you want here is one on 3800X with one of my B-die ram kits either 4000 or 4133 at CL16 3866Mhz. I am running CL 19 3600Mhz rank 2 32gb ram on 5800X and latency probably around 80ns
View attachment 29924049608546617_506df1a677_k by gerard fraser, on Flickr

AMD Ryzen 5000 some are good and I probably just got lucky to get a working one.
thanks for the info and sharing with the rest of us. Much appreciated !!!! :) As for RAM latnecy it shows you have 61ns which is pretty good. I'm sure you could even get it better. Maybe try using the famous Zen Ram Calculator.
 
thanks for the info and sharing with the rest of us. Much appreciated !!!! :) As for RAM latnecy it shows you have 61ns which is pretty good. I'm sure you could even get it better. Maybe try using the famous Zen Ram Calculator.
Well it was 3 runs back to back to back 60.7/61.1/61 at CL16 3866Mhz on a 3800X. It does not get much better than that especially at CL16 on 3800X check and see.
Good that you looked lol.Just fucking with ya.

Now with 5800X I can sure I could do 50ns but I am happy with 80ns because there is no performance difference when I play PC games at 4K.
your welcome about the sharing thing,everyone needs to know what amd cpu can do
 
AMD Ryzen 5000 some are good and I probably just got lucky to get a working one.

I dunno if binning has anything to do with it yet. We don't have a way to measure it nor or is there much data out yet. However as I surmised with jumper18's leaks last month, the huge reduction in voltage needed to hit boost clocks showed the big improvement in voltage scaling. It made possible the idea that we could maybe run the boost clock on all cores at reasonable voltage which you have achieved. Thus yea, it's looking good especially for those with ample cooling.
 
Impressive clocks and temps! I can get to ~6050 pints in Cinebench 20 on stock settings and I think I'll stay there :)

as far as RAM is considered, I tuned my set to 3800 CL16 & got these scores, also Geekbench incresed from 9500 points on default timings to 10700 on the one that I'm attaching. For 4k gaming it's pointless, but I like to get the performance I paid for ;)
1605517585628.png
1605517608051.png
 
have not been able to get 1900 IF to post as of yet, 1866 seems just fine

tweaked161616bestcachemem.png
r20_allcore Capture.JPG
 
Putz
If you use the AMD Curve Optimizer to set your 5950X ,you should get higher single/multi scores and a sustained boost of around 5150Mhz in light loads.
Here is my single in Cinebench R20
Desktop-Screenshot-2020-11-16-19-43-03-74-2.png
 
Gerard excellent.Shortened the learning curve.Prefer low latency though.Going to break 51ns latency soon.Fingers crossed.
 

Attachments

  • DBA016B5-84CA-4FBA-8198-37E4ADC12213.jpeg
    DBA016B5-84CA-4FBA-8198-37E4ADC12213.jpeg
    350.3 KB · Views: 0
Doh, I just went thru your other thread. In your video you disabled effective clocks in hwinfo.

PSA for all users, the only thing you should be looking at are effective clocks. Core clock (aka discrete clocks) display is meaningless as it just shows what could be called, but is not what the cpu is actually operating at. These threads are all based on discrete clocks which is duh, misleading.

Effective clock vs instant (discrete) clock | HWiNFO Forum
 
Doh, I just went thru your other thread. In your video you disabled effective clocks in hwinfo.

PSA for all users, the only thing you should be looking at are effective clocks. Core clock (aka discrete clocks) display is meaningless as it just shows what could be called, but is not what the cpu is actually operating at. These threads are all based on discrete clocks which is duh, misleading.

Effective clock vs instant (discrete) clock | HWiNFO Forum

Believe effective was around 4700 before link no longer worked.First time I was gaming with cpu clocks running On screen.Was jumping up and down at 4850 “all core”.Then I looked at effective clocks..........
 
@OP Do you have a hwinfo screen that shows your effective clock?



^^You need to update to v6.30 as the older versions are not accurate, reads are bigger than actual.
yeah I realized that after I posted. I'll post an updated one.
 
@OP Do you have a hwinfo screen that shows your effective clock?
In the first post effective clocks are shown in real time video. Not sure by you saying I have effective clock disabled. Effective clock for all core clock is 4800Mhz


Here is another one without all core overclock with AMD curve optimizer on 5100Mhz,if you need screenshots instead of video I have screen shots
 
Gerard excellent.Shortened the learning curve.Prefer low latency though.Going to break 51ns latency soon.Fingers crossed.
Thanks for the compliment,I assume your talking about AMD Curve Optimizer or If your talking about the screen shot I posted on memory,it was a 3800X and it it was 60.7ns you do not get lower just saying. Now 5800X I am sure I could get below 50ns for my epeen but I can not be bothered. I play game @4K ram does not matter at all.
 
In the first post effective clocks are shown in real time video. Not sure by you saying I have effective clock disabled. Effective clock for all core clock is 4800Mhz


Here is another one without all core overclock with AMD curve optimizer on 5100Mhz,if you need screenshots instead of video I have screen shots


Ah sorry mate, I was looking in the default place below discrete clocks but I see now your effective clocks are bottom right.

Thanks for the compliment,I assume your talking about AMD Curve Optimizer or If your talking about the screen shot I posted on memory,it was a 3800X and it it was 60.7ns you do not get lower just saying. Now 5800X I am sure I could get below 50ns for my epeen but I can not be bothered. I play game @4K ram does not matter at all.

I've seen one setup that was mid 48ns. It was using the royal 4400c16 8gb dimms tightened down quite a bit. I'm not sure that its worth it all that work for a few ns that no one can notice.
 
No problems ,Yeah tuning your ram timings is the way to go but buying expensive ram is not for everyone and no performance gains.
 
Thanks for the compliment,I assume your talking about AMD Curve Optimizer or If your talking about the screen shot I posted on memory,it was a 3800X and it it was 60.7ns you do not get lower just saying. Now 5800X I am sure I could get below 50ns for my epeen but I can not be bothered. I play game @4K ram does not matter at all.
Yes sir Curve optimizer tutorial.Enjoy videos made on your tube Chanel.I just couldn’t be bothered with that;).
 
LOL I am an old man who semi retired at 39 and for good around 44 almost .I put up the youtube stuff for my fun and my niece's ,they live far away.Also to help some people once in awhile
 
Pulling your leg buddy.You do nice work across the OC community.Cough 4k.
Setting here scratchin my head why a tweaker would not like to push entire system.Different stroke for ........
Can find these sub $110 at times and will run with most anything out there.
https://www.newegg.com/patriot-16gb-288-pin-ddr4-sdram/p/N82E16820225144

Those are decent though they are not that highly binned but for that cost its an excellent value. With a lot of volts you can run those at 3800 at cas 14. I have used a set for a lil bit. They clocked slightly easier than my g.skill 3600cas16 ram.
 
Well I owned these and other sets ,I switched to CL19 3600 Ram 32GB Rank 2,believe me when I say there is no difference in anything I do on my computer. Thanks for the tip though.

EG: FCLK is a Myth on AMD Ryzen tune your ram save cash anyway check out

BF5 GTAV 1920x1080 Tuned Ram AMD Ryzen Fabric Clock 1467Mhz (DDR4 2933Mhz) vs Fabric Clock 1933Mhz (DDR4 3866Mhz)


50069369516_3b8f4bf43f_k.jpgB-Die Ram
 
So I've dug through my entire BIOS and cannot find any mention of AMD Curve Optimizer. Specs in sig. I guess it's not on every X570 BIOS?
 
Putz
If you use the AMD Curve Optimizer to set your 5950X ,you should get higher single/multi scores and a sustained boost of around 5150Mhz in light loads.
Here is my single in Cinebench R20
View attachment 301602
is that something in the bios? or in Ryzen Master? i dont see anything like that on my x570 ROG Strix, running the latest beta bios
 
So I've dug through my entire BIOS and cannot find any mention of AMD Curve Optimizer. Specs in sig. I guess it's not on every X570 BIOS?
was looking for something like that as well on the Asus ROG Strix x570.. on the 2812 beta bios and nothing
 
Back
Top