5900x VID at > 1.45?

texuspete00

Supreme [H]ardness
Joined
Sep 9, 2002
Messages
5,623
I had been out of the game for a bit. Did a crash course here and Google when finally upgrading. I was surprised to see the VID fluctuating around, but mostly over 1.45 VID on an Asus TUF Gaming X570 pro. Idle is like 41C. I thought maybe I mounted the cooler poorly, but I think it's more the VID. It actually only goes to low 60s if I run a quick benchmark test.

Does this seem like a high VID? I tried to shut off this "PBO" setting in the BIOS, didn't do much, but other than that, this is mostly all the default settings. I have since set it to DDR3600 as well, but I doubt that is it.
 
I had been out of the game for a bit. Did a crash course here and Google when finally upgrading. I was surprised to see the VID fluctuating around, but mostly over 1.45 VID on an Asus TUF Gaming X570 pro. Idle is like 41C. I thought maybe I mounted the cooler poorly, but I think it's more the VID. It actually only goes to low 60s if I run a quick benchmark test.

Does this seem like a high VID? I tried to shut off this "PBO" setting in the BIOS, didn't do much, but other than that, this is mostly all the default settings. I have since set it to DDR3600 as well, but I doubt that is it.

Temps look about right. These chips run warmer than what most of us were used to.
 
I agree. My 5800X every now and then taps the idle voltage at 1.456v. From what I gather, that's how they run. They're not actually "using" that much voltage tho.
 
It's expected behavior. The 5800X I built for a family member goes up to 1.475. AMD considers it safe enough to give you a 3 year warranty.
 
It's expected behavior. The 5800X I built for a family member goes up to 1.475. AMD considers it safe enough to give you a 3 year warranty.
And hopefully these chips last a average length of time. It’s not very often a cpu dies before anything else on the system in my experience.
 
Enigma posted this in another similar thread, so normal. i still run a neg voltage offset though on my 5600x, to drop idle/desktop temps.

1630699444246.png
 
I feel a little better. I would hope to see more of what that chart calls "Desktop Idle". I mean if I stare at CoreTemp long enough, I might see a couple blips of ~1.0v. But probably spending 80-90% of the time at 1.45-1.48v, not doing anything. Thought Asus might be up to something weird to try and win mobo benchmarks by .01% with fractionally better boosts or something :LOL:

I have a liquid freezer ii 420 keeping it plenty cool even under load, but I can hear the fans spin up ever so slightly if I so much as click on something most of the time ha. Might have to mess with fan curves or something, if there are no gremlins to chase here.
 
I feel a little better. I would hope to see more of what that chart calls "Desktop Idle". I mean if I stare at CoreTemp long enough, I might see a couple blips of ~1.0v. But probably spending 80-90% of the time at 1.45-1.48v, not doing anything. Thought Asus might be up to something weird to try and win mobo benchmarks by .01% with fractionally better boosts or something :LOL:

I have a liquid freezer ii 420 keeping it plenty cool even under load, but I can hear the fans spin up ever so slightly if I so much as click on something most of the time ha. Might have to mess with fan curves or something, if there are no gremlins to chase here.
my tuf board has a setting called something like performance bias with different benchmarks under it, i set it to none. maybe see if yours has that. also, a lot of board due just over-juice out of the box. you could try a negative voltage offset like i mentioned earlier. just do a cinebench run or something cpu related before and after to see if performance degrades or not.
 
I wouldn't worry... my 3600XT, 5600X, and 5900X all got the same amount of voltage. Although I do recall seeing 1.55+ a few time on the XT.

Capture.PNG
 
Up to 1.55v is default with PBO settings depending on Motherboard and agesa. Your voltage is normal as others have said.

My 5800X Hitting max of 5175Mhz effective clock and only 1.5v when idle or doing nothing which is the way the AMD CPU works.
Idle-25ºC
Cinebench-70ºC
5800x test

Shadow Of The Tomb Raider Set up to 5150Mhz PC gaming Highest Temperature around 61ºC with the same kind of VID voltages with mean nothing on CPU
 
Voltage is normal, but temp is on the high side for idle. Based on your idle temps I would assume you have an AIO with a very noise focused fan curve which causes your water temp to be high at idle and that you are running quite a few background processes as well. The CPU will spin up quite fast so it will use more power if you are not truly idle.
 
This is my 5900X with a heatsink in the stock clock range. For 3dmark and stuff it will boost to the top. For stuff like Linpack Xtreme it will drop down to 4500, f@h/wcg 4600. With PBO and a custom tune it will run SuperPi 32M at 5150 and set a good time.

Capture.PNG
 
I have mine set to 1.37 basically a -offset, but under loads it'll go up to 1.47. Would not worry about it as long as temps are within expected range.
 
My 3900x is the same way. Much higher voltage even during idle than I would expect.
 
I had been out of the game for a bit. Did a crash course here and Google when finally upgrading. I was surprised to see the VID fluctuating around, but mostly over 1.45 VID on an Asus TUF Gaming X570 pro. Idle is like 41C. I thought maybe I mounted the cooler poorly, but I think it's more the VID. It actually only goes to low 60s if I run a quick benchmark test.

Does this seem like a high VID? I tried to shut off this "PBO" setting in the BIOS, didn't do much, but other than that, this is mostly all the default settings. I have since set it to DDR3600 as well, but I doubt that is it.
The CPU will run the VID as high as 1.51v at times to achieve its maximum boost clocks. Basically, this is done to enhance performance but the problem is that these CPU's don't clock all that high. In reality, a 5900X can really only clock at about 4.2GHz or so on all cores. In order to improve single-threaded or lightly threaded performance, one or two cores are placed in the die that can achieve significantly higher clock speeds. However, because the architecture doesn't clock well, AMD has to use a shit ton of voltage to get those cores to hit those speeds. Unfortunately, this isn't sustainable. So it will run the VID at 1.51v temporarily and when it gets to 68c, it will bring the VID and clocks back down to lower levels then ramp them back up as conditions allow for it.
 
I am running a negative 0.15 offset on both of my AMD systems, 5800X & 5900X.
It dropped idle and load temps for me.
 
are you using a negative offset at all?
I am, but a very mild one I think i'm at like -.050v I was in the process of testing negative offsets and seeing it's affect on performance, clocks and temps. -050 was either as good or marginally better on those metrics than default settings. I got too lazy to test any further and just left it there.
 
The CPU will run the VID as high as 1.51v at times to achieve its maximum boost clocks. Basically, this is done to enhance performance but the problem is that these CPU's don't clock all that high. In reality, a 5900X can really only clock at about 4.2GHz or so on all cores. In order to improve single-threaded or lightly threaded performance, one or two cores are placed in the die that can achieve significantly higher clock speeds. However, because the architecture doesn't clock well, AMD has to use a shit ton of voltage to get those cores to hit those speeds. Unfortunately, this isn't sustainable. So it will run the VID at 1.51v temporarily and when it gets to 68c, it will bring the VID and clocks back down to lower levels then ramp them back up as conditions allow for it.

That seems kind of low for a 5900x, I run mine all core at 4650 at 1.3 volts, if you just let it run default then yeah 4200 is about right. Now when it hits 4950 on one or two cores or so then yeah it needs voltage, 1.45 volts in my case. I think quality cooling makes a big difference in clock speed you get for all core loads and reducing the voltage you let it have helps as well. I do use CTR though to keep a tight control of how my chip runs as I find I could get more out of the chip then AMD by default could. I think motherboard manufactures compound the issue by pumping more voltage then needed.
 
That seems kind of low for a 5900x, I run mine all core at 4650 at 1.3 volts, if you just let it run default then yeah 4200 is about right. Now when it hits 4950 on one or two cores or so then yeah it needs voltage, 1.45 volts in my case. I think quality cooling makes a big difference in clock speed you get for all core loads and reducing the voltage you let it have helps as well. I do use CTR though to keep a tight control of how my chip runs as I find I could get more out of the chip then AMD by default could. I think motherboard manufactures compound the issue by pumping more voltage then needed.
It was just an example. I'm not entirely sure what they can clock to using all 12 cores. I have one for the test bench but I don't have much experience with it yet. The last time I tested it, I couldn't get mine to do more than 4.1GHz or 4.2GHz all core with any stability. But, again the 1.51v is just when it's boosting one or two cores. Using all core overclocks I've typically needed 1.45v to 1.485v to do that.
 
Up to 1.55v is default with PBO settings depending on Motherboard and agesa. Your voltage is normal as others have said.

My 5800X Hitting max of 5175Mhz effective clock and only 1.5v when idle or doing nothing which is the way the AMD CPU works.
Idle-25ºC
Cinebench-70ºC
51351776683_44187f2b1d_k.jpg5800x test

Shadow Of The Tomb Raider Set up to 5150Mhz PC gaming Highest Temperature around 61ºC with the same kind of VID voltages with mean nothing on CPU

wow, 700 single core on cpuz bench is pretty nice! did you have to do anything special to get it there? i've got a 3800X on ASUS TUF x570 and i get like 535 so that's like a 30% jump in performance.

i ask because i actually get the best performance leaving everything stock, no pbo, no nothing. except for a tiny negative voltage offset. only thing i have oc'd is my memory/IF to 3666/1833 (ballistix 3200)

thinking about upgrading to a 5800X-3D when or if they come out. just noticed for BF2042 the minimum spec is 3600X and recommended is 2700X so there's no way i'm going less than 8 cores. you know by the time this console generation is over they'll be squeezing every last drop of performance out of those 8 core zen 2 chips they got in there.
 
wow, 700 single core on cpuz bench is pretty nice! did you have to do anything special to get it there? i've got a 3800X on ASUS TUF x570 and i get like 535 so that's like a 30% jump in performance.

i ask because i actually get the best performance leaving everything stock, no pbo, no nothing. except for a tiny negative voltage offset. only thing i have oc'd is my memory/IF to 3666/1833 (ballistix 3200)

thinking about upgrading to a 5800X-3D when or if they come out. just noticed for BF2042 the minimum spec is 3600X and recommended is 2700X so there's no way i'm going less than 8 cores. you know by the time this console generation is over they'll be squeezing every last drop of performance out of those 8 core zen 2 chips they got in there.
Well just logged in this site tonight. It is fine to upgrade to the new 5800x-3D coming up but do not really expect any PC gaming performance,unless you run at low settings 1920x1080 with high end GPU like AMD6900XT/RTX3090 then maybe some difference. If you are PC gaming higher than 1080p ultra settings there is not much difference at all at 1440p/4K or not difference to mention.
I have owned AMD CPU's 2600x,3600x,3600xt,3800x,3800xt,5600x,5800x and no real PC gaming difference for me but I play at 4K ultra setting on good GPU's. I would not judge any score on CPU-Z/Cinebench etc when it comes to PC Gaming .

Now with the crap I just wrote,I am going to buy 5800x-3D CPU and motherboard and ditch this Intel I am running at the moment for fun. I used AMD Curve Optimizer that's all to get the high clocks on the 5800x but there was no real difference running at 5200Mhz or 4500Mhz in PC gaming for me. There was also no real difference in PC Gaming with tuned Ram timing from 2933Mhz-3866Mhz. So I am not trying to discourage you,just saying with the gear you have you should be fine even with BF2042 unless you need every FPS and run at low settings 720p or something,then a faster CPU will make a difference.

EG:Ram 2933Mhz Vs 3866Mhz on 3800X 1920x1080 with Battlefield 5 and Grand theft Auto 5,sorry for the rambling
 
Back
Top