VID/Vcore Core Temp and CPUZ Confusion

Bomo

Gawd
Joined
Mar 9, 2010
Messages
902
So I'm a bit new to overclocking, and I really like how simple it has been with my 2500k. I've been running 4.2 with everything else on auto for a while, but I recently got curious how far I could push it (most likely I'll go back to 4.2-4.4ish).

I'm using CPU-Z and Core Temp to monitor it, and I'm at a bit of a loss on Core Temp's VID reporting. From what I know about VID, it's essentially the amount of power required to get it going. Vcore is the voltage that the CPU is running at.

My recent round of testing I was trying 46x with the Vcore set at auto (1.352 according to CPU-Z), but now Core Temp keeps reporting a range in VID from 1.0007v to 1.4100v primarily staying at 1.381v.

I'm fine with running auto at ~1.35v and then testing where I can drop it to from there. But the 1.4 scares me, because that's generally a lot of people's long term cut off.

Am I freaking out about nothing? Is Core Temp essentially only useful for reporting temperature?

Really any information would help.


I'm running with an H60 and its stock thermal grease (I'm upgrading to arctic silver and a push pull set up on it when I upgrade my video card (possibly an H100 and modifying my case).
 
CPU-z and CoreTemp show slightly different voltages on mine when idle, but at load they are both the same. Yours is different under load?

VID is the voltage the CPU is requesting, and Vcore is the voltage being provided. With Auto set in the BIOS, they should be the same (or close, Vdroop will have some affect), but manual voltage settings would obviously be different. I've always used CPU-z for voltages, but I can't really comment on which is more accurate.
 
CPU-z and CoreTemp show slightly different voltages on mine when idle, but at load they are both the same. Yours is different under load?

VID is the voltage the CPU is requesting, and Vcore is the voltage being provided. With Auto set in the BIOS, they should be the same (or close, Vdroop will have some affect), but manual voltage settings would obviously be different. I've always used CPU-z for voltages, but I can't really comment on which is more accurate.

At auto, Vcore was 1.362 according to CPU-Z (which doesn't show VID as far as I know) while VID was at 1.371 to ~1.41.

I've since dropped back to 44x at 1.28 (not auto, so the Vcore and VID should be different in this case, right?), which did the 10 rounds of IBT completely fine. I've not run into any problems, I just want to make sure I'm not stressing my chip too much.
 
At auto, Vcore was 1.362 according to CPU-Z (which doesn't show VID as far as I know) while VID was at 1.371 to ~1.41.

I just want to make sure I'm not stressing my chip too much .

Your reported VID's of ~1.371/~1.41v are fine and you are not stressing your chip too much provided that your temps are under control.

I'm currently running my 2600K at 49x with ~1.4v (load) and CoreTemps's VID is reporting 1.3911v.
 
The only thing I use real temp for is the temperatures If you hit the vid button you can change what it reads and I use it to measure watts on the cpu I am not sure how accurate it is but it seems accurate enough for me. My Real temp has never showed the correct Vcore when it is idle it appears to be close but under load it is way off.Every program I have except for Eazytune 6 shows a different Vcore. OCCT CPUZ (with the latest version It is accurate now) Speed fan . Now CPU-Z is the only program I use to measure Vcore
 
Back
Top