Intel XTU

Discussion in 'Intel Processors' started by FrgMstr, Apr 11, 2019.

  1. FrgMstr

    FrgMstr Just Plain Mean Staff Member

    Messages:
    48,213
    Joined:
    May 18, 1997
    {NG}Fidel, Armenius and juanrga like this.
  2. spine

    spine 2[H]4U

    Messages:
    2,539
    Joined:
    Feb 4, 2003
    I came across it reasonably recently, and was rather impressed actually. It's a world apart from their shitty graphics control panel, which was what I was fearing when I first ran it.

    Main gripe is it could definitely stand to be alot leaner. And it does the usual Intel game of installing a raft of Intel services, most of which aren't needed to run the application. I couldn't tell you off hand as I've since deleted the services. But it's something like 5 or 6 all told, which is ridiculous.

    Should be one executable, just like any other app of this ilk.
     
    Last edited: Apr 11, 2019
  3. Dullard

    Dullard 2[H]4U

    Messages:
    2,207
    Joined:
    Jun 28, 2012
    Yeah, the stress test isn't really very stressful.
     
  4. CAD4466HK

    CAD4466HK [H]ard|Gawd

    Messages:
    1,222
    Joined:
    Jul 24, 2008
    This. IBT/Linpack is much better IMO.
     
    Armenius and Spartacus09 like this.
  5. FrgMstr

    FrgMstr Just Plain Mean Staff Member

    Messages:
    48,213
    Joined:
    May 18, 1997
    So you do see value in built in stress tests? To put it plainly, would you trust those for building a bulletproof configuration, or are you going to do that with a 3rd party too anyway?
     
  6. Dullard

    Dullard 2[H]4U

    Messages:
    2,207
    Joined:
    Jun 28, 2012
    Here's 5 minutes of XTU vs a few seconds left on Blender Classroom (takes about 5 1/2 minutes).

    My core 2 is usually a bit hotter than the others, shows that in XTU. But even with AVX -3 kicking in in Blender, notice how much higher the temps are. On the other hand, a 7980XE @ 4.6 is a pretty beastly rig. XTU1.jpg Blend-1.jpg
     
    Armenius and FrgMstr like this.
  7. jrobdog

    jrobdog Limp Gawd

    Messages:
    286
    Joined:
    Dec 4, 2006
    I would love Intel to consider an automated binning process within the XTU framework. User sets the voltages and the program can test various speeds until stability worsens.
     
    craigdt, alxlwson, Armenius and 2 others like this.
  8. mda

    mda [H]ard|Gawd

    Messages:
    1,559
    Joined:
    Mar 23, 2011
    To give a short background, I use XTU mainly with my Laptop, a Thinkpad P50 with the 6700HQ CPU. I don't use it for PCs where I do the tweaking in BIOS.
    My main goal for using it is to try to undervolt to keep the laptop cooler, save more battery, maybe to allow the CPU to turbo higher and longer, all to hopefully get a little more performance out of the system.

    My main issues and comments are the ff:
    1. The in-app upgrade is wonky. Had an old version and the app prompted me to update. I clicked download, installed, and ended up with the old version after upgrading from inside the app. Can't remember the version number at that time though, but it was between 2017 and 2018. Had to uninstall the app and reinstall a new version. This may or may not have been fixed since then.
    --Side note, I realized the app doesn't quickly/intuitively show which version, build number or identifying date for this either.

    2. The XTU doesn't go to the system tray during bootup by default. I'm quite sure I didn't uncheck anything when I installed it, as I'd want it there to know it's running. Therefore, I'm not sure if the settings I applied are actually in place every time I reboot the laptop. At this time, I'm thinking: "Even without the icon in the system tray, are the settings I applied set, or do I need to open XTU to get it running?"

    3. I can't set some of the single settings back to it's 'default'. Once you save the settings, you no longer know what the original value was. IE. On "Processor Core ICCMax", there is a default value of 68.000A. No default button though, and can't tell what this does.

    4. I guess I'm not a total noob, but some of the names and explanations are sort of vague to me. Checking one parameter, "Turbo Boost Power Time Window", I have a rough idea of what it does but I have never run into this term before outside XTU and don't know exactly what this setting does. The default explanation is not clear either.
    "The Time window over which the average CPU core power must be below the Turbo Boost Power Max (PL1 time)."
    Seems these were written for engineers who are familiar with Intel's technical language. Same with the other settings like mentioned "Processor Core ICCMax". Since I can't tell what they do, I just end up leaving the values as is.

    5. Like how some of the motherboard makers do it, is there some indication of when the values are physically impossible/stupid to do? Like the numbers turning red when you crank up the voltage too much for the CPU. I'm thinking that Intel of all people will know when feeding voltage above ___ to your CPU is bad. **Right now, the values turn yellow when I adjust it, but in this case, it only serves to tell you that you've adjusted it from the default.

    6. Is there any way to determine which settings are 'locked' by either Intel or the OEM (in this case, Lenovo) for my specific CPU, and tell me which values I can play with so as not to waste time trying a value that will probably not be implemented either?

    7. Similar to how we read CPU-z or HWInfo after we tweak the voltages in our BIOS, is there anywhere to verify the changes we've made in XTU? It seems like I hit save and I have no idea if what I did was actually applied, especially since some of the values are not normally reported by the other monitoring programs anyway.
    Sample: "Turbo Boost Short Power Max, Turbo Boost Power Max, Turbo Boost Power Time Window", etc.

    8. Long shot - Any chance in hell you can convince the XTU team to implement all core turbo/MCE like how all the other board vendors do? (or any way to tweak the turbo logic) :D

    9. Partly formed thought - if I'm a desktop user lets say with an unlocked CPU like the "K" range, is there anything we can tweak CPU-side that you can't do in the BIOS to make the program worthwhile/a must-have for overclockers and tweakers? I guess same applies to if I have a laptop like mine now, even with a CPU outside the K range.

    Hope this helps!
     
    Last edited: Apr 11, 2019
    Armenius, IdiotInCharge and FrgMstr like this.
  9. Furious_Styles

    Furious_Styles [H]ard|Gawd

    Messages:
    1,293
    Joined:
    Jan 16, 2013
    This is my main issue with it as well. I don't trust it for stability.
     
  10. Brian_B

    Brian_B 2[H]4U

    Messages:
    2,745
    Joined:
    Mar 23, 2012
    I fired up XTU after seeing this for the first time in a few months. It asked me to upgrade, I clicked OK... it downloaded some stuff, said the upgrade failed a validation check and failed to update. So I clicked the Check for Update button manually, it told me I had the latest version. No harm no foul in my case I suppose, but it certainly didn't do what it was intended to do.

    So yeah, still somewhat of an issue I'd say.
     
  11. Wiffle

    Wiffle Limp Gawd

    Messages:
    293
    Joined:
    Oct 2, 2011
    I was looking at the screenies Kyle was posting in the Leaving Threadripper Behind - Going 8600K thread, and was wondering if XTU allows you to set the multipliers for the cores independently of each-other? I know the recent AMD and Intel CPUs support independent multipliers, but does this program allow it to be done easily? I am not averse to manually doing it, but it just sucks when you want to try different configurations quickly. I can't tell because im still using a 2600k which doesn't support individual core multipliers.

    I only ask because in terms of gaming, I think the XTU would come in handy. As it stands now, I don't have much of a reason to upgrade the 2600k (well I do have a reason, but its not enough...yet), and am more likely to spring for video card upgrades instead of doing a whole rebuild around a newer chip. Having a bazillion cores has little impact on gaming, and I don't think it will at least for the next 5-6 years (I said the same thing 8 years ago when I opted for the 2600k instead of the 2500k... should have got the 2500k, didn't need the hyper-threaded cores...). But, being able to limit a game to certain cores with increased multipliers while allowing the OS and other background processes to run on lower multiplier cores seems like a interesting way to utilize the multiple cores on the newer chips.

    I like my grand strategy games, and I like them to take forever with 20+ AI opponents on the largest maps possible... subsequently this means increased turn times and decreased performance which gets worse the longer the game goes on. Overclocking does wonders to help this, but it kinda destroys the longevity of the VRMs and Caps... (which Ive done with several times with previous personal builds) inevitably forcing me to remove the overclock, and in one case I had to downclock, to maintain stability until I upgrade. Currently my 2600k has been at stock speeds the last 2 years, cause 6 years of being at 4.5Ghz hasn't been kind to my motherboard...

    I figure, if i can juice the system on only a couple cores, instead of all of them, I can at least spare the poor VRMs, and maintain stability over a long period of time while keeping a decent overclock for the games I play. I have kids now, and would prefer not to roast my hardware anymore, that way I can pass it on. But I would be damned if I didn't push the hardware to its limits! Besides, with my next build I want to start getting into the VR realm.

    Also, I feel the XTU could be used in a similar fashion to how video card drivers have "optimized" gaming profiles. It could go a long way towards adding extra value with not only gaming, but other programs as well. Giving developers an option to fine tune how their programs interact with Intel CPUs would be an amazing boon.
     
    Armenius and Keljian like this.
  12. Armenius

    Armenius I Drive Myself to the [H]ospital

    Messages:
    17,478
    Joined:
    Jan 28, 2014
    This would be great. I love this feature with the new NVIDIA RTX cards. Their implementation has been able to pretty much hit the most stable overclock with each cooling setup I had on the card (air to AIO) and each time it only took 15 minutes to scan. If nothing else it can be used as a good baseline. This would be a marked improvement over the manual overclocking process now when it can take literally all day to dial everything in. To go along with this and the stress testing conversation, it would be cool to have an option on what your typical use case is going to be and the program adjusts system load during the test based on that preference.
     
    CAD4466HK likes this.
  13. CAD4466HK

    CAD4466HK [H]ard|Gawd

    Messages:
    1,222
    Joined:
    Jul 24, 2008
    This would be a great idea for n00bs and people who don't have the time to spend hours on a stable OC. I for one wouldn't use that feature, since I would miss out on the satisfaction gained from achieving the "on the edge" extreme OC. But that's just me. Good idea though.
     
  14. ccityinstaller

    ccityinstaller 2[H]4U

    Messages:
    4,062
    Joined:
    Feb 23, 2007

    Either you need to quit buying motherboards that have shitty VRM setups with little to no cooling (via heatsinks and or airflow) or you need to learn how to properly OC things. I have never once destroyed a Mobo or had a cpu "degrade" in the 20+ years I have been doing this, and I push my chips to the ragged edge (I use full custom water setups and have since the days we were using heatercores from a car for a rad and external pumps along with several peltier setups as well). You are doing something VERY VERY wrong.

    As for the "moar cores are useless for gaming" you are incorrect there as well. I went from a 5~5.2ghz all core OC'd 3770K with dual chanel DDR2400 @ 2600 to a Ryzen 1600 @ 4.1Ghz with dual channel 2933 then 3200 ram (and then later a 2700 @ 4.3Ghzz).

    With fresh installs on both platforms, the Ryzen system was consistenly faster and "smoother" in every application. The intel software hacks for Ivy/Sandy are terrible performance wise, and you just do not realize how bad it is until you stop using it.

    I am not stating this as an AMD fan, you can buy a 6+ core Intel CPU for all I care, but saying that a modern 6+ core cpu is not going to give you any benefit from an 8 yr old HT'd quadcore is just wrong on so many levels. You 1% and .01% frametimes, which determine your "smoothness" in game will be insanely better on a modern 6+ core cpu platform.
     
    {NG}Fidel likes this.
  15. Keljian

    Keljian Gawd

    Messages:
    582
    Joined:
    Nov 7, 2006

    So.. I've been playing with computers for 34 years.

    I've had a 4 motherboards degrade over that time. Most recently the Asus Prime x370 pro on the ryzen platform. I wasn't even overclocking hard, I just had all cores pegged at 3.7ghz maximum on a 1700. 2 years, the vrms went. Ironically it has a very nice VRM setup on it.
    I have enormous case airflow (3x120s down the front) and a noctua d15 which was blowing on them.

    Before that it was an Asus cheapy board I got as a stop gap which I never liked. Did the job, but didn't have enough i/o for me. Was on the ivy bridge platform.

    The other boards to go were in the dark ages of the 90s, so they don't count..

    I currently have an Asus, but as soon as I can afford it, I'll move to gigabyte..(running an asus prime z370 prime pII/9700k)


    The longest serving board I've had is still in service. It's an Asrock B85-m pro4 hosting a i7-4970. It has been in service since about Q3-2012


    In terms of cpus degrading, I had a k6 degrade, but that is it.
     
    Last edited: Apr 22, 2019
    {NG}Fidel likes this.
  16. ccityinstaller

    ccityinstaller 2[H]4U

    Messages:
    4,062
    Joined:
    Feb 23, 2007
    Odd, I had the X 370 Pro running a 1600 @ 4.1Ghz with 130% current limit and 1.45V (I don't remember the LLC level) and it ran 24/7 for almost 8 months mining Cryptonight based coins with no active cooling on th VRMS since my top rad 3x120mm rad fans are set to exhaust. I did brick the bios (it was a known issue with that revision of board and while waiting for a replacement I purchased a x470 Prime Pro running in my sig rig in the same setup with even more current being pulled)...

    Perhaps it's just bad luck? I do wish AMD/Intel would enforce some type of minium standard for VRM setups when they know the ODMs are going to be marketing them as "OC'ing boards" etc but that will never happen.
     
  17. Shadowed

    Shadowed Limp Gawd

    Messages:
    496
    Joined:
    Mar 21, 2018
    I have to use a really old XTU for it to show the L4 cache in my CPU. The current one doesn't list it.

    That's the only complaint I have for XTU.
     
  18. craigdt

    craigdt [H]ard|Gawd

    Messages:
    1,031
    Joined:
    Oct 27, 2016
    Please, sir, I want some more
     
    Spartacus09 likes this.
  19. tangoseal

    tangoseal [H]ardness Supreme

    Messages:
    7,311
    Joined:
    Dec 18, 2010
    How much you want to bet the new Intel cards are no faster than an Athlon 200GE IGP lol ...

    well see but im excited to finally see another direct competitor in the gpu market place.
     
  20. Thunderdolt

    Thunderdolt Limp Gawd

    Messages:
    186
    Joined:
    Oct 23, 2018
    It would be pretty excellent to have an automated utility that could do a full per-core overclock. Maybe the current hardware doesn't allow something like this, but for the next gen it would be great. I'm a pro user who builds the high-compute machines that he runs his business from. I don't want to deal with headaches from a traditional manual OC process. The downtime from crashing while OCing plus the hours invested in learning the best OC approach to each platform/config is massively more expensive than the 5-10% performance boost that I would get. At the same time, it's also expensive for me to potentially be leaving 5-10% performance margins on the table.

    As a pro user, I also wouldn't be looking to get the extreme OCs that today's bloggers achieve. Even just a 3% boost would effectively make my "compute year" 11 days longer.

    I think Nvidia has done pretty well with the OC tool that was initially rolled out to the RTX line. I didn't break OC records on any of my GPUs with that tool, but I did gain 10+% in my real world usage and that has proven to be quite valuable.
     
  21. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    10,418
    Joined:
    Jun 13, 2003
    To jump off of this and the rest of your post, the generation of per-core and all-core stability curves relative to core speed and thermals would be invaluable.
     
    Thunderdolt likes this.
  22. Keljian

    Keljian Gawd

    Messages:
    582
    Joined:
    Nov 7, 2006
  23. Spartacus09

    Spartacus09 Gawd

    Messages:
    977
    Joined:
    Apr 21, 2018
  24. Thunderdolt

    Thunderdolt Limp Gawd

    Messages:
    186
    Joined:
    Oct 23, 2018
    That sounds awesome. Looks like my 9980XE isn't supported yet though.
     
  25. Dullard

    Dullard 2[H]4U

    Messages:
    2,207
    Joined:
    Jun 28, 2012
    Looks like this is targeted towards the consumer grade K chips - Intel probably assume people with XE chips can OC on their own.
     
  26. Thatguybil

    Thatguybil [H]Lite

    Messages:
    98
    Joined:
    Jan 21, 2017
    Does the core with the best performance curve be assigned as CPU zero?

    Can the second best core be assigned as CPU 1?

    Etc
     
  27. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    10,418
    Joined:
    Jun 13, 2003
    CPU core 'numbers' should already be assigned and be consistent across applications.
     
  28. Thatguybil

    Thatguybil [H]Lite

    Messages:
    98
    Joined:
    Jan 21, 2017
    I am asking if once XTU identifies the best cores with the best OC can it renumber the cores so the highest priority threads are assigned to the best cores.
     
  29. FrgMstr

    FrgMstr Just Plain Mean Staff Member

    Messages:
    48,213
    Joined:
    May 18, 1997
    I did a good but if work with the extreme guys while at Intel, and I do not think what you are asking for is possible.
     
  30. FrgMstr

    FrgMstr Just Plain Mean Staff Member

    Messages:
    48,213
    Joined:
    May 18, 1997
    Yes, just now for 9 series K skus. There are a huge number of things that go on inside IPM, and it is quite impressive. It is not just running a series of profile tests like Asus' utility does. That said, each series of chip has to through a fairly rigorous validation process before it is added to IPM, and there are no current plans to add older SKUs to it. There are set points for maximums it can apply in terms voltage etc and are what I consider safe, and to be honest I was surprised at how hard Intel will push the CPUs .

    While I did heavily lobby for there to be a free version of IPM, it looks as though there will be a small cost to it, and that does come with insurance for chip replacement if you were to kill your CPU while using it.

    I have a ton of notes on IPM, but most of the information is classified. I would say however that IPM is the real deal.
     
    Araxie, Chimpee and IdiotInCharge like this.
  31. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    10,418
    Joined:
    Jun 13, 2003
    I'm actually okay with this.
     
    Araxie likes this.
  32. Thatguybil

    Thatguybil [H]Lite

    Messages:
    98
    Joined:
    Jan 21, 2017
    Thanks for checking!

    Hmmm insurance... I wonder if people will use this to get another spin at the silicon lottery or a later revision...
     
    Last edited: Jun 2, 2019
  33. Dullard

    Dullard 2[H]4U

    Messages:
    2,207
    Joined:
    Jun 28, 2012
    Intel has had the Tuning Plan protection for some time, looks like the website is undergoing upgrades, possibly to integrate the tuning utility/tuning plan. But you just signed up for the chip you have, cost depends on what the chip is (cheap chips, not much - last one I bought was $35 for a 5960X IIRC. Seems like it was $175 for the 7980XE, I didn't buy that one). It's a one time deal, and there's a 30 day waiting period before you can use it as in you can't fry your processor, sign up, and get a freebie right away. You have to wait 30 days.

    This info might be dated, like I said it's been a few years since I bought it and it looks like it's changing.
     
  34. FrgMstr

    FrgMstr Just Plain Mean Staff Member

    Messages:
    48,213
    Joined:
    May 18, 1997
    It will be different.
     
    {NG}Fidel and IdiotInCharge like this.