• Some users have recently had their accounts hijacked. It seems that the now defunct EVGA forums might have compromised your password there and seems many are using the same PW here. We would suggest you UPDATE YOUR PASSWORD and TURN ON 2FA for your account here to further secure it. None of the compromised accounts had 2FA turned on.
    Once you have enabled 2FA, your account will be updated soon to show a badge, letting other members know that you use 2FA to protect your account. This should be beneficial for everyone that uses FSFT.

New build. Does all this look OK?

Both your current system and your planned system are grossly imbalanced in terms of the performance balance between the CPU and the GPU. That can, and occasionally does, result in corrupted video renders and exports compared to a system with a lesser CPU but a more powerful GPU.
Just stop it okay?
 
Just stop it okay?
Sorry. I got a bit carried away with that. I had assumed that the productivity apps utilize the GPU almost equally as much as it does the CPU. Hence my original response.

My intended response was to prevent someone paying a lot more money than it is worth for a GPU that’s potato in performance by current standards. Obviously I got into the sort of thing that was mentioned above.

At any rate, to the OP:

Good luck with the build knowing that the lack of a discrete GPU situation will hopefully be temporary.
 
I've ordered everything, but there's one more thing I'm not sure about.
I decided to save up for a better GPU and use the onboard video until then. The trouble is, my 2 monitors both have displayport inputs, but the mobo only has one displayport, an hdmi and a USB-C. The latter is labeled "USB4 DP." I'm guessing this is just fine, but am so unfamiliar with USB-C that I want to make sure. If it's OK, will just about any USB-C to displayport cable do, or is there something specific to look for/avoid?

Trying the iGPU before buying a 3050 is what I'd do in your shoes. The GPU market is totally nuts right now, but it should get better once the lower end models come out over the next few months.

The iGPU in Arrow Lake is quite a bit better than past Intel integrated graphics. I'm pretty sure it'll smoke a GT 710, so you should still see a pretty noticeable improvement as long as your apps work with Intel Xe graphics. It'll also be some useful testing to see if you actually need an NV card or not. Usually they either want NVidia or they don't care which brand of GPU you have. If your software works with GPU acceleration on integrated Intel Xe graphics I'd be quite surprised if AMD Radeon and Intel ARC (discreet Xe - similar core designs and the same drivers) didn't work too. If you don't need NV a 12GB ARC B580 or 16GB RX 9060XT are likely to be the cheapest options for a new 12 or 16GB card, respectively, until 2027. There's a decent chance of a 12GB RTX 5060, but it'll probably be a 5060 Super and won't come until NV's usual mid-cycle refresh, so 2026.

Just a regular USB-C to DisplayPort adapter or cable should be fine. I used one with my Z890 machine while I was setting it up before moving my vid card over.
 
Trying the iGPU before buying a 3050 is what I'd do in your shoes. The GPU market is totally nuts right now, but it should get better once the lower end models come out over the next few months.

The iGPU in Arrow Lake is quite a bit better than past Intel integrated graphics. I'm pretty sure it'll smoke a GT 710, so you should still see a pretty noticeable improvement as long as your apps work with Intel Xe graphics. It'll also be some useful testing to see if you actually need an NV card or not. Usually they either want NVidia or they don't care which brand of GPU you have. If your software works with GPU acceleration on integrated Intel Xe graphics I'd be quite surprised if AMD Radeon and Intel ARC (discreet Xe - similar core designs and the same drivers) didn't work too. If you don't need NV a 12GB ARC B580 or 16GB RX 9060XT are likely to be the cheapest options for a new 12 or 16GB card, respectively, until 2027. There's a decent chance of a 12GB RTX 5060, but it'll probably be a 5060 Super and won't come until NV's usual mid-cycle refresh, so 2026.

Just a regular USB-C to DisplayPort adapter or cable should be fine. I used one with my Z890 machine while I was setting it up before moving my vid card over.

Well this idea is new to me - that the apps could actually use the onboard video for accelerated performance in addition to the CPU. I know nothing about it but everything I've read is imploring people to get a card. I just checked my older version of Gigapixel and its options are "Enable discrete GPU" and "Enable Intel OpenVINO." When I choose discrete GPU on my current system, it slows to a crawl. I mean like 5 times as long for processing, so it seems like it's switching it totally away from the CPU unfortunately. When I get the new OS I can upgrade the software and see if options are better. Can't test anything yet on the new build as there have been delivery issues with a couple of parts.

Thanks for the GPU insight I'll have to keep an eye on avalability and pricing.
 
Well this idea is new to me - that the apps could actually use the onboard video for accelerated performance in addition to the CPU. I know nothing about it but everything I've read is imploring people to get a card. I just checked my older version of Gigapixel and its options are "Enable discrete GPU" and "Enable Intel OpenVINO." When I choose discrete GPU on my current system, it slows to a crawl. I mean like 5 times as long for processing, so it seems like it's switching it totally away from the CPU unfortunately. When I get the new OS I can upgrade the software and see if options are better. Can't test anything yet on the new build as there have been delivery issues with a couple of parts.

Thanks for the GPU insight I'll have to keep an eye on avalability and pricing.
When people implore you to get a discrete card, they don’t mean a 3050 6GB 😉 I’m with Zandor, a decent and modern iGPU might actually be better than a 3050, or at least close enough to last until you can get something better.
 
When people implore you to get a discrete card, they don’t mean a 3050 6GB 😉 I’m with Zandor, a decent and modern iGPU might actually be better than a 3050, or at least close enough to last until you can get something better.
I just want to be sure we're talking about the same thing here. I don't need a discrete GPU for driving monitors, the iGPU is very adequate for what I do. It's purely for boosting performance in things like DxO deep prime and Gigapixel AI. Are you saying, like zandor, that it would use the onboard graphics for that in addition to the CPU?
 
I just want to be sure we're talking about the same thing here. I don't need a discrete GPU for driving monitors, the iGPU is very adequate for what I do. It's purely for boosting performance in things like DxO deep prime and Gigapixel AI. Are you saying, like zandor, that it would use the onboard graphics for that in addition to the CPU?
Yes, you select it like any other GPU. I tried searching for Topaz results with the latest Intel iGPU’s and it’s hard to find people that don’t have a dGPU, so I can’t compare. You can always start with the integrated one and if you don’t like the performance you’re getting, you can compare with people’s benchmarks using the GPUs you are interested in and see how much better their results are.
 
Yes, you select it like any other GPU. I tried searching for Topaz results with the latest Intel iGPU’s and it’s hard to find people that don’t have a dGPU, so I can’t compare. You can always start with the integrated one and if you don’t like the performance you’re getting, you can compare with people’s benchmarks using the GPUs you are interested in and see how much better their results are.
Yeah, I've had a hard time getting definitive test results. The best info I've seen is the somewhat dated list here for DeepPrime. But there are weird results like someone with a 3050 8gb + i5 12500k getting faster times than another with a 3080 + ryzen 9 3900X. I'm just gonna have to test it myself, but good to know about being able to select the onboard just like a GPU.
 
When people implore you to get a discrete card, they don’t mean a 3050 6GB 😉 I’m with Zandor, a decent and modern iGPU might actually be better than a 3050, or at least close enough to last until you can get something better.
I was thinking more that it would be a nice upgrade from a GT 710. Those things use DDR3 on a 64-bit bus. I really doubt an Arrow Lake desktop iGPU can match the computing power of a 3050. It can probably beat it in certain scenarios though, like jobs that need lots more vram. AI stuff is a good candidate. iGPU can use a big chunk of system ram, and while it's slower than vram on a decent card system ram is much faster than shuffling stuff back and forth over the PCI-e bus. So if you throw a job that needs 16GB at an 8GB dGPU on a machine with plenty of system ram there's a good chance the iGPU will win... quite possibly by forfeit when the app crashes on the dGPU.
 
Build is mostly complete. It posts and memtest found no RAM errors, so there's some excitement and relief about that. But it brings me to the point I know the least about, the BIOS. The more I read, the more uncertainty. Seems every model has a slightly different BIOS and these B860s are new enough that there's not as much info as I'd like, or at least I can't find it.

The priority here is max stability. I don't want excessive heat, CPU degradation or issues with RAM. I've read several warnings that recent motherboard defaults are often set too high and people end up having issues because of it, so I'd like to know specifically how to set things to prevent that.

Apparently the 265K CPU has a default of 3900mhz, but BIOS is saying it's at 5200. Is that normal? What setting determines this? Gigabyte "Perfdrive" has options of 1) Intel default settings baseline 2) Intel default settings performance. 3) GBT default.

What about CPU voltages? It's showing p-core/e-core at 0.960. Do I need to change anything?

The RAM is DDR5 6000 CL36-36-36-96 1.35v, but in BIOS it shows 4800. Enabling XMP should boost it, but will that set it to the rated 6000 or will it vary by task and load? I guess that's technically overclocking. Not sure whether the benefit of it is more significant than a greater possibility of instability. If it's not a big deal, I'll just leave it as is. In memtest, for lowest/highest memory speed it said 40-40-40-77. What to make of that?

During memtest, the temp range was 27-47. I'm guessing that's pretty good.
After reading stories of long RAM training times I expected to wait a while for 64gb, but every time it boots into the BIOS quickly, including the first time. Hmmm...

There's also the question of whether to update the BIOS. At the moment I'm a little reluctant to. Not just for the usual risk reasons, but I'm going to be using Win 10 LTSC 21H2. This is an absolute priority that's far more important to me than any potential performance benefits of Win 11. That said, I'd consider updating BIOS if I knew for certain that it would not cause any issues with that version of the OS. Gigabyte has 2 updates available, F5 from Jan and F6 from Mar. The F6 version is "based on MR1 ingredients" and says "Please use Windows 11 24H2 Ent 26100.2454 or after to get optimal performance." That doesn't mean F6 will be worse for Win 10 than the current F2, does it?

I don't suppose there's any way to revert a BIOS update if I don't like it is there? The BIOS .pdf doesn't even mention it.

I read another concerning thing about BIOS updates: "Actually with Intel, you need to update the ME Firmware before any BIOS update or you're flirting with disaster." Is this true? Looks like Gigabyte would warn you on the download page if it were an issue. On that page it just says, "please update Intel Management Engine DRIVER 2507.7.10.0" Also, in the update items, one of them is "Intel CSME version 19..." Could this actually be the needed firmware included into the BIOS update?

Sorry for the question overload, It's just that being sure about this seems pretty important and a wrong move could have serious consequences. I have a few BIOS screenshots if it helps.
 
You are running your RAM at 4800 MT/s and CL40 timings, which is the rated default maximum JEDEC profile for that RAM. Enable XMP and you will see 6000 MT/s with that RAM.

And yes, I agree with the poster who stated that even an RTX 3050 6 GB card is a significant improvement over your old and now obsolete GT 710. But why stop there when a few more dollars can buy you so much more? Hence, my original reply in this thread: Don’t waste your money on something that costs only slightly less money but performs significantly worse than the next step up.
 
DDR5-6000 isn't pushing it with Arrow Lake, or anything else that uses DDR5 really. Enabling XMP should set it to 6000.

The BIOS is probably just picking one core and reporting its clock speed. The 3900MHz base clock for P-cores is pretty much meaningless. Base clocks for anything sorta modern rarely happen unless you disable turbo. Pretty much every CPU has an all core turbo above base clock and if it's not busy it'll downclock to save power, so you're never really running at base clock speed. Mine seem to go as low as 399MHz.

I get the long training time on this cheap B650 AM5 board that came in a Microcenter combo. No video for several minutes. I haven't seen it on my Z890 board with 64GB. The B650 board does it if I flash it or if I turn on EXPO.

In my experience with an ASRock Z890 board the ME firmware just comes with a BIOS update. One time when I flashed my board it told me it was going to update the ME firmware. Intel CSME version 19 looks like an ME firmware update to me. One of the BIOS update descriptions for my board has this line: "2. Update Intel ME version. (19.0.0.1895)". Some of the other ones just say "Update ME". So the major version matches and "CSME" sounds like an ME update.

Generally speaking you can just flash the BIOS again with an older version if you need to downgrade. BIOS flash normally resets everything to defaults. I doubt F6 will be worse with Win10. I'm guessing that bit with performance improvements involves all the changes made to try to improve Arrow Lake's gaming performance a bit. I'd go ahead and flash it to F6.
 
Thanks, I'll go ahead and enable XMP. If the RAM passed the memtest at 4800 I assume there's no need to run it again at 6000.

It sounds like you're saying the 5200mhz is normal because turbo is on. I didn't even see that in the settings. As long as it's not one of those problematic things I've read about that are set too high by default I'll just leave it. Or maybe it's governed by the "perfdrive" settings. Any thoughts about that? The complaints have been that defaults are set for marketing reasons and people find their temps are seriously high and panic about their coolers and such, only to have have to finally adjust back to a true non-overclocked setting. I don't know what adjustment that would take though. The perfdrive gives options of "Intel default baseline" "Intel default performance" and "GBT default". Right now, going by the orange square, it looks like it's set to Intel default performance. I'm guessing these adjust a lot more than the speed in mhz, so I'm not sure which is best. Maybe there's no way to know without experimenting, but I was just hoping someone had dealt with Gigabyte's settings before and knew specifically what each would produce.

The big thing is that I'm now feeling better about updating the BIOS. I think you're right about the CSME v 19 being the firmware. I didn't find anything definte about Gigabyte but did see that MSI includes the ME firmware with their update and found an Intel reference to that version number as "firmware" so I'm pretty sure it's what's needed. Surely Gigabyte would warn on the download page if it had to be separate.

About the video... Right now, this has exhausted the budget, so I'm just going to live with the onboard video for a while. It's true, if I had got a 3050 I'd probably be nagged by doubts about what could have been. Hopefully decent GPUs will be a little more affordable soon, but I'm skeptical.
 
You have to go out of your way to turn turbo off. That's a setting almost no one uses, and I've never heard of it not being on by default. You probably can turn it off, but it's likely buried in some menu under advanced settings.

The setting that causes most of the overheating trouble is power limits. Board vendors got in the habit of disabling power limits by default on Intel setups. Intel clamped down on it a bit with Arrow Lake, but they still allow them to default to 250W. That's ok if you have a decent AIO. Not ok if you have a $40 air cooler. With your setup I'd just set it for the official Intel defaults - 125W long term limit and 250W short term limit IIRC. I'm not sure what name Gigabyte gave it. I bet the default is the 250W all the time mode. I'm thinking baseline sounds like the 125W + 250W short term boost setting. No idea what "GBT default" is. Of course you can always just test it. Load up some monitoring software, fire up Prime95, and see what happens.
 
The only time I turn AMD's equivalent of Turbo off is when I'm running Topaz video upscaling, and I do it by setting a power plan that limits CPU to 99% (which automatically disables any boost). I do it because there's a tiny difference in performance, but a large difference in power and temps. 45 -> 7 Watts reported in AMD's dashboard and 90 -> 60 deg C CPU temps. A 265k looks like it's better for productivity, so it might never get as hot as my Ryzen 7700 and this might not be needed.
 
With more research, it seems like even a year ago "Intel default...." had become a meaningless description because mobo manufacturers are putting different Intel defaults under the same the same label. Hard to believe Intel is OK with that. I guess it's time to just plunge in and experiment. I've never used anything like Prime95 but I'll look into it. As for monitoring, I have HWinfo. Just looking at my current system, under CPU I see clock and bus speeds and a TDP, but not sure where to see real time wattage. Even 250w short term sounds like asking for trouble, no? If you're using all cores near 100% on a 2hr video encode, isn't that a little worrying? I guess I'll have to play with the turbo too if the difference can be that great. Posts like this one and the one that follows are what makes me want conservative settings.
 
Prime95 has been popular as a stress testing app for a long time. I mostly use it as a load generator to tune CPU cooling, mostly to figure out how high of a power limit a given cooling setup can handle. It just loads your CPU up to 100%, and it can pile on the AVX instructions to make sure you hit the power limits. People also use it as a stress test on overclocked CPUs to see if they're stable.

The boost settings are mostly about cooling. 125W -> 250W you need twice as much cooling, but you can exceed what the cooling can handle for a short time as long as the CPU doesn't get too hot. I'm not sure how long the time is on a 265k but on past CPUs 18 seconds was typical. It started out as a laptop thing IIRC. A heavy all-core load won't run a lot of power through a single core. P-cores will use more than e-cores of course, but you're still splitting 250W across 20 cores. That averages out to 12.5W per core, again with the caveat that P-cores will use more and E-cores will use less. A single thread load on a p-core can use quite a bit more than what one core will get in an all-core load, even if you have a "65W" chip. If you use the 125W long-term/250W short term setting a long running job like a 2 hour encode is going to be sitting there at 125W basically the entire time. I haven't actually dug around in my BIOS looking for manual PL1 and PL2 settings, but usually they're in there somewhere. That's what you'll need to do if you want to turn off the 250W boost. I'd just run it at the 125W long-term/250W short-term preset since you're looking for stability and using air cooling, but do test your cooling to make sure it doesn't overheat at 125W. Actually that's not what I'd do... but I'm using a 360mm AIO and my setup doesn't overheat at 250W during a sustained stress test.
 
If you use the 125W long-term/250W short term setting a long running job like a 2 hour encode is going to be sitting there at 125W basically the entire time.
This part is good to know. I'm definitely not knowlegeable about what different loads demand of P/E-cores and wattage. I'll experiment and monitor temps at first.
 
I'll experiment and monitor temps at first.
I highly recommend doing that. Power limits should be tuned to what your cooling can handle. Max sustained wattage is generally whatever you set it to on a recent-ish Intel build, though given your preference for stability I'd say start by figuring out which preset is the Intel stock 125W/250W boost preset, check to make sure your cooling can handle it, and just go with that if it's ok.
 
echter,

For your mobo the official Intel “stock” settings are in your board’s “Intel default settings performance” profile. The “Intel default settings baseline” profile restricts the PL2 to only 177 watts.
 
Last edited:
I updated the BIOS and everything seemed to go well until the last reboot and it got stuck on a "UEFI Interactive Shell" screen that said "Press ESC in 1 seconds to skip startup.nsh or any other key to continue." ESC did not work. A search suggested removing the flash drive and restarting. I did and it booted into the BIOS. Everything looked fine, BIOS shows the updated version. But now when I choose Save and Exit, it just reboots back into the BIOS. Is this normal when there's no OS? Seems like before the update, it actually shut the PC off. I don't want to try installing the OS until I know everything's OK.

Edit: I just compared a screenshot of the earlier BIOS to this one. Previously under Boot Sequence it said "Windows Boot Manager" but now the updated BIOS says "No Bootable Device Found."

And thanks for the info E4g1e, I'll leave it at Intel default performance.
 
Last edited:
Is this normal when there's no OS? Seems like before the update, it actually shut the PC off. I don't want to try installing the OS until I know everything's OK.
Yes, that's normal. "Save and Exit" is really "Save and Reboot". Shutting off almost sounds like a bug. B860 hasn't been out that long so it wouldn't surprise me at all if your board shipped with a BIOS that was a little janky. The AM5 machine I built a couple weeks ago rebooted into the BIOS before I installed Windows, and I'm pretty sure my Z890 board did too.
 
Back
Top