• Some users have recently had their accounts hijacked. It seems that the now defunct EVGA forums might have compromised your password there and seems many are using the same PW here. We would suggest you UPDATE YOUR PASSWORD and TURN ON 2FA for your account here to further secure it. None of the compromised accounts had 2FA turned on.
    Once you have enabled 2FA, your account will be updated soon to show a badge, letting other members know that you use 2FA to protect your account. This should be beneficial for everyone that uses FSFT.

New build. Does all this look OK?

Both your current system and your planned system are grossly imbalanced in terms of the performance balance between the CPU and the GPU. That can, and occasionally does, result in corrupted video renders and exports compared to a system with a lesser CPU but a more powerful GPU.
Just stop it okay?
 
Just stop it okay?
Sorry. I got a bit carried away with that. I had assumed that the productivity apps utilize the GPU almost equally as much as it does the CPU. Hence my original response.

My intended response was to prevent someone paying a lot more money than it is worth for a GPU that’s potato in performance by current standards. Obviously I got into the sort of thing that was mentioned above.

At any rate, to the OP:

Good luck with the build knowing that the lack of a discrete GPU situation will hopefully be temporary.
 
I've ordered everything, but there's one more thing I'm not sure about.
I decided to save up for a better GPU and use the onboard video until then. The trouble is, my 2 monitors both have displayport inputs, but the mobo only has one displayport, an hdmi and a USB-C. The latter is labeled "USB4 DP." I'm guessing this is just fine, but am so unfamiliar with USB-C that I want to make sure. If it's OK, will just about any USB-C to displayport cable do, or is there something specific to look for/avoid?

Trying the iGPU before buying a 3050 is what I'd do in your shoes. The GPU market is totally nuts right now, but it should get better once the lower end models come out over the next few months.

The iGPU in Arrow Lake is quite a bit better than past Intel integrated graphics. I'm pretty sure it'll smoke a GT 710, so you should still see a pretty noticeable improvement as long as your apps work with Intel Xe graphics. It'll also be some useful testing to see if you actually need an NV card or not. Usually they either want NVidia or they don't care which brand of GPU you have. If your software works with GPU acceleration on integrated Intel Xe graphics I'd be quite surprised if AMD Radeon and Intel ARC (discreet Xe - similar core designs and the same drivers) didn't work too. If you don't need NV a 12GB ARC B580 or 16GB RX 9060XT are likely to be the cheapest options for a new 12 or 16GB card, respectively, until 2027. There's a decent chance of a 12GB RTX 5060, but it'll probably be a 5060 Super and won't come until NV's usual mid-cycle refresh, so 2026.

Just a regular USB-C to DisplayPort adapter or cable should be fine. I used one with my Z890 machine while I was setting it up before moving my vid card over.
 
Trying the iGPU before buying a 3050 is what I'd do in your shoes. The GPU market is totally nuts right now, but it should get better once the lower end models come out over the next few months.

The iGPU in Arrow Lake is quite a bit better than past Intel integrated graphics. I'm pretty sure it'll smoke a GT 710, so you should still see a pretty noticeable improvement as long as your apps work with Intel Xe graphics. It'll also be some useful testing to see if you actually need an NV card or not. Usually they either want NVidia or they don't care which brand of GPU you have. If your software works with GPU acceleration on integrated Intel Xe graphics I'd be quite surprised if AMD Radeon and Intel ARC (discreet Xe - similar core designs and the same drivers) didn't work too. If you don't need NV a 12GB ARC B580 or 16GB RX 9060XT are likely to be the cheapest options for a new 12 or 16GB card, respectively, until 2027. There's a decent chance of a 12GB RTX 5060, but it'll probably be a 5060 Super and won't come until NV's usual mid-cycle refresh, so 2026.

Just a regular USB-C to DisplayPort adapter or cable should be fine. I used one with my Z890 machine while I was setting it up before moving my vid card over.

Well this idea is new to me - that the apps could actually use the onboard video for accelerated performance in addition to the CPU. I know nothing about it but everything I've read is imploring people to get a card. I just checked my older version of Gigapixel and its options are "Enable discrete GPU" and "Enable Intel OpenVINO." When I choose discrete GPU on my current system, it slows to a crawl. I mean like 5 times as long for processing, so it seems like it's switching it totally away from the CPU unfortunately. When I get the new OS I can upgrade the software and see if options are better. Can't test anything yet on the new build as there have been delivery issues with a couple of parts.

Thanks for the GPU insight I'll have to keep an eye on avalability and pricing.
 
Well this idea is new to me - that the apps could actually use the onboard video for accelerated performance in addition to the CPU. I know nothing about it but everything I've read is imploring people to get a card. I just checked my older version of Gigapixel and its options are "Enable discrete GPU" and "Enable Intel OpenVINO." When I choose discrete GPU on my current system, it slows to a crawl. I mean like 5 times as long for processing, so it seems like it's switching it totally away from the CPU unfortunately. When I get the new OS I can upgrade the software and see if options are better. Can't test anything yet on the new build as there have been delivery issues with a couple of parts.

Thanks for the GPU insight I'll have to keep an eye on avalability and pricing.
When people implore you to get a discrete card, they don’t mean a 3050 6GB 😉 I’m with Zandor, a decent and modern iGPU might actually be better than a 3050, or at least close enough to last until you can get something better.
 
When people implore you to get a discrete card, they don’t mean a 3050 6GB 😉 I’m with Zandor, a decent and modern iGPU might actually be better than a 3050, or at least close enough to last until you can get something better.
I just want to be sure we're talking about the same thing here. I don't need a discrete GPU for driving monitors, the iGPU is very adequate for what I do. It's purely for boosting performance in things like DxO deep prime and Gigapixel AI. Are you saying, like zandor, that it would use the onboard graphics for that in addition to the CPU?
 
I just want to be sure we're talking about the same thing here. I don't need a discrete GPU for driving monitors, the iGPU is very adequate for what I do. It's purely for boosting performance in things like DxO deep prime and Gigapixel AI. Are you saying, like zandor, that it would use the onboard graphics for that in addition to the CPU?
Yes, you select it like any other GPU. I tried searching for Topaz results with the latest Intel iGPU’s and it’s hard to find people that don’t have a dGPU, so I can’t compare. You can always start with the integrated one and if you don’t like the performance you’re getting, you can compare with people’s benchmarks using the GPUs you are interested in and see how much better their results are.
 
Yes, you select it like any other GPU. I tried searching for Topaz results with the latest Intel iGPU’s and it’s hard to find people that don’t have a dGPU, so I can’t compare. You can always start with the integrated one and if you don’t like the performance you’re getting, you can compare with people’s benchmarks using the GPUs you are interested in and see how much better their results are.
Yeah, I've had a hard time getting definitive test results. The best info I've seen is the somewhat dated list here for DeepPrime. But there are weird results like someone with a 3050 8gb + i5 12500k getting faster times than another with a 3080 + ryzen 9 3900X. I'm just gonna have to test it myself, but good to know about being able to select the onboard just like a GPU.
 
When people implore you to get a discrete card, they don’t mean a 3050 6GB 😉 I’m with Zandor, a decent and modern iGPU might actually be better than a 3050, or at least close enough to last until you can get something better.
I was thinking more that it would be a nice upgrade from a GT 710. Those things use DDR3 on a 64-bit bus. I really doubt an Arrow Lake desktop iGPU can match the computing power of a 3050. It can probably beat it in certain scenarios though, like jobs that need lots more vram. AI stuff is a good candidate. iGPU can use a big chunk of system ram, and while it's slower than vram on a decent card system ram is much faster than shuffling stuff back and forth over the PCI-e bus. So if you throw a job that needs 16GB at an 8GB dGPU on a machine with plenty of system ram there's a good chance the iGPU will win... quite possibly by forfeit when the app crashes on the dGPU.
 
Back
Top