AI-based system: what to look for?

philb2

[H]ard|Gawd
Joined
May 26, 2021
Messages
1,978
Just occurred to me that my next upgrade should have a decent amount of local processing power. I am a heavy Adobe Lightroom and Photoshop user, and Adobe has been steadily adding AI and features for processing images. I shoot RAW, for the quality and flexibility of resulting "developed" images. I know I will need to upgrade my GPU card, since Adobe has been using that for AI. However, with the arrival of AI-based CPUs, I expect that Adobe will also depend on the CPU for some AI-based processing.

So what should I look for (wait for) in my next major upgrade? Assume that I will upgrade within 2 years. If it matters, I have a 4-fan AIO cooler for my CPU, which is an AMD Ryzen 9 7900x.
 
Assume that I will upgrade within 2 years.
Moving so fast that probably better to look about it last minute (right now a lot of its done via the cloud and does not use the local machine using Adobe Firefly cloud-based AI I think ?)

Chance are on a desktop that can use 450watt of discrete GPU, that will be the way to go, but things could change.
 
Last edited:
Moving so fast that probably better to look about it last minute (right now a lot of it couple via the cloud, using Adobe Firefly cloud-based AI ?)
Moving so fast, I agree. But at some point, the normal product life cycle will resume. (I am assuming.) If I am assuming right, then are we 12 months from that point, 18? 24? More than 24?

Chance are on a desktop that can use 450watt of discrete GPU, that will be the way to go, but things could change.

Yeah, I may need a bigger PSU. Right now, I'm rocking 850 watts of Seasonic.
 
the normal product life cycle will resume. (I am assuming.) If I am assuming right,
It must yes either disapear and be cloud base or become more stable (like 3d, sound, etc... did before, could be faster than ever considering the amount of money being spent).

My guess in 12 months, we a full third generation of NPUs in the wild and an idea of the software side, one possible (almost certain) reason they made chatgpt 4-o free to use could be that it is cost way less to run than chatgpt 3 and the previous 4, making them too costly to run those and made them obsolete overnight. What a 15 watts npu can do will change dramatically and fast, still on a desktop it will be hard to beat what those big discrete GPU can do in that domain, it is hard to imagine (for adobe stuff where often cloud is not an issue) that the latency added by using the gpu instead of the cpu will be an issue or that dgpu will not continue to be magnitude stronger, considering the power budget and Nvidia position in the field.
 
Focus on the GPU for now. There isn't any photoshop usable "AI" in current CPUs.
 
Back
Top