Intel Core Ultra 9 285K, Ultra 7 265K and Ultra 5 245K Review Roundup

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
12,383
1729780717982.png


Source: https://videocardz.com/188838/intel-core-ultra-9-285k-ultra-7-265k-and-ultra-5-245k-review-roundup

1729782966638.png

"Oh boy this is a massacre, 7800x3d just killed Intel 200s crap" https://www.techpowerup.com/forums/threads/intel-core-ultra-9-285k.327799/
 
Last edited:
Yikes, not ready for primetime. Some of the outlier benchmarks are pretty solid though (like cinebench), makes me think these might look a lot better with a few months of BIOS/microcode fixes in place.
 
Intel stock is up slightly on the day hot off the news of the release today

1729784614059.png
 
Yikes, not ready for primetime. Some of the outlier benchmarks are pretty solid though (like cinebench), makes me think these might look a lot better with a few months of BIOS/microcode fixes in place.
is it worse than Bulldozer yet?
 
Excuse me while I go buy this $700 Z890 Asus board...oh, wait, maybe not.
Exactly, that sort of pricing is an immediate deal-breaker. It's fine to be the underdog in the desktop segment so long as you can be competitive in price/performance.

We could argue the board partners are responsible for making the buy-in on Z890 + 285K such a bad value, but Intel knew what the performance was before today. Intel should have been much more aggressive with the CPU pricing out of the gate.
 
Is it just me, but are gaming benchmarks being the end-all be-all for performance markers for CPU's annoying? Fine if you're rich, and all you do is game, but gaming is about 20% of what I do on a computer.
As I do agree, but gaming is what usually sells a lot of DIY computers. For what you do, most sites recommend a 7950x or 9950x anyway.

The 285 Ultra is just overpriced, It's hard to recommend it for any desktop use anyway.
 
Is it just me, but are gaming benchmarks being the end-all be-all for performance markers for CPU's annoying? Fine if you're rich, and all you do is game, but gaming is about 20% of what I do on a computer.
It is not like geekbench, cinebench, passmark are other benchmarks are not extremely popular performance markers for CPU, I would say more than a 3dmark or other single number regarding games.

Some review will have almost no games at all (they did regress on Linux as well too), for work computer
https://www.phoronix.com/review/intel-core-ultra-9-285k-linux
When taking the geometric mean of nearly 400 benchmarks tested across these processors, the Core Ultra 9 285K was 12% faster than the Core i9 14900K overall. Or about 14% faster when switching to DDR5-8000 memory.
The generational Raptor Lake to Arrow Lake performance is more impressive when factoring in power use. The Core Ultra 9 285K on average was at around 136 Watts during the entire span of workloads tested, right inline with the average of 137 Watts on the Ryzen 9 9950X and much lower than the 156 Watt average with the Core i9 14900K. The Core Ultra 9 285K did have a peak recorded power use at 248 Watts, above the 201 Watts found with the Ryzen 9 9950X but at least much lower than the 347 Watt peak with the i9-14900K


Some channels are gaming channels, like an HUB, so a lot of focus will go there

285k keeping up with a 9950x so much is impressive (performance and efficacy)
 
Yikes, not ready for primetime. Some of the outlier benchmarks are pretty solid though (like cinebench), makes me think these might look a lot better with a few months of BIOS/microcode fixes in place.
Could be structural, where things that are hurt by higher latency will not increase like a typical non real time render or compilation task can.

Some latency are up:
https://www.tomshardware.com/pc-components/cpus/intel-core-ultra-9-285k-cpu-review

Maybe can get better or maybe it will be the cost to have a compute tile being a different one than the io tile, the extreme case slower than a 12600k could be fix, but the general trend staying forever would not surprise me.
 
Last edited:
This could be a good sign, depending on how much more silicon was spent:

igp-relative-performance-1920-1080.png


A next jump like that and we will have 1060 level gpu coming with an Intel cpu....

This is not bad:
efficiency-multithread.png
efficiency-gaming.png
, zen5 efficacy in cinebench and games.

Application relative ranking

9950x: 103.4%
285k: 100%

265k: 93.7%
9900x: 92.6%

245k: 79%
9700x: 77.8%

Price them well, and maybe they are a better choice for many outside gaming (and with DDR-5 8000-8800 maybe it look good for a lot of application), but with the gaming decline it will be rough.
 
Is it just me, but are gaming benchmarks being the end-all be-all for performance markers for CPU's annoying? Fine if you're rich, and all you do is game, but gaming is about 20% of what I do on a computer.
Ya, This is why i avoid most review sites these days, sure it is a massive market, and likely what most people buying and building use their rigs for, but then there are others who utilise their systems for other tasks. I would love to see reviews that maybe included running virtualisation software like KVM/vmware workstation , linux related benchs and such too.. but I know it is such a small niche market the time and effort to do it...

mmm, time for us to start up Hard reviews again and cater to things other than Windows Gaming?
 
Give intel some time, they'll get their mojo back.
Sure once they ditch this entire architecture....

This is kind of reminding me of the days when AMD took off with the 64 chips and let intel in the dust cause they sat pretty on their thrown for too long and didnt actually innovate...and then the Core 2 Duo line came along and just owned everything..
 


Seems like Min Viable Product is the process these companies follow. When review sites off the bat can find so many flaws and issues, makes it seem like Intel / AMD, do the bare minimum testing of their products for release and just figure "meh, ship it, we will try to fix it later with bios updates / blame mobo makers "insert excuse here"
 
My last AMD system was the XP1800 - over 20 years ago. I worked at AMD for 2 years and still ran Intel/NV

If the 9800X3D is at least as good as the 7800X3D, it's an easy sell.
 
Sure once they ditch this entire architecture....

This is kind of reminding me of the days when AMD took off with the 64 chips and let intel in the dust cause they sat pretty on their thrown for too long and didnt actually innovate...and then the Core 2 Duo line came along and just owned everything..
They should of left HT on for these CPU's. But that would of increased power usage by quite a bit. But at least then they would of won a lot of the production workload tests. Just my 0.02c
 
It doesn't work with Easy anti cheat....
Welcome to the Linux experience windows users. hahaha

In all seriousness more proof that EAC and anti cheats like it are evil, hooking into the kernel is terrible.
 
I would love to see reviews that maybe included running virtualisation software like KVM/vmware workstation , linux related benchs and such too.. but I know it is such a small niche market the time and effort to do it...
Linux benchmark, you should be able to find a bunch:
https://www.phoronix.com/review/intel-core-ultra-9-285k-linux

They should of left HT on for these CPU's. But that would of increased power usage by quite a bit. But at least then they would of won a lot of the production workload tests. Just my 0.02c

One big issue seem to be latency, wouldn't HT make that even worst ? And considering by how much the 285k beat the 14900k-9950x in cinebench and other task where MT benefits a lot, I am not even sure if it is really "off" or the new things do something quite similar. Regardless I do not feel task where HT shine does not seem to be an top issue for those cpus.
 
Last edited:
Looking at Gamers Nexus power draw charts now... as he properly hooked up the 12volt lines as apparently Intel is now sucking CPU power from there. Jesus this is really bad for Intel. I mean looking at MIPS per watt in zip compression the 285 is only 25% better then the 14900.... which would be great if the 14900 wasn't so terrible in this metric. I mean in that metric the 285 is pushing 1184 mips at 161 watts.... the 7950x is pushing 1936 mips at 133 watts. The one thing Intel said they fixed... eh sort of. If you care about power efficiency AMD is still not just dominant its not even a contest.
 
Looking at Gamers Nexus power draw charts now... as he properly hooked up the 12volt lines as apparently Intel is now sucking CPU power from there. Jesus this is really bad for Intel. I mean looking at MIPS per watt in zip compression the 285 is only 25% better then the 14900.... which would be great if the 14900 wasn't so terrible in this metric. I mean in that metric the 285 is pushing 1184 mips at 161 watts.... the 7950x is pushing 1936 mips at 133 watts. The one thing Intel said they fixed... eh sort of. If you care about power efficiency AMD is still not just dominant its not even a contest.
Typo meant for decompression ?, they look good for compression, 285k has more mips per watt than the 7950x orr 9900x in compression and do 56% better than the 14900k.

On phoronix, they are quite similar efficacy wise to zen5, on TPU, the 47 application average for the 285k is 132 watt, the 9950x is 135watt and the 285k is not that much slower. They are also more efficient in games, but that could be because they starve latency wise.

265k for example, a bit faster than the 9900x in the application average (margin of error, the 9900x scoring 98.9% of the 265k score, while consuming in average 5 less watt.
 
Last edited:
It doesn't work with Easy anti cheat....
Welcome to the Linux experience windows users. hahaha

In all seriousness more proof that EAC and anti cheats like it are evil, hooking into the kernel is terrible.
you do wonder with MS working to change kernel level access due to the Crowdstrike issue, how might this affect DRM root kits spyware..
 
Type, for decompression, they look good for compression, 285k has more mips per watt than the 7950x orr 9900x in compression and do 56% better.

On phoronix, they are quite similar efficacy wise to zen5, on TPU, the 47 application average for the 285k is 132 watt, the 9950x is 135watt and the 285k is not that much slower. They are also more efficient in games, but that could be because they starve latency wise.

265k for example, a bit faster than the 9900x in the application average (margin of error, the 9900x scoring 98.9% of the 265k score, while consuming in average 5 less watt.
Did you watch the tech jesus review. Intel is now drawing power from the 12 volt lines. I doubt Phoronix was properly measuring power use. I doubt many people have been measuring it properly.

This is the setup Tech Jesus setup... smart way to get proper power results;

View: https://www.youtube.com/watch?v=nmK1rCyKbgQ

Screenshot_20241024_131533.png
Screenshot_20241024_131453.png
 
Did you watch the tech jesus review.
Yes that where the compression looking good comments come from (as you see in the screenshot, 285k more efficient than the 9900x-9950x and so on down the line.

Phoronix saw a peak power usage of 248 watt, if Phoronix use less precise software value I imagine for them where the power come from do not matter. From their doc they seem to rely on system able to monitor it, I imagine most do it that way ?

It also look good on people that track total system power.
 
Thefpsreview guys are usually good at finding the diamond in the rough.

Not this time around it seems.
 
It's almost as if they were so hyperfocused on AI they forgot about traditional performance.
It only have an the old already existing for a long time laptop meteor lake NPU (only 13 tops) in there and does not do particularly well in AI workload:
https://www.phoronix.com/review/intel-core-ultra-9-285k-linux/16

I doubt that a big part of what is going on.

How on earth does it perform worse than previous gen? That's a new one.
The big money saving step was to have tile, to do something a bit like AMD gpu, they have the compute tile on TSMC 3N, the IO-memory controller on a physical different tile on TSMC 6N. This introduce a bunch of latency that hurt latency sensitive performance, like game.
 
Back
Top