• Some users have recently had their accounts hijacked. It seems that the now defunct EVGA forums might have compromised your password there and seems many are using the same PW here. We would suggest you UPDATE YOUR PASSWORD and TURN ON 2FA for your account here to further secure it. None of the compromised accounts had 2FA turned on.
    Once you have enabled 2FA, your account will be updated soon to show a badge, letting other members know that you use 2FA to protect your account. This should be beneficial for everyone that uses FSFT.

Intel Core Ultra 9 285K, Ultra 7 265K and Ultra 5 245K Review Roundup

With that said, AMD is worse. Why? Intel released a duck with a forced socket change. AMD released a duck without socket change.

Now, what I mean is that AMD's 9xxx doesn't raise the bar above 7xxx, and truth be told, unless a handful of fps matters to you, not much above 5xxx even, if just considering gaming workloads.

So, why is a socket change a good thing? I know this sounds hideous, but the "big market" for "Ultra" or whatever it's called (thank you Intel for being stupid) will always be "corporate"... and so we'll see this new round of desktops, etc. surrounding the new "lack luster" whatever it's called (again, thanks Intel).

Perhaps, because Intel has been, let's say, problematic lately, that all of this mess is simply a "distraction". Would have loved to seen AMD take advantage, they did not... so here we are. Two forgetful gens from different companies. Distraction or not, Intel will generate a ton of sales around the new socket. I'd buy an AMD SKU instead, but since that's where AMD falls flat (lack of offerings), Intel rules the roost.
 
There are certain advantages to changing sockets more frequently. You inherit a lot of legacy baggage building CPU's generation after generation to be compatible with motherboards upwards of half a decade old.
 
There are certain advantages to changing sockets more frequently. You inherit a lot of legacy baggage building CPU's generation after generation to be compatible with motherboards upwards of half a decade old.
Its hard to see that... when you look at bench marks were the 5800x3d is often besting a brand new flagship intel part. At least in gaming anyway.
No doubt getting the latest greats extra stuff on a mobo is nice... rather then being stuck with the same nic for 8 years. I mean its nice to have the OPTION. AMD has always released new mobos with each gen, you simply have the option of continuing to use the older one.. sometimes at a small cost in performance. I haven't seen any real reason the socket needs to be changed for most of Intels launches.
 
Its hard to see that... when you look at bench marks were the 5800x3d is often besting a brand new flagship intel part. At least in gaming anyway.
No doubt getting the latest greats extra stuff on a mobo is nice... rather then being stuck with the same nic for 8 years. I mean its nice to have the OPTION. AMD has always released new mobos with each gen, you simply have the option of continuing to use the older one.. sometimes at a small cost in performance. I haven't seen any real reason the socket needs to be changed for most of Intels launches.
The issues are not necessarily performance related or even necessarily something that's an explicit engineering challenge. Sometimes the issues do impact users. In the case of AM4+, there wasn't enough room on most BIOS ROMs to support all the CPU's that were compatible with the boards electrically. Not to mention all the cases where you'd need to keep your old CPU to do the BIOS update with before installing the new CPU.

These problems weren't an issue on the high end, but with mid-range and more budget type boards they were.
 
Did you watch the tech jesus review. Intel is now drawing power from the 12 volt lines. I doubt Phoronix was properly measuring power use. I doubt many people have been measuring it properly.

This is the setup Tech Jesus setup... smart way to get proper power results;

View: https://www.youtube.com/watch?v=nmK1rCyKbgQ

View attachment 687120View attachment 687121

Almost like Intel tried to hide some of that power draw to make Arrow Lake look more efficient than it actually is...
 
The issues are not necessarily performance related or even necessarily something that's an explicit engineering challenge. Sometimes the issues do impact users. In the case of AM4+, there wasn't enough room on most BIOS ROMs to support all the CPU's that were compatible with the boards electrically. Not to mention all the cases where you'd need to keep your old CPU to do the BIOS update with before installing the new CPU.
WIth the modern price of flash memory, that issue was probably just an artifact of the past, not thinking a socket would ever have that many gen on it? BIOS rom flash are twice as big now.

Has for your old cpu needed for the bios update, that can be fix with the flash install as well. I am sure there is many benefit, but my brain does not get them enough while the keeping the same socket are more obvious.

Here they went from ddr4-ddr5 to ddr5 only type of support, quite a new CPU, quite a new ram type of support possible, this feel like a fully understandable change of socket, but it would be nice if it is a long lived one and if not if the reason-what it fix or new feature are obvious enough.
 
Last edited:
It really looks to me like they are trying to replicate what they did when they moved from Netburst to Core architecture. But this time they don't have the foundry / process node advantage. Pretty much every interview I've watched has made clear that the emphasis here was on producing an architecture that they could grow into / expand upon, meaning that there is real hope that they will refine this architecture into a truly competitive CPU in the future. One thing is for sure, the 14900K was a dead-end, arguably already one step too far in terms of power consumption and heat output. There was just nothing else they could squeeze out of that architecture.

So while I'm not going to be buying one of these CPUs, I wasn't buying their 13 or 14 series CPUs either. I genuinely think that this was a good long-term move on their part.
 
Almost like Intel tried to hide some of that power draw to make Arrow Lake look more efficient than it actually is...
I am sure their is a good technical reason to draw power the way they are... maybe it allows for lower low power states or something? I don't know. The potential shady part might be in their own benchmarks and not letting reviewers know. It sounds like Steve got a tip from a friendly MOBO maker, or he would have had no reason to check the power on other lines. I guess people looking at power from the wall would see it anyway. People relying on software reporting CPU draw would probably miss the extra power use on the lines.
 
The issues are not necessarily performance related or even necessarily something that's an explicit engineering challenge. Sometimes the issues do impact users. In the case of AM4+, there wasn't enough room on most BIOS ROMs to support all the CPU's that were compatible with the boards electrically. Not to mention all the cases where you'd need to keep your old CPU to do the BIOS update with before installing the new CPU.

These problems weren't an issue on the high end, but with mid-range and more budget type boards they were.
Agreed its not like AM4 was some magical corp workstation upgrade path. There are issues. Having said that changing your socket almost every single generation is also mostly unnecessary. The AM4 rom issues mostly came down to early AM4 mobo makers just not putting enough ram on the board. They didn't all buy in on the AMD becoming a high end CPU company early on.
 
Oddly enough, even when HardOCP was around the review round ups were always in the front page news. Did erek crush your old lady in high school or something?
just trying to point him in the right direction so people stop complaining about the other subforums dying...
whats he doing for you to step in on his defence?
 
It simply came down to cost savings on the BIOS chips but there were plenty of socket A boards that could only support certain chips or S478 boards that didn't support Prescott Pentium IV's, and LGA 775 was another one that was messy when it came to compatibility.
 
just trying to point him in the right direction so people stop complaining about the other subforums dying...
whats he doing for you to step in on his defence?
He isn't doing anything for me to step in on his defense. But seriously who is complaining about the other forums dying? Let it go, nobody cares.
 
Agreed its not like AM4 was some magical corp workstation upgrade path. There are issues. Having said that changing your socket almost every single generation is also mostly unnecessary. The AM4 rom issues mostly came down to early AM4 mobo makers just not putting enough ram on the board. They didn't all buy in on the AMD becoming a high end CPU company early on.
Yeah one I recall the most was going from kaby to coffee. Same socket (1151) but intel changed around the pinout claiming they needed extra power.
 
  • Like
Reactions: ChadD
like this
He isn't doing anything for me to step in on his defense. But seriously who is complaining about the other forums dying? Let it go, nobody cares.
i keep getting told their dead when directing posts to them, by defenders like you, last one was complaining about the apple section being dead yet theres tonnes of apple posts in the news section for no reason. some do.
 
Yeah one I recall the most was going from kaby to coffee. Same socket (1151) but intel changed around the pinout claiming they needed extra power.
Intel's argument for the pinout change was refuted when some hobbyist got an 8000 series running a Z170/Z270 motherboard. I think Intel even admitted it would have worked for "most" motherboards, but because of the tolerance they allowed with those first two generations there might have been some cheaply made boards that would have had issues with 8th gen, especially the ones drawing the most power.

Whether that's true or not, we'll never know. I think they could have made 8th gen work with Z270 and then compiled a QVL list of boards tested to work (much like what board makers do with RAM).

Either way, there's always the conspiracy that the board makers themselves are influencing Intel's decision to break compatibility every so many generations. Maybe there's truth there too?

To be fair, Intel has been doing this since the Sandy Bridge days. It's not like this is something new. AMD just came along and planted the idea that a motherboard's life can in fact be extended beyond more than 2 or 3 generations of products.
 
i keep getting told their dead when directing posts to them, by defenders like you, last one was complaining about the apple section being dead yet theres tonnes of apple posts in the news section for no reason. some do.
Then report it to a mod and have it moved.
 
Being budget focused, I actually see the AMD 9000 and Intel Core Ultra series as a positive. Not much performance gains for a high price that people will still upgrade to, and all the rest of the market will issue price drops for the 2nd to latest generation, as well as the used market getting better deals as well.
 
With that said, AMD is worse. Why? Intel released a duck with a forced socket change. AMD released a duck without socket change.

Now, what I mean is that AMD's 9xxx doesn't raise the bar above 7xxx, and truth be told, unless a handful of fps matters to you, not much above 5xxx even, if just considering gaming workloads.

So, why is a socket change a good thing? I know this sounds hideous, but the "big market" for "Ultra" or whatever it's called (thank you Intel for being stupid) will always be "corporate"... and so we'll see this new round of desktops, etc. surrounding the new "lack luster" whatever it's called (again, thanks Intel).

Perhaps, because Intel has been, let's say, problematic lately, that all of this mess is simply a "distraction". Would have loved to seen AMD take advantage, they did not... so here we are. Two forgetful gens from different companies. Distraction or not, Intel will generate a ton of sales around the new socket. I'd buy an AMD SKU instead, but since that's where AMD falls flat (lack of offerings), Intel rules the roost.
I'm not so sure of that yet, both seem like they are failing pretty hard at this to me.
Maybe after a few BIOS updates on the Intel side with a Windows update or 12 we can get a better picture.
 
I'm not so sure of that yet, both seem like they are failing pretty hard at this to me.
Maybe after a few BIOS updates on the Intel side with a Windows update or 12 we can get a better picture.
AMD's move focuses on the tiny market of DIY, Intel's move creates a slew of new SKUs to be sold to corporations.
 
OOF.

It's almost as if they were so hyperfocused on AI they forgot about traditional performance.

How on earth does it perform worse than previous gen? That's a new one.
Stopgap. 13th and 14th gen are fucked from the burnout issues - even replacing them with working cores, it's 1 or 2 year old hardware, with a known issue blowing up the used market, and a massive RMA liability; plus you have consumers worrying that they got it all. You need SOMETHING to sell.

Much like 11th gen, this is a stopgap release. Your goal with those is to match prior gen performance and offer some features on the motherboard and chipset, unfortunately they kinda fucked it up and it's a BAD stopgap release.

This will be interesting to see how the market reacts and what's out there. I'm skimming the Z790 boards now to see about deals on the 14900k, but buying an upgrade to a 1 year old platform is ick. At the same time, I'd rather not go dual AMD as I still like to have both, so... fuck.
 
In the case of AM4+, there wasn't enough room on most BIOS ROMs to support all the CPU's that were compatible with the boards electrically. Not to mention all the cases where you'd need to keep your old CPU to do the BIOS update with before installing the new CPU.

These problems weren't an issue on the high end, but with mid-range and more budget type boards they were.
IIRC, its a problem because board makers went against AMD's spec and used lower capacity chips for the BIOS, to cut costs.
 
I'm not so sure of that yet, both seem like they are failing pretty hard at this to me.
Maybe after a few BIOS updates on the Intel side with a Windows update or 12 we can get a better picture.
This is one of the reasons the clickbait videos are problematic. Zen 5 didn't have stability issues and even if some don't admit it Zen5 is an improvement. That came out during recent testing that most reviewers (intentionally) just glanced over. Remember the performance regressions in gaming?
1729812369457.png

Gone. Zen 5 is similar or higher than Zen 4. It's minuscule to be sure. But it's not a regression. Ultra 9 though most definitely is and that's with the higher spec memory. Furthermore, reviewers ended up admitting that 24H2 is a regression for Intel generally. Meaning Windows was artificially holding back the AMD processors something that most of us realized once we looked at the Linux benches being significantly faster than the usual 0- 5% advantage.
 

Attachments

  • 1729812220694.png
    1729812220694.png
    140.3 KB · Views: 0
This could be a good sign, depending on how much more silicon was spent:

View attachment 687100

A next jump like that and we will have 1060 level gpu coming with an Intel cpu....
8700g not shown if iGPU is your thing.
Cinebench efficiency will naturally be good here as rendering seems to be it's sole strong point.

Cant help but notice gaming efficiency of the 7700x, 700x, and 7059s on that chart.
, zen5 efficacy in cinebench and games.

Application relative ranking

9950x: 103.4%
285k: 100%

265k: 93.7%
9900x: 92.6%

245k: 79%
9700x: 77.8%

Price them well, and maybe they are a better choice for many outside gaming (and with DDR-5 8000-8800 maybe it look good for a lot of application), but with the gaming decline it will be rough.
The coping has begun.
 
I would argue this makes the Zen 5 design by AMD a smart move.
Well if they were going to rethink their branch prediction and cache sytems... and hopefully we find out they have rethought their x3d interconnects a bit. They picked the right time to do it. Overhaul generations often have a bit of regression. Zen5 isn't a regression. Sure in most cases its not a big jump over their previous but they don't ever loose to the old chips. (and in specific no gaming workloads they really are 20-30% faster).

Anyway with Intel tripping on their laces it was perfect timing. 9000x chips are the ultimate productivity and power efficiency winners. We also all know that there is no way the new x3d chips aren't the kings of gaming for a long time now. Its not even in question. The 3D cache also results in big efficiency gains for specific work loads as well. The big x3d chips next year are going to have some interesting productivity use cases not only gaming.
 
Well if they were going to rethink their branch prediction and cache sytems... and hopefully we find out they have rethought their x3d interconnects a bit. They picked the right time to do it. Overhaul generations often have a bit of regression. Zen5 isn't a regression. Sure in most cases its not a big jump over their previous but they don't ever loose to the old chips. (and in specific no gaming workloads they really are 20-30% faster).

Anyway with Intel tripping on their laces it was perfect timing. 9000x chips are the ultimate productivity and power efficiency winners. We also all know that there is no way the new x3d chips aren't the kings of gaming for a long time now. Its not even in question. The 3D cache also results in big efficiency gains for specific work loads as well. The big x3d chips next year are going to have some interesting productivity use cases not only gaming.
My thoughts exactly. Specifically I was thinking the changes will be a benefit for Epyc and Threadripper cpus and AMD can attack datacenter market share.
 
  • Like
Reactions: ChadD
like this
My thoughts exactly. Specifically I was thinking the changes will be a benefit for Epyc and Threadripper cpus and AMD can attack datacenter market share.
In data center Zen5 is tearing up Intels offerings. What is maybe frustrating about Zen5 is the consumer parts have the Zen4 IO die. The Zen5 server parts have a brand new IO die... which is probably why in server they are destroying Intel... in basically all workloads by sometimes huge numbers.
If don't know if AMD has said what the new “Shimada Peak” threadrippers are going to have for IO. I hope they are using the newer IO die. If they are running the server Zen5 IO they are going to be monster workstation parts.
 
She also notes after the charts, that some games were very inconsistent, run-to-run. Saying as much as 10-15% differences.
That seems to be the take away. I am not sure this platform was fully baked. You would hope some bios patches or something even it out. I wouldn't be making that bet with my own money.
This is a strange generation. The whole drawing CPU power from other sources is also strange. Watching GN Steves 12 volt line jump back and forth with load was odd. Makes me wonder if there is some honest issues with this platform down the road.
 
Is it just me, but are gaming benchmarks being the end-all be-all for performance markers for CPU's annoying? Fine if you're rich, and all you do is game, but gaming is about 20% of what I do on a computer.
Well, in addition to what others have mentioned (I'm only about 10 posts past yours so far) there's Phoronix who does non-gaming benchmarks.
 
That seems to be the take away. I am not sure this platform was fully baked. You would hope some bios patches or something even it out. I wouldn't be making that bet with my own money.
This is a strange generation. The whole drawing CPU power from other sources is also strange. Watching GN Steves 12 volt line jump back and forth with load was odd. Makes me wonder if there is some honest issues with this platform down the road.
They've already said it's a transitional release - have to get it out to start building on it for the next one. The sad part is that it's yet another transitional release, after 12th gen (first DDR5/etc) being somewhat transitional, and 13/14 being... broken.
 
This release reminds me of the Socket 423 Pentium IV's. They were always transitional and were often out performed by higher end Pentium III's. Even Prescott Pentium 4 "E's" were transitional as they had deactivated EM64T extensions that weren't enabled until the LGA sockets came out.
 
This release reminds me of the Socket 423 Pentium IV's. They were always transitional and were often out performed by higher end Pentium III's. Even Prescott Pentium 4 "E's" were transitional as they had deactivated EM64T extensions that weren't enabled until the LGA sockets came out.

Heh, I remember that craziness. P3 1ghz vs P4 at 1.4ghz. And the 1.7ghz or so chips needed a new mobo months later. That led me to Duron then Athlon XP.
 
Back
Top