• Some users have recently had their accounts hijacked. It seems that the now defunct EVGA forums might have compromised your password there and seems many are using the same PW here. We would suggest you UPDATE YOUR PASSWORD and TURN ON 2FA for your account here to further secure it. None of the compromised accounts had 2FA turned on.
    Once you have enabled 2FA, your account will be updated soon to show a badge, letting other members know that you use 2FA to protect your account. This should be beneficial for everyone that uses FSFT.

Arrow Lake 2024 (and beyond)

Anyone can confirm if 14900K/285K has better 0.1% Low than 9800X3D?

Seen the discussion about FPS dipping issue on 7800X3D. Wonder if AMD's completely fixed the issue now.
 
AMD X3D wins every metrics for gamers, there is not actually a discussion.

Don't look at GPU bound benchmarks, and don't use random Youtubers with a pro-Intel agenda or who have no idea what they are doing.

The whole "AM-Dip" mantra is from a dishonest clown who sells bogus overclocking services to clueless victims.
 
9800X3D and 14900KS on Windows 10.JPG

kalston I caught Jufes testing 7800X3D, 9800X3D, 14900K/KS on Win10.
These chips are built for Win11 especially X3Ds, so by testing them on Win10, the "Dip" inevitably happens to both sides, but got worse on AMD due to being chiplet.
 
View attachment 692882
kalston I caught Jufes testing 7800X3D, 9800X3D, 14900K/KS on Win10.
These chips are built for Win11 especially X3Ds, so by testing them on Win10, the "Dip" inevitably happens to both sides, but got worse on AMD due to being chiplet.
So what?
He probably uses memory tweaks that open the performance.

As it seems, you don't know what it takes to make everything work almost the same on all windows and just follow the trend created by Hardware Misinformation (HU) :)
 
As it seems, you don't know what it takes to make everything work almost the same on all windows and just follow the trend created by Hardware Misinformation (HU) :)
I was talking about Jufes from Frame Chasers not HUB, you AMclown :cry::D Respond like you don't know to read.
 
I dunno what "GameTurbo" power plan is. Never seen it. Sounds like it may be trying to do some extra sauce with the scheduling----which may actually be bad for Arrow Lake. I would stick to "balanced" or "High Performance", for now.

I think the forthcoming improvements will probably be tweaks to Windows Scheduling, the CPU's internal thread director, and other microcode tweaks. Seems to me like the P-cores aren't be utilized correctly. I've seen some speculate its due to mistakes in how the cache is being handled.
I've seen it happen with both modes, but I did switch back to High Performance today because why not.
Had the same problem this morning.

> Turned on the computer, it sat for a few hours before I logged in and used it
> Tried to play CS2, FPS was ~150-250
> Restart computer
> FPS is 360+

I just don't know what's doing this. I have to restart this thing before I play any game just to be sure.
 
AMD X3D wins every metrics for gamers, there is not actually a discussion.

Don't look at GPU bound benchmarks, and don't use random Youtubers with a pro-Intel agenda or who have no idea what they are doing.

The whole "AM-Dip" mantra is from a dishonest clown who sells bogus overclocking services to clueless victims.

He admits to cherry picking games where AMD will "dip" because he wants to expose flaws:


View: https://www.youtube.com/shorts/D8S5xL6UvaE

While I think trying to find any flaws in something can be a good thing, my issue with his cherry picking is that apart from COD Warzone, he's picking games that literally nobody is playing in order to make his case for "INTEL IS BETTER". Like in one of his videos he was using Borderlands 2 to showcase the AMDip...like seriously? Borderlands 2? Who the F still plays that lol. Imagine taking a victory lap because Intel won't dip as hard in Borderlands 2, congrats!
 
He admits to cherry picking games where AMD will "dip" because he wants to expose flaws:


View: https://www.youtube.com/shorts/D8S5xL6UvaE

While I think trying to find any flaws in something can be a good thing, my issue with his cherry picking is that apart from COD Warzone, he's picking games that literally nobody is playing in order to make his case for "INTEL IS BETTER". Like in one of his videos he was using Borderlands 2 to showcase the AMDip...like seriously? Borderlands 2? Who the F still plays that lol. Imagine taking a victory lap because Intel won't dip as hard in Borderlands 2, congrats!

Frame Chaser is definitely a character catering to a crowd. I think overall, he's probably presenting decent data.

It would be interesting if a couple of larger channels did a sort of "peer review" and tried to see if they could recreate the phenomenon he shows where AMD CPUs drop minimum frames massively, in COD and Apex. Not so much that I don't believe him, either. I just......don't hear about it from normal gamers. And larger channels otherwise don't seem to have problems with Dual CCD CPUs in most of their games.

Yet occasionally I see a very small channel or even a solo user in a forum somewhere, posting data supposedly showing a Dual CCD chip way behind. And I have to question that. I feel like it may be due to the issue stated by AMD themselves, that if you swap different Ryzen CPUs in the same system without doing a fresh Windows install, you will see performance issues. Because the chipset drivers don't properly re-apply for the new CPU, etc. And you get core scheduling issues and issues utilizing cache.

All that said, there definitely are a couple of games out there, which consistently have scheduling issues for AMD's dual CCD chips. And for those you can just turn off the second CCD.
 
Frame Chaser is definitely a character catering to a crowd. I think overall, he's probably presenting decent data.

It would be interesting if a couple of larger channels did a sort of "peer review" and tried to see if they could recreate the phenomenon he shows where AMD CPUs drop minimum frames massively, in COD and Apex. Not so much that I don't believe him, either. I just......don't hear about it from normal gamers. And larger channels otherwise don't seem to have problems with Dual CCD CPUs in most of their games.

Yet occasionally I see a very small channel or even a solo user in a forum somewhere, posting data supposedly showing a Dual CCD chip way behind. And I have to question that. I feel like it may be due to the issue stated by AMD themselves, that if you swap different Ryzen CPUs in the same system without doing a fresh Windows install, you will see performance issues. Because the chipset drivers don't properly re-apply for the new CPU, etc. And you get core scheduling issues and issues utilizing cache.

All that said, there definitely are a couple of games out there, which consistently have scheduling issues for AMD's dual CCD chips. And for those you can just turn off the second CCD.
https://videocardz.com/newz/amd-ryz...ing-end-of-january-3d-v-cache-only-on-one-ccd

This better not be true though.. If a game utilizes just 1 CCD, the whole latency/stuttering/frametime spiking/AM-Dip might happen again.
 
Last edited:
https://videocardz.com/newz/amd-ryz...ing-end-of-january-3d-v-cache-only-on-one-ccd

This better not be true though.. If a game utilizes just 1 CCD, the whole latency/stuttering/frametime spiking/AM-Dip might happen again.
Eh, you can disable the non-vcache CCD if you have problems with a certain game.

**And this is likely the true intent of the recent X3D Turbo (Gigabyte) and Game Mode (Asus), which disable the second CCD and also turn off SMT.



View: https://youtu.be/frb2UsrHl6s?si=sAKOPXHiI8yd-_tl
 
Eh, you can disable the non-vcache CCD if you have problems with a certain game.
Then we wouldn't want to buy it at that price, I mean if you're gonna cap its true potentials just to game on.
**And this is likely the true intent of the recent X3D Turbo (Gigabyte) and Game Mode (Asus), which disable the second CCD and also turn off SMT.



View: https://youtu.be/frb2UsrHl6s?si=sAKOPXHiI8yd-_tl

I think those 2 features are exclusive to Asus and Gigabyte, other vendors couldn't and I'm not sure if you need to F12 every time you open your pc to do that.

9950X3D's main selling point and why people prefer to get it over 9800X3D and every Intel Top-end SKUs is to get that juicy V-Cache on Dual-CCDs.
 
Frame Chaser is definitely a character catering to a crowd. I think overall, he's probably presenting decent data.

It would be interesting if a couple of larger channels did a sort of "peer review" and tried to see if they could recreate the phenomenon he shows where AMD CPUs drop minimum frames massively, in COD and Apex. Not so much that I don't believe him, either. I just......don't hear about it from normal gamers. And larger channels otherwise don't seem to have problems with Dual CCD CPUs in most of their games.

Yet occasionally I see a very small channel or even a solo user in a forum somewhere, posting data supposedly showing a Dual CCD chip way behind. And I have to question that. I feel like it may be due to the issue stated by AMD themselves, that if you swap different Ryzen CPUs in the same system without doing a fresh Windows install, you will see performance issues. Because the chipset drivers don't properly re-apply for the new CPU, etc. And you get core scheduling issues and issues utilizing cache.

All that said, there definitely are a couple of games out there, which consistently have scheduling issues for AMD's dual CCD chips. And for those you can just turn off the second CCD.
The lowest 0.1% FPS is mostly when the CPU is waiting for data, and Intel is faster at delivering that data in most cases - but not always, of course. So nothing to wonder about here.
I use RAM cashing and RAM tuning to bypass this, and it works in most situations.
So Frame Chaser does a bit of marketing, but is after all correct.
_____
If you look at the latency tables with dual CCD data, and if you check that Microsoft added a switch to turn off the second CCD for gaming - you won't need anything else to prove that for gaming the dual CCD is slower.
 
Eh, you can disable the non-vcache CCD if you have problems with a certain game.

**And this is likely the true intent of the recent X3D Turbo (Gigabyte) and Game Mode (Asus), which disable the second CCD and also turn off SMT.
The second CCD runs "slower" all the time, so doing comparison tests for it alone is... a waste of time.
 
I wonder how long Microcenter's new 265k pricing is going to stick around. $299 for a 265k plus $70 off any Z890 board. Might just be a "black Friday" deal. Seems like everyone is starting early this year.
 
Looks like we have the first part of the fixes:

https://www.asrock.com/news/index.asp?iD=5548


I will be testing this tonight.
I can't install it. It gives me a generic 0x800700b error.

I suspect it may require 24H2. But no sites say anything about that requirement.

I have my Windows 11 Home setup to only get 23H2 updates, via local group policy through gpedit.msc


Also, after being on Ryzen for a year------I forgot how many different Drivers Intel requires to make their platform work. And many of them you can't even easily, directly update. AMD is way better about this.
 
I'm quite thorough when it comes to drivers and updates. And was very thorough when I installed this system.

I have to assume that this PPM driver is already installed and the package is simply failing becuase its already there. Its not a normal driver install. Its some kind of Windows package. Gigabyte's download for it actually visibily launches powershell.

1732598304725.png
 
On another note, Asrock released a new bios for the Z890 Nova ITX. It notes improved memory compatibility. But there are also several new options for memory training.
 
NVIDIA’s arm CPUs are coming in Q3 next year btw. For those who’ve had enough with Intel-AMD and looking to go full green.
 
I can't install it. It gives me a generic 0x800700b error.

I suspect it may require 24H2. But no sites say anything about that requirement.

I have my Windows 11 Home setup to only get 23H2 updates, via local group policy through gpedit.msc


Also, after being on Ryzen for a year------I forgot how many different Drivers Intel requires to make their platform work. And many of them you can't even easily, directly update. AMD is way better about this.
I hope you are having fun with Arrow Lake, because your posts make it sound as bad as the reviewers said and then some :D

I never had driver issues with Intel though. I always let Windows (10/11) just auto-install them and never had a hiccup. I don't remember having to fiddle much on Win 7 either, perhaps a couple of drivers for networking and audio.

For me AMD took a lot more work, there were things to patch up in the first days/months, while all my Intel builds were perfectly fine out of the box (my last Intel build is 11th gen though).
 
Last edited:
Personally I wouldn’t wanna go thru all the 4-5 long steps to get the most out of 12000-14000Ks. Only 2 steps I’m comfortable with are set the MB timing XMP RAM and double click Intel APO. And if I have to do it daily would just take the APO configuration alone.
 
Apologies if already mentioned, I did search for 1700 in this thread... anyone see any *recent* rumors/updates to the '12 performance core' socket 1700 version of arrow lake?
 
Apologies if already mentioned, I did search for 1700 in this thread... anyone see any *recent* rumors/updates to the '12 performance core' socket 1700 version of arrow lake?
I think someone posted that was server only.
 
Apologies if already mentioned, I did search for 1700 in this thread... anyone see any *recent* rumors/updates to the '12 performance core' socket 1700 version of arrow lake?
There's never been an LGA 1700 version of Arrow Lake.

I think that you are referring to the rumored Bartlett-Lake. Which has all P-core variants, as well as hybrid variants. The Hybrid variants are said to use leftover Raptor Lake and Alder Lake dies. The P-core only variants, don't have a lot of info. There is some minor info that they will have some sort of improvements for AI----but not as much as Arrow Lake. And ultimately, its unclear if this is some all new architecture fork. or if its a sort of forward-port of Raptor Lake or something. But, it seems to be clear it doesn't have anything to do with Arrow Lake. Such as a direct back-port of Arrow Lake, for example.

I think the last rumors were that the hybrid versions would release Q1 2025 and the P-core only versions would be much later.
 
I haven't seen anything about the all p-core Bartlett Lake-S/Raptor Lake refresh #2 since last summer, but it seems very likely that the rebadge of the big.little models is coming early next year. It's basically just a name change to Core # 2xx.

The 285k I had on order from Amazon for a few weeks finally arrived today. I was hoping to hear something about patches/fixes by now. Now I have to decide if I open the box and build this weekend.
 
It would be interesting if the all pcore version has super quick latency and is a fast gaming chip lol but I'm not holding my breath 🤭
 
I just built my system Oct 2023 due to crazy low pricing on DDR4 and snagging (what to me is) a loaded Z790 board for $120 refurbished. Hoping I can extend it's life in a few years with a Bartlett-S or whatever they call it. I really don't want the power draw nor the heat of the 14 series.
 
Budget DDR4 systems has Ryzen 3 written all over it.
That’s nothing compare to the level of performance you can get with 14600-14900s. The bang-for-buck value of DDR4-3600/4000 & 660/760/690 are simply too tempting to turn down really.
 
It would be interesting if the all pcore version has super quick latency and is a fast gaming chip lol but I'm not holding my breath 🤭
its thought that the latency issues are due to the memory controller being on a separate die/tile, from the CPU cores. So, even if they cut out the E-cores and only have P-Cores, latency would likely still be similar.

That said, I don't think the MC being separate, is the real issue. But, we'll see after the "fixes" are implemented.

And BTW, if we don't hear something by next week------maybe this isn't as easy as Mr. Hallock said.
 
its thought that the latency issues are due to the memory controller being on a separate die/tile, from the CPU cores. So, even if they cut out the E-cores and only have P-Cores, latency would likely still be similar.

That said, I don't think the MC being separate, is the real issue. But, we'll see after the "fixes" are implemented.

And BTW, if we don't hear something by next week------maybe this isn't as easy as Mr. Hallock said.
I was considering making a new thread asking the same question about Raptor Lake but maybe you or someone knows the answer here so I'll ask.

Is raptor lake effected by the latency issues due to E cores being on or off? Or does having E cores on does not effect Raptor Lake CPUs because the latency is unavoidable whether the ecores are on or off? Or does it matter how many ecores are on? Does any different configurations make any meaningful difference?
 
I was considering making a new thread asking the same question about Raptor Lake but maybe you or someone knows the answer here so I'll ask.

Is raptor lake effected by the latency issues due to E cores being on or off? Or does having E cores on does not effect Raptor Lake CPUs because the latency is unavoidable whether the ecores are on or off? Or does it matter how many ecores are on? Does any different configurations make any meaningful difference?
non-issue with Raptor Lake.
 
I just built my system Oct 2023 due to crazy low pricing on DDR4 and snagging (what to me is) a loaded Z790 board for $120 refurbished. Hoping I can extend it's life in a few years with a Bartlett-S or whatever they call it. I really don't want the power draw nor the heat of the 14 series.
I bought my 12700K and Z690 combo used off of Marketplace for $225 a year ago, and reused 64GB of DDR4 3200... I really have not needed more CPU yet (not a gamer)... but the incremental Windows and AI updates will ensure an upgrade is coming sooner or later with my heavy multi tasking.

I am sitting this upgrade cycle out, but waiting to see what happens next year... especially with the Nvidia release.
 
Back
Top