Will Alder Lake be good enough to compete with Zen 3 XT and its 3D cache?

In general yes but not when talking about becoming obsolete fast or not, I took obsolete has meaning:
: no longer in use or no longer useful
https://www.merriam-webster.com/dictionary/obsolete

Which sound a really strange to say about something like a 2500K, that it became obsolete almost at launch (I am still using mine 4c-4t of that era in a server right now).

Looking at benchmark between a 2500k a 2600K for game of the time I am still not convinced.

P.S. Not using a CAD, making a CAD, multithread performance changed thing by a giant ridiculous factor (I would imagine maybe has far to say has 6 to 11 time faster for some things) when he went on a 3700x, it was just to point out that it was a CPU that lived for about 10 year's even for people that do a living on their PC, which is all time high historically speaking not short living.

Not one was programming in 2000 on a 1990 even high end cpu, it was a particularly long living generation of processor.
I can still dig a house foundation with shovels, does that mean they aren't obsolete and we should stop using hydraulic diggers?
 
I can still dig a house foundation with shovels, does that mean they aren't obsolete and we should stop using hydraulic diggers?
Yes I literally according to the Webster definition above, many more than thousand year's old tool never became obsolete, like manual hammers, shovels, wheel barrel, even if there is something better in some scenario. A good shovel will still be useful and use for a long time.

I am not sure why you say that something not being obsolete mean that we should stop to use stuff that came after ? If vinyl are used enough, they become not obsolete (and vice versa), it is almost a value-judgment less word and do people use it or not word.

Velcro didn't made lace shoes obsolete, digital watch did not made analogue watch obsolete and by that I do not mean the former should not be used and saying 0 judgment about the relative quality, it is only do people continue to use something or not and I would argue that the 2500k-3500k type of 4c-4t cpu had a very long life (still using one here on my plex machine and a laptop for a gaming streaming and other browsing on tv-streaming type affair has well) under any metric we can thing off about CPU life expectancy in the history of computing and it is extremely strange to me to say they were obsolete almost on launch, they are still being used by a giant amount of people 10 year's in, something that is a think unprecedented, but could have became the new norm.
 
Last edited:
Yes I literally according to the Webster definition above, many more than thousand year's old tool never became obsolete, like manual hammers, shovels, wheel barrel, even if there is something better in some scenario. A good shovel will still be useful and use for a long time.

I am not sure why you say that something not being obsolete mean that we should stop to use stuff that came after ? If vinyl are used enough, they become not obsolete (and vice versa), it is almost a value-judgment less word and do people use it or not word.

Velcro didn't made lace shoes obsolete, digital watch did not made analogue watch obsolete and by that I do not mean the former should not be used and saying 0 judgment about the relative quality, it is only do people continue to use something or not and I would argue that the 2500k-3500k type of 4c-4t cpu had a very long life (still using one here on my plex machine and a laptop for a gaming streaming and other browsing on tv-streaming type affair has well) under any metric we can thing off about CPU life expectancy in the history of computing and it is extremely strange to me to say they were obsolete almost on launch, they are still being used by a giant amount of people 10 year's in, something that is a think unprecedented, but could have became the new norm.
From Oxford; no longer produced or used; out of date.
Shovels are obsolete when talking about digging large holes, just as Haswell i5's were for gaming almost at launch.
These items are both out dated, this meets the Oxford definition. Can we now proceed to argue about how my dictionary is better than your dictionary?

I'll add, digital watches did make analogue watches obsolete. Even with an analogue dial what do you think is inside that thing that requires a battery? Just because someone may still have and use, or even make true analogue watches, doesn't make them any less obsolete.
It is perfectly acceptable to use obsolete equipment. You are misreading the definition.
 
Last edited:
I could be misreading everything, but wasn't some of those some of the longest used with the longest relevant life CPU is history ?, I have a college that changed is core i5 4c-4t 2xxx this year at home, doing CAD development on it. Not sure about the split on logical cpu, but 4 cpu is still the most common on steam hardware survey.
You also need to interpret the numbers on the steam survey, a lot of the machines on it are latops or machines that are rarely used or not at all for modern games.

My dad uses steam and has an old ish i5 4c/4t CPU on a Z97 Mobo that he plays Kings bounty the legend on and that's it.
 
I think a lot of folks pointed out the past as we look to the future. My hope is that Intel has learned from that much like AMD has. Competition is good for all of us so we want Intel to have some success here..it's in our collective best interest.

I agree with having reservations, though. There is a lot of change on the horizon especially a new Operating System in Windows 11 and we all know the past there as well with Vista, Me..etc.

I'm willing to bet that much like all of you here like me, have been burned by early adoption. These companies love to "beta" test their products in production, they really do. There's a lot of value too them, unfortunately pain for us.

Based on that, i'm waiting it out this round.
 
With 4c8t it was a different story. The 4c4t processors were almost immediately obsolete, yet synthetic benchmarks looked great.
Where the heck did this come from? 4C4T was the undisputed gaming CPU for three generations (Sandy Bridge to Haswell) and completely acceptable for another two (Skylake and Kaby Lake). It wasn't until the latest crop of heavily threaded game engines that you needed a 6T CPU, and honestly a lot of that is probably because of threading overhead (the engine 'expects' 8 console hardware threads).
Now, I completely agree in retrospect that 4C4T was a mistake - if you had put the extra $99 (out of maybe $1500+ for the complete rig) you'd have a machine that can still game in 2021. Its the same reason why I'm not a fan of "just buy a 5600X because it's the best CPU for gaming", because for an extra $149 you can get an 8C/16T that will probably last you well into the next decade.
 
I'm interested to see how PCIe lane allocation works out. It'd be nice to have PCIe slots that serve as more than decoration. DMI receives an update to version 4.0, which provides double the transfer rate of Rocket Lake at the reported width of "up to" 8 lanes.

The complexity of this new architecture screams "wait for version 2+".

For the first time in my life, I'll probably be moving to AMD on principle alone next upgrade.

https://www.anandtech.com/show/16881/a-deep-dive-into-intels-alder-lake-microarchitectures
https://www.techpowerup.com/285746/...-only-few-pcie-gen-5-and-dynamic-memory-clock

Edit: updated DMI information.

Edit #2: Has anyone noticed that recent releases of Windows 10 share the same kernel with Windows 11? (That was true a few weeks ago, at least.)
 
Last edited:
I'm interested to see how PCIe lane allocation works out. It'd be nice to have PCIe slots that serve as more than decoration. It seems the DMI link to chipset is still 4.0 -- same as Rocket Lake.

The complexity of this new architecture screams "wait for version 2+".

For the first time in my life, I'll probably be moving to AMD on principle alone next upgrade.

https://www.anandtech.com/show/16881/a-deep-dive-into-intels-alder-lake-microarchitectures
https://www.techpowerup.com/285746/...-only-few-pcie-gen-5-and-dynamic-memory-clock

Same here on lane allocation, I suspect the 5.0 x16 will probably not be bifurcatable on most boards because PCIe5 needs repeaters like crazy to travel across a PCB and avoiding a set of 8 (needed to relay the extra lanes farther down to a second slot) would be a decent savings. Most of the 4.0's probably to m.2 slots, since 8 of the 3.0's would work equally as well for a 2nd GPU. I thought Intel was widening DMI from 4 to 8 lanes though which would help a fair bit with the chipset bottlenecking on multiple SSDs.

My desktops (going back to ones bought by my parents) have been Motorolla (6502), Intel (486), Cyrix(?? - 486 upgrade chip), AMD (k6), AMD (athlon xp), AMD (athlonx2) Intel (i7-920/930), and Intel (i7-4770/4790); my loyalty is to whatever works best for the buck.
 
I thought Intel was widening DMI from 4 to 8 lanes though which would help a fair bit with the chipset bottlenecking on multiple SSDs.
My bad. Rocket Lake was DMI 3.0 x8. Alder Lake is supposedly DMI 4.0 x8*. The lane count remains the same, but the transition to DMI 4.0 (or "Gen4") doubles the speed. I don't know why I thought Rocket Lake was DMI 4.0. :confused: Thanks for pointing that out. I'll correct my post.

*Rather, "up to x8", quite possibly dependent on chipset. These numbers are from secondary sources and shouldn't be regarded as fact until release or official confirmation from Intel.
 
Last edited:
The complexity of this new architecture screams "wait for version 2+".

I tend to agree with that. I can't really foresee a situation where my 5950x won't be a viable platform while I wait out for lower priced DDR5 and Microsoft hashing out the scheduler to work well with the little cores.
 
Is it worth waiting for Alder Lake? I've been using an Alienware laptop and a tablet for the past 7 years, I haven't even turned the laptop on in over a year. However, I'm itching to build a new desktop for 1440p gaming. If you had to build a new rig now, would you go Rocket Lake or Ryzen or would you just wait on Alder Lake? It seems like a real shitty time to build something new.
 
Is it worth waiting for Alder Lake? I've been using an Alienware laptop and a tablet for the past 7 years, I haven't even turned the laptop on in over a year. However, I'm itching to build a new desktop for 1440p gaming. If you had to build a new rig now, would you go Rocket Lake or Ryzen or would you just wait on Alder Lake? It seems like a real shitty time to build something new.
Highly speculative but I doubt that the gpu will not be the main bottleneck (i.e. game performance even on a 3080-6080 level of card if you get one will not be significantly different at 1440p between a 5800 or 5900x and the newer CPUs), extra bandwidth will probably be irrelevant for a gaming machine (we tend to not use that much of it right now), pci-exress 4.0 SSD are just starting to distringuish temselve from pci-express 3.0 same for video card x16, will take a time before pci-express 4.0 become an issue for a gaming PC would be my guess.

DDR5 vs 4 could become a big deal over time, but historically the very first year of a ram generation change was never worth the extra cash, this could be a different time obviously, both in magnitude of change and being worth it, but if DDR5 make having 128 or 256 gig of ram down the road, before that it become common enough and that some game take well advantage of it (like the Unreal 5 demo could easily take advantage of both 256-512 gig of ram and giant fast hard drive if they were out there it seem), it will be a long time.

Anything that will be strong enough to not bottleneck the RTX 4000x and Radeon 7xxx well enough should age well enough.
 
  • Like
Reactions: Tobit
like this
Techpowerup just showed a leak of Alder Lake getting 785 in single core score. God damn!
 
Is it worth waiting for Alder Lake? I've been using an Alienware laptop and a tablet for the past 7 years, I haven't even turned the laptop on in over a year. However, I'm itching to build a new desktop for 1440p gaming. If you had to build a new rig now, would you go Rocket Lake or Ryzen or would you just wait on Alder Lake? It seems like a real shitty time to build something new.

There will always be something new and better coming out later. If you want something now, then buy something now. By the time Alder Lake is out, people will already be talking about Zen 4, Meteor Lake, and the better DDR5 that will be coming out eventually.
 
  • Like
Reactions: Tobit
like this
confirmation today that DD4 is also supported by the memory controller and Z690 chipset allows both.

However, remains to be seen if the lower chipsets will also allow both.
 
confirmation today that DD4 is also supported by the memory controller and Z690 chipset allows both.

However, remains to be seen if the lower chipsets will also allow both.
Does this mean that a Z690 motherboard will accept both or there will be 2 versions (DDR4 & DDR5)?

Edit: a quick google search is there should be specific DDR4 versions which, for Asus, will have a "D4" in the model number.
 
Does this mean that a Z690 motherboard will accept both or there will be 2 versions (DDR4 & DDR5)?

Edit: a quick google search is there should be specific DDR4 versions which, for Asus, will have a "D4" in the model number.
In previous transitions there was a vendor that always made a few boards with two N slots and two N+1 slots that could be used exclusively, but because the slots are physically different you can't make a combined slot and AFAIK there's never been a controller that supported both at the same time.
 
I'm looking at moving to Alderlake largely because the 5600x I have now can't get enough frames to drive my 3080 at 1080p. I really hope Intel hit a home run here. It's funny, I went from 2500k (had it for 9 years on my primary gaming rig) to 3600x to 5600x (same motherboard which is awesome!) and now considering going back to Intel. I really want 280+ FPS.

The big question I have is, will any DDR5 be able to compete with highspeed low latency Bdie? I doubt it.
 
I'm looking at moving to Alderlake largely because the 5600x I have now can't get enough frames to drive my 3080 at 1080p. I really hope Intel hit a home run here. It's funny, I went from 2500k (had it for 9 years on my primary gaming rig) to 3600x to 5600x (same motherboard which is awesome!) and now considering going back to Intel. I really want 280+ FPS.

The big question I have is, will any DDR5 be able to compete with highspeed low latency Bdie? I doubt it.

I believe you will still be disappointed, Nvidia cards dont do so well at that low of a resolution and in fact the 6000 series card would have been a better choice. It appears to be a bottleneck of the architecture and not cpu speed.

ojNcdsHPQFu433rB2JKPNS-970-80.png.webp
geoKjcqzUhCQUrM26B44zS-970-80.png.webp
 
I believe you will still be disappointed, Nvidia cards dont do so well at that low of a resolution and in fact the 6000 series card would have been a better choice. It appears to be a bottleneck of the architecture and not cpu speed.

ojNcdsHPQFu433rB2JKPNS-970-80.png.webp
geoKjcqzUhCQUrM26B44zS-970-80.png.webp
The 6900xt has the same frames in Warzone as a 3090. In fact there isn't much difference between a 3070 and a 3090 at 1080p. That being said, upgrading from a 3600x to 5600x cpu is a massive performance gain. Since BR games are all I play benches like the one you shared are pointless. It's clear as day that Warzone is the most cpu demanding triple A BR game. Getting cpu frametimes below 4ms is extremely difficult in Verdansk.
 
I believe you will still be disappointed, Nvidia cards dont do so well at that low of a resolution and in fact the 6000 series card would have been a better choice. It appears to be a bottleneck of the architecture and not cpu speed.

ojNcdsHPQFu433rB2JKPNS-970-80.png.webp
geoKjcqzUhCQUrM26B44zS-970-80.png.webp
It may be due to Nvidia's driver overhead---which several youtubers scooped a couple of months ago. With Alder Lake's huge single core IPC boost: it could very well improve the 1080p performance for Nvidia.
 
And AMD claimed that Bulldozer would outperform Intel. Believe claims never until proof is delivered.


I so remember that time. There was some poster called iirc "AMDGuy" or something that shouted down anyone that sneered or was suspicious of the Bulldozer chips pre-release. Then they appeared...and he disappeared. :oops: Quite funny really.

Personally I'll wait for the Intel equivalent of the 5820K in 2022...hopefully.
 
It may be due to Nvidia's driver overhead---which several youtubers scooped a couple of months ago. With Alder Lake's huge single core IPC boost: it could very well improve the 1080p performance for Nvidia.
I love how over time the blame for the differences in performance oscilates back and forth. A few years back instead of "nVidia sucks at 1080p" it was "AMD sucks at 4k"; same relative performance difference but 180* different blaming by the commentariat. 🙄🙄🙄
 
I have an Haswell-e it's a good CPU but I need new one now.
I'm disappointed from AMD to be so late with DDR5 and Zen 4.
 
The overclocked results were blah, 490W for not even 10% increase in performance :X?

Lets face it. Both camps have altered their boost algorithms to squeeze every last drop of performance out of the CPU without exotic cooling. The fact that the CPU boost adapts to your cooling pretty much indicates that there's no meaningful headroom for a manual OC unless your cooling costs more than your PC.
 
Lets face it. Both camps have altered their boost algorithms to squeeze every last drop of performance out of the CPU without exotic cooling. The fact that the CPU boost adapts to your cooling pretty much indicates that there's no meaningful headroom for a manual OC unless your cooling costs more than your PC.
While mostly true, AMD has plenty of headroom for multi core OC, not so much for single core.
 
While mostly true, AMD has plenty of headroom for multi core OC, not so much for single core.
With the latest optimized by core boost system, would it be true to say that multi core OC headroom for many workloads does not mean that much performance was left on the table ? Could be wrong or misremembering, but I think I saw good multiCore OC loose performance versus letting it do is thing in a much more optimized way.
 
With the latest optimized by core boost system, would it be true to say that multi core OC headroom for many workloads does not mean that much performance was left on the table ? Could be wrong or misremembering, but I think I saw good multiCore OC loose performance versus letting it do is thing in a much more optimized way.
No for Zen 3 you can get a good MT boost from multi core OC, you can lock your cores at 4.7-4.8. You lose light / single threaded perf as the normal AMD boost no longer works once you manually select the core multipliers :)
 
Last edited:
No for Zen 3 you can get a good MT boost from multi core OC, you can lock your cores at 4.7-4.8. You lose light / single threaded perf as the normal AMD boost no longer works once you manually select the core multipliers :)
If you use clock tuner for Ryzen then you can have high boost on both sides. My 5900x goes to 5.0 single core and 4.75 all core by using that program.
 
If you use clock tuner for Ryzen then you can have high boost on both sides. My 5900x goes to 5.0 single core and 4.75 all core by using that program.
That was my understanding, that you could mix auto-oc and PBO at the same time and you could maybe get more but very little is left on the table.
 
If you use clock tuner for Ryzen then you can have high boost on both sides. My 5900x goes to 5.0 single core and 4.75 all core by using that program.
Neat, I have not tried it. Was using dynamic OC switcher before on the CHVIII DH
 
Last edited:
Guess that didn't pan out did it
Well, it does destroy with performance, at the price points for the 5600x and 5800x.

12600k is $300 at Best Buy and makes the 5600x irrelevant with performance + an IGPU. And then for $269 is the 12600kf. No GPU (Like a Ryzen) Same performance at an incredible price.

12700k at $420 ($409 KF) solidly beats a 5900x ----- at the 5800x price point. 5800x is completely irrelevant at its current price and really feels like an odd man out. Because the 12600Kf is as good or better....for $130 less. 5900x costs at least $100 more, still, to get similar performance as a 12700K.

AMD's whole price stack needs to shift.

Indeed, launch period premium on Z690 motherboards is no joke. However, there are some reasonable options around $230. And that situation only stands to improve as prices normalize and also the B660 series launches.
 
Last edited:
Back
Top