sblantipodi
2[H]4U
- Joined
- Aug 29, 2010
- Messages
- 3,765
As title,
what is your feeling about Alder Lake? Will be good enough to compete with AMD?
what is your feeling about Alder Lake? Will be good enough to compete with AMD?
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Not looking real impressive here.Alder Lake will destroy Zen 3 in both performance and efficiency according to roumors and leaks.
Alder Lake will destroy Zen 3 in both performance and efficiency according to roumors and leaks.
The efficiency will only come when the system is lightly-loaded. The new big cores are just playing the same games as Tiger Lake - suck down even more power to get 20% faster than the previous gen, so any increase in power consumption in Zen 3 + will be more than matched in the Intel side!
https://www.hardwaretimes.com/intel...on-to-468w-15-more-than-11th-gen-rocket-lake/
And that performance/watt under low-loads is entirely dependent on how much better Alder Lake's hardware controller is than Lakefield was.
And AMD claimed that Bulldozer would outperform Intel. Believe claims never until proof is delivered.this is obviously not true, Intel clearly clamed that efficiency cores are more powerful than skylake, it will not work that way,
efficiency cores will be used even in high load situations.
they are meant to improve performance by increasing efficiency, not just save some power.
And AMD claimed that Bulldozer would outperform Intel. Believe claims never until proof is delivered.
Assuming Intel's and AMD's claims are both correct:As title,
what is your feeling about Alder Lake? Will be good enough to compete with AMD?
Efficiency is alive and real, even in the desktop world. I think as enthusiasts we tend to think about our uses. We are the 1% literally in the CPU world. For us it may not matter. For the other 99% it will. Many more CPU's are sold for offices and prebuilts, than ever get placed into a enthusiast DIY pc. Intel is not catering to us. They are looking at the bigger picture.this for sure but I think that it is really reasonable that performance core are not there for power saving only, it would have no much sense on desktops.
Absolutely this. The leaked Alder lake stuff looks promising, but I don't plan on buying a new build until both Intel and AMD release their 2022 products for testing.And AMD claimed that Bulldozer would outperform Intel. Believe claims never until proof is delivered.
My gut says dodge first gen. The big+little setup works well for phones, but diving in and buying first gen desktop big+little hardware sounds like unnecessary pain. It's a good idea but I'm expecting a good amount of "ouch" with the first gen desktop implementation.i've also been resisting any upgrades until 10 nm, and I'm stuck on a 4670k. i sincerely hope now that intel's implementation of efficiency and performance cores turns out good
I’m betting we’ll end up with game modes that disable the small cores for a bit.My gut says dodge first gen. The big+little setup works well for phones, but diving in and buying first gen desktop big+little hardware sounds like unnecessary pain. It's a good idea but I'm expecting a good amount of "ouch" with the first gen desktop implementation.
Very likely, but I bet we'll need a third party utility for some older games at first. New stuff will get a patch, older games... nah, no $ in it for most of them. Some will of course. After a year or two Microsoft will sort it out and you won't have to think about it anymore most of the time. Of course exceptions will still apply.I’m betting we’ll end up with game modes that disable the small cores for a bit.
Smart idea as iv been a early adopter once 16gb DDR 3 Corsair Dominator Plats $800 cad lol, nver again,My gut says dodge first gen. The big+little setup works well for phones, but diving in and buying first gen desktop big+little hardware sounds like unnecessary pain. It's a good idea but I'm expecting a good amount of "ouch" with the first gen desktop implementation.
I really hope not. If MS does they'll almost certainly have to remove it in a future update because the developers of all the random craplets you have running which have no business on anything other than a little core will abuse the flag to put themselves on a main core where their wasting of big CPU cycles will slow down anything compute intensive that you care about - and on laptops drain the battery faster. The latter in particular makes me suspect that if MS does offer any way for apps to request a core type it'll be an opt-in one to request running on little cores intended for the developers of well behaved background utilities.I’m betting we’ll end up with game modes that disable the small cores for a bit.
That’s almost what my 3960x pulls at stock.I saw a recent post about 12900k beating out the 5950x in Geekbench on both ST and MT, but at 250W.
I'm guessing that top end performance will be pretty close, but performance per watt will still be to AMD's advantage.
They'll be able to do that anyway unless MS figures out a way to force them onto a little core. Set affinity has been in Windows as long as I've had a machine with more than one CPU core. That started with a dual Pentium 3 box running Windows 2000. It is not and has never been a privileged operation. Any user can set affinity, and it can be done with system calls in an application. Same with Linux. Despite being a bit of a Linux geek and knowing more about fun with CPU scheduling on Linux, I never actually messed around with setting affinity on Linux until 2012 when I started a job that involved high frequency trading. Before that I just used it to make games meant for a one core machine run on a dual socket. Most were fine out of the box, but a few flipped out if you had more than one CPU. Like Unreal Tournament (original 1999 edition) was a slideshow on a dual Opteron but ran stupid fast if you forced it onto one CPU.I really hope not. If MS does they'll almost certainly have to remove it in a future update because the developers of all the random craplets you have running which have no business on anything other than a little core will abuse the flag to put themselves on a main core where their wasting of big CPU cycles will slow down anything compute intensive that you care about - and on laptops drain the battery faster. The latter in particular makes me suspect that if MS does offer any way for apps to request a core type it'll be an opt-in one to request running on little cores intended for the developers of well behaved background utilities.
technically it did eventually. eventually the extra "cores" mattered enough, per unit $. no different than crossfire 480's besting 980's, only to have crossfire retired a year later (massive egg in face imo).And AMD claimed that Bulldozer would outperform Intel. Believe claims never until proof is delivered.
Super briefly. That was pretty much eliminated by the early Haswell days, if not sooner. They weren’t bad, mind you- but they were only competitive on price, not on performance. And they weren’t a clear win over Intel like AMD64 was or like Zen3 has been.technically it did eventually. eventually the extra "cores" mattered enough, per unit $. no different than crossfire 480's besting 980's, only to have crossfire retired a year later (massive egg in face imo).
You speak truth however, prove it. or else BS.
Still running a Dozer box, keeps up with first gen ryzen in gaming loads. Can't say that about Haswell i5's. It's true they had no hope competing with Intel in the high end. With 4c8t it was a different story. The 4c4t processors were almost immediately obsolete, yet synthetic benchmarks looked great.Super briefly. That was pretty much eliminated by the early Haswell days, if not sooner. They weren’t bad, mind you- but they were only competitive on price, not on performance. And they weren’t a clear win over Intel like AMD64 was or like Zen3 has been.
Oh gaming tends to be mostly fine. Started hitting issues with VR stuff on it- my wife was using mine so I built a new one and gave her my 6700K. But I used it as a workstation and it was trounced for that by Intel, never mind zen 1 (which I also had). And true, was only thinking high end- I don’t really do anything smaller than the top-end consumer parts since mine tend to run forever.Still running a Dozer box, keeps up with first gen ryzen in gaming loads. Can't say that about Haswell i5's. It's true they had no hope competing with Intel in the high end. With 4c8t it was a different story. The 4c4t processors were almost immediately obsolete, yet synthetic benchmarks looked great.
Annoyingly, the mainstream desktop platform is going to feel the drawbacks of such a design the most. On the mobile side you can have some nice 2P/6E or 2P/8E designs which will outperform last gen designs everywhere (laptops don't do all core turbo anywhere close to 5GHz). On the HEDT side you can have something crazy like a 8P/64E which puts all of your single-threaded tasks (linking, CAD, etc) on the P-cores and puts the multithreaded stuff (compiling, rendering, encoding) on the E-cores, which is pretty good. But an 8P/8E is just going to add confusion on mainstream desktops, it will end up performing like a 5900X but with a ton of scheduler-induced drama.
I'm really hoping to see a 12P/0E conventional design with competitive power consumption. It won't thrash AMD, but at this point everyone is so close to the limit of what x86 and silicon can do that its going to be nigh impossible to get a huge leap in real-world workloads.
Maybe Sapphire Rapids will be our savior, it's not showing up for a while but 18+ unlocked Golden Cove cores should be quite competitive on desktops. For the crazy HFT guys that need nanosecond latencies it's no good (you still get some extra latency going through the EMIB) but for the rest of us AMD has shown the chiplet/tile based design doesn't hurt real-world performance.I don't think Intel is going to release an ADL without e-cores for the simple fact that 4 e-cores only take up as much space as 1 p-core. Intel can call a 6P/4E chip "10-core" and it would fit on a smaller die than an 8P/0E "8-core." Even though I would rather have the latter, the former sounds better on paper, would probably score higher on a lot of benchmarks, and probably costs less to make.
I could be misreading everything, but wasn't some of those some of the longest used with the longest relevant life CPU is history ?, I have a college that changed is core i5 4c-4t 2xxx this year at home, doing CAD development on it. Not sure about the split on logical cpu, but 4 cpu is still the most common on steam hardware survey.The 4c4t processors were almost immediately obsolete, yet synthetic benchmarks looked great.
Sure if you're using single threaded apps like CAD software (I'm a CAD designer) and a cheap assed college, you could use those forever. Along with 90% of the CPUs made at that time. 4c4t CPUs choke on AAA games, did almost at launch. Also multi tasking and multi threaded apps suffer(something professional CAD designers deal with when they get out of school).I could be misreading everything, but wasn't some of those some of the longest used with the longest relevant life CPU is history ?, I have a college that changed is core i5 4c-4t 2xxx this year at home, doing CAD development on it. Not sure about the split on logical cpu, but 4 cpu is still the most common on steam hardware survey.
In general yes but not when talking about becoming obsolete fast or not, I took obsolete has meaning:for something is more or less irrelevant.