Intel Pressuring Board Partners to remove features from Alder Lake via bios update for "Product Segmentation"

kirbyrj

Fully [H]
Joined
Feb 1, 2005
Messages
30,819
https://www.igorslab.de/en/intel-de...-interpretation-of-efficiency-news-editorial/

https://www.hardwaretimes.com/intel-disables-avx-512-on-12th-gen-alder-lake-cpus-via-bios-update/

Intel is now set to disable “AVX-512” completely on all Alder Lake CPUs with an upcoming microcode update in new BIOS releases. Mainboard manufacturers were able to make the supposedly disabled instruction set available at launch, which resulted in a significant performance increase for the P-cores of the new CPUs. Now Intel is tightening the noose completely after all and, according to our sources, has instructed motherboard manufacturers to completely disable the “unsupported” feature.
 
I'm curious what consumer level software uses AVX 512 instructions.
Not much that I've seen so far. Probably mostly because it has been real rare on consumer CPUs. They rolled it out on the high end stuff some time ago but only recently did consumer shit see it. Also it is really fragmented, unlike a lot of previous stuff like AVX2 where you either has it or doesn't, there are all kinds of feature levels of it. It seems like it was not a great idea for an addon overall.

Only thing I've ever seen kick it on is Prime95 and I don't know if it actually helps its speed, or if it just does it for stress testing purposes.
 
Not much that I've seen so far. Probably mostly because it has been real rare on consumer CPUs. They rolled it out on the high end stuff some time ago but only recently did consumer shit see it. Also it is really fragmented, unlike a lot of previous stuff like AVX2 where you either has it or doesn't, there are all kinds of feature levels of it. It seems like it was not a great idea for an addon overall.

Only thing I've ever seen kick it on is Prime95 and I don't know if it actually helps its speed, or if it just does it for stress testing purposes.
It is monumentally faster at Prime95 and related applications, almost 2x speedup (the clockspeed hit reduces the effectiveness somewhat). I use AVX-512 as a part of my Primegrid crunching with great success.

It is my hope that if Intel is segmenting the implementation, that it means that there remains intention of further consumer HEDT Core X releases like an Ice Lake-X or (better yet) Sapphire Rapids-X with AMX. The reliable sources indicate that non-Pro Threadripper is done for, but I hope we're not totally left out in the cold. Building crunchers or home servers from Ice Lake-SP+ Xeon ebay dumps won't be nearly as cost effective as it has been with Broadwell-EP and earlier.
 
Intel in 2018: you should buy our CPUs bc we're expanding AVX512 support soon and look how big these bar graphs are compared to AMD!

Intel in 2022: we're disabling AVX512 on our long-awaited 10nm "Intel 7" performance desktop CPUs

got it.
 
Well my 5900X doesn't have AVX-512 either and I couldn't care less.

But removing features via microcode update does seem rather under-handed. In years past it would have been easy to simply stay on an old BIOS, but these days many OEM systems get their BIOS updates automatically via Windows Update. Updating your BIOS is also a lot more important than it used to be given that is how many new security vulnerabilities are fixed/mitigated.
 
Despite this, if you were using AVX-512 as a consumer... it sounds like you can just stick with an older BIOS to retain the functionality.

Still, applications that leveraged it remain exceedingly rare, at least among those that are relevant to most consumers. This Anand thread was one of the only resources I could find that gave examples and it was basically just, x264.

There is a bigger question IMO about what Intel is doing in the HEDT segment. Most of the innovation they've brought to market has primarily benefitted consumer CPUs like 12th gen Core series, and I can't remember the last time mainstream manufacturers released compelling mobos for HEDT platform anyway. Couple years ago I switched from X99 to Z270 and imho, there's no looking back.
 
The two biggest applications for general consumers off the top of my head, are H.265 encoding and RPCS3 (PS3 emulator, which sees non-trivial performance gains with AVX512).

And even though this sucks, it should be clarified that Intel isn't trying to "remove" features, exactly. They marketed Alder Lake as not having AVX512 and even said it was "lasered off" (even though it isn't.) and did not otherwise provide documentation for how to implement it.

Board Makers discovered the functionality. Asus and Asrock decided to unofficially release bios features which expose it.

All that said----it would be great if Intel would just ease up on this and let it exist.
 
I don't really care about AVX512 and neither do 99% of other users. Still its up to the general tech community to speak up about this "product segmentation" crap.
I am reminded of the Martin Niemöller poem... I guess in this case its;
First they came for AVX512 and I said nothing... cause I don't encode.
If Intel ends up in a point where they don't feel the mid range needs X or Y or the low end doesn't need as many PCI lanes ect ect... if no one speaks out against the idea of segmentation that is what they will do... Cause we didn't advertise it so we can just disable it.

Let Intel get away with this crap and it becomes super easy for them to do it when they are in a position to dictate the market again. How much more would they disable with the "product segmentation" excuse if AMD wasn't AMD right now. Let this slide... and we end up with crap mid range stuff with bios gating.
 
I don't really care about AVX512 and neither do 99% of other users. Still its up to the general tech community to speak up about this "product segmentation" crap.
I am reminded of the Martin Niemöller poem... I guess in this case its;
First they came for AVX512 and I said nothing... cause I don't encode.
If Intel ends up in a point where they don't feel the mid range needs X or Y or the low end doesn't need as many PCI lanes ect ect... if no one speaks out against the idea of segmentation that is what they will do... Cause we didn't advertise it so we can just disable it.

Let Intel get away with this crap and it becomes super easy for them to do it when they are in a position to dictate the market again. How much more would they disable with the "product segmentation" excuse if AMD wasn't AMD right now. Let this slide... and we end up with crap mid range stuff with bios gating.
Now do LHR for mining on nvidia cards... and please, don't compare an unadvertised, stated to be disabled feature to the holocaust.
 
And people don't think Intel can fix prices for graphics cards.
 
Now do LHR for mining on nvidia cards... and please, don't compare an unadvertised, stated to be disabled feature to the holocaust.
Yes people should also be even more annoyed about what Nvidia is doing.
Also not a holocaust comparison... Niemöller was warning about letting people get away with shit you can tolerate cause it doesn't effect you. Point taken though... perhaps its not a fair comparison. What Intel and yes Nvidia are doing isn't pure evil, just greed. Point stands... let them segment performance they will. Perhaps I don't care about AVX512... perhaps I don't care about hash rates... but the companies making the hardware shouldn't be able to soft disable hardware capabilities cause I didn't buy X or Y more expensive edition.
 
They want them sold to consumers at close to cost or at a loss.

Intel of all companies is not gonna leave money on the table. I have never seen them take that stance with any product in their entire history and I seriously doubt they are going to start now - they consider themselves a "premium brand" after all.

More likely scenario: They seed reviewers with cards and announce a reasonable MSRP. Reviewers review the cards, creating demand. Suddenly MSRP can't be met due to "supply problems." We are still in the same boat getting GPUs that we are right now.
 
Intel of all companies is not gonna leave money on the table. I have never seen them take that stance with any product in their entire history and I seriously doubt they are going to start now - they consider themselves a "premium brand" after all.

What? Intel absolutely and repeatedly operates at losses in order to undercut competition. It's a hallmark of their methodology.
 
What? Intel absolutely and repeatedly operates at losses in order to undercut competition. It's a hallmark of their methodology.
But, current market conditions aren't suitable for that.
Far more likely they will have an attractive MSRP, but have "supply issues" then sell at jacked up prices.

In an alternate universe, maybe they'll be kick-ass mining cards.
 
Ya intels response to AMD over the years is always to hit them with a loss leader... and cash in on the big SKUs. Having said that Intel until recently was almost always in possession of those top performing profit parts.
Over the years though they have many times priced a mid range or low end part to razor margin points... and perhaps even under a bit in order to destroy all hope of AMD gaining market share. Just think back to the days of $100 Celerons and $1500 P4s.

It will be interesting to see what Intel does with GPU pricing though... I mean I doubt they have a SKU that will even be close to competing with 3090/6900 (4090/7900?). If by some fluke they do manage that... I am sure Intel would price that top part right there with NV and AMD. On the mid range though.... I wouldn't put it past Intel to push pricing down. Assuming their Fabs are working at least reasonably well anyway. Part of what has allowed Intel to play that game was their historical savings on the FAB side. They could price chips off the same wafer anywhere from $100-1500 (or more)... now that they are outsourcing, or trying to push their own fabs to the limits that might be harder to swallow. Its harder to hide selling tons of product at razor thin margins (or losses) if it involves paying a supplier. (years ago Intel would hide that stuff in internal paperwork... Shareholders didn't have to know celerons where going out the door at a loss... they just saw the total Profit per wafer or whatever they where allowed to see)
 
... Shareholders didn't have to know celerons where going out the door at a loss... they just saw the total Profit per wafer or whatever they where allowed to see)
As much as we all enjoy hating Intel's corporate policies, this is pretty typical. As long as the ink is black, few questions are asked, no stones turned, etc. First drop of red ink and holeee shite, 10 auditors show up and pitch a tent in accounting.
 
I'm curious what consumer level software uses AVX 512 instructions.
Encryption and decryption - especially AES-GCM which iirc is still used by all browsers for HTTPS (I worked on some SIMD accelerated cryptography assembly code for a non-x86 platform and spent a LOT of time in browser cryptography backends) - ie. every time you send/receive data from a website it’s a full de/encryption cycle. Gimping AVX *can* have a noticeable impact here.

Intel funds a lot of research into cryptography optimization via new vector instructions and wider registers. I wonder if they’re doing this to hide the terrible power draw and subsequent throttling across the chip when executing AVX-512 instructions.
 
Over the years though they have many times priced a mid range or low end part to razor margin points... and perhaps even under a bit in order to destroy all hope of AMD gaining market share. Just think back to the days of $100 Celerons and $1500 P4s.

And they did it by what were blatantly illegal methods then. Since, they've had time to work out the kinks and plan ahead a little better.

I expect them to have an I+I solution that partners can't refuse on the laptop front, where the major sales are right now, then have their way with other product lines based on that. And I think this is even a sign that's what's already in the works.

Intel: cut features, make them pay for more.

Partners: yes, my liege.

Intel: excellent, you will not be forgotten when DG2 enters mass production.
 
Last edited:
And they did it by what were blatantly illegal methods then. Since, they've had time to work out the kinks and plan ahead a little better.

I expect them to have an I+I solution that partners can't refuse on the laptop front, where the major sales are right now, then have their way with other product lines based on that. And I think this is even a sign that's what's already in the works.

Intel: cut features, make them pay for more.

Partners: yes, my liege.

Intel: excellent, you will not be forgotten when DG2 enters mass production.
I expect it is going to take intel a couple generations to be where they would like with GPUs. They are going to get to parity though. With Intels past aborted attempts I believe there was a core of higher ups at Intel that honestly didn't see the GPU future. I think they honestly believed GPUs where never going to be more then a video game market thing which wasn't the big $$$, and at some point the CPUs would start doing real time ray tracing or something anyway. This time they clearly understand they need a GPU to provide 100% intel solutions for supercomputer/server/ai/auto ect ect.... so they will ride out the embarrassment of a couple of also ran status product cycles. (old Intel was too quick to abandon markets they didn't crush, too thin skinned to admit they where #2.... frankly they still sort of are their current CEO doesn't seem any different to me)

For this first round of GPUs I don't think Intel will have the wiggle room to really pull the old one waffer... 5 skus ranging from -50% margin to 250%. The GPUs are outsourced... I mean they will obviously not price the bottom of the wafer at the top, I just don't think they can get away with the shenanigans of the past as easily having to more closely account for wafer supplier payments. At this point Shareholders are going to want to know yields... and aren't going to be ok with "lasered off" (or disabled in firmware) chunks of silicon that is being sold at cost or worse under cost. When Intel moves the GPU silicon to their own fabs though.... that might be very different. If Intel can ever sort their fabs out, then I expect a lot of OEMs to be stuck with the old you buy our CPU you buy our GPU for $1 or perhaps we don't have enough CPUs to fill your order BS. At that point Intel covers those OEM deals up in NDAs, reports the sales of CPU and GPU as "computing hardware unit" sales on the quarterly reports and everyone is happy. (same way companies like Microsoft have hidden Xbox losses for over a decade)
 
I foresee a lot of modded BIOS that people will download and unlock certain features in their CPU's. It's happened before plenty of times. If Intel wants to do that then be my guest as people will flock to the cheaper parts and installed modded bios to unlock hidden features.
 
If they wanted segmentation they would have been better off just not including the AVX512 stuff in gen 13, let gen 12 be an oddity and just ride it out silently. Would have required less work on everybody’s part and been a complete non issue.
 
If they wanted segmentation they would have been better off just not including the AVX512 stuff in gen 13, let gen 12 be an oddity and just ride it out silently. Would have required less work on everybody’s part and been a complete non issue.

But this is Intel, it wouldn't be Intel without them backpedaling on features mid product cycle because they wanted more market segmentation.

"Hey guys, the 440BX is too good, we won't sell our new Xeons well if consumers can just buy cheap 440BX boards. We have a great idea called the i810 and 815 chipsets that are hard limited to 512 MB and no ECC support, quick, kill off 440 as fast as possible!"
 
They must have had some mass enterprise purchasing for AVX loads. Funniest part is, it just further cements their sales to stay as lackluster as they are, because now not only can you not get DDR5, but Intel 'nerfed' your CPU. Even though quite a few desktop users won't make use of AVX-512, they might want to in future or future applications will, now it's not an option. Remember when AVX-512 was a selling point vs AMD cpus and the only way they were relevant in benchmarks? Confused messaging much?

They always do this stuff, anything to make an 'okay' CPU launch (DDR5 aside) into a PR shitfest. They literally shoot themselves in the foot these days for the consumer market.
1641114140181.png
 
So I looked a little more into this AVX512 is disabled on these chips, but disabling the E cores allows some users on some boards to trick the system to re enabling it. Some vendors leaned into it and made it a feature.
Intel says enabling AVX512 on these chips leads to errors and that’s why it was disabled to begin with, I’m sure those are errors in their books but who am I to know.

https://www.tomshardware.com/amp/news/intel-reportedly-kills-avx-512-alder-lake-cpus
 
Last edited:
Encryption and decryption - especially AES-GCM which iirc is still used by all browsers for HTTPS (I worked on some SIMD accelerated cryptography assembly code for a non-x86 platform and spent a LOT of time in browser cryptography backends) - ie. every time you send/receive data from a website it’s a full de/encryption cycle. Gimping AVX *can* have a noticeable impact here.

Intel funds a lot of research into cryptography optimization via new vector instructions and wider registers. I wonder if they’re doing this to hide the terrible power draw and subsequent throttling across the chip when executing AVX-512 instructions.
But even the lowest sku alder lake cpu isn't going to be bottle necking web browsing to any significant degree.
 
I foresee a lot of modded BIOS that people will download and unlock certain features in their CPU's. It's happened before plenty of times. If Intel wants to do that then be my guest as people will flock to the cheaper parts and installed modded bios to unlock hidden features.
That's the first thing I thought of as well; I've done it twice so far, both time with the BIOS tool that Intel provides to OEMs. It was a bit nervewracking flashing the BIOS after modding it, but it turned out just fine.
 
If they wanted segmentation they would have been better off just not including the AVX512 stuff in gen 13, let gen 12 be an oddity and just ride it out silently. Would have required less work on everybody’s part and been a complete non issue.
The issue appears to be with the different kind of cores. The E cores don't have AVX 512, the P cores do. However the schedulers in OSes right now treat them as having feature parity. They don't schedule based on what a core can do, only based on how much power the thread needs. That means if you had an AVX512 workload and the CPU moved it to an E core, there's be a crash.

Sounds like you can disable the E cores and then the BIOS can enable it. You do lose 8 cores doing that though and even if they aren't as performant as P cores they look like they add a good amount ot multithreaded workloads.
 
I toyed around with my bios. I can disable the E cores, and go under the AVX settings and enable AVX512 just like described in the article. It showed up in CPU-Z as an available instruction set.

More interesting would be the non-E core CPUs (e.g. 12400 which sounds like a budget champ). Sounds like you could just leave AVX512 enabled if you so desired if you had the option.
 
Ironically the AVX512 was the only reason I wanted one. (programming) I guess I'll just wait till it goes mainstream.
 
This might be dangerously close to off topic, but ever since the Celeron 300A Intel has been cracking down on overclocking, tweaking, or otherwise stealing performance from them. The only reason we have overclocking and unlocked CPUs is because the competitive offered it to compete. Intel is a shady, multi national Corp that would sell your children to slavery in order to sell you a can of beans.
 
This might be dangerously close to off topic, but ever since the Celeron 300A Intel has been cracking down on overclocking, tweaking, or otherwise stealing performance from them. The only reason we have overclocking and unlocked CPUs is because the competitive offered it to compete. Intel is a shady, multi national Corp that would sell your children to slavery in order to sell you a can of beans.
nobody’s cracking down on overclocking, they are running the silicon so close to the edge that the turbo frequencies basically do the job for you unless you start getting into the more exotic cooling solutions and yes an AIO or massive Noctua is an exotic cooler. What there is now is competition and neither party can afford to leave anything behind because that extra 100mhz could be the difference between it reviewing well or bombing.
 
Back
Top