[Rumour] Intel to cut back on Arc discrete GPU

Comment from Raja Koduri



https://twitter.com/RajaXg/status/1569150521038229505?s=20&t=xZSCCnSca9r47-N1mSeuOA

I hope ARC discrete makes it, but it will be a long struggle and shareholders have to show lots of patience
One more comment from Raja:


Attached is what we said in Feb'22 and are continuing to execute this strategy.
FceAcq4WIAA13-e.jpeg

https://twitter.com/RajaXg/status/1569368896712617987?s=20&t=VsezS9UuXu1YgBaCzcZ6QA
 
Intel should sell these cards at cost, get them out there with low expectations. People will buy them for generic 2d and encoding tasks.
Once there are enough cards in the wild, game makers will have to work with Intel and Intel can use thier experience to write a better driver.

I think that if they would used that strategy with thier itanium processor, there would have been enough units sold to convince the industry to take it seriously. We may all have been running itaniums today.

And if it's like Microsoft, they'll actually be good after 30 years.
 
This just means that we will likely see some of their investment in the form on their future CPUs. It would be stupid for them to not user their investment in graphics. I believe I already said something to this effect earlier tho.

Yeah that's what Raja is saying via smoke and mirrors too IMO, just like all their other dGPUs went to live on via/enhance their iGPUs
 
If discrete cards were used exclusively for entertainment, i could see intel quitting.

But now the real money is in compute. And common substrate, chiplet designs look like they will be the top performing solutions.

If Intel quits, that leaves AMD as the only company capable of making those designs (not really but the only vertically integrated one)

I think there is too much cash for Intel to ignore. But everyone will have to continue playing Call of Doodie on amd/nvidia.
 
If discrete cards were used exclusively for entertainment, i could see intel quitting.

But now the real money is in compute. And common substrate, chiplet designs look like they will be the top performing solutions.

If Intel quits, that leaves AMD as the only company capable of making those designs (not really but the only vertically integrated one)

I think there is too much cash for Intel to ignore. But everyone will have to continue playing Call of Doodie on amd/nvidia.
At some point, the entire industry will need to transition to chiplet designs. That's going to be the only way to keep heat and cost under control. The age of monolithic die design is coming to an end.
 
Update from Usman Pirzada of wccftech:

Update: The Future of Intel Arc (Discrete Desktop Graphics)​

  • Intel insiders have told me in no uncertain words that *no* decision has been taken yet about the future of Arc discrete desktop graphics and they remain committed to the roadmap through 2024 at the very least.
https://wccftech.com/update-the-future-of-intel-arc-discrete-desktop-graphics/
 
*no* decision has been taken yet

That's not a definitive "it isn't going anywhere" either, lol it's "we're thinking about it" at best, if taken at face value and not just PR

MLID really hit some raw nerves outside of Intel here it seems 🍵

they remain committed to the roadmap through 2024 at the very least.

.... at least until they aren't committed via orders from up above anymore 🍵
 
Last edited:
If discrete cards were used exclusively for entertainment, i could see intel quitting.

But now the real money is in compute. And common substrate, chiplet designs look like they will be the top performing solutions.

If Intel quits, that leaves AMD as the only company capable of making those designs (not really but the only vertically integrated one)

I think there is too much cash for Intel to ignore. But everyone will have to continue playing Call of Doodie on amd/nvidia.
Intel can't afford not to be in that market, and they and their shareholders know it, they would lose out on so many government contracts it would make a noticeable dent in their bottom line, not to mention all the university grants.
AMD has finally got their MI250 available, and its starting to trickle in for availability, the designs I am seeing from the big guys using them are VERY nice and currently trump the Intel solutions by a long mine, the NVidia A100 solutions even look bulky in comparison. Though AMD and their partners are throwing the given design methodologies out the window, need some serious power delivery and heat dissipation for them the density is just insane. But AMD is going hard this round and it seems they actually have managed to get their software game to a decent place, including some toolsets from converting and optimizing CUDA to some of the open platforms to really target the entrenched Nvidia customers.
Nvidia is going to need to go hard on their ARM offerings ownership of it or not, and Intel while they supposedly did really well with Ponte Vecchio they have to make sure they stay in that game.
 
As MLID puts it, Intel just can’t afford to subsidize a division that it feels won’t make the company money for too many years, given the overall difficult fiscal waters the firm is currently navigating.

Now, this may not turn out to be true, so let’s hope that’s the case – but if Arc is to continue, it sounds like it will most probably be in a much more cut-down form than the grander plans of an entire range of laptop and desktop GPUs as put forward with Alchemist. MLID does theorize that maybe we will see a Battlemage discrete product, like some kind of low-end laptop graphics solution, but what we seemingly won’t get is a flourishing range of GPUs to meaningfully challenge AMD and Nvidia.

https://www.techradar.com/news/intel-arc-gpus-could-be-canceled-already


Intel isn't going to stop iterating ARC because the US goverment is paying Intel 11 figures of taxpayer money to build domestic chip manufacturing capable of highend AI (GPU) chips.

*** Note -- this is a new rare bipartisan agreement on the USA CHIPS Act -- even CNN and FoxNews agree because of scary geopolitics ***

They may throttle up/down. Adjust schedules of consumer product, aka postpone ARC 1st-gen (ARC1) ala Cyberpunk 2077. But was Cyberpunk 2077 cancelled? Tell me. Was Cyberpunk 2077 cancelled? And remember, PROJEKT RED could release something better, like a new Witcher (ARC 2nd/3rd gen which I'll call ARC2 or ARC3); you know.

Intel are are essentially (in a roundabout way) unable to cancel it outright. If they scale it back, that's only a slight lifting of the gas pedal. Definitely not a cancel spin. Probably a few demoralized middle management idiots at Intel (Larrabee greybeard middle management probably, misappropriated as being "the top" when there's still more cake layers above them). This confused some youtubers who wanted revenue from misinterpreted sensationalism, but you have to follow the money even higher up than demoralized middle management. Once you go to the top above the top, you start to realize ARC isn't cancelled. And there's another top above Intel CEO/CFO: the US government. So here I go with my micdrop:

The reason is Intel is now ball-and-chained to the USA CHIPS act. Intel is being paid 11 figures to build domestic chip manufacturing for North America manufacturing capability capable of fabbing top-of-the-line GPU and AI chips.

Because of this, the people claiming ARC is cancelled = 100% hyperbole.

ARC1 might be mid and become a lame launch in the medias' point of view. But a future generation ARC2 or ARC3 or ARC4 (or whatever rebrand for relaunch) will be competitive because of the new local chip factory they're currently building in Ohio, and they want to expand the one in Arizona too.

If the publicity becomes so bad, I'd not be surprised if they just rename and relaunch, but it'd still be a "never cancelled ARC technology". It'd only be a consumer launch that might be scaled back until 2nd-gen (with more enterprise sales), but they'll still pour massively into the IP since both GPU & AI research goes hand in hand, given massively overlapping processing needs.

Apparently, the US doesn't want chinese AI chips in domestic US technology, and they are desparate to see Intel build a TSMC clone (In fact, even TSMC is being paid concurrently to also build something on the other side of the pond too). It will be several years before the plant sprays out high-end chips for relatively cheap, but the game is on. Idling a brand new bleeding edge 100 billion dollar EUV chip factory is not an option. So if they don't have a mature ARC-variant, they risk losing lots of money and upsetting the US government on lack of domestic AI-manufacturing capacity.

They apparently have no choice but to keep ARC-like technology on the R&D burners at a fairly high setting, even if they cost-cut a bit initially by intentionally strategic procrastination techniques due to the long lead time building a chip plant. Sing along "Sharholders Todaaaay, Shareholders Tomorrowwwwww, Thread The Neeeeedle, Between Todaaaaaay and Tomorrowwwwwwww! Oooohhh Oooohhhh". Do that at the Karaoke mic in our favourite voice impersonation and whatever pentameter you want.

Different shareholders will fight over how important today is versus tomorrow is -- the neverending holy war -- so Intel will need to thread this equilibrium. The government simply made tomorrow's shareholders more important than today's shareholder, sealing ARC's fate as a continuous technology iteration, regardless of what happens to the ARC 1st-gen consumer launch.

But know this:The AI boom isn't going to go away, and US is panicking in inability to build good AI chips domestically. It's a bigger problem than not being able to launch anything after Space Shuttle got retired in 2010. Once Intel need to milk a historically momentously ginormously fiendishly expensive government-funding-assisted EUV chip plant built in USA, they have to spray out all those ARC3's and ARC4's or lose lots of profit.

If they don't iterate ARC1 now, they won't be ready to milk the 2025+ cash cow. Shareholders fire-breathing angry in 2025+. Ooops. So they gotta thread the needle between today's capital intensivity (shareholders mad) and having an idling money printing machine (EUV chip factory). Shareholders madder. Rock. Hard place. So no choice but to keep working nonstop on ARC technology (at various strategic gas pedal pressures), even if they have to keep relaunching it.

Maybe reluctant slog and grind (consumer facing ARC1) before the successful ARC2/3 (or renamed relaunch if media keeps crapping the ARC name so much). Being designed as a multipurpose architecture that also helps AI, this is critically important to Intel.

Bottom line: You heard it from me that this isn't going to be Larrabee. The 11 figure legiron is attached already to both of Intel's legs.

Follow the money beyond Intel. Intel's stuck with the long grind on improving ARC-related technologies this decade.

P.S. Blur Busters is a Canadian company. Although I have many friends at NVIDIA, the media has obvious bullshit calls from my investigation. As Blur Busters, I have better contacts than many people in the media. Nobody employed at Blur Busters have ever had stock or a stake in Intel since Blur Busters' inception, as of this time of writing (September 2022). True outsider call here.

</micdrop>
 
Last edited:
Let's not forget the HD 4850/70 and 4890 XT

Money moves is when you take the kneecaps out on nforce 4 chipset that's been a blood sucking you like a vampire = The Bad Blood that flows with AMD and Nvidia = The Blood Wars
 
Intel isn't going to stop iterating ARC because the US goverment is paying Intel 11 figures of taxpayer money to build domestic chip manufacturing capable of highend AI (GPU) chips.

*** Note -- this is a new rare bipartisan agreement on the USA CHIPS Act -- even CNN and FoxNews agree because of scary geopolitics ***

They may throttle up/down. Adjust schedules of consumer product, aka postpone ARC 1st-gen (ARC1) ala Cyberpunk 2077. But was Cyberpunk 2077 cancelled? Tell me. Was Cyberpunk 2077 cancelled? And remember, PROJEKT RED could release something better, like a new Witcher (ARC 2nd/3rd gen which I'll call ARC2 or ARC3); you know.

Intel are are essentially (in a roundabout way) unable to cancel it outright. If they scale it back, that's only a slight lifting of the gas pedal. Definitely not a cancel spin. Probably a few demoralized middle management idiots at Intel (Larrabee greybeard middle management probably, misappropriated as being "the top" when there's still more cake layers above them). This confused some youtubers who wanted revenue from misinterpreted sensationalism, but you have to follow the money even higher up than demoralized middle management. Once you go to the top above the top, you start to realize ARC isn't cancelled. And there's another top above Intel CEO/CFO: the US government. So here I go with my micdrop:

The reason is Intel is now ball-and-chained to the USA CHIPS act. Intel is being paid 11 figures to build domestic chip manufacturing for North America manufacturing capability capable of fabbing top-of-the-line GPU and AI chips.

Because of this, the people claiming ARC is cancelled = 100% hyperbole.

ARC1 might be mid and become a lame launch in the medias' point of view. But a future generation ARC2 or ARC3 or ARC4 (or whatever rebrand for relaunch) will be competitive because of the new local chip factory they're currently building in Ohio, and they want to expand the one in Arizona too.

If the publicity becomes so bad, I'd not be surprised if they just rename and relaunch, but it'd still be a "never cancelled ARC technology". It'd only be a consumer launch that might be scaled back until 2nd-gen (with more enterprise sales), but they'll still pour massively into the IP since both GPU & AI research goes hand in hand, given massively overlapping processing needs.

Apparently, the US doesn't want chinese AI chips in domestic US technology, and they are desparate to see Intel build a TSMC clone (In fact, even TSMC is being paid concurrently to also build something on the other side of the pond too). It will be several years before the plant sprays out high-end chips for relatively cheap, but the game is on. Idling a brand new bleeding edge 100 billion dollar EUV chip factory is not an option. So if they don't have a mature ARC-variant, they risk losing lots of money and upsetting the US government on lack of domestic AI-manufacturing capacity.

They apparently have no choice but to keep ARC-like technology on the R&D burners at a fairly high setting, even if they cost-cut a bit initially by intentionally strategic procrastination techniques due to the long lead time building a chip plant. Sing along "Sharholders Todaaaay, Shareholders Tomorrowwwwww, Thread The Neeeeedle, Between Todaaaaaay and Tomorrowwwwwwww! Oooohhh Oooohhhh". Do that at the Karaoke mic in our favourite voice impersonation and whatever pentameter you want.

Different shareholders will fight over how important today is versus tomorrow is -- the neverending holy war -- so Intel will need to thread this equilibrium. The government simply made tomorrow's shareholders more important than today's shareholder, sealing ARC's fate as a continuous technology iteration, regardless of what happens to the ARC 1st-gen consumer launch.

But know this:The AI boom isn't going to go away, and US is panicking in inability to build good AI chips domestically. It's a bigger problem than not being able to launch anything after Space Shuttle got retired in 2010. Once Intel need to milk a historically momentously ginormously fiendishly expensive government-funding-assisted EUV chip plant built in USA, they have to spray out all those ARC3's and ARC4's or lose lots of profit.

If they don't iterate ARC1 now, they won't be ready to milk the 2025+ cash cow. Shareholders fire-breathing angry in 2025+. Ooops. So they gotta thread the needle between today's capital intensivity (shareholders mad) and having an idling money printing machine (EUV chip factory). Shareholders madder. Rock. Hard place. So no choice but to keep working nonstop on ARC technology (at various strategic gas pedal pressures), even if they have to keep relaunching it.

Maybe reluctant slog and grind (consumer facing ARC1) before the successful ARC2/3 (or renamed relaunch if media keeps crapping the ARC name so much). Being designed as a multipurpose architecture that also helps AI, this is critically important to Intel.

Bottom line: You heard it from me that this isn't going to be Larrabee. The 11 figure legiron is attached already to both of Intel's legs.

Follow the money beyond Intel. Intel's stuck with the long grind on improving ARC-related technologies this decade.

P.S. Blur Busters is a Canadian company. Although I have many friends at NVIDIA, the media has obvious bullshit calls from my investigation. As Blur Busters, I have better contacts than many people in the media. Nobody employed at Blur Busters have ever had stock or a stake in Intel since Blur Busters' inception, as of this time of writing (September 2022). True outsider call here.

</micdrop>
AMD and Nvidia AI/GPUs could be made using Intel Foundry, so Intel ARC and successors are not required for chip making, AI or GPUs. I do believe Nvidia is working on a possible partnership with Intel Fabs for their future chips.

I don't see how Intel will pull the plug on ARC at this stage, we will see.
 
They may throttle up/down. Adjust schedules of consumer product, aka postpone ARC 1st-gen (ARC1) ala Cyberpunk 2077. But was Cyberpunk 2077 cancelled? Tell me. Was Cyberpunk 2077 cancelled?
No, but it should have been.

ARC1 might be mid and become a lame launch in the medias' point of view. But a future generation ARC2 or ARC3 or ARC4 (or whatever rebrand for relaunch) will be competitive because of the new local chip factory they're currently building in Ohio, and they want to expand the one in Arizona too.
A GPU will be competitive because a new plant is being built?!
 
When they realize they must embrace GPU fully or die.
Which AMD realized 10+ years ahead of Intel. Debt from the ATI acquisition choked AMD and contributed to a years-long slump. Even the pre-Su leadership deserves credit for sticking it out and realizing what the actual liabilities were (foundries that couldn't keep up with the cutting edge) and what investments would pay off.

Intel, on the other hand, has the resources to both rejoin the foundry leadership and develop GPU. That combo will pay off massively in the future.
 
The reason AMD made this look easy was the fact that they bought ATi and kind of let it do it's thing.

Intel has been acquiring companies recently as well. Now part of this was prior to current macro trends, but with the money they have in the bank, and offsetting incoming funds from the CHIPs act they could go out and look for more companies accretive to their bottom line and / or in line with future growth plans.
 
Digital foundry ARC discussion with Tom Peterson of Intel. 42 minute mark begins discussion of Intel ARC current status and the their future product stack plans. Tom eludes to Intels intent to remain in the discrete gpu business for many reasons and iterations through druid although no mention of intent afterward.
 
Oh I have no idea Intel will be staying in the fight for future AI chips. "Arc" isn't going anywhere....

Consumer GPUs... are not what the Gov is giving them money for, and not Intels future either.

Intel Consumer GPUs are probably doomed, Intel can't handle not being #1, and #1 in GPUs is so far out of their reach that ya they are not going to be shipping Consumer cards at least not the cards any of us care about.
 
Given reports like this https://hardforum.com/threads/my-arc-380-adventure.2021738/ it seems plausible that Intel will stop releasing AIB cards next generation and focus on laptop and server chips, where they can validate the supported configurations.
im not an expert on this subject, but i feel like i740, larrabee, and Intels current gpu are all 1st tries.
i740 was technology they bought from another company that iirc made graphics chips for flight simulators, so it was only an iteration of already established GPUs.
Intel Arc is the successor of DG1 I think, so at the very least 2nd generation.

But yeah, like with Cannon Lake their first 10nm CPU, Intel would prefer if you forget about DG1 as the first discrete graphics product in modern times.
 
Which AMD realized 10+ years ahead of Intel. Debt from the ATI acquisition choked AMD and contributed to a years-long slump. Even the pre-Su leadership deserves credit for sticking it out and realizing what the actual liabilities were (foundries that couldn't keep up with the cutting edge) and what investments would pay off.

Intel, on the other hand, has the resources to both rejoin the foundry leadership and develop GPU. That combo will pay off massively in the future.
Not telling you they need to go out and get in debt, but AMD saw the long game. Get into GPU as well or perish. Not sure why you chose to take what I said at such a low level of understanding. But you did. Can't you see that in the modern game it reqires both to dominate in the near future?
 
Not telling you they need to go out and get in debt, but AMD saw the long game. Get into GPU as well or perish. Not sure why you chose to take what I said at such a low level of understanding. But you did. Can't you see that in the modern game it reqires both to dominate in the near future?
I think it made lots of sense for AMD to diversify. They where never in a market where they where #1 in sales. Companies take bigger risks if they are not comfortable in their core business.

Intel handicap has been being so far out in the lead for so many years in CPU. (in terms of sales) They have grown a deep culture of over confidence. I get the feeling for Intel management the idea that they are not THE leader is unfathomable. Also deeply embarrassing.

Intel is going to hang with Arc... the question is will they bother making consumer level Arc products. Intel really needs AI processors, and they will get enough in contracts from the US government to build AI processors. If I had to guess Intel is going to make Arc chips to fill their super computer contracts, and datacenter needs. Intel has said they expect to loose data center business for a few years to AMD... but also expect to regain it in what was it 2025 if I remember right. (and who knows if that is because they have exciting early work on next gen stuff... or if its all just ego based on Intel will overcome BS that comes from Pat)

Consumer parts though... that is a toss up. I can't imagine Intel releasing many higher end cards that get laughed at, their egos can't handle that. I do believe however that there is a good chance they half heartedly release low end cards for the next few generations in less then markets. I think the upper management has been told that in and around that 2025 mark they will have a product that will make them #1. If Pat believes that... Arc stays. If Pat doesn't believe they will be #1 then Arc as a consumer product will probably get axed. (Pat is one of the worst Intel is the bestest best company in the worlds and our engineers are smarty types the company has employed... his first go round at Intel and his arrogance and hubris was one of the drivers that forced them onto one node for a DECADE)
 
Not telling you they need to go out and get in debt, but AMD saw the long game. Get into GPU as well or perish. Not sure why you chose to take what I said at such a low level of understanding. But you did. Can't you see that in the modern game it reqires both to dominate in the near future?
I was simply agreeing with your point and looking at it from a particular angle. AMD paid a high price (more than $$$) for its GPU division and went through some dark times. Their perseverance now has given them some advantages over both Nvidia and Intel (Nvidia doesn't have X86*, Intel is behind on GPU). AMD isn't outselling either competitor, but they are making gains.

Intel has the capability to match that X86+GPU combo. Though the cost might be high and it will take some time, it will be quicker and the cost proportionately lower than what AMD went through.

*and Jensen really hates that
Consumer parts though... that is a toss up. I can't imagine Intel releasing many higher end cards that get laughed at, their egos can't handle that. I do believe however that there is a good chance they half heartedly release low end cards for the next few generations in less then markets. I think the upper management has been told that in and around that 2025 mark they will have a product that will make them #1. If Pat believes that... Arc stays. If Pat doesn't believe they will be #1 then Arc as a consumer product will probably get axed. (Pat is one of the worst Intel is the bestest best company in the worlds and our engineers are smarty types the company has employed... his first go round at Intel and his arrogance and hubris was one of the drivers that forced them onto one node for a DECADE)
"Higher end" is relative, of course. Intel's current challenges are
1) Don't be broken
2) Price appropriately for the market

If the scheduler rumors are true, point 1 might not be fixed until the 2nd Arc generation. With their one released card, Intel is doing OK on point 2. Obviously if Intel has a card with X070/X700 XT performance, they can't charge X080/X800 XT prices. They're going to have to forgo the Intel tax to get a foothold in consumer GPU.
 
"Higher end" is relative, of course. Intel's current challenges are
1) Don't be broken
2) Price appropriately for the market

If the scheduler rumors are true, point 1 might not be fixed until the 2nd Arc generation. With their one released card, Intel is doing OK on point 2. Obviously if Intel has a card with X070/X700 XT performance, they can't charge X080/X800 XT prices. They're going to have to forgo the Intel tax to get a foothold in consumer GPU.
I agree with you and if we where talking about AMD getting into a new market... or even Nvidia (if they had a license) trying to get into x86. However this is Intel. If the best Card Intel can manage is only going to be x070 range... I can't imagine Pat and other management at Intel letting that really happen.

I think they may let it limp for a couple generations... where all we ever see is the super low end, and mostly those will be released in places like China. I think the only way Arc can stick around is if the GPU division has really convinced Pat that at some point they can release a x090 class card that will instantly catapult Intel to first place.

The idea of Intel being a value brand in any market chaps their asses.
 
Consumer parts though... that is a toss up. I can't imagine Intel releasing many higher end cards that get laughed at, their egos can't handle that. I do believe however that there is a good chance they half heartedly release low end cards for the next few generations in less then markets. I think the upper management has been told that in and around that 2025 mark they will have a product that will make them #1. If Pat believes that... Arc stays. If Pat doesn't believe they will be #1 then Arc as a consumer product will probably get axed. (Pat is one of the worst Intel is the bestest best company in the worlds and our engineers are smarty types the company has employed... his first go round at Intel and his arrogance and hubris was one of the drivers that forced them onto one node for a DECADE)
This makes the most sense since we all know Intel can't touch AMD and Nvidia no matter what. On the other hand low or even mid range cards don't need to prove to anyone with performance, and they can price it low and hopefully penetrate the market. Also, AMD and Nvidia have left the low and mid range market mostly alone. I don't believe they have sub $200 cards in the market.
 
This makes the most sense since we all know Intel can't touch AMD and Nvidia no matter what. On the other hand low or even mid range cards don't need to prove to anyone with performance, and they can price it low and hopefully penetrate the market. Also, AMD and Nvidia have left the low and mid range market mostly alone. I don't believe they have sub $200 cards in the market.
they have quite a bit of them at the moment according to pcpart picker
https://pcpartpicker.com/products/video-card/#sort=price&page=1

one 6400 is $150
The 1630 $160
some 6500xt are $160

It is not necessarily easy for even an intel to be the offer of an $160 6500XT or a $170 GTX 1650-1050TI or $199 1660/1650 super.

Right now it seem to have a 6650xt available at $278, that buying what 115% of what was a 5700xt in 2019 for 60% of the price on an 5700xt in actual money, you need to do twice by $ of an 5700xt to compete in the higher middle range.
 
Back
Top