[Rumour] Intel to cut back on Arc discrete GPU

Marees

[H]ard|Gawd
Joined
Sep 28, 2018
Messages
2,039
As MLID puts it, Intel just can’t afford to subsidize a division that it feels won’t make the company money for too many years, given the overall difficult fiscal waters the firm is currently navigating.

Now, this may not turn out to be true, so let’s hope that’s the case – but if Arc is to continue, it sounds like it will most probably be in a much more cut-down form than the grander plans of an entire range of laptop and desktop GPUs as put forward with Alchemist. MLID does theorize that maybe we will see a Battlemage discrete product, like some kind of low-end laptop graphics solution, but what we seemingly won’t get is a flourishing range of GPUs to meaningfully challenge AMD and Nvidia.

https://www.techradar.com/news/intel-arc-gpus-could-be-canceled-already

 
When I first heard Intel was making cards it turned out to be that FPGA thing someone found in a dumpster. Then Arc came along and I got excited again. And, yet again, my hopes for 3 serious GPU makers ended up in the dumpster.
 
I never got the impression that hardware was the problem. There's too many games that have particular hardware-favoring characteristics for a newcomer to come along and not have to optimize the ever-living crap out of each and every title.
 
It's a perfect storm of shit which is why they're cancelling it from them not being able to get it right and wouldn't for some time (that alone wouldn't mean they axe it) along with crypto boom ending so second hand being flooded with established GPUs (as others have pointed out) along with the recession, along with how their stock in particular has been hit very hard in this economy and in combination due to their own shortcomings company wide as of late (stock is $53.21>$31.46 YTD -40.87%)

It's obvious something was gonna be on the chopping block, name a more valid candidate than the underwhelming to say the least consumer GPUs just launched and not well established in any sense, in time on the market or performance
 
Last edited:
This industry desperately needs competition. Did Intel seriously think they would get it right on their first attempt? I wish they would learn from their mistakes and move forward.

Intel has made a lot of mistakes and not learned from them. If they are one and done in Arc, they weren’t really committed to it in the first place.
 
It sounds like the hardware is pretty solid, its the software they can't get right. I heard, (semiaccurate?) That the shader compiler team was based in Russia and everything went i to the toilet when the Ukraine invasion happened.
 
Intel should sell these cards at cost, get them out there with low expectations. People will buy them for generic 2d and encoding tasks.
Once there are enough cards in the wild, game makers will have to work with Intel and Intel can use thier experience to write a better driver.

I think that if they would used that strategy with thier itanium processor, there would have been enough units sold to convince the industry to take it seriously. We may all have been running itaniums today.
 
Once there are enough cards in the wild, game makers will have to work with Intel and Intel can use thier experience to write a better driver.

Wouldn't happen with what they have available to put out, and ignores the option of game makers could still.... just ignore them with that paltry amount.
 
Wouldn't happen with what they have available to put out, and ignores the option of game makers could still.... just ignore them with that paltry amount.

Maybe, or maybe not. Its a better idea than just shutting down the whole division. Even Nvidia and amd have had to survive turd product generations every now and then.
 
Maybe, or maybe not. Its a better idea than just shutting down the whole division. Even Nvidia and amd have had to survive turd product generations every now and then.

That's easy to say when you're not bankrolling that possibly futile gamble. AMD and Nvidia are already established, they just keep on trucking when times are bad cause this is already what they do, have done, and will continue to do.

1662919428021.png
 
Maybe Intel should have killed thier xeon line because it has stunk compared to Amd? And all those spectre flaws? Shut down the entire server division!
 
I dont have any skin in this game.

It's just that I think can run Intel better than thier managment team.
 
It really feels like the only thing that actually pushed them into this market was the insane prices people were paying for graphics cards. Who wouldn't want a piece of that pie? Now with prices crashing, the dynamic has changed.

Still, it would be nice if they focused more on the long-term rather than constantly shifting direction based on the short-term price roller-coaster. Unfortunately when shareholders focus on the later, it pretty much means the company has to also...
 
It's just that I think can run Intel better than thier managment team.

Yeah, I think it should be clear to anyone that this wasn't Intel failing with technology, this was Intel failing with management. This was all avoidable, and a discrete graphics division was easily within their reach.
 
Intel cannot afford to NOT be in the GPU business. They may or may not sell consumer cards, but CPUs are not the stars of the server world anymore. Intel needs GPU to stay relevant in the long term, and -$500 million is peanuts compared to the consequences of failing. With $27 billion cash in the bank, Intel's not going broke over GPU.

There's a story that, in the early 60's somebody in the NFL head office told commissioner Pete Rozelle, "Did you hear Lamar Hunt lost a million dollars on the AFL last season?" To which Rozelle replied, "At that rate, he can only keep it going for another 100 years."
 
Yeah, but they also can't afford to leave the consumer GPU market, either. They need integrated graphics, and they need to be good enough to game with, or AMD will take that whole market, which includes most laptops these days.
 
Myself, I see this as a symptom of Intel going through rough times. We all know that their stock has been falling for a while now (overall). They'll figure it out I'm sure but consider the recent history here. How long did it take Intel to get off 14nm? I wonder how much money they sunk into that venture. Add in new fab. GPU business is another venture that didn't work out so well (so it seems). All this adds up and they just don't have the money. I'm not so sure Intel is out of the GPU business for good but I really don't find this news surprising.
 
Yeah, but they also can't afford to leave the consumer GPU market, either. They need integrated graphics, and they need to be good enough to game with, or AMD will take that whole market, which includes most laptops these days.
I don't know. Most laptops don't need much graphics horsepower, and Intel far outsells AMD in laptops. Intel's Xe is good enough for the job, even if AMD's integrated (especially now that RDNA is finally replacing Vega) is better. Intel would love to have all-Intel laptops with discrete GPU, but that's probably priority #3 for their GPU division.
 
The reason AMD made this look easy was the fact that they bought ATi and kind of let it do it's thing.
Except it wasn't easy for AMD. They bought ATi and got a golden egg in the HD 4870 and 5870 almost immediately. After that, they struggled for a really long time before FINALLY getting RDNA 2 out and starting to turn up the heat on Nvidia. For the longest time, Nvidia was untouchable on the high-end, and as everyone here knows, if you don't own the high-end, you're second place in the minds of consumers. Intel, who is literally creating a GPU architecture from almost nothing and has ZERO experience in the discrete GPU market, thought they could force their way into the market with a team of the best engineers in the world... but they had to know that it was going to be a monumental task to create a competitive product. They had a close to ZERO chance of doing it with the first generation, and the fact that it's being killed off now means that someone (or a group of someones) screwed up big time in R&D and it wasn't discovered before production began. We're talking Geforce FX or Fermi-levels of screw up here, maybe even worse. Unlike Nvidia, though, Intel doesn't have mindshare in the GPU market, so they can't just force it through with good marketing (looking at you, Tom Petersen.); they need it to perform, and it sounds like it isn't going to.

This just sucks. The duopoly in this market is devastating for consumers.
 
Intel was never going to nail it the 1st time, and it honestly seems like people have zero patience for anything but "does everything I buy brand X for, but cheaper". I really don't see how Arc discrete ever had a real chance.
that^^^ and larabee were their 1st and second tries. hence, "third time isnt the charm".
 
Intel cannot afford to NOT be in the GPU business. They may or may not sell consumer cards, but CPUs are not the stars of the server world anymore. Intel needs GPU to stay relevant in the long term, and -$500 million is peanuts compared to the consequences of failing. With $27 billion cash in the bank, Intel's not going broke over GPU.
500 million in losses per year is not chump-change when it will take another five years minimum for xe t catch-up with Amperes more superior memory compression!

AMD: five years from Tonga (first memory compression) to rdna1

and oh yeah, did you forget about this little trend pre-ethereum pow? how likely is that rush to continue,, post-stake?

728813_jpr_q2_2016_mkt_historical_annual_gpu_sales_575px.png



if Intel wants to make money post-stake - the only way is by having a superior GPU within a couple of years; that's not something you can rush,like when they threw money at nand catch-up program:

https://www.intel.com/pressroom/archive/releases/2005/20051121corp.htm
 
Last edited:
that^^^ and larabee were their 1st and second tries. hence, "third time isnt the charm".

Arc would actually be their 5th or 6th try, that we publicly know of anyway.

After the i740 failed, Intel tried again a year later in 1999 with the i752 and i754. The latter was cancelled before release, but the i752 did see a tiny number of units sold. The i752 too was cancelled just a few months after release because it was barely better than the i740 it was intended to replace.

In 2000, they were going to try again with "Tinma", which would have been Intel's first forte into integrated graphics on the CPU die, but that too was cancelled, a casualty of the Rambus fiasco.
 
Comment from Raja Koduri

we are (helpless emoji) about these rumors as well. They don’t help the team working hard to bring these to market, they don’t help the pc graphics community..one must wonder, who do they help?..we are still in first gen and yes we had more obstacles than planned to ovecome, but we persisted…

https://twitter.com/RajaXg/status/1569150521038229505?s=20&t=xZSCCnSca9r47-N1mSeuOA

I hope ARC discrete makes it, but it will be a long struggle and shareholders have to show lots of patience
 
You will probably see their graphics division recycled into better iGPUs for their processors in the upcoming generations. Nothing mind blowing, to be certain, but likely a bit of an uptick. They aren't exiting the compute market for data center, so there will be a foundation for consumer cards at some point when they inevitably relaunch the division and try again.

No point in worrying about competition. China is going to eventually start releasing outright copies of graphics hardware and/or their own discreet stuff that will hit the low end of the market where there is almost nothing. Good chance they come out swinging at some point with something competitive. After they steal everyone else's tech and integrate it.
 
You will probably see their graphics division recycled into better iGPUs for their processors in the upcoming generations. Nothing mind blowing, to be certain, but likely a bit of an uptick. They aren't exiting the compute market for data center, so there will be a foundation for consumer cards at some point when they inevitably relaunch the division and try again.

No point in worrying about competition. China is going to eventually start releasing outright copies of graphics hardware and/or their own discreet stuff that will hit the low end of the market where there is almost nothing. Good chance they come out swinging at some point with something competitive. After they steal everyone else's tech and integrate it.
This is not the competition anyone is hoping for.
 
I think they missed their window to release first gen Arc. They were competing against the Ampere midrange with how they were marketing, and we're at the end of that generation at this point. They needed to release it a year ago. Now they need to focus-forward on the post-Lovelace generation to put out a competing product for that instead, or else they will always be a generation behind.
 
Except it wasn't easy for AMD. They bought ATi and got a golden egg in the HD 4870 and 5870 almost immediately. After that, they struggled for a really long time before FINALLY getting RDNA 2 out and starting to turn up the heat on Nvidia. For the longest time, Nvidia was untouchable on the high-end, and as everyone here knows, if you don't own the high-end, you're second place in the minds of consumers. Intel, who is literally creating a GPU architecture from almost nothing and has ZERO experience in the discrete GPU market, thought they could force their way into the market with a team of the best engineers in the world... but they had to know that it was going to be a monumental task to create a competitive product. They had a close to ZERO chance of doing it with the first generation, and the fact that it's being killed off now means that someone (or a group of someones) screwed up big time in R&D and it wasn't discovered before production began. We're talking Geforce FX or Fermi-levels of screw up here, maybe even worse. Unlike Nvidia, though, Intel doesn't have mindshare in the GPU market, so they can't just force it through with good marketing (looking at you, Tom Petersen.); they need it to perform, and it sounds like it isn't going to.

This just sucks. The duopoly in this market is devastating for consumers.
Don't exclude the competitive products amd offered after the 5870.

The 7970 was fantastic and near unmatched. 290 was very good. It was only the fury until recently that the high end was truly undisputed. And amd did just fine there running the highly successful mid range Polaris cards (through multiple mining rushes).


The GPU game has been far more competitive then the CPU space was prior to Zen
 
that^^^ and larabee were their 1st and second tries. hence, "third time isnt the charm".
im not an expert on this subject, but i feel like i740, larrabee, and Intels current gpu are all 1st tries.

Here's my reasoning: To have a proper 2nd try you need to have mostly the same team iterate on an initial design immediately after the 1st design. that way everything is still fresh and the team can actually build off of mistakes.


after i740 the successor was cancelled. the next gpu, larrabee, didnt seem like a proper graphics focused card. see matt pharr and tom forsyth posts about larrabee. larrabee died out (although from what i understand a lot of the learnings from it lived on in avx). now the new intel gpu is again a fresh project.


it just doesn't seem like intel ever stuck to one project and actually directly iterated to the next project with a graphics optimized product.


It would seem silly to me for intel to once again sideline a general purpose parrallel processor. Seems like a very large investment to just shutdown after the 1st product. However just because I think its silly isnt a good argument that intel wont do it again.
 
Don't exclude the competitive products amd offered after the 5870.

The 7970 was fantastic and near unmatched. 290 was very good. It was only the fury until recently that the high end was truly undisputed. And amd did just fine there running the highly successful mid range Polaris cards (through multiple mining rushes).


The GPU game has been far more competitive then the CPU space was prior to Zen
Forgot about the 7970 and 290. You're right, those were both excellent products as well.
 
Back
Top