Intel's Next Generation GPUs to be Made by TSMC, Celestial Set for 3 nm Process

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
10,874
Looking pretty interesting for Intel

“The request for a supposedly very large quantity production of next-gen Intel GPUs has surprised industry analysts, especially when remembering the late arrival of the Arc 6 nm-based family of cards, also manufactured under contract by TSMC. Final silicon was ready by the middle of 2022, but further delays resulted in cards not reaching customers until later in the year. Intel looks determined to secure its third pillar in a market segment long dominated by NVIDIA and AMD, and reports of the bulk orders at TSMC seem to confirm that ambition, alongside continued improvement of drivers for the current crop of Arc cards. GPU R&D projects are ongoing and want to meet the demand from entertainment consumers (video games) and enterprise (artificial intelligence assisted tasks) alike. In the more immediate future, Intel is expected to launch a refreshed "Alchemist+" range of graphics cards. Insiders are pointing to a late 2023 launch for the Arc refresh.”

Source: https://www.techpowerup.com/306780/...e-made-by-tsmc-celestial-set-for-3-nm-process
 
Looking pretty interesting for Intel

“The request for a supposedly very large quantity production of next-gen Intel GPUs has surprised industry analysts, especially when remembering the late arrival of the Arc 6 nm-based family of cards, also manufactured under contract by TSMC. Final silicon was ready by the middle of 2022, but further delays resulted in cards not reaching customers until later in the year. Intel looks determined to secure its third pillar in a market segment long dominated by NVIDIA and AMD, and reports of the bulk orders at TSMC seem to confirm that ambition, alongside continued improvement of drivers for the current crop of Arc cards. GPU R&D projects are ongoing and want to meet the demand from entertainment consumers (video games) and enterprise (artificial intelligence assisted tasks) alike. In the more immediate future, Intel is expected to launch a refreshed "Alchemist+" range of graphics cards. Insiders are pointing to a late 2023 launch for the Arc refresh.”

Source: https://www.techpowerup.com/306780/...e-made-by-tsmc-celestial-set-for-3-nm-process
How has this surprised "industry analysts?" the current Intel GPU lineup is made by TSMC, they signed the contracts years ago and expressly stated that they would be using them. Hell Intel's roadmap has it plain as day that they would be using TSMC 3nm for their mobile releases in 2023.
Back in March of 2021 Intel's roadmap expressly states that they would be using an external foundry and their 3nm process for their GPU launch in 2023 (https://www.pcmag.com/news/intels-7nm-pc-chip-will-arrive-in-2023-using-tsmcs-tech).
I mean the only thing that could be considered surprising here is that Intel is actually on schedule for the first time in a decade.
 
Are they? When was Battlemage supposed to show up?
Late 2023 or early 2024 from their initial reports back in the day but I recall it being said because of market conditions they were going to push that back to mid to late 2024 because they feared releasing a product with no market to buy it. Honestly, I can't fault them there, the ARCs are primarily moving in areas where cheap modern GPUs are requested, Battlemage steps the game up and should compete with AMD and Nvidia in the mid-range where there is a current excess of cards and not much in terms of sales for what is there. So letting the market settle before cramming it with more things is the smart move.
 
Late 2023 or early 2024 from their initial reports back in the day but I recall it being said because of market conditions they were going to push that back to mid to late 2024 because they feared releasing a product with no market to buy it. Honestly, I can't fault them there, the ARCs are primarily moving in areas where cheap modern GPUs are requested, Battlemage steps the game up and should compete with AMD and Nvidia in the mid-range where there is a current excess of cards and not much in terms of sales for what is there. So letting the market settle before cramming it with more things is the smart move.
At least they're consistent. Probably 4070-tier performance in time to compete with Nvidia's 5000 series.
 
At least they're consistent. Probably 4070-tier performance in time to compete with Nvidia's 5000 series.
But at some point that's fine, unless you are trying to play 4K or making extensive use of Ray Tracing a 4070 will leave you happy everywhere except in your bank account.
AMD and Nvidia are content with letting their previous generations of hardware fill that lower tier, Intel doesn't have previous generations available so they need to release products that are priced to compete in that market. Because no serious gamer at this stage is going to risk dropping $1200 bucks on a GPU with driver and support issues Intel currently has.
Intel is not yet in a position where they can do day 1 drivers for the latest AAA releases, but they certainly are getting to a place where the popular MOBA, RTS, and MMO titles are supported, then they have to work on the games that are actively streamed and try to get them onboard for a following, then they can work on the latest AAA titles, but that is a 3'rd gen goal at the earliest.
Intel is the underdog, not a place they are familiar with, and they have to play the slow game here, fortunately, though they can copy a number of its competitors' playbooks so they aren't having to reinvent any strategies, they can literally copy and adapt the ones used to steal their own market positions away from them.
 
Plot twist - Intel GPU's succeed and AMD gets out of the discrete market since they clearly don't take it serious anyways.

5 years from now we'll still be in the same boat of only 2 vendors.
 
Plot twist - Intel GPU's succeed and AMD gets out of the discrete market since they clearly don't take it serious anyways.

5 years from now we'll still be in the same boat of only 2 vendors.
That would cost AMD their console contracts and that would cost them too much, they actually make a lot of money there with very little in the way of expense on their side, it is the definition of easy money for them and they won't give that up without a fight.
 
That would cost AMD their console contracts and that would cost them too much, they actually make a lot of money there with very little in the way of expense on their side, it is the definition of easy money for them and they won't give that up without a fight.
I said get out of the discrete market. They don't use discrete cards in consoles.
 
I said get out of the discrete market. They don't use discrete cards in consoles.
No, but it is kind of their thing, they go hand in hand in a fun little self-symbiotic cycle.
Their console R&D generates new features that trickle into the consumer market, and their usage and evolution of it there trickle back into the consoles, in this way the game developers and their work pushing the designs create the wanted features of the next generation in a wonderful cycle.

AMD doesn't have the budget to work like Nvidia, Nvidia creates a cool new bit of technology then goes around to developers and says "Look at this cool thing you can do now!" and developers pick up on that, AMD goes the opposite direction, they deliver solid hardware, let developers do something really cool with it, then find a way to take it a step further. Something something "Together we Advance".
If AMD were to lose too much more market share or step out of the discrete market altogether it risks breaking that little cycle they have going, which risks stagnation or cost increases which makes other options look viable.

Picture if you will 2 years from now Microsoft announces the new Xbox with an Intel-based APU that gives impressive (compared the current Xbox) performance figures, or Sony announcing an Nvidia Grace/RTX powered console for the PS6, what would that do to AMD currently?
 
What about Qualcomm?

(Bookmarking this post for posterity.)
I was under the impression that Qualcomm and Mediatek were signed up for the N3E process but they were holding off on the N3 or N3B process because of the costs not justifying the gains.
 
AMD doesn't have the budget to work like Nvidia, Nvidia creates a cool new bit of technology then goes around to developers and says "Look at this cool thing you can do now!" and developers pick up on that, AMD goes the opposite direction, they deliver solid hardware, let developers do something really cool with it, then find a way to take it a step further. Something something "Together we Advance".
Nvidia doesn't just do show & tell with developers & then the developers voluntarily implement those new features, Nvidia pays the developers & provides their own developers to implement these new features in their games.
 
Nvidia doesn't just do show & tell with developers & then the developers voluntarily implement those new features, Nvidia pays the developers & provides their own developers to implement these new features in their games.
Yeah, the point is AMD has neither the resources nor the budget to take that approach, they just make hardware and tell developers to have fun. Then they create an open source project that does their best to replicate the Nvidia cool thing so they don't get left behind but also don't have to spend the huge amount to develop it themselves while getting to play the "we're open" card. It works for them while keeping costs down.
 
And you might add into that whole equation that Ritche Corpus and Steve Bell left and went to Intel a couple of years ago and took most of the ISV team with them. They did an incredible job with a shoestring budget at AMD for well over a decade.
 
And you might add into that whole equation that Ritche Corpus and Steve Bell left and went to Intel a couple of years ago and took most of the ISV team with them. They did an incredible job with a shoestring budget at AMD for well over a decade.
Yeah Intel came to play and they are throwing resources to their GPU division like its a hill to die on, I mean it is for them, but it's good that they don't seem to be half-assing it.
 
Somewhat mature by then drivers-software stack could make for a much more interesting launch review wise, who knows maybe a lot of hardest-costliest is behind them and feel confident that there is enough margin for them to have product-price that compete

Does that mean that all that:
https://www.pcgamer.com/intel-arc-rumour-raja-koduri-twitter-response/

Was wrong or how much better the product got over a relatively short amount of time software-driver wise turned Intel around ?
 
Somewhat mature by then drivers-software stack could make for a much more interesting launch review wise, who knows maybe a lot of hardest-costliest is behind them and feel confident that there is enough margin for them to have product-price that compete

Does that mean that all that:
https://www.pcgamer.com/intel-arc-rumour-raja-koduri-twitter-response/

Was wrong or how much better the product got over a relatively short amount of time software-driver wise turned Intel around ?
Dropping the plan to hand tune the driver for still-popular DX9 games and just use a DX9 to DX12 wrapper was a huge step. I'm sure Intel saved time and silicon complexity by not baking older standards into the hardware - they should have used the wrapper from the start.
 
Somewhat mature by then drivers-software stack could make for a much more interesting launch review wise, who knows maybe a lot of hardest-costliest is behind them and feel confident that there is enough margin for them to have product-price that compete

Does that mean that all that:
https://www.pcgamer.com/intel-arc-rumour-raja-koduri-twitter-response/

Was wrong or how much better the product got over a relatively short amount of time software-driver wise turned Intel around ?
Intel and TSMC also have deals in place for 4nm and 3nm stretching out till 2026 so it doesn't look like Intel has any immediate plans to kill the lineup.
I mean they can't afford to at this stage, GPUs have replaced CPUs as the big must-have in big datacenter, 4 GPUs get sold for every 1 CPU there, Intel can't let that market slip by any longer because they are just handing AMD and Nvidia a fat paycheck by not playing there.
 
Dropping the plan to hand tune the driver for still-popular DX9 games and just use a DX9 to DX12 wrapper was a huge step. I'm sure Intel saved time and silicon complexity by not baking older standards into the hardware - they should have used the wrapper from the start.
I think it was always the plan it was more a case of getting the wrapper working in the wild was a lot harder than getting it working in the lab.
 
Back
Top