Intel reaffirms Xe2 Battlemage "is coming"

Regardless of how it performs, I'm just glad to see them sticking with it and remaining in the discrete GPU business. I was legitimately afraid that they would give up and bow out. Even if their card can't compete on the top-end, I'll still be happy if it expands choices and brings down prices in the middle to low-end. It's still disgusting how much it costs to get a new card for something like a budget PC build or an HTPC. Nvidia is still trying to sell cards like the GT710 to people who might not have even been born yet when it first came out.
 
Regardless of how it performs, I'm just glad to see them sticking with it and remaining in the discrete GPU business. I was legitimately afraid that they would give up and bow out. Even if their card can't compete on the top-end, I'll still be happy if it expands choices and brings down prices in the middle to low-end. It's still disgusting how much it costs to get a new card for something like a budget PC build or an HTPC. Nvidia is still trying to sell cards like the GT710 to people who might not have even been born yet when it first came out.

They don’t really have a choice, unless they want to bow out from any prospect of participating in data centre and AI going forward. They need to develop good GPU solutions to remain relevant.
 
Yes? The A380 is the perfect card for something like a HTPC now that the drivers are better and you can get them for $120. A RX 580 would arguably be better for gaming, but the A380 will be better at everything else.
You do realize that these ARC cards are now being sold at a huge loss due to picking up nearly zero traction in the market. If you think Intel's focus is to provide cheap GPU solutions going forward, you would be mistaken.
 
You do realize that these ARC cards are now being sold at a huge loss due to picking up nearly zero traction in the market. If you think Intel's focus is to provide cheap GPU solutions going forward, you would be mistaken.
The only question left unanswered about Intels plans is how long will their board let them sell them at a loss. Because Intel has to play the underdog for a good while, if they get to a 4’th Generation I will be pleasantly surprised. But I really hope that the 2nd Gen parts are competitive and stable.
 
dont you have a thread for this?
https://hardforum.com/threads/new-battlemage-rumors.2030305/



im still surprised theyve held on this long, was not expecting it....
This may be Intels hill to die on.
AI, and ML are eating their lunch.
We’re already at a place where Compute acceleration outsells traditional x86 CPUs for datacenter.
And big contracts love bundles, Nvidia has a CPU designed to optimize their GPUs and more and more software platforms run on it natively every year.
AMD has a mature ML platform and also their own line of highly competitive x86 CPU’s.
So that leaves Intel with a huge platform that is significantly lacking when compared to their competitors.
All this while Amazon, Meta, Alphabet, Tencent, and many others are starting to build their own custom architectures for their own environments.
Intel gets GPUs working or they die, that’s that.
It’s why they are looking to spin out the Fabs now, it’s not unreasonable right now for the Fabs to be worth more than the rest of the company in the next 10 years. So Intel has to get that built and spun up taking orders from 3’rd parties. Because if they wait until they feel that the rest of Intel is done it will be too late to restructure and spin that up.
 
You do realize that these ARC cards are now being sold at a huge loss due to picking up nearly zero traction in the market. If you think Intel's focus is to provide cheap GPU solutions going forward, you would be mistaken.
Intel's loss (leader) is our gain, for however long it lasts.
 
This may be Intels hill to die on.
AI, and ML are eating their lunch.
We’re already at a place where Compute acceleration outsells traditional x86 CPUs for datacenter.
And big contracts love bundles, Nvidia has a CPU designed to optimize their GPUs and more and more software platforms run on it natively every year.
AMD has a mature ML platform and also their own line of highly competitive x86 CPU’s.
So that leaves Intel with a huge platform that is significantly lacking when compared to their competitors.
All this while Amazon, Meta, Alphabet, Tencent, and many others are starting to build their own custom architectures for their own environments.
Intel gets GPUs working or they die, that’s that.
It’s why they are looking to spin out the Fabs now, it’s not unreasonable right now for the Fabs to be worth more than the rest of the company in the next 10 years. So Intel has to get that built and spun up taking orders from 3’rd parties. Because if they wait until they feel that the rest of Intel is done it will be too late to restructure and spin that up.
Intel has been resting on their laurels for far too long. It's good that they've finally started pulling their heads out of their nether regions, but it took things like this to wake them up. Apple dropping Intel was definitely another big wake up call. AMD ramping up their chips and being the primary proponent of console APUs. NVIDIA absolutely dominating in the AI and GPU space. Intel needs to get it together. Competition is good for everyone. Intel didn't have any for quite a while there. Now everyone is competing on so many levels for so many reasons. I don't want anyone to lose this race because it's bad for all of us. I want everyone to stay in the race so we can all enjoy bleeding edge technology in both the consumer and enterprise spaces.
 
id look at the fs/t forum rather than this, regardless of a "2yr warranty" out of chy-na....

"Friendly Note:

Dear buyer,
The graphics card is a reassembled graphics card after the mining chip is removed.
All graphics cards will be tested several times before shipment,
Make sure the graphics cards is in good condition,
Graphics cards 2 year warranty !

If you do mind that it is not brand new, please buy with caution!"
 
id look at the fs/t forum rather than this, regardless of a "2yr warranty" out of chy-na....

"Friendly Note:

Dear buyer,
The graphics card is a reassembled graphics card after the mining chip is removed.
All graphics cards will be tested several times before shipment,
Make sure the graphics cards is in good condition,
Graphics cards 2 year warranty !

If you do mind that it is not brand new, please buy with caution!"
That doesn't sound shady at all.
 
id look at the fs/t forum rather than this, regardless of a "2yr warranty" out of chy-na....

"Friendly Note:

Dear buyer,
The graphics card is a reassembled graphics card after the mining chip is removed.
All graphics cards will be tested several times before shipment,
Make sure the graphics cards is in good condition,
Graphics cards 2 year warranty !

If you do mind that it is not brand new, please buy with caution!"
I think at this point all RX 580's were once a use mining card.
 
Do remember that Intel desktop GPU and Intel data center GPU are not in the same BU any more. Take that for what you will.
They aren’t the same, but the fundamentals are similar. The consumer cards are just significantly scaled down, and much simpler. Honestly it’s smart, and I’m overall impressed with their current offering, from what I understand it’s not doing badly in its intended roll, it falls in nicely behind the Nvidia offering and far ahead of the AMD ones. But its applicable scope is too small, but the packaging tech there is the real standout one for me cramming 63 tiles onto one interposer is not a minor engineering feat.

I want Intel to get their GPU’s to a good place, I want it to be a 3 way race, but more than that I want their fab competitive at the top end. Intel, AMD, and Nvidia are a using TSMC for the GPU compute tiles and that is one of the reasons GPU pricing stinks.
TSMC needs competition and until that happens pricing is only getting worse.
 
Yes? The A380 is the perfect card for something like a HTPC now that the drivers are better and you can get them for $120. A RX 580 would arguably be better for gaming, but the A380 will be better at everything else.
Once they put AV1 encode on their CPU's graphics core----you won't need a dedicated GPU for anything HTPC related. And I thiiiiink that is scheduled for 15th gen desktop.

*and if you don't care about AV1 encode, their current CPU graphics are pretty damn great for HTPC.
 
I think at this point all RX 580's were once a use mining card.
They were the most popular card for doing that for a long time.
I’d personally take a 7600 over a 580, but I’m fortunate enough that I’m not burdened with that decision.
 
Discrete is only a small share of the market right.

The majority is integrated.

As per what I read battlemage is integrated into lunar lake chips coming this year.

I was hopeful that discrete battlemage would come before that, but Intel themselves are publicly targetting CES 2025 🤔 That is strange. (Maybe to get the drivers ready ??)
 
Last edited:
Discrete is only a small share of the market right.

The majority is integrated.

As per what I read battlemage is integrated into lunarvlake chips coming this year.

I was hopeful that discrete battlemage would come before that, but Intel themselves are publicly targetting CES 2025 🤔 That is strange.

Intel Battlemage GPU Architecture More than 2x Faster than Alchemist, Per Leaked Benchmark​

Lunar Lake features a Battlemage iGPU with 64 EUs or 512 cores clocked at a boost clock of 2GHz. It nets 1,927.6 Mpix/s in the SiSoft benchmark.​


https://www.hardwaretimes.com/intel...x-faster-than-alchemist-per-leaked-benchmark/


here’s another thing. Intel’s next mobile chip will be “Lunar Lake” and it’s already shipping that chip to its PC partners.

https://www.pcworld.com/article/2195582/intels-2024-isnt-over-arrow-lake-lunar-lake-are-next-up.html
 

Intel Battlemage GPU Architecture More than 2x Faster than Alchemist, Per Leaked Benchmark​

Lunar Lake features a Battlemage iGPU with 64 EUs or 512 cores clocked at a boost clock of 2GHz. It nets 1,927.6 Mpix/s in the SiSoft benchmark.​


https://www.hardwaretimes.com/intel...x-faster-than-alchemist-per-leaked-benchmark/


here’s another thing. Intel’s next mobile chip will be “Lunar Lake” and it’s already shipping that chip to its PC partners.

https://www.pcworld.com/article/2195582/intels-2024-isnt-over-arrow-lake-lunar-lake-are-next-up.html
Intel gets OEM’s sample chips very early, so they can do their own validations, designs, and other tasks they feel necessary.
It’s one of the nicer aspects of working with Intel, they don’t often surprise their partners.
 
They aren’t the same, but the fundamentals are similar. The consumer cards are just significantly scaled down, and much simpler. Honestly it’s smart, and I’m overall impressed with their current offering, from what I understand it’s not doing badly in its intended roll, it falls in nicely behind the Nvidia offering and far ahead of the AMD ones. But its applicable scope is too small, but the packaging tech there is the real standout one for me cramming 63 tiles onto one interposer is not a minor engineering feat.

I want Intel to get their GPU’s to a good place, I want it to be a 3 way race, but more than that I want their fab competitive at the top end. Intel, AMD, and Nvidia are a using TSMC for the GPU compute tiles and that is one of the reasons GPU pricing stinks.
TSMC needs competition and until that happens pricing is only getting worse.
Once Intel realized that it cannot compete on desktop GPU, it spun R&D costs off to DC. I do not think there is any chance of Intel ever competing in the desktop video card market from an actual BU cost.

Now that Intel has realized that desktop is a no-go, let's move all that R&D into in "IGP/APU." That is where the money is. AMD and Intel push NVIDIA out of the "APU" market and absorb a the HUGE mobile market across the board.

That line of thinking was an "NV killer" mobile ecosystem after crypto all went POS, but all of a sudden AI is the thing which muddies the waters tremendously. The plan for AMD and Intel remain the same for consumer devices.

AMD and Intel have been edging NVIDA out of mobile SKUs for years, and AI is the only thing saving NV at the moment.

The bottom line is how long does NVIDIA keep taking care of gamers at the cost of its fiduciary duty of its stockholders?
 
Once Intel realized that it cannot compete on desktop GPU, it spun R&D costs off to DC. I do not think there is any chance of Intel ever competing in the desktop video card market from an actual BU cost.

Now that Intel has realized that desktop is a no-go, let's move all that R&D into in "IGP/APU." That is where the money is. AMD and Intel push NVIDIA out of the "APU" market and absorb a the HUGE mobile market across the board.

That line of thinking was an "NV killer" mobile ecosystem after crypto all went POS, but all of a sudden AI is the thing which muddies the waters tremendously. The plan for AMD and Intel remain the same for consumer devices.

AMD and Intel have been edging NVIDA out of mobile SKUs for years, and AI is the only thing saving NV at the moment.

The bottom line is how long does NVIDIA keep taking care of gamers at the cost of its fiduciary duty of its stockholders?
I think that duty to stockholders will force AMD and Nvidia to shift how they build consumer GPUs, I honestly expect performance to only grow some 10% year over year and the die size to just shrink dramatically instead so they can maintain margins large enough that the investors don't complain.
I mean Blackwell is rumored to be significantly faster at Ray Tracing and slightly faster at Rasterization, but I certainly don't expect the 5080 to be significantly faster than the 4080 let alone the 4080 Super. Should the Omnissiah bless us, we might get a 10% raster and a 20% Ray Tracing bump, but instead of that GPU clocking in at ~400 square mm, we probably see it closer to 250 with an MSRP ~$50 more to account for inflation.

We can all pray that AMD then undercuts them significantly, but we know they won't, AMD is in the same boat, their investors want them to focus on consoles and AI/data center products just as much as the Nvidia ones do, so they likely play the same game.

Sure there will be a 5090 spooge edition made from the cast-off silicon not good enough for the Datacenter but that will be $$$$$ and uncontested across the whole range AMD and Intel included.
 
We can all pray that AMD then undercuts them significantly
AMD can sell cheaper if they come late to the same node. I.e. if they are behind nvidia by 1 year on same node then they don't have to pay TSMC premium for first use like apple/nvidia
 
Once they put AV1 encode on their CPU's graphics core----you won't need a dedicated GPU for anything HTPC related. And I thiiiiink that is scheduled for 15th gen desktop.

*and if you don't care about AV1 encode, their current CPU graphics are pretty damn great for HTPC.
Until AV2 or whatever codec is released and you need to buy an Intel bARK dedicated GPU for it. The cycle repeats.
 
Discrete is only a small share of the market right.

The majority is integrated.
Depend if we count the market by unit or by dollars ;)

According to this:
https://www.techpowerup.com/313019/...last-quarter-and-decreased-by-27-year-to-year

The GPU's overall attach rate (which includes integrated and discrete GPUs, desktops, notebooks, and workstations) to PCs for the quarter was 115%,

Outside AM4, cpu with no igpu are quite rare I think (some F version of intel), so maybe 20% or so of pc are sold with a discrete GPU.

But you also have standalone GPU sales and like mentionned units vs dollar, a single discrete GPU could be worth a lot of integrated solution.
 
Last edited:
I think at this point all RX 580's were once a use mining card.
Best Buy is selling a new one, but it's more expensive than the A380 at this point.

https://www.bestbuy.com/site/xfx-am...0-graphics-card-black/6136515.p?skuId=6136515
Discrete is only a small share of the market right.

The majority is integrated.

As per what I read battlemage is integrated into lunar lake chips coming this year.

I was hopeful that discrete battlemage would come before that, but Intel themselves are publicly targetting CES 2025 🤔 That is strange. (Maybe to get the drivers ready ??)
In the hobby market, people are not generally buying CPUs for the GPU included on it, so I think it's disingenuous to say that integrated GPUs are the majority of the market.
 
Tha ARC cards no longer completely suck as the drivers have matured a bit, but the next Gen of Intel GPU will be appreciated. Assuming Intel keeps up with the driver improvements they should be able to carve themselves out a small chunk of the market. Remember they are playing the long game now...and drive improvements have been steady.

This video is probably the best explanation of how far the driver maturity has come:
View: https://www.youtube.com/watch?v=aXU9wee0tec
 
Tha ARC cards no longer completely suck as the drivers have matured a bit, but the next Gen of Intel GPU will be appreciated. Assuming Intel keeps up with the driver improvements they should be able to carve themselves out a small chunk of the market. Remember they are playing the long game now...and drive improvements have been steady.

This video is probably the best explanation of how far the driver maturity has come:
View: https://www.youtube.com/watch?v=aXU9wee0tec

It's no longer about drivers but pricing for me. Intel's A380, A580, and A750 are just not worth getting. The A380 is what you buy when you just need a display output and AV1 encoding. The A580 and A750 is priced close to a RX 6600, and not as fast. The A770 would be the GPU to own, except that at $300 it isn't going to be a better buy compared to a RTX 3060 12GB or RX 6700 XT 12GB. Intel needs to start lowering the price of the A770 and release newer faster GPU's.
 
Back
Top