Intel Confirms Debut of Next-Gen Arc “Battlemage” GPUs, Official Showcase Set For December 3

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
12,431
B570 specs here: https://videocardz.com/newz/intel-arc-b570-specs-leaked-18-xe2-cores-10gb-memory-and-pcie-4-0x8

"While the announcement hasn't revealed the specifics of the event, it is highly likely that Intel plans to unveil its next-gen Arc "Battlemage" B580 and B570 SKUs, successor to the respective Arc series GPUs. In terms of what we know up until now, Intel's Arc Battlemage B580 is set to feature 160 CUs, or around 20 Xe Cores, along with 12 GB of GDDR6 VRAM on a 192-bit memory bus, targeting it towards the "mid-tier" segment of the markets. The GPU is said to feature a 2.8 GHz max boost clock, with 2x 8-pin power connectors and an estimated price of around $250-$260.

Intel's Arc Battlemage B580​

  • 12GB 192-bit GDDR6
  • Battlemage BMG-G21 GPU (20 Xe Cores)
  • Intel Xe2-HPG Architecture
  • 2850 MHz Max Boost Clocks
  • Intel Xe Super Sampling (Intel XeSS)
  • Intel Xe Matrix Extensions (Intel XMX)"
Source: https://wccftech.com/intel-confirms...ge-gpus-official-showcase-set-for-december-3/
 
Yeah, but Intel is selling them at a loss, at best as a break-even.

Not bad for an attempt to get some of the market. I wouldn't touch Intel due to driver issues though they seem to be making improvements. I would take AMD before Intel. But for those on a tight budget, this may be a compelling option. May be the first sub $300 GPU that can actually handle ray tracing somewhat well.
 
ARC was being sold at a significant loss on hardware alone forgetting about R&D and driver development costs. There’s no way that Battlemage is going to be much cheaper to make than ARC was and certainly not by an amount to cover the spread.

The Intel GPU division was operating at a loss of nearly $4B.
 
12GB 192-bit GDDR6 priced around $250-$260? See that Nvidia? Intel gets it.
Hopefully this will put a pricing pressure on AMD and in turn on nvidia

Is Intel really selling a (non-gimped) 4060 ti for $270 ??

AMD has one in the works (Navi 44 8gb) for approx $300, I am guessing

Maybe nv will just rebadge
4060 to 5050
4060 ti to 5060
4070 to 5060 ti
🤔
 
Ive read these cards will be decent at gp hpc applications. Id be down to spend some time on one of them if that's true. Hell, a titan x gpu is old now and still costs 400 bucks.
 
Hopefully this will put a pricing pressure on AMD and in turn on nvidia

Is Intel really selling a (non-gimped) 4060 ti for $270 ??

AMD has one in the works (Navi 44 8gb) for approx $300, I am guessing

Maybe nv will just rebadge
4060 to 5050
4060 ti to 5060
4070 to 5060 ti
🤔
It will completely depend on drivers and performance. It could pressure AMD, but does nothing to Nvidia.
It would be like saying “hopefully the new Corolla puts pressure on Porsche”.

And I would seriously hope that neither AMD nor Nvidia have any plans for an 8GB anything with this upcoming gen. Just discount the existing hardware and don’t bother rebranding.
 
It will completely depend on drivers and performance. It could pressure AMD, but does nothing to Nvidia.
I work for a relatively large company. We own our space with some dominance. We don't turn profit on the lower end, tighter margin space. We compete there ONLY to keep the competition from gaining market share that could challenge our dominant position on the higher margin sales.

Nvidia has a couple of strategies here, but it would be extremely foolish for them to not take these space seriously. I suspect Jensen would rather give up some profits to ensure long term sustainability.

Time will tell.
 
I work for a relatively large company. We own our space with some dominance. We don't turn profit on the lower end, tighter margin space. We compete there ONLY to keep the competition from gaining market share that could challenge our dominant position on the higher margin sales.

Nvidia has a couple of strategies here, but it would be extremely foolish for them to not take these space seriously. I suspect Jensen would rather give up some profits to ensure long term sustainability.

Time will tell.
Right they will have something but they could serve it just as well with a price drop on the 4000 series stock.
Intels best offering should perform around a 4070ti, Nvidia just needs to price match.
 
ARC was being sold at a significant loss on hardware alone forgetting about R&D and driver development costs. There’s no way that Battlemage is going to be much cheaper to make than ARC was and certainly not by an amount to cover the spread.

The Intel GPU division was operating at a loss of nearly $4B.
Intel can't afford to be throwing away money that they don't have.
Selling a product at a loss is just passing on the value to your customer for no real gain.
Gaining marketshare on a product they lose money with every sale does not help them.
They could have invested that money into R&D to make an actually competitive product.
Stupid.
 
R&D is so high on those that you need really good volume to ever amortize it and not make a lost.

Those core are pretty much the one in the Lunar Lake-arrow lake igpu ? If so it could work, depending how reusability their drivers have between the 2 products line. Same goes for game dev interest and support, intel igpu are a really big deal for a certain type of game (all of them but the really hard to run one) as their tend to become the most common gpu.

12GB 192-bit GDDR6 priced around $250-$260? See that Nvidia? Intel gets it.
If it is just a bit faster than a 3060, 20 cores... (the A770 was 32 core-256 bits, that not that big of an upgrade price or vram wise), the A770 often does not beat the 3060-7600, specially when the game launch.

performance-1920-1080.png
performance-1920-1080.png
performance-1920-1080.png


Would the 20 core-192 bits achieve to be significantly faster than the 32 cores-256 bits A770 ? If they achieved much better than a 60% per core with slower bandwith upgrade.. sure. And maybe Intel will be like amd-nvidia in the 90s in term of gen on gen upgrade speed in the first generations.
 
Last edited:
Intel can't afford to be throwing away money that they don't have.
Selling a product at a loss is just passing on the value to your customer for no real gain.
Gaining marketshare on a product they lose money with every sale does not help them.
They could have invested that money into R&D to make an actually competitive product.
Stupid.
The plan is for the whole GPU lineup to get rolled into the iGPU space so they basically have a whole APU lineup for mobile.

But I suspect that this may be the last dGPU generation unless it gets popular. The architecture scales down very well for laptops it would carry over for a long while.
 
Yeah, but Intel is selling them at a loss, at best as a break-even.
See that AMD, that's how you get much needed market share. If this GPU performs well enough then it could justify $300, but Intel is worried about market share at this point. If Intel wants their XeSS to be used in games then it needs more hardware out in the wild.
It will completely depend on drivers and performance. It could pressure AMD, but does nothing to Nvidia.
Nvidia does react to AMD once in a while. Remember when AMD was about to release their 5700 XT and then Nvidia reacted with Super cards with price drops? If Intel created a better GPU with a better price then you better believe Nvidia will react.
It would be like saying “hopefully the new Corolla puts pressure on Porsche”.
More like Toyota's Supra which is oddly based on BMW's engine.
And I would seriously hope that neither AMD nor Nvidia have any plans for an 8GB anything with this upcoming gen. Just discount the existing hardware and don’t bother rebranding.
Discounting the existing hardware is expected, but if AMD and Intel don't make new replacements for the 7600 and 4060 cards then Intel has a market all to themselves.
 
If this GPU performs well enough then it could justify $300,
Would need to be significantly better than the $300 4060 its $300-330 upcoming 5060 replacement, the $320 7600xt and its $300-330 successor, it is a really big if that this is the case..

If it play in the 7600xt-3060-4060 tier of performance it need to cut them in price by a lot for OEM to care, if you're building and selling pre-built entry level gaming pc, how big of a rebate you want for you to not have an NVIDIA sticker on it, change your current in palce deal and for the chance to have a lot of support about game not working happening. And we are a couple of months away to change those skus for 5060-8600xt.

Discounting the existing hardware is expected, but if AMD and Intel don't make new replacements for the 7600 and 4060 cards then Intel has a market all to themselves.
Only if the RDNA 3, Lovelace and ampere stock get finally empty and before the 5060-RDNA 4 launch, which should be all by march 2025. When can we expect for the 3060 to stop to be available ? That one can drop at $260

There is no world where Nvidia does not make a replacement for the 4060, there will be a 5060, same for AMD.
 
Last edited:
High Probability.

Unless we get a miracle worker into the CEO slot- I think the company is done. The thing that scares me is that Intel has thrown away all their "Steve Jobs/Steve Wozniak" type people. They have a hard time innovating, producing, and selling product. There is also a reported culture problem and many fiefdoms. And they can't quality control much... based on recent gen processors.

Keller? (Bueller?)- The story is told by the departure of Jim Keller. It really is. The "mountain of fiefdoms" would not bend to Keller's will. Keller left. And the left overs have no idea what, or how, to do what they need to do.

Next thing you know they will sell degrading processors with the "degredation" as a feature:

"The new Intel chips have a new convenience feature: They will tell you when to upgrade by failing!"
 
Would need to be significantly better than the $300 4060 its $300-330 upcoming 5060 replacement, the $320 7600xt and its $300-330 successor, it is a really big if that this is the case..
According to intel own metric, it will be slower than the 5060, 10% faster than a 4060, depending on the 5060/8600 launch price, they could have to reduce the price a little bit.
https://download.intel.com/newsroom/2024/client-computing/Intel-Arc-B580-B570-Media-Deck.pdf

Look good performance wise in the current market, all in the hands on the drivers not having major issues in any really relevant title at launch.

+24% an A750, which according to the TPU game suite would be around a 3060Ti/6700xt, for the drivers back when the 4060ti 16GB launched
 
Last edited:
According to intel own metric, it will be slower than the 5060, 10% faster than a 4060, depending on the 5060/8600 launch price, they could have to reduce the price a little bit.
https://download.intel.com/newsroom/2024/client-computing/Intel-Arc-B580-B570-Media-Deck.pdf

+24% an A750, which according to the TPU game suite would be around a 3060Ti/6700xt, for the drivers back when the 4060ti 16GB launched
Assuming the 6750 xt / 6800 sell out by this year..

For $250 or less — buy Intel

In 1 or 2 quarters, AMD/Nvidia will have 4060 ti 8gb for $300 to $350

In 2 or 3 quarters, AMD/Nvidia will have 4070 12gb performance cards for $350 to $400
 
For $250 or less — buy Intel
It open up, potentially PS5 performance with12gb vram which should be realistic max a game can use on Ps5 of the 16gb budget as "vram", better upscaling, new with warranty at a $250 price tag and we can imagine spring of 2025, $220-230 in store, could be a nice step in the good direction.

And all the av1, display port 2.1, eDP 1.5, the new H.266 media engine-connectivity side of things, will all be all top of the line with intel quicksync long history, it seems to be on Lunar Lake, with 8k60 10bit HDR, VVC, etc... those will be real nice plex card/Home media in their second life for a long time.
 
Last edited:
comp with 1060-1660 was not expected on the bingo card, but it is a very common (maybe the most ?) possible upgrade path.

There is a little bit of a nice trick by intel here, to mix up people with the performance by dollar, I wonder if some people think that the raster number are the performance one (that youtube video make little sense to have people acting like it is a big deal for example..., a $300 5060 could easily be a bit ahead in fps / $ here for example and the 8600 should as well)
 
comp with 1060-1660 was not expected on the bingo card, but it is a very common (maybe the most ?) possible upgrade path.
I didn't see anything about those cards in Intel's slide deck. The only competitive product they compared it to was the RTX 4060. Is that from Intel somewhere or just something some Tuber put together?
 
So you're looking at about double the performance of a 1660, which is one of the most common gpus I run across for budget systems.

I don't want to be too eager, but Intel could have the next RX580, a card that is REALLY good for the 'I want to play any game out at medium settings 1080p and not upgrade for 5 years' crowd.

RX580 cards had some legs.
 
Intel can't afford to be throwing away money that they don't have.
Selling a product at a loss is just passing on the value to your customer for no real gain.
Gaining marketshare on a product they lose money with every sale does not help them.
They could have invested that money into R&D to make an actually competitive product.
Stupid.

Selling at a loss to break into a new market is normal though. If they're offering the same performance/dollar as the established players there'd be no real reason for anyone to buy their cards. NVidia is the de facto standard at this point, with AMD holding on to what relevance it has mainly due to their GPU being in consoles meaning that most of the key optimization work needed to get best results is effectively free to the PC market.

Unless/Until Intel can unfuck its manufacturing and get a better than TSMC process to make their cards on they really don't have anyway to offer significantly better perf/dollar of manufacturing cost which means the only way Intel can make themselves desirable in the short term is to be the cheapest option on the market. That hits them twice though, once with the lower gross margin on each card sold; with the second hit coming from both needing to put more work into their drivers than either established player while having fewer devices to spread the cost over.

In retrospect they may have done better to have delayed launching discrete cards a few more years while only cooking on drivers at the IGP level; and perhaps trying to push performance to challenge AMDs APUs. But making that call a half dozen years ago would have required expecting failure of their own manufacturing division. That would have likely been internal political suicide for any manager who tried to use the argument.
 
I don't want to be too eager, but Intel could have the next RX580, a card that is REALLY good for the 'I want to play any game out at medium settings 1080p and not upgrade for 5 years' crowd.

RX580 cards had some legs.

Coincident or not that Intel used "580" and "570" in these models naming? :ROFLMAO:
Is it lazy marketing? (same as multiple heatsink companies using "assassin" in their product naming.. or "620", etc.. etc...)
 
In retrospect they may have done better to have delayed launching discrete cards a few more years while only cooking on drivers at the IGP level;
Maybe but in some way they were doing that for a decade before the discrete launch, there is something into actual deadline and product in the field that can force things to happen (and focus which are needed) that could be really hard to reproduce without jumping in the water. they had been since 2010 the largest pc gpu sellers in the world by a fast amount and cooking drivers.

Would not surprise me if the progression of the first 3 month post launch was something they did not thought they were capable of.

Selling a product at a loss is just passing on the value to your customer for no real gain.
Gaining marketshare on a product they lose money with every sale does not help them.
It could depends what people mean by selling at a loss, there is 2 differents selling at a loss possible talk.

A - Total revenues going to intel from the dgpu - total costs that went into making them (from R&D to customer service, warranty, driver supports after the sales)
B - Revenues that goes to Intel from a sales of a dgpu - marginal cost of making one more card

Would it be B, then sure, but that would be hard to imagine, even if this is a terrible gpu, needing 260mm² for those performance on say tsmc 4N, can they get at least 160 working die from a 12 inch fabs ? Can they get a 12 inch for 16k... those seem quite conservator,

Each new marginal die would be around $100, can you make everything work for $220-250 with some gross margin (if not all the talk about their price being too high...) ? $25 of coolers, $35 board-power, $30 of vram, $10 average after sales issues, $40 for the reseller-shipping and you could be barely above water (I am really speculating here), And it could be just a bad gpu needing 200mm of die, that they get 235 out of a 12inch that they buy 15k, for $65 per gpu instead, in that case easier to make it work.


The big issue, say if they sold only 30 millions dgpu a year now and that Intel is fighting to get to 6%, trying to say during that ARC gpu generation 4 millions units in total over 3 years (I think that was the previous gen target), if the R&D to have made them is "only" $200 millions, you add $50 by gpu and it does not work anymore, and you compete with Jensen that can spend 100 millions on an TSMC phone call, Nvidia is getting close to spending 10,000 millions in R&D, if you do not have a super high revenues brand above subsidizing the desktop dgpu and not spreading that cost over 25 millions unit a year (with the laptop line using the same die exactly) that become hard to make it work.

Yes they need to build the Arc brand at a price that do not loose money selling marginal units, but until you have some marketshare, total cost-total revenues could be really hard to make it work.
 
Ive read these cards will be decent at gp hpc applications. Id be down to spend some time on one of them if that's true. Hell, a titan x gpu is old now and still costs 400 bucks.

That would even be useful if I didn't sit on so much software that is written directly in CUDA...
 
Would it be B, then sure, but that would be hard to imagine, even if this is a terrible gpu, needing 260mm² for those performance on say tsmc 4N, can they get at least 160 working die from a 12 inch fabs ? Can they get a 12 inch for 16k... those seem quite conservator,
Apparently it is worse than that:
https://www.hardwareluxx.de/index.p...ikkarte-für-das-1440p-gaming-vorgestellt.html

Die Size: 272 mm²-192 bits
Of TSMC 5nm

If those are true... that almost a 4070 super (291), being only +10% faster than a 4060 and selling for only $250, maybe the math do not work and they are loosing money by marginal unit.
 
Back
Top