Intel Arc B580, Can Battlemage Deliver What Gamers Need?

They say Dec 3rd is the reveal and retail sales on Dec 12th, I may grab one as it seems they have been doing a lot of work with PCIe 8x and power management with the single 8 pin model, the thing still has 12GB of memory, so we have never really seen that from Intel before as not having to brute forced with the hardware specs to make up for 2022 drivers.

Edit, I guess there is also a B570 with 18 Xe cores and 10GB of memory with one 8pin power connection.

also, forgive me if I say 10Gb as I started on a Ge Force 4200 Ti with 64Mb of video memory.
 
Last edited:
I just want someone to eat some of nVidia's lunch, because fuck 'em.

I mean, it's probably not going to be Intel (or AMD at this point), but a man can dream.
 
I'm starting to be impressed with the Ray Tracing on my A770 and why can't they make games that run this good anymore like Resident Evil 2, the game was free with my new Radeon RX 580 8GB back in 2019 and I back then didn't have the hardware to run Ray Tracing being an AMD fanboy.

No upscaling needed with RT on as to give you guys some hope in the drivers for these new cards coming out next week.


View: https://youtu.be/7BCi2RfnzfE?si=AAlitkjWAVPEpnGi
 
Last edited:
I hope they can just compete with AMD and Nvidia and force some competition in the 200-300$ segments. Sad that the B770 is delayed for canceled outright.
 
I hope they can just compete with AMD and Nvidia and force some competition in the 200-300$ segments. Sad that the B770 is delayed for canceled outright.
What everyone keeps saying that the B570 / B580 are all that's coming for this generation.
 
Hopefully they're just launching the budget card first. That's what I'd do. The budget card is their best opening since NV and AMD have been neglecting the segment.

What I find really entertaining is Intel has beaten AMD to getting AI upscaling & frame gen out the door and arguably had better ray tracing last time around. Also backwards budget builds with an AMD CPU and Intel GPU.
 
Seems like it'll be a solid budget option and nothing more. Hopefully people buy it over the holidays. I'd like to see them continue
 
Sounds like a B770 would have been something to look into, but we might or might not get it.
 
Sounds like a B770 would have been something to look into, but we might or might not get it.
It's a real shame, it likely would've been a 4070 superish card, my guess is they think they can't compete in that space.
 
I heard they may have a low profile version B580, too. Enterprise only, but, that could be a neat card. If they squeeze better than 4060 performance with 12GB VRAM into that small of a device that's a win, in my opinion. I don't see it with the wattage specs I've seen, but, maybe more to come.
 
It's a real shame, it likely would've been a 4070 superish card, my guess is they think they can't compete in that space.
Would have been interesting.

If the 272mm of the B580 is true:
https://www.techpowerup.com/gpu-specs/arc-b580.c4244

And the only 10% faster than a 4060 is accurate (70% bigger die we can imagine fully working that overall cost what a full ~double, with 50% more memory bandwidth bits for just 10% more performance), getting to 4070 super level on that architecture could have took a 450-500mm die with 384 bits of memory bandwith...

And after all that, if it work, you get matched by a 5060ti... that cost significantly less than half to do and will have "DLSS 4"... they are so far behind that it is rough to make it work, specially if you are not ready to lose hundreds of millions on it right now.
 
Would have been interesting.

If the 272mm of the B580 is true:
https://www.techpowerup.com/gpu-specs/arc-b580.c4244

And the only 10% faster than a 4060 is accurate (70% bigger die we can imagine fully working that overall cost what a full ~double, with 50% more memory bandwidth bits for just 10% more performance), getting to 4070 super level on that architecture could have took a 450-500mm die with 384 bits of memory bandwith...

And after all that, if it work, you get matched by a 5060ti... that cost significantly less than half to do and will have "DLSS 4"... they are so far behind that it is rough to make it work, specially if you are not ready to lose hundreds of millions on it right now.
They're using different processes though, 5nm vs. 4 for NVidia. It's very similar to a 4060 in terms of transistor count -- 19.6 billion for the B580 and 18.9 for the 4060.
 
They're using different processes though, 5nm vs. 4 for NVidia. It's very similar to a 4060 in terms of transistor count -- 19.6 billion for the B580 and 18.9 for the 4060.
special that their density is so much lower than the 7800xt that use quite a bit of tsmc 6... maybe they use a lot of cache or something that do not scale well..
 
They're using different processes though, 5nm vs. 4 for NVidia. It's very similar to a 4060 in terms of transistor count -- 19.6 billion for the B580 and 18.9 for the 4060.

special that their density is so much lower than the 7800xt that use quite a bit of tsmc 6... maybe they use a lot of cache or something that do not scale well..

Addressed here:

View: https://youtu.be/XYZyai-xjNM?t=1419

Lot of cache and dummy transistor, they do not count, mixed with probably different way to count transistor, they do not think they are really that much worse than AMD-Nvidia a putting transistor on a die here. This show in part of much it is turned into alchemy, transistor count not being something people really know that precisely.
 
all either very popular or new title obviously, but still that seem really nice:

https://www.techpowerup.com/review/intel-arc-b580/35.html

Not a single title with a strangely low 1% low versus the average.

I am curious about what the 7600xt do better here ? still more game with cleaner frametime ?
a bit cheaper, more vram, better power, better performance, much better RT, better XESS support
 
Last edited:
DF's review:
View: https://www.youtube.com/watch?v=yKMigkGU8vI

Card looks legit for $250. I like how they used the 580 nomenclature, when the last great ~$250 card was the RX580 back in 2018-2020. If my son's RTX 2080 dies anytime soon, I would prolly be replacing it with this, unless it forces AMD and NVidia to drop their prices to compete at this level. Which hopefully it does.
 
BANANAS... That Intel is now on Par with its competitors for much better pricing. Yeah it trades blows with the 7600 and 4060 but at it's pricing and RT performance it's a knockout. I would not have thought they'd come out swinging this hard. Can't wait for the 770/780 review! Assuming they have a part to compete at that segment.
 
Last edited:
BANANAS... That Intel is now at Par with this competitors for much better pricing. Yeah it trades blows with the 7600 and 4060 but at it's pricing and RT performance it's a knockout. I would not have thought they'd come out swinging this hard. Can't wait for the 770/780 review! Assuming they have a part to compete at that segment.
Yeah, I have to say i'm impressed. I'm going to recommend this to a buddy who is stuck on a 1650 super. But I thought it was just going to be the 580/550 this go around. I don't think they'll be a 770 successor(of this gen). But i could be wrong on that. I'm sure someone will chime in and let us know for sure.
 
That Intel is now at Par with this competitors
Depends on what we mean with at par, it seems to use 50% more power, 70% more memory bandwidth on a 70% bigger die to beat the 2023(on a 2022 platform) card by 7-10%. And they need that gpu to be 100% enabled to do it, which at that size can start to mean lower yield.

Would they keep that 50%+ improvement gen on gen pace (drivers and hardware, both go better massively in 2 years), the next 2 launch they could join AMD-Nvidia, but like for them it will start to get harder and harder to improve that fast.
 
Depends on what we mean with at par, it seems to use 50% more power, 70% more memory bandwidth on a 70% bigger die to beat the 2023(on a 2022 platform) card by 7-10%. And they need that gpu to be 100% enabled to do it, which at that size can start to mean lower yield.

Would they keep that 50%+ improvement gen on gen pace (drivers and hardware, both go better massively in 2 years), the next 2 launch they could join AMD-Nvidia, but like for them it will start to get harder and harder to improve that fast.
Fair enough.
 
I think this may have been a more interesting value for gaming, if it had matched slightly beat the 2080 ti at Ray Tracing.
 
Depends on what we mean with at par, it seems to use 50% more power, 70% more memory bandwidth on a 70% bigger die to beat the 2023(on a 2022 platform) card by 7-10%. And they need that gpu to be 100% enabled to do it, which at that size can start to mean lower yield.

Would they keep that 50%+ improvement gen on gen pace (drivers and hardware, both go better massively in 2 years), the next 2 launch they could join AMD-Nvidia, but like for them it will start to get harder and harder to improve that fast.
I have to agree. I can't imagine Intel is making money on these at $250 sorry to say. 70% bigger die than a 4060 on the same process? Yikes.

I still want one to play with on the bench.
 
I have to agree. I can't imagine Intel is making money on these at $250 sorry to say. 70% bigger die than a 4060 on the same process? Yikes.
Both are called TSMC 5 family, but I think Nvidia is more a TSMC 4N just for them that is a good amount better.

But the AMD competition, on 2022 cores and an older node (TSMC6) like the 7600xt from is still on a significantly smaller die (75% the size) with much lower bandwidth (62.5% of the B580 bandwidth) for the same tier of performance. Intel do have more RT-Xess capability that could explain it in part, but they are probably still quite behind 2022 AMD drivers-hardware wise.

During their press tour, Intel spokesman was really clear that they are realist here, they still have a long way to go to match AMD-Nvidia and will have to eat very low margin to develop the product and probably need to stay on that lower end.

They did improve really fast, so if they keep at it who knows, but if Nvidia transition back to yearly new generation like before with the giant R&D budget to do it, keeping it up with them could be just impossible, with AMD busy with AI too, maybe.
 
Last edited:
I have to agree. I can't imagine Intel is making money on these at $250 sorry to say. 70% bigger die than a 4060 on the same process? Yikes.

I still want one to play with on the bench.
Intel's GPU projects are indeed, not yet profitable.
 
I think from a design and engineering perspective a lot of you are accurate that these cannot be making Intel a lot of money. I agree wholeheartedly. However from a consumer's perspective they perform, and put pressure on Nvidia and AMD. To me, as a consumer, that means the most.
 
But from a consumer perspective, it could mean the second they ever matter (i.e. sales well at volume) price will go up immediately and if that the case I am not sure how real the pressure will be, will see the 8600/5060 price tag, maybe.
 
But from a consumer perspective, it could mean the second they ever matter (i.e. sales well at volume) price will go up immediately and if that the case I am not sure how real the pressure will be, will see the 8600/5060 price tag, maybe.
It can never be price parity as long as they trail in market share. Most of the uninformed will automatically buy Nvidia so Intel will need something to sway them.
 
  • Like
Reactions: T4rd
like this
They have no choice to eat margin (maybe even loss) until they get better than them I imagine, not just match, as they will have a reputation deficit, a bit like AMD vs Nvidia right now.
 
Lets see how it stacks up with the 5060 and 8600 parts. They better be more than 8GBs or they are toast. More and more games are are pushing that 8GB barrier at 1080p and 1440p. But Bravo Intel please keep it up.
 
I'm also impressed how relatively affordable these products are and I'm glad they have hefty amount of VRAM for the performane class. LukeTbk good that you presented some numerical figures, painting the picture how far or close Intel actually is with their design to the two other companies - I too grasped that they invested notable die space for ray tracing and Xess. Design wise, including their proprietary upscaling solution - and exclusive in the sense that for the best result you need Arc - they seem to have taken approach more similar to Nvidia than AMD. The latter has for many years, if not always, including the ATI period, designed chips to be cost effective, which is an understandable strategy risk wise. You may think that more dedicated design for accelerating ray tracing is a must, but I'm curious to see how AMD will improve their acceleration in the RX 8000 line-up - it's all math after all. I do not pose anything on the amount of improvement though, but certainly something is to come: this rumor says 45 % better than RX 7900 XTX in a lightly, unremarkably ray traced game.
They have no choice to eat margin (maybe even loss) until they get better than them I imagine, not just match, as they will have a reputation deficit, a bit like AMD vs Nvidia right now.
Business reputation is the end result of concrete achievements. We may not value every and all achievements the same on a personal level, but reputation paints the overall picture fairly well. For those who do not tinker with computers, Intel is not a choice and I personally would not recommend even Radeon cards for such, and this group is the majority of buyers, which creates the big cash flow. The amount an individual play games or use his GPU otherwise does not increase the cash flow, only purchasements do, and for the majority of purposes, Nvidia is the safe bet.
I am interested in how Intel's driver develops and am a potential buyer in a case they offer something notably better when I'm upgrading my GPU, compared to the one I replace. Even if Intel has a good brand popularity, strong even, the majority of buyers recognize that they are new in the dGPU space, but they certainly are building fame fast given the media coverage and all. The software part though, it saves a half performing product in the end of the day as far as customer satisfaction goes, while destroys an otherwise performant one. :O I personally trust they know how to approach the software development, but it's just a race to achieve goals in time versus how the investors see the prevailing financial situation - luck is also a factor here. Battlemage is already a success in the sense that products seem to move to customers, which funds further development and keeps bug reports incoming.

EDIT. small clarification
 
Last edited:
Both are called TSMC 5 family, but I think Nvidia is more a TSMC 4N just for them that is a good amount better.

But the AMD competition, on 2022 cores and an older node (TSMC6) like the 7600xt from is still on a significantly smaller die (75% the size) with much lower bandwidth (62.5% of the B580 bandwidth) for the same tier of performance. Intel do have more RT-Xess capability that could explain it in part, but they are probably still quite behind 2022 AMD drivers-hardware wise.

During their press tour, Intel spokesman was really clear that they are realist here, they still have a long way to go to match AMD-Nvidia and will have to eat very low margin to develop the product and probably need to stay on that lower end.

They did improve really fast, so if they keep at it who knows, but if Nvidia transition back to yearly new generation like before with the giant R&D budget to do it, keeping it up with them could be just impossible, with AMD busy with AI too, maybe.
I for one am happy to have Intel in the space and they are improving.

I just hope that Intel's current situation doesn't kill off Arc altogether.
 
If you can't earn market shares than the next best way is to buy them with candy for all the big grown up kids. if you say the gpu doesn't earn a profit!
 
Back
Top