• Some users have recently had their accounts hijacked. It seems that the now defunct EVGA forums might have compromised your password there and seems many are using the same PW here. We would suggest you UPDATE YOUR PASSWORD and TURN ON 2FA for your account here to further secure it. None of the compromised accounts had 2FA turned on.
    Once you have enabled 2FA, your account will be updated soon to show a badge, letting other members know that you use 2FA to protect your account. This should be beneficial for everyone that uses FSFT.

Intel Arc B580, Can Battlemage Deliver What Gamers Need?

So what do gamers need out of this card? I bought one at Microcenter yesterday; soon it will be time to play with it. I just have to finish setting up the Z890 rig I built last week first.

Reviewers mostly covered how it does with a fast CPU, so I'm thinking I'll play with it in my X299 machine. It's PCI-e 3.0, so I can see how things go with an 8x card on PCI-e 3. From what I've read it should be fine.
 
So what do gamers need out of this card? I bought one at Microcenter yesterday; soon it will be time to play with it. I just have to finish setting up the Z890 rig I built last week first.

Reviewers mostly covered how it does with a fast CPU, so I'm thinking I'll play with it in my X299 machine. It's PCI-e 3.0, so I can see how things go with an 8x card on PCI-e 3. From what I've read it should be fine.
The big consideration is does your mobo have support for resizeable BAR? Intel's GPUs experience a huge performance hit, without it.
 
The big consideration is does your mobo have support for resizeable BAR? Intel's GPUs experience a huge performance hit, without it.
Yes. It didn't ship with it, but I flashed it to add support back in early 2021. Also flashed my 3090 to add support for it at the same time.
 
Don't most these days?
Basically all of them if it's not too old and has an updated BIOS. Resizeable bar showed up as part of the RX 6000 series and Zen 3 launch in late 2020. The thing is all it does is allow mapping of all vram into an app or game's address space at once. That's been part of the PCI-e spec since the beginning. It was just never implemented back in the day because it took forever for games to stop being 32-bit even though we had moved on to mostly using 64-bit operating systems years before. 32-bit apps/games on windows can only address 2GB of ram (+2GB of address space reserved for the system), so you can't map all of vram since it'll block the app/game from accessing too much system ram. That makes ReBar unusable. If you run a 64-bit app/game mapping all of vram from multiple 4090s is no problem. Basically the limitation on it is whether your mainboard or computer vendor bothered to go back and add it. So lots of 400 series AMD (Zen 2) and 300 series Intel (Coffee Lake) board got it along with my X299 board. There are also hacks to enable it on boards that don't have a BIOS update available. I've never played with one of those hacks, but I do have an old X79 box I could try it out on.
 
I finally got around to shuffling cards around and installing the B580 in my X299 rig last night. I haven't played around with it too much, but so far so good. Kinda feels like it's one resolution down from my 3090, so it does about as well at 1440p as the 3090 does at 4k except for situations where a game supports DLSS but not XeSS or FSR. Then I don't get to pick between scaling and lowering settings. I haven't had to use "low" yet though, and the only messy game I've run into is Control. It seems to think I don't have an RT capable GPU, or maybe it's locked on "high" RT? I can't adjust the RT settings but it says "high" and the boxes are greyed out but checked. I haven't bothered staring at it enough to try to decide if it's doing RT or not, but I think it's not working. Runs fine on the medium preset at 1440p. That's substantially worse than Cyberpunk 2077, which runs ok on Ultra with RT and scaling off at 1440p or the RT medium preset with XeSS balanced. "Ok" meaning around 60fps. Bear in mind that I've only messed with this card a bit and just with a few games, but so far this thing seems legit.
 
Ok, I played with Control a little more and now the RT settings are just disabled. Looks like Control thinks the B580 isn't RT capable. The menu displaying RT as being enabled was probably due to having played it with RT on with my 3090 previously. Also I realized that the highest preset in Control enabled 4x MSAA. If you just turn that down to 2x it runs ok at 1440p.
 
Some of us are trying to wrap our head around the fact that Intel removed the capture / record from their newest driver 6325.
 
Been playing with my B580 for a couple of days now, and I must say, it's one solid card. There's a few bugs that I'm sure will get ironed out with future drivers, but I'm seeing great performance in all games I'm testing in. I can do Crysis Remastered at 4K "Can it run Crysis" settings (no RT) and get 30-40 FPS. I set resolution at 1440p, and 60 is easy. Cyberpunk 2077 performance is great as long as you keep resolution reasonable. It's extremely quiet and performance is rock solid; my Sparkle Titan B580 gets to 62C and stays there and goes to 2850 Mhz core and stays there... zero play in clock speed frequencies.

I'm honestly shocked. Intel made a solid GPU, and for $250, it is a bangin value too!
 
I'm honestly shocked. Intel made a solid GPU, and for $250, it is a bangin value too!
It was supposed to be similar for Alchemist but especially at the time of launch drivers were quite terrible and architecture was lacking certain functionalities which happened to be limiting its performance.

Personally I already have faster GPU (Radeon RX 6900XT) so Intel so far has nothing of interest yet but once they start competing at higher end and continue to keep good value it might be viable option.

BTW. Not that anyone uses Alchemist GPUs anyway but it would be interesting if driver quality between A and B GPUs is the same.
Intel still needs to prove they provide long-lasting driver support for their GPUs.
What I am saying: they better not flock it up now. Keep engineers developing and testing drivers for Alchemist GPUs as sooner or later someone might go back to A-series GPUs and test how well they fare and if past issues which don't happen on Battlemage were also resolved. My understanding however is that Intel is still working on Alchemist drivers.
 
Been playing with my B580 for a couple of days now, and I must say, it's one solid card. There's a few bugs that I'm sure will get ironed out with future drivers, but I'm seeing great performance in all games I'm testing in. I can do Crysis Remastered at 4K "Can it run Crysis" settings (no RT) and get 30-40 FPS. I set resolution at 1440p, and 60 is easy. Cyberpunk 2077 performance is great as long as you keep resolution reasonable. It's extremely quiet and performance is rock solid; my Sparkle Titan B580 gets to 62C and stays there and goes to 2850 Mhz core and stays there... zero play in clock speed frequencies.

I'm honestly shocked. Intel made a solid GPU, and for $250, it is a bangin value too!

I got the Sparkle Titan too mainly because I read/watched reviews that it ran cooler than the Limited one. Been playing various games with it and noticed the same thing, it has been staying/hovering at 62C for me as well. I am quite happy with it because for me it is a jump in graphical quality from the 2070 Super and same time runs much cooler.
 
I would say price performance wise the B580 has been a success and has sold out everywhere.

I'm looking forward to the next generation of Intel cards, maybe they can make a card to beat out Nvidia's intentionally weakened 5080 next time.

I'm willing to try Intel if they offer more great performance to value video cards in the future, particularly if it can handle 4k gaming.
 
This post is mostly about the current low-end GPUs:

I don't quite understand the talk some have about generating mindshare by simply getting the product, an Arc GPU, into customer's PC. To me, what creates the mindshare, is a great end user experience. I don't think that any regular gamer who buys any GPU and possibly faces various bugs and issues is going to lean for the brand in the future, but rather avoid the brand if there will be an appealing alternative available price and otherwise. I think Arc is selling to them who have been into PCs for a long time and can handle issues, because I don't see why any other kind of person would want to take the risk compared to just buying the RTX 4060, other than being misled by the media or posts on the net. I'm all for the success of Arc because I'm interested in the tech and a potential future customer, but I avoid spelling wrong impressions.
On US amazon, right now RX 7600 XT is cheaper by a significant margin (Asrock Challenger 309,99 $ versus Gunnir 369,00 $), not being that much slower while having even more VRAM. It loses in ray tracing badly, but I don't think there will be many games that require running considerably taxing RT effects, even through the whole new GPU generation. If there will be some, then, bad luck if you want to play those, but is the B580's ray tracing performance going to save the day in these cases? I'm willing to bet not, if there is no Intel collaboration in developing the game.

We'll see how much they can supply the B580, since for 250 $ it is a good buy for the right kind of customer, but depending on how much RX 7600 XTs there are yet in stock, the latter can drop cheaper than B580 any day, which would be smart from AMD if they want to grab some more budget customers before Arc's restock(s). Likely scenario is though, that new RDNA 3 cards will show up to compete, built on the existing node, while the new midrange value option (RX 9070 XT), among some lesser models, are going to occupy the newer one; see the chart of this rumour. The new node might be too occupied for a long period, while being more expensive, even if only slightly. The existing node likely remains available to fill the lower price segments, as the rumour practically points to, keeping volume possible. I just wonder if they are going to include the ray tracing improvements to the new models, since they did that with PS5 Pro, which utilizes RDNA 3 with the ray tracing improvements of RDNA 4. RDNA 3.5 is designed solely for the APUs I understand, introducing improvements on cache and memory (RAM) calls.

Even if a boosted ray tracing performance in the RX 76xx and 77xx classes would not change much, it would at the very least make the products comparable ray tracing wise to the B580, or close to. It would be interesting to see how much better such would fare in games like Star Wars: Outlaws or Indiana Jones: The Great Circle, where ray tracing is mandatory, but on playable settings naturally, because I find all the ray tracing performance comparisons with unenjoyable FPS numbers unavailing.
The most important thing is to bring the RX 76xx part on par or better in rasterization against the B580 and RTX 5060, and I hope they would name such as part of the RX 7000 series, not fooling customers to think it utilizes RDNA 4.

Cheap GPUs are needed to bring everyone on the line, especially at the laptop front, where RDNA 4 will most likely be utilized, which would be a big relief: on the laptop side Nvidia's doings are most egregious with high prices combined with low VRAM (8 GB even on the RTX 4070 mobile), while RDNA 3 offerings practically do not exist, though they are much behind in efficiency compared to the laudable Lovelace. I would love to see how well a discrete mobile Battlemage performs, but perhaps that is too much to expect.
Yet regarding the rumour, question arises is there any sense introducing RDNA 3.5 based laptop models, even if the parts are relatively low consuming? I understood that RDNA 3.5 doesn't reap much benefit beyond a certain amount of CUs and power usage, the center attention being having no VRAM at all, as in APUs. Such models are welcome too nonetheless, if it means more performant budget laptops, but I doubt the end result is too desirable. :/
If true, the new RDNA 3 desktop models are good news for budget gamers in any case, they just need to have prominently more oomph for the price points they likely replace or simply be cheaper to be appealing - and please no 8 GB models. :)

Thanks for reading, happy new year!
 

Attachments

  • Hardware Unboxed - Arc B580, 1440p, avg.png
    Hardware Unboxed - Arc B580, 1440p, avg.png
    366.4 KB · Views: 0
  • Hardware Unboxed - Arc B580, 1440p RT, avg.png
    Hardware Unboxed - Arc B580, 1440p RT, avg.png
    327 KB · Views: 0
Last edited:
Latest "maybe" - Intel Says "Stay Tuned" to people asking for a B770.

If they do launch one I have a theory about why it took so long after the B580 came out. I'm thinking that after Alchemist Intel wasn't sure they could get any traction in the gaming card market, so maybe they only ordered a limited number of chips from TSMC to go into the B580 and B570 basically as a test. It takes months to get a chip through a fab unless you rush it and that makes the fab inefficient and costs extra. That's not going to work for a modestly priced consumer product. Then once the B580 went over well perhaps they ordered more and also placed orders for the bigger GPUs that would go into a B770 or something like that. So B580 comes out, goes over well, Intel places orders, wait several months... could happen.
 
Back
Top