• Some users have recently had their accounts hijacked. It seems that the now defunct EVGA forums might have compromised your password there and seems many are using the same PW here. We would suggest you UPDATE YOUR PASSWORD and TURN ON 2FA for your account here to further secure it. None of the compromised accounts had 2FA turned on.
    Once you have enabled 2FA, your account will be updated soon to show a badge, letting other members know that you use 2FA to protect your account. This should be beneficial for everyone that uses FSFT.

ARC Battlemage Owners' Experiences and Performance

zandor

Supreme [H]ardness
2FA
Joined
Dec 14, 2002
Messages
4,779
I figured we ought to have a thread for discussing experiences with these new B580 and shortly B570 cards (unless you snagged one at Microcenter already when they oopsed and started selling them early).

I picked up an AsRock Steel Legend ARC B580 at Microcenter on launch day. They let me order it for pickup, which surprised me since they don't usually do that with new releases. I just drove down after work and picked it up. Since then I've been playing with the card more than actually playing games on it. I built a new rig which left me short a card until I get my hands on Blackwell since I plan on keeping my previous machine. I moved my 3090 over to the new build and bought the B580 to make my old machine work and try out the new thing from Intel.

I've been playing around with the B580 seeing what it can do. So far I haven't had any problems other than performance issues, and that's mostly about Starfield. I haven't tried overclocking it yet, but I've messed around with a bunch of games trying to see what sort of settings I can run them at. This is more of a "highest playable settings" sort of thing than the usual comparison charts. I don't really have anything useful to compare the B580 to - just a 3090 and a 1660Ti. I'd have to move cards around to make any useful comparisons. As far as settings go I was aiming to get the 1% lows over 60 if I could do it without clobbering settings too much and if not at least up into the 50s with an average decently over 60fps. Well, except for 4k. Yes, I tormented the poor budget card by making it do 4k. 4k was more like how high can I go most of the time. Scaling is all XeSS because I don't seem to have any games that support FSR and not XeSS that I felt like testing.

On to the charts. Please note that these are just taken at whatever random point in the game I was at, and they're not a carefully scripted test run.

Test System:
i9-10980XE (18c/18t, HT disabled), 4x16GB DDR4-3600 CL16 (quad channel), AsRock Steel Legend ARC B580 12GB, PCI-e 3.0

* Highest preset. RT and everything else are separate except for Doom Eternal.

1080p

GamePresetScalingAverage1% Lows
CyberPunk 2077Ultra*Off7859
CyberPunk 2077 RTRT LowOff7854
Doom Eternal RTUltra Nightmare with RT*Not Supported, Built-in Scaling Off175115
ControlHigh*Not Supported 10867
Control RTHigh + Med RTNot Supported8766
Metro Exodus Enhanced EditionUltraNot Supported8465
The Outer WorldsUltra*Not Supported10076
Shadow of the Tomb RaiderHighest*Off15076
Shadow of the Tomb Raider RTHighest* + Ultra RT*Off8762
StarfieldUltra*Off5036

1440p
GamePresetScalingAverage1% Lows
CyberPunk 2077Ultra*XeSS Balanced7860
CyberPunk 2077 RTRT LowXeSS Balanced7657
Doom Eternal RTUltra Nightmare with RT*Not Supported, Built-in Scaling Off138100
ControlMediumNot Supported8962
Control RTTurn off Ray Tracing
Metro Exodus Enhanced EditionHighNot Supported8864
The Outer WorldsVery HighNot Supported8967
Shadow of the Tomb RaiderHighXeSS Ultra Quality13668
Shadow of the Tomb Raider RTHighest* + High RTXeSS Ultra Quality10169
StarfieldDidn't bother. See 4k results.

Somehow enabling RT helps with 1% lows in Shadow of the Tomb Raider. We'll see more of this at 4k.

Torturing the budget card at 4k
GamePresetXeSSAverage1% Lows
CyberPunk 2077HighXeSS Performance7451
CyberPunk 2077 RTRT LowXeSS Performance6146
Doom Eternal RTUltra Nightmare*Not Supported, Built-in Scaling Off8467
ControlLowNot Supported5440
Control RTDon't even think about it.
Metro Exodus Enhanced EditionLowNot Supported5644
The Outer WorldsMediumNot Supported
Shadow of the Tomb RaiderHighXeSS Performance10962
Shadow of the Tomb Raider RTHighest* + Medium RTXeSS Performance9066
StarfieldUltra*XeSS Performance4731

Once again Shadow of the Tomb Raider has better 1% lows with RT enabled. Starfield is pretty much a train wreck on the B580. It doesn't run well at any resolution or settings. 1080p low averages 54fps with 1% lows at 41. Might as well run it at ultra settings. It's barely slower, so I figure it's badly CPU bound.
 
Can you run Cyberpunk 2077 with Path Tracing and everything maxed out at different XeSS presets using built-in benchmark?
 
Path Tracing is the only thing my A770 doesn't like or do well at in Cyberpunk 2077, I ordered the B570 from newegg for $219 and should be here next week as Intel has still not shown us the new Xess 2 upscaler yet.
 
I reinstalled the game to play with the benchmark tool some. Personally I would turn off PT and maybe tweak some other stuff for 60FPS+ lows like the OP has shown already.

Screenshot 2025-01-16 144013.png
 

Attachments

  • Screenshot 2025-01-16 144216.png
    Screenshot 2025-01-16 144216.png
    1.6 MB · Views: 0
  • Screenshot 2025-01-16 144231.png
    Screenshot 2025-01-16 144231.png
    1.6 MB · Views: 0
  • Screenshot 2025-01-16 144152.png
    Screenshot 2025-01-16 144152.png
    1.5 MB · Views: 0
  • Screenshot 2025-01-16 144134.png
    Screenshot 2025-01-16 144134.png
    1.5 MB · Views: 0
  • Screenshot 2025-01-16 144122.png
    Screenshot 2025-01-16 144122.png
    1.4 MB · Views: 0
Can you run Cyberpunk 2077 with Path Tracing and everything maxed out at different XeSS presets using built-in benchmark?
Don't plan on actually playing with it on. I ran it on the Ultra RT preset.

XeSS1080p1440p4k
Performance50.2 / 43.534.5 / 30.018.3 / 15.9
Balanced43.3 / 37.729.0 / 25.214.9 / 12.9
Quality34.9 / 30.222.9 / 19.711.5 / 9.9
Ultra Quality29.9 / 25.818.7 / 16.09.4 / 8.0
Off (native)16.0 / 13.69.7 / 8.00fps

4k native ran out of vram. Afterburner was showing 14.5GB allocated, so it was spilling over into system memory.
 
Has Intel released there XeSS 2 Frame Gen yet? kind of why I picked up the B570 if it's locked to those XeSS 2 cores as be really buying software like Nvidia products.
 
Any of you owners play Fortnite? That would be interesting. Don't go install it for me or anything. I want to pick one up to play with but they've been scarce even with a Micro Center very close to me.
 
Hmm. These are all lower than my mobile 4070 140w card. What is the expectation from this generation? Am I missing the stack location?
 
My thinking is that for the second generation Intel are moving on the right direction with battlmage. They still have some caveats and anything from AMD or Nvidia has less of those. For me it’s not the only GPU I have at home. I’ve got some from AMD, Nvidia and now Intel cards to mess with. Intel still has some work to do concerning driver cpu overhead and some performance issues in games that should otherwise work fine. Starfield for example. I’ve seen expected performance from both of my 580s and one is paired with that 12600k which is hardly an expensive cpu these days. So some of the “you have to have a 9800x3d or it sucks” arguments are off the mark. IMO you need a 12th gen Intel cpu or newer and probably a AM5 7000 really. That does not make these GPUs friendly for upgrading older systems but what about building a new one for 5-600 bucks? Totally viable for that for someone who won’t buy a used gpu.
 
Has Intel released there XeSS 2 Frame Gen yet? kind of why I picked up the B570 if it's locked to those XeSS 2 cores as be really buying software like Nvidia products.
I think it's only in F1 24 so far with a few more announced. So it's out but game support is extremely limited. It's supposed to work on Alchemist, no idea how well.

Hmm. These are all lower than my mobile 4070 140w card. What is the expectation from this generation? Am I missing the stack location?
It's a $250 MSRP card. In the current generation it competes with the RTX 4060, RX 7600 and RX 7600XT and is usually a little faster. It's next gen competitor is probably an RTX 5050 if NV makes one this time and whatever AMD has cooking for the sub-$300 segment, if anything. AMD might just keep the 7600 & 7600XT around and do some price cuts for budget cards. 7600 is already down to ~$250.

I haven't managed to find laptop 4070 vs. desktop 4060Ti tests online, but just going by specs I'd think a 140W mobile 4070 would be pretty close to a desktop 4060Ti. The B580 is generally slower than a 4060Ti, so your laptop getting higher fps isn't surprising.

I'm also using a borderline CPU for a B580. Battlemage has higher driver CPU overhead than AMD and NV, so it's held back more by a slower CPU. I have it in a machine with an i9-10980XE in it, and 10th gen or Ryzen 5000 is Intel's CPU "requirement". The card still works in older machines, but a bunch of Tech Tubers have been playing with it and it loses a lot more performance with an older CPU than competing AMD & NV cards. In other words my results aren't as good as you'd get with a 12th gen+ or AM5 cpu.
 
I plan to pair the B570 with my 7700x system on B650, one of the best hacks I have going on with my A770 is with Dead Space and DLSS, when I bought and installed the game I had my RTX 3070 and the game gave me DLSS options, replace 3070 with A 770 and fire up Dead Space and use DLSS on your Intel Arc card, works great as I use DLSS auto at 1440p as the game has no clue Nvidia is not with us this trip.
 
Anyone checked if Cyberpunk works better with async compute off?
Can be disabled by Cyber Engine Tweaks and/or creating ini file in engine\config\platform\pc and putting inside
Code:
[Rendering/AsyncCompute]
BuildDepthChain = false
DynamicTexture = false
Enable = false
FlattenNormals = false
HairClears = false
LutGeneration = false
RaytraceASBuild = false
SSAO = false

[WaterSimulation]
UseAsyncComputeFFT = false
It is supposed to help with performance on older GPUs
On my 4090 it gives about 2 fps improvement.

Might be worth trying out to check how well Battlemage supports Async Computer.
Driver is most definitely advertising itself to the game as capable of doing async compute but even Ada Lovelace only merely emulates async compute so I would not expect Intel to have proper fully hardware support of it. In this case internal non asynchronous implementations of scheduling in the game might be faster than emulating async compute.


I reinstalled the game to play with the benchmark tool some. Personally I would turn off PT and maybe tweak some other stuff for 60FPS+ lows like the OP has shown already.
Definitely not at 36fps and especially not PT with 43% render scale at 1080p.
PT looks amazing but the way filtering works you will get lower details level which will depend on render resolution and frame rate.
On 4090 with DLSS Quality at 1440p I get almost 90fps and still at places it is somewhat obvious that details slowly accumulate on objects when given place was just disoccluded. It is not an issue in this case but only shows that having much less samples per second will only make process take much much longer. At some point so long details will take too long to appear and I would need to move slowly in-game to not experience everything looking like it lacked details.

In this sense normal RT should be better as frame rate is much higher and most things don't rely on accumulating rays so is immediately fully detailed... or at least as far as actual render resolution is concerned and to upscale to full resolution might still take some time.
Higher frame r
 
I think I might have found my first real problem with the B580. I'm not sure if it's the drivers or the game, but I tried Indiana Jones and the Great Circle on it and the display gets super bright and washed out in some indoor areas close to the beginning of the game. Seriously messing with the in-game brightness and monitor settings fixes it enough that I can see where I'm going. The game starts out with re-enacting the scene from one of the movies where Indiana Jones takes a gold idol from a temple in Peru. It's gorgeous until you get into the temple, then boom everything goes white. On an NV card it's dark in there like it should be. It's like global illumination is backwards and making everything bright instead of dark.

I've tried just about everything I can think of. Drop all the settings to minimum, mess with them a bunch, HDR on and off, reinstall the game, reinstall ARC drivers, disable AMD integrated in the BIOS and remove AMD's drivers, 2 different monitors, etc. At this point I'm at a loss for what to do about it other than just play the game on an NVidia card.

First two are on ARC, second two on NVidia. You're not going to need to compare them side by side...

TheGreatCircle_entrance_B580.jpg
TheGreatCircle_cave_B580.jpg
TheGreatCircle_entrance_3090.jpg
TheGreatCircle_cave_3090.jpg
 
I don't have this game but I have a B580 and a 7800X3D that I could test on. Trying to see if I can get it at a decent price.

EDIT: downloading now!
 
Last edited:
MY experience so far is that it runs okay, but damn, does the driver overhead pisses me off so much sometimes. I temporarily am using a ryzen 5 1600, and in certain esports games, I can almost get the same amount of frames as my 1600 + B580 with an FX and GTX 680 (at 1080p vs 1440p, but still). This and the rebar stuff basically stops it from being used on old systems. But, $270 (for my sparkle card), and I can satisfy my hunger to upgrade, GPUs can be expensive AF, but I will still have one.
 
Same results. :-/

Specs: https://valid.x86.fr/5bekdh

Version:
32.0.101.6790
I kinda hate to say it, but you having the problem too makes me happy. It means it's probably just a game or driver bug rather than me having some sort of uniquely fucked up system that'll never get fixed. I should look around and see if I have any older Intel drivers lying around. I'm using the same version you are, and as far as I know it's the latest.

Intel and Bethesda both got back to me this morning. Bethesda just wanted me to upload dxdiag and msinfo output. Their support form asks for both, but won't accept more than one file, so I just concatenated them together. They need to fix their support form. Intel wanted me to run their system support utility (SSU), upload video of the problem and a comparison video at the same spot in the game on an NVidia card since I told them it rendered properly on NVidia, and asked if I'd made any changes to the system recently. That last one is well, yeah, extreme changes. I built it on Saturday May 3 2025.

edit: fix typo in build date. 2025 not 20205.
 
Last edited:
I just installed 32.0.101.6790 on my A770, using it for now to see how these older cards react with the new drivers, and it seem to work with OBS where B570 seems broken on OBS.
 
I just installed 32.0.101.6790 on my A770, using it for now to see how these older cards react with the new drivers, and it seem to work with OBS where B570 seems broken on OBS.
I had a look at the OBS forums and I guess my 5090 won't work either? Or at least there's a thread about it not working with recent posts. I didn't have any trouble capturing video off my B580 using Afterburner or Steam. Had trouble with Steam recording at first so tried Afterburner, then realized I forgot to press ctrl. Steam default to turn on recording once you enable it is ctrl+F11. Somehow I spaced out on the ctrl part. Oops. So Intel got Indiana Jones recordings from Afterburner. At any rate, it seems like OBS does some more complicated stuff and requires actual work to make it work on a new card.
 

Maxsun registers several Intel Arc B580 24GB models with the EEC​

I don't think these will be Pro cards because they would never offer an Over Clocked model as a Pro card.

https://www.msn.com/en-us/lifestyle...1&cvid=29a4545cbaea4772898c8b68aa12d001&ei=49
It's an "iCraft" OC model. Sounds crafty. Something like that could make sense for the creative crowd. I don't really do photo or video editing, digital art, 3D design, etc. but from what I understand there are a bunch of creative applications were you really want some more vram than gaming cards tend to come with but raw speed isn't so important and that made the 12GB RTX 3060 quite popular.
 
I know Intel offers a Pro driver = Users looking for the dedicated Intel® Arc™ Pro Graphics Driver should click here.

So, I just got done installing driver 6793 on the B570, I noticed the driver did a frimware update also on the card, is that the same as an on the fly bios flash?
 
Got my AsRock B570 Challenger.
It gets 12fps in furmark 1080p, vs 30-40fps on my A380. Otoh, it does better on the furmark knot, which utterly chokes the A380.


Tests ran on ArchLinux with latest updates, AM5 8400F cpu.

Dunno why it does so poor on the basic Furmark test, maybe a driver issue. I guess it might not be changing the power mode under increased demand.
 
12 fps? Ouch. Sounds like a Linux driver issue. My B580 + 9600X machine and my laptop are the only ones I don't have Linux on, but I'm getting far better FurMark scores in Windows 11 24H2 running windowed at the default 1080p. You'd expect a B580 + 9600X to beat a B570 + 8400F, but not by anything close to 12x.

FurMark GL 145
FurMark VK 147
FurMark Knot GL 86
FurMark Knot VK 77
 
12 fps? Ouch. Sounds like a Linux driver issue. My B580 + 9600X machine and my laptop are the only ones I don't have Linux on, but I'm getting far better FurMark scores in Windows 11 24H2 running windowed at the default 1080p. You'd expect a B580 + 9600X to beat a B570 + 8400F, but not by anything close to 12x.

FurMark GL 145
FurMark VK 147
FurMark Knot GL 86
FurMark Knot VK 77
Yeah, will have to investigate later. It still runs dolphin at a reasonable clip (30fps locked) with a bunch of graphic enhancements enabled, so I know it is working.

Fwiw, the vk test runs much higher framerate, about matching the a380 iirc, but it has visual artifacts (the donut flickers)...well it did that with the a380 too, so might be a bug in furmark or the Plasma wayland compositor.
 
Fwiw, the vk test runs much higher framerate, about matching the a380 iirc, but it has visual artifacts (the donut flickers)...well it did that with the a380 too, so might be a bug in furmark or the Plasma wayland compositor.
That sounds like a bug too. I don't see any artifacts in Windows.
 
Looks like it was a firmware bug, maybe. Checked dmesg, said failed to resize BAR. Checked efi, Resizable BAR was enabled...well I flipped the (ReBAR) switch off and on, and enabled IOMMU explicitly while I was in there, then rebooted. dmesg error gone now.

Get 122 avg in the 1080p gl test now, about 115 in the vk test.
Knot gets 50 fps avg in gl, much better.
 
Installed Saint's Row: GooH to play around with.
Runs really well in Linux via Wine/gamescope. Get about 200fps max, but it dips down to 28-30 when on the ground looking in the distance (shoots up to 200 if you look up or down). You almost can't tell, though, it's really smooth.

Changing the settings has almost no noticeable effect -- it runs well on all high settings, 1080p, but stumbles a bit when on the ground if you don't drop some down. GPU and CPU utilization both don't go much above 50%, usually staying in that area, regardless of settings. I imagine if I was fighting a boss or increased the resolution to 1440p or higher, it might go up a bit, but then the framerate may be worse on the ground, depending on what is causing that bottleneck.
 
Back
Top