- Joined
- Sep 15, 2011
- Messages
- 2,961
https://gamerant.com/intel-battlemage-gpus-performance-specs-leaked/
12gb doesn't seem like much, hope these are low end models
12gb doesn't seem like much, hope these are low end models
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
It's not, they just want you to think that it is, so that you will go & buy a 16GB modelhttps://gamerant.com/intel-battlemage-gpus-performance-specs-leaked/
12gb doesn't seem like much, hope these are low end models
Cool
“Intel has reportedly chosen the TSMC 4 nm EUV foundry node for its next generation Arc Xe2 discrete GPUs based on the "Battlemage" graphics architecture. This would mark a generational upgrade from the Arc "Alchemist" family, which Intel built on the TSMC 6 nm DUV process. The TSMC N4 node offers significant increases in transistor densities, performance, and power efficiency over the N6, which is allowing Intel to nearly double the Xe cores on its largest "Battlemage" variant in numerical terms. This, coupled with increased IPC, clock speeds, and other features, should make the "Battlemage" contemporary against today's AMD RDNA 3 and NVIDIA Ada gaming GPUs. Interestingly, TSMC N4 isn't the most advanced foundry node that the Xe2 "Battlemage" is being built on. The iGPU powering Intel's Core Ultra 200V "Lunar Lake" processor is part of its Compute tile, which Intel is building on the more advanced TSMC N3 (3 nm) node.”
View attachment 663766
Source: https://www.techpowerup.com/324197/intel-arc-xe2-battlemage-discrete-gpus-made-on-tsmc-4-nm-process
I bet it's the B5xx models, so a 4GB increase from the A580. 20 or 24 Xe2 cores vs. 24 for the A580. Likely a hit to memory bandwidth, but maybe not much. A580 has a 256-bit bus with 16Gb/s GDDR6, but 20 seems to be readily available. If they use 20Gb/s ram it would have about 94% of the A580s bandwidth. 256-bit bus on a sub $200 card is a little nuts these days, I bet the A580 can't really use all of it.It's not, they just want you to think that it is, so that you will go & buy a 16GB model![]()
Well it would be rather funny if Intel can beat AMD highest end RDNA 4 card. Just have not heart much dealing with Intels discrete cards.Hoping they go 32 again on their high end model as I need another gpu. Will be waiting to see how the arch changes pan out. I think they learned a lot from Arc and will address the biggest issues with through put for the next gen. Actually a bit excited for the new cards to release and hoping they're gonna be competitive at a reasonable price point.
That would be hilarious. It would also be fun to see Intel fabbing GPUs for NV or AMD the gen after next if they can get 18A working well. Giggles aside I just want someone to properly compete with NV, and I wouldn't mind at all if one of my options was fabbed by Intel in the USA. That's not going to happen this round, but maybe in late 2026/early 2027. I just don't like one company having a lock on anything.Well it would be rather funny if Intel can beat AMD highest end RDNA 4 card. Just have not heart much dealing with Intels discrete cards.
I saw that. I doubt that's the top model. There are a lot of leaks about a 32 Xe2 core, 256 bus, 16GB vram model, so specs similar to the A770 aside from Xe2 instead of Xe cores. But maybe the B580 launches first? What I'm really wondering is when, and especially if Intel will launch this year and get out ahead of the next gen NV and AMD stuff. Sounds like they might, but to really pull off a coup if they have the goods and they're ready there would have been an announcement by now. Get the cards to retail by Black Friday.
The chip that was reportedly cancelled was the little one for B3xx cards. I bet they're just starting with $200-250 budget cards since that's where they can make the biggest dent. NV and AMD have been neglecting the budget segment which leaves an opening for Intel to grab market share. More market share will make ARC work better. New games not working on day 1 isn't all Intel's fault. It's also the fault of the game devs for not bothering to test on ARC. Of course a lot of them probably don't see much reason to bother since hardly anyone uses ARC, but more market share will change that.So B580-B570 should be just be the mid card if they kept like before (but maybe they will skip the "high end")
AMD and Nvidia are their competitors.With everything going on at Intel right now I hope they continue to push in the DGPU space. Anyone think they will bail after Battlemage or will they keep at it? Personally I'm planning picking one up to try with a 12th gen system I have with no GPU in it right now.
10% less core, 6% lower clock, 17% less bandwidth with 17% less memory, for 12% cheaper.https://www.rockpapershotgun.com/ba...announce-the-arc-b580-and-b570-graphics-cards
Why would anyone buy the B570? There's only a $30 price difference between the B570 and B580.
Which means PCI-E 3.0 x16 will be fine in terms of bandwidth.pay attention to this if youre on an older system...
View attachment 695677
or its neutered to only x8 like amd did with the 65/6600s...Which means PCI-E 3.0 x16 will be fine in terms of bandwidth.
Would it not mean it has only 8 lines, so it will be pci-e 3.0 x8 (which is maybe not much of an issue for this level of card that has a nice amount of vram)Which means PCI-E 3.0 x16 will be fine in terms of bandwidth.
that how im interpreting it...Would it not mean it has only 8 lines, so it will be pci-e 3.0 x8 (which is maybe not much of an issue for this level of card)
Same here.that how im interpreting it...
youtubers are blowing smoke for clicks. afaik, there will only be the two.Will Intel later have a B770/750 card? Some YouTubers hinted that might be possible. hmmmm