Massive Intel Arc GPU Driver Update Claims 10% to 77% Gains

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
10,785
Hoping for even more as time goes

"Intel Graphics today released the latest version of the Arc GPU Graphics Drivers. Version 101.4091 adds support for the iGPUs of 13th Gen Core "Raptor Lake-P" mobile processors. The Intel Arc Control software now supports a standalone desktop mode. Among the handful issues fixed with this release, an application crash noticed in Warhammer 40,000: Darktide during the character-selection screen on Arc discrete GPUs, has been fixed. For the iGPUs of Intel Core processors (11th Gen and later), the drivers fix color/display corruption issues with Need for Speed Unbound and Battlefield 2042; and an intermittent application crash issue with Total War: Warhammer III. Grab the driver from the link below.

DOWNLOAD: Intel Arc GPU Graphics Drivers 101.4091"

https://www.techpowerup.com/304298/intel-releases-arc-gpu-graphics-drivers-101-4091
 
So 30 fps instead of 17? *rimshot*
It was bound to happen. At least there is one company that gives a shit about their driver support and improving performance. They're a newcomer to the GPU space, it's gonna take time for them to get competitive and work out all the bugs. They have been using Linux translation software to make the DX9 calls use Vulkan, which is pretty effing cool if you think about it. Vulkan is delivering respectable if not pretty amazing DX9 performance. Neat trick.

The Cards generally carry a 4 or higher Star rating in reviews. Which means people that are buying them are likely managing their expectations. While not cheap, they're at a price point that puts them in the affordable category that both AMD and Nvidia (specifically) have utterly ignored. Thus why Intel has managed to steal some GPU market share.

Do the cards suck? Yeah. I won't run that shit in my rig.

Do I know people that would? Yeah. Not everyone has a grand+ in disposable income laying around to buy something "good enough" to play most games.

With enough time invested Intel will likely deliver stable performance with these things and in the process enhance their software stack so that it is in a much better place for their Battle Mage lineup IIRC.
 
  • Like
Reactions: Halon
like this
So 6600xt/6700xt stopped existing?
No, but AMD hasn't released a driver update in over 2 Months for the 6000 Series Cards. They are updating the 7000 series cards only atm.

I put the "(specifically)" after Nvidia for a reason.
 
No, but AMD hasn't released a driver update in over 2 Months for the 6000 Series Cards. They are updating the 7000 series cards only atm.

I put the "(specifically)" after Nvidia for a reason.
I didn't know 46 days was over 2 months? lol.

Either way not a good look that AMD hasn't released a 6000 driver in 1 1/2 months now.
 
I can't speak to the rest of the world but I do know that I can buy an A770 for less than I can buy a 6600XT, if I needed productivity first, with gaming secondary then the A770 would be a strong contender.
I know the Intel cards have strengths. I was just commenting on the tad bit of hyperbole that you need to drop a grand on a card that can "play most games good".
 
I know the Intel cards have strengths. I was just commenting on the tad bit of hyperbole that you need to drop a grand on a card that can "play most games good".
I was speaking to the high end. If you want to play Cyberpunk at 60 FPS at 4K, all settings bitched out to the max, you need to drop a Grand+.

There are no budget GPUs that can do that.
 
I was speaking to the high end. If you want to play Cyberpunk at 60 FPS at 4K, all settings bitched out to the max, you need to drop a Grand+.

There are no budget GPUs that can do that.

"If you want to run the most demanding game at the highest possible resolution using the highest possible settings while getting maximum frame rate, you need to buy top end hardware."

Uhhh.... yeah. If you want Ferrari performance, you have to buy a Ferrari. That's kind of how things work. It's literally the definition of "top end." If an entry level card performed the same as a halo card, why would the halo card even exist?
 
The Intel arc cards are about what I expected. Rocky launch, driver problems, etc. Intel hasn't made a serious attempt at a gaming GPU in many years and those past ones were questionable at best.

I'm hoping they get their act together for whenever their next gen comes along. I wasn't expecting this one to be great, but it did better than I expected.
 
"If you want to run the most demanding game at the highest possible resolution using the highest possible settings while getting maximum frame rate, you need to buy top end hardware."

Uhhh.... yeah. If you want Ferrari performance, you have to buy a Ferrari. That's kind of how things work. It's literally the definition of "top end." If an entry level card performed the same as a halo card, why would the halo card even exist?
That's an interesting question. I recall a time when the budget solution cards would run everything damn near max resolutions ... but those days are long gone. I miss those days.
 
I suspect it will take years for Intel to get tot he same level of driver optimization across all games that NV/AMD have already.

Good to see the work progressing.
 
  • Like
Reactions: erek
like this
That's an interesting question. I recall a time when the budget solution cards would run everything damn near max resolutions ... but those days are long gone. I miss those days.

When was this? I'm still just in my mid 20s and this fascinates me.
 
When was this? I'm still just in my mid 20s and this fascinates me.
The last time that was true the RTX 1060 6gb was a new and welcomed surprise. It was really the last time a card like that existed, and I doubt we will see the likes of it again.
 
The last time that was true the RTX 1060 6gb was a new and welcomed surprise. It was really the last time a card like that existed, and I doubt we will see the likes of it again.
4K requires a lot of GPU power.

But, we are in a great spot for 1440p. Most games you can get 60fps at 1440p from a 6600 XT. And for the few you can't----you can do some mixed settings and squeek over the edge for 60.
 
When was this? I'm still just in my mid 20s and this fascinates me.
As late as the 900 Series Cards where you could run a pair of GTX970's in SLI for relatively cheap. We had done it with the GT7600 parts in SLI for like 100 bucks a card. Older boards like the GT250 would deliver 60 FPS in 1080P in most DX9 titles (for a massive period of time). I had 3 of em and still have 1, though driver support has gone to shit and been dropped. Played the shit out of Mass Effect 3 Platinum Level MP on one with zero problems.

It really wasn't until the advent of affordable 4K Displays that things became an issue. 1440P is playable on lesser cards for a long time now. But 4K has been a dog unless you have a top tier card. 1080Ti was 40FPS in top titles, 2080Ti got close to 60 FPS, then you couldn't get cards due to the supply chain bullshit and Covid.

We once gamed at lower resolutions, that's likely why lower end 192 - 256 Bit cards had no problem pushing the FPS.
 
I suspect it will take years for Intel to get tot he same level of driver optimization across all games that NV/AMD have already.

Good to see the work progressing.
Even though their cards are still buggy as all hell, it's pretty amazing the amount of performance they have unlocked by optimizing their driver code and using a couple tricks from Linux driver translation to get DX9 on Vulkan. Might take em forever to iron out all the bugs but I suspect we may see some relatively stable drivers for most titles in a relatively short period of time. Couple months, as they lead up to their next GPU launch. Gives me hope for better competition.
 
This is one of their slides with performance claims:

*note that it has no indication of what the bars represent ;)

big_intel-arc-value-vs-launch.jpg
 
The last time that was true the RTX 1060 6gb was a new and welcomed surprise. It was really the last time a card like that existed, and I doubt we will see the likes of it again.
One of my all time favorites was the 8800gt. I picked mine up for $199 and it out performed the $400 8800gts and was fairly close to the $599 8800GTX. That was a pretty cool time to be a budget gamer.
 
One of my all time favorites was the 8800gt. I picked mine up for $199 and it out performed the $400 8800gts and was fairly close to the $599 8800GTX. That was a pretty cool time to be a budget gamer.
250 GT was a 9800GT rebrand. Sweet thing about the cards was you could occasionally find one that was 256 Bit, 1GB RAM (actually I think this limited it to 512MB) and required NO PCI-E Power header, got all it's power from the PCI-E Slot. They ran at like 400 Mhz, but I gave one to a friend with a OEM PC with middling power supply and he played the shit out of Star Trek Online at Max settings for about 5 years until the card died.
 
I didn't know 46 days was over 2 months? lol.

Either way not a good look that AMD hasn't released a 6000 driver in 1 1/2 months now.

I personally don't know if its really a big deal. Last I checked there where not any major show stopping bugs going on with the 6000/5000/500 cards.
I'll worry AMD has abandoned those products if we get to 4 months without a driver.... of if some new game drops that crashes and they don't have a fixed driver in a reasonable timeframe.

Monthly drivers are great and all However if a driver isn't going to offer any real improvements, or fix anything; thanks for saving me a download for nothing. I know its popular with non AMD users to go on about how AMD needs to "catch up" driver wise... I think the majority of AMD users know that is all a bunch of fud. I don't really care if AMD releases monthly drivers, as long as they are on top of things when major bugs pop up or new titles are released that may have issues. (On that front though AMD has very much improved the last few years... its pretty rare a new title has any real performance issues on release, owning the console market helps I'm sure)

I hope Intel figures out their drivers... they claim to be trashing their overlay/control panel setup to move to a AMD style all in one. Hopefully they do... and I'm glad some people have chanced beta testing a Intel card to get Intel the real world data they need to square the software. Hopefully Intels next generation will be real competition and their software gets nice and polished up by arc.
 
I take this as a good sign. Anyone who got this first gen ARC release is basically helping beta test Intel GPU drivers.
By the time the 2nd generation ARC comes out the drivers will be mature enough to actually run a variety of games and if Intel keeps up addressing all these problems, the 2nd generation launch should be smooth and the cards might actually be contenders that'll help bring down the price of AMD and Nvidia GPUs so the whole damn market becomes a bit more financially viable.

Hopefully.
 
I take this as a good sign. Anyone who got this first gen ARC release is basically helping beta test Intel GPU drivers.
By the time the 2nd generation ARC comes out the drivers will be mature enough to actually run a variety of games and if Intel keeps up addressing all these problems, the 2nd generation launch should be smooth and the cards might actually be contenders that'll help bring down the price of AMD and Nvidia GPUs so the whole damn market becomes a bit more financially viable.

Hopefully.
Technically, this is second gen ARC. There was a straight to OEM card released that only worked on Intel Boards for their first debut. I do tend to agree with your thoughts on the maturity of the drivers for Battle Mage. We can hope... for competition.
 
A couple of weeks ago, it was rumored that Nvidia is attempting to use AI to optimize its GPU drivers and that it could lead to a large improvement.

They certainly could stand to get some relief on how much their drivers require from the CPU.
You’re right: https://hothardware.com/news/nvidia-ai-optimized-gpu-drivers-massive-performance-boost

And not only that but their products are optimizing the CEOs net worth by billions to boot: https://markets.businessinsider.com...th-artificial-intelligence-chip-stocks-2023-1
 
I love to see them improving.. but bragging about DX9 games? zoinks.
DX9 titles were the ones performing the absolute worst, the GPU itself doesn't actually support it. DX9 titles are actually run through an emulation layer called D3D9on12.
https://github.com/microsoft/D3D9On12
https://www.intel.ca/content/www/ca/en/support/articles/000091237/graphics.html

D3D9On12 is a translation layer that maps graphics commands from D3D9 to D3D12. When an application creates a D3D9 device, they may choose for it to be a D3D9On12 device, rather than a native D3D9 device. When this happens, d3d9on12.dll is loaded by the D3D9 runtime and initialized. When the application calls rendering commands, D3D9 will validate those commands, and then will convert those commands to the D3D9 Device Driver Interface (DDI) and send it to D3D9On12, just like any D3D9 driver. D3D9On12 will take these commands and convert them into D3D12 API calls, which are further validated by the D3D12 runtime, optionally including the D3D12 debug layer, which are then converted to the D3D12 DDI and sent to the D3D12 driver.
 
I think it is very plausible, Nvidia has mentioned in a few of their presentations that they already use AI to help design their silicon and their PCBs, as it is far more capable of designing traces than humans can, it is also capable of running massive simulations on what happens if they increase cache by X amount or move parts of the silicon logic from one place to another, so it can assist in fine-tuning the hardware to some degree. Having an AI comb through Nvidia driver code to find function calls to nowhere, duplicate algorithms, or inefficient loops is just the next step. I mean if you put a team of humans on it and had them working around the clock on that sort of job they would never get it finished, and the drivers would be updated too frequently for them to keep up. I am sure they had a small team train an AI, give it some parameters then fed it the source code so it could find parts of code that violate the parameters it was fed then spit out a detailed report. I know they wouldn't trust it with making actual code changes, but if it generates something tangible the driver teams can work with that alone is enough, then as time goes on and they make the changes they do and the AI gets more fine-tuned they can then experiment with it trying to optimize algorithms for either performance or stability or both.
 
This is very welcome news...they need to stabilize the control panel as well.
 
https://www.techpowerup.com/304346/dead-space-performance-jumps-46-with-resizable-bar-enabled

“The PCI Resizable BAR feature can have a major impact on the performance of "Dead Space" (the 2023 reboot).”

From a comment under that link:

TL;DR while you may have ReBAR enabled in your BIOS, and GPU-Z may show it as Enabled, that doesn't mean your NVIDIA GPU will actually use it! It seems like ReBAR support is something that is baked into the drivers on a per-game basis, and NVIDIA chooses to blacklist all games by default from using ReBAR, while whitelisting only titles they have tested and found it works well with. This is in direct contrast to AMD who whitelist by default and blacklist non-working games; presumably Intel does the same as AMD.
 
Back
Top