[Rumor] Intel Arc A770 Early October Launch

Both the A750 and A770 are around $300, but the 6600 XT which beats both in benchmarks can be found $300 new and $250 used. The RTX 3060 can be found for slightly less than $300 on Ebay used. IF the A750 was $200 and the A770 $250 then we'd have something here. Not a bad choice for a Linux user, but the 6600 XT is still the better choice.
If they have the encoders working in Linux maybe. Otherwise? Meh.

Intel Arc Graphics A750 + A770 Linux Gaming Performance - Phoronix

 
If they have the encoders working in Linux maybe. Otherwise? Meh.

Intel Arc Graphics A750 + A770 Linux Gaming Performance - Phoronix

On even on windows, If you want to render on Blender but do not want to use Nvidia for some reason, can be significantly faster than a 6800xt in some workload:



OpenCL:
https://techgage.com/wp-content/uploads/2022/10/Intel-Arc-A770-and-A750-Performance-LuxMark.jpg

But where it completely beat the Radeon, it is not a good option against the RTX would be my feeling, looking at this:

https://techgage.com/article/intel-arc-a750-a770-workstation-review/

Adobe, blender, blackmagic, not just the Handbrake that is does incredibly well by $

https://techgage.com/article/intel-arc-a750-a770-workstation-review/
https://techgage.com/article/intel-arc-a750-a770-workstation-review/2/

But you need to prove yourself quite a bit to get the pro market interested I would imagine and the dollar shaving will not be common, maybe in some part of the world too.
 
The bottom line is that Intel needs ARC to perform well enough, with enough stability, to be able to push Nvidia out of its laptop designs.

A+A and I+I across mobile would be a crushing blow to Nvidia.

I am not even sure Intel is concerned about the actual dGPU market. It just has to be proven and branded there.
Hopefully not too crushing, since we need Nvidia too. I don´t want AMD and Intel locking in their own platforms in the long run. But, in the short run, I do hope Intel does this. Intel supports open standards and even their AI upscaling method can also be run on Nvidia and AMD hardware. Intel have the muscle to gain marked share through OEM prebuilds and laptops. They also have the muscle to increase open standards support on productivity software for GPU accelerated tasks.

Nvidia have their "innovations", but them locking it into both hardware and their own software, instead of building a more open standard around it, obsoletes them too quickly:
G-sync -> Dead in its original form, now basically a software support for AMD+VESAs VRR open standard.
GPU Accelerated PhysX -> Dead for years. Was locked to Nvidia hardware after they bought Ageia. CPU accelerated version is now open source.
DLSS - > Locked to hardware. DLSS not supported by non RTX cards. DLSS 3 is now only supported by RTX 4XXX series and above. Further developement for DLSS 2 remains to be seen, but might soon be dead. Best chance for long time support on Nvidia cards with upscaling is AMDs FSR and Intels XeSS.

If Intel squeezes Nvidia out of the high volume mobile and OEM marked, it will eat up Nvidias marked share and give developers even less incentive to lock features to Nvidia in their games or software. Due to todays possibility of light gaming on iGPUs, they are already pretty much out of the low end marked. Intels dGPUs gains ground in the mainstream segment, which leaves Nvidia with high end and halo dGPUs.

If Intel gets their game support to a higher level and also pushes open standard support for productivity software, the playing field in the GPU marked will be totally changed I believe. Intel gets less dependent on Nvidia with GPUs and AMD is already not dependent on Nvidia for GPUs. Marked share gets split more and us consumers might benefit a lot from that.

Nvidia has probably seen this coming a long time ago and might be another reason why they wanted to buy ARM.
 
Hopefully not too crushing, since we need Nvidia too. I don´t want AMD and Intel locking in their own platforms in the long run. But, in the short run, I do hope Intel does this. Intel supports open standards and even their AI upscaling method can also be run on Nvidia and AMD hardware. Intel have the muscle to gain marked share through OEM prebuilds and laptops. They also have the muscle to increase open standards support on productivity software for GPU accelerated tasks.

Nvidia have their "innovations", but them locking it into both hardware and their own software, instead of building a more open standard around it, obsoletes them too quickly:
G-sync -> Dead in its original form, now basically a software support for AMD+VESAs VRR open standard.
GPU Accelerated PhysX -> Dead for years. Was locked to Nvidia hardware after they bought Ageia. CPU accelerated version is now open source.
DLSS - > Locked to hardware. DLSS not supported by non RTX cards. DLSS 3 is now only supported by RTX 4XXX series and above. Further developement for DLSS 2 remains to be seen, but might soon be dead. Best chance for long time support on Nvidia cards with upscaling is AMDs FSR and Intels XeSS.

If Intel squeezes Nvidia out of the high volume mobile and OEM marked, it will eat up Nvidias marked share and give developers even less incentive to lock features to Nvidia in their games or software. Due to todays possibility of light gaming on iGPUs, they are already pretty much out of the low end marked. Intels dGPUs gains ground in the mainstream segment, which leaves Nvidia with high end and halo dGPUs.

If Intel gets their game support to a higher level and also pushes open standard support for productivity software, the playing field in the GPU marked will be totally changed I believe. Intel gets less dependent on Nvidia with GPUs and AMD is already not dependent on Nvidia for GPUs. Marked share gets split more and us consumers might benefit a lot from that.

Nvidia has probably seen this coming a long time ago and might be another reason why they wanted to buy ARM.
The only thing buying arm would have really gave Nvidia would was the ability to act like Nvidia. Pushing GPUs on everything ARM.

Nvidia was getting a little too big for the britches... exemplified by 4000 pricing.

The industry is moving in the right direction. FB/Meta is moving to remove or at least impact Nvidias stranglehold on AI software. (which isn't all about AMD everyone wants to just make their own chips)
Intel is in the GPU business now... and they might not be a threat to the high end yet. Kyle is correct though a solid mid tier mobile Intel option could all but end Nvidia mobile.

For 4 generations now Nvidia has been building AI chips that they could also put in consumer gaming cards. I doubt they stop doing that... but its possible they might pay a bit for it over the next year or so. Intel is doing the same thing... but Nvidia hasn't had to deal with the OEM wrangling end of Intel before. I know I sound like a AMD shill sometimes but I believe they really are in the best position. Chiplets which it seems like will also be a thing in GPUs shortly gives AMD a very unique ability to mix and match a little for different markets. Instead of fabbing one massive die like Nvidia (and now Intel it seems) which they intend to use in Datacenter parts and at least the core design if not the actual same die runs in their consumer cards are locked into a lot of unchangeable choices. We'll have to see what AMDs 7000s are looking like... but going into even 2nd or 3rd gen chiplet GPUs, AMD may be able to double the Tensor chiplets (or even just make a 100% tensor chip) ect. Perhaps not this gen but down the road they could really more easily tailor chips for specific markets. Chiplet GPUs are actually exciting for datacenter more then gamers imo. For gamers though it may lead to much more inexpensive GPUs. (assuming AMD doesn't make hay and rake in profit instead)
 
These GPU's do have potential, but I'm not a huge fan betting on future driver updates to bring about better performance. Had that bridge burned too many times. These are good at DX12 and Vulkan games, but yea DX11 and lower are not very good. It certainly doesn't help that Intel are using a wrapper to convert DX11 to DX12. Linux does this with DXVK and VKD3D, but Valve has spent years optimizing these. Intel's DX11 and DX9 wrapper is from Microsoft who didn't optimize them well.
 
The only thing buying arm would have really gave Nvidia would was the ability to act like Nvidia. Pushing GPUs on everything ARM.

Nvidia was getting a little too big for the britches... exemplified by 4000 pricing.

The industry is moving in the right direction. FB/Meta is moving to remove or at least impact Nvidias stranglehold on AI software. (which isn't all about AMD everyone wants to just make their own chips)
Intel is in the GPU business now... and they might not be a threat to the high end yet. Kyle is correct though a solid mid tier mobile Intel option could all but end Nvidia mobile.

For 4 generations now Nvidia has been building AI chips that they could also put in consumer gaming cards. I doubt they stop doing that... but its possible they might pay a bit for it over the next year or so. Intel is doing the same thing... but Nvidia hasn't had to deal with the OEM wrangling end of Intel before. I know I sound like a AMD shill sometimes but I believe they really are in the best position. Chiplets which it seems like will also be a thing in GPUs shortly gives AMD a very unique ability to mix and match a little for different markets. Instead of fabbing one massive die like Nvidia (and now Intel it seems) which they intend to use in Datacenter parts and at least the core design if not the actual same die runs in their consumer cards are locked into a lot of unchangeable choices. We'll have to see what AMDs 7000s are looking like... but going into even 2nd or 3rd gen chiplet GPUs, AMD may be able to double the Tensor chiplets (or even just make a 100% tensor chip) ect. Perhaps not this gen but down the road they could really more easily tailor chips for specific markets. Chiplet GPUs are actually exciting for datacenter more then gamers imo. For gamers though it may lead to much more inexpensive GPUs. (assuming AMD doesn't make hay and rake in profit instead)
Chiplets is the future. :) There are node contraints in size (I believe Volta was close up to that) and diminishing returns with node shrinks. Nvidia and Intel are also working on chiplet designs. AMD is just way ahead there for the moment. In a few years, I recon all high end GPUs will be based upon chiplet designs. As with Ryzen CPUs, if you don´t make the silicon bigger, you can always combine more instead.

Jensens reported death of Moore´s law might be premature with chiplets, Moore´s law might slumber a bit for Jensen and Nvidia, until they have a breakthrough in chiplets too.
 
Chiplets is the future. :) There are node contraints in size (I believe Volta was close up to that) and diminishing returns with node shrinks. Nvidia and Intel are also working on chiplet designs. AMD is just way ahead there for the moment. In a few years, I recon all high end GPUs will be based upon chiplet designs. As with Ryzen CPUs, if you don´t make the silicon bigger, you can always combine more instead.

Jensens reported death of Moore´s law might be premature with chiplets, Moore´s law might slumber a bit for Jensen and Nvidia, until they have a breakthrough in chiplets too.
I wouldn't say AMD is "way ahead", the Intel Pont Vecchio accelerator uses 47 Tiles/Chiplets in its design, and for its intended job outpaces the Nvidia A100, and the new AMD MI 250, its successor is already in development and should outpace the H100 and will be a full generation ahead of the MI 250 which is AMD's first MCM GPU. For the CPU side of things, the new Sapphire Rapids CPUs look beastly and are a full MCM design as well.
If anything I would say the new Intel Tile/Chiplet design methodology is ahead of TSMC's because of its scalability, for example, the Sapphire Rapids Xeons are made up of 4 tiles, but each tile is made up of 7 single purpose chiplets. This is made possible by some sort of voodoo magic with interposers and the hygroscopic properties of water to move the chiplets into place at a super small scale.
 
I wouldn't say AMD is "way ahead", the Intel Pont Vecchio accelerator uses 47 Tiles/Chiplets in its design, and for its intended job outpaces the Nvidia A100, and the new AMD MI 250, its successor is already in development and should outpace the H100 and will be a full generation ahead of the MI 250 which is AMD's first MCM GPU. For the CPU side of things, the new Sapphire Rapids CPUs look beastly and are a full MCM design as well.
If anything I would say the new Intel Tile/Chiplet design methodology is ahead of TSMC's because of its scalability, for example, the Sapphire Rapids Xeons are made up of 4 tiles, but each tile is made up of 7 single purpose chiplets. This is made possible by some sort of voodoo magic with interposers and the hygroscopic properties of water to move the chiplets into place at a super small scale.
I would say AMD is way ahead.
Ponte Vecchio is delayed, once it arrives, it might compete against MI250X or A100, or it might compete against MI300 (supposedly a multi-GPU with CPU) or H100. Most likely it will be above A100 and H100, but below MI250X.
Intels Sapphire rapids Xeon is delayed, once it arrives, it might compete against AMDs Genoa. The link there doesnt show it I think, but if I remember correctly, Intel is supposed to have a 80 core part too, with 4X 20 core chiplets vs AMDs 12X 8-core chiplets.
AMD is already past infinity fabric 3 and are developing these HPC parts on infinity fabric 4. Its going to be interesting to see what their purchase of Xilinx is bringing to the table there too.

But, for all I care, I can agree to disagree with you on this, because my main point was that chiplet is the future and we might see performance for us consumers beyond what node shrinks can do in the coming years! SLI and Crossfire is pretty much dead now and have been for a while. The chiplet approach can give it a bit back without the headache that plagued multi-GPU systems. :D

Whats beautiful in this, is that Intel, AMD and Nvidia all supports UCIe, which means we might get a lot of fun surprises from all three in terms of what they package with their MCM designs. I love common standards and I love that Intel, Nvidia and AMD all have a path forward away from the reliance of monolithic dies and node shrinks.
 
Let's go, boys

$349
https://www.newegg.com/intel-21p01j00ba/p/N82E16814883001?Item=N82E16814883001

1665558341341.png


$289
https://www.newegg.com/intel-arc-a750-21p02j00ba/p/N82E16814883002?Item=N82E16814883002

1665569398934.png
 
Last edited:
Out of stock :(.
I think it's just a place-holder page and has not yet been in stock.

BTW I found this benchmark while looking for what amazing render times I'll see with a 4090 in Topaz Video Enhance AI. Mousefart. I'm not sure if the weeping fully began with the 4090's lack of meaningful uplift, or the fact a $329 GPU was right up there with it. WTF!? There may still be some optimization in the tank for 4090 since the developer hasn't yet updated, but then again the program is CUDA-enabled meaning Nvidia should've already had a big advantage, and I'm not sure how much ground an update could actually make up.

Prediction: Gamers will continue to laugh off Intel Arc for DirectX time-wasting, but word slowly gets out that A750/A770 may just be amazing at certain workloads, astounding at the pricepoint (AI, Visual FX rendering, AV1 encoding for streamers, and again - yes, Taylor-Swift-on-Dua-Lipa deepfakes). Then there's a run on these and they're OOS everywhere, scalping commences, and Joseph R. Biden sneezes then hiccups while shuffling down a hall looking for toilet paper. It is morning in America again.

Again, $329 GPU. The $289 A750 may even perform close to it in these workloads. And this particular program does scale with multiple GPU's.

1665560064903.png

1665560028263.png
 
Last edited:
The only thing that would make me think about an ARC is if Intel would lower their prices and enhance their video playback quality. I know, the card is meant for gaming, but as far as I know, Intel iGPU's have always had a horrible, fake, yellowish, video quality when it comes to streaming playback.
That's the main reason for me to keeping my old Radeons, because I game, but also, I like to watch videos with the best quality possible and in that matter, neither Intel nor Nvidia can touch AMD cards.
 
Last edited:
The only thing that would make me think about an ARC is if Intel would lower their prices and enhance their video playback quality. I know, the card is meant for gaming, but as far as I know, Intel iGPU's have always had a horrible, fake, yellowish, video quality when it comes to streaming playback.
That's the main reason for me to keeping my old Radeons, because I game, but also, I like to watch videos with the best quality possible and in that matter, neither Intel or Nvidia can touch AMD cards.

I want one just to fiddle with. I know, it defies all logic! 😁
 
Well that's unfortunate, was planning to buy one for OpenCL/compute on Linux (ie. giving Intel some of my (unpaid) time to bring these up in multiple distros and flush out driver/runtime bugs).
 
I checked Newegg, Amazon, BestBuy and pcpartpicker. Nothing even listed except for Newegg. Starting to look like a paper launch?
Probably or Intel have a hard time convincing Bestbuy and Amazon to carry it. I do still think majority of Arc will go into OEM builds since Intel probably figure consumer interest isn't high to pick one up unless you are an enthusiast to pick one up to play with.
 
I checked Newegg, Amazon, BestBuy and pcpartpicker. Nothing even listed except for Newegg. Starting to look like a paper launch?
I think they're either having trouble getting retailers to play ball, or they're targeting specific ones for the limited stock available while they gear up for a bigger push with Battlemage. I'm posting from my A750, and a few driver wurbles aside, this thing is entirely meeting and sometimes exceeding my expectations.
 
Prediction: Gamers continue to laugh off Intel Arc for DirectX time-wasting, but word slowly gets out that A750/A770 is amazing at certain workloads, astounding at the pricepoint (AI, Visual FX rendering, AV1 encoding for streamers, and Dua Lipa deepfakes). Then there's a run on these and they're OOS everywhere, scalping commences

That didnt take long.

ldVeh91.png
 
Last edited:
Last edited:
Will be interesting to see the 2024 look back videos with the latest drivers (if they keep at it).
 
  • Like
Reactions: erek
like this
Intel using DXVK (part of Steam Proton) for their Windows Arc GPU DX 9 drivers
https://www.gamingonlinux.com/2022/...roton-for-their-windows-arc-gpu-dx-9-drivers/

In their announcement video below, they talk about how they did a "native implementation of DX9" and "there will be times when games are running on our native DX implementation, but there'll be other times when we take advantage of translation layers to go from DX 9 to a more modern API". Their blog post also mentions this hybrid approach, but neither the blog post nor video actually say it's DXVK. You only find that out when you look through their readme, to find the DXVK note and license.

They say the way it works will be transparent to the end user, so they likely just have a list of games where the driver switches between DXVK or their own work.


Nice to see open source getting some love from the big corporations.
 
Updated drivers. About 6 weeks since last release and this is all they come up with?

https://www.intel.com/content/www/u...intel-arc-iris-xe-graphics-beta-windows.html?

FIXED ISSUES:

Intel® Arc™ Graphics Products: • Fortnite* (DX12) may experience an application crash with High and Epic graphics settings. Intel® Core™ Processor Products: • Sonic Frontiers* (DX11) may exhibit texture and rendering corruption during gameplay.

KNOWN ISSUES:

Intel® Arc™ Graphics Products: • Diablo II: Resurrected* (DX12) may cause system instability or application crash. • Warhammer 40,000: Darktide* (DX12) may experience application crash during character selection. • A Plague Tale: Requiem* (DX12) may experience application freeze and crash during gameplay. • Conqueror’s Blade* (DX11) may experience corruption in benchmark mode. • Call of Duty: Vanguard* (DX12) may experience missing or corrupted shadows during the Submarine mission. • System may hang while waking up from sleep. May need to power cycle the system for recovery. • GPU hardware acceleration may not be available for media playback and encode with some versions of Adobe Premiere Pro. • Blender may exhibit corruption while using Nishita Sky texture node. Intel® Iris™ Xe MAX Graphics Products: • Driver installation may not complete successfully on certain notebook systems with both Intel® Iris™ X e + Iris™ X e MAX devices. A system reboot and re-installation of the graphics driver may be required for successful installation. Intel® Core™ Processor Products: • Dysterra* (DX12) may exhibit visual corruptions during gameplay. • Total War: Warhammer III* (DX11) may experience application crash while loading battle scenarios. • Call of Duty Warzone 2.0* (DX12) may exhibit corruptions on certain buildings and floors during gameplay. • Conqueror's Blade* (DX12) may experience application crash during game launch. • A Plague Tale: Requiem* (DX12) may experience application freeze and crash during gameplay.

INTEL® ARC™ CONTROL KNOWN ISSUES: •

Windows UAC Admin is required to install and launch Arc Control. • Some applications may exhibit a transparent or blank window when CMAA is set to “Force ON” globally. • A 1440p resolution selection in Arc Control Studio Capture may be unavailable when the display native resolution is 4K. • Arc Control Studio Camera overlay position may not retain desired position and size after a system restart. • The Arc Control Studio Camera tab may take longer than expected responsiveness upon the first navigation. • Arc Control may report incorrect memory bandwidth value. Intel® Arc™ Control Performance Tuning (BETA): • Intel® Arc™ Control Performance Tuning is currently in Beta. As such, performance and features may behave unexpectedly. Intel® will continue to refine the Performance Tuning software in future releases.
 
1673989135276.png



WTF is going on here?
They have Nvidia GPU's on them as co-processors? News to me...
 
But it says it's from the Intel Store. You would think someone at Intel looks over the product page for inaccuracies...
 
Back
Top