Intel introduces new high-performance graphics brand: Arc

Still think I'm sticking with AMD for now. Not sure I want to deal with first gen drivers. But if I needed a new GPU and they were available I would consider it.

Also want to see what the Linux driver situation is. AMD is very stable and though Intel is well supported for the most part, their GPUs have never been fast enough to properly test.
 
I remember that the initial DG1 Xe GPUs only worked on certain Intel systems. Will these work on older systems or on Ryzen? I'm still on LGA 2011-3, but I'd like to get one to replace my nVidia 1080.
 
If price vs performance is right I'll buy one for sure. My newly picked up 1440p / 165Hz really struggles with a 1070 Ti
 
I've been holding off building a main gaming rig due to current graphics card availability issues. If performance, price and availability are good then I'm game (pun intended).
 
It may all hang on the maturity/stability of the new drivers...promising but we'll see. 😁
 
There won't be a huge supply, but doesn't Intel have their own factories? Which means we wouldn't be splitting the chips like we are with AMD.
 
It may all hang on the maturity/stability of the new drivers...promising but we'll see. 😁
I'm not really worried about the drivers, Intel's drivers have traditionally been plain jane but rock-solid, if anything they will be missing "the optimizations" that AMD and Nvidia have added for specific titles over the years, but I am not expecting to have any stability concerns. Intel's Linux drivers have generally been pretty solid as well on both the Open and Closed source sides of things.

I want to see how well the Intel XeSS holds up against DLSS and Fidelity FX, and how easy it is to implement. Because supposedly it is powered by the DP4a instruction sets which make it compatible with Nvidia and AMD hardware as well as current Intel onboard graphics. And I know they have talked about making the XeSS SDK open for anybody to implement but given the other existing options will they want to is the real question..
 
There won't be a huge supply, but doesn't Intel have their own factories? Which means we wouldn't be splitting the chips like we are with AMD.

They have their own fabs, but it is unclear whether or not they are using them for graphics products.

Remember, Intel still has some process difficulties to overcome.

Chances are they - like everyone else - are going to TSMC or Samsung this round.
 
I'm not really worried about the drivers, Intel's drivers have traditionally been plain jane but rock-solid, if anything they will be missing "the optimizations" that AMD and Nvidia have added for specific titles over the years, but I am not expecting to have any stability concerns.
Right, people forget that Intel has the most graphics card market share out of anyone, at around 68%. Granted most of those are non-gaming or business machines, but they have to be solid.

So Intel is not starting from scratch. However, like you say, Nvidia/AMD have per-game optimization and also work with developers while they are making the games, so they have a huge advantage right now.

Intel is not dumb, and I think they can pull it off. However, I still wouldn't buy the first gen. There are just thousands of games, even from the last few years, that would need to work perfectly. I can't imagine there won't be hiccups.
 
I'm not really worried about the drivers, Intel's drivers have traditionally been plain jane but rock-solid, if anything they will be missing "the optimizations" that AMD and Nvidia have added for specific titles over the years, but I am not expecting to have any stability concerns. Intel's Linux drivers have generally been pretty solid as well on both the Open and Closed source sides of things.

I want to see how well the Intel XeSS holds up against DLSS and Fidelity FX, and how easy it is to implement. Because supposedly it is powered by the DP4a instruction sets which make it compatible with Nvidia and AMD hardware as well as current Intel onboard graphics. And I know they have talked about making the XeSS SDK open for anybody to implement but given the other existing options will they want to is the real question..

You must be the first person I've met that isn't worried about Intel drivers. I haven't had stability issues for years with AMD, NV or intel drivers for the most part. These aren't integrated graphics that only need to run Farmland or 10 year old games. Optimizations and games fixes for new games are the ticket. Intel has never focused on games and when they did it is more of just get the game to run. Sure you want to display the desktop or run a spreadsheet they are fine, but that is not what we are focusing on with these new cards.
 
Right, people forget that Intel has the most graphics card market share out of anyone, at around 68%. Granted most of those are non-gaming or business machines, but they have to be solid.

So Intel is not starting from scratch. However, like you say, Nvidia/AMD have per-game optimization and also work with developers while they are making the games, so they have a huge advantage right now.

Intel is not dumb, and I think they can pull it off. However, I still wouldn't buy the first gen. There are just thousands of games, even from the last few years, that would need to work perfectly. I can't imagine there won't be hiccups.
I expect current popular titles that are in the top steam charts, popular streaming charts, and major tech test suites to have their optimizations in place but I expect most titles outside that short list to be artifact city for the first few months.
 
They have their own fabs, but it is unclear whether or not they are using them for graphics products.

Remember, Intel still has some process difficulties to overcome.

Chances are they - like everyone else - are going to TSMC or Samsung this round.
I thought it was announced some time ago they would be using TSMC 7nm?
 
Right, I'm sure Call of Duty and Fortnite will work 100%, but I play all sorts of weird indie games that I bet Intel has never even heard of.
Indie titles very very rarely are creating their own engines from scratch. They are probably going to work fine because I really doubt they are paying AMD and NVidia to optimize their drivers for their games. They are probably sticking to a standard engine and not straying too far from a beaten path.
 
Well the real dilemma is whether or not to wait for reviews. The architecture looks to me to be a mining beast in the making and that doesn't bode well for those still seeking a new GPU.
 
Well the real dilemma is whether or not to wait for reviews. The architecture looks to me to be a mining beast in the making and that doesn't bode well for those still seeking a new GPU.
Don’t wait, all it needs to do is the job, it would have to be an unbelievably shitty card to be worse than nothing.
 
If it is good and in my preferred price range, I will get it. Because why not get an Intel GPU to go along with my AMD CPU?
 
There won't be a huge supply, but doesn't Intel have their own factories? Which means we wouldn't be splitting the chips like we are with AMD.
Yes, TSMC fabbed.
They have their own fabs, but it is unclear whether or not they are using them for graphics products.

Remember, Intel still has some process difficulties to overcome.

Chances are they - like everyone else - are going to TSMC or Samsung this round.
Yep.
You must be the first person I've met that isn't worried about Intel drivers. I haven't had stability issues for years with AMD, NV or intel drivers for the most part. These aren't integrated graphics that only need to run Farmland or 10 year old games. Optimizations and games fixes for new games are the ticket. Intel has never focused on games and when they did it is more of just get the game to run. Sure you want to display the desktop or run a spreadsheet they are fine, but that is not what we are focusing on with these new cards.
Drivers are the primary focus of scaling its launch now.
I thought it was announced some time ago they would be using TSMC 7nm?
Intel was initially TSMC's 6nm launch partner.
Turmoil.
 
If there's any sort of availability in the $400-600 segment, it could be a winner by default.

I would think that intel would want to focus far more on maximizing the *number* of videocards to get into people's hands, as opposed to immediate profits. That would be the smaller-die budget variants in the < $300 (msrp) market that both nvidia and amd have been ignoring for so long.

Imo that gap, together with current human-malware/mining-caused market conditions, is one of intel's best opportunities to seriously and permanently enter the video-card market yet.
 
Last edited:
I hear ya, in other circumstances these cards might have been sold for $399 but let's say it performs RTX 3070 levels for the top card, those cards go for like $1100+ (insanity yeah) and they could price themselves like $799 and would still laugh all their way to the bank. I'm betting on "MSRP" $399 but street pricing initially perhaps 699-$799 that will raise to like $899 in a couple of weeks.

This is like the best possible time to enter the graphics card market for sure, even when you double the price you will still make cards sell like hotcakes haha.
 
Heck, 1660s are selling for almost 500 - Intel would be crazy not to price at 700 given they are virtually guaranteed to beat 1660 performance.
 
Heck, 1660s are selling for almost 500 - Intel would be crazy not to price at 700 given they are virtually guaranteed to beat 1660 performance.
If Intel doesn’t price appropriately to their competitors given that the cards will basically instantly sell out then the shareholders will riot. The only way we are seeing these cards dramatically lower than AMD/NVidia’s direct performance competitors is if they aren’t selling.
 
If Intel doesn’t price appropriately to their competitors given that the cards will basically instantly sell out then the shareholders will riot. The only way we are seeing these cards dramatically lower than AMD/NVidia’s direct performance competitors is if they aren’t selling.

Well then they're gonna riot, because Raja Koduri has already said they don't expect this generation to earn any profits.
 
i would assume that intel see it as a loss leader that will give them valuable feedback and possibly a market advantage in the future so are willing to make a loss whilst they gather the telemetry and other user feedback/ establish dgpu market presence.
 
I just hope they have unlimited decode/encoding with their gamer/retail cards. It would set them apart with the media crowds. Nvidia sets a 2 stream limit iirc.

AMD is not all that great in comparison, but Intel should have a very capable decoder/encoder.
 
I just hope they have unlimited decode/encoding with their gamer/retail cards. It would set them apart with the media crowds. Nvidia sets a 2 stream limit iirc.

AMD is not all that great in comparison, but Intel should have a very capable decoder/encoder.
That would be my intention for a low end card.
Plex, streaming ect.
Nvidia professional cards that have unlimited streams are priced out of my reach for what I need.
 
Back
Top