Intel Exec Discloses Discrete GPU Details and Strategy

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
HotHardware attempted to grill Intel’s Ari Rauch (Vice President of the Core and Visual Computing Group and General Manager of the Visual Technologies Team) for new information regarding its new discrete GPU, which is “on track” for 2020. While it’s too early for the company to show its cards, Rauch did respond to a variety of reader questions that provide insight on potential features such as multi-GPU support and what markets will be initially targeted.

Intel’s focus will be on industry standard APIs, but differentiating features are in the works as well that will likely require some proprietary framework. What those features are, we are not sure yet. Based on the teaser video and Intel’s interest in the community’s take on axial vs. blower-style fans, for example, it’s also safe to assume that desktop add-in cards are planned. The talk of enabling new, and interesting form factors, however, also hints toward hybrid solutions similar to Kaby Lake G. Eventually, both solutions are most likely to appear.
 
Intel’s focus will be on industry standard APIs, but differentiating features are in the works as well that will likely require some proprietary framework. What those features are, we are not sure yet.

We found a copy of nvidia's playbook.
 
Will there be an actual product this time? Intel has been talking about graphics for a long time.
 
Intel moving into the discrete graphics card space is nothing but white noise. I'm sure AMD is licking their chops at Intel diversifying in a high-risk space where AMD and Nvidia reign supreme. Intel as a company feels "lost" right now. I can't help but feel like they are just throwing shit against the wall to see what sticks.
 
Probably a midrange gpu in 2020 that won't be as good as what nVidia and AMD offer but significantly better than what Intel has now. If they price it right, they will make inroads fast.
People buy computers with Intel graphics now because it is FREE. There is never an upcharge for Intel graphics. As soon as Intel goes into the AIB market with GPUs, unless they are FREE for OEMs to stick in their machines, there will be no inroads. Remember, Intel must now tread carefully when providing incentives to OEMs that gives them something for free or more than free.
 
I'd expect any diiferentiating features to be mainly oriented towards productivity or computing, like perhaps taking their existing "Quick Sync" to another level. I doubt that they will launch a truly separate line of workstation right away, so they'll need something in their "gaming" cards to appeal broadly
 
What are intel graphics drivers like these days? That's a really good indication of whether they'll be ready for rolling with the big boys
 
I’ll bet a lot of this is about renewing/updating/defending patents so they don’t lose them. They don’t really need this product to be successful.
 
People buy computers with Intel graphics now because it is FREE. There is never an upcharge for Intel graphics. As soon as Intel goes into the AIB market with GPUs, unless they are FREE for OEMs to stick in their machines, there will be no inroads. Remember, Intel must now tread carefully when providing incentives to OEMs that gives them something for free or more than free.

They'll probably bundle CPU and GPU to undercut competitor GPU price.
 
I'd expect any diiferentiating features to be mainly oriented towards productivity or computing, like perhaps taking their existing "Quick Sync" to another level. I doubt that they will launch a truly separate line of workstation right away, so they'll need something in their "gaming" cards to appeal broadly
They'll probably have some compute, vm, and other proprietary tech as incentives. It will almost certainly not be just graphics.
 
  • Like
Reactions: mord
like this
I read the entire article. I HATE market-speak non-answers, a Turing-response program could have generated those replies.
 
  • Like
Reactions: HoffY
like this
Intel will pull it off. Deep packets... and even if it is not hugely successful it hopefully will keep at least AMD not to go full retard on prices. The way Ngreedia did with their arsonist edition of cards ….:D
 
What are intel graphics drivers like these days? That's a really good indication of whether they'll be ready for rolling with the big boys

They're effective. Generally stable, etc.

And given the proliferation of their current architecture, they're supported pretty well universally.
 
Can I get in first for predicting a boilerplate copy of the NVidia NDA?
 
Really don't see how they're gonna compete with AMD on the cost/performance metric in the mainstream graphics segment. I doubt Nvidia is going to allow Intel to create any space more than AMD already has cut out in the pie.
 
They'll probably bundle CPU and GPU to undercut competitor GPU price.
I don't see how their shareholders would allow that to happen. "Let's introduce a new product that we intend to sell at a loss." They would be better off making a deal with the devil (Nvidia) and bundling Intel CPUs with Nvidia GPUs on the same SoC. At least that would be a product people would buy. I just don't see Intel having as firm of a market position as they did when Athlon 64 came out. Lisa Su is running AMD better than any other CEO ever has.

Raj was the Polaris/Vega guy. AMD dropped him and let Intel scoop him up because they knew that Intel doesn't have a realistic chance at the dGPU market. Picking 2020 as the dGPU deployment year for Intel is also horrible timing as the PS5 will be coming out, Xbox whatever will soon follow and dGPU performance is going to make another leap like it always does when new consoles come out.
 
Really don't see how they're gonna compete with AMD on the cost/performance metric in the mainstream graphics segment. I doubt Nvidia is going to allow Intel to create any space more than AMD already has cut out in the pie.

Intel already has plenty of the "pie". They have a huge foundation to build upon when they expand their offerings. It's Nvidia and AMD that should be worried IMO. Even many laptops that use Nvidia GPUs still use the Intel GPU via Optimus. They are hardly in a position to prevent Intel from doing anything.

mwpr-001b-325.png
 
dGPU performance is going to make another leap like it always does

This is interesting- inherently, it's kind of true, but it's hard to imagine that the correlation is more than circumstantial. Part of it seems to be based on production capabilities at TSMC (and now Samsung?), and part seems to relate to increases in production values with the coming generation of consoles.

Still, after the PS3 and 360, we saw the introduction of DX10 and specifically the Nvidia 8000-series, which brought massive improvements over the DX9 generation. With the Xbox and PS2, I don't think we saw that; the Xbox had a GF3-based GPU, and the GF4 wasn't a significant departure technologically. And now we're over a decade, right?

As to how that may or may not affect Intel's offering: the biggest issue I see is TSMC and Samsung at 7nm while Intel is still getting their 10nm process ready for mass production. If this Intel dGPU arrives on 10nm, it could perhaps be competitive at say the level that Vega is today, and that'd be a pretty good place to be.

It could also be a flop in one of many ways...
 
Intel already has plenty of the "pie". They have a huge foundation to build upon when they expand their offerings. It's Nvidia and AMD that should be worried IMO. Even many laptops that use Nvidia GPUs still use the Intel GPU via Optimus. They are hardly in a position to prevent Intel from doing anything.

View attachment 124630

I don't believe that is discrete market. That is notebook market, what you have to understand is that intel gets a boost as all their chips automatically include built in GPU and more laptops don't have dedicated card. I do think AMD is something significant to gain there as they increase their APU shares. I have seen more and more laptops are coming out with ryzen GPUs with vega graphics. I think once they sort our their lack of driver support by OEM which they plan to early next year with regular updates they have much to gain there. I do expect AMD to take away some overall GPU marketshare in coming years, heck even intel is using vega M packaged on the die.

This is primarily the reason you are seeing intel coming out with discrete GPUs in 2020. I think they are expecting AMD to gain more and more marketshare in notebook market. Intel really has to make it up in anyway they can.

Intel is worried and it shows. Because they know AMD on 7nm is going to have better and better integrated solution and they will only gain market share. So what else they do? They try to get in to the discrete graphics market to gain whatever they can there.
 
Intel already has plenty of the "pie". They have a huge foundation to build upon when they expand their offerings. It's Nvidia and AMD that should be worried IMO. Even many laptops that use Nvidia GPUs still use the Intel GPU via Optimus. They are hardly in a position to prevent Intel from doing anything.

View attachment 124630

(discreet market)
 
when he was asked the question on what kind of product they are going to release, whether its a add in card or on board chip? he said, it will be a surprise.

so im thinking they will release something proprietary to their chipset, and has to be added on at extra cost to other chipsets like AMDs....

so with intels track record in discrete gpus
and the current climate of the industry, i am pretty skeptical that intel is going to release any thing worth a damn.

but i would be more than happy to be totally wrong and welcome a serious new player to the gpu market with open wallet and arms.
 
They're effective. Generally stable, etc.

And given the proliferation of their current architecture, they're supported pretty well universally.

That's a big positive for them then. Intel igp drivers used to be a byword for dreadful!
 
This is interesting- inherently, it's kind of true, but it's hard to imagine that the correlation is more than circumstantial. Part of it seems to be based on production capabilities at TSMC (and now Samsung?), and part seems to relate to increases in production values with the coming generation of consoles.

Still, after the PS3 and 360, we saw the introduction of DX10 and specifically the Nvidia 8000-series, which brought massive improvements over the DX9 generation. With the Xbox and PS2, I don't think we saw that; the Xbox had a GF3-based GPU, and the GF4 wasn't a significant departure technologically. And now we're over a decade, right?

As to how that may or may not affect Intel's offering: the biggest issue I see is TSMC and Samsung at 7nm while Intel is still getting their 10nm process ready for mass production. If this Intel dGPU arrives on 10nm, it could perhaps be competitive at say the level that Vega is today, and that'd be a pretty good place to be.

It could also be a flop in one of many ways...
I just don't understand why they are making a discrete push when integrated SoC is where we are heading with literally everything.
 
I just don't understand why they are making a discrete push when integrated SoC is where we are heading with literally everything.

i'm just hoping we can see some prototypes leaked, might even be better if they cancel the project entirely with those samples in terms of value for collectibles
 
I am expecting First gen of intel discreet will be meteocre/weak at best. But the second release will reveal if intel has the corporate ability to build and support a proper GPU.

They would never say so but I am curious how much of their existing GPU tech will involved.
 
I just don't understand why they are making a discrete push when integrated SoC is where we are heading with literally everything.

...we are?

I don't see this. As seen with the RTX release, as fast as GPUs can be made, they're still not fast enough. Chiplets might be a way of shoving more processing power into a single logical GPU, but that just means putting multiples of what we already have together. Probably.

And that's still way too massive for an SoC.
 
...we are?

I don't see this. As seen with the RTX release, as fast as GPUs can be made, they're still not fast enough. Chiplets might be a way of shoving more processing power into a single logical GPU, but that just means putting multiples of what we already have together. Probably.

And that's still way too massive for an SoC.
I'm not talking about high end gaming or high end professional. I'm talking about mid-high and lower. Intel is going to struggle to match today's Vega in 2020, let alone beat AMD or Nvidia's high end in 2020. Mobile phones and tablets are getting so powerful that ARM is going to be part of the equation in the 2020s for low/mid-low end computers. So yeah, for the mass market, tech is converging to SoC.

If Intel was able to push the envelope in SoC tech, instead of wasting resources on dGPU, Intel could literally take over the mass market world of tech.
 
how many times have we hard this from intel now? Didnt they say the same thing recently?

I740
larabbee


im not relly sure i hope fo it or not.
pro: one more GPU players
Con: more reveneu for intel which cangive them more ressorucet o fight AMD in the CPU market.

realyl wish we could get a real 3rd partner into the GPU race
 
Back
Top