Shrout Says Intel Said 2020 on Discrete GPU

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,532
Ryan Shrout is reporting that Intel CEO Brian Krzanich stated that Intel will have its first discrete graphics chips available in 2020. No direct quote is given by Shrout, only paraphrasing, and it is a loose statement at best. Personally I am betting for more like 2021 or 2022 for anything "competitive" in the gaming arena. Keep in mind that Intel is looking at GPUs for things far beyond mere gaming, so at this point, "discrete graphics chips" could mean just about anything.

Intel CEO Brian Krzanich disclosed during an analyst event last week that it will have its first discrete graphics chips available in 2020. This will mark the beginning of the chip giant’s journey toward a portfolio of high-performance graphics products for various markets including gaming, data center and artificial intelligence (AI).
 
Interested in seeing if they actually can deliver this time. Will be great if they do. 3 major players in the market again will be amazing.
 
With the money nVidia is making with their Tesla lineup I can see why Intel wants a piece of that pie.
 
Not the first time Intel will have tried, and failed, to enter the discreet GPU market. It's a classic case of what happens every time they try to stray too far from their core expertise.
 
This is a disaster waiting to happen for Intel. The problem isn't technical capability or fab capacity, it's the savage competition for executive mindshare among middle-level managers -- too many people angling to become one of the legion of Intel vice-presidents. Historically, no one could compete with the CPU business for that, it's where the money came from.

Flash/X-Point used entirely different business stacks, from the fabs and JVs up, so it could thrive in spite of the CPU-dominated culture. But discrete-GPU has to go toe-to-toe with the CPU guys for wafer starts on the leading-edge fabs. Worse, the CPU guys have screwed the pooch recently, and are going to be that much more vicious as a result, such as by fabbing low-yield monster chips that eat up all the fab capacity. Even if it's isn't good for Intel to do that, it allows the CPU business to shut out their rivals in GPUs, so it's an office politics win, as long as your bullshit reasons for doing it can hoodwink the execs.

Back in the day, Intel's fabs were by far the best in the world, and could save Intel's bacon. But now even the fabs are having problems. This is going to make it even easier to use wafer start allocations as a weapon in the struggle for promotion.

Worst case, the office-politics wars ends with a Pyrrhic victory for one or the other of the CPU and GPU camps, and both business flounder/sink as a result. Best case, that both CPU and GPU wind up industry leaders, is a long-shot bet of extraordinary magnitude. I wouldn't even bet that they'd both wind up competitive.
 
AMD CPU and an Intel GPU ... welcome to Bizarro world.
Maybe there will be a nVidia Motherboard again by then, or SSD, to complete the picture.

I always liked nVidia chipset motherboards... building the north bridge into the cpu ended chipsets as we knew them back then.

Normally I agree with Kyle, not this time though on the time-frames - Intel seems to be OK to rush things out the doors these days, and graphic cards may not be an exception to that. I'll add that I believe more eyes from nVidia and Intel are on AMD then most that know would admit in seeing (waiting) if they pop any sorts of surprises (totally off the radar) with graphic cards (gaming or consumer focused).
 
Normally I agree with Kyle, not this time though on the time-frames - Intel seems to be OK to rush things out the doors these days, and graphic cards may not be an exception to that. I'll add that I believe more eyes from nVidia and Intel are on AMD then most that know would admit in seeing (waiting) if they pop any sorts of surprises (totally off the radar) with graphic cards (gaming or consumer focused).
So you are suggesting that Intel will have a competitive gaming card in 2020? You know it is 2018 right now and AMD is already pushing hard to 7nm successfully and Intel is having issues at 10nm, right? So please tell me what you think Intel is going to "rush out the door?" Also, things I am hearing about Intel internally are far from rosy.
 
2020 seems optimistic, but it's hard to know where they are starting from.
They've been making GPUs the whole time, just integrated, so they aren't starting from 0 in terms of either hardware or software knowledge and processes internally, and they've been getting better (slowly).
If they are starting a new GPU track from scratch, starting from when they started bringing onboard the AMD guys, then yeah, 2020 seems soon.
But if those guys were added to something already up and running, to get it over the finish line, then maybe we could see something by them, I'd be skeptical it'd be any kind of "flagship" competitive high end product though, more likely a glorified proof of concept.
 
They so going to have a 1080 killer in 2020.
i-see-what-you-did-there-meme.jpg
 
Shrout says 2020 for the discrete GPU... he didn't say a competitive GPU.
I don't see Intel coming out the gate with gaming cards, virtual Desktop graphics, GPU's for server scale stuff that's where the profit margins are Microsoft and Amazon are both paying sizable fortunes to nVidia for licensing access to Tesla and a number of other platforms. That's where Intel is aming and 2020 would put them right in time for their refresh cycle.

Just look at the pricetag on the nVidia V100's and remember that Google, Microsoft, and Amazon recently purchased thousands of them each. They are all also paying yearly licensing fees for support and drivers.
 
I don't see Intel coming out the gate with gaming cards, virtual Desktop graphics, GPU's for server scale stuff that's where the profit margins are Microsoft and Amazon are both paying sizable fortunes to nVidia for licensing access to Tesla and a number of other platforms. That's where Intel is aming and 2020 would put them right in time for their refresh cycle.

Just look at the pricetag on the nVidia V100's and remember that Google, Microsoft, and Amazon recently purchased thousands of them each. They are all also paying yearly licensing fees for support and drivers.

if intel creates a vdi card or even a card similar to the v100 they already have a capable gaming card. At that point its just politics/price that dictates wither they market said card as a gaming card and considering intel currently has 0% of that market i can see them coming out immediately with a gaming card as long as it doesnt cost them a fourtun to manufacture said card. the only reason nvidia doesnt release a v100 as a gaming card is so that 1. they can keep concumers buying gpus with small performance jumps per generation and 2. to keep companys from buying cheaper concumer cards for this crucial market space.
 
Why do I get the feeling this intel gpu will be the Matrox of modern day? Meaning it will exist, but is not truly competitive and only benefits a niche crowd.
 
if intel creates a vdi card or even a card similar to the v100 they already have a capable gaming card. At that point its just politics/price that dictates wither they market said card as a gaming card and considering intel currently has 0% of that market i can see them coming out immediately with a gaming card as long as it doesnt cost them a fourtun to manufacture said card. the only reason nvidia doesnt release a v100 as a gaming card is so that 1. they can keep concumers buying gpus with small performance jumps per generation and 2. to keep companys from buying cheaper concumer cards for this crucial market space.
When you put out a consumer product there is a lot more that has to go into drivers and support. It costs more and Intel isn't really set up for direct consumer sales. Intel and nVidia currently scramble every few weeks to get their drivers ready for the next AAA launch title and spend a few months after that tweaking for maximum performance. Can you really see Intel shifting their business model and current driver release schedule to fall in line with that in such a short time they barely keep up as is with their video drivers.
 
Another 'Raja Promise'.

Won't happen, for us. He may have blamed AMD's lack of resources for not delivering his GPU innovations, but I'd argue, he necessarily had unrealistic release schedules given the known environment/constraints.

He'll inevitably just find more walls to run up against at Intel, perpetually 'looking forward', rarely in the actual now where it counts.

And I don't think he's changed as far as we're concerned. I don't see Intel fighting hard to deliverer a competitive gamer GPU. Why would they bother? And at what cost?
 
When you put out a consumer product there is a lot more that has to go into drivers and support. It costs more and Intel isn't really set up for direct consumer sales. Intel and nVidia currently scramble every few weeks to get their drivers ready for the next AAA launch title and spend a few months after that tweaking for maximum performance. Can you really see Intel shifting their business model and current driver release schedule to fall in line with that in such a short time they barely keep up as is with their video drivers.

you overestimate how much work is required. i can take a tesla c2075, a tesla grid 2.0, a amd sky 900 and run a game on each of them. i dought any of these cards had any significant amount of optimization for games and yet all of them work fine. and intel wouldnt need to. if they made a very competitive card people would buy it. even if they never touched the drivers again they could sell cards.
 
Another 'Raja Promise'.

Won't happen, for us. He may have blamed AMD's lack of resources for not delivering his GPU innovations, but I'd argue, he necessarily had unrealistic release schedules given the known environment/constraints.

He'll inevitably just find more walls to run up against at Intel, perpetually 'looking forward', rarely in the actual now where it counts.

And I don't think he's changed as far as we're concerned. I don't see Intel fighting hard to deliverer a competitive gamer GPU. Why would they bother? And at what cost?

if there compute card deviates farther from the phi and close to a typical gpu there is very little work required to make a gaming gpu. if nvidia ONLY made a tesla v100 and felt like making a gaming card they would have no issue cutting the vram and throwing the same core on a gaming card and it would work well. cost MAY be a issue if intel fucks up in manufacturing the cards but even if they created a workstation card with a heafty price tag that card could easily be gamed on
 
Last edited:
Nvidia doing AMD chipsets again..... That would be crazy

god help us if they do this, when they "first" came out (absolutely helped AMD make their motherboards as best as possible during this time, along with Intel variations) they were actually pretty decent (compared to what was on the market at the time).

they also chose to not bother updating to keep it relevant, loads of problems from solder issues, to overheating, poor power managment, bad IRQ selections, to poor given performance from ther various chipsets (N force 2 all the way to MCP78) made them "not so good" in a moderately short amount of time, so they up and left the market (from motherboards).

People bitch moan and complain that AMD has hot running power hungry gpu (and cpu in some cases) could just imagine how much Nv would shit in AMD bed if they did a Ryzen chipset, pretty sure Nv would not even suggest doing so (as they take every opportunity to castrate potential performance of all Nv branded products to "co-operate" with anything AMD based) especially Radeons, Nv wants to be 100% control or they are not interested seems to me, going

------------------------------------------------------------------
-------------------------

As for the article or this idea in question

100000000000% doubt Intel will be releasing any gamer oriented gpu in the foreseeable future that is "competitive" price/perf, perf/watt, absolute performance or as compatible with DX or Vulkan then what AMD or Nv are currently bringing to the table (hell even Nv is not 100% fully compliant with DX10-11-12 as AMD is, can just imagine how piss poor an Intel variant would end up being, unless they have backseat deal with AMD, which I also highly doubt).

Intel might have deep pockets, though they are behemoth when it comes to CPU design, no matter the team they have on hand are likely as novice as possible (compared to the 2 giants in GPU land) not to mention the wide array of patents and such that AMD and Nv hold in this regard.

Raja Kadouri and Jim Keller might be very well established veterans in the industry for what they can and have done, but, they are likely just as deeply constrained with what they want to do, vs what they will be able to do for Intel. (laws obligations or whatever).

I wonder if they (Intel) would decide to do like they do with their cpu and motherboards, that is if you do not get above X socket variation it operates at 1/4 speed (not use full pci-e bandwidth) or will cheap out on the thermal interface (using lowest quality TIM inside the gpu core instead of solder) ^.^
 
Another 'Raja Promise'.

Won't happen, for us. He may have blamed AMD's lack of resources for not delivering his GPU innovations, but I'd argue, he necessarily had unrealistic release schedules given the known environment/constraints.

He'll inevitably just find more walls to run up against at Intel, perpetually 'looking forward', rarely in the actual now where it counts.

And I don't think he's changed as far as we're concerned. I don't see Intel fighting hard to deliverer a competitive gamer GPU. Why would they bother? And at what cost?

Also he kept moving the goals, and kept forcing the engineers to start over again and again. I don't think he's exactly an asset for Intel...
 
  • Like
Reactions: N4CR
like this
The article from the link to WCCFRech that rgMekanic front-paged also said AMD 7nm is like Intel's 10nm. Considering how much lead time Intel has squandered, it's still really sad.
 
Well the worlds #1 super computer is now powered by power chips... and 27,648 Nvidia Tesla chips.

IBM has an open licence on Power which can more then handle server, cloud, data center work loads... and of course super computer duties.

ARM has an open licence... and although not wide spread yet are starting to make noise in the server market. (with Qcoms centriq retreat... its possible a new ARM server company steps it, or that the big data centers simply move to Power)

And somewhere in that mess RiscV is 100% free to use... Nvidia is moving to riscv to handle the internals on their GPUs. Western Digitial has said that in the 2020's they expect to be shipping billions of RiscV chips a year integrated with their storage solutions. (if storage AI is handled directly by the storage hardware... that greatly reduces the load on data centers CPUs... making power and arm even more attractive)

My point... Intel needs to get into the discreet GPU business as fast as possible, the more compute work being done by Nvidia GPUs, and purpose built ASICs and direct storage controllers, the less the big boy customers really need Intel CPUs. Its very possible that cloud / Data center machines in the mid 20s will feature 1000s of GPU like chips crunching AI... and every connected storage drive will have its own on board RiscV chips powering AI storage algorithms. All connected by a Linux OS that can as easily be running ARM or Power chips instead of x86.
 
Well the worlds #1 super computer is now powered by power chips... and 27,648 Nvidia Tesla chips.

IBM has an open licence on Power which can more then handle server, cloud, data center work loads... and of course super computer duties.

ARM has an open licence... and although not wide spread yet are starting to make noise in the server market. (with Qcoms centriq retreat... its possible a new ARM server company steps it, or that the big data centers simply move to Power)

And somewhere in that mess RiscV is 100% free to use... Nvidia is moving to riscv to handle the internals on their GPUs. Western Digitial has said that in the 2020's they expect to be shipping billions of RiscV chips a year integrated with their storage solutions. (if storage AI is handled directly by the storage hardware... that greatly reduces the load on data centers CPUs... making power and arm even more attractive)

My point... Intel needs to get into the discreet GPU business as fast as possible, the more compute work being done by Nvidia GPUs, and purpose built ASICs and direct storage controllers, the less the big boy customers really need Intel CPUs. Its very possible that cloud / Data center machines in the mid 20s will feature 1000s of GPU like chips crunching AI... and every connected storage drive will have its own on board RiscV chips powering AI storage algorithms. All connected by a Linux OS that can as easily be running ARM or Power chips instead of x86.

keep in mind 4 of the top 10 supercomputers use intel phi coprosessers and 4/10 use xeon cpus. intel doesnt need to get into the discrete gpu business for supercomputing sake as that market is already well populated by intel phi coprocessers which honestly work VERY well. there isnt really much issue with x86 chips so I dont see a huge move from them however ibms new power cpus are EXTREMLY impressive and last i looked way cheaper then a comparable xeon. another thing to keep in mind a well optimized program will also be significantly better for price/preformance on a phi then a tesla so why would anyone make the switch to a gpu when the phis are easier to program for and cheaper.
 
  • Like
Reactions: ChadD
like this
keep in mind 4 of the top 10 supercomputers use intel phi coprosessers and 4/10 use xeon cpus. intel doesnt need to get into the discrete gpu business for supercomputing sake as that market is already well populated by intel phi coprocessers which honestly work VERY well. there isnt really much issue with x86 chips so I dont see a huge move from them however ibms new power cpus are EXTREMLY impressive and last i looked way cheaper then a comparable xeon. another thing to keep in mind a well optimized program will also be significantly better for price/preformance on a phi then a tesla so why would anyone make the switch to a gpu when the phis are easier to program for and cheaper.

I agree with you for the most part.... accept that Intel has canceled the Phi chips, and announced they will be replaced by "a new architecture".

Aurora was supposed to be Intels answer to the IBM/NV Summit machine using xeon and phi. Those plans have been scrapped with the dept of energy pushing Aurora to 2021 or so.

If I where betting I would say Intel is looking at what NV is doing and realize spending billions of R&D money on an architecture with one use is not smart. I would expect their new "GPU" will be capable of being scaled for use in everything from APUs and Gaming GPUs... to Super computers, machine learning, and automotive processing packs ect.

We'll see what Intel comes up with... I don't think its hyperbole to say their future is riding on the project. The x86 CPUs are becoming more likely to be replaced every year. More and more work that used to be x86 territory is being done by other chips. Not to mention that Intel is unwilling to licence x86... and there are a lot of companies with the capital and the will to create their own chips these days. ARM and IBM are both willing to provide the foundation for a reasonable cost, and RiscV although newer and less of an ecosystem is completely free.
 
We have seen Intel claim to be entering the enthusiast GPU market before. I will believe it when I see it for sale. I remember some of the previous attempts that had hype and faded to nothingness.
 
I agree with you for the most part.... accept that Intel has canceled the Phi chips, and announced they will be replaced by "a new architecture".

Aurora was supposed to be Intels answer to the IBM/NV Summit machine using xeon and phi. Those plans have been scrapped with the dept of energy pushing Aurora to 2021 or so.

If I where betting I would say Intel is looking at what NV is doing and realize spending billions of R&D money on an architecture with one use is not smart. I would expect their new "GPU" will be capable of being scaled for use in everything from APUs and Gaming GPUs... to Super computers, machine learning, and automotive processing packs ect.

We'll see what Intel comes up with... I don't think its hyperbole to say their future is riding on the project. The x86 CPUs are becoming more likely to be replaced every year. More and more work that used to be x86 territory is being done by other chips. Not to mention that Intel is unwilling to licence x86... and there are a lot of companies with the capital and the will to create their own chips these days. ARM and IBM are both willing to provide the foundation for a reasonable cost, and RiscV although newer and less of an ecosystem is completely free.

i can garentee we will see another revision of the phis (possibly under a different name) and that intel will be using a large majority of what is used in the curent phis in the future as it did a pretty good job. and phis are quite abit more versitile then a typical gpu as they can nativly run just about any code or even a standerd vertion of windows or linux. the only way the phis lost was in marketing or possible software developed for them (although its easier then developing for a gpu) this may continue into the first generations of the intel gpu.

i am completly unsure of the gaming capabilities an intel card will have however i am certain intel could create a competitive compute card as they have done that already.

x86 cpus will only be replaced if they are the worse option at the time. if risc manufactures sleep x86 will gain market again if x86 doesnt improve risc will do better
 
Lets sit back and watch Raja promise the Earth at some cringe-worthy "event", and deliver a 3rd place GPU in a 3 horse GPU race.

Competition in this market is what we need, but it actually has to be competition, not just another Matrox Millennium wannabe.

Plus take anything Shrout has to say with a large pinch of salt, especially if it concerns anything regarding one of his only "customers".
 
Last edited:
I had been led to believe was common wisdom was that processes from Samsung, AMD, and others as compared to Intel was that the non-intel processes tended to have transistors a bit larger than the stated number i.e. AMD 7nm is more like Intel's 10nm in terms of raw size.
Xnm is mostly marketing now for a while.
However, AMD already has a lead on Intel as they can't even get the 10nm out the door, outside of some shitty i3 dual core in a lenovo, which they can't even use iGPU for because the 10nm defect rate is too high. 10nm is a big, smouldering shitty dumpster fire. It's funny that Nvidia also skipped 10nm after trying for a while, 10nm is cursed!
Let alone Intel needs a 2kW water chiller to barely compete with an air cooled 32 core thread ripper2 - almost laughable how far Intel has fallen, it's just like mid 2000s all over again with Athlon 64, except this time it's not an IMC advantage, it's a process advantage so the playing field is much more level as most of the free lunch is gone these days. AMD is also making more money now and has more mindshare as we are not in the 'wild west' days of the internet anymore, so it will be a long road to catch up.

Only saviour I see for Intel: new process medium and technology. E.g not sillicon. Maybe the jews have figured out how to make optical cpus for them by 2020.
 

Attachments

  • 7nm Comparisons.jpg
    7nm Comparisons.jpg
    68.5 KB · Views: 0
god help us if they do this, when they "first" came out (absolutely helped AMD make their motherboards as best as possible during this time, along with Intel variations) they were actually pretty decent (compared to what was on the market at the time).

they also chose to not bother updating to keep it relevant, loads of problems from solder issues, to overheating, poor power managment, bad IRQ selections, to poor given performance from ther various chipsets (N force 2 all the way to MCP78) made them "not so good" in a moderately short amount of time, so they up and left the market (from motherboards).

People bitch moan and complain that AMD has hot running power hungry gpu (and cpu in some cases) could just imagine how much Nv would shit in AMD bed if they did a Ryzen chipset, pretty sure Nv would not even suggest doing so (as they take every opportunity to castrate potential performance of all Nv branded products to "co-operate" with anything AMD based) especially Radeons, Nv wants to be 100% control or they are not interested seems to me, going

------------------------------------------------------------------
-------------------------

As for the article or this idea in question

100000000000% doubt Intel will be releasing any gamer oriented gpu in the foreseeable future that is "competitive" price/perf, perf/watt, absolute performance or as compatible with DX or Vulkan then what AMD or Nv are currently bringing to the table (hell even Nv is not 100% fully compliant with DX10-11-12 as AMD is, can just imagine how piss poor an Intel variant would end up being, unless they have backseat deal with AMD, which I also highly doubt).

Intel might have deep pockets, though they are behemoth when it comes to CPU design, no matter the team they have on hand are likely as novice as possible (compared to the 2 giants in GPU land) not to mention the wide array of patents and such that AMD and Nv hold in this regard.

Raja Kadouri and Jim Keller might be very well established veterans in the industry for what they can and have done, but, they are likely just as deeply constrained with what they want to do, vs what they will be able to do for Intel. (laws obligations or whatever).

I wonder if they (Intel) would decide to do like they do with their cpu and motherboards, that is if you do not get above X socket variation it operates at 1/4 speed (not use full pci-e bandwidth) or will cheap out on the thermal interface (using lowest quality TIM inside the gpu core instead of solder) ^.^

Side note, if these are built on Intels previous attempts (Phi) then it would be DirectX (X) as in every version - compatible.
 
Nvidia doing AMD chipsets again..... That would be crazy
SIGH I miss nForce. nForce 2 and 4 were especially awesome.

Not the first time Intel will have tried, and failed, to enter the discreet GPU market. It's a classic case of what happens every time they try to stray too far from their core expertise.
Haha yupz. Can't wait to see what kind of trash they come up with this time. Intel, making a real GPU? Yeah good luck with that.

Shrout says 2020 for the discrete GPU... he didn't say a competitive GPU.
Indeeeeed. I'm all for more competition, the GPU market has sorely needed it for many years, but I don't think Intel is gonna be the one to make waves in this pool. Man, where the fuck is Matrox? You know what I want? For RTG to be free of AMD and become ATI again.
 
Back
Top