Shrout Says Intel Said 2020 on Discrete GPU

Discussion in '[H]ard|OCP Front Page News' started by Kyle_Bennett, Jun 12, 2018.

  1. Kyle_Bennett

    Kyle_Bennett El Chingón Staff Member

    Messages:
    53,974
    Joined:
    May 18, 1997
    Ryan Shrout is reporting that Intel CEO Brian Krzanich stated that Intel will have its first discrete graphics chips available in 2020. No direct quote is given by Shrout, only paraphrasing, and it is a loose statement at best. Personally I am betting for more like 2021 or 2022 for anything "competitive" in the gaming arena. Keep in mind that Intel is looking at GPUs for things far beyond mere gaming, so at this point, "discrete graphics chips" could mean just about anything.

    Intel CEO Brian Krzanich disclosed during an analyst event last week that it will have its first discrete graphics chips available in 2020. This will mark the beginning of the chip giant’s journey toward a portfolio of high-performance graphics products for various markets including gaming, data center and artificial intelligence (AI).
     
    nEo717 likes this.
  2. exlink

    exlink 2[H]4U

    Messages:
    3,665
    Joined:
    Dec 16, 2006
    Fuck yes. More competition.
     
  3. Taldren

    Taldren Limp Gawd

    Messages:
    485
    Joined:
    Nov 28, 2006
    AMD CPU and an Intel GPU ... welcome to Bizarro world.
    Maybe there will be a nVidia Motherboard again by then, or SSD, to complete the picture.
     
    KarsusTG, jnemesh, mynamehere and 3 others like this.
  4. Lakados

    Lakados Gawd

    Messages:
    696
    Joined:
    Feb 3, 2014
    Nvidia doing AMD chipsets again..... That would be crazy
     
    DrezKill, jnemesh, mynamehere and 2 others like this.
  5. Armenius

    Armenius [H]ardForum Junkie

    Messages:
    15,407
    Joined:
    Jan 28, 2014
    Interested in seeing if they actually can deliver this time. Will be great if they do. 3 major players in the market again will be amazing.
     
    Templar_X and mynamehere like this.
  6. Ehren8879

    Ehren8879 Little Bitch 3

    Messages:
    4,292
    Joined:
    Sep 24, 2004
    What else did Intel get out of that AMD graphics deal?

    Besides Raja :)
     
  7. Lakados

    Lakados Gawd

    Messages:
    696
    Joined:
    Feb 3, 2014
    With the money nVidia is making with their Tesla lineup I can see why Intel wants a piece of that pie.
     
  8. Advil

    Advil [H]ard|Gawd

    Messages:
    1,729
    Joined:
    Jul 16, 2004
    Not the first time Intel will have tried, and failed, to enter the discreet GPU market. It's a classic case of what happens every time they try to stray too far from their core expertise.
     
    jnemesh and dragonstongue like this.
  9. Gottfried Leibnizzle

    Gottfried Leibnizzle Limp Gawd

    Messages:
    200
    Joined:
    Apr 29, 2015
    Hm. Methinks they want a piece of the very lucrative crypto hardware market more than anything else GPU related.
     
  10. pcgeekesq

    pcgeekesq [H]ard|Gawd

    Messages:
    1,306
    Joined:
    Apr 23, 2012
    This is a disaster waiting to happen for Intel. The problem isn't technical capability or fab capacity, it's the savage competition for executive mindshare among middle-level managers -- too many people angling to become one of the legion of Intel vice-presidents. Historically, no one could compete with the CPU business for that, it's where the money came from.

    Flash/X-Point used entirely different business stacks, from the fabs and JVs up, so it could thrive in spite of the CPU-dominated culture. But discrete-GPU has to go toe-to-toe with the CPU guys for wafer starts on the leading-edge fabs. Worse, the CPU guys have screwed the pooch recently, and are going to be that much more vicious as a result, such as by fabbing low-yield monster chips that eat up all the fab capacity. Even if it's isn't good for Intel to do that, it allows the CPU business to shut out their rivals in GPUs, so it's an office politics win, as long as your bullshit reasons for doing it can hoodwink the execs.

    Back in the day, Intel's fabs were by far the best in the world, and could save Intel's bacon. But now even the fabs are having problems. This is going to make it even easier to use wafer start allocations as a weapon in the struggle for promotion.

    Worst case, the office-politics wars ends with a Pyrrhic victory for one or the other of the CPU and GPU camps, and both business flounder/sink as a result. Best case, that both CPU and GPU wind up industry leaders, is a long-shot bet of extraordinary magnitude. I wouldn't even bet that they'd both wind up competitive.
     
    clockdogg likes this.
  11. nEo717

    nEo717 Limp Gawd

    Messages:
    173
    Joined:
    Jun 2, 2017
    I always liked nVidia chipset motherboards... building the north bridge into the cpu ended chipsets as we knew them back then.

    Normally I agree with Kyle, not this time though on the time-frames - Intel seems to be OK to rush things out the doors these days, and graphic cards may not be an exception to that. I'll add that I believe more eyes from nVidia and Intel are on AMD then most that know would admit in seeing (waiting) if they pop any sorts of surprises (totally off the radar) with graphic cards (gaming or consumer focused).
     
  12. Master_shake_

    Master_shake_ Little Bitch

    Messages:
    7,509
    Joined:
    Apr 9, 2012
    What f word is Intel going to use for reference cards.
     
  13. Kyle_Bennett

    Kyle_Bennett El Chingón Staff Member

    Messages:
    53,974
    Joined:
    May 18, 1997
    So you are suggesting that Intel will have a competitive gaming card in 2020? You know it is 2018 right now and AMD is already pushing hard to 7nm successfully and Intel is having issues at 10nm, right? So please tell me what you think Intel is going to "rush out the door?" Also, things I am hearing about Intel internally are far from rosy.
     
  14. Fifth Horseman

    Fifth Horseman Limp Gawd

    Messages:
    464
    Joined:
    May 5, 2000
  15. MartinX

    MartinX One Hour Martinizing While You Wait

    Messages:
    8,625
    Joined:
    Jan 23, 2003
    2020 seems optimistic, but it's hard to know where they are starting from.
    They've been making GPUs the whole time, just integrated, so they aren't starting from 0 in terms of either hardware or software knowledge and processes internally, and they've been getting better (slowly).
    If they are starting a new GPU track from scratch, starting from when they started bringing onboard the AMD guys, then yeah, 2020 seems soon.
    But if those guys were added to something already up and running, to get it over the finish line, then maybe we could see something by them, I'd be skeptical it'd be any kind of "flagship" competitive high end product though, more likely a glorified proof of concept.
     
    Armenius likes this.
  16. shpankey

    shpankey Limp Gawd

    Messages:
    138
    Joined:
    May 27, 2005
    Shrout says 2020 for the discrete GPU... he didn't say a competitive GPU.
     
    katanaD, Jim Kim, N4CR and 2 others like this.
  17. vegeta535

    vegeta535 2[H]4U

    Messages:
    2,302
    Joined:
    Jul 19, 2013
    They so going to have a 1080 killer in 2020.
     
    {NG}Fidel and Armenius like this.
  18. cdabc123

    cdabc123 2[H]4U

    Messages:
    2,084
    Joined:
    Jun 21, 2016
    They already have hardware that does pretty dang well at mining. I run abunch of the Intel phi CPUs and on cryptonote(monero) they kill every gpu out there.

    As stated it depends on what a discreet graphics chip would be if Intel just wanted they could just throw iris graphics on a modified phi card and have a decent compute card that would be similar to the vca2. They could probably ship that out 2019 if they felt like it. As far as a gpu capable of gaming the only bottleneck I see is Intel coming up with a decent software solution as Intel already knows how to make both a gpu as well as powerful compute cards.
     
  19. motomonkey

    motomonkey [H]ard|Gawd

    Messages:
    1,237
    Joined:
    Jan 17, 2009
    i-see-what-you-did-there-meme.jpg
     
  20. Lakados

    Lakados Gawd

    Messages:
    696
    Joined:
    Feb 3, 2014
    I don't see Intel coming out the gate with gaming cards, virtual Desktop graphics, GPU's for server scale stuff that's where the profit margins are Microsoft and Amazon are both paying sizable fortunes to nVidia for licensing access to Tesla and a number of other platforms. That's where Intel is aming and 2020 would put them right in time for their refresh cycle.

    Just look at the pricetag on the nVidia V100's and remember that Google, Microsoft, and Amazon recently purchased thousands of them each. They are all also paying yearly licensing fees for support and drivers.
     
    the901 likes this.
  21. cdabc123

    cdabc123 2[H]4U

    Messages:
    2,084
    Joined:
    Jun 21, 2016
    if intel creates a vdi card or even a card similar to the v100 they already have a capable gaming card. At that point its just politics/price that dictates wither they market said card as a gaming card and considering intel currently has 0% of that market i can see them coming out immediately with a gaming card as long as it doesnt cost them a fourtun to manufacture said card. the only reason nvidia doesnt release a v100 as a gaming card is so that 1. they can keep concumers buying gpus with small performance jumps per generation and 2. to keep companys from buying cheaper concumer cards for this crucial market space.
     
    Armenius and nEo717 like this.
  22. nEo717

    nEo717 Limp Gawd

    Messages:
    173
    Joined:
    Jun 2, 2017
    Change is never easy for the fat cats (at Intel) who sat back a coasted for just a moment, so I have little doubt of the uneasiness at Intel right now let alone not everyone is not on the love wagon with the new hire in the graphics department. That being though, Raja may not be one to commitment well to just one path, and I'm sure most of the teams at Intel have end dates that are farther out (than 2020) - However Raja (or his team making up for his undecided nature to pick a platform) at AMD pushed Vega (ready or not) out the door much like I'd expect him (and his new team) to do so with an Intel Solution, perhaps fully ready or not.

    Competitive in 2020, my money is on yes (mid range), taking the crown in 2020, nope... Intel has to make something in that new AZ 7nm plant after all.
     
  23. Teenyman45

    Teenyman45 2[H]4U

    Messages:
    2,090
    Joined:
    Nov 29, 2010
    Just as a possible clarification from somebody looking on from the outside, the perception I had been led to believe was common wisdom was that processes from Samsung, AMD, and others as compared to Intel was that the non-intel processes tended to have transistors a bit larger than the stated number i.e. AMD 7nm is more like Intel's 10nm in terms of raw size.
     
    nEo717 likes this.
  24. Bomber

    Bomber [H]ard|Gawd

    Messages:
    1,124
    Joined:
    Jan 14, 2002
    Fifth Horseman and nEo717 like this.
  25. HorseproofBacon

    HorseproofBacon Limp Gawd

    Messages:
    261
    Joined:
    Nov 22, 2016
    Why do I get the feeling this intel gpu will be the Matrox of modern day? Meaning it will exist, but is not truly competitive and only benefits a niche crowd.
     
    nEo717 likes this.
  26. Lakados

    Lakados Gawd

    Messages:
    696
    Joined:
    Feb 3, 2014
    When you put out a consumer product there is a lot more that has to go into drivers and support. It costs more and Intel isn't really set up for direct consumer sales. Intel and nVidia currently scramble every few weeks to get their drivers ready for the next AAA launch title and spend a few months after that tweaking for maximum performance. Can you really see Intel shifting their business model and current driver release schedule to fall in line with that in such a short time they barely keep up as is with their video drivers.
     
  27. spine

    spine 2[H]4U

    Messages:
    2,365
    Joined:
    Feb 4, 2003
    Another 'Raja Promise'.

    Won't happen, for us. He may have blamed AMD's lack of resources for not delivering his GPU innovations, but I'd argue, he necessarily had unrealistic release schedules given the known environment/constraints.

    He'll inevitably just find more walls to run up against at Intel, perpetually 'looking forward', rarely in the actual now where it counts.

    And I don't think he's changed as far as we're concerned. I don't see Intel fighting hard to deliverer a competitive gamer GPU. Why would they bother? And at what cost?
     
    jnemesh likes this.
  28. cdabc123

    cdabc123 2[H]4U

    Messages:
    2,084
    Joined:
    Jun 21, 2016
    you overestimate how much work is required. i can take a tesla c2075, a tesla grid 2.0, a amd sky 900 and run a game on each of them. i dought any of these cards had any significant amount of optimization for games and yet all of them work fine. and intel wouldnt need to. if they made a very competitive card people would buy it. even if they never touched the drivers again they could sell cards.
     
  29. cdabc123

    cdabc123 2[H]4U

    Messages:
    2,084
    Joined:
    Jun 21, 2016
    if there compute card deviates farther from the phi and close to a typical gpu there is very little work required to make a gaming gpu. if nvidia ONLY made a tesla v100 and felt like making a gaming card they would have no issue cutting the vram and throwing the same core on a gaming card and it would work well. cost MAY be a issue if intel fucks up in manufacturing the cards but even if they created a workstation card with a heafty price tag that card could easily be gamed on
     
    Last edited: Jun 12, 2018
  30. dragonstongue

    dragonstongue 2[H]4U

    Messages:
    2,911
    Joined:
    Nov 18, 2008
    god help us if they do this, when they "first" came out (absolutely helped AMD make their motherboards as best as possible during this time, along with Intel variations) they were actually pretty decent (compared to what was on the market at the time).

    they also chose to not bother updating to keep it relevant, loads of problems from solder issues, to overheating, poor power managment, bad IRQ selections, to poor given performance from ther various chipsets (N force 2 all the way to MCP78) made them "not so good" in a moderately short amount of time, so they up and left the market (from motherboards).

    People bitch moan and complain that AMD has hot running power hungry gpu (and cpu in some cases) could just imagine how much Nv would shit in AMD bed if they did a Ryzen chipset, pretty sure Nv would not even suggest doing so (as they take every opportunity to castrate potential performance of all Nv branded products to "co-operate" with anything AMD based) especially Radeons, Nv wants to be 100% control or they are not interested seems to me, going

    ------------------------------------------------------------------
    -------------------------

    As for the article or this idea in question

    100000000000% doubt Intel will be releasing any gamer oriented gpu in the foreseeable future that is "competitive" price/perf, perf/watt, absolute performance or as compatible with DX or Vulkan then what AMD or Nv are currently bringing to the table (hell even Nv is not 100% fully compliant with DX10-11-12 as AMD is, can just imagine how piss poor an Intel variant would end up being, unless they have backseat deal with AMD, which I also highly doubt).

    Intel might have deep pockets, though they are behemoth when it comes to CPU design, no matter the team they have on hand are likely as novice as possible (compared to the 2 giants in GPU land) not to mention the wide array of patents and such that AMD and Nv hold in this regard.

    Raja Kadouri and Jim Keller might be very well established veterans in the industry for what they can and have done, but, they are likely just as deeply constrained with what they want to do, vs what they will be able to do for Intel. (laws obligations or whatever).

    I wonder if they (Intel) would decide to do like they do with their cpu and motherboards, that is if you do not get above X socket variation it operates at 1/4 speed (not use full pci-e bandwidth) or will cheap out on the thermal interface (using lowest quality TIM inside the gpu core instead of solder) ^.^
     
  31. jnemesh

    jnemesh Gawd

    Messages:
    973
    Joined:
    Jan 21, 2013
    Also he kept moving the goals, and kept forcing the engineers to start over again and again. I don't think he's exactly an asset for Intel...
     
    N4CR likes this.
  32. Teenyman45

    Teenyman45 2[H]4U

    Messages:
    2,090
    Joined:
    Nov 29, 2010
    The article from the link to WCCFRech that rgMekanic front-paged also said AMD 7nm is like Intel's 10nm. Considering how much lead time Intel has squandered, it's still really sad.
     
  33. ChadD

    ChadD I Love TEXAS

    Messages:
    3,158
    Joined:
    Feb 8, 2016
    Well the worlds #1 super computer is now powered by power chips... and 27,648 Nvidia Tesla chips.

    IBM has an open licence on Power which can more then handle server, cloud, data center work loads... and of course super computer duties.

    ARM has an open licence... and although not wide spread yet are starting to make noise in the server market. (with Qcoms centriq retreat... its possible a new ARM server company steps it, or that the big data centers simply move to Power)

    And somewhere in that mess RiscV is 100% free to use... Nvidia is moving to riscv to handle the internals on their GPUs. Western Digitial has said that in the 2020's they expect to be shipping billions of RiscV chips a year integrated with their storage solutions. (if storage AI is handled directly by the storage hardware... that greatly reduces the load on data centers CPUs... making power and arm even more attractive)

    My point... Intel needs to get into the discreet GPU business as fast as possible, the more compute work being done by Nvidia GPUs, and purpose built ASICs and direct storage controllers, the less the big boy customers really need Intel CPUs. Its very possible that cloud / Data center machines in the mid 20s will feature 1000s of GPU like chips crunching AI... and every connected storage drive will have its own on board RiscV chips powering AI storage algorithms. All connected by a Linux OS that can as easily be running ARM or Power chips instead of x86.
     
  34. Dan_D

    Dan_D [H]ardOCP Motherboard Editor

    Messages:
    54,195
    Joined:
    Feb 9, 2002
    That might be alright. NVIDIA doing Intel chipsets was a clusterfuck last time.
     
  35. cdabc123

    cdabc123 2[H]4U

    Messages:
    2,084
    Joined:
    Jun 21, 2016
    keep in mind 4 of the top 10 supercomputers use intel phi coprosessers and 4/10 use xeon cpus. intel doesnt need to get into the discrete gpu business for supercomputing sake as that market is already well populated by intel phi coprocessers which honestly work VERY well. there isnt really much issue with x86 chips so I dont see a huge move from them however ibms new power cpus are EXTREMLY impressive and last i looked way cheaper then a comparable xeon. another thing to keep in mind a well optimized program will also be significantly better for price/preformance on a phi then a tesla so why would anyone make the switch to a gpu when the phis are easier to program for and cheaper.
     
    ChadD likes this.
  36. ChadD

    ChadD I Love TEXAS

    Messages:
    3,158
    Joined:
    Feb 8, 2016
    I agree with you for the most part.... accept that Intel has canceled the Phi chips, and announced they will be replaced by "a new architecture".

    Aurora was supposed to be Intels answer to the IBM/NV Summit machine using xeon and phi. Those plans have been scrapped with the dept of energy pushing Aurora to 2021 or so.

    If I where betting I would say Intel is looking at what NV is doing and realize spending billions of R&D money on an architecture with one use is not smart. I would expect their new "GPU" will be capable of being scaled for use in everything from APUs and Gaming GPUs... to Super computers, machine learning, and automotive processing packs ect.

    We'll see what Intel comes up with... I don't think its hyperbole to say their future is riding on the project. The x86 CPUs are becoming more likely to be replaced every year. More and more work that used to be x86 territory is being done by other chips. Not to mention that Intel is unwilling to licence x86... and there are a lot of companies with the capital and the will to create their own chips these days. ARM and IBM are both willing to provide the foundation for a reasonable cost, and RiscV although newer and less of an ecosystem is completely free.
     
  37. Vader1975

    Vader1975 Gawd

    Messages:
    601
    Joined:
    May 11, 2016
    We have seen Intel claim to be entering the enthusiast GPU market before. I will believe it when I see it for sale. I remember some of the previous attempts that had hype and faded to nothingness.
     
  38. cdabc123

    cdabc123 2[H]4U

    Messages:
    2,084
    Joined:
    Jun 21, 2016
    i can garentee we will see another revision of the phis (possibly under a different name) and that intel will be using a large majority of what is used in the curent phis in the future as it did a pretty good job. and phis are quite abit more versitile then a typical gpu as they can nativly run just about any code or even a standerd vertion of windows or linux. the only way the phis lost was in marketing or possible software developed for them (although its easier then developing for a gpu) this may continue into the first generations of the intel gpu.

    i am completly unsure of the gaming capabilities an intel card will have however i am certain intel could create a competitive compute card as they have done that already.

    x86 cpus will only be replaced if they are the worse option at the time. if risc manufactures sleep x86 will gain market again if x86 doesnt improve risc will do better
     
  39. mord

    mord Limp Gawd

    Messages:
    257
    Joined:
    Mar 8, 2005
    FTFY. Note, not adjusted for inflation.
     
    Jim Kim and nEo717 like this.
  40. SvenBent

    SvenBent 2[H]4U

    Messages:
    2,252
    Joined:
    Sep 13, 2008
Tags: