Intel Wants You to Use Vulkan

Discussion in '[H]ard|OCP Front Page News' started by rgMekanic, Jul 9, 2018.

  1. rgMekanic

    rgMekanic [H]ard|News Staff Member

    Messages:
    5,120
    Joined:
    May 13, 2013
    In an interesting post on the Intel Developer Zone page, Intel gives high praise to the Vulkan API with many glowing statements. In addition, they give you step by step instructions on how to render objects in parallel using the Vulkan API, with sample code.

    I have to wonder if this is a bit of foreshadowing for the upcoming Intel discrete GPUs. Trying to help give the industry a gentle nudge towards more Vulkan in the future. Either way, you can now go render yourselves a giant chicken, so that's good.

    Vulkan APIs are positioned to become one of the next dominant graphics rendering platforms. Characteristics of the platform help apps gain longevity and run in more places. You might say that Vulkan lets apps live long and prosper- and this code sample will help get you started.
     
    DrezKill, dvsman and heatlesssun like this.
  2. Krenum

    Krenum [H]ardForum Junkie

    Messages:
    14,429
    Joined:
    Apr 29, 2005
    I want to use Vulkan. Give us more Vulkan Intel, please.
     
    Wierdo, DPI, scojer and 8 others like this.
  3. HockeyJon

    HockeyJon Gawd

    Messages:
    940
    Joined:
    Dec 14, 2014
    Doesn't surprise me to be honest, given that Raja is leading the division over there.
     
    Wierdo, F.E.A.R., jnemesh and 7 others like this.
  4. DukenukemX

    DukenukemX 2[H]4U

    Messages:
    3,989
    Joined:
    Jan 30, 2005
    Interesting point, if you have a Haswell, Ivy bridge, and Sandy bridge Intel CPU's, they don't have Vulkan support in Windows, but they do have Vulkan support in Linux. Something to think about.
     
    DLGenesis and Luke Wells like this.
  5. misterbobby

    misterbobby 2[H]4U

    Messages:
    3,522
    Joined:
    Mar 18, 2014
    What does that actually mean? Those cpus run Vulkan just fine in the few games that use it.
     
  6. CombatChrisNC

    CombatChrisNC [H]ard|Gawd

    Messages:
    1,049
    Joined:
    Apr 3, 2013
    Wouldn't it be possible to offload some tasks to the otherwise unused iGPU in a discreet GPU equipped system?

    Intel iGPU + intel GPU > competition?

    You could do the same thing with an AMD APU and GPU, right? I mean in theory you could even do AMD APU + Intel or nVidia GPU... Right?

    Power and glory to Vulcan.
     
    DLGenesis likes this.
  7. digitalwanderer

    digitalwanderer Long Time No See

    Messages:
    159
    Joined:
    Aug 21, 2004
    Wouldn't it be interesting if they adopted all the open standards that AMD supports/provides? It makes a whole lot of sense to me, and I have a feeling the new GPU division is thinking the same way.

    What better way to bork over nVidia while taking advantage of the already built in support that AMD has built up? It'd finally put an end to there monopoly over getting games customized mainly for their cards and would lead the industry to support more open standards, which I think would be a very good thing.
     
    Revdarian, GSDragoon, c3k and 13 others like this.
  8. ChadD

    ChadD I Love TEXAS

    Messages:
    3,267
    Joined:
    Feb 8, 2016
    The simple answer is yes APUs can use last level cache, to offload workloads to on board CPU or GPU cores depending on which is better suited to the task. They swap data via a L3 (last level) cache. (software also really has to come in to play to really expose that)

    Offloading work to a discrete card however would mean all data would have to move back and forth via the PCI bus. The added communication bottleneck would make such a setup unlikely to outperform simply sending the data to the GPU, and letting it crunch. That is the main advantage of APUs like those found in the Consoles... the CPU and GPU can share workloads by using a L3 (last level) cache. Most of the advantage is in power efficiency, and hardware cost savings.

    https://pdfs.semanticscholar.org/presentation/3009/8fae9dd812777d100662a8283e13c574f6a6.pdf

    The main thing holding APUs back from simply replacing GPU cards... is the issue of texture memory and speed. GPU cards are still directly connected to much faster RAM. Console systems are really the only systems designed to allow developers to use faster video memory like GDDR5 with both the CPU and GPU.
     
  9. Spidey329

    Spidey329 [H]ardForum Junkie

    Messages:
    8,954
    Joined:
    Dec 15, 2003
    Makes sense, their options were to push the open standards that exist or try and make a new proprietary standard. The latter would hurt their positioning since they're so late to the dance.

    Better believe if they were in a better market position, they'd be pushing a proprietary system.
     
    Wierdo and digitalwanderer like this.
  10. martinmsj

    martinmsj [H]ard|Gawd

    Messages:
    1,516
    Joined:
    Mar 3, 2005
    I’m not surprised considering they’re (intel) supplying graphics drivers for the Vega GPUs that come with their CPUs. I thought AMD tech is the basis for their GPUs (to start off.)
     
    Last edited: Jul 10, 2018
  11. Trimlock

    Trimlock [H]ardForum Junkie

    Messages:
    15,127
    Joined:
    Sep 23, 2005
    Hell yea, get a power house behind this!
     
    Darth Kyrie likes this.
  12. ChadD

    ChadD I Love TEXAS

    Messages:
    3,267
    Joined:
    Feb 8, 2016
    Intel has always been a very good open source citizen. I would hope and expect that will continue with their new GPU stuffs. Intel isn't going to try and reinvent Freesync/Gsync or build an Intel Hairworks. lol I'm sure they will support every open standard they can... I am also very sure they are not fans of DX in anyway. Intel and MS have a long history of stepping on each others toes. All you have to do is look through Intel Captials holdings to see how Intel funds a ton of companies directly competing with MS. (not related but damn its always interesting seeing what new companies Intel is investing in... Intel seems to fund all sorts of perhaps this will be something one day stuff like sifive)

    Wintel is no happy marriage.
    - In the early 90s MS pushed hard to replace Intel as one of the founders of the ACE consortium (advanced computing environment) MS and 20 some other companies attempted to replace x86 with a RISC arch. Clearly that didn't work. (although its why the first versions of Windows NT supported x86 / mips / powerpc and alpha)
    - IN 95 there was the NSP (native single processing) fight. Basically Intel in order to sell faster CPUs... built software that bypassed windows audio video systems (and the need for specialized hardware (read sound cards ect). MS was pissed to say the least. MS threatened to end OEMs that shipped Intel NSP software.
    - MMX... Intel spent a lot of R&D money on MMX, and even spent something like $250 million to get software vendors to support it. MS however told them to pound sand, they wouldn't add MMX support to windows... unless they gave AMD a MMX licence. This was good for consumers yes... still you know that had to have pissed Intel off to no end.

    Looking forward to seeing what Intel will cook up... if their hardware is good, I fully expect they will push even harder to lessen MS. So promoting Vulcan perhaps even choosing to not support newer versions of DX wouldn't shock me at all. I also wouldn't be shocked to see Intel start spending $ on developers... getting them to prioritize Vulcan. (its possible they are already doing just that... supporting open source means their hardware doesn't have to be done at all... to start getting developers to support GPUs that may be 2-3 years off)

    If they are serious about taking on Nvidia... being able to Screw MS over at the same time I am sure would be the icing for some of the long time Intel folks.
     
    Darth Kyrie, DLGenesis and shpankey like this.
  13. GMcDougal

    GMcDougal Gawd

    Messages:
    904
    Joined:
    Aug 22, 2004
    Nvidia is seeing this and is now realizing that a shit storm is coming.
     
  14. Armenius

    Armenius [H]ardForum Junkie

    Messages:
    16,240
    Joined:
    Jan 28, 2014
    Why? Wide Vulkan adoption would be good for everyone.
     
    DrezKill, Wierdo and scojer like this.
  15. Lakados

    Lakados Gawd

    Messages:
    879
    Joined:
    Feb 3, 2014
    With actual developers putting time and resources into putting together solid DevTools for Vulkan it will work, that was OpenGL's biggest problem for most studios and why it wasn't used nearly as much as DirectX
     
    DrezKill and Armenius like this.
  16. ChadD

    ChadD I Love TEXAS

    Messages:
    3,267
    Joined:
    Feb 8, 2016
    Even the company that has spent 100s of millions pushing closed source crap like Gsync.... and promoting DX add ons for things like ray tracing.

    The shit storm is coming for them I would say. Even if Intel doesn't take the performance throne... the Intel war chest makes AMDs look like lunch money. If Intel manages to push a large number of AAA game developers to support Vulcan over DX... and hardware vendors to promote FreeSync, and starts offering OEMs attractive Intel APUs and discounts on Intel CPU / GPU packages. Nvidia is indeed going to be up against it. If Intel goes all in and spends some serious cash pushing the industry to promote Intel.
     
  17. IdiotInCharge

    IdiotInCharge Not the Idiot YOU are Looking for

    Messages:
    8,097
    Joined:
    Jun 13, 2003
    If Intel pushes anyone out, it will be AMD. A competent Intel GPU will compete with AMD GPUs for second hat for a time, and we might see them acquire RTG.

    When you see that 'FreeSync' has yet to meet the technical standard that G-Sync presented on release, 'crap' doesn't apply. Nvidia solved the whole problem before AMD started half-assing an alternative.

    Which is the best chance to get developers to actually use it in games! And since all of Nvidia's DX stuff works on AMD (and ostensibly Intel), there's no problem here. Well, there is if AMD/Intel build inferior hardware. That's up to them!
     
    Armenius likes this.
  18. Pyromaneyakk

    Pyromaneyakk 2[H]4U

    Messages:
    2,145
    Joined:
    Sep 2, 2000
    You guys are lucky they didn't replace the image with gay pr0n ;)
     

    Attached Files:

  19. DedEmbryonicCe11

    DedEmbryonicCe11 [H]ard|Gawd

    Messages:
    1,530
    Joined:
    Jun 6, 2006
    I'm going to guess they meant the IGP portion of the processor.
     
  20. misterbobby

    misterbobby 2[H]4U

    Messages:
    3,522
    Joined:
    Mar 18, 2014
    Okay makes some sense but that is sort of irrelevant since even the HD 4600 is WAY too slow to run any halfway modern games anyway.
     
  21. Luke Wells

    Luke Wells n00bie

    Messages:
    36
    Joined:
    Jan 24, 2018
    Serious question: Have you ever used Freesync? It's FAR from half-assed. It's awesome.

    Secondly, adding a $200+ free for Gsync monitors does not solve a problem.
     
  22. Shadowed

    Shadowed Limp Gawd

    Messages:
    281
    Joined:
    Mar 21, 2018
    I can't tell the difference with Freesync and Gsync. They feel the same in games.
     
  23. Krenum

    Krenum [H]ardForum Junkie

    Messages:
    14,429
    Joined:
    Apr 29, 2005

    Oh please, stop drinking the green colored Koolaid! I use Freesync everyday & its a wonderful experience. Most people don't use G-sync because of the Nvidia tax. I would love to use it, if it was affordable.
     
  24. ole-m

    ole-m Limp Gawd

    Messages:
    361
    Joined:
    Oct 5, 2015
    I'd love to embrace it but it has no improvements over freesync and costs more. :p
    so end result is, why do we have it ? :)
     
  25. ChadD

    ChadD I Love TEXAS

    Messages:
    3,267
    Joined:
    Feb 8, 2016
    To attempt to lock people into Nvidia when it comes time to upgrade their GPUs.

    The only way you leave Nvidias green gang is by getting jumped out by Jensen... joking, but you do have to buy a new monitor at the same time. lol
     
    Revdarian and Solhokuten like this.
  26. dragonstongue

    dragonstongue 2[H]4U

    Messages:
    2,961
    Joined:
    Nov 18, 2008
    Nv does not want anyone to use Vulkan because like DX11 or DX 12 they do not have nor can have "full support" of it, whereas because Vulkan is very much aligned with what AMD brought forth it only makes sense that Intel supports it, bad days for Nv, someone needs to take Jensen head down like 20 sizes anyways, let them make GPU for cars for all I care,

    PC was meant to be more "open" IMO, if they want proprietary BS they (Nv) should do direct partnership with Apple and see if they can still "call all the shots" like they think they have the "right" to do like they have been trying to control all the software/hardware side of things for PC land the last decade or so or crying like little bitches when AMD sunk major $$$$$$$$$$$$ and development time coming up with some novelty features that Nv could not or did not want to bother with.

    when they find out that MSFT will be putting support for it in DX (because AMD gave them very good reason to do so they (Nv) bitch moan and cry unless they get to race the race the way they want to instead of playing by the same rules, tessellation, Explicit Multi-GPU and so forth

    If you enter a race already going on, play by the same rules, otherwise do not bother pretending you are the "best race car" and placing labels on your products making false claims of "full support of X Y Z" when you DO NOT have full support of it.

    needless to say, here is hoping Intel gives the major backing to all the AMD based things, they have been direct partners with AMD for much longer than Nvidia after all, about time to give them a man hug ^.^
     
  27. lostin3d

    lostin3d [H]ard|Gawd

    Messages:
    1,537
    Joined:
    Oct 13, 2016
    feature_20110502140345.jpg
    Virtu Switchable Graphics
    Based on LucidLogix Virtu technology, MSI Z68 motherboard series firstly provides the most expectable feature for desktop platform - switchable graphics, which allows users to enjoy both graphics power of integrated GPU and discrete GPU. It will switch to integrated GPU for HD movie playback, video transcoding and general applications to save system energy, or release full power for hardcore 3D gaming by switching to discrete GPU automatically.


    A mobo I have has this. The one with my 2600k in it. Ended up being more of a PIA than was worth so I keep it disabled. Cool idea but never really seemed to work smoothly.
     
    TheHobbyist and halo000008 like this.
  28. lostin3d

    lostin3d [H]ard|Gawd

    Messages:
    1,537
    Joined:
    Oct 13, 2016
    I always say that desperate choices are costly choices but at this point anything that helps tip NV off it's perch is probably a good thing plus I've heard nothing but good things about Vulkan.
     
    Darth Kyrie likes this.
  29. SvenBent

    SvenBent 2[H]4U

    Messages:
    2,387
    Joined:
    Sep 13, 2008
    Leonard_Nimoy_Spock_1967.jpg


    Jokes aside a push for more vulcan development would be nice. I support programs on any open standard.
     
  30. lostin3d

    lostin3d [H]ard|Gawd

    Messages:
    1,537
    Joined:
    Oct 13, 2016
    I'd prefer to support AMD after all they've invested over the years but flat truth is that the first one to make a comparable 1080TI equivalent will get my money and help me on towards a new path of VRR 4k T.V.'s. I know Intel's not planning anything like that for awhile but just saying that's where my plans are.
     
  31. jnemesh

    jnemesh [H]ard|Gawd

    Messages:
    1,053
    Joined:
    Jan 21, 2013
    The only difference you will see is a lot less green in your wallet with G-sync.
     
    Revdarian and DrezKill like this.
  32. jnemesh

    jnemesh [H]ard|Gawd

    Messages:
    1,053
    Joined:
    Jan 21, 2013
    The Vega64 comes a lot closer than people want to admit in most cases. I just bought one, despite the performance deficit between it and a 1080ti because 1) F*** Nvidia, 2) Monitors are SIGNIFICANTLY less expensive, 3) my gaming will be at 3440 x 1440, and 4) Samsung's support of Freesync on 2018 TVs.

    It should be noted that when testing the latest Battlefield V closed alpha, the Vega OUTPERFORMED the 1080ti...and this is going to be a trend as more games support DX12 and Vulkan. There is only so much Nvidia can do with per-game patching.
     
  33. Dan_D

    Dan_D [H]ardOCP Motherboard Editor

    Messages:
    54,540
    Joined:
    Feb 9, 2002
    Doom is the only game I play that uses Vulkan, and while fast it doesn't work with V-Sync, or adaptive V-Sync on my system. So, I don't use it.
     
  34. katanaD

    katanaD [H]ard|Gawd

    Messages:
    1,800
    Joined:
    Nov 15, 2016

    ok.. that was bad.. and you should feel bad..

    but..i LOL'd
     
  35. Solhokuten

    Solhokuten [H]ard|Gawd

    Messages:
    1,109
    Joined:
    Dec 9, 2009
  36. sparks

    sparks 2[H]4U

    Messages:
    3,186
    Joined:
    Jun 19, 2004
    anyone else remember when G-sync was first shown?
    They stated it would only add between $25 and $50 to the price and would solve all your problems.
    now they say they have to verify each monitor and it cost at least $200.
    free sync is just a modified vesa standard and adds nothing to the cost.
    Something is wrong.
    sure its a chip but the manufacture is doing all the work not Nvidia.
    funny that the bigger the monitor the more it cost.

    The Vega64 comes a lot closer than people want to admit in most cases. I just bought one, despite the performance deficit between it and a 1080ti because 1) F*** Nvidia

    yes but when it came out, it was hotter, used more power and cost more than the N models. And when the first asus designed cooling boards came out they wouldn't clock over 1560 and the stock boards would do 1630...
    talk about an overpriced cluster f.
     
  37. Armenius

    Armenius [H]ardForum Junkie

    Messages:
    16,240
    Joined:
    Jan 28, 2014
    I want whatever you are smoking.
    Freesync 2 takes the same approach as G-Sync now. Blame the wild west of manufacturers only supporting Freesync in a narrow range like 50-60 Hz. G-Sync supported 30-144 Hz from the very start.
     
    IdiotInCharge likes this.
  38. twzTechman

    twzTechman Limp Gawd

    Messages:
    208
    Joined:
    Apr 14, 2011

    I am looking to move to a 3440 x 1440 monitor very soon myself. I currently have a 27' g-sync monitor and a gtx1080 @1440p. Seriously considering moving over to a Vega64 and Freesync set-up for the same reasons as noted above. I think my GTX1080 would struggle at 3440 x 1440 without a g-sync monitor and those are pricey.
     
    jnemesh likes this.
  39. DukenukemX

    DukenukemX 2[H]4U

    Messages:
    3,989
    Joined:
    Jan 30, 2005
    For their GPU's. You've ran Vulkan games on a Intel HD 4400?
     
  40. scojer

    scojer 2[H]4U

    Messages:
    3,907
    Joined:
    Jun 13, 2009
    I'd like to use Vulkan. I want more devs to support it.