Intel Wants You to Use Vulkan

rgMekanic

[H]ard|News
Joined
May 13, 2013
Messages
6,943
In an interesting post on the Intel Developer Zone page, Intel gives high praise to the Vulkan API with many glowing statements. In addition, they give you step by step instructions on how to render objects in parallel using the Vulkan API, with sample code.

I have to wonder if this is a bit of foreshadowing for the upcoming Intel discrete GPUs. Trying to help give the industry a gentle nudge towards more Vulkan in the future. Either way, you can now go render yourselves a giant chicken, so that's good.

Vulkan APIs are positioned to become one of the next dominant graphics rendering platforms. Characteristics of the platform help apps gain longevity and run in more places. You might say that Vulkan lets apps live long and prosper- and this code sample will help get you started.
 
Interesting point, if you have a Haswell, Ivy bridge, and Sandy bridge Intel CPU's, they don't have Vulkan support in Windows, but they do have Vulkan support in Linux. Something to think about.
 
Wouldn't it be possible to offload some tasks to the otherwise unused iGPU in a discreet GPU equipped system?

Intel iGPU + intel GPU > competition?

You could do the same thing with an AMD APU and GPU, right? I mean in theory you could even do AMD APU + Intel or nVidia GPU... Right?

Power and glory to Vulcan.
 
Wouldn't it be interesting if they adopted all the open standards that AMD supports/provides? It makes a whole lot of sense to me, and I have a feeling the new GPU division is thinking the same way.

What better way to bork over nVidia while taking advantage of the already built in support that AMD has built up? It'd finally put an end to there monopoly over getting games customized mainly for their cards and would lead the industry to support more open standards, which I think would be a very good thing.
 
Wouldn't it be possible to offload some tasks to the otherwise unused iGPU in a discreet GPU equipped system?

Intel iGPU + intel GPU > competition?

You could do the same thing with an AMD APU and GPU, right? I mean in theory you could even do AMD APU + Intel or nVidia GPU... Right?

Power and glory to Vulcan.

The simple answer is yes APUs can use last level cache, to offload workloads to on board CPU or GPU cores depending on which is better suited to the task. They swap data via a L3 (last level) cache. (software also really has to come in to play to really expose that)

Offloading work to a discrete card however would mean all data would have to move back and forth via the PCI bus. The added communication bottleneck would make such a setup unlikely to outperform simply sending the data to the GPU, and letting it crunch. That is the main advantage of APUs like those found in the Consoles... the CPU and GPU can share workloads by using a L3 (last level) cache. Most of the advantage is in power efficiency, and hardware cost savings.

https://pdfs.semanticscholar.org/presentation/3009/8fae9dd812777d100662a8283e13c574f6a6.pdf

The main thing holding APUs back from simply replacing GPU cards... is the issue of texture memory and speed. GPU cards are still directly connected to much faster RAM. Console systems are really the only systems designed to allow developers to use faster video memory like GDDR5 with both the CPU and GPU.
 
Makes sense, their options were to push the open standards that exist or try and make a new proprietary standard. The latter would hurt their positioning since they're so late to the dance.

Better believe if they were in a better market position, they'd be pushing a proprietary system.
 
I’m not surprised considering they’re (intel) supplying graphics drivers for the Vega GPUs that come with their CPUs. I thought AMD tech is the basis for their GPUs (to start off.)
 
Last edited:
Wouldn't it be interesting if they adopted all the open standards that AMD supports/provides? It makes a whole lot of sense to me, and I have a feeling the new GPU division is thinking the same way.

What better way to bork over nVidia while taking advantage of the already built in support that AMD has built up? It'd finally put an end to there monopoly over getting games customized mainly for their cards and would lead the industry to support more open standards, which I think would be a very good thing.

Intel has always been a very good open source citizen. I would hope and expect that will continue with their new GPU stuffs. Intel isn't going to try and reinvent Freesync/Gsync or build an Intel Hairworks. lol I'm sure they will support every open standard they can... I am also very sure they are not fans of DX in anyway. Intel and MS have a long history of stepping on each others toes. All you have to do is look through Intel Captials holdings to see how Intel funds a ton of companies directly competing with MS. (not related but damn its always interesting seeing what new companies Intel is investing in... Intel seems to fund all sorts of perhaps this will be something one day stuff like sifive)

Wintel is no happy marriage.
- In the early 90s MS pushed hard to replace Intel as one of the founders of the ACE consortium (advanced computing environment) MS and 20 some other companies attempted to replace x86 with a RISC arch. Clearly that didn't work. (although its why the first versions of Windows NT supported x86 / mips / powerpc and alpha)
- IN 95 there was the NSP (native single processing) fight. Basically Intel in order to sell faster CPUs... built software that bypassed windows audio video systems (and the need for specialized hardware (read sound cards ect). MS was pissed to say the least. MS threatened to end OEMs that shipped Intel NSP software.
- MMX... Intel spent a lot of R&D money on MMX, and even spent something like $250 million to get software vendors to support it. MS however told them to pound sand, they wouldn't add MMX support to windows... unless they gave AMD a MMX licence. This was good for consumers yes... still you know that had to have pissed Intel off to no end.

Looking forward to seeing what Intel will cook up... if their hardware is good, I fully expect they will push even harder to lessen MS. So promoting Vulcan perhaps even choosing to not support newer versions of DX wouldn't shock me at all. I also wouldn't be shocked to see Intel start spending $ on developers... getting them to prioritize Vulcan. (its possible they are already doing just that... supporting open source means their hardware doesn't have to be done at all... to start getting developers to support GPUs that may be 2-3 years off)

If they are serious about taking on Nvidia... being able to Screw MS over at the same time I am sure would be the icing for some of the long time Intel folks.
 
With actual developers putting time and resources into putting together solid DevTools for Vulkan it will work, that was OpenGL's biggest problem for most studios and why it wasn't used nearly as much as DirectX
 
Why? Wide Vulkan adoption would be good for everyone.

Even the company that has spent 100s of millions pushing closed source crap like Gsync.... and promoting DX add ons for things like ray tracing.

The shit storm is coming for them I would say. Even if Intel doesn't take the performance throne... the Intel war chest makes AMDs look like lunch money. If Intel manages to push a large number of AAA game developers to support Vulcan over DX... and hardware vendors to promote FreeSync, and starts offering OEMs attractive Intel APUs and discounts on Intel CPU / GPU packages. Nvidia is indeed going to be up against it. If Intel goes all in and spends some serious cash pushing the industry to promote Intel.
 
If Intel pushes anyone out, it will be AMD. A competent Intel GPU will compete with AMD GPUs for second hat for a time, and we might see them acquire RTG.

Even the company that has spent 100s of millions pushing closed source crap like Gsync.

When you see that 'FreeSync' has yet to meet the technical standard that G-Sync presented on release, 'crap' doesn't apply. Nvidia solved the whole problem before AMD started half-assing an alternative.

promoting DX add ons for things like ray tracing

Which is the best chance to get developers to actually use it in games! And since all of Nvidia's DX stuff works on AMD (and ostensibly Intel), there's no problem here. Well, there is if AMD/Intel build inferior hardware. That's up to them!
 

Attachments

  • image_blocked.jpg
    image_blocked.jpg
    279.2 KB · Views: 0
When you see that 'FreeSync' has yet to meet the technical standard that G-Sync presented on release, 'crap' doesn't apply. Nvidia solved the whole problem before AMD started half-assing an alternative.

Serious question: Have you ever used Freesync? It's FAR from half-assed. It's awesome.

Secondly, adding a $200+ free for Gsync monitors does not solve a problem.
 
If Intel pushes anyone out, it will be AMD. A competent Intel GPU will compete with AMD GPUs for second hat for a time, and we might see them acquire RTG.
When you see that 'FreeSync' has yet to meet the technical standard that G-Sync presented on release, 'crap' doesn't apply. Nvidia solved the whole problem before AMD started half-assing an alternative.


Oh please, stop drinking the green colored Koolaid! I use Freesync everyday & its a wonderful experience. Most people don't use G-sync because of the Nvidia tax. I would love to use it, if it was affordable.
 
I'd love to embrace it but it has no improvements over freesync and costs more. :p
so end result is, why do we have it ? :)

To attempt to lock people into Nvidia when it comes time to upgrade their GPUs.

The only way you leave Nvidias green gang is by getting jumped out by Jensen... joking, but you do have to buy a new monitor at the same time. lol
 
Nv does not want anyone to use Vulkan because like DX11 or DX 12 they do not have nor can have "full support" of it, whereas because Vulkan is very much aligned with what AMD brought forth it only makes sense that Intel supports it, bad days for Nv, someone needs to take Jensen head down like 20 sizes anyways, let them make GPU for cars for all I care,

PC was meant to be more "open" IMO, if they want proprietary BS they (Nv) should do direct partnership with Apple and see if they can still "call all the shots" like they think they have the "right" to do like they have been trying to control all the software/hardware side of things for PC land the last decade or so or crying like little bitches when AMD sunk major $$$$$$$$$$$$ and development time coming up with some novelty features that Nv could not or did not want to bother with.

when they find out that MSFT will be putting support for it in DX (because AMD gave them very good reason to do so they (Nv) bitch moan and cry unless they get to race the race the way they want to instead of playing by the same rules, tessellation, Explicit Multi-GPU and so forth

If you enter a race already going on, play by the same rules, otherwise do not bother pretending you are the "best race car" and placing labels on your products making false claims of "full support of X Y Z" when you DO NOT have full support of it.

needless to say, here is hoping Intel gives the major backing to all the AMD based things, they have been direct partners with AMD for much longer than Nvidia after all, about time to give them a man hug ^.^
 
Wouldn't it be possible to offload some tasks to the otherwise unused iGPU in a discreet GPU equipped system?

Intel iGPU + intel GPU > competition?

You could do the same thing with an AMD APU and GPU, right? I mean in theory you could even do AMD APU + Intel or nVidia GPU... Right?

Power and glory to Vulcan.

feature_20110502140345.jpg

Virtu Switchable Graphics
Based on LucidLogix Virtu technology, MSI Z68 motherboard series firstly provides the most expectable feature for desktop platform - switchable graphics, which allows users to enjoy both graphics power of integrated GPU and discrete GPU. It will switch to integrated GPU for HD movie playback, video transcoding and general applications to save system energy, or release full power for hardcore 3D gaming by switching to discrete GPU automatically.


A mobo I have has this. The one with my 2600k in it. Ended up being more of a PIA than was worth so I keep it disabled. Cool idea but never really seemed to work smoothly.
 
I always say that desperate choices are costly choices but at this point anything that helps tip NV off it's perch is probably a good thing plus I've heard nothing but good things about Vulkan.
 
Oh please, stop drinking the green colored Koolaid! I use Freesync everyday & its a wonderful experience. Most people don't use G-sync because of the Nvidia tax. I would love to use it, if it was affordable.

I'd prefer to support AMD after all they've invested over the years but flat truth is that the first one to make a comparable 1080TI equivalent will get my money and help me on towards a new path of VRR 4k T.V.'s. I know Intel's not planning anything like that for awhile but just saying that's where my plans are.
 
I'd prefer to support AMD after all they've invested over the years but flat truth is that the first one to make a comparable 1080TI equivalent will get my money and help me on towards a new path of VRR 4k T.V.'s. I know Intel's not planning anything like that for awhile but just saying that's where my plans are.

The Vega64 comes a lot closer than people want to admit in most cases. I just bought one, despite the performance deficit between it and a 1080ti because 1) F*** Nvidia, 2) Monitors are SIGNIFICANTLY less expensive, 3) my gaming will be at 3440 x 1440, and 4) Samsung's support of Freesync on 2018 TVs.

It should be noted that when testing the latest Battlefield V closed alpha, the Vega OUTPERFORMED the 1080ti...and this is going to be a trend as more games support DX12 and Vulkan. There is only so much Nvidia can do with per-game patching.
 
Doom is the only game I play that uses Vulkan, and while fast it doesn't work with V-Sync, or adaptive V-Sync on my system. So, I don't use it.
 
anyone else remember when G-sync was first shown?
They stated it would only add between $25 and $50 to the price and would solve all your problems.
now they say they have to verify each monitor and it cost at least $200.
free sync is just a modified vesa standard and adds nothing to the cost.
Something is wrong.
sure its a chip but the manufacture is doing all the work not Nvidia.
funny that the bigger the monitor the more it cost.

The Vega64 comes a lot closer than people want to admit in most cases. I just bought one, despite the performance deficit between it and a 1080ti because 1) F*** Nvidia

yes but when it came out, it was hotter, used more power and cost more than the N models. And when the first asus designed cooling boards came out they wouldn't clock over 1560 and the stock boards would do 1630...
talk about an overpriced cluster f.
 
Nv does not want anyone to use Vulkan because like DX11 or DX 12 they do not have nor can have "full support" of it, whereas because Vulkan is very much aligned with what AMD brought forth it only makes sense that Intel supports it, bad days for Nv, someone needs to take Jensen head down like 20 sizes anyways, let them make GPU for cars for all I care,

PC was meant to be more "open" IMO, if they want proprietary BS they (Nv) should do direct partnership with Apple and see if they can still "call all the shots" like they think they have the "right" to do like they have been trying to control all the software/hardware side of things for PC land the last decade or so or crying like little bitches when AMD sunk major $$$$$$$$$$$$ and development time coming up with some novelty features that Nv could not or did not want to bother with.

when they find out that MSFT will be putting support for it in DX (because AMD gave them very good reason to do so they (Nv) bitch moan and cry unless they get to race the race the way they want to instead of playing by the same rules, tessellation, Explicit Multi-GPU and so forth

If you enter a race already going on, play by the same rules, otherwise do not bother pretending you are the "best race car" and placing labels on your products making false claims of "full support of X Y Z" when you DO NOT have full support of it.

needless to say, here is hoping Intel gives the major backing to all the AMD based things, they have been direct partners with AMD for much longer than Nvidia after all, about time to give them a man hug ^.^
I want whatever you are smoking.
anyone else remember when G-sync was first shown?
They stated it would only add between $25 and $50 to the price and would solve all your problems.
now they say they have to verify each monitor and it cost at least $200.
free sync is just a modified vesa standard and adds nothing.
Something is wrong.
sure its a chip but the manufacture is doing all the work not Nvidia.
funny that the bigger the monitor the more it cost.

The Vega64 comes a lot closer than people want to admit in most cases. I just bought one, despite the performance deficit between it and a 1080ti because 1) F*** Nvidia

yes but when it came out, it was hotter, used more power and cost more than the N models. And when the first asus designed cooling boards came out they wouldn't clock over 1560 and the stock boards would do 1630...
talk about am overpriced cluster f.
Freesync 2 takes the same approach as G-Sync now. Blame the wild west of manufacturers only supporting Freesync in a narrow range like 50-60 Hz. G-Sync supported 30-144 Hz from the very start.
 
The Vega64 comes a lot closer than people want to admit in most cases. I just bought one, despite the performance deficit between it and a 1080ti because 1) F*** Nvidia, 2) Monitors are SIGNIFICANTLY less expensive, 3) my gaming will be at 3440 x 1440, and 4) Samsung's support of Freesync on 2018 TVs.

It should be noted that when testing the latest Battlefield V closed alpha, the Vega OUTPERFORMED the 1080ti...and this is going to be a trend as more games support DX12 and Vulkan. There is only so much Nvidia can do with per-game patching.


I am looking to move to a 3440 x 1440 monitor very soon myself. I currently have a 27' g-sync monitor and a gtx1080 @1440p. Seriously considering moving over to a Vega64 and Freesync set-up for the same reasons as noted above. I think my GTX1080 would struggle at 3440 x 1440 without a g-sync monitor and those are pricey.
 
Back
Top