Vulkan Has Just Become the World's First Graphics API with a Formal Memory Model

cageymaru

Fully [H]
Joined
Apr 10, 2003
Messages
22,060
Khronos Group has announced that Vulkan has become the world's first graphics API with a formal memory model. "A memory model, or memory consistency model, for a programming language describes how threads in a parallel processing system can access shared data and synchronize with each other, while allowing compilers and hardware the freedom to reorder and optimize memory accesses." The breakthrough has come about due to the rise of multi-core processors from CPU manufacturers. Khronos Group says that research is ongoing into making these models scale to massively parallel systems with new notions like scoped synchronization.

This week, Vulkan has become the world's first graphics API to include a formal memory model for its associated GLSL and SPIR-V programming languages. This significant announcement has a number of components that come together to significantly boost the robustness of the Vulkan standard for programming correctness and sophisticated compiler optimizations.
 
Realistically though, we'll probably have to wait for new versions of game engines before any of this ends up in games, possibly with major overhauls to the engine.
 
Realistically though, we'll probably have to wait for new versions of game engines before any of this ends up in games, possibly with major overhauls to the engine.
Mobile will push this for us as smartphones have multi-core processors also. I bet Unreal Engine will adopt it.

:)
 
Any progress is good progress, right?

I have no idea how it will improve performance and maybe have an effect on the whole ghz vs cores discussion, but it's interesting nonetheless.
 
You mean the ecosystem that wouldn't exist without Intel and Nvidia?

:ROFLMAO:
Pretty sure this is a logical fallacy. If either company were to not exist at all, we'd simply be buying AMD products, Cyrix might have done better or maybe even all using PowerPC derived units, etc. You can't say that they created the ecosystem when in reality the factors behind that are immeasurable and largely have to do with end users. Now I know loving Intel/Nvidia is popular and all being they are "the fastest" -currently- but that has nothing to do with innovating in the space most of us care about and certainly very little to do with building a ecosystem that in all probability would exist if they had never even founded.

TL;DR: I do not think Intel existing would have stopped the PC revolution nor the creation of this ecosystem as it's popularity has more to do with convenience and empowering life which all has happened due to natural funding and progression.
 
Last edited:
If either company were to not exist at all, we'd simply be buying AMD products, Cyrix might have done better or maybe even all using PowerPC derived units, etc.

...you realize that their first designs were literally straight up copies of Intel's, right?
 
Which supports 'AMD brought more advancement to the ecosystem' how?

You were the one who brought it up.

You can thank AMD for things like x86-64, forcing Intel to sell more than 4 cores on the desktop platform, on die memory controller, on core graphics that don't suck, and not being grossly vulnerable to hardware attacks.
 
You were the one who brought it up.

You can thank AMD for things like x86-64, forcing Intel to sell more than 4 cores on the desktop platform, on die memory controller, on core graphics that don't suck, and not being grossly vulnerable to hardware attacks.

:ROFLMAO:

I still like the idea of VLIW- think ML could boost it quite a bit, and with the level of abstraction going on at the hardware level, it might make a comeback. As for x86-64- there was no invention here. AMD just did what Intel decided not to do. Same with on-die memory controller, and Intel's on-core graphics have worked pretty swell for years (do note that my desktop has all three vendors' graphics in it right now!). And AMD is also vulnerable to hardware attacks, because they attack the nature of out-of-order processors, the solution to which isn't immediately apparent- which is why I mentioned VLIW. And I don't see AMD 'forcing' Intel to do anything. Intel didn't want to do big consumer dies on 14nm, but they borked their 10nm process- so they put stuff in the pipe. And yeah, that took a whole lot longer than the few months Ryzen was previewed.

I give AMD credit for occasionally almost catching up. They're terrible at outright innovation though, almost nothing they do (that's actually innovative) sticks. Even Mantle and Vulkan followed the DX12 initiative, and their similarity shows just how similar graphics can be once all of the abstraction (the 'easy mode' stuff) is stripped away, and how the entire industry wanted that solution.

Next we'll have AMD CPUs with AVX512 and AMD GPUs with RT...
 
Yeah okay hoss. I'm sure Intel's IA64 would have been fantastic.

I also wish Intel sandbagged longer with four cores on the desktop.

And if you think AMD is having the same problems with security mitigations as Intel, you have your fingers in your ears.
 
You mean the ecosystem that wouldn't exist without Intel and Nvidia?

:ROFLMAO:

i think you mean the ecosystem that wouldnt exist without 3dfx.
no wait, S3... woulndt exist without them
..... oh crap no, ATI... i mean amd really just bought them anyway right?
certainly it wouldnt have been the RCA Pixie chip.... i mean, they had nothing to do with the advancement of graphics at all...
i mean.... we couldnt include siggraph in here... they didnt actually MAKE a card .... right?


yeah... THAT ecosystem?
 
i think you mean the ecosystem that wouldnt exist without 3dfx.
no wait, S3... woulndt exist without them
..... oh crap no, ATI... i mean amd really just bought them anyway right?
certainly it wouldnt have been the RCA Pixie chip.... i mean, they had nothing to do with the advancement of graphics at all...
i mean.... we couldnt include siggraph in here... they didnt actually MAKE a card .... right?


yeah... THAT ecosystem?
I mean, we all know none of this would have been possible without VIA...clearly...arby's is pretty cool?
 
Last edited:
Yeah okay hoss. I'm sure Intel's IA64 would have been fantastic.

IA64 is fantastic, always was- running x86 on it wasn't, for obvious reasons. VLIW needs compiler support that never existed during Itanium's tenure, but could be possible with ML-boosted compilers now. Best part about it is that VLIW is stupid efficient when the code is well structured, which makes it a good candidate to replace the higher-performance stuff we have today (x86, ARM, Power).

I also wish Intel sandbagged longer with four cores on the desktop.

Skylake was supposed to be the last Intel consumer arch that topped at four cores, so if you're the AMD fanboi that you're pretending to be, you're damn right you would've wished Intel sandbagged longer! Had Intel pushed eight-core 10nm CPUs out instead of Kaby Lake, Ryzen would have been another day-late, dollar short Dozer.

And if you think AMD is having the same problems with security mitigations as Intel, you have your fingers in your ears.

There are the problems that we know about, and those that we don't. And what anyone reasonable is inferring about the problems we're seeing is that they exploit basic out-of-order CPU architecture, that is everything from the Pentium for Intel and from the Athlon (K7) for AMD. I expect more to come.
 
There are the problems that we know about, and those that we don't. And what anyone reasonable is inferring about the problems we're seeing is that they exploit basic out-of-order CPU architecture, that is everything from the Pentium for Intel and from the Athlon (K7) for AMD. I expect more to come.

Lets just look at this one.

What arch is getting hyperthreading disabled on bsd?
 
Lets just look at this one.

What arch is getting hyperthreading disabled on bsd?

Why limit it to one, and just this point? Why refuse to look at the bigger picture?

Could it be that you have a predetermined outcome that you're trying to prove?

How can you prove that AMD's or ARM's implementation of this feature or that feature won't be exploited next using a similar process, since they're all very similar under the hood?

Lol.
 
Why limit it to one, and just this point? Why refuse to look at the bigger picture?

Could it be that you have a predetermined outcome that you're trying to prove?

How can you prove that AMD's or ARM's implementation of this feature or that feature won't be exploited next using a similar process, since they're all very similar under the hood?

Lol.

Sure, I'll worry more about non-existent vulns then THE ONES THAT EXIST RIGHT NOW. :cautious:

Don't deep throat Intel too much now.
 
Improvements to Vulkan help all gamers. Who can hate that except schills ?

I'm not even sure why shills would hate it. Outside of maybe diehard MS fanboys, but even they don't seem super hot on DX12.
 
...you realize that their first designs were literally straight up copies of Intel's, right?
I think what you really meant to say is you realize w/o AMD that Intel is straight up a monopoly. Sorry I forgot VIA made a CPU 20 years ago.
 
I think what you really meant to say is you realize w/o AMD that Intel is straight up a monopoly. Sorry I forgot VIA made a CPU 20 years ago.

I don't believe VIA made anything. Pretty sure they bought Cyrix and used their CPU's, just renamed.
 
Happy to see Vulkan picking up steam and setting standards. Disappointed to see people acting like children in here about something completely off topic.
 
Back
Top