How AMD's Mantle Makes Games Better

They have but failed: https://developer.nvidia.com/nvapi

Why have AMD made it open source? Well the more people that take it up the better the adoption and easier it is across different vender GPU's support. Better then nVIDIA that have G sync yet will only work with their GPU's and have said out right they are not going to share it. As far as i am concerned G sync is DOA with just that comment alone since they are not even interested in providing licences to make the technology wide spread.

AMD have made the right move in having it open source. nVIDIA arrogance all over again.
 
AMD said that Mantle is tied the the GCN architecture. It is not open.

In fact it may be the most closed API out there. It won't even be used on the consoles.

Really ?

It is open for other vendors that want to support it. They have to be able to do some serious work to make it work on their hardware.

http://www.tomshardware.com/news/amd-apu-developer-conference-mantle,25079.html
Ultimately one thing was made clear during the presentations: Mantle is not simply just for AMD APUs and GPUs. The technology is reportedly open, so whether Nvidia accepts the tech or not is a different story altogether, but it's there for the GeForce company to embrace.
 
They have but failed: https://developer.nvidia.com/nvapi

Why have AMD made it open source? Well the more people that take it up the better the adoption and easier it is across different vender GPU's support. Better then nVIDIA that have G sync yet will only work with their GPU's and have said out right they are not going to share it. As far as i am concerned G sync is DOA with just that comment alone since they are not even interested in providing licences to make the technology wide spread.

AMD have made the right move in having it open source. nVIDIA arrogance all over again.

I wouldn't call it "arrogance" to reject something that is unproven and would require a large number of man hours to get to work with your product.

If mantle proves itself and is worth it I can see them implementing it but to go hog wild with it as is? Not a good idea.
 
Why have AMD made it open source?

AMD have made the right move in having it open source.

If AMD does indeed allow other GPU devs access to Mantle I can't imagine they would make it open source. Open API maybe but I doubt open source. AMD will still want to control Mantle. After all their money made it.

But as I said earlier I have not seen a single announcement from AMD saying Mantle is open API or open source. Just a bunch of could or maybe.
 
You're an idiot, or just blind.

Watch the video again, and look closely at the avg FPS display when zoomed in. You can clearly see massive FPS drops. You can also clearly see the FPS drops and stuttering if you watch the display behind the guys while they're talking(and moving quite fluidly, while demo is stuttering).

Using a demo that can't even maintain 30fps to try and show off an API is just terrible.

So maybe we need a side by side comparison. I can pretty much guarantee you that the
D3D version would be running a whole lot slower.

When zoomed in, I was specifically watching the fps display. I never saw it drop below mid-30s.

And that is a crap load of units to be simulating and still be able to push out the fps it is.

One other thing.. you cannot trust exactly what you see when looking at a video recording of a computer monitor. How in the world would you get both in perfect sync? You are always going to see some "stuttering" when recording like that.
 
AMD said that Mantle is tied the the GCN architecture. It is not open.

In fact it may be the most closed API out there. It won't even be used on the consoles.
Mantle is AMDs way of making PCs operate more like consoles.
Consoles have no need for Mantle because they already operate that way without it.
 
So you are saying that my 6(?) year old 8800GTS can support DirectX 11 since the API is purely software?
That's not how the Mantle API would work. The idea is that Mantle strips away all this extra worthless layer that OpenGL and Direct 3D requires to be universal with other graphic cards. This layer usually requires a heavy duty CPU to translate High Level code over to something the GPU can understand. All Mantle does is strip that out and give total control over to developers. They are stripping out the unnecessary software layer that would otherwise severely slow down the GPU and CPU, at the cost of compatibility with other GPU vendors like Nvidia or AMD's own HD 5000/6000 cards.

Can you back this up with any proof?
It's an API. You could write a driver to support Mantle, if you had the time, knowledge, and energy to do it. As long as the hardware is relatively equivalent in specs, it can be done. Something tells me that you will find people who will make their own Mantle drivers, if AMD is going to release documentation on it. Especially on the Linux end of things. I expect someone to create a new Gallium3D state tracker to support Mantle, and it would likely not be limited to GCN hardware.

Whether or not you'll get the performance like Mantle on GCN hardware, is entirely a different situation.

Why point the finger at AMD when its them that is advancing performance.
Does Mantle somehow slow down NVidia performance?
The point is that a low level API would be better if it was done by API makers like Khronos Group who maintains OpenGL. Otherwise API's made by Nvidia or AMD would favor their hardware, like GCN. Ultimately to have a low level API it would have to favor hardware, but there could be a nicer way of doing it. A more developer friendly and industry standard friendly way of doing it.

The way I see it, Mantle will slowly die the way of Glide unless someone like Nvidia and Intel also uses it.
So why dont NVidia make their own API that is closer to the bones that utilises parts of Mantle.
At least some of the work is already done for them.
They can design hardware to better utilise a Mantle type API then as well.
You mean NVAPI? That's been around for a while. Frostbite 2 and 3 use it, and BF3 and BF4 even use it. It just doesn't replace a Direct 3D/OpenGL API flat out, just bits and pieces. Still gives a performance boost, but not as much as Mantle will.
 
I think companies should stop making presentations for tech that isn't ready/finished yet. AMD got a bunch of buzz (and sales) off all this mantle talk, but we haven't seen real world results.

Derp... Car companies have been doing this for over 60 yrs. Your logic is severely flawed... Sometimes tech that's demonstrating isn't ready for the mainstream, but gives you a look into the future. Secondly they want to gauge the interest of such a product to see if there is a demand or customer need for the tech.
 
I think companies should stop making presentations for tech that isn't ready/finished yet. AMD got a bunch of buzz (and sales) off all this mantle talk, but we haven't seen real world results.

Classic AMD. They are doing the exact same thing with "freesync"

AMD is living on borrowed time. It will run out pretty soon.
 
Which as I said, is a terrible way to demo it. If they want to show how great it is, they need to show a system with mantle, and a system without it. Not some supposedly underclocked hardware running a demo that apparently some people on this forum are incapable of seeing the framerate drop.you can clearly see the framerate tank at 1:57 while the dev is talking without and framerate issues visible due to the video playback. If it was a matter of my computer or internet connection(as if that would have mattered with a fully buffered video), then why doesn't the dev who is speaking look like crap if it was a matter of video playback framerate? It's hilarious that some people want to pretend that crappy stuttering video on youtube is somehow selective to screen area through an entire video.

I don't think you understand the point of the demo / presentation if you think it's strictly tied to fps or jerky / smoothness
 
It's an API. You could write a driver to support Mantle, if you had the time, knowledge, and energy to do it. As long as the hardware is relatively equivalent in specs, it can be done. Something tells me that you will find people who will make their own Mantle drivers, if AMD is going to release documentation on it. Especially on the Linux end of things. I expect someone to create a new Gallium3D state tracker to support Mantle, and it would likely not be limited to GCN hardware.

Whether or not you'll get the performance like Mantle on GCN hardware, is entirely a different situation.
That isnt proof.
Anything can be coded in software, that isnt the point. Its whether its worthwhile and also whether the hardware vendors will do it or provide a competitive alternative.
It has been reported that Mantle will support other hardware later, but you cant expect miracles before it is even in the public domain.
If you want NVidia to support it, go ball ache them, its not AMDs fault.
As you point out, there may not be a lot of point enabling it on older hardware anyway because the hardware isnt best suited to it.

The point is that a low level API would be better if it was done by API makers like Khronos Group who maintains OpenGL. Otherwise API's made by Nvidia or AMD would favor their hardware, like GCN. Ultimately to have a low level API it would have to favor hardware, but there could be a nicer way of doing it. A more developer friendly and industry standard friendly way of doing it.
How would it be better?
If it was better, games would be using OpenGL and there would be no need to develop Mantle.
The reason Mantle has come into being is because there was a need due to hardware not functioning optimally, not just because of the driver.
A hardware manufacturer designed optimised hardware and driver/software that dispenses with many of the abstraction layers presented by Windows.
An open organisation couldnt have achieved this much, it had to come from a hardware/driver manufacturer.

The way I see it, Mantle will slowly die the way of Glide unless someone like Nvidia and Intel also uses it.
Maybe, but only after the industry catches up and a common performance API is used, or AMD produce something better.
Its a step in the right direction for the whole industry because they need to catch up to remain competitive.

You mean NVAPI? That's been around for a while. Frostbite 2 and 3 use it, and BF3 and BF4 even use it. It just doesn't replace a Direct 3D/OpenGL API flat out, just bits and pieces. Still gives a performance boost, but not as much as Mantle will.
If you felt NVAPI was all that, you wouldnt be ball aching because NVidia wont use Mantle.
So clearly that isnt good enough, by your own words.
 
That stuttering slow mess is a problem with your internet connection or video rendering.
Its very smooth here watching from UK.
I went to Youtube directly using the link provided.

That is confirmed. I have 2 wicked gaming pc's and they can run anything but videos. Videos are always choppy with browsers and I have never been able to fix this.
 
I don't think you understand the point of the demo / presentation if you think it's strictly tied to fps or jerky / smoothness

Really, so what should I be concerned about then when the presentation is about an API for rendering using a game as an example? It's either going to be about image quality, or performance. All of the mantle marketing has been about performance, they need to demo a performance comparison. Not some random "here's our software running on mantle on an underclocked CPU" well other than not actually comparing mantle to anything, it tells me nothing from a consumer standpoint.

Someone made a comment in this thread about mantle supposedly running that same demo at 300fps without underclocking the CPU. Great, still doesn't tell me anything. Then someone else commented that the 300fps was an estimate in the event of unlimited GPU power. What? Why not just claim 3000fps with "unlimited GPU power"
 
Then someone else commented that the 300fps was an estimate in the event of unlimited GPU power. What? Why not just claim 3000fps with "unlimited GPU power"

Because it doesn't scale infinitely? The point is to show they're GPU bound and that the API is effective in doing what it says on the tin.
 
Really, so what should I be concerned about then when the presentation is about an API for rendering using a game as an example? It's either going to be about image quality, or performance. All of the mantle marketing has been about performance, they need to demo a performance comparison. Not some random "here's our software running on mantle on an underclocked CPU" well other than not actually comparing mantle to anything, it tells me nothing from a consumer standpoint.

Someone made a comment in this thread about mantle supposedly running that same demo at 300fps without underclocking the CPU. Great, still doesn't tell me anything. Then someone else commented that the 300fps was an estimate in the event of unlimited GPU power. What? Why not just claim 3000fps with "unlimited GPU power"

Maybe you should watch it again and listen to what he actually says? I'll give you a hint, number of unique objects(there are other goodies in there too)
 
Last edited:
AMD said that Mantle is tied the the GCN architecture. It is not open.

In fact it may be the most closed API out there. It won't even be used on the consoles.

i thought you were banned.

why are you in here thread crapping again?
 
For all those suffering from Nvidia1 butt hurt, the point is that the demo was maxing a R9 while not denting the AMD 8350 downclocked by 1/2. Wipe your tears1 and imagine how much GPU power it would take to become CPU limited by a stock 8350?
Mantle is about removing the harness that GPU's have had hanging around their neck for far too long. Console programing for the PC. FINALLY. There is a REASON1 that the new consoles have 8 weak cores and can game like a son'ofa'b*&tch!
I hope I have illustrated some prime reasons why Nvidiots1 should stop it with all the baby boy, butt1 hurt.
 
Mantle is to GPU's what their 3DNow tech was back in the Athlon days. It's a custom instruction set meant to take advantage of certain integrated features of their hardware. You cant make this "open" because it wouldnt work on an Nvidia GPU even if you wanted it to. If Mantle were to become the next big thing and replaced Direct3D then I'm sure Nvidia would be able to support it with their GPU's. But there hasnt really been any competition in the graphics API world for a long time. Microsoft rules the world with DirectX plain and simple. It sounds like adding on support for Mantle is pretty easy for dev's so hopefully a bunch of them will do it. It's not a stab at Nvidia, it's just AMD designing something new. This is far more significant than G-sync.
 
seems mantles focus is unloading the cpu? WHY from WHAT? my aging i7 is in no way holding any thing back the issue is threading
and i would rather the devs spend more time optimizing thread cpu code then this waste of time that will only matter on low end CPUs
why not a side by side with over clocked CPU? OR an INTEL CPU! they had to UNDER CLOCK the CPU to show it really doing any thing which just proves what we have know for a wile the issue is the GPU not the CPU any more
 
As you point out, there may not be a lot of point enabling it on older hardware anyway because the hardware isnt best suited to it.
For developers, that's the whole point of Mantle. You don't want to alienate your customers because you can't get your game to run acceptably on their hardware. Mantle caters specifically to relatively new hardware, like the 7000 and 200 series cards. Even though the architecture of the 5000/6000 are very similar to the GCN hardware, it isn't supported. Not because it isn't best suited, but marketing.

The reason Mantle has come into being is because there was a need due to hardware not functioning optimally, not just because of the driver.
Uh, that's the reason why Mantle exists. The drivers handles everything going on, which is why Mantle gives all that control over to developers.

decoupledmemory.jpg


An open organisation couldnt have achieved this much, it had to come from a hardware/driver manufacturer.
OpenGL was actually made by SGI for their graphic cards, and then they later opened up the API. Glide, which is exactly just like Mantle, was created by 3DFX and became open source.

If you felt NVAPI was all that, you wouldnt be ball aching because NVidia wont use Mantle.
So clearly that isnt good enough, by your own words.
I don't like NVAPI or Mantle. Though to be fair, NVAPI is 5 years old and does support older hardware. You could use OpenGL and throw in some custom extensions. Both Nvidia and ATI have been doing that for years, before DirectX because the thing of the future.

For example, Nvidia has created a number of custom OpenGL extensions that ATI has to use just to be compatible, and vice versa. But Nvidia has created so many more and has gotten developers to use them more often, that it's no wonder why Nvidia is usually higher in performance then ATI was.

It's not like Mantle where you created an entire API, but it does give a decent performance boost without having developers to create whole new code.
 
Amd's mantle is not proprietary and is open to Nvidia.....if you want to talk proprietary then consider Nvidias "GeforceWorks" initiative....black box libraries that developers are prevented from sharing with AMD via license agreements.....black boxes that have been shown to not only optimize for Nvidia but also specifically and unnecessarily hamstring AMD hardware and prevent AMD from accessing information required to optimize on their hardware...... doing things like their excessive tessellation tricks of old.

Hopefully having GCN widespread in console as well as pc will prevent Nvidia's dirty tricks from gaining any ground....I'm getting sick of their underhanded approach that damages the gaming ecosystem just to support their profit margin.

I've voted with my wallet and I'm upgrading from my 2x gtx460's to a R9 290x gigabyte windforce edition which is being delivered today and I'm really excited for what Mantle can bring.

Not only will it bring greater performance to high end cards and allow for lower end cards to play at acceptable levels but it will also remove excessive cpu load perhaps allowing for greater platform longevity.

Nvidia's been using it's position of influence in a negative manner in many ways in terms of the game eco system and it's time that stopped.
 
Last edited:
I don't like how he skated around the comparison question ("turn things off and lower graphics"). How about you tell us hard numbers on an apple to apple comparison of this special demo. With mantle on and off. FPS vs FPS.

If it's 2FPS with it being off and 18FPS with it on, say that.

I'm highly interested in Mantle and really want to see the DirectX API go away... but yeah, he definitely threw up some smoke screen vague terminology when the very important question was asked (i.e. with vs without mantle on same hardware).

When people do that, it generally means disappointment will follow when the real numbers show up.

Very surprised hardly anyone on here is commenting on that, not very observant or...?
 
Amd's mantle is not proprietary and is open to Nvidia.....if you want to talk proprietary then consider Nvidias "GeforceWorks" initiative....black box libraries that developers are prevented from sharing with AMD via license agreements.....black boxes that have been shown to not only optimize for Nvidia but also specifically and unnecessarily hamstring AMD hardware and prevent AMD from accessing information required to optimize on their hardware...... doing things like their excessive tessellation tricks of old.

Hopefully having GCN widespread in console as well as pc will prevent Nvidia's dirty tricks from gaining any ground....I'm getting sick of their underhanded approach that damages the gaming ecosystem just to support their profit margin.

I've voted with my wallet and I'm upgrading from my 2x gtx460's to a R9 290x gigabyte windforce edition which is being delivered today and I'm really excited for what Mantle can bring.

Not only will it bring greater performance to high end cards and allow for lower end cards to play at acceptable levels but it will also remove excessive cpu load perhaps allowing for greater platform longevity.

Nvidia's been using it's position of influence in a negative manner in many ways in terms of the game eco system and it's time that stopped.
At the moment Mantle is as proprietary api as it gets. There is not even white paper available in public for it. No documentation, nothing. Only AMD can make drivers to it at the moment and I am 100% that they would decline giving away the documentation and stuff if NVIDIA asked.

AMD does questionable things too. Tomb Raider and latest Dirt comes to my mind.
 
I think before people keep complaining about 30 FPS demos, they should also consider the amount of data on the screen, I'd agree though on having a "Mantle OFF" vs "Mantle ON" could have better illustrated the difference
 
I'm holding judgement until I see it in person or here on the [H] but that was one of the worst demo's I've ever seen. You could see in the bottom left with perfoverlay the CPU spikes and stuttering. Not to mention they showed no comparisons, released no info on what hardware they were using, and didn't even really show what they were playing on (multiplayer/singleplayer).
 
I'm highly interested in Mantle and really want to see the DirectX API go away... but yeah, he definitely threw up some smoke screen vague terminology when the very important question was asked (i.e. with vs without mantle on same hardware).

When people do that, it generally means disappointment will follow when the real numbers show up.

Very surprised hardly anyone on here is commenting on that, not very observant or...?

I also wonder if there will be any visual differences when using Mantle. Things like AA and texture quality. For something that is supposed to be production ready there is very little information.
 
I'm holding judgement until I see it in person or here on the [H] but that was one of the worst demo's I've ever seen. You could see in the bottom left with perfoverlay the CPU spikes and stuttering. Not to mention they showed no comparisons, released no info on what hardware they were using, and didn't even really show what they were playing on (multiplayer/singleplayer).

http://hardforum.com/showpost.php?p=1040530359&postcount=23

I don't recall hearing what GPU they were using beyond a 2xx, the screenshot shows those numbers that were bouncing around and what they actually are not what people are interpreting it as. They tell you what CPU is being used and that it's a single player game that is AI vs AI for the purposes of the demo (basically so maximum laaaaazors! at once)

First time I watched the video I didn't notice what was what so I was basing my judgement on what I was assuming, after I went back and paused it I saw what was what and while I'm not exactly "wowing" over it I do look forward to actually getting some hands on time with it and seeing how it is.
 
http://hardforum.com/showpost.php?p=1040530359&postcount=23

I don't recall hearing what GPU they were using beyond a 2xx, the screenshot shows those numbers that were bouncing around and what they actually are not what people are interpreting it as. They tell you what CPU is being used and that it's a single player game that is AI vs AI for the purposes of the demo (basically so maximum laaaaazors! at once)

First time I watched the video I didn't notice what was what so I was basing my judgement on what I was assuming, after I went back and paused it I saw what was what and while I'm not exactly "wowing" over it I do look forward to actually getting some hands on time with it and seeing how it is.

I'll give you the benefit of the doubt with the stuttering but something I feel that's been understated is the demo still looked terrible regardless of the reason. If you're going to try to sell me on something new, it better damn appear/be better or hold off until you are ready to show it (remind you of of the game in general? :rolleyes: )

To be honest though it's all moot until I see the hardware sites test it, and even then for me personally I'm happy with the performance so anything more would just be a bonus.
 
That was the lowest FPS I saw it hit during the video and to be fair he stated that it was running at about 30fps. Watching it I saw it spike from 60 at its highest to 34 at its lowest which I don't feel is something that is that unusual as the high fps spots were pushed in views only having to render one ship and a couple of beams whereas the 34fps areas were rendering the entire battle.

Frankly its a new thing and it will take time to be optimized and implemented correctly, I'm not holding my breath and I don't expect massive gains from it. Right now I get the FPS that I want in the games I play, if I don't get a boost no biggs but if I do it will be a nice bonus.

EDIT:

The "Average Frames" is measured in MS.

What are you talking about, in what game do you have over 7000 units, crysis ? Stacraft 2 ? Rome 2 ?
 
GCN versions 1 and 2 are required to run Mantle.
http://en.wikipedia.org/wiki/Mantle_(API)

This why Nvidia will not support it.

They will have to create a arch that will play nice with it. They're in a better position to work on their own version and roll it out faster.
Remember, single card multi-monitor support wasn't available until Kepler, Fermi couldn't support it. Also, they called it Surround instead of Eyefinity.
I doubt Nv will call their version Mantle, they'll make up a different name.
Mantle isn't a driver, the arch is what makes Mantle possible. Try to run Mantle on a 5870 and see where that get you.
 
Nvidia will just need a miniport dll just like 3dfx had to handle opengl which wasn't supported directly on their hardware. It converts the mantle calls to nvidia acceptable calls
 
This why Nvidia will not support it.

They will have to create a arch that will play nice with it. They're in a better position to work on their own version and roll it out faster.
Remember, single card multi-monitor support wasn't available until Kepler, Fermi couldn't support it. Also, they called it Surround instead of Eyefinity.
I doubt Nv will call their version Mantle, they'll make up a different name.
Mantle isn't a driver, the arch is what makes Mantle possible. Try to run Mantle on a 5870 and see where that get you.

Mantle DOES have a driver. They're using GCN as the hardware baseline. Just like certain DX versions require a hardware feature set, Mantle defines and expects one too. The extension system was not an accident.
 
How is using Direct Compute questionable?
That is not the issue, they did not work with NVIDIA so at launch it was completely unoptimised. Dirt on the other hand was made so that it runs crap on NVIDIA hardware. By the way, what techs do you think that NVIDIA uses in their game works stuff? Exactly, standard stuff that works on other GPUs too so how is that questionable? :p
 
That is not the issue, they did not work with NVIDIA so at launch it was completely unoptimised.
Ummm... actually yes, both devs did work with Nvidia.

Dirt on the other hand was made so that it runs crap on NVIDIA hardware.
Not really. Yes, AMD developed a relationship early with Codemasters so much of the engine and game is built/optimized around AMD's architectures but there is nothing in the game that is inherently built for AMD.


By the way, what techs do you think that NVIDIA uses in their game works stuff? Exactly, standard stuff that works on other GPUs too so how is that questionable? :p
CUDA, PhysX and Gameworks. Those all are not able to be run on AMD hardware and/or are open to AMD optimization.

Not to mention all of the other lockouts Nvidia has done to AMD. My favorite is Batman AA where Nvidia actually used AA code that was originally implemented by AMD and included a vendor lock so it couldn't be run on AMD hardware.
 
I'm holding judgement until I see it in person or here on the [H] but that was one of the worst demo's I've ever seen. You could see in the bottom left with perfoverlay the CPU spikes and stuttering. Not to mention they showed no comparisons, released no info on what hardware they were using, and didn't even really show what they were playing on (multiplayer/singleplayer).

The did say they were running on a downclocked 8350 as well as the game being played by two AI players.
 
This why Nvidia will not support it.

They will have to create a arch that will play nice with it. They're in a better position to work on their own version and roll it out faster.
Remember, single card multi-monitor support wasn't available until Kepler, Fermi couldn't support it. Also, they called it Surround instead of Eyefinity.
I doubt Nv will call their version Mantle, they'll make up a different name.
Mantle isn't a driver, the arch is what makes Mantle possible. Try to run Mantle on a 5870 and see where that get you.

Thats not a reason why.
It would take them time to get up to speed.
if you expect lack of immediate results to be a reason, then nothing would get developed.

I dont think NVidia will play ball, I think they will develop their own methods.
But thinking they will be rolled out ahead of Mantle is very strange.

As already pointed out, Mantle is part hardware, part driver and part API.
AMD have indicated that they can make it work on older hardware (with a driver change), but for now it is GCN.
 
Back
Top