Nvidia's latest DX drivers doing better than Mantle in Star Swarm

madgun

[H]ard|Gawd
Joined
May 11, 2006
Messages
1,783
Now this is an interesting piece:

http://wccftech.com/nvidias-directx-11-driver-better-mantle-api-benchmark/

I thought Star Swarm was the attraction on Mantle's side. Looks like in the last month or so, NVidia has polished it's drivers to upstage Mantle based on pure performance.

May be it shows that if you have nicely optimized drivers, you can still make the best use of underlying hardware. It could also mean that Nvidia might have worked with Star swarm's developers to make low level calls to it's own architecture.
 
May be it shows that if you have nicely optimized drivers, you can still make the best use of underlying hardware. It could also mean that Nvidia might have worked with Star swarm's developers to make low level calls to it's own architecture.
Except we already know that current drivers aren't well optimized. Neither AMD nor Nvidia have been taking full advantage of the threading optimizations already present in DirectX 11.

DICE complained about this fact, heavily, in their press slides for Battlefield 3. They wanted to boost performance and reduce CPU overhead by using DX11's new threading capabilities, but quickly found out that both AMD and Nvidia had dropped the ball and failed to implement the required driver-side functionality.

It could be Nvidia finally got around to implementing said optimizations, allowing DX11 to operate as-intended :p
 
Nothing strange there, when the star swarm was out with DX only waiting for mantle.. My i7 2600 machine with a 660TI was performing way better than people with 7970GHZ/280X cards + FX 8350.. And equal to 290s.. When mantle was out they received a nice bump in performance but just to make'em equal in performance?... With a nice optimization i know much more things can be achieved.. :)
 
From what I read if you enable Deferred Contexts in SS you'll see a large bump in FPS on nvidia cards. This is before any of these driver optimizations that nvidia is talking about.

What other games are not implementing Deferred Contexts?
 
Now this is an interesting piece:

http://wccftech.com/nvidias-directx-11-driver-better-mantle-api-benchmark/

I thought Star Swarm was the attraction on Mantle's side. Looks like in the last month or so, NVidia has polished it's drivers to upstage Mantle based on pure performance.

May be it shows that if you have nicely optimized drivers, you can still make the best use of underlying hardware. It could also mean that Nvidia might have worked with Star swarm's developers to make low level calls to it's own architecture.

considering nVidia is able to improve their performance so much in games that use Mantle, I'm wondering if they are taking advantage of the Mantle API themselves? AMD did say that Mantle only requires a certain feature set.

Now, how did they gain the knowledge? Is it something that they could just figure out? It doesn't seem like there was enough time for that though.
 
Except we already know that current drivers aren't well optimized. Neither AMD nor Nvidia have been taking full advantage of the threading optimizations already present in DirectX 11.

DICE complained about this fact, heavily, in their press slides for Battlefield 3. They wanted to boost performance and reduce CPU overhead by using DX11's new threading capabilities, but quickly found out that both AMD and Nvidia had dropped the ball and failed to implement the required driver-side functionality.

It could be Nvidia finally got around to implementing said optimizations, allowing DX11 to operate as-intended :p

It's actually an optional feature, they're not required to support it.

I think CIV5 might be the only game out that even has support for it.
 
omg it's getting delirious here

Just sit back and laugh, i've truly seen some hilarious stuff come from rabid fans of you know what corporation. I was just reading on another forum someone really trying to tell everyone that the 7850k is a superior CPU over the 4770k. Alrighty then. You really can't argue with crazy.
 
Last edited:
It's actually an optional feature, they're not required to support it.

I think CIV5 might be the only game out that even has support for it.

This is my impression as well. Deferred context was MS' original answer for this entire problem, but it was apparently extremely difficult to use, and results not quite as good as hoped (undoubtedly due to the difficulty). I think you are correct that Civ V was the only game to use it. Whether it is being used in Star Swarm? Maybe. I don't really know, but nvidia does have multithreaded drivers supporting deferred context while AMD didn't, i'm not sure if that has changed for AMD. If I had to guess i'd say no.
 
I moved from a 780 GTX Ti to 2 x R9 290s TriX and now back to a 780 GTX Ti Ghz edition. Pretty much seen it all.

Both companies have got great hardware but I must admit that Nvidia has a larger pools of games for which it has optimized it's drivers.

I can see one reason about why AMD would push Mantle. Mantle would greatly reduce the fuss about updating drivers for every new game release and optimizing to squeeze out every inch of performance. AMD can save on spending all the capital by leaving things up to developers like DICE.

My reason for switching back to 780 GTX Ti was no fault of AMD. It was more of DICE doing their dirty implementation of Mantle and the game was just unplayable with crossfire enabled under Mantle. There were crashes every so often. DICE needs to get it's act together and it needs to stop releasing beta versions. By the time BF3 matured and got playable, it was replaced by BF4. I hope the same does not happen again. and BF4 has a bigger cycle than BF3.
 
I was watching the presentation of OpenGL's scheduler in the "reducing driver overhead to zero" on a blog (I watched the portion presented by Nvidia's John Mcdonald), and the gist I got was that previously, the scheduler was entirely at the whim of the driver with a high level API. This is true for both D3D and OpenGL as things are, the driver handles the scheduler entirely. The problem with this? Because small inefficiencies exponentially reduce the efficiency over time, and CPU efficiency gets dragged into the mud over time unless the developer is really careful. This can be overcome, but not easily so. Anyway, so all of this right now is handled by the driver exclusively with a high level API. Now, with a low level API it's a co-effort between the game developer *and* the driver. It seems like it would make things more difficult, although i'm sure that would ease over time. So it seems like every game will be a co-optional effort with this going forward. Previously that wasn't necessarily the case, although both AMD and NV would have input with devs on how to improve performance and what-not.

It's just strange though - i'm not a developer, and I don't know if anyone can comment on above. But it seems like it would make things much more tough on both the developer AND the driver development for both Mantle and DX12. They basically have to co-ordinate their effort to make one game work. Developer and Driver team. Which i'm all for, because we ALL want more performance. But it's going to require more money thrown at software engineering and development by both nvidia and AMD. Should be interesting to see how it all pans out in the end.
 
Last edited:
It's actually an optional feature, they're not required to support it.
Not sure what you're responding to, I never said it was a mandatory feature.

Pretty obvious it's optional, since both AMD and Nvidia skipped out on implementing it driver-side for so long...
 
I moved from a 780 GTX Ti to 2 x R9 290s TriX and now back to a 780 GTX Ti Ghz edition. Pretty much seen it all.

Both companies have got great hardware but I must admit that Nvidia has a larger pools of games for which it has optimized it's drivers.

I can see one reason about why AMD would push Mantle. Mantle would greatly reduce the fuss about updating drivers for every new game release and optimizing to squeeze out every inch of performance. AMD can save on spending all the capital by leaving things up to developers like DICE.

My reason for switching back to 780 GTX Ti was no fault of AMD. It was more of DICE doing their dirty implementation of Mantle and the game was just unplayable with crossfire enabled under Mantle. There were crashes every so often. DICE needs to get it's act together and it needs to stop releasing beta versions. By the time BF3 matured and got playable, it was replaced by BF4. I hope the same does not happen again. and BF4 has a bigger cycle than BF3.

I think we won't be seeing a BF5 for a long... long ...long time. Even if they announce it I can imagine after the large backlash from how much EA/DICE fucked it up I have a feeling we won't see BF5 til at least 2015 late or 2016. So who knows what the landscape will be like.
 
Yeah, for me personally BF3's launch was good. BF4? Ugh. The first month was pure hell. I'd get an hour or so of enjoyment here or there followed by the strangest bugs and crashes in the game. Then patch after patch would be released, with the patch fixing 1 thing and breaking 4 others.

Seriously, fuck EA for rushing BF4 out of the door. Definitely not going through that nonsense again with BF5.
 
Ahh yes, the one from Steam Dev Days. The approach is interesting. Instead of trying to maximize the number of draw calls, they try to reduce the need for draw calls and make the calls that are required a lot lighter.
 
considering nVidia is able to improve their performance so much in games that use Mantle, I'm wondering if they are taking advantage of the Mantle API themselves? AMD did say that Mantle only requires a certain feature set.

Now, how did they gain the knowledge? Is it something that they could just figure out? It doesn't seem like there was enough time for that though.

nVidia can't just go steal Intellectual Property and call it their own through a driver update, so I'd say thats pretty much impossible. DX11 has been out a good while (since windows 7 release), no one knows how long this driver was in development for. You can't assume they started on it only after finishing the last one.
 
What IP would they have to steal?

So you think they've been working on a driver that dramatically boosts performance in Mantle capable games, but it has nothing to do with Mantle?
 
What IP would they have to steal?

So you think they've been working on a driver that dramatically boosts performance in Mantle capable games, but it has nothing to do with Mantle?

Mantle is AMD IP
You're suggesting NVidia is secretly using Mantle

Do the math

Or is this another one of those attempts at arguing for the sake of arguing but not actually disagreeing like you did in the other thread?

And who said this driver is only for Mantle capable games? They're comparing their performance to Mantle. You realize the games they're testing also have a DX11 path right?

How do you suggest they get a direct comparison to Mantle if they test games that only have a DX path?
 
Last edited:
I'm arguing? You responded to me. If you don't want to discuss something then don't start it.

If the features are there then why can't nVidia, or anyone else use them? It doesn't mean they are infringing on anyone's IP. unless they gained the information needed by illegal means. I'm not suggesting that though. As far as seeing similar improvements in non Mantle games, I guess we'll see. If they do then my hypothesis will be disproved.
 
I'm not sure what you mean by "if the features are there" Mantle is not some open source API that anyone can just use on a whim. Whether or not AMD decides to make it so in the future is another matter. So yes, if NVidia reversed engineered their drivers to take advantage of Mantle and call it DirectX, it most certainly would be infringing on AMDs IP and I'm quite certain would be violating their terms with Microsoft as well if they're using Mantle and calling it DirectX.

I can tell by your recent post history that you like Mantle and want it to be the holy grail of APIs. That's fine, you're entitled to that, but it seems this Mantle love has caused you to get a bit carried away with your imagination. It's quite easy to see how your theory is nowhere near plausible if you take the time to think about what you're saying.
 
You should stick to responding to what I say rather than what you think I'm thinking.

As I said, we'll see if they show similar gains in non Mantle optimized games.
 
So you think they've been working on a driver that dramatically boosts performance in Mantle capable games, but it has nothing to do with Mantle?
Uh, of course it has nothing to do with Mantle? These improvements only apply to DX 11 rendering engines.

Unless, that is, you're suggesting DX11 somehow has some of the same potential hardware hooks as Mantle, and everyone has somehow neglected to use them until now :p
 
Nvidia has a fair bit of low level optimizations of it's own. Programs like Gameworks are part of this initiative, precisely aimed at working with game studios in order for them to add handles for special features available in their particular architecture as well as leveraging PhysX.

Nvidia does not need to use Mantle. And besides AMD did not open source Mantle so far. It's geared for their GCN architecture and there's still some work left to make it support more platforms.
 
Uh, of course it has nothing to do with Mantle? These improvements only apply to DX 11 rendering engines.

Unless, that is, you're suggesting DX11 somehow has some of the same potential hardware hooks as Mantle, and everyone has somehow neglected to use them until now :p

Well, there's something that just happened recently. Look at the improvements with nVidia and Win8.1 in BF4. It's possible that some of the optimizing they've done in the basic code is also showing improvements in the DX render path with thread scheduling? I don't know but we don't see these types of gains with merely a performance driver update. Especially on an arch that's as old as Maxwell.
 
Uh, of course it has nothing to do with Mantle? These improvements only apply to DX 11 rendering engines.

Unless, that is, you're suggesting DX11 somehow has some of the same potential hardware hooks as Mantle, and everyone has somehow neglected to use them until now :p

I wouldn't be surprised as 99.9% of the software released in the PC sector has been single threaded until AMD took over the consoles. Now with the push to multithreaded software and hardware as the lowest common denominator is multithreaded, everyone is suddenly jumping on the bandwagon as they don't want their products to look old and antiquated. Also they realize that they can sell the new tech to consumers and finally get us to retire our old PCs.

So yes, I bet there are lots of things that software engineers could have been taking advantage of that weren't utilized. Also, I would also bet that AMD's tech has been baked into DX12 aka DX11+Mantle. Best thing that can happen is that Linux gets drivers that allows porting to be even easier than before by using this technology. That would be a boost to SteamOS.
 
You should stick to responding to what I say rather than what you think I'm thinking.

As I said, we'll see if they show similar gains in non Mantle optimized games.

I can respond to both, while you continue to come up with your conspiracy theories then resort to a "we'll see what happens" post when you're challenged
 
Well, there's something that just happened recently. Look at the improvements with nVidia and Win8.1 in BF4. It's possible that some of the optimizing they've done in the basic code is also showing improvements in the DX render path with thread scheduling? I don't know but we don't see these types of gains with merely a performance driver update. Especially on an arch that's as old as Maxwell.

BF4 + Win 8 performance improvement was a combination of DX 11.1 (note the .1) which Windows 7 doesn't have full support for in addition to better thread scheduling in Windows 8. Some of the performance gains in Windows 8 were retained in Windows 7 by disabling Core Parking. I myself never tried it, as I upgraded to 8 shortly after BF4 was released, but many reported a performance gain in Windows 7 by doing it.
 
BF4 + Win 8 performance improvement was a combination of DX 11.1 (note the .1) which Windows 7 doesn't have full support for in addition to better thread scheduling in Windows 8. Some of the performance gains in Windows 8 were retained in Windows 7 by disabling Core Parking. I myself never tried it, as I upgraded to 8 shortly after BF4 was released, but many reported a performance gain in Windows 7 by doing it.

To echo what you're saying, upgrading from Win 7 to Win 8 is like night and day for multithreaded processors like my FX-9370 in any task. My buddies with Intel chips received a nice boost from doing so also. The ones that stuck with Win 7 got pissed with DICE because they had to do the core parking and other tricks to get close to Win 8 performance. If you die in a PVP game you have to blame the game's performance first. :) So many headaches that could have been avoided if they had agreed to alter their Start menu.
 
From Toms:
To its credit, Nvidia did clarify that its new driver has a higher average frame rate, but still has more slow frames than Mantle – something that Nvidia will continue to chip away at.

+ benchmarks run on 6C/12T i7-3930K with a yet to come highly optimized driver VS a first Mantle effort. Yeah, hardly a Mantle killer.
 
continue to come up with your conspiracy theories then resort to a "we'll see what happens"

The conspiracy theories and circumstantial pieced together evidence are hilarious. I say let them keep coming. Some of the stuff I see coming from rabid A corporation fans, it's just mind blowing. Fortunately, most of the fans are level headed people, but the truly "out there" guys. Wow. It's just funny and who doesn't need a laugh now and then. Might as well sit back and laugh instead of arguing with something passing the crazy threshold.
 
Last edited:
I'd wait to see the batch count and unit count. Since that's what matters in starswarm.
 
considering nVidia is able to improve their performance so much in games that use Mantle, I'm wondering if they are taking advantage of the Mantle API themselves? AMD did say that Mantle only requires a certain feature set.

Now, how did they gain the knowledge? Is it something that they could just figure out? It doesn't seem like there was enough time for that though.

So you are saying that NVIDIA (in the span of a month or so) was able to get Mantle running on their cards better than AMD cards?

Even though Mantle was written for a very specific architecture and the source code has not been released.

Wow! That is impressive.
 
I'd wait to see the batch count and unit count. Since that's what matters in starswarm.

Motion blur is the million dollar question. Batch count jumps up like 8 fold or something absurd when it's enabled.

For all I know NV is flexing nuts about their performance gains at 20,000 when motion blur will easily push it over 100k in heavy scenes. Can they still be completely GPU limited on a weaker CPU?
 
Question: Why is Star Swarm's motion blur implemented in such an inefficient way in the first place?

Most games just use post-processing shader based on depth-buffer information, which looks fine and doesn't slag your graphics card...
 
I look at the Star Swarm demo and can't help but wonder what in the fuck. The graphics are so-so at best. Actually, i'm being generous with that statement. I find the graphics to be woefully unimpressive. Yet here we are with most systems getting 10 fps in directX. Sounds like a great reason to skip the fuck out of this game, unless they make a more serious effort with D3D.
 
Question: Why is Star Swarm's motion blur implemented in such an inefficient way in the first place?

Most games just use post-processing shader based on depth-buffer information, which looks fine and doesn't slag your graphics card...

I'm guessing it was half a "because we can" move, but it sure looks good.

I look at the Star Swarm demo and can't help but wonder what in the fuck. The graphics are so-so at best. Actually, i'm being generous with that statement. I find the graphics to be woefully unimpressive. Yet here we are with most systems getting 10 fps in directX. Sounds like a great reason to skip the fuck out of this game, unless they make a more serious effort with D3D.

Their DX path is fast, but they're literally pushing worst case scenario for the API.
 
Back
Top