How can nvidia compete against mantle amd cards?

Nvidia won't be competing in games that support Mantle, that's pretty much a given. The advantages gained from having a low-level API is just too great for graphics intensive games. So the only real question that remains is how many game engines will actually use it. I personally suspect all the big ones will, and depending on how hard it's to work with( not very, I suspect), even some smaller titles might use it. It's really not all that much work to support another API in a game.

Are there some benchmarks out already ?

For all that talk about low-level API Xbox One still has rediculus amount of games not runing at 1080p with 7770 on board.
 
As someone who has been with both camps, the AMD fanboys in this thread are outrageous..... Mantle SOUNDS good.... in theory.... We do not have anything solid on it, and the AMD fanboys are praising it as something revolutionary, really cmon. It's good to see AMD in a position to be a lot more directly competitive with nVidia than they have been, but what actually happens we can only wait and see, not just say "ZOMG Mantle is the BEST thing EVAR, nVidia cant do ANYTHING at all!!!!!!!!"
 
If AMD has to spend $8,000,000 to get one game to run better on their hardware, I don't think NVIDIA has anything to worry about. Especially when that one game won't be launching with Mantle anyways.

AMD investors should be plenty worried.
 
Most major PC games are on all consoles as well, so in theory Mantle will be used by default. The same can't be said for Nvidia's future Mantle-clone.
 
I wonder why Mantle isn't being used at BF4 launch, when all the review websites write their benchmarks (and people bying videocards based on those reviews).
Ofcourse you can say "it's not ready" but hey, doesn't AMD have a beta "Mantle driver" which those reviewers can use for the benchmarks?
If AMD doesn't have a beta in november, they don't have a final version in december.

I have a 7970 and hope that Mantle will rock, but i'm sceptic. Not having a working beta at BF4 launch is a big big failure. As if they are hiding something.
 
Most major PC games are on all consoles as well, so in theory Mantle will be used by default. The same can't be said for Nvidia's future Mantle-clone.

That is the battle AMD will be trying to win here. If games are developed on Mantle, they will be optimized for consoles and every PC with a recent AMD graphics card. Then they will port that work over to DirectX for users of old AMD hardware and Nvidia users.
 
Most major PC games are on all consoles as well, so in theory Mantle will be used by default. The same can't be said for Nvidia's future Mantle-clone.

But wouldn't NVIDIA just work to get their clone supported in all the top engines? I'm not sure how it's going to be much different in practice.
 
Good grief, people....
nVidia doesn't have to do a damn thing right now or start a hurry-up campaign for developing their own competing LAPI. Mantle isn't even out yet, so no one knows what the performance difference will be, or how much dev support and implementation it will realize.

Maybe it will be the magic bean that could give 20+ % increases in games utlilzing it. Then again, maybe it will end up like LucidLogic's Virtu and show performance decreases while providing a metric fuck-ton of instability issues with certain titles.

Maybe nVidia will just build a more versatile CUDA L/API that can somehow interface with OpenCL, OpenGL, and DirectX to accellerate those instruction calls. I doubt it can be done. Fuck, who knows? But the same things apply to an nVidia L/API if they made one...could be great, could be dog shit.

There are a lot of maybe statements about Mantle and what so-and-so has to do to combat it, but success all depends on the real world performance benefit (hopefully not a deficit) and a huge dev adoption rate.

In the meantime, please take a deep breath, unpucker your , relieve this self-induced unwarranted stress, don't get your panties in a wad over something that's not even being used yet, and go on just as you did before Mantle was announced: by enjoying your current PC games.
 
Mantle is most likely already working on the consoles, but it will take some work to port to the PC since there are many GCN cards it needs to support. That's why it's delayed on BF4 for PC.
 
But wouldn't NVIDIA just work to get their clone supported in all the top engines? I'm not sure how it's going to be much different in practice.


Mantle will theoretically be supported nearly by default, since AMD hardware is already in the next gen consoles. That's the leverage AMD will have to get this in games. Nv will have to throw money at it nearly every single time, to get their clone in, same as they currently do for GPU PhysX.

Pretty sure Mantle is going to be used by many games ported from consoles. How well Mantle ends up working, both in performance, and as a marketing tool, are the real questions. I don't really think adoption by the multi platform devs, especially as the upcoming consoles start to mature and start needing any performance boost they can get, is really a question.
 
best way to determine will be after december when its patched into bf4 and theres enough information on the 290. I was surprised that during the press conference Amd didn't say crap about the 290 other than it exists so I guess they expected to run it on hopes and dreams.

I'm skeptical due to the fact that its an amd tech and if its anything like their driver support there's going to be a performance hit so amd software performance hit would probably make it still subpar to nvidia and overall there's not going to be much of a change.

I'm not at all concerned about consoles or porting over from consoles because its been happening for years so its nothing new. Unless nvidia takes a massive nosedive somehow (like blackberry) then I'd be surprised if anything changes from the current status quo.
 
Mantle will kill Nvidia in games where its used. Crossfire will scale to 90-100% across 4 gpus at all times as multi-threading will be possible with additional cpu power at disposal 200+ extra fps possible. Mantle will unleash unused untapped cpu power. The more cores you have the merrier. Its going to destroy SLI. Scaling will be deadly. Betting at least 30-80 more fps in most games on a single gpu. Minimum framerates will be doubled meaning higher-lowest fps. Its going to take over once everyone sees the benchmarks. Mantle is the holy grail. Microsoft might send in the dogs some how though if it becomes too successful ;)

You should work for AMD marketing. I hope you are right!!
 
I'd be surprised if you saw Mantle give more than sub-10% in general. I'd expect a few percent at most, outside of "best case" showcases. If so, they'll probably make it hard to compare directly. In practice it's going to be impossible to compare say DX vs Mantle in a game since it's hard to know if they're doing the same work.
 
Unless I missed something, nvidia will have to rely on the slower directx 11/open gl interfaces for games while mantle gpus get to leverage console ports from popular game engines like frostbite.

Even though I am sure all games will still be designed to run on direct x, engines like frostbite 3 are going to be EXTREMELY popular and support a large number of tripple A titles. And that can't be the only large engine that adopts mantle into the builds, as an activision guy tweeted out (even if he does not look forward to the increased workload).

http://www.neogaf.com/forum/showthread.php?t=688033


And which card would enthusiasts prefer to buy? The one that only gets directx 11 optimizations? or the one that can tap into increased performance with the same or vastly cheaper hardware using mantle for popular games?


What about you nvidia guys on these forums, I know plenty of you prefer nvidia for whatever reason, what will YOU be upgrading to for your next cards?

Nvidia should go ahead and try to sell off their assets to AMD now before they go out of business.
 
I'd be surprised if you saw Mantle give more than sub-10% in general. I'd expect a few percent at most, outside of "best case" showcases. If so, they'll probably make it hard to compare directly. In practice it's going to be impossible to compare say DX vs Mantle in a game since it's hard to know if they're doing the same work.

Is that your honest opinion as a programmer, or is it a poor attempt of downplaying Mantle? (I am asking if its your honest opinion and what you base this on. If its a pathetic attempt of downplaying, don't bother answering).

Do you actually think that an API Dice and AMD are going to pitch to developers will not contain improvements that will make it worthwhile to use?

In the Frostbite 3 game engine itself, there is a "custom build & heavily optimized rendering pipeline". According to AMD's slides, there are new rendering techniques. They've spend 2 years co-developing this API with Dice (in collaboration with top game developers not yet revealed). Do you really think we are talking about sub-10% in general and a few percent at most outside of "best case" showcases? That kind of improvement is something we already get from driver improvements alone.

If that would have been the case, what would be the point of creating a new API?

Common sense dictates the improvements will be substantial. Dice have already said we can switch between DX 11 and Mantle, so it can be measured on the same hardware easily. I personally look forward to the December Mantle patch benchmarks. As a gaming enthusiast I find it very interesting.

amd-a18ljsnc.png

amd-a19scs7n.png

91
 
Last edited:
yeh ppl do not read so well and don't use their thinking cap much. AMD did not spend $8mil on ONE game, they spent it to colabrate with EA/DICE to get it into all their upcoming games(as well as Activision and some unnamed ones) so yeh AMD did a lot of the grunt work, they spent a good chunk of coin to make it happen BUT EA is one of if not the largest game publisher on the market with a massive chunk of IP as well as subsidiaries and "friends" so this is not something to throw down honestly. I am pretty sure if they only seen a 10% boost they would not even bother to talk about it, they would just do it behind closed doors, the ramifications for this will be huge.

Linux, games, SteamOS and as mentioned more then likely a very pronounced and beneficial performance boost for their cpu as well(offloading to gpu in one aspect) so yeh, even if say it was 10% raw performance gain from the gpu side but also reduced cpu loading by 10% or so and made crossfire/eyefinity that much easier to scale as they could custom split workloads the way the devs see fit, anyways point is, 10% as a number means nothing, but 10% taken into the context it may apply to could very well be like the performance difference between a Pentium-D and an i7-980x or like a lawnmower to a sports car, a great bump in efficiency means that whatever power you do have will simply equal more available grunt to be used more effectively no matter what is being used.
 
While I've had good experiences with ATI in the past, I still hate their drivers and little things like not being able to force vsync in DirectX games. I don't mind going back hardware-wise, but I don't feel like having to run 5 different 3rd party graphics utilities all of the time either.
If Mantle really does turn out well (and I still question if it will), hopefully their driver team will finally modernize their feature sets.
 
While I've had good experiences with ATI in the past, I still hate their drivers and little things like not being able to force vsync in DirectX games. I don't mind going back hardware-wise, but I don't feel like having to run 5 different 3rd party graphics utilities all of the time either.
If Mantle really does turn out well (and I still question if it will), hopefully their driver team will finally modernize their feature sets.

Yeah, I have no idea what the fuck is going on with their control panel forced vsync. It has been like that since the freaking dawn of time, and is infuriating in games like Dead Space 1-3 (whereas nv's force vsync works).
 
Frankly? Nvidia was the direct competitor of 3dfx back in "the day" when proprietary graphics API's were the ONLY way you could game in 3D.

Honestly? Nvidia could have attempted to lock down the graphics market any number of times with a proprietary API, but it was exactly their decision to support D3D natively and work with MS that buried 3dfx in the first place.

PC gamers on the whole will not be happy campers when the "if you don't buy AMD you can't play x or y games at all."

And there is nothing on earth stopping Nvidia from responding with the same thing and having their own API.

Hell, for all I know it might be better for gaming if we actually had two low level APIs and that was it. It might make it easier for everyone.

But whatever. AMD has now broken the unspoken vow to remain open and all bets are off. Let the backstabbing wars begin.
 
odd I use Vsync forced on in Cat center and it works fine(granted I only use a single card)

that is one thing they need though, a specific place on their site for suggestions/issues might help them sort out what needs to be done better.
 
I seriously doubt this will be anything game breaking considering... if EA Even looked at the STEAM info they post monthly.. they'd be idiots to give minority company a performance boost over nVidia camp considering nVidia owns according to the Steam Hardware Survey, nVidia owns 51% of the market, AMD has 32%. Just like Intel owns 73% and AMD owns just 26% of the CPU market.

That would be like announcing a major leap in AMD archetecture to increase performance of windows 8..Oh wait... AMD did that for windows 8 was it?...Really did them some good did it?

Why would you want to give just32% of the gaming market a boost in performance and not the one that owns 51%? From a business standpoint, unless the increase is mote, it doesn't make much sense to co-develop with the underdog on just one game.

http://store.steampowered.com/hwsurvey/

P.S.

FYI, According to the top 10 DX11 GPU's, 8 are GeForce, one is INTEL and the other is a HD 5770.
 
Last edited:
lol advil, you are saying Nvidia is not doing the same thing, 3dfx tried to lock some things down to make sure stuff had a certain compatibility for better performance(hmm sounds like Nvidia to me through and through) and they screwed up by spending far more then they were making bringing to many things under their own control when they should not have, Nvidia and say CUDA or even better PhysX are locked down like crazy(most of the time for no reason at all but to "ensure compatibility" they didn't "work with MS" to bury 3dfx, 3dxfx buried themselves and Nvidia bought the scraps of the inferno that took place.

Nvidia has not once been about being open and fair in no way shape or form(they control and set the price and cop out big time if something happens even if they were truly at fault) not to business, not to consumer, not even to those who make their products which they sell to everyone else, you seriously need to get your facts straight cause you are way off the mark.

Jen(ceo of Nvidia) used to be a chip designer at AMD before he started Nvidia which started 1993
3dfx started in 1994 upon their demise most of their assets were bought by Nvidia
ATi now AMD graphics division started in 1985 rage pro line was one of the first viable alternatives to 3dfx 3d only voodoo chipset.

So yeh anyways, Nvidia is a business purely they sit on their laurels waiting for others to make the move to certain features or hardware designs before they swoop in and act like they are the best at it, anyways not fighting about this, it just is wicked interesting history if you look deep enough into it to see all the good and bad that all these companies have done, honestly Nvidia just like Intel are the bullies of the tech world but in this case for GPU most of the fancy designs they have done were from assets acquired from 3dfx before them and just advancing the designs, most of the hardware used or features(say GDDR memory and such) were not at all made by them, only used when others worked out the kinks.

Well and the fact is ATi as well as Nvidia supported MS and d3d(directX and all the various parts of this that is part of windows) technically from my reading Nvidia supported Direct3D 7.0 fully for a PC side of things but ATi was around before this point so technically ATi was the first to support MS and D3D in general.

DirectX however is a different story as this was used(and a version of win95) on the xbox and xb360 namely a tuned version of 8.1 for the xbox as a base.

Anyways long story short yeh this may affect some pc gamers but Nvidia has been doing this for at least a decade or so so big whoop, how many games play well that have PhysX in it if you DONT use Nvidia? not many I can assure you so AMD has not broken a vow that Nvidia has not time and time again and worse Nvidia has went out of their way to castrate performance on purpose even if it did not boost their own performance so yeh, AMD certainly is not the bad guy here, better yet, Nvidia is far from a saint so its best not to act as if they are :)

Just to top it off, standard API are great which is why DX will continue to be supported on windows machines and from a console standpoint so that doesn't matter, but from a standpoint of getting the most out of the hardware custom API are not going to hurt anyone using the competition in any meaningful way not like OpenGL or 3dfxGlide did which very much made it a point of needing wrappers and workarounds to "try" to get non compliant hardware to function properly.

If its windows, it has to use DX or be able to use it, period or that would be major lawsuits AMD would be paying out, console or steambox(Linux) well that's it own ballgame, it nice if the tables turn on the company who shat on everyone for a change IMHO.
 
yeh and those polls mean nothing, many do not take them, many are way out of date so on and so forth. Nvidia is higher % of market for dedicated gpu but barely now, Intel still is the most prevalent for graphics at this point period, Nvidia I believe was 48% AMD 42% of total not that long ago but this is changing due to APU and more overall sales that AMD has been doing but point stands, devs will not drop complete support of Nvidia and gamers in general, the business is far to large for them to do some bs like that some $20+BLN/year yeh they will support as much as possible, this means DX, OpenGL and soon to add Mantle to that mix for "supported" cards meaning GCN Mantle/DX/GL everything else DX/GL
 
yeh and those polls mean nothing, many do not take them, many are way out of date so on and so forth. Nvidia is higher % of market for dedicated gpu but barely now, Intel still is the most prevalent for graphics at this point period, Nvidia I believe was 48% AMD 42% of total not that long ago but this is changing due to APU and more overall sales that AMD has been doing but point stands, devs will not drop complete support of Nvidia and gamers in general, the business is far to large for them to do some bs like that some $20+BLN/year yeh they will support as much as possible, this means DX, OpenGL and soon to add Mantle to that mix for "supported" cards meaning GCN Mantle/DX/GL everything else DX/GL

Steam Hardware & Software Survey: September 2013

Out of date? By what.... 5 days?
 
I'm not so sure the next-gen consoles will be Mantle.

If they is....and BF4 is only 720p/60fps for ps4/xbone...which is about 1080p/30fps...and apparently at medium graphic settings...

But the again they are essentially a $400 pc.
 
I totally agree with the OP. If nvidia stops making newer GPUs, there is no way it can compete against Mantle. :rolleyes:

Serious, this thread?
 
Frankly? Nvidia was the direct competitor of 3dfx back in "the day" when proprietary graphics API's were the ONLY way you could game in 3D.

Honestly? Nvidia could have attempted to lock down the graphics market any number of times with a proprietary API, but it was exactly their decision to support D3D natively and work with MS that buried 3dfx in the first place.

PC gamers on the whole will not be happy campers when the "if you don't buy AMD you can't play x or y games at all."

And there is nothing on earth stopping Nvidia from responding with the same thing and having their own API.

Hell, for all I know it might be better for gaming if we actually had two low level APIs and that was it. It might make it easier for everyone.


But whatever. AMD has now broken the unspoken vow to remain open and all bets are off. Let the backstabbing wars begin.

Frankly, Nv embraced D3D, and OGL because they had to. Without them, Nv would have had little chance in the PC market if they came forward with their own API. 3DFX all but ruled during that time frame.

Honestly, Nv has tried to put forward it's own API and lock out the competition. It's called CUDA. Successful in the scientific community, but for gaming, it failed outside of being a marketing tool to tie their PhysX middle ware to. Plenty use the cpu version of PhysX, almost no one uses the gpu version without being paid to, or Nv coding the physics into the game for them.

The only thing keeping Nv from putting out it's own API is that they will have to pay the devs, or code it in themselves, to get anyone to use it. Same as PhysX. They will put one out anyway I believe, but it will just be another marketing tool.

No one would be happy with only one player in the GPU market. We had close to that during the Nv 8xxx series of cards when ATI was not competitive for a couple years. We all saw what happened to prices.

There is obviously no unspoken vow, rule, or agreement between them. Nv tried to go that way itself with CUDA and PhysX.
All AMD is doing, is making a lowe(r) level interface to get more out of the console hardware Sony and MS are buying. The optimizations made using it, apparently transfer pretty directly over to AMD's newest and upcoming PC hardware. They are leveraging that, for a performance, and marketing boost. How is this any different than what Nv did with CUDA and PhysX?
 
Frankly, Nv embraced D3D, and OGL because they had to. Without them, Nv would have had little chance in the PC market if they came forward with their own API. 3DFX all but ruled during that time frame.

Honestly, Nv has tried to put forward it's own API and lock out the competition. It's called CUDA. Successful in the scientific community, but for gaming, it failed outside of being a marketing tool to tie their PhysX middle ware to. Plenty use the cpu version of PhysX, almost no one uses the gpu version without being paid to, or Nv coding the physics into the game for them.

The only thing keeping Nv from putting out it's own API is that they will have to pay the devs, or code it in themselves, to get anyone to use it. Same as PhysX. They will put one out anyway I believe, but it will just be another marketing tool.

No one would be happy with only one player in the GPU market. We had close to that during the Nv 8xxx series of cards when ATI was not competitive for a couple years. We all saw what happened to prices.

There is obviously no unspoken vow, rule, or agreement between them. Nv tried to go that way itself with CUDA and PhysX.
All AMD is doing, is making a lowe(r) level interface to get more out of the console hardware Sony and MS are buying. The optimizations made using it, apparently transfer pretty directly over to AMD's newest and upcoming PC hardware. They are leveraging that, for a performance, and marketing boost. How is this any different than what Nv did with CUDA and PhysX?

Nvidia has a hardware solution for mantle..it works with any game. testing has begun.:eek:
 
the problem is not Mantle for Nvidia, its the fact that no next-gen consoles are using Nvidia product.

console ports should run perfect on AMD systems on day one.

this is common sense folks, Nvidia should have a lot on their hands once X1 and PS4 ports start pouring in for PC.

because this time, PS4 is not using Nvidia GPU like the PS3 did.
 
Nvidia has a hardware solution for mantle..it works with any game. testing has begun.:eek:

I would imagine so. R&D is ongoing for new GPUs at both companies pretty much all of the time. Though not sure they really need a "solution". Mantle is maybe a performance boost, and maybe a bit of marketing capitol, but I doubt it will be much more.
 
the problem is not Mantle for Nvidia, its the fact that no next-gen consoles are using Nvidia product.

console ports should run perfect on AMD systems on day one.
The APIs aren't the same (including the Xbone's DX), but there is some work which may be optimized for AMD GPUs using the same architecture, like shaders.
 
Yeah, I have no idea what the fuck is going on with their control panel forced vsync. It has been like that since the freaking dawn of time, and is infuriating in games like Dead Space 1-3 (whereas nv's force vsync works).

AMD control panel forced vsync is for OpenGL only and its surprising that so many long term PC gamers still don't know that.
 
Early benchmarks aside, (links please on those by the way), I would bet, that as the consoles age, and devs start looking for anyway they can to eek a bit more performance out of them, they use Mantle. When that occurs, we will see at least some benefit on AMD PC hardware from the quick and dirty ports, that tend to be the norm these days. No it will not be earth moving or Nv ending, but we will see some tangible benefit.
 
For what it's worth, I just got news from my suppliers TODAY that Nvidia is going to drastically reduce the price of their 700 series GPUs 'soon'. I guess they see the new AMD cards as a legitimate threat.
 
Back
Top