Mantle is here

like I said all along, just give me a steady and smooth fps and ill take that any day over more fps that we get from normal driver updates that barely get noticed.
 
upgrade already, 6xxx is so 2009.

lol
Read my post, just said I already upgraded to a GTX 780 :p

I've been waiting since 2009 for AMD to fix some of their more annoying driver-related bugs and issues. Didn't happen, didn't want to buy into a product with more of the same issues, so back to Nvidia I went.

And yes, some of the problems that drove me nuts on the HD 6970 are still present on the R9 290x, as well.
 
Well, there goes that, then. Guess I'm out of here, Mantle officially does nothing for me :p

Already upgraded my main rig to a GTX 780, anyway. The GPU bottleneck with an HD 6970 was too severe.

Point still stands, though. Anyone with a decent CPU probably isn't wooed right now.

No mantle for me saddly as well... :p

But I'm very curious to see how it works for everyone and what [H] has to say. I'm probably going to replace my 6870s in a year or two and mantle could be the factor if i go nvidia or amd.
 
Assuming you're playing mantle-optimized games, it's cool you can build a rig and save on the CPU costs, maybe dump them into a better GPU.
 
that's what I was saying earlier...I currently have a 6 core CPU + GTX 580 (1920 x 1200)...the 580 is not the highest end GPU today but Mantle will be basically useless for me...and when I upgrade to Maxwell (or AMD's next refresh) then it'll be even more useless

Yup, Mantle will be basically useless for you because it won't run on your nV 580 at all. AMD-only, GCN (7000-series and newer) only right now. While it may be ported to future nVidia hardware (don't hold your breath), there's basically no chance it'll ever be ported to the 580.
 
it might, if it supports dyanmic shader grouping in hardware.

I dont know enough about NV hardware to answer that.
 
Assuming you're playing mantle-optimized games, it's cool you can build a rig and save on the CPU costs, maybe dump them into a better GPU.

Unfortunately, I'm willing to bet that Mantle adoption will not be as fast and as widespread as anyone would like, so you'd be shooting yourself in the foot as far as other games (especially cpu intensive games like Civ, Total War, etc.)
 
Theve already listed a dozen games that i already planned on purchasing before mantle was annouced, so adoption seems widespread enough so to me.

Between DA3, ME4, Star Citizen, and Starcontrol; Who isnt going to buy all of those with or without mantle?
 
I'd laugh if AMD has to do a major architectural hardware design change in the future and then developers will have to have Mantle, Mantle 2.0, OpenGL, DirectX apis.

Welcome back to the 90's boys.
 
Haven't looked into these preliminary results that are out, but from everyone's comments so far, it seems to me that one group that are going to benifit are the crossfire 79xx and 290 users with 2500ks and 3570ks, well the ones with 120hz monitors anyway. I know when I was trying to get 120fps with CF 7950s on bf3 and bf4 it was my 3570k that was dropping me into the 80s at times..
Anyway it will be interesting to see the results, though it won't impact my single r9 290/3770k too much I don't think. I'll drop my cpu back to stock and turn off HT to test the difference mantle makes one day though.

I was getting a bit worried that I was going to have to start looking at i7 5390k or 5960x's in the future if CPUs kept progressing as slowly as they are. If mantle takes off and gets adopted by the majority of develepors I'll probably be able to keep my ivy system for longer than I would have dreamed. Still remains to be seen how many will adopt it though. I'm sure nvidia will throw plenty of money at upcoming big title releases to ensure mantle has a hard time tasking off. It would be silly business not too.
 
Mantle fits right in with AMD's consoles development with their APUs.

What I find the most interesting is the console ports we're used to are mostly CPU bound because of the multi-platform engines used. Now, with games ported with Mantle, CPU performance may not be so important, but gpu performance will be.

However, with a DX port you may still be CPU bound and need much higher system specs overall to get the same, or lower performance, than Mantle equipped games.
 
Im still waiting for more price for benchmark comparisons, more single gpu solutions, and more variation on cpus in that range.
 
Unfortunately, I'm willing to bet that Mantle adoption will not be as fast and as widespread as anyone would like, so you'd be shooting yourself in the foot as far as other games (especially cpu intensive games like Civ, Total War, etc.)

You never know you maybe eating your own words:rolleyes:
 
BF4 benchmarks are out, still waiting for more price for benchmark comparisons, more single gpu solutions, and more variation on cpus in that range.

1: Low-end single-player
AMD A10-7850K ‘Kaveri’ APU), 4 cores @3.7 GHz
No GPU
720p Medium
Windows 7 64-bit
Singapore
26.6 ms/f (37.6 fps)
23.3 ms/f (43 fps)
14% faster

2: 64-player multi-player
AMD FX-8350, 8 cores @4 GHz
AMD Radeon 7970 3 GB
Ultra 1080p
Windows 8 64-bit
Siege of Shanghai
18.87 ms/f (52.9 fps)
15.08 ms/f (66.3 fps)
25.1% faster

3: Multi-GPU singleplayer
Intel Core i7-3970xExtreme, 12 logical cores @ 3.5 GHz
2x AMD Radeon R9
290x 4 GB
Ultra 1080p 4x MSAA
Windows 8 64-bit
South China Sea
13.24 ms/f (78.4 fps)
8.38 ms/f (121.5 fps)
58% faster

Good to see that the gains per GPU with Mantle scale properly with crossfire.
 
Well, that's unfortunate. This use case seemed to me to be the main draw for Mantle.

I'm pretty sure that it's been known for a while (or at least speculated) that mantle would see it's largest gains when a system was bottlenecked by the CPU. The Kaveri APU's are nice, but they aren't anything spectacular compared to discreet graphics cards. I have a feeling though that you'd see much larger gains by simply adding a low/mid range AMD GPU to that system to run a crossfire setup with the APU
 
Don't ever plan on running dual-GPU. Too many issues (not to mention the added noise).

Doesn't look like Mantle will really benefit me. Oh well.


Not really, crossfire overhead isn't THAT bad.

I think what you meant to say is that adding additional GPU's gives you enough headroom that you might end up CPU limited again.


50% increase to what, though? No specifics are given.

If that's a 58% increase to MINIMUM framerate, that's significant.
If minimums are exactly the same, but maximum is higher (thus leading to a 58% in AVERAGE framerate), then it's still worthless to me.

Minimum FPS is the important metric, average can be artificially inflated and doesn't tell me anything.

Edit: Also, the article you linked agrees with the article I linked, here's an expert from the article you just linked:

What I'm seeing, over and over, is if you already have a fast CPU (or you're gaming at decently high resolution), Mantle does almost nothing.


Yes, but this is only BF4. Is it draw call limited in DX11? Starswarm, allegedly, has much bigger benefits from Mantle, since it's build to take advantage of other features than just the removal of the CPU overhead.


Good to see that the gains per GPU with Mantle scale properly with crossfire.

That should be normal, since, as others have said, SLI/Crossfire systems are more CPU bottlenecked than single GPU systems. Would be nice to see benchmarks from a quadfire setup in BF4.
 
Are these people so disorganized they can't release something at the same time, even after it was delayed an entire month?
 
Well, that's unfortunate. This use case seemed to me to be the main draw for Mantle.

Look at the graph they posted, it looks enormously smoother even if the raw gain isn't massive. That's pretty nice.

Someone gif'd it
test0gstb.gif
 
You never know you maybe eating your own words:rolleyes:

I agree with him. We won't likely see Mantle fully (hopefully it grows) taken advantage for about a year or two. I wouldn't rush out to buy an AMD setup just yet.
 
Really curious to know what type of performance gains are to be had on my hardware.

8350
r9 290 stock
1440p monitor


I really want to see the performance differences at higher than 1080p resolution, Since higher resolutions tend to be more gpu limited, doesn't that mean that mantle will be less of a boost there?

What is the effect for given gpu setups while varying the cpus?

290 with:
i5
8350
i7-4xxx
i7-extreme

CF 290s with:
i5
8350
i7-4xxx
i7-extreme

test at 1080p and also at 1440p


so many results I still want to see.
 
I really want to see the performance differences at higher than 1080p resolution, Since higher resolutions tend to be more gpu limited, doesn't that mean that mantle will be less of a boost there?

Pretty much, Mantle moves the bottleneck back to the GPU where it should be. If you were already tapping out the card under DX then you won't see more than a 10-15% gain which is the result of the lower level access to the GPU itself. Look at the 7850K APU results and you will get a pretty good idea of how that scenario will look.
 
I wonder to what this could be attributed. Do you know what the scale is of that graph?

I know with the new drivers, they're supposed to move their frame pacing to "phase 2" (whatever that means). So it's possible that's what we're seeing (or maybe Mantle is just more stable?)
 
Without testing or experiencing for myself, I can tell you this test doesn't really mean anything to most gamers. You won't be able to tell a difference between 80FPS and 120FPS when you're actually playing the game, so a 50% performance increase doesn't really buy you anything here.

It's kind of interesting that you said this. I agree it's meaningless, but lately on this site reading all the bashing going on from NVIDIA fanboys and AMD fanboys it does actually mean something.

When the Ti came out, you had all those folks willing to pay a couple hundred dollars more for a card because it gets 'a few fps' faster than the 290x. All the AMD fanyboys came out in droves saying those few FPS are meaningless and they are paying way to much money. Now AMD has mantle and gain some FPS from it and one of the 1st things they hear on the thread is how it's meaningless because you can't tell the difference between 80fps and 120fps.

With the battle of words that goes on in this forum, i would say the difference between 80fps and 120fps is 'not' meaningless. haha!
 
I'm too old to buy goods and services based on marketing materials.
I still own an Xbox 360 with no plans on new hardware this year.
 
The difference between 80 and 120 FPS is huge. Its the difference between running full light boost and not tearing and not being able to do it. What the heck is David talking about.
 
I know with the new drivers, they're supposed to move their frame pacing to "phase 2" (whatever that means).
If that's the case, it would be useful to see how similarly-performing Intel and NVIDIA parts handle this scene.
 
Last edited:
The difference between 80 and 120 FPS is huge. Its the difference between running full light boost and not tearing and not being able to do it. What the heck is David talking about.

I was thinking the same thing. I'm looking forward to trying out Mantle tomorrow. I'm hoping to lock-in 120fps at 3840x800.
 
Back
Top