AMD Mantle Performance Preview in Battlefield 4 @ [H]

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,532
AMD Mantle Performance Preview in Battlefield 4 - AMD's Mantle is here for Battlefield 4. We take an XFX Radeon R9 290X video card for a spin under the AMD Mantle API and compare it to Direct3D 11.1. We look at performance advantages, compare resolutions, and even take a look at frame time. Is AMD Mantle everything it was cracked up to be? Let's find out in this performance preview.
 
So is 4X AA really that important at 2560x1600?

Whats the difference? Do you notice it when in the middle of the action?
 
So is 4X AA really that important at 2560x1600?

Whats the difference? Do you notice it when in the middle of the action?


Yes, with BF4 specifically. We have noted in previous reviews how AA is important with this specific game and its experience.
 
Thanks for the review. Looks promising. Are you going to have another one comparing performance with different CPUs? Maybe something as dated as a Kentsfield?
 
im at work so i didnt have time to read the whole thing, but i didnt see what the test system was? 3770k @ 4.8? 16gb ram?
 
Do you plan on doing some image quality tests on Mantle, just on another forum it seemed that some screenshots showed a slight decrease in quality when using Mantle?
 
Do you plan on doing some image quality tests on Mantle, just on another forum it seemed that some screenshots showed a slight decrease in quality when using Mantle?


I'm interested in this also
IQ differences that gives 25%+ perfonmance increase shold be pretty evidente though
 
Mind sharing with us your test rig (specifically CPU info and memory speed)? Since Mantle's potential benefit seems to be mainly concerning CPU-bound tasks, that would help put the numbers into context.

Danke!
 
Do you plan on doing some image quality tests on Mantle, just on another forum it seemed that some screenshots showed a slight decrease in quality when using Mantle?

I personally experienced no IQ differences between the two APIs.

We will of course look at this closely to make sure.
 
Same system I always use.

http://www.hardocp.com/article/2014/01/26/xfx_r9_290x_double_dissipation_edition_crossfire_review/2

I added this information now at the bottom of the first page.

Thanks!

The fact that you were getting improvement on an overclocked IB i7 is pretty amazing. That would be the scenario I would least look for CPU limited performance.

I'm sure you have this planned for the full review, but I'm curious to see what kind of improvement is possible with an i5/FX or i3/Kaveri.

Keep up the good work. You all are the best for in-depth performance analysis.
 
Great review, Brent, and great qualitative feedback.

Question: Not being familiar with Mantle, does it support some of the more interesting DX features? Tesselation, HBAO/SSAO, FXAA, volumetric fog/rain etc? In other words, is Mantle a tech equivalent to DX11, or are you giving up on some of the more optional eye candy options? (And I assume it lacks the Nvidia specific ones like PhysX and TXAA).
 
Great review, Brent, and great qualitative feedback.

Question: Not being familiar with Mantle, does it support some of the more interesting DX features? Tesselation, HBAO/SSAO, FXAA, volumetric fog/rain etc? In other words, is Mantle a tech equivalent to DX11, or are you giving up on some of the more optional eye candy options? (And I assume it lacks the Nvidia specific ones like PhysX and TXAA).

I'd give up Physix, TXAA before you could finish saying it for more performance. There's also MLAA, SMAA, FXAA, etc...take your pick.
 
Great review. I'm dying to see Crossfire numbers using the same methods, I predict the performance improvements will be mighty impressive.
 
Great review, Brent, and great qualitative feedback.

Question: Not being familiar with Mantle, does it support some of the more interesting DX features? Tesselation, HBAO/SSAO, FXAA, volumetric fog/rain etc? In other words, is Mantle a tech equivalent to DX11, or are you giving up on some of the more optional eye candy options? (And I assume it lacks the Nvidia specific ones like PhysX and TXAA).

Since it allows low-level access to GCN GPUs, it supports, well, anything a programmer wants to implement. The shaders on the GPU are agnostic, they do whatever the programmer tells them to. Mantle allows access to them in more bare metal way than DX does, there is no HAL (Hardware Extraction Layer) to go through. So it isn't a matter of support of features, it's a matter of what the programmer decides to do, and the door is open to any possibility. You can't compare "feature sets" between DX and Mantle, because Mantle operates in a very different way than DX. It allows the programmer to use those shaders to do any 3D effect they want. You won't see AMD specific features removed, and you won't see 3D effects removed with Mantle, it can improve these things in performance and IQ potentially. Something like Tessellation really has the potential to be done better with Mantle since it will know exactly how the GPU its running on works, instead of having to go through DX.
 
Great review. I'm dying to see Crossfire numbers using the same methods, I predict the performance improvements will be mighty impressive.

I'm more looking forward to the improvements of frame times in CrossFire.
 
I personally experienced no IQ differences between the two APIs.

We will of course look at this closely to make sure.

Image quality is what I'm concerned about too.

Browsing another site, another person put up a comparison shot:

http://forums.guru3d.com/showthread.php?t=386248&page=6

With both in the picture, it's easy to tell there's a difference. Maybe quality of the render the same, but there definitely seems to be something off.
 
Have you had a chance to try 14.1 with CrossFire at all? DICE posted great results so I'm wondering if they have access to a different driver than what was released to us.
 
Image quality is what I'm concerned about too.

Browsing another site, another person put up a comparison shot:

http://forums.guru3d.com/showthread.php?t=386248&page=6

With both in the picture, it's easy to tell there's a difference. Maybe quality of the render the same, but there definitely seems to be something off.

It could be something as simple as contrast differences. However, BF4 is hard to compare screenshots because the lighting and weather is dynamically changing. I've seen people post screenshots, but they are not taking into account the fact that the light from the sky/clouds is dynamically changing on objects, and this can create darker and lighter images, and make far away objects look more washed out at times. I experienced this just sitting idle in Test Range, the lighting was changing all around me in both APIs. So, I would take a lot of screenshot comparisons with BF4 with a grain of salt. Take the safe road, be skeptical unless hard evidence is provided. Like I said, they looked the same to me in-game.
 
Since it allows low-level access to GCN GPUs, it supports, well, anything a programmer wants to implement. The shaders on the GPU are agnostic, they do whatever the programmer tells them to. Mantle allows access to them in more bare metal way than DX does, there is no HAL (Hardware Extraction Layer) to go through. So it isn't a matter of support of features, it's a matter of what the programmer decides to do, and the door is open to any possibility. You can't compare "feature sets" between DX and Mantle, because Mantle operates in a very different way than DX. It allows the programmer to use those shaders to do any 3D effect they want. You won't see AMD specific features removed, and you won't see 3D effects removed with Mantle, it can improve these things in performance and IQ potentially. Something like Tessellation really has the potential to be done better with Mantle since it will know exactly how the GPU its running on works, instead of having to go through DX.
Huh... interesting. It seems like this is then a real opportunity for engine-makers. With that kind of access to the gpu, a real solid Mantle-based game engine could enable some pretty impressive things not just graphically, but in terms of what a dev could do in making a game from it.

Makes me want to see what the internal reaction is from Microsoft. Direct X has been so dominant for so long (largely because it's been better than the alternatives), to have a free, functional and better alternative available could be a concern. Hopefully Mantle pushes DX12 to be something actually worth talking about.
Assuming there is a DX12: http://www.i-programmer.info/news/144-graphics-and-games/5744-amd-no-directx-12.html
 
Good read..thank you.

I also found the odd hitching noticeable in the test range.

Think time will allow developers to further optimize for the gpu so that gpu limited scenario's will get a bigger boost.

Also in addition to this helping lower/mid power systems I also think that it will help the more extreme high end multi gpu crossfire systems, where the added cpu load of multiple cards together with the higher frame rates they produce will ultimately lead to cpu limitation.
 
Expecting announcement from Microsoft about a legacy stripped, tweaked up DirectX API in 5...4...3...
 
I'm more looking forward to the improvements of frame times in CrossFire.

Oh, that goes without saying. Frametimes have improved significantly.
mwnKIsZ.jpg


Found this also pretty interesting, performance DOES increase even with an overclocked CPU. Tested at 2560x1600 4xMSAA as well.
7fQt9ZT.jpg
StuTkqD.jpg


Source:
http://www.hardwareluxx.de/index.ph...te-ergebnisse-im-kampf-mantle-vs-directx.html
http://www.computerbase.de/artikel/grafikkarten/2014/erste-eindruecke-zu-amds-mantle/2/
 
Last edited:
Found this also pretty interesting, performance DOES increase even with an overclocked CPU. Tested at 2560x1600 4xMSAA as well.

Just to correct you, that CPU is not overclocked at all; 3.9ghz is stock - I do think I understand the correlation you're trying to make with the underclocked CPUs, though.

Anyway, I'm not aware of any websites that have tested overclocked CPUs, I have seen numerous websites test "underclocked" CPUs which I find rather worthless. I mean, I get the purpose, but I think the real users for the 290X will generally have high end CPUs which probably are OC'ed if they're true tech nerds like most of us are. Anyone found OC'ed CPU results?
 
Just to correct you, that CPU is not overclocked at all; 3.9ghz is stock - I do think I understand the correlation you're trying to make with the underclocked CPUs, though.

Anyway, I'm not aware of any websites that have tested overclocked CPUs, I have seen numerous websites test "underclocked" CPUs which I find rather worthless. I mean, I get the purpose, but I think the real users for the 290X will generally have high end CPUs which probably are OC'ed if they're true tech nerds like most of us are. Anyone found OC'ed CPU results?

I may be wrong but 3.9ghz is not the stock speed of the 3960X it is 3.3ghz......3.9ghz is only the boost speed when only one core is active afaik.... so this have been oc'ed to 3.9ghz on all cores.

I also found that link very interesting, firstly in how the DX version was virtually unresponsive to increased cpu clock speed :-/ just goes to show how much DX has been limiting things.

Secondly and the big take away for me is how much multi gpu on high end systems benefits from Mantle....25% improvement on the crossfired 290x's compared to DX11.
Goes to show how much cpu load multi cards are placing on the system....may even see some wild results for trifire and 4xcrossfire.
 
Just to correct you, that CPU is not overclocked at all; 3.9ghz is stock - I do think I understand the correlation you're trying to make with the underclocked CPUs, though.

Anyway, I'm not aware of any websites that have tested overclocked CPUs, I have seen numerous websites test "underclocked" CPUs which I find rather worthless. I mean, I get the purpose, but I think the real users for the 290X will generally have high end CPUs which probably are OC'ed if they're true tech nerds like most of us are. Anyone found OC'ed CPU results?

Like it was said above 3960X is 3.3 stock and for some reason the other image didn't go through. Fixed now
 
Image quality is what I'm concerned about too.

Browsing another site, another person put up a comparison shot:

http://forums.guru3d.com/showthread.php?t=386248&page=6

With both in the picture, it's easy to tell there's a difference. Maybe quality of the render the same, but there definitely seems to be something off.

Dice tweeted over the weekend they are aware of this, its a gamma algorithm bug. There is no actual difference in quality.
 
Last edited:
Any chance we could see some testing over a range of CPU overclocks?

While 4.8 GHz potentially eliminates the CPU bottleneck, seems like the average OC range is between 4.2 - 4.5 GHz (and frankly, with Haswell-based systems most people can't realistically get any higher than that), so it might be interesting to see how Mantle scales with CPU performance in BF4 multiplayer.
 
some guy on overclockers uk made a program to turn the .csv file from the bf4 frame time capture into
min/max/avg

http://forums.overclockers.co.uk/showthread.php?t=18577693

you don't get graphs but might save some time


Wow some guy over at that thread reporting a 95.7% increase going from DX to Mantle with a trifire 290 setup....incredible.

Also looking back at the german site linked to earlier shows the crossfire scaling at 99% when going from one to two 290x's under mantle while the scaling with DX at the same settings is only 78.5%......so going from one card using DX to two cards using Mantle give an increase of 128% !!!
 
Last edited:
converting to FPS from frame time data is simple:

assume: frame time is in milliseconds

FPS = 1000 / (time_now - time_last)

in Excel this might look like: =1000/(A2-A1)

I don't mind frame time graphs, but given the community is used to seeing FPS graphs it doesn't seem like a horrible burden to convert the data.

Interesting results none the less. It will be very interesting to see how this matures over time, and how it effects nVidia's development priorities.

Phil
 
Brent, what about using 2x crossfire in the test? Would be good to see what a custom 290x(that no drops clock due to overheat when in crossfire setups) can do with Mantle.
 
Crossfire is broken in this release with 290 cards. It isn't possible to test. You may want to actually read the numerous documentations, user experiences, AMD's known issues list for this beta, or the article itself that brushes on this fact.
 
Good overview, however. I would like to have seen more AA at the lower resolutions because there is where it would seem to do the most good and is what most people would turn up the highest first.

Also Brent, Kyle could this allow for more "PHYSX" type of implementation? The type that can be turned on or off to allow a person to customize a game to their personal liking. This might bring nvidia into the fold(I could only hope) because Physx & 3D under a common umbrella would be a good thing at least for the consumer.
 
Last edited:
Crossfire is broken in this release with 290 cards. It isn't possible to test. You may want to actually read the article where he mentions this.

Or you could read the driver release notes. Or..(tons of other documentation) The key is reading.
Which is interesting, since DICE has posted CFX results with Mantle. They must have a different build of the driver?
 
Crossfire is broken in this release with 290 cards. It isn't possible to test. You may want to actually read the numerous documentations, user experiences, AMD's known issues list for this beta, or the article itself that brushes on this fact.

WUT?
 
Back
Top