Mantle is here

Can't wait for more benchmarks. Curious if my 4770k is bottlenecking my GTX770 (~7970), just like everyone else. Wonder if DX can come out with an update that does the same thing.

I would say yes.
 
This is good to see, been waiting for my 7950 to get some mantle love, but i probably wont pick up BF4 until its on sale.
 
Without testing or experiencing for myself, I can tell you this test doesn't really mean anything to most gamers. You won't be able to tell a difference between 80FPS and 120FPS when you're actually playing the game, so a 50% performance increase doesn't really buy you anything here.

Did you really just say that?
50% increase in FPS, with smoother frame times, has no benefit to gamers...
 
pure destruction. Read 'em and weep nvidia.

Test case 3: High-end single-player with multiple GPUs
CPU: Intel Core i7-3970x Extreme, 12 logical cores @ 3.5 GHz
GPU: 2x AMD Radeon R9 290x 4 GB
Settings: 1080p ULTRA 4x MSAA
OS: Windows 8 64-bit
Level: South China Sea “Broken Flight Deck”
This single-player scene is heavy on both the CPU and GPU with lots of action going on. Test was done on the highest end Intel CPU on Windows 8, which is the fastest option before Mantle thanks to DirectX 11.1. Still this CPU is not fast enough to keep the 2 290x GPUs fed at 1080p on Ultra settings so we get a significant CPU performance bottleneck which results in major performance improvement when enabling Mantle.
Result: 13.24 ms/f -> = 8.38 ms/f = 58% faster

*snip*
*snip*

unrealistic results..

78FPS at 1080P 4xAAMSAA with 290X Xfire?. thats a BIG POT of smoke in the face of the people.. if even a single 280X average 60+FPS at those settings.. and according to the [H] Review are 65+..

and with a single 290X at 2560x1600 with 2XMSAA average a 67FPS with 100+FPS maximums... how can they tell that with 2x 290X at 1080P they had 78FPS?.. i have to say again.. a BIG FLASHBANG grenade in the face..

don't take me wrong, I really want to see how mantle can push DX.. but in realistic world i know that in one of my work machines with a FX8350+ 7970GHZ Xfire I will have way better results than posted by trydling@ BattleLog//
 
Maybe he thinks Intel/AMD/Nvidia should just stop developing new technology because what we have is fast enough?
 
Did you really just say that?
50% increase in FPS, with smoother frame times, has no benefit to gamers...

I did. An increase from 80FPS to 120FPS is typically not one that will be able to be identified by gamers unless there's some other factor in play with inconsistent frame delivery or particular scenes where the frame rate bottoms out. I'm point is that the test that they selected for the dual card configuration is/was CPU bound and Mantle "fixed" the CPU bound part of it. If you have a pair of 290X's, you're probably not playing at 1080p, and if you are, you've probably got resolution scale turned up a bit. I would guess (though, can not confirm), that if you run that same test at a higher resolution and/or with the resolution scale jacked up, that the performance increase will be significantly lower.
 
Can't wait to see someone hack these drivers for HD 5000 and 600 series cards.
 
unrealistic results..

78FPS at 1080P 4xAAMSAA with 290X Xfire?. thats a BIG POT of smoke in the face of the people.. if even a single 280X average 60+FPS at those settings.. and according to the [H] Review are 65+..

and with a single 290X at 2560x1600 with 2XMSAA average a 67FPS with 100+FPS maximums... how can they tell that with 2x 290X at 1080P they had 78FPS?.. i have to say again.. a BIG FLASHBANG grenade in the face..

don't take me wrong, I really want to see how mantle can push DX.. but in realistic world i know that in one of my work machines with a FX8350+ 7970GHZ Xfire I will have way better results than posted by trydling@ BattleLog//

If you read the description of the benchmark, they mention that the particular scene is very intensive. I can get 200fps in some scenarios on my 290 crossfire, but that doesn't mean every in game scenario will produce those fps.
 
I did. An increase from 80FPS to 120FPS is typically not one that will be able to be identified by gamers unless there's some other factor in play with inconsistent frame delivery or particular scenes where the frame rate bottoms out. I'm point is that the test that they selected for the dual card configuration is/was CPU bound and Mantle "fixed" the CPU bound part of it. If you have a pair of 290X's, you're probably not playing at 1080p, and if you are, you've probably got resolution scale turned up a bit. I would guess (though, can not confirm), that if you run that same test at a higher resolution and/or with the resolution scale jacked up, that the performance increase will be significantly lower.

Right... I will remember that when I read your articles.
 
I did. An increase from 80FPS to 120FPS is typically not one that will be able to be identified by gamers unless there's some other factor in play with inconsistent frame delivery or particular scenes where the frame rate bottoms out. I'm point is that the test that they selected for the dual card configuration is/was CPU bound and Mantle "fixed" the CPU bound part of it. If you have a pair of 290X's, you're probably not playing at 1080p, and if you are, you've probably got resolution scale turned up a bit. I would guess (though, can not confirm), that if you run that same test at a higher resolution and/or with the resolution scale jacked up, that the performance increase will be significantly lower.

Fully agreed. While they say that FPS increases above 100 are un-noticeable to the human eye, it actually difficult to notice much of a difference after 60fps assuming your frames never drop below 60. It can be noticed in precision FPS games, but Skyrim at 60 fps and 80 fps is very similar.
 
Without testing or experiencing for myself, I can tell you this test doesn't really mean anything to most gamers. You won't be able to tell a difference between 80FPS and 120FPS when you're actually playing the game, so a 50% performance increase doesn't really buy you anything here.

Not everyone is the same as you. I easily notice the difference between 80 and 120 FPS. Some people are more accustomed to it than others. I have spent what must be about 10 years playing Quake 3 competitively @ 120fps/120hz. Once your used to 100+ fps you start to notice these variations pretty easily.
 
I did. An increase from 80FPS to 120FPS is typically not one that will be able to be identified by gamers unless there's some other factor in play with inconsistent frame delivery or particular scenes where the frame rate bottoms out. I'm point is that the test that they selected for the dual card configuration is/was CPU bound and Mantle "fixed" the CPU bound part of it. If you have a pair of 290X's, you're probably not playing at 1080p, and if you are, you've probably got resolution scale turned up a bit. I would guess (though, can not confirm), that if you run that same test at a higher resolution and/or with the resolution scale jacked up, that the performance increase will be significantly lower.

That difference will be evident in 120hz panels. What I believe to be more important is if the high increase in crossfire scenarios at the given settings will translate to a similar increase if one would use higher resolution/resolution scaling, and tons of aa and af.

Currently, at 150% resolution scaling, on ultra 4xmsaa , 16x ssaa, 16x af quality mode, I get 79 to 120 fps. I would be very happy if I can get 80 to 100 fps on 200% resolution scaling using the same settings. It will be interesting how the performance increase scales when using very demanding settings.
 
i wanna see benches on an older core 2quad for laughs and giggles.

I have a Q9650 OC'd to 3.75ghz, and a 7850. My fps drops frequently to the low 30's on Hainain Resort in 64p matches, so I'm really excited about trying this out.

My card isn't fully supported yet, though, so the results may be underwhelming for now.

Without testing or experiencing for myself, I can tell you this test doesn't really mean anything to most gamers. You won't be able to tell a difference between 80FPS and 120FPS when you're actually playing the game, so a 50% performance increase doesn't really buy you anything here.

If you're running without vsync then you'll have much less noticeable tearing on a 120hz monitor. And you can always increase settings to bring that 120 down, by increasing stuff like resolution scale.
 
Can't wait to see someone hack these drivers for HD 5000 and 600 series cards.

wont happen.

Shaders need to be able to be dynamically grouped int SMUs. That not supported by the hardware on 6000 and older series.
 
Turn them into some cash. I convert 6950's into cash for some people. ;)

I actually have three 6950s I'm not using, I've thought a couple times about selling them and putting the money toward a new 290(x) to play with, but packaging them all up and shipping them out seems like a lot of work :D it would be cool to check out mantle first hand though. Maybe I should make a FS thread.......
 
If you have a pair of 290X's, you're probably not playing at 1080p, and if you are, you've probably got resolution scale turned up a bit. I would guess (though, can not confirm), that if you run that same test at a higher resolution and/or with the resolution scale jacked up, that the performance increase will be significantly lower.
Bingo. The test was designed to create a big CPU bottleneck. Twin 290x's at 1080p are begging for more work, thus mantle gains are large. Regardless, I'm impressed.
 
Bingo. The test was designed to create a big CPU bottleneck. Twin 290x's at 1080p are begging for more work, thus mantle gains are large. Regardless, I'm impressed.
I don't see why this is a bad result, though. There are a lot of gamers who play FPS games on 120hz monitors, and all of those currently are 1080P. With Mantle in this scenario they can now lock 120fps which is a pretty big deal.
 
I did. An increase from 80FPS to 120FPS is typically not one that will be able to be identified by gamers unless there's some other factor in play with inconsistent frame delivery or particular scenes where the frame rate bottoms out.

Speaking of frame delivery, it does seem that Mantle have a more consistent frame delivery then under DX. Take a look at the frame time GPU graph. Its a straight line under Mantle with 8350+7970:

DX:
testcase2_dx6401.jpg


Mantle:
testcase2_mantle6401.jpg


Would have been nice if you could investigate this further in your review. :)
 
Can't wait to see someone hack these drivers for HD 5000 and 600 series cards.
Mantle only supports GCN architecture. The HD5000 and 6000 series use a different architecture, it's not supported. You can't "Hack" that to work.
 
its because of the render units, they need to be dynamically allocated and its only supported by CGN.
 
Color me intrigued! However, I'm going to wait for some third-party reviews to come out just to make sure DICE/EA/AMD aren't bloating numbers. If these are the actual gains then I am quite impressed!
 
A Core i7-3970x Extreme w/ 12 logical cores @ 3.5 GHz can be bottlenecked by BF4.

...



...

:confused:
 
So ready for Mantle fps bump!

Getting 45 fps average with BF4 at 1080p ultra with Opteron 6376 and R9 270X. 25% Mantle boost will bring it up to ~56 fps which is very close to optimal 60 fps.

Kaveri A10-7850K gets about 35 fps average at 1080p low. 15% boost will bring it up to ~40 fps which is perfectly playable for a ~$330 system when the other Opteron systems are tied up with ESXi.
 
Do you have the Mantle documentation? Did you work on Mantle?

Do you not understand that there are quite large differences in the architectures of GCN and previous generations?
 
Speaking of frame delivery, it does seem that Mantle have a more consistent frame delivery then under DX. Take a look at the frame time GPU graph. Its a straight line under Mantle with 8350+7970:

DX:
testcase2_dx6401.jpg


Mantle:
testcase2_mantle6401.jpg


Would have been nice if you could investigate this further in your review. :)

... and to think that will only get better with driver updates from AMD. Now if only AMD could make BF4 not fucking terrible.....
 
If you have something to add to the conversation, please do. If you are here to act like a child, you will be infracted and possibly banned.
 
So I"m basically just hitting F5 on the AMD beta driver page. RELEASE THE KRAKEN AMD
 
From The Tech Report
  • CPU-limited scenario: 40.9% (1080p) and 40.1% (1600p) performance improvement under Ultra settings and 4xAA on the AMD A10-7700K with an AMD Radeon™ R9 290X.
  • GPU-limited scenario: 2.7% (1080p) and 1.4% (1600p) performance improvement under Ultra settings and FXAA on the Core i7-4960X with an AMD Radeon™ R7 260X

So anyone who already has a fast CPU and/or is playing at high resolution wont see much more than a 2% increase in performance... ok then.
 
Back
Top