Mantle pushes FX-8350 to beyond i7-4960X performance.

if all you do is play BF4 thats great
but Mantle wont help any thing else like say video encoding or say live streaming
only a better CPU can help
id like to see mantle vs intel wile run OBS
or better yet
AMD cpu + NV card using shadow play to stream vs mantle and OBS on a AMD card streaming

OBS supports AMD VCE so you can stream using your video card to do all of the work. Here is a live stream archive with a Q6600 + HD7850 streaming with OBS using the AMD VCE capabilities of the video card. Note that the video card is doing all of the transcoding and encoding in this video. Quality is only limited by your the upload capability of your ISP. Does Shadowplay allow you to choose any quality setting?

AMD Gaming Evolved App has AMD VCE which is similar to "ShadowPlay" streaming and local capture built. You can stream directly to Twitch like you would in OBS. You can also capture directly to your local hard drive if you like. Comes with your Mantle drivers as an optional install.

Check out the AMD encoding tools thread here on [H]ardocp.

Check out the AMD section sometimes. It's fun keeping up with both what Nvidia and AMD are doing at the same time without arguing who makes the best pudding pops. :)
 
iirc it cant stream

You need to stop. You've been completely dominated time and time again in this thread. Mantle was released to help weak CPUs compete, period. socK has proved it's awesome.
 
Actually Mantle would allow to use more of the cpu rather then helping a weak cpu.
As soon as the development cycle is built around Mantles ability to push more data through, you will get a lot more/better features in games or better AI.

I would definitely say that FX8350 with 8 cores/threads shows that is not as bad as "people" say it is. Or the 1001 useless benchmarks never made really clear.
 
was playing with mantle earlier and 3 290's in BF4, would rarely see much above 50% per cpu core in 64 player mp rounds, while still maintaining over 120-200fps all settings maxed 4x aa.

as soon as i tried the same server with dx11 i would see dips down to 90fps and the cpu would see 85%+ per core

CPU is a stock 4790K

really want that raptr Game DVR beta to support crossfire because i would hate to disable 2, 290's to record. and i can not fit my capture card in my system with the 3 290's lol
 
Try OBS Wildace. You can record to your hard drive with that just like the Raptr app.
 
Actually Mantle would allow to use more of the cpu rather then helping a weak cpu.
As soon as the development cycle is built around Mantles ability to push more data through, you will get a lot more/better features in games or better AI.

I would definitely say that FX8350 with 8 cores/threads shows that is not as bad as "people" say it is. Or the 1001 useless benchmarks never made really clear.
Using Mantle does not make the IPC deficencies of the AMD FX cpu's go away. Mantle has less CPU overhead than DX which frees up the CPU to do other things such as networking, AI, physics, player interaction, environment interaction, etc. It only allows you to use "more of the cpu" in the sense that the CPU can spend more time on other processing chores.

Comparatively speaking, the FX cpu's are as bad as benchmarks show it to be. They do lag behind many Intel CPU's on many, but not all tasks. That much is clear. When factoring in cost and average workload of the end user the outlook changes slightly. The value of AMD cpu's lies in the fact that most current software isn't that demanding, and thus the AMD cpu's still have plenty of power for most chores.
 
You can do proper threaded rendering in Mantle, no doubt very nice for AMD processors.
 
What is there to argue about? Performance between the FX-8350 and i7-4960X are virtually even without the DirectX artificial gimp layer and for the price of the i7 you can the FX with two R9 290X. No brainer.
 
What are you even trying to show? Same thing happened on Mantle.

I'm pretty sure it was because I started the frame log during loading and it hitched for a second as it actually entered the game.
 
The settings were the same, I guarantee it. It's an old screenshot, there was a bug with the fog in Mantle then so one looks slightly brighter, and I was not in the exact same spot. I retook it along with a bunch of spots.

I moved slightly to get a bigger view of the scene, since you spawn somewhat in cover. That's why they are not 100% identical location wise.

DX11
https://i.imgur.com/z5jx557.jpg

Mantle
https://i.imgur.com/9GrkW2D.jpg


For another map, and an actual benchmark, here is the easiest map in the game to run, Locker. I'll play a heavy map later to compare.

DX11
Code:
Battlefield 4 Frame Time Analyzer Results:
File imported: snip
Branch #1 executed.
Imported 93660 points of data.
Frames: 0 points of bad data tossed.
CPU Frames: 0 points of bad data tossed.
GPU Frames: 0 points of bad data tossed.

Frame Time Avg		CPU Frame Avg		GPU Frame Avg
68.175 FPS		68.018 FPS		78.762 FPS

Max FPS			Max FPS (CPU)		Max FPS (GPU)
418.41 FPS		204.499 FPS		943.396 FPS

Min FPS			Min FPS (CPU)		Min FPS (GPU)
3.673 FPS		2.531 FPS		3.734 FPS

Time Spent:		FPS %:		FPS %(CPU):	FPS %(GPU):
Above 200 FPS:		0.02 %		0.07 %		0.11 %
Above 144 FPS:		0.29 %		1.32 %		2.27 %
Above 120 FPS:		1.43 %		4.3 %		7.24 %
Above 100 FPS:		7.15 %		10.15 %		22.23 %
Above 90 FPS:		15.29 %		15.74 %		35.31 %
Above 60 FPS:		74.67 %		75.04 %		87.85 %
Above 45 FPS:		96.88 %		97.23 %		98.98 %
Above 30 FPS:		99.84 %		99.87 %		99.95 %

Mantle
Code:
Battlefield 4 Frame Time Analyzer Results:
Second File imported: snip
Branch #1 executed.
Imported 76897 points of data.
Frames: 0 points of bad data tossed.
CPU Frames: 0 points of bad data tossed.
GPU Frames: 0 points of bad data tossed.

Frame Time Avg		CPU Frame Avg		GPU Frame Avg
88.642 FPS		87.841 FPS		164.755 FPS

Max FPS			Max FPS (CPU)		Max FPS (GPU)
588.235 FPS		201.207 FPS		980.392 FPS

Min FPS			Min FPS (CPU)		Min FPS (GPU)
3.642 FPS		3.721 FPS		116.959 FPS

Time Spent:		FPS %:		FPS %(CPU):	FPS %(GPU):
Above 200 FPS:		0.7 %		1.15 %		8.13 %
Above 144 FPS:		4.05 %		7.62 %		88.12 %
Above 120 FPS:		11.96 %		14.66 %		99.98 %
Above 100 FPS:		35.6 %		29.93 %		100 %
Above 90 FPS:		53.55 %		46.97 %		100 %
Above 60 FPS:		96.01 %		97.91 %		100 %
Above 45 FPS:		99.49 %		99.59 %		100 %
Above 30 FPS:		99.84 %		99.93 %		100 %

9yKazYm.png




Lucky for us, AMD now has their own tool equivilent to Shadowplay.

This guy benched it. I saw similar results. It's "free" performance wise.
http://hardforum.com/showpost.php?p=1040913868&postcount=25

Can you when you have the time....explain what software was used in making these comparisons...I wouldn't mind testing mine to see if there are any tangible benefits....course i guess i could just use fraps to do the same thing
 
Why do I feel like DICE is sprinkling 50ms sleep calls all over their DX code just to make Mantle look good?

Or are they really that bad when it comes to programming?

Or are you just grasping at straws in a desperate attempt to declare that Mantle is crap?
 
no attempt here mantle is pointless if this is the best AMD can come up with
 
Try OBS Wildace. You can record to your hard drive with that just like the Raptr app.

couldn't get it to work with bf4 in full screen, i was using the monitor capture and that works fine in window mode but not full screen it just goes black.
 
Enable the frame log from the console in BF4, it'll crap out a spreadsheet.

PerfOverlay.FrameFileLogEnable 1

Use this to parse the data (or have excel shit out a graph)
http://www.overclock.net/t/1469627/...e-analyzer-version-4-2-released-major-release

thanks...i'll give it a try....last question...did they ever iron out all the bugs with mantle? Seems like one of the settings, I can't remember which didn't work in mantle at one time. So the end result you had a slight decrease in visual quality vs. dx11

ok so one the left is mantle......................................and on the right dx11....just ran around blowing stuff up on the test range
Capture_zps86e8f67a.jpg
[/URL][/IMG]
And to honest i may have pushed the mantle run a little harder....blew more stuff up anyway.......looks like it has better min frame rates anyway...hmmm maybe i should keep using mantle...but it kinda sucks not having riva tuner osd:)
 
Last edited:
This is why i do not want to use DX11 in BF4 with my 3x 290's stock 4790k

max in game settings with no motion blur. same settings for both.

the Left is DX11 the Right is Mantle.
rawr-2.jpg
 
I will say this about mantle. If you have enough GPU horsepower, sustaining 120fps on a 120hz monitor is alot easier then DX11.

That is the thing I like about it.

Im shocked how well Mantle works in BF4 using tri-fire.
 
couldn't get it to work with bf4 in full screen, i was using the monitor capture and that works fine in window mode but not full screen it just goes black.

Oh yes, forgot about that. To do fullscreen captures many streamers used DXTORY, but that would defeat the purpose of AMD VCE. I'll guess you'll have to wait for AMD to fix it. What I would do if I were you is go to the latest Mantle driver thread and mention the issue that you're having. Then take the time to fill out the error report that was linked by our AMD rep Warsam71 on [H]ardocp. That way they know that someone needs the feature worked on ASAP.
 
I will say this about mantle. If you have enough GPU horsepower, sustaining 120fps on a 120hz monitor is alot easier then DX11.

That is the thing I like about it.

Im shocked how well Mantle works in BF4 using tri-fire.

yeah it is so smooth with 3x290's, i ran the test again with only 2x 290's

DX11 on the left, Mantle on the right, DX11 has a lot better consistency with only 2 gpu's but mantle still held above 120fps a bit a better.
RAWR2.jpg


there is more room for the 3x290's to breath too because they were only around 70 utilized in the first test while the 2x290 test was near max 99% load most of the time in both DX11 and Mantle test.
 
I'll be the first to admit...i don't exactly understand how to read these charts entirely....maybe i wrap my head around it sooner or latter...i thought fps was fps....whats all this cpu/gpu % stuff? I understand max, min, max frame rates...and i guess frame times make sense as well but it confuses me with all this extra stuff......maybe someone can break it down for me
 
I'll be the first to admit...i don't exactly understand how to read these charts entirely....maybe i wrap my head around it sooner or latter...i thought fps was fps....whats all this cpu/gpu % stuff? I understand max, min, max frame rates...and i guess frame times make sense as well but it confuses me with all this extra stuff......maybe someone can break it down for me

TDLR: You want the highest mins. possible for smooth play. Having 200+ fps only to drop to 2 fps in intense combat is useless.
 
TDLR: You want the highest mins. possible for smooth play. Having 200+ fps only to drop to 2 fps in intense combat is useless.

that i agree with...i just cant grasp why there's a cpu chart and a gpu chart...i thought the goal was just to record frame rates and frame times.....why is there 2 charts?
 
I'll wait for comparisons in an open-source engine like UE4 before drawing any sensible conclusions.

Yea man EA wanted their multi million dollar engine gimped for like 60 plus percent of users because fuck having a tech advantage over competitors. Also the various presentations about the engine, with large portions dedicated to PC performance and optimizations given by their lead graphics guy were actually complete fabrications and part of a many year long ruse.

Thief and Star Swarm have also sabotaged their renderer and any Mantle gains are purely an act of subterfuge. Dark forces in motion man for sure.

Uh huh
 
Last edited:
http://www.bytemedev.com/bf4-fta/faq/ - here is a link to what I assume is the BF4 frame-time analyzer being used. I couldn't decode anything further after a quick look.

I'm also having a bit of trouble understanding the charts.

My prior assumptions:

- Frame-time = milliseconds or equivalent frame-rate (from which we can calculate time).
- Frame-Rate: FPS min/avg/max.

I've seen this data organized in a few quickly comprehensible ways. Ex: Time above 30FPS, or Number or Percentage of frames at X>20ms or similar...

- CPU/GPU utilization: In % Percentage
--------------------
I have some hunches as to what the CPU FPS and the like could mean (CPU FPS: rate at which GPU is given frames? GPU FPS: rate of frame-rendering?)... Though there are some numerical inconsistencies using these hunches...

I guess I should search for a guide.

I don't play BF4 or own a current GPU, but Geez...I feel like a noob. Feels nice :)
 
Last edited:
that i agree with...i just cant grasp why there's a cpu chart and a gpu chart...i thought the goal was just to record frame rates and frame times.....why is there 2 charts?

The game tracks both, even with the in game graph. Definitely makes it easier to see what the limiting factor in your system is. My Q6600 drags far behind my 7870 - it is far and away the limiting thing on my performance.
 
Yea man EA wanted their multi million dollar engine gimped for like 60 plus percent of users because fuck having a tech advantage over competitors. Also the various presentations about the engine, with large portions dedicated to PC performance and optimizations given by their lead graphics guy were actually complete fabrications and part of a many year long ruse.

Thief and Star Swarm have also sabotaged their renderer and any Mantle gains are purely an act of subterfuge. Dark forces in motion man for sure.

Uh huh

Isn't this the same EA that had to apologize over the extraordinarily poor programming effort demonstrated by DICE in BF4?

But I'm sure we're to take these benchmarks seriously.
 
The game tracks both, even with the in game graph. Definitely makes it easier to see what the limiting factor in your system is. My Q6600 drags far behind my 7870 - it is far and away the limiting thing on my performance.

i guess im being retarded...there can only be one fps.....how can there be a cpu fps % and a gpu fps %.....what does the % mean
It says time spent above.......120fps (98.47%)...then its says cpu (97.81%) then gpu (0%) on one of the charts for example
What is the % for cpu and gpu indicating.....i cant wrap my head around it...please explain it...how can there be 3 sections when only 1 makes any sense. is the % supposed to indicate what? it can't be usage then what is it?
 
Isn't this the same EA that had to apologize over the extraordinarily poor programming effort demonstrated by DICE in BF4?

But I'm sure we're to take these benchmarks seriously.

Frostbite isn't a DICE thing any more, they're their own separate studio. It's EA wide middleware.

Have you even played the game? I have like 170 hours in it, and I've had it since launch. It's been fine for quite a while.

i guess im being retarded...there can only be one fps.....how can there be a cpu fps % and a gpu fps %.....what does the % mean
It says time spent above.......120fps (98.47%)...then its says cpu (97.81%) then gpu (0%) on one of the charts for example
What is the % for cpu and gpu indicating.....i cant wrap my head around it...please explain it...how can there be 3 sections when only 1 makes any sense. is the % supposed to indicate what? it can't be usage then what is it?

The CPU / GPU don't run in lockstep, they aren't guaranteed to be spending the exact same time in rendering a frame, but you ultimately need to wait on the weakest link to complete whatever it's doing to actually get the frame out the door. If you go into options and disable / enable something like HBAO and watch the numbers, you'll likely see only the GPU time change as it's an effect that can run entirely on the GPU.

The percentage is the time out of the total run that was spent above a given mark. Your example would mean the GPU's are what's holding performance back at that point.
 
Using Mantle does not make the IPC deficencies of the AMD FX cpu's go away. Mantle has less CPU overhead than DX which frees up the CPU to do other things such as networking, AI, physics, player interaction, environment interaction, etc. It only allows you to use "more of the cpu" in the sense that the CPU can spend more time on other processing chores.

Comparatively speaking, the FX cpu's are as bad as benchmarks show it to be. They do lag behind many Intel CPU's on many, but not all tasks. That much is clear. When factoring in cost and average workload of the end user the outlook changes slightly. The value of AMD cpu's lies in the fact that most current software isn't that demanding, and thus the AMD cpu's still have plenty of power for most chores.

Post is according to the chosen title(Bulls[H]it Master). All windows benchmark(eting) are optimized for Int-el(yes, I tease them); once when you enter onto the leveled playing field, the situation is radically changed. Enjoy...
 
I really need to get my hands on what your smoking.

No one is debating e which cpu is faster, We all know the Intel cpu is faster.

Mantle just allows slower cpus to not bottleneck the game. Its far superior to dx11 code path.

The uninformed are looking at this as AMD vs. Intel. The comparison is Mantle vs. DX and showing that Mantle increases CPU efficiency to the point that even an FX-8350 can run with a i7-4960X when using Mantle. Sure, AMD picked the most expensive CPU they could and claimed victory for a MoE difference. That's just them dramatizing the comparison. It doesn't diminish what Mantle is accomplishing here. If they were running the DX path the 4960X would destroy the 8350.
 
Last edited:
Post is according to the chosen title(Bulls[H]it Master). All windows benchmark(eting) are optimized for Int-el(yes, I tease them); once when you enter onto the leveled playing field, the situation is radically changed. Enjoy...

Derp....ok. That proves nothing we don't already know, that under highly multithreaded testing the AMD octo-core chips do nearly as well as Intel quad-core chips and many times sometimes surpass them. But they require 8 cores to Intel's 4 to do so. I'm going to quote the article you linked just so you realize even your own article you are pushing as evidence isn't as rosy as you think, just in case you missed it.
Phoronix said:
In other words, the AMD FX-8350 is offered at a rather competitive value for fairly high-end desktops and workstations against Intel's latest Ivy Bridge offerings -- if you're commonly engaging in a workload where AMD CPUs do well.
Phoronix said:
In not all of the Linux CPU benchmarks did the Piledriver-based FX-8350 do well. For some Linux programs, AMD CPUs simply don't perform well and the 2012 FX CPU was even beaten out by older Core i5 and i7 CPUs.
So how exactly does your post refute my comment? Specifically this one:
CaptNumbNutz said:
They do lag behind many Intel CPU's on many, but not all tasks. That much is clear. When factoring in cost and average workload of the end user the outlook changes slightly.
Oh right...it doesn't. WTF was the point of your post again?

Your post is also wildly off-topic. We are discussing Mantle, and not only that but we are discussing Mantle under Microsoft Windows. We aren't discussing obscure multi-threaded Linux benchmarks that have nothing to do with mantle. I swear you fanboys will throw as much off topic shit as possible out there to to prove an irrelevant point to make your product of choice look better.

As far as my chosen title, that comes from General Mayhem forum here at [H]. It has nothing to do with anything else. Since you have no clue why I have that title, you should kindly crawl back under your bridge to wallow. Your post reeks.
 
Last edited:
Using Mantle does not make the IPC deficencies of the AMD FX cpu's go away. Mantle has less CPU overhead than DX which frees up the CPU to do other things such as networking, AI, physics, player interaction, environment interaction, etc. It only allows you to use "more of the cpu" in the sense that the CPU can spend more time on other processing chores.

Comparatively speaking, the FX cpu's are as bad as benchmarks show it to be. They do lag behind many Intel CPU's on many, but not all tasks. That much is clear. When factoring in cost and average workload of the end user the outlook changes slightly. The value of AMD cpu's lies in the fact that most current software isn't that demanding, and thus the AMD cpu's still have plenty of power for most chores.

Where did I state that IPC was fixed? But you might want to know that Mantle does not fix anything on the FX-8350. Mantle allows the FX-8350 to work as it was intended, the number of cores which everyone laughed at are now fully used and able to do very well.
Keep pointing towards useless benchmarks which really don't matter in the first place.
 
I think people are too stuck on the CPU names, "FX-8350" & "i7-4960X".

It's about taking the CPU out of the coding paths, where it never belonged in the first place.

If Mantle can make the CPU irrelevant enough that I could use my old CPUs with a R9 295x2 and run games at full speed I'll be quite happy.
 
I think people are too stuck on the CPU names, "FX-8350" & "i7-4960X".

It's about taking the CPU out of the coding paths, where it never belonged in the first place.

If Mantle can make the CPU irrelevant enough that I could use my old CPUs with a R9 295x2 and run games at full speed I'll be quite happy.

For some people the elite factor is more important to them than over all progress.
They want to feel that there more expensive CPU is more beneficial all the time and will put down anything that takes that away.

You see the same with GPUs and comments that they are glad that the next process is delayed as it means there card is still number 1 for longer.
 
Last edited:
If Mantle can make the CPU irrelevant enough that I could use my old CPUs with a R9 295x2 and run games at full speed I'll be quite happy.

A $1500 video card for Plants vs Zombies? :D

Maybe they will at least bundle "Pin the tail on the donkey" with the 295x2. Bring some value to the card.
 
VgI3BnH.jpg


Question... if this was the same card, using Mantle in both instances, on two CPU's... why is the slower CPU faster?

Everything else is equal, so why does Mantle do a worse job when a faster CPU is present for it to work with?

@Elios. The 290x Mainstreet map 4K chart shows that Mantle gives at least a generational boost to performance.
This 4k chart?

sDm99Bl.jpg


This chart also shows that an R9 290 is faster than an R9 290X when both cards are tested running DirectX 11... which is pretty obviously ass-backwards.
 
Last edited:
Back
Top