BF4 -- Maybe we don't need mantle after all

I am a huge AMD fan, but i do think Nvidia will gain some ground with their next driver update. I want a 290 bad, but my friend is giving me a 580 GTX and well... my needs are met by a 560 right now, so no need for overkill anyway.
 
You sure sound bitter in all these threads.

Some of you nvidia guys get so worked up anytime something possibly positive is posted about AMD. As though it threatens your way of life and you need to justify why it's not better than nvidia. Talk about making people laugh.

People internalize and get personal with video card brands ...becomes part of their identity. So you get the rival brand threatening that identity... It sounds sad hence alot of denial. Agree though it is entertaining from a cognitive perspective :p:D:cool:
 
I had to overclock my fx to 4.5ghz to get it smooth with a 290x. Even in campaign mode, it felt a little bogged down in the 2nd mission with the tank and the heavy rain, and then the ocean level felt a little off too until I overclocked my cpu.

it's a trade off, my room is noticeably hotter. I will probably lower game settings in the summer and run my cpu at stock, maybe even underclock my 290x too.
 
People internalize and get personal with video card brands ...becomes part of their identity. So you get the rival brand threatening that identity... It sounds sad hence alot of denial. Agree though it is entertaining from a cognitive perspective :p:D:cool:

Well in fairness there was a lot of "brand bragging" in the first part of this thread. But we all know the reality is that BF4 is just one game (and a poor one at that) and one that AMD worked closely with its developers. The results aren't surprising and we know they're not going to stay that way and certainly aren't reflective of the industry at large.

It's kinda like bragging about a home run in the first inning.
 
Well in fairness there was a lot of "brand bragging" in the first part of this thread. But we all know the reality is that BF4 is just one game (and a poor one at that) and one that AMD worked closely with its developers. The results aren't surprising and we know they're not going to stay that way and certainly aren't reflective of the industry at large.

It's kinda like bragging about a home run in the first inning.

Disagree..the op had some objectivity with some proof in the links. The follow up posts were all fair weather just agreeing with the links nothing over the top. ..then there's 1 guy with multiple sour grapes...entertainingly so.

I wouldn't downplay a major title like BF4 (not my fav genre) and frankly just sounds foolish considering these type of heavy hitters in the industry will be major marketing promos..you better bet your bottom dollar these performance advantage benchmarks are gonna be huge in the video card gaming industry. As enthusiasts we know this 1 AMD optimized game =/ all encompassing top to bottom with other pc titles however we are the minority here...but your average joe six pack at Frys not gonna see it the same way. Just look at past generations with TWIMTBP games that were heavily promoted to sell cards.

Most people would disagree with you as well on being surprised with results as evidenced. I was expecting to be slightly above Nvidia (who always delivers btw) wasn't expecting it to be this good on release drivers. Was expecting a homerun ..this was a grandslam in the first inning! Yes AMD has bragging rights.
 
Last edited:
http://www.guru3d.com/articles_pages/battlefield_4_vga_graphics_performance_benchmark,1.html

http://www.techspot.com/review/734-battlefield-4-benchmarks/

AMD kicks nvidia in BF4 already, and by a good margin.
The 7990 is KING and the R9 290X easily beats Titan.
how much better can it get?

Of course we need Mantle. I don't see Mantle as a method AMD can beat Nvidia in reviews, but something that brings more performance to us consumers without making it worse for others.

Perhaps if AMD should block Mantle on their highend cards, people would see Mantle as something more then in the light of the benchmark pissing contest for the performance crown. Mantle might give gaming capabilities for lower end and mainstream gamers, which without it might have a worse time playing BF4.

IF Mantle removes some of the CPU overhead, mainstream gamers might not need to buy expensive highend CPU's to get a decent smooth gameplay. IF Mantle improves on the GPU rendering, mainstream gamers might get a better gaming experience with higher framerate.
 
If an R9 290x GPU gets to 95C using DirectX, how hot is it going to get using the more GPU intensive mantle? Is it possible that if mantle is any good it could actually kill AMD's own GPU's? :p
 
I'm serious though, is it possible? Could the more intensive GPU utilisation caused by mantle cause it to overheat?

I seriously doubt it. Mantle is there to do things more efficient. Something can be inefficient and cause a 100% GPU load, while something can be more efficient and cause a 100% GPU load. In both cases, the load will be the same.

The overheating due to drivers as we heard of with Nvidia cards, have been because the software regulating this didn't function, not because the drivers made the cards work harder.
 
I suspect Mantle will be a big boon to the mid range segment of the market, if it enables a 280x to pull 290x frame rates then that suddenly becomes a pretty big deal.

lol.
 
I'm not a "green camp" guy or a "red camp" guy really, I just love hardware and fast GPUs.

I will say that with the 290X performance being so close to a Titan, I would still choose a Titan over a 290X for the lower power consumption, lower heat, and better build quality. I also run at least 1 (I had 3 7970s, went to dual GTX 680s, then dual Titans now) and AMDs crossfire drivers and frame pacing issues still scare me.

I'd rather have nVidias better drivers, better build quality, lower heat & power consumption at this point. I do hope AMD comes out with some kind of 290X "Ultra" with a better cooler and lower temps. I'd be all over that and buy 2 or 3 of them.
 
I ended up not going with a 290x because I thought it would be too hot, and just bought a 2nd 7950.

As the cards are reasonably close together, the primary card gets up to 89 degrees anyway. I shouldn't have been worried, it's not really that big a deal, particularly if you wear headphones.

I think I'll be picking up some dual 780's or 290's when the dust settles, though.
 
If an R9 290x GPU gets to 95C using DirectX, how hot is it going to get using the more GPU intensive mantle? Is it possible that if mantle is any good it could actually kill AMD's own GPU's? :p
New Troll. Cool.

I'm serious though, is it possible? Could the more intensive GPU utilisation caused by mantle cause it to overheat?
You seem... familiar.
Have.... you met IdiotInCharge?

I'm not a "green camp" guy or a "red camp" guy really, I just love hardware and fast GPUs.
False. You are green camp.

I will say that with the 290X performance being so close to a Titan, I would still choose a Titan over a 290X for the lower power consumption, lower heat, and better build quality.
Better build quality? You mean worse?
290x simply outclasses Titan in build quality of the card/PCB.
So you admit you would pay double for a card that is actually worse than the 290x? No wonder you chose that opening sentence.

I'd rather have nVidias better drivers, better build quality, lower heat & power consumption at this point. I do hope AMD comes out with some kind of 290X "Ultra" with a better cooler and lower temps. I'd be all over that and buy 2 or 3 of them.
Better drivers, false.
Better build quality, false.
Lower heat, yes.
Better power consumption, who cares.
 
If an R9 290x GPU gets to 95C using DirectX, how hot is it going to get using the more GPU intensive mantle? Is it possible that if mantle is any good it could actually kill AMD's own GPU's? :p

Either you are:

1) really really REALLY stupid...

or

2) Trollin.

Read and understand how GPU thermal design limits work before you speak next time.
 
Either you are:

1) really really REALLY stupid...
or
2) Trollin.

Read and understand how GPU thermal design limits work before you speak next time.

Let's pretend it's me and I'm stupid. Explain how thermals could not possibly be a limiter if API improvements removed a CPU bottleneck and gave the GPU more work to do.

I don't mean that the 290X specifically would "overheat", obviously, but there are implications to fan and clocks.
 
That's the thing to understand. It's not giving the card MORE work to do, it's allowing the card to do a greater amount of work in the same time through optimizations. A pseudo example:

Pretend the card has to do this now:

1. Load texture A into buffer
2. Load a filter for use
3. Apply filter to Texture A
4. Save result to buffer
5. Remove texture A
6. Remove filter A

Total time: 100 ms

Now say they have an optimized API (Mantle). Maybe they can do something like:

1. Load AND apply a filter to a texture without having to preload it.
2. Save the result to buffer

Total time:60 ms

It's the same amount of work in a smaller amount of time, thus allowing the GPU to be free and start on the next work 40ms faster. So more work done in a shorter amount of time.
 
If you spend $8,000,000 on a game it had better work well on your hardware.

Mantle is mostly marketing. Once it came out that consoles won't be using it, the hype died off almost instantly.
 
If you spend $8,000,000 on a game it had better work well on your hardware.

Mantle is mostly marketing. Once it came out that consoles won't be using it, the hype died off almost instantly.

Unless you've seen benchmarks, you've clearly not understood the concept, and are just spreading fud. It could be marketing, however it might have potential.
 
That's the thing to understand. It's not giving the card MORE work to do, it's allowing the card to do a greater amount of work in the same time through optimizations. A pseudo example:

I'm not disputing that a new API couldn't expose new ways to do things which are simply more efficient in themselves, but the only improvement we've seen quoted publicly is wrt to draw-calls vs DirectX, so I had that example in mind.

That specifically is the sort of bottleneck removal that could have thermal implications.
 
I'm not disputing that a new API couldn't expose new ways to do things which are simply more efficient in themselves, but the only improvement we've seen quoted publicly is wrt to draw-calls vs DirectX, so I had that example in mind.

That specifically is the sort of bottleneck removal that could have thermal implications.


o_O

If a gpu is already running at 100%, having more efficient code won't make it run at 150%, just do what it does more efficiently.

Emmm.....whoa
 
Either you are:

1) really really REALLY stupid...

or

2) Trollin.

Read and understand how GPU thermal design limits work before you speak next time.

I was asking a question, is there a problem with that "collegeboy69us" (great name btw)? Does the fact I don't know how "GPU thermal design limits work" make me "really really REALLY stupid" ? :rolleyes:

Well there you go people, according to collegeboy69us if you don't know how GPU thermal design limits work and ask a question related it to it you're an idiot.

Sorry, I won't ask questions about things that I don't understand properly in future, that way I'll be sure not to learn anything.
 
Last edited:
Let's pretend it's me and I'm stupid. Explain how thermals could not possibly be a limiter if API improvements removed a CPU bottleneck and gave the GPU more work to do.

I don't mean that the 290X specifically would "overheat", obviously, but there are implications to fan and clocks.

An API improvement is not about "more work" being shifted onto the GPU. It's a layer for software to interface with the GPU itself. The closer you are "to the metal" means you can get away with cooler/slicker things that would otherwise take MORE processing power (CPU & GPU alike). Think of it as an all around efficiency booster. A developer wants to do X and Y for a game... if the mantle route to produce the same output takes 20% fewer cycles, then how/why would we be worried about worse thermals?

Generally the closer you are to the metal, the less compatible you are with. DirectX is pretty much a universal graphics API -- and it's not that efficient at all (at least we are staring

I won't disagree that 95C is toasty -- I sure wouldn't want my GPU able to almost boil water 24/7. (I like to LTC mine as a hobby). AMD has one badass chip on their hands, and will soon be releasing a huge bucket of lube to make it even slicker (mantle in december) The API itself will be more efficient in almost every respect, if you want to reach extra far as that applies to thermals, I see no reason a more efficient API would cause an increase in heat via the GPU -- the work being performed is the same, it's just now being done more efficiently.

Aftermarket will solve that problem.
 
2 of the funniest posts ever in this thread. 1st mantle may explode gpus and then titan has $500 worth of heatsink
 
If you spend $8,000,000 on a game it had better work well on your hardware.

Mantle is mostly marketing. Once it came out that consoles won't be using it, the hype died off almost instantly.

Despite all the communication from AMD, you still don't understand what mantle is for?

Mantle is AMD's interpretation of the hardware features of the chips used on consoles for the pc platform. It's there so that to the metal optimized code on consoles can be transferred to PC games easilty.

It's not used on consoles because it is not needed. They just program straight to the metal when desired.
 
It's not used on consoles because it is not needed. They just program straight to the metal when desired.

It's not needed because each console has it's own API. Neither company wants a separate proprietary API screwing things up, neither do PC gamers.

So unless AMD is willing to spend a LOT of money to sucker a company in to using it, no company has any motivation to waste their time on it.

Now do you understand?
 
Despite all the communication from AMD, you still don't understand what mantle is for?

Mantle is AMD's interpretation of the hardware features of the chips used on consoles for the pc platform. It's there so that to the metal optimized code on consoles can be transferred to PC games easilty.

It's not used on consoles because it is not needed. They just program straight to the metal when desired.

Most of the "communication" has been obfuscated double-speak. All we really have to go on his Johan's presentation. And even that didn't say what most people deduced.

We won't really know jack until the November developers' conference. And who knows, maybe we'll get the usual hand-waving there too.
 
In the techspot review using windows 8 .. the 290x performance is not close to a Titan.. it's (hauling ass) pass the Titan.

http://www.techspot.com/review/734-battlefield-4-benchmarks/page3.html

The performance gap in favour of the 290x over Titan is so huge in that review I was impressed, then questioned it. But then Guru3D, and now [H] all show similar Massive gains by the 290x all in different scenarios, just WOW.

It's like 15%. You'll see at least that much improvement in just a single typical driver update.
 
It's like 15%. You'll see at least that much improvement in just a single typical driver update.

7970ghz is either tied or beating 780 on the charts... so I have to say that it has to be a driver issue... or FB3 engine is that much more optimized for AMD, Which is a bit strange (imho), because the previous versions of the FB engine seemed to have been more optimized for NV cards.
 
I don't know where the hell these benchmarks came from I just did a bench run with fraps on flood gate full round like 20 minutes of gameplay and this is what I got.
Min: 47
Avg: 85
Max: 179
Never once saw less than 65fps in game don't know where fraps picked up the 47 probably dying or something. Seriously plays and looks amazing. Every setting maxed out. Likely all the charts are single player and Nvidia focused on multiplayer first. Obviously. Also this was at stock clocks (steady boost of 1071/6204)
 
Updated to the new amd driver has caused me to have even more crashes in bf4. This is really making me lose lose my faith in amd. I want badly to wait for the 290 non x reviews and mantle reviews but the constant driver issue is really really getting on my nerves.
 
Updated to the new amd driver has caused me to have even more crashes in bf4. This is really making me lose lose my faith in amd. I want badly to wait for the 290 non x reviews and mantle reviews but the constant driver issue is really really getting on my nerves.


Its the game is my bet.
server sdie patches today and it seems its getting better.
played on a 48player today several hours without a crash.
still happens now and then but its not the driver.
 
Back
Top