BF4 -- Maybe we don't need mantle after all

these are benchmarks for the single player campaign, i wonder how much it changes (if at all) for multiplayer. i know multiplayer is difficult to get consistent results but there's at least one other benchmark out there that shows it flipping dramatically in Nvidia's favor during mulitplayer.
 
very nice. The 7990 looks great. kicks the shit out of Titan and the 690 lol Can't wait to pick up BF4 and give her a go :)
 
http://www.guru3d.com/articles_pages/battlefield_4_vga_graphics_performance_benchmark,1.html

http://www.techspot.com/review/734-battlefield-4-benchmarks/

AMD kicks nvidia in BF4 already, and by a good margin.
The 7990 is KING and the R9 290X easily beats Titan.
how much better can it get?

you left out the GTX 690 sitting between the 7990 and 290x...but regardless, if this holds up pre-Mantle then I can only imagine what sort of killer performance advantages that will bring...30-40 fps advantage?
 
http://www.guru3d.com/articles_pages/battlefield_4_vga_graphics_performance_benchmark,1.html

http://www.techspot.com/review/734-battlefield-4-benchmarks/

AMD kicks nvidia in BF4 already, and by a good margin.
The 7990 is KING and the R9 290X easily beats Titan.
how much better can it get?

:eek:

Techspot's review looked good, Guru3D adds weight to the 290x kicking serious ass; with the 7990 in a league of it's own!

My guess would be Mantle will send the 290x & 7990 fps into the stratosphere somewhere! Perhaps more meaningful will be the boost the 7790, 7950, and 280x cards should receive! :eek:

Wow!
 
If the 690 had more memory I think it would do better at 4K. 3GB 680's in SLI would do much better than the 690.

Nvm - just realized that they're 4GB.
 
Last edited:
If the 690 had more memory I think it would do better at 4K. 3GB 680's in SLI would do much better than the 690.

Nvm - just realized that they're 4GB.

you realize technically its only 2gigs.
 
7970 (standard, ghz, and 280x) is beating 770, too.
I guess the new Frostbite3 engine runs better on AMD cards (either that, or 2gb on a standard 770 is hurting its performance).
 
I wish people would bench 4GB 770's as well as 2GB. It's not as fair a comparison to use one with less vram than the requirements state that might affect its performance, when there is one with extra memory that still won't give it any advantage over its competition (the 7970 and 280x).

I don't know if that made sense, basically as somebody who is thinking about getting a 280x or a 4GB 770 I'd like to see if the 4GB alleviates any problems caused by a potential lack of vram on the 2GB 770.
 
7970 (standard, ghz, and 280x) is beating 770, too.
I guess the new Frostbite3 engine runs better on AMD cards (either that, or 2gb on a standard 770 is hurting its performance).

probably both
 
where are the CPU benchmarks? That's what matters to me :(

JK found it but find it very hard to believe.
CPU_01.png
 
I wish people would bench 4GB 770's as well as 2GB. It's not as fair a comparison to use one with less vram than the requirements state that might affect its performance, when there is one with extra memory that still won't give it any advantage over its competition (the 7970 and 280x).

I don't know if that made sense, basically as somebody who is thinking about getting a 280x or a 4GB 770 I'd like to see if the 4GB alleviates any problems caused by a potential lack of vram on the 2GB 770.

I think people make to big a deal out of the 2gb vs 4 gb arguement.I had posted something a while back comparing a gtx 770 2 gb vs 4 gb at 2560x1600 and there was nomore then 2 fps difference in any game and sometimes the 2gb even beat out the 4
 
I think people make to big a deal out of the 2gb vs 4 gb arguement.I had posted something a while back comparing a gtx 770 2 gb vs 4 gb at 2560x1600 and there was nomore then 2 fps difference in any game and sometimes the 2gb even beat out the 4

It depends on the game. Before bf4, there wasn't really any game that suggested more than 2 gigs of VRAM. Now, that extra ram matters.
 
I think people make to big a deal out of the 2gb vs 4 gb arguement.I had posted something a while back comparing a gtx 770 2 gb vs 4 gb at 2560x1600 and there was nomore then 2 fps difference in any game and sometimes the 2gb even beat out the 4


Yeah but this is in cases where more than 2GB isn't required, if you hit 2GB of actual usage and not just allocation when you only have 2GB available, you're gonna have a bad time. But looking at guru3D's benchmark the 2GB isn't a problem anyways, however that's in single player and maybe multiplayer can use more due to larger maps?

Remember during the beta this place did a performance review and they thought the gameplay with the 770 wasn't smooth because of the 2GB VRAM even at 1080p, so it will be interesting to see if they find any change in the release version and what they say about that. I'm more interested to see the [H] benchmark review than the others because of that.
 
where are the CPU benchmarks? That's what matters to me :(

JK found it but find it very hard to believe.
CPU_01.png

The Windows 8 optimizations reduce CPU overhead a lot in BF4 apparently.

I imagine multiplayer will still be a lot more taxing on the CPU, though.
 
The Windows 8 optimizations reduce CPU overhead a lot in BF4 apparently.

I imagine multiplayer will still be a lot more taxing on the CPU, though.

I'm waiting for more conclusive benches before I pull the trigger on a 4930k. Which is probably a bad idea considerin haswell-e is going to be a big shift.
 
I think most games for the next few years will be catered towards AMD simply because AMD won all of the consoles. So a game studio (which most of them design around consoles first) will already have a game using the AMD APIs and feature set, vs the nVidia solution where they would have to accomodate for that after the fact.
 
hahah so you'll get 120fps in one game with the 7990, and have shit for scaling everywhere else unless the new drivers fixed all teh frame pacing stuff.
 
I think most games for the next few years will be catered towards AMD simply because AMD won all of the consoles. So a game studio (which most of them design around consoles first) will already have a game using the AMD APIs and feature set, vs the nVidia solution where they would have to accomodate for that after the fact.

But more people on PC's are running nvidia hardware, so games companies are surely not going to ignore that when porting games to the PC.
 
I suspect Mantle will be a big boon to the mid range segment of the market, if it enables a 280x to pull 290x frame rates then that suddenly becomes a pretty big deal.
 
Mantle will likely be a large CPU win more than anything, with AMD's chips winning biggest given the weaker per core performance, but with more of them to throw around.
 
I thought the best scale was the techspot review that shows the HD7950 Boost faster then the 680GTX .. remember when it was 680GTX vs HD7970 and now it has to battle the lesser HD7950 and boost model as that shows how much better it runs on AMD hardware.
 
You guys make me laugh. AMD had all the time in the world to optimize this game. You honestly believe it will stay this way ? Nvidia will optimize drivers and 690 will be back on par with 7990 and 780/titan will be right on pace with 290x. Crack me up like AMD has godly performance now because 1 game that they backed has better performance on their cards 2 days after release :rolleyes:
 
Crack me up like AMD has godly performance now because 1 game that they backed has better performance on their cards 2 days after release :rolleyes:

most games going forward will be backed by AMD because of the next-gen consoles...I'm sure Nvidia will bring performance up with drivers but they might not be released as fast as they used to
 
most games going forward will be backed by AMD because of the next-gen consoles...I'm sure Nvidia will bring performance up with drivers but they might not be released as fast as they used to

Point is what ? Used to be the other way around for a long long time. 95°C to not wait a few days for drivers and same performance. Not worth it. I was all for 290x till I saw the temps. 780 is better buy now that price dropped no question.
 
Point is what ? Used to be the other way around for a long long time.

point is exactly that...it used to be that way and now things might swing the opposite way...hopefully Nvidia owners will be as patient...if Nvidia plays their cards right they might be able to get in one of the next-next gen consoles in 2019 :D
 
point is exactly that...it used to be that way and now things might swing the opposite way...hopefully Nvidia owners will be as patient...if Nvidia plays their cards right they might be able to get in one of the next-next gen consoles in 2019 :D

Its not going to make a difference really. Amd backed games we may have to wait a week or so but most devs nvidia will have plenty of access to optimize their drivers for release.
 
You guys make me laugh. AMD had all the time in the world to optimize this game. You honestly believe it will stay this way ? Nvidia will optimize drivers and 690 will be back on par with 7990 and 780/titan will be right on pace with 290x. Crack me up like AMD has godly performance now because 1 game that they backed has better performance on their cards 2 days after release :rolleyes:

If you are this bitter now..wait til Mantle kicks in :p
 
A few guys that have 780's have been telling me the frame rate on their cards is not all that great with any type of eye candy turned on
 
A few guys that have 780's have been telling me the frame rate on their cards is not all that great with any type of eye candy turned on

I find 50% gain over my previous 7970 in most gains. 7970 @1200/1650 score just under 50fps in valley while my 780 at clocks in sig scores about 75fps. Just over 10, 000 in 3dmark11 on 7970 and about 15, 000 on 780. I know these are jusy benchmarks. Depends if you have the cpu to push it and what game they are playing as well. Thats what I felt with my first 680. Didnt clock for shit and my good clocking 670 walked all over it and my 7970. My 670 did do almost 1400 core tho :) should of kept her.

Im not bitter im a realist. Funny AMD instead if making a new cpu architecture decided to make mantle to cover up how slow their cpu are. Seriously a 4670k at 4.5ghz beats an 8350 at 5ghz in bf4. Brutal doubke the core and 500mhz faster and still cant keep up. Pathetic.
 
Im not bitter

You sure sound bitter in all these threads.

Some of you nvidia guys get so worked up anytime something possibly positive is posted about AMD. As though it threatens your way of life and you need to justify why it's not better than nvidia. Talk about making people laugh.
 
Im not bitter im a realist. Funny AMD instead if making a new cpu architecture decided to make mantle to cover up how slow their cpu are. Seriously a 4670k at 4.5ghz beats an 8350 at 5ghz in bf4. Brutal doubke the core and 500mhz faster and still cant keep up. Pathetic.

LOL, AMD haven't the best CPU's, won't argue that, but developing mantle as a cover up, you got be kidding me. AMD have been working on APU's for years now and because of this they got their hardware into both the PS4 and Xbox One. That's what lead to mantle.

Besides they haven't got the resources or the money that Intel has. Intel can throw billions into R&D and not even miss it, AMD can't do that. I think their focus is HSA and they are probably gambling on it to be honest. They are still pretty strong in the server market, but, with so little money something has to suffer and I guess that consumer desktop CPU's are losing out.
 
Anyone wants to buy GTX780? lol I am sure COD is going to run great.:rolleyes:
 
You sure sound bitter in all these threads.

Some of you nvidia guys get so worked up anytime something possibly positive is posted about AMD. As though it threatens your way of life and you need to justify why it's not better than nvidia. Talk about making people laugh.

lol. I've owned 5850 crossfire, 6950 crossfire unlocked, 7950 crossfire, 7970. I'm not an nvidia guy. I'm definately an Intel guy tho. Just funny people think because of these consoles that AMD has something better to offer. You're all in dreamland. It's still AMD. I hope mantle does go somewhere as it'll create competition. Which is good for everyone. I would of bought a 290x if it didn't run at 95C no question. Sold my reference 780 and sat on a 7950 for a month. I gave into the hype and I was dissapointed. I don't see anything other than triple slot coolers keeping the heat off those 290x either. They draw a TON of power.
 
Back
Top