Battlefield Hardline Video Card Performance Preview @ [H]

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,634
Battlefield Hardline Video Card Performance Preview - We hopped on the open public beta of Battlefield Hardline this past week and tested performance in all three maps with six video cards to find out how this game performs. We will talk about each map in the beta, and our experiences in terms of performance and gameplay experience so that you will know what to expect in the full game.
 
Thanks for the VRAM guide Brent. On a side note is this a Gameworks game ?
 
Hardline has not been on my radar much. After BF4's biffed release, and ongoign bugs, I never even bought IT. At least they seem to have the bugs under control (at least on these few maps), but inconsistent performance is worrying, particularly with only a month to go before release.
Do we know if it has gone Gold yet?

BF Hardline needs an upgrade in terms of visuals. While it uses the same game engine as BF4 it doesn't seem that any modification or enhancements have been made. This game certainly does not feel like a "next generation" 2015 game. BF4 was released in 2013. If this game were released right alongside BF4 in 2013 it would feel right at home. BF Hardline is basically BF4 with cops and robbers and terrible AMD GPU performance in its beta release.
I am not, therefore, impressed.
 
are people over exaggerating Nvidia's texture compression because the difference in VRAM usage between AMD and Nvidia cards can be chalked up to variance in this preview
 
A very well written article. You've clearly written this to address what has been learned from the recent 970 fiasco. I find it interesting that your recommended cards make no mention of the 970 in your closing, given that it kept pace and in one test bested the 290X.

Being that I am in the market for a new card, I've really been weighing the 290X against the 970. Both have pros and cons. Seeing how the VRAM usage is nearing 3.5GB with these dated visuals is really giving me pause about the 970. But those AMD drivers....ugh...

Thanks so much for the article.
 
for 5 frames for an extra $280, I will stick with my OLD 290x setup. I'm sure a driver update from AMD will fix the issue you are talking about with 2x crossfire. I didn't have that issue playing the beta but I am also running 3x 290x in crossfire.. it was smooth as hell in 3978x2238 4x AA resolution with 125% resolution scaling. The game still looked horrible though.. lol.
 
i've been playing this game on all weekend and performance is good.

1200p on Ultra with a 7970Ghz I get 63fps average in the in game bench and played on both the 32 player map and 64 player map and no complaints.

I will have to turn on fraps next time in 64 player dust bowl to see if there is a fps hit when the storm hits.
 
Fps averaging in the 40's at 4K unplayable? Was there some kind of hitching or is it about the game being a competitive FPS and sub 60 frames just not being acceptable for the genre? Didn't see any graphs for 4k frame times.
 
Last edited:
I'm getting 64 FPS (using the in game benchmark) with 970's SLI @4k all maxed out no AA. Graphics really aren't bad but do seem very dated.
 
Last edited:
The statement is correct.

After seeing it elsewhere, is [H] doing any kind of article on the GTX970 3.5+.5 to-do? All I saw were canned benchmarks, and I am very curious to see if the architecture actually has a real-world impact. I'm thinking of grabbing one, but I'm not sure I want to pull the trigger until I know whether this is a real issue or not.
 
Hardline has not been on my radar much. After BF4's biffed release, and ongoign bugs, I never even bought IT. At least they seem to have the bugs under control (at least on these few maps), but inconsistent performance is worrying, particularly with only a month to go before release.
Do we know if it has gone Gold yet?


I am not, therefore, impressed.
It's not gold yet but I am sure it will go gold soon... BTW it's not worth buying.
The game felt like a Bad Company 2 rebooted. Where they decided to add a few BF add on ,but use the same engine . So you kind of have BC2 , less maps , with new game types , a re written skill item set up as in a different way to gain new equipment based after earning cash + EXP + Kill count. The beta felt half backed to to me... oh lets no forget the only BF feel was the game had some old BF4 exploits injected into it along with known BF4 cheat software.
I found it nice and refreshing to see you can drive your vehicle or land a helicopter in an out of the zone area and snipe away... heck you can even drive the armored vehicle or fly up on the mountain / Hill side / building to a small area of a dead zone & then just fire away at your prey with it's gatling stile machineguns...
Oh well ... :D
 
The general consensus of BF Hardline is a general tone of gamers being unimpressed. We have followed gamer feedback on our forums, and even noted in-game chat while playing the game. The general consensus is that this game feels like a simple mod for BF4, or a BF4 DLC. It is everything BF4 is, just with a "cops vs robbers" theme running throughout. Battlefield Hardline has been compared to other games of this genre and doesn't seem to stand out.

i'm in the minority then, i had a crap ton of fun with the beta. thanks for the review Brent and Kyle, interesting read.
 
This thread will stay on topic. Should you wish to discuss 3.5GB 970 cards, you need to it in another thread. You will be banned otherwise.
 
interesting in the performance preview it mentions 1440p users needing 4GB VRAM while 1080p users only need 2GB...there's that big of jump between 1080p and 1440p?...where does 1200p (16:10) fall into this?...basically the same as 1080p or closer to 1440p?
 
interesting in the performance preview it mentions 1440p users needing 4GB VRAM while 1080p users only need 2GB...there's that big of jump between 1080p and 1440p?...where does 1200p (16:10) fall into this?...basically the same as 1080p or closer to 1440p?

Re-read, I recommend 3GB for 1080p, it was pushing the capacity of GTX 960 at 2GB.

For 1440p it was pushing 3GB, so only 1GB difference between the resolutions. But it was just over 3GB, meaning 3GB would be overall limiting as well, thus the recommendation for 4GB.
 
I'm sorry but while I appreciate your hard work,; I continually get the impression that any negative for AMD gets brought to light and nvidia gets a pass on almost every issue but pricing. I take all reviews with a grain of salt, especially here.
 
I'm sorry but while I appreciate your hard work,; I continually get the impression that any negative for AMD gets brought to light and nvidia gets a pass on almost every issue but pricing. I take all reviews with a grain of salt, especially here.

You must only visit HardOCP when AMD is not making a superior product.

You can see just a few of the Gold and Silver AMD awards from last year at the link below.

http://hardocp.com/reviews/gpu_video_cards/1/amd
 
Will be interesting to see if Mantle smooths out the frame time spikes and game play. If the 290x can out perform the 970, albeit with some choppy game play, then bringing Mantle to the table should help quite a bit.

I can only assume AMD is not concerned with DX performance with the newer Mantle supported GPUs provided the game supports Mantle in which BF:Hardline will.
 
The last bit about the frame-times was really eye opening.

Thanks for the testing and writeup!
 
I think it would be pretty interesting to do an apples-to-apples VRAM measurement for all cards. Does bumping up the resolution to 1440p actually add a full GB of VRAM usage? Or is the game simply using more of the available VRAM at its disposal?
 
Great article, as always.

The only thing which I thought was strange was that they are comparing factory overclocked cards to reference clocked cards. Yes, the overclocked 970 was faster than a stock 290x and almost as fast as a stock 980 but...

Can the MSI 970 GAMING et al be downclocked to reference speeds using Afterburner or similar?

What I'd like to see if there's another article after the full game is released is how Mantle improves frame pacing (or otherwise) for the Red Team. Also, SLI & Crossfire scaling/frame pacing for each of the cards used here. SLI performance at 1440 or 4K would be most interesting to 970 owners as their GPU VRAM consumption exceeds 3.5GB.
 
A very well written article. You've clearly written this to address what has been learned from the recent 970 fiasco. I find it interesting that your recommended cards make no mention of the 970 in your closing, given that it kept pace and in one test bested the 290X.

Being that I am in the market for a new card, I've really been weighing the 290X against the 970. Both have pros and cons. Seeing how the VRAM usage is nearing 3.5GB with these dated visuals is really giving me pause about the 970. But those AMD drivers....ugh...

Thanks so much for the article.

As someone who tends to alternate between AMD and Nvidia cards based predominantly on price/performance at the time, let me tell you straight up that the driver issues people constantly bitch about regarding AMD cards are, in my opinion, grossly exaggerated. In my personal experience, I haven't felt they were much worse than anything Nvidia is making. Prior to the creation of Catalyst, ATI had terrible drivers. After Catalyst, they're really not that different than Nvidias. Again, this is in my experience.

As for upgrading now, I'd wait for stacked memory, which is coming out in a few months. It'll more or less eliminate memory bandwidth bottlenecks altogether.
 
I'm sorry but while I appreciate your hard work,; I continually get the impression that any negative for AMD gets brought to light and nvidia gets a pass on almost every issue but pricing. I take all reviews with a grain of salt, especially here.

You must have completely missed the reviews when the R9 290X came out.
 
Single GPUs having issues with frame times? That's kinda scary.

Great review - ever think of percentile charts? Looks like you already have the data. I use Minitab but I think Excel can do it quite easily. I like to see how cards compare around the 95 to 99th percentile since minimum frame rate is what I really care about and that shows it nicely. Generally I get everything I care about between [H] and pcper :).
 
Great article, as always.

The only thing which I thought was strange was that they are comparing factory overclocked cards to reference clocked cards. Yes, the overclocked 970 was faster than a stock 290x and almost as fast as a stock 980 but...

Can the MSI 970 GAMING et al be downclocked to reference speeds using Afterburner or similar?

What I'd like to see if there's another article after the full game is released is how Mantle improves frame pacing (or otherwise) for the Red Team. Also, SLI & Crossfire scaling/frame pacing for each of the cards used here. SLI performance at 1440 or 4K would be most interesting to 970 owners as their GPU VRAM consumption exceeds 3.5GB.

If you read the article, RAM usage never exceeded 3.1GB, so that should answer your question.
 
A good friend is a heavy BF4 player. He is currently running on a pair of GTX 780Ti in SLI but has mentioned the upgrade itch. Even though this is only a preview of the open BETA version I'm going to recommend he upgrade to two GTX 980s in SLI just to have a bit more headroom to run at max on his 2560 x 1440 widescreen display.

Thanks for the Preview, [H]. Greatly appreciated. :)
 
I expect Hardline to fail in sales but I hope it doesn't affect SW: Battlefront as that is the game I'm truly excited about!
 
Thanks for the VRAM guide Brent. On a side note is this a Gameworks game ?

Quite the opposite, AMD signed a big co-development/marketing contract with EA in 2013. EA is to AMD what Ubisoft is to Nvidia from a technical perspective.
 
Quite the opposite, AMD signed a big co-development/marketing contract with EA in 2013. EA is to AMD what Ubisoft is to Nvidia from a technical perspective.

And THAT is pretty worrying, given how AMD cards were performing in this game
 
I was most disappointed to see that you didn't test all the maps at 4K. As I understand it, the more players, the more CPU is required, so by increasing both the player count and the resolution, it wasn't really a fair comparison.

Also, I would have liked to have seen what settings were required to make the game playable at 4K on a single card.

Hopefully you'll cover both of these in a later article.
 
Very good to see ram usage figures. Hopefully this will be a consistent thing in future reviews. 3GB ram definitely aint cutting it anymore if you want that high res playability.

On a tired, dead, beaten side-note: I thought AMD said a new focus of theirs was on drivers... :(
 
So no one thought about using the GstRender.MantleEnabled setting in the PROFSAVE_profile to toggle Mantle on and off.
 
How readily available was information on this setting?

Its right there in the documents BFH Beta 2 folder which is also there for BFH beta and BF4. i used it when i forgot to change the render settings in game, which save the trouble of firing up the game then change the API and have to close and restart the game.

But i have just found out that this Beta was 32bit only, Mantle needs 64bit to work and on top of that its a known fact that there can be performance differences between a 32bit and 64bit version.
 
Have to agree, with Mantle working, dx11 is really secondary.

Still, whatever EA is testing in this beta should have DX working properly for both cards at this point in the development, even if only for the PR for AMD.
 
Its right there in the documents BFH Beta 2 folder which is also there for BFH beta and BF4. i used it when i forgot to change the render settings in game, which save the trouble of firing up the game then change the API and have to close and restart the game.

But i have just found out that this Beta was 32bit only, Mantle needs 64bit to work and on top of that its a known fact that there can be performance differences between a 32bit and 64bit version.

Quote from article

We confirmed that the executable was running in 64-bit mode, so this game, even in open public beta form is a 64-bit game.
 
For the excessive stuttering on AMD cards, if you use the console to limit the FPS. (For instance for my monitor [gametime.maxvariablefps 59.95] ) The game will run butter smooth. It's a bandaid until the drivers get sorted out.
 
Back
Top