AMD Dominates the Battlefield V Closed Alpha

Played the alpha, they didn't optimize much, had to run it at the lowest settings to enjoy 60+ fps, I have a 4690k and a 580. I don't like how PCGamesN didn't specify the graphic settings it tested for the GTX 1060 vs RX 580, would have been nice to compare with my experience. Also they state that the GTX 1080Ti ran at 114 fps at Ultra on 1080p, a little strange to test 1080p considering thats a 4k card if anything.
What makes the 1080 Ti a 4k card? The power of denial?

Which indicates the game is probably sensitive to GPU memory bandwidth. That's the one area where ATI/AMD has always had an edge on NVIDIA.
Sir, are you telling me my GTX 1060 3GB is slower than the RX 480/580 4GB? I'm appalled I say. Simply appalled. Next you'll tell me the GTX 1060 3GB is slower than a 6GB because of missing cuda cores and pipes.

Now if you'll excuse me I must go back to playing games on my GT 1030 DDR4, cause I did quick Google of GT 1030 and saw benchmarks for a GT 1030 GDDR5 and made the assumption that the DDR4 is the same as GDDR5. So therefore I'm correct in thinking they're the same in performance. Cause Nvidia Gods would never steer me wrong, just like Jesus in the bibble.
 
Last edited:
1-8-7-2-6-5-4-3 master race.
This.
I am a disciple of the temple of LS and am spreading it's delicious ways to the heathens in the south pacific.
P.s. got any storage area stateside and want some side $?
 
Game runs better on AMD because it's been designed for the AMD platform inside the current gen consoles? Is this one of the rare occasions when team Red can claim a relevant win (AshesOfTheBullshit)? Answer: Who cares.....so long as it it does more than look like a pretty cut-scene engine. I hate those "You are in the airplane, press X over and over for 5 minutes to feel like y
Every Developer hits both buttons and releases it prematurely.

Every Publisher....Developers are not always on board with the schedules they are demanded to adhere to, but I also agree that at some point, someone probably told Michaelangelo "Ok man hurry the F-up" too so....its one of those balances between hitting your milestones (to start making revenue, paying back loans you may have taken out, etc) and shipping exactly what you want to ship, finished and polished.
 
BF3-4 very much were like this in alpha and beta that what seemed like great, amazing, prettty turned into run like and look like crap for months after making a great performing radeon (taking on 2-3 tier higher Ngreedia cards) once Nv got their sticky paws on DICE/EA "support".

If Nv does not perform as well as they would like, they do not bother attempting to fix the situation with their own hard work (via drivers or new hardware) they would seemingly rather paying people off to throw a monkey wrench in the gears to make the code all jacked up so they get the benefits and everyone else (including end consumers) suffer as a result.

Nv "It is in the way we play you"

Nv should spend half as much time making and supporting a great product as they do finding out ways to skew every race they enter into in cheap ways, so that they would have and earned the greatness they constantly parade around as having, instead they are just children throwing a tantrum when they do not get their lollipop and other boys actualy play ball better than they do ^.^
 
Nvboy spotted.

You wish. I like pointing out this stuff out from time to time, especially since AMD is the 'chosen one' on this forum. I buy the best performance card I can afford when I need to upgrade which has been a Nvidia almost every single time.
 
  1. Is this DX12 performance?
  2. Are there actually benefits in using DX12 in BFV?
  3. How does AMD/Nvidia compare in DX11?
 
You wish. I like pointing out this stuff out from time to time, especially since AMD is the 'chosen one' on this forum. I buy the best performance card I can afford when I need to upgrade which has been a Nvidia almost every single time.

No, that's not the reason you pointed out that out. Also, you must not have been here long if you think AMD is the "chosen one" around here. LOL :D
 
No, that's not the reason you pointed out that out. Also, you must not have been here long if you think AMD is the "chosen one" around here. LOL :D

No, I'm pretty sure that was my reason and I've been here just as long as you. Tee hee!
 
You wish. I like pointing out this stuff out from time to time, especially since AMD is the 'chosen one' on this forum. I buy the best performance card I can afford when I need to upgrade which has been a Nvidia almost every single time.
I have to admit I have a little AMD fanboy in me, especially with their GPU’s. Probably because back with the 9700pro release NV was giving us such marginal releases after the first GeForce GPU and bam we have a winner in both price and performance. That has stuck with me, I usually find myself going AMD -> NV -> AMD through out the years.
 
Maybe NVIDIA got some GimpWorks into the early alpha and only runs on their cards? :meh:

Or alternatively it is starting to show that we are actually on the 3rd iteration of Maxwell without much architectural changes...
 
TressFX maybe? Could really help with the realism of battle-LOL.
 
Stepping out of the shadow of DX 11 games the GPUs that favored them are going to suck. No getting around that. DirectX 11 (or less) would be a wrapper for compatibility with older systems going forward. This goes back to AMD cards age well, and NVidia released new cards.
 
Was able to answer some of questions.

  1. Is this DX12 performance?
  2. Are there actually benefits in using DX12 in BFV?
  3. How does AMD/Nvidia compare in DX11?

  1. During the 1080p DX11 run, the GTX 1060 scored an average of 45 FPS while the RX 580 outperformed it by a whopping 51% with an average of 68 FPS. At 1440p the performance gap was slightly reduced, but the RX 580 still came away victorious with a 44% average FPS advantage.
  2. ????
  3. Curiously the DX12 test demonstrated that, at least with this alpha code, Nvidia and DICE have some work ahead of them. Nvidia's GTX 1060 suffered an 8% drop in average framerate at 1080p compared to DX11, while the RX 580 maintained the same framerate.
From the forbes analysis
 
Was able to answer some of questions.



  1. During the 1080p DX11 run, the GTX 1060 scored an average of 45 FPS while the RX 580 outperformed it by a whopping 51% with an average of 68 FPS. At 1440p the performance gap was slightly reduced, but the RX 580 still came away victorious with a 44% average FPS advantage.
  2. ????
  3. Curiously the DX12 test demonstrated that, at least with this alpha code, Nvidia and DICE have some work ahead of them. Nvidia's GTX 1060 suffered an 8% drop in average framerate at 1080p compared to DX11, while the RX 580 maintained the same framerate.
From the forbes analysis

If you ever look at game compares on DX12 of a RX580, versus a 1060 6GB, the RX580 usually comes out on top (stock).

That said, a popular technique in drivers is to preload shaders used by video games into unused memory and reuse the precompiled versions instead of reshipping it back over the PCIe bus and reprocessing it all again. It is possible NVIDIA hasn't cached the shader programs yet as it hasn't been released. That will fall on nvidia's shoulders though to implement before release. It won't save them 44%. But I wouldn't put it past nvidia to force the developers to GIMP AMD's work as they have done it before.

Pulling funding because they don't like how a 3rd party company develops their products? NVIDIA would never do that right? /snark
 
  • Like
Reactions: Zuul
like this
I'm just surprised by that gap given it was measured in DX11. Less surprised granted AMD has worked with DICE on Frostbyte optimizations in the immediate past, and even more skeptical given this is alpha code, afterall. Overall glad competition is alive and well. Now if only Vega would come back down in price.
 
My friend's LG OLED is 4096. I get about 8
5-10 fps lower on that than my 3840 monitor.
The LG OLED is UHD 4K native, but it can accept and display DCI 4K. I don't know of any panels out on the market that are DCI 4K native.
 
  • Like
Reactions: M76
like this
The LG OLED is UHD 4K native, but it can accept and display DCI 4K. I don't know of any panels out on the market that are DCI 4K native.
That's what I thought. So basically by playing at 4096x2160 he's just rewarding himself with all the "benefits" of scaling.
 
That's what I thought. So basically by playing at 4096x2160 he's just rewarding himself with all the "benefits" of scaling.

Do BluRays even encode at true 4K (DCI)? I don't think they do. Only cinemas use true 4K (DCI)

Edit: I stand corrected. They do support 4K. But only a very very small % of home projectors support it (Sony being one)
 
I do love a good fanboy epithet.

Some people are self confessed, that’s fine. Some people live in a state of post purchase rationalisation. Understandable. What is worth bearing in mind though is just how many people just don’t give a fuck.

Speaking for myself, I have no loyalty to any vendor and get no warm fuzzy feeling about a faceless multi billion dollar company. The only feeling I really get is anmity, and more often that is to die to poor business practices or support issues. While I can’t speak for a certain bristly Texan, I’ve always felt we’re aligned on that, hence why I’ve read the [H] for almost as long as some members have been alive.

Put your product performance where your mouth is and don’t act like a dick.

Also, low end shit, who gives a fuck:LOL:
 
I do love a good fanboy epithet.



Also, low end shit, who gives a fuck:LOL:

Front page wouldn’t be as good without them though :D

Also, this brought back some fun debates back when NVidia released the 1060 and AMD had brought in the RX480?580? Anyways most couldn’t get the AMD card, low initial stock but the 1060 wasn’t exactly a thriller either. But still, a lot were forced to get the 1060 as the 480 OOS everywhere. Another example of how AMD does their competition good by not meeting demand.
 
Back
Top