Battlefield V NVIDIA Ray Tracing RTX 2080 Performance @ [H]

Discussion in 'Video Cards' started by Kyle_Bennett, Dec 25, 2018.

  1. Kyle_Bennett

    Kyle_Bennett Editor-in-Chief HardOCP Staff Member

    Messages:
    52,075
    Joined:
    May 18, 1997
    Battlefield V NVIDIA Ray Tracing RTX 2080 Performance

    Can the NVIDIA GeForce RTX 2080 muscle up the performance to play Battlefield V with ray tracing enabled at 1080p and 1440p now that the new RTX performance patch has been released? We will find out using an ASUS ROG STRIX RTX 2080 OC and testing Battlefield V using 64-player MULTIPLAYER. Everything just works?

    If you like our content, please support HardOCP on Patreon.
     
    DejaWiz, Jza, IKV1476 and 1 other person like this.
  2. Maddness

    Maddness [H]ard|Gawd

    Messages:
    1,159
    Joined:
    Oct 24, 2014
    I really would like to know why there is such a big performance drop going from DX11 to DX12. Its not like DX12 gives you any enhanced graphics over what DX11 offers.
     
  3. Teenyman45

    Teenyman45 2[H]4U

    Messages:
    2,280
    Joined:
    Nov 29, 2010
    Nvidia's had performance issues on DX12 with the 10XX series and the 9XX series. I wonder if it's the architecture. Sure DX12 is M$'s take on AMD's Mantle,but it's not like running Mantle in Doom caused a penalty for Nvidia cards at the time.
     
  4. Chris_B

    Chris_B [H]ardness Supreme

    Messages:
    5,053
    Joined:
    May 29, 2001
    Dx12 is just pretty shit for performance in general.
     
  5. jbltecnicspro

    jbltecnicspro [H]ardness Supreme

    Messages:
    5,217
    Joined:
    Aug 18, 2006
    Man, this just isn't going so well for Nvidia is it? I get that real-time raytracing is a monster. I get that it's new tech. But come on - at the end of the day - no one pays upward of $900 on a video card to get barely playable performance at 1080p.
     
    Bcc335, Revdarian, KazeoHin and 4 others like this.
  6. polonyc2

    polonyc2 [H]ardForum Junkie

    Messages:
    16,006
    Joined:
    Oct 25, 2004
    huge fail by Nvidia...releasing ray-tracing cards that aren't able to play at high end resolutions- 1440p/4K...they should have released one Titan card with RTX capabilities as a beta test and saved the rest for next gen
     
  7. Maddness

    Maddness [H]ard|Gawd

    Messages:
    1,159
    Joined:
    Oct 24, 2014
    I get that and its true. But i think back to the days of Crysis. No card could run that game at exceptable frame rates. It took quite a while till we finally got them. Ray Tracing has always been one of the Holly Grails of computer graphics. You only have to listen to game devs talk about it with such enthusiasm. I know performance with these cards are not anywhere near what we would like for the crazy prices Nvidia is asking. But it is a step in the right direction for the tech itself. Personally, i would love AMDs next generation to come out with it, even if the performance is comparable.
     
  8. R_Type

    R_Type Limp Gawd

    Messages:
    218
    Joined:
    Mar 11, 2018
    While that's very true, nobody was lead to believe that it would work well at the time. Can't do raytracing playably at 4 mega pixels? For that price? Cmon
     
    jbltecnicspro and Kyle_Bennett like this.
  9. Maddness

    Maddness [H]ard|Gawd

    Messages:
    1,159
    Joined:
    Oct 24, 2014
    I agree, the price is by far the biggest fail with these new cards from Nvidia. If they were priced the same or close to the last Pascal series, i doubt anyone would be moaning about the performance. Nvidia really made a meal of this launch.
     
    Bcc335, polonyc2, Vercinaigh and 2 others like this.
  10. Nolan7689

    Nolan7689 [H]ard|Gawd

    Messages:
    1,189
    Joined:
    Jun 5, 2015
    I’m almost okay with this type of performance. It harkens back in a way to when game maximum game settings didn’t result in perfect performance.

    On the other hand it’s a single technology holding back performance and I don’t know about it really being worth it as opposed the games where it was multiple settings and pushing all the polygons resulted in markedly increased fidelity.

    Probably not articulating myself properly but oh well.
     
  11. ecuador

    ecuador Limp Gawd

    Messages:
    205
    Joined:
    Dec 29, 2008
    So, ray tracing on an Nvidia RTX 2080 is about as good as VR was with an AMD RX 480? Perhaps we can call it a draw for the respective marketing depts? :D
     
  12. IKV1476

    IKV1476 Lurker

    Messages:
    281
    Joined:
    Dec 26, 2005
    I really like this article. Very informative. Can't wait for the 2080 ti article to tie this all together.

    I agree with the above comment that Nvidia maybe should have just left this to a titan type card to get everyone excited about it, then on the next series make it more available to mainstream type cards.
    I have witnessed going back a step with performance to get a new visual effect, like shadows, but this seems like two steps back, instead of one.
     
    R_Type and Maddness like this.
  13. Dayaks

    Dayaks [H]ardness Supreme

    Messages:
    6,609
    Joined:
    Feb 22, 2012
    [H] might want to try a higher thread count processor. I couldn’t figure out why your performance is so far below mine but I run a 2700x with 16 threads vs the test setup of a 7700k at 8 threads.

    I know during development they targeted 12 threaded cpus. The recommended specs are a 8700k or above.

    It could very well be the difference. Might help with the DX12 decrease too?

    I know you’re busy but I think it makes sense.

    Merry Christmas!!
     
  14. Kyle_Bennett

    Kyle_Bennett Editor-in-Chief HardOCP Staff Member

    Messages:
    52,075
    Joined:
    May 18, 1997
    Lol. Yeah. Four cores eight threads at 5GHz is killing performance... Just in ray tracing.

    Straws?
     
    lostin3d likes this.
  15. Furious_Styles

    Furious_Styles [H]ard|Gawd

    Messages:
    1,068
    Joined:
    Jan 16, 2013
    lol, time to get the threadripper out. DX12 UNLEASHED
     
  16. blandead

    blandead Limp Gawd

    Messages:
    202
    Joined:
    Nov 6, 2010
    This will never be playable until someone can fix dx12 performance. I remember a review showing nvidia cards having a bigger performance hit with dx12 than amd cards but either way all cards had a performance hit.

    Can't remember any other dx successor to run slower with the same features

    Any game that supported mantle ran much smoother so there's something seriously broken with dx12
     
  17. piscian18

    piscian18 [H]ardForum Junkie

    Messages:
    11,057
    Joined:
    Jul 26, 2005
    and here Tom was making me feel like a loser for not preordering.
     
    Revdarian and jmilcher like this.
  18. twzTechman

    twzTechman Limp Gawd

    Messages:
    221
    Joined:
    Apr 14, 2011
    Merry Christmas Kyle and thanks for posting something for us to read today.

    I am just not seeing a good reason to upgrade these days. While I applaud Nvidia for bringing out a new technology, at this point it does not seem to provide any benefit. There is no game out where this makes for a better experience. Am I correct in thinking playing Battlefield at 4k or even 1440p is better looking than 1080p with RTX.
     
    Maddness likes this.
  19. Lakku

    Lakku n00b

    Messages:
    21
    Joined:
    Jul 1, 2009
    Why are you testing DXR with multiplayer? Nobody was going to be using it for MP to begin with, as most gamers want steady performance above all else. Plus the maps are significantly larger in multiplayer with dozens more people than in single player. This is a rather poor series of articles that doesn’t reflect where most gamers would use DXR to begin with. And ageeeing with the person above, I also have much better overall stability and performance with DXR than what you’re showing, using a 2080 overclocked and 1440p Ultra. That is in single player however, MP is closer to your numbers.

    As for the rest of people calling it a failure of some sort, are you serious? Or entitled? Soft shadows, completely unplayable at release. Antialiasing, completely unusable at release. It took years and new techniques for it to become usable.

    As for pricing? HardOCP already covered this. Top of the line cards have always been really expensive and have been for twenty years. The 2080 and 2070 are better values than the 1080ti and 1080 were at launch when using inflation. Hell a GeForce 2 Ultra would be 700-750 dollars in today’s value. I feel salty gamers haven’t been around long enough or forgot the past along the way.
     
    jologskyblues, ScuNioN and Ranger101 like this.
  20. sirmonkey1985

    sirmonkey1985 [H]ard|DCer of the Month - July 2010

    Messages:
    20,988
    Joined:
    Sep 13, 2008
    it's mostly just the frostbite engine.. if you look at other dx12 like rise of the tomb raider the differences in performance are pretty minimal. that being said though i feel developers really aren't bothering to take advantage of the features within dx12 and are just putting it in there just to say it has it.
     
  21. 460cidpower

    460cidpower [H]Lite

    Messages:
    119
    Joined:
    Jul 8, 2004
    I find it hilarious that supposedly there's going to be a 2060 rtx... What's the point of Raytracing if only the $1400 TI can do it with any kind of performance?
     
    Revdarian and Maddness like this.
  22. Ranger101

    Ranger101 [H]Lite

    Messages:
    75
    Joined:
    Sep 11, 2015
    Thanks for a nice Christmas present Mr Bennet! No surprises in your analysis. Coming from a 1070, I'm happy with my RTX 2080 perf in Battlefield with DXR on, bearing in mind the early adoption tax and even happier in all the other games I play without DXR.The RTX series is not only about "allowing entry into playable performance with DXR enabled" so I feel it's perhaps a little disingenuous to withhold a recommendation solely on that basis. If you keep in mind this is 1st gen tech, the RTX series is the best of both worlds DXR on and off ☺
     
    Revdarian and Kyle_Bennett like this.
  23. Ranger101

    Ranger101 [H]Lite

    Messages:
    75
    Joined:
    Sep 11, 2015
    Nice one. Certainly are a lot of salty RTX haters lurking in the environs ☺
     
    Kyle_Bennett likes this.
  24. Ranger101

    Ranger101 [H]Lite

    Messages:
    75
    Joined:
    Sep 11, 2015
    Sigh....will it ever end....
     
  25. wyqtor

    wyqtor Limp Gawd

    Messages:
    372
    Joined:
    Dec 30, 2011
    Glad I got the 2070, with the 2080 I would have paid 50% more for 10% improvement in RT. Better to save money for future cards that can max out RT at 1440p at a reasonable price.
     
    Geforcepat likes this.
  26. lostin3d

    lostin3d [H]ard|Gawd

    Messages:
    1,838
    Joined:
    Oct 13, 2016
    Thanks for this 3 part. It's really shed clear light #'s comparisons for RT on these cards. Pretty obvious that NV's nickle and dime approach of minute(10 rt cores) makes little to no difference for these cards. I expect there will be a bit more of a gap for the TI. I do wonder how much added costs in the RTX line really came from RT/Tensor cores and how much was just made up for profits.
     
  27. jardows

    jardows [H]ard|Gawd

    Messages:
    1,425
    Joined:
    Jun 10, 2015
    That sentiment really only comes from DX12's main advantages, when not optimized for by the game engines, is with lower end hardware. I see performance gains on my system in DX12 mode, in the two games I have that can use it.

    It's not just a single technology, but the whole reason Ray-Tracing was built into BFV was because of Nvidia's RTX cards. RTX, DXR, BFV, are all part of the same package, and it is not doing well. You would think that a new technology being launched, that is currently exclusive to one series of hardware, on one version of windows, and used in only one game, would have been released in a "best case" scenario. This is not the case.
     
    lostin3d and Kyle_Bennett like this.
  28. InquisitorDavid

    InquisitorDavid [H]Lite

    Messages:
    92
    Joined:
    Jun 27, 2016
    All things considered, the performance drop in just the change of API is a real deal-breaker already at 4K. I imagine some additional optimization can still be had by just improving the DX12 renderer, which could in turn make RT more palatable. That 17fps on 1440p gets real damn close to 60fps average, which isn't bad at all for first-gen RT.

    Not in this case. RX480 was saying it's a Premium VR Experience, but the competitor offers a much better one. Here, the competitor does not offer better (or anything for that matter) in the Real-Time Raytracing department.
     
    Last edited: Dec 26, 2018
  29. ScuNioN

    ScuNioN [H]Lite

    Messages:
    91
    Joined:
    Dec 30, 2016
    "Similar to the previous review, we have an expensive $869.99 video card just being able to play 1080p resolution with NVIDIA Ray Tracing enabled. Doesn’t quite sound right to spend that much money just to play a game at 1080p with just playable performance does it? The price required to even get playable performance at a minimum seems too high for us to recommend it. It is better than the RTX 2070, but it just allows entry into playable performance with DXR enabled in Battlefield V. It’s quite a lot of money to just get entry performance into a new feature. Everything just works? Not really...at least not in NVIDIA's ray tracing launch title."

    As someone who had an Orchid Righteous 3D card right when they came available it was a great card for single player games. Did I play competitive (vs other opponents) Quake with it, no (640x480 it ran at around 30 FPS) Did I play single player Quake with it, yes along with Tomb Raider (such an awesome game) and many other games when driver wrapper support became available. There should be a distinction for tech that supports 60 frames and 120 frames or single player and competitive multiplayer games. I've always felt this way for 3D Vision, it is absolutely fantastic for single player titles but terrible (well perhaps not competitive driving games) for competitive games due to frame rate and cross-hairs being wonky. Bottom line, new tech that is not ready for mulitplayer mainstream yet. I could mention when NVIDIA released hardware T/L and it was slow.

    There is a reason why 1080p 144 Hz monitors are still what I suggest for everyone to buy, consistent 100+ frame rate is king for many types of gaming; it is hard to get that with a 1440p monitor. What does a 1440p monitor net you, better image quality? I will take the experience (high frame rate coupled with better responsiveness through controls) over better image quality (more pixels) most every time.
     
  30. noko

    noko [H]ardness Supreme

    Messages:
    4,134
    Joined:
    Apr 14, 2010
    For Part 3, request playable settings using DXR be explored for each card vice just using all maxed out settings. As in for 1440p for the 2070/80/Ti using DXR, what are the max other graphic settings one can use when using Ray Traced reflections.

    I do not see the win here with DXR in BFV, improve reflection IQ while degrading everything else using a much lower resolution. Really 4K playability with max out settings in DX 11 to a choppy 1080p, DX 12 max out settings with low or med DXR settings!

    CPU usage might be interesting to explore, does DXR use additional threads over non DXR game play? Does it hit the CPU harder or less?

    As for DX12 and DX11, that in itself is destroying performance and thus DXR ability - WTH is the issue there is what I wondering and only Dice/Nvidia probably would know. In short BFV is a no go for DXR technology so far.
     
    Nightfire and CAD4466HK like this.
  31. socK

    socK 2[H]4U

    Messages:
    3,650
    Joined:
    Jan 25, 2004
    Pretty unfortunate that their DX12 implementation somehow isn't up to par. With a Q6600 and a 7870, when the Mantle patch hit in BF4, it literally doubled or tripled my framerate on some maps. The performance uplift was absolutely absurd - like 35 fps to 90~ in extreme cases.

    Was surprised they went for DX12 instead of Vulkan too...

    I actually saw their rendering guy answer this question on Twitter some time back that "Frostbite's DX11 renderer is just _really_ good."
     
  32. R_Type

    R_Type Limp Gawd

    Messages:
    218
    Joined:
    Mar 11, 2018
    Loved the trip down memory lane for me there! Remembering loading up F.E.A.R. for the first time and having to chop shadow quality back a whole bunch but still being impressed the whole time. Ah, Memories!
     
    IKV1476 likes this.
  33. Ruddys

    Ruddys n00b

    Messages:
    37
    Joined:
    Nov 27, 2018
    Without bothering reading all the posts and to answer the question after the update its quite playable at rtx2080 at 1440p getting around 75-85fps with RTX set on low
     
    Dayaks likes this.
  34. Dayaks

    Dayaks [H]ardness Supreme

    Messages:
    6,609
    Joined:
    Feb 22, 2012
    During my playthroughs single player was more demanding than multiplayer for ray tracing. I would also defend the multiplayer choice since more people spend way more hours in that mode and this review is more beneficial to the masses as is. I’d also believe it takes a ton of work to produce meaningful data this way.

    I was just pointing out what DICE recommends. If someone asked, I’d tell them the 7700k at 5Ghz would be fine. Just my experience is more in line with Ruddys’. It’s not that far off though. It’s still a big hit. It seemed the large concensus on the forums is if you have a high Hz monitor keeping high Hz vastly outweighs the RTX benefits.

    It’s hard to find decent BFV multiplayer data nevermind BFV processor scaling with RTX on. It seems everyone else just loads up single player and stares at a puddle - because that’s useful and realistic gameplay data.
     
    jbltecnicspro likes this.
  35. jbltecnicspro

    jbltecnicspro [H]ardness Supreme

    Messages:
    5,217
    Joined:
    Aug 18, 2006
    But puduws aw so pwetty.
     
  36. noko

    noko [H]ardness Supreme

    Messages:
    4,134
    Joined:
    Apr 14, 2010
    Looking at this video playing at 4K Ultra, single and multiplayer the VRam requirements are unreal with DXR! Hitting close to 11gb! I wonder if the 2080/2070 are running out of Vram in this title? Also CPU is 8700K, this game may indeed need more than 8 threads or more than 4 cores with DXR but could not find any applicable data other than EA recommending an AMD 2700 or 8700.



    Edit: The multiplier section of video, VRam goes over 10gb, single player it is over 8gb. Still even at 4K that is a lot of Vram being used and maybe the 2070/2080 are running out of VRam in multiplayer at 1440p. Do recommend tests be done with different CPUs core counts. This game or DXR may have made 4 core CPUs obsolete.
     
    Last edited: Dec 27, 2018
  37. lostin3d

    lostin3d [H]ard|Gawd

    Messages:
    1,838
    Joined:
    Oct 13, 2016
    I thought the very same. I've noticed most sites have ceased posting Vram usage metrics and I apologize if I missed them in these two reviews. At 4k I'd say it's almost a given they're hitting the v-ram ceiling if using fully/manually maxed settings. It's even possible at 1440p. Most AAA games I've gotten since ROTTR/RE7/MEA have all hit over 8GB in 4k. When I had my 1080SLI setup I often saw how it would cost me FPS. When I lowered settings to get under those 8GB's I saw major improvements. I know the old belief of games will use it if you have it regarding Vram but I honestly feel these days that it should be more like if you don't have it you won't get it. Regardless, can't wait until Kyle & team get pt. 3 out for the final conclusions.
     
  38. noko

    noko [H]ardness Supreme

    Messages:
    4,134
    Joined:
    Apr 14, 2010
    Also the system ram was over 11gb as well :LOL:, 16gb for a gaming machine is bare minimum for Enthusiast. Anyways the erratic framerates Brent saw makes me wonder if the cause is actually VRam limitations in multiplayer of the 2070 and the 2080. Cannot find one review looking at CPU workload with and without DXR - since part of DX 12 API, looks like it will take additional cpu threads/cores when used. New technology which has not been explained at all in how it affects your gaming rig.
     
    lostin3d likes this.
  39. lostin3d

    lostin3d [H]ard|Gawd

    Messages:
    1,838
    Joined:
    Oct 13, 2016
    I have to totally agree. Waiting for pt.3 but I've run this game with the rig in my sig at 1080p/120hz and 4k/60hz w/ fully manually maxed settings in DX12 except for, DXR med-blur related off-lens effects/aberration off and it was impressive to me. I only played it in single player but I did test all the single player campaigns for about an hour each. France and Norway each had some impressive visuals to me. I'm using nearly identical settings that Kyle did for his Strix 2080TI review but I toned them down a little just so I could keep the fans under 75% and the card under 65c.

    At the time I was testing this game I'd been 'benching my brains out' between the new GPU, OC'ing my CPU a bit further to 4.3, and tweaking fans etc. for optimal noise/temp compromises due to now only have one GPU so I'm a bit foggy on my exact FPS but I do remember being surprised(100-118 fps/1080p and 50-60+ in 4k) considering all the bad hype back then. I did this all just after the 'optimized' RT drivers came out but had not even tried at game release.
     
    ScuNioN likes this.
  40. lostin3d

    lostin3d [H]ard|Gawd

    Messages:
    1,838
    Joined:
    Oct 13, 2016
    Just did some quick re-testing in 4k(4096x2160). FPS are the same as I remember them. This CPU was getting 40-54% usage. Vram was averaging 9.1-9.4 GB. It didn't seem to change at all in going from Med to Ultra DXR. I'm sure if I lowered my resolution to 3840x2160 it'd still be ~9GB so I think this establishes the game does use more than 8GB in 4k. I only retested the Norway/France campaigns since there's not much to RT in a desert ;)
     
    Maddness and noko like this.