Battlefield V NVIDIA Ray Tracing RTX 2080 Ti Performance @ [H]

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,532
Battlefield V NVIDIA Ray Tracing RTX 2080 Ti Performance

How many frames will NVIDIA Ray Tracing cost you in Battlefield V with the RTX 2080 Ti? We take an ASUS ROG STRIX RTX 2080 Ti OC and turn on NVIDIA Ray Tracing in BFV Multiplayer at 1080p, 1440p, and 4K to find out. We test VRAM capacity, which reveals something very important that can impact NVIDIA Ray Tracing performance.

If you like our content, please support HardOCP on Patreon.
 
I think the bottom line is we need more games to test. Saying "The RTX is a lie" based on one game seems disingenuous, especially when that game is using an engine that historically has issues running in DX12. I do appreciate all the work Brent has put into this, though, and I do not discount the rest of the conclusions.

Looking forward to your analysis of Metro: Exodus if you guys are planning on doing one. Forgoing the fact we may not get RTX in Shadow of the Tomb Raider, it will be the first game I actually want to buy and play that will put my 2080 Ti to good use.
 
The minimums with RTX enabled was pretty atrocious. So even SLI would still not be great for minimums even if it scaled 100%.
The DX11 to DX12 is odd. Is the quality that much better that it almost doubles the ram usage?
 
I think the bottom line is we need more games to test. Saying "The RTX is a lie" based on one game seems disingenuous, especially when that game is using an engine that historically has issues running in DX12. I do appreciate all the work Brent has put into this, though, and I do not discount the rest of the conclusions.

Looking forward to your analysis of Metro: Exodus if you guys are planning on doing one. Forgoing the fact we may not get RTX in Shadow of the Tomb Raider, it will be the first game I actually want to buy and play that will put my 2080 Ti to good use.

This article directly relates to only Battlefield V, not other ray traced games. All conclusions are pointed directly toward this game.

We will of course test future games supporting NV Ray Tracing as they are released.
 
I think the bottom line is we need more games to test. Saying "The RTX is a lie" based on one game seems disingenuous,
Yes we need more games test, but those do not exist. NVIDIA chose this as their launch title and have bet the farm on it. That is the state it is at right now. RTX is a lie. That may or may not change, but we do not base our opinions on what might happen tomorrow. For those that paid their hard earned cash for RTX today, that is what RTX gets you today.
 
Wow so a 2060 in this game would not even be able to run DX12 on with it's VRAM limitations, but nVidia touts it as being able to play with RTX "ON". If it takes a 2080 Ti to struggle through RTX why on earth is the feature on the lower end cards at all? Thank you for taking your time to write this up Brent and letting us know were to put our money.
 
itll be interesting to see how this all plays out in the next 18-24 months. especially with rtx gen2 and gen3 cards. but i guess this needed to start someplace, and while it doent seem to work well yet, it does seem to work-ish. if they can get performance up on both software, and hardware.... well, real time raytracing has been the end game goal for the last 20 years.

id also be curious to see how a game would perform that was built around RT from the start, and not just crammed in at the last minute.
 
Do you plan to re-visit these benchmarks after DLSS support is patched into the game? Jensen made some pretty bold claims at the keynote last night indicating that DLSS could negate the performance penalty of other RTX effects. If the numbers are accurate, it's a shame it hasn't made its way into the public release of BF5 yet.
 
Do you plan to re-visit these benchmarks after DLSS support is patched into the game? Jensen made some pretty bold claims at the keynote last night indicating that DLSS could negate the performance penalty of other RTX effects. If the numbers are accurate, it's a shame it hasn't made its way into the public release of BF5 yet.
Surely. DLSS is actually what we are excited about, but it will be on a game by game basis as well.
 
I've been saying it in various threads since the RTX was first tested: there is zero justification for the artificially inflated price tags set by nVidia, and the gameplay experience offers no return on investment.

$1200+ for a 2080Ti to get the minimum experience, and $2500+ for a Titan to *maybe* get acceptable performance? GTFOH with that shit, nVidia!

I speculate that those whom have ponied up for any RTX since launch are going to be even more pissed when they find that their GPU resale values are in the toilet after 9-12 months when more games instituting Ray Tracing come along (that they can't get a decent gameplay experience with)...

It is evident to me that nVidia has really fucked themselves over with choosing to price these the way they have, and that means that they fuck their customers over.

My final thoughts are that the RTX series should have been launched like so:

Titan* - 24-32GB - $1000-1200
2080Ti - 12-16GB - $700-800
2080 - 8-12GB - $500-600
2070 - 8GB - $350-400
2060 - 6-8GB - $250-300


* and market it as the gaming GPU that it is, instead of trying to fool the masses into believing it's a professional card to try and justify it's current outrageous price.
 
I've been saying it in various threads since the RTX was first tested: there is zero justification for the artificially inflated price tags set by nVidia, and the gameplay experience offers no return on investment.

$1200+ for a 2080Ti to get the minimum experience, and $2500+ for a Titan to *maybe* get acceptable performance? GTFOH with that shit, nVidia!

I speculate that those whom have ponied up for any RTX since launch are going to be even more pissed when they find that their GPU resale values are in the toilet after 9-12 months when more games instituting Ray Tracing come along (that they can't get a decent gameplay experience with)...

It is evident to me that nVidia has really fucked themselves over with choosing to price these the way they have, and that means that they fuck their customers over.

My final thoughts are that the RTX series should have been launched like so:

Titan* - 24-32GB - $1000-1200
2080Ti - 12-16GB - $700-800
2080 - 8-12GB - $500-600
2070 - 8GB - $350-400
2060 - 6-8GB - $250-300


* and market it as the gaming GPU that it is, instead of trying to fool the masses into believing it's a professional card to try and justify it's current outrageous price.
At those prices they would of sold very fast. OTOH, the problem with the cards would of been more crazy than it is now.
 
That conclusion was harsh but deserved, imo.

I'm usually an early adopter. I went from 680 SLI -> 290X CFX -> 980 Ti -> Titan XP. I usually spend $1000-1200 per upgrade cycle. There is just nothing motivating me to buy an RTX 2080 Ti. Especially given the failure rate. The performance improvement isn't worth the $1300 on it's face and I'm not gambling on a card that may fail when my Titan XP has been rock solid stable since 2016.

I really hope AMD has something decent with Navi because it seems like NVIDIA has been asleep at the wheel the last two years.
 
Fuck new tech. let's hide like pussies in the here and now. I am not happy about the performance of Ray Tracing either, but before this card was released it was said it could never be done in real time. EVER. The cards are expensive, but let's not bitch and whine about a new GPU tech that's first gen.
 
I didn't buy my card for Ray Tracing. I came from a GTX980ti, and yeah...I'm not happy about the price, but this card really has delivered in real time 4k gaming. But, GPU makers have to start somewhere. Like when I bought my 9700 pro. Hardware direct X9. Frames dropped, but it was fantastic.

Also, if you're talking about failed cards....I hope you get replacements or your money back. I'm hoping my GPU doesn't take a shit on me.
I have been in this business for 20 years. The RTX launch ranks right up there with the worst of them. Nvidia has totally screwed this up.
 
Very interesting review and nailed it on identifying causes for the poor consistent frame times with DXR dealing with a ram usage. Why DX12 consumes more Vram is just weird. Will have to check if this is consistent with AMD.

Love the [H]ardOCP conclusion, right on. Hopefully NVidia gets there DX12 performance up which would help a lot. More optimizations in BF5 would also help. I don’t think we are at the full performance potential yet with BF5 with DXR.

I just fail to see the value that DXR gives in this title even if performance was faster. Even without DXR, Dice did a good job with the reflections and are not overdone like in many Cases I’ve seen so far with DXR turned on.
 
I got the same findings after testing BFV on my GTX2080Ti rig - 3440x1440 at Ultra graphics @ DXR Medium gave me 70-80 fps which fluctuates downwards. I thought the saving grace from Jensen was his "boop" remark about DLSS pushing up the frame rates taken away by DXR so I'll wait and see if DICE releases a DLSS patch and if it makes a difference.
 
I didn't buy my card for Ray Tracing. I came from a GTX980ti, and yeah...I'm not happy about the price, but this card really has delivered in real time 4k gaming. But, GPU makers have to start somewhere. Like when I bought my 9700 pro. Hardware direct X9. Frames dropped, but it was fantastic.

Also, if you're talking about failed cards....I hope you get replacements or your money back. I'm hoping my GPU doesn't take a shit on me.
You could have bought a 1080 Ti for like $650 a few months ago and received what, 80% of the performance for 50% of the cost? With RTX and DLSS being essentially non-existent outside of 2 games, the only reason to buy the RTX cards is the performance, and they are a terrible value.

Sure, the Titan XP was $1200 at release, but I've been using it for going on 3 years. There is absolutely no way they don't replace or steeply discount the current RTX offerings before 2021.
 
Excellent write up, any clue why DX12 is being such a RAM hog of an API for worse performance and identical graphics? I'm now curious if other DX11/DX12 games have similar discrepancies of memory usage.

I said it before on one of the previous RT evaluations, but I feel like reiterating this almost seems acceptable in a way. We're spoiled on systems being able to crush games at 1080p and many pushing higher resolutions. Seems like it use to be the other way around where games used to be seen as System Crushers, always asking "Can it play Crysis?"......On the other hand, Ray Tracing is a single effect that is doing the crushing rather than a whole barrage of graphical effects. It's just so unfortunate that we're limited to a sample size of one game to be able to see if it's just poor implementation or actually that brutal on cards. My guess is, it's that brutal.

Edit: (Off-topic) the front page and articles all seem to be stuck on the mobile versions for me currently :( Damn you Opera.
 
I got the same findings after testing BFV on my GTX2080Ti rig - 3440x1440 at Ultra graphics @ DXR Medium gave me 70-80 fps which fluctuates downwards. I thought the saving grace from Jensen was his "boop" remark about DLSS pushing up the frame rates taken away by DXR so I'll wait and see if DICE releases a DLSS patch and if it makes a difference.
If that actually happens, and I hope it does, we will surely look at this again. I think once NVIDIA got hit with the crypto bust, it went into panic mode, and pushed this product out way too quick and certainly overpriced. The extremely high failure rate points to this as well. Whole thing reminds me of NVIDIA when TNL was the big thing, but the climate was much different then.
 
Does any of the AMD cards take the same sort of hit in performance between Dx11 and Dx12
 
Unfortunately this isn't the type of game I'm usually into playing plus between the time it takes to for EA servers to do their thing and the load times its kind of a pain for me to test. A 400Mbps connection and SATA III SSD and this game is still slow to load. All of that combined and it takes a bit of motivation for me to do extensive testing so I'm really grateful for the extensive testing Brent and Kyle have done. Three cards, three resolutions, 4 settings of DXR, and two versions of DirectX plus CPU and Vram metrics. That's a lot of data to gather and compile.

I've done some testing in the single player campaigns only since I didn't want to embarrass myself in multi. Really sucks that I can't do a like for like comparison otherwise. I used DX12 4k med DXR, everything else except blur/distortion effects maxed and my averages in the Norway and France campaigns were much higher but that's likely due to not having much to render for reflections. I'm also using tricks that Kyle mentioned in his Strix 2080TI review and my OC numbers are averaging 2025-2070Mhz with rare peaks to 2100MHZ and Vram at 14998ghz. At those speeds I tend to average nearly 10fps more than stock for many games.

The behavior of DX11 vs DX12 is downright bizarre. So far I've only got 3 games that allow switching between the two, ROTTR-SOTTR-BFV. Each of them with very unique performance. You guys have already done an insane amount of testing but it does lead to mysteries regarding exactly what choices DX12 makes per various hardware configurations per game. So far SOTTR has been the best DX12 experience for me.

I have to agree with you both on numerous things here though.

1. Price-I'm happy with my Strix but I'll be walking funny for a couple moew months still. My goal was to get away from SLI, simplify my rig more, and get closer to 4k/60fps experience. Pretty much got it all except for the obvious DXR you covered extensively here.
2. Wrong tool for the job-Brent previously mentioned that this game was a poor choice to demo DXR. This is a fast paced action game, not a walk around and take in the scenery. Hard to do any testing my own w/o looking like a total idiot meandering the landscape.
3. NV totally bombed this every which way. Even ignoring the above issues its still unacceptable that NV didn't go that extra mile with EA to ensure DXR and DLSS were truly running at optimal levels out of the gate. There's a rumor DLSS will be implemented with the 2060 release.
4. 2060-oh god make it stop. This is the card for the poor soul who was given a 1030/1050 last year and for a moment dreamed about the 4k hype likely on the box only to watch their dream come to a stuttering halt as someone else in the room choked up laughing about how their console did so much better.

edit: One other thing I hate about this game and a few others. I miss the days when you hit pause/start and only had to hop one or two menus to change things and a max preset actually maxed the game. Lately it's become like configuring a new smart t.v. where you have to dig thru a half dozen menus and then more sub menus(I exaggerate but still) and heaven help you if you change something that needs the game to restart.
 
Last edited:
Fuck new tech. let's hide like pussies in the here and now. I am not happy about the performance of Ray Tracing either, but before this card was released it was said it could never be done in real time. EVER. The cards are expensive, but let's not bitch and whine about a new GPU tech that's first gen.

Creating data and making observations on said data set isn't bitching and whining. Considering the entire RTX line was sold on these features working, I don't know anyone that wasn't dissapointed when they found out how atrocious performance is with them. If we can identify exactly what the strenghs and weaknesses are it can only help to solve the issue long term. Companies like Nvidia listen to gamers and developers to guide their decision making process whether or not you want to believe that. If every single developer tells Nvidia that they can't use ray tracing because it uses up more VRAM than the cards have, and that performance makes their games unplayable, Nvidia will have to modify their hardware in the future. Otherwise we'll continue to get products that under perform to our expectations. Considering the price premium, this is indeed a serious issue with their new tech.

Brent didn't even discuss the implementation of the ray tracing, and the amount of noise present with the current ray tracing in Battlefield. If Nvidia wanted to get more accurate reflections, we'd probably see frame crushing numbers beyond what anyone would consider playable, so there's a list of issues with how they launched RTX, essentially selling something that was in fact not ready for market.

Clearly defining the issues and making recommendations on how to potentially alleviate disappointment for prospective RTX buyers is a well performed service by the staff of Hardocp. I know I would be absolutely mental if I had bought a 2070 only to realize I couldn't even use rtx without a horrible experience at 1080p.
 
It occurred to me somebody should make a GIF of a soldier walking around and looking close up at reflections while two other soldiers speak to each other. One says "What's that guy doing?" and the other replies "Benching his RTX". Got to admit that'd look funny during a battle.
 
damn you can't even get 'Ultra' RTX settings at 1440p with the 2080ti!!??...what a disaster of a card...how is Nvidia hyping up the RTX 2060 as being able to do Battlefield 5 with ray-tracing enabled if even the 2070 wasn't able to give you that?...and ray-tracing cuts your frame rates in half??...crazy
 
I have a rtx 2080 ti with a 8700 and 16gb ram. My average fps at 1440p@ultra preset with rtx ultra is about 60-70ish, but there's a very odd feeling when enabling rtx. It's almost like a layer of lag has been added even when my fps are in the 70s. Anyone experience this or feel this?
 
This is really disappointing....

Thanks for providing the minimum fps in the tests. That was VERY telling.

The situation reminds me of the Geforce 2 GTS. They promised "shaders" and all we got was a couple of effects in Evolve, Giants: CK, and Sacrifice.

So if that tech didn't go mainstream until about the Geforce 4, we can expect ray-tracing to be viable in 2 generations?
 
I’m starting to think that the lie is not only encompassing the RTX line but DX12 as a whole. We were promised so much years and years ago and hardly any of it has come to light.
 
The fact that this is the only title released with Ray Tracing enhances the lie of us having several titles with it. DLSS is another whopper. Where is my DLSS patch for Darksiders 3? Nevermind I already beat it so too late. Same with Shadow of the Tomb Raider. So glad I stuck with my 1080ti and did not fall for it.
 
This is really disappointing....

Thanks for providing the minimum fps in the tests. That was VERY telling.

The situation reminds me of the Geforce 2 GTS. They promised "shaders" and all we got was a couple of effects in Evolve, Giants: CK, and Sacrifice.

So if that tech didn't go mainstream until about the Geforce 4, we can expect ray-tracing to be viable in 2 generations?
Actual ray-tracing? No.

What they are using in BF:V today? Yes, most likely.
 
Does any of the AMD cards take the same sort of hit in performance between Dx11 and Dx12
I am seeing better performance with DX 12 over DX 11 with Vega FE. That is with single player and with Ryzen so not apples to apples on the game.
 
  • Like
Reactions: craz0
like this
I think this is great news. It means we can skip this generation and see what comes next. :D
 
Back
Top