Battlefield V NVIDIA Ray Tracing RTX 2080 Ti Performance @ [H]

Did anyone notice how smug Jensen was when he pointed at the frame rate chart and said "We are aware of that" when the screen showed the frame rate dropped after enabling DXR? He then went on to show the DLSS feature pushing ("boop") the frame rate back up again. The optimistic side of me tells me he has an ace card up his sleeve but I know we should all manage our expectations after seeing how much VRAM is needed when enabling DXR in the above article/findings. I don't know if DLSS could circumvent the lack of VRAM.
 
There is simply not any incentives after weighing the cost versus performance of these cards, so I am glad I stuck with my GTX 1070. I really can't wait to see what AMD has for their next NAVI architecture. I hope it's like their new CPU Ryzen line and gives some true competition to Nvidia.

I loved my ATI 9700 Pro when I got Half Life 2 with it.
 
I don"t where your problem is here, but computerbase is getting much higher numbers than this, in the same map, even for the 2060:

1546856165f4k2bi0vdz_7_3.png


2lo5i0p.jpg


https://www.computerbase.de/2019-01...est/5/#abschnitt_battlefield_v_mit_raytracing

My money is that you are still CPU limited on that 7700K. And your CPU testing failed to illustrate that limitation somehow.
 
It may be time to take off the tin foil hat, but the DX12 results almost seem...artificially degraded.

Not only does the FPS take a huge hit, which makes ray tracing suffer before it is even turned on, but the vram usage is just madness.

Does dx12 in these games extend the Gameworks distances akin to the miles away Hairworks debacle in FFXV so as to make ray tracing work when it is turned on?
We really need to see how dx12 is affecting Vega.

BTW, great article HArdocp. The Bottom Line was hilarious!
 
I don"t where your problem is here, but computerbase is getting much higher numbers than this, in the same map, even for the 2060:

View attachment 133583

View attachment 133584

https://www.computerbase.de/2019-01...est/5/#abschnitt_battlefield_v_mit_raytracing

My money is that you are still CPU limited on that 7700K. And your CPU testing failed to illustrate that limitation somehow.

Our CPU is not the bottleneck - https://www.hardocp.com/article/2019/01/02/battlefield_v_nvidia_ray_tracing_cpu_testing/

I do not know how computerbase evaluates its performance in that multiplayer map, so I can't comment on that. For all I know they are standing in one place, or moving back and forth in just one area, or testing only 1 minute of gameplay, I do not know.

What I can tell you about how we test is that I play a real-world live 64 player match loaded with 64 players just like a normal gamer. I time it for 5 minutes each run, and do several runs. I make sure to explore not just one area, but all areas of the map that I can, especially in spots I know that are lower in performance since that is the worse case scenario, but also the real-world scenario.

In my experience of Rotterdam with DXR performance can very GREATLY depending on what part of the map you are in, and where you are. I find the lowest performance seems to be right under the train bridge, for the entire length of that area framerates are the lowest. Framerates are also low inside the mansion with the shiny floor. Framerates are typically much higher in indoor environments, and in the area next to the water. The framerates can deviate 15FPS easily between all those spots. So your average framerate can be higher or lower just depending on where you benchmark at in the map. This is also a major point we have to make, the fact that framerate varies so much with DXR is a problem. Someone could easily bias the average framerate in Rotterdam if they specifically stay in one area over another, either higher or lower fps. SO BE CAREFUL with other reviews. Again, I test in all areas to get an average, so now you know how we do it.
 
I’m starting to think that the lie is not only encompassing the RTX line but DX12 as a whole. We were promised so much years and years ago and hardly any of it has come to light.

Seems like it's been every other DX release that offered something good. DX9 good, DX10 meh, DX11 good, DX12 meh. So maybe DX13 will be better? :p
 
A seriously scathing review. Clearly the standards for "playability" are elitist at the {H}, which I get, but for some,
including myself, the RTX experience in Battlefield on an RTX 2080, even above 1080P is
acceptable at minimum. I certainly did not purchase my RTX card solely for that purpose as I'm sure is the case
with many RTX owners, so I'm not quite sure why you continue to belabour the point of, "You don’t spend that amount of money
just to play BFV at 1080p." Lest we forget, RTX cards are very strong performers in non RTX games which is the probable
use case for most prospective buyers. Nevertheless, I can see why for many, this would be a very fair review. It's just that I'm
not one of them :)
 
Thanks Brent and [H] for the work in these reviews.

Now that you have had some time playing the game with DXR on vs off and at various resolutions, what would your subjective image quality impressions be of the game in 1080p ultra DXR, vs 4k no DXR?

i.e., is the visual benefit from DXR worth the loss in resolution, everything else being equal?
 
I understand the CPU is at 5GHz but I feel like the 4 cores are holding the results back a bit.

index.png
 
Seems like it's been every other DX release that offered something good. DX9 good, DX10 meh, DX11 good, DX12 meh. So maybe DX13 will be better? :p

Like Star Trek movies of old, lets hope so. Cept the even treks were awesome.
 
Thanks Brent and [H] for the work in these reviews.

Now that you have had some time playing the game with DXR on vs off and at various resolutions, what would your subjective image quality impressions be of the game in 1080p ultra DXR, vs 4k no DXR?

i.e., is the visual benefit from DXR worth the loss in resolution, everything else being equal?
I think they answered that question in this review, a fairly resounding "NO," but obviously it's a subjective conclusion...Ymmv.
 
Yes it is, your results don't agree with other sites that use higher CPUs, it doesn't agree with many YouTubers who captured their runs with higher CPUs as well.

Your CPU comparison compared an 8 thread CPU to another 8 thread CPU. DXR in Battlefield requires a minimum of 12 threads CPU, even a Ryzen 1800X can bottleneck the game with DXR as Digitalfoundry found out.

At 1440p low DXR, You have the 2080Ti just 13% faster than a 2080, and the 2080 just 5% faster than the 2070, which is impossible unless you are severely CPU limited.

Your numbers at 1080p are also screwed, showing only 10% between cards, whether at low or ultra DXR, which is another sign that you have a massive CPU bottleneck.

I don't know how you can claim you are not CPU limited while using a CPU that is begged at 90% usage most of the time.
 
Last edited:
You are free to feel however you want, but when it comes to the facts, I prefer to stick with them - https://www.hardocp.com/article/2019/01/02/battlefield_v_nvidia_ray_tracing_cpu_testing/
For sure but that is also DX12 which we all know is slower than DX11 even without DXR enabled. I guess guru3D never stated which API they were using for their core scaling tests either.
If scaling proves itself in DX11 and not DX12 then there is something seriously wrong with the DX12 implementation.
I wonder what core scaling looks like for either API with smt on/off and with different core counts from the same processor families.
 
One really cannot compare different sites results that were tested way differently. A canned benchmark is about the only way but it’s value in many cases are meaningless.

Brent from my observations is about the best tester in the world for finding where a card struggles in a game. Looking up into the sky and getting mega FPS is meaningless. In a battle it is nice to know how it really performs.

With DXR it looks to be GPU limited, does CPU usage go up? Stays the same? Or goes down? If it stays the same or down and the FPS Goes down with DXR then it is highly unlikely CPU bound but GPU bound.
 
I read it whole, it still doesn't explain the massive inconsistencies you have with your numbers. The differences between the 2070, 2080 and 2080Ti are completely compressed in your testing as a result of a massive CPU bottlneck. And your 2080Ti is bascially doing 2060 level of performance

Also ComputerBase tests a live match in the mansion area of Rotterdam.

With DXR it looks to be GPU limited, does CPU usage go up?
It goes considerably up because the ray setup is CPU intensive.
 
You are free to feel however you want, but when it comes to the facts, I prefer to stick with them - https://www.hardocp.com/article/2019/01/02/battlefield_v_nvidia_ray_tracing_cpu_testing/

Didn’t Kyle buy a 9900k to put this to bed IIRC?

I was personally impressed they got RT running as well as they did although in the wrong type of game genre. I wish a game like World of Warships would implement it where proper lighting would be amazing.

Overall what I am really really disappointed with is the adoption rate of DLSS2x/DLSS/Ray Tracing.
 
GPU reviews at the {H} are of a very high standard imo. There are other extremely capable reviewers and some of their experiences do appear to be at odds with what is being reported here.

Make of it what you will.
 
I think the bottom line is we need more games to test. Saying "The RTX is a lie" based on one game seems disingenuous, especially when that game is using an engine that historically has issues running in DX12. I do appreciate all the work Brent has put into this, though, and I do not discount the rest of the conclusions.

Looking forward to your analysis of Metro: Exodus if you guys are planning on doing one. Forgoing the fact we may not get RTX in Shadow of the Tomb Raider, it will be the first game I actually want to buy and play that will put my 2080 Ti to good use.

There is that but then again this one game is the lead title for DXR both in terms of development resources and marketing. BF 5 is the drum that's getting banged here by the people trying to sell you a Turing card. Sadly that drum has a big hole in the middle of it.
 
Looking forward to your analysis of Metro: Exodus if you guys are planning on doing one. Forgoing the fact we may not get RTX in Shadow of the Tomb Raider, it will be the first game I actually want to buy and play that will put my 2080 Ti to good use.

I'm looking forward to that too. My only real concern is that the previous Metro's can still bring even a 2080TI in 4k to a crawl with AA maxed, most I can do is x2 for 50-60fps with them. One of the places I usually test with is Venice. Right after you get off the raft and walk thru market it can get pretty rough. Once you get above ground in the marsh the same again. They were a part of the first slew of games I tested when I got it. I tested both versions of both games. I do have a sneaky suspicion that modern drivers might be hampering it. Some tin foil hat stuff-really seems like performance for these has decreased over the last couple of years since I went from 970SLI>1080SLI>2080TI. Love the Metro's but Exodus could be a whole new world of hurt for modern cards again. This is one of the few games I'll pre-order and I'll suffer along with my card for it.
 
itll be interesting to see how this all plays out in the next 18-24 months. especially with rtx gen2 and gen3 cards. but i guess this needed to start someplace, and while it doent seem to work well yet, it does seem to work-ish. if they can get performance up on both software, and hardware.... well, real time raytracing has been the end game goal for the last 20 years.

id also be curious to see how a game would perform that was built around RT from the start, and not just crammed in at the last minute.
I am definitely interested in seeing how a game that is completely ray traced would perform on the hardware we have available compared to the hybrid method currently being employed.
I've been saying it in various threads since the RTX was first tested: there is zero justification for the artificially inflated price tags set by nVidia, and the gameplay experience offers no return on investment.

$1200+ for a 2080Ti to get the minimum experience, and $2500+ for a Titan to *maybe* get acceptable performance? GTFOH with that shit, nVidia!

I speculate that those whom have ponied up for any RTX since launch are going to be even more pissed when they find that their GPU resale values are in the toilet after 9-12 months when more games instituting Ray Tracing come along (that they can't get a decent gameplay experience with)...

It is evident to me that nVidia has really fucked themselves over with choosing to price these the way they have, and that means that they fuck their customers over.

My final thoughts are that the RTX series should have been launched like so:

Titan* - 24-32GB - $1000-1200
2080Ti - 12-16GB - $700-800
2080 - 8-12GB - $500-600
2070 - 8GB - $350-400
2060 - 6-8GB - $250-300


* and market it as the gaming GPU that it is, instead of trying to fool the masses into believing it's a professional card to try and justify it's current outrageous price.
The only way for price to come down is to decrease the rasterization or ray tracing hardware in the chip. Which means developers have to make a decision: do we continue to languish in a rasterized world or do we move forward into a ray traced one?
I read it whole, it still doesn't explain the massive inconsistencies you have with your numbers. The differences between the 2070, 2080 and 2080Ti are completely compressed in your testing as a result of a massive CPU bottlneck. And your 2080Ti is bascially doing 2060 level of performance

Also ComputerBase tests a live match in the mansion area of Rotterdam.


It goes considerably up because the ray setup is CPU intensive.
ComputerBase just say they use the Ultra preset with Future Frame Rendering turned on. [H] goes beyond the Ultra preset as detailed here:
https://www.hardocp.com/article/2018/12/17/battlefield_v_nvidia_ray_tracing_rtx_2070_performance/2
 
As Armenius had stated about a game known with DX12 issues it gave me an idea.

Facts:
1. Brent has done exhaustive testing showing the CPU load and performance.
2. Early on it was explained when DX12 was revealed that it would lower hardware loads but require extra coding at the engine software level to achieve that while previous API's used to mainly just dump it to hardware.

Theory:

What if BFV's engine is flagging the core/thread count and then limiting itself? It would completely ignore a screaming 5ghz 4c/8t and then favor high counts regardless of speed. The lower count CPU would never see its full potential but appear bottlenecked to anyone not checking like Brent did. With three very different rigs in my house whose only commonality it 2133mhz(misc. CAS) 32gb ram and same W10 builds I've seen some strange things with DX12. It's a wacky beast that depends on the dev a lot more than previous API's did. A hammer approach will still mostly(I've seen a couple of exceptions) work with previous API's but not necessarily with DX12.
 
Yes it is, your results don't agree with other sites that use higher CPUs, it doesn't agree with many YouTubers who captured their runs with higher CPUs as well.

Your CPU comparison compared an 8 thread CPU to another 8 thread CPU. DXR in Battlefield requires a minimum of 12 threads CPU, even a Ryzen 1800X can bottleneck the game with DXR as Digitalfoundry found out.

At 1440p low DXR, You have the 2080Ti just 13% faster than a 2080, and the 2080 just 5% faster than the 2070, which is impossible unless you are severely CPU limited.

Your numbers at 1080p are also screwed, showing only 10% between cards, whether at low or ultra DXR, which is another sign that you have a massive CPU bottleneck.

I don't know how you can claim you are not CPU limited while using a CPU that is begged at 90% usage most of the time.
If we are CPU limited, then why did we see framerate scaling across the different cards? That said, Brent now has a 9900K in hand to use, and we will be publishing our findings.
 
"The fact is that Jensen Huang stood on that stage last night at CES and lied to you."

That's a strong statement and I think it is good to say. Doesn't look like the rest of the "tech sites" are doing any real analysis because everyone is lapping up this piece of crap 6GB $350 card.
If the 2060 was $250, OK. But it's up there with the 8GB cards only in price ana general performance. When you actually want to use the shiny, it fails you. Waiting for more games is some crap.

Jensen Huang is very intentionally deceiving. A 6GB card at $350 is bullshit and he knows it, so he makes it more palatable with RTX.

I really hope AMD mule kicks NVIDIAs heads out of their collective asses with their upcoming releases. I'd be happy to buy a GTX2080 for ~$500.
 
The difference between the 2080 and 2080Ti @4K (where CPU plays very little part) is huge:75%, part of that is VRAM, but part of it is the liberation of that CPU bottleneck too. You should see at least a 40% difference between the 2080TI and 2080 at any point.

I am looking forward to your finding with the 9900K.
Fair deal. We will see how it all works out. If anything, we will fully determine if you need a 8C CPU to play BFV online.
 
I'm really doubting that BFV requires more than ~6C/T. Game engines traditionally do not scale as well as say, rendering applications, because there are many parts of the game engine that are difficult or hard to multi-thread. I could certainly be wrong, but I can't think of any other games that scale to 8 cores, and none beyond that.
 
I'm really doubting that BFV requires more than ~6C/T. Game engines traditionally do not scale as well as say, rendering applications, because there are many parts of the game engine that are difficult or hard to multi-thread. I could certainly be wrong, but I can't think of any other games that scale to 8 cores, and none beyond that.

BF4, BF1, Forza, GTA V, Assassin's Creed...
 
I really hope AMD mule kicks NVIDIAs heads out of their collective asses with their upcoming releases. I'd be happy to buy a GTX2080 for ~$500.

I really hope you aren’t rooting for AMD purely in hopes of pushing down Nvidia prices. (Assuming competitive performance) It’s part of the problem to only hope the competitor drives prices into your range. It’s unsustainable for the competitor, which well, if they can’t compete then they can’t compete but if they are actually competing....
 
BF4, BF1, Forza, GTA V, Assassin's Creed...
BF4 definitely didn't scale beyond 6 cores:
According to DICE’s Johan Andersson, Battlefield 4’s Frostbite engine is parallelized to run on up to eight CPU cores. However, in a presentation he gave back in 2014, he conceded that it’s not possible to utilize all available CPU cores under the then-current implementations of DirectX and OpenGL.

BF1 doesn't either: https://www.tomshardware.co.uk/battlefield-1-directx-12-benchmark,review-33864-8.html

GTA V doesn't really either, 4 core CPU is a few frames off a 10 core CPU at 1080p and no difference at 1440p: https://www.tomshardware.com/reviews/multi-core-cpu-scaling-directx-11,4768-3.html

Looks like Assassin's Creed Oddysey does, though, which is interesting. https://wccftech.com/assassins-creed-odyssey-pc-performance-explored/
 
Last edited:
BF4 definitely didn't scale beyond 6 cores:

BF1 doesn't either: https://www.tomshardware.co.uk/battlefield-1-directx-12-benchmark,review-33864-8.html

GTA V doesn't really either, 4 core CPU is a few frames off a 10 core CPU at 1080p and no difference at 1440p: https://www.tomshardware.com/reviews/multi-core-cpu-scaling-directx-11,4768-3.html

Looks like Assassin's Creed Oddysey does, though, which is interesting. https://wccftech.com/assassins-creed-odyssey-pc-performance-explored/

Your links are comparing mesh cpus to ring bus. Please try again.
 
Seems a little hard on nVidia. They are trying to push forward and sell something besides resolution and framerate. Ray tracing is the first real image quality improvement technology we've seen since shaders debuted. I know running Destiny 2 at 165fps locked on my 1440p monitor wouldn't do anything for me over running it at the 90-120fps I get off a 1080ti.

I don't even know how out of line their pricing is given the cost of memory and the size of the chips they are putting on these boards.

Of course the follow on to that is if nvida is selling them at a fair cost given the BoM for the card, does it's performance constitute a value for people to upgrade to. I think for most people here the answer to that is a resounding no. I'm sure as shit not going to buy it, but it exists as a stepping stone product in between pre-ray tracing cards and the die-shrinked version of this tech that will actually provide enough performance and hopefully enough software support to offer real value. Roll outs for hardware T&L and Shaders were similar in the immediate lack of support and extra cost for hardware you couldn't yet use.

Is RTX a thing that has value today, no not yet.
I do firmly believe Ray Tracing will be the future, it was just birthed a little early.
 
My 1080 Ti that I bought 2 years ago has probably proven to be the best investment in computer hardware that I've made pretty much ever. With this current crop I have no reason to upgrade for another 2 years....
 
Your links are comparing mesh cpus to ring bus. Please try again.
Oh boy, and now we start moving the goal posts.

Intel stated that the ring bus was designed for up to 8 physical core CPUs, and mesh interconnect was introduced on 10+ core CPUs. Please provide some actual proof that the ring bus is introducing significant performance penalties in 4/6/8C desktop CPUs like what are being tested here. I'll wait.
 
Oh boy, and now we start moving the goal posts.

Intel stated that the ring bus was designed for up to 8 physical core CPUs, and mesh interconnect was introduced on 10+ core CPUs. Please provide some actual proof that the ring bus is introducing significant performance penalties in 4/6/8C desktop CPUs like what are being tested here. I'll wait.
https://www.techspot.com/review/1457-ryzen-7-vs-core-i7-octa-core/

Well that didn't take long did it? Here, the R7 1700 is matching the 7820x clock for clock in gaming.

Are you going to tell me that Ryzen will match Skylake clock for clock when it is running ring bus as well?

If so, there will be alot if pissed off 9900k owners out there.
 
damn you can't even get 'Ultra' RTX settings at 1440p with the 2080ti!!??...what a disaster of a card...how is Nvidia hyping up the RTX 2060 as being able to do Battlefield 5 with ray-tracing enabled if even the 2070 wasn't able to give you that?...and ray-tracing cuts your frame rates in half??...crazy
They just forgot to mention is was a slide show gaming experience. :)
 
Back
Top