Battlefield V NVIDIA Ray Tracing: i9-9900K CPU Testing @ [H]

Fine, we can agree to disagree. I didn't buy the RTX for the RT capability. I bought it because I wanted better performance with my 21:9 display. DXR is still up in the air as far as I'm concerned. When DICE gets their DX12 programming straight I'll try it out again.

I too game on 21:9. Acer XR382CQK.002, 3840x1600 on a 1080TI and 6600K under customer water. Upgraded to the 1080Ti from a 980Ti less than two months ago.
 
Ok, the only difference between my playing and Brent in settings is I keep GPU Memory Restrictions "ON". I tried turning it off like Brent and I'd get massive stutters every ~10 seconds. Even had my sound skip.

2.png

Maybe I can attribute my higher FPS to using a 380W bios and OC, below is an Afterburner chart with vsync on at 60 Hz 3440x1440 ultra/dxr low Rotterdam, 64 players, just running around the map. You can see it's capable from my usage.

ab.png

Overall this is still a terrible game for RT, there are so many better games for it that would look amazing.
 
Ok, the only difference between my playing and Brent in settings is I keep GPU Memory Restrictions "ON". I tried turning it off like Brent and I'd get massive stutters every ~10 seconds. Even had my sound skip.

View attachment 135082

Maybe I can attribute my higher FPS to using a 380W bios and OC, below is an Afterburner chart with vsync on at 60 Hz 3440x1440 ultra/dxr low Rotterdam, 64 players, just running around the map. You can see it's capable from my usage.

View attachment 135080

Overall this is still a terrible game for RT, there are so many better games for it that would look amazing.

You have VSYNC On. I have VSYNC Off in testing.

In regards to Memory Restriction, I tried both On and Off and had the same experience either way.
 
Kyle, it's true that we don't have anything else to judge the RTX line with besides BFV, but I think we're going to see better implementation in the future. That doesn't mean I forgive Jensen for overpromising and underdelivering, but I am willing to withhold final judgement until I see other engines try DXR.

What if that future never comes.... I don't know what will happen, only what is now, as Kyle said.

I do know this though about the future, it is constantly changing. NVIDIA Hybrid Ray Tracing may be short lived and something else entirely different like Path Tracing take over. The point is, we don't know. Your RTX card may only have 1 or 2 games that ever support NVIDIA Ray Tracing. Or, maybe it will have 100, I surely don't know.
 
You have VSYNC On. I have VSYNC Off in testing.

In regards to Memory Restriction, I tried both On and Off and had the same experience either way.

I ran vsync off the other day and was in the 70s.

I hit 10.5GB with dxr low and ran fine. I did have stuttering with dxr ultra/memory restrictions on but more micro stutters. Confirms your VRAM adventures IMO.

At the end of the day it shouldn’t be this hard with the top GPU...
 
Lol, "chuck it to the trash, RTX is a lie". I love it.

I would be interested to see how the numbers look on stock clocks, seeing as how that is what the specs on the box would be looking for.

This is exactly the kinda stuff that Lisa Sure was talking about during CES reviews- Having the hardware there without having a proper software package is a disaster.

I wonder if the ray tracing tech... If it's an issue with the software, the hardware just not being powerful enough, or both?


Thanks for the write-up!

between many hardware maker in the world nvidia most likely understand this the most.
 
between many hardware maker in the world nvidia most likely understand this the most.

Apparently they don't. Hence GPP and the NDA. They knew they had a dud when it came to ray tracing long before we even knew about the tech. Wake up.
 
Not DX 12, but dices ability to use it properly, or better yet, code it properly. They have had issues with DX12 since day one. Yet there are many devs/games out there that have no issues with DX12. It really seems as though DX12 is an after thought with both BF1 and BFV. Then you throw in Ray Tracing and it gets worse.
the issue most likely with the game engine itself. instead saying it was an afterthought i think DICE want to get more experience with it. and to get the real sense how it works in real world they need to see how it run in actual gameplay not just in their test lab. the caveat is user might not get the best experience from it.
 
the issue most likely with the game engine itself. instead saying it was an afterthought i think DICE want to get more experience with it. and to get the real sense how it works in real world they need to see how it run in actual gameplay not just in their test lab. the caveat is user might not get the best experience from it.


The user doesn't get anything but a negative experience. Having to drop a single resolution tier for a proper implementation would be acceptable. Having to go from 4k to 1080 for a half baked implementation is unacceptable considering the pricing increases. Hell, the pricing increases alone don't match up for non-RT numbers.
 
Apparently they don't. Hence GPP and the NDA. They knew they had a dud when it came to ray tracing long before we even knew about the tech. Wake up.

nvidia got where they were today because of the amount of resource they have put in improving their software. that's why they dominate the professional space. RTRT has to start somewhere. there will be no perfect solution from day one. when tessellation first appeared in Dirt 2 what did people think about it? a tech is not ready to be implemented in games for 5 more years? completely useless feature ever introduce in 3D games? even [H] recommend the game to be played on DX9 because there is no visual advantage at all. developer and other hardware maker will benefit from this early experience. even if RTRT only being pushed 5 years from now i'm sure at the time we will see similar issue what we are seeing right now because hardware maker and game developer still does not have real experience on how it run on real games.
 
nvidia got where they were today because of the amount of resource they have put in improving their software. that's why they dominate the professional space. RTRT has to start somewhere. there will be no perfect solution from day one. when tessellation first appeared in Dirt 2 what did people think about it? a tech is not ready to be implemented in games for 5 more years? completely useless feature ever introduce in 3D games? even [H] recommend the game to be played on DX9 because there is no visual advantage at all. developer and other hardware maker will benefit from this early experience. even if RTRT only being pushed 5 years from now i'm sure at the time we will see similar issue what we are seeing right now because hardware maker and game developer still does not have real experience on how it run on real games.

nVidia got where they are today by aquisitions.

Right there is the meat, exactly what I was talking about. It's amazing how people end up agreeing with something they claim to disagree with, without even realizing it.
 
The user doesn't get anything but a negative experience. Having to drop a single resolution tier for a proper implementation would be acceptable. Having to go from 4k to 1080 for a half baked implementation is unacceptable considering the pricing increases. Hell, the pricing increases alone don't match up for non-RT numbers.

no one are happy with the price increase. but nvidia are company. so are AMD. with the price increase of RTX AMD also quickly take advantage of the situation to get better profit. acceptable or not it is up to user to decide. and ultimately no one actually forcing the user to buy the new RTX card else you can't play the game at all. older cards like 1080ti can play it just fine sans RT. funny thing is people complain when developer did not push things higher and said it was done because of console parity and such. and when they try to do so with heavy performance penalty they got complained with half baked implementation. that's why some dev simply give their middle finger to PC gamer.
 
nVidia got where they are today by aquisitions.

Right there is the meat, exactly what I was talking about. It's amazing how people end up agreeing with something they claim to disagree with, without even realizing it.

part of it. but without good leadership no matter how promising tech you have under you belt you will still crumble in the end.
 
the issue most likely with the game engine itself. instead saying it was an afterthought i think DICE want to get more experience with it. and to get the real sense how it works in real world they need to see how it run in actual gameplay not just in their test lab. the caveat is user might not get the best experience from it.

Don't they first have to get DX 12 working properly without Ray Tracing first? They can't even do that yet. Battlefield 1 had issues with DX 12 from the start, and still has issues with DX 12, Battlefield V was released with serious issues with DX 12 , and there are still issues with DX12 without Ray Tracing, so how is releasing Ray Tracing to see how it runs in actual game play helping anything when they can't even get the foundation it runs on to be reliable and solid? (It is safe to say they had plenty of real game play from Battlefield 1 to learn all they can learn)

Since Dice created the frostbite engine, and the engine has trouble with DX 12, that pretty much supports that DX 12 was added as an after thought and they never built a new Frostbite engine from the ground up with DX 12 in mind and/or they are not capable of modifying the current engine to support it properly. So, again, they do not have a sold foundation that can properly run DX 12, so why bother adding Ray Tracing?
 
Last edited:
I too game on 21:9. Acer XR382CQK.002, 3840x1600 on a 1080TI and 6600K under customer water. Upgraded to the 1080Ti from a 980Ti less than two months ago.

I still have my old(2015) GTX Titan X (Maxwell, ~ 980 ti with 12 GB VRAM) and not sure what to use in my next PC build: cheap used GTX 1080 ti or new (expensive) RTX 2080 ti. Initially I wanted to double my FPS (1440p, later maybe 4K), but GP102 is only +50-60% - that's why I skipped it initially. TU102 is a bit more than 2X FPS (compared to GM200), but the price is still higher than was my GTX TItan X (1K$ in 2015) - should I wait until it goes down? AMD alternative is still not available in 2019, if smth appears ~ RTX 2080 but reasonably priced, then is OK for me. I don't care much about DXR/RT.
 
Could you point us in the direction of these tweaks, as i am interested in looking into them, or to see if there any other tweaks i need to do that I haven't already done. I am sure there are others who would be interested in them as well.

Thanks!
Didn't have time to find the video in my initial post. I found the video here.
Not all of them are necessary for getting better performance, but try it out and see if it helps.
 
Ok so this is a stupid question that has probably been asked before but what is up with DX12? I thought that it was auto enabled in some games? How are you disabling it and is there any advantage to running DX12 and taking the FPS hit?
 
Last edited:
Thing is it has everything to do with your test of the game as well. Obviously you are GPU limited with Ray Tracing on, but you are testing CPU's.
You do realize the point of these CPU tests were to address all of the readers suggesting we were CPU-limited....right?

If you would like to discuss YouTubers tweaking BFV, please do that in your own thread. This thread is for discussing information of our review. Thanks.
 
man, this must be a sore spot - I've never had my post deleted before... didn't realize what I said was so offensive... I just said I see 100% real-world performance over your numbers... and asked what hard drive you used and that maybe it's related to disk churn? you mentioned you had stuttering which no one else has. clearly there's some tension here I wasn't aware of. my intent wasn't to say you did anything wrong, we're just trying to help understand why your numbers don't match others, and in this case, yours are substantially lower than what others see.
 
man, this must be a sore spot - I've never had my post deleted before... didn't realize what I said was so offensive... I just said I see 100% real-world performance over your numbers... and asked what hard drive you used and that maybe it's related to disk churn? you mentioned you had stuttering which no one else has. clearly there's some tension here I wasn't aware of. my intent wasn't to say you did anything wrong, we're just trying to help understand why your numbers don't match others, and in this case, yours are substantially lower than what others see.

I said it in another post, but I think it is related to how we evaluate. If you stick your head in one part of the map, and stay there, you can certainly inflate, or deflate your numbers, it is very easy to bias the results in BFV and still call them "real-world"

You must check every place on the map you can go, there are areas that are more demanding than others.

You must also have lots of people on the screen at the same time with combat happening, if not, numbers will be a lot higher when nothing is happening onscreen. It's real easy to once again be in a place in the map where there aren't people and the numbers will go way up. The only way to properly test is to be very active in the map and put yourself in dense situations.

It is easy to take the easy way out with this game and test for just a minutes worth of gameplay, we test longer.

Multiplayer is dynamic in nature, so naturally performance is going to be different, that's why we do multiple runs.

Jensen at CES looking at a spot of water with nothing happening onscreen does not equate to what you will really get as you are playing the game against real people, dying, re-spawning, and going at it again, rinse and repeat with fast gameplay.
 
Thanks for confirming your config. And I do understand the tension you face with regard to configuration and setting up the system when doing a review as it obviously has a big impact on performance. I suppose there is some room for ambiguity as to what settings would be considered "automatic" or "out of the box" vs. "advanced" or "professional/competition" related but I do applaud you for taking a stance on an issue and sticking with it. Having said this, I am not sure most people would only install Driver, NGX, and PhysX even though doing so is clearly the way to go (in my case) when playing BFV, but maybe I am an outlier and that drivers are the one thing I usually let install whatever it wants to.
 
Thanks for confirming your config. And I do understand the tension you face with regard to configuration and setting up the system when doing a review as it obviously has a big impact on performance. I suppose there is some room for ambiguity as to what settings would be considered "automatic" or "out of the box" vs. "advanced" or "professional/competition" related but I do applaud you for taking a stance on an issue and sticking with it. Having said this, I am not sure most people would only install Driver, NGX, and PhysX even though doing so is clearly the way to go (in my case) when playing BFV, but maybe I am an outlier and that drivers are the one thing I usually let install whatever it wants to.
Introducing a ton of unknown variables and the true effects of those is simply not in the cards for us if we want to be taken seriously.
 
People trying real hard to find some secret bullet to justify their 1000+ USD cards, that Nvidia isn't giving out to everyone because they don't need all the sales they can get right? DXR looks like undercooked software and to be a hardware generation early.
 
Last edited:
Remember when you criticized Hardware T&L as being incomplete and worthless? History is repeating itself it seems. We certainly need more time and games before we prejudge an important tech like DXR as worthless, or "Trash".
I thought about writing a whole story about the parallels. Good memory on you!

And agreed, I do not think that DXR is trash, but currently it is worthless IMO.

RTX 2060 BFV review being edited now.
 
We are expecting the GTX 2060 to be a disaster at RTX with the same settings as the other cards.

Ultra textures is just too much for everything short of a 2080ti.
 
We are expecting the GTX 2060 to be a disaster at RTX with the same settings as the other cards.

Ultra textures is just too much for everything short of a 2080ti.

With DLSS ray tracing might have a fighting chance. It’s supposed to come to BFV.

DLSS is a pretty sweet feature, just taking wayy too long to implement, but honestly at 2060 performance I would have just preferred straight CUDA cores and a smaller/cheaper die. They are setting themselves up to get curb stomped by the competitors if the competitors would hurry up.
 
I wonder who it was who criticized [H] for using overclocked 7700K instead of 9900K? That criticism could have been countered with a simple logic. Raytracing is entirely GPU bound effect, enabling it is like pushing resolution to the max, it takes load AWAY from the CPU and evens out the differences between them, the GPU becomes a bottleneck. Heck if you were using a Sandy Bridge you'd probably see similar performance numbers with raytracing on.
 
I wonder who it was who criticized [H] for using overclocked 7700K instead of 9900K? That criticism could have been countered with a simple logic. Raytracing is entirely GPU bound effect, enabling it is like pushing resolution to the max, it takes load AWAY from the CPU and evens out the differences between them, the GPU becomes a bottleneck. Heck if you were using a Sandy Bridge you'd probably see similar performance numbers with raytracing on.

BFV system requirements state a 6 core processor for having RTX enabled.. That's why..
 
BFV system requirements state a 6 core processor for having RTX enabled.. That's why..

I know that. But 4 core 8 thread CPU is roughly equivalent to 6 real cores 6 threads, clock for clock. And anyone even remotely interested in PC gaming should know that the minimum and recommended boxes are not based on anything real but specs developers pull out of their asses. So again, who was criticizing Kyle from their original test? Certainly not anyone from this forum where everyone should know atleast something about hardware?
 
I know that. But 4 core 8 thread CPU is roughly equivalent to 6 real cores 6 threads, clock for clock. And anyone even remotely interested in PC gaming should know that the minimum and recommended boxes are not based on anything real but specs developers pull out of their asses. So again, who was criticizing Kyle from their original test? Certainly not anyone from this forum where everyone should know atleast something about hardware?
Data is always good.
 
True. At least it does give you some hard numbers that tells whiners to STFU. :)

I was probably the first and it was more of an observation than criticism. I get much higher fps, ~75fps at 3440x1440 ultra/dxr low where Brent got 66 at 2560x1440p. The first observation was I have 8C/16T and he was at 4C/8T. Others mentioned similar.

I remembered DICE mentioning in an interview they were implementing ray tracing for 12 threads. It’s also on the suggested list. So I just noted that as a possible reason. I agreed with Kyle a high clocked 7700k was most likely fine if I had to guess.

I think the discrepancy probably comes from a combination of my card being clocked higher / hard modded and Brent finding intense areas to test RT.

Also my CPU usage appears higher in DX12 than DX11. We also didn’t know how RT presents the data. It could have been completely different than a typical game.

Anyways it’s great [H] did this testing since I couldn’t find any cpu comparisons for RT. Again, the main reason I give to Patreon is they dig into things like this.
 
True. At least it does give you some hard numbers that tells whiners to STFU. :)
The fact is that while we were 99% sure we were on solid ground, we did not KNOW. Now we know. Good for everyone involved.
 
Back
Top