Battlefield V NVIDIA Ray Tracing: i9-9900K CPU Testing @ [H]

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,532
Battlefield V NVIDIA Ray Tracing: i9-9900K CPU Testing

We delve a bit deeper into Battlefield V testing and what all impacts your ability to play multiplayer with your brand new NVIDIA RTX video card with all the shiny Ray Tracing bits turned on. In this fifth installment, we head back to give a quick look at just how CPU width in terms of cores and threads impact your gameplay. Clock is still king at the high end.

If you like our content, please support HardOCP on Patreon.
 
Thanks, now it seems to be clear that more threads not help with dx12/dxr. But what happened to cpu/gpu load on your 9900K? Is it going to be lower than on 7700K: some threads just not used? GPU load with 'DXR on' is always about 100%? MIN fps is at 100% loaded gpu and not loaded CPU? Is dxr loading cpu more than just dx12? It seems min fps is not consistent here, better use 1% low fps?
 
Thanks, now it seems to be clear that more threads not help with dx12/dxr. But what happened to cpu/gpu load on your 9900K? Is it going to be lower than on 7700K: some threads just not used? GPU load with 'DXR on' is always about 100%? MIN fps is at 100% loaded gpu and not loaded CPU? Is dxr loading cpu more than just dx12? It seems min fps is not consistent here, better use 1% low fps?
All the stutters and games pauses that we encountered with the 7700K were experienced on the 9700K and 9900K. The actual gameplay experience was identical.

CPU loads were slightly lower on the 9700K and about 20% lower on the 9900K.
 
Strange is that your results show even some dx11 min/avg fps scaling with more threads, but no dx12/dxr scaling. While Dx12 is supposed to be the api to better utilize cores/threads in general. Maybe bf5 engine dx12 code is so poor and is the root cause of all the problems? Maybe more patches are on the way.
 
So DX12 needs some overhauling?
So still gpu limited. Maybe sli/cfx would be cpu limited?
In short don't doubt the [H] team! :)
 
Lol, "chuck it to the trash, RTX is a lie". I love it.

I would be interested to see how the numbers look on stock clocks, seeing as how that is what the specs on the box would be looking for.

This is exactly the kinda stuff that Lisa Sure was talking about during CES reviews- Having the hardware there without having a proper software package is a disaster.

I wonder if the ray tracing tech... If it's an issue with the software, the hardware just not being powerful enough, or both?


Thanks for the write-up!
 
Thanks Kyle and Brent for the insane amount of testing you've done with this title.

It was nice to see increased documentation of DX12. I've witnessed it with my rigs in this game, and others, and after this I just don't know how to technically suggest causes.

Bottom line for anyone's claims of success with DX12 is that if it works for them it doesn't mean it will for all and I believe many dev's already display such things when you switch to DX12 in games.
 
Interesting results; would be neat to see some Ryzen RTX benches.
 
Now that the CPU cores/thread debate can be buried, we can focus solely on the new RTX. A $1250+ GPU provides 30 FPS min and about 60 FPS avg with DXR ultra settings. That is one hell of a steep price of admission to get what is widely adopted as bare minimum acceptable performance...and at only 1080p. How pathetic.

Maybe the future RTX 3xxx series will give a 60-80% gain, so 1440p can be barely playable? :p
 
Another great review thank you! I do have to say that after testing a lot of different graphics options I ran into a lot of stuttering, but after clearing the BFV's shader cache in the documents folder it seemed to have fixed it and also stopped the constant pop-in I was getting. Also it's worth comparing DX11 FFR on to DX12 FFR off as FFR on doesn't increase DX12 fps but does add a very slight input delay. If there's ever a Ryzen version done it would be great to see the scaling from 4 threads to 16.
 
Hahahaha 30Fps in 1080 with RTX and Ultra settings/ lol Looks like the dream of 4K is SHOT IN THE ASS how does that 1200$ price tag feel now guys? The cars are a JOKE! No wonder Nvidia stock is down 45% And I own a 1080Ti But will be ditching it fort the new AMD card Who gives a shit about the Ray Tracing right now.
 
I won't be happy until you run this test on a 10900K....!

This is why I work in a field where I don't face a customer. People go out of their way with utmost effort to complain and whine than be a part of the solution.
 
So, in a nutchell there are some fanboys who have 9900K rigs who wanted to see how much worse, cough better cough, their CPU performs compared to an older CPU such as the prehistoric 7700K and discovered through the H's testing that they upgraded too soon and spent money needlessly... I enjoy the comparion, but some guys need to man up and increase or start funding via Patreon if they want more of these comparisons in the future!
 
So, in a nutchell there are some fanboys who have 9900K rigs who wanted to see how much worse, cough better cough, their CPU performs compared to an older CPU such as the prehistoric 7700K and discovered through the H's testing that they upgraded too soon and spent money needlessly... I enjoy the comparion, but some guys need to man up and increase or start funding via Patreon if they want more of these comparisons in the future!

Negative.

Some of us with higher core count processors get higher fps and were wondering if that was the reason why. For instance, I get 75fps at 3440x1440 ultra/low where Brent got ~66fps at 2560x1440 ultra/low. So I had 16% higher fps at 35% more pixels. I also don’t notice any stuttering.

The results aren’t so different they are bogus, but it is a decent amount, it’s awesome [H] did this testing and we got to see how minimums improved with the 9900k. It’s great they investigated at all. A big part of why I am a Patreon.

The overall jist I noticed on the forum is if you have a high Hz monitor, high Hz >> ray tracing. DLSS is supposed to come to BFV and cancel out the fps drop from ray tracing *according to nVidia* so take that as you want.
 
Haven't played this game and almost everyone complains its stutters alot despite having 8 core CPUs. Seems to happen in multiplayer mostly from what I read not single player.
I'm playing this game on a 4.3GHz 4770K (I lost the silicon lottery on this chip badly) and I don't have any issues with stuttering in multiplayer, fwiw. 4C/8T, playing with a Titan Xp (2016 model) at 2076MHz. I play at 1440p Ultra and it's usually around 90-100 fps on most maps.

Strange is that your results show even some dx11 min/avg fps scaling with more threads, but no dx12/dxr scaling. While Dx12 is supposed to be the api to better utilize cores/threads in general. Maybe bf5 engine dx12 code is so poor and is the root cause of all the problems? Maybe more patches are on the way.
DirectX12 has never performed well in Frostbite. It's kind of unclear why, DICE clearly has talented software engineers so you would think that the DX12 implementation would be solid, especially given it's required for some of the flagship visual effects.
 
To be clear on this also. We have done MULTIPLE clean installs of the OS, BFV, chipset drivers, GPU drivers, etc. We find the issues over and over again.

I did not mind the testing we did, if anything is was good for everyone.

The bottom line is that NVIDIA has royally screwed this launch up going with this game/dev to show off ray tracing.

Keep in mind I have now purchased about $5000 worth of RTX cards for testing. I run an RTX card in my own system. These are great cards for gaming, albeit overpriced. I think NVIDIA got the cart way out in front of the horse with ray tracing however while hanging their hat on it as the feature we are paying for.
 
This is why I patreon.

Are there any Vulkan games that are doing ray tracing yet? Would be interesting to see the results in a different API.

I'm still happy with my upgrade to a 2080ti coming from a Radeon 7950, however I'm interested to see H's review on the Radeon 7. I max out my 2080ti's VRAM running in VR and 4K, so it'll be interesting to see if the Radeon 7's increased memory gives it any advantages and helps narrow the gap at higher resolutions and VR.
 
Excellent review and I like many others can't believe the amount of time you have spent bench marking this title.
 
Meh, sure the launch was a bit rushed and it's not quite fully ready for widespread implementation, especially in multiplayer, but Ray Tracing had to start somewhere, and frankly I don't think many people ever expected it to arrive as soon as it did. Ray Tracing isn't going to go away and will likely be fully functional on most systems possibly a lot sooner than it would have if Nvidia hadn't rushed things. The good thing is it looks like Nvidia is offering a really solid product in the 2060 at a very fair price. AMD is doing the same with their new releases. Thankfully we still have some solid choices and the competition between the two is healthy which is good for the consumer.
 
So DX12 needs some overhauling?
So still gpu limited. Maybe sli/cfx would be cpu limited?
In short don't doubt the [H] team! :)
Not DX 12, but dices ability to use it properly, or better yet, code it properly. They have had issues with DX12 since day one. Yet there are many devs/games out there that have no issues with DX12. It really seems as though DX12 is an after thought with both BF1 and BFV. Then you throw in Ray Tracing and it gets worse.
 
Meh, sure the launch was a bit rushed and it's not quite fully ready for widespread implementation, especially in multiplayer, but Ray Tracing had to start somewhere, and frankly I don't think many people ever expected it to arrive as soon as it did. Ray Tracing isn't going to go away and will likely be fully functional on most systems possibly a lot sooner than it would have if Nvidia hadn't rushed things. The good thing is it looks like Nvidia is offering a really solid product in the 2060 at a very fair price. AMD is doing the same with their new releases. Thankfully we still have some solid choices and the competition between the two is healthy which is good for the consumer.
I see people saying this a lot. Please bear in mind this is *not* full scene ray tracing. The engine is using rasterization for almost everything and ray tracing only some elements and lighting effects.

We are still many years away from full-scene ray tracing replacing rasterizing as the primary technology driving visuals in games.
 
Meh, sure the launch was a bit rushed and it's not quite fully ready for widespread implementation, especially in multiplayer, but Ray Tracing had to start somewhere, and frankly I don't think many people ever expected it to arrive as soon as it did. Ray Tracing isn't going to go away and will likely be fully functional on most systems possibly a lot sooner than it would have if Nvidia hadn't rushed things. The good thing is it looks like Nvidia is offering a really solid product in the 2060 at a very fair price. AMD is doing the same with their new releases. Thankfully we still have some solid choices and the competition between the two is healthy which is good for the consumer.


2060 is not a fair price
 
So the 9900k was pretty much within the margin of error. I wonder if people OC their cards which is why they got a higher #?
 
For your best BFV multiplayer experience, chunk NVIDIA Ray Tracing in the trash bin, and run DX11. The RTX is a lie.​
Oof, That's gonna hurt some feelz from those who grabbed an i9-9900k and an rtx card. lolz
 
Haven't played this game and almost everyone complains its stutters alot despite having 8 core CPUs. Seems to happen in multiplayer mostly from what I read not single player.
With the Windows settings tweaked for gaming you won't get any stutter at all. With the right tweaks to Windows it's possible to gain almost twice your fps.
I went from 130-140 fps to 230-240 fps with a few Windows/regedit tweaks and a config. Same graphics settings but gained 100 fps and stutter was completely gone.
 
With the Windows settings tweaked for gaming you won't get any stutter at all. With the right tweaks to Windows it's possible to gain almost twice your fps.
I went from 130-140 fps to 230-240 fps with a few Windows/regedit tweaks and a config. Same graphics settings but gained 100 fps and stutter was completely gone.
Could you point us in the direction of these tweaks, as i am interested in looking into them, or to see if there any other tweaks i need to do that I haven't already done. I am sure there are others who would be interested in them as well.

Thanks!
 
With the Windows settings tweaked for gaming you won't get any stutter at all. With the right tweaks to Windows it's possible to gain almost twice your fps.
I went from 130-140 fps to 230-240 fps with a few Windows/regedit tweaks and a config. Same graphics settings but gained 100 fps and stutter was completely gone.
Please share with the rest of us.
 
I see people saying this a lot. Please bear in mind this is *not* full scene ray tracing. The engine is using rasterization for almost everything and ray tracing only some elements and lighting effects.

We are still many years away from full-scene ray tracing replacing rasterizing as the primary technology driving visuals in games.

Yeah I am fully aware of that. This is exactly what expected years ago when I had a discussion with some gaming friends friends about ray tracing, that it basically would start out being partially implemented and go from there, kind of similar to how Wolfenstein 3D & Doom started out not being true/full 3D. The fact that we have something already is awesome, sort of a silver lining from the death of Moore's Law. They can't get much more raw FPS performance anymore, so now they are upping the quality. This is a great thing for gaming.
 
IMO, people who judge DXR from the DICE implementation are very short-sighted.
I do not think short-sighted, it is just BFV is the only thing we have to actually judge RTX Ray Tracing by, four months after launch. Jensen told us over and over, "it just works." Does it? It does not seem that way to us. When you hang an entire card launch on one feature, and that feature cannot be used in the context it was shown, that makes it a failure in those regards. Jensen getting up on that stage and telling the audience that the RTX 2060 would allow you to play BFV at 1440p was a lie. That sort of messaging and marketing we have always stood up against, whether it be ATI, AMD, Intel, etc. etc. etc.
 
Kyle, it's true that we don't have anything else to judge the RTX line with besides BFV, but I think we're going to see better implementation in the future. That doesn't mean I forgive Jensen for overpromising and underdelivering, but I am willing to withhold final judgement until I see other engines try DXR.
 
Yep, DICE doesn't do DX12 very well. I suspect NV went with them because they were the first dev that would at least try DXR within a game. I like how BFV looks, but it's pretty crappy for what it is at this moment. I suspect we'll see much better implementation in the future from better DX12 game developers.


IMO, people who judge DXR from the DICE implementation are very short-sighted.

P.S. How bout that great bang for the buck VII card?


I disagree. Their lead programmer is a fucking Wiz, a total genius. If anyone can pull ray tracing off, it's him. Fact of the matter is that nVidia didn't have strong enough hardware to pull this off. Wouldn't have been that big of a deal, but they tacked on a huge price premium to help pay for the R&D across the entire 20x0 line. They could have advertised the feature, sucked it up for a year with volume sales, and taken said volume sales profits into further R&D.
Instead, they decided to pass along the R&D costs to their gaming customers (which generally has instead been passed along to their workstation customers). This is plainly obvious starting with GPP and the eventual current NDA, culminating in opening pre-orders for a Star Wars tech demo that started months before "leaks" and reviews ever made it to potential gaming customers. nVidia knew they had a dud on their hands from the very beginning.
Fact is, nVidia offered a knee cushion, pulled out the GoPro, and everyone started pulling down the zipper. Congratulations! If you bought an RTX for ray tracing, you're part of the largest Casting Couch audition of all time.
 
I disagree. Their lead programmer is a fucking Wiz, a total genius. If anyone can pull ray tracing off, it's him. Fact of the matter is that nVidia didn't have strong enough hardware to pull this off. Wouldn't have been that big of a deal, but they tacked on a huge price premium to help pay for the R&D across the entire 20x0 line. They could have advertised the feature, sucked it up for a year with volume sales, and taken said volume sales profits into further R&D.
Instead, they decided to pass along the R&D costs to their gaming customers (which generally has instead been passed along to their workstation customers). This is plainly obvious starting with GPP and the eventual current NDA, culminating in opening pre-orders for a Star Wars tech demo that started months before "leaks" and reviews ever made it to potential gaming customers. nVidia knew they had a dud on their hands from the very beginning.
Fact is, nVidia offered a knee cushion, pulled out the GoPro, and everyone started pulling down the zipper. Congratulations! If you bought an RTX for ray tracing, you're part of the largest Casting Couch audition of all time.

Fine, we can agree to disagree. I didn't buy the RTX for the RT capability. I bought it because I wanted better performance with my 21:9 display. DXR is still up in the air as far as I'm concerned. When DICE gets their DX12 programming straight I'll try it out again.
 
Kyle, it's true that we don't have anything else to judge the RTX line with besides BFV, but I think we're going to see better implementation in the future. That doesn't mean I forgive Jensen for overpromising and underdelivering, but I am willing to withhold final judgement until I see other engines try DXR.
Well that is where we differ. I evaluate hardware on what it delivers today, not tomorrow. Hopefully our evaluation will change in the very near future.
 
Back
Top