Battlefield V NVIDIA Ray Tracing RTX 2060 Performance @ [H]

Good review. As long as Vega 64's are available in the wild at the $400 price point, this is a hard sell at $389.00 as far as I'm concerned. A properly tuned Vega 64 smokes this card.
 
Well, I don't want to sound like a dick but you obviously haven't played many games in 4K at maximum settings. As for DX12, I'm not sure if it's bad implementation or not but it's beneficial for a game to use more VRAM (again I put RAGE as example of that; I personally loved the game but their streaming just didn't work and it would been better for use more VRAM instead). It will then have to preload less assets from your SSD/HDD thus improving performance and provide smoother FPS. In an ideal world you'd want everything preloaded into RAM/VRAM for best possible performance. Granted that devs can also work on a better and more efficient texture streaming and whatnot so their product fares well on lower systems but for AAA games I personally would rather have them spend more time on visuals and game content than try to get it to run on a wide variety of systems. It costs time and money for that sort of thing so if they are making a top tier game, I want them to focus on what's important for me which is why I build an above average system.

That is nice and all but most will not be using this card for 4k max settings. Even at 1440p, it does just fine:
https://m.hardocp.com/article/2019/01/30/msi_geforce_rtx_2060_gaming_z_video_card_review

If the gpu runs out over vram, it will use the system ram first, at least in these titles.
 
Even the good old AMD R9 390 has 8GB and Nvidia is still conning their customers with 6Gb. Sad times.
 
That is nice and all but most will not be using this card for 4k max settings. Even at 1440p, it does just fine:
https://m.hardocp.com/article/2019/01/30/msi_geforce_rtx_2060_gaming_z_video_card_review

If the gpu runs out over vram, it will use the system ram first, at least in these titles.

That's right, probably no 2060 but I feel the fact they kept VRAM at same size as previous gen which is several years old is rather insulting to gamers. It will also severely limit useful life of this GPU line as game devs got used to vast amounts of VRAM being available on most mid range and above GPU. This was the sad case with 700 series where lack of VRAM was essentially the thing that killed it prematurely (using system RAM is unacceptable and really doesn't help, sure it will allow game to run but it will be unplayable; this is how it was in Rise of the Tomb Raider which didn't appear as much GPU limited as it was VRAM back when I was on 780GHz SLI and throttling textures to reduce used VRAM made the game run very high settings at 1600p). 900 series suffered from that as well, especially mobile versions. RTX2060 can easily surpass its 6GB VRAM if RTX features are used, without it you're probably fine at 1080p, though at 1080p without RTX you'd be just fine with a previous generation which costs quite a bit less. I just don't feel you're rightfully defending NV on this one, to me it feels like this is a deliberate gimpage due to absense of proper competition from AMD.
 
That's right, probably no 2060 but I feel the fact they kept VRAM at same size as previous gen which is several years old is rather insulting to gamers. It will also severely limit useful life of this GPU line as game devs got used to vast amounts of VRAM being available on most mid range and above GPU. This was the sad case with 700 series where lack of VRAM was essentially the thing that killed it prematurely (using system RAM is unacceptable and really doesn't help, sure it will allow game to run but it will be unplayable; this is how it was in Rise of the Tomb Raider which didn't appear as much GPU limited as it was VRAM back when I was on 780GHz SLI and throttling textures to reduce used VRAM made the game run very high settings at 1600p). 900 series suffered from that as well, especially mobile versions. RTX2060 can easily surpass its 6GB VRAM if RTX features are used, without it you're probably fine at 1080p, though at 1080p without RTX you'd be just fine with a previous generation which costs quite a bit less. I just don't feel you're rightfully defending NV on this one, to me it feels like this is a deliberate gimpage due to absense of proper competition from AMD.

Even 1440p without RTX stuff will likely be fine for a few years. Beyond that, more VRAM would also mean more cost. An additional 2GB of GDDR6 would cost Nvidia around $23, which means the price of the card would go up another $30-$50. Its already too expensive, another $30+ would be insane.
 
Really it makes no sense for the RTX2060 to be 256 bit with GDDR6. Perhaps they will come out with a 12 GB version to shut everyone up.

That's right, probably no 2060 but I feel the fact they kept VRAM at same size as previous gen which is several years old is rather insulting to gamers. It will also severely limit useful life of this GPU line as game devs got used to vast amounts of VRAM being available on most mid range and above GPU. This was the sad case with 700 series where lack of VRAM was essentially the thing that killed it prematurely (using system RAM is unacceptable and really doesn't help, sure it will allow game to run but it will be unplayable; this is how it was in Rise of the Tomb Raider which didn't appear as much GPU limited as it was VRAM back when I was on 780GHz SLI and throttling textures to reduce used VRAM made the game run very high settings at 1600p). 900 series suffered from that as well, especially mobile versions. RTX2060 can easily surpass its 6GB VRAM if RTX features are used, without it you're probably fine at 1080p, though at 1080p without RTX you'd be just fine with a previous generation which costs quite a bit less. I just don't feel you're rightfully defending NV on this one, to me it feels like this is a deliberate gimpage due to absense of proper competition from AMD.

As a user in another thread put it, maybe it is similar to how system ram has become stagnant. The average user had 8gb of system ram forever whereas it was rapidly increasing every gen before DDR3 was popular. At some point, vRam usage had to slow down. Christ, how much information can a single frame use?
 
Really it makes no sense for the RTX2060 to be 256 bit with GDDR6. Perhaps they will come out with a 12 GB version to shut everyone up.



As a user in another thread put it, maybe it is similar to how system ram has become stagnant. The average user had 8gb of system ram forever whereas it was rapidly increasing every gen before DDR3 was popular. At some point, vRam usage had to slow down. Christ, how much information can a single frame use?
VRam usage is combination of the number of objects which are increasing as time goes on as well as textures/geometry/shaders plus Compute per frame, HDR also addes more VRam needs, add in higher end AA as well, RTX will have less culling of objects if ray tracing reflections are used adding in even more VRam needed. With more VRam you can pre-load more what the game has and avoid hitches or momentary stalls as different assets are needed from location to location. Yes with reduced settings the 2060 will probably be OK for awhile but may give less options as time goes on. For the price point no way I would recommend a 2060 unless it was a short term card only.
 
VRam usage is combination of the number of objects which are increasing as time goes on as well as textures/geometry/shaders plus Compute per frame, HDR also addes more VRam needs, add in higher end AA as well, RTX will have less culling of objects if ray tracing reflections are used adding in even more VRam needed. With more VRam you can pre-load more what the game has and avoid hitches or momentary stalls as different assets are needed from location to location. Yes with reduced settings the 2060 will probably be OK for awhile but may give less options as time goes on. For the price point no way I would recommend a 2060 unless it was a short term card only.

I guess it always confuses me why these enhanced textures/HDR/RTX assets typically increase vram by a set amount and doesn't scale up with resolution. ie. If the higher textures add 1 GB of vram consumption 1080p, why does it add only 1 GB (or slughtly more) at 4k and not 4 GB (4x resolution). Doesn't a game have 4x the information to process at 4k than it does at 1080p, given all things equal?
 
I guess it always confuses me why these enhanced textures/HDR/RTX assets typically increase vram by a set amount and doesn't scale up with resolution. ie. If the higher textures add 1 GB of vram consumption 1080p, why does it add only 1 GB (or slughtly more) at 4k and not 4 GB (4x resolution). Doesn't a game have 4x the information to process at 4k than it does at 1080p, given all things equal?
Textures/Shaders/Geometry usually stays pretty much the same for memory requirements. What goes up is the buffers to display current and next frame or more, plus at higher resolutions you are going to render more pixels per triangle which is main reason for the frame rate drop. Add in any kind of Multisampling or SuperSampling and the buffers can be dramatically increase in size again. In Shadow Of The Tomb Raider using HDR vice SDR at 1440p adds a little over .5gb more VRam used since now you are dealing with increase color data, 10bit vice 8 bit for display, back buffers etc. - at 4K that will increase to about 1GB more. I have not seen tests done with the 2080 using RTX and HDR at the same time as a side note, it may be ugly pushing that 8gb of VRam.
 
It will be interesting to see if the new patch with DLSS that comes out tomorrow bring the performance up to the levels that were hinted at. Are you going to try it out Kyle to see if there is performance gains and also the quality impact of running it.
 
It will be interesting to see if the new patch with DLSS that comes out tomorrow bring the performance up to the levels that were hinted at. Are you going to try it out Kyle to see if there is performance gains and also the quality impact of running it.
Nope, never going to turn it on.
 
Seeing as my 2080 Ti is borderline playable with DXR (1080p UW), it's no surprise the 2060 is a no-go.

I'd like to see what other developers do, maybe BFV is not the best game for this technology.

Maybe nvidia shouldn't have made bf5 their flagship title featuring the technology then, or you know... waited till they had hardware that could pull it off.

You bring up a good point though, I probably would have wanted to feature the tech on a game where raw fps is not so important instead.
 
Maybe nvidia shouldn't have made bf5 their flagship title featuring the technology then, or you know... waited till they had hardware that could pull it off.

You bring up a good point though, I probably would have wanted to feature the tech on a game where raw fps is not so important instead.

Hardware pulls it off just fine in Metro. BFV’s DX12 implementation is terrible so it starts off on a bad foot before you even turn RTX on. And yeah, I agree this game is not the best application for it.

I do kind of agree with the hardware statement... but more in that the low end (~$200) segment should be able to do RT. As long as devs have to implement the old and and the new way we won’t see really good implementation / adoption.
 
Metro is a much better example. Not sure about the 2060, but it runs great with a 2080 Ti maxed out everything with RT and DLSS.

And, the graphics on Metro are clearly better with RT, it's a much better implementation.
 
Hardware pulls it off just fine in Metro. BFV’s DX12 implementation is terrible so it starts off on a bad foot before you even turn RTX on. And yeah, I agree this game is not the best application for it.

I do kind of agree with the hardware statement... but more in that the low end (~$200) segment should be able to do RT. As long as devs have to implement the old and and the new way we won’t see really good implementation / adoption.
I don't agree with you at all about it running fine but then again I never clarified how I define that.

For the 2080ti rtx should be able to hit 4k@60fps or 1440p@120fps.

For the 2060 their performance target should be 1080p 60fps.

Less than that is a step backward from previous gen card performance without the features, if you can't even maintain the performance of the previous gen with the features on you collossally fucked up imo.
 
I don't agree with you at all about it running fine but then again I never clarified how I define that.

For the 2080ti rtx should be able to hit 4k@60fps or 1440p@120fps.

For the 2060 their performance target should be 1080p 60fps.

Less than that is a step backward from previous gen card performance without the features, if you can't even maintain the performance of the previous gen with the features on you collossally fucked up imo.

It can do that in Metro...
 
So as expected, Metro plays fantastically with RTX, even more so if you enable DLSS. Don't you think your initial assessment of calling RTX trash is proving to be wrong now?
I would suggest out assessment of RTX perf in BFV was dead on. And I would suggest that NV is still batting about .333 so far.

Edit: Spelling. Darn phone...
 
Last edited:
Sorry, didn't get that metaphor, English is my second language.

It's a baseball batter hit percentage (should say 'batting .333'), which applied directly means that Nvidia has been 33% successful with RTX implementations.

The problem wasn't that you assessed BFV alone, your conclusions was so wide it included the whole concept of RTX itself, you called it trash and a hoax.

While I might disagree with the tone of the conclusion, Kyle was factually right, and there was no hard evidence available at the time to back up Nvidia's bold promises with RTX. Metro really is the first we've seen of an effective implementation (and we don't know if BF:V will ever get there).

Odds are that we'll see more good implementations in the future, and probably more not so good implementations too.
 
So as expected, Metro plays fantastically with RTX, even more so if you enable DLSS. Don't you think your initial assessment of calling RTX trash is proving to be wrong now?

No, their conclusions were spot on. Global Illumination is the most basic Ray Tracing there is. That's what's used in the Metro game. Whereas Reflections are much harder to implement and need a lot more power. So each game is only using one aspect of Ray Tracing.

Metro Exodus - Ray Traced Global Illumination
Battle Field V - Ray Traced Reflections.

To get reflections working at all they had to reduce the number of rays and limit the number of objects that the Rays reflected off.

1 game using the most basic implementation of Ray Tracing doesn't change HardOCP's conclusions.
 
The problem wasn't that you assessed BFV alone, your conclusions was so wide it included the whole concept of RTX itself, you called it trash and a hoax.
Well, given that there was only one game with RTX support for about 6 months....

And our conclusions were about the one game. That said, yes, Jensen lied to us showing some of these cars being "used" on the BF5 map. There is no way to use those cards on that multiplayer map....when there are other players on the map.

So if you think we are going to start barking how great RTX ray tracing is, you are going to be disappointed.
 
No, GI is much harder when you factor in the amount of bounces that needs to be done.


I play that map just fine and achieve 60fps in those heavy reflection areas @1440p max settings, on a 2080Ti.


No need to bark, just state it as it is, and don't rush to premature conclusions. Remember Hardware T&L ?
I am glad you have a much different experience than we do on multiplayer maps.
 
Back
Top