Battlefield V NVIDIA Ray Tracing RTX 2060 Performance @ [H]

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,534
Battlefield V NVIDIA Ray Tracing RTX 2060 Performance

The new NVIDIA GeForce RTX 2060 GPU is the least expensive GPU to support NVIDIA Ray Tracing at $349 MSRP. What kind of real-world ray traced performance can it deliver in Battlefield V multiplayer with DXR turned on? We take a very fast MSI GeForce RTX 2060 GAMING Z 6GB video card and find out. Will it just work?

If you like our content, please support HardOCP on Patreon.
 
Nice Review! Totally agree. Been saying it since the day they announced it that for 350 and 6GB of ram its a tragedy. Then I had people troll me about it how it won't matter for this card. I guess only Nvidia can sell a card for 350 and 6GB of ram. If it was AMD internet would be in fire lol.
 
The one thing that struck me during Jensen's CES presentation was the phrase "if dlss works", like he didn't even believe what he was peddling. Seems that the 2060 is absolutely no surprise .
 
I've been telling people who are considering this card to just buy a used 1070 with 8G...
 
What a conundrum.
I'm a 1080p gamer currently with a GTX 970. My HTPC has a 660ti though, and needs an upgrade. The 970 will go to the HTPC.

I'm stuck. I'm really against buying previous tech, so I really don't want to buy a 1070. But this review makes me think a 2060 might be a bit of a waste, paying for Ray tracing tech I'll never be able to use.

What's a guy to do? Is waiting for AMD'S offering maybe, FINALLY, a good plan?
 
Last edited:
What a conundrum.
I'm a 1080p gamer currently with a GTX 970. My HTPC has a 660to though, and needs an upgrade. The 970 will go to the HTPC.

I'm stuck. I'm really against buying previous tech, so I really don't want to buy a 1070. But this review makes me think a 2060 might be a bit of a waste, paying for Ray tracing tech I'll never be able to use.

What's a guy to do? Is waiting for AMD'S offering maybe, FINALLY, a good plan?
Let's stay on topic please.
 
The Bottom Line really does sum up RTX 2060 and even RTX 2070. What is the point of buying a card for a feature that it can't even make usable.

Nvidia really should have found a better game to show off RTX, like a story driven adventure game where things could be slower paced and where you would stop to take in the scenes.
A fast paced shooter seems pointless for this feature.

Looking forward to the full review of this MSI card.
 
I bought the RTX 2060 knowing I won't be able to use ray tracing, I just bought it for performance really. Ideally I'd prefer to buy a 1070 ti but they're more expensive new and I couldn't find a good used one either.

That being said, GPU choices suck and illI probably be upgrading from the card as soon as AMD gets back in the game.
 
Just curious, between the CPU testing and all four GPU's, roughly how many hours went into all these reviews combined?

[H]ard seriously went the extra mile and then some with this testing!
 
This was about as in depth of a review as you can get. Holy crap.

Really, this card should have been called a GTX 2060. The RTX 2060 needs to ship with 12 GB of vram :p

When not running DX12 in BFV, this card actually does fine with 6 GB of vram. But yeah, selling it off as a RTX card - complete B.S.
 
god dammit, i was drinking my pop when i was reading the 3rd page and almost killed myself choking when i got to the graph. I didn't think it would end up being that bad.. oops.

ah well, spot on conclusion. I was saying, what is the point of the RTX, especially on the 2060. I get annoyed reading all the 2060 reviews and in almost all the reviews, one of the positives of the 2060 is that it supports RTX... to bad it that doesn't mean playable.
 
Seeing as my 2080 Ti is borderline playable with DXR (1080p UW), it's no surprise the 2060 is a no-go.

I'd like to see what other developers do, maybe BFV is not the best game for this technology.
 
That dx12 hit just to begin discussion is brutal, and really sets the table.

I bet Diablo 3 would look great with raytracing...
 
Nvidia also promised DLSS with these RTX cards. Being able to get "free" AA would be a nice benefit for the 2060, but DLSS appears to be all smoke. Nvidia is still promising DLSS at their web page as well. Yet where is it? Final fantasy 15 and only at 4k? Nobody is going to be buying a 2060 to play 4k.

Hell Nvidia pre rtx launch promised DLSS for like ten games. And they keep adding more to their list of promises. So what's the deal? How long does it take to add in DLSS? It's been 4 months since they promised ten games and so far all we've got is the FF15 only at 4k. What a joke.
 
Exposing the truth makes the lie stick out for all to see. All the demonstrations Jensen has done with RTX has been very misleading, demoing a MP game, MP map with no players? lol

I am just utterly confounded by the DX12 performance handy cap, WTH? I can see why BFV ended up being the first use of RTX, Dice are fantastic programmers, game looks awesome and runs extremely well unless you turn on RTX or DX 12 with Nvidia Hardware. AMD Hardware seems OK with DX 12 so not sure if a game issue with Nvidia with DX 12 or Nvidia issue.

The 2060 6gb of memory on a $350+ card is worst for the times then the Fury 4gb in June of 2015. I would not recommend any card over $250 having less than 6gb of ram. Even 8gb will limit developers from pushing high IQ textures at 4K. With 5K monitors and beyond coming and some are out now the whole RTX lineup falls short for longevity.

As for DLSS, technique for rendering at lower resolutions then up-scaling is nothing new in purpose for performance increase - the real question does it give better IQ or results from the methods already available now? I can game at 4K while rendering at a lower resolution with any game with AMD or Nvidia drivers currently. The lack of DSLL being used speaks loudly on how unready the launch was to begin with and suggest it will not deliver just like RTX, more smoke indeed.

Let see how the RTX line of low memory cards stack up to the 16gb Radeon at 4K, DX 12, Vulcan and so on including BFV non DXR using HDR. A good HDR done game will add to the memory footprint of a game, plus will push color compression making it less effective and more memory intensive where bandwidth will really come into play.
 
Wow, even at the lowest DXR setting it uses more memory than the card has, wth, imo that's false advertising.


"It just doesn’t work, and Jensen should be ashamed for telling us all that it does".
DAMN, [H] went Hard. Rightfully so. This card is a waste of money.
 
Last edited:
Wow, even at the lowest DXR setting it uses more memory than the card has, wth, imo that's false advertising.


"It just doesn’t work, and Jensen should be ashamed for telling us all that it does".
DAMN, [H] went Hard. Rightfully so. This card is a waste of money.

Granted, we've only seen NV Ray Tracing performance on it in one game, I think it is a waste of money specifically for NV Ray Tracing in BFV to be clear. We've yet to see how DLSS could impact it, or other games.

We've also got to see how it performs in games in general before concluding on the value.
 
Last edited:
At this point I think we have to take a step back and really ask ourselves, "Did Nvidia just attempt to pull a 'mining' craze move on us gamers?"

Look at some of the comparisons that can be made.
a) The top card is $1200. 1080Ti during peak mining craze was selling, for what, $1500 on average.
b) Pushing the key phrase RTX (ray tracing) as a platform defining technology. 1080Ti were non-apologetically sold as mining platform cards because they were selling faster that the industry could create proprietary alternatives.
c) Jenson stood on stage and sold it as a functional technology for gaming. How many times has Nvidia sold the public on technology that neither got adopted or embraced?

I am glad that one of the closing points in this article is a hard stare at this generation's "Value" as a card. In its entirety with RTX and DLSS as this generation's defining technology, I just cannot disagree that the value falls on its face. We should all just laugh, shake our heads, and just walk away. The only reason this card is a 6GB model is because Nvidia did not care about this line enough to elevate it above what they could get away with selling it for.

I imagine the thought process at the time the 2060 discussion was brought up went like, "Hey when the customer is standing there all oblivious to how much memory they need they will just look at the model # and say, well this one has a digit higher number and cool new features. I think I will buy that for an additional $100." /smfh
 
Presumably you have made nVidia aware of your results, very interested to hear what they say and the spin they may put on it.

Great article btw, every other site i've read manages much better results, close to the nVidia party line. Wonder why that is?

Yours is the most honest and trustworthy site for my buying decisions though.
 
How is DX12 taking up more memory with RTX disabled?

I dont have BFV but I'm replaying Deus Ex: Mankind Divided and found the game more stable with consistent FPS playing DX11 over DX12, most likely because of memory issues with DX12 too.

ANTHEM comes out next month and I hope I can enable some of the RTX features with my 2060 but I'll take high FPS over RTX anyday.
 
Granted, we've only seen NV Ray Tracing performance on it in one game, I think it is a waste of money specifically for NV Ray Tracing in BFV to be clear. We've yet to see how DLSS could impact it, or other games.

We've also got to see how it performs in games in general before concluding on the value.

DLSS looks like shit compared to "real AA"
DLSS is slightly faster than traditional rendering and renders stuff sometimes wrong and has weird artifacts that can be really annoying.

to me DLSS looks like a tech RTX lineup shouldn't be using at all and maybe a tech that GTX1060,1050TI and such could make use of as they're not capable of running real AA..
So I am very confused about DLSS, if it's an RTX feature and the mainstream cards won't be getting it I don't see a future for it at all, unless they magically fix all of these artifacts.
 
How is DX12 taking up more memory with RTX disabled?

I dont have BFV but I'm replaying Deus Ex: Mankind Divided and found the game more stable with consistent FPS playing DX11 over DX12, most likely because of memory issues with DX12 too.

ANTHEM comes out next month and I hope I can enable some of the RTX features with my 2060 but I'll take high FPS over RTX anyday.

Nvidia's memory compression doesn't work in DX12?
I don't see any major difference on amdgpu in many DX11\DX12 titles, and my maxwell on DX12 never ever worked.
Don't have BF games so can't test them.
 
Nvidia's memory compression doesn't work in DX12?
I don't see any major difference on amdgpu in many DX11\DX12 titles, and my maxwell on DX12 never ever worked.
Don't have BF games so can't test them.

Hmm, makes sense as DX:MD was optimized for AMD.
 
What exactly happens to this game when DX12 is turned on? Does it enable some graphics feature which is computationally intensive? Or is this a game based issue, or a result of just how bad Microsoft have coded DX12 in general? Another thing would be, do AMD cards suffer the same hit in this game with DX12 enabled?

It seems to me, that if they could fix the 32% performance loss, as well as the VRAM usage (could memory compression bugged in some way, explaining the increased VRAM usage?) of DX12, then RT could possibly be playable...
 
Last edited:
The mysteries of DX12 we've seen with [H]ards testing in this series of reviews made me curious about the anomalies I've seen in other games. I've only got 2 others(SOTTR & ROTTR). I did some quick re-testing last night in 1440p with my 1080TI and noticed both used more Vram just like BFV! On average 1-2GB, and again in 1440p so I imagine around double for 4k. So here the consistency is that DX12 just uses more Vram in general. BFV or otherwise. It doesn't seem to matter RTX or other gens either. Performance, however, still seems to come down to the games implementation. Kyle & Brent showed extensively how in BFV it was detrimental. For me in testing the TR's, SOTTR works best in DX12. I had min 50's max 100's in the canned with DX12 and then min 30's max 90's in DX11 while ROTTR was just a train wreck in DX12.

I totally agree with all the fails of the RTX2060. The RTX series at best is a mixed bag and should've stopped at the 2070 and even that is questionable for RT. This should've been a GTX and if the rumors of the GTX version are true then that looks even worse. If DLSS had happened it might've given a shot to this card but it hasn't yet.
 
Just wanted to point out that the GPU Memory Restriction setting seems to be doing something with my 2080 Ti. In single player it is keeping VRAM usage to right around 10GB, but when I turn it off it is using all 11GB constantly. I cannot tell what, if any, settings the game is altering to keep VRAM usage in check, but I'll be damned if I can tell any difference at 4K with DXR Ultra. I just got the game to test if anything funky is going on with the PG278Q, so I have yet to spend a whole lot of time with it (including diving into multiplayer).
 
Man, the graph at the bottom of page 6...

Even 2070 can't hit 60FPS with low DXR @ 1080p.. Forget the 2060.

Makes you wonder why anyone would pay $350 for a hobbled card that is memory bottle-necked almost always.

As always, fantastic testing and insight from [H].
 
I think the conclusion is that dx12 is flat broken in bfv . Even without rtx performance drops 30% and vram usage doubles for no apparent reason. I don’t think rtx is necessarily the issue here, either dice or nvidia have something completely wrong in dx12.
 
Agree with the statements in the article. Any "Gaming" card in 2019 should have atleast 8GB's of memory regardless of price, having 6GB's is step backwards.
 
This was to be expected, if looking at 2070 performance. This gen of RTX cards is not worth it in my opinion. Maybe the next gen/version will be better, the main draw (RTX feature) are just too weak currently. I know a lot of people get excited b/c its new technology, but the prices are higher, the price/performance ration isn't as good as previous gen releases, the RTX stuff isn't quite there yet, theres hardly any games with it, the DLSS stuff... same issue. Paying for a new card with not-quite-there-yet features isn't really that great of a prospect. Right now I just do see much appeal in these first gen RTX cards (if) you already have a 1070/1080 etc.
 
Wow that performance is truly atrocious. 32% hit just enabling DX12 without even turning on ray tracing? This card shouldn't exist let alone at $349.99.
 
I guess the question now is why would anyone buy this card at all?

Seems expensive for a 1080p card.

A
 
I guess the question now is why would anyone buy this card at all?

Seems expensive for a 1080p card.

A


I upgraded from a GTX 1060 to the RTX 2060 just for a performance increase and yes, I did overpay for it. I never paid over $300 for a new card and I paid $380 for mine :(

Another thing that stings me about this upgrade is all my previous upgrades from 1999 we're at least doubled in VRAM. From 16Mb (TNT2) to 32Mb (Radeon), 64Mb to 256Mb, 256Mb to 512Mb, 1Gb to 3Gb, 3Gb to 6Gb with my GTX 1060. This is my first upgrade where the VRAM size stayed the same.
 
I upgraded from a GTX 1060 to the RTX 2060 just for a performance increase and yes, I did overpay for it. I never paid over $300 for a new card and I paid $380 for mine :(

Another thing that stings me about this upgrade is all my previous upgrades from 1999 we're at least doubled in VRAM. From 16Mb (TNT2) to 32Mb (Radeon), 64Mb to 256Mb, 256Mb to 512Mb, 1Gb to 3Gb, 3Gb to 6Gb with my GTX 1060. This is my first upgrade where the VRAM size stayed the same.

Bummer. But if it does everything you need it to do then it's not all that bad.

Play on and enjoy what you got.

A
 
Back
Top