Battlefield V NVIDIA Ray Tracing RTX 2060 Performance @ [H]

Discussion in 'Video Cards' started by Kyle_Bennett, Jan 20, 2019.

  1. Kyle_Bennett

    Kyle_Bennett Editor-in-Chief HardOCP Staff Member

    Messages:
    51,873
    Joined:
    May 18, 1997
    Battlefield V NVIDIA Ray Tracing RTX 2060 Performance

    The new NVIDIA GeForce RTX 2060 GPU is the least expensive GPU to support NVIDIA Ray Tracing at $349 MSRP. What kind of real-world ray traced performance can it deliver in Battlefield V multiplayer with DXR turned on? We take a very fast MSI GeForce RTX 2060 GAMING Z 6GB video card and find out. Will it just work?

    If you like our content, please support HardOCP on Patreon.
     
  2. NKD

    NKD [H]ardness Supreme

    Messages:
    7,313
    Joined:
    Aug 26, 2007
    Nice Review! Totally agree. Been saying it since the day they announced it that for 350 and 6GB of ram its a tragedy. Then I had people troll me about it how it won't matter for this card. I guess only Nvidia can sell a card for 350 and 6GB of ram. If it was AMD internet would be in fire lol.
     
  3. russnuck

    russnuck Gawd

    Messages:
    640
    Joined:
    Mar 25, 2005
    The one thing that struck me during Jensen's CES presentation was the phrase "if dlss works", like he didn't even believe what he was peddling. Seems that the 2060 is absolutely no surprise .
     
    Kyle_Bennett likes this.
  4. /dev/null

    /dev/null [H]ardForum Junkie

    Messages:
    13,677
    Joined:
    Mar 31, 2001
    I've been telling people who are considering this card to just buy a used 1070 with 8G...
     
  5. Modred189

    Modred189 I'm Smarter Than You

    Messages:
    14,297
    Joined:
    May 24, 2006
    What a conundrum.
    I'm a 1080p gamer currently with a GTX 970. My HTPC has a 660ti though, and needs an upgrade. The 970 will go to the HTPC.

    I'm stuck. I'm really against buying previous tech, so I really don't want to buy a 1070. But this review makes me think a 2060 might be a bit of a waste, paying for Ray tracing tech I'll never be able to use.

    What's a guy to do? Is waiting for AMD'S offering maybe, FINALLY, a good plan?
     
    Last edited: Jan 20, 2019
    Marees likes this.
  6. Kyle_Bennett

    Kyle_Bennett Editor-in-Chief HardOCP Staff Member

    Messages:
    51,873
    Joined:
    May 18, 1997
    Let's stay on topic please.
     
    Nightfire likes this.
  7. IKV1476

    IKV1476 Lurker

    Messages:
    257
    Joined:
    Dec 26, 2005
    The Bottom Line really does sum up RTX 2060 and even RTX 2070. What is the point of buying a card for a feature that it can't even make usable.

    Nvidia really should have found a better game to show off RTX, like a story driven adventure game where things could be slower paced and where you would stop to take in the scenes.
    A fast paced shooter seems pointless for this feature.

    Looking forward to the full review of this MSI card.
     
    Snowdensjacket likes this.
  8. TangledThornz

    TangledThornz Limp Gawd

    Messages:
    473
    Joined:
    Jun 12, 2018
    I bought the RTX 2060 knowing I won't be able to use ray tracing, I just bought it for performance really. Ideally I'd prefer to buy a 1070 ti but they're more expensive new and I couldn't find a good used one either.

    That being said, GPU choices suck and illI probably be upgrading from the card as soon as AMD gets back in the game.
     
    lostin3d and Kyle_Bennett like this.
  9. DejaWiz

    DejaWiz Oracle of Unfortunate Truths

    Messages:
    18,408
    Joined:
    Apr 15, 2005
    Excellent review, and the conclusion is spot on: RTX is a lie and it just doesn't work...especially at these pricing, VRAM, and performance degradation points.
     
  10. lostin3d

    lostin3d [H]ard|Gawd

    Messages:
    1,759
    Joined:
    Oct 13, 2016
    Just curious, between the CPU testing and all four GPU's, roughly how many hours went into all these reviews combined?

    [H]ard seriously went the extra mile and then some with this testing!
     
    Nightfire likes this.
  11. Kyle_Bennett

    Kyle_Bennett Editor-in-Chief HardOCP Staff Member

    Messages:
    51,873
    Joined:
    May 18, 1997
    A lot? Lol. We have spent about $8000 in resources would be a quick guess.
     
  12. lostin3d

    lostin3d [H]ard|Gawd

    Messages:
    1,759
    Joined:
    Oct 13, 2016
    WOW!
     
    Geforcepat likes this.
  13. Nightfire

    Nightfire [H]ard|Gawd

    Messages:
    1,267
    Joined:
    Sep 7, 2017
    This was about as in depth of a review as you can get. Holy crap.

    Really, this card should have been called a GTX 2060. The RTX 2060 needs to ship with 12 GB of vram :p

    When not running DX12 in BFV, this card actually does fine with 6 GB of vram. But yeah, selling it off as a RTX card - complete B.S.
     
  14. hesho

    hesho Limp Gawd

    Messages:
    500
    Joined:
    Sep 27, 2010
    god dammit, i was drinking my pop when i was reading the 3rd page and almost killed myself choking when i got to the graph. I didn't think it would end up being that bad.. oops.

    ah well, spot on conclusion. I was saying, what is the point of the RTX, especially on the 2060. I get annoyed reading all the 2060 reviews and in almost all the reviews, one of the positives of the 2060 is that it supports RTX... to bad it that doesn't mean playable.
     
    Marees likes this.
  15. Nightfire

    Nightfire [H]ard|Gawd

    Messages:
    1,267
    Joined:
    Sep 7, 2017
    Forget RTX, the card ran better at 1440p on dx11 than it did with dx12 at 1080p while consuming far less vram.

    Honestly, what in the hell is going on with dx12 in this title?
     
    Marees, Ranulfo, DrezKill and 6 others like this.
  16. cybereality

    cybereality 2[H]4U

    Messages:
    3,747
    Joined:
    Mar 22, 2008
    Seeing as my 2080 Ti is borderline playable with DXR (1080p UW), it's no surprise the 2060 is a no-go.

    I'd like to see what other developers do, maybe BFV is not the best game for this technology.
     
    Geforcepat and lostin3d like this.
  17. Kyle_Bennett

    Kyle_Bennett Editor-in-Chief HardOCP Staff Member

    Messages:
    51,873
    Joined:
    May 18, 1997
    That is the most hopeful outlook we could have.
     
    DrezKill, cybereality and Shotglass01 like this.
  18. PhaseNoise

    PhaseNoise [H]ard|Gawd

    Messages:
    1,192
    Joined:
    May 11, 2005
    That dx12 hit just to begin discussion is brutal, and really sets the table.

    I bet Diablo 3 would look great with raytracing...
     
    Marees likes this.
  19. Snowdensjacket

    Snowdensjacket Limp Gawd

    Messages:
    304
    Joined:
    Apr 10, 2017
    Nvidia also promised DLSS with these RTX cards. Being able to get "free" AA would be a nice benefit for the 2060, but DLSS appears to be all smoke. Nvidia is still promising DLSS at their web page as well. Yet where is it? Final fantasy 15 and only at 4k? Nobody is going to be buying a 2060 to play 4k.

    Hell Nvidia pre rtx launch promised DLSS for like ten games. And they keep adding more to their list of promises. So what's the deal? How long does it take to add in DLSS? It's been 4 months since they promised ten games and so far all we've got is the FF15 only at 4k. What a joke.
     
    Marees, noko, lostin3d and 3 others like this.
  20. noko

    noko 2[H]4U

    Messages:
    4,083
    Joined:
    Apr 14, 2010
    Exposing the truth makes the lie stick out for all to see. All the demonstrations Jensen has done with RTX has been very misleading, demoing a MP game, MP map with no players? lol

    I am just utterly confounded by the DX12 performance handy cap, WTH? I can see why BFV ended up being the first use of RTX, Dice are fantastic programmers, game looks awesome and runs extremely well unless you turn on RTX or DX 12 with Nvidia Hardware. AMD Hardware seems OK with DX 12 so not sure if a game issue with Nvidia with DX 12 or Nvidia issue.

    The 2060 6gb of memory on a $350+ card is worst for the times then the Fury 4gb in June of 2015. I would not recommend any card over $250 having less than 6gb of ram. Even 8gb will limit developers from pushing high IQ textures at 4K. With 5K monitors and beyond coming and some are out now the whole RTX lineup falls short for longevity.

    As for DLSS, technique for rendering at lower resolutions then up-scaling is nothing new in purpose for performance increase - the real question does it give better IQ or results from the methods already available now? I can game at 4K while rendering at a lower resolution with any game with AMD or Nvidia drivers currently. The lack of DSLL being used speaks loudly on how unready the launch was to begin with and suggest it will not deliver just like RTX, more smoke indeed.

    Let see how the RTX line of low memory cards stack up to the 16gb Radeon at 4K, DX 12, Vulcan and so on including BFV non DXR using HDR. A good HDR done game will add to the memory footprint of a game, plus will push color compression making it less effective and more memory intensive where bandwidth will really come into play.
     
    Nightfire and PhoenixGenesys like this.
  21. Krenum

    Krenum [H]ardForum Junkie

    Messages:
    14,920
    Joined:
    Apr 29, 2005
    Wow, even at the lowest DXR setting it uses more memory than the card has, wth, imo that's false advertising.


    "It just doesn’t work, and Jensen should be ashamed for telling us all that it does".
    DAMN, [H] went Hard. Rightfully so. This card is a waste of money.
     
    Last edited: Jan 21, 2019
  22. Brent_Justice

    Brent_Justice [H] Video Card Managing Editor Staff Member

    Messages:
    17,820
    Joined:
    Apr 17, 2000
    Granted, we've only seen NV Ray Tracing performance on it in one game, I think it is a waste of money specifically for NV Ray Tracing in BFV to be clear. We've yet to see how DLSS could impact it, or other games.

    We've also got to see how it performs in games in general before concluding on the value.
     
    Last edited: Jan 21, 2019
  23. iQuasarLV

    iQuasarLV AMDFanboy EchoChamber Member

    Messages:
    72
    Joined:
    May 19, 2016
    At this point I think we have to take a step back and really ask ourselves, "Did Nvidia just attempt to pull a 'mining' craze move on us gamers?"

    Look at some of the comparisons that can be made.
    a) The top card is $1200. 1080Ti during peak mining craze was selling, for what, $1500 on average.
    b) Pushing the key phrase RTX (ray tracing) as a platform defining technology. 1080Ti were non-apologetically sold as mining platform cards because they were selling faster that the industry could create proprietary alternatives.
    c) Jenson stood on stage and sold it as a functional technology for gaming. How many times has Nvidia sold the public on technology that neither got adopted or embraced?

    I am glad that one of the closing points in this article is a hard stare at this generation's "Value" as a card. In its entirety with RTX and DLSS as this generation's defining technology, I just cannot disagree that the value falls on its face. We should all just laugh, shake our heads, and just walk away. The only reason this card is a 6GB model is because Nvidia did not care about this line enough to elevate it above what they could get away with selling it for.

    I imagine the thought process at the time the 2060 discussion was brought up went like, "Hey when the customer is standing there all oblivious to how much memory they need they will just look at the model # and say, well this one has a digit higher number and cool new features. I think I will buy that for an additional $100." /smfh
     
    Marees likes this.
  24. Colonel_Blimp

    Colonel_Blimp n00b

    Messages:
    36
    Joined:
    Dec 30, 2017
    Presumably you have made nVidia aware of your results, very interested to hear what they say and the spin they may put on it.

    Great article btw, every other site i've read manages much better results, close to the nVidia party line. Wonder why that is?

    Yours is the most honest and trustworthy site for my buying decisions though.
     
  25. TangledThornz

    TangledThornz Limp Gawd

    Messages:
    473
    Joined:
    Jun 12, 2018
    How is DX12 taking up more memory with RTX disabled?

    I dont have BFV but I'm replaying Deus Ex: Mankind Divided and found the game more stable with consistent FPS playing DX11 over DX12, most likely because of memory issues with DX12 too.

    ANTHEM comes out next month and I hope I can enable some of the RTX features with my 2060 but I'll take high FPS over RTX anyday.
     
  26. ole-m

    ole-m Limp Gawd

    Messages:
    407
    Joined:
    Oct 5, 2015
    DLSS looks like shit compared to "real AA"
    DLSS is slightly faster than traditional rendering and renders stuff sometimes wrong and has weird artifacts that can be really annoying.

    to me DLSS looks like a tech RTX lineup shouldn't be using at all and maybe a tech that GTX1060,1050TI and such could make use of as they're not capable of running real AA..
    So I am very confused about DLSS, if it's an RTX feature and the mainstream cards won't be getting it I don't see a future for it at all, unless they magically fix all of these artifacts.
     
  27. ole-m

    ole-m Limp Gawd

    Messages:
    407
    Joined:
    Oct 5, 2015
    Nvidia's memory compression doesn't work in DX12?
    I don't see any major difference on amdgpu in many DX11\DX12 titles, and my maxwell on DX12 never ever worked.
    Don't have BF games so can't test them.
     
    Marees and TangledThornz like this.
  28. TangledThornz

    TangledThornz Limp Gawd

    Messages:
    473
    Joined:
    Jun 12, 2018
    Hmm, makes sense as DX:MD was optimized for AMD.
     
  29. Stimpy88

    Stimpy88 [H]ard|Gawd

    Messages:
    1,246
    Joined:
    Feb 18, 2004
    What exactly happens to this game when DX12 is turned on? Does it enable some graphics feature which is computationally intensive? Or is this a game based issue, or a result of just how bad Microsoft have coded DX12 in general? Another thing would be, do AMD cards suffer the same hit in this game with DX12 enabled?

    It seems to me, that if they could fix the 32% performance loss, as well as the VRAM usage (could memory compression bugged in some way, explaining the increased VRAM usage?) of DX12, then RT could possibly be playable...
     
    Last edited: Jan 21, 2019
    Nightfire and Armenius like this.
  30. lostin3d

    lostin3d [H]ard|Gawd

    Messages:
    1,759
    Joined:
    Oct 13, 2016
    The mysteries of DX12 we've seen with [H]ards testing in this series of reviews made me curious about the anomalies I've seen in other games. I've only got 2 others(SOTTR & ROTTR). I did some quick re-testing last night in 1440p with my 1080TI and noticed both used more Vram just like BFV! On average 1-2GB, and again in 1440p so I imagine around double for 4k. So here the consistency is that DX12 just uses more Vram in general. BFV or otherwise. It doesn't seem to matter RTX or other gens either. Performance, however, still seems to come down to the games implementation. Kyle & Brent showed extensively how in BFV it was detrimental. For me in testing the TR's, SOTTR works best in DX12. I had min 50's max 100's in the canned with DX12 and then min 30's max 90's in DX11 while ROTTR was just a train wreck in DX12.

    I totally agree with all the fails of the RTX2060. The RTX series at best is a mixed bag and should've stopped at the 2070 and even that is questionable for RT. This should've been a GTX and if the rumors of the GTX version are true then that looks even worse. If DLSS had happened it might've given a shot to this card but it hasn't yet.
     
  31. Armenius

    Armenius [H]ardForum Junkie

    Messages:
    15,903
    Joined:
    Jan 28, 2014
    Just wanted to point out that the GPU Memory Restriction setting seems to be doing something with my 2080 Ti. In single player it is keeping VRAM usage to right around 10GB, but when I turn it off it is using all 11GB constantly. I cannot tell what, if any, settings the game is altering to keep VRAM usage in check, but I'll be damned if I can tell any difference at 4K with DXR Ultra. I just got the game to test if anything funky is going on with the PG278Q, so I have yet to spend a whole lot of time with it (including diving into multiplayer).
     
    lostin3d likes this.
  32. lazz

    lazz Limp Gawd

    Messages:
    324
    Joined:
    Apr 15, 2007
    Man, the graph at the bottom of page 6...

    Even 2070 can't hit 60FPS with low DXR @ 1080p.. Forget the 2060.

    Makes you wonder why anyone would pay $350 for a hobbled card that is memory bottle-necked almost always.

    As always, fantastic testing and insight from [H].
     
  33. bobzdar

    bobzdar [H]ard|Gawd

    Messages:
    1,559
    Joined:
    Jun 6, 2003
    I think the conclusion is that dx12 is flat broken in bfv . Even without rtx performance drops 30% and vram usage doubles for no apparent reason. I don’t think rtx is necessarily the issue here, either dice or nvidia have something completely wrong in dx12.
     
    Armenius likes this.
  34. Krenum

    Krenum [H]ardForum Junkie

    Messages:
    14,920
    Joined:
    Apr 29, 2005
    Agree with the statements in the article. Any "Gaming" card in 2019 should have atleast 8GB's of memory regardless of price, having 6GB's is step backwards.
     
    Armenius likes this.
  35. illli

    illli [H]ard|Gawd

    Messages:
    1,181
    Joined:
    Oct 26, 2005
    This was to be expected, if looking at 2070 performance. This gen of RTX cards is not worth it in my opinion. Maybe the next gen/version will be better, the main draw (RTX feature) are just too weak currently. I know a lot of people get excited b/c its new technology, but the prices are higher, the price/performance ration isn't as good as previous gen releases, the RTX stuff isn't quite there yet, theres hardly any games with it, the DLSS stuff... same issue. Paying for a new card with not-quite-there-yet features isn't really that great of a prospect. Right now I just do see much appeal in these first gen RTX cards (if) you already have a 1070/1080 etc.
     
  36. Rizen

    Rizen [H]ardForum Junkie

    Messages:
    9,218
    Joined:
    Jul 16, 2000
    Wow that performance is truly atrocious. 32% hit just enabling DX12 without even turning on ray tracing? This card shouldn't exist let alone at $349.99.
     
  37. Auer

    Auer Limp Gawd

    Messages:
    182
    Joined:
    Nov 2, 2018
    I guess the question now is why would anyone buy this card at all?

    Seems expensive for a 1080p card.

    A
     
    TangledThornz likes this.
  38. Hostile

    Hostile Gawd

    Messages:
    683
    Joined:
    Feb 24, 2004
    And still there is no real reason for my to upgrade my 1070 @ 1440p.
     
  39. TangledThornz

    TangledThornz Limp Gawd

    Messages:
    473
    Joined:
    Jun 12, 2018

    I upgraded from a GTX 1060 to the RTX 2060 just for a performance increase and yes, I did overpay for it. I never paid over $300 for a new card and I paid $380 for mine :(

    Another thing that stings me about this upgrade is all my previous upgrades from 1999 we're at least doubled in VRAM. From 16Mb (TNT2) to 32Mb (Radeon), 64Mb to 256Mb, 256Mb to 512Mb, 1Gb to 3Gb, 3Gb to 6Gb with my GTX 1060. This is my first upgrade where the VRAM size stayed the same.
     
    Marees and DrezKill like this.
  40. Everlast

    Everlast n00b

    Messages:
    51
    Joined:
    Jun 3, 2014
    6GB? So only 2.5GB more than the 970 had in 2014.

    Meh.