Battlefield V NVIDIA Ray Tracing RTX 2060 Performance @ [H]

I would not judge the 2060 on BFx, simply because it's Dx12 is not optimized one bit, no matter the HW, runs worse.
Yeh, 6GB is on low side but I can play with a GTX970 with med-high settings, and its smooth for most part. (60fps flat line).
 
I turned all this stuff on the other day out of curiosity and I got and bombarded with constant foliage flickering.
 
Just wanted to point out that the GPU Memory Restriction setting seems to be doing something with my 2080 Ti. In single player it is keeping VRAM usage to right around 10GB, but when I turn it off it is using all 11GB constantly. I cannot tell what, if any, settings the game is altering to keep VRAM usage in check, but I'll be damned if I can tell any difference at 4K with DXR Ultra. I just got the game to test if anything funky is going on with the PG278Q, so I have yet to spend a whole lot of time with it (including diving into multiplayer).

Now you got me more curious. As of yet I haven't come across anyone testing a new Titan with this and DXR. What if some of FPS issues aren't just limited the vram of the 2060? Between DX12 and DXR it's easy to imagine that Vram is just being brutalized. How much is enough? I also agree, Ultra is just ridiculous. When I tested low or med was a sweet spot just like [H]ard said. I've haven't gone back to do much with it and wouldn't at all if it didn't come free with the card. All the while thankful for the in-depth analysis here.

Funny, only seems a couple of years ago when some people were saying 8GB of system and 4GB of Vram were overkill. Now 6GB of Vram clearly won't do it and even 11GB can have issues.
 
Man, the graph at the bottom of page 6...

Even 2070 can't hit 60FPS with low DXR @ 1080p.. Forget the 2060.

Makes you wonder why anyone would pay $350 for a hobbled card that is memory bottle-necked almost always.

As always, fantastic testing and insight from [H].

I don't play games like BF5, but messed around with it for a bit with my 2070 @ 2055mhz and mem OCed +700 on my 5ghz 8600k. Single player only and at 2560 x 1080. Ultra everything plus ultra ray tracing is playable, single player.
 
Did Nvidia really test RTX 1060 with different games at different resolutions before packaging them to be sold in stores?

Why do I get the feeling that they used demos enabled in Ray-tracing mode just to see if it works then stamped it a OK?
 
I know everyone here Loves these multiplayer games with huge maps, sadly I am not one of them. Never have been, never will be. Closest I come to multiplayer is DDO and EVEOnline. that is it, I do not bother with anything else. Quite frankly, I dont even know what all the joy is about them? PUBG, BF V, FN. give me a good single player game and awesome eye candy. Be nice if someone would do a write up on that instead of all these MMOFPS versions of a game.
 
I know everyone here Loves these multiplayer games with huge maps, sadly I am not one of them. Never have been, never will be. Closest I come to multiplayer is DDO and EVEOnline. that is it, I do not bother with anything else. Quite frankly, I dont even know what all the joy is about them? PUBG, BF V, FN. give me a good single player game and awesome eye candy. Be nice if someone would do a write up on that instead of all these MMOFPS versions of a game.
Thanks for sharing. Can you tell us more?
 
I can only imagine how the RTX 2060 struggles with ray tracing when my newly acquired RTX 2080 8GB Founders Edition renders a mere 25FPS @ 1440P in the 3DMark Port Royal benchmark. The system I'm running this on is a Threadripper 1950X overclocked to 4.0GHz on all cores, 64GB DDR4 @ 3000MHz, Gigabyte X399 Aorus Gaming 7, and a Corsair H100i v2 liquid cooler with Noctua 2000 RPM industrial fans. This system is no slouch, though I'm sure others have better machines. So far I'm unimpressed by the RTX 2080 8GB, and I'm coming from an EVGA GeForce GTX 1080 Ti FTW3 11GB. If anything, this feels like a downgrade. This is by no means my only system, and I don't game much. Unless someone can tell me if there is any application that can take advantage of those Tensor cores, or if I can access the RT cores for rendering without having to fork over a lot of money for a Quadro RTX card, this thing will be returned and I'm popping my GTX 1080 Ti back into this machine.

For those of you who didn't get to experience the full awesomeness that the NVIDIA RTX technology is, and the wonders that it can do for gaming today, in its current implementation, I uploaded a short video of the 3DMark Port Royal benchmark that I ran. Enjoy!


 
I can only imagine how the RTX 2060 struggles with ray tracing when my newly acquired RTX 2080 8GB Founders Edition renders a mere 25FPS @ 1440P in the 3DMark Port Royal benchmark. The system I'm running this on is a Threadripper 1950X overclocked to 4.0GHz on all cores, 64GB DDR4 @ 3000MHz, Gigabyte X399 Aorus Gaming 7, and a Corsair H100i v2 liquid cooler with Noctua 2000 RPM industrial fans. This system is no slouch, though I'm sure others have better machines. So far I'm unimpressed by the RTX 2080 8GB, and I'm coming from an EVGA GeForce GTX 1080 Ti FTW3 11GB. If anything, this feels like a downgrade. This is by no means my only system, and I don't game much. Unless someone can tell me if there is any application that can take advantage of those Tensor cores, or if I can access the RT cores for rendering without having to fork over a lot of money for a Quadro RTX card, this thing will be returned and I'm popping my GTX 1080 Ti back into this machine.

For those of you who didn't get to experience the full awesomeness that the NVIDIA RTX technology is, and the wonders that it can do for gaming today, in its current implementation, I uploaded a short video of the 3DMark Port Royal benchmark that I ran. Enjoy!




The logical upgrade path from a 1080Ti, is a 2080Ti. There were plenty benchmarks showing that there is really no difference between the 1080Ti and the 2080, besides RT cores and less VRAM with the 2080. Plus TR is not a great gaming or benchmarking CPU unless you like playing CB15.

Next time do your research.
 
I saw a review on Newegg or Amazon where they were praising the RTX card for such great raytracing on Shadow of the Tomb Raider. I'd have to say Nvidia did a great job at the placebo effect. Because a lot of owners probably think they are ray tracing on all their titles.
 
The logical upgrade path from a 1080Ti, is a 2080Ti. There were plenty benchmarks showing that there is really no difference between the 1080Ti and the 2080, besides RT cores and less VRAM with the 2080. Plus TR is not a great gaming or benchmarking CPU unless you like playing CB15.

Next time do your research.

Yeah, well... you see... I'm not made of money. I paid $800 for the EVGA GeForce GTX 1080 Ti FTW3 11GB over a year ago. At that time a GeForce GTX 1080 Ti 11GB Founders Edition was $699 on GeForce.com. To follow your upgrade path logic I would have to pay $400 more this time around to get the Founders Edition RTX 2080 Ti, while and EVGA RTX 2080 Ti FTW3 now costs $1500 or more, depending on where you decide to buy it from. You might as well recommend the TITAN RTX, which by the way is the only RTX GPU that actually has a decent amount of VRAM, but it's way overpriced.

I did my research, and if you'd paid attention, I mentioned that this isn't my only machine. No worries, this POS RTX 2080 8GB FE doesn't benchmark any better when installed in a system with a 9900K overclocked to 5.0GHz either. The score is slightly higher, but that FPS is still garbage with RTX ON.

Honestly, I would have been happy with the same or maybe 5% better performance than the GTX 1080 Ti. What I didn't expect was for RTX to be such a huge gimmick.

I'm sure it's a great feature on Quadro RTX, however, for gamers, it's a performance penalty that will put a hole in your wallet.
 
Last edited:
The logical upgrade path from a 1080Ti, is a 2080Ti. There were plenty benchmarks showing that there is really no difference between the 1080Ti and the 2080, besides RT cores and less VRAM with the 2080. Plus TR is not a great gaming or benchmarking CPU unless you like playing CB15.

Next time do your research.
If only we had a good source for real gameplay.....

RTX 2070 vs RTX 2080 vs GTX 1080 Ti vs GTX 1070

ASUS ROG STRIX RTX 2080 Ti Video Card Review

MSI GeForce RTX 2080 GAMING X TRIO Review


 
Yeah, well... you see... I'm not made of money. I paid $800 for the EVGA GeForce GTX 1080 Ti FTW3 11GB over a year ago. At that time a GeForce GTX 1080 Ti 11GB Founders Edition was $699 on GeForce.com. To follow your upgrade path logic I would have to pay $400 more this time around to get the Founders Edition RTX 2080 Ti, while and EVGA RTX 2080 Ti FTW3 now costs $1500 or more, depending on where you decide to buy it from. You might as well recommend the TITAN RTX, which by the way is the only RTX GPU that actually has a decent amount of VRAM, but it's way overpriced.

I did my research, and if you'd paid attention, I mentioned that this isn't my only machine. No worries, this POS RTX 2080 8GB FE doesn't benchmark any better when installed in a system with a 9900K overclocked to 5.0GHz either. The score is slightly higher, but that FPS is still garbage with RTX ON.

I did pay attention. You came into a 2060 thread, complaining that your TR based, "POS" 2080 can only do 25fps in PR, and that it feels like a downgrade from your 1080Ti.

Not to derail this thread any more than we have, but you say your not made of money, sounds to me like you are since you paid $800 for RT instead of more performance.
Especially if you wanted more performance then your 1080Ti, the 2080Ti is a no brainer. I think if I was paying $800 for an upgrade, I would of done my research a lot better than you have.

You have no one to blame but yourself.
 
I did pay attention. You came into a 2060 thread, complaining that your TR based, "POS" 2080 can only do 25fps in PR, and that it feels like a downgrade from your 1080Ti.

Not to derail this thread any more than we have, but you say your not made of money, sounds to me like you are since you paid $800 for RT instead of more performance.
Especially if you wanted more performance then your 1080Ti, the 2080Ti is a no brainer. I think if I was paying $800 for an upgrade, I would of done my research a lot better than you have.

You have no one to blame but yourself.

My entire point was that I can only imagine how bad the RTX 2060 is at ray tracing if the RTX 2080 is struggling with it. Also, the benchmark that I posted is ray tracing related, which is why I got the card in the first place. I wasn't necessarily expecting better performance than the 1080 Ti, however, it actually feels like it has slightly worse performance in every single game that I play. To add insult to injury, it is really bad at the very thing that it is named after: RTX - ray tracing. An average of 25FPS at 1440P is really bad, and no amount of raw CPU power will improve that. That was my entire point, sorry you missed it.
 
"Above are the exact in-game settings used to test here today."

Was the wrong screen shown on these? I did not see the settings which I assume is in the advanced tab.

I was curious to see what settings were needed for 1080p low dxr to beat 720p ultra dxr and almost match 1080p dx12 max settings.

The GTX 1060 is clearly a lost cause, but I imagine the 8 GB cards would do well with something like high dxr on medium settings at 1080p and perhaps low dxr on medium settings with 1440p.
 
"Above are the exact in-game settings used to test here today."

Was the wrong screen shown on these? I did not see the settings which I assume is in the advanced tab.

I was curious to see what settings were needed for 1080p low dxr to beat 720p ultra dxr and almost match 1080p dx12 max settings.

The GTX 1060 is clearly a lost cause, but I imagine the 8 GB cards would do well with something like high dxr on medium settings at 1080p and perhaps low dxr on medium settings with 1440p.
Tag Brent_Justice is you want more explanation, that he will surely be more than happy to give you. Full transparency is our game.
 
Seeing as my 2080 Ti is borderline playable with DXR (1080p UW), it's no surprise the 2060 is a no-go.

I'd like to see what other developers do, maybe BFV is not the best game for this technology.

Would like to see some games using Vulkan based Ray Tracing. It seems to have a lot more performance than the Dx12 version. Enlisted was the only game that Nvidia demonstrated that was using Ray Tracing at 4K. And it used Vulkan.

As for the review of the 2060 here, great review as usual, but, I wonder was there anyone out there expecting anything different with regards to Ray Tracing? Like you say the 2080Ti is just good enough what hope has the 2060?

The should have stuck with an 8Gb x60 card without any Ray Tracing.
 
Would like to see some games using Vulkan based Ray Tracing. It seems to have a lot more performance than the Dx12 version. Enlisted was the only game that Nvidia demonstrated that was using Ray Tracing at 4K. And it used Vulkan.

As for the review of the 2060 here, great review as usual, but, I wonder was there anyone out there expecting anything different with regards to Ray Tracing? Like you say the 2080Ti is just good enough what hope has the 2060?

The should have stuck with an 8Gb x60 card without any Ray Tracing.

I don’t bring up Enlisted since I could never find anything besides a video of a screen at gamescon (?). But if a 2080ti can do 90fps @ 4k a 2060 should be able to do about 75fps @ 1440p which would be pretty sweet, assuming everything is linear.

What makes me skeptical is I’d be promoting the hell out of that if I was nVidia.
 
I think we should all see how RTX cards perform on other games when they are released with ray tracing. Basing opinions of a whole product range on one game could be unrealistic. As reaper12 said, it may perform a lot better running RTX features under Vulcan. It may also me worth considering that Battlefield V needs more optimization and Nvidia drivers are immature, who knows !!
This is definitely a case of early adopters being disappointed and overcharged. It will be interesting to see how this situation develops.
 
I think we should all see how RTX cards perform on other games when they are released with ray tracing. Basing opinions of a whole product range on one game could be unrealistic.
Absolutely, yes, more games to test. There are none, four months later.

Sadly this is all we have to base our opinion on currently when it comes to RTX.

That said, this is a review of a specific game with a specific product, which is a very narrow view.

If you go back and read our full reviews of the RTX cards we have had luck getting our hands on....you will find awards.

So our opinions are pointed, but Jensen has very much hung his hat on RTX ray tracing performance continually, and it seems he has enough rope out there to hang himself as well. Maybe he will reel some of that in soon, but until then, we base our opinions on real hardware and real games we can play on that hardware.
 
Nice review, very much strengthens my resolve to give NVIDIA the finger this time around. I'm no hater -- 1070 and G-Sync monitor -- but they need to be called out on this abortion of a video card.

I can really ignore the 2080Ti, it screams early adopter. Those folks know what they're getting and they don't mind spending the coin. I think that card is a correctly done part: super expensive, super high end, it's clear what this is.

But I feel the small amount of VRAM on the 2060 is going to force a lot of gamers into a quick upgrade in 2020, when the next line of cards (hopefully) makes RTX beneficial and plentiful. Jensen should be ashamed of himself. The 2060 is deceiving when, as it stands, a 1070Ti is a superior solution with more life left in it.
 
was updating my 1080ti drivers now. and saw the add for 2060 fe is there and i will just say first sentence was -> Delivering max-setting 60+ FPS gaming at 1920x1080 @ BFV / And runs Battlefield V with ray tracing at 60 FPS. BF1 was shit with dx12 also :p It is possible i will get the 2080ti later for the performance increase in general but and have a real life look at it, but still cost way to much i think. probably better to wait for next generation is better, idk how soon we will see the next titanium. thinking bfv is a poor choice also for it seeing how hard it hit performance in general, i would probs play around with it a bit but go back to running high fps instead i didnt buy a 165hz panel to run 30-60 fps on. think maybe in games like dragon age /w g-sync on it would be a nice touch.
 
As someone still on Maxwell and 1080p, I will be avoiding a RTX 2060 like the plague. A 970 is still more than adequate for what I play and I see nothing compelling when it comes to the marquee feature that Jensen justifies jacking up the price $100 on a 60 series card. If anything, I'd get either a 1070 or a second 970 before this card. At least the 960 let you do 2-way SLi, a feature nVIDIA has seen to strip out of the 60 series since Pascal. Thanks, but no thanks to nVIDIA. Many thanks to Brent and Kyle for their hard work in making this review.
 
there is a cheaper 1060ti coming, think it is faster but one thing i do know it dont have rey tracing.
 
The Bottom Line really does sum up RTX 2060 and even RTX 2070. What is the point of buying a card for a feature that it can't even make usable.

Nvidia really should have found a better game to show off RTX, like a story driven adventure game where things could be slower paced and where you would stop to take in the scenes.
A fast paced shooter seems pointless for this feature.

Looking forward to the full review of this MSI card.

Looks like using Ray Tracing for reflections is a waste of time

But on the other hand using path tracing for lighting as in Quake 2 seems to be a winner
 
Last edited:
As for DLSS, technique for rendering at lower resolutions then up-scaling is nothing new in purpose for performance increase - the real question does it give better IQ or results from the methods already available now?

In theory DLSS should be equivalent to dialling down the texture settings at 4k or upscaling from 1800p
 
Nice Review! Totally agree. Been saying it since the day they announced it that for 350 and 6GB of ram its a tragedy. Then I had people troll me about it how it won't matter for this card. I guess only Nvidia can sell a card for 350 and 6GB of ram. If it was AMD internet would be in fire lol.

Same issue with 2070 and 2080 that only have 8GB. To be honest even 2080ti should have had more than 11GB as that isn't enough for 4K with high DXR. It is highly disappointing that there were no VRAM increase over previous generation especially given how these new technologies need so much of it. I bet it would have been better off with more VRAM using GDDR5X than less using GDDR6. Bandwidth increase isn't that great and definitely won't help compensate for lack of physical memory.
 
Same issue with 2070 and 2080 that only have 8GB. To be honest even 2080ti should have had more than 11GB as that isn't enough for 4K with high DXR. It is highly disappointing that there were no VRAM increase over previous generation especially given how these new technologies need so much of it. I bet it would have been better off with more VRAM using GDDR5X than less using GDDR6. Bandwidth increase isn't that great and definitely won't help compensate for lack of physical memory.

The bandwidth increase over the last gen is rather significant, especially going GTX 1070 to 2070 where it is massive.

So now 8GB is not enough? I suppose you could have given the 2070 and 2080 cards 12 GB of ddr5x but they would likely perform worse in most games. And the only option for the 2080ti to have good bandwidth and capacity would have been HBM2.

Just take a look at the newest review for RE2 which is said to use up to 13.7GB, according to the game:



On the top end, the GTX 2080 matched the 1080ti at 4k with max settings.

Screenshot_20190129-074704_YouTube.jpg


The GTX 2060 is doing just fine at 1440p agains the GTX 1080. In fact, the bandwidth deficiency of the GTX 1070ti had a greater impact against the GTX 1080, even if you factor in the fewer CUDA cores.

Screenshot_20190129-074925_YouTube.jpg


A similar story an be seen at 1080p comparing 4, 6, and 8GB cards. 4GB is on the edge here as the FuryX suffers and all 4GB cards suffer at 1440p, though that is not their target. Remember too that all of these tests are done with Max settings.
 
At this point in time if feels more like lazy game devs depending on the horsepower of new cards over how NVidia or AMD builds them. Use the PS4 for instance, I'll point to Horizon : Zero Dawn. It looks amazing and plays amazing, something that took more than a few players off guard.This game was built and optimized to run smooth, and it does. Now take a look at Fallout 76, a more recent game which had loads of time look great and have it's engine optimized, instead it was released on PC, PS4 and ZBox alike and looks like dithered horse rectums.

The Witcher 3 is another fantastic example of a beautiful game, built for multiple platforms and optimized for each.

Lazy programmers depending on brand name fan base and hype. Too incompetent to actually do anything that takes intelligence, instead the poor saps trying to make a fabulous game get handcuffed by inept coding. It's more prevalent as time goes on.

Real Time Ray Tracing from NVidia is an attempt to mainstream it, nothing else. They've given the tools to programmers to make use of it but honestly, it'll be a generation or 2 before we see it pump the power required to run the games using it now. It's the Crytek of GPU offerings. To be honest it should have remained off of the current generation and have been implemented through an add-on card, like Ageia PhysX originally was, prior to implementing it onboard. The option would have been nice over being force-fed the steaming shitheap of performance which is RTX.

Kudos to the team who built BF5 and attempted to implement ray tracing, but from what I've seen and read from the different devs building games using ray tracing it takes an aweful lot of trial and error to have it run.
 
Real Time Ray Tracing from NVidia is an attempt to mainstream it, nothing else. They've given the tools to programmers to make use of it but honestly, it'll be a generation or 2 before we see it pump the power required to run the games using it now. It's the Crytek of GPU offerings. To be honest it should have remained off of the current generation and have been implemented through an add-on card, like Ageia PhysX originally was, prior to implementing it onboard. The option would have been nice over being force-fed the steaming shitheap of performance which is RTX

Just a thought

Maybe NVIDIA should have released prototypes of the cards (along with drivers) in advance

Look at the Vulkan path tracing for Nvidia implemented by a post doc researcher for quake 2

If people like him had these cards/drivers in advance, then maybe, the games would have been ready by the time of official release of cards, maybe !?
 
Just a thought

Maybe NVIDIA should have released prototypes of the cards (along with drivers) in advance

Look at the Vulkan path tracing for Nvidia implemented by a post doc researcher for quake 2

If people like him had these cards/drivers in advance, then maybe, the games would have been ready by the time of official release of cards, maybe !?

I'm betting they did, but the programming for it is complex and NVidia wanted their new cards out the door while the GPU mining was hot. The premium prices reflect this, unfortunately the RTX performanceand support does not.
 
  • Like
Reactions: Auer
like this
Kudos to the team who built BF5 and attempted to implement ray tracing, but from what I've seen and read from the different devs building games using ray tracing it takes an aweful lot of trial and error to have it run.

Ray Tracing is easy, that's not the problem, the problem is that it takes massive amounts of computational power. So to make games work with Ray Tracing using today's hardware they have to implement a hybrid system part Ray Tracing, part rasterization, with most of the frame been rasterized not Ray Traced.
 
The bandwidth increase over the last gen is rather significant, especially going GTX 1070 to 2070 where it is massive.

So now 8GB is not enough? I suppose you could have given the 2070 and 2080 cards 12 GB of ddr5x but they would likely perform worse in most games. And the only option for the 2080ti to have good bandwidth and capacity would have been HBM2.

Just take a look at the newest review for RE2 which is said to use up to 13.7GB, according to the game:



On the top end, the GTX 2080 matched the 1080ti at 4k with max settings.

View attachment 138276

The GTX 2060 is doing just fine at 1440p agains the GTX 1080. In fact, the bandwidth deficiency of the GTX 1070ti had a greater impact against the GTX 1080, even if you factor in the fewer CUDA cores.

View attachment 138277

A similar story an be seen at 1080p comparing 4, 6, and 8GB cards. 4GB is on the edge here as the FuryX suffers and all 4GB cards suffer at 1440p, though that is not their target. Remember too that all of these tests are done with Max settings.


You are using a single game for comparison which doesn't appear to be particularly visually intensive either. Personally, it looks very dated but that isn't unusual for Capcom games (my opinion). Anyway, it does depend how a game engine is designed to stream its textures so depending on that and other settings will use VRAM differently (remember RAGE?). DX12 games also use more VRAM from what I've seen. Clearly this game isn't VRAM size limited but looks at various other reviews especially with ray tracing enabled and look at how much ram they demand. It is a lot! I also highly doubt that typical end user who is in the market for 2070 or higher is going to be running 1080p, likely not even 1440p. I'm not sure what point there is to argue that higher end cards especially in their price segment surely could have had more VRAM as 8GB is pretty old stuff and have been available for years and also many games easily push that.
 
*Snip hardware unboxed RE2 stuff

I’ve said it in some other threads but I’m still thinking that REngine is not properly reporting vram usage. For starters it outright thinks that my Vega64 doesn’t even have VRAM. Secondly being that afterburner reports far less VRAM usage, and doesn’t even bump against 8GB from what I’ve seen so far.

I’m no expert though, and I’d love to see some in-depth looks at RE2 and it’s settings.

Now off to read that hardware unboxed review.
 
You are using a single game for comparison which doesn't appear to be particularly visually intensive either. Personally, it looks very dated but that isn't unusual for Capcom games (my opinion). Anyway, it does depend how a game engine is designed to stream its textures so depending on that and other settings will use VRAM differently (remember RAGE?). DX12 games also use more VRAM from what I've seen. Clearly this game isn't VRAM size limited but looks at various other reviews especially with ray tracing enabled and look at how much ram they demand. It is a lot! I also highly doubt that typical end user who is in the market for 2070 or higher is going to be running 1080p, likely not even 1440p. I'm not sure what point there is to argue that higher end cards especially in their price segment surely could have had more VRAM as 8GB is pretty old stuff and have been available for years and also many games easily push that.

I guess the point is that I have yet to see a game (other than BFV running dxR) that is crippled with 6 GB of vram, much less 8 GB.

The biggest issue with games lately has been the awful dx12 implementation. Both BFV and RE2 are getting huge performance and vram penalties with dx12. It seems that Hitman 2 didnt even bother with it as stated in another thread.
 
I guess the point is that I have yet to see a game (other than BFV running dxR) that is crippled with 6 GB of vram, much less 8 GB.

The biggest issue with games lately has been the awful dx12 implementation. Both BFV and RE2 are getting huge performance and vram penalties with dx12. It seems that Hitman 2 didnt even bother with it as stated in another thread.

Well, I don't want to sound like a dick but you obviously haven't played many games in 4K at maximum settings. As for DX12, I'm not sure if it's bad implementation or not but it's beneficial for a game to use more VRAM (again I put RAGE as example of that; I personally loved the game but their streaming just didn't work and it would been better for use more VRAM instead). It will then have to preload less assets from your SSD/HDD thus improving performance and provide smoother FPS. In an ideal world you'd want everything preloaded into RAM/VRAM for best possible performance. Granted that devs can also work on a better and more efficient texture streaming and whatnot so their product fares well on lower systems but for AAA games I personally would rather have them spend more time on visuals and game content than try to get it to run on a wide variety of systems. It costs time and money for that sort of thing so if they are making a top tier game, I want them to focus on what's important for me which is why I build an above average system.
 
Back
Top