Watch Dogs Legion Benchmark Test & Performance Analysis - 30 Graphics Cards Tested

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
10,785
"DLSS works great to increase your FPS without a major loss in image quality. Personally, I find "DLSS Performance" the best choice, though the difference to "DLSS Quality" is minimal. "DLSS Ultra Performance," which is marketed as for use with 8K, looks terrible, though. Everything is super blurry, and you can feel how low-resolution the original image is. There's also very strong artifacting around moving objects with that setting—not recommended. The other settings work well, though. What's distracting at all DLSS levels is that the the blue-striped GPS indicator in the game world has some serious moire artifacts with DLSS enabled. Overall, I can still recommend DLSS since it's the easiest way to gain more performance while keeping high details.

Overall, I would expect much better performance optimizations from a title that has been in the making for so long. Ubisoft has nearly infinite resources—this is not a small developer/publisher. I'm also a bit surprised NVIDIA didn't push for a better raytracing implementation because RTX has to impress to convince people—once again, thanks to all those graphics settings we can play with, support for widescreen displays, and the ability to run uncapped FPS."


https://www.techpowerup.com/review/watch-dogs-legion-benchmark-test-performance-analysis/
 
"DLSS works great to increase your FPS without a major loss in image quality. Personally, I find "DLSS Performance" the best choice, though the difference to "DLSS Quality" is minimal. "DLSS Ultra Performance," which is marketed as for use with 8K, looks terrible, though. Everything is super blurry, and you can feel how low-resolution the original image is. There's also very strong artifacting around moving objects with that setting—not recommended. The other settings work well, though. What's distracting at all DLSS levels is that the the blue-striped GPS indicator in the game world has some serious moire artifacts with DLSS enabled. Overall, I can still recommend DLSS since it's the easiest way to gain more performance while keeping high details.

Overall, I would expect much better performance optimizations from a title that has been in the making for so long. Ubisoft has nearly infinite resources—this is not a small developer/publisher. I'm also a bit surprised NVIDIA didn't push for a better raytracing implementation because RTX has to impress to convince people—once again, thanks to all those graphics settings we can play with, support for widescreen displays, and the ability to run uncapped FPS."


https://www.techpowerup.com/review/watch-dogs-legion-benchmark-test-performance-analysis/

Nvidia didn't push for better RTX implementation because its probably DXR and coming to consoles. This is going to be how the future games come out I think. Where they will be straight DXR based and not really be focused totally on Nvidia hardware. Where they work on everything.
 
For Watch Dogs Legion it seems the RX 5500 is much slower than the RX 470 with both having 4GB, and that 8GB of VRAM is very important for this game at 1080P. The RX 5600XT with it's 6GB of VRAM is now slower than a Vega56 with 8GB. The 8GB RX580 will beat the RX 5600XT at 1440P. Makes one wonder how the RTX 3000 cards with 8GB of VRAM will survive into the near future of gaming?
 
In a sense the fact that this game can run on the original 2013 Xbox/PS4 console indicate that it had quite the performance optimization made, just not for PC hardware ?

because its probably DXR and coming to consoles.
The menu options do not say RTX and just a generic term for raytracing so you are almost certainly right (while it say DSSL for DSSL and other NVidia specific techs).

. Makes one wonder how the RTX 3000 cards with 8GB of VRAM will survive into the near future of gaming?
At the same time it is an extremely exigeant new game and unlike some others this one has the 8GB 3070 beating the 11 GB 2080TI in a similar way at 4K than it does at 1440p (and the lead grew in 4k vs 1440p over the 1080TI with 11gig), while the 8 gig 1070Ti did take the lead over the 2060 6gb at 4K, making it hard to use in a particular direction for a prediction, but it does seem to be already flirting with the comfortable line, one thing going for them is that resolution in gaming will almost certainly not move during their lifetime, you buy them to play at the resolution you are playing with them with your 2020 (or planned next screen purchase).
 
The next 12-18 months we're going to see a lot of crappy software releases, it's just the nature of the beast.

Devs will scramble to produce bells and whistles with code they are not intimately familiar with for the new consoles on both sides of the fence....... and the PC will get those games as trickle-down ports.........suggest you all take a huge deep breath and realize a great time for new hardware means a shit-tier time for software, give it 2-3 years and the games will really start impressing...but between now and then, honestly, expect a shit-show and be pleasantly surprised if its not one.
 
Back
Top