NVIDIA card w/ FreeSync - Pixelated graphics (especially hair)

Also this from the Steam community. Not trying to be a jerk or anything, but I really think it's just the game.

View attachment 383332
I know you're answering jobert, and yes, the problem in this image is very similar to mine, still, this very problem happens on other games, made by other developers (such as rdr2 and cod mw). So the problem is not with this specific game, maybe it's some sort of technology that all of them use, but it is not related exclusively to Marvel's Avengers.
 
I've seen this with 1080Ti's using FreeSync 4K display, I was able to correct by going into the drivers and under "Change Resolution" "Use Nvidia color settings"
  • Desktop Color depth: Highest (32 bit)
  • Output color depth (8bit or 10bit is you have a 10bit monitor which I did)
  • Output color format: RGB
  • Output dynamic range: Full
Nvidia drivers at the time was pushing 6bit + FRC with my 10bit monitor: WTF I thought but yes games would have this pixelated look. Not sure if that is the same problem after a few years of Nvidia drivers! Also do you have GeForce Experience Optimizing your games (more like destroying them)? If so, turn that shit off and go to defaults in 3d Settings and Program settings in the drivers besides the color settings. Best I can think of.
Your best is not even close to reality 🤣

There is no such thing as 6-bit setting and even if by some miracle GPU sent 6-bit instead of 8/10 bit it would cause entirely different issue. With 6-bit + A-FRC (or just 'dithering') image would look identical to 8 or 10-bit albeit with imperceptible noise.
Not to mention screenshots would not capture any of the differences in how GPU send data to monitor, dooh

@

im_shadows

You got 31.5" monitor that is 1920x1080 so you will see some rendering quirks. Games use trickery to fake certain computationally expensive effects and when you start pixel peeping (which is very easy in this case...) you will notice these things.

What would definitely solve your issue is to use DSR and render game at 3840x2160 but of course that is not possible with these games on your GPU with sensible performance. Alternatively you can try just slightly bigger resolution like smaller DSR factor and tweak sharpness until most of the pixelated patterns disappear. The bigger rendering resolution the better the effect will be - this would be the brute force solution.

I am not sure how these game should look or if there is any option to mitigate this rendering quirk so it might just be how the game renders. A lot of games have this, but some of them just hide it eg. Doom 2016 without AA has these dot patterns allover the place.
Definitely do not use any form of artificial sharpening in your GPU drivers or in game. Rather try different AA method in game. TAA should help if it is available, though it will blur image somewhat - and if it does using sharpening is not a solution: it never is.

Otherwise because you have large screen and relatively low resolution for a computer monitor you could sit further away from it. This is not really proper solution but might help.
 
A few things you can try:
Complete reinstall of drivers and reset of settings using DDU.
Try a different monitor.
Take pictures with a phone instead of screenshots. Screenshots won't show monitor's deficiencies.
 
Oh wow yeah 1080p on a 32 inch screen is pretty horrific. I tried it first had one time and could not believe how bad it looked even on the desktop never mind in games. Even 1440p at 32 inches looks a bit blurry and grainy coming from a 27 inch 1440p.
 
woah, nvidia cards can use freesync monitors now?
They have always been able to use the monitor itself just not freesync of course. But many if not most current freesync monitors are "gsync compatible" .
 
See the picture where someone compared SMAA and TAA, the TAA looked a lot better. Can you enable that in the in-game settings? Also make sure the nvidia control panel shows
Antialiasing - FXAA - Off
Antialiasing - Mode - Application controlled
Antialiasing Transparency - will need to try both options, but probably should be off
and in-game set to TAA.

and should hopefully help. That looks pretty jarring with all of the grid lines, I can see why you want to get rid of it.
 
Oh wow yeah 1080p on a 32 inch screen is pretty horrific. I tried it first had one time and could not believe how bad it looked even on the desktop never mind in games. Even 1440p at 32 inches looks a bit blurry and grainy coming from a 27 inch 1440p.
Wouldn't 1920x1080 on 32" screen look exactly the same as 1920x1080 on say 23" screen? :)
 
A few things you can try:
Complete reinstall of drivers and reset of settings using DDU.
Try a different monitor.
Take pictures with a phone instead of screenshots. Screenshots won't show monitor's deficiencies.
Your best is not even close to reality 🤣

There is no such thing as 6-bit setting and even if by some miracle GPU sent 6-bit instead of 8/10 bit it would cause entirely different issue. With 6-bit + A-FRC (or just 'dithering') image would look identical to 8 or 10-bit albeit with imperceptible noise.
Not to mention screenshots would not capture any of the differences in how GPU send data to monitor, dooh

@

im_shadows

You got 31.5" monitor that is 1920x1080 so you will see some rendering quirks. Games use trickery to fake certain computationally expensive effects and when you start pixel peeping (which is very easy in this case...) you will notice these things.

What would definitely solve your issue is to use DSR and render game at 3840x2160 but of course that is not possible with these games on your GPU with sensible performance. Alternatively you can try just slightly bigger resolution like smaller DSR factor and tweak sharpness until most of the pixelated patterns disappear. The bigger rendering resolution the better the effect will be - this would be the brute force solution.

I am not sure how these game should look or if there is any option to mitigate this rendering quirk so it might just be how the game renders. A lot of games have this, but some of them just hide it eg. Doom 2016 without AA has these dot patterns allover the place.
Definitely do not use any form of artificial sharpening in your GPU drivers or in game. Rather try different AA method in game. TAA should help if it is available, though it will blur image somewhat - and if it does using sharpening is not a solution: it never is.

Otherwise because you have large screen and relatively low resolution for a computer monitor you could sit further away from it. This is not really proper solution but might help.
I'll try putting GTA on 4k, still, I used to have a 32" 720p TV as my monitor for a few time, and this kind of problem didn't occur. The only problem that I had back in that day was blurry textures, but no grid-like shadows and hair
 
See the picture where someone compared SMAA and TAA, the TAA looked a lot better. Can you enable that in the in-game settings? Also make sure the nvidia control panel shows
Antialiasing - FXAA - Off
Antialiasing - Mode - Application controlled
Antialiasing Transparency - will need to try both options, but probably should be off
and in-game set to TAA.

and should hopefully help. That looks pretty jarring with all of the grid lines, I can see why you want to get rid of it.
I don't think GTA V has TAA but I'll look into it anyways. I know Marvel's Avengers had it but, as I said on a post before, I played it on a Steam free weekend. My control panel already have these settings, so I guess my only bet is hoping that games that have this problem also have TAA on their options menu haha. Anyways, thanks for trying to help.
 
A few things you can try:
Complete reinstall of drivers and reset of settings using DDU.
Try a different monitor.
Take pictures with a phone instead of screenshots. Screenshots won't show monitor's deficiencies.
I've been avoiding reinstalling all drivers, but I guess that's the only option I have left. I can take pictures of the screen with my phone, but all the problems I see are also shown on the screenshots, so I guess the problem is not exclusively with the GPU or monitor, but with their interaction.
 
not really... the DPI of the 32" is lower so pixelation will be more noticeable
Not if you correct your sitting distance so that both screens take the same amount of field of view.
Also not so much if you already see everything there is to see on the screen super clearly, though I suspect some people might be more bothered when stuff in their field of view is actually bigger vs. smaller... kinda like very small tiger is just a cat 🤗

I guess games should look blurry, not full of grids :dead::dead:
Actually not so much blurry but aliased
Rendering works by sampling points and if you sample as much points as pixels on the screen you get aliasing. If you sample more points on the edges to smooth them they won't appear blurry but more defined and this is called Anti-Aliasing. At some point in time games used this technique and it was called MultiSampling Anti-Aliasing or MSAA in short. Another method is SuperSampling Anti Aliasing or SSAA and it is pretty much rendering everything at higher resolution and scaling it down and is what you can use with DSR (Nvidia's Digital Super Resolution).

More modern AA techniques use post-processing filters to fake anti-aliasing so MLAA, FXAA, SMAA, etc. all work on fully rendered image and try to detect some patterns which look like aliasing and smooth them out in clever ways. They do work miracles but do make image blurry. TAA is Temporal Anti-Aliasing and it also uses information from previous frames and various other tricks and it works but blurs image even more.

Most of AA methods work quite well on original aliasing problem but not so much when it comes to aliasing and other artifacts caused by shader programs. So if your game uses shader to fake texture of beard by creating checkerboard pattern then most AA methods will struggle smootching it out.

Now what we always want is how game looks with tons of SSAA so for example if you have 1920x1080p screen you would be satisfied with how game looks eg. rendered in 7680x4320 (literally "8K") and down-scaled to 1080p. This is what all anti-aliasing methods try to fake. IF you did that all your jaggies, aliases, checkerboards, etc. would be nicely smoothed out but in a way that would not produce any blur whatsoever, just higher definition image. Obviously this is not gonna happen on GTX 1070 🙃

...so you need to fiddle with settings. Maybe rendering game at slightly higher resolution + other AA method will give good results.
The thing with these modern AA method which blur image is that once you add supersampling (which you can do using Nvidia DSR to enable higher than native resolutions) all the blur they add gets hidden. So even though eg. FXAA or TAA add considerable amount of blur these AA method should be enabled when using DSR because you won't see additional blur they cause and they will help to eliminate some aliasing.

Maybe rendering at 2560x1440 would work with good enough framerate so it could be usable?
Imho worth trying out. Also if you use DSR you have slider for "smoothness" so you can fiddle with that, though I do not recommend setting it do zero because that only works good for the so called "4.00x", in which case on 1920x1080 screen you would render game at 3840x2160, 4x the amount of pixels... which is not gonna happen on 1070 I am affraid... or you can actually check how it will run and look :)
 
Your best is not even close to reality 🤣

There is no such thing as 6-bit setting and even if by some miracle GPU sent 6-bit instead of 8/10 bit it would cause entirely different issue. With 6-bit + A-FRC (or just 'dithering') image would look identical to 8 or 10-bit albeit with imperceptible noise.
Not to mention screenshots would not capture any of the differences in how GPU send data to monitor, dooh

@

im_shadows

You got 31.5" monitor that is 1920x1080 so you will see some rendering quirks. Games use trickery to fake certain computationally expensive effects and when you start pixel peeping (which is very easy in this case...) you will notice these things.

What would definitely solve your issue is to use DSR and render game at 3840x2160 but of course that is not possible with these games on your GPU with sensible performance. Alternatively you can try just slightly bigger resolution like smaller DSR factor and tweak sharpness until most of the pixelated patterns disappear. The bigger rendering resolution the better the effect will be - this would be the brute force solution.

I am not sure how these game should look or if there is any option to mitigate this rendering quirk so it might just be how the game renders. A lot of games have this, but some of them just hide it eg. Doom 2016 without AA has these dot patterns allover the place.
Definitely do not use any form of artificial sharpening in your GPU drivers or in game. Rather try different AA method in game. TAA should help if it is available, though it will blur image somewhat - and if it does using sharpening is not a solution: it never is.

Otherwise because you have large screen and relatively low resolution for a computer monitor you could sit further away from it. This is not really proper solution but might help.
Well this was with two EVGA 1080Ti's in SLI, Pascal initially did not support 10bit with their gaming cards, only the professional cards. Later they supported it but you had to manually select it.
https://forums.evga.com/gtx-1080-support-for-10-bit-display-m2510043.aspx

I may have remembered wrong, it could have been 8bit + dithering. Anyways for a period of time 10bit was not available which for this monitor caused visual problems making it unusable until Nvidia allowed 10bit on their gaming cards.

Also tried this with a 1440p HDR monitor, Nvidia drivers made a mess of HDR and it looked terrible at times. It had 8bit + dithering which I was able to select 10bit from the drivers which cleaned up the banding and what looked similar to OP dirty look but other issues with HDR were also present.
 
Not if you correct your sitting distance so that both screens take the same amount of field of view.
Also not so much if you already see everything there is to see on the screen super clearly, though I suspect some people might be more bothered when stuff in their field of view is actually bigger vs. smaller... kinda like very small tiger is just a cat 🤗


Actually not so much blurry but aliased
Rendering works by sampling points and if you sample as much points as pixels on the screen you get aliasing. If you sample more points on the edges to smooth them they won't appear blurry but more defined and this is called Anti-Aliasing. At some point in time games used this technique and it was called MultiSampling Anti-Aliasing or MSAA in short. Another method is SuperSampling Anti Aliasing or SSAA and it is pretty much rendering everything at higher resolution and scaling it down and is what you can use with DSR (Nvidia's Digital Super Resolution).

More modern AA techniques use post-processing filters to fake anti-aliasing so MLAA, FXAA, SMAA, etc. all work on fully rendered image and try to detect some patterns which look like aliasing and smooth them out in clever ways. They do work miracles but do make image blurry. TAA is Temporal Anti-Aliasing and it also uses information from previous frames and various other tricks and it works but blurs image even more.

Most of AA methods work quite well on original aliasing problem but not so much when it comes to aliasing and other artifacts caused by shader programs. So if your game uses shader to fake texture of beard by creating checkerboard pattern then most AA methods will struggle smootching it out.

Now what we always want is how game looks with tons of SSAA so for example if you have 1920x1080p screen you would be satisfied with how game looks eg. rendered in 7680x4320 (literally "8K") and down-scaled to 1080p. This is what all anti-aliasing methods try to fake. IF you did that all your jaggies, aliases, checkerboards, etc. would be nicely smoothed out but in a way that would not produce any blur whatsoever, just higher definition image. Obviously this is not gonna happen on GTX 1070 🙃

...so you need to fiddle with settings. Maybe rendering game at slightly higher resolution + other AA method will give good results.
The thing with these modern AA method which blur image is that once you add supersampling (which you can do using Nvidia DSR to enable higher than native resolutions) all the blur they add gets hidden. So even though eg. FXAA or TAA add considerable amount of blur these AA method should be enabled when using DSR because you won't see additional blur they cause and they will help to eliminate some aliasing.

Maybe rendering at 2560x1440 would work with good enough framerate so it could be usable?
Imho worth trying out. Also if you use DSR you have slider for "smoothness" so you can fiddle with that, though I do not recommend setting it do zero because that only works good for the so called "4.00x", in which case on 1920x1080 screen you would render game at 3840x2160, 4x the amount of pixels... which is not gonna happen on 1070 I am affraid... or you can actually check how it will run and look :)
Used DSR to upscale GTAV to 1440p and 4k. The corners were much sharper, and the game a lot more beautiful. Still, the grids were present. In GTA that's almost imperceptible (regardless of the resolution), I can only notice on Michael's house, but on some games it becomes almost unplayable.
 
im_shadows
I posted this over on Hardware Canucks hope it helps some. Also XoR_ has good suggestions,since you do not have Marvels Avengers anymore ,I feel like a nob taking screens in that game.:)

Your PC gear is fine. PC games just need to have the correct setting's with resolution,anti-aliasing .Not all PC games are perfect especially at lower resolutions.

All screen shots done on Very High settings with different methods of AA . I did not use Nvidia Inspector to get better AA but for the PC games you listed, would have similar results. Doh forget FSR screen ,I just noticed OH same stuff.

In Game Very High Settings-1080p-no-AA
Marvel-s-Avengers-1080p-no-AA.png

In Game Very High Settings-1080p-SMAA
Marvel-s-Avengers-1080p-SMAA.png

In Game Very High Settings-1080p-TAA
Marvel-s-Avengers-1080p-TAA.png

In Game Very High Settings-1080p-DLSS-Ultra-Performance
Marvel-s-Avengers-1080p-DLSS-Ultra-Performance.png

In Game Very High Settings-1080p-DLSS-Quality

In Game Very High Settings-4K-TAA-No Ultra Textures
Marvel-s-Avengers-4-K-TAA-No-ultra-Textures.jpg

In Game Very High Settings-4K-TAA-Ultra Textures
Marvel-s-Avengers-4-K-TAA-Ultra-Textures.jpg

Also GTAV is a known issue with grid as you say and shadows,you can google that.
Also if your wondering 8Bit-12Bit Monitor all will look the same in these games,I only know because I tested it when taking the screen shots on 8Bit/10Bit/12Bit monitor's.
 
@noko
Topic of displays and bits is very complex, also for historical reason and is affected by how Windows was developed making it very hard to have anything more than 8 bits per color. Nvidia certainly made it all more complicated than it needs to be with lots of issues that affected and still are affecting a lot of users so lets say I believe there was some difference between 8-bit vs 10-bit when you say there was because it might be the case that was some typical Nvidia bullsheet.

Know this however that today the only additional banding that Nvidia can introduce on 8-bit connection is modifying LUTs so correcting gamma. Things like what monitor calibration suites do when you calibrate screen (or even when you do windows calibration) all change LUTs. Only calibration which does not is hardware calibration on those few monitors which have this feature. Banding happens because you try to correct 256 levels to 256 levels. Some information loss is inevitable.

Otherwise all SDR applications seems to be always outputting 8-bit*
HDR is also dithered to 8-bit on 8-bit connection it seems so that is not an issue. Of course I do not recommend using 8-bit for HDR or running 8-bit at all if more bitdepth is possible but if for any reason 8-bit was preferable eg. it allowed higher refresh rate, that sorts of things, then it is fine to use it for HDR with the note that it might lead to some noise which with HDR might become actually visible. Noise, not banding. This is very important distinction. That would be dynamic quickly changing noise, not static.

8-bit on Nvidia only has this LUT banding issue on "Full" dynamic range. On "Limited" they do use dithering to avoid bading and this also makes any changes to LUTs use banding as well. Monitors typically do not handle RGB with Limited range but they might handle YCbCr444 with Limited range correctly (not leading to piss poor contrast ratio or enabling 'TV processing' so sharpen filters and such) and this might be then the easiest solution, though not perfect as Limited range by design has even less gradation steps between full black and full white so will have more of this dithering than eg. dithering on Full range would - which BTW can be enabled on Nvidia cards and is enabled by default on AMD cards.

As long as gamma doesn't need to be changed it is best to block applications from changing it by "Override to reference mode" and then for SDR 8-bit should be as good as 10-bit and for HDR it should also look at least correct. This 8-bit + dithering you would not be able to detect with your bare eyes let alone see any difference there might be as obvious visible issue.

But maybe there are exceptions and if they are please tell me how I can test these exceptions. Like if any specific game does show visible difference or mode or whatever.

*) This dithering application to 8-bit is by design and it is why screenshots actually represent how games look. HDR screenshots are garbabe, at least those made with print screen button and at least when pasting to something like Irfan View. And since all applications are dithered to 8-bit then professional applications like Adobe Photoshop also do that and this is where Quadro is still relevant. I am not sure if Nvidia unlocked true 10-bit for normal cards (or game drivers) for such applications to actually allow applications to use 10-bit or not but I also know that it would be impossible to tell just by looking at stuff with eyes so many people might actually believe they run at 10-bit when in fact they are not!

As long as you do not see dithering you do not care.
As long as you do not care any dithering wont bother you as much.
And when you know you run 10-bit monitor at 8-bit and then see dithering you will automatically assume what is the culprit. Let's be clear, most users when they got 10-bit didn't meticulously test what applications are affected in what way. They just enabled it and assumed all their issues are gone so there is no need to even look for them just like they assumed that at 8-bit they have tons of banding and looked for it and it bothered them because they knew it was stupid bullsheet limitation.
 
Back
Top