Why people don't care about calibration?

Because its a finnicky, very time consuming, and annoying process, if it was as simple as selecting settings, calibrating, putting the file somewhere and having perfect calibration, I expect a lot more people would be interested.

And also if games wouldn't ignore it altogether.
 
And also if games wouldn't ignore it altogether.

Yeah, that's another thing too. I once calibrated an old Nokia monitor that had no drive or cutoff controls (not even under the case!). While the soft calibration worked well... As soon as I started a game, said game would trash the calibration profile.

Now hardware calibrations - that's a whole other story. :)
 
Why there's still no official solution from nvidia and amd to keep the lut from resetting in fulscreen mode?

It's been 10 years already since the problem was introduced with dx10. There's a huge thread regarding this on nvida forums and no one still gives a fuck. If directx and game developers are insisting on using a flat gamma curve in fullscreen mode, then there surely should be a way to force a custom lut somehow by the means of drivers.
 
Why there's still no official solution from nvidia and amd to keep the lut from resetting in fulscreen mode?

It's been 10 years already since the problem was introduced with dx10. There's a huge thread regarding this on nvida forums and no one still gives a fuck. If directx and game developers are insisting on using a flat gamma curve in fullscreen mode, then there surely should be a way to force a custom lut somehow by the means of drivers.

Doesn't AMD do this? Something "write DAC to pallete?" Or some option like that. I have an AMD card somewhere - need to pull it up and see what it does.
 
A lot of this reply I made in another thread applies here to the question.

You do not need to sit in the dark with a glossy monitor. The problem with lighting is that people typically set up their "computer studio" with their desk against the wall like a bookshelf, which acts as a catcher's mitt for direct light pollution no matter what type of coating they have. Computers have often been seen as something to stuff away somewhere as opposed to how some people set up a nice tv "theatre" specifically for lighting, seating, and surround audio for example, or how people set up a photo studio, etc.
.
I set my corner desk away from the corner, taking over the corner almost like a cubicle or command deck type of thing. The room itself has plenty of lighting from floor lamps and a window , but none of them are above or in front of the monitor faces and desk. I even have a small lamp at each end of my long desk, in line with/adjacent to my monitors, but they aren't in front of them where they would have an angle of reflection.
.
Direct light pollutes any monitor, pollutes the monitor and color space no matter what coating.
lcd-glare_ag-vs-glossy.jpg


Variable lighting condition environments completely alter the way our eyes and brains perceive brightness, contrast and saturation - so if you don't maintain the same lighting conditions your settings are completely off when the room lighting changes. I keep lamps so that the daylight window lighting levels are maintained at night so that there is no major lighting level fluctuation. In my living room, I keep 3 sets of settings on my TV for different lighting conditions/times of day for the same reason.
Even hardware monitor calibration is usually done right up against the screen in a dark room. It is a good baseline but once you change the lighting and use it in your actual viewing environment, the way your eyes and brain see that calibrated state is off from what you calibrated so you really should tweak it further to suit the way your eyes see it in that lighting environment. Any direct light hitting the panel also pollutes the color space as I stated. Then if you vary the lighting conditions in the room, your perceived settings fluctuate greatly (most notably, any brighter in the room yields paler screen and poor contrast, darker room yields brighter and more saturated, ..i.e. can be too much brightness and saturation).
 
[...] there surely should be a way to force a custom lut somehow by the means of drivers.
Not via the driver, but e.g. ReShade (if it supports the specific game) can be used to apply custom LUTs for full gamut correction, so you're not even limited to whitepoint/grayscale like via the graphics card 1D curves.
 
Not via the driver, but e.g. ReShade (if it supports the specific game) can be used to apply custom LUTs for full gamut correction, so you're not even limited to whitepoint/grayscale like via the graphics card 1D curves.

I mean, it shouldn't take much effort for amd and nvidia to implement a lut forcing option in their drivers, or for the direct x developers to give an option of ignoring the reset gamma ramp dx call. It hasn't been done after 10 years because so few people actually care about this stuff.

You can apply a 3d lut to your games through shader injectors or run in borderless windowed mode and get banned for this from online.
 
Yeah, I have to jump on the whole "Purdy =/= correct" camp.

I 'calibrate' insofar as making sure I get a full range from my monitor, I don't really try to get photo-accurate as all of my artwork is consumer-level and digital. If I was working for pro-level or print, I would care much more about 101% accurate calibration.
 
General population of gamers is clueless about calibration and they do not want to spend money on a good probe or time calibrating their displays. They also do not realize the importance of calibration and how it affects image quality. Its kind of funny actually because they care for getting the best eye candy they can with their hardware or upgrade hardware to get better eye candy, yet they do not care for seeing an accurate image that shows games the way they were developed.

Most gamers are ignorant and buy "Gaming Hardware" like Gaming Headphones or Gaming Monitors or whatever else has "Gaming" in it, even though when it comes to Audio and Displays/Monitors, Gaming devices usually suck compared to the ones that do not have "Gaming" on their label. For example, Sennheiser Gaming Headphones SUCK in comparison to Sennheiser HD 600 (non-gaming) headphones that sound incredible in films and games. The same goes for monitors - VG248QE - "The Best Gaming Monitor of 2014" has HORRIFIC image quality! PC Hardware knowledge is just one competency. Audio hardware knowledge and calibration is a separate competency and so is display hardware knowledge and calibration. A single professional display calibration costs about $400-500 because it is worth it and yet gamers do want to learn how to do it for free, although they would have to buy a good probe.

Just a few months ago, it was impossible to use software calibration to achieve both - accurate grayscale AND accurate colorspace on displays in games. You had to create LUT / ICC profiles, which would apply only grayscale calibration to games, but most of the time you had to use LUT / ICC profile enforcers, such as CPKeeper and/or Monitor Calibration Wizard and not every game worked with those tools. You often had to use Borderless Window Mode, which could create stutters. Only a handful of games allowed the use of LUC / ICC profiles in normal Fullscreen Mode. TODAY, however, you can create a 3DLUT (with help of ArgyllCMS + dispcalGUI and ReShade Framework) that can fully and very accurately apply both - grayscale AND colorspace to 99% of games, using ANY mode (Window, Borderless Window, Fullscreen). So far only 2 games I own do not work with ReShade: MGS V - Ground Zeroes, and MGS V - Phantom Pain. MGS V does not work with ReShade simply because of its Denuvo protection, which very few games have.

Just a year ago, HTPC Enthusiasts had to pay $500+ for hardware 3DLUT devices to get that excellent accuracy in films and games, but today it is free with madVR 3DLUT's for film playback and ReShade 3DLUT's for games. Yet, gamers are still resistant to getting the best out of their displays.
 
Do you calibrate your speakers? I mean real room correction that can get the frequency response razor-flat, not just twiddling a GEQ. Do you have an accurate calibration mass for your kitchen scales?

People who use their monitor as a tool need it calibrated. For everyone else, it doesn't really matter as long as it looks ok.

Pretty much this. On top of that, most people don't have the hardware to properly calibrate their displays.
 
The most hilarious thing is seeing all those cutting edge 16gb of ram, multi gpu gaming rigs finished with crappy 1080p 144hz gaming monitors.

It's like those people don't realize that the monitor is the most crucial part in your system and shitty picture quality will render a powerful gpu setup a waste of money.
 
The most hilarious thing is seeing all those cutting edge 16gb of ram, multi gpu gaming rigs finished with crappy 1080p 144hz gaming monitors.

It's like those people don't realize that the monitor is the most crucial part in your system and shitty picture quality will render a powerful gpu setup a waste of money.

Maybe to you. Everything has trade off's. Some people would rather have lower latency and higher refresh rates (which are arguably much more important in a gaming setup) over color accuracy.

Unfortunately there is no real one size fits all solution.
 
1080p also has its own perks, something higher resolution monitors very rarely enjoy.
 
Responsiveness and refresh rate are more important if the only games you play are cs:go, quake live and assfaggots.
 
PQ....:confused:
I'm posting this again after seeing the downplaying of the effects on graphics and aesthetics from being compromised by medium to low fps-hz in highly dynamic FoV motion 1st/3rd person gaming again, and from comments appearing to say that high hz low response time monitors only have twitch gaming advantages.
During FoV movement at low fps-hz you aren't even at what can be considered a solid grid resolution to your eyes, so I find the aesthetic comments hard to take seriously. You don't play a screen shot.
This is about aesthetics too, I hate when people say that high fps+high hz is only for competition/twitch.

100fps-hz/120fps-hz/144fps-hz:
~40/50/60% blur reduction,
5:3/2:1/2.4:1 increase in motion definition and path articulation,
g-sync rides the fps graph +/- without screen aberrations

That said I'm fine with running very high settings (rather than the arbitrarily set ultra ceiling) to achieve a 100fps+ playing framerate, and utilizing g-sync to ride the fps graph +/- which would go into the 120's and 130's on the high end.

<snip>

The tradeoffs are:

Increase in static picture quality (color, uniformity, additionally ppi and resolution on some models) but suffering a viewport considerably more "out of focus" every time you move your FoV in game or are moving quickly in a game , or both moving your "head" in game + moving quickly, which happen all the time in 1st person and 3rd person games, flight games, car games. However keep in mind that most (if not all) games don't use all the colors available on higher gamut monitors so the potential "color advantage" can become more about uniformity when in regard to games.

On 120hz+ low response time, modern RTC gaming monitors, motion at speed has a full "soften blur" effect more within the "shadow mask" of onscreen objects and architectures, but still takes out all detail and textures. (50% blur reduction vs baseline 60hz blur).

On 60hz models - worse blurring "outside of the lines" during motion (smearing). Additionally, loss of much greater motion tracking and smoothness of motion (separate from blur reduction considerations) that high hz at high fps provides, and it's corresponding accuracy and more recent action shown. Much fewer dots per dotted line length (unique slices of action), fewer openings into what is currently happening in the game world per second.

On higher resolution screens, potentially very expensive gpu requirements for modern games if you want to play them at very high, very high+, or max settings. Especially if you have a 120hz+ and want to get anywhere near filling the hz with more recent action slices than 60hz.

I think the way superior motion tracking looks and feels at very high fps on a 120hz+ screen, as well as blur reduction are both very visual and aesthetic besides providing increases in accuracy - so it bother's me when people don't consider them as part of picture quality. A 60hz ips can look great in still photos of games and is great for desktop stuff, but during movement, especially at 60hz, it is slush and rips the focus and detail away from your eyes.
I try to use descriptive language in describing how you are, relatively speaking, continually moving the game world around your viewport when you are movement-keying and mouse looking around in 1st-3rd person games. The differences in the tech goes way beyond what the simple mouse pointer motion articulation example photos can attempt to show, and way beyond what a simple single cell shaded ufo object in a chase cam photo's blur amounts can attempt to show. It is degrees of motion of the entire game world moving in your viewport relative to you, and the blur (smearing blur at worst) of the entire scene - every object in the game world, every high detail texture, depth via bump mapping,etc.at any moderate or higher speed periods in movement/looking.

People are infatuated with graphics detail in still shots (e.g. 4k), but you don't play screen shots.
If you are running a low to medium fps-hz setup, or for example using variable hz at 1440p to run low (sub 75fps-hz to 90fps-hz mode/most of the time in game, really should be like 100 at least imo) - you are essentially running a low hz, low motion definition and motion articulation, smearing blur monitor and missing out on most of the gaming advancements that modern gaming monitors provide outside of the judder/tearing/stops avoidance of g-sync models. For most gamers a dynamic hz 1080p gaming monitor would be a lot better for their gpu budgets.
 
Last edited:
That's why I got a 4k monitor with freesync and switched to amd. Thanks to the ones that mentioned power strip, gonna be giving that a try soon.
 
..<snip>..
. TODAY, however, you can create a 3DLUT (with help of ArgyllCMS + dispcalGUI and ReShade Framework) that can fully and very accurately apply both - grayscale AND colorspace to 99% of games, using ANY mode (Window, Borderless Window, Fullscreen). So far only 2 games I own do not work with ReShade: MGS V - Ground Zeroes, and MGS V - Phantom Pain. MGS V does not work with ReShade simply because of its Denuvo protection, which very few games have.

Just a year ago, HTPC Enthusiasts had to pay $500+ for hardware 3DLUT devices to get that excellent accuracy in films and games, but today it is free with madVR 3DLUT's for film playback and ReShade 3DLUT's for games. Yet, gamers are still resistant to getting the best out of their displays.

I'll have to look into that. It would have to work with sli, g-sync and fullscreen borderless as well as not choking on multi-monitor setups (only the primary monitor for gaming+g-sync).
 
And also if games wouldn't ignore it altogether.

Yes that was what I meant, if you could just apply a calibration and it worked perfectly with everything then that would be good, but unfortunately it is nothing like that.
 
"Motion articulation and definition" is overhyped to me. I used to have a 120hz gaming monitor and now I'm back to a "slow" 60hz display with trails and ghosting (which you can only see on moving text).

And I don't notice anything in games that matter to me, at all, it's smooth enough. Then again, I don't play competitive shooters.
 
"Motion articulation and definition" is overhyped to me. I used to have a 120hz gaming monitor and now I'm back to a "slow" 60hz display with trails and ghosting (which you can only see on moving text).

And I don't notice anything in games that matter to me, at all, it's smooth enough. Then again, I don't play competitive shooters.

That sounds like you completely ignored what I posted about exacerbated motion blur and low motion definition having a very bad effect on PQ in 1st/3rd person gaming.. I even directly stated that it is not just about competitive snap and went into detail about how it makes a big difference aesthetically.

the way superior motion tracking looks and feels at very high fps on a 120hz+ screen, as well as blur reduction are both very visual and aesthetic besides providing increases in accuracy - so it bothers me when people don't consider them as part of picture quality. A 60hz ips can look great in still photos of games and is great for desktop stuff, but during movement, especially at 60hz, it is slush and rips the focus and detail away from your eyes.

the blur (smearing blur at worst) of the entire scene - every object in the game world, every high detail texture, depth via bump mapping,etc.at any moderate or higher speed periods in movement/looking.

Again, you aren't seeing what a high hz monitor can do if you aren't supplying a fairly high common frame rate consistently with your gpu power/graphics settings selections.
100fps-hz / 120fps-hz / 144fps-hz:
~40% / 50% / 60% blur reduction,
5:3 / 2:1 / 2.4:1 increase in motion definition and path articulation,
g-sync rides the fps graph +/- without screen aberrations
These are huge increases in motion clarity (much less blur, vs low fps-hz looking smeared and "under water" out of focus during motion) and motion definition, smoothness, path articulation, (potentially even animation cycle definition compared to lower fps-hz).
It is a very aesthetic thing. 60hz is a mess by comparison. In 1st/3rd person games, you are continually moving the entire game world around in your viewport relative to you. During FoV motion at speed, to your eyes it's not even definable as a solid grid resolution due to smearing of the entire viewport.

It's something like as if you have goggles on filled with a viscous fluid and every time you move your "head" , the screen smears and goes out of focus. At 100fps-hz to 120fps-hz this is minimized to more of a soften blur within the shadow masks of onscreen objects, toward 144fps-hz even tighter, and snaps back to and enjoys full clarity/focus and a lot sooner during lower speed parts of FoV movement.

In another analogy, regarding motion definition, it is as if you are watching the action with a strobe light in the room with special glasses which rather than seeing "black out" periods, make you see "freeze-frames" of the last frame of action frozen in place during the blackout period of the strobe. At lower fps-hz this is like molasses movement by comparison.
60hz-120hz-30-15_onesecondframe.jpg


Not only are individual onscreen entities moving with greater motion clarity and motion definition+articulation, but the entire game world is continually moving around relative to you in 1st/3rd person gaming FoV movement. So the entire viewport/game world/virtual camera movement gets the benefits of greater motion clarity/focus and motion articulation/smoothness (or suffers greatly from the lack of these by comparison).

If you are running a low to medium fps-hz setup, or for example using variable hz at 1440p to run low (sub 75fps-hz to 90fps-hz mode/most of the time in game, really should be like 100 at least imo) - you are essentially running a low hz, low motion definition and motion articulation, smearing blur monitor and missing out on most of the gaming advancements that modern gaming monitors provide outside of the judder/tearing/stops avoidance of g-sync models. For most gamers a dynamic hz 1080p gaming monitor would be a lot better for their gpu budgets.
 
Last edited:
"Motion articulation and definition" is overhyped to me. I used to have a 120hz gaming monitor and now I'm back to a "slow" 60hz display with trails and ghosting (which you can only see on moving text).

And I don't notice anything in games that matter to me, at all, it's smooth enough. Then again, I don't play competitive shooters.

60hz feels "smooth" enough but it does leave much to be desired in terms of motion clarity. I always aim for at least 90fps in games and that's good enough for me, anything higher is just icing on the cake and I'm not willing to pay a chunk of cash for another GPU to get a constant 100+ fps. Would be nice to have, but not worth it for me.
 


Then I simply don't notice poor motion performance as much as you do. I've just looked at my displays closely again, including my VA tv, the colors in motion look the same to me and the out of focus effect don't bother me at all while playing, I just don't notice it. I can't even feel the 16-20ms input lag on the tv. 60hz feels smooth enough even after playing games at 120fps on a 120hz monitor, just need a bit of readjustment. Getting recent games to 120fps in 1080p would require a multi gpu setup, which is not feasibly for a lot of people, either that or lowering setting and ending up with shittier looking visuals, not to mention that a lot of console port game's engine mechanics are tied to fps and running them at 120fps will just break stuff such as physics engine, or speed up ingame time, example: skyrim and dark souls.

I will never buy a gaming monitor again, I don't want to run games in windowed mode anymore or end up getting banned for using lut injectors. I also can't tolerate w-led backlights, they are just killing my eyes, and what is used in 1080p gaming monitors is probably bottom of the barrel quality in terms spectral distribution.
 
Last edited:
Back
Top