Calibration Question

JNolan93

n00b
Joined
Aug 20, 2015
Messages
12
Hello everyone,

Alright so after much thought, NCX convinced be to buy an HP 25xw monitor in the Dell S2415H thread. The picture is crisp and clear and I like it. Here is the problem. It has an annoying yellow tint to the image and it makes me not want to use the display.

My question is, will calibration fix that tint and improve other colors? NCX calibrated his monitor with the X-Rite i1 Pro Display. I was wondering if something cheaper would get the job done like the X-Rite ColorMunki Smile. I'm not a photographer so I don't need professional grade accuracy, I just want the yellow tint gone and I want the colors/display to look better and more presentable.

Thanks everyone!
 
Hi.

Doesn't really matter that u do photography or not, and the whole thing is not only about accurancy, monitor producers, and users, have really bad habits, the first somethime to sell sometimes cause don't care, the seconds cause don't know, and those bad habit are not healty.
So you will gain not only a better rappresentation of colours along a space, but have care of yourself too.

About colorimeters.
Mind that some cheap colorimeter worn along time,the munki smile is one of them, glass filtered ones are better in that, throwing away money is never a good choice.

From there we could enter be a questionable the debate against spiders xrite etc etc, and various software, is a wasp nest, so better let it stay were is, for you one could be good as another.

http://www.color-management-guide.com

In this site there's a brief list of colorimeters, this site is pretty good, and well written. I would suggest you to give a read to the guide to calibration too. And there's a list of colorimeters too, (a small buying guide). I think you could find something for your pocket.
If then u have question i' often here.
Bye.
 
Last edited:
Sorry But is a bad bad hint the one u fave him.
For a lot of reason .
Just to mention few not get technical:

1) the diplay could be more on 5000k instead of 6500k . Checkable on the osd, this will correct colour temp But not the shift already present (colour temp could correct the "tint" he see .)

But he requested to end with a presentable result... And to me is simply end, doing a Basic correction done right.
And colorimeter is the only tool for doing this.

2) i could not find one reasonable reason to mess with gpu controls as the monitor have propers controls. Only on koreans ubercheap, one could have to touch contrast from gpu controls..(not talking about Sum an error with another trying to get a correct result..).

3) why doing things by eyes and end up with something approssimative to be fair, a s**t to be honest?

Every monitor is off when buyed. Is a matter of fact, And not only for colour. Doing approssimative eyeball correction end up killing colour reproducible, banding, dithering, overbrightness, off Blackpoint crunched dynamic...

Better the colorimeter....
 
in OSC controls set RGB to maximum possible values. You might need to enter service menu to do that. Most probably by holding menu button when powering monitor with power button and then instead "language" option there will be service settings.

With RGB at maximum you get maximum contrast ratio your panel can possibly have. It will be most probably on blueish side so you reduce blue. Generally tinker with it until it is white to *you* and image likable without any tints. Just keep at least one RGB value at maximum to have highest possible contrast ratio with given white chromatic point.

Calibration probe is needed for professional stuff and for normal home use, especially for one monitor it is not all that needed. With two monitors trying to set things by eye it might be daunting work and often impossible. But for home for single monitor if monitor is not completely screwed up using eyes is good cheap solution.

In any case ability and knowledge how to do setting whitepoint in monitors RGB controls is useful even when using calibration probe because that way you maximize contrast ratio and in games LUT changes that calibration software use to correct colors are usually not preserved so not using monitor controls you could use probe to calibrate display to perfect 6500K and still have yellow tint in games.
 
If it was properly calibrated to D65, the issue will not be that your new monitor is "too yellow" but that your previous uncalibrated displays are too blue.
 
zone74 no seems he never calbrate it.. is new.
could be that his previous one would be more on 7000k so for him this one that "could be" presetted at 6500k or a bit less would appear yellow to him.
Calibrating would solve the issue. for him.. or at least open his eyes.. to colours..


XoR, Sorry but your way is wrong as staticlag one.
With your way he will end with:
1)an unknown set; as will be all arbitrary white point, black point, brightness.....
2)with all the rgb at max.. the result would be "awfull" non only incorrect, . he will end up with a really offset monitor, a totally arbitrary space at best reduced even to the panel specs , worse shifted, to one axis.,so he will loose colours too and the others oversaturated.

I could not understand why there's this bad habit of doing this thing by eyes, when the result is obivious that will be horrible, maybe pleasant somewhat but utterly wrong.

Calibration is not only for photographer, and not only for 2 or more monitors...is good for everyone, the reason are simple and easy to explain.
calibration, allow to set correctly the monitor enjoing the correct range of it..
even without loading the lut the result would be "decent."

Calibration by eyes is always wrong, no way one think about his capacity of tinker the setup, you will loose colours, at best.

calibration allow:
SpaceGame+UncalibratedvsCalPC.jpg


And there's a lot of way on how not unload an icc while gaming, eg Powestrip to mention one.

Now going a bit onto the wonderfull world of who need to work with colours.
Matching 2 monitor that not have the capacity of simulate another one, (quato has this capacity eizo too) is truly a tough work even 2 of the same model, never match 100%.
Usually u have one simulate the result of one of the 2 . (if we don't wanna talk about lutboxes).
 
Last edited:
zone74 no seems he never calbrate it.. is new.
Oh you're right, I completely misread that as having bought a pre-calibrated monitor. Not quite sure how I managed that.

Proper calibration should certainly remove any kind of color tint from the screen - though the D65 standard is actually slightly blue tinted. (Illuminant E, which is closer to D55, would be "neutral")

I don't know if I'd recommend the ColorMunki Smile. It might "do the job" to get rid of your color tint, but it won't necessarily be accurate across multiple displays if you have more than one to calibrate.
The best value meter on the market right now is the current revision of the i1Display Pro, which looks to start at about $200 on Amazon. There also looks to be a $170 ColorMunki package using the same meter, but I don't know much about that, and would be more inclined to spend the extra $30 for the Pro package.

Then again, your HP 25xw is a $160 monitor, and if you only care about getting rid of a color tint, rather than trying to use it for any kind of production work, I doubt most people would think it's worthwhile to spend $200 on calibration hardware.

It's not going to be accurate, but I'd be more inclined to suggest that you either select another color temperature preset on the monitor, increase the blue or decrease red/green if it has custom color temperature options, or try to find a user with a calibrated display or a review that has recommended settings you can copy.

That's not something I would normally be recommending, but while I personally have spent a lot more on calibration hardware than any individual display that I own, I do realize that most people would not.
 
Zone74, i warn you about The Munky smile is is gel filtered, and those ones get ruined in 5-6 years, and have to be serviced.. (rememb.er to check out)

i haven't watched the price of the monitor oh well is a 190$ one on amaz... but as you say
the calibs remain more than monitors.

i noticed tha pcmonitor info warn about "The S2415H provided an image that was fairly inviting out of the box. Bright, without being retina-scorching and with a noticeable cool tint that upset overall balance."

Now our OP is lamenting a yellow tendences.. so i suppose is in the wrong mode...
 
@nifft
to have proper result one need to:
- correct gamut
- correct white point and black point
- correct gamma response

Brightness doesn't matter at all. Black point correction can be safely skipped with minimal impact on image quality. Gamma response is usually good ennough. Gamut is almost always better profiled in monitor manufacturers ICC files and EDID than you can do using most probes. Whitepoint if you really read about calibration (instead watching lame advertising images) is not needed to be at any specific coordinate for viewing purposes and 6500K and other values are only for content creation, same with sRGB gamma which is not really recommended for viewing.

Thing to note is that most people use Nvidia or Intel GPU which doesn't allow calibration without adding banding and have no means to correct gamut in all applications. Changes to LUT actually degrade image quality by introducing banding. One need really screwed up gamma response for it to being worth having more banding.

Calibration is not as perfect as you say nor as needed. I am not even mentioning actual instruments performance being 'meh...' at most. For it to works whole operating system would need to properly support it with gamut remapping, banding-less gamma correction and with all programs and games. And as it is one is much better off without going into this calibration hell.

Conclusion: Software calibration is a joke and sucks donkeys ass.
For proper colors one buy NEC, EIZO, Quato, HP DreamColor, etc. monitor and use hardware calibration instead. Calibration prove is usually included with these :D

BTW. I recommended setting not 255 255 255 but one of colors to 255 (red in case of W-LED) and reduce other two until proper whitepoint is achieved. It is very unlikely that monitor does clipping. Most likely it already have red color set at 255.
 
Xor this is turn in a really interesting " conversation", i hope my horrible english will not fail me too much. (would be easiest to me in my ownl language, if i get not clear let me know)

you give me to think for a second with for "viewing purpose", i admit i'm a bit biased on that matter, as i've done specific tasks. (i needed a quato not for nothing).

ok we could state just for a sec not going ot that we know that our view is not linear. at all, and we know too that Luma or brightness of a color isn’t determined until we view it in context and our iris sets a size and it hits our retina.

starting from end.
i1pro is not too biased, i would not compare those with minolta, or barbieri if we pass on print, but you are a bit too severe, calibration for masses to me remain better than nothing.

i do not agree with some of the thing u wrote.
brightness matter, to be totally honest would matter more in relation of enviroment around the viewer but allow me to not enter into that, in our case matter for eyestrain,more of, there's standards, and they works.
you for sure could aknowldedge that luma is relative, not determined until we view it, and our iris set his aperture, an it hits our retina,again brightness standard allow us to determine a standard, *good enough* to be classified as universal. so brightness change the colour perception, in relation on how we sense them against light.

For the gamma we could discuss against l* vs 2.2 stocked into the montors, and we could talk about 2.2 standard is a lie. (as crt have a natural 2.5)

but again talking about a office grade monitor, and i could think on those what they stock into lut is not sooo accurate. In our case for the OP, the best i could suggesti is 2.2 6500k that are the facto universal accepted standards, and more findable into "monitor presets". ( and to me always off).
As standard is something everyone could follow, with prevedible results. or we enter into perceptual domain.

As for srgb, i suppose we could discuss again a lot of a colour space by his observer, but in our case, again what we have is a standard for viewing the majority of contents, on web, fixed into the panel. i know as you that there's no enough green. i'm thinking about get out the most of his hp monitor as he could, and by eyeballing will not happen.

i don't know that about Nvidia, intersting, a curious thing i've yesterday started reading some specs some about how the two companies manage colours, could u point me to something about this Nvidia flaw?

If we are talking for work i could follow you about the need of all enviroment, properly setted, (is not that hell, once done is done forever almost) but for this case for our OP, tweaking without knowing what exactly he is doing, hoping for the best, or having at least a standard calib.(and even softproof, would suffice with the 6 bit panel he own), i continue to prefer the calib.


p.s
I could agree for work sofproof sucks, but as always somwhere there's compromises to do.
a couple of firends of mine that worked with flame, for tv shows, could say stories that make your hair-rising....
pps.
u know the poor Jnolan will have an headache reading all this?
To me was a pleasure discuss with u xor.
 
Last edited:
Thank you everyone for your responses. I took awhile to look at this thread because after reading the link that nifft posted, it wouldn't be smart to buy a cheap option like the ColorMunki Smile.

Now I installed the disc that game with my monitor, and there are calibration tools but it's the eye test. It says my monitor's color temperature is already at 6500k and says it's "standard". However it still had the greenish yellow tint.

I know it's a $160 monitor (was $220 when I bought it a few months ago) so I shouldn't expect perfection for a budget monitor but I read NCX's review and he told me these HP's are more superior than any other monitor I was looking at in my price range. I also read his review here and he went into great deal about everything.

http://wecravegamestoo.com/forums/monitor-reviews-discussion/15935-hp-27cw-hp-25xw-review-almost-glossy-overclock-able-1080p-6-bit-frc-lg-ah-ips.html#post1402101

According to his settings, once calibrated it is suitable for professional use so I figured if it's good enough for that then it's good enough for accurate color in gaming lol

I've done some color adjusting on my graphics card in the past (AMD) on my old monitor but it didn't fix anything.

I appreciate everyone's help, I hope for more conversation about this. I want to buy the i1 Display Pro, will that pretty much solve all my problems.

Sorry for not replying to you guys individually but I do appreciate the help!
 
I appreciate everyone's help, I hope for more conversation about this. I want to buy the i1 Display Pro, will that pretty much solve all my problems.
Yes, if you don't mind spending as much as the monitor on that hardware.
Of course the i1Display Pro can be used on every display that you buy from now onwards, and it means that you know things will definitely look correct.

-----

Brightness doesn't matter at all.
The modern target for SDR is 100 nits.
Or 80-120 depending on which old standards you were looking at. (sRGB: 80 nits, SMPTE: 120 nits, EBU: 100 nits)
Black point correction can be safely skipped with minimal impact on image quality.
Only if you don't care about shadow detail being clipped.
You should care about shadow detail being clipped.
Gamma response is usually good ennough.
Only in mid-range and up. Inexpensive displays usually have a noticeably inaccurate gamma response.
Gamut is almost always better profiled in monitor manufacturers ICC files and EDID than you can do using most probes.
That will give you the primaries. It won't measure the full gamut.
Whitepoint if you really read about calibration (instead watching lame advertising images) is not needed to be at any specific coordinate for viewing purposes and 6500K and other values are only for content creation, same with sRGB gamma which is not really recommended for viewing.
Print work is usually done at D50 or D55. Everything else is created with and intended to be viewed at D65.

Thing to note is that most people use Nvidia or Intel GPU which doesn't allow calibration without adding banding and have no means to correct gamut in all applications. Changes to LUT actually degrade image quality by introducing banding. One need really screwed up gamma response for it to being worth having more banding.
I agree that calibration via the GPU LUT is a bad idea. However you still need a probe to calibrate any display, and most displays at least offer things like white balance controls, which would remove a "yellow tint" for example.
As I said above though, I'm not sure that it's worth spending that kind of money in this situation.

For proper colors one buy NEC, EIZO, Quato, HP DreamColor, etc. monitor and use hardware calibration instead. Calibration prove is usually included with these :D
The probe is usually an optional extra, and these days everyone is using profiled i1Display Pros.
And again: the display in question here was a $160 monitor. What you're talking about is typically >$1000.

For the gamma we could discuss against l* vs 2.2 stocked into the montors, and we could talk about 2.2 standard is a lie. (as crt have a natural 2.5)
l* is a weird thing that I've only ever seen photographers talk about.
I understand the concept behind it, but I'd never recommend that anyone use it.
Stick to a flat 2.2 gamma for any computer/graphics work. Or use BT.1886 if you want the modern equivalent.
Avoid sRGB. No-one ever actually used the literal sRGB transfer function, since most calibration packages didn't even offer it until a few years ago. They just used 2.22 for their "sRGB" preset.

For video/film work/viewing - and that's the only place that it matters - CRTs are closer to 2.35 when properly calibrated. The EBU have tech papers on this if you want to read up on it.
BT.1886 is the modern standard to use, which applies JND-based black level compensation if your display's native contrast ratio is <10,000:1
Above 10,000:1 native contrast, the target is a flat 2.40 gamma.
There is also a "strict" CRT simulation curve defined, but 2.40 is generally sufficient for most applications.
 
Print work is usually done at D50 or D55. Everything else is created with and intended to be viewed at D65.
D50 ist the communication standard for the ICC workflow. The characterization data is recorded under that illumination for reflective measurements (by offsetting the reflection values against its constructed SPD) or chromatically adapted in the emissive case from the actual display white point (this leads to corresponding colorimetric values under the new illumination).

But it is important to note that neither a D50 display calibration will give a sufficient visual match to an accordant normlight setup (observer metamersim, influence of OBAs - especially regarding differences in UV portions in measurement and reproduction -, deviations in the SPD of real world light sources compared to the constructed D50 spectrum used for offsetting the reflection values...) nor D65 is needed for a correct reproduction of working color spaces originally defined relative to this whitepoint .

Besides adaptation mechanisms of the eye, I would like to emphasize effects of observer metamerism. I have a two screen setup, adjusted to a normlight color matching. To get a visual match in neutral tones between both screens, the whitepoint differs by about dE = 10 for the CIE standard observer.


For the gamma we could discuss against l* vs 2.2 stocked into the montors, and we could talk about 2.2 standard is a lie. (as crt have a natural 2.5)
l* is a weird thing that I've only ever seen photographers talk about.
I understand the concept behind it, but I'd never recommend that anyone use it.
Stick to a flat 2.2 gamma for any computer/graphics work.
L* characterizes the distribution of brightness in the device independent CIELAB color space and takes the non-linear brightness perception of the human eye into account. When implemented in a working color space (=> ECI-RGB v2) it therefore maximizes the codifying efficiency. Tonal value density is present where it is needed (a Gamma 2.2 tonal response curve for example is to dense in the absolute shadows compared with human perception). So if your primary working color space is e.g. ECI-RGB v2, then calibrating your display with the L* tonal value curve helps to avoid losses of tonal values and raises the level of precision. But if the neccessary corrections are carried out by creating an vcgt and loading its content into the videocard LUT, the advantages are of course often nonexistent (we must keep in mind that as long as the actual display characteristic is captured by the display profile, a CMM will transform correctly into display RGB) - while working with material in a L* based working color space still is meaningful. nVidia cards e.g. just provide a harsh cut 8bit output per channel while ATI/ AMD provides a dithering for the downconversion.

Just to point out one thing: It makes no sense to choose L*, even if your (hardware calibrated) display can display it lossless, only because it is based on perception and then reproduce material in non color-ware software that was gamma-corrected in an other way. This will just lead to unwanted shifts in the mid-tones.

The modern target for SDR is 100 nits.
Or 80-120 depending on which old standards you were looking at. (sRGB: 80 nits, SMPTE: 120 nits, EBU: 100 nits)
Quote:
These numbers must be handled with caution. The ideal brightness (and also the whitepoint - see above) just depends on the actual ambient/ color matching conditions. The ISO 3664 defines for example two viewing conditions for proofing situations. P2 for "(practical) print evaluation" requires an illumination level of 500 lux which can be achived by a dimmable viewing booth. A parallel used display will then have to calibrated with a light density of about 160 cd/m² to give a match. In the printroom, condition P1 comes into play which requires 2000 lux - corresponding to a light density of around 640 cd/m² (that's why soft proofs in the printroom itself are still not very common, as wide-gamut LC panels achiving this brightness are rare). Of course in a dark home environmemt even the 160 cd/m² will be too bright. That's why it is situation-bound.

That will give you the primaries. It won't measure the full gamut.
A display that can't be charaterized by a simple Shaper/ matrix profile after a linearization (first part of what is called calibration in this context) is not suitable for color critical tasks even if thick LUT profiles would be used. Shaper/ matrix profiles consist of the colorimetrical data of the primary colors and the tonal response curve for each channel. The resulting LUT <=> Matrix workflow from and to the PCS is very robust.

Only if you don't care about shadow detail being clipped.
You should care about shadow detail being clipped.
Some kind of black point consideration is of course in most cases no bad idea during the linearization (calibration) process (BT.1886 just inverses the approach direction compared to a direct offset of measuring data). Though as long as the actual black point is reflected by the profile, a (good) CMM will be able to optionally map differing black points in relative colorimetric transformations based on the source and target black point (Adobe has developed a quite popular and good implementation) by itself. As a result, the dark out-of-gamut colors are not mapped to the gamut boundary but moved to ensure differentiation.

2)with all the rgb at max.. the result would be "awfull" non only incorrect, . he will end up with a really offset monitor, a totally arbitrary space at best reduced even to the panel specs , worse shifted, to one axis.,so he will loose colours too and the others oversaturated.
XoR was referring to a starting point for the manual (may be assisted during a software calibration) white point adjustment. A meaningful RGB gain implementation will lead to the panel native white point at maxed out regulators without clipping in the highlights. From there you can achieve every desired white point by lowering maximal two of the three regulators (any setting where all three regulators are lowered is redundant and will lead to a reduced white level). Of course you have to ensure that there's no clipping because some implemenations allow for a "oversteering".

We must also keep in mind that we're talking only about the linearization part of a calibration (whitepoint and tonal response curve adjustment), which doesn't ensure a color correct reproduction. Games for example don't respect the display profile and execute transformations from and to the PCS but at maximum don't unload the vcgt information in the videocard LUT. If you know the underlying characteristic (often it is safe to assume sRGB), in these cases a color space emulation is neccessary for high demands.
 
Last edited:
I reserve this space... to reply about ho usefull could be work in l* :D
But Sailor is already in what we both Say we are not linear...
Now i'm on the cellphone...so impossible write Long things

This thread will became a treaty if we continue this way eheh.
 
This is NOT my image, someone posted this in a customer review on Amazon,

71HA25uvQhL._SL256_.jpg


The same monitors, same color settings and one has the yellow tint. Now I'm not sure if mine is as bad as that or not, However, can that be fixed with calibration or is my monitor just a defect?
 
That's exactly what calibration is for: making all your displays match as closely as they are capable of.
 
Back
Top