24-30" Monitor for Photographers

dmccombs

n00b
Joined
Oct 5, 2007
Messages
8
I am a photographer looking for a 24-30" monitor. Can you help me find a monitor that will fit my needs?

I don't play games on the PC, nor watch movies, so based on my reading, it looks like I should aim for a IPS, or at least a S-PVA, correct?

I have read conflicting info as to weather the High Gamut monitors are really a help. Can those with a High Gamut monitor comment on this? I do have a 21" PVA monitor (Sammy 213T), that I can use for web and e-mail if necessary.

If get a monitor in the 24-28" range, I would like to keep the price under $750. If I go to 30", I could go to $1250. Is 30" overkill for photo editing.

Thanks,
Darrell
 
When it comes to image editing there's no such thing as too much space. I would know since I'm a graphic designer myself.
The S-IPS by NEC is a tad on the expensive side, but it does provide great image quality.
Samsung's 27" seems to provide good image quality also, from the very few reviews I've read.

The High gamut does help a bit, but it doesn't justify the price premium imo.
 
If you're in the United States you can go to Provantage.com right now and get either the NEC2490 or 2690 with the higher color gamut for under $1200, in either case.
 
When it comes to image editing there's no such thing as too much space. I would know since I'm a graphic designer myself.

That is not true. In theory it should be true but there two factors that your claim dubious. One, until DisplayPort, its still an 8 bit world so wider gamut are not genuinely wider so much as the same number of data points spread thinner across a wider gamut. This means that the gaps between any given shade of a color are greater. This can be problematic for color correction particularly for any photographer that does portraits. Two, the current crop of wide gamut monitors with internal LUTs do not support sRGB modes with custom profiles which negates the whole point of having having gama correction for anyone that works primarily in sRGB.

It simple is not a case that more is better. It should be a very qualified response particularly if the monitor is used for home use as wide gamut monitors do not display 99% of PC content correctly (web, desktop, games, etc).

The High gamut does help a bit, but it doesn't justify the price premium imo.

This is spot on. That fact is this is poorly implemented feature set that is being implemented ahead of the needed high bandwidth interface Display Port. Wide gamut in an 8 bit world is like putting turbo on a Yugo. Additionally it is being sold as “more vibrant colors” etc when in fact this feature set only has to real world applications which currently fits a very small niche of users. Photographers and graphic designers that need to design in high gamuts such as Adobe RGB or Pro RGB and HD content. In the case of HD content, it is likely to have a minimal impact as most HD content is still produced in sRGB. In later, there are few photographers that need to actually shot or work in Pro RGB. The vast majority of armatures and professionals are better suited shooting in sRGB given that is the default for photolabs, etc.

There is a small group of folks that could benefit from these monitors but they are simple over sold.

If I had the case it would be the NEC 2490. It has all of the other features of its larger brother the 2690 (12 bit gama correction, internal LUT) but none of its weaknesses higher price tag and wide gamut. If you need to save some pennies consider the 2190.
 
So far, there seems to be a concensous on NEC LM240WU2, but if I am going to spend that much it is only $100-$200 more to get a HP 3065, or the Dell 30" HC. Would either of those be as good or better than the NEC?

Also, are there any decent 24" models for photo editing in the sub $750 range?
 
I was just talking strictly in regards to the screen real estate. Being able to edit multiple photos, comparing different shots and etc. >_<

But it was a very informative post Luth. ^^

From what I've been reading, it seems when it comes to pure image quality - the NECs are above the HP and DELL 30"s.


It depends what you mean by "decent". If color accuracy isn't mission critical, then there are plenty that will fall in your price range.
On the other hand, if it is very important - get a crt. >_< or the NEC or the Eizos.
Although Eizo's screens are on a class of their own.
 
The 30in Dell HC has great screen real estate, plus it has great color reporduction and response times. For its current price i'd say its the best monitor available for what ever the task is.
 
So far, there seems to be a concensous on NEC LM240WU2, but if I am going to spend that much it is only $100-$200 more to get a HP 3065, or the Dell 30" HC. Would either of those be as good or better than the NEC?

Also, are there any decent 24" models for photo editing in the sub $750 range?

I was interested in the Dell at one time but the high gamut issues and the matte finish killed it for me. I don't mind antiglare for a lot of things but I despise in on light colored photos. I have seen people who liked the BenQ 24" for photos but that's M-PVA and not S-IPS. I don't like shifty panels and would get S-IPS while they are still around.
 
My current contestants are listed in the following thread:

http://www.hardforum.com/showthread.php?p=1031493986#post1031493986

I can't get a decent price in Europe for NEC 2490 or 2690, so I've excluded them (also the EURO version of 2690 had audible noise issues, which weren't covered by warranty by all retailers).

If I were you I'd ask myself:

1) Do I need soft-proofing for print or other media?
2) Is grey scale separation at black end important to me (this is imho crucial, if you work with print or with a lot of b&w photos, but can be important in other situations as well))?
3) Do I work with extended colour space images that scream for extended colour space beyond sRGB?
4) Is my final display of photos (final medium) high colour space capable?
5) Do I have excellent calibrating tools (hw) and skill/time to use them properly?
6) Do I only view the monitor myself, or work with clients who are highly critical and view the same monitor from a slightly off angle.

Based on that, I'd check which products have which failings (good calibration support, good grey scale separation with a near linear gamma curve, wide colour space, consistent viewing angles without any shifts in colour or grey scale).

It really depends on what you need.
 
Remember there is a huge difference between "Graphic Designer" and "Photographer"

Photography you do not need a huge resolution, I am perfectly happy on my 19" 1280x1024 monitor. Graphic design you tend to want more space as you are working with lots of different elements.

Also, you do not ALWAYS need an IPS panel or more high quality panel. I found the Samsung 940b 19" I have that is not an IPS panel, but still has amazing color rendition and is very true once calibrated...not saying all non-IPS panels will be like this, but there are some out there that are not at the huge premium and still have a great picture.
 
I am a photographer also. The whole thing of monitor "accuracy" is a bit hard to pin down. There are a couple of ways to go about profiling for "true" color. Both are fairly expensive. If you use LCD (most everyone), you have to get a high end unit which is used with a profiling program. (profile by the numbers) Until fairly recently, many photographers that had to have spot on color for commercial use, would use (and still do), the old CRT's which they degausse and profile with a puck. This method also requires a profiling program and more $, but if very accurate when done properly. Information on these procedures can be found at Shootsmarter.com. However, most photographers seem to be interested in getting close, not spending the $ to get spot on.
As to 24 - 30 in size, It is a question with most folks of getting something (screen space) and leaving something behind (desk space). I have used numerous configurations thru some fellow photogs, and my personal choice is a dual screen. Spent somewhat more on my main screen and less on second screen where I might park bridge or lightroom. Makes is easier to not have to close/open new/close/open old. You can do this same thing on one screen, but I like to edit at 100% and two screens seem to keep things cleaner, at least to me.
Good luck\
 
I am a photographer also. The whole thing of monitor "accuracy" is a bit hard to pin down. There are a couple of ways to go about profiling for "true" color. Both are fairly expensive. If you use LCD (most everyone), you have to get a high end unit which is used with a profiling program. (profile by the numbers) Until fairly recently, many photographers that had to have spot on color for commercial use, would use (and still do), the old CRT's which they degausse and profile with a puck. This method also requires a profiling program and more $, but if very accurate when done properly. Information on these procedures can be found at Shootsmarter.com. However, most photographers seem to be interested in getting close, not spending the $ to get spot on.
As to 24 - 30 in size, It is a question with most folks of getting something (screen space) and leaving something behind (desk space). I have used numerous configurations thru some fellow photogs, and my personal choice is a dual screen. Spent somewhat more on my main screen and less on second screen where I might park bridge or lightroom. Makes is easier to not have to close/open new/close/open old. You can do this same thing on one screen, but I like to edit at 100% and two screens seem to keep things cleaner, at least to me.
Good luck\

For pixel-level editing of photos (cloning, healing, etc.), it's nice to have a lot of desktop space so you can still see the surrounding area and see how your edits look in the context of the surrounding image. I use 2 LCDs at 1600x1200, and the extra room is nice. For general white balance, exposure level, vignetting, etc. types of corrections, high resolution isn't necessary, although it's still nice to be able to see your images at a smoother resolution with less jaggies on a higher-res monitor. After years of working with at least one 1600x1200 monitor, I don't think I could go back to 1280x1024 for every day use.

If only Lightroom were multi-monitor aware. Adobe went form over function with this, which is really disappointing since Photoshop (and other Adobe apps) has been multi-monitor aware for years and years now, while Lightroom is a new product. Here's hoping v2.0 will let me put my tabs, thumbnails, etc. on my 2nd monitor.
 
Well, if everything being recommended is going to be over a grand.... :rolleyes:

Am I better off getting a HP 3065, or the Dell 30" HC?
 
Well, if everything being recommended is going to be over a grand.... :rolleyes:

Am I better off getting a HP 3065, or the Dell 30" HC?

There is a lot of good advice here. Halcy really hit the issues on the head. The core features that divide the IPS pack on price are…

Color accuracy / internal LUT gamma correction: If you do color critical print work then you probably want these features. If you need to save cash look at the NEC 2190 or 2090. I think both can be found for less $1000.00. Both have the same feature sets as the 2490. Both are standard gamut.

Screen real-estate / size: If you are not concerned about having the extra color accuracy an internal LUT (most users will not notice the difference) then consider going big. If I had to chose between the Dell and HP, I would go Dell as it tends to go on sale often for hundreds less than the HP and they are pretty much equal feature wise.

Whatever you chose, add in the cost of a good hardware color calibration unit as no monitor is color accurate out of the box and all improve significantly after calibration. Something like the Eye Display 2, etc.
 
Luthorcrow,

Thanks for the consice summation.

I do have an Eye Calibration unit, so calibrating the new monitor is not a problem.

I see the 30" HP for ~250 less than the Dell normal price. This works out to be even to the Dell sale price. If you could get either monitor at the same price, which would you get?

I hate to get into the $1250 price range, but if the the decent 21" monitors run ~$950, then I will just get the big one.

I was really hoping to find a nice 24" monitor in the $650 range. I guess that would end up being a reasonablty well implemented S-PVA. If folks have suggestions as to what would compete with the 24" dell let me know.

Thanks,
Darrell
 
People at the photo forum I go too have had some issues with over saturation on wide gamut displays. You could ask around there:

http://forums.dpreview.com/forums/forum.asp?forum=1004

I read a bit there too. :)

A good thread about wide gamut vs. sRGB and the 2690 can be found here:
http://forums.dpreview.com/forums/readflat.asp?forum=1004&thread=25042197&page=1

With a reply about this from:
Will Hollingworth
Manager of OEM Product Design & Development Engineering
NEC Display Solutions of America, Inc.

and the experience from someone who used the Dell 2407WFP
 
i'm a VERY amateur photographer. i love doing it as a hobby but its not a job.

i recently bought a dell 2407-hc and a pantone huey pro. it made a huge difference over my old dell 2005 and no calibration. i know this setup isnt great for pro photogs but for someone who doesnt want to spend a ton of money, i'm VERY happy with the setup i have right now.
 
People at the photo forum I go too have had some issues with over saturation on wide gamut displays. You could ask around there:

http://forums.dpreview.com/forums/forum.asp?forum=1004

Folks call it oversaturation but technically its not oversaturation. The visually effect is similar but what is really happening is same RGB values do not point to the same shades of color in sRGB vs the default gamut of the monitor. Because 99% of PC content is wide gamut, it incorrectly maps all of the colors on your monitor outside of a color aware program.

In my view, given that sRGB is the standard, the monitor manufactures chose to not bother to address the issue rather banking on selling more monitors on the oversatured effect of the mismapped colors.

So when folks tell you the colors are "better", "richer" in games and desktop operations they are wrong. They are just seeing the wrong colors which appear oversaturated as a result.:D
 
Remember there is a huge difference between "Graphic Designer" and "Photographer"

Photography you do not need a huge resolution, I am perfectly happy on my 19" 1280x1024 monitor. Graphic design you tend to want more space as you are working with lots of different elements.

Also, you do not ALWAYS need an IPS panel or more high quality panel. I found the Samsung 940b 19" I have that is not an IPS panel, but still has amazing color rendition and is very true once calibrated...not saying all non-IPS panels will be like this, but there are some out there that are not at the huge premium and still have a great picture.


My clanmate compared his 940b from his family to his own Dell S-IPS 2007WFP, and told me the 940b looked like a piece of crap qua colors. :confused:
 
In my view, given that sRGB is the standard, the monitor manufactures chose to not bother to address the issue rather banking on selling more monitors on the oversatured effect of the mismapped colors.

The screens doesn't have mismapped colors, they are mapped for another color space. Using the word mismapped, sounds like an error, which it isn't.
The fact is that the world is moving away from sRGB. aRGB and above are being more available now and also screens that can display such. You get cheap home printers with color space capabilities beyond sRGB and also camera's which can capture aRGB and RAW. Professional online printer services (those where you can download their printer profiles from and gives a damn about color accuracy) accepts aRGB and CMYK (both with gamut larger then small sRGB).

When it comes to color awareness (this goes for PC, since MAC have been color aware for ages and gamut issues doesn't apply that much there), the new Firefox 3.0 is color aware and in addition, Safari are being released in full (non beta) for windows users.

Vista have an improved color management (but you need to turn off permission popup, since the dimming destroys your color management).

The newest Office are color aware.

Most screens aren't even sRGB themselves. If you read my link above, then Will Hollingworth explains it pretty nicely. Most screens are "close enough", while still incorrect. The standard gamma for sRGB is 2.2, but most CRT have a native gamma of 2.5. The standard white point is 6500K, while CRT's often have 9300K. Most people don't calibrate their screens for accuracy. So, almost no screens have an accurate sRGB. Just "close enough". The gamut of the 2490 btw are 78% of Adobe RGB I've read.


So when folks tell you the colors are "better", "richer" in games and desktop operations they are wrong. They are just seeing the wrong colors which appear oversaturated as a result.:D

"Better" is a subjective term. It depends on who is seeing. What is importent however, is that most people who don't work with color cares about what looks "better" and don't give a **** about color accuracy and "the right colors". If their games look "richer" and "better" in wide gamut, then its correct of them to say so. The wide gamut have then improved on the "correct colors". "Correct colors" only matters if its nessesary for your work.

The 2690WUXi performs well by using wide gamut for 16-bit aRGB and RAW picture files, and also performs good when using the sRGB preset when needed.

Saying that people should buy "close enough" to sRGB for the sake of material that exist today in sRGB is wrong. It will limit people very soon. Surfing internet is not a problem at all with the "correct colors" or "close enough" if you have a color aware browser. It will show sRGB as sRGB and aRGB as aRGB on a wide gamut screen, while on an "close enough" to sRGB screen, it will show everything as sRGB (since it cannot display those rich aRGB shades).

Remember there is a huge difference between "Graphic Designer" and "Photographer"

Photography you do not need a huge resolution, I am perfectly happy on my 19" 1280x1024 monitor. Graphic design you tend to want more space as you are working with lots of different elements.

Also, you do not ALWAYS need an IPS panel or more high quality panel. I found the Samsung 940b 19" I have that is not an IPS panel, but still has amazing color rendition and is very true once calibrated...not saying all non-IPS panels will be like this, but there are some out there that are not at the huge premium and still have a great picture.

The Samsung 940b is a 6-bit TN panel with dithering and overdrive artifacts. All the IPS panels are true 8-bit. I mean no offence, but I hope you are not making any color critical descitions on that screen.

Sure, there are many screens that would be great for gaming and movies and some also for non-critical color work. But, personally, I would never trust anyone who sits infront of a VA or TN panel making color critical descitions. That includes the Eizo CE240W which is an S-PVA panel. Eizo makes the best out of what panels they have (TN's, VA's and IPS), but with the left side being brighter then the right side and color/gamma/brightness shifts even at center view, I wouldn't trust it to color critical material. Eizo themselves states this in the Norwegian Eizo pages.

CE-modellene (21" wide og 24" wide) er tenkt for bedrifter som arbeider med å skape publikasjoner, design, arkitekter etc der arbeidsområde og farge er viktig men der man ikke tar fargekritiske beslutninger direkte på skjermen.
http://www.eizo.no/eizo/smpage.fwx?smlanguage=NOR&page=72

translated:

"The CE-models (21" wide and 24" wide) are ment for companies that work with creating publications, design, arkitects etc. where workarea and color is important, but where you don't take colorcritical descitions directly on screen.".

Even people who don't work with colors professionally notice the color shifts. I'm amazed over the so-called "professionals" who do color critical work on such panels. I don't trust their eyes for details for a second and would never place important work in their hands.

Even small angle shifts changes the picture:
http://www.hardforum.com/showthread.php?t=1228073

I don't mean to be harsh, but seriously, if they (people who work with such panels) cannot see the color shifts, they shouldn't really work with color critical material either in my opinion.

Take a look at ToastyX's pictures of the new Eizo HD2441W at center view:

eizo-color-shifting.th.jpg

http://www.hardforum.com/showpost.php?p=1031371458&postcount=58 for a larger version

It doesn't matter if you calibrate the screen with even a perfect DeltaE in the middle of the screen if the rest of the screen sucks.

I'm being brutal now and don't mean to offend anyone, but I strongly would advice against using VA's and TN panels for color critical work. They have their strengths in other places and are not bad panels.

I don't disagree with you, Grentz, in what you say that "you do not ALWAYS need an IPS panel or more high quality panel.". But for color critical work, I would disagree. You either need IPS or more high quality panel/monitor (I'd love to get my hands on a Sony Artisan...). Other panel types are too unstable in my opinion.
 
The screens doesn't have mismapped colors, they are mapped for another color space. Using the word mismapped, sounds like an error, which it isn't&#8230;
No mismapped is the correct term. I realize you are a wide screen, wide gamut evangelist. I am only giving the facts and letting the buyer beware. I think you tend let your own passions on this issue cloud your judgments.

On hardware level yes the monitor is mapping to its default color space so on that level it is not an error. But no piece of hardware is used in isolation. The fact is sRGB is the standard of both PCs and largely of current HDTV content. A device intended to be used for this purpose would need to map these colors in the correct color space by default. The current crop of wide gamut monitors do not does this and even manually settings for sRGB are problematic.

You can split hairs on whether you view that as an error or not but what I described above and before is accurate. By any reasonable definition it is an error.

The fact is that the world is moving away from sRGB&#8230;
That&#8217;s a bold statement but I think you are confusing hardware capabilities with a changing standard. The fact is your wrong or really over eager. There has been no signs of serious change for PC content. Will we eventually change to a wider gamut standard, probably, but its at least a decade or more away. The market drives standards and the fact is Firefox is small percentage of the browser market. Even the monitors we are discussing here are small percentage. I run a website and one of the breakdowns of Google Analytics is screen size. Wide screen monitors (the primary market for wide gamut) are in the single digits. Of those that are likely wide gamut the percentage would probably drop down to the 1-2% or less.

Will it eventually change yes but it&#8217;s a long, long way off to be advising folks today on their purchases given that monitor they buy today will be gathering dusty when the change does come.

&#8230; Professional online printer services (those where you can download their printer profiles from and gives a damn about color accuracy) accepts aRGB and CMYK (both with gamut larger then small sRGB).
I corrected you once on this but let me do it again. CMYK is not a wider gamut than sRGB. It in effect is the narrowest of gamuts used for design. There are only about 55,000 color values in CMYK verus the 16.7 million in sRGB. Yes there are colors in CMYK that are out of gamut for sRGB but does not make it a wider gamut as there about 16 million colors in sRGB that would be out of gamut for CMYK.

Vista have an improved color management (but you need to turn off permission popup, since the dimming destroys your color management).

Again Vista is a tiny fraction of the OS market right now. XP and WN2K still dominate the world and US market. It took years before XP unseated WN2K it will be years before Vista does the same. But this is probably most solid sign of a change coming. So maybe MS and W3C might ready for a new standard in the next OS after Vista.

Most screens aren't even sRGB themselves. If you read my link above, then Will Hollingworth explains it pretty nicely. Most screens are "close enough", while still incorrect&#8230;
That&#8217;s really stretching the a truth to make a point. The fact is most monitors were designed to display sRGB. I never claimed they were perfect but talking about differences that would require special tools measure vs. a wide gamut monitor that clearly displays colors incorrectly to naked eye is just comparing baby apples to big fat Florida Oranges. :p

Saying that people should buy "close enough" to sRGB for the sake of material that exist today in sRGB is wrong. It will limit people very soon. Surfing internet is not a problem at all with the "correct colors" or "close enough" if you have a color aware browser.
No it won&#8217;t because there is no none sRGB content to see. Again, just because there is a Mac browser that is coloraware and an unreleased beta of Firefox that is, does not magically produce content.

&#8230;I'm being brutal now and don't mean to offend anyone, but I strongly would advice against using VA's and TN panels for color critical work. They have their strengths in other places and are not bad panels.
Agreed. For none color critical a VA or MVA panel will make many happy. TN and design work should not be mentioned in the same sentence. :)
 
No mismapped is the correct term. I realize you are a wide screen, wide gamut evangelist. I am only giving the facts and letting the buyer beware. I think you tend let your own passions on this issue cloud your judgments.

Widescreen evangelist I can agree upon. ;) Problem is that you are not giving all the facts or you don't know them yet. You call everything outside of sRGB for color mapping error, though they are mapped for another standard. You also state that HDTV is mostly in sRGB, which is wrong. sRGB and other color spaces are just containers and sRGB is only one of many standards. It IS the current standard on internet for the moment, but with introduction of color aware browsers like firefox 3 and safari, this might change.
Have you ever heard about YIQ, YUV, and YCrCb? NTSC? CMYK (which have been a standard for print for years)? Movies, tv shows and such are mostly NOT sRGB.


That&#8217;s a bold statement but I think you are confusing hardware capabilities with a changing standard. The fact is your wrong or really over eager. There has been no signs of serious change for PC content. Will we eventually change to a wider gamut standard, probably, but its at least a decade or more away.

Its not a bold statement at all. The fact is that the signs are everywhere:

*Panel manufacturers are moving their panels over to wide gamut and away from sRGB. sRGB are being phased out like CRT's did in their time. Even cheap TN panels are being replaced by wide gamut models.

*Major companies like Dell are phasing out their sRGB models. 3007WFP (sRGB) got replaced with 3007WFP-HC (wide gamut). 2707WFP came direcly in wide gamut. 2407WFP got replaced with 2407WFP-HC (wide gamut).

*All upcoming panel techs are wide gamut. SED, LED, OLED, LASER DISPLAY etc. have all a wider then aRGB even.

*Browsers, Office packs, Vista etc. are becoming color aware which makes them more compatible with all the different color spaces there is.

*They are aiming for larger and larger color spaces:
http://displaydaily.com/2007/01/23/ntsc-color-gamut-be-gone-enter-xvycc-and-deep-color/

*Camera's, printers and also public print services etc. already offers color spaces larger then sRGB. Note that the print standard CMYK are already larger then sRGB.

*Mac's are already ready for wider gamut displays, due to superior color management.

*Linux are already working on making their OS's color aware.

The list goes on.


I corrected you once on this but let me do it again. CMYK is not a wider gamut than sRGB. It in effect is the narrowest of gamuts used for design. There are only about 55,000 color values in CMYK verus the 16.7 million in sRGB. Yes there are colors in CMYK that are out of gamut for sRGB but does not make it a wider gamut as there about 16 million colors in sRGB that would be out of gamut for CMYK.

Let me correct you here. CMYK stretches beyond sRGB. Therefore, sRGB is too small to contain CMYK. A wide gamut screen covers both. The number of colors doesn't define the color space. CMYK (Cyan, Magneta, Yellow and blacK/Key color) have a total of 4 million colors, not 55 000. For more colors, they offer printers with even more color primaries like CMYKOV etc. All larger then small sRGB.



Again Vista is a tiny fraction of the OS market right now. XP and WN2K still dominate the world and US market. It took years before XP unseated WN2K it will be years before Vista does the same. But this is probably most solid sign of a change coming. So maybe MS and W3C might ready for a new standard in the next OS after Vista.

During the first 100 days after Vista's release, over 40 million copies were sold. Now Vista have over 60+ million copies sold.
http://windev.wordpress.com/2007/08/13/60-millions-of-windows-vista-copies-sold/

Vista and OSX for Mac makes up enough users that it would be offending to call it tiny. More might come after SP1 in january and since many computers are sold with Vista preinstalled, numbers will grow in any case.


That&#8217;s really stretching the a truth to make a point. The fact is most monitors were designed to display sRGB. I never claimed they were perfect but talking about differences that would require special tools measure vs. a wide gamut monitor that clearly displays colors incorrectly to naked eye is just comparing baby apples to big fat Florida Oranges. :p

No, most displays were designed. sRGB was created since its the lowest common color space that most screens were close to. sRGB is made due to limitations in computer hardware and they needed a common platform for even the crappiest of crap hardware, not because its such a great color space.

I think you are missing the point entirely. A standard monitor is not calibrated and made for a color space. Color space's are created to give certain boundries for color managed work. sRGB is selected in many cases (when it comes to web), because its "close enough". These boundries doesn't traditionally exist on a standard screen. Color space containers are for content creators, not users, though users can benifit from it too. The choice of color space is purely depended on the target group.

No it won&#8217;t because there is no none sRGB content to see. Again, just because there is a Mac browser that is coloraware and an unreleased beta of Firefox that is, does not magically produce content.

As explained above, there are many color spaces that are in use that are not in sRGB. From movies to vacation pictures taken with an aRGB or Raw capable camera. The standard home ink printer are not even sRGB. Even on internet, there are many aRGB pictures made by people who didn't convert it to sRGB.

The point is, with a color aware browser, sRGB material shown on a wide gamut screen is a non-issue. Having a screen that can only show sRGB is a lmitiation, not an advantage. The only time where sRGB is an advantage is in those cases where you work purely with sRGB material. In all other color spaces, you'll be limited.



Agreed. For none color critical a VA or MVA panel will make many happy. TN and design work should not be mentioned in the same sentence. :)

Finally, we agree upon something... :D
 
My clanmate compared his 940b from his family to his own Dell S-IPS 2007WFP, and told me the 940b looked like a piece of crap qua colors. :confused:

Its actually funny, I have two 940b monitors side by side...one is spot on and very very close to true color, the other one (also that has been calibrated) is not and not matter how much calibration I do I still cannot get it very close to true.

I think the panels can very, I got lucky and got one very very good one and one that is not the best and probably more normal for a TN display.

Also as far as color critical...no I do not have to do anything color critical...but I can say that the colors on my screen match up almost exactly with the colors on my Canon Dedicated 4x6 Photo Printer and prints that I have sent of to be professionally printed from small to full poster size. I also have used IPS LCDs and Sony Professional Grade CRT color corrected monitors so I do have good comparisons and am not just spouting out smoke. My one very very good 940b is not quite as good as the IPS or Sony...but it is also half if not less the price and still very good and better than many monitors I see :cool:

Dont get me wrong, I still think full time designers should have proper high end panels or CRTs, but for those hobbiests out there some more cost effective alternatives do exist...I would rather put my money into more glass and shooting gear then paying double for a monitor that looks just fractionally better and I do not have a use for it to be 100% perfect.

I also think the experience I have had with one 940b being better than the other shows how panels can very and you would never know unless you did side by side to notice the "o so great screen model" you got may not be the best rendition of color.
 
Widescreen evangelist I can agree upon. ;) Problem is that you are not giving all the facts or you don't know them yet. You call everything outside of sRGB for color mapping error, though they are mapped for another standard. You also state that HDTV is mostly in sRGB, which is wrong. sRGB and other color spaces are just containers and sRGB is only one of many standards. It IS the current standard on internet for the moment, but with introduction of color aware browsers like firefox 3 and safari, this might change.
First that is complete strawman of any of my statements. Second, late me state up front that we both agree that at some point the standard will change. Let me tackle your strawman argument first.

My only point has been that currently virtually all of the content for your PC (web, games, etc) is sRGB. The minimum a PC monitor needs to be able to do is display that color space by default. Currently wide gamut monitors instead display the default color space for that monitor. No consider that there are varying color spaces for these monitors currently ranging from 92%-102% with this likely to rise to 120% in 2008. The default will continue to change as a result a standard needs to be adhered to particularly with a new feature influx.

Regarding, sRGB and HDTV, you need to go to W3C and read about the HDTV color space because you are dead wrong. sRGB was designed as a standard to work with the HDTV color standard which was defined in 1996. The primary color values between the standards match. The result is that HDTV colors will map vary close in sRGB and sRGB in HDTV. If all of the wider color gamuts were mapped in this fashion then we would have no debate.

Safari and Firefox 3 adding color awareness is great. Again, we both agree that in the future the standard may change. But until MS and W3C start releasing white papers its not even on the horizon. Consider this, the HDTV specs were designed in 1996 with white papers released. We just recently caught on hardware, content and services that support that standard. Do you really think they are going to change the standard at its infancy? Even if they do, it would likely be mapped to allow for accurate display of the current sRGB, HDTV color format. The standards for the web were designed to be compatible with broadcast TV. Any change in the standard is not just going to need to be a PC standard but would also include the web, HDTV, etc.

So agree it may change, but to be advising folks making purchases now on something is not going to happen until the monitor they by today is gathering dust at Goodwell particularly with all of the obvious downsides, the inability to display current content correctly, is just not reasonable.

Let me correct you here. CMYK stretches beyond sRGB. Therefore, sRGB is too small to contain CMYK. A wide gamut screen covers both. The number of colors doesn't define the color space. CMYK (Cyan, Magneta, Yellow and blacK/Key color) have a total of 4 million colors, not 55 000. For more colors, they offer printers with even more color primaries like CMYKOV etc. All larger then small sRGB.
On this one you simply wrong. The point in your first sentence can be reversed, the sRGB colorspace stretches beyond CYMK therefore CYMK is too small to contain sRGB. So what?

The biggest difference between RGB and CMYK colors is that RGB colors have a larger Gamut. Gamut refers to the number of colors that can be viewed.
- Source

Assuming your are right on 4 million vs my 55k quote, that&#8217;s still 12.7 less. If look at a diagram of the to colors spaces its clear sRGB is the wider gamut.

Vista and OSX for Mac makes up enough users that it would be offending to call it tiny.
OSX users would not be offended as they are keenly aware of being a monitor. Most of the OSX users I know pride themselves on being special. 60 million copies is tiny fraction of the world OS market. Again, are seriously suggesting that Vista is going become the dominate OS faster than say 3-5 years?

Color space's are created to give certain boundries for color managed work. sRGB is selected in many cases (when it comes to web), because its "close enough".

Your analogy is very misleading. A more accurate one would be a map. So yes each color space defines a boundary but they have second function which is to plot the associated color to a given RGB value within that boundary. So its just not that you lose the colors outside of the boundary but that each location (color) is mapped to a different value. . As a result all of the colors are mapped incorrectly not just those outside of the boundary. That is crux. Its not that it could not have been done by default either. We have the example of sRGB and the wider gamut space of HDTV where the primaries values match between the two color spaces.
 
The CMYK/RGB matter seems to need some additional explanation.


First don't mistake the often used "color dept" for color space:


Color space defines how blue the blue is not how many blues there are.
Some years ago monitors had 16 bit color (~65000 colors) but these were used to display the colors of a colorspace equal to sRGB.
And we also have 24 bit RGB color (~16.7m colors) still displaying colors of the same sRGB color space only with finer graduations.
CMYK mostly uses four 8bit channels making it 32bit or 4 billion colors,
you can also to use 4, 16 or 32 bit channels (most common) with RGB and CMYK data.
But it's possible to have quadrillion colors in any color space (sRGB, CMYK, ROFL) it only changes how exactly they can be addressed inside of this space.

2nd CMYK and RGB use different primary colors so the color space will have a different shape:

http://upload.wikimedia.org/wikipedia/de/f/fd/CIE_RGB-CMYK-Beleucht.png
So the sRGB space might be bigger than cmyk but it does not contain all of them, aRGB does however contain almost all of the cmyk colors. (ignore the wide gamut RGB in the pic it's not identical to the somewhat wide gamut of the displays)

But there are more color spaces bigger than that used in print.
http://www.brainnew.com.tw/Article/raymond2003/IMAGE/101003/hexa_gamut.gif

choosing a display is a trade off
The current displays only have 24 bit color so they will only be able to address 16.7m colors.
The lower sRGB gamut will display sRGB colors more exactly but it won't be able to display all cmyk colors. The wide gamut monitor will display a larger variety of colors but these slightly less exact.

Getting back to the original topic....

It comes down to how the fotos will be displayed:
on the net (sRGB preview on sRGB monitor)
or on a printout (CMYK or different preview on aRGB monitor possibly shot with aRGB camera).


having one of each connected to the PC would be the most convenient option.
 
First that is complete strawman of any of my statements. Second, late me state up front that we both agree that at some point the standard will change. Let me tackle your strawman argument first.

My only point has been that currently virtually all of the content for your PC (web, games, etc) is sRGB. The minimum a PC monitor needs to be able to do is display that color space by default. Currently wide gamut monitors instead display the default color space for that monitor. No consider that there are varying color spaces for these monitors currently ranging from 92%-102% with this likely to rise to 120% in 2008. The default will continue to change as a result a standard needs to be adhered to particularly with a new feature influx.

Regarding, sRGB and HDTV, you need to go to W3C and read about the HDTV color space because you are dead wrong. sRGB was designed as a standard to work with the HDTV color standard which was defined in 1996. The primary color values between the standards match. The result is that HDTV colors will map vary close in sRGB and sRGB in HDTV. If all of the wider color gamuts were mapped in this fashion then we would have no debate.

First of all, what seems to be strawman argument for you seems to be lack of knowledge for me. HDTV broadcasts doesn't operate even with RGB, but YUV and YPbPr. A conversion to RGB is needed to display it on a computer screen. Secondly, there is a vaste amount of different color spaces or containers. NTSC you might have heard of. My wide gamut screen can only display 92% of the NTSC color space. Here is a picture showing only some of the color spaces in use:

spaces_all.jpg


Now, if you are stating that HDTV broadcasts have a new standard, where they go away from YUV, NTSC, PAL etc. and have moved over to sRGB, then I suggest you come up with some evidence of that.


Safari and Firefox 3 adding color awareness is great. Again, we both agree that in the future the standard may change. But until MS and W3C start releasing white papers its not even on the horizon. Consider this, the HDTV specs were designed in 1996 with white papers released. We just recently caught on hardware, content and services that support that standard. Do you really think they are going to change the standard at its infancy? Even if they do, it would likely be mapped to allow for accurate display of the current sRGB, HDTV color format. The standards for the web were designed to be compatible with broadcast TV. Any change in the standard is not just going to need to be a PC standard but would also include the web, HDTV, etc

They don't need to change any standard if color aware browsers becomes a norm. It will become like this:
http://www.color.org/version4html.xalter
And again: HDTV is not sRGB.

So agree it may change, but to be advising folks making purchases now on something is not going to happen until the monitor they by today is gathering dust at Goodwell particularly with all of the obvious downsides, the inability to display current content correctly, is just not reasonable.




On this one you simply wrong. The point in your first sentence can be reversed, the sRGB colorspace stretches beyond CYMK therefore CYMK is too small to contain sRGB. So what?

Assuming your are right on 4 million vs my 55k quote, that&#8217;s still 12.7 less. If look at a diagram of the to colors spaces its clear sRGB is the wider gamut.

There are no native CMYK displays, talking about CMYK too small to contain sRGB is pointless. My point is that CMYK have colors mapped beyond sRGB. Most printers can print the CMYK range, but a sRGB screen can't display it properly.

If you take a picture from the beautiful Tahiti, you might want to have those wonderful CYAN shades in the water. An el-cheapo CMYK printer can print them, but if you have sRGB as color space, the information is lost. If you send it to a pro printing service, you can get it printed in aRGB even. There will be a visible difference in taking the picture in 16-bit aRGB and then converting it to CMYK vs. taking it in sRGB and converting it to CMYK. Especially in this case, since the amount of CYAN shades are large when it comes to water pictures.

There are other advantages also using a larger color space when capturing. Even larger then aRGB.
http://www.earthboundlight.com/phototips/prophoto-rgb.html


OSX users would not be offended as they are keenly aware of being a monitor. Most of the OSX users I know pride themselves on being special. 60 million copies is tiny fraction of the world OS market. Again, are seriously suggesting that Vista is going become the dominate OS faster than say 3-5 years?

Vista doesn't need to be the dominant OS, neither does OSX for getting benifits from escaping the sRGB limitations. A color aware browser will work nice and dandy on windows XP and linux as well. All developers need to do then, is to tag their images properly.



Your analogy is very misleading. A more accurate one would be a map. So yes each color space defines a boundary but they have second function which is to plot the associated color to a given RGB value within that boundary. So its just not that you lose the colors outside of the boundary but that each location (color) is mapped to a different value. . As a result all of the colors are mapped incorrectly not just those outside of the boundary. That is crux. Its not that it could not have been done by default either. We have the example of sRGB and the wider gamut space of HDTV where the primaries values match between the two color spaces.

I disagree. My analogy is the very essense of color managed workflow and its clear as daylight. .ICC profiles are created just for this intent. Color space's are created to give certain boundries for color managed work. Without them, you have no control over the workflow. If you work with colors, you will encounter many kinds of color spaces.

Your statement "As a result all of the colors are mapped incorrectly not just those outside of the boundary." is the misleading one. In a color managed workflow, this is incorrect. The world doesn't evolve around your "everything is mapped wrong, since they are not sRGB". If you rent a DVD, its mapped just as it should and if you ever find a commercial DVD thats sRGB or a TV/HDTV broadcast thats sRGB, please let me know.

If you work with aRGB, CMYK, NTSC etc. within a color aware enviroment, its not those color spaces that are "mapped wrong", since your display is sRGB. The only reason for sRGB to exist, is because its the lowest common color space which the crappiest of crap screens are "close enough" to display.

Outside of a color aware enviroment, the colors are still not "mapped wrong". They are mapped for another color space. Your CMYK printer is mapped for CMYK colors. Your TV is mapped for YPbPr/YUV with PAL/NTSC/SECAM. Your DVD collection don't even have sRGB. The limitation is in your screen and your OS, which is limited to sRGB. With aRGB and larger gamut screens in a color aware enviroment, you have less limitations. So, most material on the WEB are sRGB. BAM, color aware browser (Like firefox 3). So, your work in office programs like powerpoint shows the sRGB slides as aRGB. BAM, color aware office (newest office pack). So, your OS shows sRGB in aRGB. BAM, get yourself a color aware OS if it matters.

sRGB is the limitation, not the answer.
 
Also as far as color critical...no I do not have to do anything color critical...but I can say that the colors on my screen match up almost exactly with the colors on my Canon Dedicated 4x6 Photo Printer and prints that I have sent of to be professionally printed from small to full poster size. I also have used IPS LCDs and Sony Professional Grade CRT color corrected monitors so I do have good comparisons and am not just spouting out smoke. My one very very good 940b is not quite as good as the IPS or Sony...but it is also half if not less the price and still very good and better than many monitors I see :cool:

Dont get me wrong, I still think full time designers should have proper high end panels or CRTs, but for those hobbiests out there some more cost effective alternatives do exist...I would rather put my money into more glass and shooting gear then paying double for a monitor that looks just fractionally better and I do not have a use for it to be 100% perfect.

I can see your point, and though I don't agree upon your recommandations, the main thing is that you have found a screen that suits your need. :)

Like photography where using the correct lens, making sure lightning hits in the right angle etc. having the right screen for the right job is important. Sure, you can take good pictures with a semi-decent fully automatic digital camera with image stablizer, but you have less control over the output.

I am not a good photographer. I'm a lousy photographer to be honest and if I were to take pictures myself that I use, no editing could save them. A semi-decent fully automatic digital camera with image stablizer would probably be best in my hands if I were to take a picture. But, I would never recommend it to someone who asks what camera to use for photographers, the same way I would never recommend a TN or VA panel for photographers.
Aftertreatment of photo's is a mighty sword to vield. You need to trust your tool as with camera's in order for it to swing where you wish.

Lets take your screen, the Samsung 940b, into the world of aftertreatment:

Backlight bleed (up to a certain extent), pixel defects (up to a certain extent) and factory calibration (up to a certain extent) are errors that can happen with any screens, but doesn't affect the aftertreatment in a large enough degree for it to be faulty. So, in this area, there is not much risk.

However, its a 6-bit TN panel. This means it has 262 144 true colors (64 shades per RGB). This is not good if you work with 8-bit to 16-bit images.

It has dithering, which means it can simulate up to 16.2 M colors (newer TN's can dither up to 16.7M colors due to 9-bit LUT/HI-FRC). It improves a bit, but you work with 8-bit to 16-bit colors and have already introduced dithering to them. Thats not good either.

The viewing angles of this panel is restricted. Granted, it doesn't have the color/gamma/brightness shifts at normal angle +/- 5 degrees off center like VA's + left side brighter then right, but it have darkening/brightening top/bottom of screen due to vertical viewing angle limitation.

In addition, there is limited calibration possibilities and the risk is larger to have problems with brightness uniformity. Brightness uniformity problems are something that most are not aware of. Backlight bleed only affect the lowest shades of black, while brightness uniformity problems affect ALL shades. You can actually have different shades of color on different places on screen, even though it should be one uniform color. Not so easy to spot always, but it affects aftertreatment greatly when it comes to any color adjustments.

It has overdrive you can turn off and can cause RTC (response time compensation) artifacts in still pictures even.

If you work with a 16-bit image and have calibrated GFX lut with a 16-bit profile, then your path is as following:
16-bit photoshop -> 16-bit LUT GFX -> 8-bit LUT monitor -> 6-bit panel -> FRC kicks in and dithers up to 16.2 M colors.

You are limited to display only sRGB as color space.

There is no control over drift in colors and brightness etc.

By getting a TN panel, you have little control over the output.

Lets take my screen, the NEC 2690WUXi, into the world of aftertreatment:

Its a true 8-bit H-IPS panel. This means it can show by nature 16 777 216 true colors. This gives you enough colors to play with when working with 8-bit to 16-bit images.

Viewing angles are stable, which means you have high image concistency. You eyes can trust the image not being changed and warped by the screen.

You can hardware calibrate the screen itself for optimal accuracy and are therefore not at the mercy of factory calibration.

It has both wide gamut and a sRGB preset which dumbs down the gamut to sRGB when needed.

Brightness uniformity is not an issue thanks to Colorcomp, which evens out brightness and color over the whole display. Each panel have been individually measured by factory with over 100 different points of screen and stored in the display. Internally, there is a sensor that continuesly monitors brightness and color checking for drifts and compensates for optimal stability.

12-bit gamma correction (programmable with different gammacurves) and 12-bit LUT (programmable) decreases the chance for rounding error and helps keeping the colors accurate and the gradients smooth.

If I work with a 16-bit image and have calibrated GFX lut with a 16-bit profile, then my path is as following:

16-bit photoshop -> 16-bit GFX LUT -> 8-bit to 12-bit LUT with 12-bit gamma correction -> 8-bit output on screen.

Enable/disable overdrive and much more features are packed into this screen. Its too long to list every feature, but they are there to ensure your can trust what your eyes see.

Cost effective?

Lets say I get a picture in 16-bit aRGB. The gamut is large enough, so I can display it "close enough" in photoshop. I can edit the picture and trust that what I see, is the correct colors, brightness etc.
If I wish to remove an object, I can do so. Dithering, compression and other artifacts will be made by the image itself and not the screen, so I can safely edit them there. If I wish to add dithering, then the screen won't add extra dithering in addition. I can change the color balance, brightness, contrast, remove objects etc. and being able to trust I am removing what the image contains, not what the screen warps it to show.
After editing, I can store an edited 16-bit aRGB comfortably knowing I don't have to redo the work, since my screen showed me the image accurately. If needed, I can store it also as a CMYK and a sRGB image and softproof them , since they are sub spaces within my monitors gamut range. No need for out of gamut warnings and clipping. When needed, I can pull out the CMYK for print on CMYK printers, the aRGB for aRGB printers and full range of colors, sRGB for web usage etc. Work has been done already, so I don't waste time.

Wasting time must be added to the "cost effective". With a cheap screen only capable of sRGB, I've been limited and it would take more time to spot errors and double check the outcome, since the cheap screen couldn't be trusted.

I can't state enough how visible the differences are with pictures taken and printed in 16-bit aRGB is, compared to pictures taken and printed in sRGB is. Green looks more green and less lime etc.

My time is valuable. Not only have I already saved in the extra I spend getting a proper tool for my job, but I also gained more free time, which is worth much by itself. A cheap TN screen would have been MORE expensive in my case.

"Cost effective" is relative in my opinion and in my arguments above.

If your time is cheap and your pictures doesn't deserve the extra care and control in the aftertreatment, then a TN panel is most cost effective. If you are serious about color and wants to keep your image as intact as possible with every conversions and edits, then getting a cheap TN panel wouldn't be cost effective at all.


These are my two cents about this at least, though I respect you have a different opinion and found a screen suitable for your needs. But, I strongly disagree with recommending a TN (or VA) as a general screen for photographer. All the care and consideration you photographers put into your images when taking them deserves better treatment then that! ;)
 
...Your statement "As a result all of the colors are mapped incorrectly not just those outside of the boundary." is the misleading one. In a color managed workflow, this is incorrect. The world doesn't evolve around your "everything is mapped wrong, since they are not sRGB". If you rent a DVD, its mapped just as it should and if you ever find a commercial DVD thats sRGB or a TV/HDTV broadcast thats sRGB, please let me know...

Outside of a color aware enviroment, the colors are still not "mapped wrong". They are mapped for another color space. Your CMYK printer is mapped for CMYK colors. Your TV is mapped for YPbPr/YUV with PAL/NTSC/SECAM. Your DVD collection don't even have sRGB. The limitation is in your screen and your OS, which is limited to sRGB....
Well I am going to boil this down to the central topic. Again, you present another strawman. I never stated that colors would be mapped incorrectly in color aware programs. In fact I said the opposite. So let me try to make it clear.

1) Color aware programs are not effected by the default color space of a wider gamut monitor. If you have a 72% or 92% gamut monitor, in a color aware program sRGB should look about the same as long as your color management policies are being used correctly.

2) My references to colors being mapped wrong is solely based on none color aware programs (OS, games, etc). The standard is sRGB and all PC desktop operations, games and the internet are designed for sRGB colorspace.


...Outside of a color aware enviroment, the colors are still not "mapped wrong". They are mapped for another color space. Your CMYK printer is mapped for CMYK colors. Your TV is mapped for YPbPr/YUV with PAL/NTSC/SECAM. Your DVD collection don't even have sRGB. The limitation is in your screen and your OS, which is limited to sRGB....

Wow, please re-read you first statement tell me how that is not mapped wrong? If content was created for sRGB color space it is displayed in another colors space without being converted to that colors space then by definition that is mapped wrong? Are you serious in backing up that statement?

None HDTV is mapped to Y'CbCr but HDTV is mapped to a newer standard ITU-R BT.709 both types of RGB color spaces. sRGB was designed to work together with the older standard of broadcast TV and the newer standard of HDTV which wasn&#8217;t even a reality when the standard was created. I quote:

Figure three illustrates both the sRGB color space and the extraction of the monitor only specifications implicit within the ITU-R BT.709 standard. By producing such a monitor space, one can then transfer the ITU-R BT.709 encoded signals to other devices. By building on this system, the sRGB color space provides a monitor definition that can be used independently from the ITU-R BT.709 standard while maintain compatibility. This allows for the well-defined transfer of color information across the World Wide Web as described in the other section of this paper.
&#8211; Source W3.org White Paper A Standard Default Color Space for the Internet &#8211; sRGB

Most broadcast HDTV is still optimized for 72% as wide gamut HDTVs are still a small share of the world wide and US market. Again it will likely change but the standards are designed for compatibility.

The default color space of today's wide gamut monitors are not. You seem to want to shift the focus to the OS or other software applications but the fact is the monitor manufactures are aware of these standards and could have adjusted their defualt drivers and firmware to display the current PC content correctly while still supporting the newer standards of HDTV.

The silver lining is as they continue to raise the gamuts into into 2008-09 they will be forced by the market to address this issue as displayed content which looks a little more saturated now will start to look just silly as the gamuts rise and the issue is not addressed.
 
Well I am going to boil this down to the central topic. Again, you present another strawman. I never stated that colors would be mapped incorrectly in color aware programs. In fact I said the opposite. So let me try to make it clear.

Your argumentation of calling my posts "strawman" is nothing but a pathetic "poisoning the well" approach. If you cannot discuss civilized, then don't discuss at all.

In your "sRGB evangalism", you claim that devices like wide gamut screens are "mismapped (your exact wording)". You go to the extent of saying:

By any reasonable definition it is an error.

They are not mismapped at all and its definitely not an error. They are mapped for another color space as they should, since sRGB is limited.


Download Safari 3 beta here:
http://www.apple.com/safari/

You can easily uninstall it later if needed.

Open thise pages in Safari:

http://www.color.org/version4html.xalter
http://www.gballard.net/psd/go_live_page_profile/embeddedJPEGprofiles.html

You don't need a sRGB screen to read internet. Its a disadvantage even if you are in the market of some good aRGB pictures and the photographer have displayed them as tagged aRGB, while you watch them in limited sRGB.

As for broadcasts, HDTV and DVD's, all of this needs to be converted into the RGB color space. They are not broadcasted and encoded in RGB for both bandwidth and compatibility reasons.

Here is some about conversion issues between color spaces and multimedia:

You are looking at difference color spaces...

Y'CbCr - this is what's sent over SDI and recorded onto digital formats like dBeta. Legal black level is always at 16(Y'), and white level at 235(Y'). Note that the unit there is Y'.
Sometimes mistakenly called YUV. If it's digital, they are probably referring to Y'CbCr.

R'G'B' - lots of computer-oriented programs use this. Legal range is usually 0-255(RGB), but is sometimes 16-235(RGB). This behaviour depends on what codec you are using to convert from RGB<-->Y'CbCr. Some codecs want to see 16-235 levels, most want to see 0-255.
http://forums.creativecow.net/readpost/21/856602

Partially list of color spaces:
http://www.experiencefestival.com/a/Color_space_-_Partial_list_of_color_spaces/id/1243566

Note that TV/Broadcast/DVD are NOT sRGB.

This is a thread about screens for photographers, so I want to ask you following:

Why should a photographer limit himself/herself to sRGB?

I'll tell you why he/she shouldn't:

He/she is not limited to see his/her pictures only in sRGB.
He/she is not limited to edit his/her pictures only in sRGB.
He/she is not limited to see others pictures only in sRGB.
He/she can use color aware browsers that respects embedded .icc profiles and see the pictures correctly, instead of everything being dumbed down to sRGB.
He/she can edit/view pictures in 16-bit aRGB and convert+softproof them in both sRGB and CMYK without out of gamut errors and clipping.
He/she can capture images in larger color spaces to gain the extra color depths found in nature and everywhere else, and being able to view them when coming back as they were captured. With sRGB, he/she needs to dumb them down to the small sRGB container, so his limited gamut can display them.
For us who wants good photo's: aRGB (and above) is a good package deal. You get aRGB, sRGB and CMYK all in one. :D

sRGB is the limitation, not the answer.
 
The only 2 reasons I could think of would be the wish to preview sRGB colors as exact as possible,
as there will be a slight loss of displayable colors when only using the sRGB spectrum of a aRGB monitor. (panels are limited to 8 bit so it won't be possible to use all 256 steps)

It's also possible that you are forced to color critically use non-color aware software and can't find a way of calibrating the aRGB monitor to display sRGB colors.
(there isn't an option to do so in the eye-one software, but I think some of the spyders can calibrate to a target color space)

Both don't seem to apply to a photographer unless he/she works for web graphics only. ^_^

it's a pity that 10bit/channel wide color panels aren't out yet as they should be able to display sRGB better than the current sRGB monitors...
 
The only 2 reasons I could think of would be the wish to preview sRGB colors as exact as possible,
as there will be a slight loss of displayable colors when only using the sRGB spectrum of a aRGB monitor. (panels are limited to 8 bit so it won't be possible to use all 256 steps)

It's also possible that you are forced to color critically use non-color aware software and can't find a way of calibrating the aRGB monitor to display sRGB colors.
(there isn't an option to do so in the eye-one software, but I think some of the spyders can calibrate to a target color space)

Both don't seem to apply to a photographer unless he/she works for web graphics only. ^_^

it's a pity that 10bit/channel wide color panels aren't out yet as they should be able to display sRGB better than the current sRGB monitors...

Two very good reasons if you your work require such accuracy. :)

Unfortunately, there are few that gives such accuracy. :( The 2490WUXi gives you an option to hardware calibrate the monitor LUT and gamma, and you can implement the correct sRGB tonal response (sRGB IEC61966-2.1) instead of using 2.2. It also have 12-bit LUT (programmable) and 12-bit gamma correction (programmable). But, most screens on the market are limited in this sense, so for this accuracy, only the pro's with hardware calibration can give you that.

As for me, with the wide gamut 2690WUXi, I have some options that can increase my accuracy. I can either edit pictures in 16-bit aRGB before converting them to sRGB and do softproofing then, or I can use the sRGB preset which limits the gamut to sRGB (the preset cannot be changed or recalibrated though). Spectraview profiler gives an option to software calibrate (GFX lut) as well with 16-bit precision and selectable gamma curves. Not as good as using the 2490WUXi for this though.

There are 10-10-10 options as well, but it will cost. For such sensitive tasks it might be worth it in many cases though (if its commercial purpouses).

You need minium an ATI card of newer date (I think all X1XXX have this) or Nvidia 8800 (if I'm not mistaken). Those have support for 10-bit DVI. You need an operating system that supports 10-bit (like Vista). Then you need a 10-bit screen like the Nec 2180WG-LED. This screen gives you 10-bit DVI path with dual-link DVI and 10-bit display. It supports both aRGB and sRGB. :D Expensive though....
 
Benny,

F.Y.I. I had my EyeOne out this morning and noticed you can calibrate to an .icc, so you can calibrate to a working space.

I have the current version. I don't know if this is a recent feature in the software.

It's also possible that you are forced to color critically use non-color aware software and can't find a way of calibrating the aRGB monitor to display sRGB colors.
(there isn't an option to do so in the eye-one software, but I think some of the spyders can calibrate to a target color space)
 
dmccmobs,
hmm, maybe they implemented something but tech support told me a few weeks ago such a feature was not supported.
I'm not an expert at calibration, so how can the icc profile be used to make the display emulate the smaller color space?

Tamlin_WSGF,
thanks I didn't know the 2180WG-LED supported 10-bit color, it kinda hurts that the upcomming samsung and LG 30'' LED doesn't support 10bit (spec listed on the german site http://monitor.samsung.de/article.asp?artid=2AA18C62-B937-49ED-8741-E3C3369D35F3&show=specs and http://www.lgphilips-lcd.com/homeContain/jsp/eng/prd/prd201_j_e.jsp) as samsung already displayed a panel capable of it and the NEC demonstrates that it's possible with DVI... the LED gamut might even put the used graduations in an emulated smaller gamut into a range percievable by normal humans ^^;

on the other hand it makes us more happy with our current displays (3007wfp-hc in my case) as we don't have to worry about much much better upcomming products as long as they stay 8 bit.. ^^;
 
Your argumentation of calling my posts "strawman" is nothing but a pathetic "poisoning the well" approach. If you cannot discuss civilized, then don't discuss at all&#8230;
Dial it down a notch, Jesus. &#8220;Pathetic&#8221; is not exactly civilized. I was merely drawing attention to the fact that you were misrepresenting my claims, which you did several times. Nothing uncivilized about that.

They are not mismapped at all and its definitely not an error. They are mapped for another color space as they should, since sRGB is limited.
Let&#8217;s take a moment and analyze this because in my view this is not a reasonable statement. If content is designed for a specific color space, whether it is aRGB, Pro RGB, etc, if it is displayed in another color space, then you are not seeing the correct color mapping. In this case, sRGB is the standard for Windows OS, games and the web, in others all none-color aware PC applications and content. So viewing any of that content in any color space outside for sRGB is not the correct color mapping that was intended by the content creator.

On a purely technically hardware point of view you are right, as I stated before, but given that this is a known, well established standard it is mapped incorrectly if mapped to anything outside of sRGB. I confounded as to why you would argue that point.

Additionally, you state They are mapped for another color space as they should, since sRGB is limited. but I do not follow your logic? What is gained in displaying sRGB content in a wider gamut without conversion? There is no added detail, just colors values mapped to shades not intended by the content creator. All color spaces are limited. What is important is that content is displayed in the correct color space. Which in this case it is not.

He/she is not limited to see his/her pictures only in sRGB.
I never said they should. What I do believe is the buyer should know what they are getting. Wide gamut maybe the best fit but for most users it offers my problems then it solves in its current form.

He/she is not limited to edit his/her pictures only in sRGB.

He/she is not limited to see others pictures only in sRGB.

He/she can use color aware browsers that respects embedded .icc profiles and see the pictures correctly, instead of everything being dumbed down to sRGB.
You seem to have a rather strange view of sRGB. You do realize the difference when mapped correctly is very subtle. You can take any two raw photos, color correct them for either aRGB or sRGB and them display them in the correct color space and the difference in most cases would very slight and probably not noticeable to average user.

He/she can edit/view pictures in 16-bit aRGB and convert+softproof them in both sRGB and CMYK without out of gamut errors and clipping.
Yes and no. Yes, if you have a need to work in color critical wider gamuts (a very small user base) then it might be worth the trouble. But no it is not 16 bit aRGB. We are still limited to 8 bit until Display port.

He/she can capture images in larger color spaces to gain the extra color depths found in nature and everywhere else, and being able to view them when coming back as they were captured. With sRGB, he/she needs to dumb them down to the small sRGB container, so his limited gamut can display them.
Again, the difference for images that have been converted correctly is very small in most cases. But yes, if someone has the need then it might be the best solution.

My point is not that the current wide gamut monitors are junk, etc. But there are some serious problems (displaying none color aware content, 8 bit rather than 16 bit for wide gamuts, etc). So buyer beware. Personally, I don't think wide gamut monitors are worth the money until display port is on the market and they resolve the default color space issue of standard PC content.

If it were my cash, I would stick to the NEC 2090, 2190 or 2490.
 
Luthorcrow: web user/ consumer perspective. sRGB less hassle all fine it's the standard.

Tamlin_WSGF: general content creator perspective. WCG-CCFL, preferres complicated production with more options and better results.

Couldn't we just settle for WCG-CCFL is better for producing content for print/photo and sRGB better if you produce content for pc/web only?

Both have pros and cons
WCG-CCFL will need caution and the workflow will be a bit complicated until you get the hang of it.
sRGB will be fine as long as you know what space you are working in (and correctly tag your images so they will be displayed properly on wide-color screens), but it will be impossible to accurately display of screen colors.

Normal non content creating users won't care about falsely displayed colors and might even prefer the oversaturated wide-color screen.

But no it is not 16 bit aRGB. We are still limited to 8 bit until Display port.

I believe the statement is only referring to previewing 16bit data on an 8bit screen. you two seem to be talking into different directions.

Luthorcrow,
The only problem with your arguments is that you started off being pretty uninformed about colorspaces relevant to print/photography which lead to some rather embarrassing statements that made it hard to take you seriously.
But you are indeed right that the current HDTV broadcast standard is equal to sRGB or PAL/SECAM which is why some TVs support turning off the wide gamut as it is possible with the 2690.
Sony is however trying to push a new wider standard on the consumer side with wide color capable cameras and TVs.
The cinema production itself is however a completely different beast.
 
Tamlin_WSGF: general content creator perspective. WCG-CCFL, preferres complicated production with more options and better results.

Couldn't we just settle for WCG-CCFL is better for producing content for print/photo and sRGB better if you produce content for pc/web only?

This we can, though I would like stress on the word "only". :)

A final link in these matters:
http://www.eizo.com/products/graphics/cg221/features.asp#adobeRGB



I believe the statement is only referring to previewing 16bit data on an 8bit screen. you two seem to be talking into different directions.

This is true. :) Previewing and also working in 16-bit mode on an 8-bit screen has it advantages when preserving information during edits.
http://www.earthboundlight.com/phototips/8bit-versus-16bit-difference.html

Depends a bit on what kind of edits you wish to make.

On a side note since you have the 3007WFP-HC:

An updated version of firefox 3 beta is available:
http://ftp.mozilla.org/pub/mozilla.org/firefox/nightly/latest-trunk/?C=M;O=D

Looking good for wide gamut users. :D It has some issues with v4 YCC-RGB, but it works better then Safari IMO.
 
Does the Firefox 3 (tried a8 and a9) require some specific settings to be turned on for the CMS to work ? I can't find such and the color management seems to be off (all tests fail) :confused:
 
Back
Top