Turns out I hate wide color gamut, what new monitor do I get?

TheManko

Weaksauce
Joined
Jun 13, 2004
Messages
122
I'm using a hp w2408h monitor and the colors have always bothered me with it. No, it's not because it's a crap monitor, because I have a Spyder 3 which helps a lot there. As it turns out my problem was simply that it's a wide color gamut monitor. Using the color managed option in Firefox 3 and experimenting with Safari I have come to the conclusion that wide color gamut is a horrible abomination of a feature that has no place in any consumer screens. How am I supposed to enjoy the picture if colors look over saturated in games and movies no matter what I do? What is the point of wide color gamut anyway? I thought sRGB was the standard.

So what are my options for a new monitor? I have looked at many, many monitors and pretty much all monitors released this year are wide color gamut. Do any screens have an option to turn it off? Because they're all useless to me otherwise.

I assume that LCD tvs aren't wide color gamut since it wouldn't make any sense screwing up the colors for all the content that will be displayed on it, so is that my only option unless I have missed some magical screen that isn't TN and isn't wide color gamut?
 
Interested in this also, for I am considering an HP, and dont want to suffer through movies with crazy oversaturated colors.

I also don't know what is going to give me a more accurate representation of digital images I take.
 
I am glad I did not buy a wide color gamut - it always seemed like this would greatly bother me; this just confirms it.

Thank you for sharing.
 
More accurate color management for me. Thanks wide gamut...

Movies? Almost all mainstream dvd program offers saturation control.........

Web browsing? Many future browsers will come with color management option............

LCD TVs match sRGB color profile because most broadcasting sRGB/untagged.
 
It's just too bad that I have plenty of opportunity where I live to see the same games and movies with non-wide color gamut monitors, so as it turns out no matter what I do with the saturation controls I can't get rid of that overly saturated look. I did however do that pixelshader trick in Media Player Classic from the avsforum that correctly converts the colors and it looks just as it should. Too bad I have tons of blu ray movies and PowerDVD is very limited in those options.

With games the worst case scenarios are probably Grid and Dirt, because they both have very strong colors and with a wide color gamut many visual anomalies pop up, such as roads looking very strange compared to what they're supposed to. It's so bad that I'm close to going back to my old Viewsonic vx2025 even though it has a stuck pixel.

I mean if you never play any games and never watch blu ray I could see how you could live with a monitor like this, but as it turns out 80% of what I do with the computer is stuff that currently can't be converted to a proper color space.

I know very little about LCD-tv colors, so is there something I need to know before I run out and buy one? I live in Europe (land of PAL), so what could I expect from the colors of a tv?
 
PAL is pretty close to sRGB.
(nothing a normal person would see http://upload.wikimedia.org/wikipedia/de/f/fd/CIE_RGB-CMYK-Beleucht.png ,
some monitors seem to have a backlight closer to PAL than sRGB gamut without anyone complaining..
http://www.lesnumeriques.com/duels.php?ty=6&ma1=35&mo1=189&p1=1815&ma2=88&mo2=116&p2=1217&ph=7)

But almost all LCD TVs that come with a wide color backlight let you reduce the gamut.
(www.hdtvtest.co.uk got some of the nicest reviews including input lag measurement in the newer ones)
 
Thanks, I did some reading on that hdtvtest site and it seems like there are plenty of good alternatives! Now let's see how much money I can save until christmas.
 
Totally agree with the original poster. Wide Gamut is an abomination where you get zero benefit from wide gamut, because there is NO WIDE GAMUT MATERIAL. nothing but grief if you like normal colors. Everything is normal gamut and when you display it on a wide gamut monitor color will be strange, usually oversatured but some times the hues are wrong.

Basically the marketing department found a new number they could put on the box and crank up. Because in marketing 101, bigger number is better. (argh simpletons).

After ditching my wide gamut display I considered the following sRGB displays:

Beng G2400WD, V2400W - These are both sRGB with very good default color out of the box.
NEC 2490WUXi. Higher end (much more expensive) sRGB and what I bought.
 
I totally agree with lifanus

I can also see how some people can be turned off by it. Wide gamut is for people who work in photography and who use OS X.

Avoid if you are not going to be using it heavily with photoshop.
 
I'm so sad--I thought color management in FF3 would take care of any problems when I bought my Planar PX2611W. I'm a web designer, so most of my time is spent in Photoshop, Dreamweaver, and Firefox.

Not so. Colors look horrid when compared with my Dell 2007FP w/ IPS at work. I've managed to get it so that it's OK, but it still doesn't look that great. And I bought it from New Egg, so I don't think I can exchange for credit towards the NEC 2490. :(
 
Wide Gamut is an abomination where you get zero benefit from wide gamut, because there is NO WIDE GAMUT MATERIAL. nothing but grief if you like normal colors. Everything is normal gamut
There's nothing normal in sRGB gamut and if anything it's the sRGB which is that "holy cow".
First of all sRGB is only minority of what human eye can detect, it can't even represent even nearly all colours available from basic printing and cheap basic printers and was literally made basing to smallest common denominator. Less surprisingly Microsoft was biggest name behind it and forced it through with "sRGB is enough for everyone" mentality and tried to make everything to comply only to it instead of going for actual colour management.
So why don't you go to complain them for going where the fence was lowest?
If MS had done their work correctly there wouldn't be any problems now, you would just install colour profile of display to keep colours correct. Also graphic card makers should be easily able to implement gamut conversion in drivers considering Media Player Classic can use GPU for that.

And neither is sRGB native gamut for anything.
Like said CMYK printing and printers have many colours outside sRGB. Television has had its own colour spaces defined much before whole MS was born... Photography films don't have anything to do with it and films used in movies have probably equally little to do with it.
Neither it has anything to do with digicams, only reason they commonly output sRGB is because MS forced them to it. In fact one reason for introduction of AdobeRGB colour space was addressing most failures of sRGB and that's why aRGB is used for serious photo editing. With change of one setting I could get AdobeRGB JPEGs directly from camera. From RAW I could probably get much larger colour space very near to that of human vision.

http://findarticles.com/p/articles/mi_m0MWK/is_29_12/ai_n27544118
http://photojoes.org/art275/lesson01/chapter01.html
http://www.luminous-landscape.com/tutorials/prophoto-rgb.shtml

So bark how much you want but just bark the right tree!

I'm so sad--I thought color management in FF3 would take care of any problems
Did you turn colour management on?
http://ejohn.org/blog/color-profiles/
 
E.T.:

Yes--I did turn it on. At first it still looked pretty damn bad. Then I set the path to my calibrated color profile, which made it much better. But it still doesn't look as good as my normal gamut Dell 2007FP.

Oh well, I guess if i ever branch into print design this wide gamut monitor will be better-suited for it, right?
 
There's nothing normal in sRGB gamut and if anything it's the sRGB which is that "holy cow".
First of all sRGB is only minority of what human eye can detect, it can't even represent even nearly all colours available from basic printing and cheap basic printers and was literally made basing to smallest common denominator. Less surprisingly Microsoft was biggest name behind it and forced it through with "sRGB is enough for everyone" mentality and tried to make everything to comply only to it instead of going for actual colour management.

It is not just a big microsoft conspiracy. Every bit of content out there is in sRGB. Movies/TV/Games/BluRay. Everything. Saying sRGB is nothing special is like saying it is nothing special to build trains to fit on standard rail tracks.

All content is sRGB, so your best bet is by far an sRGB monitor. The best you can hope for with wide gamut is jumping through hoops to get your sRGB content back into looking normal in some applications. If the best hope is getting back to sRGB, it makes a lot more sense to get a monitor that displays it to begin with.
 
E.T.:

Yes--I did turn it on. At first it still looked pretty damn bad. Then I set the path to my calibrated color profile, which made it much better. But it still doesn't look as good as my normal gamut Dell 2007FP.

Oh well, I guess if i ever branch into print design this wide gamut monitor will be better-suited for it, right?

Is the 2007 properly calibrated?
 
Philjohn:

I don't know, it's my work computer and was already set up when I got the job. It could be, but there's a chance that it might not as well. My guess is that if they were buying dell panel lottery, then they weren't too worried about color, and wouldn't bother to calibrate it.
 
<nice set of links E.T. here are some more for your collection>

Firstly a nice FireFox test. If things are set up correctly the image at the top should not change, the image near the bottom should change:
http://www.gballard.net/photoshop/srgb_wide_gamut.html

sRGB definition. This is from 1996 (!) I'd be delighted if someone could point me at a more recent version:
http://www.w3.org/Graphics/Color/sRGB

sRGB vs other color space definitions:
http://www.brucelindbloom.com/index.html?WorkingSpaceInfo.html

Gamut and bit depth. Read the comments too.
http://forums.dpreview.com/forums/read.asp?forum=1019&message=4778608

In short sRGB was specified over 10 years ago for efficiency, not fidelity. Eight bits per color is not a good choice when editing graphics and wide gamut is not a good choice when you only have 8 bits per color to play with.

The short in short. It's a mess.

DVI already supports 10bit color (iirc), Display Port even more (iirc). Now we have the monitors all we need is OS support and everyone will be be happy again. Everyone apart from gamers, I wonder what 10/12/16bit color will do to frame rates :eek:
 
Not to mention games offer color correction now too... naming Half-Life 2 series... Displays normal picture under wide gamut with color correction enabled. Kudos to wide gamut kudos to supporting companies
 
There is some serious misinformation in these posts. Let's make it clear, sRGB is not the best color space but is the standard for the Internet, all most all PC content, DVDs, games and movies. As a result it does matter.

There's nothing normal in sRGB gamut and if anything it's the sRGB which is that "holy cow".
First of all sRGB is only minority of what human eye can detect, it can't even represent even nearly all colours available from basic printing and cheap basic printers and was literally made basing to smallest common denominator...
What is your point? The same is true of Adobe RGB, Wide Gamut RGB and even RAW images. None of them cover the entire color spectrum that the human eye can detect. The only standard gamut that I am aware of that actually meets and exceeds the visible spectrum is scRGB. They are all compromises based on hardware limitations.

If I am going to have a standard, it only makes sense that it be the one that works for the widest audience.

Less surprisingly Microsoft was biggest name behind it and forced it through with "sRGB is enough for everyone" mentality and tried to make everything to comply only to it instead of going for actual colour management.
Throwing stones at MS for no reason. The purpose behind sRGB was to match PC content to the HD TV standard. It had nothing to do with the MS vs. Apple. The driving force behind sRGB was to match broadcast HD. MS was anticipating the PC, PDA, TV media merging via the internet.

And neither is sRGB native gamut for anything.
Not true. sRGB is designed to be compatible with and is virtually identical to the HD TV color space. It is also a subset of the future standard scRGB. The result being that when we eventually do have a standard that exceeds the human eye (scRGB) the old images (sRGB, HD TV) will still look correct and be mapped to the correct RGB primaries.

In fact one reason for introduction of AdobeRGB colour space was addressing most failures of sRGB and that's why aRGB is used for serious photo editing...
Again, only half true. aRGB is not necessarily the standard for serious photo editing. Portrait photographers generally work in sRGB because it is a works better in an 8 bit world for skin tones. It all depends on the professional application. The fact is neither is necessarily better than the other. They both have draw backs.

In short sRGB was specified over 10 years ago for efficiency, not fidelity. Eight bits per color is not a good choice when editing graphics and wide gamut is not a good choice when you only have 8 bits per color to play with...
Unless something has changed our video data pathes are still 8 bits and are likely to be for the near future. Until Display Port replaces DVI and video card manufactures start supporting 10 bit or higher we are in an 8 bit world. But I do agree that wider gamuts do not make a lot of since without an increase from our 8 bit stream. Eventually we will move up to a 10-16 bit video path and use scRGB as standard for media. I just think we are at least a decade away from that being a practical reality.
 
Can someone please take a picture or a screenshot (and saturate it manually via Photoshop or something) of how this looks like? It would really benefit a lot of us on the fence who want to buy these monitors.

The web designer who uses the same stuff (or some of it) that I do, that really scares me because that is almost 100% that I will not like it, and it is in a FedEx truck as we speak.
 
Can someone please take a picture or a screenshot (and saturate it manually via Photoshop or something) of how this looks like? It would really benefit a lot of us on the fence who want to buy these monitors.

The web designer who uses the same stuff (or some of it) that I do, that really scares me because that is almost 100% that I will not like it, and it is in a FedEx truck as we speak.

Calibrate it properly and firefox and Photoshop will have accurate colours - that's all you really need to worry about as a web-designer.
 
Can someone please take a picture or a screenshot ...of how this looks like?

82azy3.jpg
 
Photoshop and firefox look fine, which is what's really important. But other programs and stuff might not be color-aware, and they look really bad. I pretty much only use photoshop, dreamweaver, and firefox, so I think I'll be able to live with it.
 
Unless something has changed our video data pathes are still 8 bits and are likely to be for the near future. Until Display Port replaces DVI and video card manufactures start supporting 10 bit or higher we are in an 8 bit world. But I do agree that wider gamuts do not make a lot of since without an increase from our 8 bit stream. Eventually we will move up to a 10-16 bit video path and use scRGB as standard for media. I just think we are at least a decade away from that being a practical reality.

A decade you think ? I suspect we are somewhat closer. That said better than 8bit was available over 10 years ago only to be stifled by the sRGB quest for efficiency - file size, transfer speed, processing speed, memory usage, little to do with fidelity.

Common interfaces that support better than 8bit color today:

VGA - analog so in theory unlimited. I practice look out for graphics cards with 10bit or better DACs. 10bit or better processing is a further requirement.

DVI - Dual link supports up to 16bit color depth

HDMI 1.3 - Up to 16bit color depth. Also compliant with IEC 61966-2-4

Common software that supports better than 8bit color today (a very much abridged list):

Acrobat - PDF v1.5 provided support for 16bit color depth

Microsoft XPS - 32bit color depth - is this current ?

Various graphics software (do professionals really edit in 8bit color ?)

10 years ? I guess we'll just have to wait and see :)
 
Do any screens have an option to turn it off?

No because the gamut is defined by the backlighting so it is a non user adjustable part of a monitor.
 
No because the gamut is defined by the backlighting so it is a non user adjustable part of a monitor.

Isn't there quite a few wide-gamut displays with sRGB modes?
Shouldn't the monitor be able to remap for different modes? This would also not (in theory at least) be as bad as graphic card calibration since the monitor could remap _all_ colors accordingly (working with the pixels rather than the digital signal).
It could of course also just translate the digital values to different ones matching sRGB (as closely as possible), just like graphic card calibration.
Since sRGB is a complete subset of the wide-gamut displays they sure can display sRGB as well (perhaps not as good as an 'native' sRGB display but I doubt that it'd matter in anything but high-end displays).
But I don't know what is implemented in todays wide-gamut displays? What does sRGB modes in todays wide-gamut monitors mean exactly (bet it's different for different manufacturers though)?

What are possible short-term solutions btw.? I have a hard time accepting that they would force wide-gamut on everyone without an 'backup-plan' for the transition period.

One solution that I hope for (realistic?) is hardware support in graphic cards (that would translate everything, allowing you to switch to sRGB mode manually when running applications that you can't get color-aware).
Another could be just box that you run your DVI cable through that translates wide-gamut values to sRGB (it would need to be programmable though or many for different wide-gamuts), preferably with a switch to so that you can get wide-gamut for those few occasions you might need it, don't know how much the hardware for such an solution would cost, but in terms of lag it shouldn't be a problem at all.
Considering what scalers cost and that this should be a much easier operation in theory I think it could be quite affordable (doubt that the demand for it would be enough for prices to get decent though).

Better OS support is one thing but you'd need the hardware to support it in games/movies anyway, right?

The MPC 'hack' seems just to be a hack that although might work great but is probably not suitable for games.
 
One solution that I hope for (realistic?) is hardware support in graphic cards (that would translate everything, allowing you to switch to sRGB mode manually when running applications that you can't get color-aware).

I think this would be the best option until we have 10 (or 12, or 16) bit data paths, there must be some way to harness the power of modern graphics cards to convert an ICC profile into a shader program and run that over the framebuffer.
 
A decade you think ? I suspect we are somewhat closer. That said better than 8bit was available over 10 years ago only to be stifled by the sRGB quest for efficiency - file size, transfer speed, processing speed, memory usage, little to do with fidelity.
Please, do you honestly think that hardware 10 year ago had the capacity to pump 10bit or better digital video path?!? Again, sRGB had more to do with matching HDTV then anything else.

Common interfaces that support better than 8bit color today:

VGA - analog so in theory unlimited. I practice look out for graphics cards with 10bit or better DACs. 10bit or better processing is a further requirement.

DVI - Dual link supports up to 16bit color depth

HDMI 1.3 - Up to 16bit color depth. Also compliant with IEC 61966-2-4

Common software that supports better than 8bit color today (a very much abridged list):

Acrobat - PDF v1.5 provided support for 16bit color depth

Microsoft XPS - 32bit color depth - is this current ?

...10 years ? I guess we'll just have to wait and see :)
Yes, we are still about 10 years out. First off their has been support in software and hardware for better than 8bit images for a long time but that has nothing to do with the video path. Right now, Display Port appears to be the connection of choice. VGA is dead. Dual link DVI? Where have you seen that except on a 30" monitor?

The pieces are starting to come together: display, gamuts increasing etc. But first we need to actuall have video cards with display port, monitors that have display port connections and support for better than 8 bit video paths. It is going to take time for that to happen. So yes, I am pretty confident that it will be about 10 years before all that distributed wide enough to be used for a standard.


Various graphics software (do professionals really edit in 8bit color ?)
Yes, they do. Keep in mind when you are editing in Photoshop in 16bit color, your video path is still limited to 8bits. You are just spreading the same number of data points (16.7 million) over a wider canvas until the video path is increased. The primary reason to edit in 16bit is for two reason, print work and to provide padding for loss data as various layers and effects are applied.


Isn't there quite a few wide-gamut displays with sRGB modes?
Shouldn't the monitor be able to remap for different modes?
It wouldn't be an elegant solution as it would be global. Also, from what I have seen none of the monitors with a sRGB mode work all that well.

...What are possible short-term solutions btw.? I have a hard time accepting that they would force wide-gamut on everyone without an 'backup-plan' for the transition period....
Right now most wide gamut monitors are only 92%-120%. So they are still close enough to normal that for many users the over saturated effect looks good. In fact, you can see that monitor manufactures are using deceptive sales tactics based on this fact to sell more monitors by pushing wide gamut like snake oil.

What is the back up plan? It has to be an OS change. This is a problem for both MS and Apple by the way. I think a simple solution would be if through a patch to the color management system all images are treated as sRGB. This wouldn't effect color managed programs but would ensure the default for none color managed programs would be sRGB. Which is what the average user needs.

That would fix the problem until the new standard rolls out. I doubt it will happen until there is market pressure to make it happen. But if Dell and company keep pushing the gamuts up maybe we will see a fix.
 
Can someone please take a picture or a screenshot (and saturate it manually via Photoshop or something) of how this looks like? It would really benefit a lot of us on the fence who want to buy these monitors.

No need for any tricks. Just opened in a color managed program and a none color managed program side by side. In Photoshop I used the monitor profile for the color space. This is apples to apples on the same monitor. I embedded the color profile for my monitor a Planar PX2611W so you can open it in PS if you want to double check.

The image in question is an sRGB image. On the left it is displayed correctly in sRGB. The same image on the right is displayed in a none color managed program. You can see the model turns from African-American to Martian as her skin gets mapped to the to the wrong shades of red and yellow.

 
In fact one reason for introduction of AdobeRGB colour space was addressing most failures of sRGB and that's why aRGB is used for serious photo editing.

The main advantage of aRGB is that it contains all CMYK print colors, if you edit photos to be displayed on the web it won't be of any use.
And as Luthorcrow pointed out when using in in conjunction with 8bit color depth it will limit precision of skintones.
As long as you're limited to 8bit it's a two edged sword.
The skin of A model photographed on a sunny beach can be edited more precisely on a sRGB screen but the sky and sea colors will only show up properly in aRGB if it gets edited for print.

Please, do you honestly think that hardware 10 year ago had the capacity to pump 10bit or better digital video path?!? Again, sRGB had more to do with matching HDTV then anything else.

we didn't have the capacity digitally but I think he meant we had DACs with 10 or more bits when working with analog video.
And that the 8bit limitation was introduced by the technological limit of yesterday without taking into consideration that it might one day have to be lifted.
But ecat's main misunderstanding seems to be that he's interpreting sRGB as a limit on color depth (http://en.wikipedia.org/wiki/Color_depth) rather than gamut (http://en.wikipedia.org/wiki/Gamut)
sRGB has nothing to do with efficiency as you could have defined any gamut to be addressed with 8bits.

The funny thing is that HDTV is starting to introduce standards for wider color while the PC world is fighting with a standard that was implemented to match HDTV color. :)

That would fix the problem until the new standard rolls out. I doubt it will happen until there is market pressure to make it happen. But if Dell and company keep pushing the gamuts up maybe we will see a fix.

It's worrying that there seems to be hardly any awareness of the issue although there are tons of wide gamut displays out there. (and untagged content being created on them)

The introduction of OLEDs and widespread adaption of LED backlights will slowly start to turn color into a mess even greater then the current outside of programs that aren't aware of the gamut... http://www.lesnumeriques.com/duels.php?ty=6&ma1=16&mo1=344&p1=3247&ma2=36&mo2=167&p2=1698&ph=7

I'm wondering if the developers even know that there are people longing for a fix to happen asap (I doubt there was something on the windows7 wish list.., and apple seems to be hiding behind sRGB backlights although some of it's most typical users are the ones that could sometimes use a little more cyan..)
 
All I can hope for is that Apple will take the initiative to step into wide gamut monitors for all of the print designers using apple products and then patch OS X to display sRGB correctly.

Until then, I guess I'll cry myself to sleep every night and curse the moment I clicked "purchase" for the Planar PX2611W on newegg...
 
No need for any tricks. Just opened in a color managed program and a none color managed program side by side. In Photoshop I used the monitor profile for the color space. This is apples to apples on the same monitor. I embedded the color profile for my monitor a Planar PX2611W so you can open it in PS if you want to double check.

The image in question is an sRGB image. On the left it is displayed correctly in sRGB. The same image on the right is displayed in a none color managed program. You can see the model turns from African-American to Martian as her skin gets mapped to the to the wrong shades of red and yellow.


call me Kooky but I find the color of the right picture to be better. the one on the left looks bland. not as much glow in her skin on the left.

now this might be due to the fact I'm on a laptop.
 
Yes, you're either kooky or your laptop's screen is off. The image on the right certainly looks worse.

--
Re: video

When I got my planar, I fired up some 1080p trailers from apple.com

Colors did not bother me a bit. In fact, the trailers looked fantastic.

Are the trailers + quicktime color-aware? Or did I just not notice any problems?
 
Please, do you honestly think that hardware 10 year ago had the capacity to pump 10bit or better digital video path?!?

we didn't have the capacity digitally but I think he meant we had DACs with 10 or more bits when working with analog video.
And that the 8bit limitation was introduced by the technological limit of yesterday without taking into consideration that it might one day have to be lifted.

Indeed, mainstream consumer cards with 10bit processing and 10bit DACs have been around since 2002. The ATI Radeon 7000 for example could certainly be pressed into true 10bit service on the Mac. Talking of the Mac I believe the Radius ThunderPower was a true 10 bit card that predated even this.

As for our current 24bit color, 8 bit depth, it appeared in the 'mainstream' in the early 90's. I've quoted mainstream here because I doubt many people would have been falling over themselves to buy this $13,849.00 PC from 1991 ( http://www.atarimax.com/freenet/fre...ortArea/8.OnlineMagazines/showarticle.php?123 )

I wonder what the likes of SGI were producing between those two paragraphs ?

But ecat's main misunderstanding seems to be that he's interpreting sRGB as a limit on color depth (http://en.wikipedia.org/wiki/Color_depth) rather than gamut (http://en.wikipedia.org/wiki/Gamut)
sRGB has nothing to do with efficiency as you could have defined any gamut to be addressed with 8bits.

True for sRGB as a color space. As a standard for the internet, from http://www.w3.org/Graphics/Color/sRGB we have:

This current specification proposes using a black digital count of 0 and a white digital count of 255 for 24-bit (8-bits/channel) encoding.



Yes, we are still about 10 years out. First off their has been support in software and hardware for better than 8bit images for a long time but that has nothing to do with the video path. Right now, Display Port appears to be the connection of choice. VGA is dead. Dual link DVI? Where have you seen that except on a 30" monitor?

The pieces are starting to come together: display, gamuts increasing etc. But first we need to actuall have video cards with display port, monitors that have display port connections and support for better than 8 bit video paths. It is going to take time for that to happen. So yes, I am pretty confident that it will be about 10 years before all that distributed wide enough to be used for a standard.

10bit monitors, video cards and funky tricks with DVI (packed-pixel signal ?) already exist (eg Eizo http://radiforce.com/en/products/pdf/gb_en.pdf , http://radiforce.com/en/downloads/brochures/bro_en_RadiForce.pdf , I couldn't be bothered searching out any other links )

I agree entirely that the pieces are starting to come together, DisplayPort is the future or HDMI1.3 for those viewing in 16:9 ;). I'm sure the graphics card manufactures would love to introduce true 10bit to the mainstream, don't they already calculate to better than this and scale down ? Which leaves OS support and the economics/limitations of 10bit LCDs the only things holding the market back.

With DeepColor and wide gamut in the TV market, Blu-Ray desperately looking for another feature , OLED on the horizon ( or should that be 'still on the horizon' ?) and the positioning of the PC as a media device, I think we are seeing a push from the technologies that are up to the job, we could see things move in the next 4 to 5 years. Or 10, chuckle, it's been an interesting and educational discussion :)
 
I wonder if the color management will be improved in Windows 7.

Luckily I have my NEC 2490 to keep me warm in its properly colored glow until this mess gets a little furthur along. :D
 
After ditching my wide gamut display I considered the following sRGB displays:

Beng G2400WD, V2400W - These are both sRGB with very good default color out of the box.
NEC 2490WUXi. Higher end (much more expensive) sRGB and what I bought.

So the Benq V2400W and the NEC 2490WUXi are the only two non wide gamut monitors that are available right now? (I would include the G2400WD, but as far as I can tell, it's not available anywhere right now, and won't be untill the end of October?) It frustrates me that the V2400W isn't VESA compliant... :( (and it's my understanding that the G2400WD will be basically the same as the V2400W, but with VESA ability?)
 
So the Benq V2400W and the NEC 2490WUXi are the only two non wide gamut monitors that are available right now?

No Just that the Benq G2400/V2400 represent the best of the low end. The NEC the best of the higher end.
 
Back
Top