Dell U2412M

The U2410 has over saturated colors, can only see this being benifical for some color work and a few colorful games, but the medicore contrast+AG+tinting counters the vibrancy. Side by side calibrated (to be fair) in sRGB mode the U2410 will look like absolute shit. Nothing rich about paying 200$ for far worse image quality.
 
Last edited:
10e, you mentioned the AG coating and how it wasn't horrible. Could you possibly say a bit more about the AG coating, and how it compares to other monitors you've experienced?
 
The U2410 has over saturated colors, can only see this being benifical for some color work and a few colorful games, but the medicore contrast+AG+tinting counters the vibrancy. Side by side calibrated (to be fair) in sRGB mode the U2410 will look like absolute shit. Nothing rich about paying 200$ for far worse image quality.

Not even any games ;-)

Wide colour gamut could be only benefical in DTP, pre press and only for professional photography with good and expensive pro DSLR cameras that can shoot in AdobeRGB. For anyone else is wide gamut absolutely useless.
Today standard for movies, games, web pages, operating systems, HDTV broadcasting(all regardless of platform) is rec.709/sRGB( yes they are created in rec.709/sRGB)

here is nice article about gamut
http://www.maximumtech.com/display-...itor-hdtv-companies-cook-their-specs?page=0,3
 
Last edited:
now thats wrong!

http://en.wikipedia.org/wiki/H-IPS#In-plane_switching_.28IPS.29

10e, you mentioned the AG coating and how it wasn't horrible. Could you possibly say a bit more about the AG coating, and how it compares to other monitors you've experienced?

Much better than the Dell U2711 and Apple Cinema 23" (two of the worst I've used/seen), a fair bit better than the Dell 2209WA, and 2005FPW

There is graininess and crystalline effect there, but for some reason it looks more "fine"/less coarse than many I've seen.

It seems to be almost identical to the NEC LCD2490WUXi2-BK I have next to it. It is almost a perfect match.
 
Last edited:
Not even any games ;-)

Wide colour gamut could be only benefical in DTP, pre press and only for professional photography with good and expensive pro DSLR cameras that can shoot in AdobeRGB. For anyone else is wide gamut absolutely useless.
Today standard for movies, games, web pages, operating systems, HDTV broadcasting(all regardless of platform) is rec.709/sRGB( yes they are created in rec.709/sRGB)

here is nice article about gamut
http://www.maximumtech.com/display-...itor-hdtv-companies-cook-their-specs?page=0,3

yep wide gamut is only useful if your workflow uses the wide gamut from start to finish. However, if you are working in srgb colour space, it actually makes things worse as it's not accurate for that colour space.
 
10e: are you able to caibrate the monitor and post your settings and icc profile?
 
10e: are you able to calibrate the monitor and post your settings and icc profile?

Yes, when I am completely happy with the calibration I will post the settings. I am calibrating to a target of 150 cdm/2 to match my NEC monitor, but it should still work properly at lower or higher brightness levels.

One thing to keep in mind, is that it does not take panel-to-panel differences into account, so results may vary.

My calibration will likely be based on the custom color mode. So far I've only gone down to about 975:1 contrast ratio after calibration which is still good IMHO.
 
Want to say thanks 10e! You've helped me out so much and many other people with your great feedback and professional guru knowledge. I can't wait till my monitor is here!
 
The U2410 has over saturated colors, can only see this being benifical for some color work and a few colorful games
Ok then, switch the U2410 to Game Mode. Now switch to sRGB mode without confirming it on the menu. Congratulations - you now have the reduced input lag of Game Mode with sRGB colours.
but the medicore contrast
You seem determined to hammer the U2410 at every mention. Image quality is about way more than contrast. As I already said, the U2410 will still burn my eyeballs past 50% brightness, so it's really about black level. Whilst a high contrast is good in that regard, in that it helps an image "pop", there's also a whole range of values between 0 and 255 you know, and the U2410 performs excellently there. The U2412 can only reproduce 64 shades of light in that range of 256 without resorting to "tricks". Is that a move in a positive direction? Whilst, according to 10e, the U2412 does a decent job in this regard, I still think it's a lot to ask that you can give up a panels ability to render 16.5 of its 16.7 million unique colours natively without having some sort of impact on visual quality.

I haven't seen anything which suggests the U2412 differs significantly in that regard. This is also an element which can vary between batches of products too.

Colour gradation, as it's properly referred to, is an issue across ALL of LG's panels. Buying panels with tighter colour gradation tolerances increases panel prices so much that even screens intended for the higher end of the market make use of corrective circuitry on the panel instead (like "DUE" or "Colorcomp"). Unless LG have performed a miracle I suspect the U2412 will, unfortunately, also be subject to the same random quality control as LG's other panels.

Side by side calibrated (to be fair) in sRGB mode the U2410 will look like absolute shit. Nothing rich about paying 200$ for far worse image quality.
Except it won't, unless you're clueless and made a bad job of calibrating one of them. Of course, you are (naturally) well aware of how few devices handle very wide gamut CCFL lighting properly too, so you've already educated yourself properly about that first too. Oh wait, I forgot, you just like trashing the U2410 and I'm feeding the troll in this regard. My bad.
 
Not even any games ;-)
I beg to differ. The pop of the colours makes some games look great in my opinion.

Wide colour gamut could be only benefical in DTP, pre press and only for professional photography with good and expensive pro DSLR cameras that can shoot in AdobeRGB. For anyone else is wide gamut absolutely useless.
For you, in your opinion. The fact is human eyes can see a much wider range of colours than the limited range sRGB represents. I think it makes for a nice improvement, in visual quality, when your eyes are allowed to see this extra range - especially when you have a standard sRGB device beside it to compare it to. What "was" red on the sRGB screen no longer looks so red - It takes on a hint of orange or pale red. What was green looks more like a yellow-green. What was blue might take on a more purple hue.

Trends like "high dynamic range" exist because some people want to capture elements which are visible to our eyes, but which are often lost due to the limitations of reproduction methods in current technology. To me it's quite sad that the role of wider gamuts is often pushed to the back in all of that.

But most content is in sRGB. How do you ever change that situation? You won't unless there are a sufficient number of wide gamut devices. Wide gamut devices only add to a screens capabilities, as long as they are given a way to emulate an sRGB gamut properly. It's also much easier for computer content, such as games or photographic web content, to take advantage of such things than it is for things like the movie industry to change its entire format. But sRGB is "good enough for most people most of the time", just like TN panels are too.

BTW I think you'll find that the usage of Adobe RGB isn't limited to pro SLR cameras..
 
6 vs 8 bit does not matter unless you look at full screen grey gradients constantly. You aren't seeing more colors. Except for those wanting to feel good about past purchases, I doubt any one is buying the whole alpha superiority of 8bit anymore, so please stop with the non sense.

Who cares if a monitor is super bright (except for 3D and VA panels with deep blacks), a high brightness only further washes out the image due to the rising black level, or in the U2410s case, the grey level. Getting used to having colors look the way they aren't intended to look only increases user confusion and decreases the push for industry standards.

Color accuracy means nothing if the contrast is low/black level is high, it will look terrible no matter what. Given that pretty much every other IPS panel has higher contrast and good color accuracy for a lower price negates any positive spins one could give the 2410, except for those feeling the need to justify their purchases to the internet because they bought one and like it.

I think at this point the U2410 Defence Force is a 1 man army

It seems to be almost identical to the NEC LCD2490WUXi2-BK I have next to it. It is almost a perfect match.

+1, hope to hear the same from a few more people.
 
Last edited:
6 vs 8 bit does not matter unless you look at full screen grey gradients constantly.
Oh. You mean the sort of gradients that rarely appear on things like TV or movie content, but commonly appear on things like computer content, where a GUI or web page background makes a transition between two colour shades?

You aren't seeing more colors.
You are. What you're talking about is the eyes ability to discern them. That's why a reduction in colours typically matters less for moving video images too. Now take a picture of a bright daytime or dark night time sky, where lots of things may be going on within a much more limited range of colour shades. All of a sudden the monitors ability to clearly discern small differences between these values matters a lot more. From a close range (you know, the sort of range which isn't common for a TV but IS common for a desktop computer screen) people are more liable to notice these details too.

Color accuracy means nothing if the contrast is low/black level is high, it will look terrible no matter what.
Of course. However your words about the U2410's black level honestly come across as more "crackpot" than "reasonable opinion" here, especially when the reality sitting in front of my eyes bears absolutely no resemblance to the words you regularly use to describe the U2410. Also, even ignoring the fact that contrast varies noticeably between panels of the same type, in case you didn't realize it - back lights also dim with usage too, meaning black levels are one of those things which actually improves over time.

I think at this point the U2410 Defence Force is a 1 man army
I don't know why you have to portray things with silly language like "defending" the U2410. It's a piece of electronics, the same as a toaster or a microwave, and I don't attach much emotion to it. If someone wants to present me with a reasonably priced 27 or 30 inch (of equivalent quality) then I'd be happy to move on. What I do take issue with is the ridiculous language you regularly use, and personal opinions presented as undisputed fact, because I feel your words go beyond merely personal opinion and into the realm of being pretty misleading.
 
A small interruption.

6 vs 8 bits would matter on just about everything. If it were actually 6 bits. But it isn't.

It is 6 bit + 2 bit FRC. This is still 8 bits and the difference is just about undetectable.

Most 10 bit Pro monitors are 8 bit + 2 bit FRC. If FRC was so terrible it would be used in so many top end monitors.

You may now return to your bun fight.
 
I beg to differ. The pop of the colours makes some games look great in my opinion.

OK it could look great for you, but colors will be inaccurate simply because creator-developer intend is to show you colours in rec.709/sRGB

For you, in your opinion. The fact is human eyes can see a much wider range of colours than the limited range sRGB represents. I think it makes for a nice improvement, in visual quality, when your eyes are allowed to see this extra range - especially when you have a standard sRGB device beside it to compare it to. What "was" red on the sRGB screen no longer looks so red - It takes on a hint of orange or pale red. What was green looks more like a yellow-green. What was blue might take on a more purple hue.

Trends like "high dynamic range" exist because some people want to capture elements which are visible to our eyes, but which are often lost due to the limitations of reproduction methods in current technology. To me it's quite sad that the role of wider gamuts is often pushed to the back in all of that.

I am not against to wide gamut displays just want to see more understanding about this "issue"
bigger isn't better

I will be happily using wide gamut display when we will have wide gamut content(movies, games, web pages) until that there is no usage for me.

and there is one thing about colors outside rec.709/sRGB triangle in CIE diagram, they aren't common in nature.

You can see more of this and comments under article
http://www.maximumtech.com/display-...itor-hdtv-companies-cook-their-specs?page=0,3
 
sitting in front of my eyes bears absolutely no resemblance to the words you regularly use to describe the U2410. Also, even ignoring the fact that contrast varies noticeably between panels of the same type, in case you didn't realize it - back lights also dim with usage too, meaning black levels are one of those things which actually improves over time.

Black=Grey, Image Quality=Terrible, thanks PRAD. Just like the numbers tell us, the image looks very washed out. Great you have one, and you think it looks good, but science and photographic evidence tells us other wise. Every one is entitled to their opinion, but some times they are wrong

. said:
Also, even ignoring the fact that contrast varies noticeably between panels of the same type, in case you didn't realize it - back lights also dim with usage too, meaning black levels are one of those things which actually improves over time.

Yes monitors dim over time, but the image quality does not suddenly improve. It may be 0.18cdm/2 black @ 120cdm/2 1 year and 0.17cdm/2 @110cdm/2 the next, but the black level does not suddenly drop, resulting in better image quality over time like you suggest. Mulitple reviews all show that the contrast is stable, and always medicore on the 2410.

. said:
I don't know why you have to portray things with silly language like "defending" the U2410. It's a piece of electronics, the same as a toaster or a microwave, and I don't attach much emotion to it.

Read your own posts

. said:
If someone wants to present me with a reasonably priced 27 or 30 inch (of equivalent quality) then I'd be happy to move on .

Pick anything

6 vs 8 bits would matter on just about everything. If it were actually 6 bits. But it isn't.

I figured this goes without saying.
 
Last edited:
NCX said:
Where does it say that? The only thing I see is it won't convert 16-235 (limited range) to 0-255 (full range), but most monitors won't. The gray bars are from the video output itself. In fact, if you look above and below the gray bars, you can see it displays black perfectly fine. The pictures show an issue with black level conversion, not the contrast of the monitor.
 
OK it could look great for you, but colors will be inaccurate simply because creator-developer intend is to show you colours in rec.709/sRGB
Sure, and that was the point! Just because people intend one thing, and it doesn't match a spec sheet, doesn't mean it won't actually look great. A very wide gamut really pops those colours. As long as there aren't scenarios that require things, like an accurate representation of human skin tone, then it can actually work as an advantage a surprising amount of the time. I've seen several say the same, so I don't think it's an uncommon opinion. Suffice to say that calling it no use "even in games" definitely isn't the case - I just wish more games would take deliberate (rather than accidental) advantage!

I am not against to wide gamut displays just want to see more understanding about this "issue" bigger isn't better

I will be happily using wide gamut display when we will have wide gamut content(movies, games, web pages) until that there is no usage for me.
I checked the article and it used "bigger isn't better" in the context of producing more accurate colours. That's true. But it does produce a wider palette. I would say (in this case) that bigger is better because, as long as the wide gamut screen contains a way to reproduce sRGB well, you only gain a bigger palette.

I understand the "no content / no point" argument though, as well as the additional confusion caused to people who don't understand any of it. But you're simply talking about adding an additional feature. As long as that feature doesn't wreck a screens ability to render an 8 bit SRGB signal well I see no real problems.

and there is one thing about colors outside rec.709/sRGB triangle in CIE diagram, they aren't common in nature.
Hmm, well I know that I don't see poison dart frogs every day. But I'm not sure I'd agree in the case of deep greens? My impression is it's represented with more yellow, than should be the case, in an sRGB signal.
 
science and photographic evidence tells us other wise.
Oh really? Feel free to produce your photographic evidence which "proves" this. I look forward to it :)
Every one is entitled to their opinion, but some times they are wrong
Ya don't say :D
Mulitple reviews all show that the contrast is stable, and always medicore on the 2410.
They don't, actually. The best? Hardly. But nobody ever said that. I'm saying it's not bad enough to change the fact that the image quality on the U2410 is rather nice overall. Also what your eyes regard as black changes in the presence of brighter objects on the screen. I have a full screen black IRC window open with white text on it right now. The black on the U2410 looks pitch black to me. Should I place an OLED device next to it, will I notice the black could be darker? Yep. But it's enough to provide very good image quality. I'm extremely confident most people would agree. The fact that you don't is fine, except you seem to want to go beyond that and trash the device. For all its flaws, I think most people (who've seen one) would agree that calling the U2410 mediocre is pretty silly..
 
Where does it say that? The only thing I see is it won't convert 16-235 (limited range) to 0-255 (full range), but most monitors won't. The gray bars are from the video output itself. In fact, if you look above and below the gray bars, you can see it displays black perfectly fine. The pictures show an issue with black level conversion, not the contrast of the monitor.

They look pretty light to me when clicking on the picture and some of the other ones they have posted, and the image looks totally washed out. Plus they also highlight another issue when watching movies:D. Unless the pictures are taken in the dark it is hard to capture on camera.

2209WA (same black level as the 2410)
http://img155.imageshack.us/img155/6501/3544cff30441cea8.jpg

Google finds me a U2410 next to an HP 2408 which has a much lower CR than the 2412
 
Last edited:
It is 6 bit + 2 bit FRC. This is still 8 bits and the difference is just about undetectable.
To some extend I agree, but it greatly depends on the contrast and tonal range of the image you are working on. Low contrast images could be e.g. sunsets with a near black landscape regions in the foreground, or an image of a black labrador... whatever).

Most 10 bit Pro monitors are 8 bit + 2 bit FRC. If FRC was so terrible it would be used in so many top end monitors.
FRC/AFRC isn't terrible by itself... these two extra bits are so less significant than the least significant bits (7,8) in 6 bit+AFRC.

Again, i'm not saying the U2410 is better or anything, but for those who notice dithering, it's annoying, and it's bad to give LG the impression that 6 bit+AFRC (or whatever) is a fine norm for future monitors - eventually spreading to all segments.
I think the improvements they have otherwise made with the panel in the U2412 appears great (from what I gathered here), but it would be nice to have just improvements and not any step-backs or compromises - technology matures and prices drop. They already saved a lot when only installing a basic controller board and moving to edge-lighting.
 
They look pretty light to me when clicking on the picture and some of the other ones they have posted, and the image looks totally washed out. Plus they also highlight another issue when watching movies:D
Black appears to be in the ballpark of R,G,B=7,7,7 (in the right image).
Unless they used fixed camera settings in all of their images, they can't be compared without compensation. The camera always tries to grab an image with a good histogram with a wide distribution, so even what appears black to the naked eye can look non-uniform and grey when just letting the camera decide. :)

EDIT: Well, sorry - didn't see the edit.
As you suggested, these are only differences that is perceived in a camera shot, where the actual perceived black level is raised to compensate for the lower/deeper black level of the other monitor (i.e. differences in monitor contrasts). If you exchanged one of the monitors with an OLED monitor, the camera would again compensate, so that the LCD panel looked much worse (since the CR of OLED is much, much higher than either PVA or IPS), but again, this doesn't tell much on how the LCD monitor is perceived. (But this doesn't have much to do with the Prad.de images you linked to.)
 
Last edited:
I would say (in this case) that bigger is better because, as long as the wide gamut screen contains a way to reproduce sRGB well, you only gain a bigger palette.

It is my observation that wide gamut screens are (1) more expensive and that sRGB emulation modes result in (2)lost contrast and are usually must (3)less flexible color control(often locked) in emulation mode than the control you get from a native sRGB screen and often emulation modes have been (4)disappointing.

So IMO there are significant downsides to making a screen wide gamut, and it delivers negligible to non existent utility for the vast majority of people.

I am really glad they moved to a native sRGB screen for the U2412.
 
It is my observation that wide gamut screens are (1) more expensive and that sRGB emulation modes result in (2)lost contrast and are usually must (3)less flexible color control(often locked) in emulation mode than the control you get from a native sRGB screen and often emulation modes have been (4)disappointing.

So IMO there are significant downsides to making a screen wide gamut, and it delivers negligible to non existent utility for the vast majority of people.


I am really glad they moved to a native sRGB screen for the U2412.

For the average consumer who doesn't even understand color space or even have source materials that takes advantage of extended color space this "wide gamut" thing is just a over used marketing gimmick to add more "contents" to the fat tag. Its akin saying your monitor have 1 million to 1 dynamic contrast ratio!
 
Ordered this monitor (U2412M), cost me 330usd :D


so all you hate is because of this "proof"? :eek:

notice the REAL black bars, not the ones from the video source:

6WS67.jpg

From prad: "Apart from RGB, the monitor also accepts YCbCr as a colour model and is automatically adjusted to the source. Unfortunately, RGB is limited to the video level (16-235) and correct adjustment of the player did not bring about an improvement – the high and low hues are ignored.
 
Ordered this monitor (U2412M), cost me 330usd :D



so all you hate is because of this "proof"? :eek:

notice the REAL black bars, not the ones from the video source:

6WS67.jpg

From prad: "Apart from RGB, the monitor also accepts YCbCr as a colour model and is automatically adjusted to the source. Unfortunately, RGB is limited to the video level (16-235) and correct adjustment of the player did not bring about an improvement – the high and low hues are ignored.


How did you got it for only $ 330.
 
It is my observation that wide gamut screens are (1) more expensive
True. Emulating different colour spaces properly requires less basic circuitry. But it needn't add anything significant to costs, especially if something is mass manufactured rather than treated as a more niche market part - as is generally the case with wide gamut now. Having said that, the Hong Kong price of the U2410 often hasn't been a million miles from where the US price of the U2412 is now. Examples like the U2410 Vs the LP2475w also come to mind as evidence that it doesn't make a big price difference. The thing that really adds to the cost (Vs the U2412) is probably the panel construction and back light. So I suspect we won't see a real move towards wider gamuts again until either cheaper edge lit LED lighting allows for it, or there's a move to different technologies from LCD.

result in (2)lost contrast
How things are mapped out, so that a narrower colour space can be accomodated inside what's actually a wider one, definitely creates room for issues, and I won't argue that things like that aren't common side effects. It's clear that some companies were / are in a learning process themselves. For example Dell's first two ICM files (designed for colour managed apps when using the native wide gamut modes of the U2410) gave major issues in exactly this regard, and they weren't solved until they calibrated it to 10 bit internal precision. But, again, no argument from me that care has to be taken to emulate sRGB properly, and that clearly hasn't always been the case.

(3)less flexible color control(often locked) in emulation mode than the control you get from a native sRGB screen and often emulation modes have been (4)disappointing.
If you want to point to a real failure of the U2410 (in terms of people wanting to use the screen to the best of its abilities) then this is probably one of the main ones. They built a monitor with some nice features, then gave nobody a way to access them! Again, can't argue here.

So IMO there are significant downsides to making a screen wide gamut, and it delivers negligible to non existent utility for the vast majority of people.
Show people an image, or even a game, that uses wide gamut well though, and they'll often agree it can make a nice difference. A vitally important one to them? Probably not. One they'd pay a lot more money for? Nope! But a direction generally worth aiming in? I'd say "definitely yes".

I agree it shouldn't come at the expense of a good solid sRGB mode. That should be priority number one. And, to be clear, if Dell introduced a 30-inch 8 bit IPS, that's edge lit with only sRGB gamut, for under $700 street, I'd probably buy one to replace the U2410 tomorrow :)

But I very much agree with tk-don's sentiments in some of the posts above where he said "it's nice to have just improvements and not any step-backs or compromises" (although he was talking about the move from 8 to 6 bit, not gamut there) . Whilst you can, rightfully in certain respects, argue that the native sRGB is a step forward, it would also have been nice to keep the wide gamut option and provide an sRGB solution which pleases even the most picky. There's no question, in the U2412's case, it was probably the right move for Dell to make, in order to bring down the cost. But a solution, which gives the best of both worlds (wide gamut and sRGB), would also have been nice :)
 
The only way how could Dell avoid utilize 6bit+A-FRC in U2412 is using older lcd panel module from LG LM240WU6-SDA1, which is true 8bit and it has other advantage with direct LED backlighting instead edge LED backlighting in LG LM240WU8
By the way this panel module(LM240WU6) is in 24" Apple LED Cinema Display and 24" iMac.
 
After a bunch of research, googling and reading through all this good and bad info on here, and not being able to make up my mind as a result, I've ordered both a U2410 and a 2412M and will compare them side by side. I don't have the tools of an avid monitor tester but if you would like me to do anything specific, I will try to accomidate when I have both monitors set up. I"ll take pictures of course as well.

I was really tempted for a 30" but I just dunno if I can justify spending that much on just a PC monitor vs a new large TV for near the same price
 
after some more reading it seems dell will release a u2412hm version later this year? better version?
 
I thought it was a model naming mistake with some sites on the U2412hm which was meant to be U2412m. The 'h' models are the 16:9 ratio panels to have 1920x1080 and will be released in the U2212hm and U2312hm as replacements for the previous models.
 
The only way how could Dell avoid utilize 6bit+A-FRC in U2412 is using older lcd panel module from LG LM240WU6-SDA1, which is true 8bit and it has other advantage with direct LED backlighting instead edge LED backlighting in LG LM240WU8
By the way this panel module(LM240WU6) is in 24" Apple LED Cinema Display and 24" iMac.
An alternative is just to decide on this with LG beforehand. The panels are often engineered to a customer's specification, if the order is large enough.
The LM240WU6 is also edge-lit, not LEDs mounted behind the LCD. The LM240WU5 utilizes direct LED backlighting and is used in HP LP2480ZX ("dreamcolor") display.
 
Show people an image, or even a game, that uses wide gamut well though, and they'll often agree it can make a nice difference.

And likewise most manufactures use "vivid" picture settings for store settings because bright exaggerated colors impress people in side by side comparisons.

This thread has been helpful for me because I did not know that most sources were in sRGB. I don't see a point in wide gamut if that's the case. It would be like watching all SD content on a HD TV.
 
And likewise most manufactures use "vivid" picture settings for store settings because bright exaggerated colors impress people in side by side comparisons.
True :) Although WG goes a bit beyond that, because it's not just boosting saturation - it's genuinely producing colours which are wider on the spectrum.
This thread has been helpful for me because I did not know that most sources were in sRGB. I don't see a point in wide gamut if that's the case. It would be like watching all SD content on a HD TV.
But that's exactly what many people do - watch lots of SD TV content on an HD TV. They're still free to view HD though. The point is they have the option. If only a handful of TV's were ever made with HD then widespread support for HD would probably never have arrived. The devices have to be there for any form of support to emerge. Though wide gamut support has arrived on the many cameras that support formats like Adobe RGB, so that's probably the most common "proper" usage scenario right now.. :)
 
True :) Although WG goes a bit beyond that, because it's not just boosting saturation - it's genuinely producing colours which are wider on the spectrum.

There is nothing genuine about it. It is just artificial and wrong. You are simply using the wrong color space for the material. The primaries are in different place, the color mix with make certain tones, not just oversaturated but off in tint as well.

This thread has been helpful for me because I did not know that most sources were in sRGB. I don't see a point in wide gamut if that's the case. It would be like watching all SD content on a HD TV.

There is nothing wrong with watching SD on an HDTV, it won't look as good as HD, but at least it doesn't distort the picture.

OTOH Running normal content through a wide gamut monitor is more like watching TV after you little brother cranked up the saturation/color control and gave the tint control a twist, then broke off both knobs. It is a distortion.

Things are improving, while a couple of years ago it looked like wide gamut was being pushed everywhere by LCD marketers who just discovered another number to fatten on the spec sheet, now wide gamut now seems to be in retreat from the consumer LCDs. Sanity is returning to consumer monitors.
 
Is there an important reviews about this monitor?
How good is when compared to the old U2410?

It depends. Do you need wide gamut for your work or hobby work-flows (usually for photographers that have access to printers that can do higher gamut printing, but most DSLRs should be able to capture argb in raw format)
 
There is nothing genuine about it. It is just artificial and wrong.
Even if you over-saturate red on an sRGB device, it's still not going to give you the red a very wide gamut device produces. And there is no "wrong" when it comes to subjective personal visual preferences. There's what looks good and what doesn't.

Things are improving
I wouldn't call permanently sticking with a colour gamut that's much more limited, than what your eyes can see, "an improvement" just because most content uses it. In my opinion the goal of screens should always be to represent what your eyes would actually see in front of you. A move away from wide gamuts represents a move away from this goal.
 
I wouldn't call permanently sticking with a colour gamut that's much more limited, than what your eyes can see, "an improvement" just because most content uses it. In my opinion the goal of screens should always be to represent what your eyes would actually see in front of you. A move away from wide gamuts represents a move away from this goal.

I agree with you on this. However, when dealing strictly with accuracy on computers, as basically 99.9% of stuff is presented in sRGB, an sRGB monitor is the better choice.

What *should* happen to promote adoption of wider gamut monitors outside of a professional setting is that the video card should automatically detect if a monitor is a wide gamut or not, if it is,and if viewing something that is sRGB, automatically translate the values from sRGB to aRGB so the colours will look accurate no matter what.

That way, there is still proper backwards compatability,just like watching SD content on HD content, there isn't really a 'quality loss' compared to watching it on a SD tv that is calibated the same, as the HDTV or computer scales the image to size. Compared to currently if you are using a argb monitor with srgb content in argb mode, the 'colours' don't get translated right, therefore there is some sort of inaccuracy in the colours therefore a 'qualty loss', and the only way to avoid that is running the monitor in sRGB, then switching back to argb when you have content available for it.

This switching of modes is most likely too much work for the average consumer, as they might not even know if the content they are viewing is srgb or argb.
 
I wouldn't call permanently sticking with a colour gamut that's much more limited, than what your eyes can see, "an improvement" just because most content uses it. In my opinion the goal of screens should always be to represent what your eyes would actually see in front of you. A move away from wide gamuts represents a move away from this goal.

If the goal is to represent what your eyes would actually see in front of you, Wide gamut is a complete setback for display of any actual content. ALL of which is created for sRGB primaries. None of which will look remotely like what your eyes would actually see on a wide gamut screen.

It is great improvement to get back to less expensive screens that most accurately portray ALL the content you will use it for. Instead of paying more for less accuracy.
 
Back
Top