Dell U2410

I'm not entirely familiar with all the display jargon but.

When calibration changes setting at a LUT level, does it only change it for that preset?

For example, I calibrate the 'standard' mode at a LUT level with a calibrator software. If I switch the preset to sRGB would it retain the LUT settings from the standard preset, or would it change its LUT settings on preset by preset basis?

Thanks.
 
What were the XYZ tristimulus values of the white point?

Hardware settings: RGB values at 255-255-255 (in the service menu), contrast 50, brightness 10

Standard mode measures:
Black = XYZ 0.14 0.14 0.24
Red = XYZ 72.67 32.74 1.07
Green = XYZ 26.33 90.19 12.84
Blue = XYZ 24.49 9.02 128.79
White = XYZ 122.84 131.22 141.99

sRGB mode measures:
Black = XYZ 0.13 0.13 0.22
Red = XYZ 53.22 27.42 1.86
Green = XYZ 46.55 89.56 15.18
Blue = XYZ 23.82 10.73 122.22
White = XYZ 117.45 123.83 133.83

We achieved the best display from a colorimetric perspective during playback in RGB and using "xv Mode"

It sounds very interesting. I thought that xvYCC works with the YCrCb signal format only because this option is currently unavailable for me. But I figured it out now that I need HDMI connection instead of DisplayPort. (Because it is supported by HDMI only.)
But I am not sure about the source side either. Does it works correctly with a Radeon HD5 series VGA card? So, can I use xvYCC with my PC? It is theoretically supported. But would it produce a correct sRGB gamut as the sRGB Preset does?
I am not sure if you talk about a PC or a desktop Blu-Ray player here. (Some VGA cards can output YCrCb but that is useless after RGB frame buffering. But you mentioned problems, so...)
Is it so simple that I select the xvMode in the OSD and my VGA will detect and use it automatically (with gamut correction)? Because I can't see any xvYCC settings in ATI-CCC.
Did you checked the output bit depth in this mode?

How does it works? Do the VGA converts the colors (with 16 bit precision ? ) or do the VGA tells the display that it sends sRGB, so the display will use the sRGB emulation mode automatically?

Now, with DP connection Video mode is useless for me. But if this is all true (correct sRGB gamut mapping and usable tonal response characteristics) then Video/xvMode could be better than Graphic/sRGB.
Why didn't I hear it before now and why didn't they include HDMI instead of DP cable when the display needs HDMI input for all of it's functions? :rolleyes:
 
Last edited:
When calibration changes setting at a LUT level, does it only change it for that preset?

The used internal LUT in the display hardware will be chosen by the selected preset mode. The VGA LUT won't follow the OSD changes.
Your calibration software will create a VGA LUT for the given preset which you used during the calibration. It won't be usable for any other preset.
You should clear your VGA LUT or manually load the correct calibration LUT after you change the preset on the OSD.
You can't do anything with the internal LUTs right now (the display will handle them automatically).
 
This is interesting:
- The HDMI 1.3 standard says that DeepColor requires at least 12 bit/color support (16 bit is optional).
- U2410's specification says that it supports DeepColor.
- The STDP60xx documentation mentions 12 bit/color (but in context with YCC, so it is unclear for RGB)
So, I think that U2410 should receive 12 bit/color through HDMI. Does anyone tested it yet?
I tested the DP connection and it looks like that the effective bit depth is 10 bit/color. (Moninfo says it and ArgyllCMS guess the same in the pre-calibration report.)
 
I tested the DP connection and it looks like that the effective bit depth is 10 bit/color. (Moninfo says it and ArgyllCMS guess the same in the pre-calibration report.)
10bit support is very likely but we should keep in mind that the panel is still a 8bit version with internal FRC for 10bit/8bit conversion that is used regardless of the input signal and its bit depth.

sRGB mode measures:
White = XYZ 117.45 123.83 133.83
255-255-255 setting without calibration? Quite good result because not far away from the black body curve (< dE 0,4).

Is it so simple that I select the xvMode in the OSD and my VGA will detect and use it automatically (with gamut correction)?
Signal was a RGB signal (PC-level) from a DVD player and and two external videoprocessors. Apart from the YCbCr => RGB transcoding and spreading to PC-levels no additional processing was applied.

Best regards

Denis
 
Last edited:
255-255-255 setting without calibration? Quite good result because not far away from the black body curve (< dE 0,4).

Yes. I think they made the sRGB emulation 3DLUT with 255-255-255 values. (And they lowered them later, for whatever reason...)
The calibration curves are not too bad either. There is some contrast degradation after it is calibrated to pure power gamma 2.2 tonal response but it is still good. (And better than "uncalibrated" factory settings with lower RGB values.)

the panel is still a 8bit version with internal FRC for 10bit/8bit conversion
The ST chip is working with 12 bit (So, 12 bit input would be better than 10 bit. It would eliminate an interpolation step and it may produce a little better result...) and dithers it back to 8 or 6 bit to feed the panel. Does the panel would dither it again? It is weird and can cause some of the issues. Or not... I am not so familiar with displays to judge that but it sounds weird... What do you win with that dithering when there is no additional correction? Is there any?

Signal was a RGB signal (PC-level) from a DVD player and and two external videoprocessors. Apart from the YCbCr => RGB transcoding and spreading to PC-levels no additional processing was applied.

Yes, but YCbCr->RGB conversion is a logical place for gamut conversion. I am trying to figure it out what happens when xvYCC mode is enabled.
- Does the display select the correct 3DLUT automatically (according to the reported source gamut information)?
- Does the input side make the gamut conversion (according to the reported display gamut information)?

The first case much more likely happens. But the second would be more interesting. :D
In the second case, the display should behave like in the sRGB preset, and the review says that it is the best mode on the colorimetric side, not that it is same as the sRGB mode.
I will see. A HDMI cable is not so expensive.
 
Last edited:
Yes, but YCbCr->RGB conversion is a logical place for gamut conversion. I am trying to figure it out what happens when xvYCC mode is enabled.
The behavior in xvYCC mode is quite odd but I rechecked twice with the U2711. Normally only a YCbCr signal should "automatically" lead to the Rec. 709 color space in this case but the whole "Video" settings are, as indicated and you also have discovered, screwed up a bit. But interestingly the custom color mode is linear on the U2711.

Was just wondering because the native white point (though rarely accessible - how "native" the 255-255-255 in the factory menu really is, is another question) normally lies "colorimetrically worse" so that RGB gain changes would be used in a simple factory calibration.


Does the panel would dither it again? It is weird and can cause some if the issues
Though not an Eizo CG the process is at least basically similar
http://www.eizo.de/pool/files/de/EIZO_LCD_Kalibration_Teil3.pdf

The FRC dithering can take place in scaler or panel to preserve tonal values that weren't lost through high bit processing. In the specific LG panel the FRC stage is implemented in the panel.

Best regards

Denis
 
Last edited:
itching to get this monitor now...but if its not equaivalent in picture/color quality and text crispness to the 2209wa I have right now...I would be terribly disappointed.
 
itching to get this monitor now...but if its not equaivalent in picture/color quality and text crispness to the 2209wa I have right now...I would be terribly disappointed.

Have they fix the tinting issues and all that because last time i checked, 95% of posts coming from here are people saying their 2nd or 3rd monitors stil have issues
 
Good advice - buy this monitor in a local shop and test it on place. Bearing in mind the high percentage of issues with this model, it's pretty much a lottery if you order it from the internet. I bought mine several days ago in an official Dell dealer shop. They let me tested it through my notebook. It was the only U2410 there, sitting on a stand. I think that's why it was flawless, no tinting, dithering, glows, broken pixels, graininess. The sliding panel said it was made in May 2010 in CZ, firmware A01.

As far as it goes, I encountered no problems with it, apart from calibration issue (but I use sRGB mode and it works fine for me) and brightness (it was too bright originally, but I put brightness to 10% and looked good).

Now I wonder if I can enhance this monitor more. First off, I want to do some calibration, but I'm pretty much a newbie in this. As far as I investigated, I need to use a hardware calibrator plus a software in standart mode. If anybody already calibrated their U2410 successfully, a step by step instruction would be useful.
 
Well, got my u2410 today. Zero tinting issues, zero dead pixels, and beats the hell out of my TN.

I ended up going through ebay and opting for the $20 pixel check where they go through monitors until they get a "perfect" one.

I suppose it was worth it.

Also geeky, I'm looking to calibrate it as well and I don't know much myself. But I do know that the software that comes with the calibration tool creates a "profile" for adjusting colors and such, so it's done of a software level, not on the monitor settings itself (well, most of it, brightness/contrast still come into play). It cn only get so accurate using rgb, and the custom color preset on this apparently isn't that great.

this review: http://www.tftcentral.co.uk/reviews/dell_u2410.htm explains quite well all the details of calibration.
 
If anybody already calibrated their U2410 successfully, a step by step instruction would be useful.

Take a look at this post: #2562
I am sorry if my english knowledge would retain your interpretation.

But this was my conclusion before Sailor_Moon let me know that xvMode can be used with RGB signal as well.
So, I will get a HDMI cable on monday, test the xvMode and report my results.

One of my familiars (a "VGA guru" who works for AMD) told me that the gamut conversion is made by the VGA card in xvYCC mode!
- If the VGA reads the native chromatic coordinates from the active ICC file, then we are living in the "monitor's haven". :D
- If it gets the coordinates from the EDID, well, it will require some tricks but we can edit and write back the modified EDID to the display, so the gamut emulation could be perfect (at least for the 100% RGB saturation).
- If it gets coordinates from somewhere else. Well, we will see what can we do. (But I think we will have more choices than we have with the display hardware itself.)

One remaining question is that: how does the VGA make the chromatic adaptation?
- What kind of mathematic methods are used? (Cheap and fast or high quality?)
- Is it made after the 8 bit/color framebuffering with 8 bit final output for the VGA LUT? Or is it made by a 32(/64) bit shader from the 16 bit data before the 8 bit framebuffering, or is it made by shaders from the framebuffered data but with 16 bit final output for the LUT)?

I will try to get some detailed informations and I will test the real life results with this given display model. ;)
 
Does anyone have a coupon for this monitor right now? I'm waiting until I can snag it for 449 or less before I grab another.
 
One of my familiars (a "VGA guru" who works for AMD) told me that the gamut conversion is made by the VGA card in xvYCC mode!
There is no conversion in the video adapter (btw: A standalone DVD player and videoprocessor were used for the test). The transformation should take place because of the smart xvYCC definition that uses the same primaries as Rec. 709 (sRGB primaries). A "normal" YCbCr signal with videolevels will "automatically" leed to such a result if the implemenation was carefully. The odd thing in this case is that a YCbCr signal doesn't leed to this situation on the DELL U2711 but only a RGB signal with PC-level which makes no sense.

If the VGA reads the native chromatic coordinates from the active ICC file, then we are living in the "monitor's haven".
As I said that is not the case - but when talking about ICC profiles and math keep in mind that the XYZ tristimulus values are relative to D50 here.

Also geeky, I'm looking to calibrate it as well and I don't know much myself. But I do know that the software that comes with the calibration tool creates a "profile" for adjusting colors and such, so it's done of a software level, not on the monitor settings itself (well, most of it, brightness/contrast still come into play). It cn only get so accurate using rgb, and the custom color preset on this apparently isn't that great.

this review: http://www.tftcentral.co.uk/reviews/dell_u2410.htm explains quite well all the details of calibration.
A calibration is ony the first part which brings the display to your desired parameters and "neutralizes" it. In case of a software calibration the modfications are carried out in the video card LUT which leeds to loss of tonal values (btw: without intervention the LUT is resetted when starting a game). The necessary transformations are carried out by a CMM via a device-independent color space and the involved ICC profiles which describe a characteristic (the monitor profile that was created after calibration describes the display colorimetrically). So you need color aware software (for example Photoshop). For non color aware software (games etc.) you will need a display with flexbile color space emulation or at least one with a decent sRGB mode if this is the content you will display. The sRGB mode of DELL U2410 and U2711 is rather good although sRGB gradation is not achieved.

Important is the right colorimeter respectively colorimeter-software combination. When you compare the measurements of the linked tftcentral (used a colorimeter without correction) report with the measurements @prad (used a spectrophotometer) you will see a great difference in white point. This is very problematic. When using a colorimeter without appropriate correction you will at least end up with a white point that is far away from your desired one (and also far away from the black body curve which will leed to a color cast). More information here.

Best regards

Denis
 
Last edited:
been looking for a desk clamp stand that can hold my 3 u2410 screens. i cant find anything.
 
The sh#tstorm over my U2410 is swirling again...

I bought a HDMI cable. (It is an "OEM" part but the pins and the "chassis" are gold plated and it is theoretically a High Quality cable.)
The display cannot show any image with it! I tried to use it with my desktop PC first (HD5850 with native HDMI port) and I also tried to connect it to my laptop later (Geforce 8600M GT and the HDMI port is integrated with the Realtek audio chip) but the display says the "No Signal. Move the mouse, ect..."
I also tried to set different display modes with the laptop (native, 1080p24, 800x600@60Hz...) but none of them worked. (I watched the VGA control panel on the laptop screen...)

The funny thing is that the Moninfo software could read the EDID and the VGA driver was able to recognize it as well (display name, resolutions, ect). According to moninfo, it should accept 12 bit/color through the HDMI input. (And it recognized the xvYCC capabilities as well. But that was not a question like the bit depth.)

The OSD always flashed when I tried to set a different display mode but I never saw any image instead of the black background with an OSD box.

What do you think? Is the cable broken or should call Dell again?



Other thing:
The EDID listed the Audio capabilities as well. But it talked about max 2 channels audio now! I am sure it was 6 channels with older revisions.
So, I changed back to DP connection and I checked it again. It reads 2 channels! I checked my HD Audio driver and it is ok (the latest ATI driver from Catalyst). The Windows control panel let me set it only to stereo mode.

I never tried to use the Audio capabilities until now. I don't really care about it but I was curious, so I connected my Senheiser CX-300 to test it. The audio is very distorted. It is clearly broken.
I tried to connect it to my laptop again (with HDMI) but there wasn't any sound. (I expected it because there is no picture at all, but I tried anyway.)



And a last thought: I think I figured out the source of my banding issues!

It is a refurbished part. It was manufactured and factory calibrated with the A00 firmware but it has the A02 firmware now.
Can you remember the dithering issues? They disabled the dithering in the A01 firmware. But it was calibrated with dithering. I think it can make a lot of difference with the dark tones. And I have issues only with the dark shades!



What do you think? (I think I should never buy this display. But it is here now, I have to use it somehow...)

And a question: I don't know too much about Dell's policies. Can I request a different Dell product? (I don't have any idea now but may be they will release a new 24" display with less issues soon. -> Are there any rumors?)
I have an order number but I never had a real Dell invoice because I bought it from a seller, not from Dell. Is it a problem in this case? (They didn't ask for any papers when they came with the replacements. But if I would ask for a different product...)
 
Last edited:
Massive thread...can ayone tell me how this panel is for gaming? any noticable ghosting or lag? thanks.
 
Massive thread...can ayone tell me how this panel is for gaming? any noticable ghosting or lag? thanks.

I got mine last week, but only yesterday had the time to set it up with my new system. It looks amazing with on Metro 2033. I got the rev. A02.
 
I've been through 3 of them already, all of them had tint issues, really poor quality control
 
Ordered mine on July 24, 2010 from Dell (on sale $539). Received the monitor on July 27th (free shipping on Fedex). Monitor is excellent, with no issues. Colors are fabulous in Lightroom 3 and CS 5; text is sharp and easy to read on the web. I think I lucked out and got one that works.
 
Well, I bought another HDMI cable (it is an expensive HAMA High Speed certified cable, the other was an OEM gold plated one). There is picture on the screen now.

But the xvMode is a joke! There is no banding (the only video mode without out-of-box banding problems) but it produces heavy color distortions! It produces the same image which I achieved when I tried to set up a custom sRGB emulation in Custom mode with the Huye and Saturation controls. It is absolutely useless.

The EDID says that the 12 bit/color mode is supported but there is no any software which is able to tell me the current bit depth on HDMI. (Moninfo showed 10 bit with DP and it shows "undefined" with HDMI)
ArgyII guess 10 bit/color now as it guessed 10 bit with DP as well.
But I think that the 256 step gray ramp looks a bit smoother now. As I remember, there was a very little jump around ~50. It is smooth now.

So, I think I will keep with HDMI but this cable doesn't worth it. :p
 
But the xvMode is a joke! There is no banding (the only video mode without out-of-box banding problems) but it produces heavy color distortions! It produces the same image which I achieved when I tried to set up a custom sRGB emulation in Custom mode with the Huye and Saturation controls. It is absolutely useless.
Sorry to hear. Unfortunately I could never test the mode on the U2410.

Best regards

Denis
 
My mom just got one...yea my mom and all she does is type and do document related work :(

I am very envious as that screen was absolutely stunning.
 
The EDID says that the 12 bit/color mode is supported but there is no any software which is able to tell me the current bit depth on HDMI. (Moninfo showed 10 bit with DP and it shows "undefined" with HDMI)
ArgyII guess 10 bit/color now as it guessed 10 bit with DP as well.

I have been following this with interest having just set up my U2410 with i1 Pro and displayCal with service menu tuning relatively successfully.

I must say though that I think regardless of the cable (DP, HDMI, DVI) using Windows you won't be able to get more than 8 bit output. This is covered somewhere on Nvidia forum - some of their Quadro cards support 30 bit output in DP but only when you use OpenGL screen rendering which bypasses Windows framebuffer. It is only usable in certain apps and Photoshop can only use it in full screen mode. The Win framebuffer as such always outputs 8bit so the card and cable capabilities do not matter.
 
I hope they add support for 12bit color in Windows 8, for professionals. If monitors can push non-standard gamuts, then they can also support higher bit-depths.
 
Well. The higher bit depth output from the VGA LUT will help with the calibration. In most cases, you won't need 10 bit framebuffering to reproduce 8 bit source materials. But those frames will go through the VGA LUT where the higher bitdepth output works well (I am sure about the 10+ bit, it is measurable) and it can reduce the banding which can be caused by the calibration (with relatively high changes through the VGA LUT and low bitdepth connection).

But I want to use 10 bit framebuffering for one particular thing: Blu-Ray playback.
madVR can work in D3DFullscreen exclusive mode now, so the DeepColor support is only matter of time and VGA support. And 10 bit output can be useful when we do some gamma correction as well (after the YCbCr4:2:0->RGB4:4:4 conversion which is tricky with 8 bit frambuffering too...).
And AMD should enable the 10+ bit framebuffering for a video renderer because they said that my card is DeepColor compatible, the advertisement mentions up to 16 bit/color support!
I don't know how they imagined that, but it is a good soil for arguments to ask him to enable 10 bit framebuffering for video renderers!

Otherwise, windows 7 theoretically supports DeepColor with up to 16 bit/color. I think it works with the D3D surface (it should).

By the way, I started a Dell IdeaStorm thread about the LUT utility request here. I know, it will never happen but you may want to promote it:
http://www.ideastorm.com/ideaView?id=087700000000io9AAA
 
Last edited:
My mom just got one...yea my mom and all she does is type and do document related work :(

I am very envious as that screen was absolutely stunning.

Dude, that's pretty sad that you're envious of a technology product that your mother owns. How can you show your face on this forum? :p
 
Well. The higher bit depth output from the VGA LUT will help with the calibration. In most cases, you won't need 10 bit framebuffering to reproduce 8 bit source materials. But those frames will go through the VGA LUT where the higher bitdepth output works well (I am sure about the 10+ bit, it is measurable) and it can reduce the banding which can be caused by the calibration (with relatively high changes through the VGA LUT and low bitdepth connection).

Please explain how it is so? Every image you will try to output because of the framebuffer will be 8bit. The graphics card may scale it up to 10/12 bits but that's just scaling and will still leave 256 distinct values per channel. It won't reduce banding because in order to do that you would need more distinct values to fill the transition between gradations and those won't be coming out of nowhere. From my experiments with HDMI cable (dispcalGUI tests show 10 bits) - it does not make any difference on the grayscale ramp. Calibrated with different from native gamma (say Standard mode to 2.2) it shows exactly the same banding as DVI-D cable.

And AMD should enable the 10+ bit framebuffering for a video renderer because they said that my card is DeepColor compatible, the advertisement mentions up to 16 bit/color support!

My Nvidia (GT240) is also according to spec supports DeepColor but there are no settings to set it up or explicitly enable, plus with the framebuffer limitation I doubt there are any real benefits of that.

Otherwise, windows 7 theoretically supports DeepColor with up to 16 bit/color. I think it works with the D3D surface (it should).

Possibly, but how is that useful in real apps that will benefit from that (Photo, video editing)?

By the way, I started a Dell IdeaStorm thread about the LUT utility request here. I know, it will never happen but you may want to promote it:
http://www.ideastorm.com/ideaView?id=087700000000io9AAA

I will. BTW, I had a look at a binary of the A01 firmware file for U2410 and it seems that the firmware has some capability to do LUT programming. There are quite a few diagnostic messages that indicate that. I also believe that this mode can be enabled through some service menu - not the one that we know about though. It would help if I could understand what format that bin file is in but without knowing the details about U2410 internal hardware, that would be practically impossible.
 
Please explain how it is so? Every image you will try to output because of the framebuffer will be 8bit.
From my experiments with HDMI cable (dispcalGUI tests show 10 bits) - it does not make any difference on the grayscale ramp. Calibrated with different from native gamma (say Standard mode to 2.2) it shows exactly the same banding as DVI-D cable.

Well, there are some explanations:
- DVI is also capable to transfer 10 bit/color. It is a nonstandard but existing behavior and it works with some 10 bit devices.
- You can load the usual 8 bit RGB materials into the 8 bit RGB framebuffer but you are unable to do any changes without harm while you have 8 bit connection.
But the VGA LUT works with 16 bit/color. The 8 bit framebuffer data will be multiplied to 16 bit, processed, and rounded back to the bit depth of you display's supported bit depth which is 10 or 12 bit/color in this case. So, you can work with 10+ bit calibration precision and this will help you to avoid banding.
But yes, this is not a magically high precision. It is only 4 times better. It won't let you to make too big changes. And in you example, the Standard mode has some out-of-box banding problems. It won't be fixed by any VGA LUT calibration. This is a problem with the factory settings.

My Nvidia (GT240) is also according to spec supports DeepColor but there are no settings to set it up or explicitly enable, plus with the framebuffer limitation I doubt there are any real benefits of that.

I hope they will enable the 10 bit D3D framebuffering for consumer level cards. It would be useful for Blu-Ray playback. I think a home user deserves the correct quality as well.

Why the hell the world think that a home user won't make any benefit of "studio quality" stuffs? We would require the same quality to fully reproduce the beautiful creations. And what is the point on the HQ editing and "level of the true art" when nobody can reproduce it? :rolleyes:

Possibly, but how is that useful in real apps that will benefit from that (Photo, video editing)?

Not only editing but simple playback as well. The Rec709 source -->> PC monitor thing is very complicated. I think it would require at least 10 bit RGB.
The YCbCr -> RGB conversion produces floating point data and you need gamma correction as well.
Of course, an EVR won't take it too seriously, but there is a very cool and free D3D renderer which does. You should take a look at it: mafVR


I had a look at a binary of the A01 firmware file for U2410 and it seems that the firmware has some capability to do LUT programming. There are quite a few diagnostic
messages that indicate that.

Interesting. So, the LUT should be accessible through the USB port. (It won't require any special internal connectors, like JTAG, or...)

I also believe that this mode can be enabled through some service menu - not the one that we know about though.

I don't think that there is any other hidden OSD and I doubt that it would enable some kind of standard ways for existing calibration programs any they won't recognize the display anyway (ColorNavigator, Spectraview, ect.). I think they use specific service softwares to set up the other factory settings. It would be nice if those softwares would be leaked as well (like the firmware updater). ;)
 
Last edited:
Well, there are some explanations:
But the VGA LUT works with 16 bit/color. The 8 bit framebuffer data will be multiplied to 16 bit, processed, and rounded back to the bit depth of you display's supported bit depth which is 10 or 12 bit/color in this case. So, you can work with 10+ bit calibration precision and this will help you to avoid banding.

I see now - so 10bit connection only helps to reduce the VGA LUT induced artifacts. If you populate VGA LUT with straight gamma 1 it won't make any difference.

And in you example, the Standard mode has some out-of-box banding problems. It won't be fixed by any VGA LUT calibration. This is a problem with the factory settings.

Actually my Standard mode has pretty good ramp (very smooth transitions) when uncalibrated (where it uses linear VGA LUT/gamma 1). I however do not like the default monitor gamma in Standard mode (around 1.8 on mine) so I calibrate it to gamma 2.2 in VGA LUT. That is what is causing some banding in shadows. Having spend ages tweaking dispcalGUI settings (with i1Pro) and experimenting measuring/calibrating from different parts of the screen I managed to minimise this so it is only very mild now. I still would love to get rid of it completely but I guess it is not possible without monitor LUT being accessible.

Interesting. So, the LUT should be accessible through the USB port. (It won't require any special internal connectors, like JTAG, or...)

That I am not sure about. I cannot analyze the binary firmware image apart from the messages it contains - need to know more about what it is for (controller etc). From the menus structure, my guess is that it still works via DDC/CI albeit in some non-standard way.

I don't think that there is any other hidden OSD and I doubt that it would enable some kind of standard ways for existing calibration programs any they won't recognize the display anyway (ColorNavigator, Spectraview, ect.). I think they use specific service softwares to set up the other factory settings. It would be nice if those softwares would be leaked as well (like the firmware updater). ;)

That's not what I meant, sorry haven't been more descriptive. There seems to be some list of choices (textual) in the firmware some of which have something to do with LUT. Because of the way it was presented in a text, it looked like some kind of menu. I think it is simply there to perhaps enable the LUT programming via DDI/CI. Now how to get to it - I have no idea.

Edit: the extracted text from firmware (formatted) looks like this:

Factory color adjust =========================
0 : Parameters List .......
1 : Set RGB Gain ................. 1 R_Gain G_Gain B_Gain [Range 0 - 0x7FF]
2 : Set R Re-mapping ............. 2 R_11 R_12 R_13 [Range 0 - 2047]
3 : Set G Re-mapping ............. 3 G_21 G_22 G_23 [Range 0 - 2047]
4 : Set B Re-mapping ............. 4 B_31 B_32 B_33 [Range 0 - 2047]
5 : Print Color Parameters current settings
6 : Excute current color settings
7 : Print final color matrix
R Gain ............................ = 0x%x
G Gain ............................ = 0x%x
B Gain ............................ = 0x%x
R Coef11 ............................ = 0x%x
R Coef12 ............................ = 0x%x
R Coef13 ............................ = 0x%x
G Coef21 ............................ = 0x%x
G Coef22 ............................ = 0x%x
G Coef23 ............................ = 0x%x
B Coef31 ............................ = 0x%x
B Coef32 ............................ = 0x%x
B Coef33 ............................ = 0x%x
R Gain ............................ = 0x%x
G Gain ............................ = 0x%x
B Gain ............................ = 0x%x
R Coef11 ............................ = 0x%x
R Coef12 ............................ = 0x%x
R Coef13 ............................ = 0x%x
G Coef21 ............................ = 0x%x
G Coef22 ............................ = 0x%x
G Coef23 ............................ = 0x%x
B Coef31 ............................ = 0x%x
B Coef32 ............................ = 0x%x
B Coef33 ............................ = 0x%x
7 : Print final color matrix
COEF 11 ............................ = 0x%x
COEF 12 ............................ = 0x%x
COEF 13 ............................ = 0x%x
COEF 21 ............................ = 0x%x
COEF 22 ............................ = 0x%x
COEF 23 ............................ = 0x%x
COEF 31 ............................ = 0x%x
COEF 32 ............................ = 0x%x
COEF 33 ............................ = 0x%x
Disable PreLUT
Disable RGB2YUV_CTRL
Bypass ACC
Bypass ACM
Bypass 3x3M 1
Bypass 3x3M 2
Disable DISP_LUT_CONTROL DDC

Switch to HDMI Factory color adjust =========================
0 : Parameters List .......
1 : Set RGB Gain ................. 1 R_Gain G_Gain B_Gain [Range 0 - 0x7FF]
2 : Set R Re-mapping ............. 2 R_11 R_12 R_13 [Range 0 - 2047]
3 : Set G Re-mapping ............. 3 G_21 G_22 G_23 [Range 0 - 2047]
4 : Set B Re-mapping ............. 4 B_31 B_32 B_33 [Range 0 - 2047]
5 : Print Color Parameters current settings
6 : Excute current color settings
7 : Print final color matrix
 
Last edited:
Thanks. It is very interesting. (And a little weird as well. It may mess with some of my theories, but may be not...)

Disable DISP_LUT_CONTROL DDC

May be they used the VESA standard (or other common DDC methods, like the one used by NEC) to access the LUT.
If it works with DDC, I think it may communicates directly with the STDP60xx chipset.

Well, I think it would be a pleasure to play with the factory softwares. :D
 
Ordered mine on July 24, 2010 from Dell (on sale $539). Received the monitor on July 27th (free shipping on Fedex). Monitor is excellent, with no issues. Colors are fabulous in Lightroom 3 and CS 5; text is sharp and easy to read on the web. I think I lucked out and got one that works.

I also just picked up a Rev. A02 July 2010 Made in Mexico. I'm upgrading from a 2005FPW. The u2410 is looking pretty good so far. The screen is uniform. There's no green or pink tint anywhere on the screen. No dead pixels that I could fine. The only flaw I could find was some black light bleeding from the lower left hand corner, but it doesn't bother me. The colors are gorgeous. :)
 
I also got an A02 manufactured in Jan. 10 - no tinting or leaking. On standard brightness the display is very bright and evenly lit.
But: only Standard, Multimedia, Game and Custom preset give a picture, the other give no picture - not entering sleep mode and backlight is still on.
Anyone know whats up with that? Tried different DVI cables on my Gainward GTS250 to no avail.
 
So, you have a refurbished shrinkage.
The A02 firmware is much younger than your hardware, so it is a refurbished hardware.
And every preset works on my U2410, so this is not a common problem.
Don't worry, I came through three of them and I am still not perfectly satisfied yet.

By the way, I just read a review about the U2311H. Now, I am really sad. That looks like a slightly better display for a much lower price. The only disadvantage is the 16:9 aspect ratio. And I bought my U2410 right before they arrived. :(
It has only sRGB gamut but I can't use any wide-gamut mode on this U2410 anyway.
Advantages: 72/75Hz support, better tonal response, lower input lag.
What the hell am I doing with this expensive crap? May be I will change it in 1:1 cost with somebody (after a proper test, of course). :)
 
just got my u2410 in today, was really worried after reading about all the tinting issues. The display I got seems to be perfect though. No dead pixels and no tinting issue! still messing with the calibration but so far im happy. :D Monitor says REV A02 but the driver disk says REV 03?
 
just got my u2410 in today, was really worried after reading about all the tinting issues. The display I got seems to be perfect though. No dead pixels and no tinting issue! still messing with the calibration but so far im happy. :D Monitor says REV A02 but the driver disk says REV 03?

If you came from a TN lcd monitor, do you notice any difference with the anti glare coating on the new dell?
 
My center monitor was a dell 2405fpw but i do have two asus vh242h TNs on the side. The rooms not really bright enough to get a glare but I dont see an issue with it. Looks more or less the same as my previous screen
 
do these monitors have ambient light sensing? I have noticed that sometimes if i navigate from like hard to Google you can see the brightness increment down or up like a laptop screen when adjusting brightness.
 
do these monitors have ambient light sensing? I have noticed that sometimes if i navigate from like hard to Google you can see the brightness increment down or up like a laptop screen when adjusting brightness.

No, there is no sensor.

Some preset modes will activate the digital contrast ratio (content based back-light control), and it can be activated anywhere from the factory OSD, and you can (always?) disable it from the user OSD.

Which display mode do you use? Of course, it is your choice, but I think that the usable modes (Standard, sRGB, AdobeRGB) doesn't offer DCR at all.
 
No, there is no sensor.

Some preset modes will activate the digital contrast ratio (content based back-light control), and it can be activated anywhere from the factory OSD, and you can (always?) disable it from the user OSD.

Which display mode do you use? Of course, it is your choice, but I think that the usable modes (Standard, sRGB, AdobeRGB) doesn't offer DCR at all.


Oh okay that makes sense... I usally use Multimedia If I am typing/ surfing or my own setting that I setup. for watching videos or playing game.

Thanks.
 
Back
Top