Dell U3011 coming soon.

The 3rd and final dell replacement arrived in the mail today. I'll report back to see if displays any issue of uneven back lighting... with any luck it'll at least be better then the last 2 in that department..

Otherwise, on the 2 monitors I've tested so far, I have not had any issued with using the Custom RGB mode and awaking from sleep. Other then quality control issues of uneven CCFL back lighting I'm still fairly impressed by the monitor.
 
20min, seriously? That's how long it takes for my 10 year old CRT to warm up...

Actually I think my 10 year old CRT used to warm up faster, but really this is how CCFLs behave. This is why they say to wait 30 mins before calibrating an LCD.

Sorry Twiik, THOUSANDS AND THOUSANDS of gamers can't be wrong on this one. Input Lag is an issue and the very best professional gamers play on TN-panels because of it.

If you are going to game competitively that is fine, but just because Lance Armstrong used a Carbon Fiber bicycle in the Tour, doesn't mean you need one to ride around the neighborhood.


As far as wide gamut vs. native sRGB goes, I don't think that is really the question. I think the real question is how good is the sRGB emulation on the particular wide gamut display you want to buy, unless, of course, you primarily use color managed apps (I do not).


Yes, good sRGB emulation will be the critical element on my next LCD (if it isn't standard gamut). These modes have been improving, but I have some concern as it sounds like this one isn't quite as good as the one in the Dell U2711.
 
Yes, good sRGB emulation will be the critical element on my next LCD (if it isn't standard gamut). These modes have been improving, but I have some concern as it sounds like this one isn't quite as good as the one in the Dell U2711.

What do you look for to judge the quality of the sRGB mode?
 
Zarathustra[H];1036509876 said:
What do you look for to judge the quality of the sRGB mode?

The presence of a 3D-LUT inside the monitor to correctly match smaller colour spaces.
 
The presence of a 3D-LUT inside the monitor to correctly match smaller colour spaces.
A 3D-LUT is a nice feature but you can achieve comparable results with a 1D-LUT for each channel. A good example is the CG303W which I could recently test. Regarding color space emulation: No difference over measuring inaccuracy/ display variability compared to the CG245W or NEC PAs even with critical mixed colors (partly near achromatic axis).

Best regards

Denis
 
Last edited:
A 3D-LUT is a nice feature but you can achieve comparable results with a 1D-LUT for each channel. A good example is the CG303W which I could recently test. No difference over measuring inaccuracy compared to the CG245W or NEC PAs even with critical mixed colors.

I generally go more by reviews than features, but sRGB that allows adjustment/calibration in monitor is a large bonus.

Do your tests/reviews show up online? Where? Do you have Dell U3011?
 
Do your tests/reviews show up online? Where? Do you have Dell U3011?
Our DELL U3011 had a quite good fixed sRGB/ AdobeRGB mode (comparable to the U2711). But there were some problems especially regarding the white point which I wouldn't have expected with a view to to the results of the U2711. So my "extrapolation" before having seen and measured the display was not so accurate.

I generally go more by reviews than features, but sRGB that allows adjustment/calibration in monitor is a large bonus.
Of course. A flexible color space emulation is very nice to have - only wanted to point out that this doesn't need a 3D-LUT.

Best regards

Denis
 
Last edited:
Our DELL U3011 had a quite good fixed sRGB/ AdobeRGB mode (comparable to the U2711). But there were some problems especially regarding the white point which I wouldn't have expected with a view to to the results of the U2711. So my "extrapolation" before having seen and measured the display was not so accurate.

It is a disappointment that the early reports of poor white balance have been verified by your very rigorous test methods.

I was hoping it would deliver something like the U2711 results you found (which was just about as good as it gets for any factory setting). I would want something of this level before I would consider a Wide Gamut panel. It would be enough to tip me in favor of the U2711 over the U3011.

Thanks for your input.
 
A 3D-LUT is a nice feature but you can achieve comparable results with a 1D-LUT for each channel.

But for the purpose of colour space emulation, a target primary colour may not fit within a line between the native primary colour and the white point. In this case, how does a 1D-LUT achieve a comparable result to a 3D-LUT?

Are there any particular resources you read on this subject that can help my understanding?
 
But for the purpose of colour space emulation, a target primary colour may not fit within a line between the native primary colour and the white point. In this case, how does a 1D-LUT achieve a comparable result to a 3D-LUT?

Are there any particular resources you read on this subject that can help my understanding?

I don't quite understand how any of this works, but he was talking about using 3 1D LUT's (one per channel) vs, 1 3D LUT I believe...
 
Zarathustra[H];1036510709 said:
I'm wondering more what qualitative differences in the image there are between a wide gamut monitor with a good sRGB mode and one without?

Good sRGB mode will give you natural looking colors everywhere, natural skin tones, natural sky tones, no color casts. It is very desirable IMO.

Poor sRGB mode could mean anything. Early attempt on old Dell like the 2408 were pretty horrid and almost no one found them usable. The usually just cut down saturation (often too much) without really correcting color.

Even a reasonable mode, with incorrect white balance can make colors look slightly off and like their is a color cast.

Finally if the sRGB mode is so unusable, you resort to native mode, and usually end up with over-saturated "Disney-on-Acid" neon colors in animations/games, and weird sunburned skin tones and alien blue skies in movies.
 
This is again incorrect. You make a lot of mistakes for someone so smug. Higher bit depth has nothing to do with wider gamut. There are 10 bit standard gamut displays
...
When I was mentioning the loss of bit depth, I was considering 8 bit options,
I know there are 10 bit standard gamut monitors which are rather an exception and with wide gamut monitors is the other way around because 10 bit has something to do with wide gamut (check again what that guy says) There are 8 bit wide gamut monitors but those are either older models or with 92%NTSC range. Most recent wider gamut monitors have either 10 bit or internal processing to improve the display. Anyway, for the record, with my reply I was not claiming that there is no standard gamut 10 bit monitors but it was a direct answer to your statement claiming that I'm sacrificing bit depth and I have banding inaccuracies because I'm using wide gamut and thus you are suggesting that users of standard gamut will not have those issues which in fact is quite the opposite and everyone can verified it with the test file. Here's what you said:
Consumer wide gamut calibration has significant inadequacies, there is no guarantee it is any more accurate than a native sRGB monitor and you are also sacrificing significant chunk of your limited 8 bit color depth to deal with calibration. You probably have some nice banding along with your other inaccuracies.

... last I checked you need Professional grade graphics card and then it only works in a small handful of applications.

Has this changed?
AFAIK all recent gaming cards are 10 bit and some are for several years now. 10 bit feature is mainly targeting gamers and home entertainment - I don't play games much and don't know if there are 10 bit games but like everything computer related like most of the time it is transition time again:)
 
I was not claiming that there is no standard gamut 10 bit monitors..

That isn't what I called you on. This is:
All standard gamut monitor users will see a horrendous banding when they open this file in Photoshop without any possibility to ever avoid it, while the users like me with wide gamut 10 bit channels monitors.... blah, blah, blah(snipped)...

You made a completely baseless, but absolute statement(bold green), topped off with smug self serving references to yourself. It just happened to be wrong, which is what makes smug self reference funny. :)

As to the question of actual full 10 bit path functioning with consumer cards, the info I find still indicates professional cards are required. I will wait for confirmation from someone reliable.
 
Regarding the 10bit panels: Most versions are still based on a 8bit panel. There are very few real 10bit panels, for example in the LG W2420R and HP LP2480zx.

To preserve tonal values after high precision processing (>= 10bit LUT) of the input signal a FRC stage is used. That is nothing new and state of the art since many years for all "better" screens (I'm not referring to TN panels which also use FRC dithering to overcome limitations) with extensive electronics (which would be quite useless otherwise as we would suffer from a loss of tonal values at the very end).

A 10bit capable input often leads to a "10bit panel" in advertisement, especially when the FRC stage is implemented in the panel itself. That is a quite new and somewhat misleading trend but makes no difference for the end user. A good realisation (independent from location) only shows minor artefacts (slight noise in dark tonal values). And there is of course also a benefit when feeding a 10bit signal even in this 8bit + FRC constellation.

Best regards

Denis
 
Last edited:
Sorry Twiik, THOUSANDS AND THOUSANDS of gamers can't be wrong on this one. Input Lag is an issue and the very best professional gamers play on TN-panels because of it.

I'm coming from a CRT (not TN though) and I'm telling you that the difference is noticable. If you're coming from other LCD panels, then you haven't noticed the difference because you're already coming from the same environment. I'm telling you the difference between TN and IPS is like playing on satellite dish vs cable. It's DEFINITELY noticable!

I'm no expert on monitors and won't ever claim to be but when you're talking about a TN panel doesn't that only apply to LCD's? If that's true what do you mean when you reference "coming from a CRT(not TN though)"? I think you should have your facts straight before you make a post trying to say someone else is wrong.
 
Good sRGB mode will give you natural looking colors everywhere, natural skin tones, natural sky tones, no color casts. It is very desirable IMO.

Poor sRGB mode could mean anything. Early attempt on old Dell like the 2408 were pretty horrid and almost no one found them usable. The usually just cut down saturation (often too much) without really correcting color.

Even a reasonable mode, with incorrect white balance can make colors look slightly off and like their is a color cast.

Finally if the sRGB mode is so unusable, you resort to native mode, and usually end up with over-saturated "Disney-on-Acid" neon colors in animations/games, and weird sunburned skin tones and alien blue skies in movies.

That was exactly my experience with the 2408WFP. I had two of them because I honestly thought the first one was defective it looked that horrible. This was coming from happily using a 2005FPW and 2407WFP previously, which while decent screens also made some concessions to meet their price points. They weren't perfect, but they were definitely usable and enjoyable.

The 2408WFP was simply unusable to me in any mode or setting. Had I not gotten lucky and found a deal on a discontinued Eizo SX2461W directly from Eizo, I would have probably saved up and picked up the same 24" NEC you currently are using.

IMO wide gamut is not usable in non-color managed apps, which (as you have already pointed out) are almost all the apps most people use. People can argue that it doesn't bother them, which is fine. However, incorrect or over saturated color is still wrong no matter if it bothers the end user or not. The same could be said for input lag, it's still there whether you notice it or not.

Nothing's perfect though, and we have to pick what we are willing to compromise on. Since this is a thread about the U3011, I would assume that people here are more interested in color vs lowest cost/lowest input lag. If the latter is more important, than the HP ZR30w is probably the better choice based on everything I've read.

Personally, I would rather the image look good and colors be accurate in exchange for input lag, and I'm willing to pay a little extra for it. Does anyone know how much better the sRGB emulation on the U3011 has gotten compared to the 2408WFP?
 
Nothing's perfect though, and we have to pick what we are willing to compromise on. Since this is a thread about the U3011, I would assume that people here are more interested in color vs lowest cost/lowest input lag. If the latter is more important, than the HP ZR30w is probably the better choice based on everything I've read.

Many people purchased this monitor at 25% or 35% off, which brought the price very close to the ZR30w. At that point, the trade-off becomes color vs input lag. If someone (such as myself) cannot visually discern the difference between 11ms and 25ms input lag, then their purchasing decision may be even easier to make.
 
Many people purchased this monitor at 25% or 35% off, which brought the price very close to the ZR30w. At that point, the trade-off becomes color vs input lag. If someone (such as myself) cannot visually discern the difference between 11ms and 25ms input lag, then their purchasing decision may be even easier to make.

Agreed, but only if the sRGB emulation and overall image quality on the U3011 is good and colors are accurate. If not, it's really irrelevant what you paid for it.
 
I'm no expert on monitors and won't ever claim to be but when you're talking about a TN panel doesn't that only apply to LCD's? If that's true what do you mean when you reference "coming from a CRT(not TN though)"? I think you should have your facts straight before you make a post trying to say someone else is wrong.

I can no longer locate the website that describes the sort of CRT technology that I had with my monitor. But it stated something like the graphics series monitor was above TN and MVS or something like that.

The point being that I come from a place with very little input lag. CRTs always have good input lag. You're just trying to discredit me because I cannot locate the exact information on a monitor that's a decade old.

It's okay though, because it really doesn't matter what you think. On that CRT, I was an exceptional player who enjoyed being the top of his team more often than not regardless of the winning or losing side of the game. And now I'm somewhere in the middle of the scoring because my new Dell U3011 has mediocre input lag.

If you respond to me again, I'll repeat the same exact thing in a different way. But thanks for the reply and chance to express this once again. :)
 
Zarathustra[H];1036512555 said:
So what do I look for when opening this test file?

If it is banding then you lack the ability to handle 30-bit color and you are seeing tonal banding because you cannot process the amounts of color necessary to show exceptional gradient tonality. The only way to overcome this is to see 30-bit color and that's only done by having a card that uses displayport and outputs 30-bit color. Typically this means having a video card that costs over $1000.
 
I just got my U3011 yesterday...quick impressions:
- black levels leave a little to be desired, but then again, I'm coming from a CRT
- input lag isn't bugging me, again coming from a CRT. Tried out UT2004, Mass Effect, Mass Effect 2, sins of a solar empire, to cover a variety of game "speeds". Just didn't notice it.
- passes nokia monitor test right out of the box, on multimedia preset. So hooray, brightness/contrast are set properly. Going to tackle colors later this weekend.
- doesn't weigh 110lbs like the FW900
- only saw AG coating sparkling a bit on a pure white background full-screen
Bad:
- built in USB hub causes my boot process to hang, maybe it's because I have 2 hubs already (from other monitors)?

Other:
I use a 2209WA in portrait mode for web-browsing/PDFs/comic viewing (so, 1050x1680). The 30" is *literally* high-enough resolution to just stick a web browser on and hit Win+Right/Win+Left and let it take up half the monitor, with plenty of other space left over. It can pretty much fit 2 2209WAs side by side (it's just 80 pixels short vertically). I can't imagine how sweet it'd be to have 2 3011s side-by-side...
 
If it is banding then you lack the ability to handle 30-bit color and you are seeing tonal banding because you cannot process the amounts of color necessary to show exceptional gradient tonality. The only way to overcome this is to see 30-bit color and that's only done by having a card that uses displayport and outputs 30-bit color. Typically this means having a video card that costs over $1000.

Hmm... I do see banding...

The part I don't get is that if I use the dropper tool to pick the colors on the far edges of the sample image and then create my own gradient using the gradient tool, the graient is perfectly smooth...

So why would this particular image show banding, if I can create a smooth gradient between the same colors?
 
The 30" is *literally* high-enough resolution to just stick a web browser on and hit Win+Right/Win+Left and let it take up half the monitor, with plenty of other space left over.

I used to do this with my 24" 1920x1200 2405FPW. The 2560x1600 almost seems overkill for this process.


Where I really take advantage of the resolution is in Photoshop and Cature NX when working on my pictures, and in games.
 
I got mine yesterday. I did notice a slight greenish tint to the left half.

I also really notice the input lag with this compared to my old NEC WMGX2 20". Text is super shart though, which I love. I also only notice the AG coating on white backgrounds.

But if real low input lag is vital for your needs, then this is not the monitor for you. The difference is night and day compared to my old NEC.
 
If it is banding then you lack the ability to handle 30-bit color and you are seeing tonal banding because you cannot process the amounts of color necessary to show exceptional gradient tonality. The only way to overcome this is to see 30-bit color and that's only done by having a card that uses displayport and outputs 30-bit color. Typically this means having a video card that costs over $1000.

Yes, in everything I read, you need a professional card from NVidia (Quadro) or ATI (FirePro), some of these are not horribly expensive but you will get much weaker GPU for the price on a pro card.
Something like this:
http://www.newegg.com/Product/Product.aspx?Item=N82E16814133272

Or you can get a reasonably good GPU and pay $3000+
http://www.newegg.com/Product/Product.aspx?Item=N82E16814133347


Here is NVidia PDF on requirements:
http://www.nvidia.com/docs/IO/40049/TB-04701-001_v02_new.pdf
Here is the AMD PDF on requirements:
http://www.amd.com/us/Documents/48108-B_ATI_FirePro_Adobe_10-Bit_FAQ_R5_Final.pdf

Both say they require professional workstation cards (Quadro or Firepro).

Once you have the appropriate card and monitor, then you need appropriate software, which has to be specifically coded using OpenGL system calls.

In reality practically no one goes for this except high end pro shops, because you are paying through the nose on hardware to get 10 bit support usually just in one application.

I would think if 10 bit application support was released on consumer cards, we would be able to find something on AMD/NVidia pages.
 
Last edited:
Zarathustra[H];1036514863 said:
Seeing that the Quadro 6000 is a lower clocked GF100 with the same amount of shaders as a GTX470, I wonder if its possible to flash a GTX470/480 with the firmware from one of these and get free hardware antialiased lines and 30bit color...

You'd have less video ram though...

Looks like it is...

http://www.techarp.com/showarticle.aspx?artno=539

Though its a soft mod, not a firmware flash... This is pretty old though.. I wounder if it still works...

I'd try it but unfortunately I no longer have a GF100 based card to try it on. I have a GTX460 kicking around I am not using, but unfortunately there do not appear to be any GF104 based quadro cards...
 
Last edited:
I got mine yesterday. I did notice a slight greenish tint to the left half.

I also really notice the input lag with this compared to my old NEC WMGX2 20". Text is super shart though, which I love. I also only notice the AG coating on white backgrounds.

But if real low input lag is vital for your needs, then this is not the monitor for you. The difference is night and day compared to my old NEC.

What do you have wide mode set to? Setting mine to 1:1 helped out.
 
I got mine yesterday. I did notice a slight greenish tint to the left half.

I also really notice the input lag with this compared to my old NEC WMGX2 20". Text is super shart though, which I love. I also only notice the AG coating on white backgrounds.

But if real low input lag is vital for your needs, then this is not the monitor for you. The difference is night and day compared to my old NEC.

My tint is very slightly greyish. It appears as though the left portion of the screen is just a tad darker than the luminance of the right side. This was also the case with TFTCentral's test panel: http://www.tftcentral.co.uk/reviews/content/dell_u3011.htm#uniformity
 
Come again?

1:1 help your inperceivable input lag?

No, leaving it set to "Fill" or "16:9" and feeding it 1920x1200, UT2k4 felt a bit "off" (i.e., I was missing much much more than usual). Forcing it to 1:1 and forcing my GPU to scale seemed to mitigate it somewhat.
 
Zarathustra[H];1036514763 said:
Hmm... I do see banding...

The part I don't get is that if I use the dropper tool to pick the colors on the far edges of the sample image and then create my own gradient using the gradient tool, the graient is perfectly smooth...

So why would this particular image show banding, if I can create a smooth gradient between the same colors?

My guess is that if you had Photoshop set up to take advantage of 30-bit color and did a gradient then it would have banding unless you had a video card that could handle the tonality of the gradient.

At any rate, besides some possible banding issues not very many people would need 30-bit deep color anyway... even on photoshop. It might be nice, but I've noticed that it's pretty hard even to get banding when I'm post-processing. I suppose it will allow me to push my effects further; but it also introduces the problem of having to view your photo in and out of photoshop if you intend to share it on the web with the 99.99995% of everyone in the world who doesn't have 30-bit color capability.
 
I can no longer locate the website that describes the sort of CRT technology that I had with my monitor. But it stated something like the graphics series monitor was above TN and MVS or something like that.

The point being that I come from a place with very little input lag. CRTs always have good input lag. You're just trying to discredit me because I cannot locate the exact information on a monitor that's a decade old.

Maybe I should have left off the last sentence as my post wasn't really to discredit you but to understand what you were referring to when you mentioned TN panels and CRT's in the same sentence. Feel free to educate me, like I said I'm no expert on the subject and the more you know the better off you are.
 
Thanks Raven. I appreciate your statement and I agree with your ending.


All I know is what I preceive coming from the professional CRT monitor that I was using and having this Dell LCD that I own now. And I'd like to stress that the difference isn't a deal breaker for me. I just want to point out that there is a perceptible difference.
 
My tint is very slightly greyish. It appears as though the left portion of the screen is just a tad darker than the luminance of the right side. This was also the case with TFTCentral's test panel: http://www.tftcentral.co.uk/reviews/content/dell_u3011.htm#uniformity

I can confirm that all 3 U3011's I've tested out exhibit the extra brightness on the right side. This seems to be a manufacturing flaw present in all u3011's... debating on whether or not to return mine for a refund.
 
I can confirm that all 3 U3011's I've tested out exhibit the extra brightness on the right side. This seems to be a manufacturing flaw present in all u3011's... debating on whether or not to return mine for a refund.

In that case, I think I may have gotten lucky because mine is not very noticeable but I'm quite picky. If I stare at it long enough, it seems to go away. I think I can get used to it as this is still a spectacular monitor.

Some will say, "should've puchased the ZR30w," but there's also no guarantee that the HP would have been without issues.
 
Some will say, "should've puchased the ZR30w," but there's also no guarantee that the HP would have been without issues.

They are all using LG IPS panels, and uniformity issues plague them. If you have one that is reasonably good that is likely as good as you will get.
 
In that case, I think I may have gotten lucky because mine is not very noticeable but I'm quite picky. If I stare at it long enough, it seems to go away. I think I can get used to it as this is still a spectacular monitor.

Some will say, "should've puchased the ZR30w," but there's also no guarantee that the HP would have been without issues.

Same here.


I tried putting an all grey background as described in this thread, and I can barely tell. It's at the level of me not being sure if I am seeing a difference between the left and right side or if I am imagining something.

During regular use it is completely indistinguishable.
 
Back
Top