Dell U3011 coming soon.

For the record, my recent monitor history is as follows:

2001-2005 Iiyama 22" CRT based on Mitsubishi Diamondtron flat screen tube. It was the one with the two thin lines barely visible accross the screen, and was a great monitor. I still miss it. Cost me ~$1000 at the time.

2005-2010 Dell 2405FPW 24" LCD. 1920x1200. Bought because it used the same panel as Apple in their 24" wide screen display and the apple display was very well regarded at the time. It has served me well, and I still use it occasionally. I am going to be using it for a gaming rig I a building with/for my stepson. This one cost me just over $1000 at the time (if memory serves)

Now: Dell U3011. ~$1200 with all taxes and everything included.

I guess ~$1000 every 5 years constitutes my monitor buying habits. At least thats the case for the last 10 years or so. Prior to my Iiyama above, I believe I used some low end 17" CRT that would change colors when the VGA cord was wiggled :p The VGA cable was hard wired so I couldn't replace it...

Its been too long since my CRT died so I can't remember how the U3011 compares to it, but coming from the 2405FPW, its all positive all around, except for the slight sheen in the outmost corners of the screen during dark scenes. I have been told this is unavoidable with IPS panels.

I wonder if there are any 3rd party films that can be applied to limit its appearance?
 
Last edited:
They are all using LG IPS panels, and uniformity issues plague them. If you have one that is reasonably good that is likely as good as you will get.

Exactly. That and I didn't want to go through weeks of having replacements shipped and shipping them back. It can turn into an endless cycle. The uniformity of this panel is almost indistinguishable and I can live with that.
 
That isn't what I called you on. This is: ...
LOL, OK nitpicker, If you can't read what I mean but only what I write, with my follow up message I explained clearly what I meant. If you are not interested in that, go play some shoot em up games:p
Thanks God no one is correcting your statements otherwise we have to stand annoying bold green all over the place.

Regarding cards supporting 10 bit, the consumer and workstation cards have identical hardware with the exception of some additional ram chips added in some of the higher end workstation cards. The features that are disabled with the permanent hardware switch are the workstation software that controls certain rendering functions but 10 bit per channel display calculations are not part of this.

The cards listed on the left side bar on this page all have 10 bit per channel display capabilities. It is not directly listed in the specifications but if you check this link and search for "10-bit-per-color" you will find it. They hype this feature much more with the pro cards to justify the rip off.
For nvidia gaming cards I can't find where they tugged this information and I'm not even familiar with their gaming cards as I've used only Quadros from them. A Google search finds this page - there is some info about it.
 
Here is NVidia PDF on requirements:
http://www.nvidia.com/docs/IO/40049/TB-04701-001_v02_new.pdf
Here is the AMD PDF on requirements:
http://www.amd.com/us/Documents/48108-B_ATI_FirePro_Adobe_10-Bit_FAQ_R5_Final.pdf

Both say they require professional workstation cards (Quadro or Firepro).

If I see something to contrary from a reliable source (wikipedia and misinterpretations of AMD video processing engine don't count) or person I will accept that this requirement has changed, but I see no evidence so far.
 
images
 
I have the ATI HD 5870 and the 10-bit is working perfectly with Photoshop just like the high end Quadro that I also have. I'm sure that all ATI cards from this link work the same with 10-bit output. I cannot confirm if the latest gaming card from nvidia output 10-bit because I don't have such cards, but if they still don't, it is a matter of finalizing the driver to allow software to access it - the hardware has been ready for quite some time and those who already have the cards its a matter of driver upgrade.

The current video card processors calculate the color display at 32-bit per channel, on the bus and all the way through the display port cable the data arrives at 16-bit per color channel. The rest of the chain is the video card driver, software, and monitor support.
 
Many of the posts I've read indicated that DVI wasn't capable of 30+bit color, and that Displayport would be neccessary.

I don't know this to be a fact, just that I've read it.

I've also read that *ALL* recent AMD/ATI and NVIDIA cards are capable of 30+bit deep color, but only the workstations card actually had drivers that allowed output of that capability (and therefore being a big reason to own such a card).
 
The ATI drivers already support the 10-bit for the consumer cards. I know that Nvidia have been working on a driver for their consumer cards but I'm not sure if it has been released. Last time I was reading about that was in one of the Adobe forums where an employee from Adobe was giving that information. I don't have the link to that but anyone interested can call or contact Nvidia and ask.
 
The ATI drivers already support the 10-bit for the consumer cards. .
Well perhaps the consumer drivers have code in there but it's not functioning with nearly all consumer cards. It seems ati only supports 10 bit per channel on the 5870, and not on any of the lower end cards, even other evergreen class cards.
In fact you are the first to report that you've found a consumer card for which you've been able to actually verify 30 bit output. Congrats! Others have only quoted specs. I assume you used the banding test pattern, ramp.psd? Works via displayport or dvi? Does the driver give you the bit depth config option (enable 10-bit pixel format support)?
 
Last edited:
I can confirm that all 3 U3011's I've tested out exhibit the extra brightness on the right side. This seems to be a manufacturing flaw present in all u3011's... debating on whether or not to return mine for a refund.

Ah, well it's good to know that it's normal for this monitor. It is rather annoying though. Again, it's only noticeable on all white backgrounds, but it is noticeable. If I spent most of my time doing spreadsheets it would drive me nuts. But I won't notice it for internet, gaming, movies and such, so I guess I can live with it.
 
Some data on the input lag:

Using the Online Monitor Test at http://tft.vanity.dk/, I've found evidence that having 1:1 pixel maping may substantially reduce input lag. I'm still trying to work out exactly what is going on, but here's what I've found so far:

My laptop display is about 15 ms slower than my CRT. (There may be scaling going on here, so I'm not 100% sure about this result.)
My BenQ 241w flat panel is about 3 ms slower than my laptop display with 1:1 pixel mapping.
My BeqQ 241w flat panel display is about 32 ms slower than my laptop display when scaling.
My Dell u3011 is about 30 ms slower than my laptop display when scaling.

So at least with my BenQ I found a drop from 47 ms lag to 18 ms lag just by not scaling, so I suspect the same may be true of the u3011. That's 3 frames of lag down to 1 frame, which is pretty significant. I don't know if it is the GPU scaler or the monitor scaler that is introducing the additional lag, though.

I haven't found a good way to test the u3011 with no scaling because I either need another display (with a known lag) capable of doing 2560x1600 to compare with or I need to find a way to get Windows 7 to duplicate only a region of the larger display with no scaling on the laptop comparison screen. Does anyone know how to do this?
 
I don't know what the requirements for doing so are but I managed to clone two displays using different resolutions. With this I tested the 3008WFP and concluded that there were no difference whether the monitor was scaling or not.
 
Some data on the input lag:

Using the Online Monitor Test at http://tft.vanity.dk/, I've found evidence that having 1:1 pixel maping may substantially reduce input lag. I'm still trying to work out exactly what is going on, but here's what I've found so far:

My laptop display is about 15 ms slower than my CRT. (There may be scaling going on here, so I'm not 100% sure about this result.)
My BenQ 241w flat panel is about 3 ms slower than my laptop display with 1:1 pixel mapping.
My BeqQ 241w flat panel display is about 32 ms slower than my laptop display when scaling.
My Dell u3011 is about 30 ms slower than my laptop display when scaling.

So at least with my BenQ I found a drop from 47 ms lag to 18 ms lag just by not scaling, so I suspect the same may be true of the u3011. That's 3 frames of lag down to 1 frame, which is pretty significant. I don't know if it is the GPU scaler or the monitor scaler that is introducing the additional lag, though.

I haven't found a good way to test the u3011 with no scaling because I either need another display (with a known lag) capable of doing 2560x1600 to compare with or I need to find a way to get Windows 7 to duplicate only a region of the larger display with no scaling on the laptop comparison screen. Does anyone know how to do this?

I don't have a CRT anymore, so I can't do any side by side comparisons, but what I do like about this monitor if the same holds up for it, is that it remembers settings based on which input source they are from.

So I can set it to always scale to 16:9 from the HDMI1 input and always map 1:1 from the DVI1 input...
 
I'm looking to go all out on my next monitor. I want the picture/color to be really good. I'm primarily going to be gaming/watching movies on it. Is this going to be good for me?
 
Ah, well it's good to know that it's normal for this monitor. It is rather annoying though. Again, it's only noticeable on all white backgrounds, but it is noticeable. If I spent most of my time doing spreadsheets it would drive me nuts. But I won't notice it for internet, gaming, movies and such, so I guess I can live with it.

Just adding, mine looks great in this regard. Excel and the like look very uniform from edge to edge and I’m supper picky. Coming from my two 2707s I see the sparkle one white from the anti glare coating (?) for instance. Is it a burn in thing as I’ve only had the u3011 and u2711 for a day?
 
I'm looking to go all out on my next monitor. I want the picture/color to be really good. I'm primarily going to be gaming/watching movies on it. Is this going to be good for me?

For movies and gaming I’d go with a 16:9 27” at 1920x1080. Gaming with my ATI 5970 at 2560x1600 has been disappointing. My old 2707s had a nice sweet spot with a large screen and good native res for games. Now I’m stuck deciding whether I want to turn down effects or run at a lower res. With the 2707s at 1920x1200 I was able to run everything at native res with effects maxed, thus far and been good with it.
 
Just adding, mine looks great in this regard. Excel and the like look very uniform from edge to edge and I’m supper picky. Coming from my two 2707s I see the sparkle one white from the anti glare coating (?) for instance. Is it a burn in thing as I’ve only had the u3011 and u2711 for a day?

Did you get revision A00? Maybe you are very lucky in this regard.
 
For movies and gaming I’d go with a 16:9 27” at 1920x1080. Gaming with my ATI 5970 at 2560x1600 has been disappointing. My old 2707s had a nice sweet spot with a large screen and good native res for games. Now I’m stuck deciding whether I want to turn down effects or run at a lower res. With the 2707s at 1920x1200 I was able to run everything at native res with effects maxed, thus far and been good with it.

The other option of course is to beef out your video. You really need best of the best to seamlessly play at 2560x1600 with max settings in today's games. My single GTX 480 definitely feels the burn. I may consider SLI GTX 580 or 595, or Crossfire 6970 or 6990 in the near future.

Also worth considering: many games today support windowed mode. Lower your game resolution without lowering your monitor's native resolution.
 
FWIW, I am also very fussy about picture quality. I have an A00 panel and it has great uniformity. The blacks do gray out at the corners, but this is due to the contrast degrading in the diagonal viewing angle direction much faster than vertically or horizontally. The panel is just too huge to avoid viewing angle problems at the corners when viewed from close up (2-3 feet). However, the blacks in the corners of the U3011 are about the same as the blacks in the center of the BenQ 241w and Dell 2001fp that I have, and the blacks in the center of the U3011 are excellent... far better than any panel I have ever owned. I have not noticed the contrast drop in the corners in normal use, only when putting up a black test image.

Overall I'm very pleased with the purchase except for the broken Custom RGB mode. I have yet to try and contact Dell about it.
 
Overall I'm very pleased with the purchase except for the broken Custom RGB mode. I have yet to try and contact Dell about it.

I'm not sure if contacting Dell will solve the problem. Best bet is to use sRGB mode.
 
update on the quality of the 3 u3011's I've been through:

- custom color mode is broken: 2 out of 3 (yes I just got one that works right, surprise)
- whites on the right side look visibly more warm than the left side: 2 out of 3
- color is in some way not uniform from one side to the other: 3 out of 3 but to varying degrees ( I hear this happens on apple 30 inchers too)
- whites overall look a little on the yellow side before calibration: 3 out of 3
- dead pixel: 1 out of 3
- menu button sometimes flickers by itself even when nothing is near it: 1 out of 3
- noticeable lag or ghosting: 0 of 3
 
The other option of course is to beef out your video. You really need best of the best to seamlessly play at 2560x1600 with max settings in today's games. My single GTX 480 definitely feels the burn. I may consider SLI GTX 580 or 595, or Crossfire 6970 or 6990 in the near future.

Also worth considering: many games today support windowed mode. Lower your game resolution without lowering your monitor's native resolution.

Yes that was the point. It’s all up to each person on how they want to go. For movies and games a 1920x1080 screen is the way to go. Movies (HD Movies) will look better at the native res and video cards will drive games at that resolution better and for much longer. If money is no object and you can upgrade your two card SLI/Crossfire rig every year go for a 2560x1440 or 2560x1600 res screen.

I think windowed mode takes a performance hit with most games but that is an option as is running at a lower blurry (scaled by the monitor) resolution or sharp one with big black borders.

If you don’t need the extra resolution for work (as I do) or something why torture yourself? For gaming and movies get a 1920x1080 screen with a kick ass refresh rate and low input lag.
 
If you don’t need the extra resolution for work (as I do) or something why torture yourself? For gaming and movies get a 1920x1080 screen with a kick ass refresh rate and low input lag.

I definitely disagree. Gaming on a 30" 2560x1600 is far superior to that of my last 24" dell. As is watching movies... I'm not sure how you can make the argument that a smaller screen is better for movie watching... interesting. In any case, i feel this 3011 is a worthy upgrade from my prior 2407wfp dell.... just wish the right side of the monitor wasn't noticeably brighter and washed out on blacks.
 
I'm not sure how you can make the argument that a smaller screen is better for movie watching... interesting.

It's all in what you prefer. Certainly a larger screen results in a better viewing experience, but movies viewed on the u3011 do suffer from a slight softening due to the upscaling of the source material. When the number of pixels on your display device matches your source material exactly there is a razor-sharpness to the image that just can't be achieved if there is any scaling involved. So, a 30 inch or larger 1920x1080 television would produce an even sharper image than the u3011.
 
It's all in what you prefer. Certainly a larger screen results in a better viewing experience, but movies viewed on the u3011 do suffer from a slight softening due to the upscaling of the source material. When the number of pixels on your display device matches your source material exactly there is a razor-sharpness to the image that just can't be achieved if there is any scaling involved. So, a 30 inch or larger 1920x1080 television would produce an even sharper image than the u3011.

The difference is negligible.
The image you see is never razor-sharp even if watched with a 1920x1080p display, there is no content available that is good enough for that statement to really make sense.
 
I definitely disagree. Gaming on a 30" 2560x1600 is far superior to that of my last 24" dell. As is watching movies... I'm not sure how you can make the argument that a smaller screen is better for movie watching... interesting. In any case, i feel this 3011 is a worthy upgrade from my prior 2407wfp dell.... just wish the right side of the monitor wasn't noticeably brighter and washed out on blacks.

I was talking about my 27” displays for one, not 24” displays. Setting at my desk at 2-3’ form the screen, movies or whatever look better at their native res. If you are going set back a ways then I’d agree that the 30” would be better. My 60” Kuro plasma looks way better form the couch.

Seemore +1
Anything scaled will look softer and can add other side effects as the monitor or PC is processing the image again.

I stand by my previous and complete statement pertaining to Games and Movies on a 30” LCD at 2560x1600 vs. a 27” LCD at 1920x1080 100%.

I guess I'll just agree to disagree.
 
The difference is negligible.
The image you see is never razor-sharp even if watched with a 1920x1080p display, there is no content available that is good enough for that statement to really make sense.

What??? All my 1920x1080p HD movies on my hard drive look way better without being scaled up but maybe that’s just me.I’m sure some people like having clear type turned on, I can’t stand it. All the text looks blurry.
 
Last edited:
What??? All my 1920x1080p HD movies on my hard drive look way better without being scaled up but maybe that’s just me.I’m sure some people like having clear type turned on, I can’t stand it. All the text looks blurry.

And you know this how? (most simple tests aren't that useful)
(I can't stand clear type either, drives me mad)
 
I prefer having the pixel density of a 27'' 1080p monitor.
If there was a 37'' 2560x1600 monitor I would buy that.
 
The other option of course is to beef out your video. You really need best of the best to seamlessly play at 2560x1600 with max settings in today's games. My single GTX 480 definitely feels the burn. I may consider SLI GTX 580 or 595, or Crossfire 6970 or 6990 in the near future.

Also worth considering: many games today support windowed mode. Lower your game resolution without lowering your monitor's native resolution.

I have a single 580. Its good, but I would like more power out of newer games. Surprisingly, my 580 performs about the same at stock speeds as my overclocked GTX470 did. Unfortunately the GTX580 does not overclock as well. Core clocks can go up to 900 stably, but its held back by memory bandwidth, and the memory won't overclock at all on mine without becoming unstable..

I've been playing Stalker a lot lately and the 580 at stock speed is fine for 2560x1600 gaming on all three of them (Shadow of Chernobyl, Clear Sky and Call of Pripyat). It is also enough to run Metro 2033 on "Normal" quality level with AAA turned on at 2560x1600 for about 60fps. I haven't had time to experiment with upping the settings any more yet. Since its single player, dropping the frame rate down to about 40fps would probably be OK.

When I build my Sandy Bridge rig, I will consider dual 580's, but right now I can't. My small form factor case (with motherboard built in) does not have space for two double slot SLI cards... I can sli two single slot cards, but thats not going to happen unless I water cool and convert the 580s to single slot, so I'll have to wait for a new motherboard and case, and I'm not spending money on that until Sandy Bridge comes out...
 
update on the quality of the 3 u3011's I've been through:

- custom color mode is broken: 2 out of 3 (yes I just got one that works right, surprise)
- whites on the right side look visibly more warm than the left side: 2 out of 3
- color is in some way not uniform from one side to the other: 3 out of 3 but to varying degrees ( I hear this happens on apple 30 inchers too)
- whites overall look a little on the yellow side before calibration: 3 out of 3
- dead pixel: 1 out of 3
- menu button sometimes flickers by itself even when nothing is near it: 1 out of 3
- noticeable lag or ghosting: 0 of 3

Geez. I haven't opened my U3011 yet, and that post makes me afraid to as my 3008WFP is plagued by none of those issues.

Oh well, I bought the 5-year warranty so I guess if I keep the U3011 and it has any issues that I can't live with, I can just keep exchanging it while using my other 30 inchers. I can definitely see why people with just one monitor would want to forgo the hassle and live with an imperfect display. I'm eager to see how many of these issues will be resolved with future revisions of the U3011.
 
It's all in what you prefer. Certainly a larger screen results in a better viewing experience, but movies viewed on the u3011 do suffer from a slight softening due to the upscaling of the source material. When the number of pixels on your display device matches your source material exactly there is a razor-sharpness to the image that just can't be achieved if there is any scaling involved. So, a 30 inch or larger 1920x1080 television would produce an even sharper image than the u3011.

Well I don't know about that. Thems would be some mighty large pixels.

A 50" 720p television might produce a sharper image with 720p content than a 50" 1080p television running 720p content, but the pixels on the 720p display are larger and therefore blockier, which might negate the effects of the higher-res display scaling the image.

And you could always achieve the same sharpness on the U3011 by not having it scale the image to 2560x1600, and running the original resolution with black bars. Personally I'd rather be able to use a high resolution in Windows and games, and have the choice of either running a movie at its original resolution with big black bars or putting up with a slight loss of sharpness if it was scaled than to use a large low resolution display.
 
That sucks! I got lucky on the second U3011; the first had light bleeding from the top center, not really noticeable except for bright white background, it was just annoying. The second U3011 exhibited no light bleeding (at least not noticeable to me) and color uniformity is excellent. This is revision A00; I don't think there's new revision yet. I once had a NEC MultiSync LCD3090WQXi-BK and went through three copies but the light leakage/bleeding was far worse than the U3011. So I gotta give to Dell to doing a better job at QC the U3011 for revision A00.
 
update on the quality of the 3 u3011's I've been through:

- custom color mode is broken: 2 out of 3 (yes I just got one that works right, surprise)
- whites on the right side look visibly more warm than the left side: 2 out of 3
- color is in some way not uniform from one side to the other: 3 out of 3 but to varying degrees ( I hear this happens on apple 30 inchers too)
- whites overall look a little on the yellow side before calibration: 3 out of 3
- dead pixel: 1 out of 3
- menu button sometimes flickers by itself even when nothing is near it: 1 out of 3
- noticeable lag or ghosting: 0 of 3

Holy crap, someone else is having the same issues as me. I posted about this on the Dell forums and the guy said it was cause I was using a Mac as an input :rolleyes:

Anyways, so far I'm:

-whites on the right side look visibly more warm than the left side: 2 out of 2
-menu button sometimes flickers by itself even when nothing is near it: 1 out of 2
dead pixel: 0 out of 2
-notice lag or ghosting: 0 out of 2
-noticeable light bleeding: 2 out of 2

My first panel was from China, second from Mexico. Time for me to try panel number 3...
 
Calling all you NEC experts....Do you think the PA301 30" to have similar display challanges as the U3011 or does NEC do a number of items to make sure the display is "more" perfect?
 
Calling all you NEC experts....Do you think the PA301 30" to have similar display challanges as the U3011 or does NEC do a number of items to make sure the display is "more" perfect?

IMO, the same. Every single LG IPS that gets discussed here, always comes up with the tinting and uniformity issues, with it's screen. NEC 3090 had all these problems, the PA301 won't be any different.
 
- custom color mode is broken: 2 out of 3 (yes I just got one that works right, surprise)

Custom color will break on that last one as well. :) I thought it worked on the last one I received as well, but it broke after a while just like the others. Doesn't really matter though as this monitor is only usable in Adobe RGB or sRGB as far as I'm concerned.

I've gotten used to the monitor now and I like it. Something must have happened to it though because as I was about to return it the annoying buzz went away so I never returned this third one. The brightness must also have normalized itself or something because now I don't have to constantly change it and I like it at 70% whether I'm in front of the screen for 30 minutes or 10 hours.

The only thing that bothers me now is how long it takes to change between inputs.
 
Back
Top