Deep Color (30 bit) on U2711

ciobi

n00b
Joined
Mar 20, 2010
Messages
12
Did anybody test how 30 bit color works on U2711?

HDMI seems to max out at 1920x1080 or 1920x1200, so it's not acceptable for desktop work. (At least that's how it sounds from some posts.)

DVI doesn't do Deep Color.

That leaves DisplayPort as the only possible option for 30 bit color at 2560x1440. Well, does it work? If it does, with what graphics cards?
 
That leaves DisplayPort as the only possible option for 30 bit color at 2560x1440.
It should work via DisplayPort. But keep in mind that the panel is still a 8bit version with internal FRC. So there will be a 10bit => 8bit + FRC conversion. And you need support of OS, driver and application.

Best regards

Denis
 
It should work via DisplayPort. But keep in mind that the panel is still a 8bit version with internal FRC. So there will be a 10bit => 8bit + FRC conversion. And you need support of OS, driver and application.

True, but I'd like to see what 30-bit looks like, even with these restrictions.

However, I'm looking at GT 240 specs (in "Features") and the only context they mention Deep Color is "HDMI 1.3a". That's the same for Radeon HD 5750. And, well, the same for U2711. As far as I can tell nVidia mentions 30-bit color and DisplayPort in the same sentence only when describing Quadro cards.

To me these mean that it's quite likely that DisplayPort only works with 24-bit colors on U2711 and on consumer (non-workstation) cards. So now I'm inclined to believe that consumer monitors (including U2711) and cards can only do 2560x1440 (or 2560x1600) with 24-bit colors, over DVI and maybe DisplayPort; and that 30-bit color is only possible over HDMI, for a maximum resolution of 1920x1080. Please somebody prove me wrong!
 
Last edited:
As I mentioned in your other thread about this subject, HDMI 1.3 optionally supports 30-bit color at 2560x1600.

The 5750 spec you linked to seems to say that it only supports Deep Color up to 1920x1200, if I read it correctly.

You need to be aware that you must also be using an application that supports and displays Deep Color. Which leaves me asking what you intend to use it for.
 
As I mentioned in your other thread about this subject, HDMI 1.3 optionally supports 30-bit color at 2560x1600.

The 5750 spec you linked to seems to say that it only supports Deep Color up to 1920x1200, if I read it correctly.

You need to be aware that you must also be using an application that supports and displays Deep Color. Which leaves me asking what you intend to use it for.

The other thread is about cards, while this one is about U2711.

What I want to use Deep Color for is photo processing (starting from 12-bit RAW) and vector graphics.

My question in this thread is to what extent U2711 supports Deep Color, assuming that I have the right graphics card, OS, app, cable, and whatever else might be needed. The answer seems to be "only on HDMI and only up to 1920x1080", but I was hoping that somebody who has the monitor could comment on this based on their experience.
 
Not purely no. It would internally up convert I believe so you would need a different kind of monitor for photo processing ..one thats native 12-bit through and through.
 
Hi,
I've done a lot of research on 30-bit output recently for a thesis on medical imaging and got it to work on a Dell U2711 (using DisplayPort) at my workplace. When viewing artificial grayscale ramps or color gradients, it looks very smooth compared to 8 bit per channel output - You won't be able to discern any "borders" between adjacent colors any more. However, I could'nt find any benefit for real world images. Even the slightest hint of image noise - which there likely is on any RAW capture from (even the best) DSLRs - will make the advantage of 30 bit color neglectable.

Aside from that, you will need a workstation class graphics adapter (like AMD FirePro or Nvidia QuadroFX) for 30 bit output to work. There's a lot of marketing FUD and incorrect information floating around, but none of the current consumer graphics adapters is 30-bit capable over DisplayPort yet. Also, I had no success with HDMI DeepColor either on a Radeon 4870 board.

Further information here:
- http://ati.amd.com/products/pdf/10-Bit.pdf (AMD)
- http://www.nvidia.com/docs/IO/40049/TB-04701-001_v02_new.pdf (nVidia)
 
I've done a lot of research on 30-bit output recently for a thesis on medical imaging and got it to work on a Dell U2711 (using DisplayPort) at my workplace. When viewing artificial grayscale ramps or color gradients, it looks very smooth compared to 8 bit per channel output - You won't be able to discern any "borders" between adjacent colors any more. However, I could'nt find any benefit for real world images. Even the slightest hint of image noise - which there likely is on any RAW capture from (even the best) DSLRs - will make the advantage of 30 bit color neglectable.

Yes, there's probably not much reason for non-pro photographers to go with 30-bit, but I'd like to check it out anyway.

Aside from that, you will need a workstation class graphics adapter (like AMD FirePro or Nvidia QuadroFX) for 30 bit output to work. There's a lot of marketing FUD and incorrect information floating around, but none of the current consumer graphics adapters is 30-bit capable over DisplayPort yet.

Thanks! Although it's not what I wanted to hear, it's what I expected. I'm about to order the monitor. I'll try to get a clear answer from AMD first - maybe there's something that you should enable first to get 30bit. if there isn't, I'll probably go with nVidia.

I don't see what the big deal with 30-bit on DisplayPort is, since 30-bit HDMI seems to be widely supported.
 
Hi Nashbirne,

do you know if 30bit color is now supported in the GDI of Windows 7?
This will be a big progress implementing 10bit grayscale (or 30bit color) for Digital Imaging as it doesn't require writing code for a specific hardware vendor by the PACS software companies.
Did you use the NVAPI to implement 30bit color?
Good luck with your thesis. There has been a lot of interest (and debate) in the industry around this subject for years.

For what it's worth, a reasonable priced option is the NVidia Quadro FX580 for less than $150 which supports dual display port with 30bit color.
 
Did anybody test how 30 bit color works on U2711??...
I have u2711 on a trial period here, and I don't mind making the test for you if you give me step by step instructions how to make the test and what hardware, software is required.
 
are there 30bit examples?(screenshots???)

Unless your current monitor is capable of displaying 30bit color, I'm not sure how you expect to be able to see 30bit color in a screenshot. It would be like showing a screenshot of a color TV on an old black&white TV.
 
Hi Nashbirne,

do you know if 30bit color is now supported in the GDI of Windows 7?
This will be a big progress implementing 10bit grayscale (or 30bit color) for Digital Imaging as it doesn't require writing code for a specific hardware vendor by the PACS software companies.
Did you use the NVAPI to implement 30bit color?
Good luck with your thesis. There has been a lot of interest (and debate) in the industry around this subject for years.

For what it's worth, a reasonable priced option is the NVidia Quadro FX580 for less than $150 which supports dual display port with 30bit color.
@uhermes
No I don't believe 30 bit colour is natively supported on Windows 7. For now, you will have to activate a special option in the ATI or NVIDIA driver control panel for 30 bit support which will, as a side effect, disable desktop compositing (aero) and only works with either OpenGL or Direct3D hardware accelerated contexts (D3D requires exclusive fullscreen modes). Another hint at 30 bit colour being sort of a driver level "hack" will get obvious as soon as you try drawing on top of say, an OpenGL context, using GDI - it will fall back to 24 bit display immediately. However, you won't need any vendor specific code to make it work, just some specific OpenGL or D3D code to set a suitable pixel format. What you need to do is outlined in the two PDF files in my earlier post.

Also, AMD offers an entry level workstation board, the FirePro V3750 for a comparable price to Nvidias QuadroFX 580. It does support 30 bit colour on a dual displayport setup as well.
 
Unless your current monitor is capable of displaying 30bit color, I'm not sure how you expect to be able to see 30bit color in a screenshot. It would be like showing a screenshot of a color TV on an old black&white TV.

An excellent analogy.
 
This thread on the Adobe forums may be of interest to you:
http://forums.adobe.com/thread/506853

To sum it up:

New ATI FirePro cards currently seem to be working with 30bit in Photoshop CS5.

The most recent (Fermi-based) Nvidia Quadro cards only began exposing 30bit in drivers last month. Supposedly, NVIDIA OEMs are claiming Fermi-based GeForce cards with DisplayPort should also support 30bit, but NVIDIA has yet to expose this in drivers. People appear to be having trouble getting any NVIDIA card working with Photoshop CS5 30bit.

Adobe is still working on it with manufacturers, but it basically comes down to lack of driver support.
 
Back
Top