How does 30 bit color or deep color work?

rigurat

Limp Gawd
Joined
Jun 1, 2010
Messages
269
I've been around here for a while, I just registered to ask this question. I've searched around for the answer, I did get some, but not the ones I was looking for. Anyway here's the question:

How does deep color, or 30 bit color work in Windows 7? I know it works, I just don't know how it's done in Windows 7.

Assuming I have all the things required for deep color, how is it turned on in Win7? Am I going to see another option under the Right Click>Screen Resolution>Advanced Settings>Monitor>Colors menu?

Like this: High Color (16 bit), True Color (24 bit), Deep Color (30 bit)?

I already know no game has it, and no convensional application will benefit from it, I also don't really care if my eyes will even see the difference.

I'm interested in deep color because I have a lot of pictures on my PC, most of them captured in RAW format which I'm sure is above 24 bit color, every once in a while I like to travel down "memory lane" and view these pictures using ACDSee, and I want to see these pictures the best they can be.

If deep color works the way I hope it does on Windows 7, then I'm planing to buy a 30bit monitor and everything else I need for it.
 
You need a workstation class gfx adapter like an nVidia Quatro or ATi FireGL + compatible software to get it to work.
 
I saw that thread earlier, it did answer some of my questions.

I'm quite willing to go through the effort to have better color on my PC.

I just want to view RAW images in their best. Perhaps watch a few movies too.
 
dont know what youre talking about; I just can see true color (32bit) on windows7 by default without setting anything and with crappy Dell TN panel
truelg.jpg
 
I saw that thread earlier, it did answer some of my questions.

I'm quite willing to go through the effort to have better color on my PC.

I just want to view RAW images in their best. Perhaps watch a few movies too.

Well as no photo editing software yet supports 30bit colour output you wouldn't see any difference. Also, films are not encoded with any more than 24bit colour.
 
dont know what youre talking about; I just can see true color (32bit) on windows7 by default without setting anything and with crappy Dell TN panel
truelg.jpg

This is the question I have. For those who have all the requirements for deep color, does an additional option show up on this menu, say "Deep Color (38 bit, or whatever)"?
 
This is the question I have. For those who have all the requirements for deep color, does an additional option show up on this menu, say "Deep Color (38 bit, or whatever)"?

From the thread I posted earlier:
No I don't believe 30 bit colour is natively supported on Windows 7. For now, you will have to activate a special option in the ATI or NVIDIA driver control panel for 30 bit support which will, as a side effect, disable desktop compositing (aero) and only works with either OpenGL or Direct3D hardware accelerated contexts (D3D requires exclusive fullscreen modes). Another hint at 30 bit colour being sort of a driver level "hack" will get obvious as soon as you try drawing on top of say, an OpenGL context, using GDI - it will fall back to 24 bit display immediately. However, you won't need any vendor specific code to make it work, just some specific OpenGL or D3D code to set a suitable pixel format. What you need to do is outlined in the two PDF files in my earlier post.
http://hardforum.com/showpost.php?p=1035671876&postcount=15
 
Well I'm a bit disappointed, now I feel I'm stuck with a color pallet that's 15 years old. I guess for the next couple of years Deep Color will be an 'exotic' feature only for a few.
 
Yeah, I still don't understand why we there isn't deep color support. Is it hard to implement or is it just extremely low on the priority list? I mean my TV supports deep color, why can't I use it?

Also does this mean that blu-ray's look better on a bluray player since the separate players have support for deep color?
 
Yes, and to think with all the exponential adancement in CPU, GPU, and even storage technology getting all the attention, our display color pallet after over 15 years remains unchanged!

TV technology seems to be overtaking PC technology! For SHAME! PC technology is always supposed to be on top of everything, even game console technology.
 
Well I'm a bit disappointed, now I feel I'm stuck with a color pallet that's 15 years old. I guess for the next couple of years Deep Color will be an 'exotic' feature only for a few.

Deep Color will probably never be a common feature. Most people don't actually have True Color, either. After all, most people are hooking their computers up to TN panels, which are only 6-bit, not 8-bit. So if you hook your computer up to a TN you can kiss 16.7 million colors goodbye and say hello to 260,000 colors. And people don't seem to care.
 
Yes, and to think with all the exponential adancement in CPU, GPU, and even storage technology getting all the attention, our display color pallet after over 15 years remains unchanged!

Arguably its gotten worse. A cheap CRT beats a cheap LCD by miles. People have gotten significantly worse displays and thought they were improvements.

TV technology seems to be overtaking PC technology! For SHAME! PC technology is always supposed to be on top of everything, even game console technology.

Because people are willing to actually buy a high quality TV, but aren't willing to buy a high quality monitor. Companies are just going where the money is.
 
Yeah, I still don't understand why we there isn't deep color support. Is it hard to implement or is it just extremely low on the priority list? I mean my TV supports deep color, why can't I use it?

Also does this mean that blu-ray's look better on a bluray player since the separate players have support for deep color?

I think the effect of deep color is largely overestimated, especially on HDTVs since their color gamut is supposed to match the BT.709 color space, which is standard gamut. Thus, adjacent color values are almost indistinguishable from each other even on a 24 bit palette.
TV manufacturer's claims of "depp color support" are usually related to input formats and internal processing precision and not to the native output bit depth of the panel. Many midrange to highend computer displays use internal LUT sizes of 10 or 12 bit per channel as well, so there is no disadvantage compared to TVs. As a bottom line, I don't think blurays will look better on a standalone bluray player just because it has deep color support. Still, I'd like to see built-in support for 30 bit color in future operating systems.
 
Yeah, I still don't understand why we there isn't deep color support.?
Because it won't make a lot of difference?

Also does this mean that blu-ray's look better on a bluray player since the separate players have support for deep color?
No. Blu-ray video has 16,777,216 possible colours - 24bit - so why would being able to show more make any difference as you would be limited by the source.

Well I'm a bit disappointed, now I feel I'm stuck with a color pallet that's 15 years old. I guess for the next couple of years Deep Color will be an 'exotic' feature only for a few.
Well as 24bit colour is generally good enough for pretty much most purposes why do we necessarily need support for more colours being able to be displayed?
 
Deep Color will probably never be a common feature. Most people don't actually have True Color, either. After all, most people are hooking their computers up to TN panels, which are only 6-bit, not 8-bit. So if you hook your computer up to a TN you can kiss 16.7 million colors goodbye and say hello to 260,000 colors. And people don't seem to care.

Thanks for the heads up, I'll start looking for a great quality 2560x1600 IPS monitor instead, I'll check out some other threads to see what's good. Originaly I was planing to get a 30 bit monitor but now I think I'm having a change of mind.
 
The commonly accepted estimate of how many colors the human eye can see is 10 million. This is probably why there is a low priority on monitors that display more than 16.7 million colors. As Nashbirne comments in the previously linked thread, the difference between 16.7 million colors and 1 billion colors only shows up in "artificial grayscale ramps or color gradients", not in real world images.
 
Looking at your signiture, you have a 8800 GTX. Those can be softmodded into a quatro can't they? I don't remember the number of the quatro it is however, and if that model has 30 bit color. It would save lots of money, and then all you'd need was a HP dreamcolor screen for $1999 and use the dual DVI.

It makes me wonder what it is the quatro has that the GeForce ones don't that enables the 30bit.
 
After reading all the posts I'm just giving up on the idea of taking the deep color plunge. It's probably a fall I'll never be able to get up from.

I'll just use the money to buy a kick ass 30 inch monitor rather than a 30 bit monitor.

UPDATE: I ended up buying the HP ZR30w monitor. It turns out it supports 30 bit color!
 
Last edited:
The commonly accepted estimate of how many colors the human eye can see is 10 million.
One of - if not THE - main points of "30 bit colour" isn't merely to push the amount of possible values up, but to expand on the sRGB colour gamut. Your eye might struggle to distinguish beyond a certain amount of colours, but it sure as hell can EASILY see the difference between a crappy range of colours, like SRGB, and a much wider colour gamut given the right material.

Now, you don't need 30 bit for that, you just need a light source which enables a wider range of colours to be shown. But you DO need 30 bit if you want to do that AND maintain a full range 24 bit, 16.7 million colour, sRGB emulation from within the native wide gamut mode in colour managed applications.

As Nashbirne comments in the previously linked thread, the difference between 16.7 million colors and 1 billion colors only shows up in "artificial grayscale ramps or color gradients", not in real world images.
But the point of 30 bit usually isn't just to provide more values. The point is to help widen the colour gamut whilst enabling a full 16.7 million colours to be seen natively in an emulated sRGB mode. This way you can show both wider gamut and sRGB content on screen correctly at the same time.

Without 10 bits per channel support for this, that 16.7 million colour sRGB content must have millions of values dropped to display on the wide gamut screen correctly. However many values it does drop, I *do* think you can see a difference between some content when compared to a native sRGB 8 bit panel. Also, although gradient images might be a "theoretical" scenario which doesn't come up much, that doesn't mean it doesn't arise at all. For example high resolution wide gamut images of the sky, especially at dawn or dusk, aren't -so- far off from those gradient images..

However, there are 2 problems for 30 bit colour right now. Firstly the hardware side (graphics cards etc) and especially the software side simply hasn't kept up.

Secondly, there's the "good enough" factor. The main reason why it'll be a long while before 30 bit colour takes off (if ever) is the same reason BluRay has been slow to take off. DVD is simply "good enough" for most people, and they only notice the resolution difference more easily when HDTV screens get large enough to reveal it (50 inches+ etc).

Likewise, 8 bit panels are perfectly adequate for showing sRGB content. Why would you need anything more? Well, you don't - because there's hardly any content out there right now which goes beyond that! However this doesn't mean your eye can't see way beyond the sRGB colour spectrum, and 30 bit colour on a native 10 bit panel simply allows you to go in this direction whilst also retaining the ability to stay in that mode AND emulate the full 16.7 million sRGB values whilst doing so. Without those extra bits, if you want a wide gamut screen, you must say goodbye to several million values in order to show sRGB content. Although those extra bits can be emulated with dithering you might as well skip that step if the technology allows for it, which 10 bit native panels do :)

So 30 bit colour (on a 10 bit panel) is NOT simply about providing 1024 values for red, green and blue, instead of 256. It's about expanding the potential gamut of a screen whilst retaining its ability to show all 16.7 million sRGB colours within that wide gamut mode.
 
and 30 bit colour on a native 10 bit panel
This is the important limitation. You won't see a difference between 8bit and 10bit panel while driving the display @8bit per channel. In both cases a FRC dithering stage will "try" to preserve the tonal values that weren't lost through high bit processing before (I'm speaking of displays in the upper segment with a LUT >8bit). I think there was some astonishment in a thread here why the HP LP2480zx (one of the few displays with 10bit panel) showed even spatial dithering artifacts in a 8bit workflow. But generally this works quite good. Only disadvantage of a good implementation is slight temporal noise in dark tonal values.

A real 10bit panel is a nice thing. But like you have said: One has to input a 10bit signal to achieve advantages and bypass FRC. By the way: Most displays that offer 10bit input via DisplayPort are still using a 8bit panel.

Best regards

Denis
 
Last edited:
which has a wider gamut, 30bit RGB or LAB? I know that LAB is 16bits per channel and shows imaginary colors.
And also would all but the highest end cameras have 30bit color even in raw or just 24/lower?
 
which has a wider gamut, 30bit RGB or LAB? I know that LAB is 16bits per channel and shows imaginary colors.
And also would all but the highest end cameras have 30bit color even in raw or just 24/lower?

Colourspace and the number of colours are completely separate things. The colourspace defines the range of possible colours. Bit depth defines the number of colours used.

You can use the LAB, sRGB, AdobeRGB, ProPhoto RGB, CMYK, etc in 8bit/16bit/30bit modes if you want to.

Current high end medium format digital cameras RAW files work out to 16bits per channel - 48bit for RGB. Current DSLR cameras have 12 or 14 bit RAW capability, both higher than 30bit RGB.
 
So my question is just gamut of sRGB vs. LAB. I should have thought that because of all the sound work with bit depths etc.
Which got me to UHD.
I wonder as UHD is reported to use 10bit/channel, and shall replace HD that by then well see cheap 30bit screens? (in 2020..2025!?) As there shall be popular drive towards it, like with HD features.
 
So my question is just gamut of sRGB vs. LAB
Lab is a device independent color space (like XYZ) that covers all visible colors. It is often used as Profile Connection Space (=> Photoshop uses Lab (D50) internally for colorimetric calculations). Together with the corresponding white point, a color sample can be unequivocally definied in Lab. Normally it should not be used as/ like a working color space (like sRGB, AdobeRGB, ECI-RGB,...).

Best regards

Denis
 
Well, I'm using Dell u2711 which is a wide gamut monitor and I also have a CRT sRGB monitor also connected in dual display mode so I can make comparisons. When I paint an image in Photoshop with, let's say pure red, green, etc. when I preview it in sRGB mode the color saturation of the image is pretty much the same on both monitors, but when I choose monitor preview, Photoshop displays much more saturated colors on u2711 that are impossible to display on the sRGB monitor and the difference is huge. The eye can clearly see the difference and the wider gamut definitely allows much more nicer imagery than the limited sRGB gamut. Also as an artist I can also see in real world colors that the limited sRGB display cannot show. Just because of this I believe that the future will be wide gamut with the content coming probably sooner than expected.
 
I've written an article about how deep color / 10-bit color / 30-bit color works on a PC here.

NEC also offers a sample application that can show the effects in real-time if you have a 30-bit color display setup (e.g. MultiSync PA Series, DisplayPort connection and appropriate video card).

You can see this effect in anything that uses OpenGL, and the article also includes instructions on how to enable it in Photoshop.

Hope this helps.
 
Back
Top