LUT vs Maximum DIsplay Colours vs Colour depth

murkris118

Limp Gawd
Joined
Jun 5, 2012
Messages
467
Dear All,

I am new to this forum so I am starting my new thread to post my queries. Please bear with me if this topic has already been covered.

I am currently on the lookout for a monitor which suits my needs when I came across all these terms. Could anyone please enlighten me with facts and figures on how all these can make an effect on the display colour and quality. To be honest I am more interested in natural yet vivid colours than the actual sharpness the monitor can give me. So my questions are:

1.) What is the relation between Wide Gamut and maximum display colours ie 16.7 Mil or 1.07 Bil? and does it have to be that only a wide gamut monitor can support 1.07 billion colours?

2.) What is the relation between LUT or Lookup Tables and maximum display colours?. Does it mean that a monitor having a 10-bit LUT can display 1.07 billion colours eventhough it has a native 8-bit panel considering that the entire pipeline is 10-bit including GPU and application?. If it does not then can I take advantage of a professional grade GPU with a 10-bit LUT monitor?. What is the process taking place when the monitor uses a 10-bit LUT ( Conversions that the signal goes through from source to destination )?.

3.) What is Deep colour and will it have any effect on a native 8-bit+10-bit LUT monitor or does it have to be a true 10-bit display?. PS: I am using a PS3 to watch Blu ray.

4.) In a nutshell what is the maximum number of colours an 8 bit+10-bit LUT panel produce considering LUT?.

Sorry guys, I am pretty sure this one would need a long explanation but please bear with me and help me find the right answers to all these questions. Any help would be appreciated. :).

Thanks,
 
2.) What is the relation between LUT or Lookup Tables and maximum display colours?.
It's an indirect relation. A LUT more precise than the input signal (typically 12-16bit per channel or a 3D-LUT coupled with a 8bit panel; example for such a workflow) allows – independent from the native color depth of the panel – for lossless transformations => No additional banding is introduced and the desired characteristic (via OSD or hardware calibration) can be achieved accurately. A FRC stage "rescues" all tonal values (while it doesn't interpolate "new colors" of course).

1.) What is the relation between Wide Gamut and maximum display colours
No relation.

a monitor having a 10-bit LUT can display 1.07 billion colours eventhough it has a native 8-bit panel considering that the entire pipeline is 10-bit including GPU and application?
Yes, you can establish a 10bit workflow also with a native 8bit panel. Real 10bit panels are still very rare. In times of panelinternal FRC stages and DisplayPort connections the manufacturers make use of the term quite quickly. However: The difference (10bit input => appropriate electronic => 8bit panel, 10bit panel) actually is negligible for you as user. Unfortunately a 10bit workflow is still a little "wobbly" and requires dedicated support of applications, video card and OS.

PS: I am using a PS3 to watch Blu ray.
They don't even make full use of a 8bit quantization.

What is Deep colour and
The "video deep colour" referes to a higher quantization of the videosignal. No consumer medium is encoded in such a way to this day.They are all using 8bit YCbCr (=> video level) 4:2:0.

Best regards

Denis
 
Last edited:
Color Gamut
The color gamut is identifies the range of colors that the display can produce. The wider the gamut, the more intense or saturated the colors can be. The reddest red on a wide gamut monitor will appear more red than on a standard gamut monitor.

Displayable Colors / Deep Color
The maximum displayable colors comes from how many bits of color data are used. Displays with 10-bit inputs can support deep color, which is 10 bits or more of color data. I wrote a brief article on this here: http://www.ronmartblog.com/2011/07/guest-blog-understanding-10-bit-color.html

Note that to have full deep color support, you have to have a 10-bit capable display (see denis' comment), 10-bit connection and 10-bit video card / signal. This is not possible on MacOS today and is only with certain cards on PCs.

LUTs and calibration
The best way to think about a LUT is that it's a way for the display to adjust the viewable colors to match what you (or a colorimeter) expects them to be.

Say you have a 8 bit connection (e.g. DVI, for example), which means that you have 16.7 million colors available. Without a LUT, adjusting colors needs to be done in the video card. This means that the video card will shift some colors to make the display look better (e.g. mapping a dark gray to black) and that can reduce the amount of colors sent to your display.

Say your calibration without a monitor LUT maps only 8 color values on each of the red, green and blue (RGB) signals to another color. Where you originally had 8 bits of color (256 values) you now have 248 values. This reduces your displayable colors to 15.3 million.

With a LUT, the monitor can make the color adjustments in the same way, without affecting the input signals. If the bit depth of the LUT is larger than the input signal, the 256 input values can be mapped on to a larger scale for more accurate adjustment and generally with no loss of detail. For example, our MultiSync PA Series have 14-bit LUTs, which are 16,384 entries that you can map those 256 input values onto when calibrating.

So to answer your last question, your displayable colors are affected by only the bit depth of your connection. Unless you have a true 10-bit source, your actual displayable colors will only be 16.7 million with an 8-bit connection.

Anyway, I hope that this helps.

-- Art
 
With a LUT, the monitor can make the color adjustments in the same way, without affecting the input signals.
-- Art

Hi, thanks for the answer, I never known that there is monitors without a LUT.
I know that there are monitors with different LUT, from 8bit LUT to 14bit but never known about monitors without LUT.
Are you sure that this kind of monitors exist?
 
Hi, thanks for the answer, I never known that there is monitors without a LUT.
I know that there are monitors with different LUT, from 8bit LUT to 14bit but never known about monitors without LUT.
Are you sure that this kind of monitors exist?

It's more typical that a monitor does not have a LUT. Typically only professional-grade monitors will have a LUT. Most displays rely on video card LUTs for calibration.

-- Art
 
It's more typical that a monitor does not have a LUT. Typically only professional-grade monitors will have a LUT. Most displays rely on video card LUTs for calibration.

-- Art

understood, but how can a monitor let you change RGB values if it doesn't have a LUT?
thanks.
 
Thank you... I have actually seen banding even on my friend`s 10 bit LUT IPS panel when watching some videos and photos. Why is this posterization present? when they advertise that a monitor will have smooth gradients when it uses a 10 bit LUT?. Does it have something to do with bad encoding quality and bad compression in the case of photos?. (or) does it have something to do with the colour depth of the monitor?.

In contrast will a true 10 bit panel reduce banding?. I actually hate this when I watch movies when the transition from dark to bright scenes are not smooth. Will a 10-bit graphics card eliminate this because of the extra bit depth?.
 
I never known that there is monitors without a LUT.
Are you sure that this kind of monitors exist?
Yes and No. Also displays without LUT (=> no OSD) have something we could call "panelinternal LUT", implemented by the panel manufacturer, to ensure the desired characteristic (gradation, grey balance, white point). The native characteristic of a LC panel is quite unusable.

understood, but how can a monitor let you change RGB values if it doesn't have a LUT?
These displays implement a "full featured" LUT.

Best regards

Denis
 
Yes and No. Also displays without LUT (=> no OSD) have something we could call "panelinternal LUT", implemented by the panel manufacturer, to ensure the desired characteristic (gradation, grey balance, white point). The native characteristic of a LC panel is quite unusable.


These displays implement a "full featured" LUT.

Thanks Denis! You've corrected my mischaracterization. I should have said that most displays do not have a "programmable" LUT, one that can be adjusted after the display has left the factory.

-- Art
 
Thank you... I have actually seen banding even on my friend`s 10 bit LUT IPS panel when watching some videos and photos. Why is this posterization present? when they advertise that a monitor will have smooth gradients when it uses a 10 bit LUT?. Does it have something to do with bad encoding quality and bad compression in the case of photos?. (or) does it have something to do with the colour depth of the monitor?.

In contrast will a true 10 bit panel reduce banding?. I actually hate this when I watch movies when the transition from dark to bright scenes are not smooth. Will a 10-bit graphics card eliminate this because of the extra bit depth?.

There are many factors, including (and often) the source material, the software used to view videos / photos, the OS, the video card, etc.

Everything that touches video can introduce posterization.

-- Art
 
Yes and No. Also displays without LUT (=> no OSD) have something we could call "panelinternal LUT", implemented by the panel manufacturer, to ensure the desired characteristic (gradation, grey balance, white point). The native characteristic of a LC panel is quite unusable.


These displays implement a "full featured" LUT.

Best regards

Denis

it's a pleasure to read you again denis :)
I never had a monitor that is not able to adjuct RGB values, also the cheap one has this RGB settings in OSD so I tought that no monitor comes without this settings.
 
Units like the U2410 do feature a LUT provided by the controller (STDP6028 in this case) that can be adjusted using the OSD. The level of benefit it offers depends on what controls are offered, whether it is RGB gain and 'Contrast', or preset modes calculated by measurement at the time of manufacture.

What we commonly think of on this forum as a LUT though is a user programmable one via a colorimeter such as in the LCD2490WUXi.
 
Yes and No. Also displays without LUT (=> no OSD) have something we could call "panelinternal LUT", implemented by the panel manufacturer, to ensure the desired characteristic (gradation, grey balance, white point).

Where is this physically implemented?
 
Units like the U2410 do feature a LUT provided by the controller (STDP6028 in this case) that can be adjusted using the OSD. The level of benefit it offers depends on what controls are offered, whether it is RGB gain and 'Contrast', or preset modes calculated by measurement at the time of manufacture.

What we commonly think of on this forum as a LUT though is a user programmable one via a colorimeter such as in the LCD2490WUXi.

So what is the difference between the Hardware LUT provided in the LCD2490WUXi and how different it is from the firmware LUT in teh U2410?.
 
So what is the difference between the Hardware LUT provided in the LCD2490WUXi and how different it is from the firmware LUT in teh U2410?.
Reply With Quote
The LUT of the DELL can only be accessed by its OSD but the programmable LUT of the NEC can – in addition – be altered by compatible (SpectraView II, SpectraView Profiler) calibration software.

Best regards

Denis
 
I have few more questions regarding display colours when using Photoshop CS6. I recently played with some pictures that I had taken on my trip to Seville recently and I noticed that when I view the photos on CS6 I see the reds, blues and almost all the colours a bit saturated and more vibrant. When I open the same image on Windows Photo viewer side by side, I lose nearly 50% of the colours and detail. The Working colour space set on CS6 is Adobe RGB 1998 but the photo was taken in sRGB mode.

Does it have something to do with the 10-bit LUT or is photoshop upscaling the picture by any means?. The panel is 8-bit though and does not support Adobe RGB for your information. I do not use a 10-bit GPU as well. Its an 8-bit+10-bit LUT Display running on Intel HD 3000. :). Please enlighten me.
 
Back
Top