Ever going to see higher than "32 Bit" Color?

USMC2Hard4U

Supreme [H]ardness
Joined
Apr 4, 2003
Messages
6,157
I dont know if it even makes a difference. I know I can tell the difference between 256 -> 16bit -> 32 bit

So The next logical step would be 64 Bit color.... any word? any bennifit to it? Will it ever happen?
 
64 bit color?

That would BE an evolution and probably would mean new monitors/graphics cards/ overall systems so..my guess not for a while.

would be interesting though :)
 
Thats actually a good question...while I'm home for the summer and I'm stuck running my parents computer, I've been forced to remember what 16-bit color looks like in games (to play WC3 on a 5200...is a sad thing).

I wonder if there are any 'comparison' shots that could show the difference. I know that 16-bit to 32-bit just seems so much richer in terms of color detail, but I don't know if 64-bit would yeild the same kind of wow factor.
 
64bit color would give you 16-16-16-16
most noticable would be the incresed alpha depth imo
 
For desktop use. Probably not.

However , it comes in handy when you want to resample like in games and image editing I believe.

Games already use more than 32-bit, in a way. That's what HDR is.

Take this with a bucket of salt, but I think that's pretty much correct.
 
Some games are indeed already using more than 32-bit colour, and that is what HDR essentially is (e.g. true HDR works with 4*16-bit floating channels (64-bit colour basically). It just happens that HDR typically gets misconstrued as tone mapping, bloom, glare, and so on, which are just effects that benefit enormously from using the higher-precision data that HDR delivers.

For desktop usage and most apps though, keep in mind that it's effectively only 24-bit colour (8-bits for red, green, and blue channels, and an 8-bit alpha channel which is basically never used). I don't know about NV's cards, but on ATi, one benefit that they have for 2D processing is 30-bit colour, where each channel (R,G, and B) have 10 bits apiece, and only 2 bits are used for the alpha. This gives a significantly higher precision*, and in most cases there's almost no real loss due to the general unuse of the alpha channel. Some 3D games can also take advantage of this 10/10/10/2 format for their graphics if they support it and don't have much use for the alpha channel, and allows for a nice image quality boost (not sure what the performance tradeoff is though).


*24-bits can do 16 million colours, 30-bits can do about a billion.
 
there's actually a lot of processes already used in film, video, or internally in equipment (scanners, digital cameras, graphic cards etc) or games that produce higher than 32bit color. 32bit is simply 24bit (8bits per color -RGB) plus an extra 8bit for alpha channels; even this is only seen in 3d most commonly. However, there are devices that work at 10bit or even 12bits per color - so that's ~ 40bit and 48bit color. There's even a new format called ILM "half" that incorporates 16bit floating point numbers per channel (64bit) - current technology isn't really powerful enough to utilize it on a day-to-day basis yet unfortunately. The problem is though... the human eye can's really see a difference beyond 32bitish. But there are certain spectrums that can be improved, and yes high dynamic range imaging (HDRI), floating point numbers are used to describe numbers in excess of 'full' white and black. This allows an image to describe accurately the intensity of the sun and deep shadows in the same colour space. Various models are used to describe these ranges, many employing 32 bit accuracy per channel.
 
i think the real question is not weather we can do 64 bit colour, but weather or not the human eye can see 64 million colours.

i read somewhere that the human eye is actually capable of only a little over 16-bit colour. please, take this with a grain of salt as it was many years ago i read this (back when we were all moving from 16 to 32-bit colour) so i cannot provide any references. the question is could the human eye tell the difference between 32 and 64 bit colour? i read that the only reason we can see the difference between 16 and 32 is because everybody has a slightly different range of colours which we can see so using 32-bit actually shows us more that we can see making sure our entire visual spectrum is covered.

again, i am not certian on this as i read something to this effect long ago. but you still gotta wonder how much of a difference 64-bit would make;)
 
18446744073709551616 colors actuly

or

281474976710656 if you dont inculde alpha
 
With 32 bit there are still more colors available than there are pixels on my screen, so I'm not holding my breath or looking forward :p

 
USMC2Hard4U said:
I dont know if it even makes a difference. I know I can tell the difference between 256 -> 16bit -> 32 bit

So The next logical step would be 64 Bit color.... any word? any bennifit to it? Will it ever happen?

What do you think HDR is?
 
for lighting yes

have you all been in a cave?

the requirenment at the least for HDR is to use a 64 bit color surface format, the current highest quality uses a 128 bit color surface format
 
Brent_Justice said:
What do you think HDR is?
I figured it was just some new way of more light sources and shadows and maping and stuff... not more colors...

But thats interesting :) Thaats why I asked
 
USMC2Hard4U said:
I figured it was just some new way of more light sources and shadows and maping and stuff... not more colors...

But thats interesting :) Thaats why I asked

HDR is about representing a greater color spectrum for lighting, it stores color values much greater than 0.0f to 1.0f used in games.

This is an awesome slide that shows what HDR is all about

http://enthusiast.hardocp.com/image.html?image=MTEyODI4MDE0MEFCVGlYSnBoRUNfNl8yX2wuanBn

Read the text I have quoted under it - http://enthusiast.hardocp.com/article.html?art=ODIyLDYsLGhuZXdz



Here are a couple of wiki pages that could help you as well:

http://en.wikipedia.org/wiki/High_dynamic_range_imaging

http://en.wikipedia.org/wiki/High_dynamic_range_rendering
 
Brent_Justice said:
for lighting yes

have you all been in a cave?

the requirenment at the least for HDR is to use a 64 bit color surface format, the current highest quality uses a 128 bit color surface format

Don't forget Vista and Dx10

-Proxy
 
tornadotsunamilife said:
So those 'HDR monitors' I saw a few months ago used 64bit colour or whatever?

No, I don't believe they were outputting more than 8bit/component. The fuss about the "HDR" monitors is that they could output a much higher dynamic range of light. By using an LED background, they could also give you much better contrast. Basically a lower rez duplicate image [of what was being displayed in high res on the screen] is run on the LED backlight, and so areas that are meant to be dark, are really dark, and areas that are meant to be blinding are blinding. The luminance on these monitors is a step up as well. I think the contrast ratio (even without added brightness this would be enough imo) was something on the order of 200k:1 which is several orders of magnitude better than the best CRT.
 
pakotlar said:
No, I don't believe they were outputting more than 8bit/component. The fuss about the "HDR" monitors is that they could output a much higher dynamic range of light. By using an LED background, they could also give you much better contrast. Basically a lower rez duplicate image [of what was being displayed in high res on the screen] is run on the LED backlight, and so areas that are meant to be dark, are really dark, and areas that are meant to be blinding are blinding. The luminance on these monitors is a step up as well. I think the contrast ratio (even without added brightness this would be enough imo) was something on the order of 200k:1 which is several orders of magnitude better than the best CRT.
The display he's talking about is the DR37-P. It's one of the only, if not the only (I don't know of any others), monitors that can display true HDR. It also costs CDN$50,000 for that pretty 37" screen. So... Yeah.

HDR is nice and all, but it's still a relatively new technique (new in wide use, at least). So, there will be many improvements to come in the next few years. It's still far off human dynamic range, though.
 
most (maybe all?) LCD panels cannot do the full 24bit color range... let alone 48 bit or 64 bit.

there is still alot of work to be done to get LCD to display the full (98% or more?) color gamut at a good price.
 
Yashu said:
most (maybe all?) LCD panels cannot do the full 24bit color range... let alone 48 bit or 64 bit.

there is still alot of work to be done to get LCD to display the full (98% or more?) color gamut at a good price.

Yes, most LCD's use 6-bits per color channel, meaning 262 thousand real colors, to achieve a look of 24bit the rest are dithered. Also the backlights enter into the equation also for providing a shorter range color gamut.

LCD's still aren't up to CRT quality.
 
Of course in due time, it will move up higher.

Thats like saying..

wow, these new Quake 3D graphics are amazing on my New voodoo video card. I dont know if it can get any better than this!
 
Of course it will continue to evolve, hopefully it isn't the nature of the backlight and it can improve. Otherwise we will have to go to LED's, and those are more expensive right now, hopefully the prices of those come down.
 
Proxy said:
Don't forget Vista and Dx10

-Proxy

What about Vista and D3D10? The highest quality format that D3D10 provides is DXGI_FORMAT_R32G32B32A32_FLOAT, which is analogous to D3D9's D3DFMT_A32B32G32R32F format, which is 128-bit HDR. Heck, I'm not even sure highend rendering studios like Pixar/ILM use more than 4*32-bit imaging formats in their systems.
 
Brent_Justice said:
Of course it will continue to evolve, hopefully it isn't the nature of the backlight and it can improve. Otherwise we will have to go to LED's, and those are more expensive right now, hopefully the prices of those come down.
I think that's a given. We will have to move to LEDs, unless they can improve the current situation. Once we start mass-producing LEDs in an efficient manner, then the prices will go down. Until then, we'll have to stick with what we can get our hands on, at the price our pockets (or wives if you got 'em) will allow us.
 
Brent_Justice said:
Of course it will continue to evolve, hopefully it isn't the nature of the backlight and it can improve. Otherwise we will have to go to LED's, and those are more expensive right now, hopefully the prices of those come down.

Aside from the expense, the power usage of them is insane. Like that DR37-P has a peak power draw of 1680W (!) :eek:
 
as mentioned earlier, with 24bit colour there are more colours than pixels on screen in any current usable resolution.
So in theory (assuming LCDs will have true 24bit colour) this means Monitors wont need any modifications as they display more colours than we can tell the difference between.

take a high res of 2048x1536 = 3,145,728 pixels
24 bit colour gives over 16 million colours which is already 5x overkill.
To allow use of all the colours in the palette, a minimum resolution of around 4729 x 3547 (4:3) is needed.
Increasing the number of displayed colours wont give any extra benefit and wont really be needed even when we have huge resolution screens.
64 bit colour is of benefit in the internal architecture for colour processing.

I can see marketing as a good reason to tout 64 bit colour displays (ie they will sell because they are 64 bit lol).
When its cheap enough, the mfrs may as well make 64 bit gfx output and Displays so I expect them to appear some time.
 
Chernobyl1 said:
as mentioned earlier, with 24bit colour there are more colours than pixels on screen in any current usable resolution.
So in theory (assuming LCDs will have true 24bit colour) this means Monitors wont need any modifications as they display more colours than we can tell the difference between.

take a high res of 2048x1536 = 3,145,728 pixels
24 bit colour gives over 16 million colours which is already 5x overkill.
To allow use of all the colours in the palette, a minimum resolution of around 4729 x 3547 (4:3) is needed.
Increasing the number of displayed colours wont give any extra benefit and wont really be needed even when we have huge resolution screens.
64 bit colour is of benefit in the internal architecture for colour processing.

I can see marketing as a good reason to tout 64 bit colour displays (ie they will sell because they are 64 bit lol).
When its cheap enough, the mfrs may as well make 64 bit gfx output and Displays so I expect them to appear some time.

It isn't about the amount of displayed colors on a screen, it's about the possible amount of colors to draw from a palette. However, that said... no, it is very unlikely that 64 bit will come to the consumer market, as the human eye can't even discern all the different colors drawn from a 24 bit palette (it can, however, from a 16 bit palette).
 
Zinn said:
what about fuschia?

Indigo pwns fuschia in the face :p

I just read a review about that DR37-P, and they talk about it being very loud and hot because of the fans. If they were truly thinking, they'd find someone [H]ard enough to watercool it :p
 
As others have pointed out, we already have "better than truecolor" in our PCs today - HDR, 16 bit image editing, scanners, etc - but those are all downsampled before the image is displayed on a monitor.

Is the question about the *display* of higher than 24-bit color? Displaying that on a CRT isn't a problem because it's an analog signal. The limit there is the size of the framebuffer, the bit depth of the RAMDAC in the video card, and the noise in the signal between the RAMDAC and the CRT electron gun.

Displaying higher than 24-bit color on a digital device (like an LCD) has more limitations. Single link DVI is limited to 24-bit color, so you'd have to use dual link or twin link. All the electronics in the monitor would need to support the higher bit depth.

24-bit is already very close to the limits of the human eye for distinguishing color, so it may not be worth the effort. In grayscale the eye is more sensitive. Maybe there's a way to improve the *perception* of the image by increasing the amount of information in the grayscale and leaving the color alone. (DV, JPEG, and MPEG already do tricks like this to reduce the amount of bandwidth/storage needed for images. They cut color information and leave the luminance.)

I don't think that would be too big a leap in display technology. Medical diagnostic quality LCDs already have the capability of displaying more than 8 bit grayscale (256 shades of gray). This one does between 11 and 12 bits grayscale.
 
Cool. I wonder when the displays will catch up...

(From extremetech)

The ITU 601 standard, which governs today's displays, allows only 60 to 80 percent of the available colors, even if the display can support more, Chard said. "The color bit depth [of today's displays] is typically 24-bits RGB – that gets you 16 million colors, and the human eye can distinguish that," Chard said. "That leads to scaling and onscreen effects which you can pick up. Either 36-bit or 48-bit RGB is beyond the ability of the human eye to distinguish."

I guess I better check my info on the color sensitivity of the human eye.
 
bassman said:
In grayscale the eye is more sensitive.

Getting OT here--beautiful discussion BTW--but is that why digital cable looks like shit on my TV? I can see a ton of blotchiness on dark shots.
 
Back
Top