color difference on displayport compared to DVI

dolevo

n00b
Joined
Mar 8, 2013
Messages
4
Hi guys,

I am currently using Radeon 6320 graphics card and this card has displayport (DP) and DVI-I output. I have two identical monitors and they both have HDMI input. Therefore, I am using two cables to hook my monitors, DP->HDMI and DVI-I->HDMI. However, the colors are visibly different on both monitors and it is quite recognizable more in blueish colors. I tried using different monitors as well but the result is the same. Can it be because of some color conversions from DP->HDMI? Is there any color conversion going on from DP to HDMI? Can it be because of monitors? The monitors are just DELL monitors which have 1600x900 resolution and have only HDMI imput.

I'd appreciate your comments on this issue.
 
I also have seen the same phenomenon as yours with my rig. Mine is an Asus Pro Art 23 inch IPS panel with a native 8 bit color depth with a 10 bit LUT. When I hook up my monitor to my graphics card with a DP the colors are much more vivid and the blacks are more inky compared to HDMI, I dont use any adapters by the way as my monitor has both DP and HDMI ports and so as my 680. At first I thought it was the 10 bit LUT doing the vivid stuff but then I realized windows does not have a 10 bit color depth option at the OS level. I am yet to find an answer to this behavior but man am I loving this because of the increased color depth :D.... Any thoughts would be appreciated from others.
 
It could be that one of the monitors is using 16-235 level range (meant for TVs) instead of 0-255 (for PCs), or it's using YCbCr mode instead of RGB. I don't know enough about ATI drivers to know how to check for or fix either one.
 
Last edited:
The best bet is to stick with the exact same port types.

I believe you can buy one of those fancy color calibrators that (in theory) is used to analyze colors on a monitor and then builds a custom color profile to try to match it to a baseline.

I have 2 exact same monitors and I run each monitor with each connected to the same DVI port matching to each individual video card (which are exact same).

I originally could notice the difference between the two but now after 3 years I seem to not be able to.

Actually I still can sort of, on some darker colors.
 
For AMD cards, in CCC go to "My Digital Flat-Panel > Pixel Format, and make your selection.
 
Display port can't do 120hz right? I have a 7950 and a use dual dvi.
 
Hi guys,

I am currently using Radeon 6320 graphics card and this card has displayport (DP) and DVI-I output. I have two identical monitors and they both have HDMI input. Therefore, I am using two cables to hook my monitors, DP->HDMI and DVI-I->HDMI. However, the colors are visibly different on both monitors and it is quite recognizable more in blueish colors. I tried using different monitors as well but the result is the same. Can it be because of some color conversions from DP->HDMI? Is there any color conversion going on from DP to HDMI? Can it be because of monitors? The monitors are just DELL monitors which have 1600x900 resolution and have only HDMI imput.

I'd appreciate your comments on this issue.

Hello dolevo,

Do you have "Use Extended Display Identification Data (EDID)" enabled in the Catalyst Control Center Display Color (My Digital Flat-Panel)? Try disabling and see if it corrected the issue. The Color temperature should be at 6500K.
 
I also have seen the same phenomenon as yours with my rig. Mine is an Asus Pro Art 23 inch IPS panel with a native 8 bit color depth with a 10 bit LUT. When I hook up my monitor to my graphics card with a DP the colors are much more vivid and the blacks are more inky compared to HDMI, I dont use any adapters by the way as my monitor has both DP and HDMI ports and so as my 680. At first I thought it was the 10 bit LUT doing the vivid stuff but then I realized windows does not have a 10 bit color depth option at the OS level. I am yet to find an answer to this behavior but man am I loving this because of the increased color depth :D.... Any thoughts would be appreciated from others.

That's good that I am not alone :cool:

It could be that one of the monitors is using 16-235 level range (meant for TVs) instead of 0-255 (for PCs), or it's using YCbCr mode instead of RGB. I don't know enough about ATI drivers to know how to check for or fix either one.

That's what I thought at the beginning but what I did to eliminate this possibility was that I used the same monitor and I plugged out HDMI input cable from the DVI and plug the other HDMI input coming from the DP. Result was the same. Since from the monitor point of view, incoming signals are the same since I use HDMI in both cases, I think this can't be issue. Perhaps I am still wrong?

Hello dolevo,

Do you have "Use Extended Display Identification Data (EDID)" enabled in the Catalyst Control Center Display Color (My Digital Flat-Panel)? Try disabling and see if it corrected the issue. The Color temperature should be at 6500K.

Hi Mr Mean,
I have no idea what is enabled over there. I will check it tomorrow and let the forum know.

I have been reading a lot since yesterday about these technologies. Apparently, DP is sending data digitally in a similar way how data is sent over let say a network or from PC to USB pendrive. However, the other digital technologies such as DVI and HDMI, they send the R, G and B data seperately with seperate clock signal. Therefore, when I take the cable in the following image, there is a sort of data conversion from one format to the other one. I am guessing that the source of the problem is this conversion. However, I am still investigating where this conversion occurs, somewhere in the cable in a passive fashion, that I don't know yet.

DisplayPort-to-HDMI-cable.jpg
 
Hello dolevo,

Do you have "Use Extended Display Identification Data (EDID)" enabled in the Catalyst Control Center Display Color (My Digital Flat-Panel)? Try disabling and see if it corrected the issue. The Color temperature should be at 6500K.

Didn't make any difference at all.
 
This is probably an obvious question but I've only ever ran single monitor so please bear with me. I always use "adjust desktop color settings" for cool things like vibrancy, digital sharpness, and gamma. Can these sort of settings be used to "tune" the monitor with your software/gpu? I know it's not ideal, but I use this method on a lot of games just to get the colors the way I think they should be from game to game.
 
This is probably an obvious question but I've only ever ran single monitor so please bear with me. I always use "adjust desktop color settings" for cool things like vibrancy, digital sharpness, and gamma. Can these sort of settings be used to "tune" the monitor with your software/gpu? I know it's not ideal, but I use this method on a lot of games just to get the colors the way I think they should be from game to game.

Yes, you are right, that could be but I test this also using single monitor to eliminate the possibility that the source of the problem is due to different monitor settings or physical properties. I use single monitor and there are two cables coming from my graphics card. One attached to the DP and the other one is to the DVI. Both cables have HDMI end and I just switch it at the monitor end.
 
Back
Top