Nvidia GPU? HDMI connection? Please read.

Medion

[H]ard|Gawd
Joined
Mar 26, 2012
Messages
1,584
I've been saying this for months, and a few others have recently jumped in. For those of us using HDTVs, this has been an issue for years. With the recent explosion of HDMI in PC monitors, it's becoming a larger issue. Nvidia's GPUs do not properly output full-range HDMI. But there is a fix. Run the below utility if you want it fixed. Read on if you want more of an explanation.

http://blog.metaclassofnil.com/wp-content/uploads/2012/08/NV_RGBFullRangeToggle.zip

Basically, if you've ever used the menu in a modern HDTV, you've probably noticed a setting called "HDMI black levels" or just "black level." RGB sets a range of 256 (0-255) for black, when using 24-bit color. Most digital video content uses a range of 16-235. It is important for your connected device and your TV/monitor to be set at the same setting, or image quality will suffer. Unfortunately, there is no standard for the settings, so I'll explain the common ones below.

-Option for "full-range," or "high" is 0-255
-Option for "limited-range," or "low" is 16-235

Setting the input device (PS3, Xbox360, PC) to full-range and the TV/monitor to "limited-range" will lead to crushing blacks. The opposite will lead to a washed out image. If you have both set to full-range and play a DVD or Blu-Ray (Ie, PS3), this is fine as the PS3 and even your PC knows to properly map 16-235 content within a 0-255 range. Even your Nvidia GPU has this setting under video color settings. NOTE: This only affects video output when the desktop is at 0-255. Setting is useless if your desktop is 16-235.

So, when you plug in your GPU to a monitor/TV via HDMI, the driver looks up the display's EDID and tries to find out if it supports 0-255. If not or uncertain, it defaults to 16-235. Since monitors are coming out faster than AMD or Nvidia can keep pace, this means that a lot of displays cause the driver to default to 16-235, despite the display using 0-255. This means a washed out image. AMD has fixed this by giving us a toggle in their drivers. Nvidia has not and is ignoring the issue despite numerous threads on their own forums, and numerous bug submissions by myself and others.

So what can you do? After installing a driver, run the utility I linked at the top. You'll have to do this after every driver installation. Also, this forces 0-255 output, so your image will have crushing blacks if output to a limited range HDTV (so run the toggle again).
 
I have an odd problem with my GTX 660 Ti hooked up to a monitor...

Using the HDMI, if I have the nvidia control panel, under the display settings, I can change the resolution.
Choosing the HDTV option for my native resolution makes the desktop look washed out, but the colors in games look good.
Conversely, when I choose the PC option for my native resolution, the desktop looks great, but the blacks are compressed in games.

I am going to try using some other cables (DVI and VGA) tomorrow, but I was wondering if you have seen this before?
 
Damn, I just did this (a few days ago) and movies look a lot better on my hdmi hdtv. I had no idea this was a fixable issue, I figured hdtv's just look more washed out compared to my monitor. It sucks that such information is not more high profile. That tool also crashed on my win 8 install so I did it manually, should be do-able in powershell so I might mess around and try to implement a script for it.
 
This is interesting. I'll have to look into this, see what I can learn, and find out if something more can be done about it. Thanks for sharing.
 
[F8];1039375619 said:
I tried it too, what am I looking for, I do not see black level setting in my tv options. JVC http://tv.jvc.com/product.jsp?modelId=MODL028923&pathId=201&page=10

The limited range reduces the color space, so the image should be more colorful. I don't think black level settings have anything to do with it. If you can't tell the difference I don't know what to tell you, it seemed very pronounced for me in LoTR EE blu-rays.
 
I'm not seeing a difference, at least right away. Here's a question - does this affect the YCbCr444 color format at all, or just RGB?
 
I left this on all weekend (I toggled to RGB mode)...and it seems to essentially make RGB mode look like YCbCr444 mode. Is there a reason I wouldn't just set it to YCbCr444 instead?
 
When I tried it, everything seemed darker, like almost too dark. Not necessarily something brightness would fix. What is "crushing blacks"?
 
While I don't really have a means for creating a 1-to-1 comparison, I toggled back and forth a bit and while this DOES make RGB mode look better, I still think YCbCr444 looks better than either version of RGB on my set.
 
While I don't really have a means for creating a 1-to-1 comparison, I toggled back and forth a bit and while this DOES make RGB mode look better, I still think YCbCr444 looks better than either version of RGB on my set.

YCbCr444 looks washed out on my screen. RGB Full Range definitely looks best of all these options here. Maybe your TV options need adjusting (color, detail, brightness, etc.)

Edit: Hmm, there is an RGB 16-235 / RGB Full Range toggle in the Nvidia driver panel, I didn't know that was there, but now that I do I wonder if there is a reason to use a program for this. I toggled back and forth (does not even require a video reset, let alone a reboot), and full range looks plainly superior, the richer colors are readily apparent.
 
Last edited:
Rock what looks best, or invest in one of those AV calibration playback discs.
 
For me, if anything, I'd say the RGB tends to make blacks more grey'ish and it almost reminds me of the Doom 3 world where the dark areas take over instead of seamlessly blending with the rest of the image.
Not sure of a good way to describe it, but the full RGB appears to affect the colors near dark patches and it makes them fade out. Colors aren't as bright.
 
Here are some screen pictures, poor quality, but it does show there is a difference:
RGB Full: https://iyt2pw.blu.livefilestore.co...8tCJhyvRlq0HhTbZmqma_NutH/IMG_0054.JPG?psid=1
YCbCr444: https://iyt2pw.blu.livefilestore.co...0lYeOqgKS2ggX67n3oYMK49mf/IMG_0055.JPG?psid=1
RGB Limited: https://iyt2pw.blu.livefilestore.co...1NRH-cJcjYGTqF_iBiq1UOG6O/IMG_0057.JPG?psid=1

The skin doesn't look right because the camera sucks, but you can see the hair looks better in the RGB Full than YCbCr444, imo. The RGB limited is not apples to apples because I had to play the video before the change kicked in, but the hair also looks more washed out there, it's similar to YCbCr444 visually.
 
Huh...weird. The difference on your TV looks almost backwards to mine. I guess your mileage may vary!
 
Just picked up a new nvidia card so I decided to use the hdmi connection on my Asus monitor, it looked washed out. Used this download, looks much, much better.

Thank you OP.
 
This patch doesn't work correctly on my 560ti. Applying the patch and using HDMI Black Level set to Low completely crushes blacks.
 
How do I tell if my HDTV is limited or full? I dont see HDMI black or black levels. I even looked in the service settings menu.

Nevermind I found a way.
In full screen. If the words RED and MAGENTA are blurry, then you have RGB compression. If they look sharp and clear, you are using Full RGB correctly.

 
Last edited:
So Nvidia has a hdmi issue, is this issue still there if you are using DVI or display port?
 
So Nvidia has a hdmi issue, is this issue still there if you are using DVI or display port?

The "issue" can happen with either brand. The problem with nv (as I understand it) is that they don't provide an easy global switch between 0-255 and 16-240. The problem is that the display sends the incorrect info, so it's not even really the graphics card's fault. Apparently TV-type displays are horrible about containing the correct info.
 
I've only seen the issue on HDMI, it seems the DP / DVI display information is sent correctly / contains more precise instructions for the display? Another reason to loathe HDMI I guess
 
I used AMD for the past 7 years and several video card upgrades with zero issues and just recently wanted give Nvidia another go with a 660 just to get problems(one listed here). After hours of research and no fixes I found the app here that fixed it. My monitor only has HDMI and VGA otherwise I would have used DVI. If I hadn't fixed the problem I would have went VGA but didn't want to.

Thanks
 
I want to know why somebody in the video industry thought 16-240 was better than 0-255 for TVs. Probably some initial crappy digital-to-analog with lame brightness/contrast circuit dependent on that "little range hack".
 
There is a reason why AMD, NVIDIA and Intel have all decided to drop HDMI.

The latest cards from AMD have HDMI, I'm using it on my R9 card.
It makes sense that the current standard for home entertainment has direct support to avoid confusion.
Do you have proof that future cards wont have it?
 
The latest cards from AMD have HDMI, I'm using it on my R9 card.
It makes sense that the current standard for home entertainment has direct support to avoid confusion.
Do you have proof that future cards wont have it?


Yup, and so do the 780 and 780ti.....
 
The only port that will disappear at some point in the future is DVI, unless for legacy reasons. That is still several years off, but DVI is not a changing standard since the standards body for DVI has long disbanded over a decade ago; with that being the case - Displayport is an evolving technology as is HDMI. HDMI will not disappear, if anything DVI will disappear 5 years from now. DVI-D does not go past 2560*1600, displayport and HDMI 2.0 can.

That said, HDMI 1.4 has a ton of issues and generally does not even work past 1080p in 99% of implementations. But, HDMI 2.0 whenever it begins to get adopted -should- fix these issues, since UHDTV's in 2015 will generally use HDMI 2.0. As of right now, though, nothing has HDMI 2.0 implemented as of yet. It was just passed as a standard a few months ago..

So 3-4 years from now I can see GPUs using DVI only for legacy (big maybe) with the main connections being HDMI 2.0 and Displayport. Of course, legacy is a huge issue in the PC world since a lot of folks use old monitors with their PCs. I do think DVI will disappear at some point, the only question is when. 5 years? 6? Who knows.
 
[H], let's get Nvidia to fix this as well as fix/limit the banding issues and offer EDID support (AMD Catalyst Control Center>display color section and check "Use Extended Display Identification Data (EDID)" in Color Temperature Control) which can allegedly limit a wide gamut monitors color gamut in non-color managed programs such as games.

This issue applies to Displayport as well for some monitors like the U2414H PC Monitors Reviewed and the BenQ BL2710PT I tested (contrast dropped from 1000:1 to 200:1 when switching from the native 1440p resolution to 1080p over DP).
 
Last edited:
No matter what you think of HDMI, it isn't going anywhere.
They're just going to move to HDMI 2.0 at some point in the near future. You'll see DVI phased out a la VGA, though. Expect HDMI and DisplayPort to be the only options and side-by-side.
 
[H], let's get Nvidia to fix this as well as fix/limit the banding issues and offer EDID support (AMD Catalyst Control Center>display color section and check "Use Extended Display Identification Data (EDID)" in Color Temperature Control) which can allegedly limit a wide gamut monitors color gamut in non-color managed programs such as games.

This issue applies to Displayport as well for some monitors like the U2414H PC Monitors Reviewed and the BenQ BL2710PT I tested (contrast dropped from 1000:1 to 200:1 when switching from the native 1440p resolution to 1080p over DP).

You must mean EDID over DisplayPort - not HDMI or DVI. I've never had issues with my GTX550Ti over DVI - the reason I haven't HDMI is due to the adapter on the GPU is mini-HDMI, not full-sized HDMI.
 
i came from amd to nvidia because of the supposedly better software etc but there are all these weird bugs etc seriously wtf?
 
i came from amd to nvidia because of the supposedly better software etc but there are all these weird bugs etc seriously wtf?

Neither brand is perfect. However this issue is caused by what the display device reports (or doesn't report). They have to maintain a blacklist of devices (with hardwired settings) to resolve this presently. They seem to lack a global switch, which apparently Radeon has in the driver control panel.
 
i came from amd to nvidia because of the supposedly better software etc but there are all these weird bugs etc seriously wtf?

Neither brand is perfect. However this issue is caused by what the display device reports (or doesn't report). They have to maintain a blacklist of devices (with hardwired settings) to resolve this presently. They seem to lack a global switch, which apparently Radeon has in the driver control panel.



The color dominance of AMD has go on for more than a few years and was even prevalent back in the 5870/480 daysin 09, if not before! (used to have cheap monitors so never noticed it before then) At this point NV doesn't have have any excuses, they've continually chosen to drop the ball on this issue even after countless of emails about they're terrible settings and profiles. Maybe you could say it was displays during the first year or so that we noticed when HDMI was relatively new but at this point it's ALL on NV.

With that said, this has been such a small issue that affects a small subset of people that I understand why it goes unsolved and they excel in others areas which makes up for it for most folks that don't even notice the issue. I cared enough to be on AMD up until toggle was out and was only too happy to sell my 7950's for profit and get a gtx 780 and pocket some money. :p
 
You must mean EDID over DisplayPort - not HDMI or DVI. I've never had issues with my GTX550Ti over DVI - the reason I haven't HDMI is due to the adapter on the GPU is mini-HDMI, not full-sized HDMI.

AMD's EDID support applies to all connections and apparently can make a Wide Gamut monitors colors normal without having to manually enable color management in programs which support it (games don't). Nvidia's color issues lies with HDMI & DP, DVI is fine.
 
Back
Top