Nvidia's 340.43 beta drivers support 4k @ 60 Hz over HDMI 1.4

:\

The color cost is pretty high to reduce bandwidth enough to make it work from the samples I've seen. Makes it not really usable for much except watching video. Still, interesting on a technical level if not practical.
 
Does watching video translate into gaming at all? I've seen images showing how terrible text looks, but wonder about the other possible applications.
 
Does watching video translate into gaming at all? I've seen images showing how terrible text looks, but wonder about the other possible applications.


Not really. The differences between the two are pretty drastic. Video has a lot more variables that are mostly FPS/Hz independent.

It's pretty sad people are having to hack together solutions in the first place. HDMI 2.0 is a joke compared to even DP 1.2 for being what, 2-3 years late? Thankfully it's mostly a TV standard which has very little effect on gaming.
 
I only use my 4K HDTV for watching videos, so gaming would never matter for me unless I were to buy a 4k monitor which will be using DP anyway.
 
Just reading the Anandtech article this sounds like the worst idea ever. Nvidia can keep this innovation to themselves. ;)
 
Tried this driver, it was extremely buggy and I encountered too many issues, had to go back to the previous one. And it didn't even feel like 60Hz to me anyway when I tried it on my 4K TV.
 
How a driver can beat the physical restrictions on cable and port? Thought it's hardware issue, that 1.4 aint physically able to transfer that amount of data, required for 4K. That's why the TV makers switched from 1.4 to 2.0 at real 60 Hz 4K screens. So how's that possible to be enabled software wise?
 
How a driver can beat the physical restrictions on cable and port? Thought it's hardware issue, that 1.4 aint physically able to transfer that amount of data, required for 4K. That's why the TV makers switched from 1.4 to 2.0 at real 60 Hz 4K screens. So how's that possible to be enabled software wise?
They cut the color quality so that the video signal fits into the hdmi 1.4 bandwidth @ 60hz.

Not sure why anyone would knock NVIDIA for this. At least they are providing you with an option so you can try it if you'd like.
 
Wow this is nice surprise from Nvidia. Color might not be full RGB, but at least it gives people the option if they have a TV with HDMI and no display port.

Shouldnt be that bad of a color loss.
 
Just reading the Anandtech article this sounds like the worst idea ever. Nvidia can keep this innovation to themselves. ;)
This isn't actually an Nvidia innovation. Reducing bandwidth requirements by using high-resolution luma with low-resolution chroma is an extremely old technique. The basic theory is that we're FAR more sensitive to changes in brightness than we are to changes in color, so as long as luma stays full-resolution, most people wont notice degraded color information.

Now, with that said... I'm not sure why they bothered implementing this. 4K @ 24Hz is all you need for most 4k movies (and HDMI 1.3 can push that without a problem, without compressing colors). So it doesn't really help you there unless you have a rare 60 FPS 4k video you want to view.

It might look ok in games, but any effects that draw using single pixel accuracy will look a bit mushy. And it certainly wouldn't be a mode you'd want to use on the desktop
 
This isn't actually an Nvidia innovation. Reducing bandwidth requirements by using high-resolution luma with low-resolution chroma is an extremely old technique. The basic theory is that we're FAR more sensitive to changes in brightness than we are to changes in color, so as long as luma stays full-resolution, most people wont notice degraded color information.

Now, with that said... I'm not sure why they bothered implementing this. 4K @ 24Hz is all you need for most 4k movies (and HDMI 1.3 can push that without a problem, without compressing colors). So it doesn't really help you there unless you have a rare 60 FPS 4k video you want to view.

It might look ok in games, but any effects that draw using single pixel accuracy will look a bit mushy. And it certainly wouldn't be a mode you'd want to use on the desktop

As I do agree with you on most points. I don't see it hurting anything by giving people this option. Sure it might not look all that great, but its better then being stuck at 30hz.

Anything for free is great in my book. Specially if its just in a driver update.
 
I don't see it hurting anything by giving people this option. Sure it might not look all that great, but its better then being stuck at 30hz.

Anything for free is great in my book. Specially if its just in a driver update.
Oh, of course, never said it hurt anyone... just not entirely sure why they bothered, is all.

Basically, I'm curious exactly what use-case they had in mind for mode. My best guess is gaming, since that actually needs 60Hz, and you might not notice the color degradation.
 
Last edited:
Oh, of course, never said it hurt anyone... just not entirely sure why they bothered, is all.

Basically, I'm curious exactly what use-case they had in mind for mode. My best guess is gaming, since that actually needs 60Hz, and you might not notice the color degradation.

I find it interesting they didnt even put it in the release notes. I mean you would think you would at least mention it.

Yea I guess it depends on how it looks. I know not enabling full RGB mode on a tv sucks royal ass.
 
Yea I guess it depends on how it looks. I know not enabling full RGB mode on a tv sucks royal ass.
Well, the upshot of using this method is that it only reduces color resolution, not bit-depth. You still have 16,777,216 colors to work with, but you're effectively sending color information at 1080p and brightness information at 4k.

Here's an example. Original image on the left, compressed version on the right:

CHyJKGx.jpg
.
HFln0yQ.jpg


Even though the image on the right has 1/4th the color information, they're REALLY hard to tell apart. For movies and games, this is probably acceptable. You'll also notice that there's no banding added the image (since bit-depth wasn't reduced)

Edit: Just to drive the point home about how little your brain cares about color when determining the overall "quality" of an image, here's the same image compressed even further. Left is 1/8th the color information of the original. Right it 1/16th the color information of the original.
Again, these represent extreme compression, far in excess of the display mode Nvidia has just implemented.

2PxXum1.jpg
.
jAwrHNw.jpg


That very last image (bottom-right) is only using 1064 pixels of information to represent color, where as the original image used 76500 (that amounts to 72 times less color information)... and yet it still looks decent. It's FINALLY no-longer looking true to the original, but it still looks nice and crisp and sharp. If viewed on its own, most people wouldn't see anything wrong with the bottom-right image.
 
Last edited:
I've tried loading up borderlands 2 and playing it in 4k on my HDTV and all I can say is that it is so much smoother in 60hz than playing it at the 30 Hz. It feels much more playable. I couldn't tell a difference between the color between 1080p @ 60hz and 4k @ 60 hz through my TV. To be fair though, I did only play for about 10 minutes.
 
I couldn't tell a difference between the color between 1080p @ 60hz and 4k @ 60 hz through my TV.

That's because there is no difference in how color is represented when comparing 1080p 4:4:4 @ 60Hz and 4k 4:2:0 @ 60 Hz :p
 
That's because there is no difference in how color is represented when comparing 1080p 4:4:4 @ 60Hz and 4k 4:2:0 @ 60 Hz :p

So if I'm understanding this correctly, I will see no difference in colors unless I'm using either a displayport connection or HDMI 2.0 to the 4k display?
 
So if I'm understanding this correctly, I will see no difference in colors unless I'm using either a displayport connection or HDMI 2.0 to the 4k display?
Pretty much. Here's a quick breakdown of 1080p and 4k modes available to various HDMI versions. The mode added by Nvidia in this driver update is right in the middle of the table (the only one with mismatched luma/chroma):

qHKEc5x.jpg


But, as the example images I posted above demonstrate, most people wont notice the loss in color detail (even in cases of extreme reductions in color resolution). It only becomes readily apparent when single-pixel effects are involved (like text on the desktop).
 
Can someone test 4K at 60hz over HDMI 1.4 on a Seiki 39" 4k if any one has a Kepler 600+ series, these drivers, and that monitor.
 
No it doesn't, there is huge thread about that tv in display forum you could check.

Also for that chart saying 4k@60hz w/4:2:0 color encoding is HDMI 1.4 is a bit misleading, 4:2:0 encoding was not part of HDMI 1.4. That was added as part of HDMI 2.0 spec.
 
Back
Top