New Samsung 4k for everyone.

My opinion.. Go with a 6000 or 7000 series, so if there is a defect it won't eat at you as much since the cost was lower. Accept that the display you want doesn't exist at this point, or at least not with manufacturing standards strict enough to ensure 100% quality. It's not like Samsung is going to stop making TVs after this year. There will be QD LCDs next year, and maybe the yields will be more uniform.
 
I wouldn't go back to a 7xxx series because of one dark blue pixel. You might end up with a more noticeable bright pixel AND (in my opinion) less pleasing image quality...although it almost sounds like Z would prefer the image on the 6 and 7 series as it's less vibrant and saturated.

I don't know the details behind the manufacturing process, but I think that 4K displays have so many pixels that it's somewhat common to end up with a pixel defect. He stated it well in his last post:

Zarathustra[H said:
]Six Sigma is still world class, and it results in 3.4 parts per million defective. There are ~8.3 million pixels on a 4k screen. Having just one that is off, is a defect rate of 0.12 ppm, which should make any manufacturer of anything pretty proud...

...Consumer displays are typically class II, and they allow for a surprisingly high amount of pixel defects.

0.12 is MANY times less than 3.4, which is the world class standard for defects!

I could be wrong, but I don't think that even the JS series displays are high end enough that Samsung is using Class I panels in them. That's just a guess, though. Maybe they should be using them, and Samsung is using Class II panels to increase profit margins. Like I said before, the vast majority of people are going to be using these as televisions, not monitors. And with 4K being conducive to small pixel sizes, most people would never notice a defect or two. Hence they are OK with letting them leave the factory with less than Six Sigma levels of defects. It makes sense. We are just more discriminating because we're using them up close as monitors.
 
@Zarathustra[H]

Do you have anywhere you can get the display you want locally? If so, see if they will work with you on bringing your computer in and giving one a try (probably right before closing or when they open).

If not that then personally I would keep the display. It sounds like it isn't too bad and it really will come down to if you can live with it or not. I know how an issue can eat at you, so I think you will need to answer that question first and foremost - can you get to a place where you are okay with it?

It is easy to focus on the issue to the exclusion of all else. Maybe write up a pros and cons list to help with making the decision.

Good luck and let us know.
 
Zarathustra[H];1041715196 said:
5.) What would you guys do? Maybe I've just been lucky in the past, but over the countless LCD screens I've had over the years (granted, none of them 4k) I've never had a new one with a bad pixel

I mentioned recently that I also got a stuck bright pixel on my new 7100 (purchased from a BestBuy store). I could have decided to go to the store for a return, but the few hours round-trip and the uncertainty of what the new screen might end up like, steered me away.
I do see it on a pure black screen when I look for it, but I find the last few days I notice it less, likely because I am getting tired of looking for it :)

I haven't tried jscreenfix, maybe I should ?, but I did try Undead Pixel as well as some other web based LCD repair web page, but had not luck.

For me I think I am just going to live with it, and hopefully notice it less often as I get used to it.

FWIW, I went through 3 flat 6700/7100 and none had a bad pixel. I think curve and/or quantum dots have a higher chance of bad pixels based on the number of issues I've read on the forum

Well, if we are counting, to add my 2 cents, mine is 7100 (flat screen), and has one stuck pixel. Not that it is changing the fact that curved displays might or might not have more issues, and I love the screen uniformity and it not having any backlight bleeding that I can see, but thought I'll add one more to the statistics :)

My opinion.. Go with a 6000 or 7000 series, so if there is a defect it won't eat at you as much since the cost was lower. Accept that the display you want doesn't exist at this point

Funny how I had that same thought process in my mind when I was going through the decision making of which model to get. Reading on various forums about the JS8500 / 9000, and issues some people were having with them, made me worried about playing the lottery here, and end up being a winner of an expensive toy that is not perfect, which would have likely bugged me more than my 7100's stuck pixel currently does.

That said, I would probably not recommend going back to a lower series if you have already tried the 8500 / 9000, I am thinking the drop in picture quality you were used to might then also be a problem. Likely like when you go from a bigger screen to a smaller, or higher resolution to a lower .. For me I have not tried the 8500 / 9000 at home, so I will just remain ignorant and happy :)
 
Last edited:
Zarathustra[H];1041714875 said:
I would still argue, that at normal viewing distances, 4k makes at best marginal sense for TV/Film content. My 60" Panny 1080p plasma won't be leaving my livingroom any time soon, but as a computer screen it is a damned thing of beauty


This is all that really needs to be said, esp nice plasma colors. Games are far more akin to video than computer text.

Resolution these days is basically what sells more expensive GPUs, so it's no surprise what's being pushed by marketing depts and no surprise what folks believe, and it's no surprise said folks get angry when this is pointed out because accepting also implies accepting they're easily influenced.

Game houses spend increasing millions on very complicated code and art to marginally improve realism when they can just sit around for a res bump. God they're so dumb, right?


=============
I was very close to buying this TV but decided to see what happens before going on vacation a couple month back, and probably should've unsubbed to the thread. Evidently the market is changing quickly, and it'll be interesting to see who drops the first non-premium 4k usable as monitor.
 
You're mixing two concepts, detail and resolution. Detail increases realism, be it more polygons, improved processing, or increased tesselation, etc. Resolution, also increases quality, but in a different way. Higher resolution, higher resolution textures, looks better than lower resolution, lower res bit textures. Take for example, a movie at 720P has more details and more realiism than a Pixar movie. However, a Pixar movie, even though it less realistic details, looks smoother, clearer, and in other ways, better than movies filmed in 35mm format. Any movie in 480i is clearly inferior to the same movie in HD, yet they're both the same movie with the same types of realistic details.

Your problem is that you don't have said unit, have nothing to compare it to with your own set of eyes, yet you keep spouting off non-sense, and your perceived notion of what is real or not. You're also accusing people of being idiots and sheeps because what they experience in actual experience clashes with your notion of realism. You accuse people of falling into marketing because they're too dumb to decide for themselves. You should probably unsub from this thread. Instead of admitting that you can't judge something without seeing it yourself, you're clearly trolling at this point with your abrasive attitude and weird accusations. Everyone in this thread can't all be dumb and you're the only person who can see the light.
 
It's very unlikely I'm conflating anything given I used to do this for a living rather than read about it in PR releases. The "enthusiast" nomenclature is pretty apt when folks get exited over the little things. Review sites like hardocp get free product to push articles for ad views & everyone gets excited about inconsequential 10% performance diffs of whatever latest overclocked card with a wing on it. AMD & Nvidia even have fanbois arguing about processors they can't even begin to understand. The ecosystem works and every's happy until some curmudgeon points out the external reality of it all.

It's also notable that pro cinematography itself in general has long evolved past specs. Cine lenses on modern 35mm might as well be infinite res compared to 720/1080p but art staff do a lot to diminish the technical merits for aesthetic appeal; and the same is true for digital 4k cameras whose output is considered so clean it's called a soap opera. Part of the appeal of blurry 24fps for film is due to its dreamy visual nature. This type of artistic merit is still in its infancy with computer games and I expect quite a bit of bitching from g4m3rz when the hobby is snatched away from them by appealing to the "mainstream" who care more about the whole product than specific numbers.
 
Sooo... how 'bout that JS9000, boys? Ain't she fancy? :D

Isn't that what we were talking about before this shit went way off the rails?
 
What's with all the offtopic drivel? Some guys need to take the need to sound smart to PMs
 
Sooo... how 'bout that JS9000, boys? Ain't she fancy? :D

Isn't that what we were talking about before this shit went way off the rails?

We were. I fail to see how diatribes such as the one above add anything to the discussion.

Whether or not those observations are true, they don't add anything of value here. I don't think this is the right place for this. There are tons of 4K threads here, and on the internet as a whole. Why shit up this one with crap that no one cares about? We want to talk about the Samsung 4K displays, not pontificate about the state of the GPU industry.
 
Hi guys,

Was after some feedback on settings for PC Mode / Game Mode for JS9000 connected to Nvidia based gaming PC. I normally use a 40" Philips 4K monitor over Displayport at my desk and got the JS9000 for controller based gaming in the loungeroom (wheel the PC in and play!)

I have been really impressed with the handling of all content via the JS9000 including media via One Connect Box, upscaling of SD material on Digital TV and playing games like Batman on the PS4.

However on the PC side i have mixed feelings and this will come down to tuning settings i'm sure. I played two games for some time i normally play on the Philips being GTA V and Witcher 3. GTA V i preferred on the JS9000, colors had more 'pop' and game was very fluid and good motion in GAME mode. Witcher 3 was a bit "meh" compared to the Philips. I didn't like the colors and the contrast and this is most likely my settings i tried. The Philips has excellent contrast and my screen is also calibrated (via Spyder Elite and using MCW to ensure persistent color profile during gaming) I know the Samsung has excellent color reproduction capability as well and contrast levels can be tuned.

I had the JS9000 setup in Game mode (which i believe does 4:2:2) to reduce lag, however i'd be happy to use PC mode (4:4:4) as its ~56 ms which would be fine for a controller based game. I did try various output methods and settings (YbPCr / RGB with correct HDMI output level etc)

Was wondering if anyone could detail their settings to get good contrast and output on a Nvidia powered gaming pc (i have 2 x 980Tis in SLI) to this screen. Obviously a professional calibration would be the way to go, but was looking to improve things especially for Witcher 3 to get it closer to the Philips especially in contrast if possible even at the expense of lag.

Any ideas or help / nvidia control panel / samsung settings much appreciated!

Ta!

Reklaw
 
Hi guys,

Was after some feedback on settings for PC Mode / Game Mode for JS9000 connected to Nvidia based gaming PC. I normally use a 40" Philips 4K monitor over Displayport at my desk and got the JS9000 for controller based gaming in the loungeroom (wheel the PC in and play!)

I have been really impressed with the handling of all content via the JS9000 including media via One Connect Box, upscaling of SD material on Digital TV and playing games like Batman on the PS4.

However on the PC side i have mixed feelings and this will come down to tuning settings i'm sure. I played two games for some time i normally play on the Philips being GTA V and Witcher 3. GTA V i preferred on the JS9000, colors had more 'pop' and game was very fluid and good motion in GAME mode. Witcher 3 was a bit "meh" compared to the Philips. I didn't like the colors and the contrast and this is most likely my settings i tried. The Philips has excellent contrast and my screen is also calibrated (via Spyder Elite and using MCW to ensure persistent color profile during gaming) I know the Samsung has excellent color reproduction capability as well and contrast levels can be tuned.

I had the JS9000 setup in Game mode (which i believe does 4:2:2) to reduce lag, however i'd be happy to use PC mode (4:4:4) as its ~56 ms which would be fine for a controller based game. I did try various output methods and settings (YbPCr / RGB with correct HDMI output level etc)

Was wondering if anyone could detail their settings to get good contrast and output on a Nvidia powered gaming pc (i have 2 x 980Tis in SLI) to this screen. Obviously a professional calibration would be the way to go, but was looking to improve things especially for Witcher 3 to get it closer to the Philips especially in contrast if possible even at the expense of lag.

Any ideas or help / nvidia control panel / samsung settings much appreciated!

Ta!

Reklaw

I personally run in PC mode as i do not perceive the additional lag but I do see the 4:2:2 artifacts in GAME mode. If you use GAME mode the settings are the same but set Sharpness to 0. I own both 48JU7500 and the 55JS9000 I upgraded to. The settings between them are almost the same.

in PC mode for 55JS9000:

Back-light: 20
Contrast: 100
Brightness: 45
Sharpness: 50
HDMI UHD color: on
Color tone: Warm1

You can either use RGB output set to full 0-255 (in NVIDIA control panel) and set the HDMI black level: Normal in TV or you can set the NVIDIA control Panel to output YCbCr444. The screen result is identical between the two.

The YCbCr is the preferred method because in RGB mode the TV will set the HDMI black level to auto (which is the same as low) every time it is powered off. Its rather annoying.

In HDMI black level low/auto the TV actually displays colors as 16-235. Any value 16 or below is displayed as pure black (complete lack of color). Any value 235 or higher is displayed as pure color of max brightness. That is the TV standard for HDMI but computer monitors like to have full 0-255.

In HDMI black level normal (and YCbCr) the TV displays the full gamut 0-255 of each Red, Green and Blue.

When you switch from low/auto to normal it will look like the screen is "hazy" because in in low/auto its hugely over saturated at the settings given above. Just give it a minute and your eyes will adjust.

Feel free to test it yourself. Black crush:

http://www.lagom.nl/lcd-test/black.php

On Samsung TV you should be able to see the box 3 or 4 as "slightly grey". If you switch to HDMI black level low/auto you will see that boxes 0-15 will become black.

White crush (you should be able to see difference between bar 31 and 32):

http://www.lagom.nl/lcd-test/contrast.php

The JS9000 has a much better green and red (the difference in red is especially dramatic) compared to JU7500. That is why I run color warm1 on JS9000. On the JU7500 you have to run the warm2 setting to compensate for too much blue color (and lackluster green/red)

Otherwise the settings are the same:

JS9000: Warm1
JU7500: Warm2
 
@Ziran

Nice detailed post! Gonna capture that into Evernote so I don't have to search for it.

Thanks!
 
Hi Ziran,

Thanks for the very detailed response :)

I'll try your settings tonight however had another couple of questions.

1. I assume your gaming in a bright room given the high backlight settings? Or do you like it bright and game in the dark? I normally game in dark room for immersion myself, however i do have the Philips up at 100% brightness as well and play in dark room with that.

2. What about the other settings like Dynamic Contrast and also what color space do you use (Native / Custom / Auto) as i believe that effects the color reproduction. Any other settings i should be aware of (e.g. around screen fit etc) would be much appreciated :)

Anyway ta muchly for the settings, i'll set to YCbCr444 and use PC mode with your settings, try the test patterns you linked and see how i go with Warm 1 :)

Ta man,

Reklaw
 
1. Yes I use these settings in a normally lit room. The back-light 20 minimizes the PWM (and also gives you the most vibrant colors possible for this panel). You can lower back light if it is too bright. Lowering contrast from 100 will also effectively make the screen darker. I would not lower brightness setting because then blacks get crushed. Increasing brightness above 45 will start turning black into grey which does not look good either.

2. You will not have dynamic contrast in PC mode (I leave it disabled in GAME mode). The other settings are for color calibration for which I do not have hardware to do so I just leave it at defaults.

Feel free to experiment. Ultimately what matters is how it looks to you. Everyone tweaks the settings to their preference.
 
Does anyone know if these monitors will need a power inverter in order to safely plug into a 220 V socket? Or are they like most laptops where they have a fairly wide range of voltages they can accept?

I'm in Thailand and know that the minimum I will likely need a plug adapter. I'm hoping to skip the inverter though.
 
We were. I fail to see how diatribes such as the one above add anything to the discussion.

Whether or not those observations are true, they don't add anything of value here. I don't think this is the right place for this. There are tons of 4K threads here, and on the internet as a whole. Why shit up this one with crap that no one cares about? We want to talk about the Samsung 4K displays, not pontificate about the state of the GPU industry.

God forbid a 200 page thread mentions practical application of the subject within.
 
Does anyone know if these monitors will need a power inverter in order to safely plug into a 220 V socket? Or are they like most laptops where they have a fairly wide range of voltages they can accept?

I'm in Thailand and know that the minimum I will likely need a plug adapter. I'm hoping to skip the inverter though.

These sell in Europe and I believe they're 220V there. Most computer devices will take 110v or 220v. I believe this is the same. However, why not contact Samsung? I'm sure this is in one of their database questions.
 
1. Yes I use these settings in a normally lit room. The back-light 20 minimizes the PWM (and also gives you the most vibrant colors possible for this panel). You can lower back light if it is too bright.

Wow. I go back and forth between 7 and 8 on the backlight, but usually wind up sticking with 7. I find anything brighter feels like it is going to scorch my eyeballs out. When I first turn on the screen with default settings (which I have done three times now :p ) I find the need to protect my eyes because it is so damned bright, until I turn everything down.

As far as PWM goes, I must not be sensitive to it. was concerned before I bought the JS9000 because it has been mentioned here so much, but even after reading up about it, and listening to peoples description of PWM flicker, I just can't notice it no matter what I do. Someone said to wave my hand back and forth in front of the screen and look for flicker. I tried that, didn't see any, but even if I had, I don't understand why it would matter, as no one uses a screen that way. You want nothing obstructing your view :p

Lowering contrast from 100 will also effectively make the screen darker. I would not lower brightness setting because then blacks get crushed. Increasing brightness above 45 will start turning black into grey which does not look good either.

I haven't tried changing the contrast settings, but fully agree with the brightness setting. It ships at 45, and that's a good setting for it. Turn it down and the shadows are just indiscernible and black, turn it up, and the black quality is diminished.
 
Got a question. On my old GTX 670, in the Witcher 3, with the gamma slide low, I could barely make out the picture, in the gamma setting. Now with my 980 TI, I have to push gamma all the way to the right to make out the picture. What has changed between the 670 and 980 Ti? It appears I've lost a ton of contrast upgrading to the 980 TI.
 
Dunno but I have to push the slider all the way to the left and even then its too bright on my 980.
I also turn the brightness down a tad.
Strangeness.
 
Got a question. On my old GTX 670, in the Witcher 3, with the gamma slide low, I could barely make out the picture, in the gamma setting. Now with my 980 TI, I have to push gamma all the way to the right to make out the picture. What has changed between the 670 and 980 Ti? It appears I've lost a ton of contrast upgrading to the 980 TI.

Did you remove all traces of drivers before upgrading and then reinstall, or did you just swap the cards?

No idea if that'll make a difference, but just a thought.
 
Did you remove all traces of drivers before upgrading and then reinstall, or did you just swap the cards?

No idea if that'll make a difference, but just a thought.

I did. It's really weird. I may need to double check my HDMI level when I get home. It appears I'm getting black crush now that I didn't before. I've lost my VA-ness, for lack of a better word.
 
I did. It's really weird. I may need to double check my HDMI level when I get home. It appears I'm getting black crush now that I didn't before. I've lost my VA-ness, for lack of a better word.

That is odd. Same input port? UHD color is set per port. I only ever enabled it on HDM1 on mine.

Did you set it to YCBR444 mode in the Nvidia settings resolution panel?

I also wonder if the TV somehow knows that you have connected a new device, and because of this has given you fresh picture settings? Are your picture settings teh same as before?
 
Got a question. On my old GTX 670, in the Witcher 3, with the gamma slide low, I could barely make out the picture, in the gamma setting. Now with my 980 TI, I have to push gamma all the way to the right to make out the picture. What has changed between the 670 and 980 Ti? It appears I've lost a ton of contrast upgrading to the 980 TI.

Do you have an ICC your using from the old card or need to recalibrate your monitor.
 
Ok guys, I'm dumb. The TV must have detected that the GPU is different and it reset the HDMI level to auto. It's working as before now.
 
> As far as PWM goes, I must not be sensitive to it. was concerned before I bought the JS9000 because it has been mentioned here so much, but even after reading up about it,

Only the 6xxx have the srsly low PWM freq problem. To clarify, they're all low freq, but the higher models approximate a sinusoid instead of square wave which should be significantly better.
 
Last edited:
Only the 6xxx have the srsly low PWM freq problem. To clarify, they're all low freq, but the higher models approximate a sinusoid instead of square wave which should be significantly better.

Yeah indeed, I wonder why Samsung did it.

6 series
http://www.rtings.com/images/reviews/ju6500/ju6500-backlight-large.jpg
http://www.rtings.com/images/reviews/ju6700/ju6700-backlight-large.jpg

7, 8, 9 series
http://www.rtings.com/images/reviews/ju7100/ju7100-backlight-large.jpg
http://www.rtings.com/images/reviews/js8500/js8500-backlight-large.jpg
http://www.rtings.com/images/reviews/js9500/js9500-backlight-large.jpg

Different duty cycle on the 6 series, very abrupt.

Sony X850C is PWM free but has higher lag and can't do 4:4:4@60hz

http://www.rtings.com/images/reviews/x850c/x850c-backlight-large.jpg
 
That's very interesting. That's totally new information. It's probably the reason why Brahmzy who is PWM-sensitive mentioned he felt much better after he upgraded to the 7500.
 
Yeah indeed, I wonder why Samsung did it.

The other effect and perhaps main reason is reduce tracking artifacts (which you can see a bit on same rtings test). Initially it looked that 6xx vs 7xx diff might be due to different panel which seems odd given how similar the sets are, but then it made sense after they scoped the backlight. Ostensibly it's bit more complicated to control DC level on the backlight instead of straight up pulse.

Low PWM isn't as big of a deal with video/game since the screen is by default dark instead of white, that and you're not staring at the same spot on screen as much.
 
Zarathustra[H];1041717617 said:
Wow. I go back and forth between 7 and 8 on the backlight, but usually wind up sticking with 7. I find anything brighter feels like it is going to scorch my eyeballs out. When I first turn on the screen with default settings (which I have done three times now :p ) I find the need to protect my eyes because it is so damned bright, until I turn everything down.

As far as PWM goes, I must not be sensitive to it. was concerned before I bought the JS9000 because it has been mentioned here so much, but even after reading up about it, and listening to peoples description of PWM flicker, I just can't notice it no matter what I do. Someone said to wave my hand back and forth in front of the screen and look for flicker. I tried that, didn't see any, but even if I had, I don't understand why it would matter, as no one uses a screen that way. You want nothing obstructing your view :p

I haven't tried changing the contrast settings, but fully agree with the brightness setting. It ships at 45, and that's a good setting for it. Turn it down and the shadows are just indiscernible and black, turn it up, and the black quality is diminished.

That is strange. I am sensitive to bright lights (have to wear dark glasses on sunny day out) but I am not bothered by 20 back light. But I guess everyone is different.

I also don't really run any full screen white background applications. A full screen pure white/red/green/blue is a little too bright but in normal applications/games it works just fine and leaves room for bright highlights.
 
Hi again Ziran,

Tried those settings you gave me last night and found they worked really well. I found the Witcher 3 much more to my preference in PC Mode 4:4:4 with Warm 1 etc and couldn't notice any additional lag when playing with the controller over GAME mode. So all in all really happy now and this will be a great TV for big screen gaming :)

Ta muchly,

Reklaw
 
So with all this mention of YCbCr or RGB and HDMI Black Level, I became curious and found some interesting articles:

RGB Full vs Limited
Correcting HDMI Colour on Nvidia and AMD GPUs
Samsung Black Level and Nvidia Settings
HDMI Enhanced Black Levels, xvYCC and RGB

Interesting bits pulled from the links:
TVs use a video range from 16-235. It considers levels below 16 to be black, and information above 235 is white. A calibrated TV will never display anything below 16 as anything other than black. Most will also treat everything over 235 as white since it should not exist in video content.

PCs are different and use a range from 0-255. There is no data below 0 or above 255 with an 8-bit video signal as there are only 256 possible values. In short, this is much simpler to understand as the TV concepts of Blacker-than-Black and Whiter-than-White do not exist.

Nvidia/AMD may need tweaking to enforce full RGB over HDMI. Note: I think Nvidia fixed this in their drivers recently.

Samsung HDMI Level Normal/Low may have been and may still be swapped (i.e. Normal is Limited Ranged (Video Spec) and Low is Full Range (PC Spec). Evidently there are quite a few AVSforum posts about this.

The HDMI 1.3 Specification states that: "Black and white levels for video components shall be either “Full Range” or “Limited Range.” YCbCr components shall always be Limited Range while RGB components may be either Full Range or Limited Range. While using RGB, Limited Range shall be used for all video formats defined in CEA-861-D, with the exception of VGA (640x480) format, which requires Full Range."
Which implies when you force YCbCr from your video card over HDMI you are forcing limited range even though technically YCbCr can output full range.
FYI, it does not appear this portion of the specification has changed from 1.3 to 2.0. Still attempting to track this down but, it appears they added the capability for full range "YCbCr" in 1.3 as xvYCC.

What does all this mean? Well my take away is that configure your display/videocard the way that looks best to you and be done with it. In my case, that is a lower contrast ratio (not 100 more like 80), HDMI Level on Low and Nvidia set to RGB/Full. Your mileage may vary; void where prohibited.
 
Last edited:
Ok guys, I'm dumb. The TV must have detected that the GPU is different and it reset the HDMI level to auto. It's working as before now.

Samsung TVs always set HDMI black level to auto every time you turn TV off. It is very annoying. That is why we use the YCbCr mode.
 
Samsung TVs always set HDMI black level to auto every time you turn TV off. It is very annoying. That is why we use the YCbCr mode.

I am not seeing that behavior with the current firmware on a Nvidia 9xx series with current drivers.
 
So with all this mention of YCbCr or RGB and HDMI Black Level, I became curious and found some interesting articles:

RGB Full vs Limited
Correcting HDMI Colour on Nvidia and AMD GPUs
Samsung Black Level and Nvidia Settings
HDMI Enhanced Black Levels, xvYCC and RGB

Interesting bits pulled from the links:
TVs use a video range from 16-235. It considers levels below 16 to be black, and information above 235 is white. A calibrated TV will never display anything below 16 as anything other than black. Most will also treat everything over 235 as white since it should not exist in video content.

PCs are different and use a range from 0-255. There is no data below 0 or above 255 with an 8-bit video signal as there are only 256 possible values. In short, this is much simpler to understand as the TV concepts of Blacker-than-Black and Whiter-than-White do not exist.

Nvidia/AMD may need tweaking to enforce full RGB over HDMI. Note: I think Nvidia fixed this in their drivers recently.

Samsung HDMI Level Normal/Low may have been and may still be swapped (i.e. Normal is Limited Ranged (Video Spec) and Low is Full Range (PC Spec). Evidently there are quite a few AVSforum posts about this.

The HDMI 1.3 Specification states that: "Black and white levels for video components shall be either “Full Range” or “Limited Range.” YCbCr components shall always be Limited Range while RGB components may be either Full Range or Limited Range. While using RGB, Limited Range shall be used for all video formats defined in CEA-861-D, with the exception of VGA (640x480) format, which requires Full Range."
Which implies when you force YCbCr from your video card over HDMI you are forcing limited range even though technically YCbCr can output full range.
FYI, it does not appear this portion of the specification has changed from 1.3 to 2.0. Still attempting to track this down but, it appears they added the capability for full range "YCbCr" in 1.3 as xvYCC.

What does all this mean? Well my take away is that configure your display/videocard the way that looks best to you and be done with it. In my case, that is a lower contrast ratio (not 100 more like 80), HDMI Level on Low and Nvidia set to RGB/Full. Your mileage may vary; void where prohibited.

Run a image test... color pallette... something. There is no confusion. HDMI LOW = limited range. Dark colors are crushed to straight black and at the other end color values are boosted increasing vibrance, which loses detail. HDMI should be set to NORMAL.

Samsung TVs always set HDMI black level to auto every time you turn TV off. It is very annoying. That is why we use the YCbCr mode.

Mine does not reset when running in RGB mode?
 
I'm pretty disappointed in LGs pricing for OLED so far. They need to get it down or they won't be making many sales. Even at $2k, that's a lot for a 50 inch and the input lag numbers on all of the OLED models so far are terrible. I LOVE OLED though and I hope they sort this stuff out asap.

The input lag isn't terrible. It's a hair worse than the Samsung TV's, but it's definitely not terrible.

Gaming at higher res is largely a pointless marketing exercise. No game looks anywhere near as good as something which "looks good" (eg. 1080p film), and increasing res for perceived sharpness does absolutely nothing to bridge that gap.

If perceived sharpness is the goal then the sharpness control provides 80% results for <1% of the cost.

OTOH oftentimes ownership happiness has a price, which in this case is the cost of an $$ graphics card. YMMV.

Sharpness control isn't doing crap. You can't force pixels to be there when they aren't there. 4K makes even old games looks fantastic.
 
Back
Top