Samsung UN40KU6300 40-Inch 4K

I've noticed it on mine but it's not too bad really. My last panel has a glossy finish so this is a big difference but it's not really bothering me much. In all honesty I notice it when the screen is off during the day than anything else.
 
I notice no color shift, but maybe I'm used to it since I'm sure it was worse on my 8 year old Samsung. I'm glad my old 22" monitor still works, as I can grade pretty mechanical translations while catching up on Westworld. I'm another 22" monitor away on the left side from having a proper holodeck :D

As1YQdK.jpg
 
ran into an issue with "overscan" earlier today. since buying this TV 4 months ago 90% of my usage has been PC. playing FF15 on the PS4 today i noticed some menu item names were slightly cutoff. checking the screen settings and it was set to 16x9 "default". i changed it to "custom" and the screen instantly changed to pixel match with no overscan. this is not an issue in PC mode, but if any of you guys are using other devices make sure you double check this setting.
 
Yeah, I noticed the overscan issue on my PS4 Pro but thankfully it asks you to adjust that as part of the initial setup. Seems silly that manufacturers still do this when everything is digital now.
 
Yeah, I guess I'm just not used to the seeing this type of AR coating since my previous monitors all had matte finishes. It's probably my only real gripe with this TV but I think its something I can get used to. Also, I am having a bit of confusion as to what setting to use in the nvidia control panel as well as the HDMI black level setting on the TV. When I use RGB with HDMI black level on automatic(which looks to be the same as low), the blacks get crushed on the lagom black level test and everything above the bottom row is the same shade of black. When I change the HDMI black level to normal, I can distinguish the different shades but the colors look a bit too washed out. YCbCr444 seems to give me the same result as RGB with HDMI black level on normal. I also plan on using my PC for watching Blu-rays which I believe use a limited range so I'm a bit confused as to what setting to use here.
 
SamsungDesk_otc8g1as2d.jpg


Well, this thing is enormous.

Got the 6290 a few days ago, and I have the same question as Savoy had a few pages back. This is on Windows 10 with a GTX 1070.

When using the HDMI UHD Color mode for my PC, I can't turn it off without the TV losing the connection and just reporting HDMI No Signal. The only way I can get out of UHD Color and see the desktop is to physically unplug the HDMI connection and plug it back in. Does anyone with an AMD card have this problem, or is it a TV issue?

The reason I'm trying to get out of that mode for games is to get access to the LED Clear Motion setting, which is not available in UDH Color mode. This TV actually strobes at 60hz! The increase in motion clarity is big compared to typical sample-and-hold displays. Samsung is so close to having the 6290/6300 be usable in this mode, but the input lag is crazy high and there seems to be no real pixel overdrive, so pixel transitions (http://i.rtings.com/images/reviews/ku6300/ku6300-response-time-large.jpg) overlap onto the next frame. I don't know if it's possible, but if Samsung could publish a firmware update to modify the pixel overdrive and allow LED Clear Motion in Game mode, we'd have an option that many gaming monitor manufacturers are entirely unwilling to do: strobe at 60hz.

The brightness does drop significantly when using LED Clear motion, but at backlight set to 20, I measured a maximum light output of 139.61 cd/m^2. With a black measurement of around 0.09 cd/m^2, that gives an approximately 1500:1 contrast ratio. It's a downgrade from 5000:1 for sure, but I'd be willing to make that trade-off for motion clarity. And maybe it's because I'm used to a CRT, but the flickering in this mode is only really aggravating when you're looking at a full white screen; for most video/game content, you can get used to it pretty fast.

But back to UHD Color, I have a colorimeter and have spent some time trying to calibrate the screen to Rec.709. Unfortunately, in this mode we only have access to the 2-point white balance adjustment, so I can't get it perfect, but I think I've got it pretty close. The standard caveat applies: each screen may be different, but I wouldn't think different enough that these settings wouldn't at least be a good starting point.

Picture Mode: Standard

Expert

Backlight: 10
Brightness: 61
Contrast: 83
Sharpness: 50

HDMI UHD Color: ON
HDMI Black Level: Auto

Color Tone: Warm2

White Balance (2 Point):
R-Offset: 2
G-Offset: 3
B-Offset: -15
R-Gain: -2
G-Gain: 6
B-Gain: 2

Gamma: -1

With those settings, here were the results:

SamsungGamma_n5gih0bdji.png


The white line is BT.1886 reference, yellow is measured gamma, and cyan is the average. Gamma tracks pretty well with BT.1886; it's a bit low throughout, but the Samsung's gamma adjustment is not particularly fine.

SamsungRGB_boqszqti3k.png


This tracks color levels throughout and the magenta line is Delta-E. The Samsung has problems with the blue channel near black. Delta-E stays low throughout, but I wish we could use the 10-Point correction here.

SamsungTemp_ngnwl23u28.png


And as you could anticipate from the previous graph, color temperature stays pretty consistent at 6500k except at the low end.

I tried to keep the backlight level at 10 for this calibration; that gives at brightness of 189.830 cd/m^2. If you want to adjust downward, the backlight at 5 and 6 give 100 and 120 cd/m^2 respectively.

One last note: I want to echo what remy199 said about the screen coating: if you're close to the screen, any white viewed from off dead-center angle has a slightly distracting rainbow-speckled look, and it varies around the screen.
 
When using the HDMI UHD Color mode for my PC, I can't turn it off without the TV losing the connection and just reporting HDMI No Signal. The only way I can get out of UHD Color and see the desktop is to physically unplug the HDMI connection and plug it back in. Does anyone with an AMD card have this problem, or is it a TV issue?

Sorry to hear that bud. I just tried turning UHD Color on and off on my 40KU6290 on an RX470 and I had zero problems. Took a good 3/4 seconds when turning off, turning back on took 1 second. No problem either way.

The reason I'm trying to get out of that mode for games is to get access to the LED Clear Motion setting

If the main reason why you turn off UHD color is for the clear motion interpolation, I have found an easier way: whatever game you're playing, set it at either 4096 resolution (which is fake, as the TV is 3840) or if you want lower res, 1080p at 50fps. In both cases, the TV "abandoned" the PC mode briefly and gave me all the TV options like the clear motion, probably because it understands it as a more logical TV resolution. Whether this is the TV or my RX 470 telling the TV that "hey I'm a TV signal now", that I don't know, but it's worth a try if you don't want to be dealing with other hassles. Obviously, the compromise in the case of 1080p is 50fps instead of 60fps, but then again, I've only tried those 2 cases mentioned. You may want to try other resolutions and see how the TV reacts.

Not to criticize, but your display color management looks way blue to me. It may just be the picture you took, but my Spyder 5 colorimeter definitely gave me a warmer result. Here's a link to my ICM profile if you want to give it a try. Measured at full backlight (20), standard mode, no dynamic contrast messing with the color output.
 
Last edited:
Yes, I also have the issue of the TV going black when disabling UHD color. I'm using a Titan XP. It seems you need to unplug the HDMI cable from the set to get out of it.

Honestly, I've had some issues with adjusting settings, Game Mode sometimes turns off and is grayed out and I still don't understand why. So now that I have the settings I like, I just leave it alone.
 
I just picked up the UN40KU7000F which is basically the same set as the 6300 but this one supports HDR. I'm having a hell of a time getting it to connect to my gaming PC. I'm connecting the TV to a Titan X (Pascal) and the TV keeps blacking out and coming back. I'm using a cable that I know works on my 4K 65" TV in the living room. I also have an Acer X34 and an Acer XB270HU connected to the same PC.
 
Last edited:
Load the latest firmware. Try another port on the display and on the computer. Upgrade your video drivers.

My first KS8000 would black out and keep coming back. I tried three cables. It ended up being the set and not the cable. Replacement sets didn't have that issue with the same cable, gfx driver, and port on the 1080.
 
Load the latest firmware. Try another port on the display and on the computer. Upgrade your video drivers.

My first KS8000 would black out and keep coming back. I tried three cables. It ended up being the set and not the cable. Replacement sets didn't have that issue with the same cable, gfx driver, and port on the 1080.
Thanks for the info. I went ahead and connected the KU7000 to an Alienware laptop I own and it worked perfectly. I guess the RAMDAC on the Titan X can't run two 1440P monitors and this TV at 4K UHD 60Hz resolution.
 
Taking the conversation back to the KU6300/6290's ability to reproduce an HDR signal: I've said before that HDR+ mode is hot garbage. I've since changed my mind. I'm currently preparing a boatload of images comparing HDR+ mode on/off in a direct-feed source recorded from HDR10 signal (this FFXV video, exactly). I'm not sure if the metadata makes it an HDR video and it plays HDR signal only when the display detects it, or if it's a regular video that recorded the direct feed from a PS4 Pro displaying HDR10 signal. Either way, when played through the KU6290's Youtube app, the colors were immediately apparent in their excessive contrast and saturation. It seems that what HDR+ does is flatten the contrast and saturation of the HDR10 so it is in a way compressed to "fit" the tighter limits of the SDR signal that the KU6290 can reproduce. The end result seems a bit more nuanced than a regular SDR image. I've taken 15 pics in SDR and another 15 in HDR+ mode to see the difference. With the TV fed this HDR10 signal, every single time it looked better in HDR+ mode, mainly because without it the image looked cartoonishly oversaturated.

Pictures coming soon, I'm processing them now.
 
Yeah, I should try HDR+ with the PS4 Pro. I tried it before with some UHD Blu-Ray discs, but the result was more like an Instagram filter than anything else.

Also, has anyone experimented with gaming on the TV in 30Hz mode with Auto Motion Plus enabled? My PC is a beast, but many PS4 games lock at 30fps (or if you have a slower GPU on your PC), and I wonder if that would make the experience more smooth.
 
Also, has anyone experimented with gaming on the TV in 30Hz mode with Auto Motion Plus enabled? My PC is a beast, but many PS4 games lock at 30fps (or if you have a slower GPU on your PC), and I wonder if that would make the experience more smooth.

I haven't tried it but I can't imagine it being very responsive at 30fps. When I tried the 1080p50 resolution to get the clear motion going on, it was already annoying how much slower input lag became (and I'm not usually bothered by this kind of stuff, so it must have been really noticeable).

As promised, here are the comparisons. All pictures taken at ISO800 1/60 to ensure the exact same exposure. Every picture pair processed with exact same values on photoshop - taking HDR+ shot as reference since it's the one that looked good. 1st image is always HDR, 2nd image is always SDR:

https://i.imgur.com/4yJCSyo.jpg
https://i.imgur.com/ltTlnMg.jpg

https://i.imgur.com/SwDKHHd.jpg
https://i.imgur.com/dY3gmw0.jpg

https://i.imgur.com/ODqZlxO.jpg
https://i.imgur.com/HMfsHvq.jpg

https://i.imgur.com/U5ETQ6H.jpg
https://i.imgur.com/vipwXe3.jpg

https://i.imgur.com/solqCkG.jpg
https://i.imgur.com/jsGBy5J.jpg

https://i.imgur.com/n8L2pxN.jpg
https://i.imgur.com/ckK2ix6.jpg

https://i.imgur.com/iWYCfRI.jpg
https://i.imgur.com/HxHr44E.jpg

https://i.imgur.com/HHhfDsw.jpg
https://i.imgur.com/kDvdFAi.jpg

https://i.imgur.com/JKyK2qL.jpg
https://i.imgur.com/fMEuvlI.jpg

https://i.imgur.com/oZrEd1z.jpg
https://i.imgur.com/XhK2Qa0.jpg

https://i.imgur.com/mX5xSk3.jpg
https://i.imgur.com/DJoaarR.jpg

https://i.imgur.com/ptXWxH1.jpg
https://i.imgur.com/IJApMoW.jpg

https://i.imgur.com/uGzUJ4d.jpg
https://i.imgur.com/ezM6PY3.jpg

https://i.imgur.com/XE8FGW4.jpg
https://i.imgur.com/17QAEVz.jpg
 
Last edited:
Ocellaris it's for both. I wouldn't call it BS as it's an optional feature that is off by default. I will test more to see if it's any better than my first experience.
 
OK, I tested it again. It's still crap. Strangely, I tried to take photos and the difference wasn't apparent. But I'm 100% sure it looks better without HDR+.

Tested on PS4 Pro w/ Ratchet & Clank (HDR supported). HDR+ mode just adds a fake looking sepia filter or something. Might be OK for some content, but seems like a cheap trick to me.

Also, I don't believe the PS4 records in HDR. I just tried recording a clip, and it looked pretty crappy compared to actually playing the game. So the test may not be indicative of real world performance.
 
HDR+ mode just adds a fake looking sepia filter or something. Might be OK for some content, but seems like a cheap trick to me.

That's because when you switch to HDR+ mode, it has specific image settings by default. That sepia tone you mention is because for some (stupid) reason, Samsung decided to load it with Warm2 color tone by default. I had to bring it back to standard, turn dynamic contrast off and bring the backlight back to 4/20 so it would have the same settings I normally use. Only then you can make a comparison. If used on SDR content, I agree with cybereality , it's utter crap. But when I used it with this recorded HDR10 signal, it most definitely improved the image quality, AKA it didn't look contrasty and saturated to infinity and beyond. It just looked fine.

If that's not what happens when outputting HDR10 signal directly, maybe the only thing that HDR+ mode is good for is for an HDR10 signal that's been recorded and so there is no metadata in the record that tells the TV to display an HDR10 signal, thus why it looks horrible, thus why HDR+ mode just forces those changes. When I've played other HDR sources from the USB port, when the display recognizes that they're HDR10 encoded (the channel info says clearly it's playing HDR) then video looks great without HDR+ mode.
 
Yeah, I should try HDR+ with the PS4 Pro. I tried it before with some UHD Blu-Ray discs, but the result was more like an Instagram filter than anything else.

Also, has anyone experimented with gaming on the TV in 30Hz mode with Auto Motion Plus enabled? My PC is a beast, but many PS4 games lock at 30fps (or if you have a slower GPU on your PC), and I wonder if that would make the experience more smooth.

input lag jumps to 110ms with interpolation (soap opera effect)
 
Ars Technica had a great post on HDR today. What I found out from that, is that none of my TV apps (AKA youtube, netflix, amazon prime or hulu) output HDR yet:

"if you want to watch YouTube's HDR-10 content—which debuted last month and is currently limited to a whopping four compatible videos—your options are limited to select UHD Blu-ray players and the $70 Chromecast Ultra. Google TV versions of the app don't support this yet, and while the Xbox One S's YouTube app is advertised as compatible, its HDR-10 playback wasn't working as of press time. (The Xbox One S also doesn't yet play back Amazon's HDR content.) Once more HDR videos appear on YouTube, we hope HDR compatibility proliferates to devices such as Roku, WebOS TVs, and Google's other devices."
That's probably why when I see people using PS4/pro or XB1S the TV actually tells you that an HDR source is playing, whereas in my case I haven't had that notification because none of what I've watched has been HDR (except the videos I've loaded on the USB which the TV recognizes). This is not new, I guess, we did know this, but it's further confirmation that the HDR ecosystem is an absolute mess for now. It also makes me happy I only invested $300 in the 6290, it's a safer option to wait a year or two until this is all solved and then spend money on a good set, hopefully QLED or OLED.
 
I'm already doing that in the living room! I may consider doing it for the office too, but I do like a dim office, keeps me relaxed. If PWM ever gives me a headache I'll consider turning up the backlight.

14525008_10102812097981879_756656560030693471_o.jpg

i ordered the light strip from Amazon. should be here tomorrow. i like to play in a mostly dark room. will let you guys know how it goes with the lights.
 
i ordered the light strip from Amazon. should be here tomorrow. i like to play in a mostly dark room. will let you guys know how it goes with the lights.

What did you order. My old LCD was much brighter (Older Light Bleed) so now my room is much darker. I don't like the light I have in here and thought a nice strip behind the new screen would be great.
 
This is the one I got and works pretty great.

OK, I tested it again. It's still crap. Strangely, I tried to take photos and the difference wasn't apparent. But I'm 100% sure it looks better without HDR+.

Might be OK for some content, but seems like a cheap trick to me.

Upon more testing myself, I think I've found a more nuanced conclusion. Yes, many times HDR+ makes things look like contrast and color-less crap. I've replayed several videos that the TV recognizes as HDR format, and they look great on their own.

However, when some videos are NOT recognized as HDR even though they are (maybe because the signal lacks metadata to tell the TV that it's HDR), in those cases the video shows oversaturated. That's when turning HDR+ seems to help, as it flattens color and contrast.

So sometimes HDR+ works, but most of the times it doesn't. I'd only use HDR+ as a life-saver when the TV does not recognize an HDR source.
 
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
have any of you messed with Dynamic Contrast? I turned it on today for the first time playing FF15 in game mode. On high it really makes the colors pop.
 
Personally I prefer it set to off. It makes things too contrasty and exaggerated for my taste.
 
Personally I prefer it set to off. It makes things too contrasty and exaggerated for my taste.

it worked great IMO with the HDR in FF15. i had played around 10+ hours with it off and wasn't blown away by the HDR alone. after turning it on high last night it really made a big difference. i used the TV's Amazon app and watched the first episode of "GT" with Dynamic Contrast on high it was over exaggerated but still gave that "holy shit" Best Buy demo TV cranked to the max look.

after reading the article above from Ars Technica i'm just scratching my head. now i know why the TV never popped the "you are watching HDR video" message until i started playing games on the PS4 and Xbox One S. i can't believe that actual apps on this TV can't trigger HDR. i also can't believe Amazon sold me a "4k" Fire TV box that can't trigger HDR either... hopefully this stuff can be fixed via software/firmware updates... but damn it was incredibly short-sighted on their part.
 
after reading the article above from Ars Technica i'm just scratching my head. now i know why the TV never popped the "you are watching HDR video" message until i started playing games on the PS4 and Xbox One S. i can't believe that actual apps on this TV can't trigger HDR. i also can't believe Amazon sold me a "4k" Fire TV box that can't trigger HDR either... hopefully this stuff can be fixed via software/firmware updates... but damn it was incredibly short-sighted on their part.

I would expect some firmware update in the next few months to add this support. There is 0 reason to not do it at least on the 2016 TVs - good customer support, and it's not like any of the 2016 buyers would consider buying a new TV in 2017, so it's a win-win for everybody. I don't think it's short-sightedness, more like the HDR standards aren't fully implemented so they couldn't really get the software ready by the time these TVs came out, but I'd expect them to in the near future. Amazon, Netflix, Hulu... they all need to update their apps for HDR support.
 
I have contrast on medium. Looks good enough to me (high was a bit over saturated).

The HDR situation is a bit of a cluster right now, but it's typically like that with new tech. I'm sure in a year or two it will be solid and that's the price you pay to be the first one on the block with something new.
 
Aaaaand we're already seeing progress on HDR:

Samsung releases update to enable YouTube HDR on 2016 TVs

I tried downloading the KS9500 firmware linked there as technically it's the same OS for all 2016 TVs. Put it in a USB but it didn't work. Checking online also revealed nothing new. Keep your eyes out for the firmware update guys! Should be the same system version for KU6x, KS7x, KS8x, KS9x.
 
Last edited:
Aaaaand we're already seeing progress on HDR:

Samsung releases update to enable YouTube HDR on 2016 TVs

I tried downloading the KS9500 firmware linked there as technically it's the same OS for all 2016 TVs. Put it in a USB but it didn't work. Checking online also revealed nothing new. Keep your eyes out for the firmware update guys! Should be the same system version for KU6x, KS7x, KS8x, KS9x.

Will do. I'm really enjoying this screen on my gaming PC for games and just about everything. I guess I got used to the slight input lag and now it's pretty much normal to me.
 
Update - upon looking at the firmware better, it does apply to all the TVs in Samsung's 2016 UHD range except for the KU6x series. It seems like all the KU6x have a different firmware. This is strange as the KU6x and KU7x are mostly similar TVs except for DCI P3 coverage (the KU7x aren't real HDR either, like the KU6x). It would've made more sense to me if the KS9x and KS8x range had a firmware and KU7x and KU6x shared a different one. But, no, Samsung hasn't divided it like that. Seems a bit arbitrary. Here's hoping the KU6x gets the new firmware soon.
 
anybody using an HDMI switch (successfully) with this TV? i'm swapping between PS4 and PC on HDMI 1 a lot. there are a ton of switches claiming "4k" but some don't work at 60hz and i'm not sure about passing HDR.
 
Not sure but I'd almost assume if a switch doesn't specifically call out HDR support then it doesn't work.

But who knows? If it's all within the HDMI 2.0 spec then maybe any HDMI 2.0 compliant switch would be OK. I see some switches in the $20-30 range so it's not a huge gamble.

This one looks promising for $26: https://www.amazon.com/Expert-Connect-HDMI-Switch-Port/dp/B01MQM4GEV/
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
Not sure but I'd almost assume if a switch doesn't specifically call out HDR support then it doesn't work.

But who knows? If it's all within the HDMI 2.0 spec then maybe any HDMI 2.0 compliant switch would be OK. I see some switches in the $20-30 range so it's not a huge gamble.

This one looks promising for $26: https://www.amazon.com/Expert-Connect-HDMI-Switch-Port/dp/B01MQM4GEV/

that does look good. especially confirming 4k 60hz and 4:4:4. the only concern is no mention on the page of HDR. i may guinea pig this. Amazon has excellent return policy so no worries if it doesn't work.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
that does look good. especially confirming 4k 60hz and 4:4:4. the only concern is no mention on the page of HDR. i may guinea pig this. Amazon has excellent return policy so no worries if it doesn't work.

Please let us know :)
 
Has anyone here played The Last Of Us with this TV? Oh boy you will love it. Trailing all over the place.

This TV has serious issues with pixel response,that's for gaming, As a TV it works really well but the motion handling is too bad for gaming.
 
I played Last of Us on the TV. I didn't notice any trailing. Played a bunch of games, they all look fine.

I have noticed trailing on websites, and it can be annoying at times (but I've gotten used to it). Never saw any problem with games or movies.
 
Back
Top