Samsung UN40KU6300 40-Inch 4K

Does turning on auto motion plus have any effect on input lag? I have it off since I figured any additional processing could potentially increase input lag.

I cannot say that I know, but logic tells you it must. Motion smoothers basically insert black frames or repeated ones to create a fake illusion of increased fluidity, so it stands to reason that it does indeed generate some input lag.
 
I think motion plus is for 24p content to up it to 60Hz. If you already have a 60Hz signal like from a PC or console it shouldn't do anything.
 
So does HDR really look crap on this tv from your personal experience? i.e. no point having HDR at all on this tv enabled? ie looks the same as SDR?
 
I still have yet to try the PS4 Pro in HDR. I tried an UHD Blu-ray player and it wasn't a huge difference. I mean, it looked good, don't get me wrong. And it seemed better than SDR, but not the massive leap I hear about with the more expensive TVs.
 
So does HDR really look crap on this tv from your personal experience? i.e. no point having HDR at all on this tv enabled? ie looks the same as SDR?

You seem to be looking for extremes. Good. Bad. Great. Horrible. HDR is much more nuanced than that. It will not "look crap" by any means. But think about what is happening to the signal:

1. Encoded for HDR10 with higher peak/lower dark values
2. Fed to your TV, expecting to be able to get brighter/darker
3. Your TV (anything below $900 right now) can't display the expected 1000nits, but the HDR signal doesn't know that
4. TV displays HDR signal but can't reproduce the brighter brights and darker darks
5. HDR signal is still good, more nuanced, with more information than SDR, but you're not getting all of its benefits

So, no, it will not "look crap". But you won't be watching real HDR either, and by "real" I mean what-the-industry-understands-as-HDR which really means HDR10-signal-displayed-through-1000nit-and-DCIP3-colorspace-capable-panel.

You'll see a glimpse of the quality of HDR. Probably you'll just notice a good signal, just dimmer than you expect it to be. That's not bad. It's also not what you expect from HDR either.

Check this video. A PS4 Pro playing a game with HDR on. You can see it looks quite good. It's a KS7000, which supports wide(r than 6300) color gamut, but since it's still limited in how bright it gets, you still don't get the full benefit of HDR. A 6300 will probably look similar, just with less color variety, but still great.
 
Last edited:
You seem to be looking for extremes. Good. Bad. Great. Horrible. HDR is much more nuanced than that. It will not "look crap" by any means. But think about what is happening to the signal:

1. Encoded for HDR10 with higher peak/lower dark values
2. Fed to your TV, expecting to be able to get brighter/darker
3. Your TV (anything below $900 right now) can't display the expected 1000nits, but the HDR signal doesn't know that
4. TV displays HDR signal but can't reproduce the brighter brights and darker darks
5. HDR signal is still good, more nuanced, with more information than SDR, but you're not getting all of its benefits

So, no, it will not "look crap". But you won't be watching real HDR either, and by "real" I mean what-the-industry-understands-as-HDR which really means HDR10-signal-displayed-through-1000nit-and-DCIP3-colorspace-capable-panel.

You'll see a glimpse of the quality of HDR. Probably you'll just notice a good signal, just dimmer than you expect it to be. That's not bad. It's also not what you expect from HDR either.

All this does is highlight the whole industry is at fault for not enforcing HDR requirements. Even people with compliant gear can still easily fuck up somewhere in their settings or gear that will reduce or elminate the HDR effect. HDR face planted right out of the gate by allowing for different specification for "HDR".

New A/V standards seem to actively work at confusing people off or making sure you need a unicorn tears and a four dimension remote to get everything working.
 
Last edited:
Well, technically, the market is not doing anything wrong, they're just being sneaky as usual. There is a perfectly specified standard for "good" HDR, which is what we call Ultra HD Premium. That requires 1000nits and %90 support of DCI P3. Other TVs that aren't UHD Premium can be sold as HDR TVs because they understand and display an HDR image correctly... just a pale shadow of what HDR can do. So technically they're not lying... but by calling these TVs bullpockey like HDR Premium, which doesn't mean anything at all, vendors are trying to confuse buyers, who remember the word Premium, but don't realize that the 1000nit/DCI P3 standard is UHD Premium, not HDR Premium.

Technically, they're not lying. Just muddying the facts as always and creating a worse experience for everyone involved. Which is typical TV manufacturer, as always. Now more than ever, consumers need to be informed to not fall into their misdirecting shenanigans.

As for the Difference between signals, once again I link to Rtings who posted a nice article today comparing HDR to SDR. Check the difference between:

HDR signal at 1000nits, like the KS8000:
X930D-HDR-dynamic-range-large.jpg

SDR signal on the same TV:
X930D-SDR-dynamic-range-large.jpg

HDR signal at 500nits, like the KU6300/7000:
X700D-HDR-dynamic-range-large.jpg
 
Also does all the HDMI inputs on the samsung KU6300 support HDMI 2.0 and HDCP2.2 and 4k@60hz 4:4:4?
 
ok thanks. So if I wanted to plug in my xbox one S and also computer, this won't work for me?

Since to get sharp text I need 4k@60 4:4:4 for the computer then the xbox one S I also need 4k@60hz 4:4:4 as well??
 
ok thanks. So if I wanted to plug in my xbox one S and also computer, this won't work for me? Since to get sharp text I need 4k@60 4:4:4 for the computer then the xbox one S I also need 4k@60hz 4:4:4 as well??

Buddy, there's a lot of well meaning people here helping everybody else. However, you also have some responsibility as to what you ask and the help you get. You're asking questions that have been answered already in this thread. Read the thread. If you can't find what you're looking for, then ask. Besides, your question is answering itself in part. If only HDMI1 does HDMI2 signal, what do you think? Choose what to connect to HDMI1 or use a splitter that works. Which? I don't know. Figure it out. Look on the internet.

As for your second question, you can answer it yourself: you need 4:4:4 for sharp, defined text. OK. When are you reading sharp fine text on your X1S??? Never. Big letters while gaming. So... what do you think? Will you need 4:4:4 for the xbox?
 
Thanks. I'm going to use it as a computer. And Xbox one S on another input....
 
In the end I went the cheap option because I found a very good deal on the X800D. Now I only have to figure out how to connect my sound-related stuff to it, get a NAS and setup it and pray that fiber gets installed ASAP.
 
I just found a RX 470 for $170, which ended up costing me $100 after gift cards. Not the fastest thing ever, but it has HDMI 2 and it runs most of the games I'm playing (Doom, Mirror's Edge Catalyst, Batman Arkham Knight, Battlefield 1) at nearly 60fps 1080p, ~30fps on frigging 4K (latter at medium settings though). So I'm pretty excited. GPU gets here on Saturday, TV on Monday. I'll post pics!
 
I have a RX 470 in one of my machines. It's a solid card for the price but probably struggling at 4K. For 1080p it should be good.
 
BTW, can your TV's play 4K HDR content? I'm asking because if the TV can do the work by itself (of course I need to get the media to the TV, but that is easy) then any NAS will do the job for storage.

Jeeez my desk system will have more wires than a nuclear plant :/
 
So I've been using this screen the last few days and I have to say games and content at 4k look gorgeous. I set up PC mode for desktop use etc and game mode for games.

I changed the picture settings in game mode to make it look almost like PC does because out of the box game mode looks like shit.

That said, the mouse lag isn't the greatest. I really feel it on some games. Others not to bad. I'm coming from a Korean catless 27" panel that I've been running at 100 Hz over clocked for the last few years and that feels very snappy. This between the input lag and motion blur sometimes I get very annoyed at it. Then other times I find myself saying this isn't too bad. Am I missing a setting to lower there input lag further? Can someone post the optimal settings for games mode please? I just want to make sure I didn't miss anything.

I'm pretty much only playing fps games right now. Cod mwvrm, bf1, SW bf and like Titan fall 2. I noticed the input lag on battle front the most. Cod and Titan fall weren't bad. Bf1 was me ok I guess too.

Oh and I had to lower texture settings in all those games at 4k res since it was maxing out my 4gig of VRAM. No surprise there since bf1 was crashing as soon as I got into a games at ultra textures.
 
Which model you have?

So I've been using this screen the last few days and I have to say games and content at 4k look gorgeous. I set up PC mode for desktop use etc and game mode for games.

I changed the picture settings in game mode to make it look almost like PC does because out of the box game mode looks like shit.

That said, the mouse lag isn't the greatest. I really feel it on some games. Others not to bad. I'm coming from a Korean catless 27" panel that I've been running at 100 Hz over clocked for the last few years and that feels very snappy. This between the input lag and motion blur sometimes I get very annoyed at it. Then other times I find myself saying this isn't too bad. Am I missing a setting to lower there input lag further? Can someone post the optimal settings for games mode please? I just want to make sure I didn't miss anything.

I'm pretty much only playing fps games right now. Cod mwvrm, bf1, SW bf and like Titan fall 2. I noticed the input lag on battle front the most. Cod and Titan fall weren't bad. Bf1 was me ok I guess too.

Oh and I had to lower texture settings in all those games at 4k res since it was maxing out my 4gig of VRAM. No surprise there since bf1 was crashing as soon as I got into a games at ultra textures.
 
Is there a response time difference between the various models I wonder, 6300, 6290, and 6270. Can't find any info on the 6270 as it's pretty new it looks like. Asked Samsung online chat and was told it's identical to the 6290 but with 1 addition image control feature I can't recall the name.
 
I haven't really gamed much on this TV so far other than with my PS Pro and it works great for that with Sharpness at 0. However, I was wondering what Sharpness settings should be for both Movies/Netflix and PC modes and are they at the same scale as Game Mode (anything other than sharpness of 0 looks horrible in PS4 games)? Specifically for PC, I find that with sharpness of 0 my text looks sub-par even though 4:4:4 Chroma is supposedly active (Quick brown fox test pattern looks correct, but the ChromaRes test pattern shows both 4:2:2 and 4:4:4 text?). To get the PC text to look good, I have to raise sharpness to 20-30 range. I haven't really played any PC games yet to test sharpness levels, I only have a single 1070 GTX and the frame rates are too low in new games like Dishonored 2, either I need to upgrade to SLI or a 1080 GTX and I don't think a single 1080 would be enough. I just wish I didn't get the 1070 FTW edition because it's height is higher than other cards so to use a high bandwidth SLI bridge I need a 2nd 1070 FTW....
 
I haven't really gamed much on this TV so far other than with my PS Pro and it works great for that with Sharpness at 0. However, I was wondering what Sharpness settings should be for both Movies/Netflix and PC modes and are they at the same scale as Game Mode (anything other than sharpness of 0 looks horrible in PS4 games)? Specifically for PC, I find that with sharpness of 0 my text looks sub-par even though 4:4:4 Chroma is supposedly active (Quick brown fox test pattern looks correct, but the ChromaRes test pattern shows both 4:2:2 and 4:4:4 text?). To get the PC text to look good, I have to raise sharpness to 20-30 range. I haven't really played any PC games yet to test sharpness levels, I only have a single 1070 GTX and the frame rates are too low in new games like Dishonored 2, either I need to upgrade to SLI or a 1080 GTX and I don't think a single 1080 would be enough. I just wish I didn't get the 1070 FTW edition because it's height is higher than other cards so to use a high bandwidth SLI bridge I need a 2nd 1070 FTW....

Short answer is the current gen GPUs aren't powerful enough for AAA 4K gaming at high quality settings. If that's what you're after, you will have to SLI 1080s together and spend $1000+. Or you can wait for Volta in 2017.

For my purposes, I am content to wait and use my 1070 FTW in the interim.
 
My 3 980gtx sc's are doing ok in the games I play as far as fps goes. Although I had to lower texture settings do to the insane res with only 4 gigs on board each card. I'll be upgrading to 2 x 1080ti's when nvidia decideds to launch those suckers. Overall I love this screen but I just wish I could nail down the input lag a little lower. I mean it's not horrid but I can definetly feel it and I prefer the way my old 1440p korean 100hz panel felt in fps games.

I know Kyle has one of these panels from lasy year a 2015 model I believe. Was it the JU series? I thought I remembered reading they had slightly higher input lag than this one. I'm guessing Kyle is still using this to game on? I'm curious to hear his experience since he's had it for a while now. Input lag bother you at all?
 
Last edited:
I haven't really gamed much on this TV so far other than with my PS Pro and it works great for that with Sharpness at 0. However, I was wondering what Sharpness settings should be for both Movies/Netflix and PC modes and are they at the same scale as Game Mode (anything other than sharpness of 0 looks horrible in PS4 games)? Specifically for PC, I find that with sharpness of 0 my text looks sub-par even though 4:4:4 Chroma is supposedly active (Quick brown fox test pattern looks correct, but the ChromaRes test pattern shows both 4:2:2 and 4:4:4 text?). To get the PC text to look good, I have to raise sharpness to 20-30 range. I haven't really played any PC games yet to test sharpness levels, I only have a single 1070 GTX and the frame rates are too low in new games like Dishonored 2, either I need to upgrade to SLI or a 1080 GTX and I don't think a single 1080 would be enough. I just wish I didn't get the 1070 FTW edition because it's height is higher than other cards so to use a high bandwidth SLI bridge I need a 2nd 1070 FTW....

In PC Mode you need to have the sharpness set at 50, in Game Mode you need to use sharpness 0. It is weird how they did this but 0 = 50 in Game Mode and 50 = 0 in PC Mode.
 
The Titan XP is basically there for 4K gaming (and CPU/RAM also makes a difference). Most games from the past few years can be maxed out at 4K and get well above 60FPS. A couple newer games still struggle (like Deus Ex) but you can make them work by tweaking settings. Even newer well optimized games like Battlefield 1 can be maxed, I'm getting like 90FPS.

If you are willing to lower and tweak settings, I think a GTX 1080 could also just make it. For cheaper cards you'll probably have to stick to older or indie games. I was even able to play at 4K with a GTX 970 previously with games like DmC, Ridge Racer Unbounded, and some older titles.

If you're on a budget, I think the PS4 Pro is money much better spent, in terms of what you get for $400. For me, building PCs is more of a hobby, so I didn't mind splurging on the Titan for fun. I think that PC cost me around $2,400 total versus the PS4 Pro, which isn't quite the full 4K60 experience but comes close enough for the price.

In terms of the input lag, I have a 144Hz TN on another machine, and will likely still use that for competitive online gaming. If you are a serious online player, than the faster refresh and lower lag is definitely an advantage. But I play mostly single player games with a gamepad so the lag is within reasonable limits and not very noticeable.
 
Short answer is the current gen GPUs aren't powerful enough for AAA 4K gaming at high quality settings. If that's what you're after, you will have to SLI 1080s together and spend $1000+. Or you can wait for Volta in 2017.

For my purposes, I am content to wait and use my 1070 FTW in the interim.
Yeah, I'm not sure what to do, I kind of want to wait on Volta but that could be second half of 2017. 1080 Ti I'd imagine will be $900-1000 price range and just a Pascal. I have the 1080 step-up going but with EVGA being so slow it could be another 6 months haha. That is why I was just thinking of going with another 1070 FTW which is $409 on Newegg today so I have to decide within a few hours (just wish the 5% mobile coupon worked but stupid free game lol). However to keep it cool, I might have to upgrade it to a hybrid cooler which is another $100, too bad the hybrid EVGA 1070 on sale elsewhere for Black Friday is the regular hybrid and not FTW hybrid (card height differences means I cant use it).


In PC Mode you need to have the sharpness set at 50, in Game Mode you need to use sharpness 0. It is weird how they did this but 0 = 50 in Game Mode and 50 = 0 in PC Mode.
Thanks, I'll set this up ASAP. It's a very weird thing, I knew something was wrong when at 0 sharpness in Windows 10 PC mode the text was the worst I've ever seen.
 
turn AA off. you will gain 15-20 fps. the pixel density of 3840x2160 resolution on a 40" screen practically resolves any aliasing.

i have a single OC'd 980Ti and hit 60fps in just about everything i play. if turning AA off doesn't get you close to 60 (Witcher 3, GTA V Redux) lower shadow quality from ultra to high.
 
turn AA off. you will gain 15-20 fps. the pixel density of 3840x2160 resolution on a 40" screen practically resolves any aliasing.

i have a single OC'd 980Ti and hit 60fps in just about everything i play. if turning AA off doesn't get you close to 60 (Witcher 3, GTA V Redux) lower shadow quality from ultra to high.

Your card has 4 gigs of ram? I can't remember if ti's had 4 or 6. Do you have to lower texture settings? I do in most games.
 
The 1070 GTX FTW sold out so I guess that makes my decision easy, hah. I'll just wait for the Step-up to the 1080 if it occurs within the next month or so. And maybe get a second 1080 for cheap when everyone is selling theirs to get a 1080 Ti in January/Feb....
 
I tried HDR with the PS4 Pro and Samsung KU6300. It's a subtle difference, but I think it does look better. I realize it's not the experience you get on higher-end TVs but it still looks pretty nice.

So far only tried Last of Us. Will need to test more to come to a better conclusion.
 
So ya I really love this screen overall. It's really weird sometimes I'm really bothered by the input lag and other times I'm totally ok with it. I'm kind of ocd'ing about it at times lol. I find myself in a game just stopping to move the mouse around left and right looking for how bad I feel a delay heh. Someone stop me. Lol

On a side note, the first time I fired up some 4k content on Netflix and YouTube I had to pick my jaw up of the floor. It looks unbelievably amazing. First time I'm actually seeing 4k up close at my leisure and not in a store real quick. It's simply stunning. I still can't believe how impressive it looks. It's as if your looking through a portal and you can just jump right in.
 
Your card has 4 gigs of ram? I can't remember if ti's had 4 or 6. Do you have to lower texture settings? I do in most games.

6gb and i've never lowered textures on anything. kill AA first, then adjust shadow quality. i can run Witcher 3 on "ultra" settings adjusted with no AA, medium shadows, and hairworks on Geralt only. i get a steady 40 to 50 fps. it's not ideal, but is still better than 30fps console and looks 100 times better than 60fps ultra maxed on a 1080p screen. i would prefer 60fps all the time (who wouldn't) but is bumping shadow quality up a notch and turning on AA worth the cost of SLi GPU's?

edit: at 4k resolution you will have issues with less than 6gb of Vram. i see mine maxed or near max a lot.
 
That's a pretty good deal. Not really interested at this point, but at that price I may have picked it up versus the Phillips stand-alone player I paid $250 for (now $230).
 
6gb and i've never lowered textures on anything. kill AA first, then adjust shadow quality. i can run Witcher 3 on "ultra" settings adjusted with no AA, medium shadows, and hairworks on Geralt only. i get a steady 40 to 50 fps. it's not ideal, but is still better than 30fps console and looks 100 times better than 60fps ultra maxed on a 1080p screen. i would prefer 60fps all the time (who wouldn't) but is bumping shadow quality up a notch and turning on AA worth the cost of SLi GPU's?

edit: at 4k resolution you will have issues with less than 6gb of Vram. i see mine maxed or near max a lot.

Ya I lower textures since my cards have 4 gigs. Only a little while longer since I'm gonna pick up 2 1080 it's when they come out. I've already turned off as in my games since at this res I agree it's totally unnecessary.

I'm curious what panel did you come from before this and do you feel the input lag while playing fps games at all?
 
Ya I lower textures since my cards have 4 gigs. Only a little while longer since I'm gonna pick up 2 1080 it's when they come out. I've already turned off as in my games since at this res I agree it's totally unnecessary.

I'm curious what panel did you come from before this and do you feel the input lag while playing fps games at all?

texture resolution is the next Vram hog after overall game resolution.

i had a Korean (Shimian) 2560x1440 ips for around 3 years. i don't notice the lag at all. i'm not sure what the shimian was rated at but i'm certain it wasn't much lower (if any) than this Samsung. i tried "game mode" with the 20ms lag and "pc mode" with 34ms. i just leave it in pc mode even gaming.
 
Back
Top