Samsung UN40KU6300 40-Inch 4K

Hi All, the new windows 10 creators update enables HDR and In the nvidia setting there is now the option to select a 12bpc setting. Would this 12bpc be better then the 8bpc stated to be used in this guide?

No.
 
In a dark room, I try to keep backlight at zero or one, but I might go as high as ten if I have the windows open. Brightness and contrast are at 45/95. For solid grays, color tone seems to make a difference, with cooler settings more tolerable than warmer ones.
Anyway, I'd be very happy to see someone without a defective unit post a picture of this thread on the default Hard Forum Dark theme for direct comparison.
 
andy4theherd That background is RGB 128, 128, 128, which is too light to show these panel defects. Use something like RGB 25, 25, 25 and you'll be able to easily see the gamma shift as you move your head around; it'll also expose any panel non-uniformity.
 
No. You would need a separate device to send the signal (I can't recall what it's called but it was mentioned earlier in the thread) and it isn't cheap. Just keep the remote next to your keyboard like the rest of us.
can you (or anyone) point be to a post mentioning this? i searched to no avail.
 
so with this being BGR instead of RGB, any Mac users out there using this TV flipped upside down? I've got a wall mount on order and I might try it to see if the text looks better. I've got the UN40MU6300 which I guess its the 2017 revision of the UN40KU6300


I'm also waiting on a mini-DP to HDMI 2.0 adapter so I can run this at 60hz on my 2014 rMBP
 
I have a question.

I've followed the guide here by Digital Foundry. Setting a custom resolution of 3200x1800. One of his steps was going to nvidia control panel> adjust desktop size and position
The default values here are Aspect Ratio and Perform scaling on Desktop, Override unchecked. On the guide, DF says to change it to GPU perform scaling then override checked.

I was wondering if this has an effect on the upscaling technique of the TV?

LqdYzoU.png



 
I have a question.

I've followed the guide here by Digital Foundry. Setting a custom resolution of 3200x1800. One of his steps was going to nvidia control panel> adjust desktop size and position
The default values here are Aspect Ratio and Perform scaling on Desktop, Override unchecked. On the guide, DF says to change it to GPU perform scaling then override checked.

I was wondering if this has an effect on the upscaling technique of the TV?

LqdYzoU.png




It should leave the TV out of scaling entirely, since it will still get a native resolution signal and the GPU will handle scaling. That's kind of the whole point.
 
" ... so I have to keep moving the taskbar around so the icons don't stay imprinted. "

Why not just auto-hide the taskbar?

I've got this monitor and with the latest firmware update last week, I think it's stellar.

Maybe I got lucky in the panel lottery, but this thing is amazing and has better uniformity than the other images I see in this thread.
 
  • Like
Reactions: Savoy
like this
It should leave the TV out of scaling entirely, since it will still get a native resolution signal and the GPU will handle scaling. That's kind of the whole point.

Kinda confused. I kinda somehow got the point of that option. In short, the upscaling of the TV has nothing to do with it right? The scaling option from what I've understood is for the resolution.

ex. If my monitor is only 1080p, I cannot get 4k native reso without changing that scaling option to GPU. I once tried it on the TV and it says signal error or something that's why I came up with that conclusion.
 
Kinda confused. I kinda somehow got the point of that option. In short, the upscaling of the TV has nothing to do with it right? The scaling option from what I've understood is for the resolution.

ex. If my monitor is only 1080p, I cannot get 4k native reso without changing that scaling option to GPU. I once tried it on the TV and it says signal error or something that's why I came up with that conclusion.

I don't think that's an option in either GPU settings for standard desktop resolution, to be able to set a resolution above the monitor resolution and downscale.

The option to scale either on the GPU or the monitor generally refers to upscaling, IE. if you have a 1080p monitor and you only have a 720p resolution input/video, does your GPU take the 720p video and upscale it to 1080p, or does your GPU send the 720p unadulterated and have the monitor upscale to 1080p.

Also, to note, I tested a resolution, I believe 3200x1800 on this exact TV, both GPU scaling and monitor scaling, and the Samsung TV I thought did a much better job upscaling than the GPU, very surprisingly. I always assumed a high end Nvidia GPU would do much better upscaling, but the quick test I did on a WIndows desktop showed the TV did a much better job with that.
 
I don't think that's an option in either GPU settings for standard desktop resolution, to be able to set a resolution above the monitor resolution and downscale.

The option to scale either on the GPU or the monitor generally refers to upscaling, IE. if you have a 1080p monitor and you only have a 720p resolution input/video, does your GPU take the 720p video and upscale it to 1080p, or does your GPU send the 720p unadulterated and have the monitor upscale to 1080p.

Also, to note, I tested a resolution, I believe 3200x1800 on this exact TV, both GPU scaling and monitor scaling, and the Samsung TV I thought did a much better job upscaling than the GPU, very surprisingly.
I always assumed a high end Nvidia GPU would do much better upscaling, but the quick test I did on a WIndows desktop showed the TV did a much better job with that.

Really ? right now I bring it back to the default Display scaling settings and just set the custom reso in the nvidia control panel. I wonder why the guide uses another custom reso app when you can just set it up on nvidia control panel.

I'm confused. Why would you set a resolution other than the native 3840x2160?

My 1060 6GB can't handle full 3840x2160 resolution even if the fps is set to 30. FPS drops here and there. So I thought of dropping it down a notch, faux 4k (3200x1800). I can now play with consistent 30fps. I only did this on rpg games like Witcher 3 though and I don't mind 30fps as long as it stays there and the reso is 1800p. :D
 
Really ? right now I bring it back to the default Display scaling settings and just set the custom reso in the nvidia control panel. I wonder why the guide uses another custom reso app when you can just set it up on nvidia control panel.



My 1060 6GB can't handle full 3840x2160 resolution even if the fps is set to 30. FPS drops here and there. So I thought of dropping it down a notch, faux 4k (3200x1800). I can now play with consistent 30fps. I only did this on rpg games like Witcher 3 though and I don't mind 30fps as long as it stays there and the reso is 1800p. :D
There should be no problem with 1060 with that TV. I have that GPU with that TV and UHD with 60Hz works without any problem. Maybe you have problem with cable ? Is your cable Really a HDMI 2.0 compatible ?
 
There should be no problem with 1060 with that TV. I have that GPU with that TV and UHD with 60Hz works without any problem. Maybe you have problem with cable ? Is your cable Really a HDMI 2.0 compatible ?

That's a good point. It sounds like his cable isn't quite meeting the HDMI 2.0 spec which is why dropping the resolution down and reducing the bandwidth demands on the cable works. Deicidium I would definitely try another cable, as the 1060 should handle 4K @ 60 Hz no problem.
 
well crap...

I can't get this TV to work at 4k 60hz on my Macbook Pro (retina 15", mid-2014). It works at 30hz, but using an Amazon HDMI 2.0 cable, the 'pluggable' branded midiDP->HDMI 2.0 adapter it still only works at 30hz.

I installed SwitchResX which lets me force it to 60hz... the TV recognizes it at 60hz but there is about a 4 inch vertical strip about 1/4th of the way into the screen from the left side that is all glitched out :( the image is 'buzzing', or vibrating, and the whole strip is just a copy of the 4 inches to the left of that section. in other words, unusable


anyone happen to have a 2014 rMBP with this TV working at 60hz? am I doing something wrong or just out of luck
 
I'm not sure those old MBP can support 4K60. I recall someone getting it to work with Thunderbolt but I forget which model that was on.
 
Did some quick searching, I think this adapter might work.
https://www.amazon.com/Mini-DisplayPort-HDMI-Active-Adapter/dp/B01B702YTG

Or something similar search around.

The one I'm using supposedly worked for some folks : https://www.amazon.com/gp/product/B00S0BWR2K/ref=oh_aui_detailpage_o07_s00?ie=UTF8&psc=1

but it could be the adapter that is tripping me up. I guess I will try that one you linked, or I've also heard success with the club3d branded one. Its weird because I have seen reports of people getting 40k 60hz on even 2013 macbooks so I feel like this 2014 should work. Seeing it come up successfully at 60hz using SwitchResX (except for the 4" strip of corruption) also makes me think it should work if I can just find the right custom resolution in SwitchResX

I will feel dumb if I keep throwing money at adapters and purchasing this software and still can't get it to work =P I know the cable I'm using is capable of 40k 60hz since it works on my GTX 970 Windows box I use with my HTC Vive
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
Did anyone get an update for this tv that made text render crappy? Mine updated twice in the last couple weeks. And now text sucks, there are red or blue edges on text now instead of a defined edge.
 
It probably changed the pixel format. You want it to be 4:4:4 for the best looking text. You can adjust in the AMD/Nvidia settings on your computer.
 
Look like I got a blue line of death, how is the rma process? For people who did it before?1496668441431150957468.jpg

edit:
samsung support say move the tv from surge protector to outlet and it seems that it fixed the line, now I am paranoid...
 
Last edited:
Look like I got a blue line of death, how is the rma process? For people who did it before?View attachment 26780

edit:
samsung support say move the tv from surge protector to outlet and it seems that it fixed the line, now I am paranoid...

Update, line re appear, got a tech come over and replace the panel. Probably different panel then original one, contrast is like 700 less, max bright is like 30nit less and much more "whiter" (higher CCT). not that I can complain...

the tech quote "they give you a new panel, most of time, you get a refurb".

I have no idea what the panel part # since they dont event need to take the panel out from the bezel. The replacement part come with the bezel and panel backing.

also notice the right side seem to be greenish where left is reddish, ~, meh whatever
 
Last edited:
I updated Windows 10 to Creator's Update and now the image on the TV looks washed out. AFAIK the settings on the TV have not changed.

Checked in Windows, res and refresh are correct, AMD settings on 4:2:0 pixel format as before. I don't know what happened, any advice?
 
I updated Windows 10 to Creator's Update and now the image on the TV looks washed out. AFAIK the settings on the TV have not changed.

Checked in Windows, res and refresh are correct, AMD settings on 4:2:0 pixel format as before. I don't know what happened, any advice?

Change the HDMI black level on the TV or in the AMD control panel.
 
I updated Windows 10 to Creator's Update and now the image on the TV looks washed out. AFAIK the settings on the TV have not changed.

Checked in Windows, res and refresh are correct, AMD settings on 4:2:0 pixel format as before. I don't know what happened, any advice?

Same happened to me, it turned out that Win 10 Creator's Update auto enable HDR in the display setting. Right click on desktop, select Display Setting, turn HDR and advanced color to off. This return my screen to normal calibrated as before.
 
Did anyone get an update for this tv that made text render crappy? Mine updated twice in the last couple weeks. And now text sucks, there are red or blue edges on text now instead of a defined edge.

It probably changed the pixel format. You want it to be 4:4:4 for the best looking text. You can adjust in the AMD/Nvidia settings on your computer.

I updated Windows 10 to Creator's Update and now the image on the TV looks washed out. AFAIK the settings on the TV have not changed.

Checked in Windows, res and refresh are correct, AMD settings on 4:2:0 pixel format as before. I don't know what happened, any advice?

Check the TV, my settings got reset recently, maybe yours did too.
 
I updated Windows 10 to Creator's Update and now the image on the TV looks washed out. AFAIK the settings on the TV have not changed.

Checked in Windows, res and refresh are correct, AMD settings on 4:2:0 pixel format as before. I don't know what happened, any advice?

Are you in HDR? That's the only time image looks washed out for me in CU. Alternatively, make sure you've selected Full instead of limited RGB. Other than that, can't think of any other causes...
 
So, I don't seem to have any HDR settings in Windows. And in AMD panel there is only one option for 4:2:0.

I did just install the latest AMD driver (I was only a slightly old driver due to mining testing) and the strange part was that during the install, right after the screen would turn black, it would return looking correctly for about 3 seconds, and then become faded again.

After finishing and rebooting, I think it looks better but maybe not as nice as it looked before Win 10 Creators Update. I mean, I can probably be OK with it now but it's the kind of thing that will bug the hell out of me as a PC tinkerer.
 
OK, I think I got it working. There was a Calibrate Colors settings in Windows and I adjusted the Gamma a bit and now it looks like before.
 
Are you sure there is no HDR setting in Display Setting?
 

Attachments

  • Display HDR setting.jpg
    Display HDR setting.jpg
    51.5 KB · Views: 99
Check your resolution - sometimes the AMD control panel thinks that the resolution is a TV resolution. I can't recall exactly how I fixed it, but I went to Radeon Additional settings and there, if possible, check RGB full. If that's not an option, try the windows Settings>Display> all the way down click on Display Adapter Properties. Under monitor, make sure you're at 59 or 60hz, I think changing that did the trick for me.
 
Maybe you missed my post earlier. I ran the Windows Calibrate Colors option and adjusted the Gamma. Now it looks like it did before updating.
 
Anyone know what (metric size and length) screws are to be used for the 4 mounting holes on the back? I found info that says M4 but the holes are much bigger than that.
 
I am a little confused on what mode everyone is using for a "calibrated" picture. My input was originally PC which allowed me to use game mode. All the websites recommended Movie mode, so I had to switch the input to Blu-ray which then allowed for all the advanced picture calibration settings.

Is everyone else using PC as the input or have they renamed it to allow for movie mode? Does naming the input as PC reduce input lag?

Thanks
 
I renamed to "4K" so I could use Game Mode. Game Mode has slightly low latency, but I also find the picture/color quality looks better.
 
Back
Top