2015 Samsung 4k TV as a Monitor Set Up Guide

Joined
Dec 21, 2016
Messages
526
Anyone know if the newer models (e.g. Q7) don't automatically default to 4:2:2 in game mode? I'm in game mode and as far as I can tell from looking at the test image I'm still in 4:4:4 but I'm wondering if I'm getting more input lag because of it.
 
Joined
Dec 21, 2016
Messages
526
It depends where you've got your scaling set at but it's one way to get a little more control over things without changing everything at once.
 

Dan_D

Extremely [H]
Joined
Feb 9, 2002
Messages
57,208
At some point one of the firmwares for the KS8000 made some (good) HDR tweaks. It no longer gives me the "HDR content detected" flag, but it seems to remember HDR-specific color settings now. No more having to change all of my settings any time a game uses HDR.

Not sure if it was Nvidia, Samsung, or Microsoft, but HDR auto-detect also seems to work in Windows again.

This is good to hear.
 

kniff

Weaksauce
Joined
Aug 21, 2008
Messages
91
Sorry if this has been asked before, but sometimes when i turn on my TV (Samsung JU7500) the text in Windows have a red-ish hue/outline around them, like a shadow of sorts. If i turn the TV off and on it's back to normal and text isn't weird anymore.

100% scale in WIndows and no other changes i'm aware of!

Thoughts?
 

thelead

2[H]4U
Joined
May 28, 2005
Messages
2,130
Any way of getting the tv (ks8000) to consistently keep the PC in 'game mode'? Every time I restart the computer, it gets recognized as a PC and disables game mode.
 

dpoverlord

[H]ard|Gawd
Joined
Nov 18, 2004
Messages
1,781
I havent decided which mode I like better for gaming. I tend to keep it in PC mode, do you guys have a favorite mode? I am using the JS9000
 

Kdawg

Gawd
Joined
Aug 12, 2017
Messages
997
Does anyone else think Samsung 4k TV's are buggy as hell when connecting to PC?

I have hdmi 2.0 and gtx1060.

In my case, I have 2 brand new samsung 4k's which don't work on hdmi port #1. The whole screen glitches and goes blank nonstop, auto disconnecting/connecting repeatedly at 4k 60hz 4:4:4.

If I switch to hdmi #2 or #3, then it works for the most part. But as soon as the PC wakes up from standby mode, and I turn the TV back on, even these hdmi ports occasionally blank out for a second. And it's not the cable; I even tried ripoff Monstercables.
 

Nenu

[H]ardened
Joined
Apr 28, 2007
Messages
19,438
Does anyone else think Samsung 4k TV's are buggy as hell when connecting to PC?

I have hdmi 2.0 and gtx1060.

In my case, I have 2 brand new samsung 4k's which don't work on hdmi port #1. The whole screen glitches and goes blank nonstop, auto disconnecting/connecting repeatedly at 4k 60hz 4:4:4.

If I switch to hdmi #2 or #3, then it works for the most part. But as soon as the PC wakes up from standby mode, and I turn the TV back on, even these hdmi ports occasionally blank out for a second. And it's not the cable; I even tried ripoff Monstercables.
Get an HDMI cable with higher bandwidth.
Even high quality cables have bad examples.
Your problem looks like classic not enough bandwidth.
 

Kdawg

Gawd
Joined
Aug 12, 2017
Messages
997
Get an HDMI cable with higher bandwidth.
Even high quality cables have bad examples.
Your problem looks like classic not enough bandwidth.

The monster cables were 18gbps, and 4k 60hz 4:4:4 worked on hdmi #2 and #3.
Not sure why hdmi #1 glitched out so severely on both units
 

Nenu

[H]ardened
Joined
Apr 28, 2007
Messages
19,438
The monster cables were 18gbps, and 4k 60hz 4:4:4 worked on hdmi #2 and #3.
Not sure why hdmi #1 glitched out so severely on both units
Not all connections+circuits are as good as the best, just the way it is.
When on the limits of bandwidth a slightly worse connection/cct can produce problems.
And not all cables run at their rated spec even though they apply a specific number to it.

If you are having problems that look like lack of bandwidth, it most likely is.
When the only thing you can do about it is change the cable (unless you dont mind dropping the quality), try another good quality cable.
The longer the cable the more likely it will have problems so keep it as short as you can.
 

SilverSliver

[H]F Junkie
Joined
Feb 23, 2004
Messages
11,418
I have a JU6700. When will upgraded 40+ inch curved displays hit the market? Itching to get a new display, addicted to the curve now. Seems like curved sets in that size, <49", are disappearing.
 

alaricljs

Limp Gawd
Joined
Apr 6, 2011
Messages
311
The problem is the curve only really benefits near-field viewing which is great for sitting at a desk but is far from the primary market these TVs are aimed at. The curve was basically a marketing gimmick that they threw at the wall. If it sold well enough I'm sure they'd be making far more than they are now.
 

Zarathustra[H]

Fully [H]
Joined
Oct 29, 2000
Messages
31,215
The problem is the curve only really benefits near-field viewing which is great for sitting at a desk but is far from the primary market these TVs are aimed at. The curve was basically a marketing gimmick that they threw at the wall. If it sold well enough I'm sure they'd be making far more than they are now.

Yeah, I agree.

The curve is great when using g these as monitors and sitting 2 ft away from a 40-48" screen.

99.99% of the people who buy these TV's will never do this. They look at these things from across the room, where the curve is more of a problem than it is a benefit.

Thus, the lowest common denominator users mess everything up for the enthusiast, yet again.
 

SilverSliver

[H]F Junkie
Joined
Feb 23, 2004
Messages
11,418
I just don't get why there is not a higher end but not stupid expensive segment of actual computer monitors that are 4k, large format curved screens with decent response time given the TV counterparts. There are medium format curved screens in the UWHD format, smaller 120hz curved screens, but nothing really at the 40+ range.
 

Lateralus

More [H]uman than Human
Joined
Aug 7, 2004
Messages
16,085
I just don't get why there is not a higher end but not stupid expensive segment of actual computer monitors that are 4k, large format curved screens with decent response time given the TV counterparts. There are medium format curved screens in the UWHD format, smaller 120hz curved screens, but nothing really at the 40+ range.

And that's why I've stuck with my JS9000. Everything is a compromise. I want 4K 120 Hz with low input lag and good response time. Don't even care if it's VA or IPS. Unless something like that comes out, it's OLED for me next.
 

Zarathustra[H]

Fully [H]
Joined
Oct 29, 2000
Messages
31,215
And that's why I've stuck with my JS9000. Everything is a compromise. I want 4K 120 Hz with low input lag and good response time. Don't even care if it's VA or IPS. Unless something like that comes out, it's OLED for me next.


OLED to me always is on the horizon. Something I want, but have to wait for the links to be ironed out. First it was input lag, but that is much better now than it once was.

Now it is image retention...
 

Dan_D

Extremely [H]
Joined
Feb 9, 2002
Messages
57,208
I think I may try a 35 inch ultrawide g-sync monitor next, like the Asus PG35VQ next.

I looked at them and they weren't big enough. I also can't do the 3440x1440 resolution. It's not tall enough vertically. It's ok for gaming, but terrible for productivity as I came from an array of Dell 30" 2560x1600 monitors before getting into these 48"+ Samsung TV's. The 38"'s look better, but I'd miss HDR and the physical size more than I feel like I'd benefit from going beyond 60Hz with G-Sync. Don't get me wrong, I like the latter quite a bit, but I like the physical screen size, image quality, HDR etc. more.
 

Domingo

Fully [H]
Joined
Jul 30, 2004
Messages
18,678
It seems like the KS8000 continues to get HDR tweaks for the better. For a while it felt like every single device (Xbox, PC, PS4, internal apps) needed vastly different settings to get anything out of HDR.
I've recently noticed that the same settings (Backlight 20, Brightness 45, Contrast 100, most everything else = default) work universally.
This combined with the TV now remembering HDR-specific settings has finally made HDR something beyond a novelty for this set.
 

Zarathustra[H]

Fully [H]
Joined
Oct 29, 2000
Messages
31,215
I've honestly never even bothered with HDR on mine.

My understanding is (and correct me if I am wrong) that HDMI 2 has insufficient bandwidth for HDR at 4k60hz, and 4k60hz is all I use the panel for.
 

Dan_D

Extremely [H]
Joined
Feb 9, 2002
Messages
57,208
I've honestly never even bothered with HDR on mine.

My understanding is (and correct me if I am wrong) that HDMI 2 has insufficient bandwidth for HDR at 4k60hz, and 4k60hz is all I use the panel for.

Incorrect. That's all I do with my KS8500 and I have seen HDR with some games.
 

Domingo

Fully [H]
Joined
Jul 30, 2004
Messages
18,678
I've honestly never even bothered with HDR on mine.

My understanding is (and correct me if I am wrong) that HDMI 2 has insufficient bandwidth for HDR at 4k60hz, and 4k60hz is all I use the panel for.

4K/HDR definitely works on the PC. RE7, Destiny, and Shadow Warrior 2 can do it. Mass Effect Andromeda used to before it broke. It's showing up as an option on my PS4 output menus as well. I think the caveat is that it isn't 4:4:4. People who are far more knowledgeable than I am can clarify.
 

Zarathustra[H]

Fully [H]
Joined
Oct 29, 2000
Messages
31,215
4K/HDR definitely works on the PC. RE7, Destiny, and Shadow Warrior 2 can do it. Mass Effect Andromeda used to before it broke. It's showing up as an option on my PS4 output menus as well. I think the caveat is that it isn't 4:4:4. People who are far more knowledgeable than I am can clarify.

Ah, fair enough. I kind of presumed 4:4:4 as my base assumption, but I guess I shouldn't, because my "game mode" I use isn't 4:4:4.

Is HDR really worth it if it means you have to drop the chroma subsampling down a notch to achieve it?
 

Domingo

Fully [H]
Joined
Jul 30, 2004
Messages
18,678
Ah, fair enough. I kind of presumed 4:4:4 as my base assumption, but I guess I shouldn't, because my "game mode" I use isn't 4:4:4.

Is HDR really worth it if it means you have to drop the chroma subsampling down a notch to achieve it?

Probably depends on who you ask and what the content is. For a lot of content, HDR is a gimmick. At its worst, it just washes out normal colors. I get the impression that the content creators have to keep that in mind, as some early HDR-supported games looked like trash on the PS4. But when it's used well, it offers lighting effects that can't be duplicated via normal means. At least not quite in the same way. The ability to have isolated ultra-bright lighting effects is really cool when appropriate. It's very life-like. Resident Evil 7 is a great example of doing it well. A few movies (and Netflix shows) do, too. If the game or movie doesn't put a heavy focus on pinpoint lighting effects, it's probably pointless.
 

Zarathustra[H]

Fully [H]
Joined
Oct 29, 2000
Messages
31,215
Probably depends on who you ask and what the content is. For a lot of content, HDR is a gimmick. At its worst, it just washes out normal colors. I get the impression that the content creators have to keep that in mind, as some early HDR-supported games looked like trash on the PS4. But when it's used well, it offers lighting effects that can't be duplicated via normal means. At least not quite in the same way. The ability to have isolated ultra-bright lighting effects is really cool when appropriate. It's very life-like. Resident Evil 7 is a great example of doing it well. A few movies (and Netflix shows) do, too. If the game or movie doesn't put a heavy focus on pinpoint lighting effects, it's probably pointless.


Hmm. Maybe I'll give it a try one of these days. How and in what mode are you enabling it?
 

Domingo

Fully [H]
Joined
Jul 30, 2004
Messages
18,678
Hmm. Maybe I'll give it a try one of these days. How and in what mode are you enabling it?

As long as you have "UHD Color" enabled for the input you're using, games will auto-detect it. With the PS4, the whole Netflix app triggers it regardless of what content you're playing.
 

Zarathustra[H]

Fully [H]
Joined
Oct 29, 2000
Messages
31,215
As long as you have "UHD Color" enabled for the input you're using, games will auto-detect it. With the PS4, the whole Netflix app triggers it regardless of what content you're playing.
Huh.

Maybe I've been using it all along and didn't realize.

Come to think of it, some of the lighting effects in Deus Ex Mankind Divided did seem kind of intense.
 

ChodeNOde

n00b
Joined
Aug 31, 2017
Messages
4
Hi guys,

Not sure where else to turn. I just built a new rig: i7-7700k, 16GB RAM, Geforce GTX 1080 (not ti) on Windows 10. I'm outputting this to my Samsung UN55JS9000 TV. I was trying out the new Destiny 2 beta and it runs and looks great at 4k 60Hz. When I turn on HDR, however, it doesn't seem to display colors right - it's like I'm looking at an image with the sun behind it and the whites bleach out lighter areas of the screen.

I know screenshots would be dumb to post, but the difference can be appreciated with pictures from my phone camera. The area around the dude's head makes it pretty apparent.

Without HDR:
http://imgur.com/xVoX7dW

With HDR:
http://imgur.com/EWUn36S

The Samsung tech said if my TV was outputting to HDR, it should show something like this: http://imgur.com/KU8hvlO - but I see no such HDR tag when I flip HDR on in Destiny 2 (or Resident Evil 7 either). A guy I know with this TV says that tag is no longer a thing for this TV model anymore.

I made the Output dynamic range setting to Full in nVidia control panel prior to this. I've followed instructions on this guide. My cable is rated at 18Gb/s and is 15 feet long (reviewers on Amazon said they got that 15' one to work for 4k HDR). The colors do seem to pop more with HDR enabled, but the whites just bleach everything out. That test image looks right on my TV, but my eyesight has to strain a little at the last row of text so I'm guessing it's at 4:4:4. Sadly I don't have another HDR capable device to try something else. I have a 10' cable on the way, but I'm not hopeful it will work. Can anybody recommend a way to troubleshoot this further? I'm hitting a brick wall.

I'd truly appreciate any help.
I'd just like to report my problem seems to have corrected itself with the new Creator's Update (CU), new CU-drivers, and the live Destiny 2 build. I'm getting HDR. My TV won't display that HDR tag, but it's pretty clearly displaying HDR content.
 

Domingo

Fully [H]
Joined
Jul 30, 2004
Messages
18,678
A few firmware revisions ago my TV used to have a little drop down that would alert me every time HDR content was playing. That went away, but hitting the "Info" button still reveals whether it's working with most content. It doesn't work with the TV's internal Netflix app, but it does work with the YouTube one.
 

Lateralus

More [H]uman than Human
Joined
Aug 7, 2004
Messages
16,085
And that's why I've stuck with my JS9000. Everything is a compromise. I want 4K 120 Hz with low input lag and good response time. Don't even care if it's VA or IPS. Unless something like that comes out, it's OLED for me next.

Well, after 2.5 wonderful years of using the JS9000 as my monitor, it is time for us to part ways.

Until now, the LG OLED TVs have been priced high enough that I could keep on telling myself that my JS9000 was "good enough." I know that OLED transcends LCD technology, but the $2000-$3000 price point was enough for me to keep dealing with what I have and waiting for that mythical 40"+ 4K 120 Hz G-Sync unicorn monitor that may or may not exist one day.

Today, however, a sales flyer from Newegg changed the game. I caught a sale for the 55" LG B7 (2017 model)...$1399 shipped. 55" is larger than I really wanted but for the price I will deal with the size. Tired of conventional LCDs in the 40-50" range having compromises and I really do not see a 40"+ 4K 120 Hz happening any time soon. The B7 will provide a stunning picture with less input lag, plus it does 120 Hz at 1080p which was an attractive feature of the Sony 4K TVs.

Thread in HotDeals if anyone is interested. I'm looking forward to playing some of my favorite games all over again, just like I did when I got the JS9000. :)
 

Zarathustra[H]

Fully [H]
Joined
Oct 29, 2000
Messages
31,215
Well, after 2.5 wonderful years of using the JS9000 as my monitor, it is time for us to part ways.

Until now, the LG OLED TVs have been priced high enough that I could keep on telling myself that my JS9000 was "good enough." I know that OLED transcends LCD technology, but the $2000-$3000 price point was enough for me to keep dealing with what I have and waiting for that mythical 40"+ 4K 120 Hz G-Sync unicorn monitor that may or may not exist one day.

Today, however, a sales flyer from Newegg changed the game. I caught a sale for the 55" LG B7 (2017 model)...$1399 shipped. 55" is larger than I really wanted but for the price I will deal with the size. Tired of conventional LCDs in the 40-50" range having compromises and I really do not see a 40"+ 4K 120 Hz happening any time soon. The B7 will provide a stunning picture with less input lag, plus it does 120 Hz at 1080p which was an attractive feature of the Sony 4K TVs.

Thread in HotDeals if anyone is interested. I'm looking forward to playing some of my favorite games all over again, just like I did when I got the JS9000. :)

You aren't worried about image retention?

That - more than anything - is what is keeping me from going OLED.
 

Lateralus

More [H]uman than Human
Joined
Aug 7, 2004
Messages
16,085
You aren't worried about image retention?

That - more than anything - is what is keeping me from going OLED.

Nope. From everything I've read, any IR that you might get is temporary. One review that I read today said that they had zero instances where it didn't go away on its own. The people on the forum here who have purchased these sets for monitors haven't indicated that it's a problem, either. And as far as I know, not one of them has any intention of going back to traditional LCD displays, so that in itself says a lot to me.

It's important to remember that it's image retention, not burn-in. And the 2017 sets are better at mitigating it than last year's. If you want to keep waiting, it's possible that future sets will be even better but for me, now was the right time to jump in.
 

Zarathustra[H]

Fully [H]
Joined
Oct 29, 2000
Messages
31,215
Nope. From everything I've read, any IR that you might get is temporary. One review that I read today said that they had zero instances where it didn't go away on its own. The people on the forum here who have purchased these sets for monitors haven't indicated that it's a problem, either. And as far as I know, not one of them has any intention of going back to traditional LCD displays, so that in itself says a lot to me.

It's important to remember that it's image retention, not burn-in. And the 2017 sets are better at mitigating it than last year's. If you want to keep waiting, it's possible that future sets will be even better but for me, now was the right time to jump in.

Hmm. The burn in on my Droid Turbo OLED screen from using Waze a bunch was permanent.

Has the tech really changed that much?
 

Lateralus

More [H]uman than Human
Joined
Aug 7, 2004
Messages
16,085
Hmm. The burn in on my Droid Turbo OLED screen from using Waze a bunch was permanent.

Has the tech really changed that much?

I apologize for not being able to go into more detail, but it's almost midnight here and I have to get up pretty early for work. But here is a quote from one professional review of the B7:

If you look closely you can in the third picture above see a faint outline of a box in the very center of the panel. This is a 10% window that we use for measurements and calibration, which has stuck on the panel as retention. We did not actually see it and a 2s exposure shot was needed (plus 1-2 hours of measurements with the 10% window) to reveal that it was there but as always we should emphasize that retention is possible on OLED panels.

The situation is nothing like the plasma days and based on our experience to date there is nothing to fear. Just use an OLED TV like you normally would. We have used them for 6-8 hours straight for gaming without issues. If you are unlucky enough to provoke retention, the panel will run a cleaning process when turned off (it needs stand-by power), meaning that it will be gone the next morning.

That's just one example of people saying pretty much the same thing - "Don't worry about it."

But since this thread is about optimizing the Samsung TVs for PC use and not about LG OLED TVs, I'm a bit reluctant to drag it further off topic (my fault). Maybe we could/should discuss this elsewhere if you want to explore the topic further?
 

Nenu

[H]ardened
Joined
Apr 28, 2007
Messages
19,438
Using an OLED for PC is a far cry from video use.
For sure there is nothing to fear if it gets used mainly for video because that evens the wear out a great deal.
Dont mistake that it is ok for PC use from those reviews.

The cleaning process accelerates wear of the display which will reduce its brightness.
 

Lateralus

More [H]uman than Human
Joined
Aug 7, 2004
Messages
16,085
We will see. I still stand by the fact that no actual owners of them have reported any permanent burn-in despite being used as a PC monitor (at least that I can find). Rtings recommends it for PC use with no warnings that I saw. If it's an issue then I just got an outstanding new OLED TV for the best price I've ever seen it, and I continue the search for the perfect LCD.
 

Nitemare3219

Limp Gawd
Joined
Apr 2, 2015
Messages
206
I apologize for not being able to go into more detail, but it's almost midnight here and I have to get up pretty early for work. But here is a quote from one professional review of the B7:



That's just one example of people saying pretty much the same thing - "Don't worry about it."

But since this thread is about optimizing the Samsung TVs for PC use and not about LG OLED TVs, I'm a bit reluctant to drag it further off topic (my fault). Maybe we could/should discuss this elsewhere if you want to explore the topic further?

Using an OLED for PC is a far cry from video use.
For sure there is nothing to fear if it gets used mainly for video because that evens the wear out a great deal.
Dont mistake that it is ok for PC use from those reviews.

The cleaning process accelerates wear of the display which will reduce its brightness.

We will see. I still stand by the fact that no actual owners of them have reported any permanent burn-in despite being used as a PC monitor (at least that I can find). Rtings recommends it for PC use with no warnings that I saw. If it's an issue then I just got an outstanding new OLED TV for the best price I've ever seen it, and I continue the search for the perfect LCD.

Hey fellas. I had a 2015 LG EG9600 OLED, which I traded up for a 2016 C6, which I traded up for a 2017 C7 (solely due to lower input lag vs the C6, which is a dramatic difference). I have had zero burn-in on any of these TVs. Image retention can happen, but it goes away. With that said, I do take a few steps to help prevent the problem which doesn't really affect my usage, and isn't too outlandish...

1. When not gaming, I use a different picture mode for lower brightness (OLED light around 20-30 - my room is dark so it's actually almost necessary considering all the white desktop windows).
2. All my desktop icons are in the middle of the display, so that they are covered by my browser window, etc. the vast majority of the time. I also web browse with about a 50% horizontally sized window since full screen is a bit insane to browse with at 55" close up.
3. Task bar auto-hides.
4. After 5 minutes, my screensaver goes to a 100% black image to turn off the pixels if I am briefly stepping away from the TV. I always power off the TV when away for extended periods.
 
Top