6 best 4K TVs for HDR/SDR Console and PC gaming Oct 2016

Wiz33

Gawd
Joined
May 2, 2015
Messages
860
Anyone thinking about getting a 4K TV for PC/Gaming should give this a read.

http://4k.com/these-are-the-6-best-4k-tvs-for-hdrsdr-console-and-pc-gaming-17960-3/

The only thing I disagree on is the HDR for the LG OLED. The LG max out at 660 nits brightness (HDR Specs requires a minimum 1000 nits) and tends to lower the overall brightness on HDR content to show the range between dark and bright while the Samsungs and Sony can all go above 1000 nits and will gives you brights that are so bright that you almost have to avert you eyes while maintaining dark details.

Also, don't dismiss HDR just because you only plan on using it as a PC monitor. PC content supporting HDR and 10 bit color will be showing up real soon. Basically, unless you are on a budget constraint. Consider spending just a bit more to get a 4K with full HDR spec.

Those looking for further info on this HDR thing may want to read this:

http://www.pcgamer.com/what-hdr-means-for-pc-gaming/
 
Last edited:
Waiting for OLEDs to drop below $1K, till then this is all window-shopping...
 
Waiting for OLEDs to drop below $1K, till then this is all window-shopping...

You should also be waiting for OLED to not have such bad
  • motion blur
  • input lag
  • burn-in
  • brightness fade over time
Honestly, the burn-in alone is a huge con for OLED, especially for gaming where you often have static HUD elements and whatnot on-screen for long periods of time. After a year, that OLED will be trash. Totally not worth it. The tech is in its infancy and not really ready for consumer adoption.
 
Gonna be at least 2-3 years. If you are at 1080p and looking to switch up to 4K I would just go ahead and do it. Then upgrade once OLED comes down in price and has been further refined.
 
I was ready to get the OLED till the 2016 Samsung with HDR 10 came out. I do have a UHD bluray player and from all the reviews. The OLED is just not bright enough, especially during daytime unless your room have home theater curtains. 660 nits just is not bright enough and does not meet the HDR 10 spec of a 1000 nits minimum. The input lag on the Samsung is also better than current OLED. There are also concern about color degradation over time and burn-in is a definite concern for PC use where you tends to have windows opened at the same locations. I'll give it another fews year and I'll take a 2nd look (maybe for a pure TV use).
 
I am disappointed you can't play the 4K blu rays over HTPC. When I heard about the pacific rim and prometheus 4K discs I wanted them. But without the proper drives and the proper software there isn't any way to play them natively off the disc. MKV files sure but those probably trade off quality vs. the original source.

Gonna be a while yet before we can do it.
 
You should also be waiting for OLED to not have such bad
  • motion blur
  • input lag
  • burn-in
  • brightness fade over time
Honestly, the burn-in alone is a huge con for OLED, especially for gaming where you often have static HUD elements and whatnot on-screen for long periods of time. After a year, that OLED will be trash. Totally not worth it. The tech is in its infancy and not really ready for consumer adoption.


All irrelevant on 2016 models. Don't care what you read on the "net". I bought the 65" E6 for my mom. Set Tru Motion off. Even for movies, the is no "Blur". Latest update gets latency down under 33ms. Non issue with daughter's PS4 hooked up to it and tried gaming with my speced PC below. Fun, and no issues where as colors really pop. Brightness fade= Are you kidding me? It would take 10 years of 6-8 hours of watching to lose I believe it was posted at AVS forum 20%. Once again, someone who doesn't own or can't afford one of these T.V.'s Also all LG OLED T.V.'s support Dolby Vision and HDR-10. I went over and watched a couple of movies with her, including Warcraft in Dolby Vison and its incredible. Don't knock it till you actual have one. Oh and the high nit rating on the Sony 75" 940D couldn't hold its own against this T.V.. Why? Oh had that one before we traded it in for the OLED, which does need calibration and can't be beat once it is. Post pics later when I stop over! Give me a few hours.
 
> a TV could have a peak brightness of 400 nits and a black level of 0.4 nits.
how can OLED, with its brightness issues, qualify for HDR compatibility which demands much higher brightness levels than standard TVs? Well, the UHD Alliance has got around the problem by introducing two standards:

>STANDARD 1: More than 1,000 nits peak brightness and less than 0.05nits black level.

>STANDARD 2: More than 540 nits brightness and less than 0.0005 nits black level. (OLED)


>While standard one demands higher brightness and tolerates a higher black level, standard two tolerates a lower brightness and demands a lower black level. This means manufacturers looking to make LED HDR TVs, which most are, will abide by standard one, while OLED TVs will be able to gain the Ultra HD Premium label by conforming to standard two.

>And it doesn’t stop there. If you’re still with us, there’s more colour stuff to go over. An HDR TV must be able to produce a certain amount of what’s known as ‘P3’ colour. P3 colour refers to the range of the colour spectrum which is included. The best way to think about this is imagine an overall colour spectrum, and within that a set of defined spaces. The P3 colour space is a larger than the what standard TVs use, Rec. 709, which means it covers more colours.
HDR means a TV can cover a wider space within the colour spectrum, and within that space, the various gradations of shades will be much smoother

--------------------------------
P3 color is a few % more than adobe RGB, mostly on the green end.
This might not be as big of a deal compared to some computer monitors but is a great move as a TV standard.
VR6gxX2.png


---------------------------------------------------

I'm guessing the only LCD monitors that will get even close to that standard will be VA ones due to the black depth requirement. For now I bought a 70" 4k VA tv with FALD / zone lit direct led backlighting with a killer black depth and contrast level (and no HDR). Calibrated Dark mode, yielded 31 fL peak white and 0.0038 fL for black, an 8157:1 contrast ratio.

Hopefully the 2560 x 1440 and 3440 x 1440 samsung 144hz - 200hz variable sync monitors will be good and will hold me over until full featured OLED gaming monitors (high hz, variable hz, hopefully screen blanking ~ blur reduction/elimination) and large OLED tvs are ubiquitous with wrinkles worked out. By then more HDR content (and 4k content) will be available too. I'm guessing around 3 - 5 yrs until I make that leap on both TV and gaming monitor and hope to never look back at LCD again.

VA black depths blow away IPS and TN:
aXXZjaD.png


cjAHpXJ.jpg

OLED does have sample and hold blur and I don't consider any display a modern PC gaming monitor if it's capped at 60fps-hz (low motion clarity/blur reduction, low motion definition and path articulation), and lacks modern gaming overdrive, variable hz and/or screen blanking or strobing blur elimination tech. Personally I won't even bother with a 100hz cap knowing dp 1.3 displays of much higher hz are around the corner (including 4k 120hz+ displays with dp 1.3 inputs eventually).

-----------------------------------------------------------------------------------------------------------
100fps-hz average or so would be the lowest I'd go and even then, using variable hz would be a "vibration" blur graph all over the place
For a few examples of high demand games at 2560 x 1440 from the last year not even considering games yet to be released:
I think witcher 3 gets 82 fps-hz average on ultra on a single gtx 1080 with hairworks disabled. Far Cry Primal on very high gets 80 fps-hz average.

blur reduction/motion clarity increase 100fps-hz ave:
0% <- 20%(~80fps-hz) <-- <<40% (100fps-hz)>> -->50%(120fps-hz)->60% (144fps-hz)
and motion definition/path articulation/smoothness wise
1:1 -- 1.x:1 (~80+ f-hz) <--- << 5:3 (100fps-hz)>> --> 2:1 (120fps-hz) -> 2.4:1 (144fps-hz)
 
Last edited:
I am disappointed you can't play the 4K blu rays over HTPC. When I heard about the pacific rim and prometheus 4K discs I wanted them. But without the proper drives and the proper software there isn't any way to play them natively off the disc. MKV files sure but those probably trade off quality vs. the original source.

Gonna be a while yet before we can do it.

Well, I would not hold my breath on a PC UHD drive with all the HDCP 2.2 copy protections stuff, Netflix won't even let you stream 4K from a PC even if your videocard have HDCP 2.2. Not that any of that is preventing someone from ripping UHD contents already.

UHD player is dropping in price. I got the Samsung early at close to full price but it has already dropped 35% (~$250) and you can find refurbs or open box for under $200.

Also. You really need a UHD player to appreciate what the newer 4K TV can do. I had Netlix and Amazon 4k streaming since last year and while it was nice, it never really wowed me until I got the UHD player and some UHD movies.
 
Last edited:
Point is that a TV which can at least get up to the 1000 nit mark is nominally capable of handling the extremes of HDR10 without the aid of dynamic metadata, i.e. there's no need for dynamic mapping because the TV can accommodate the full signal that's being thrown at it, whereas something less well endowed nit-wise benefits far more from dynamic DV because the signal is being properly mapped rather than being badly clipped like HDR10 is on some manufacturers' sets.

This is something that HDTVTest proved in a recent article: HDR10 on an OLED had noticeably clipped highlights compared to the equivalent DV source material, but an LCD TV showing the same HDR10 source had far more nuance and was essentially equivalent to the DV version on the OLED for what could be seen in the specular highlights.

Many have reported that while watching UHD disc on an OLED set, the overall picture was dimmed to allow for the highlights
 
I would think they'd just move the entire range down (like minus 460 or minus 500 for 1000nit vs OLED 460nit standard ) since oled's black depth is so much lower but perhaps they use some dynamic formula as you seem to suggest. LCD black depth standard is .05nit and oled's is .0005 which is amazing.

You keep mentioning the highs of 1000 but...
What is the lower recorded black depth limit for HDR content by studios?
If it's lower than .05 nit then the LCD's would have to be clipped on the low end(or perhaps their bottom black depth vs detail in blacks on the bottom of their range dynamically adjusted).

I'd definitely be interested to know.

----------------------------------------------------------------------
> a TV could have a peak brightness of 400 nits and a black level of 0.4 nits.
how can OLED, with its brightness issues, qualify for HDR compatibility which demands much higher brightness levels than standard TVs? Well, the UHD Alliance has got around the problem by introducing two standards:

>STANDARD 1: More than 1,000 nits peak brightness and less than 0.05nits black level.

>STANDARD 2: More than 540 nits brightness and less than 0.0005 nits black level. (OLED)


>While standard one demands higher brightness and tolerates a higher black level, standard two tolerates a lower brightness and demands a lower black level. This means manufacturers looking to make LED HDR TVs, which most are, will abide by standard one, while OLED TVs will be able to gain the Ultra HD Premium label by conforming to standard two.
 
OLED does have sample and hold blur and I don't consider any display a modern PC gaming monitor if it's capped at 60fps-hz (low motion clarity/blur reduction, low motion definition and path articulation), and lacks modern gaming overdrive, variable hz and/or screen blanking or strobing blur elimination tech. Personally I won't even bother with a 100hz cap knowing dp 1.3 displays of much higher hz are around the corner (including 4k 120hz+ displays with dp 1.3 inputs eventually).

Only if you only play fast action games. Makes no difference for console gamer since the One S and PS4 Pro would be hard pressed to get 4k@60. Also, PC gaming is not all fast action.
 
http://www.flatpanelshd.com/focus.php?subaction=showfull&id=1435052975

(quote)
TVs obviously need to be able to output a higher brightness level but also a very low black level. Plasma TVs were not able to do that but LCDs and OLEDs are. On a LCD you will need a “local dimming” system to be able to control brightness locally in zones. The more zones the better. We have already seen edge LED based LCD TVs claim HDR support but in our opinion this is stretching it. Ideally you would want to be able to control light output from 0 nit to a maximum brightness level of 800-1000 nits (or much higher for Dolby Vision, typically 4,000-10,000) in every single pixel. Does that sound familiar? Yes that is how OLED displays work.

------------------------------------------------------------------------

In regard to fast action games comment.... motion clarity and motion definition are aesthetic gains , not just twitch scoring advantages. I know consoles are low frame rate and low hz but you mentioned PC gaming too.
Anyone who has played a gorgeous game world in 1st/3rd person at high hz should be able to see large aesthetic gains. "Soften blur" within the "shadow masks" of objects at high hz is still an annoyance but smearing motion blur at 60fps-hz and lower is downright ugly. The motion definition improvement and pathing feels and looks much better (and is superior aesthetically) too. Keep in mind you are moving the entire game world portal around constantly in relation to your viewpoint in 1st/3rd person games while mouse looking and movement-keying, it's not just a single ufo object crossing the screen it's the whole screen.

VR is 90hz for a reason, and would be higher if they could make it so.

Luckily there will be 3440 x 1440 21:9 and 4k at 120hz or better on dp 1.3 displays but it might be awhile yet. Hopefully mid yr to end yr 2017 since they've been pushed back a bit.
 
Last edited:
I would think they'd just move the entire range down since oled's black depth is so much lower but perhaps they use some dynamic formula as you seem to suggest. Like minus 460 or minus 500 for 1000nit vs OLED 460nit standard LCD black depth standard is .05nit and oled's is .0005 which is amazing.

You keep mentioning the highs of 1000 but...
What is the lower recorded black depth limit for HDR content by studios?
If it's lower than .05 nit then the LCD's would have to be clipped on the low end(or perhaps their bottom black depth vs detail in blacks on the bottom of their range dynamically adjusted).

I'd definitely be interested to know..

Samsung KS9500 does .015 black without local dimming and with no clipping. While there seems to be a large difference between .015 and .0005. You do realize both are already so much better than the best Plasma screen could have done a few years ago. I would say that in practical use, the only time you might see a slight difference is in a overall dark scenes.
 
Well, I would not hold my breath on a PC UHD drive with all the HDCP 2.2 copy protections stuff, Netflix won't even let you stream 4K from a PC even if your videocard have HDCP 2.2. Not that any of that is preventing someone from ripping UHD contents already.
Supposedly there are already some drives out that can read the contents of the disc however Cyberlink is still working on the player. But yes - no oxygen being withheld here. :)

It will be nice if I am someday able to buy a PC drive with bundled software that will play the discs on my system. For $200 or less.

Sure I could get a standalone player and plug it into one of the other inputs then just switch over to it. But we're nerds here. We HTPC here. Buying a standalone player isn't an attractive option.

Only if you only play fast action games. Makes no difference for console gamer since the One S and PS4 Pro would be hard pressed to get 4k@60. Also, PC gaming is not all fast action.
Been playing a lot of BF4 on this display and it looks great to me. I do see some blur if I stand there and pan around at the right pace. At least I do with the KS8000 anyway and I'm sure the KU6300 would do the same if not worse. Blur hasn't been eliminated from these displays entirely yet. I guess I'm used to it having been gaming on large panels for a long time now. Some people are more sensitive to it than others.
 
http://www.flatpanelshd.com/focus.php?subaction=showfull&id=1435052975

(quote)
TVs obviously need to be able to output a higher brightness level but also a very low black level. Plasma TVs were not able to do that but LCDs and OLEDs are. On a LCD you will need a “local dimming” system to be able to control brightness locally in zones. The more zones the better. We have already seen edge LED based LCD TVs claim HDR support but in our opinion this is stretching it. Ideally you would want to be able to control light output from 0 nit to a maximum brightness level of 800-1000 nits (or much higher for Dolby Vision, typically 4,000-10,000) in every single pixel. Does that sound familiar? Yes that is how OLED displays work.

------------------------------------------------------------------------

In regard to fast action games comment.... motion clarity and motion definition are aesthetic gains , not just twitch scoring advantages. I know consoles are low frame rate and low hz but you mentioned PC gaming too.
Anyone who has played a gorgeous game world in 1st/3rd person at high hz should be able to see large aesthetic gains. "Soften blur" within the "shadow masks" of objects at high hz is still an annoyance but smearing motion blur at 60fps-hz and lower is downright ugly. The motion definition improvement and pathing feels and looks much better (and is superior aesthetically) too. Keep in mind you are moving the entire game world portal around constantly in relation to your viewpoint in 1st/3rd person games while mouse looking and movement-keying, it's not just a single ufo object crossing the screen it's the whole screen.

VR is 90hz for a reason, and would be higher if they could make it so.

Luckily there will be 3440 x 1440 21:9 and 4k at 120hz or better on dp 1.3 displays but it might be awhile yet. Hopefully mid yr to end yr 2017 since they've been pushed back a bit.

I play a good variety of games except I'm getting a bit too old for fast FPS. The closest one to what you mentioned about the 1st/3rd person is probably Arrmored Warfare and I have no problem with the JU7500 40" or the KS9500 65" although being able to run 4K at max setting at 60fps was a challenge even for my GTX 1080 OC at times.
 
Last edited:
Supposedly there are already some drives out that can read the contents of the disc however Cyberlink is still working on the player. But yes - no oxygen being withheld here. :)

It will be nice if I am someday able to buy a PC drive with bundled software that will play the discs on my system. For $200 or less.

Sure I could get a standalone player and plug it into one of the other inputs then just switch over to it. But we're nerds here. We HTPC here. Buying a standalone player isn't an attractive option.


Been playing a lot of BF4 on this display and it looks great to me. I do see some blur if I stand there and pan around at the right pace. At least I do with the KS8000 anyway and I'm sure the KU6300 would do the same if not worse. Blur hasn't been eliminated from these displays entirely yet. I guess I'm used to it having been gaming on large panels for a long time now. Some people are more sensitive to it than others.

Talking about HTPC, have you seen some of the newer android based media player box? I have a Minix Neo U1 and I can play 4K@30 movies from my camera with no problem. The only thing it's lacking now is Dolby Atmos pass-through although it does DTS HD MA and Dolby TrueHD pass-through already. I figure the next generation will do pretty much everything you can do with a HTPC but in a much smaller package and totally silent.

While I do like to have everything on a media server. The dramatic difference between what you can on a BD-60/100 vs anything further compressed (Netflix, Amazon, mkvs) is well worth the price of a UHD player (in fact I just order a 2nd one so I have one on each TV). I've also find that even a full BD rip looks just a bit softer even using the latest Cyberlink vs disc base playback on the same TV.
 
Last edited:
Just a reminder that some 4k tvs can do 120hz native PC input at 1080p. They'd still lack variable hz capability which provides more smoothness vs judder, stops, tearing (some games tear worse than others just from bad code even), and would lack modern gaming overdrive to cut the blur as much but they'd potentially cut the blur appreciably more than 60fps-hz and still give an increased motion definition/motion smoothness benefit at high frame rates (rates easier achieved at 1080p too of course).

I'm holding out for dp 1.3 VA displays personally
Hopefully a 120hz gaming VA 4k of around 40" comes out by end of 2017, and maybe a similar 21:9 option.
That and my current tv would hold me over 3 - 5 yrs until 4k HDR OLED becomes ubiquitous in tvs (and in tv content) and some real oled gaming monitors are in play as well.

------------------------------------

Regarding HTPC - I can remember when netflix wouldn't let you stream 1080p/"Super HD" off of a pc and limited it to specific devices just like 4k is now. Hopefully it will allow it on browsers in the long run just like it did 1080p but I'm not holding my breath.
 
Talking about HTPC, have you seen some of the newer android based media player box? I have a Minix Neo U1 and I can play 4K@30 movies from my camera with no problem. The only thing it's lacking now is Dolby Atmos pass-through although it does DTS HD MA and Dolby TrueHD pass-through already. I figure the next generation will do pretty much everything you can do with a HTPC but in a much smaller package and totally silent.

While I do like to have everything on a media server. The dramatic difference between what you can on a BD-60/100 vs anything further compressed (Netflix, Amazon, mkvs) is well worth the price of a UHD player (in fact I just order a 2nd one so I have one on each TV). I've also find that even a full BD rip looks just a bit softer even using the latest Cyberlink vs disc base playback on the same TV.
I hadn't seen the Minix Neo or other similar devices. I guess I'm a bit of a PC master race snob. I tend to try to get everything to work how I want it through this one device for simplicity sake as well as the fact I just like doing a lot with the Windows desktop. I don't have anything against Android or any other solution. It's great that people have so many choices and ways to accomplish different tasks.

A standalone player will probably happen for me sooner or later if Cyberlink never produces the software needed to play the UHD discs. All up in the air at this point.

I can remember when netflix wouldn't let you stream 1080p/"Super HD" off of a pc and limited it to specific devices just like 4k is now.
Why do they limit it like this in the first place? What is the gain?
 
Just a reminder that some 4k tvs can do 120hz native PC input at 1080p. They'd still lack variable hz capability which provides more smoothness vs judder, stops, tearing (some games tear worse than others just from bad code even), and would lack modern gaming overdrive to cut the blur as much but they'd potentially cut the blur appreciably more than 60fps-hz and still give an increased motion definition/motion smoothness benefit at high frame rates (rates easier achieved at 1080p too of course).

I'm holding out for dp 1.3 VA displays personally
Hopefully a 120hz gaming VA 4k of around 40" comes out by end of 2017, and maybe a similar 21:9 option.
That and my current tv would hold me over 3 - 5 yrs until 4k HDR OLED becomes ubiquitous in tvs (and in tv content) and some real oled gaming monitors are in play as well.

------------------------------------

Regarding HTPC - I can remember when netflix wouldn't let you stream 1080p/"Super HD" off of a pc and limited it to specific devices just like 4k is now. Hopefully it will allow it on browsers in the long run just like it did 1080p but I'm not holding my breath.

What do you plan on driving 4K@120 with. 2 X GTX1080 Ti? :-P
 
I was leaning toward a 35" 21:9 3440 x 1440 samsung VA when they were first reported to be releasing a 165 - 200hz one but it turns out they are releasing 100hz on dp1.2 for now. At 100fps-hz average your frame rate graph in actuality usually is a band from around 70fps-hz to 160+ fps-hz so it stays pretty tight being 100 - 160fps 2/3 of the "noisy" frame rate band.

To answer your question more directly, if I had a 4k 120hz+ with variable hz I would run 2560 x 1440 on it on high to very high+ settings to get 100fps-hz average or so on more demanding games. Although it might seem counter-intuitive from a clean scaling perspective - according to reports 2560x1440 scales much better than 1080px4 pixels on a 4k full screen . However I have played some isometric adventure games on my 4k tv at 1080p and they seem to scale fine and don't look muddy like lcd's I've had in the past when run at lower rez. That or I would run games in a 21:9 or 16:9 window 1:1 pixel on a 40" 4k and use the full 4k resolution for desktop/apps when not gaming. I would not run demanding 1st/3rd person 3d game world perspective games at full 4k resolution, only those that I could get 100fps-hz or so on preferably very high to very high+ (ultra minus if you prefer) settings.

...2560x1440 scaled full screen on very high to very high+
..some other 21:9 or 16:9 sub-4k resolution 1:1 windowed
..using full 4k resolution for desktop/app real-estate

I'm trying to hold out for a 1080ti , still debating sli since it seems like it's being supported less even as refresh rates and resolutions increase.
 
I was leaning toward a 35" 21:9 3440 x 1440 samsung VA when they were first reported to be releasing a 165 - 200hz one but it turns out they are releasing 100hz on dp1.2 for now. At 100fps-hz average your frame rate graph in actuality usually is a band from around 70fps-hz to 160+ fps-hz so it stays pretty tight being 100 - 160fps 2/3 of the "noisy" frame rate band.

To answer your question more directly, if I had a 4k 120hz+ with variable hz I would run 2560 x 1440 on it on high to very high+ settings to get 100fps-hz average or so on more demanding games. Although it might seem counter-intuitive from a clean scaling perspective - according to reports 2560x1440 scales much better than 1080px4 pixels on a 4k full screen . However I have played some isometric adventure games on my 4k tv at 1080p and they seem to scale fine and don't look muddy like lcd's I've had in the past when run at lower rez. That or I would run games in a 21:9 or 16:9 window 1:1 pixel on a 40" 4k and use the full 4k resolution for desktop/apps when not gaming. I would not run demanding 1st/3rd person 3d game world perspective games at full 4k resolution, only those that I could get 100fps-hz or so on preferably very high to very high+ (ultra minus if you prefer) settings.

...2560x1440 scaled full screen on very high to very high+
..some other 21:9 or 16:9 sub-4k resolution 1:1 windowed
..using full 4k resolution for desktop/app real-estate

I'm trying to hold out for a 1080ti , still debating sli since it seems like it's being supported less even as refresh rates and resolutions increase.

Yea, never went SLI due to support and varies problems. You should be OK with a 1080 OC for most games anyways especially if you're willing to drop the res a bit on the demanding ones. 4K gaming at above 60hz is going to be tough as you can only do it through DP 1.4 and not many TV have DP ports but we can always hope for 30"-40" 4K monitor that supports 120hz or above.
 
Last edited:
Tomorrow, I will get some pics of Warcraft in Dolby Vision. 1000 nits. BS marketing versus in person viewing. "Why you throw Nits?" A quick couple of shots I got tonight when stopping over.

IMG_0604.JPG IMG_0603.JPG

These TV's don't have a brightness issue period. LG 65" E6. Takin with an Iphone 6s+ so the real clarity is a little destorted, straight image no adjustments. Pictures from someone who actually owns one, not reading off the internet.
 
I will take the daughter's PS4 and try to get some pics of it on it as well tomorrow. (PC is a little heavy to lug around)
 
1,000+ nits for HDR? Do they want their viewers to have retina burn every time the camera pans toward a sun? I don't mind having a lower brightness, especially when playing in a dark room.

As for 10-bit color channels, that's just another feature where digital video interfaces are catching up to VGA outputs with 10-bit RAMDACs years later than they should be. I still remember when Matrox made that a big selling point. HDTV standards really are embarrassingly behind the times compared to what PC monitors have always been capable of, and that's only just now starting to turn around a bit with the push for higher resolutions and wider color spaces.

Anyway, I prioritize minimal input lag more than anything, with refresh rate and video inputs that can keep up being a close second for PC gaming in particular. Going from 60 Hz to 90-95 Hz is a big leap already, and 120-144 Hz make for a bigger leap still. HDTVs tend to fail pretty hard at both counts.
 
1,000+ nits for HDR? Do they want their viewers to have retina burn every time the camera pans toward a sun? I don't mind having a lower brightness, especially when playing in a dark room.

The whole image isn't always at 1000 nits. That's just for momentary peak brightness, so it can get bright enough in some spots when needed.
 
A sunny day outside has ~15000-35000 nits. The actual sun has over a billion nits. An arc welder has >120 million.
So even the Dolby Vision target of 4000 nits cannot burn the retina :p
Current OLED have a peak brightness of 700-800 nits. The LCD with highest peak brightness currently is the Sony ZD9 and it can sustain an L20 test window with ~1400-1800 nits indefinitely.
 
Last edited:
Tomorrow, I will get some pics of Warcraft in Dolby Vision. 1000 nits. BS marketing versus in person viewing. "Why you throw Nits?" A quick couple of shots I got tonight when stopping over.

View attachment 9795 View attachment 9796

These TV's don't have a brightness issue period. LG 65" E6. Takin with an Iphone 6s+ so the real clarity is a little destorted, straight image no adjustments. Pictures from someone who actually owns one, not reading off the internet.


The E6 is a beautiful set and I was seriously considering it. A friend has one and I took my UHD player over and sat there and played with it from early afternoon to evening. But it does have problem in a room with a fair bit of ambient lights in the afternoon. In the evening, it was perfect. My Living room has about the same ambient light during daytime and I'm not willing to install another set of theater curtain like my friend did so I need something that worked well during the day since most of my TV viewing is going to be on weekends.

Also, we are not talking normal picture brightness. We are talking peak HDR brightness when there are dramatic difference between the dark and bright like when a search/spot light pointing directly at the screen for a sec or a lighthouse when the beam swap across the screen. In those cases, there is a dramatic difference between the E6's peak of 652 nits on 2% window and 651 nits on 10% window vs The Samsung KS9500's 1490 nits at 2% and 1468 nits at 10%. Even at 50% sustained the Samsung will do 536 nits while the E6 can only do 223 nits.
 
Last edited:
HDR isn't just about peak brightness. It's also about contrast and lack of bloom. OLED doesn't have as high of max brightness, but it can do perfect transitions/breaks with zero bloom which makes it regarded by many professionals as having the best HDR.
 
HDR isn't just about peak brightness. It's also about contrast and lack of bloom. OLED doesn't have as high of max brightness, but it can do perfect transitions/breaks with zero bloom which makes it regarded by many professionals as having the best HDR.

True. Having high peak brightness is all fine and dandy for intense specular highlights but in practice its bad if it makes large portions outside of the specular highlight glow, or even worse, make a whole portions of the screen glow when there should be black around it like edge-lit HDR TV solutions do. Kinda hurts the point of HDR which is mainly meant for dark room viewing. OLED may lack the peak brightness but it more than makes up for it for having the ability to go down to infinity and can do the peak brights with each pixel individually without bleeding outside of the intended area.

Also the TS forgets to mention that while UHD Premium certification (which is currently considered to be the baseline for fully HDR capable TV) requires 1000 nits peaks with 0.05 blacks for LCD's, it also has a second standard for self-emitting displays like oled, 500 nits peak and 0.0000somethingsomething, basically pure black and infinite contrast ratio.
 
Last edited:
HDR isn't just about peak brightness. It's also about contrast and lack of bloom. OLED doesn't have as high of max brightness, but it can do perfect transitions/breaks with zero bloom which makes it regarded by many professionals as having the best HDR.

While it looks good on paper and with test patterns. But in practical use, does .015 vs 0 black really makes any difference? After playing with both sets under similar lighting condition, I would say that the 0 black is really only barely noticeable in a totally dark room and I don't watch TV in a totally dark room. I have put ambient LED on the back of all my sets to reduce eye strain and the true black no longer makes a visible difference and definitely not during daytime hours. So at the end of the day. I'll take 1000+ nits and .015 black over 660 nits and 0 black anyday,

Also, Don't try to compare any streamed 4K from varies sites. Get a UHD player and pop in Pacific Rim and you'll see true HDR in effect. Yes, The colors are over saturated by design but it really shows off the difference between a BD60/100 vs the varies streaming services so called 4K HDR. When the spotlights on the Jaegars are pointed towards the screen, they are so bright at times that you almost have to avert your eyes and since most of the actions happens at night, you'll see that extreme brights while still having excellent dark details in the same frame.
 
Last edited:
True. Having high peak brightness is all fine and dandy for intense specular highlights but in practice its bad if it makes large portions outside of the specular highlight glow, or even worse, make a whole portions of the screen glow when there should be black around it like edge-lit HDR TV solutions do. Kinda hurts the point of HDR which is mainly meant for dark room viewing. OLED may lack the peak brightness but it more than makes up for it for having the ability to go down to infinity and can do the peak brights with each pixel individually without bleeding outside of the intended area.

Also the TS forgets to mention that while UHD Premium certification (which is currently considered to be the baseline for fully HDR capable TV) requires 1000 nits peaks with 0.05 blacks for LCD's, it also has a second standard for self-emitting displays like oled, 500 nits peak and 0.0000somethingsomething, basically pure black and infinite contrast ratio.


Also I should clarify that while I do believe OLEDs way of showing HDR is inherently superior I am not knocking LCDs, FALD or good edge LEDs or otherwise. Because I do own KS7500 (aka KS8500 in the USA), the very questionable edge led local dimming system and all. And I have downloaded and watched the Life Of Pi HDR demo clip, copied it to USB stick and watched it and holy fuck does it look jaw droppingly amazing. Old SDR system can't die fast enough in my book.

I really need a 4K Bluray player and a pile of 4K HDR movies to rewatch...
 
Last edited:
Also I should clarify that while I do believe OLEDs way of showing HDR is inherently superior I am not knocking LCDs, FALD or good edge LEDs or otherwise. Because I do own KS7500 (aka KS8500 in the USA), the very questionable edge led local dimming system and all. And I have downloaded and watched the Life Of Pi HDR demo clip, copied it to USB stick and watched it and holy fuck does it look jaw droppingly amazing. Old SDR system can't die fast enough in my book.

I really need a 4K Bluray player and a pile of 4K HDR movies to rewatch...

Yea, download and watch the demos here and tell me that the Black on your screen is not black enough or the highlights are bleeding over to the surrounding area. The OLED spec looks good on paper and test patterns but gives no practical advantage outside of a controlled test environment.

http://demo-uhd3d.com/categorie.php?tag=hdr
 
Yea, download and watch the demos here and tell me that the Black on your screen is not black enough or the highlights are bleeding over to the surrounding area. The OLED spec looks good on paper and test patterns but gives no practical advantage outside of a controlled test environment.

http://demo-uhd3d.com/categorie.php?tag=hdr

What? No! I am comparing SDR to HDR here. I am watching it in very dark room with a dim and very yellow backlight (a mere lightbulb, not a proper 6500K daylight lamp) and I can very much tell that the black level is being elevated in high contrast scenes but even then, jesus the HDR looks so good! Makes me want to see it on even better display. :)
 
What? No! I am comparing SDR to HDR here. I am watching it in very dark room with a dim and very yellow backlight (a mere lightbulb, not a proper 6500K daylight lamp) and I can very much tell that the black level is being elevated in high contrast scenes but even then, jesus the HDR looks so good! Makes me want to see it on even better display. :)


UHD players are getting cheap (relative to launch). I just picked up a 2nd Samsung K8500 for under $200 (it's a reburb but we'll see). Should be here in a couple hours.

I use these for backlight. It powers off the TV's USB ports so it turns on and off along with the TV

https://www.amazon.com/gp/product/B01LR3RAVU/ref=oh_aui_detailpage_o02_s00?ie=UTF8&psc=1
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
Interesting. I downloaded Life of Pi from here:
http://hdrsamples.com/life-of-pi-4k-uhd-hdr-sample-footage/

And it looks washed out and noisy on my system. Most everything else looks good to me from the demo-uhd3d site. Not sure what the issue is.

Maybe because ycbcr 4:4:4 only supports 8 bit on PC? Seems like the other 10 bit demos I've downloaded looked a lot better than this one.
 
Interesting. I downloaded Life of Pi from here:
http://hdrsamples.com/life-of-pi-4k-uhd-hdr-sample-footage/

And it looks washed out and noisy on my system. Most everything else looks good to me from the demo-uhd3d site. Not sure what the issue is.

Maybe because ycbcr 4:4:4 only supports 8 bit on PC? Seems like the other 10 bit demos I've downloaded looked a lot better than this one.

Right now PC does not support HDR. Both Nvidia/AMD and Microsoft with their Windows 10 have announced their support for HDR but it is still yet to happen and there is no media player that can play them correctly. The result is exactly that, a very bright and washed out video file. You have to copy them to a USB stick, stick it your TV and play it from there,
 
Ah thanks. So we're waiting on a driver update and a media player update before the support happens from HTPC side.
 
Ah thanks. So we're waiting on a driver update and a media player update before the support happens from HTPC side.

Pretty much yeah. I think this is the first time consoles are actuallly ahead of us in something.
 
Back
Top