Why are companies like Dell, Asus, iiyama and NEC not making 40" 4K monitors

I don't mean all the 40" panels are BGR, however they are all "TV" panels, so it's not surprising that *many* are BGR.

It's the smallest problem with TV panels anyway.
A PPI of 110 is not low.... Same as a 34" 3440x1440p or a 27" 1440p monitor.
 
Terrible burn in. There's an entire thread dedicated to the topic right here on [H]f. I checked it out last spring while I was still a lurker. I was quite surprised that Dell managed to make a modern display with burn in. Didn't realize it was really possible.
A PPI of 110 is not low.... Same as a 34" 3440x1440p or a 27" 1440p monitor.
I have to back this up. This is pretty much the limit if you intent to use a screen without scaling of some sort. Not exactly what I'd call "low". lol
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
Terrible burn in. There's an entire thread dedicated to the topic right here on [H]f. I checked it out last spring while I was still a lurker. I was quite surprised that Dell managed to make a modern display with burn in. Didn't realize it was really possible.

Too bad. That being said, If I went by what the 1% was experiencing here on this forum, my PG348Q would have horrible lines when running at 100hz, flickering and abysmal back light bleed. I realized that I need to stop listening to the 1% that are hyper critical and just buy the damn thing and low and behold, it's beautiful with none of those issues.

Just a thought...

EDIT

If this is the thread you're referring to, I just read it and I can't help but think there are just a few really picky people out there. In fact, I think there are more people really happy with their purchase than the opposite. https://hardforum.com/threads/dell-43-ultra-hd-4k-multi-client-monitor-p4317q.1897889/
 
Last edited:
Too bad. That being said, If I went by what the 1% was experiencing here on this forum, my PG348Q would have horrible lines when running at 100hz, flickering and abysmal back light bleed. I realized that I need to stop listening to the 1% that are hyper critical and just buy the damn thing and low and behold, it's beautiful with none of those issues.

Just a thought...
LOL! I hear you. I was reading a thread a couple days ago in which multiple people were being super critical of the Samsung KS8000's HDR capabilities because it doesn't have full array local dimming. Despite that, it offers one of the best HDR experiences on the market. Those guys were blowing things a bit out of proportion, IMO.

But that's not that case with the Dell. Literally every person who used this display ended up with image retention. There were a few people who thought they'd gotten good units, but even they were struck by the burn in after a few weeks of use. It's a sad story, but sadly on that's true. :(
 
LOL! I hear you. I was reading a thread a couple days ago in which multiple people were being super critical of the Samsung KS8000's HDR capabilities because it doesn't have full array local dimming. Despite that, it offers one of the best HDR experiences on the market. Those guys were blowing things a bit out of proportion, IMO.

But that's not that case with the Dell. Literally every person who used this display ended up with image retention. There were a few people who thought they'd gotten good units, but even they were struck by the burn in after a few weeks of use. It's a sad story, but sadly on that's true. :(

Yes, a lot of idiots on here will put words into your mouth, take crap out of context, misrepresent facts, so on and so forth. It's maddening. I don't even read half the stuff on here unless it's unbiased and enlightening which doesn't happen nearly enough.

I have a KS8000 and I can tell you it's an incredible display. And trust me, I've owned more 21" CRT's back in the day than most. I still have a brand new mint Sony 24" FW900 in the box that I used for maybe 4 months. It literally has 4 months of use on it. Anyways, I've owned tons and tons of large displays.

The KS8000 is a beast of a display. I know, I've seen a few.
 
  • Like
Reactions: Panel
like this
EDIT

If this is the thread you're referring to, I just read it and I can't help but think there are just a few really picky people out there. In fact, I think there are more people really happy with their purchase than the opposite. https://hardforum.com/threads/dell-43-ultra-hd-4k-multi-client-monitor-p4317q.1897889/
After going over that thread, I have to agree with you. However, that's not the thread I remember. It's possible I'm thinking of some other display... maybe one of the older large Phillips? I'm not sure, but people were a lot more vocal about the issue there than they are in that thread.

EDIT:
I found which thread and monitor I was thinking of. This one: https://hardforum.com/threads/philips-bdm4350uc-43-inch-4k-ips-pwm-free-monitor.1894807/
 
Last edited:
Per tftcentral news, two 16:9 144hz gaming monitors at 37.5" and 31.5" are planned.. 4k ips and 2560x1440 VA .
They won't have 1000nit, HDR, and FALD (384 zone), so feature wise will be inferior to the groundbreaking 27" 384 zone FALD 144hz gsync gaming monitors from Asus and acer in Q4. The contrast and black depth of each will be interesting to compare when there are reviews of each.
http://www.tftcentral.co.uk/articles/high_refresh_rate.htm



Added new detail of planned 37.5" IPS panels with 3840 x 2160 @ 144Hz

Update panel part for AUO 31.5" VA panel with 2560 x 1440 @ 144Hz. Mass production expectation of January 2017.

Any idea who's making that 38" 16:9 IPS at 4K at 144Hz? It looks pretty enticing, but I was wondering if we know which company will produce it and when we'll see it in the market.
 
The critiques on the other thread were defining the difference between what HDR standards aim for and what many HDR monitors available now are capable of, and incapable of. The original poster was complaining that HDR mode in games looked worse , washing out contrast and making the screen more pale without as much pop and saturation.
"I also noticed more blooming on the KS8000, where bright areas would spill over and illuminate the letterbox bars and other dark areas, hurting the perception of contrast. The bottom edge of the TV, right next to the bottom frame, also often appeared brighter than the rest of the image, especially below a highlight. I chalk this issue up to limitations with the KS8000's edge-lit system, and it's something the edge-lit Sony and the full-array Vizio handled much more cleanly." - CNET review
AVS forum thread on the ks8000
"
1) HDR movies (netflix/amazon) in general seems to have a more brighter/washed out picture. For example, Marco Polo, while it looks great, I feel it looks more washed out than a picture that "pops".

i tried rtings settings, but may go back to the default ones and do my own settings, i want accurate representation but at same time don't want subdued colors.

with my lights off, i noticed blooming as well + flash lighting, but will look more into it tonight."
-----------------------------------------------------

"I too have seen the blooming you talk about but it honestly doesn't bother me much. I've basically just come to accept it as being a part of these LED sets. I suppose maybe FALD sets might be better"
Rtings.com review
"Like for other TVs, we use the local dimming setting ('Smart LED') that brighten the 2% window the most, in that case 'High'. The peak brightness of small highlights is very high at 1472 cd/m². On a static image, the maximum brightness drops to around 500 cd/m². For the best HDR experience, leave 'Smart LED' to 'High'."
"When local dimming is activated, it produces blooming around bright highlights"
----------------------------------------------------------------------------------------------------------------------

Edge lit has downsides from lighting only from the sides, which can be annoying but passable in standard ~ 100nit (usually 350nit peak) mode. The HDR 10 and HDR premium standards are 1000nit peak minimum, and HDR movies are mastered at 4000nit peak. Edge lighting blasting 1000nit+/- from edge flashlights in HDR mode causes bright areas on dark backgrounds to bloom/glow into other parts of the screen, and the overall cast of the lighting to compromise the contrast. The idea of HDR is having the ability to do very brigh localized objects and edges, reflections, etc on the screen.. and to have all different brightness and black depth sharing the same scenes/frames dynamically. HDR will always look inferior on an edge lit panel for this reason. When you blast very bright light (up to 1000nit +/-, likely up to 4000 in future displays) from the edges on mixed brightness/black depth scene it's just going to blow up that problem. Even a poorly executed backlight array in FALD will bloom and glow. That doesn't mean the ks8000 is a bad tv, just that according to review site testing and owner feedback regarding the aforementioned issue - it probably looks a lot better in standard mode. I'll leave it at that since there is a lot of info on the other thread.
 
@ Panel:

With PC monitors the subpixel layout is almost always RGB. However, TVs are different. Just pick any popular 40 inch model on RTGINS.com, like the Sony X830C and check in what order the pixels are. With the exception of the Samsung JU6400, all 40 inch screens I have come across are BGR.

View attachment 15909
Turn the monitor upside down and rotate desktop 180 degrees: BGR -> RGB
 
40" is enormous compared to most monitors. I'm sitting about three feet away, and at that distance, I find the height (19.6 in) just a bit too tall, and because of the 16:9 aspect ratio, it's way too wide.

I've been trying some alternate aspect ratios and resolutions to see if I could find a more ideal size. Here's Just Cause 3 at an unscaled 2850x1900 (3:2 aspect ratio):

The actual image there is about 25.5" wide and 17" tall. That res is much easier to drive with a 1070 and I much prefer the 3:2 aspect ratio over 16:9. We need more 3:2 monitors!

This screen, the Samsung KU6300/6290 with VA panels, also use a BGR pixel layout. Not sure why.

I had the same problem with my 42" plasma.
I was reading up how your eyesight should be centered with a display.
However i never noticed that on my gaming displays, my eye level has always been at the top 80% or so.

So i had to unbolt my stinking tv from the wall and move the mount. Lesson learned, get a flexible tv mount.
 
Any idea who's making that 38" 16:9 IPS at 4K at 144Hz? It looks pretty enticing, but I was wondering if we know which company will produce it and when we'll see it in the market.
I rarely quote myself, but in this case I'm making an exception since I really want to know if there's any info on this panel. We know it's being made based on the reports linked above. I'd just like to who and when.
I had the same problem with my 42" plasma.
Out of curiosity, which plasma are you using?
 
I rarely quote myself, but in this case I'm making an exception since I really want to know if there's any info on this panel. We know it's being made based on the reports linked above. I'd just like to who and when.

Out of curiosity, which plasma are you using?

Panasonic 42ST30
 
I'd like to know anything about that also.

A 37.5 IPS with 4K and 144 would be an instant buy for me.
 
A 37.5 IPS with 4K and 144 would be an instant buy for me.

Well it's coming, but panel production won't be until Q4 2017 so don't expect any displays with it before Jan 2018. (37.5'', 3840x1600, 2300R, 144Hz, AH-IPS)

2018 is going to be interesting more so than 2017, expecting even more high refresh+high res and HDR monitors to compete with each other. (and they will probably be more expensive than ever)
 
  • Like
Reactions: Panel
like this
I'd like to know anything about that also.

A 37.5 IPS with 4K and 144 would be an instant buy for me.

Ditto, I just want those things plus passable color for Lightroom. Anything else is bonus. Just bought a used predator for an inbetween while I wait for the next gen panels and gpus.
 
I'd like to know anything about that also.

A 37.5 IPS with 4K and 144 would be an instant buy for me.
I'd rather it be VA as I can't imagine IPS glow problems at that size, but yeah.
 
  • Like
Reactions: Panel
like this
For some reason I just don't like the VA panels much over IPS , had a 40" VA one next to my 32" BenQ and just seemed much , slower or something to me , something about the mouse movement seemed much smoother on the IPS , and just looked better , no idea why , just my personal preference.
 
They tend to be slower in pixel transition time so that is something to look out for when searching for a good model.
 
  • Like
Reactions: Panel
like this
They tend to be slower in pixel transition time so that is something to look out for when searching for a good model.
Ah. I didn't know. So in terms of pixel response speed (NOT input lag) it tends to go something like this?


Faster ---- OLED > CRT > Plasma > TN > IPS > VA ---- Slower

Is that correct?
 
Yes, but it doesn't tell the whole story:
http://www.blurbusters.com/faq/oled-motion-blur/
"The only way to reduce motion blur caused by sample-and-hold, is to shorten the amount of time a frame is displayed for. This is accomplished by using extra refreshes (higher Hz) or via black periods between refreshes (flicker).

Some experimental OLED televisions now shorten frame sample lengths by strobing the pixels, and/or motion interpolation (extra refreshes). Motion interpolation can interfere with video games due to input lag. Fortunately, impulse-driving (flicker) is video game friendly.

This is why many HDTV displays now use motion interpolation and high refresh rates, as well as using scanning backlights. In addition, some new 120 Hz gaming computer monitors now have a strobe backlight feature such as LightBoost, which allows these LCD’s to have less motion blur than sample-and-hold displays (including the PS Vita OLED)"

There are 4 things mentioned here: High Refresh, Flicker, Interpolation, and Strobing. Refresh is obvious. Flicker effectively doubles the refresh by adding black frames between true frames. Interpolation "guesses" frames that should go between true frames (makes it awful for gaming). But my question is, what in the world is strobing? I've heard of it before (it's the stuff used in ULMB, right?), but I don't know what it is.
 
The critiques on the other thread were defining the difference between what HDR standards aim for and what many HDR monitors available now are capable of, and incapable of. The original poster was complaining that HDR mode in games looked worse , washing out contrast and making the screen more pale without as much pop and saturation.



----------------------------------------------------------------------------------------------------------------------

Edge lit has downsides from lighting only from the sides, which can be annoying but passable in standard ~ 100nit (usually 350nit peak) mode. The HDR 10 and HDR premium standards are 1000nit peak minimum, and HDR movies are mastered at 4000nit peak. Edge lighting blasting 1000nit+/- from edge flashlights in HDR mode causes bright areas on dark backgrounds to bloom/glow into other parts of the screen, and the overall cast of the lighting to compromise the contrast. The idea of HDR is having the ability to do very brigh localized objects and edges, reflections, etc on the screen.. and to have all different brightness and black depth sharing the same scenes/frames dynamically. HDR will always look inferior on an edge lit panel for this reason. When you blast very bright light (up to 1000nit +/-, likely up to 4000 in future displays) from the edges on mixed brightness/black depth scene it's just going to blow up that problem. Even a poorly executed backlight array in FALD will bloom and glow. That doesn't mean the ks8000 is a bad tv, just that according to review site testing and owner feedback regarding the aforementioned issue - it probably looks a lot better in standard mode. I'll leave it at that since there is a lot of info on the other thread.


Its better to watch SDR than HDR with edgelit TV like KS8500? Sorry, but that is just nonsense. Yes, edgelit has limitations in HDR and yes, Samsung obviously cheated the UHD Premium badge because while the TV can hit 1000 nits in test images (with a ton of flashlighting but it does) in real life content it rarely does so because it would obviously cause a lot of brightness uniformity issues. But the native contrast ratio of this TV is still really high and when the scene is somewhat bright the blacks still look black, it still can show a convincing range from shimmering bright specular highlight to dark shadow details. Only in "space scenes" and such the limitations are easily visible. It also has real 10 bit panel and DCI-P3 coverage is really good. HDR picture looks so much better than SDR, phenomenal even despite the limitations this TV has. While FALD is always preferable do not write KS8xxx serie off just like that. It is still the best midrange TV you can buy for HDR content.

Oh, and this TV is Top/bottom edge lit, not sides. This system actually works better to dim out dark parts in the movie itself and not just the black bars the normal side-lit local dimmer does. When there is large dark shadowy part in the movie the local dimming system can dim it slightly to make it look deeper.
 
There are 4 things mentioned here: High Refresh, Flicker, Interpolation, and Strobing. Refresh is obvious. Flicker effectively doubles the refresh by adding black frames between true frames. Interpolation "guesses" frames that should go between true frames (makes it awful for gaming). But my question is, what in the world is strobing? I've heard of it before (it's the stuff used in ULMB, right?), but I don't know what it is.
Flicker = strobing = ULMB = lightboost in this case. It more than doubles the effective refresh of the monitor. As in, normal panels would have to have 500-2000 Hz refresh to match the motion clarity of the strobing ones. Flicker, brightness and colors can become an issue, though.
 
Plasma also did a per pixel blink sort of, which gave around a 60% blur reduction. Interpolation "tween" frames and other tv tricks for smoothing and blur reduction typically add a lot of input lag which is why gamers write anything like that off since they play in game mode for the lowest input lag. In contrast, I think the eizo fg2421 VA gaming monitor used a frame duplication then blacked the backlight out on every frame in what they called "240hz" mode.

In lightboost/ULMB mode the brightness drops considerably giving the screen a muted look. Even more critically, a lot of people get eyestrain at 100hz or less after playing more than a short game session. Considering the frame rate minimums required to do 100fps-hz or 120fps-hz and over stably, most people are better off using variable hz and enjoying higher graphics settings (~100fps-hz avg)on more demanding games the way things currently stand imo, particularly if they are using higher than a 1080p resolution monitor. Another thing to consider about backlight strobing is that it might not be compatible with HDR gaming mode and FALD HDR gaming mode in upcoming HDR gaming monitors, even those with g-sync modules.

The huge benefit of VA screens is up to triple the contrast ratio and black depth in gaming monitors. IPS and TN gaming monitors are very pale by comparison, having around 860:1 to 970:1 (the pg279q has 989:1). Modern VA gaming monitors like samsung's have around 2800:1 out of 3000:1. The 1080p static hz era eizo 2421 monitor lottery stood out with over 4800:1. In tv's a FALD VA can get over 8100:1 contrast ratio, and over 5000:1 outside of FALD. That is inkier than a F8500 plasma.

OLED has blur issues, input lag, some have reported blocking, and though I've read people claiming otherwise I'm not certain there's not potential for burn in. From what I understand, LG is using all white OLED sort of like a per pixel backlight in order to avoid other colors fading out over time. This seems like an ultimate sort of "FALD" available right now at a .0005 black depth and per pixel brightness/blackness for HDR. Unfortunately oled currently doesn't have 120hz+ (blur reduction and motion definition increase at high fps) nor variable hz like a modern gaming monitor should (or strobing/screen blanking for that matter, low input lag). They are also huge.

Edge lights can do alright in LCD tvs. I had a samsung B7000 series VA tv that was ok but it definitely had some flashlighting and clouding in dark scenes though you could get numb to it and mostly ignore it. I don't have a ks8500 but I know HDR has to blow out the weakness of edge lit worse just by nature of turning the flash-lighting brightness limits up much higher. According to many reviewers and users it is an issue (in standard mode even) though most can live with it.

"I also noticed more blooming on the KS8000, where bright areas would spill over and illuminate the letterbox bars and other dark areas, hurting the perception of contrast. The bottom edge of the TV, right next to the bottom frame, also often appeared brighter than the rest of the image, especially below a highlight. I chalk this issue up to limitations with the KS8000's edge-lit system, and it's something the edge-lit Sony and the full-array Vizio handled much more cleanly." - CNET review
AVS forum thread on the ks8000
"
1) HDR movies (netflix/amazon) in general seems to have a more brighter/washed out picture. For example, Marco Polo, while it looks great, I feel it looks more washed out than a picture that "pops".

i tried rtings settings, but may go back to the default ones and do my own settings, i want accurate representation but at same time don't want subdued colors.

with my lights off, i noticed blooming as well + flash lighting, but will look more into it tonight."
-----------------------------------------------------
"I too have seen the blooming you talk about but it honestly doesn't bother me much. I've basically just come to accept it as being a part of these LED sets. I suppose maybe FALD sets might be better"
Rtings.com review
"Like for other TVs, we use the local dimming setting ('Smart LED') that brighten the 2% window the most, in that case 'High'. The peak brightness of small highlights is very high at 1472 cd/m². On a static image, the maximum brightness drops to around 500 cd/m². For the best HDR experience, leave 'Smart LED' to 'High'."
"When local dimming is activated, it produces blooming around bright highlights"
----------------------------------------------------------------------------------------------------------------------

Edge lit has downsides from lighting only from the sides, which can be annoying but passable in standard ~ 100nit (usually 350nit peak) mode. The HDR 10 and HDR premium standards are 1000nit peak minimum, and HDR movies are mastered at 4000nit peak. Edge lighting blasting 1000nit+/- from edge flashlights in HDR mode causes bright areas on dark backgrounds to bloom/glow into other parts of the screen, and the overall cast of the lighting to compromise the contrast. The idea of HDR is having the ability to do very bright localized objects and edges, reflections, etc on the screen.. and to have all different brightness and black depth sharing the same scenes/frames dynamically. HDR will always look inferior (to what it could be in a good FALD implementation) on an edge lit panel for this reason. When you blast very bright light (up to 1000nit +/-, likely up to 4000 in future displays) from the edges on mixed brightness/black depth scene it's just going to blow up that problem. Even a poorly executed backlight array in FALD will bloom and glow. That doesn't mean the ks8000 is a bad tv
 
Last edited:
The huge benefit of VA screens is up to triple the contrast ratio and black depth in gaming monitors. IPS and TN are very pale by comparison, having around 860:1 to 970:1 (the pg279q has 989:1). Modern VA gaming monitors like samsung's have around 2800:1 out of 3000:1. The 1080p static hz era eizo 2421 monitor lottery stood out with over 4800:1. In tv's a FALD VA can get over 8100:1 contrast ratio, and over 5000:1 outside of FALD. That is inkier than a F8500 plasma.
F8500 has 10.000:1 ANSI and 20.000:1 non-ANSI contrast.
 
Flicker = strobing = ULMB = lightboost in this case. It more than doubles the effective refresh of the monitor. As in, normal panels would have to have 500-2000 Hz refresh to match the motion clarity of the strobing ones. Flicker, brightness and colors can become an issue, though.
Well then, what's the catch? Eye strain? Does this have anything to do with that PWM stuff (which I admittedly don't understand either)?
 
F8500 has 10.000:1 ANSI and 20.000:1 non-ANSI contrast.


Those are marketing numbers... a FALD VA tv markets 20mil to one. Plasma is still very good and it's pixel blinking type tech reduces blur by around 60%. They just have their tradeoffs incld 1080p 60fps-hz max among other things.

http://www.avsforum.com/forum/166-l...-led-lcd-uhdtv-official-avs-forum-review.html
FALD mode VA
I performed ANSI-checkerboard contrast measurements on the calibrated panels. The M65, in (calibrated) Calibrated Dark mode, yielded 31 fL peak white and 0.0038 fL for black, an 8157:1 contrast ratio. The plasma struggled a bit with the ANSI pattern; peak whites were 30 fL and black measured 0.0058 fL, resulting in a contrast ratio of 5172:1.

Shutting off FALD produced peak whites measuring a retina-scorching 99 fL, and blacks were 0.017 fL, which is a 5882:1 contrast ratio.
 
Last edited:
Plasma also did a per pixel blink sort of, which gave around a 60% blur reduction.
I really wish we had 4K Plasmas. Too bad they had to kill them off…
In lightboost/ULMB mode the brightness drops considerably giving the screen a muted look. Even more critically, a lot of people get eyestrain at 100hz or less after playing more than a short game session. Considering the frame rate minimums required to do 100fps-hz or 120fps-hz and over stably, most people are better off using variable hz and enjoying higher graphics settings (~100fps-hz avg)on more demanding games the way things currently stand imo, particularly if they are using higher than a 1080p resolution monitor. Another thing to consider about backlight strobing is that it might not be compatible with HDR gaming mode and FALD HDR gaming mode in upcoming HDR gaming monitors, even those with g-sync modules.
So basically low backlight and eye strain. Sounds like a big no no to me.
 
according to the official avs forum review thread I linked on the checkerboard full calibration test they did comparing both tvs.

For the calibrated comparison, I aimed for a peak brightness of 35 fL, BT.709 color, and a BT.1886 gamma curve on both TVs. I calibrated them with Spectracal CalMan 5 software and a Colorimetry Research CR-100 colorimeter, which I profiled with a CR-250 spectrophotometer.

I performed ANSI-checkerboard contrast measurements on the calibrated panels. The M65, in (calibrated) Calibrated Dark mode, yielded 31 fL peak white and 0.0038 fL for black, an 8157:1 contrast ratio. The plasma struggled a bit with the ANSI pattern; peak whites were 30 fL and black measured 0.0058 fL, resulting in a contrast ratio of 5172:1.

The M65 is much more than a viable replacement for a plasma. If you use it to watch UHD/4K footage, or as a 4K display for a computer, you enter a realm that 1080p TVs cannot compete in.

Anyway the point being, 8160:1 FALD and 5172:1 non-FALD contrast ratio is very high outside of OLED. ;-b
TN and IPS full featured gaming monitors are 860:1 and 980:1.
Samsung's modern VA gaming monitor is around 2800:1.
There are higher resolution and larger physical screen sized full featured gaming VA's due out in 2017 too.

We'll have to see what asus and acer's 1000nit FALD HDR IPS gaming monitors can do. Their contrast and black depth numbers are suspiciously absent so far.
 
Last edited:
I really wish we had 4K Plasmas. Too bad they had to kill them off…

So basically low backlight and eye strain. Sounds like a big no no to me.
It's putting lipstick on a pig. We need to move away from sample and hold displays.
 
It's putting lipstick on a pig. We need to move away from sample and hold displays.
Sure, but that eye strain still seems like it would be a gargantuan issue, enough to overshadow all benefit.
 
Flicker = strobing = ULMB = lightboost
Come to think of it, this doesn't make much sense either. Isn't ULMB a niche feature? Meaning only certain displays have it (similar to how only some have G-sync)? How can it be the same as flicker, which I know is found even in Samsung TVs?
 
Last edited:
according to the official avs forum review thread I linked on the checkerboard full calibration test they did comparing both tvs.

Anyway the point being, 8160:1 FALD and 5172:1 non-FALD contrast ratio is very high outside of OLED. ;-b
TN and IPS full featured gaming monitors are 860:1 and 980:1.
Samsung's modern VA gaming monitor is around 2800:1.
There are higher resolution and larger physical screen sized full featured gaming VA's due out in 2017 too.

We'll have to see what asus and acer's 1000nit FALD HDR IPS gaming monitors can do. Their contrast and black depth numbers are suspiciously absent so far.

Meeho is right, the F8500 scores (almost) 10000:1 ANSI contrast, here is another review of the 64'' where it's scoring 8000-9000:1.

Outside of ANSI patterns, LCD with full array local dimming can get much higher contrast ratios, as demonstrated by the 65'' ZD9 with ~648 dimming zones

 
Sure, but that eye strain still seems like it would be a gargantuan issue, enough to overshadow all benefit.
We need to move away from LCDs so we don't have to rely on these workarounds.

Come to think of it, this doesn't make much sense either. Isn't ULMB a niche feature? Meaning only certain displays have it (similar to how only some have G-sync)? How can it be the dam as flicker, which I know is found even in Samsung TVs?
Why wouldn't TVs have it? Also, the Samsungs could be using BFI.
 
I think monitor manufacturer's are out of touch with the enthusiast. This is why they aren't producing 40" or larger panels. To some degree it makes sense. The largest panel for several years was the 30". Now we have some 32" and 34" panels. I don't think monitor makers are aware of just how many people are starting to use 4K TV's for monitors. Even though that number is growing, I still think it's a niche market. Those are the people that the monitor makers should be targeting with such products. They either don't know the market for it exists, or don't think it's large enough to build products for it.
 
You can buy a 4K 40" Samsung 6290/6300 series for around $300. No point in waiting for a monitor from Dell, Asus or NEC. Monitor and HDTV both display images so not much of a difference if you catch my drift.
Anyone who has tried both a monitor and a tv as a monitor and does any gaming could enlighten you to lag and ghosting. Both of which are massively worse on tv sets.
 
Back
Top