4K 60 Hz Scanning Backlight TV/Monitor - Panasonic TC-L65WT600

Got a response back from Panasonic:

"Thank you for your inquiry.

The TV was designed with the option of setting the Picture Mode to "Game". Regrettably, the Owner's Manual is yet to be made available online. "

That is kinda vague. Game mode could mean it simply adjusts contrast etc and not latency / bypass electronics. And if game mode DOES bypass all the input lag inducing electronics, does that mean all of the "neat" features like local LED dimming, 60->120 frame creation, scanning backlight all turn off and reverts it back to a simple 60 Hz basic 4K display? Could you keep a scanning backlight without the 120 Hz upconvert?
 
Oh, and this page: http://www.panasonic.com.au/Products/VIERA+televisions/LCD+TVs/TH-L65WT600A/Overview
Shows it has a 2,000 Hz scanning backlight then ( Native 100hz), not sure what that means...
Me neither.
(1) Does 2,000 Hz means a motion equivalence ratio, e.g. 0.5ms strobe flashes? (possibly)
(2) Does 2,000 Hz means the speed that it flashes the next segment after the previous segment? (probably)
(3) Does 2,000 Hz means the number of times the screen is strobed (unlikely)

We need someone to point a high-speed camera at it (preferably 2,000fps, one fps per Hz) to understand what the scanning backlight is doing. Even just pointing a 1,000fps camera would answer MANY questions about this televisions' scanning backlight behavior.

The best (proper( case scenario is 0.5ms strobe flashes, once per native refresh, but there is a question about the inefficiency of backlight diffusion between adjacent scanning backlight segments, which interferes greatly with motion blur reductions.

The best motion blur reduction that a "2,000 Hz" scanning backlight would be if 100Hz means 100 flashes of 0.5ms each (grand total 50ms backlight-on time, 950ms backlight-off time). This is darn unlikely that a backlight would be 20 times brighter in order to pull off the theoretical motion resolution of a 2,000Hz scanning backlight "done in the scientifically proper way".

My prediction is that this Panasonic scanning backlight won't be as 'efficient' as the LightBoost strobe backlight. Rather than flashing one row of LED's at a time, it is probably using a rolling scan to increase brightness (and decrease motion resolution), while keeping the silly "2,000 Hz" claim.

The bottom line is that the motion resolution is probably not going to meet the "2,000Hz" claim, but neither does Panasonic's "2500Hz FFD" for their plasma displays either -- it helps motion resolution a lot, but doesn't give it 33x the motion resolution (2000 / 60).

Whereas LightBoost is ultra-efficient, in that it manages to gives 12x the motion resolution. During LightBoost, the oscilloscope photodiode measured 1.4ms per strobe, which is 1/700sec. Based on this data, we could theoretically claim "700Hz" since LightBoost is creating (700/120)x less motion blur = 6x less motion blur than regular 120Hz, or 12x less motion blur than 60Hz (e.g. motion blurring of 12 pixels becomes motion blurring of just 1 pixel).

We can thus, simply, exaggerate and market LightBoost like a "700Hz strobe backlight" simply by claiming it has 120 visible strobes and lots of dark frames in between (1.4ms on time, 6.9ms off time between). And it actually meats the claim -- the strobe backlight is very efficient. So much so, that the "700Hz" LightBoost strobe backlight is more efficient than the Panasonic "2000Hz" scanning backlight (I would bet on it, since Panasonic is almost certainly not using 0.5ms strobe flashes only once per refresh cycle, unless they're using really bright backlight LED's). Strobe length (once per unique frame) is conclusively linked to the amount of motion blur. If we wanted to exaggerate in the same way Panasonic does and use "Hz" numbers to describe LightBoost, even though there's only 120 native refreshes per second.

Moral of the story: Motion blur benchmarking time!
 
Mark, I remember when you first started this all saying that a scanning backlight was superior to a strobing backlight. Now you have come to a different conclusion. Is that based off of this "type" of scanning backlight, or all strobing backlights superior to scanning backlights?

If so, why do manufacturers even bother with scanning backlights? Is it cheaper/easier to implement/easier on the eyes, combination of the above?

I do agree, the way they use "Hz" may be a bit more marketing than anything. Here is the marketing on the Japanese site for this TV, clearly showing a scanning backlight:

picture_img03.jpg


picture_img05.jpg



How that converts into Hz is anyones guess.
 
Mark, I remember when you first started this all saying that a scanning backlight was superior to a strobing backlight.
I started out by working with scanning backlights, until I discovered strobed backlights can be superior.

But I never said scanning backlights were superior to strobe backlights. I just didn't know about the efficiency differences at the time to conclude that scanning backlights does not easily meet the inefficiencies needed easily and cheaply. The efficiency differences are very clearly explained at TFTCentral's Motion Blur Reduction Backlights article, which I helped Simon write.

There are elements I did say "scanning backlights would be better for slower-response displays such as IPS panels", but I never, never, ever said motion blur reduction capabilities of scanning backlights were superior to strobed backlights. Categorically, it illustrates the non-viability of strobe backlights on certain panels, thus scanning backlights is the only option (and historically it was; since no panel was fast enough to finish refreshing before the next refresh -- until 3D panels came along a few years ago).

Is that based off of this "type" of scanning backlight, or all strobing backlights superior to scanning backlights?
If so, why do manufacturers even bother with scanning backlights? Is it cheaper/easier to implement/easier on the eyes, combination of the above?
MOTION BLUR: Strobe backlights are easier.
ELECTRONICS DIFFICULTY: Strobe backlights are much easier

However, there's a catch: It does not work on slow LCD panels
Strobe backlights do _not_ work well on these types of panels below:

2007-lcd-3-refreshes-blended-300x166.jpg
(old Dell 2007WFP from 2007)
Frame grab from high speed video, of an older LCD unsuitable for strobe backlight.
Three frames (06, 07, 08) are shown.
The law of physics tells you that there's never a clean (fully refreshed) frame to strobe; the LCD is in perpetual ghosting (continuous).
That was always the case for LCD for many decades, until recently.

See Understanding Strobe Backlights Via High Speed Video.
A part of the "Electronics Hacking: Creating a Strobe Backlight".

So, you see, the answer is not simple.
Scientifically, it's complicated.
Strobe backlights are much, much, more efficient in eliminating motion blur, thanks to lack of backlight diffusion throughout the panel. BUT... it cannot be used with all panels.
Scanning backlights help reduce motion blur when the LCD panel is too slow to be used with a strobe backlight.

Finally, the LCD motion blur miracle happened -- LCD panels could finish refreshing the current frame long before the next refresh began. This finally made 3D possible; and finally made strobe backlights practical. It now became possible for an LCD to have less motion blur than a CRT, through the strobing technique, since you can flash the backlight only on fully refreshed frames, and get the full CRT quality effect. On a modern fast panel, short strobe flashes simulate short-persistence CRT phosphor; motion blur is now simply dictated by the length of the strobe flash.

Strobe backlights: Can be used with an edge light. Only one simple flash needed per refresh. But the LCD must finish refreshing completely before the next refresh. You're stuck with fast LCD panels (e.g. TN panels). The LCD must refresh very, very fast (as seen in high speed video of LightBoost), to squeeze the pixel transitions into the time period between refreshes (otherwise you start getting nasty full-intensity doubleghosts that aren't faint, much like seeing 30fps@60Hz strobed, or 60fps@120Hz strobed). Efficiency of motion blur reduction is ultra-high if most of the pixel transitions fit between strobes. Strobe backlights can reach near-theoretical efficiencies in motion blur reduction abilities.

Scanning backlights: Requires a full array backlight. You can flash part of a screen while a different part of the screen is refreshing in total darkness. More complicated electronics. Cannot be used with edgelights. Costs more. Makes possible motion blur reduction on much slower LCD's, including IPS LCD's and VA LCD's. Efficiency of motion blur reduction is much lower, due to backlight diffusion between adjacent off-segments and on-segments. Attempting to control this well, becomes quite expensive (4 figures, and 5 figures in the case of Viewpixx).

Practically, we are likely not going to see cheap scanning backlights in a desktop monitor. Our hope is to wait till IPS/VA becomes fast enough to easily work with a strobe backlight. Fortunately, Eizo has proven it's possible with the FDF2405W, which answers the question of "Is a strobe backlight possible with a VA monitor?". (Yes!).
 
Last edited:
Good info. Really a shame that TN panels are completely unsuitable for anything but small desktop monitors. I think it may just come down to me purchasing the display to find out the real deal on it's gaming/motion qualities. Unfortunately, hardly anyone that reviews high end TV's does such tests. Although not preferred, I could deal with up to 2 frames of input lag if the other display attributes are fantastic like on this display.

If it is more than that, the TV would have to be returned. Panasonic advertised gaming on this screen, so hopefully that is one heck of a fast processing chip or the game mode does truly bypass the electronics.
 
Got a response back from Panasonic:

"Thank you for your inquiry.

The TV was designed with the option of setting the Picture Mode to "Game". Regrettably, the Owner's Manual is yet to be made available online. "

That is kinda vague. Game mode could mean it simply adjusts contrast etc and not latency / bypass electronics. And if game mode DOES bypass all the input lag inducing electronics, does that mean all of the "neat" features like local LED dimming, 60->120 frame creation, scanning backlight all turn off and reverts it back to a simple 60 Hz basic 4K display? Could you keep a scanning backlight without the 120 Hz upconvert?

Vega,

I got a similar response to a email I sent to them. I asked about input lag and got this response:

Thank you for your inquiry.

Regrettably, the requested information was not made available for disclosure.


Thank you for contacting Panasonic.

So we are up in the air. I know that CNET does include Input Lag in their reviews, but who knows when they will review it.

I currently have the Sony 4k set and it is a great set and image is fantastic, but I am thinking of moving to the Panny because of the Display Port, this way I don't have to replace my GPU's since they don't support hdmi 2.0 and no GPU has been announced as of yet that will. But on the flip side, we have no clue as to the Panasonics image quality, input lag and how it does it's upscaling to 4k. I would hate to take back my Sony to find that the Panny is not up to par on these essential things and then have to get another Sony and hope I have no screen issues as my current one is pretty free of any major issues.
 
So we are up in the air. I know that CNET does include Input Lag in their reviews, but who knows when they will review it.

Right now everyone is using the Leo Bodnar tool to do input lag testing, however, that device is designed only for 1080p.

Won't running that device on a 4K TV cause the scaler to kick in causing even more input lag?

Hopefully the Blur Busters Input Lag Detector will solve this problem but I don't know when that will be released.
 
Hey everyone.

Some Best Buy Magnolia locations have started to received Display models for the new Panasonic. I was lucky and found a store in Canoga Park/Woodland Hills in California. I had called them a few days ago and they said they received a display model, but they were not sure when they would have it up. They called me yesterday to let me know it was going up so me and my fiancée took a drive out to check it out.

The Manager informed me when we arrived that they were one of a few Magnolia stores that received it early and that only the top 50 Magnolia Stores were getting early display models.

So the impressions so far are very positive. The set looks gorgeous and color reproduction is very good. This is THX Certified set and it showed. They had it side by side with a Sony 4k set and a Samsung 4k set. The Sony and Panasonic were running the same feed from the Sony set, which was 4k content. First thing we noticed was the black level was very good on the Panasonic. They had it in Vivid Mode, but for comparisons sake they also had the Sony on Vivid mode. The Panasonic flat out had better color reproduction overall and better black levels.

My fiancée which is not as picky as I am easily said the Panasonic just looked better. White's were more pure, black was more pronounced and the color just seemed to pop more on the Panasonic. I took my laptop cause I wanted to see what text and a desktop looked like and it looked sharper and a little cleaner than the Sony. I has some 1080p content to check out the up scaling and it looked really good.

I did verify and this set does indeed have Display Port, so those with monster rigs (like mine..hehehe) can run 4k resolution at 60fps on a 65" Screen! Finally!!! As promised by Panasonic the set does have HDMI 2.0, but there is a fine line that they need to be careful with. While Panasonic flys the "We got HDMI 2.0 on our set" flag, HDMI 2.0 is only available on the HDMI 4 input. The remaining HDMI inputs are standard 1.4a HDMI. The clerk told me that Panasonic said they will have a firmware update for the remaining 3 ports, but he had nothing specific on that. Browsing the user manual it does state that it is HDMI 4 only that has 2.0 but no mention of upgrading the other ports.

Here is a video I found that Panasonic put out. They partnered with Nvidia and Slighty Mad Studios and Square on working the TV for 4k Gaming. Towards the middle, they have a segment with the guy from Slightly Mad Studios. I then went to their website to see if I could ask him about input lag, but there was no way to email, but under his Bio he listed his gamer tag on Xbox Live, so I sent him a message and he responded with "Don't know the exact number, but it was very low as I felt no lag playing games on it".

http://www.youtube.com/watch?feature=player_embedded&v=8Bq4x2TmFjI

I currently own the Sony XBR 4k 65" set....I also just got back from Best Buy arranging delivery of the Panasonic and a pick up of the Sony set. Make of that what you will! lol
 
That video was awful, essentially all they said was that 4k=better. If the TV's gamma is super high the extra res won't help with detail since the colors will be severely crushed. The Samsung & Sony 4K TV's use VA panels=on angle black crush. HDTV Test Samsung Ux65F900 Review.

They should have talked about the Panasonic's unrivaled color accuracy and how it perfectly matches industry color standards (REC 709 & REC 20 20).
 
I currently own the Sony XBR 4k 65" set....I also just got back from Best Buy arranging delivery of the Panasonic and a pick up of the Sony set. Make of that what you will! lol

Awesome. Hope you can test out the Displayport and verify that this is the first display to do 4K over SST.

They should have talked about the Panasonic's unrivaled color accuracy and how it perfectly matches industry color standards (REC 709 & REC 20 20).

There is no way the Panasonic will cover 100% of Rec 2020. No commercially available wide gamut display has ever exceeded 92% coverage.

Mitsubishi could probably do it first if the resurrected their Mitsubishi LaserVue DLP TVs and tweak the blue laser, but that is not going to happen.
 
Awesome. Hope you can test out the Displayport and verify that this is the first display to do 4K over SST.



There is no way the Panasonic will cover 100% of Rec 2020. No commercially available wide gamut display has ever exceeded 92% coverage.

Mitsubishi could probably do it first if the resurrected their Mitsubishi LaserVue DLP TVs and tweak the blue laser, but that is not going to happen.

Not sure what you mean by it being the first set to do 4k over SST? Would you mind explaining what that means to me so I know what to look for?

Thanks!
 
Not sure what you mean by it being the first set to do 4k over SST?

Currently all 4K Monitors that have a DisplayPort only work at 4K using MST (Multi Stream Transport). This means that the monitor appears as to the GPU as two 1920x2160 streams and requires the use of multi-monitor functionality (eyefinity, Nvidia Surround-esque hack, etc).

This could be the very first 4K display to work without tiling and instead with SST (Single Stream Transport).

Would you mind explaining what that means to me so I know what to look for?

Connect the display via DisplayPort. Then check the screen resolution in the Control Panel.

If you have an AMD card and it looks like this:
http://www.pcper.com/image/view/28587?return=node/57940
If you see 1|2 then it is MST. If you just see a 1 then it is SST.

As for Nvidia, I am not quite sure, they put in this weird hack/whitelist into their driver to support 4K tiled monitors while still not allowing 2x1 surround.

http://www.pcper.com/image/view/28591?return=node/57940

Though if they didn't add the EDID to their whitelist you may find out pretty fast.

With either card, keep an eye out for artifacts in the middle of the screen (tearing, etc) that may indicate a multi-monitor setup.
 
Currently all 4K Monitors that have a DisplayPort only work at 4K using MST (Multi Stream Transport). This means that the monitor appears as to the GPU as two 1920x2160 streams and requires the use of multi-monitor functionality (eyefinity, Nvidia Surround-esque hack, etc).

This could be the very first 4K display to work without tiling and instead with SST (Single Stream Transport).



Connect the display via DisplayPort. Then check the screen resolution in the Control Panel.

If you have an AMD card and it looks like this:
http://www.pcper.com/image/view/28587?return=node/57940
If you see 1|2 then it is MST. If you just see a 1 then it is SST.

As for Nvidia, I am not quite sure, they put in this weird hack/whitelist into their driver to support 4K tiled monitors while still not allowing 2x1 surround.

http://www.pcper.com/image/view/28591?return=node/57940

Though if they didn't add the EDID to their whitelist you may find out pretty fast.

With either card, keep an eye out for artifacts in the middle of the screen (tearing, etc) that may indicate a multi-monitor setup.

Thanks for posting that...

Is this just a Display Port thing? Because I did run the Sony at 4k resolution albeit at 30fps using HDMI and it did not have a split screen in the control panel.
 
Is this just a Display Port thing? Because I did run the Sony at 4k resolution albeit at 30fps using HDMI and it did not have a split screen in the control panel.

It is a 4K 60p thing. Up until now there have been no 4K 60p input controllers, Panasonic will be the first to bring one to the market in this 4K TV. DisplayPort at 4K 30Hz will not have a split screen like HDMI (1.4) as there is enough bandwidth for that.

Another thing worth testing out on this tv is 1080p at 120Hz. The TV panel is supposed be able to refresh at 120 HZ, the new input controller can handle the bandwidth so hopefully it will be able to support it.

It is too bad the display panel isn't 240 Hz as the input controller is able to handle 1080p at 240Hz. I guess we will have to wait until next year for 1080p 240Hz input displays.
 
HDMI 1.4 on the Sony 4k does not have a split screen. I have used it a few times and there is no split screen when using 4k @30fps.
 
HDMI 1.4 on the Sony 4k does not have a split screen. I have used it a few times and there is no split screen when using 4k @30fps.

That is not what he is inquiring about. No one cares about 4K at 30 Hz via HDMI 1.4. The huge boon of this TV is that it could possibly run 4K at 60 Hz via a single DP cable with a single stream. This is akin to the drivers and windows seeing it as a "true" single monitor, and not two separate displays like all current 4K displays are configured.

So you have the set inbound for delivery? That is sweet. We can show you what to test! Besides the single stream connectivity, input lag and motion clarity are two other main attributes us computer folks will be dying to know about..

I could literally plug in a laptop in the store and figure most of this stuff out in two minutes or so. I have to call some Magnolia's around me, but I don't have a laptop. :mad:
 
That is not what he is inquiring about. No one cares about 4K at 30 Hz via HDMI 1.4. The huge boon of this TV is that it could possibly run 4K at 60 Hz via a single DP cable with a single stream. This is akin to the drivers and windows seeing it as a "true" single monitor, and not two separate displays like all current 4K displays are configured.

So you have the set inbound for delivery? That is sweet. We can show you what to test! Besides the single stream connectivity, input lag and motion clarity are two other main attributes us computer folks will be dying to know about..

I could literally plug in a laptop in the store and figure most of this stuff out in two minutes or so. I have to call some Magnolia's around me, but I don't have a laptop. :mad:

I got what he was saying, but he said this as well... "DisplayPort at 4K 30Hz will not have a split screen like HDMI (1.4)". I was just stating that it was incorrect, that HDMI 1.4 on the sony does not have split screen at 4k/30fps.

Yes, I have it inbound. Will not be delivered until October 20th. I could have it sooner, but me and my fiancée are going to Disneyland for a few days, so I did not want to cancel that just to get the set earlier.

I emailed someone at Nvidia about the single stream so hopefully they will answer. I do know that the way they had the set up at the Panasonic IFA presentation that it was a single display port cable Directly into the TV, but not much more than that.

I did take my laptop and ran a few things and the input lag seemed pretty low to me and I did not notice much motion blur at all on the set. But I also did not have time to really calibrate the set, I put it into game mode to test a few things and it seemed pretty good.
 
I don't have my set yet, but I took another look at that video I posted above and I got a hold of Lars Weinand which works for Nvidia and has been partnering with Panasonic on the new 4k Display. Here is what he said about the set and it not being SST:

"You're right about SST. Problem is, there are no panel display controllers today who can handle DP60 Hz at 4K resolution. This is why all panel makers are using MST today to get to 4K@60Hz. It's expected that such controllers will be ready sometime next year for prototyping. So we think it will take until end 2014 or 2015 until we see SST solutions with 60Hz at 4K res."

So it seems like SST at 4k/60fps wont be something that is happening anytime soon. He did tell me in a separate message to ensure I have the latest drivers as they have made specific modifications for this set and that it works really well and 4k/60fps is very nice. He said that some games have issues at 4k, but with the emphasis on 4k as of late any issues should be resolved quickly.

For example, Skyrim at 4k always displays "Level Up" in the bottom right hand corner of the screen when running 4k as one example.

I get my set on the 20th, so I should have some impressions on the 21st or shortly after.
 
Disappointing.

So let me see if I have this straight, DP 1.2 was released at the end of 2009 with enough bandwidth for 4K 60p over a single stream. Yet the first controllers to support this are only coming out in 2014 or even 2015? Ouch.

Meanwhile the first HDMI controllers supporting 4K 60p (over a single stream) are coming out in displays now.

It is too bad no existing hardware supports HDMI 2.0.

AMD-Tile-Display-Presentation-50_575px.jpg


Radeon R9 290X listed between $549 and $729 - http://www.fudzilla.com/home/item/32788-radeon-r9-290x-listed-between-$549-and-$729
 
Disappointing.

So let me see if I have this straight, DP 1.2 was released at the end of 2009 with enough bandwidth for 4K 60p over a single stream. Yet the first controllers to support this are only coming out in 2014 or even 2015? Ouch.

Meanwhile the first HDMI controllers supporting 4K 60p (over a single stream) are coming out in displays now.

It is too bad no existing hardware supports HDMI 2.0.

AMD-Tile-Display-Presentation-50_575px.jpg


Radeon R9 290X listed between $549 and $729 - http://www.fudzilla.com/home/item/32788-radeon-r9-290x-listed-between-$549-and-$729

I am still learning this stuff about MST and SST. So if future GPU's use HDMI 2.0, any current HDMI 2.0 displays will run in SST mode via HDMI 2.0?

What are the pitfalls of MST mode at 4k/60fps? Speaking with the guy at Nvidia, a lot of their testing has been going very good and a lot of games are running great on the Panasonic and Nvidia GPU's.
 
600 Mhz pixelclock is what you need for 60hz @ 4k so it sounds like that ATI card will support it using SST.

I know nvidia's drivers on linux and windows are borked to actually support this right now as they have a hard-coded 400Mhz limit which I assume is from the analog days (400 Mhz RAMDAC).
 
If anybody has some internal pictures of this TV's pcb set I would like that.
If I can get my hands on those PCBs I would be so happy.
 
Last edited:
Well that stinks. All 4K displays will need to be treated as NVIDIA Surround or AMD Eyefinity until 2014/2015? Man I hate the display sector! They are always dragging their feet on everything. Especially when it comes to connectivity standards and controller boards.

While not a deal breaker for me to try the set, it is a little less appealing now. Still the largest things for me are; how is the input lag and motion blur at native 4K/60Hz.

Another issue, is this SST/MST issue Displayport only? As far as I am aware DP is the only packet-based display connection and HDMI 2.0 should not have this issue.

So right now, we have GPU's that can output 4K@60Hz DP in SST but no TV's to accept the signal, and we have HDMI 2.0 TV input but no GPU outputs to send out the signal. Talk about a cluster...
 
Well that stinks. All 4K displays will need to be treated as NVIDIA Surround or AMD Eyefinity until 2014/2015? Man I hate the display sector! They are always dragging their feet on everything. Especially when it comes to connectivity standards and controller boards.

While not a deal breaker for me to try the set, it is a little less appealing now. Still the largest things for me are; how is the input lag and motion blur at native 4K/60Hz.

Another issue, is this SST/MST issue Displayport only? As far as I am aware DP is the only packet-based display connection and HDMI 2.0 should not have this issue.

So right now, we have GPU's that can output 4K@60Hz DP in SST but no TV's to accept the signal, and we have HDMI 2.0 TV input but no GPU outputs to send out the signal. Talk about a cluster...
I figure HDMI 2.0 still being 3 lanes so with the clock patcher you should be able to do it with current cards.
It indeed sucks what sucks even more is that unless they double the speed per lane on the DP standard we will be running 2 DP ports for 120Hz(at least)

10.8Gbps on a single lane is a huge amount but shown by HDwire and Thunderbolt it can be done over copper so VESA should get to it.

Random rant:
(which dumbass named the VESA company to the bit standard I mean the could've called it different it is not like JEIDA is a company)
 
Well, in theory this is the only 4K TV with DP 1.2 and you will be limited to MST mode there. But, once GPU's are able to output HDMI 2.0, in the future the TV can be treated as one display.
 
Panasonic just published the owners manual online. It's quite funny though, as the manual states to read the on-screen manual built into the TV set. Jeez....

Seeing how NVIDIA worked with Panasonic on this TV, it would be awesome if they were to implement G-Sync into it in the future. Or possibly make a G-Sync version. I would definitely be interested in it then.
 
Stateless269 got the TV.
He has confirmed that 4K 60p only works over DisplayPort MST.

It is MST. When you first boot the computer you get two images, side by side of the exact same thing as the Bios boots. Then when the windows logo shows up you also get a side by side of the same image. Once this disappears, the TV gives the "no signal" image, then immediately the Windows's image appears full screen. Going into the "set resolution" screen it appears as one screen.
 
So now that 4k is slowly trickling out, can I expect to see every film every made remastered into 4k or only the stuff going forward? What about TV broadcast, when are we going to see that in 4k?
 
people can't tell the difference between 720p and 1080p sets at typical viewing distances usually, on sub-70" tvs. I see a 70" tv at work regularly and it could benefit , assuming it had the material to view on it. Perceived pixel density is relative to viewing distance. Unless you have a gigantic tv or sit very close I don't think it will be as huge of an improvement as people think. Some 1080p broadcasts are better, and worse, than others. A high quality bluray movie would be a more valid example of pristine 1080p.

Where I think it would come into play more is desktop real-estate, and also game interfaces/huds. The higher the rez, the smaller the interfaces (bars, buttons, etc) and text can be. In 1st/3rd person HOR+ games, the pixel density would be a lot higher which would look gorgeous, but the scenes would still be the same 16:9 scene with scene elements all the same size(other than perhaps the HUD) compared to a lower rez 16:9 monitor of the same size. The downsides being that 4k displays are only 60hz, which makes a blurry mess during your continual FoV movement -losing out on 120hz's blur reduction and other important benefits of 120hz+120fps gaming. 4k is also an extremely demanding, fps crushing resolution.
.
I'd still be interested in a 4k display someday as a desktop monitor, jumping up from my 2560x1440 desktop/app monitor, but I'm in no rush to do so. 4k-vs-2560x display's real-estate. I can wait quite a while until the wrinkles in the tech are ironed out and prices adjust a little. The gaming monitor will be upgraded much sooner. I'm planning on getting one of next years g-sync+lightboost monitors.
 
Last edited:
people can't tell the difference between 720p and 1080p sets at typical viewing distances usually, on sub-70" tvs.

Some of us actually have good vision though. I know I can easily tell the difference between 720p and 1080p on a 55 inch tv from around 15-20 feet away.
 
depends on the quality of the signal. And are you comparing 720p video to 1080p video on the same set? Quality varies display to display. Are you training your eyes to see the difference or are you just watching the material? If someone switched the signal to a 720p high quality, not heavily compressed source of the same material without you knowing it, and without you aware or expecting that there was a random blind test going to happen in your future, would you notice immediately on high quality video material with no interfaces or OSD size tipoffs? Many would not notice immediately at distance.

I haven't seen a 4k in person yet so I can't say for sure but I do know that a 70" screen at 8 - 12 feet away (possibly further, but that is the viewing setup) could definitely benefit from 4k over 1080p because the pixelization is not subtle.

I have acute eyesight btw. People are amazed at the small text I read on my tablet from a distance on a table.

For me personally, it would be a much more appreciable difference for desktop/app use for the real-estate, at least as far as jumping in as an early adopter and not waiting it out.

There are supposedly going to be some g-sync enabled 4k displays sometime next year, but I have no idea what sizes. A separate aspect of g-sync is apparently the ability to enable a backlight strobe mode which would potentially eliminate lcd movement blur which I am definitely interested in. Blur reduction and motion definition are to me almost as important as pixel density definition, more so for games but is a major annoyance on video material too - fast action movies, sports broadcasts, camera panning, general viewing.
 
It wasnt just me. Back in the day it was very noticable when we went from 720p 2 GB bluray rips of sons of anarchy to 1080p 4 GB bluray rips. These are high bit-rate h264. Yes its the same TV. Its very obvious to me. Of course I have used 4k displays since 2005 on 22 inch monitors which 30% of the people who see it say they can't even read it and the other 40% say its too small to be confortable.
 
Back
Top