Seiki SE50UY04 3840x2160 50" TV ($1300)

It has nothing to do with monitor manufacturers not being able to do something. Plenty of displays from the Dell 5K to the Dell 8K have multiple combined Displayport inputs. 4K at 120 Hz has technically been possible for many years. For whatever reason, be it market studies etc, display manufacturers have chosen not to do it. cithix has tapped into a real market that display manufacturers either thought didn't exist, or thought was too small to pursue. Display manufacturers get a lot of things wrong and rarely take risks.

Yeah, but I think we'll be seeing a lot of 4k/120 TV's (with freesync) now that HDMI 2.1 is a thing. It won't be for a couple more years probably, but they will be here at some point. But what I want to know is, will they have input lag measured in microseconds? That's where cirthix is really trailblazing
 
But all your charts are comparing is Nvidia's implementation of driver overhead to bypassing that section of the driver. Nothing more, nothing less.

Not to get off-topic, but not quite; driver overhead has little to do with my results (and G-SYNC was kept within its range with an in-game framerate limit to ensure no extra frames of delay were added from the over-queuing of buffers).

My charts actually show the limitations of a single frame completing within a single scanout. The lower the refresh rate, the slower the scanout time, which is why G-SYNC + V-SYNC (which is forced to follow the scanout, and thus show a single, complete frame per) at 60Hz (16.6ms scanout) has notably more input lag than G-SYNC OFF + V-SYNC OFF (which does not have to follow the scanout, and can show two or more partial frame updates [dependent on the refresh rate/sustained framerate ratio] in a single scanout) at 60Hz.

It is also the same reason why the gap between the two scenarios begins to close the higher the refresh rate is, as single frame delivery synchronous with the scanout becomes faster due to lower scanout time.
 
Curious, do the any of the kits come pre-flashed? Plug and play?

Yes, the kits will be flashed and tested.

Once you install the kit to the panel (if not getting the full 28" model, its a few screws and a few connections to plug in), it will be plug and play.
 
Cirthix, does this come with any other inputs besides the two display ports?
I am wondering because I plug my XBox in at my computer desk too and if necessary I will use an HDMI to DisplayPort adapter but then Id have to unplug the PC and plug in the XBox every time I switch. Not that big a deal but just wondering.
Which brings me to another thought, will an XBox even with an adapter work on this 28" display? The XBox One can output 720p or 1080p and most games are still 720. If I plug in the console, will the monitor be able to run 240Hz refresh with the console?
 
Not to get off-topic, but not quite; driver overhead has little to do with my results (and G-SYNC was kept within its range with an in-game framerate limit to ensure no extra frames of delay were added from the over-queuing of buffers).

My charts actually show the limitations of a single frame completing within a single scanout. The lower the refresh rate, the slower the scanout time, which is why G-SYNC + V-SYNC (which is forced to follow the scanout, and thus show a single, complete frame per) at 60Hz (16.6ms scanout) has notably more input lag than G-SYNC OFF + V-SYNC OFF (which does not have to follow the scanout, and can show two or more partial frame updates [dependent on the refresh rate/sustained framerate ratio] in a single scanout) at 60Hz.

It is also the same reason why the gap between the two scenarios begins to close the higher the refresh rate is, as single frame delivery synchronous with the scanout becomes faster due to lower scanout time.

Compared to Gsync the only reduced input lag you get with vsync off is below tear lines. Everything above the first tear line is the same input lag as if you used Gsync.

So you only get partial screen reduced lag.
 
Cirthix, does this come with any other inputs besides the two display ports?
I am wondering because I plug my XBox in at my computer desk too and if necessary I will use an HDMI to DisplayPort adapter but then Id have to unplug the PC and plug in the XBox every time I switch. Not that big a deal but just wondering.
Which brings me to another thought, will an XBox even with an adapter work on this 28" display? The XBox One can output 720p or 1080p and most games are still 720. If I plug in the console, will the monitor be able to run 240Hz refresh with the console?

I haven't tested XBOX or HDMI ->DP adapters. Does the xbox even look at the EDID?

The monitor will display a good image under the following conditions:
1) Any horizontal resolution is OK
2) Vertical resolution must be 2160, 1080, 720, or 540
3) Line clock below 276KHz
4) Refresh rate greater than 30Hz.
Number two is the only real restriction, since 3 and 4 are definitely going to be met by the xbox output in any situation.

Interestingly you might be able to plug the PC into the master port of the x28, then the xbox into the secondary. This setup would work if the three following conditions were met:
1) The PC was not sending a video signal
2) The x28 is set to dual-input mode (4k120 edid set)
3) The HDMI2DP or XBOX ignores the edid and sends a 1080p or 720p image instead.
To switch to the xbox, select 4k120 edid and put the computer to sleep or otherwise disable the output.
To switch back to PC use, select one of the single-input edids (4k60/1080p240/etc).
 
Is the T-Con and backlight controller a drop-in replacement for the old one on an existing 28" monitor (with the M280DGJ-L30 panel), or would I have to replace the input with the DP2LVDS unit as well?
 
Is the T-Con and backlight controller a drop-in replacement for the old one on an existing 28" monitor (with the M280DGJ-L30 panel), or would I have to replace the input with the DP2LVDS unit as well?

No, the original M280DGJ-L30 tcon uses Vx1 interface, while the zws tcon uses an LVDS interface. These are not compatibile, so the input board must be swapped too.
 
Compared to Gsync the only reduced input lag you get with vsync off is below tear lines. Everything above the first tear line is the same input lag as if you used Gsync.

So you only get partial screen reduced lag.

At the same framerates within the VRR range, correct.
 
Last edited:
If we're talking about both methods being at the same sustain (limited) framerate below the refresh rate, then yes, correct, the difference between G-SYNC and V-SYNC OFF is minimal in that scenario for that very reason. Depending on the offset of the tearline, which is ever rolling upward with V-SYNC OFF at framerates inside the refresh rate, we're looking at 0 to 1/2 frame reduction of input latency at any point over G-SYNC.

V-SYNC OFF input lag reduction only really becomes significant over G-SYNC when the framerate is over 3x (ideally 5x) that of the current max refresh rate.

If a theoretical 1000Hz/sustained 1000fps was ever reached, adaptive sync and V-SYNC OFF would be virtually indistinguishable, though adaptive sync would still be useful for lower framerates at such refresh rates.

Currently, adaptive sync is the only method that allows 1:1 framerate to scanout delivery (within its effective range) without the delay traditional syncing methods add.

Another thing many are overlooking in the 480Hz "usefulness" conversation, is that a 1:1 framerate/refresh rate isn't everything. With a theoretical 480Hz display capable of adaptive sync, you have an extremely high VRR ceiling without worry of exceeding the VRR range (no need of a framerate limit) in the latest AAA games. And 480Hz still has a much faster scanout time than lower refresh rates, which means no matter the sustained framerate, frames are still being delivered faster.


I think the new thread button is somewhere in the corner...
 
I think the new thread button is somewhere in the corner...

Ha, sorry about that cithrix, my first post was just a passing FYI; honestly didn't think anyone would respond.

I have edited my second post, and will refrain from posting off topic from this point.
 
Last edited:
Hi,
I'm pretty late to the party, but I do have one of my ZWS DP1.2->LVDS boards hooked up to a V390DK1 panel now, want to test it to see what it can do with a proper interface board.

Does anyone have an i2c dump for poweron, 4k->2k mode switch, and 2k->4k mode switch?

Took him 14 months to double the initial bet

I had a nerdgasm and lost conscience somewhere around the 4k120hz mark:D

Cool, keep us updated! If you get 4K 120 Hz to work I'll be one of the first to buy.
I was driving kids to school when the pre-order started. But i think i bested Vega to the line :cool:
Soon enough the first 4k panel to reach mass market prices will also become the first 4k 120Hz panel.

Cirthix's blog:

" sales for the first day have been strong and totaled around $7400. Most interest has been in the 39" upgrade kits."

Told you so!;)
 
I checked the FAQ but couldn't find the answer there. I am assuming the monitor will be like the early korean and overlord 1440p IPS monitors and will only have a power button. No OSD available to adjust monitor settings right?
 
I checked the FAQ but couldn't find the answer there. I am assuming the monitor will be like the early korean and overlord 1440p IPS monitors and will only have a power button. No OSD available to adjust monitor settings right?



On/off, brightness, backlight mode rotate (scan/strobe/stable), and dedicated edid profile buttons.
upload_2017-8-19_14-51-57.png
 
Last edited:
Chief Blur Buster here.

I want to give mucho kudos to Zis (cirthix) to be the first in the world to achieve 480 Hz on an ordinary non-laboratory LCD, way ahead of mainstream manufacturers.

He did it for 240Hz LCD in year 2013 [Zis' YouTube]

I should mention that I helped Zis (for free) with his strobe backlight, as Blur Busters originally launched because of an Arduino scanning backlight project. We published some research on homebrew strobe backlights and scanning backlights, which Zis found very useful towards this project.

While I did shelve the Arduino scanning backlight project once the Lightboost hack was discovered as a much easier alternative, it's fantastic to see an "Arduino open-source scanning backlight" finally become a reality with Zis' display.

While strobing ideally needs overdrive (which helps reduce strobe crosstalk), it's very usable when used with large vertical totals. Thanks to the high bandwidth, I was able to achieve >VT2000 with ToastyX CRU with 1080p120. The accelerated LCD scanout velocity creates a longer VBI time to let LCD GtG pixels settle in total darkness (4.16ms VBI + 4.16ms scanout at 1080p120Hz running at 240Hz scanout timings). This partially compensates for the lack of overdrive, and makes strobing actually quite usable. There's a switchable scanning versus strobing mode on Zis' display. Alternatively, there is a scanning mode which helps if your VBI margins are a bit tighter, in exchange for different kinds of artifacts.

The strobe backlight firmware is open source. It can be uploaded within Arduino 1.8.2 IDE (using a tiny dongle connected over USB). I've test compiled and uploaded it, and tweaks works. Strobing at any refresh rate, any phase, and strobe length, and overvoltage boosts are available for brighter strobing at shorter strobe lengths. Yes, single strobe 60Hz (flickermania!) is available (that manufacturers tend not to do), if you want to run emulators without motion blur. Persistence (strobe length) is quite adjustable. The quality of strobing has already improved thanks to tinkering & tweaking (and will continue to)! Also, support will be added for strobe calibration via a PC utility.

Blur Busters does not derive revenues from cirthix's project (which is sold by him alone, not Amazon/NewEgg). However, in this case I helped with the strobing part, at no charge, because I believe in pushing the limits of hobbyist display tinkering -- and also because it's true to the original geek roots of Blur Busters.

Hertz insanity is a pre-requisite simulating stepless analog motion for "strobeless LightBoost", or "blurfree sample-and-hold" to achieve CRT motion clarity via continuous-light with no BFI/impulsing/strobing/phosphor/etc. Future consumer displays successfully achieving this, is an Apollo Mission worthy of an X-prize, this is already successfully (I've seen) being achieved in laboratories (e.g. Viewpixx's 1440Hz DLP laboratory projector), and even the response-time-throttled 480Hz (clearer on 28" TN) is an early-canary taste of that tantalizing long-term future. Blur-free sample-and-hold is far closer to real life, because real life doesn't strobe.

Even to this day, many LCD designers, researchers, and factory workers, don't always exactly realize how persistence behaves and the difference between pixel response (GtG) and persistence (MPRT). Many are getting smart already. Many parties are already wizards (e.g. NVIDIA engineers behind LightBoost/ULMB). But many outfits, even to this day, still think pixel response (alone) means improved motion blur, when persistence (frame visibility time) also must be simultaneously addressed.

I believe in sharing the knowledge so that display panel manufacturers understand "Why Low Persistence?" and "Why 480Hz?" and the science/mathematics of trying to attempt to match CRT motion clarity on OLED/LCD/FED/whatever future display tech. Whether via using strobing or via using Hertz insanity.

Keep pushing display limits, cirthix!
 
Last edited:
Come on Mark! Forget about having a life or enjoying a family over the weekend!
just show us pursuit camera images of scanning and strobing modes!
 
Come on Mark! Forget about having a life or enjoying a family over the weekend!
just show us pursuit camera images of scanning and strobing modes!
Patience, my padawan! ;)

Pursuit cam images of strobing is Part #3 mainly because I want the backlight source code to improve first (that takes time). It's rapidly improving (A moving target) so I want to wait for the quality to settle first.

Current tentative plan (unless the LTT video suddenly shakes things up and priority order/topics changes)

--> Part #2 is input lag tests, of 60Hz vs 120Hz vs 240Hz vs 480Hz. (further 480 Hz mythbusting)
--> Part #3 will be many more pursuit cam tests including of strobed modes. (once the updated strobe firmware mods are done)

Priority is 480hz mythbusting. News about LightBoost and 120Hz is mainstream site stuff, but mainstream sites are jawdropped at why 480 Hz exists. Many news sites have said 480 Hz is not worth it, which is total garbage. The 480 Hz quality will only get better with future monitors (even if the pixel response of 39" VA panels will also reduce the difference between 240Hz and 480Hz -- leading to more misclaims & myths, by people who do not understand). And yes, 540p is a little low for 480Hz benefits -- more FOV+rez benefits far more from Hz (more pixels to motionblur over).

However, 480Hz myths MUST be busted, by Blur Busters. :D

I have to nip that in the bud, quickly, pronto, right at the beginning, before the misconceptions spread in the media. (Just google "480 Hz" in Google News -- that came before I published Part #1). I have to bust the "480Hz is stupid" and "480Hz is useless" posts.

Although there are definitely points of diminishing returns, the holy grail of strobeless LightBoost (real life doesn't strobe) is a Blur Busters prime directive, and that is a numero uno. We bust motion blur. (Except when properly artistically used, not when unwanted!) Strobing (MBR) such as ULMB is amazing but not the final frontier -- it is blurless sample-and-hold (requiring insane Hz) -- and will be made possible by 1000fps reprojection tech (I call them "frame rate amplification tech"). Oculus does that now, but the tech can go all the way to 1000fps direclty in silicon transitors on a corner of a circa-year-2025 mid-range GPU. Yesterday, 3D became hardware (3dfx), then T&L went into silicon (GeForce256), then came shaders/stream processors. Tomorrow, it's various framerate amplification tech (thank you, VR, for all those lagless reprojectors/timewarp/interpolators that don't have soap-opera-effect artifacts) with even less and less artifacts till they almost dissapear. We'll be ready for the lagless+strobeless MBR monitors (Aka 1000Hz displays + GPUs with built in frame rate amplifiers) in ten years from now.

Lagless & strobeless Motion Blur Reduction, is a tech that we're strong advocates of. Unfortunately, doing this, requires stratospheric Hertz. I didn't think it would happen in my life, but it is possible in just a mere decade on an ordinary gaming monitor, and we are pushing HARD for that.

Don't worry, I've got a BBQ to go to, so Chief does have downtime :)
 
Last edited:
Couldn't resist recovering my account just to comment how excited I am about this monitor. I'll be interested to see what Linus's video shows too.
 
The LTT video still hasn't been posted. I pointed out some technical problems with it and LTT was going to reshoot, then decided to just add annotations for the fixes. The original date was supposed to be the 17th, when the shop went live. I'm going to have to extend the order window, which will delay the schedule, but don't know when to extend it until. The original plan was 2 weeks (aug17~31), but that seems like a pretty large delay at this point. Thoughts?

In the meantime, I've been working on a rear-illumination system with individually addressable rgb led strips. Should be neat.

Some stats: So far, there have been ~$14K of orders. Not bad for ~1100 views of the zisworks site. Expecting a lot more traffic after the LTT video gets posted though.
 
Last edited:
The initial blurbusters reivew (part 1 of 3~4) is up. http://www.blurbusters.com/4k-120hz-with-bonus-240hz-and-480hz-modes/

39" kits, 28" kits, and 28" full monitors are preorderable
I have a couple of questions. First off, where do we go to pre-order? Second, when will Linus post his video,

And what exactly is the difference between the 28" and 39". Is there anything other than the size? I have it in my mind that one is TN and the others VA, but that may just be foggy memory (I haven't checked on this in months).

Also, my setup is a little (read: extremely) weird compared to most others on here. I don't actually have a desktop anymore. I got rid of my FX 8350 based system about half a year ago, and now use a 13 inch MacBook Pro with an external GPU enclosure over Thunderbolt 3. I switch over to boot camp for the occasional game, and stay in macOS for everything else. The display should behave fine on Windows, but I'm not sure it'll cooperate on macOS (with or without the eGPU).
 
Last edited:
Two questions: where to pre-order.

And what exactly is the difference between the 28" and 39". Is there anything other than the size? I have it in my mind that one is TN and the others VA, but that may just be foggy memory (I haven't checked on this in months).

check the blurbusters review for a link to the shop. don't want to post it directly in this thread.

Differences:
28"
TN type
matte finish
"1ms" rating
650:1 contrast

39"
MVA type
glossy finish
"6.5ms" rating
2600:1 contrast

The speed ratings are inaccurate for both, but this is what the panel vendor claims.
 
  • Like
Reactions: Panel
like this
check the blurbusters review for a link to the shop. don't want to post it directly in this thread.

Differences:
28"
TN type
matte finish
"1ms" rating
650:1 contrast

39"
MVA type
glossy finish
"6.5ms" rating
2600:1 contrast

The speed ratings are inaccurate for both, but this is what the panel vendor claims.
Thanks for the fast reply. Can you go back and reread it? I edited it before I got the alert for your response; you must have been writing this as I was editing mine.
 
Thanks for the fast reply. Can you go back and reread it? I edited it before I got the alert for your response; you must have been writing this as I was editing mine.


Also, my setup is a little (read: extremely) weird compared to most others on here. I don't actually have a desktop anymore. I got rid of my FX 8350 based system about half a year ago, and now use a 13 inch MacBook Pro with an external GPU enclosure over Thunderbolt 3. I switch over to boot camp for the occasional game, and stay in macOS for everything else. The display should behave fine on Windows, but I'm not sure it'll cooperate on macOS (with or without the eGPU).

I don't have a macbook to test with, but startech did give me one of these to try : https://www.startech.com/AV/Converters/Video/thunderbolt-3-to-dual-displayport~TB32DP2

I sent it over to blurbusters, Mark has a macbook. He gave it a 30 second test and it didn't work first try, so he went back to the regular review stuff. It should be tested more, but he's busy. Don't know if its an osx issue, a hardware issue, or something with his setup/config.
 
I sent it over to blurbusters, Mark has a macbook. He gave it a 30 second test and it didn't work first try, so he went back to the regular review stuff. It should be tested more, but he's busy. Don't know if its an osx issue, a hardware issue, or something with his setup/config.
Traced to a defective Thunderbolt port on my MacBook. It even also stopped working with a Thunderbolt Ethernet adaptor, too. Not cirthix's fault -- the adaptor may very well work.

Also, ontopic:
Just heard Linus Tech Tips (LTT) is redoing parts of their video. Still doing lag tests, keep tuned, maybe this (comparatively) little guy will beat LTT again. ;)
 
Last edited:
Traced to a defective Thunderbolt port on my MacBook. It even also stopped working with a Thunderbolt Ethernet adaptor, too. Not cirthix's fault -- the adaptor may very well work.

Also, ontopic:
Just heard Linus Tech Tips (LTT) is redoing parts of their video. Still doing lag tests, keep tuned, maybe this (comparatively) little guy will beat LTT again. ;)

You mentioned the TB port problem, forgot about it.

I fully expect the adapter to work. Main question is OSX support.
 
The LTT video still hasn't been posted. I pointed out some technical problems with it and LTT was going to reshoot, then decided to just add annotations for the fixes. The original date was supposed to be the 17th, when the shop went live. I'm going to have to extend the order window, which will delay the schedule, but don't know when to extend it until. The original plan was 2 weeks (aug17~31), but that seems like a pretty large delay at this point. Thoughts?

In the meantime, I've been working on a rear-illumination system with individually addressable rgb led strips. Should be neat.

Some stats: So far, there have been ~$14K of orders. Not bad for ~1100 views of the zisworks site. Expecting a lot more traffic after the LTT video gets posted though.

Maybe do 2 batches? The first batch for those that are in this order window with the BlurBusters post, and open a second order window from middle to end of September for the Linus people to get in on it. That would also split the load you have to deal with which might make it easier on you. I bet there will be a big rush on orders from the Linus video.
 
Compared to Gsync the only reduced input lag you get with vsync off is below tear lines. Everything above the first tear line is the same input lag as if you used Gsync.
See attached image.

Lag of what's above tearline is very frametime dependant.

The lag above the tearline is (frametime)th of a second higher lag than the lag below the tearline.

The lag of the topmost tearslice is dependant on where the previous tearline is (it doesn't line up with top edge of screen). The top tearline of the first tearslice is usually somewhere on the previous refresh cycle, but may even occur inside the VBI (signal VSYNC) which is simply a block of empty black rows of pixels padding between refresh cycles (sync/porch seen in a CRU app). This is an old carryover from the analog days still used in digital signals today. But when viewed this way, it becomes mathematically simpler to calculate lag gradients of the first tear slice, once you look at the image and realize VBI is part of the tearslice math.

At 1000fps, frametime of 1/1000sec (1ms), the input lag of the pixel row immediately above the tearline is 1ms higher than the input lag of the pixel row immediately below the tearline.

The tearslices have an input lag gradient [0ms.....(frametime)ms] from the top edge to the bottom of the tearslice. So at 1000fps, the middle of tearslice is 0.5ms higher than bottom edge of tearslice, and 0.5ms lower than top edge of tearslice. This is true even when the tearslice overlaps refresh cycles (top edge / bottom edge of two adjacent refresh cycles, for example).

In the Blur Busters "filmreel metaphor" images [example attached as PNG] -- the VBI is the black gap between refresh cycles, but the picture is vertically linear time-wise. You can use the time ruler at the right-hand edge to measure the lag gradient of a tearslice. We've confirmed, re-confirmed, and laboratory lag-gradient measurements do exactly lines up with the attached PNG (on a CRT and on synchronous-scanout LCDs).

Photodiode oscilloscope equipment reveals the input lag gradient very clearly in tests, it's caused by a monitor's top-to-bottom scanout, and the sequential delivery of pixels over the cables -- both usually synchronous on 'proper' gaming monitors. Eventually we'll write an article about this very advanced math topic of lag graidents of tearslices, but these nuances are even beyond scope of 99% of Blur Busters readers...

Mathematically, it all lines up: Let's say, for a horizontal scanrate of 135Khz (1080p120Hz typical), each scanline takes 1/135,000th of a second to refresh. So if a tearslice is 450 scan lines (for a situation of 300fps VSYNC OFF during 135KHz scanrate of 1080p120Hz), the input lag difference between top edge and bottom edge of the tearslice is 450/135,000th of a second. 450/135000sec equals 1/300sec -- that's the frametime for a 300fps situation! We're of course, assuming synchronous scanout, which is what modern gaming monitors do.

Most good gaming monitors worth their salt, scan their LCDs synchronously with the input ("Instant Mode", "Lagless Mode", etc). There's obviously a GtG lag, but the vertical lag gradient is symmetrical (on good gaming monitors) with the pixels arriving at the video input.

Scanout physics difference from cable-versus-panel varies from display to display (very different for plasma/DLP) -- and scanout velocity can be different between cable versus panel -- but cable scanout versus panel scanout is symmetric on CRT and proper LCD -- they're top-to-bottom scan.

Pixel transmission -- the pixels are transmitted one at a time, left-to-right, top-to-bottom, including the blanking interval (that includes sync & porch pixels seen in a CRU app like ToastyX or NVIDIA Custom Resolution -- porches/sync pixels are a carryover from analog days, but still used digitally to indicate the start of a new refresh cycle, or to indicate the start of a new row of pixels, etc -- but they're essentially blank/black pixels that only becomes visible when a picture rolls, like an analog TV that has a misadjusted VHOLD knob, and were also occasionally visible on older digital monitors with a misadjusted Phase/Tracking setting).

It all mathematically lines up perfectly for VGA / RGBHV / HDMI / DVI. DisplayPort micropacket jitter does adds a number of microseconds of jittering error margin, but for instant-mode gaming monitors, that jitter is simply typically line-buffered-out (an ultra-tiny buffer of a few pixel rows worth -- mere microseconds worth of buffer). For all practical intents and purposes, cable scanout & panel scanout are latency-symmetrical down to the ten-microsecond-scale on all the best eSports monitors. Meaning GtG completions of a specific pixel row will always be exactly temporally separated from the pixel delivery on the cable, down to an accuracy of the tens-of-microseconds. There's a lag, but the lag of the photons emitted from a display's pixel hitting the human eye -- jitters only by tens of microseconds relative to pixels arriving on the video cable; it's really that synchronous during "Instant mode" operations.

Corollary: It also mathematically explains another benefit of continuing to do frame rates higher and higher than refresh rates. Lag gradients of random-tearslices manifests itself as random lag jittering. The higher the frame rate, the smaller the lag gradent of a tear slice is, and you have lower min/max/avg latency. Reduced lag jitter from random-placement of tearslices (and its lag gradients) means better aiming. 500fps means tearslice lag jitter of only 2ms, while 100fps means tearslice lag jitter of 10ms, regardless of your refresh rate (which adds its own lag, too -- outside scope of this topic). So you get better CS:GO aiming at 300fps@60Hz instead of 100fps@60Hz due to tighter lag gradients resulting in tighter min/max/avg.

TL;DR: Tear slices during VSYNC OFF are their own lag gradients, even for the topmost/bottommost tearslice, and tearslices never lines up with the top edge or bottom edge of screen. Mathematically, tearlines can occur unseen offscreen, in any scanline inside VBI since a VBI (signal VSYNC) is still mathematically equal to multiple dummy offscreen rows of pixels.
 

Attachments

  • blur-busters-gsync-101-filmstrip-vsync-off-144hz-288fps.png
    blur-busters-gsync-101-filmstrip-vsync-off-144hz-288fps.png
    393.7 KB · Views: 40
Last edited:
You mentioned the TB port problem, forgot about it.

I fully expect the adapter to work. Main question is OSX support.

I've been testing 43" replacements to my Seiki 39" display on a 2012 MacPro RX480 / Geforce 970 and a 2017 MacBook/AMD 460.
SwitchResX should provide access to all video modes defined in the EDID. Any custom resolutions not showing up can be easily added.

MacPro <= 2012 can leverage direct display ports.
MacBooks >= 2016 probably need a usb-c to Active Display Port adapter.

If you need a macOS tester that's experienced tearing down LCD's, I have a spare 39" seiki sitting in the corner gathering dust.
 
The LTT video wasn't that informative. Here is a better one, though it is older, so it doesn't have the new baclkight driver shown.



If you want technical stuff, go to blurbusters.
 
Ah, for those who haven't seen it yet, here is the LTT video.

http://bit.ly/2vZx0fT

The link puts my comment with corrections to the video at the top, I'd appreciate upvotes so it is more visible. Thanks :)
 
Last edited:
No, the tcon does not fit.
I haven't tested it yet (and don't plan to, unless someone buys me a panel), but the kit should fit on the V500DK2 panels, which use the same connections as the smaller models. http://www.shopjimmy.com/vizio-4s-lx477-pr3-t-con-board.htm

I'd love to have this with a 50 inch panel. Let me know if I'm following along. The link goes to a V500DK2-CKS2. I'm trying to follow linked board sets/panels and the retail TVs they're listed to repair to find TVs that'd have a chance to be a donor. Searching for panels, I see the V500DK2-LS1 is LVDS (4 ch, 10-bit), 102 pins, so I think I've got the right spec sheet.

The repair websites list TVs the boards are for. Can I just use those to back into the right ones? Looks like the Vizio M50-C1 from your link, and also the Vizio P502UI-BE1 - there also seems to be some Hisense and Toshiba TVs. The Vizios show backlight arrays, one with 32 and the other with 64 zones. Is that a factor for compatibility or is this mod agnostic to backlight zones?

Am I'm following this right?

I've got a use for a 50" TV in any case, and see a cheap Vizio on craigslist so I'm tempted to grab it and try. (also don't see any 39" Seikis in my area) While I'm not a tech, I've re-capped over a dozen LCD screens so I should be able to pop it apart just fine, and can make a braket/mount from some aluminum plate if it works - I'll just wall mount it behind the desk. Worst case I'll use the TV as a TV and hunt down a 39" Seiki (or another 50" if we figure out the right one)
 
I'd love to have this with a 50 inch panel. Let me know if I'm following along. The link goes to a V500DK2-CKS2. I'm trying to follow linked board sets/panels and the retail TVs they're listed to repair to find TVs that'd have a chance to be a donor. Searching for panels, I see the V500DK2-LS1 is LVDS (4 ch, 10-bit), 102 pins, so I think I've got the right spec sheet.

The repair websites list TVs the boards are for. Can I just use those to back into the right ones? Looks like the Vizio M50-C1 from your link, and also the Vizio P502UI-BE1 - there also seems to be some Hisense and Toshiba TVs. The Vizios show backlight arrays, one with 32 and the other with 64 zones. Is that a factor for compatibility or is this mod agnostic to backlight zones?

Am I'm following this right?

I've got a use for a 50" TV in any case, and see a cheap Vizio on craigslist so I'm tempted to grab it and try. (also don't see any 39" Seikis in my area) While I'm not a tech, I've re-capped over a dozen LCD screens so I should be able to pop it apart just fine, and can make a braket/mount from some aluminum plate if it works - I'll just wall mount it behind the desk. Worst case I'll use the TV as a TV and hunt down a 39" Seiki (or another 50" if we figure out the right one)

If I don't get a panel to test (not planning on it), there won't be a voltage+timing profile for the panel. Without this, image quality will be really bad. Backlight drivers are a secondary concern.
 
Back
Top