Any good gaming monitors exist? Everything is crap!

Did you seriously just say that?
Plasma TVs have some of the best motion quality out of any display, damn near rivaling CRTs. The whole reason I bought a Plasma TV was to get away from the motion blur of LCDs. Gaming on this thing is glorious.

Personally I just ignored his whole post and hada good chuckle since he is very clearly biased towards plasma's and most of what he posted is either inaccurate or full of hyperbole. He attempts to present the PDP technology in way to make it seem inferior when in fact it's superior.
 
Yeah, I'm going to have to disagree with him as well. People who usually dismiss plasmas have probably never seen a Kuro live.
I've owned, calibrated, and tested three generations of Kuro. (Pioneer's 8G, 9G, and "9.5G" plasmas) I'm very aware of how they perform.
The best thing that they do is black level - and while it may have been the best on a flat panel prior to local dimming LCDs and OLEDs, there's still a noticeable glow from the panel in a dark room. In a bright room, their front filter does a poor job rejecting ambient light and the whole panel looks gray.

Motion resolution is average at best, they're very bad for "phosphor lag", they're the worst plasmas around for dithering - though that does at least eliminate dynamic false countouring, they're relatively high latency, they only display 4:2:0 chroma resolution outside of the PC mode, they're very susceptible to burn-in/image retention - I got permanent burn-in on my 8G Kuro after the first time I played a game on it!

I get why people like them - especially if they managed to pick them up for a good price when stores were offloading them after Pioneer left the TV business, but I don't think they deserve the "mythical" status that people on AV websites have given them.

It says enough that people on numerous home theater forums like avsforum/highdefjunkies/flatpanelhd/etc. still haven't upgraded their plasmas to OLED because the differences are marginal at best.
The differences are pretty big. The problem is that OLED is better at black level and contrast - by quite some margin - but worse in almost every other area right now. I expect that will change.
A lot of that is due to the fact that it's LG producing these displays, and they don't know how to do quality image processing like Panasonic, Sony, and Samsung do.
They're also quite expensive right now, relative to what the last generation of Panasonic Plasmas cost for example.

That's one thing Plasma certainly had in its favor: even lower-end models still had good image quality relative to the high-end ones.
With LCD only the high-end models have good image quality. Most of the cheap ones are trash.

Did you seriously just say that?
Plasma TVs have some of the best motion quality out of any display, damn near rivaling CRTs. The whole reason I bought a Plasma TV was to get away from the motion blur of LCDs. Gaming on this thing is glorious.
Yes, I seriously just did. Just look at these PWM-like artifacts on a VT60 displaying a static image:
Here's a video of a GT60 (same range) showing what this is like with moving video and a stationary camera.
It's also visible in this ZT60 review. The video is just something that showed up in the YouTube sidebar for me, but it shows that it occurs across the whole '60 range, and since this is a review they were obviously not intentionally trying to capture that problem.

This affects all plasmas, not just Panasonic's '60 range.
Here's a KRP600M vs a TX-P60ZT60
And a slow motion video comparing Pioneer's CLEAR driving vs Panasonic's older panel driving.

It reminds me of how bad things used to be when PWM dimming was being used with CCFL-backlit LCDs.
High-contrast images are completely unwatchable on plasma for me due to this.

Personally I just ignored his whole post and hada good chuckle since he is very clearly biased towards plasma's and most of what he posted is either inaccurate or full of hyperbole. He attempts to present the PDP technology in way to make it seem inferior when in fact it's superior.
I won't deny that plasma does some things well, but to say that it doesn't have a ton of image quality or motion problems is wrong.
I can watch a CRT all day - even at low refresh rates, and I was able to tolerate high-speed single-chip DLPs - though I did see the "rainbows", but I've never been able to watch a plasma for any length of time without getting a migraine - and it's not for lack of trying. Trying to watch a high-contrast black and white film like Sin City on a plasma is torture to me.

I don't have any problem with people saying that they "don't notice" these issues (frankly, I don't understand how that is, because it stands out so much to me) or they notice them but are not bothered by them, but to say that they have the best motion quality of any display is flat out wrong.
 
Show me an LCD TV that produces better motion clarity without strobing which wreaks the picture quality and adds headache inducing flicker, or frame interpolation that matches what a quality Plasma TV was able to achieve at native 60Hz. cause in all serious I'm about to purchase a $2000 4K TV/monitor. I watched 24p movies and hockey+tennis on the JU7100+ series of Samsung tv's and sony's X810-850C line of TV's. While the 120Hz motion flow(whatever name each manufacturers want's to assign) defiantly enhances the sample and hold motion clarity of both TV's it's still not good as my 2013 Panasonic Plasma among other things, like contrast, shadow detail uniformity etc.... Plasma is not a perfect tech it certainly has some faults hence why I am buying a UHD static image display for my PC.

Also never had any IR on my 8G Kuro and I gamed on it, Had the TV for 5 years with 15,000 hours on it. With that being said I see alot of people reporting IR problems with Panasonic's 2012-2013 range of TV's. It seems to be a luck of the draw. Good thing I bought my plasma used with only 3000 hours on it in perfect condition. Plasma TV's were only as good as the manufacturer behind them and the electronics used to drive the display..

I can't argue if you personally see and are bothered by the dithering effects and the Rainbow colors. Most people never see them. I got to take of for a little while, but I will address some of the other statements you made which I think are not representative of plasma display's when they were modern.
 
Show me an LCD TV that produces better motion clarity without strobing which wreaks the picture quality and adds headache inducing flicker, or frame interpolation that matches what a quality Plasma TV was able to achieve at native 60Hz.
Well you're asking for something that isn't possible.
The reason that plasma has 4-8ms persistence (depending on the model) is because the image flickers.
Any flicker-free display will have a full 16.67ms of persistence-based motion blur at 60Hz.
A flicker-free display would have to be 120Hz to reduce persistence to 8ms or 240Hz to reduce it to 4ms.
That's not at all specific to LCD - it's true for any type of display.

Strobing should not do anything to affect image quality other than reducing the brightness, and improving perceived panel response times. (since it cuts off the beginning of the transition between frames)

Sony's televisions with the "Clearness" option set to 5 (60Hz) should have good motion clarity. I do wish that they had more flexible options though, since there is no option to select the duty cycle. (I think they target 4ms)
Most people also seem to think that NVIDIA's ULMB, and BenQ's Blur Reduction works well too - though I don't have any experience with them.

While the 120Hz motion flow(whatever name each manufacturers want's to assign) defiantly enhances the sample and hold motion clarity of both TV's it's still not good as my 2013 Panasonic Plasma among other things, like contrast, shadow detail uniformity etc.... Plasma is not a perfect tech it certainly has some faults hence why I am buying a UHD static image display for my PC.
Well interpolation is a flicker-free technology so 120Hz interpolation will be 8ms persistence at best, which is still dependent on the source framerate - and that's without mentioning all the other problems that interpolation has.
Strobing the backlight is the only good solution for motion clarity, though you will want to use a mode that combines interpolation plus strobing if you're using low framerate sources to avoid judder. (24/30 FPS video)
 
Show me an LCD TV that produces better motion clarity without strobing which wreaks the picture quality and adds headache inducing flicker, or frame interpolation that matches what a quality

You really should be careful what you ask for...because you never when thou shalt receive...

http://vpixx.com/products/tools-for-vision-sciences/visual-stimulus-displays/viewpixx/

Most people also seem to think that NVIDIA's ULMB, and BenQ's Blur Reduction works well too - though I don't have any experience with them.

Benq blur reduction on the XL2720Z works VERY VERY well (although this is more to firmware bugs Benq admitted are bugs, than by design) and it's why I'm REFUSING to upgrade to any 1440p IPS/Gsync or freesync panel.
Can someone buy me a viewpixx for an upgrade, please :(
 
Benq blur reduction on the XL2720Z works VERY VERY well (although this is more to firmware bugs Benq admitted are bugs, than by design) and it's why I'm REFUSING to upgrade to any 1440p IPS/Gsync or freesync panel.
Can someone buy me a viewpixx for an upgrade, please :(
What I'm waiting for is either BenQ to release a monitor using an IPS panel with G-Sync and Blur Reduction, or NVIDIA to remove the restriction on ULMB and allow it to operate on any refresh rate instead of being limited to 85/100/120Hz.

Whichever happens first is the monitor that I'm buying next.
 
What I'm waiting for is either BenQ to release a monitor using an IPS panel with G-Sync and Blur Reduction, or NVIDIA to remove the restriction on ULMB and allow it to operate on any refresh rate instead of being limited to 85/100/120Hz.

Whichever happens first is the monitor that I'm buying next.

Be careful what you ask for...

http://www.eizoglobal.com/products/foris/fs2735/index.html

But you'll need an AMD card for it...
And the manual makes no mention of blur reduction not being available in freesync mode...(unlike the Benq XL2730Z which specifically states that MBR gets disabled in freesync).
 
http://www.eizoglobal.com/products/foris/fs2735/index.html
But you'll need an AMD card for it...
And the manual makes no mention of blur reduction not being available in freesync mode...(unlike the Benq XL2730Z which specifically states that MBR gets disabled in freesync).
It's not really feasible to combine strobing with variable refresh rates right now.
You would need very tight control over not only the strobe duration, but also the strobe intensity. You would also need significantly brighter backlights because your maximum brightness is limited by the lowest (dimmest) refresh rate.
With the current way that backlight strobing is implemented, where you have a fixed strobe intensity, the display would get dimmer/brighter as the framerate got lower/higher.

It's one or the other - but I still want the display to have VRR support for games where I can't lock the framerate. Some newer games won't even run at a fixed 60 FPS on a high-end machine.
And don't Eizo strobe twice per frame? That results in unacceptable judder.

Switching to AMD is not an option for me, for a number of reasons.
Two quick things off the top of my head: they don't support hardware-accelerated UHD video decoding like my GTX 960 does. (the 960 is currently the fastest GPU with full hardware UHD decoding)
And they will have to add DX11 multi-threading support to their drivers before I'd consider switching. Since it's been six years already, and DX12 is here now, I suspect that they might never support it, and DX11 games are always going to have performance issues on AMD cards.
 
You really should be careful what you ask for...because you never when thou shalt receive...

http://vpixx.com/products/tools-for-vision-sciences/visual-stimulus-displays/viewpixx/

That is not what I wished for. You are taking this way out of context.
That is a specialized Monitor used for color research(not for consumer use) which I already knew about and is a far cry for replacing a 40" UHD display as it's only 22.5" in size with a low resolution of 1920 x1200 while costing a whooping $10,000 USD.

Aside from all that a pretty nice display with flicker free, low persistence sample and hold.
 
...flicker free, low persistence sample and hold.
You don't seem to get it.
You can't have low persistence on a flicker-free display unless you have extremely high refresh rates, and source framerates to match.

A CRT has about 1-2ms persistence at 60Hz due to the scanning nature of how those displays work, and the flicker that produces.
Plasmas have 6-8ms persistence due to the amount of flicker they have. (plasmas are not flicker-free displays)
A completely flicker-free display of any kind, not just LCD, will have 16.67ms persistence.

The only way to reduce persistence at 60Hz is to introduce flicker, or interpolate to a higher framerate.
Interpolation reduces motion blur without flicker, but introduces a lot of other problems.

For a flicker-free display to match the 1-2ms persistence of a CRT, it needs to be running at 500-1000Hz, with 500-1000 FPS source content.
 
You don't seem to get it.
You can't have low persistence on a flicker-free display unless you have extremely high refresh rates, and source framerates to match.

I already know this, but the VPixx lists in it's specification that it able to sample and hold at 7ms, here is an except.

7 ms typical in normal backlight mode

This is without the scanning backlight on the VPixx, I was not referring to the 1ms.

As for plasma, the so called flicker is not seen since you aren't using the display 2 feet in front of your eyes.
 
Last edited:
Strobing should not do anything to affect image quality other than reducing the brightness, and improving perceived panel response times. (since it cuts off the beginning of the transition between frames)

What, are you kidding me? It should not do anything you say other than alter the original picture.

It alters the original image by severely reducing brightness and also moderately reducing contrast. That in essence, is degrading the image at the cost of motion clarity my friend. I mean this is not debatable, I have watched movies on both the Sony with clearness 4&5 and the Samsung with Auto Motion Plus. Both end up giving you a dull picture, unless you only use these features slightly. Sony @ clearness 2 is OK but then you lose that sharp motion clarity. In the end it's not ideal for optimal cinematic picture quality.



Normal
x850c-motion-blur-small.jpg


Strobed
x850c-clearness-5-small.jpg


So you're kosher with this and stand firm that the above in no way represents a reduction in overall picture quality?
 
Last edited:
I already know this, but the VPixx lists in it's specification that it able to sample and hold at 7ms, here is an except.
That is the pixel response time, not the perceived motion blur due to image persistence.
Since it's a 120Hz monitor, that would be 8.33ms

As for plasma, the so called flicker is not seen since you aren't using the display 2 feet in front of your eyes.
So apparently you don't notice the flicker on a Plasma TV. They most certainly do flicker, and that's how the persistence is <16.67ms at 60Hz.

What, are you kidding me? It should not do anything you say other than alter the original picture.
It alters the original image by severely reducing brightness and also moderately reducing contrast. That in essence, is degrading the image at the cost of motion clarity my friend. I mean this is not debatable[...]
Well obviously it reduces the brightness since you are only switching the backlight on for a fraction of the frame duration. Let's not forget that displays like Plasmas and CRTs are a lot dimmer than a sample & hold LCD to begin with.
There should be no loss of contrast - all you're doing is switching the backlight on or off. If there is a measured loss of contrast, it's an erroneous measurement - likely due to meter sync issues.

I have watched movies on both the Sony with clearness 4&5 and the Samsung with Auto Motion Plus. Both end up giving you a dull picture, unless you only use these features slightly. Sony @ clearness 2 is OK but then you lose that sharp motion clarity. In the end it's not ideal for optimal cinematic picture quality.
Well it depends on the viewing environment and how bright the display was to begin with. Fortunately we're moving from LCDs with 300-500cd/m² backlights to HDR LCDs with 700-1200cd/m² backlights, which can better afford to lose some brightness.
If we're talking about "cinematic picture quality" then we should be targeting the reference brightness of 100cd/m². That means you should be able to reduce persistence to around 3-4ms on an LCD with a 500cd/m² backlight.
Even with a 300cd/m² backlight it should be possible to reduce persistence to 5-6ms when targeting 100cd/m², and that puts it on par with the better Plasma TVs.

Sony's Clearness options are, sadly, very limited. Only the maximum setting strobes once per frame.
They should be strobing once per frame at all settings, with higher/lower levels adjusting the persistence (balance between brightness/clarity) instead. Settings 1-4 are useless since they are strobing multiple times per frame.

Normal http://i.rtings.com/images/reviews/x850c/x850c-motion-blur-small.jpg
Strobed http://i.rtings.com/images/reviews/x850c/x850c-clearness-5-small.jpg
So you're kosher with this and stand firm that the above in no way represents a reduction in overall picture quality?
You're comparing an over-exposed photograph to an under-exposed photograph.
The only thing those images are representative of is motion blur, nothing else.
 
That is the pixel response time, not the perceived motion blur due to image persistence.
Since it's a 120Hz monitor, that would be 8.33ms

Regardless, I was not referring to the scanning backlight mode. But continue to nit pick.

So apparently you don't notice the flicker on a Plasma TV. They most certainly do flicker, and that's how the persistence is <16.67ms at 60Hz.

Sorry I don't, only you do and then moan and complain about, seeing as millions of video enthusiasts have no issues with it, nor do they notice the flicker from a proper viewing distance. I'm sorry you aren't satisfied with how plasma display's an image, the loss is yours.

Well obviously it reduces the brightness since you are only switching the backlight on for a fraction of the frame duration. Let's not forget that displays like Plasmas and CRTs are a lot dimmer than a sample & hold LCD to begin with.
There should be no loss of contrast - all you're doing is switching the backlight on or off. If there is a measured loss of contrast, it's an erroneous measurement - likely due to meter sync issues.

Are you honestly trying to pull that card, I mean seriously your grasping for straws here. Plasmas are far more brighter and look significantly better then when LCD's strobe, I would know I have owned 2 plasmas and still currently have one that is still considered to be one of the best. Your argument is that since plasmas max brightness is lower than LCD's that it's close to par with them after they strobe, sorry that is baloney. Anyone can go out test any of the 4K TV's in the stores and see for themselves how unflattering the max strobing is rather than listen to hypothetical's.

You're comparing an over-exposed photograph to an under-exposed photograph.
The only thing those images are representative of is motion blur, nothing else.

I have to ask did you read over the part where I explained I actually observed with my own eyes both the Samsung and the Sony in there strob modes. The reason why I posted those 2 pics was to try and illustrate the serve reduction in brightness that strobing causes. Online reviews that have actually taken measurements confirm minor loss of contrast is also affected but not nearly as serve as the brightness. I never said the contrast is serverly reduced, go read over carefully what I actually stated.

Now before I lose track, what was your main argument again?
"Plasma TVs are not high quality displays"

Ah yes quite the controversial statement. While Plasma's were in production LCD's paled in comparison(they still do) and this was an opinion shared by a consensuses of hardcore AV enthusiasts, reviewers, display experts, HTPC enthusiasts, color scientists, pro calibrators, and organizations related to those fields. Even the people here on this forum that have owned, used and seen a properly calibrated plasma are not in the slightest swayed by your arguments, as they fall on deaf ears. So clearly something is not right on your end. You certainly have a right to your opinion based on how your brain perceives visuals but bare in mind you are lone warrior in your cause.

One more thing I would like to ad, I find it odd that you never once bothered to refute any of the claims made by MonarchX, why is that if I may ask?
 
Last edited:
At a glance that was a pretty good representation of how muted strobe mode / ulmb mode is imo. ULMB mode dulls/mutes the screen way too much, even in low light conditions and at pulse width of 50 or 70. Even at higher pulse widths I found that the screen (pg278Q in my case) was dulled far too much for my taste. Running 100fps-hz or more as the common playing frame rate and utilizing g-sync to ride the roller coaster graph of frame rate swings results in much smoother, more defined motion gaming with 40%/50%/60% blur reduction, no tearing/judder/stops or tradeoffs of v-sync, and a screen with way more pop vs dull/muted non-gsync ULMB mode. Until someone makes (if ever) a much brighter monitor (+50% brightness?) to compensate for this screen muting I personally am not interested in backlight strobing compared to running 100/120/144 fps-hz as my common rate with the great benefits of g-sync.

Plasma runs at 60fps-hz max input in the back so loses out on motion definition and path articulation (of moving the entire game world around in relation to your viewpoint in 1st/3rd person games, in addition to individual objects moving in the scenes). However plasma is said to reduce motion blur by about 60% compared to a 60hz lcd using it's blinking tech. It's black depth and detail in blacks at depth is greatly superior to LCDs, especially non-scanning backlight, non zone lit lcds. A lot of plasmas have bad input lag, and their size makes them unsuitable for desktop setups at a desk in addition to being limited to 1920x1080 resolution. Obviously in addition to their inferior motion definition in gaming, their size results in the scene extents and HUDs/interfaces/notications/pointers/chat/maps being way out of straight-ahead primary focal range and very poor ppi at desk distances. They also lack the huge benefit of variable hz g-sync.

:rolleyes:

100fps-hz/120fps-hz/144fps-hz:
~40/50/60% blur reduction
5:3/2:1/2.4:1 increase in motion definition and path articulation
g-sync rides the fps graph +/- without screen aberrations

60hz-120hz-30-15_onesecondframe.jpg


Regardless of the monitor's hz, lower frame rates will be blurrier.

If you are using variable hz at 1440p to run low (sub 75fps-hz to 90fps-hz mode/most of the time in game, really should be like 100 at least imo), you are essentially running a low hz, low motion definition and motion articulation, smearing blur monitor and missing out on most of the gaming advancements these monitors provide outside of the judder/tearing/stops avoidance.
 
A lot of plasmas have bad input lag, and their size makes them unsuitable for desktop setups at a desk in addition to being limited to 1920x1080 resolution. Obviously in addition to their inferior motion definition in gaming, their size results in the scene extents and HUDs/interfaces/notications/pointers/chat/maps being way out of straight-ahead primary focal range and very poor ppi at desk distances. They also lack the huge benefit of variable hz g-sync.

See you are a far more sensible individual. Just to be clear I think most of us have always been on the same page that Plasma's are not in anyway suitable for desktop use. Sadly as you mentioned and has been said here many times, a lot of plasma model's do indeed have high input lag, but not all.

As for ULMB or what ever name a company assigns for light strobing, I almost have to pinch myself and make sure i'm not dreaming when I am being told that it's end result only has a minor impact on over all picture and brightness. Maybe the manufactures will further improve upon this tech for the new model's at least I hope they do, cause as of right now I find LCD's blur far too much for my liking, but it's a flaw I will have to continue to live and make peace with for the time being.
 
Regardless, I was not referring to the scanning backlight mode. But continue to nit pick.
Pixel response time is very different from image persistence.
An OLED has <1ms response times but will still have the same 16.67ms persistence as an LCD at 60Hz unless it's using black frame insertion.
That is hardly "nit picking". It's the entire reason that motion clarity sucks on OLEDs right now, despite their fast response times.

Are you honestly trying to pull that card, I mean seriously your grasping for straws here. Plasmas are far more brighter and look significantly better then when LCD's strobe
Put up a full white image on most Plasma TVs and you'll be well below 100cd/m².
Typical CRT monitors with quick phosphors would max out around 100cd/m² as well. (as opposed to office displays with slow phosphors and 150-200cd/m² brightness)

My Sony LCD is 400cd/m² sample & hold, or 200-100cd/m² depending on which strobe mode is enabled. So it ranges from much brighter (but worse motion) to about equal brightness (if we're generous) and better motion clarity. (4ms)

The reason why I posted those 2 pics was to try and illustrate the serve reduction in brightness that strobing causes.
There's no need for pictures, it's basic math. As I said before, those images are not representative of what happens, since one of them is over-exposed.
If you're running at 60Hz (16ms) and you reduce persistence to 8ms then your brightness will drop 50%. So if your display is 400cd/m² then it will drop to 200cd/m². If it's 350cd/m² it will drop to 175cd/m² etc.
If you drop it to 4ms persistence then brightness will drop 25% rather than 50%.

If you instead have a panel which operates at 120Hz (8ms) and reduce persistence to 4ms then it's only a brightness drop of 50%. So a 400cd/m² display would have 4ms persistence and 200cd/m² brightness.
And if you have an 120Hz HDR display with say an 800cd/m² backlight you could drop that to 2ms persistence (CRT-like) while still having a very bright 200cd/m² image.

The main thing hurting backlight strobing on LCDs right now is that the options are so limited on many models.
Using numbers from the PG279Q, ULMB seems to start at 1.875ms persistence, and only goes lower from that. That's why it starts at 100cd/m² and drops down to 10cd/m² (0.250ms).
There really needs to be options which let you select values like 4ms which would give you >2x the brightness at a cost of some motion clarity. But 4ms is still on-par with, if not better than, the best plasmas.

Online reviews that have actually taken measurements confirm minor loss of contrast is also affected but not nearly as serve as the brightness. I never said the contrast is serverly reduced, go read over carefully what I actually stated.
And as someone with a lot of money invested in calibration equipment, I can tell you that any loss of contrast is either due to meter sync or the black level dropping below the meter's lower limits of accuracy - or some combination of the two.
Switching the backlight on/off is not going to affect contrast at all, it just reduces perceived brightness - and that affects black level and white level equally.
Contrast is affected by the LCD panel, not the backlight, and strobing doesn't change that.

Now before I lose track, what was your main argument again?
"Plasma TVs are not high quality displays"

Ah yes quite the controversial statement. While Plasma's were in production LCD's paled in comparison(they still do) and this was an opinion shared by a consensuses of hardcore AV enthusiasts, reviewers, display experts, HTPC enthusiasts, color scientists, pro calibrators, and organizations related to those fields. Even the people here on this forum that have owned, used and seen a properly calibrated plasma are not in the slightest swayed by your arguments, as they fall on deaf ears. So clearly something is not right on your end. You certainly have a right to your opinion based on how your brain perceives visuals but bare in mind you are lone warrior in your cause.
And I maintain that Plasmas, being displays that are 1-bit+dither, do not have good image quality.
That's not to say that they cannot put out a decent picture when you're sitting 15ft away, but if you're actually looking at image quality, like you would be when using them as a PC display, they're no good.
And I think that the way they strobe many times per frame, in addition to the color separation problems they have, means that despite having high clarity, motion on Plasma TVs is poor compared to any display which does a single strobe per refresh, like a CRT or an LCD.
I think having multiple strobes per refresh is unacceptable on other displays too, like some (bad) LCD strobe modes, or LCDs which use PWM at frequencies like 250Hz or 500Hz.

One more thing I would like to ad, I find it odd that you never once bothered to refute any of the claims made by MonarchX, why is that if I may ask?
I thought I made some pretty clear posts on my position without the need for picking apart a post like that bit-by-bit - which is what this seems to have devolved into now, and I don't know that I care to continue the conversation much further.
 
I already know this, but the VPixx lists in it's specification that it able to sample and hold at 7ms, here is an except.

It sounds like you're confusing concepts here. That's the pixel response time (which, btw is measured here in gray to gray, a pretty shitty and ambiguous metric). It has nothing to do with persistence. You could have a 1 ms pixel response time, as measured as the rise time, and still have 15.7 ms worth of possible persistence if running at 60 hz. Pointing this out is hardly nitpicking.
 
I thought I made some pretty clear posts on my position without the need for picking apart a post like that bit-by-bit - which is what this seems to have devolved into now, and I don't know that I care to continue the conversation much further.

Zone you are a riot seeing as you were the first one to start picking apart posts so that is rather hypocritical of you. Of course you evade my question with double talk.
Aside from that we finally agree on one thing, this conversation serves no further purpose.
 
Last edited:
At a glance that was a pretty good representation of how muted strobe mode / ulmb mode is imo. ULMB mode dulls/mutes the screen way too much, even in low light conditions and at pulse width of 50 or 70. Even at higher pulse widths I found that the screen (pg278Q in my case) was dulled far too much for my taste. Running 100fps-hz or more as the common playing frame rate and utilizing g-sync to ride the roller coaster graph of frame rate swings results in much smoother, more defined motion gaming with 40%/50%/60% blur reduction, no tearing/judder/stops or tradeoffs of v-sync, and a screen with way more pop vs dull/muted non-gsync ULMB mode. Until someone makes (if ever) a much brighter monitor (+50% brightness?) to compensate for this screen muting I personally am not interested in backlight strobing compared to running 100/120/144 fps-hz as my common rate with the great benefits of g-sync..

My god!!
PEOPLE already did!!
But every time I say this I get ignored and this same topic comes up!
The BENQ XL2720Z already does this!

So does the XL2420Z, XL2411Z and XL2430T!
(even the XL2730Z does this but the XL2730Z is not capable of strobing under 120hz due to firmware design bugs and issues that can not be fixed with an update)

Benq originally adopted Lightboost (part of 3d vision 2) circuitry to their own purposes.
Lightboost, a custom hardware specification by Nvidia, required that the backlight voltage current be increased by 1.8x to compensate for the strobe signal.

This is *EXACTLY* why it was named LIGHTBOOST in the first place!

Setting a lightboost monitor into lightboost mode but into the "lightboost OFF" setting enabled strobing at the Lightboost=100% Persistence values (strobe widths) but did NOT increase current to the backlight at all. So Lightboost OFF with 100% strobe widths was DIMMER than lightboost 10% (the lowest persistence setting with 1.4ms pulse widths), but as blurry as the very bright lightboost 100% setting.

Benq blur reduction used the same circuitry on their monitors that Lightboost used--this Is why Benq blur reduction also increases the backlight voltage current by 1.8x in strobe mode, making it quite bright at 1.4-2.1ms persistence and making 1.0ms persistence quite usable (equal to about 20 monitor brightness with strobing =off on many monitors--typical nighttime viewing cd/m2 of about 80).

ULMB does NOT increase current to the backlight. It acts like Lightboost=off but with persistence sliders. This was one of the first complaints about the VG248QE gsync upgrade module, besides ULMB having more ghosting and more crosstalk than the old Asus module, but better colors in ULMB mode.

NONE of the ULMB monitors increase current to the backlight like Benq blur reduction did.

So yes, XL2720Z is the strobed monitor you want.

Also Eizo solved the backlight brightness issue by adopting a partial double strobe per refresh....a short and long strobe, which makes the image clearer than a pure double strobe, but definitely adds latency....

The Eizo FS2735 should also have a nice brightness in blur reduction mode.
benq blur reduction uses the same strobe signal wires as Lightboost
 
,..snip..>
ULMB mode dulls/mutes the screen way too much, even in low light conditions and at pulse width of 50 or 70. Even at higher pulse widths I found that the screen (pg278Q in my case) was dulled far too much for my taste. Running 100fps-hz or more as the common playing frame rate and utilizing g-sync to ride the roller coaster graph of frame rate swings results in much smoother, more defined motion gaming with 40%/50%/60% blur reduction, no tearing/judder/stops or tradeoffs of v-sync, and a screen with way more pop vs dull/muted non-gsync ULMB mode. Until someone makes (if ever) a much brighter monitor (+50% brightness?) to compensate for this screen muting I personally am not interested in backlight strobing compared to running 100/120/144 fps-hz as my common rate with the great benefits of g-sync.
<..snip>

My god!!
PEOPLE already did!!

The BENQ XL2720Z already does this!
<...snip...>

ULMB does NOT increase current to the backlight. It acts like Lightboost=off but with persistence sliders.
<..snip...>
NONE of the ULMB monitors increase current to the backlight like Benq blur reduction did.

<..snip..>
Benq blur reduction also increases the backlight voltage current by 1.8x in strobe mode, making it quite bright at 1.4-2.1ms persistence and making 1.0ms persistence quite usable (equal to about 20 monitor brightness with strobing =off on many monitors--typical nighttime viewing cd/m2 of about 80).
<..snip>>

So yes, XL2720Z is the strobed monitor you want.

<..snip>>

Thanks, I appreciate the info. I'd still miss 1440p and g-sync but I guess you can't have it all. I guess I don't pay much attention to 1080p in my searches and review reading anymore to be honest.

Until someone makes (if ever) a much brighter <120hz+ 1440p+ w/ g-sync as avail option> monitor (+50% brightness?) to compensate for this screen muting I personally am not interested in backlight strobing compared to running 100/120/144 fps-hz as my common rate with the great benefits of g-sync.

I don't think I could trade off both 1440p to 1080p and the lack of a g-sync option on some games even for brighter strobing. This especially considering that I run dual 780ti sc's and am happy to run my games at high or high+(custom) settings in order to get 100 - 120 fps-hz and range higher and lower throughout the game due to scene complexity swings utilizing g-sync.
If I had a more modest gpu setup that couldn't handle 1440 as well, 1080 strobed might have more weight to consider in the balance. Also, 1080p might allow powerful gpu setups to maintain a higher minimum frame rate (compared to 1440p) and potentially avoid some of the problems you'd use g-sync to eliminate from games' variable frame rate roller coasters.
Were a 1440p 144hz "extra bright" strobed monitor to come out, it would be much more tempting to me.

I'm going to try to hold out until displayport 1.3 - 1.4 gpus and monitors come out before I upgrade my monitor again. A 21:9 3440x1440 g-sync + backlight strobing 144hz dp 1.3 - 1.4 monitor off of dp 1.3 - 1.4 gpus is something I would upgrade for.

For now , at least for me, 40/50/60% blur reduction at high motion definition and articulation with the great benefits of g-sync at 2560x1440 is the way to go.
 
Last edited:
From the "BenQ Gaming XL2720Z Black 27" LED 1ms" reviews on egg:

No G-sync - not a dealbreaker for me right now, but likely would've been under different circumstances.*

As someone else mentioned in another review, the blur reduction feature just makes everything really dark and it's just not good. I don't really understand what that's about.

Blur Reduction made everything dark and unclear.

Cons: Coming from an IPS panel 1080p just isn't quite enough pixels for a 27" monitor if you like sharp images. I returned this one and got the Asus ROG Swift PG278Q

As said, I also saw some wavey-ness to the display. However I do note that this was only when I was very tired, so I'm not entirely sure if it's a physiological thing or if the display has something going in.

Cons: Don't like the blur reduction since it darkens the screen. It dulls everything as well. It will take some tweaking in the profiles.

Other Thoughts: The blur reduction would be great if it would allow the screen to maintain the brightness. I will work with a profile to see if I can use the blur reduction and enhance the appearance when enabled.

Pros: To Chet:
You can maintain the brightness quite well with the blur reduction. First, try making a custom utility in ToastyX's CRU (custom resolution utility), with the parameters :1920x1080 100hz horizontal total 2080, Vertical total 1500, front porch 48,3, sync width 32,5, add it (pixel clock should be around 317mhz, if it's 330, then you are using 2200 for horizontal total instead of 1500, and will need a toastyX pixel clock patcher to make this value not be ignored by the driver), test and restart (or use the restart zip file on the monitortests forum where CRU also is). Then combine that with the windows blur busters utility to adjust the persistence. Raising the vertical total also increases the brightness of the strobe, and lowers the crosstalk band. With strobe phase set to 000, the band should be almost identical to lightboost mode. You should now find 1.0ms persistence to be perfect in a non lit room and 2.0ms persistence to be bright or even too bright, in a lit room. If you like the results, you can do this for 120hz, but that will give you 374 MHz pixel clock and will require the pixel clock patcher (any pixel clock>329mhz requires a driver patch). Do NOT use horizontal/vertical total of 2200/1500 at 120hz, that gives 396 MHz and will cause corruption, use 2080/1500.
(note: the 2080 timings come from the defaults for the 24" screens. They work fine on the 27" also).
Check the blur busters' benq forum for more information about VT tweaks (masterotaku has a large thread on interesting tweaks).

Cons: Requires a vertical total tweak of 1500 to accelerate the scanout in blur reduction mode to make the "crosstalk zone" smaller; the same size as lightboost monitors that are run in lightboost mode. Lightboost uses accelerated scanout by default in hardware. Benq Blur reduction requires vertical total tweaks to match this (a higher VT accelerates the scanout by giving the panel more time to refresh).. Let's hope Benq changed this in the 2430T's blur reduction 2.0.

Other Thoughts: Great screen but needs tweaking. 27"'s tend to have more overdrive artifacts than the 24" versions.
 
Last edited:
I had XL2411Z, it was very nice .. until one of the ports started dying a couple of weeks later.
 
I was the one who wrote that last review feedback btw -.- And it was in attempt to try to help the original reviewer.

Most people who buy monitors don't read hardware sites like blurbusters. They probably don't even know that site exists.

Tom's hardware reviewed the XL2720Z and said that (with blur reduction off), Brightness=20 was best for normal dim/regular lit room viewing, as you don't want to go above 120 cd/m2 and definitely not above 80 cd/m2 with lights off. Using a strobe persistence of 1.5ms at 100 brightness (blur reduction on) puts you at that level. But most people don't know how persistence works. Or that the higher the refresh rate, the less persistence (Dimmer screen) per strobe duty point (strobe width) you have (since the refreshes are faster), and the lower the refresh rate, the higher persistence per point of strobe duty (width) you have.

When the monitor gets a custom resolution with a vertical total that is run out of specification for the refresh rate or it can't identify the VT as matching what belongs to that refresh rate, it switches to the 60hz persistence values (0.167ms per point of strobe duty) which is exactly the refresh rate persistence of 60hz (16.7 ms) divided by 100. So any refresh rate using a custom Vertical total (VT 1360, VT 1354, VT 1497-1502) will use the 60hz values. So strobe duty 009 in that case is 1.5ms persistence (0.167 x 9), and that's the brightness level most people should be happy with.

(this doesn't apply to 144hz as VT tweaks do not work past 129 hz). 144hz is 6.9ms persistence, 6.9 divided by 100 is 0.069 ms per point of stobe duty=you need 14 strobe duty at 144hz to get a decently bright screen).

Most of the work I've done and "hacks" (cough bug exploits) I've found for the XL2720Z I put over here.

http://forums.blurbusters.com/viewtopic.php?f=2&t=2467
 
My swift is definitely "muted" when the default ulmb mode is on, even at longer pulse widths. I've messed around with it and it is definitely neat on easier to render games with really high frame rates (like darksiders). It's just not worth the difference/dullness to me. Especially on harder to render games, g-sync is well worth the trade-off even at a common frame rate of 100 to 120 +/- using g-sync to roller coaster higher and lower (75 to 138+ generally) throughout to get 40/50/60% blur reduction and 5:3/2.1/2.4:1 motion def and object path+mouse-look articulation.

Ulmb is drab even in dim room settings. I really did try to like it long term but switching back to g-sync mode brought all the color brilliance and overall brightness (aka pq "pop") back to life even at modest (non strobing) brightness settings. Perhaps the brightness "hack" Falkentyne posted on the Benq would make it worth it on those monitors though. If I could try the same type of increased brightness during strobe mode on my swift safely I'd definitely try it. I'd much prefer that they shipped monitors with good brightness capability during strobe mode by default out of the box rather than service menu/hacks . Thanks for posting that reply and link regardless.
 
Last edited:
The XL2720Z numbers are wrong.

I think TFT's 2720Z charts were taken before the firmware came out which allowed strobe timing adjustments even though he did get the firmware later. Looks like the numbers are from V1 firmware. If I recall correctly, V1 firmware defaulted to strobe duty 020, which you could not change at all on V1. That's equal to 1.38ms persistence (0.069 x 20) at 144hz (remember 144hz persistence is 6.9ms. If that were the case, then the TFTcentral values make sense. I know about the "default" strobe duty, because whenever you flash the firmware or do a "factory reset" from the factory menu, the strobe duty resets to 020 and strobe phase resets to 100 (identical to V1's unchangeable settings).

First, the XL2720Z's strobe persistence ranges from 0.069ms to 2.1ms at 144hz.
To get the same strobe persistence as the ULMB settings, you need a strobe duty of 027. 027 x 0.069ms is 1.863ms, close enough to ULMB pulse width 100 of 1.875ms.
Already you can tell there's a problem as soon as you see these numbers, because the backlight voltage current is increased by 1.8x, and on my screen I just tested, it's -much- brighter than 119 cd/m2 at 1.875ms. 1.38ms however looks accurate for 119 cd/m2. That's close to my 80 cd/m2 test for 1.0ms persistence which is already known.

However XL2720Z at 100 brightness, at this setting, drops from 290 cd/m2 to no lower than 150 cd/m2 (I don't have measuring equipment to measure this stuff but I can tell its still more than half the brightness). And it can go up 3 more points of duty up to duty 030, which is 2.1ms. But that only goes up about 5 cd/m2. And that's at 144hz.

At 120hz, you can go PAST 300 cd/m2, which you do NOT want to do, as you can damage the backlight. The backlight persistence values seem to be capped at 25% below max cd/m2 I'm taking a wild guess. But using a VT tweak changes the pulse widths,. at 120hz, from 0.083ms-2.5ms (notice this is 100 divided by the refresh rate persistence, right?) to 0.167ms to 5 ms (!).

Once you go past 2.5ms, you are exceeding what the backlight is rated for. Strobe duty 030 at 120hz with VT tweak is equal to (guesstimation) a BLINDING 400 cd/m2. It's brighter than brightness 100 with blur reduction off. (Note: 2.5ms is strobe duty 015 when using a VT tweak, and strobe duty 030 (maximum, which is what it's properly rated for) when not using a VT tweak.
 
Last edited:
My swift is definitely "muted" when the default ulmb mode is on, even at longer pulse widths. I've messed around with it and it is definitely neat on easier to render games with really high frame rates (like darksiders). It's just not worth the difference/dullness to me.
[...]
Ulmb is drab even in dim room settings. I really did try to like it long term but switching back to g-sync mode brought all the color brilliance and overall brightness (aka pq "pop") back to life even at modest (non strobing) brightness settings.
Have you compared the Swift (I assume PG278Q) around Brightness 25 in normal/G-Sync mode, vs ULMB at maximum brightness?
Does ULMB still look "dull" or are they roughly the same in appearance, and the issue is that you prefer to have a really bright screen? What is your typical brightness setting?

Because 100cd/m² (Brightness 20) is the recommended level for monitors to be used at - and those settings should be ~120cd/m² according to TFT Central.
Now it's fine if you prefer a really bright backlight, but 120cd/m² is not dull, dim, or drab.
 
non-strobing brightness 20 on a PG278Q is more or less the same as ULMB on with pulse width of ~90. the only difference i noticed in picture quality between ULMB on and off is a slight yellow tint added with it turned on.
 
My 2 cents with ULMB after comparing it with G-Sync on Starcraft and The Incredible Adventures of Van Helsing - Final Cut.

My original screen setting in G-Sync is 20 brightness, 50 contrast.

With ULMB setting, I found if I were to keep a similar level of brightness, I would need to increase Brightness to 100 and pulse width to 90. I initially tested with Brightness 100 and pulse width of 10, since the smaller the pulse width, the greater the clarity, supposedly, but at this setting, even in a completely dark room, I found things were hard to see, so I had to increase the width to 90 to achieve a similar brightness as my original setting.

First, I tested clarity by a highly contrasted window around rapidly. I could definitely make out the text easier in ULMB mode than I could under G-Sync, so the motion clarity is there. But I found this was of little difference in Starcraft 2 where things were not really moving rapidly. I found ULMB and G-Sync to be undistinguishable.

In Van Helsing, again, I found the motion to be undistinguishable, but I noticed some stuttering when under ULMB. Switched it to G-Sync and it was smooth again (I don't know whether this is placebo or not, but the stuttering was certainly much less pronounced in G-Sync mode).

I will also admit I have not done any real extensive testing on this, as I was more interested in playing the game than to test it, but that has been my first impression.
 
Thanks for the ULMB feedback.
Having to use pulse width of 1.88ms (which is 90-100 for you ULMB guys) to have a screen that is not bright enough is pretty bad tradeoff compared to just using Gsync.

1.88ms pulse width is equal to *lightboost 10%* on the old VG248QE at 100hz (it was 1.4ms at 120hz), which is the minimum setting. That's what you guys get at pulse width 100 on ULMB, right? But the VG248QE increased voltage to the backlight by 1.8x (you can read how both Lightboost and benq blur reduction used the same hardware wires here: http://display-corner.epfl.ch/index.php/BenQ_XL2411Z , under backlight pulse width)
I used to play LB 10% on my VG248 before getting my Benq, so we were using 1.8x the brightness you guys have, and some people still found that too bright :/

The blur reduction effect of 1.88ms is "okay" but frankly I don't like it if I can go lower. When using VT tweaks on benq blur reduction, 1.88ms is equal to Strobe Duty 011, and I find that both too -bright- and too blurry. I use 1.0ms persistence day to day (strobe duty 006; 006 x 0.167= 1.0 ms) which may be pushing it in a lit room, but is a perfect 80 cd/m2 in a dark unlit room at night.
 
BenQ makes one. BenQ XL2420Z 24-Inch Screen LED-Lit Professional Gaming Monitor. Just bought one for my son and he loves it.
 
Yes, all of the Benq Z series monitors (besides the XL2730Z which uses different hardware but still has a boost voltage) increase voltage to the backlight by 1.8x. It shares hardware with the lightboost circuitry. Blur reduction works great. But the overdrive artifacts.....ahem...

The monitor will hard power cycle if the cd/m2 exceeds 450, however.
 
But I found this was of little difference in Starcraft 2 where things were not really moving rapidly. I found ULMB and G-Sync to be undistinguishable.
Part of it depends on your perception. Some people just don't seem to notice motion blur as much as others.
But a large part of it is definitely the speed and type of motion being displayed. ULMB is going to make a much bigger difference if you play fast-paced FPS games or racing games for example.

In Van Helsing, again, I found the motion to be undistinguishable, but I noticed some stuttering when under ULMB. Switched it to G-Sync and it was smooth again (I don't know whether this is placebo or not, but the stuttering was certainly much less pronounced in G-Sync mode).
A strobed display requires that your framerate is locked to the refresh rate. Any performance drops will be much more visible due to the increased motion clarity.
G-Sync is the opposite of ULMB: it trades off motion clarity for perfectly fluid motion.

That's why I like that most G-Sync monitors include both, even if ULMB may be less effective than things such as BenQ's Blur Reduction feature.

Having to use pulse width of 1.88ms (which is 90-100 for you ULMB guys) to have a screen that is not bright enough is pretty bad tradeoff compared to just using Gsync.
[...]
The blur reduction effect of 1.88ms is "okay" but frankly I don't like it if I can go lower.
To be fair, 2ms is still far better than most non-CRT displays out there, including Plasma TVs.
I agree that 2ms is certainly not enough to eliminate motion blur - but it's a good start, and a hell of a lot better than an LCD giving you 16ms of blur at 60Hz.
I also agree that strobed displays should give us more options. Most of them seem to be very limited - especially the TVs.
 
For me personally it comes down to smooth, high motion definition,
40 - 50 - 60 % blur reduction using g-sync at high fps-hz ranges (100+ fps-hz) on a 1400p resolution
vs
not getting the benefits of g-sync and muting my screen brilliance at the short enough pulse widths required to gain much more than 40 - 60% blur reduction.

(or.. dropping to 1080p rez on a benq and using brightness/timing 'hacks' w/o any g-sync option.)

On easier to render games like isometric rpg's (van helsings, torchlight2, diablo), and things like darksiders, etc. - it is pretty much always 60% blur reduction in g-sync mode (and additionally a 2.4:1 increase in motion definition&articluation of individual objects and of the entire game world moving relative to you during movement keying and mouse-looking).

So considering that I have to increase the pulse widths in order to get a more modest dimming (yet still not enjoying as brilliant and colorful of a display) on the pg278q in ulmb mode, it is not really worth it vs the 40 - 60% blur reduction combined with the huge benefits of g-sync to me.
This is coming from someone who used a fw900 for years. If I could 'hack' my pg278q to make it brighter like you have shown with the benq's, I'd definitely try it again though.
 
40 - 50 - 60 % blur reduction using g-sync at high fps-hz ranges (100+ fps-hz) on a 1400p resolution
What is this "40-50-60% blur reduction" that you are talking about?
G-Sync only works with full-persistence displays, so the amount of motion blur seen is determined by your framerate. It does not reduce motion blur at all if you're running a game at 60 FPS, compared to a standard 60Hz monitor.
Are you just trying to say that 100 FPS has less motion blur than 60 FPS?

not getting the benefits of g-sync and muting my screen brilliance at the short enough pulse widths required to gain much more than 40 - 60% blur reduction.
At 120 FPS you will have 8.33ms of motion blur with G-Sync
At 120 FPS you will have 1.875ms of motion blur with ULMB. That is a 4.4x improvement in motion clarity.

At the lowest pulse width, that drops to 0.250ms - a 33.3x improvement in motion clarity.
Of course anything less than 1.875ms is bordering on unusable since ULMB does not boost the backlight brightness to compensate for lower pulse widths, but my point is that the difference is significant, not slight.

This is coming from someone who used a fw900 for years. If I could 'hack' my pg278q to make it brighter like you have shown with the benq's, I'd definitely try it again though.
The maximum brightness from an FW900 is about 105cd/m²
The maximum brightness from a PG278Q with ULMB enabled is about 120cd/m²
 
yeah I realize crts are dim. That comment was in refernce to "zero" blur vs what I get at high hz-fps with g-sync mode.
I guess it's all about the tradeoffs. I got used to a more brilliant LCD screen even in dim viewing settings with a 120hz 1080p samsung and now my 1440p swift, even though I don't use it anywhere near max brightness.

As per blurbusters.com , on a high hz, very low response time monitor
vs. baseline 60hz (60fps-hz):

100fps-hz yields ~40% sample and hold blur reduction
120fps-hz yields ~50% sample and hold blur reduction
144fps-hz yields ~60% sample and hold blur reduction

These also result in much higher motion definition and motion articulation of course.
Locking my hz lower would lose out on the higher motion def of running 100 - 120 - 144 range with g-sync.

I know strobe mode reduces this a lot more but it's still a huge tradeoff vs:
- appreciable (40) 50% (60%) bur reduction
- the benefits of g-sync mode on over a common playing rate of 100fps-hz on a roller coaster graph fluctuating up/down on demanding games
- much more brilliant and colorful screen/game world

I did test out L4D2 and torchlight2, path of exile, darksiders, etc in ULMB mode when I got my swift but switching back and forth was a huge difference and I went with the g-sync and appreciable blur reduction (vs near blur elimination of strobing mode).

I can see where people could choose the strobing mode though, especially on the 1080p benq where you could boost the backlight level much higher, and you could get much higher frame rates at 1080p(especially with sli). If the brightness were capable of going that high on my 1440p I'd revisit the choices. Maybe with dp 1.3 gpus, dp 1.3 monitors and HDR range capabilities but in non-hdr mode someday. The framerates would still be a huge issue for me though at 1440p or 3440x1440 144hz on 21:9 dp 1.3 monitors so I'd likely stick with g-sync still.
 
Last edited:
Back
Top