ASUS/BENQ LightBoost owners!! Zero motion blur setting!

I wish blurbusters would consider doing some of it's camera motion tracking tests(60hz, 120hz, 144hz, lightboost modes) on some popular games to show the effect of the entire viewport smear on very high detail scenes/textures/shaders and post the pictures in addition to the ufo photos. I think it would provide further enlightenment.
I can do camera tracking tests on games in the future. It's just harder to do accurately without a synchronization track (HOWTO: Blur Busters Motion Tests With Pursuit Camera). It will be considered within the next few months; there are a lot of Blur Busters projects I want to push along.

For now, strobed 60Hz would be the Next Best Thing, if you can tolerate the 60Hz flicker. A good compromise is playing PS4 or XB1 on a display utilizing one of the 60Hz strobe backlights, such as Sony's Motionflow Impulse.
 
Don't sweat it, just something on my wish list.
.
My other comments were touting the superiority of 120hz and how inferior 60hz was. I was addressing many of the opinions I saw from the thread you linked on tom's.. That displays don't have visible blur, that 1080p is inadequate for gaming, that 60hz is just fine, etc.
 
XL2420TE.

Although, I hear XL2411T's are heading into Revision 2.0 as well and becoming PWM free, I haven't heard confirmed reports yet (unlike for XL2420T Rev 2.0 and XL2420TE).

So other than the Flicker-Free, what is the difference between these 2? It seems like the 2420TE is much better since it's cheaper and newer.

Also just for fun, which would you choose given the Benq or the Asus (24")? I've been following this thread for a while and I think I'm getting close to actually buying one :p
 
Is there any other improvements outside of flicker free on the 2420TE, compared to a 2420TX?
 
Just pulled the trigger on an Asus VG278H largely based off Mark's experience with it. As an owner of an HP ZR30W for many years, and exclusively S-IPS monitors before that, I'm a little spoiled when it comes to color and image quality. I tried out the Asus VG248QE a while back and threw up in my mouth a little upon firing it up. Hopefully the VG278H is a nice compromise between color/size while being 120hz Lightboost capable.

This question is directed at Mark if he happens to see this post. Are there any VG278H specific tweaks or settings you can share to get this panel looking its best? I'm planning to keep 120hz Lightboost on all the time but I'm awful at calibrating stuff. Thanks!
 
Just pulled the trigger on an Asus VG278H largely based off Mark's experience with it. As an owner of an HP ZR30W for many years, and exclusively S-IPS monitors before that, I'm a little spoiled when it comes to color and image quality. I tried out the Asus VG248QE a while back and threw up in my mouth a little upon firing it up. Hopefully the VG278H is a nice compromise between color/size while being 120hz Lightboost capable.

This question is directed at Mark if he happens to see this post. Are there any VG278H specific tweaks or settings you can share to get this panel looking its best? I'm planning to keep 120hz Lightboost on all the time but I'm awful at calibrating stuff. Thanks!

I'm going to be watching very closely for your feedback. ;)
 
I have a question that doesn't deserve a new thread:
I now own a 2 year old samsung s27a950d and have an offer to sell it. If I add 150 dollars to what I get for my monitor I can buy either a LED BenQ XL2720T, or an Asus VG278HE, because I want lightboost so bad :D (I did try lightboost on my sammy but colors were awful while doing that)
Which monitor should I go for? (I'll use it just for 2D 120hz gaming)
 
I have a question that doesn't deserve a new thread:
I now own a 2 year old samsung s27a950d and have an offer to sell it. If I add 150 dollars to what I get for my monitor I can buy either a LED BenQ XL2720T, or an Asus VG278HE, because I want lightboost so bad :D (I did try lightboost on my sammy but colors were awful while doing that)
Which monitor should I go for? (I'll use it just for 2D 120hz gaming)

You beat me to it. If I decide to roll the dice here I'm wondering what the best 27 inch option (or options) are. I noted the two most recommended 24's from Mark and made note of that and I just wanted to see the 27s as well.
 
I can't say from experience but from what I've been reading in the threads, the 27" models have better color and twice the contrast as the 24" ones in lightboost mode, but they also have a single afterimage shadow. They don't blur, but have a single slim un-blurred ghost image/outline following near to them. Others can correct me if I am wrong, or elaborate better.
.
I would personally get the xl2720T, and hopefully vega would remove the ag coating for me (paying him of course). I have some extra cash coming soon so I might make the leap, especially since with toastyX util I can use lightboost with my 6990 (I think) and not have to upgrade my gpu immediately, though next time I upgrade it will be to nvidia.

I have a samsung A750 now. The strobe mode does cut down blur on the samsung but not crt quality, and it adds input lag which I can't stand. It was still nice to see something of what I am missing with not having a lightboost monitor though.
 
I can't say from experience but from what I've been reading in the threads, the 27" models have better color and twice the contrast as the 24" ones in lightboost mode, but they also have a single afterimage shadow. They don't blur, but have a single slim un-blurred ghost image/outline following near to them. Others can correct me if I am wrong, or elaborate better.
.
I would personally get the xl2720T, and hopefully vega would remove the ag coating for me (paying him of course). I have some extra cash coming soon so I might make the leap, especially since with toastyX util I can use lightboost with my 6990 (I think) and not have to upgrade my gpu immediately, though next time I upgrade it will be to nvidia.

I have a samsung A750 now. The strobe mode does cut down blur on the samsung but not crt quality, and it adds input lag which I can't stand. It was still nice to see something of what I am missing with not having a lightboost monitor though.

Good notes.

This may be a nitpicky question but if you're going from 24 to 27 on one of these... since they're all 1920x1080s as I gather here... how much of a hit if any are you taking in the pixel pitch/pixels per inch department?



EDIT: Just for grins I checked Asus VG248QE against VG278HE and the pixel pitch went from 0.2768mm to 0.311mm. What are we talking about there?
 
Last edited:
You can check out a 1080p 27" at a store like BB if you can find one. It is obviously less resolution and pixel density but aside from that detail the actual game scene remains the same in typical 1st/3rd person perspective cgi world games FoV/scene-element wise. To me it is a lot more of a jumbo looking size on the desktop than in a game. The tradeoff allows you to get large gains in fps at higher eye candy settings (other than resolution) in the video settings of the game, and if you tweak to get 100 - 120+ fps on your 120hz monitor you will experience much higher definition (much more defined) motion transitions and aesthetic flow in addition to large reductions in blur ( and essentially zero blur if lightboost mode monitor).
.
Percieved ppi and screen size is relative to your view distance, which is why large 1080p tvs look good at typical living room viewing distances. If you have a deep desk or a desk not against the wall (not "bookshelf" desk placement), you can use a monitor arm to set the display back even farther if you wanted to. I keep my 27" 1080p at a pretty normal distance, around 2.5'. I have a 2560x1440 screen to the side of the 27" 1080p at 108.8 ppi and with better color/uniformity for everything else desktop/app wise. I enjoy using each panel for their respective tasks.

It's all about trade-offs and your budget.
You might want to check out this thead I have been posting in the last few days. It think it gives a lot of perspective to what you are inquiring about.:
Leaving multi, going back to single monitor.
.
 
Last edited:
Ugg, these latest NVIDIA drivers for BF4 (331.40) have completely broken my ability to get Surround Lightboost working, and also portrait Surround bezel correction.

I really hate GPU company drivers, usually screw up everything. So now I have to run in regular crap-tastic 120 Hz mode lol.
 
Ugg, these latest NVIDIA drivers for BF4 (331.40) have completely broken my ability to get Surround Lightboost working, and also portrait Surround bezel correction.

I really hate GPU company drivers, usually screw up everything. So now I have to run in regular crap-tastic 120 Hz mode lol.

You're one of the biggest reasons I'm even considering taking this concept for a test spin. ;)
 
You're one of the biggest reasons I'm even considering taking this concept for a test spin. ;)

You building a similar setup to mine?

With these new drivers, it crashes my computer if I try and enable portrait Lightboost surround after using the latest ToastyX program, or doing it manually through CRU.exe.

The old way of manually inputting custom resolution timings into each monitor via NVIDIA control panel, and THEN putting it into surround would work. Doesn't anymore. So my 3x Portrait Lightboost setup is a big paperweight at the moment. Maybe I will be forced to replace my Titan's with 290X's if this is a permanent change with NVIDIA.

Aint nobody got time for that!
 
You building a similar setup to mine?
I've forwarded your information to nVidia. Can you please followup and post over there, in nVidia's forum, I'm trying to get heads to roll over there. I've even PM'd a few nVidia reps, to try an expecdite this, because I truly believe nVidia customers that spent $4000 on graphics cards should not be forgotten during driver upgrades.

GeForce Forums Thread -- Problems with 120Hz Triple Surround
https://forums.geforce.com/default/...rs-may-reluctantly-go-ati-hel/?offset=2#reply

Attention -- all nVidia triple 120Hz surround users should post in the above thread. If you recall any forum nicknames who mentioned triple 120Hz monitors, please bring them up!
 
Last edited:
Thanks Mark. Have you had much luck in the past on the NVIDIA forums in the past as far as official responses go?

Have you heard of any other persons not getting LB to work with the latest drivers in surround? I just tried landscape, and it won't work there either.
 
Have you heard of any other persons not getting LB to work with the latest drivers in surround? I just tried landscape, and it won't work there either.
A number of people on OCN and BlurBusters have reported that LightBoost stopped working in surround.
So I've posted there as well. Can you also post a followup on the Geforce forums, since more people will make it more likely a fix will occur.

I've also PM'd AndyBNV of nVidia at NeoGaf, who is aware of LightBoost users.

(Does 3D Vision Surround still work, out of curiousity? You still have the emitter, do you?)
 
I've got similar question

Can i use Lighboost Benq together with 60 Hz classic VA panel connected to the same Gtx 770 ?
 
I've got similar question
Can i use Lighboost Benq together with 60 Hz classic VA panel connected to the same Gtx 770 ?
Not in surround mode, but you can run them separately in multimonitor mode, and run games on one of the monitors at a time.

A risk is games may run at the framerate of the lowest-refresh-rate monitor (60Hz) even if it is the secondary monitor. Using Windows 8 instead of Windows 7 will reduce the risk of frame rate glitches as it handles divergent-refresh-rate multi monitor systems better. However, there may be times you have to disable your 60Hz monitor from your multimonitor setup, to get your games to run at a full 120fps. Utilities are pretty helpful here, too. A paid shareware utility, UltraMon, gives you hotkeys ability to turn on/off a secondary monitor, so you can press a single keypress to go into single monitor mode -- which can be useful for pure 120Hz gaming with the troublesome games.
 
I'm going to be watching very closely for your feedback. ;)

I ultimately decided to return the VG278H. Once again the absence of motion-blur (slightly more than the VG248QE) was very nice but the colors and bigger screen of my ZR30W is still vastly more pleasant to look at and immersive. A better K:D Ratio is just not worth giving up the all the benefits of an S-IPS panel and don't even get me started on how crippling it is going from 2560x1600 to 1920x1080 when it comes to desktop and webpage real estate. I felt like I was using my cell phone for browsing the net except my cell phone is way more vibrant. I'm fairly certain I won't be giving up my ZR30W until they come out with a 30" S-IPS 120hz Lightboost capable monitor.
 
I ultimately decided to return the VG278H. Once again the absence of motion-blur (slightly more than the VG248QE) was very nice but the colors and bigger screen of my ZR30W is still vastly more pleasant to look at and immersive. A better K:D Ratio is just not worth giving up the all the benefits of an S-IPS panel and don't even get me started on how crippling it is going from 2560x1600 to 1920x1080 when it comes to desktop and webpage real estate. I felt like I was using my cell phone for browsing the net except my cell phone is way more vibrant. I'm fairly certain I won't be giving up my ZR30W until they come out with a 30" S-IPS 120hz Lightboost capable monitor.

That's why I use different monitors for desktop/apps and gaming. I have a 27" 2560x1440 60hz ips next to a 27" 1920x1080 120hz TN. I've been using two different monitors for years now, formerly various lcd's next to a fw900 graphics professional crt. The tradeoffs are too great between motion clarity, motion definition (different than blur reduction), and color + ppi/desktop real-estate to use only one. Now it is cheaper than ever to do since you can get a 2560x1440 korean for around $350.

In a 1st/3rd person cgi world perspective, the scene itself and all elements remain the same size in most games, the pixel density just changes. Also, since I feel 60hz gaming is inferior on many levels, 1080p is the sweet spot for enthusiast level gpu budgets to get 100 to (optimally) 120fp+ with a 120hz monitor on more demanding games at high, high+(custom) to ultra settings. 120hz is a huge difference aesthetically motion definition and transition wise, the accuracy is also more pleasant control wise. This in addition to greatly diminishing or eliminating the entire viewport of gorgeous architecture, textures, depth via bump mapping, etc smearing out constantly "outside of the lines" at 60hz. At least 120hz non lightboost keeps the blur within the "shadow mask" of the objects in the scene.

I've repeated this quote several times in different threads, but I keep posting it because people still keep thinking that 120hz is only about blur reduction.
While I am very interested in blur reduction and optimally blur elimination, there are additional benefits to running high fps and high hz.
.
When I say "smoothness" I mean something separate from blur reduction. If I were using a general term for blur reduction I would use something like "clarity" or "clearness".
.
Smoothness to me means more unique action slices, more recent action going on in the game world shown - more dotted lines per dotted line length, more slices between two points of travel per se, more unique and newer pages flipping in an animation booklet, pick your analogy. It means less "stops" in the action per second and more defined ("higher definition") animation/action flow, which provides greater aesthetic motion and can increase accuracy, timing, and reaction time.
.
Disregarding backlight strobing for a moment.. As I understand it - where a strobe light in a room someone runs across would show blackouts, a typical lcd rather than blacking out just continues displaying the last "frozen" frame of action until it is updated. At 60hz that is every 16.6ms of course, and at 120hz and high fps it would have shown a new state of/peek into the room and run cycle 8.3ms sooner instead of freeze-frame skipping (over what would have been a new state at +8.3ms) to the next later state of the room and run cycle a full 16.6ms later. What is displayed of the entire animated world action in games is updated twice as often(and twice as soon) which can increase accuracy, and in providing more "dots per dotted line" per se, makes movement transitions "cleaner"/aesthetically smoother, providing higher definition movement and animation divided into 8.3ms updates. This goes hand in hand with blur reduction/elimination to make the entire experience a drastic improvement over 60hz/60fps.
 
Last edited:
I wonder if some newer lightboost models will show up since they are rolling out this g-sync monitor technology:

http://www.geforce.com/whats-new/articles/introducing-nvidia-g-sync-revolutionary-ultra-smooth-stutter-free-gaming

Later this year, our first G-SYNC modules will be winging their way to professional modders who will install G-SYNC modules into ASUS VG248QE monitors, rated by press and gamers as one of the best gaming panels available. These modded VG248QE monitors will be sold by the modding firms at a small premium to cover their costs, and a 1-year warranty will be included, covering both the monitor and the G-SYNC module, giving buyers peace of mind.

Alternatively, if you’re a dab hand with a Philips screwdriver, you can purchase the kit itself and mod an ASUS VG248QE monitor at home. This is of course the cheaper option, and you’ll still receive a 1-year warranty on the G-SYNC module, though this obviously won’t cover modding accidents that are a result of your own doing. A complete installation instruction manual will be available to view online when the module becomes available, giving you a good idea of the skill level required for the DIY solution; assuming proficiency with modding, our technical gurus believe installation should take approximately 30 minutes.

If you prefer to simply buy a monitor off the shelf from a retailer or e-tailer, NVIDIA G-SYNC monitors developed and manufactured by monitor OEMs will be available for sale next year. These monitors will range in size and resolution, scaling all the way up to deluxe 3840x2160 “4K” models, resulting in the ultimate combination of image quality, image smoothness, and input responsiveness.

They are talking about tearing and stuttering, and aren't focused on ultimate blur reduction unfortunately.. but it still sounds promising for the issues it addresses.

From another forum:
this is basically a variable refresh rate monitor. This is why it comes with an external power supply (and they are comfortable with offering a DIY solution) instead of being plugged directly into an outlet. The AC adapter will probably feed low voltage DC into the monitor, and there is a frequency generator on the module to adjust the refresh rate depending on the frame rate. This is how you get around both tearing and vsync. I expect the monitor to use even pull-up cadences however when the frame rate falls below 60, otherwise you would get serious eye strain.

Overall a very good solution, and something that should have been done A LONG time ago. For the first time ever, humans should be able to feel refresh rates between common denominators with 100% accuracy. Huge milestone.

There is some question as to whether this supposed variable refresh rate will work with 3d vision, and thus whether it will work with lightboost in 2D at all. If so, running higher fps than the hz your monitor is running (100hz or 120hz for example) would probably still be best instead where possible if you want lightboost in 2D gaming for zero blur. Hopefully some of the g-sync monitors they release next year will be lightboost capable even if outside of g-sync being active, so at least we might end up with some new lightboost capable monitor releases.

http://www.anandtech.com/show/7436/nvidias-gsync-attempting-to-revolutionize-gaming-via-smoothness

http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-G-Sync-Death-Refresh-Rate/Potential-Benefits-New-Interface

http://forums.anandtech.com/showthread.php?t=2348515
 
Last edited:
I fully expect g-sync to require a single frame buffer to work effectively. Input lag would be greater than running with vsync disabled.
 
"We have a superior, low-persistence mode that should outperform that unofficial implementation, and importantly, it will be available on every G-SYNC monitor. Details will be available at a later date."

Now that is an interesting quote. Did he just say that with this G-Sync we can expect even better Lightboost performance then what we currently have?

We need to see all this stuff implemented at greater than 1080P resolution. That is what I am waiting for.
 
Bigger resolutions will work with G-sync

AndyBNV said:
For everyone coming to the thread, here's the high-level breakdown of what G-SYNC does:

  • No tearing
  • No VSync input lag
  • No VSync stutter
  • Improved input latency compared to VSync Off mode on 'standard' monitors
  • Clearer, smoother movement (tech demo shows readable text on a moving object)
  • DIY modkit available before the end of the year for the ASUS VG248QE (~$175, aiming to reduce price to ~$130)
  • Modded ASUS VG248QE versions available from partners before the end of the year
  • Off-the-shelf models available from retailers next year. ASUS G-SYNC VG248QE MSRP is $399.
  • Will be in monitors of every resolution and size, right up to 3840x2160 - selection determined by manufacturers like ASUS, BenQ, Phillips and Viewsonic
  • Requires GTX 650 Ti Boost or better

Mod it with the DIY kit, or have someone do it for you. Takes 20-30 minutes.

But we don't know if those will be lightboost too.
 
The guesswork in other forums is that the refresh rate will be variable on the monitor to match the fps of the gpu. So the hz will sink in order to prevent tearing and judder when your fps sinks. They mention input lag/responsiveness being tons better. I think they are talking about hz vs fps sync problems being cleaned up and not blur like lightboost in 2D does. People are guessing that it would be unlikely to work with 3d mode or lightboost2d mode. I'd gladly be wrong about that though and have them implement some sort of superior backlight strobing in 2D implementation along with addressing the sync issues.
.
Personally I have mixed feelings about higher resolution monitors. I'm all for the option for the extreme gpu budget people but I still want the 1080p option for $650 - $1k in gpus to run 100 - 120+ fps on high, high+(custom), or ultra on more demanding games. Dual 780's might be ok for 2560x1440 for many games to hit 100fps+ or over 120fps+ at max settings (farcry3 and hitman only in the 60's though I think), but that's still ~ $1300 in gpus alone.
 
Last edited:
Vega...

Or, G Sync separate from LightBoost? I'm not understanding...

I was about to get a new nVidia card (770 or 780) and a BenQ XL2420TE.... I'm holding up now on the monitor...
 
Will g-sync on a low latency LCD eliminate motion blur? Sorry still confused on g-syncs relation to lightboost
 
"As others have asked already, will this keep the LightBoost feature intact for the VG248QE? I bought this monitor purely for the LightBoost functionality and it too is very revolutionary for competitive gamers. I'd rather have both than have to choose for one of the two..."

NVIDIA: "We will be addressing low lag modes in the future; stay tuned."

I hate waiting. :)

Although my screen de-matte business may get a little more busy with the VG248QE being the vanguard display for this new tech!
 
"As others have asked already, will this keep the LightBoost feature intact for the VG248QE? I bought this monitor purely for the LightBoost functionality and it too is very revolutionary for competitive gamers. I'd rather have both than have to choose for one of the two..."

NVIDIA: "We will be addressing low lag modes in the future; stay tuned."


Yes, that is the question for the day.
 
I fully expect g-sync to require a single frame buffer to work effectively. Input lag would be greater than running with vsync disabled.
Same for LightBoost.
G-Sync has less input lag than LightBoost, as LightBoost has a slight additional pixel-transition-complete waiting time added at the end.
Plus, in the future, frame transmission times and scan times will go up (e.g. tomorrow's 240Hz and 480Hz LCD's later this decade, look at what's happening in the high end HDTV market. There's no reason why interpolated refreshes can't be replaced by native refreshes. The technology is doable. We can eventually achieve strobe-free LightBoost motion quality by the end of the decade!).

Look at this another way: VSYNC ON during 1000fps@1000Hz has only 1ms input lag. At that point, people begin preferring VSYNC ON. G-Sync lowers the bar tremendously since you're not always forced to round-off to the next scheduled display refresh (of discrete-refresh-rate monitors).
 
Same for LightBoost.

Mark,

What is best right now for FPS gaming? Lightboost or G-sync?

If you don't know yet, when do you expect to be able to formulate an answer?

I'm in the position of wanting to purchase both a GPU and a new monitor. Thanks.
 
Eureka: I've Invented a method of variable-rate strobing without flicker at lower framerates!!!
(This now makes it feasible to combine LightBoost and G-Sync without side effects)

Mark,
What is best right now for FPS gaming? Lightboost or G-sync?
Depends.
Variable framerates; use G-Sync
Constant 120fps@120Hz: use LightBoost.

G-Sync is still limited by 144Hz motion blur; it would take G-Sync at 400fps@400Hz to achieve flickerfree LightBoost CRT motion clarity (2.5ms sample-and-hold length), based on existing motion blur math formulas directly co-relating strobe length with motion blur. So frame durations would have to match today's LightBoost strobe lengths. Using 2.5ms means frames have to last only 1/400th of a second, and that would require 400fps@400Hz using nVidia G-Sync. If you want the clarity of LightBoost=10%, you need about 700fps@700Hz (1.4ms frame duration length) and upwards. That's not feasible. So we still need strobing, at least until we have a 1000Hz LCD (to have flickerfree CRT motion clarity). It's surprising how human eyes still sees indirect display-induced motion blur (sample-and-hold effect), even at 240Hz, 480Hz, and beyond (as we witness human eyes can tell apart LightBoost=10% (~1 to 1.4ms frame duration) versus LightBoost=100% (~2.5ms frame duration) during motion tests such as www.testufo.com/photo#photo=toronto-map.png&pps=1440 ...) Display motion blur is directly proportional to visible frame duration times.

However, we already know LightBoost becomes terrible at lower framerates, and can become very stuttery at less than triple-digit framerates. G-Sync stays smooth during varying framerates; LightBoost does not. The solution is to Combine LightBoost AND G-Sync. This solves the problem. However, new problems occurs with variable-rate strobing. Fortunately, I've come up with a successful solution.

I've found a way to combine the two. John Carmack did say it was possible in his twitch.tv video, but it didn't appear that anyone came up with a novel idea of blending PWM-free with LightBoost strobing, an enhancement that I have just come up:

I have quickly invented a new idea of combining PWM-free with LightBoost, while having G-Sync:
New Section Added to "Electronics Hacking: Creating a Strobe Backlight"

To the best of my knowledge, no patents exist on this, and not even John Carmack appears to have mentioned this in his twitch.tv video when he mentioned combining LightBoost with G-Sync. So I'm declaring it as my idea of a further improvement to nVidia G-Sync, as a workable method of fix strobing flicker on variable-refresh-rate displays:

From: New Section in "Electronics Hacking: Creating a Strobe Backlight"

With nVidia’s G-Sync announcement, variable refresh rate displays are now a reality today. Refresh rates can now dynamically vary with frame rates, and it is highly likely that nVidia has several patents on this already. If you are a monitor manufacturer, contact nVidia to license this technology, as they deserve kudos for this step towards tomorrow’s perfect Holodeck display.

However, one additional idea that Mark Rejhon of Blur Busters has come up with is a new creative PWM-free-to-strobing dynamic backlight curve manipulation algorithm, that allows variable-rate backlight strobing, without creating flicker at lower frame rates.

It is obvious to a scientist/engineer/vision researcher that to maintain constant perceived brightness during variable-rate strobing, you must keep strobing duty cycle percentages constant when varying the strobe rate. This requires careful and precise strobe-length control during variable refresh rate, as the display now refreshes dynamically on demand rather than at discrete scheduled intervals. However, a problem occurs at lower framerates: Strobing will cause uncomfortable flicker at lower refresh rates.

Mark Rejhon has invented a solution: Dynamic shaping of the strobe curve from PWM-free mode at low framerates, all the way to square-wave strobing at high framerates. The monitor backlight runs in PWM-free mode during low refresh rates (e.g. 30fps@30Hz, 45fps@45Hz), and gradually become soft gaussian/sinewave undulations in backlight brightness (bright-dim-bright-dim) at 60fps@60Hz, with the curves becoming sharper (fullbright-off-fullbright-off) as you head higher in framerates, towards 120fps@120Hz. At the monitor’s maximum framerate, the strobing more resembles a square wave with large totally-black-gaps between strobes.

Example:
10fps@10Hz — PWM-free backlight
30fps@30Hz — PWM-free backlight
45fps@45Hz — PWM-free backlight
60fps@60Hz — Minor backlight brightness undulations (bright / dim / bright / dim)
80fps@80Hz — Sharper backlight brightness undulations (very bright / very dim)
100fps@100Hz — Starts to resemble rounded-square-wave (fullbright / fulloff)
120fps@120Hz and up — Nearly square-wave strobing like original LightBoost

This would be a dynamically variable continuum all the way in bewteen too, much like automobile CVT instead of discrete gears in automobile transmissions. You avoid flicker at lower frame rates, and you get full strobing benefits at higher frame rates.

Simpler algorithm variations are also possible (e.g. keeping a square wave, and using only pulsewidth / pulseheight manipulation to achieve the blending effect, but without curve-softening). This is included as part of my general idea of blending from PWM-free at lower refresh rates, to strobing at higher refresh rates. The trigger framerates may be different from the example above (or may even be adjustable via a user flicker-threshold setting), but the concept is the same.

If nVidia or any monitor manufacturer uses this idea (if no patent application dated before October 19, 2013 covers my invention), please give Mark Rejhon / Blur Busters appropriate due credit. It is realized nVidia has several patents, but none appears to be covering this additional improvement being suggested during combining strobing and variable refresh rates. As of this writing, research is being done on any prior art, to determine whether anyone dynamically considered blending from PWM-free to square-wave strobing. If anyone else already came up with this idea, already documented in a patent application prior to October 19, 2013, please let me know & due credit will be given here.
If monitor manufacturers are reading this, please make sure to mention my name somewhere in the monitor manual, or on the monitor's page of website, or another part.
 
Last edited:
What you seem to be suggesting is that they make a g-sync like adaptive tech for strobelight length synchronous with variable refresh rate. That seemed like an obvious question on how "fix" it concept-wise to me myself after I read the g-sync material (would it be possible to match the strobing to the dynamic lower refresh rates?), but I don't have mark's background and didn't conceive of any formulae.
.
So for now, fears seem likely correct that you cannot combine g-sync and 3d vision on existing monitor tech, and therefore cannot combine g-sync and lightboost-2D gaming. The newer monitor models next year may end up being higher priced 3dvision/lightboost and g-sync capable but not at the same time. Perhaps they will release a really good lightboost capable 27" model with glossy coating though, which I would be interested in, and as they said - they will be releasing higher resolution g-sync monitors.. perhaps some of them will be 3dvision/lightboost capable w/o g-sync enabled. So there may be some benefits with newer monitors at least even if you can't use g-sync with lightboost for some time (if ever). (They could just slap g-sync boards on existing monitor designs without adding lightboost to higher rez ones I suppose :/ ).
.
If you keep your fps higher than your lightboost refresh rate (100hz or 120hz), would g-sync be as great a benefit? Is it more to address the input lag and glitches of using vsync for people with more common lower framerate setups? When you run your fps higher than 100 or 120 respectively with vsync off, wouldn't your experience be similarly un-glitched by vsync that g-sync tries to compensate for? Is it's larger benefit that people can crank up display settings on non-extreme gpu setups - resulting in lower framerates without suffering input lag, stutter, etc ?
 
Last edited:
So for now, fears seem likely correct that you cannot combine g-sync and 3d vision on existing monitor tech, and therefore cannot combine g-sync and lightboost-2D gaming.

Yes, the fear is this is an either/or scenario. Fast 120+ FPS TN panels will not benefit from G-Sync as you would rather be in Lightboost mode give the choice. Lightboost doesn't work on anything aside from fast 120+ Hz TN panels.

G-Sync works well on everything else. Low refresh rates, slow panel types with high resolutions, etc. The hurdle will be combining the two like Mark said. How soon they could do it is a huge question.

As far as 3D glasses are concerned, I think varying the flicker rate of the glasses to match the G-Sync would not turn out so well. From a practical usage standpoint nor from a design and implementation standpoint.
 
Back
Top