tonyftw
[H]ard|Gawd
- Joined
- Mar 21, 2013
- Messages
- 1,817
Wonder how Lightboost will fit into this equation...
Think nvidia mentioned something about them working on that as well. I'm kind of excited for this, but will reserve judgement.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Wonder how Lightboost will fit into this equation...
Personally I wish Nvidia would release a video card that could do 4K @120Hz or more. This seems like more of a Moore's Law bandaid. Or a remedy for the garbage Rage and ID Tech 5 became. Maybe I'm being too cynical, but this doesn't excite me. I just don't see everyone running out to buy new monitors especially if you already invested in a 120Hz monitor.
The next innovation in monitors will be getting rid of TN panels; not propping the manufacturers up for another decade.
LightBoost is designed to address pixel persistence to reduce ghosting in 3D. This is more like Adaptive Vsync 2.0.Well if its doing 144hz and input lag reduced and variable vsync, I'd say G-sync is LB 2.0.
LightBoost is designed to address pixel persistence to reduce ghosting in 3D. This is more like Adaptive Vsync 2.0.
What does G-sync have to do with Rage? LOL. Rage ran buttery smooth 60 fps it was just ugly and boring.
If you aren't locked at 120 fps and vsync'ed, then yes, you have tearing and stuttering.
G-Sync is practical, 3D vision is a gimmick. G-Sync improves many negative aspects of display technology industry wide. No comparison whatsoever.
Interesting technology out of left field. I'm not sure I agree with what seem like over-the-top assertions of how much of a problem screen refreshing is, but it's certainly interesting anyway. It's almost as if they're trying to do to frame rate what high resolutions did to jagged lines.
Unfortunately, this seems sort of like high sound quality—you don't know what you're missing until you try it out, but it costs money to try out, so no one tries it out. It's particularly tough to ask people to buy new displays. If there were an external module that we could plug in between the Displayport connector on the monitor and the Displayport connector on the GPU, then it might take off.
From: New Section in "Electronics Hacking: Creating a Strobe Backlight"
With nVidia’s G-Sync announcement, variable refresh rate displays are now a reality today. Refresh rates can now dynamically vary with frame rates, and it is highly likely that nVidia has several patents on this already. If you are a monitor manufacturer, contact nVidia to license this technology, as they deserve kudos for this step towards tomorrow’s perfect Holodeck display.
However, one additional idea that Mark Rejhon of Blur Busters has come up with is a new creative PWM-free-to-strobing dynamic backlight curve manipulation algorithm, that allows variable-rate backlight strobing, without creating flicker at lower frame rates.
It is obvious to a scientist/engineer/vision researcher that to maintain constant perceived brightness during variable-rate strobing, you must keep strobing duty cycle percentages constant when varying the strobe rate. This requires careful and precise strobe-length control during variable refresh rate, as the display now refreshes dynamically on demand rather than at discrete scheduled intervals. However, a problem occurs at lower framerates: Strobing will cause uncomfortable flicker at lower refresh rates.
Mark Rejhon has invented a solution: Dynamic shaping of the strobe curve from PWM-free mode at low framerates, all the way to square-wave strobing at high framerates. The monitor backlight runs in PWM-free mode during low refresh rates (e.g. 30fps@30Hz, 45fps@45Hz), and gradually become soft gaussian/sinewave undulations in backlight brightness (bright-dim-bright-dim) at 60fps@60Hz, with the curves becoming sharper (fullbright-off-fullbright-off) as you head higher in framerates, towards 120fps@120Hz. At the monitor’s maximum framerate, the strobing more resembles a square wave with large totally-black-gaps between strobes.
Example:
10fps@10Hz — PWM-free backlight
30fps@30Hz — PWM-free backlight
45fps@45Hz — PWM-free backlight
60fps@60Hz — Minor backlight brightness undulations (bright / dim / bright / dim)
80fps@80Hz — Sharper backlight brightness undulations (very bright / very dim)
100fps@100Hz — Starts to resemble rounded-square-wave (fullbright / fulloff)
120fps@120Hz and up — Nearly square-wave strobing like original LightBoost
This would be a dynamically variable continuum all the way in bewteen too, much like automobile CVT instead of discrete gears in automobile transmissions. You avoid flicker at lower frame rates, and you get full strobing benefits at higher frame rates.
Simpler algorithm variations are also possible (e.g. keeping a square wave, and using only pulsewidth / pulseheight manipulation to achieve the blending effect, but without curve-softening). This is included as part of my general idea of blending from PWM-free at lower refresh rates, to strobing at higher refresh rates. The trigger framerates may be different from the example above (or may even be adjustable via a user flicker-threshold setting), but the concept is the same.
If nVidia or any monitor manufacturer uses this idea (if no patent application dated before October 19, 2013 covers my invention), please give Mark Rejhon / Blur Busters appropriate due credit. It is realized nVidia has several patents, but none appears to be covering this additional improvement being suggested during combining strobing and variable refresh rates. As of this writing, research is being done on any prior art, to determine whether anyone dynamically considered blending from PWM-free to square-wave strobing. If anyone else already came up with this idea, already documented in a patent application prior to October 19, 2013, please let me know & due credit will be given here.
Wonder how Lightboost will fit into this equation...
Wonder how Lightboost will fit into this equation...
I've solved the flicker problem of combining G-Sync with LightBoost.With G-Sync you refresh rates are not constant anymore and so times between refreshes are unknown. This could be solved by displaying image (backlight on) longer if current frame was rendered longer so that overal brightness is the same.
A number of users of the XL2420TE has mentioned it has better colors than both XL2420T and VG248QE. It is an interesting question in how they compare to previous panels, but I thought I'd mention relative differences between current 24" panels, too.The 2nd gen 120hz TN panels (Asus VG236H, Planar SA2311W, Acer GN245, Samsung Series 7 & 9) all have much better colors vs. the current 144hz options.
it would be nice but I don't suppose it is doable without running at nonlinearity issues of variable voltage applied to LEDs. It could be countered with calibration though and because we know it add to the cost it make it highly unlikely to happenExample:
10fps@10Hz PWM-free backlight
30fps@30Hz PWM-free backlight
45fps@45Hz PWM-free backlight
60fps@60Hz Minor backlight brightness undulations (bright / dim / bright / dim)
80fps@80Hz Sharper backlight brightness undulations (very bright / very dim)
100fps@100Hz Starts to resemble rounded-square-wave (fullbright / fulloff)
120fps@120Hz and up Nearly square-wave strobing like original LightBoost
Personally I wish Nvidia would release a video card that could do 4K @120Hz or more. This seems like more of a Moore's Law bandaid. Or a remedy for the garbage Rage and ID Tech 5 became. Maybe I'm being too cynical, but this doesn't excite me. I just don't see everyone running out to buy new monitors especially if you already invested in a 120Hz monitor.
The next innovation in monitors will be getting rid of TN panels; not propping the manufacturers up for another decade.
You mean AMD is still worring about crap like like this and Nvidia is increasing costs with proprietary stuff ?And while AMD is increasing performance with their proprietary stuff, Nvidia still worries about crap like this...
Damn, they're hitting rock bottom...
Increasing costs? As if anyone would bother paying more for a Nvidia that goes less than an AMD card.You mean AMD is still worring about crap like like this and Nvidia is increasing costs with proprietary stuff ?
Because AMD still hasn't solved frame pacing completely (multi-monitor for example) and Nvidia seems to keep making proprietary stuff like this G-Sync and Project Shield.
Something that will need this badly are the new consoles.
Too bad they arent using NVidia tech.
Most latency is due to televisions having motion compensation, dynamic contrast and other analysing technology that introduce the delays.
And while AMD is increasing performance with their proprietary stuff, Nvidia still worries about crap like this...
Damn, they're hitting rock bottom...
setting cap above refresh frequency with v-sync on doesn't do anything at allbut there's already a fix for this.. cap fps at 60~61 and use vsync, no input lag and vsync is still on
Then you're going to have to explain why my 19" Viewsonic CRT running at 120 Hz doesn't murder my eyes, but all three LightBoost monitors I've played with in person have had obvious strobe...No buddy, I do understand how they work. I'm a long time CRT guy.
lol. its no fun when you are that obvious.Increasing costs? As if anyone would bother paying more for a Nvidia that goes less than an AMD card.
Unless AMD is completely retarded and relies on Mantle to compete, it's gonna completely obliterate Nvidia in the next years...
Performance over gimmicks...
why would you say such a thing? you clearly have no idea how things work. read up a bit more and you will see why gsync should greatly improve the gaming experience. of course based on comments I see all the time, most people cant even figure out the simple concept of adaptive vsync so probably no hope for understanding gsync.but there's already a fix for this.. cap fps at 60~61 and use vsync, no input lag and vsync is still on
Is just sad to see how much people don't realize the benefits of this new technology.
Its either that they are not able to perceive the stutter/tearing or they are so ...... that they will not embrace or at least give the benefit of the doubt to a great new concept.
So this raises to me the following question.
Is there really a lot of people that can't really tell the difference on a video game when there is stutter/tearing??