G-Sync First Impressions from TR Forums

I don't see how a higher pixel response time is going to make gsync not work all of a sudden
G-Sync will still work, the panel just wont be able to keep up with the changes correctly. I also mentioned more than just blurring... First you'll lose contrast ratio and color accuracy on anything in-motion, then smearing will start to appear.

There's a reason there aren't any 144 Hz IPS screens on the market...
 
G-Sync doesn't require a 144 Hz display.

Uh... sure about that?

G-Sync does, in fact, rely on high refresh rates in order to operate optimally. Not sure where you got the idea that it doesn't...

The goal of G-Sync is to refresh the monitor every time a new frame is ready. New frames can be ready at ANY time. Even if you average 60 frames per second over the course of 1 second, those 60 frames could have been delivered with any number of delays between them.

There might have been an 8ms gap between some of those frames, a 16ms gap between others, and a 24ms gap between a few stragglers. A monitor that can only handle 60Hz max can only service requests every 16ms... so what happens to the frames that only had an 8ms gap? A gap that small between frames requires a refresh rate equivalent to 120Hz... even though the average framerate for that single second was was 60, some frames needed a MUCH higher refresh rate to be delivered EXACTLY as they were ready.

If you have a monitor that can't handle these fast on-demand refreshes, a frame either has to be skipped entirely (which is bad) or has to wait until the next time the display can refresh to be displayed (which lands you with the same input-lag issues as V-Sync)
With the above taken into consideration, how would G-Sync not require a display capable of high refresh rates?

As far as I can tell, a display that maxes out at 60Hz would seriously hamper G-Sync.
 
Last edited:
G-Sync does, in fact, rely on high refresh rates in order to operate optimally. Not sure where you got the idea that it doesn't...

The goal of G-Sync is to refresh the monitor every time a new frame is ready. New frames can be ready at ANY time. Even if you average 60 frames per second over the course of 1 second, those 60 frames could have been delivered with any number of delays between them.

There might have been an 8ms gap between some of those frames, a 16ms gap between others, and a 24ms gap between a few stragglers. A monitor that can only handle 60Hz max can only service requests every 16ms... so what happens to the frames that only had an 8ms gap? A gap that small between frames requires a refresh rate equivalent to 120Hz... even though the average framerate for that single second was was 60, some frames needed a MUCH higher refresh rate to be delivered EXACTLY as they were ready.

If you have a monitor that can't handle these fast on-demand refreshes, a frame either has to be skipped entirely (which is bad) or has to wait until the next time the display can refresh to be displayed (which lands you with the same input-lag issues as V-Sync)

You have the basics down, but you're not understanding how G-Sync sends frames to the display. There's a reason it can scale down to 30 Hz/30 FPS. Are more frames better? Yes, of course. But a high frame and refresh rate is NOT necessarily optimal for G-Sync. Anything above 30 Hz is fine, which is the entire point of the technology.
 
Don't be obtuse. 144 Hz is a "high refresh rate", but it is not the only high refresh rate.

Agreed! There's also the fact that nvidia specifically stated that they are working on making g-sync available with 1440p and higher panels. They are working on 4k panels as well.

That wouldn't exactly make sense if g-sync didn't offer a substantial benefit for such things - the Asus lightboost panel is just the first because Asus stepped up to the plate first, and it's actually a very popular gaming monitor. In fact, this is completely anecdotal but I've noticed among pure gamers that it is probably *the* second most popular PC gaming screen right behind the BenQ 24 incher that also has lightboost. Browsing some of the hardcore gamers that stream on twitch, that asus panel is more or less ubiquitous among streamers. It's also highly popular on many forums such as OCN and here for pure PC gaming. Now I personally prefer high resolution IPS, but I think the VG248QE being the first panel to offer g-sync isn't related to the refresh rate. It's related to the fact that the screen is massively popular among PC gamers.

That doesn't mean that g-sync won't benefit higher resolution IPS panels just as well. IMO anyway.
 
You have the basics down, but you're not understanding how G-Sync sends frames to the display. There's a reason it can scale down to 30 Hz/30 FPS. Are more frames better? Yes, of course. But a high frame and refresh rate is NOT necessarily optimal for G-Sync. Anything above 30 Hz is fine, which is the entire point of the technology.
I never said anything about low framerates being an issue, so I'm not sure what you're getting at here... It appears as though you skipped over a huge chunk of my post. Just because you averaged 30 FPS over 1 second doesn't mean there was a 33.3ms gap between all 30 of those frames. Just like averaging 60 FPS over the course of 1 second doesn't mean there was a 16.6ms gap between all 60 of those frames.

G-Sync's normal mode of operation is to refresh the display exactly as a frame is delivered, perfectly in-sync. This works great as long as the delay between frames is equal-to or less-than the the maximum refresh rate of the panel in question.

Here's the thing, though (the thing you missed). If a new frame is ready SOONER than 16.6ms after the previous one, you immediately run into trouble on a monitor that maxes out at 60 Hz.


Lets say a frame was suddenly ready only 7ms after the previous one. The monitor is unable to comply, it still has another 9.6ms to wait before it can reliably refresh the panel again (it's still waiting for the previous frame to fully draw). Two things can happen at this point:
1. The frame has to sit in a buffer and wait 9.6ms for the display to be ready again. The user sees old data.
2. Yet another frame is rendered before the display is ready. The previous frame is dropped and the newest frame is sent to the display.

Worst-case-scenario for a 60Hz monitor?
1. A delay of ~16ms between the a frame being ready and the frame making it to the display.
2. Multiple dropped frames (frames which would have been at least partially displayed on the screen if both G-Sync and V-Sync were disabled). The user is stuck staring at old data for up to 16.6ms instead of getting partial new data + tearing.


So, what happens on a 144Hz-capable panel in the same scenario? A panel that can be updated every 7ms?
1. The monitor refreshes on-demand, as usual.

Worst-case-scenario for a 144Hz monitor? A delay of ~7ms between the frame being ready and the frame making it to the monitor. This MORE THAN cuts the worst-case-scenario delay in half. This scenario will also be encountered far less often due to the increased number of chances for a given frame to have a refresh ready and waiting for it.

Don't be obtuse. 144 Hz is a "high refresh rate", but it is not the only high refresh rate.
Uh...ok? I never said 144Hz was the only high refresh rate.

How do active 3dtvs work at 720p 60fps again? Most of which using an equally slow VA if I'm not mistaken.
Can you frame your question more clearly? I'm not sure what you're asking.

I don't see how 60fps content (requiring the display be updated every 16.6ms) would cause a problem for a slow VA panel. :confused:

Sorry, not overly familiar with active 3D TV's. Don't know how anyone uses them with the horrible flicker...
 
Last edited:
I never said anything about low framerates being an issue, so I'm not sure what you're getting at here...

You are discounting G-Sync at "low" refresh rates like 60 Hz, when G-Sync shines at lower refresh rates. There is no problem with G-Sync at 60 Hz. I'd even go so far as to say it's ideal.
 
Agreed! There's also the fact that nvidia specifically stated that they are working on making g-sync available with 1440p and higher panels. They are working on 4k panels as well.
Just an FYI, the first 4k panel with G-Sync is a 144Hz display as well

http://techreport.com/news/25854/philips-intros-4k-and-g-sync-monitors

You are discounting G-Sync at "low" refresh rates like 60 Hz, when G-Sync shines at lower refresh rates. There is no problem with G-Sync at 60 Hz. I'd even go so far as to say it's ideal.
No I am not, please re-read my post.

I said there are issues with using G-Sync on displays that MAX OUT at 60Hz, because the monitor will be unable to service requests faster than every 16.6ms.

I've explained this point multiple times now, and elaborated on it at-length...
 
Last edited:
No I am not, please re-read my post.

I did re-read your post. Specifically, this one:

I'd like to see this in an IPS monitor as well, but IPS seems to have trouble with high refresh rates.

The fact that you think G-Sync is going to have trouble in an IPS display due to IPS having problems with high refresh rates tells me you don't really understand what G-Sync is doing. It has very little to do with refresh rates. It could plausibly be described as independent of refresh rate above 30 Hz.
 

Take your post and s/16.6/8.3/ and the same logic now explains how 120hz is not fast enough.

Yeah if frames comes faster than the panels minimum refresh time then you have to wait to display them. Doesn't matter what the panels mininimum refresh time is. Nothing special about 60hz or 120hz.

In general frames comming too fast is also not the big problem. Frames comming too slowly the bigger problem. You either have to dispaly previous frame twice or tear displaying the late frame. Now instead of having to make this choice after 16.6ms we can wait till 18ms to display the new frame w/o tearing. If it comes too fast, eg after 14ms then yeah, we wait 2.6ms and display it. But that is nothing new.

In general though panels minimum refresh time is not that high. I'd almost say every panel can be refreshed faster than 60hz. My IPS is fine up to 120hz. Biggest reson we won't be seeing tons of 60hz monitors w/GSync is because the panel is rarely what holds the refresh rate back. Bandwidth is going to be a greater issue at higher resolutions though, so more likley we'll see 60hz 4k monitors w/GSync.
 
Anand just posted an article about AMD's 'FreeSync', which purports to take advantage of the V-Blank stuff already built in to DisplayPort.

I'm not sure if it's the same thing, but it is nice to know AMD isn't sitting on their laurels!
 
The fact that you think G-Sync is going to have trouble in an IPS display due to IPS having problems with high refresh rates tells me you don't really understand what G-Sync is doing.
I understand what it's doing just fine. I've explained the problem I'm pointing out at-length.

I quite clearly stated what G-Sync does. It attempts to refresh the display exactly in-sync with frame delivery. What you don't seem to understand is that there IS an upper limit on how often you can refresh a display (even a G-Sync display), and it's a lot easier to smack into that limit when the display maxes out at 60Hz (soonest it can be updated after a frame is sent is 16.6ms) than it is to smack into that limit when the display maxes out at 144Hz (soonest it can be updated after a frame is sent is 7ms)

IPS displays do not handle fast transitions well, they require more time than TN to get their pixels from a given starting value to a given requested value. You don't see 144Hz IPS monitors because the panel can't keep up without artifacts. Driving an IPS monitor that fast leads to reduced contrast and smearing when things are in-motion. Attempting to overdrive an IPS panel to compensate for this leads to over-shoot and shorter panel lifespan.

It has very little to do with refresh rates. It could plausibly be described as independent of refresh rate above 30 Hz.
You're not listening...

If the display is incapable of refreshing faster than 60Hz (that's once every 16.6ms, which I've mentioned before), then G-Sync will run into trouble when a frame is ready SOONER than 16.6ms.

I don't know what you don't understand about the above. It's extremely simple... I even explained the consequences of this problem in post #128 of this thread. I'll Spell it out as clearly as I possibly can.

With G-Sync, a 144Hz-max monitor must wait 7ms (at minimum), and can wait as long as 33ms (at maximum). If a frame is ready at any point between 7ms and 33ms after the previous frame was delivered, the display can be refreshed on-demand and the frame is G-Synced.
With G-Sync, a 60Hz-max monitor must wait 16ms (at minimum), and can wait as long as 33ms (at maximum). If a frame is ready at any point between 16ms and 33ms after the previous frame was delivered, the display can be refresh on-demand and the frame is G-Synced.

In the latter case, if a frame is is ready sooner than 16ms, G-Sync has to sit and wait to deliver it. The monitor is still busy drawing the previous frame, which must be completed before another can be delivered (unless you want tearing). This adds delay.
It might even have to wait so long that yet ANOTHER frame is ready before the monitor can be refreshed again. This leads to dropped frames.

A monitor that can be updated more quickly reduces both of the above problems.

Edit: And just to reiterate, this isn't just a problem at high-framerates. You can bump into this limitation even if you're averaging a solid 60 FPS, because some of those frames will be delivered faster than 16ms (which a 60Hz-max monitor obviously cannot directly handle).

Take your post and s/16.6/8.3/ and the same logic now explains how 120hz is not fast enough.
Already covered this. Moving up to 120Hz or 144Hz more-than cuts the worst-case scenario conditions in half.

Not only can the display service frames with smaller gaps between them, but the higher potential refresh rate means there are more opportunities in-total for a frame to be displayed before it has to be skipped because a newer one has already been rendered before the display was ready.
 
Last edited:
If the display is incapable of refreshing faster than 120Hz (that's once every 8.3ms, which I've mentioned before), then G-Sync will run into trouble when a frame is ready SOONER than 8.3ms.

I don't know what you don't understand about the above. It's extremely simple... I even explained the consequences of this problem.

Fixed.
 
So in theory if you already have a built in high end scaler, LUT table and are using display port it possible this could be done in software without having to void the warranty on your monitor? There is no way in hell I'm opening up my monitor but if they just work with NEC to get access to the tech that is already in there, that sound like the best of both worlds. This won't work for everyone since only TVs and workstation monitors come with scalers and LUT tables.
 
Yes, 120Hz pretty-much fixes the problem by cutting the worst-case-scenario in half. What's your point?

120Hz-max = half the potential delay of 60Hz-max if a frame is delivered faster than 16ms.
120Hz max = twice as many opportunities for a potential refresh to be ready for any frame that's about to be completed.

This is compounded by the fact that it's FAR less likely that a frame will be ready sooner than 8.3ms than it is for a frame to be ready sooner than 16ms. More frames will fall within-bounds on a 120Hz monitor.
 
No I proved 120hz was not fast enough. Perhaps we should try 240hz. :p

The worst-case-scenario is if a frame takes 35ms. That is a problem. A frame taking 14m and having to wait 2.3ms to display it is not a problem. It will happen less often if panel supports higher refresh rates but its not really a problem in the firest place.
 
Last edited:
No I proved 120hz was not fast enough. Perhaps we should try 240hz. :p
How is 120Hz not fast enough? I mean, sure, 240Hz would reduce the problem still further, but you start seeing minimal gains after 120 / 144Hz because frames simply never get delivered that fast in most scenarios.

It allows many more frames to fall within the bounds of G-Sync than a display that maxes out at 60Hz without having to wait for the display to finish drawing a previous frame.

If you were to be playing a game that averages 60 FPS, a large portion of frames would be forced to wait for the display if it were only capable of being updated once every 16+ ms.
The same scenario on a monitor that maxes out at 8.3ms wouldn't have much of a problem, as it's highly unlikely any frames were delivered faster than that while averaging 60 FPS.
 
Last edited:
Anand just posted an article about AMD's 'FreeSync', which purports to take advantage of the V-Blank stuff already built in to DisplayPort.

I'm not sure if it's the same thing, but it is nice to know AMD isn't sitting on their laurels!

The chances of this actually making it to market? Being that it's AMD? Zero.
 
The chances of this actually making it to market? Being that it's AMD? Zero.

No one knows this answer, but AMD is going to be a lot more competitive to Nvidia than ever before. With saying that , I'd be more weary of thinking they aren't a threat to NV and all of their proprietary shit . If anything , Amd wants a lot of features free and compatible with everyone's hardware. It costs more money to be proprietary and last I looked, Nvidia over prices everything.

Seeing as you like being a cheerleader for the green, at least be smart and buy NV stock... This way you can actually have a legit excuse for constantly crapping all over AMD every chance you get. Being an unpaid cheerleader just makes everyone laugh at you... And believe me, a lot of people laugh at you and the rest of your minions .
 
No one knows this answer, but AMD is going to be a lot more competitive to Nvidia than ever before. With saying that , I'd be more weary of thinking they aren't a threat to NV and all of their proprietary shit . If anything , Amd wants a lot of features free and compatible with everyone's hardware. It costs more money to be proprietary and last I looked, Nvidia over prices everything.

Seeing as you like being a cheerleader for the green, at least be smart and buy NV stock... This way you can actually have a legit excuse for constantly crapping all over AMD every chance you get. Being an unpaid cheerleader just makes everyone laugh at you... And believe me, a lot of people laugh at you and the rest of your minions .

Pot kettle black good sir, whatever you say. I'm a cheerleader? You're the AMD cheerleader. I don't give any fucks about what you or anyone else thinks. I just tell it like it is based on products that i've owned and purchased. I used AMD GPUs plenty over the years and i'm willing to bet good money that I gave AMD more money in the past 5 years than you did. I know what my experience tells me in terms of who makes a better product. Who delivers on their promises. Meanwhile, I had 7970CF for nearly a year dealing with bullshit continually. If i'm supposed to keep my mouth shut about that as to not hurt your feelings, whatever. You can fuck right off. :cool: But by all means. Call me a fanboy. Whatever you say. ;) I just know what product is better, and I know which company delivers on promises, and they now get my business. By the way, you're in the nvidia forum now. You should go back to fanboying for your precious AMD in the AMD forum.

You know who I laugh at? The idiots who keep buying into AMD's promises. Oh hey. Yeah they'll fix frame pacing on 7970CF DX9. Eyefinity. Here we are nearly 3 years later. What happened to that promise? How many beta drivers resulted in me rebooting to a perpetual black screen. 5 months straight that witcher 2 crashed in crossfire. Oh hey they fixed it 5 months after I submitted a support ticket. The tickets I submitted for non functioning crossfire in Ubi games. "We're working on it". That was always the answer. Only they NEVER fixed their shit. I could go on here, there are SO MANY problems I had with CF 7970 in 2012 that I don't even feel bothered to list them here. I'm not a miner so I don't give a flying fuck about mining - I want a functioning PC GAMING setup. And on that note, I grew tired of their bullshit. Grew tired of their promises that never transpired. You can keep fanboying for AMD and give you rmoney, I don't give a flying fuck. I would suggest getting AMD stock, but with their financial situation? Good luck with that shit. :D You can laugh all you want. I laugh at those who buy into AMD's bullshit and listen to a goddamn word they say. I know better now. Listen to nothing they say. LOL.

You know what is even more goddamn hilarious? The fact that the AT article plainly states that desktop panels don't support variable vblank. AMD has to talk to panel manufacturers. But will they? According to the AT article, they have no plans or time frame for this to come to market. They don't know when or IF it's coming out. How many panel manufacturers did AMD consult with ? Not a single goddamn one. But you took their marketing bullshit at face value. You keep on doing that. I'll bet 6 months from now that free-sync will be nowhere in sight. I do hope that AMD proves me wrong, but history says, AMD is full of shit and always has been.

You laugh at me all you want, while I laugh at you for buying into AMD's marketing bullshit. I hope i'm wrong on freesync, but like I said. History on AMD delivering on promises says they won't deliver. I used to listen to their bullshit back when I bought AMD hardware. I now know better. But hey. At least there are suckers who will buy their shit and listen to their bullshit. Like you apparently. There's a sucker born every minute. I bet you get some fucking awesome hash rates on your AMD gear though! Who needs PC gaming when you can get a good hashrate. Who needs a product that actually excels at games, with working features, and continual feature updates when you can just get the inferior card with a better hashrate. Whatever.
 
Last edited:
Can we stop with the AMD/Nvidia bs? If this starts again, and the thread goes super off-topic (which it's probably going to do), then congratulations on yet another thread being closed.

10-4, I have no issues with that. I do get rather annoyed at a fanboy throwing the fanboy card at me when i've given AMD a ton of money for their products over the years. But you know how I feel about that. ;)

Nonetheless. I'm all done on that topic. Let's talk G-sync again.
 
10-4, I have no issues with that. I do get rather annoyed at a fanboy throwing the fanboy card at me when i've given AMD a ton of money for their products over the years. But you know how I feel about that. ;)

Nonetheless. I'm all done on that topic. Let's talk G-sync again.

Gladly. :) I can't wait to see what the future has in store. G-Sync monitors with perfect motion clarity (and good color accuracy to boot) are three things that I'd love to see in a monitor.
 
Ugh $200... Guess I have a couple of days to decide to go with this or save that money and put it toward one of the new monitors coming in Q2.
 
The chances of this actually making it to market? Being that it's AMD? Zero.

Based on the article, 'FreeSync' worked by enabling options (or some such) in AMD's Catalyst drivers that aren't available in the release versions, but otherwise required no actual coding.

Now, I don't expect that monitors will be supporting 'FreeSync' anytime soon. Rather, that said technology already built into the DisplayPort spec (and mentioned heavily in this thread) will allow AMD to support G-Sync displays fully with very little ease.
 
Someone's a cynic. :D
AMD doesn't have a great track record when it comes to monitor-centric tech. They mismanaged HD3D (their 3D monitor tech) pretty badly.

Instead of writing universal 3D support into their driver that monitors could standardize on, AMD only went as far as providing an API... the idea being that 3rd parties could plug-in their own 3D driver.

This caused some pretty severe confusion. Simply buying an AMD card with HD3D support and pairing it with a monitor with HD3D support didn't actually get you working 3D. You still needed middle-ware like iZ3D or TriDef (both of which were were/are NOT free and must be purchased in addition to the graphics card and monitor). Currently, TriDef is the only middle-ware provider still in business...

AMD set up some licensing deals to get the limited-functionality versions of the middle-ware into the hands of users for free, but it still looks like a pretty poor solution compared to plugging a 3D Vision monitor into an Nvidia card.
 
No I proved 120hz was not fast enough. Perhaps we should try 240hz. :p

The worst-case-scenario is if a frame takes 35ms. That is a problem. A frame taking 14m and having to wait 2.3ms to display it is not a problem. It will happen less often if panel supports higher refresh rates but its not really a problem in the firest place.

This isn't a problem for any of the refresh rates (30/60/120/144) being discussed. With gsync, all of them can deliver frames every 35ms with out issue.

Unknown-One is spot on. If one of those frames comes in at 28ms render, the 30Hz monitor cannot push it at render, and has to wait 5.3ms before it can refresh. 60hz let's you render anything above 16.6ms without issue, 120hz does 8.3ms and above without issue.

Generally you will be able to crank up your settings to bump up these quick frame renders, to keep the display state more in sync with the game state.

All of these monitors can handle very low frame rates with seemingly identical performance.

240Hz with gsync might be great, it will be a while before we know if it is any visual improvement.
 
Except it's not necessarily bringing any of CRT's good qualities to the table...

Lightboost causes severe flicker, something that bothers a lot of people. I personally can't stand monitors that have PWM flicker, and feel that it's the sign of cheap low-quality monitors that couldn't afford a proper dimming circuit.

And then there's the intentional stroboscopic effect, which DESTROYS persistence-of-vision. This will make motion on low-framerate content appear LESS smooth than a normal monitor, as you end up seeing "a series of stills" rather than fluid movement. While this effect does make it easier to do things like read text that's being rapidly panned across the display, it will make the motion of said text far less fluid (again, that "series of stills" effect).


The display doesn't need to operate below 30Hz, just run it double-time when the frame-rate is too low...

Nothing wrong with running the monitor as follows. No issues with decay, and sub-30-FPS works:
30 FPS / 60 Hz
29 FPS / 58 Hz
28 FPS / 56 Hz
27 FPS / 54 Hz
26 FPS / 52 Hz
25 FPS / 50 Hz
etc, etc, etc...


If G-Sync can't intelligently switch to double-time (knowing full-well there are decay issues when using low refresh rates), then that seems like a G-Sync limitation.

Do research then. Lightboost is first and foremost, a gaming feature. If your eyes don't handle it, don't complain. I personally think it's close enough for my 2d/3d needs, as do the BB crew, and many others.
 
$200 for the add-in kit is a little overpriced, but honestly the effect is probably worth it.

Waiting for reviews on the XL2420G.
 
This isn't a problem for any of the refresh rates (30/60/120/144) being discussed. With gsync, all of them can deliver frames every 35ms with out issue.

I used 35ms because none of them fix this. 35ms is <30hz, so after 33.3ms with no new frame the old frame has to be displayed again, then at 35ms the new frame shows up but we now cannot show it for panel's minimum refresh period minus 1.7ms. So for 60hz panel we have to wait 14.9ms to show it.

Unknown-One is spot on. If one of those frames comes in at 28ms render, the 30Hz monitor cannot push it at render, and has to wait 5.3ms before it can refresh. 60hz let's you render anything above 16.6ms without issue, 120hz does 8.3ms and above without issue.

I've not said anything that disagrees with that. My claim is frames showing up too early is not what degrades game play.

Whatever panel with no Gsync, a frame must show up exactly on time or else:
X ms early and there is X ms delay before it is displayed. (2 ms late on 60hz panel = 2ms delay)
X ms late and there is panel's refresh period minus X ms delay. (2ms late on 60hz panel = 14.6ms delay)

The former here is not that big a deal, it is the latter that affects gameplay.

GSync solves both of these, at least for a range. Instead of exactly on time it has a range from panels minimum refresh period to 33.2ms where both issues are fixed. If a frame falls outside this range we have exact same problems again.

All a faster panel does is increase this range such that we avoid the former case. My claim is that the former case was never the issue that was causing all the problems. And certainly not to the point to claim that GSync REQUIRES a 120/144hz panel, as was said earlier.

You can compare these: Just compare VSYNC enabled vs disabled in a game when its always under 60fps and always over 60fps. Ignoring screen tearing, just focus on how smooth game feels. Enabling VSYNC when game is always over 60fps has much smaller effect than when game is always under 60fps.
 
Do research then.
Research on what, exactly? I already know how it works and pointed out exactly what I do not like about LightBoost's implementation.

Lightboost is first and foremost, a gaming feature. If your eyes don't handle it, don't complain.
A feature that makes my monitor flicker and decreases the perceived fluidity of motion by doing its best to disable persistence-of-vision is "gaming centric"? You're effectively forcing your monitor to emulate a stroboscope.
Remember when I compared LightBoost to viewing "a series of stills" as apposed to fluid animation? The strobe causes this.

People shouldn't complain that it bothers their eyes? Nobody should try to come up with something better? :rolleyes: Get a clue.

I personally think it's close enough for my 2d/3d needs, as do the BB crew, and many others.
Too bad I can see the flicker, which makes it worthless to me. As far as I'm concerned, Lightboost is more of an annoyance than anything else, and I'd much prefer a proper flicker-free dimming circuit be included in my monitors.
 
Last edited:
Back
Top