Variable Refresh Rate Monitors (G-Sync) --- Refresh Rate Varies While You Play!!!

Try again with VSYNC ON, or comparing to www.testufo.com
Yes, with vsync on, in every game and on every computer.
Your testufo is very useful to determine which refresh is needed to create perfect smoothness for each individual human at each speed.

Using the testufo or moving photo 60fps@60hz vsynced, 240 pixels/s looks perfectly smooth to me. If the monitor had perfect quality, i wouldn't be able to tell it from a roll you printed the same image on and mechanically rotated smoothly in front of my face.
At 360 pixels/s, that is no longer true. It doesn't stutter in an annoying way, but it is most certainly not flawless motion. In a blind test with the monitor vs the physical roll, i could tell them apart without a doubt.
At 480 pixels/s, the effect is about the same as 360/s, just a touch worse. It is still not bothersome. I can detect it, but i could live with this.
At 600 pixels/s however, the stuttering becomes annoying @60hz. It still looks ok while tracking the movement, but it looks choppy in an annoying fashion for objects I do not currently focus on (as in, it annoys the entire peripheral vision).. Typical "real world" scenarios: You aim at something, and an enemy comes running or driving across your field of vision from the side. CHOP-CHOP-CHOP-CHOP.
You focus on a person in a movie, while the camera pans, peripheral vision goes CHOP-CHOP-CHOP-CHOP.
This is at a speed much, much slower than an actual pan in a fps game.
You always focus on something or track something, the rest is peripheral vision that makes out the vast majority of what you see at any given time.

This is easy to test using http://testufo.com/#test=photo&photo=quebec.jpg&pps=600&pursuit=0 . Just focus on the dropdown box just above the moving picture. Does the moving picture still look smooth, when not actively tracked? Probably not, but if it does, raise the speed a bit.
This is because 60fps just doesn't have enough visual information for this movement speed to look real/smooth at this speed.
Just raise the speed until the movement looks choppy in an annoying fashion. I'm certain every human can see this at some speed at 60hz. You can easily compare this to reality by holding a photo/whatever else next to the screen and moving it at the same speed.

To achieve smoothness at greater speeds, there are as far as I know two options:
1. Add artificial motion blur to mask the stutter (movies, tv, consoles (pc games optionally) usually do this). I don't want this.
2. Simply raise the refreshrate/fps.

Try this on a 120 or 144hz monitor as well, and find the limit there as well. It seems to be proportional.
It should be possible to calculate which refreshrate is the maximum needed to make the fastest object each human can track look real.
I approximate that i can't track details moving much faster than the 3840/s speed as that speed really strains me, so lets just put that as a maximum for me.
If this is proportional, then if 60hz was needed at 480 pixels/s (the level i felt was smooth, but could still tell from reality) to look smooth, then i would need 3840*60/480=480 fps/hz to feel that the fastest object i can track moves smoothly.
To not be able to tell it from reality at all, quite a bit higher (960fps).
These numbers are based on such big pixelspeed-steps that they are not really spot on, but i think they are reasonably close to the truth (lets say not more than 20-30% off).

The point is just that even synced, 30 fps is nowhere near smoooth, neither is 60fps, nor even 144fps for fast enough motion. If the framerate drops that much with gsync on during fast motion, i really think it will be easily detected as stutter (during fast motion).
If the framerate stays reasonably close i think it will work really well, and gsync in general seems like an awesome step forward.
 
Last edited:
On a slightly different note, will variable refresh rate implementations remove the cap on maximum framerate for a display? In other words, there's no longer any reason why a display couldn't be refreshed as fast as the connection bandwidth allows, but will G-Sync actually allow this? If it does I find this exciting for display use in general and not just gaming, since 95% of the time I'm in 2D mode not gaming where the video card should have no trouble rendering at 300+fps. For a DP1.2 connection and 2560x1440 roughly 200fps should be sustainable except when the bottleneck lies elsewhere than the connection, and at 1920x1080 the limit is about 350fps.
 
Fixed-Framerate Video is INVALID proof.

However, you can "approximate" the feel by recording at significantly higher framerate than the video. E.g. in the demo they ran it at 45-60fps variable frame rate with vsync off/gsync on, if you recorded it at 1000fps, and played back a slowed down version, you would be able to see the difference.

An example of 120fps recording, here you can see the tearing on the vsync off vs the gsync monitor:
http://www.youtube.com/watch?v=NffTOnZFdVs#t=24
 
However, you can "approximate" the feel by recording at significantly higher framerate than the video. E.g. in the demo they ran it at 45-60fps variable frame rate with vsync off/gsync on, if you recorded it at 1000fps, and played back a slowed down version, you would be able to see the difference.

An example of 120fps recording, here you can see the tearing on the vsync off vs the gsync monitor:
http://www.youtube.com/watch?v=NffTOnZFdVs#t=24
True, high speed video helps. Not yet fully representative of what is actually seen by human eyes in person; but it demos better than non-highspeed video. Still far more dramatic looking in person.

But yes, if you must show off video of G-SYNC, you need high speed video (if using fixed-framerate video)
 
The only benefit I see in G-Sync for people who do not play games is 48fps (HFR) cinema.
Other than that who cares.

If a monitor overclocks to 72Hz (3x24p) thats all I need for Movies and Internet.
 
No, there are just nowhere near enough frames to do fast pans at 60fps. 60fps vsynced is fast enough for movie-speed movement/pans, but not fps-gaming speed pans, objects/players quickly moving past at an angle, etc.

Ok I know what you mean now but that is really not stuttering, call it judder or jitter (like the famous 24fps judder) and it will be easier to understand you. That is not really related to g-sync or anything though, it's just a reason why we should keep on pushing for higher hz on monitors (since yes the human eye can see hundreds of frames per second and definitely more than 24 or 60 like some would have us believe).

G-sync still remains an amazing breakthrough and step in the right direction, personally this is more for single player gaming with demanding games and video playback though, since I play competitive fps games with a frame rate of 100-120fps or more when possible and I'm not too sure what hz the first g-sync monitors will have. I mean I would still gladly buy a 60hz g-sync monitor but hopefully I won't have to. My only gripe is really that we'll probably only see TN panels for a while...
 
Both hardware and software are cooperating.

For example, the frame is pushed to the monitor immediately the moment Direct3D Present() API is called. Basically, the existing Direct3D software API is now triggering the delivery of the frame to the monitor if the monitor is currently waiting for the frame. No being forced to wait for the next scheduled refresh interval. It's necessary to have very close integration of the software with the hardware, in order to support variable-refresh-rate monitors.

The post of mine you quoted was in response to "The solution to a problem that isn't a problem since it has a software solution?" which continued an insinuation that Triple Buffering will do everything it needs to and there is no need for G-Sync hardware. Obviously it does require the hardware (as well as software) and your post explains the integration of these. :)
 
Last edited:
G-Sync is probably the biggest and best development in screens in many many years. Let's hope it expands from PC monitors into mobile devices and televisions!

I'll be buying a G-Sync monitor, but I'll be waiting for a VA panel most likely. I can't stand TN. And have yet to find an IPS with great contrast/blacks.
 
G-Sync is probably the biggest and best development in screens in many many years. Let's hope it expands from PC monitors into mobile devices and televisions!

I'll be buying a G-Sync monitor, but I'll be waiting for a VA panel most likely. I can't stand TN. And have yet to find an IPS with great contrast/blacks.

I really don't think that you will ever see a good monitor with gsync.
really doubt that Eizo, NEC, HP or Dell with do a "one graphics card" only monitor.
 
I really don't think that you will ever see a good monitor with gsync.
really doubt that Eizo, NEC, HP or Dell with do a "one graphics card" only monitor.

I would think people can still use a G-Sync enabled monitor that don't use NVidia, the G-Sync feature will just not be enabled for them? It would act as a standard monitor?

But the monitor makers will make what they can sell. And there are a lot of NVidia customers.... And a lot of new potential customers with G-Sync!
 
I would think people can still use a G-Sync enabled monitor that don't use NVidia, the G-Sync feature will just not be enabled for them? It would act as a standard monitor?

But the monitor makers will make what they can sell. And there are a lot of NVidia customers.... And a lot of new potential customers with G-Sync!

this technology could be awesome if implemented like a standard that works with every cards something like a new HDMI port that enable the variable refresh rate on the monitor.
they way gsync works means that your monitor works with one graphics card.

what is the reason why gsync will works on series 6 only?
nvidia is creating too many fragmentations (FXAA,TXAA,GYNC) I am a green fanboy but I'm annoyed from nvidia, if they will continue in this way my next card will be an ATI.
 
this technology could be awesome if implemented like a standard that works with every cards something like a new HDMI port that enable the variable refresh rate on the monitor.
they way gsync works means that your monitor works with one graphics card.

what is the reason why gsync will works on series 6 only?
nvidia is creating too many fragmentations (FXAA,TXAA,GYNC) I am a green fanboy but I'm annoyed from nvidia, if they will continue in this way my next card will be an ATI.

FXAA isn't really fragmentation as it works on both nVidia and AMD cards. AMD is just as guilty as creating "fragmentations". AMD have TressFX (works on nVidia but performance is severe), and now Mantle.

nVidia put the R&D into G-Sync, so absolutely they should benefit from it. And there is a reason they are using DisplayPort due to the tech. And I would assume that would be one of the reasons it isn't supported on lower cards, because they lack newer DisplayPort.

Perhaps down the road nVidia will open it up to other vendors.
 
...
I'll be buying a G-Sync monitor, but I'll be waiting for a VA panel most likely. I can't stand TN. And have yet to find an IPS with great contrast/blacks.
...
G-sync still remains an amazing breakthrough and step in the right direction, personally this is more for single player gaming with demanding games and video playback though, since I play competitive fps games with a frame rate of 100-120fps or more when possible and I'm not too sure what hz the first g-sync monitors will have. I mean I would still gladly buy a 60hz g-sync monitor but hopefully I won't have to. My only gripe is really that we'll probably only see TN panels for a while...


Imo gaming at sub 100hz - 120hz is vastly inferior. To date there are no non-TN panels intentionally manufactured at over 60hz input that I know of. The only ips option is an "overclocked" korean 2560x1440, whose rez then lowers your fps considerably vs maintaining 100 to (optimally) 120fps+ unless you spend a lot more on your gpu budget or turn down the most demanding game's settings considerably. 1080p is still the sweet spot GPU power vs cost for fairly robust enthusiast budgets on 120hz/144hz monitors without going to extreme gpu budgets.

60hz maintaining 60fps has the worst blurring and half (or less) the more recent game world state/action slices shown per second which makes the motion transitions much less defined and accurate. High Hz monitors at high fps have what I call "high definition motion" as opposed to higher definition resolution. Filling the screen with many more new views into the game world per second (not interpolated/fake frames) increases accuracy of movement which can result in better timing and reaction time. It's benefits are very appreciable even outside of "hard core"competitive gaming, looking and feeling~flowing better aesthetically, and timing is always appreciated in any games. Combined with much greater reduction in blur (50% or 60% less) of the entire viewport during FoV movement and blur of isolated high speed moving objects/players (or blur elimination with lightboost 2d mode) this is a huge difference from 60hz.

There are no non-TN panels manufactured with lightboost (3d vision) that I know of either, so you wouldn't have a 2D zero blur gaming option in non-TNs, nor the lowest response times that the 120hz-144hz TN's have. Nvidia did mention releasing higher resolution monitors with v-sync boards in them next year, but I suspect that they would just be similar to existing 2560x1440 and 4k monitors/tvs with a g-sync board added to them. Perhaps a 4k one would be VA like a 4k tv though who knows. VA tend to have ghosting and/or trailing issues, with a few rare exceptions. Regardless 60hz wouldn't interest me for 1st/3rd person cgi world perspective gaming. I would never game at 60hz again on my main rig if I could help it due to the much greater motion and animation definition combined with great reduction or elimination of the constant full viewport FoV movement smearing that high hz at high fps provides.

I agree that if you are keeping above your high hz setting (e.g. 100fps+, 120fps+, 133fps+, 144fps+) on your high motion definition, blur reduced(or eliminated) tn gaming monitor, g-sync wouldn't be a huge gain since you can run without vsync with insignificant tearing already. If g-sync affects zero blur lightboost mode negatively, it wouldn't be a boon there either for those that already get zero blur LB 2d gaming. I think g-sync will allow users content with inferior sub 100hz or 60hz style motion definition and blur to crank up their in game graphics settings higher and suffer even lower fps combined with considerable fps dips without suffering the side effects of not using vsync (screen abberations) nor the side effects of using vsync (input lag mainly). So I'm guessing a scenario such as someone with a 144hz monitor cranking up eye-candy could run it at 80fps average (and 80hz) and dynamically dip down to to 70's and 60's or lower throughout the game's more demanding scenes/action without vsync/non-vsync complications. When they are in a small room somewhere with nothing going on, perhaps their fps would shoot up over 100 on occasion and the g-sync would raise the monitor back to 100hz. Another scenario would be a 2560x or especially 4k monitor/tv user suffering bad fps due to crippling resolutions, able to dynamically dip below their 60hz refresh rate (ewww) without vsync/non-vsync side effects. It is an interesting tech as an addition since any game can have a min. fps sinkhole or two in it's "graph", but not if it is at the cost of eliminating other tech's like lightboost (or those who use 3d vision) - and it would be no substitute to the actual high definition motion and blur reduction of adjusting your settings (and/or gpu budget) to maintain higher fps more consistently using a 120hz+ monitor instead.

The tradeoffs between desktop/app usage and gaming are so great that I keep a different monitor for each at my desk. I do enjoy a 2560x1440 ips for everything pc outside of games, and a VA based samsung "led" tv in living room (though I probably would have gone plasma there in hindsight for movies).
 
Last edited:
Imo gaming at sub 100hz - 120hz is vastly inferior. To date there are no non-TN panels intentionally manufactured at over 60hz input that I know of. The only ips option is an "overclocked" korean 2560x1440, whose rez then lowers your fps considerably vs maintaining 100 to (optimally) 120fps+ unless you spend a lot more on your gpu budget or turn down the most demanding game's settings considerably. 1080p is still the sweet spot GPU power vs cost for fairly robust enthusiast budgets on 120hz/144hz monitors without going to extreme gpu budgets.

60hz maintaining 60fps has the worst blurring and half (or less) the more recent game world state/action slices shown per second which makes the motion transitions much less defined and accurate. High Hz monitors at high fps have what I call "high definition motion" as opposed to higher definition resolution. Filling the screen with many more new views into the game world per second (not interpolated/fake frames) increases accuracy of movement which can result in better timing and reaction time. It's benefits are very appreciable even outside of "hard core"competitive gaming, looking and feeling~flowing better aesthetically, and timing is always appreciated in any games. Combined with much greater reduction in blur (50% or 60% less) of the entire viewport during FoV movement and blur of isolated high speed moving objects/players (or blur elimination with lightboost 2d mode) this is a huge difference from 60hz.

There are no non-TN panels manufactured with lightboost (3d vision) that I know of either, so you wouldn't have a 2D zero blur gaming option in non-TNs, nor the lowest response times that the 120hz-144hz TN's have. Nvidia did mention releasing higher resolution monitors with v-sync boards in them next year, but I suspect that they would just be similar to existing 2560x1440 and 4k monitors/tvs with a g-sync board added to them. Perhaps a 4k one would be VA like a 4k tv though who knows. VA tend to have ghosting and/or trailing issues, with a few rare exceptions. Regardless 60hz wouldn't interest me for 1st/3rd person cgi world perspective gaming. I would never game at 60hz again on my main rig if I could help it due to the much greater motion and animation definition combined with great reduction or elimination of the constant full viewport FoV movement smearing.

I agree that if you are keeping above your high hz setting (e.g. 100fps+, 120fps+, 133fps+, 144fps+) on your high motion definition, blur reduced(or eliminated) tn gaming monitor, g-sync wouldn't be a huge gain since you can run without vsync with insignificant tearing already. If g-sync affects zero blur lightboost mode negatively, it wouldn't be a boon there either for those that already get zero blur LB 2d gaming. I think g-sync will allow users content with inferior sub 100hz or 60hz style motion definition and blur to crank up their in game graphics settings higher and suffer lower fps combined with considerable fps dips without suffering the side effects of not using vsync (screen abberations) nor the side effects of using vsync (input lag mainly). So I'm guessing a scenario such as someone with a 144hz monitor cranking up eye-candy could run it at 80fps average (and 80hz) and dynamically dip down to to 70's and 60's or lower throughout the game's more demanding scenes/action without vsync/non-vsync complications. When they are in a small room somewhere with nothing going on, perhaps their fps would shoot up over 100 on occasion and the g-sync would raise the monitor back to 100hz. Another scenario would be a 2560x or especially 4k monitor/tv user suffering bad fps due to crippling resolutions, able to dynamically dip below their 60hz refresh rate (ewww) without vsync/non-vsync side effects It is an interesting tech as an addition since any game can have a min. fps sinkhole or two in it's "graph", but not if it is at the cost of eliminating other tech's like lightboost - and it would be no substitute to the actual high definition motion and blur reduction of adjusting your settings or gpu budget to maintain higher fps more consistently instead.

The tradeoffs between desktop/app usage and gaming are so great that I keep a different monitor for each at my desk. I do enjoy a 2560x1440 ips for everything pc outside of games, and a VA based samsung "led" tv in living room (though I probably would have gone plasma there in hindsight for movies).

120hz can give you what you call "high definition motion" but gives you low definition in all the other area.
TN panels are the worst panel you can buy on the market, VA or IPS gives outstanding images while comparing to TN, I prefer outstanding images over an higher definition motion.
 
I consider a consistently/constantly smeared entire viewport during FoV motion low definition.
.
http://www.blurbusters.com/faq/60vs120vslb/
.
High definition motion and more recent action shown are additional benefits of high hz + high fps beyond the blur reduction/elimination - making the use of 60hz not even close for gaming.
.
I use a higher resolution ips for desktop/still imagery.
 
Last edited:
I consider a consistently/constantly smeared entire viewport during FoV motion low definition.
.
http://www.blurbusters.com/faq/60vs120vslb/
.
High definition motion and more recent action shown are additional benefits of high hz + high fps beyond the blur reduction/elimination - making the use of 60hz not even close for gaming.
.
I use a higher resolution ips for desktop/still imagery.

personally I prefer a fast IPS or better a VA to a TN, question of tastes.
 
personally I prefer a fast IPS or better a VA to a TN, question of tastes.

If you want best image quality VA-IPS.

If you want best response TN.

I'm willing to give up a some response in favor of better image quality!
 
If you want best image quality CRT.

If you want best response CRT.

problem solved :D
 
yes this is becoming an endless loop.
.
Response time:

a gaming TN is way beyond response time anymore.
120hz/144hz is a huge difference in several facets from a low response time 60hz tn of the past.
.
Image quality:
The entire viewport of goregous achitectures/geography, creatures, high detail textures and depth via bump mapping being smeared out constantly during FoV movement is not image quality.
It is as if you are wearing goggles filled with viscous fluid, and every time you turn your "head"/FoV, the fluid smears out your eyesight (quite horribly at 60hz).

http://www.blurbusters.com/faq/60vs120vslb/ <-- monitors in 2013 still not ahead of a professional crt in motion btw, even lightboost mode trades off some color, but is close. We may be getting closer if it starts getting fully supported by nvidia as a 2d blur elimination.

If you want the best of both, use a separate monitor for desktop/apps and for gaming.

-----------------------------------------------------------------------
.
I was replying to the effect that g-sync seems more useful to lower hz and lower fps to avoid vsync/non-vsync side effects, which would make you lose the 120hz+ 's advantages much or all of the time, depending on your fps "graph" due to settings vs gpu power and demands of a particular game.

120hz advantages lost when allowing low fps or hitting low fps often (dynamically lower fps than your monitor's max rez relying on g-sync to match hz so it at least prevents vsync/non-vsync side effects of dipping below your hz):

-loss of constantly more recent action shown
(8.3 ms sooner/twice as early at 120fps&120hz, 9.66ms sooner at 144hz - while the low fps user's action is "Freeze-framed" a full 16.6ms at 60fps, or longer at lower fps)
especially useful during competitive/scored games but useful to your survival and performance in any game.

- much higher definition motion transition's, animations, accuracy/timing/reaction-time, and aesthetic qualities, again useful for survival but also is a form of image quality.. motion image transition quality and control sophistication quality.

-large blur reductions during your constant FoV movement (50% less blur at 120hz and high fps, 60% less at 144hz, or complete blur elimination w/ lightboost)
going back to full baseline 60hz blur of the entire viewport "outside of the lines of the shadow masks" of all onscreen objects, high detail textures, landscapes, depth via bump mapping. 60hz's smear "resolution" during these FoV movements must equate to a horribly low resolution that is probably not even definably a solid uniform grid in relation to the material it is attempting to display. 120hz/144hz non-lightboost still blur but at least it is within the "shadow mask" of all onscreen objects and architecture/geography. All object detail, texture detail, and depth via bump mapping is still lost on the entire viewport.

----------------

While g-sync allows you to use higher settings ~ lower fps or regular lower fps swings without vsync/non-vsync side effects, it probably won't be a great boon to high hz gamers other than allowing their sub 100hz / sub 120hz fps dips at the most demanding parts of a game to remain screen abberation-free (tearing). So maybe instead of maintaining over 120fps constantly, you could tweak graphics settings to get 120fps average and the dips somewhat beneath wouldn't tear running non-vsync. So you could perhaps run a 100hz to 120hz range at a 120fps average fps without side effects. It's a good thing.
.
If I were running lightboost for zero blur and were happy with it, I probably would not trade it for using g-sync if the two end up being mutually exclusive though.
.
 
Last edited:
I don't mind having this if it's really $100 premium. I wonder if TV makers will adopt this as well, particularly Sony.
 
I wonder if TV makers will adopt this as well, particularly Sony.

Why?

All video content is static frame rate.

Anyone care to expand my awareness as to why do we need g-sync outside of games (floating, non stable frame rate)?

What is the percent of people who care for Tearing, Lag, and Stutter among ALL (TV / PC / Studios / etc...) flat-panel owners?
That's what the big corporation sharks will be asking themselves.
 
Last edited:
Why?

All video content is static frame rate.

Anyone care to expand my awareness as to why do we need g-sync outside of games (floating, non stable frame rate)?

What is the percent of people who care for Tearing, Lag, and Stutter among ALL (TV / PC / Studios / etc...) flat-panel owners?
That's what the big corporation sharks will be asking themselves.

It's useful to someone who only watches pure video as well, as there are so many video standards.
24p video can not play smoothly at 60hz, etc.
 
people want to match their hz to their framerate so that uneven multiples (i.e. 60hz input not divisible by 24 frames per second blurays) don't result in judder or other artifacts, so they don't have to rely on production of fake frames (interpolation) after the 60hz input which produce their own artifacts and unwanted screen presentations.

A 72hz or 120hz screen refresh ratecan match 24hz x3 or x5 , respectively, or you could match the hz exactly (dropping 60hz down to 24fps movie, or down to a 48fps movie) you would be at a 1:5 , 1:3, or 1:1 relationship between fps and hz which would be a clean display without any oddball fractionals/uneven numbers between the two. Personally I wish tv mfg's would do true 120hz input in the back + g-sync to avoid vsync and interpolation screen abberations, + backlight strobing to eliminate sample and hold blur. Then everything would have crt-like zero blur and avoid uneven hz to fps quirks. At 120hz you would see each 24fps bluray frame 5 times each with no fakes... a 48fps movie like the hobbit would use g-sync to drop down to showing each frame 2 times (96hz). That would also allow all of the further pc gaming 120hz, g-sync, and backlight strobing benefits if you hooked a pc or steambox up to the tv too.
 
Why? All video content is static frame rate. Anyone care to expand my awareness as to why do we need g-sync outside of games (floating, non stable frame rate)?
Why is "outside of games" a stipulation? TVs are used almost exclusively as the output devices for consoles.
 
Console games and HTPC.
The SteamBox can support G-SYNC, as various other upcoming consoles that can use nVidia. Also, many people use TV's with Home Theater PC's, for PC gaming on the big screen. The forum section on AVSFORUM.com has something like a million members, it contains the world's most popular HTPC forum. (I used to be a moderator there from 1999-2001, and it's even more popular now).

Anyone care to expand my awareness as to why do we need g-sync outside of games (floating, non stable frame rate)?
You don't. But TV's are being used for interactive content more often, with:
-- SteamBox
-- HTPC's
-- Settop boxes now containing GPU's
-- AppleTV's has a GPU built in; and Apple is considering turning it into a console in next model (App Store on AppleTV 3; download games!)
-- Etc.

Yes, they may not be interesed in G-SYNC today. But if HDMI organization adopts a variable refresh rate HDMI specification, some TV manufacturers, and so on, then G-SYNC is a reality for XBoxTwo and PlayStation5, the next round of consoles which nVidia could theoretically win back, or AMD licenses G-SYNC. The game industry earns more revenues than the movie industry, and this is growing, so theoretically variable refresh rate technology could become adopted as part of a display specification within the next 10-20 years, to better support future interactive content.

The friendly "corporate sharks" that you speak of, may include nVidia, STEAM, and a few desparate TV manufacturers "wanting to specialize". Also, generically, variable refresh rate technology is intrinsically useful in our journey to tomorrow's refreshrateless Holodeck displays. Real life has no refresh rate.
 
I've been Team Red for a long time now, but this is seriously making me consider switching to Team Green....
 
We will be stuck with xbox 1 and Ps4 for another 8 years so by that time OLED will be more mainstream which will easily outweigh the benefits of anything LCD has to offer.
 
LCDs may still have a considerable cost advantage, even at that time. It's difficult to say.
 
We will be stuck with xbox 1 and Ps4 for another 8 years so by that time OLED will be more mainstream which will easily outweigh the benefits of anything LCD has to offer.

I hope i don't have to wait 8 years for 120+ Hz oled displays with strobing ang g-sync :D
 
by that time OLED will be more mainstream which will easily outweigh the benefits of anything LCD has to offer.

OLED will still blur badly during FoV movement unless it adds some sort of backlight strobing or scanning. They still haven't figured out how to fix the color burnout/fade issue of certain colors yet either afaik. If it is 60hz input, it will lack large benefits of running high hz which include large blur reduction but much more. 120hz+120fps advantages.

We will be stuck with xbox 1 and Ps4 for another 8 years

Speak for yourself. Plenty of people will be pc gaming at 120 or more Hz (incl. some with lightboost zero blur) and higher graphics settings which are both far superior to anything consoles do. Others will be pc gaming at 60hz. Soon some people will be running steamOS or steamboxes. I'm sure some people that don't want to learn the nuances of windows might get their feet wet in pc gaming with a steambox too.
.
Unfortunately all consoles are limited (severely limited imo) to 60hz, and practically all TV's are 60hz input (very few have been discovered to hit high hz input using a forced workaround recently <- but much of this is hear-say and not definitely reproducible). Many tv's also have considerably higher input lag, ghosting/trailing, full blast 60hz blur amounts during FoV movement. Consoles controls are also retarded by hard coded console gamepad code "enhancements" and forced coded controller limitations. Their controls and game interfaces are much less sophisticated. Their graphics quality and fps much more limited. Their modding communities and private server capabilities? Even simple server browsing options by ping are not evident on some console games. Steam on windows pc and SteamOS promises a much more open game dev , modding, and overall environment, the OS is and always will be free as is belonging to and using the online services, Another bonus is good game sales/prices.
.
If they release some larger g-sync screens (or tvs) they could be useful for both console and pc/steambox users satisfied the lower performance on several levels that 60hz gaming gives. Sony also has a backlight strobing technology that reduces blur in a game mode on some of its tvs, even if it can only do so at 60hz, which many assume would look somewhat flickery at that rate.
.
Don't stick yourself to a console if you can help it, even if you have to play in a living room. Plenty of people won't, including me. Personally I'd rather sock console+peripherals+premium game cost money away toward gpus and a g-sync 120hz input monitor (hopefully glossy and with lightboost option as well) next year.
 
Last edited:
G-Sync + strobing + 1440p = hurry up and take my money!

I think I'd be fine with 85Hz, in the CRT days, as soon as I reached 75Hz I stopped noticing the flicker, which drove me absolutely nuts at 60Hz. The faster the better though!

This is so exciting, I hope it all pans out.
 
G-Sync + strobing + 1440p = hurry up and take my money!

I think I'd be fine with 85Hz, in the CRT days, as soon as I reached 75Hz I stopped noticing the flicker, which drove me absolutely nuts at 60Hz. The faster the better though!

This is so exciting, I hope it all pans out.

CRT has nothing to do with LCD, 60Hz on a CRT makes you go blind, no problem with 60Hz on a LCD.
 
G-Sync + strobing + 1440p = hurry up and take my money!

I think I'd be fine with 85Hz

Be aware that 2560x will be more demanding and work best with dual gpus on more demanding games, which could add a lot to cpu budgets(could req. pushing enthusiast budgets into extreme gpu budgets), or require you to turn your settings down considerably on a single card.

With a single gtx780 you wouldn't even hit 80fps on some of the more demanding games on max settings.

2560x ,single gtx 780

Shogun 2 DX11, ultra quality, AF 16x, AA - OFF: min: 57 ave fps: 74
Shogun 2 DX11, ultra quality, AF 16x, AA 8x: min 31 ave fps 43

BF3 Ultra quality AF 16x, AA-OFF: min 81, ave fps: 99
BF3 Ultra quality AF 16x, AA- 4x: min 63, ave fps: 74

Tomb Raider Ultra+TressFX, AF16x, FXAA: min. 42 ave fps 56
Tomb Raider Ultra+TressFX, AF16x, 2xSSAA: min 32 ave fps 40

Bioshock inf. Ultra quality AF 16x, AA-OFF: min 14 ave fps: 84 - 86
Bioshock inf. Ultra quality AF 16x, AA 4x: min 12 -14, ave fps: 68 - 69

Company of Heroes DX11, Max quality, AF 16x, AA OFF: min 29- 30 ave: 48
Company of Heroes DX11, Max quality, AF 16x, AA 8x: min 12 ave: 26


I'm sure you could adjust some settings to get 85fps on a single card on some of those games. High hz + higher fps has several advantages though, even outside of greater blur reduction. (Without lighboost yielding 50% blur reduction at 120hz and 60% blur reduction at 144hz).

As I understand it,
80 fps would attempt to send a new screen update every 12.5ms. On a g-sync monitor, the monitor's refresh rate would drop to 80hz and would actually update the screen every 12.5ms 1:1.
The 120fps+ at 120hz user would be shown 40 more frames in the same period obviously.
The 120fps+ at 120hz user would see frames 1/3 sooner, have 1/3 more motion definition,etc. than the 80fps g-sync user.

Comparing 80fps on a 120hz monitor(non-gsync) to a 120fps+120hz user:
120hz monitors update every 8.3ms. 80 frames sent would leave 40 updates "empty", requiring the same frame to be frequently shown more than once ("freeze framed").
1/2 (40) of the 80 frames would be shown 1:1 at 8.3ms each.
1/2 (40) of the 80 frames would have to be "freeze-framed" to 16.6ms each.
40 "frozen" through 2 updates (80 updates) + 40 at 1 frame per update (40) = 120 screen updates (120hz).
2/3 of the time, the 80fps user at 120hz is seeing 16.6ms "freeze frames" continue through two 120hz+120hz 8.3ms screen updates by comparison.
1/3 of the time, the 80fps user at 120hz is seeing the other 40 frames at 8.3ms.

Of course it could be a much more imperfect timing/rendering of frames than that incl glitches/judders, etc - this is just the raw number comparisons.

edit: followed up links:

&#8220;We have a superior, low-persistence mode that should outperform that unofficial [LightBoost] implementation, and importantly, it will be available on every G-SYNC monitor.

John Carmack (@ID_AA_Carmack) tweeted:
&#8220;@GuerillaDawg the didn&#8217;t talk about it, but this includes an improved lightboost driver, but it is currently a choice &#8212; gsync or flashed.&#8221;
 
Last edited:
Note that while the article did say "available on all g-sync monitors" , nowhere did it indicate the intention to make 2560x1440 or 4k g-sync monitors over 60hz input. :rolleyes:
.
If that were the case, it would rule the higher resolutions ones out for me.
 
Back
Top