[AnandTech] Nvidia G-Sync Review

Status
Not open for further replies.
I really hate people crying foul when one company or the other makes something proprietary. Let's get something straight: companies aren't looking out for you, they're looking out for your money. They'll look out for you as little as possible, but enough to get your money. That's the way things have always been. If it looks like X company is trying to compete with Y company to "give customers a choice and stop monopolization"... no, they're trying to be your choice to get your money and hopefully be in the top spot themselves. Competition is two companies competing for your money. It's true that it's a good for us, but it's not for them. If Nvidia can get you locked into using their GPU's because of some new technology that it developed for monitors... you better bet they'll do it. AMD will do the same with Mantle. There's no point in investing dollars and dollars of R&D and then just giving it to the other party when you're competing with them. I don't see why people find this surprising or questionable. It's simply your job as a consumer to evaluate what each of them give you and then choose the one you want based on the factors they present.

I think it's important to not try to personify these companies and give them certain traits (evil, moneygrubbing... which company does that not apply to?). That just begs for glossing over X or Y facts because you see them as being Z types... this is common "fanboy" behavior, I suppose.
 
Nvidia spent their own R+D funds, presumably millions of dollars, on developing this technology. So they should just give it away to AMD with that being the case? Give me a break. Open standards are rarely in the interest of a corporation that develops and spends R+D funds on creating a new technology. Now if it were a standards board such as VESA which is non profit, it would make sense. But nvidia poured their own R+D funds into developing this, therefore it will be nvidia only. I don't see why anyone would expect nvidia to simply give it away to AMD or intel. "We're going to spend millions on R+D, but we'll give it away". That doesn't happen.

If gsync is as revolutionary as reviewers indicate, I fully expect AMD and intel to develop something similar. Because people will buy gsync enabled steamboxes and nvidia GPUs due to g-sync down the road (IMHO). NVidia is just ahead of the curve here in developing this.
 
It will include a driver for an improved version of lightboost, yes.


That's not what his question implied. The 1st iteration of the gsync module will have 3 modes.

Gsync - which is what everyone is reviewing right now
3d Vision - everyone knows what this is
Low persistence - This is Nvidia's implementation of making the Lightboost mod a lot better than it currently is.

http://www.youtube.com/watch?v=KhLYYYvFp9A&t=47m39s


For now Nvidia hasn't found a way to get the module to support lightboost(low persistence) and gsync at the same time.
 
If they did a 3440x1440p monitor (since that is my next purchase) with Gsync I would be interested. My problem is from what I was reading on guru3d that anything below 40fps is really bad. Well in some MMO's and even FPS you go below 40fps no matter what, so that is my only concern.

Otherwise im all for smoother gaming experience!
 
Hmmm, I wonder if the G-Sync will be included in the rift (John Carmack was pretty pumped about it at the announcement). Considering one of the advantages is the reduction of latency introduced by vsync I would imagine it would have a positive impact in the VR space.
 
If they did a 3440x1440p monitor (since that is my next purchase) with Gsync I would be interested. My problem is from what I was reading on guru3d that anything below 40fps is really bad. Well in some MMO's and even FPS you go below 40fps no matter what, so that is my only concern.

Otherwise im all for smoother gaming experience!

It's no worse than V-Sync, except for the lack of input lag.
 
Hmmm, I wonder if the G-Sync will be included in the rift (John Carmack was pretty pumped about it at the announcement). Considering one of the advantages is the reduction of latency introduced by vsync I would imagine it would have a positive impact in the VR space.

I've seen people quoting him as saying that they want to include it, but that it won't be in the first release of the Rift. And that's understandable, given the opposing development schedules- I don't expect G-Sync to be *everywhere* for a couple of years, until they get the ASICs figured out and production recovers from the high-volume buyers getting in on the action.
 
Hmmm Never play with Vsync on.....I know it can be stuttering mess sometimes though.

Increases input lag and introduces stuttering at lower framerates- basically a no-go for 'competitive' FPS. Fine for Mass Effect, not so much BF4, etc.
 
Increases input lag and introduces stuttering at lower framerates- basically a no-go for 'competitive' FPS. Fine for Mass Effect, not so much BF4, etc.

Yea I never run with Vsync. In BF4 I leave it off, and rarely see any tearing since I run at 100hz.
 
AMD has never claimed that Mantle will be "open source". In fact, when asked whether Mantle is open source, AMD's response has simply been "no". What they have claimed is that Mantle will allow for some level of functionality with non-GCN-based architectures.

Your link is a bit old. This has all changed as I told you over in the other thread. Mantle will be open to all vendors. They stated this during the Developers Summit. It's only tied to GCN at the moment while it's in development.
 
Novel idea, but at the end of the day 96+fps gaming @120hz will solve 99% of the tearing issues people perceive.

Why "96+"?

96+ is just a rule of thumb from my observation. I really start to see the improvement of 120hz at 96+ fps. Technically, anything over 60fps at 120hz is an improvement with 120+ being ideal.

the reason people throw out 96 fps is because it's a direct multiple of 24 fps which is what film is commonly shot at. the idea is that you'll reduce/eliminate judder by running your display at 24, 48, 72, 96, 120 fps.
 
Your link is a bit old. This has all changed as I told you over in the other thread. Mantle will be open to all vendors.
Nothing has changed. Mantle is open to implementation. It is not open source.

the reason people throw out 96 fps is because it's a direct multiple of 24 fps which is what film is commonly shot at. the idea is that you'll reduce/eliminate judder by running your display at 24, 48, 72, 96, 120 fps.
As it pertains to games, however, it makes absolutely no sense.
 
the reason people throw out 96 fps is because it's a direct multiple of 24 fps which is what film is commonly shot at. the idea is that you'll reduce/eliminate judder by running your display at 24, 48, 72, 96, 120 fps.

Didn't know the technical material behind it. Thanks!
 
I like G-Sync, but its a solution to a problem that shouldn't exist.

There's no reason a digital display needs to refresh at a fixed rate. Actually it doesn't need to refresh at all.
 
I like G-Sync, but its a solution to a problem that shouldn't exist.
It's annoyingly true. Display vendors are, by and large, absolutely content with mediocrity and the same-old-same-old. That a GPU vendor has to get into the display controller business to fix something that display vendors themselves should've fixed a long time ago is ridiculous.
 
If they did a 3440x1440p monitor (since that is my next purchase) with Gsync I would be interested. My problem is from what I was reading on guru3d that anything below 40fps is really bad. Well in some MMO's and even FPS you go below 40fps no matter what, so that is my only concern.

Otherwise im all for smoother gaming experience!

pretty much the same for me.

i have three 1920x1200 panels invested in eyefinity, four if you include my reserve, it would have to be compelling for me to junk that.
 
It's annoyingly true. Display vendors are, by and large, absolutely content with mediocrity and the same-old-same-old. That a GPU vendor has to get into the display controller business to fix something that display vendors themselves should've fixed a long time ago is ridiculous.

Well, it has to do more with the decisions made with mantaining compatibility with analog displays.

But there needs to be a revision on the standards to allow for variable display rates.
 
But the 120Hz 4k part is $8900 of the $9000 :cool:

I like you.

People are willing to spend near a $1000 on a single GPU that gets replaced in months but refuse to spend, tops, $200 more for a monitor that will last and create more value of your outdated GPU.
 
I like G-Sync, but its a solution to a problem that shouldn't exist.

There's no reason a digital display needs to refresh at a fixed rate. Actually it doesn't need to refresh at all.

Hmmm I wonder who is going to work on that solution...
Not Nvidia.
 
I like you.

People are willing to spend near a $1000 on a single GPU that gets replaced in months but refuse to spend, tops, $200 more for a monitor that will last and create more value of your outdated GPU.

I would if they could get their crap together. Fact of the matter is that monitors are ridden with QC issues and I'm not going to pay anything over 500$ for something that could not only have problems out of the box, but is not guaranteed to last. I've gone through around 8 mainstream 1440p monitors that I have had to return all of due to flaws, so this is speaking from experience. My Qnix ended up being better than them (after I just did some slight panel bending for backlight bleed, took like 10 mins). That's pretty pathetic. I don't trust these people now, not when something I buy from Korea for <1/2 the price is better than what they give me.
 
I would if they could get their crap together. Fact of the matter is that monitors are ridden with QC issues and I'm not going to pay anything over 500$ for something that could not only have problems out of the box, but is not guaranteed to last. I've gone through around 8 mainstream 1440p monitors that I have had to return all of due to flaws, so this is speaking from experience. My Qnix ended up being better than them (after I just did some slight panel bending for backlight bleed, took like 10 mins). That's pretty pathetic. I don't trust these people now, not when something I buy from Korea for <1/2 the price is better than what they give me.

I'm curious about your experience. I noticed a small vendor offers to do in house panel inspection for a significantly higher fee than a monitor they don't personally check and guarantee a certain level of quality. Have you tried paying for such a service and still gotten burned?
 
I was going to upgrade but I have no reason too really getting a faster connection next week. Unless they make a IPS 21.5" G-sync
 
For now Nvidia hasn't found a way to get the module to support lightboost(low persistence) and gsync at the same time.

What's your source for this specific morsel? I've been trying to verify this for a while

Edit: To clarify, I get it that Tom Petersen describes the NEW low persistence mode as a separate, fixed refresh rate mode. But I am unsure whether someone with a vg248qe, and the G-Sync mod kit, AND the 'unofficial' LightBoost utility, could run G-Sync with LB enabled
 
Last edited:
What's your source for this specific morsel? I've been trying to verify this for a while

Edit: To clarify, I get it that Tom Petersen describes the NEW low persistence mode as a separate, fixed refresh rate mode. But I am unsure whether someone with a vg248qe, and the G-Sync mod kit, AND the 'unofficial' LightBoost utility, could run G-Sync with LB enabled

I haven't seen any evidence or quotes, but just a cursory understanding of the technologies involved shows that trying to run Lightboost properly alongside G-Sync would be very difficult.
 
Well, first of all, hopefully "Lightboost" will die and we'll get strobing implemented in the monitor firmware like the Eizo FG2421 as the new standard for these monitors... 2d Lightboost is a hack, and one that is not free of issues to say the least.

Running strobing and g-sync would require variable rate strobing, and it also doesn't make much sense. You can't realistically strobe below 60hz, or people will perceive flickering. Even 60hz is very low -- many if not most people are bothered by flickering below 100hz or so, so that's essentially the strobing "floor" and you wouldn't want your monitor to refresh at different times as your backlight strobes, so you would also end up with a 100hz floor on the refresh rate, instead of the normal 30hz g-sync floor.

All of those limitations together mean that it's hard to see why you would want g-sync at all if you were using strobing, as g-sync's big benefit is that it maintains smooth movement at low framerates via adaptively low refresh rates. If you have gpu power that is capable of maintainng the framerate minimums for strobing to work well, g-sync isn't a huge benefit for you in the first place.
 
Well, first of all, hopefully "Lightboost" will die and we'll get strobing implemented in the monitor firmware like the Eizo FG2421 as the new standard for these monitors... 2d Lightboost is a hack, and one that is not free of issues to say the least.

Running strobing and g-sync would require variable rate strobing, and it also doesn't make much sense. You can't realistically strobe below 60hz, or people will perceive flickering. Even 60hz is very low -- many if not most people are bothered by flickering below 100hz or so, so that's essentially the strobing "floor" and you wouldn't want your monitor to refresh at different times as your backlight strobes, so you would also end up with a 100hz floor on the refresh rate, instead of the normal 30hz g-sync floor.

All of those limitations together mean that it's hard to see why you would want g-sync at all if you were using strobing, as g-sync's big benefit is that it maintains smooth movement at low framerates via adaptively low refresh rates. If you have gpu power that is capable of maintainng the framerate minimums for strobing to work well, g-sync isn't a huge benefit for you in the first place.

I belive those new Benq monitors are supposed to strobe in 75-144Hz range

http://www.blurbusters.com/xl2411z-and-xl2420z-announced-on-benqs-website/

so it should be possible to have both in the future.
 
Yeah I have 2 of the 248qes that are 2 months old. When I purchased these they said these would be g-sync upgradeble but I'm not sure if I have to go direct through Asus or will third parties be selling these? I am most likely going gtx780 SLI since I can't get a 290 for any less than high end 780s.
 
I'm curious about your experience. I noticed a small vendor offers to do in house panel inspection for a significantly higher fee than a monitor they don't personally check and guarantee a certain level of quality. Have you tried paying for such a service and still gotten burned?

I don't have any of those vendors near me as I have never heard of that service.
This is what I have went through:
1. U2711 (technically nothing wrong, just horrible anti glare coating and uses 10 bit oversaturation)
2. Samsung 970D (ugly stuck pixel center)
3. PB278Q (dead pixel lower left).
4. U2713HM (crosshatching and such).
5. VP2770 (smudges and some backlight bleed)
6. VP2770 (terrible backlight bleed, something else too I don't remember)
7. VP2770 (backlight was going out on it as soon as I got it, had worse backlight bleed, etc).


Hm... I guess it was actually 7 mainstream monitors then, not 8. Huh. Or I'm forgetting the other one. And good riddance, that whole ordeal was a nightmare. I'm quite glad I got this Qnix.
 
Increases input lag and introduces stuttering at lower framerates
The input lag and stuttering issues inherent to V-Sync only come into play if you can't maintain a frame-rate equal-to or higher-than your display's refresh rate.

If you have a 60 Hz monitor and your set the game up so that it never drops below 60 FPS, you should be able to hum along happily with regular old V-Sync.
 
The input lag and stuttering issues inherent to V-Sync only come into play if you can't maintain a frame-rate equal-to or higher-than your display's refresh rate.

If you have a 60 Hz monitor and your set the game up so that it never drops below 60 FPS, you should be able to hum along happily with regular old V-Sync.

If only that were possible. You're always going to have framerate drops, and 60FPS on FRAPS doesn't mean solid 16.7ms frametimes.
 
I belive those new Benq monitors are supposed to strobe in 75-144Hz range

When the refresh rate is set to a specific rate, sure. Changing the strobe speed constantly is a whole other story, and frankly 75hz strobing sounds unpleasant to say the least. If you think PWM is a problem...
 
Status
Not open for further replies.
Back
Top