G-Sync First Impressions from TR Forums

That strobe light mod will melt your eyeballs out... never use it. For $200 I'll pass.. probably. I thought it would be more $100-$125.
 
I don't care if this new technology manufactures blowjobs: it's vendor specific and will only feature in $500+ monitors. All of these articles about G-sync are talking it up like it's the next generation of all computer graphics, when really, it's never going to take off: like PhysX: Nvidia will spend tons of money to force it down developers throats. It will never receive wide adoption because it, by nature, is unable to be adopted by all: it is Nvidia only and Nvidia is spending a lot of money to make sure that remains fact. Funny enough Nvidia was asked about proposing this as an update to the Displayport standard, that way it can be used by all, on any Displayport monitor regardless of price. As one of the biggest members of VESA (the people who have control over new and existing display standards), their reply:

Nvidia said:

Kind of obvious what their intentions are.
 
Wouldn't they make more money by increasing their market?
What does Nvidia have to gain by keeping things so exclusive?
 
Wouldn't they make more money by increasing their market?
What does Nvidia have to gain by keeping things so exclusive?

Really, this is round one- what they have is an expensive custom processor that does what a cheap ASIC could do. At the same time, all of the 'work' is done on the monitor which means that the technical barriers for other GPU vendors to implement support for G-Sync monitors are pretty slim.
 
I don't care if this new technology manufactures blowjobs: it's vendor specific and will only feature in $500+ monitors. All of these articles about G-sync are talking it up like it's the next generation of all computer graphics, when really, it's never going to take off: like PhysX: Nvidia will spend tons of money to force it down developers throats. It will never receive wide adoption because it, by nature, is unable to be adopted by all: it is Nvidia only and Nvidia is spending a lot of money to make sure that remains fact. Funny enough Nvidia was asked about proposing this as an update to the Displayport standard, that way it can be used by all, on any Displayport monitor regardless of price. As one of the biggest members of VESA (the people who have control over new and existing display standards), their reply:



Kind of obvious what their intentions are.

Basically nothing you write here is true. Your implication that developers need to support it is particularly outlandish; attempting to equate it with PhysX nearly as much.
 
Funny enough Nvidia was asked about proposing this as an update to the Displayport standard, that way it can be used by all, on any Displayport monitor regardless of price.
There are already provisions for dynamic refresh rates within the DisplayPort standard. No need to ask Nvidia for anything...

In fact, Intel had already implemented it on their latest integrated graphics chips. They can already support monitors with dynamic refresh rates...
 
Would prefer it to be available on all cards, but this is Nvidia after all. Let's see how it performs though after a few reviews have been published.
 
There are already provisions for dynamic refresh rates within the DisplayPort standard. No need to ask Nvidia for anything...

In fact, Intel had already implemented it on their latest integrated graphics chips. They can already support monitors with dynamic refresh rates...

Provisions for switching between fixed refresh rates. Not quite the same thing.
 
Would prefer it to be available on all cards, but this is Nvidia after all. Let's see how it performs though after a few reviews have been published.

I'm in for whoever gets both G-Sync and Mantle on the same card first. I prefer Nvidia for everything except for pricing, but my bet's on AMD.
 
He says in the review that once the FPS was near 30FPS then G-Sync was not working, I thought it was suppose to be all range of framerate?
 
That strobe light mod will melt your eyeballs out... never use it. For $200 I'll pass.. probably. I thought it would be more $100-$125.

He is commenting on a slide that said $200 on parts and service to get one installed.
 
I don't care if this new technology manufactures blowjobs: it's vendor specific and will only feature in $500+ monitors. All of these articles about G-sync are talking it up like it's the next generation of all computer graphics, when really, it's never going to take off: like PhysX: Nvidia will spend tons of money to force it down developers throats. It will never receive wide adoption because it, by nature, is unable to be adopted by all: it is Nvidia only and Nvidia is spending a lot of money to make sure that remains fact. Funny enough Nvidia was asked about proposing this as an update to the Displayport standard, that way it can be used by all, on any Displayport monitor regardless of price. As one of the biggest members of VESA (the people who have control over new and existing display standards), their reply:



Kind of obvious what their intentions are.

Literally everything you stated here is not true. The part about shoving it down developers throats is pretty hilarious as well, considering it doesn't require software modification at all.
 
Last edited:
But, once I enabled SSAA and my FPS started dipping into the 30fps range, the effects of G-Sync disappeared. It was like running on a normal setup. Tons of stutter and tearing. Of course, this is to be expected since all the reviewers mention that.

Well what the fuck, isn't that what this shit was supposed to fix? If I need to maintain 60fps for g-sync to work, I'd rather just have set refresh rate and throw more graphics cards at the problem.
 
You might want to look into what gsync is. A low refresh rate is still going to look like shit no matter what.
 
Well what the fuck, isn't that what this shit was supposed to fix? If I need to maintain 60fps for g-sync to work, I'd rather just have set refresh rate and throw more graphics cards at the problem.

G-Sync can't invent frames to send to the display.
 
Wait, so does G-Sync improve smoothness in 40-60FPS range? It simply has to stay above 30FPS to work, right?
 
Supposedly g-sync has a range of 30hz to 144hz, so in order for there to be tearing the FPS would have to cross BELOW 30.
 
Supposedly g-sync has a range of 30hz to 144hz, so in order for there to be tearing the FPS would have to cross BELOW 30.

Correct. You still need to mind system requirements and specs when using G-Sync.

SSAA is a massive FPS killer.
 
G-Sync can't invent frames to send to the display.

Nobody said that but if my GPU is producing the frames then G-Sync doesn't need to invent them, just don't make sense that it doesn't work once the framerate goes down to a certain amount
 
Correct. You still need to mind system requirements and specs when using G-Sync.

SSAA is a massive FPS killer.

If you don't have enough horsepower to keep something above 29fps, your computer isn't strong enough.
 
Nobody said that but if my GPU is producing the frames then G-Sync doesn't need to invent them, just don't make sense that it doesn't work once the framerate goes down to a certain amount

The fact that it doesn't invent frames is the sensible reason why it doesn't work once the frame rate dips below 30. Think about it.

And frankly, you should never be below 30 FPS anyway.
 
The fact that it doesn't invent frames is the sensible reason why it doesn't work once the frame rate dips below 30. Think about it.

And frankly, you should never be below 30 FPS anyway.

Well no, the only reason it doesn't work below 30FPS is that the display can't switch any slower than that. If the panel had a 15hz minimum, the effect would work as low as 15fps.
 
Well no, the only reason it doesn't work below 30FPS is that the display can't switch any slower than that. If the panel had a 15hz minimum, the effect would work as low as 15fps.

Precisely.

Meaning, it's not a limitation of G-Sync, and expecting it to invent functionality for the panel is getting ridiculous.
 
Seems like it is addressing a problem that would be just as well solved by adding $200 to the graphics card instead.

Neat idea, but im not interested yet.
 
Right, but it wouldn't be the panel "inventing frames", it's that it simply lacks the functionality. It technically IS a limitation of g-sync because this is the only panel that can do it, however I expect to see more enthusiast monitors which will take advantage of the tech and perhaps have as low as 10hz range.

That said, even though I have the VG248QE, I have no intentions of using g-sync. I like static high refresh rates, and vsync works well.
 
Bingo. It's like PhysX: neat idea, not very useful.

Not really. Some times with taxing games, you have areas where you're hitting 50-60fps and then you move to a more demanding scene and you have some FPS drop. I see it as worth it if I don't have to fiddle much with settings and can leave all the eye candy on without having to worry about or adjust for those dips. I can't wait for it to become more commonplace. Much more useful than Physx, which, funny enough, can cause exactly what I'm talking about.

This is w/my own personal comp -- specs in signature.
 
Right, but it wouldn't be the panel "inventing frames", it's that it simply lacks the functionality.

What I'm saying is that G-Sync cannot compensate for a shortcoming of the panel. The only way it might compensate for a very low refresh rate would be to invent frames to keep the refresh rate up, but it can't do that—thankfully.
 
Not really. Some times with taxing games, you have areas where you're hitting 50-60fps and then you move to a more demanding scene and you have some FPS drop. I see it as worth it if I don't have to fiddle much with settings and can leave all the eye candy on without having to worry about or adjust for those dips. I can't wait for it to become more commonplace. Much more useful than Physx, which, funny enough, can cause exactly what I'm talking about.

This is w/my own personal comp -- specs in signature.

When I get less than 60fps, I just keep adding more video cards. In fact, that's precisely the reason I'm going to be selling my current rig and building an 4930k & 880 QuadSLI (or 390X Quadfire) build soon.
 
Tearing at 30fps definitely seems a bit odd. Tearing happens when the display's refresh rate is lower than the FPS generated by the video card... so g-sync should be able to sync those appropriately.
 
Bingo. It's like PhysX: neat idea, not very useful.

How is it "not very useful"? You don't noticed when playing a game with vsync when it dips below the monitor's refresh rate? I sure as hell do.

It also sounds like a godsend for 3d vision.
 
When I get less than 60fps, I just keep adding more video cards. In fact, that's precisely the reason I'm going to be selling my current rig and building an 4930k & 880 QuadSLI (or 390X Quadfire) build soon.

....and you still get input lag and tearing when you turn vsync off. When you enable vsync, frames are delayed from showing on your output device, which causes input lag. Always. Prior solutions such as adaptive vsync have helped tremendously but don't eliminate it as G-sync does. I suppose this is where you tell us "well I don't get input lag" which is flat out not true.

As far as i'm concerned, if it removes the perception of stutter from 30-60 fps variable frame rate, removes input lag, and allows you to increase image quality, it's a godsend.

There are also plenty of games which aren't solid at 60 fps even at 1080p with ultra settings. You cannot always have a 60 fps solid framerate. G-sync gives you way more leeway for completely smooth gaming without input lag. With vsync, you get input lag and the second it dips below 60 fps, you get a perception of stutter. Vsync isn't a solution. Not unless the game is designed around YOUR hardware and the game is always 60 fps. This of course is an unrealistic expectation, unless you plan on spending 2000$ on GPUs.

Let's take for instance, Metro : LL. Even at 1080p, if you play it maxed out, it's going to dip below 60 FPS with a 290X or 780ti. With g-sync? That's a non issue. With v-sync? The second it dips under 60 HELLO STUTTERING. And of course since you're using vsync , you have input lag. Okay to solve this problem, add another 600$ for another GPU. Okay now you're at 60 frames solid. But you still have input lag. And even with the additional GPU, if you have g-sync you can pile on more image quality such as SSAA without stuttering when you go below 60 fps.
 
Last edited:
....and you still get input lag and tearing when you turn vsync off. When you enable vsync, frames are delayed from showing on your output device, which causes input lag. Always. Prior solutions such as adaptive vsync have helped tremendously but don't eliminate it as G-sync does. I suppose this is where you tell us "well I don't get input lag" which is flat out not true.

As far as i'm concerned, if it removes the perception of stutter from 30-60 fps variable frame rate, removes input lag, and allows you to increase image quality, it's a godsend.

There are also plenty of games which aren't solid at 60 fps even at 1080p with ultra settings. You cannot always have a 60 fps solid framerate. G-sync gives you way more leeway for completely smooth gaming without input lag. With vsync, you get input lag and the second it dips below 60 fps, you get a perception of stutter. Vsync isn't a solution. Not unless the game is designed around YOUR hardware and the game is always 60 fps. This of course is an unrealistic expectation, unless you plan on spending 2000$ on GPUs.

Easy there, ace. There are a ton of good solutions for both tearing and input lag, none perfect, but none require you to spend 200 on adapting a monitor you own, or buying a new one altogether.

And on the second point, that's exactly what I said: I AM going to spend $2000 on GPUs.
 
Back
Top