[AnandTech] Nvidia G-Sync Review

Status
Not open for further replies.
It's not my job to support your arguments. That's your job. Back up what you say.

You're confused by the difference between a panel and a display. There is nothing about G-Sync that makes it exclusive or inherent to TN panels or unable to operate on IPS panels. However, a display must have a G-Sync module installed to use the technology.

That "specific hardware" is a Displayport connector.
Right, but it helps when the person disagreeing with facts stated by Nvidia has actually done some research on the subject.

Correct but it will not work with all "displays."

False. Do some research.
 
It's amusing to watch a person walk backward toward a cliff. Spectators know there's nowhere for that person to go but down, yet that person continues stepping backward, thinking he'll somehow switch places with the spectators if he simply keeps moving toward the edge.
 
Right, but it helps when the person disagreeing with facts stated by Nvidia has actually done some research on the subject.

Agreed; it would be very nice if you did some research. You've made posts that have been contradicted and corrected by something as simple as the official G-Sync FAQ. It's pretty disheartening to see someone so completely and stubbornly wrong as you are.
 
Agreed; it would be very nice if you did some research. You've made posts that have been contradicted and corrected by something as simple as the official G-Sync FAQ. It's pretty disheartening to see someone so completely and stubbornly wrong as you are.

Yes, I was talking about myself... :rolleyes:

Here is why it simply "doesn't work with every panel/display" and takes additional R&D/engineering.
There was a longer/more indepth article/post somewhere but I can't find it. It detailed exactly what is needed to implement the G-Sync module into a display and what the specific requirements/specifications are.

Here it explicitly states that G-Sync isn't some sort of standard, i.e. it is proprietary.

Here is where it states that G-Sync doesn't work in all display modes, not just windowed. Nor with all Display Port standards...

Here is where it states that some games don't show a benefit.
There was an interview during the release of G-Sync where someone mentioned that not all games work with G-Sync, Peterson?, and that certain engines/builds show no benefit. I'm sorry I can't recall where it was from.

I apologize for keeping up with all the information/interviews out there about G-Sync and don't go off of Nvidia's PR slides.
 
Last edited:
You were asked what the specific requirement of the game engine is, not that Peterson said something we already agreed is detailed in the FAQ (why it would it need repeating?). You suggested that there was a requirement above and beyond the requirement that the game be able to run in full-screen, but made no mention as to what that requirement was.
 
Oh ok. Let me know when that happens...
This isn't some sort of open standard, there is specific hardware in Kepler to allow G-Sync as well as multiple patents.
The same way that PhysX isn't proprietary?
Ok, now I KNOW you're not reading.

I've said it three times now, there are already Embedded DisplayPort devices that use dynamic refresh rates. They change the v-blank interval to instruct the display when to refresh (exactly what G-Sync does).

Nothing had to be licensed from Nvidia to do this, it's simply part of the DisplayPort spec. It has been since October 2012.
There is NO "specific hardware in kepler" to enable this. It's part of the DisplayPort spec. Any graphics card with a full DisplayPort implementation could potentially alter its v-blank timings to instruct an attached monitor to refresh on-demand (if the monitor supports it).
Just because Nvidia cards previous to the GTX 650 Ti don't support altered v-blank timings doesn't make it proprietary, it just makes it apparent that Nvidia's early DisplayPort implementations are inflexible.
Nvidia's patents do not apply to variable refresh rate based on v-blank period. That functionality is already part of the DisplayPort spec.

So, I ask AGAIN, how is this in any way proprietary? I don't see how shoving normal video data down a DisplayPort cable is comparable to the totally in-house and proprietary PhysX tech in any way.

Here it explicitly states that G-Sync isn't some sort of standard, i.e. it is proprietary.
That article says nothing of the sort. There's no explanations or details about how it's proprietary or non-standard . All it says are Nvidia's requirements to get the tech working with their cards.
 
Last edited:
Yes, I was talking about myself... :rolleyes:

Here is why it simply "doesn't work with every panel/display" and takes additional R&D/engineering.
There was a longer/more indepth article/post somewhere but I can't find it. It detailed exactly what is needed to implement the G-Sync module into a display and what the specific requirements/specifications are.

Here it explicitly states that G-Sync isn't some sort of standard, i.e. it is proprietary.

Here is where it states that G-Sync doesn't work in all display modes, not just windowed. Nor with all Display Port standards...

Here is where it states that some games don't show a benefit.
There was an interview during the release of G-Sync where someone mentioned that not all games work with G-Sync, Peterson?, and that certain engines/builds show no benefit. I'm sorry I can't recall where it was from.

I apologize for keeping up with all the information/interviews out there about G-Sync and don't go off of Nvidia's PR slides.

All this tells me is that you have no idea what you're reading, or that you like to extrapolate and twist information.

Your first link is not new information. You present it as proof that G-Sync does not work with everything; instead, it's proof that there is no standard component in display design and construction that allows swapping out a daughterboard or PCB add-on like G-Sync. This is a huge Captain Obvious moment; of course a BenQ display won't have the same standardized internals as an Asus display. Of course the G-Sync module has to be adapted for the displays. There is no standard way of connecting a G-Sync module, or any other kind of module, to a display's internal PCB. If there were, we wouldn't need to wait for releases; we could walk into a store, buy a G-Sync module, remove the cover of our displays, and insert the module into an expansion slot. Displays don't work that way. You are not presenting new or perception-altering information.

Your second link is also Captain Obvious; no, G-Sync is not a standard, because it has not been submitted to standardization bodies, nor should it. It interacts with standards, namely Displayport, to accomplish its function. The module itself has nearly 1 GB of RAM onboard to hold frames and manage the display buffer. It's a middleman between your Displayport 1.2 cable and the panel output. That's the entire point of the module. You are not presenting new or perception-altering information.

Your third link also contains no new information. It's not even worth the same amount of words as the first two links.

Your fourth link is why no one takes Tom's seriously; there is no point to enabling v-sync with G-Sync. G-Sync replaces v-sync. It's not additive, like triple buffering. And if you're using the Gamebryo engine as an example of how things should work, you have fundamental issues in the formulation of your views. Your link even says: "At 60 Hz, adding G-Sync to the equation actually has a detrimental effect, likely because V-sync is forced on and the technology is meant to operate with V-sync off." And "For Skyrim, turning G-Sync off and playing at 60 Hz is probably the most natural approach, providing you get more than 60 FPS all of the time using your desired quality settings (not difficult)."

Your posts are the digital equivalent of hot air.
 
What I find even more funny is that he's using one source for three links. There isn't any crossreferencing or official set of statements to back anything up, just Tom's Hardware being the holy grail of G-Sync info.

And again I must ask what the point of arguing about this is? Perhaps we should start a thread up where we start arguing endlessly that Mantle is crap and isn't gonna mean anything because of <insert subjective opinions on the direction in which the industry should be heading> and such. Just seems to be pulling reasons to keep arguing... why?
 
I would assume that Maxwell will probably get rid of the whole modular G-sync and just implement it on the card (kind of like the AMD sound chip).
 
What I find even more funny is that he's using one source for three links. There isn't any crossreferencing or official set of statements to back anything up, just Tom's Hardware being the holy grail of G-Sync info.

And again I must ask what the point of arguing about this is? Perhaps we should start a thread up where we start arguing endlessly that Mantle is crap and isn't gonna mean anything because of <insert subjective opinions on the direction in which the industry should be heading> and such. Just seems to be pulling reasons to keep arguing... why?
Because I'm not going to waste my time linking to the dozens of interviews and articles that say the same thing when I can source my statements from a single reference.

That has/is already being done in the Mantle thread. Mostly by the same people that don't want anything negative said of G-Sync.

I would assume that Maxwell will probably get rid of the whole modular G-sync and just implement it on the card (kind of like the AMD sound chip).
Nope, not possible.

All this tells me is that you have no idea what you're reading, or that you like to extrapolate and twist information.

Your posts are the digital equivalent of hot air.
Thanks. Your opinion of the matter is duly noted since you simply dismiss proof that what I was stating is true.

Ok, now I KNOW you're not reading.


That article says nothing of the sort. There's no explanations or details about how it's proprietary or non-standard . All it says are Nvidia's requirements to get the tech working with their cards.
And I see you aren't either.
Very amusing.
 
Because I'm not going to waste my time linking to the dozens of interviews and articles that say the same thing when I can source my statements from a single reference.

You're not making any substantive points that affect perception or operation of G-Sync. You're acting as if you've uncovered something that makes G-Sync undesirable. You are incorrect.

Enough of this. On to the ignore list you go.
 
You're not making any substantive points that affect perception or operation of G-Sync. You're acting as if you've uncovered something that makes G-Sync undesirable. You are incorrect.

Enough of this. On to the ignore list you go.

That is cute. I'm sorry for offering facts that show it isn't the perfect solution that you rabid Nvidia fans choose to think it is.

Pretty sure I've offered up sources to back my statements while you choose to ignore them. If you didn't wish to have a mature discussion on the technology I don't understand what you are doing here.
 
That is cute. I'm sorry for offering facts that show it isn't the perfect solution that you rabid Nvidia fans choose to think it is.

Pretty sure I've offered up sources to back my statements while you choose to ignore them. If you didn't wish to have a mature discussion on the technology I don't understand what you are doing here.

Wait wait wait, what? Since when did anyone claim it was a "perfect solution"? If that's essentially the bulk of what you're arguing against, then this entire thread's been wasted with pages and pages of your utterly useless drivel. I'm really curious as to where that claim came from. People are well aware of the cons you're bringing up (they've probably been stated in several reviews), and I'm really confused about what you're trying to accomplish, here. Dissuading potential buyers? No, you're really not. You're just a troll. You're annoying. You're a nuisance.

Proprietary or not shouldn't even be a factor into this because I don't think anyone on the AMD side is expecting to be able to use this when Nvidia is the one that spent R&D into it. They're going to reap some benefit from it. I'm not blaming AMD for currently keeping Mantle proprietary (and you'll see AMD people raving about it as a boon), and I'm not sure why anyone's going to blame Nvidia if they choose to make this entirely proprietary.

Also, the "sources" you provided were all of two, one of which was just Tom's Hardware. I don't add people to ignore lists because I believe that at some point perhaps they can all make a reasonable contribution to a thread (regardless of past bouts of idiocy). But I do ignore them on a per-thread basis, and you're close.
 
And I see you aren't either.
Very amusing.
Incorrect.

I've read every one of your posts, all of which ignore the blatantly obvious and continue to blindly claim Nvidia is doing something proprietary here. They aren't, it's something inherent to modern DisplayPort that any manufacturer can implement.
You build a graphics card and alter the v-blank interval on your DisplayPort output to go off as soon as a frame is swapped to the screen buffer? Congrats, you're doing exactly what G-Sync does.

There's nothing amusing about your willful ignorance.
 
Last edited:
That is cute. I'm sorry for offering facts that show it isn't the perfect solution that you rabid Nvidia fans choose to think it is.

Pet meet kettle? You're definitely not a rabid AMD fan, right?
 
Last edited:
You're definitely not a rabid AMD fan, though, right?

Right now I am 100% confused as to what LordEC911 point actually is! G-Sync is bad??? Well he sucks at proving it! Oh that's right, he doesn't have to back up anything he says.

That is the readers job.

Would love to see him bullet-point the issues with G-Sync, especially why it is a failure, and give one factual link for each.
 
Apparently the burden of proof for HIS arguments are on us. It is our duty to find collaborating evidence for his claims. ;)

I guess not everyone can be an all star on their debate team.
 
May I ask you guys something about G-Sync. I read alot on various forums and review sites about G-Sync and couldn't find any hard evidence about G-Sync being exclusive to ASUS till Q3 2014, just a stupid post here which I doubt is true http://wccftech.com/nvidia-g-sync-asus-monitors-brands-q3-2014/
And my friends teamed up together and say it's so and hold this true (a rumour lol). So they basically go off this rumour and say GSync will fail right out the door.
I saw over at overlordcomputer forum that an admin stated this is false. http://overlordforum.com/topic/603-nvidia-g-sync/page-2
I do believe that.
 
May I ask you guys something about G-Sync. I read alot on various forums and review sites about G-Sync and couldn't find any hard evidence about G-Sync being exclusive to ASUS till Q3 2014, just a stupid post here which I doubt is true http://wccftech.com/nvidia-g-sync-asus-monitors-brands-q3-2014/
And my friends teamed up together and say it's so and hold this true (a rumour lol). So they basically go off this rumour and say GSync will fail right out the door.
I saw over at overlordcomputer forum that an admin stated this is false. http://overlordforum.com/topic/603-nvidia-g-sync/page-2
I do believe that.

this post ?

Posted 18 December 2013 - 05:09 PM
GSYNC only works on the ASUS 248 monitor.

DIY Kits for the ASUS 248 monitor were going to be released this week, but Nvidia nixed that - no idea why.

Right now, we have no stock on either the kit (so we can mod 248s) or new 248s (they are out of stock until mid-January.) That could mean the next time 248s are in stock they are GSYNC models. Again, no idea.

I would have to say this GSYNC release was pretty jacked up - no stock for either the PCB kit to us or now the displays. Seems strange for Nvidia to drop the ball this badly on such an awesome tech.

Now if only we can get them to focus on the proper panel and size that gamers want - 1440 IPS, then we would be in business!
???
 
Basically the DIY retrofit modules that Nvidia promised would be available for the Asus units, are not going to be available. Instead, they are going to offer a retrofit service where you send in your old monitor, they retrofit it, and they send it back for ~$300 from what I understand.

Quite a few people are upset because they bought the monitors when Nvidia said they could always DIY add the module when it arrives. Now, that's looking to be a $$$ proposition.
 
I like the concept of gsync but I'm not sure nvidia did the right thing by building its own asic.

As I said before, Gsync is a solution to a problem that shouldn't exist. Its an expensive solution, and since it doesn't work below 30hz it can't be used for movies at 24fps.

Would be better to propose a new standard to the vesa and HDMI groups. (hopefully this will come eventually) this way the cost would be neglible and fewer limitations.
 
Quite a few people are upset because they bought the monitors when Nvidia said they could always DIY add the module when it arrives. Now, that's looking to be a $$$ proposition.
I'd wager those guys are probably still within their return policies, depending on the vendor. In any case, they took a fairly big gamble on that and had to expect that they might not win on it. Such is life.

Would be better to propose a new standard to the vesa and HDMI groups. (hopefully this will come eventually) this way the cost would be neglible and fewer limitations.
I tried to do a little digging on this, but I couldn't find anything which suggested that NVIDIA has tried to propose something of this nature to VESA (which they're a part of). I'd suggest that NVIDIA went the direction they did for the purposes of competitivity.

It's unfortunate, but I think they're feeling the pressure to maintain dominance in the PC space with so many of AMD's wins in the console space. G-Sync is one way they're doing that. As others have said, though, it doesn't seem like G-Sync (the module itself) is genuinely a vendor-locked technology.
 
Well that is upsetting news that there will not be a dyi kit. However I can understand that Nvidia and Asus do not want to deal with people destroying this panel because an error made during disassembly.

I think it should be noted that the service should be less money based off the word that these COULD be upgradeble and people made the purchase based on that. So good faith ( yeah right ) would be the right thing to do here by giving people a free installation, sans shipping and gsync module cost. However I personally can't confirm either asus or Nvidia said this gync module for the v248s were user upgradeble. Was this said in a review? I've only read about it on the forum myself.

I bought two of these and later learned that they could be upgraded to gsync. Hopefully that option may still exist but I'm not going to cry about it. I don't plan to stay with 1080p longer than an additional year.
 
I like the concept of gsync but I'm not sure nvidia did the right thing by building its own asic.

They didn't- they customized an FPGA, i.e. programmable processor, to do the job. The R&D and the price of the FPGA, along with the initial exclusivity, make it expensive for now.

Assuming that G-Sync stays mostly proprietary we could easily see the manufacturing margin for using G-Sync, which would be built into an ASIC that is used in place of an ASIC that monitors already need, approach $5, with retail costs say around $50 over a non-G-Sync monitor.

If G-Sync doesn't stay proprietary, you may not be able to get a monitor without it in a few years, and the marginal cost would be nothing.
 
As I said before, Gsync is a solution to a problem that shouldn't exist. Its an expensive solution, and since it doesn't work below 30hz it can't be used for movies at 24fps.
Uh, just because it doesn't work below 30Hz doesn't mean it wont work with 24FPS content...

The reason G-Sync doesn't work below 30Hz is because the image on the LCD will start to fade if you wait much longer than that between refreshes... simple fix for that is for G-Sync to double-refresh when framerate is below 30 FPS.

Problem solved, 24FPS content would cause the display to run at a perfectly-synced 48 Hz.
 
Well that is upsetting news that there will not be a dyi kit. However I can understand that Nvidia and Asus do not want to deal with people destroying this panel because an error made during disassembly.

The DIY kit involves soldering and wiring, according to Tom Pedersen (from NV). So it isn't something that the average consumer can deal with, although i'm sure there are exceptions. Just to be clear, from what i've read and seen in videos it is absolutely not a "plug and play" device. It is difficult to add to a monitor even with a DIY kit, and it would be very easy to have a non working monitor afterwards. With that being the case, I could see why NV would rather people ship their monitors for the installation kit to be added. I dunno. There's really no easy answer on this. Do you let people destroy their monitors? Or is it better to have them ship their monitors for a professional job?

I could see why some would be upset. But I can also see why nvidia would be hesitant in shipping DIY kits out to individuals. I mean, if you put me near a soldering iron bad things will happen. LOL. I'm sure i'm not the only one under this category.
 
"I HERP DERP FULLOWED INSTRACTIONS RAIGHT BUT ITZ BROKEN, RAISING HELL WITH RANDOM INSTITUTIONS HERP DERP"

Might be part of the reason.
 
The DIY kit involves soldering and wiring, according to Tom Pedersen (from NV). So it isn't something that the average consumer can deal with, although i'm sure there are exceptions. Just to be clear, from what i've read and seen in videos it is absolutely not a "plug and play" device. It is difficult to add to a monitor even with a DIY kit, and it would be very easy to have a non working monitor afterwards. With that being the case, I could see why NV would rather people ship their monitors for the installation kit to be added. I dunno. There's really no easy answer on this. Do you let people destroy their monitors? Or is it better to have them ship their monitors for a professional job?

I could see why some would be upset. But I can also see why nvidia would be hesitant in shipping DIY kits out to individuals. I mean, if you put me near a soldering iron bad things will happen. LOL. I'm sure i'm not the only one under this category.

Well that sucks, but at the same time, understandable.
 
"I HERP DERP FULLOWED INSTRACTIONS RAIGHT BUT ITZ BROKEN, RAISING HELL WITH RANDOM INSTITUTIONS HERP DERP"

Might be part of the reason.

a simple YOU BROKE IT YOU OWN IT - VOIDS ALL WARRANTIES - *WARNING* USE AT YOUR OWN RISK - DON'T CALL US WE SURE AS HELL WON'T CALL YOU disclaimer should take care of that.
 
Problem with that is that they also lose a potential customer that way. Say the guy screws up putting it in, and then they say screw this. Nvidia possibly loses a future customer. All they get was what the guy paid for the module. In the meantime perhaps even get more people staying away from the product because of the pains of other people... which means that they're gonna have these material goods in stores that got overproduced and ruined their rep a bit. I dunno, I think getting trained people to do it is prolly gonna be better off for them. Since they're hosting this service, consider also that they're gonna have to be paying labor costs to do these upgrades, so they're not necessarily gonna make a huge profit off the price hike over the DIY... unless they outsource it hard. Possible.

And as a benefit they have a 100% chance (assuming they do this right) to get a satisfied customer that's hooked in for at least the near future. Sounds logical. Mind you, I'm hoping they can eventually install it on my Qnix. Not even touching it until they get it on some 1440p monitor with good quality control otherwise.
 
So I have a question:

It's mentioned here, multiple times, the GSYNC is not really a proprietary thing, because there is a variable v-blanking signal built into the DisplayPort standard.

The implication I am getting is that monitor manufacturers have not upgraded or altered their internals to take advantage of this capability, and went for the cheaper route of using a fixed refresh and ignoring the possible variable rate signal.

That's fine, I get that.

Ok, so if nVidia's module is a middle-man that reads and utilizes this signal, enabling the monitor to use a variable refresh rate, and is meeting a DisplayPort standard, then shouldn't it work with any video card (AMD or nVidia) that outputs variable timed frames along with the correct v-blank refresh interval timer? That is, if the card meets the DisplayPort standard in its output, shouldn't it work with the GSYNC monitor?

Everything I have read suggests that this would not be the case, and it is only compatible with nVidia cards. Maybe they are saying this now because AMD has not built this into their drivers, and a driver update could make them compatible with GSYNC monitors. Is that the case? If not, then I would argue that this is a proprietary implementation that is not just meeting the DIsplayPort standard. If so, it would simply be enabling monitors to take advantage of the DP tech, given the driver/software was outputting it (had it enabled).

I don't mean to support that guy, I will be getting an nVidia card and a GSync monitor as soon as there is a non-TN panel available. I am very excited about the tech, and I just want this point cleared up.
 
Oh and on his other point, there technically could be a small requirement in game engines for GSYNC to be useful:

It has to be capable of rendering frames at variable intervals, aka it has to be able to turn off vsync. There may be some very simple games where it is always on. Without rendering frames on demand, or as fast as possible, and only rendering at a fixed 60 fps, for example, GSYNC would have negligible effect. This is obviously for graphically simple games where this wouldn't be beneficial anyways.
 
Alternatively, if you’re a dab hand with a Philips screwdriver, you can purchase the kit itself and mod an ASUS VG248QE monitor at home. A complete installation instruction manual will be available to view online when the module becomes available, giving you a good idea of the skill level required for the DIY solution; assuming proficiency with modding, our technical gurus believe installation should take approximately 30 minutes.

(Update December 20, 2013: We are excited to confirm that the NVIDIA G-SYNC Do-It-Yourself Kits will be available for purchase in early January. Further details will be announced shortly.)

Right off of Nvidia's page.
Only one monitor is getting a DIY kit.
 
It has to be capable of rendering frames at variable intervals, aka it has to be able to turn off vsync. There may be some very simple games where it is always on.
Good point. The driver should be able to force it off at its level, however.
 
I love the idea of G-sync, but I'll probably wait a little longer since my monitors are fine for my needs as of now..
 
So I have a question:
Ok, so if nVidia's module is a middle-man that reads and utilizes this signal, enabling the monitor to use a variable refresh rate, and is meeting a DisplayPort standard, then shouldn't it work with any video card (AMD or nVidia) that outputs variable timed frames along with the correct v-blank refresh interval timer? That is, if the card meets the DisplayPort standard in its output, shouldn't it work with the GSYNC monitor?

Anyone have any insight on this? Still curious, can't find the information explicitly
 
Anyone have any insight on this? Still curious, can't find the information explicitly

My understanding is that the display drivers need to be able to talk to the G-Sync module, and AMD's drivers naturally would not be able to talk to an Nvidia product. So while the method used by Nvidia is standards-based, it's still a proprietary module that only talks with Nvidia products.

AMD is purportedly working on a "response" to G-Sync, but quite frankly I doubt they're going to just pull something out that's as good or better. If it was that easy, it would have been done years ago.
 
Status
Not open for further replies.
Back
Top