G-sync proven to be just vendor lock-in on top of AdaptiveSync?

If this proves to be true, then would it be possible for AMD drivers to be similarly modded to enable "free"sync on same monitors using 29x level cards?
 
The allegation is that at least some Nvidia GPUs can talk to eDP/Adaptive-sync monitors by bypassing the ping-pong checking, not that arbitrary GPUs can talk to G-sync FPGA modules, which seems less likely.
 
The allegation is that at least some Nvidia GPUs can talk to eDP/Adaptive-sync monitors by bypassing the ping-pong checking, not that arbitrary GPUs can talk to G-sync FPGA modules, which seems less likely.

Fair enough. If this opens the pool of monitors available, that's still a good thing.
 
questionable site....though still curious

then found this:

from LTT forum
http://linustechtips.com/main/topic/299222-is-nvidia-g-sync-a-scam/?p=4064652
G-SYNC doesn't use VESA Adaptive-Sync. It was created before Adaptive-Sync was added to the DisplayPort spec, and work over DisplayPort 1.2 which is prior to Adaptive-Sync's inclusion. The G-SYNC module also has a physical 768MB memory cache to buffer frames, you can't just software your way around that. Adaptive V-SYNC is also something completely different, it's just a dynamic on/off toggle for V-SYNC, and has nothing to do with changing the monitor's refresh frequency.

If the monitor's scaler is not designed for constantly varying frequencies, it will not have a fun time trying, in most cases. But since I have an NVIDIA GPU (780 Ti) and two DP 1.2a monitors (Dell U2414H and U2415) I decided to put this little theory to the test and installed the modded driver. As expected, did not work, no G-SYNC option. I also notice in the comments of the article there isn't a single comment from someone else saying "it worked for me" or anything. Just some people saying it didn't work, or asking technical questions about how it's possible for this to work. Which it isn't.

seems like just another bs article from some random website....but of course the nature of the internet will let it spread like wildfire anyway
 
This could very easily be wrong, fake, or even malware, but it'll be interesting to see. The underlying truth regardless is that there's no real reason for the G-sync ping-pong protocol to exist and that eDP panels have been capable of dynamic sync for years, so nothing in the problem requires magic to resolve.
 
Well obviously there is no magic in computing technology, but the "problem" originally solved by design in eDP was not motion clarity but power saving. If existing specs also provided a complete solution the problem of motion clarity with comparable success to Gsync then it was through serendipity and not design, but this is the first time I've heard that claim.
 
G-sync/Adaptive-sync/FreeSync have little to do with motion clarity, just eliminating the tradeoff between Vsync stutter and no-sync frame tearing.

LightBoost/ULMB/other strobing are what reduce motion clarity, and are currently not enabled simultaneously with any form of dynamic syncing.
 
G-sync/Adaptive-sync/FreeSync have little to do with motion clarity, just eliminating the tradeoff between Vsync stutter and no-sync frame tearing.

LightBoost/ULMB/other strobing are what reduce motion clarity, and are currently not enabled simultaneously with any form of dynamic syncing.

Judder is an issue of motion clarity.
 
true, but it's pretty arguably secondary compared to sample-and-hold

Arguably, but the importance of the problem wasn't my point. Variable refresh in eDP was pitched as a way to save power when not playing games or watching video. It's not impossible that it happens to be perfectly compatible with Gsync without modifications but it would at least be a little surprising if it wasn't kinda broken in some way.
 
http://gamenab.net/2015/01/26/truth-about-the-g-sync-marketing-module-nvidia-using-vesa-adaptive-sync-technology-freesync/

Somebody claims to have modified Nvidia drivers to enable G-sync on any eDP/Adaptive-sync panel by disabling the monitor polling/FPGA module authentication check.

If true, this basically forces Nvidia to openly support Adaptive-sync or admit that they invented nothing but a $150 anti-consumer lock-in chip.

Oh dear if in any way correct the last week has not been good for nvidia at all :eek:
 
Obvious fake. (or maybe he had just managed to modify the driver so the OPTION appears but does nothing).

The video of the supposed gsync pendulum has tearing and stutter, so doesn't work.
(You can pause the video and see tearing, the camera cant artificially add tearing artifacts).

IF it had been true, be sure AMD would've rushed out months ago and said "monitor x y z are alerady compatible!!!111).
 
Obvious fake. (or maybe he had just managed to modify the driver so the OPTION appears but does nothing).

The video of the supposed gsync pendulum has tearing and stutter, so doesn't work.
(You can pause the video and see tearing, the camera cant artificially add tearing artifacts).

IF it had been true, be sure AMD would've rushed out months ago and said "monitor x y z are alerady compatible!!!111).

Exactly. And ASUS and all the other GSYNC monitor makers wouldn't have added all the costly hardware to their monitors just to get them working which included the necessity of overclocking the GSYNC unit for the ROG Swift.
 
It's adding a non-functional menu option. You could add a menu option saying "4K120Hz", but that doesn't mean the hardware will magically conform to your menu option.
 
It's adding a non-functional menu option. You could add a menu option saying "4K120Hz", but that doesn't mean the hardware will magically conform to your menu option.

I get that is likely what's happening, I just haven't seen people demonstrate that it's not doing anything...

It's certainly hoax-ish? Hoax-y? I just am not seeing anyone clearly show it to be b.s.
 
The pendulum demo video he posted was absolutely using Gsync. Not sure how he was faking that.
 
The pendulum demo video he posted was absolutely using Gsync. Not sure how he was faking that.

I see one likely possibility. There's a good chance that the pendulum demo is specifically coded so that when it's showing 'vsync active' it's artificially inducing a bit of stutter. I have played several games where there is absolutely no difference between gsync and vsync if you are operating at max fps (capped at the refresh rate).
 
It is not faked. But it will only work with eDP monitors.

3 different Monitors that i own do NOT have an interface comforming to eDP but LVDS and they are all brand new. (Dell U2414H, Samsung S27D850T, Samsung U3415W). You can check type of interface here:
http://www.panelook.com/modelsearch...resolution_pixels=34401440&production_state=1 (e.g. this one is the panel in the U3415W)

It does work on my one coworkers laptop with an dedicated GPU though. Had two other coworkers try their panels, all LVDS!


Certainly not what most of us are looking for, so really he should have mentioned this fact in the beginning more clearly. But he appears to have mostly been working on laptop related topics? Probably did not expect us pc monitor fanatics to swing by in force :p. So not sure if this was intentional clickbait or if he ever seriously though he could bring this to LVDS displays. I am not going to buy a monitor from some weird company just so i can use this workaround.
 
Last edited:
I'd wait for someone to point a highspeed camera at a display and measure frame intervals. Trying to tell if something is refreshing asynchronously by eye is going to be very hard to verify (particularly if something else is going on to artificially induce additional judder in the test application for VSYNC on/of), and trying to tell from a 24/30fps (or even 60fps) video of a screen is nearly impossible due to how cameras work.
 
For the people who got the G-Sync option to show up with this driver, I would like them to give more impressions on how actual games perform and feel with it. The Pendulum Demo is not the end all be all indicator of how well G-Sync performs. One guy on the Overclock.net forum thread mention getting screen flashes every 5-10 seconds with BF4. If this is the case with a lot of games then this driver hack is pretty much worthless even if it is enabling some kind of variable refresh rate.
 
For the people who got the G-Sync option to show up with this driver, I would like them to give more impressions on how actual games perform and feel with it. The Pendulum Demo is not the end all be all indicator of how well G-Sync performs. One guy on the Overclock.net forum thread mention getting screen flashes every 5-10 seconds with BF4. If this is the case with a lot of games then this driver hack is pretty much worthless even if it is enabling some kind of variable refresh rate.

Because people with official Gsync monitors never run into issues, right?
 
Because people with official Gsync monitors never run into issues, right?

Haha, you got a point but a lot of the early G-Sync issues have been ironed out. At least for the games I've tried on my Swift. After checking the Overclock.net again one of the users said that Shadow of Mordor felt much smoother with the G-Sync option turned on on his laptop with the hack than VSync. Also, it looks like PC Perspective has confirmed that the driver works but it's because the leaked driver that was on the ASUS website was an alpha driver for mobile Gsync. Here is the article and video:

http://www.pcper.com/reviews/Graphics-Cards/Mobile-G-Sync-Confirmed-and-Tested-Leaked-Alpha-Driver
 
that video claims that the displays disconnect, don't load the proper refresh rate, etc.

close this thread, its retarded, and stop wearing them tin foils people.
 
So it sounds like eDP/Adaptive-sync is "good enough" to earn the G-sync moniker in the mobility space at least.

Even though they've has a lot of insinuations of what G-sync controllers might be capable of, it's unlikely we'll hear anything substantial until AS/FS monitors are in the marketplace for true side-by-side comparisons. If Nvidia thinks what they have is worth a $150 premium over Adaptive-sync/FreeSync, they better have some amazing new features or comparative strengths to show.

I'm of the school of thought that smarter scalers/t-cons probably do help image and motion quality but that different scaler manufacturers should be given an open communications protocol and allowed to compete as they see fit. G-sync might have a higher quality floor for its implementations than what Adaptive-sync ends up having initially, but Nvidia's walled garden approach may see it passed up on quality before they know it.
 
Still, this doesn't change the fact that Nvidia is lying about the extra equipment needed, if eDP is the only thing needed to run G-Sync on laptops.
 
Still, this doesn't change the fact that Nvidia is lying about the extra equipment needed, if eDP is the only thing needed to run G-Sync on laptops.

No, Nvidia is not lying. Desktop monitors are not set up like laptop displays. Go back a year, to January 8 2014, to see Nvidia explain the difference:

"Laptops, he explained, have a different display architecture than desktops, with a more direct interface between the GPU and the LCD panel, generally based on standards like LVDS or eDP (embedded DisplayPort). Desktop monitors use other interfaces, like HDMI and DisplayPort, and typically have a scaler chip situated in the path between the GPU and the panel. As a result, a feature like variable refresh is nearly impossible to implement on a desktop monitor as things now stand. That, Petersen explained, is why Nvidia decided to create its G-Sync module, which replaces the scaler ASIC with logic of Nvidia's own creation. To his knowledge, no scaler ASIC with variable refresh capability exists—and if it did, he said, "we would know." Nvidia's intent in building the G-Sync module was to enable this capability and thus to nudge the industry in the right direction."

http://techreport.com/news/25878/nvidia-responds-to-amd-free-sync-demo

Nvidia's nudge is working: the first eDP Tcon for 4K desktop monitors was announced on October 22, 2014:

“Traditionally only used between a GPU and embedded display, eDP is now making inroads as the panel interface within a computer monitor,” said Jimmy Chiu, Executive VP of Marketing at Parade Technologies. “As monitors move towards higher resolution such as 4K and beyond, the pin and wire count to support the existing LVDS interface, or even MIPI, is simply not practical."

http://www.paradetech.com/2014/10/parade-announces-dp667/
 
So it looks like gsync on mobile will just be freesync? Wonder what this means for gsync on the desktop in the future...
 
While Gamenab might be an idiot, I think it's a stretch to call him a liar. He found something and showed it working and then made a bunch of extrapolations that were far-fetched and founded in ignorance, but it's not like he knew the truth and was saying the opposite.

I'm glad he provoked places to look into this though. It got some good info out there.
 
Yeah the gamenab guy is a tinfoil hat nutjob and makes accusations based on assumptions but he apparently did figure something cool out, possibly something nvidia might not be happy about. It remains to be seen though once freesync/a-sync monitors get released.
 
Back
Top