Gsync, why was it not added to GPU card's

_l_

[H]ard|Gawd
Joined
Nov 27, 2016
Messages
1,151
I just started reading about G-Sync tonight and saw displays that have this G-Sync tech inside. Was wondering why nVidia didn't just add the tech to their GPU cards instead of marketing those very expensive displays?
 
It's the G-Sync module (the hardware added to the display) and the Nvidia licensing fees that make G-Sync monitors so expensive. This article explains it:
https://wccftech.com/amd-freesync-nvidia-gsync-verdict/

If Nvidia wanted, they could support the Adaptive-Sync open standard as AMD has, which provides the technology mostly on the GPU side, making the monitors much less expensive.

Though I guess you could make the argument that G-Sync pioneered the concept in proprietary form, and that it wouldn't have existed as an open-standard without them.
 
  • Like
Reactions: _l_
like this
I just started reading about G-Sync tonight and saw displays that have this G-Sync tech inside. Was wondering why nVidia didn't just add the tech to their GPU cards instead of marketing those very expensive displays?

Because its handled in the monitor, just like Freesync. It requires something in both ends. And now the actual standard is a 3rd option.
 
I just started reading about G-Sync tonight and saw displays that have this G-Sync tech inside. Was wondering why nVidia didn't just add the tech to their GPU cards instead of marketing those very expensive displays?

Freesync and Gsync are based on an embedded display port standard that has existed as a power saving in feature in laptops and APU's for years. There was nothing like this on desktop displays and the hardware needed to run a sync display wasn't on any GPU.

To get sync tech you need a framebuffer, timing controller and a suitable scaler on the monitor. Nvidia went down the route of using an FPGA chip to do everything. The chip is the scaler, framebuffer and timing controller all wrapped into one. You don't need any new hardware on the GPU side just a display port. Nvidia license this tech to monitor manufacturers. So Asus, Acer etc. have to buy the module from Nvidia which pushes up the cost of the monitor. This is called Gsync.

You can't do it all on the GPU side. The route Nvidia took means you don't need anything on the GPU side only a Nvidia GPU with a display port.

The other sync tech is Adaptive sync. There is no license fee for Monitor manufacturers to use this, so it's cheaper to make monitors using adaptive sync. This needs updated hardware on the GPU and the Monitor to work. AMD use this tech and their method of connecting to an adaptive sync monitor is called Freesync.
 
Because its handled in the monitor, just like Freesync. It requires something in both ends. And now the actual standard is a 3rd option.

There is no 3rd option? There are Adaptive sync Monitors and Gsync monitors.
 
  • Like
Reactions: _l_
like this
Gsync, Freesync and VVR. 3 different technologies. Actually 4 with Adaptive Sync.

These are all just different implementations of adaptive sync. VRR is part of HDMI 2.1 and will be implemented in TVs as a software update, it can also work on HDMI 2.0.

G-Sync is the only one that seems to do a bit more on top at least according to Wikipedia with things like overdrive prediction and frame collision avoidance. So far it seems to work a bit better than the Freesync options but I don't know if Freesync 2 levels that playing field.

The way I see it, Nvidia has picked a very Apple-like way to approach adaptive sync where they control both the hardware and software. This means they can optimize G-Sync to work well on a far more limited set of hardware than whatever gets put into Freesync displays. This does make the displays more expensive for the OEM and that cost gets passed down to the consumer.

I just wish Nvidia let people choose what they want to use. Enthusiasts might still consider displays with G-Sync modules worth the premium price. If you look at this list of Freesync monitors you can see that there are several that don't have any kind of motion blur reduction tech (backlight strobing) built in whereas nearly all G-Sync displays that can run at above 120 Hz (you need 120 Hz for ULMB) support G-Sync's ULMB mode, which is a great alternative to G-Sync on games that run at high framerates. I use ULMB on a lot of games that run 60+ fps consistently as it makes a big difference in image clarity in motion.
 
Back
Top