Slightly lower latency is about it nowadays. Also, freesync isn't the same thing as adaptive sync.
With GSU you get HDR EOTF Tracking suited for the panel for gaming. For example AW3423DW vs AW3423DWF - the latter has much worse EOTF Tracking (to be fixed by DELL though), or X32 FP vs older ones like X27 - X27 has much better looking image in games that are not the best at HDR calibration.What’s the deal with this technology in current times? With both amd and nvidia now supporting adaptive sync (free sync) what’s the benefit of shelling out $$$ for gsync Equipped monitor?
Though Nvidia already lowers the standard of Gsync Ultimate from true HDR 1000 to any HDR, G-sync module is still the most powerful processing unit to handle graphics, backlight in various way, not just latency.What’s the deal with this technology in current times? With both amd and nvidia now supporting adaptive sync (free sync) what’s the benefit of shelling out $$$ for gsync Equipped monitor?
G-sync is more important than you thought. It controls backlight as well.It matters much less than it used to. The main thing I've noticed is that Gsync ultimate or whatever they want to call the ones with the actual module just work and work well at all frequencies. Gsync compatible ones CAN work well, but I've seen a few cases where they have issues. When nVidia has their hardware on the monitor and in the computer, they seem to be able to make it work flawlessly.
As for things like latency, for the most part that seems like something not real worth worrying about these days with a good screen. Maybe if you are a top-tier pro gamer, but all-in-all the good displays are very low, be they gsync or freesync. It doesn't seem to matter enough to get worked up over.
In terms of what a gsync module actually IS, it is a FPGA and memory in the monitor with nVidia's programming on it. Normally a monitor is going to use some kind of ASIC that handles things like interfaces, scaling, VRR, all that sort of thing. Mediatek makes some real popular ones. nVidia designed their own, but they don't make enough of them to warrant making an ASIC, so they just do it with an FPGA.
I suspect it'll go away in the long run. When it was introduced Gsync was the only way to do VRR, period. Now VRR support is getting super common and works pretty well with most things.