AMD Talks About Enhanced Sync Technology

DooKey

[H]F Junkie
Joined
Apr 25, 2001
Messages
13,555
AMD talks about Enhanced Sync and they claim it helps with tearing and lowers latency compared to VSync. This is a nice bonus for Radeon users and it's even better if you use it with a FreeSync monitor. You can go to their website and read up on this, but before you do take a look at Scott Wasson explaining how it works.

Watch the video here.
 
Last edited:
It's like hammering my head into a wall complaining that nVidia refuses to support freesync VRR, but I own a freesync monitor and can't take advantage of it (╯°□°)╯︵ ┻━┻. I really want VRR back and it sucks having to replace a high dollar monitor with an even more expensive one that is basically the same just inflated for price because it has an nVidia stamp on it.
 
Is this new? I could have sworn it has been in the drivers for months at the very least. Did they just enable it for more products or something?
 
So all this does is discard excess frames over the max refresh rate rather than display multiple frames on a single refresh?

In G-Sync they call this "fast sync" and in my experience it's a very bad experience with 60Hz displays and framerates in the 60's and 70's. The pacing is all wonky when you're simply discarding ~10% of the frames.

I'd have to see this in practice but I suspect a double-buffered adaptive sync is still the best solution for fixed refresh monitors.
 
It's like hammering my head into a wall complaining that nVidia refuses to support freesync VRR, but I own a freesync monitor and can't take advantage of it (╯°□°)╯︵ ┻━┻. I really want VRR back and it sucks having to replace a high dollar monitor with an even more expensive one that is basically the same just inflated for price because it has an nVidia stamp on it.

Don't worry, both Freesync and Gsync become OBE once HDMI 2.1 hits. That should make VRR displays standard, finally freeing us all from the shackles of fixed rate refresh.
 
So all this does is discard excess frames over the max refresh rate rather than display multiple frames on a single refresh?

In G-Sync they call this "fast sync" and in my experience it's a very bad experience with 60Hz displays and framerates in the 60's and 70's. The pacing is all wonky when you're simply discarding ~10% of the frames.

I'd have to see this in practice but I suspect a double-buffered adaptive sync is still the best solution for fixed refresh monitors.

I agree. If this is anything like fast-sync then IMHO it is nearly useless. You trade severe stuttering and GPU being stressed 100% all the time for minor input lag reduction over normal Vsync methods.
 
It has been available for quite some time (1 year+?).
From my experience it's useful when the framerate is occasionally dipping below the refresh rate, no tearing when above and it doesn't fall to half the refresh rate when below (which happens when using vsync).
Depends on the game but it has improved my experience in some, had some problems in others. I usually enable/disable it on a per game basis.
 
You're assuminig NV will support HDMI VRR...

It would be silly if they don't. It's part of the HDMI standard, there's no losing face since the AMD brand isn't attached to HDMI 2.1 VRR, and theres nothing for them to add to GSync 2 that would justify it as a seperate standard over HDMI VRR.
 
This is great news for all the 580 owners, since that is the only card you can get right now. I really want an AMD 1070 ti for $450, but the AMD game stops at 580's for $250...

COME ON VEGA2!!!!
 
There is a AMD 1070ti called Vega 56 for $480 in stock at Newegg.
It seems to approx. 1/3rd of the board here (mostly Nvidia users), all Vega cards are still 1200 USD and or are about as fast as a 580.
 
It seems to approx. 1/3rd of the board here (mostly Nvidia users), all Vega cards are still 1200 USD and or are about as fast as a 580.

It seems to approx. 1/3rd of the board here (mostly AMD users), all Vega 56 cards are still 400 USD and are about as fast as a 1080ti.

:ROFLMAO:
 
You're assuminig NV will support HDMI VRR...

It's built into the mainline specification. There really isn't anything NVIDIA can do to disable it either; the monitor at the other end should automatically match whatever the current output is.
 
It seems to approx. 1/3rd of the board here (mostly AMD users), all Vega 56 cards are still 400 USD and are about as fast as a 1080ti.

:ROFLMAO:
Oi you cheeky bastard lol!

It's true though. The amount of 'AMD is not competitive' posts I see from people using 1080s and down each day is insane. And they are competitive there.
TBH I'm surprised they are even available at 570 at the egg with the freestink TV news breaking..
 
It seems to approx. 1/3rd of the board here (mostly Nvidia users), all Vega cards are still 1200 USD and or are about as fast as a 580.

Well I am an Nvidia user but to say AMD doesn't have a 1070ti is ludicrous.
 
The amount of 'AMD is not competitive' posts I see from people using 1080s and down each day is insane.

You're right that people would be wise to keep such assertions in context. Overall, AMD is less than fully competitive- that's true, and not just in terms of performance.

However, in terms of outright performance their fairly competitive in the consumer space, and as pricing has gravitated back to earth AMD's GPU offerings do certainly present as decent price/performance solutions.
 
You're assuminig NV will support HDMI VRR...

It would be silly if they don't. It's part of the HDMI standard, there's no losing face since the AMD brand isn't attached to HDMI 2.1 VRR, and theres nothing for them to add to GSync 2 that would justify it as a seperate standard over HDMI VRR.

It's built into the mainline specification. There really isn't anything NVIDIA can do to disable it either; the monitor at the other end should automatically match whatever the current output is.

Right, but Adaptive Sync (aka FreeSync) is in the VESA standard for DisplayPort 1.2a and Nvidia has completely ignored it. I will personally be shocked if Nvidia supports HDMI 2.1 Game Mode VRR in the next GPU release. I reeeeeally hope they do, but it will likely mean they'll have to be willing to let G-Sync die. I mean... why pony up a zillion dollars for a new proprietary G-Sync HDR monitor when presumably many 4K TVs next year will support high refresh 4K with HDMI 2.1 VRR?

Let's all just hope Nvidia does right by their customers this time. Unfortunately that's never really been their jam.
 
Adaptive Sync (aka FreeSync) is in the VESA standard for DisplayPort 1.2a

Optional, using the same optional channel that G-Sync uses.

It's a nice PR stunt on AMD's part, but being 'officially optional for DisplaPort' doesn't change the base fact, or that it's literally the only way that AMD could get traction for their half-assed implementation at all.

I will personally be shocked if Nvidia supports HDMI 2.1 Game Mode VRR in the next GPU release

I'm about 50/50; HDMI VRR isn't 'FreeSync' and it isn't AMD, and G-Sync is DisplayPort only. I just hope that initial HDMI VRR implementations are less half-assed than the first FreeSync ones were/are.
 
So all this does is discard excess frames over the max refresh rate rather than display multiple frames on a single refresh?

In G-Sync they call this "fast sync" and in my experience it's a very bad experience with 60Hz displays and framerates in the 60's and 70's. The pacing is all wonky when you're simply discarding ~10% of the frames.

I'd have to see this in practice but I suspect a double-buffered adaptive sync is still the best solution for fixed refresh monitors.


Compared to what ? non sync or vsync or freesync?

Trippebuffering aka fast sync aka AMD ehancned sync : has always been reffered over doublebuffing when runing vsync in my opinion.
also you get faster input delay.

whe it godly shiens thoug is when you fps dips below 60hz and you don drop to the awfull 30fps you do with vsnyc double buffing

giving freesync/gsync os better here. but non VariableHZ monitor triplebuffing has been very good for me
 
Say what you want about AMD cards, they are the cheaper option of VRR sync which I think is one of the best innovations in games in the last 10 years.
 
I agree. If this is anything like fast-sync then IMHO it is nearly useless. You trade severe stuttering and GPU being stressed 100% all the time for minor input lag reduction over normal Vsync methods.
Maybe that perception is based on the refresh rate of the monitor. But Fast Sync entirely fixed Gsync's issue for me at 120Hz and above when I transition above my monitors refresh rate, and I notice no stuttering.

https://hardforum.com/threads/my-experience-with-freesync-vs-g-sync.1952358/
 
Optional, using the same optional channel that G-Sync uses.

It's a nice PR stunt on AMD's part, but being 'officially optional for DisplaPort' doesn't change the base fact, or that it's literally the only way that AMD could get traction for their half-assed implementation at all.

Freesync does the bare minimum to make VRR workable. The limitations on the FPS it works on is a major killer. And both are hobbled by being Displayport only, since there's no way to transition the spec to TVs. That's the main reason why both are DOA once HDMI 2.1 comes out. [Hell, I argue Displayport as a specification is going the way of Firewire once the new HDMI spec hits].

I'm about 50/50; HDMI VRR isn't 'FreeSync' and it isn't AMD, and G-Sync is DisplayPort only. I just hope that initial HDMI VRR implementations are less half-assed than the first FreeSync ones were/are.

From what I've read, it looks like the HDMI VRR can be thought of as a better implementation of Freesync. Gsync is still technically superior since you're relying on an ASIC to handle the processing on the display side, but the HDMI solution should cover the holes that Freesync doesn't really address.
 
Freesync does the bare minimum to make VRR workable. The limitations on the FPS it works on is a major killer. And both are hobbled by being Displayport only, since there's no way to transition the spec to TVs. That's the main reason why both are DOA once HDMI 2.1 comes out. [Hell, I argue Displayport as a specification is going the way of Firewire once the new HDMI spec hits].



From what I've read, it looks like the HDMI VRR can be thought of as a better implementation of Freesync. Gsync is still technically superior since you're relying on an ASIC to handle the processing on the display side, but the HDMI solution should cover the holes that Freesync doesn't really address.
Freesync has always worked with HDMI, and only the first version of G-Sync was tied to DisplayPort. There are more and more televisions being released supporting Freesync now that the Xbox One S and X support it.

Seeing as Freesync is AMD's implementation of Adaptive Sync, then VRR is really HDMI's version of the latter. It still grinds my gears that people conflate the two...
 
Back
Top