AMD finally updating the FreeSync spec

I don't see how this is a good thing in any way, shape, or form. They didn't do ANYTHING that actually improves Freesync or the way it works. It seems like the ONLY impact that this change will have is that a lot of displays that would have otherwise come with Freesync support will now no longer have Freesync support. That's NOT a good thing; why would it be?

Creating a 144Hz minimum is going to cut off a lot of displays, yet IMO Freesync is most useful when FPS dips lower. The idea that Freesync isn't useful just because your monitor can only do 120Hz, or 85Hz, or whatever, is silly. The benefits definitely still exist.

It seems like this will impact TVs especially hard, and cut down on the number of TVs that can be effectively used as good gaming monitors. That's a shame. I just hope that these "low-end" monitors are still able to come with some form of Adaptive Sync, and that this is just an irrelevant branding issue by AMD. I don't care about "Freesync" anyway, I just need it to be "G-Sync Compatible" ;)

Were there really people out there that were making monitor choices based on "Freesync" branding?

My current monitor has an actual G-Sync module and so far it's provided me with a better experience than any Freesync monitor I ever used. Maybe they should work on actually improving the tech instead of limiting which monitors it works on.
 
I don't think this means actually disabling freesync on monitors that doesn't have 144hz capability, how would they do that without retroactively affecting older monitors? Surely this only means such monitors can't be sold with freesync branding, so this is purely an optics change. You'll still be able to turn on Freesync on 120hz monitors.

I couldn't care less what branding is on the display as long as it works.
 
It seems like this will impact TVs especially hard, and cut down on the number of TVs that can be effectively used as good gaming monitors

AMD is enforcing a 144Hz refresh rate or greater for displays that have a horizontal resolution of less than 3440 pixels. For displays that exceed that resolution limit, a refresh rate limit is not enforced.

Any TV that is any good for gaming isn't going to be lower than UHD resolution in the first place, because TV manufacturers don't care at all about making sub-UHD displays with good total latency and panel response times.
 
Last edited:
Yeah, I missed that exception for 4K. I think that an important exception such as that should have been included in the graph in that article that outlined what the specific requirements were for each tier, because that's kind of significant... then again this is Tomshardware we're talking about.

But I still don't think that an announcement that does nothing to actually further the tech, and only serves to exclude more displays, is the right direction to go.
 
You mean this one?

circle.png

It shows very clearly right there, less than 3440 res: max refresh equal to or greater than 144Hz.
 
It shows very clearly right there, less than 3440 res: max refresh equal to or greater than 144Hz.

It also says below that, "≥ 3440 Horizontal resolution: Max. Refresh Rate: ≥ 120 Hz", so with no exception mentioned in the graph itself, I assumed 4K displays fell into that category. Either way, I'm not interested in nitpicking that any further. My main point was simply that this change does nothing other than exclude displays that wouldn't have been excluded previously, which isn't a good thing IMO, and that's still true even with the exception.

991214_redline.png
 
My main point was simply that this change does nothing other than exclude displays that wouldn't have been excluded previously, which isn't a good thing IMO, and that's still true even with the exception.

Given that this isn't a change that's going to retroactively strip FSync compatibility out of existing budget displays, I don't see what the big deal is. This is a branding issue more than anything, because there are a lot of terrible "gaming" displays that try to leech off of AMD's image by slapping on Freesync compatibility to make it seem like some sort of premium feature, and because of that, many people equate Freesync with being cheap shit over Gsync. They need to reduce the number of low quality displays that try to bank off of their branded tech to make it not look like the Great Value alternative to nvidia's option.

These inexpensive display makers always have the option of going with the generic VESA VRR standard.

As for the TV angle, his isn't anywhere near as big a deal as you think it is. It's not 2014 anymore. The number of UHD TVs, even inexpensive ones, far exceeds the number of FHD models nowadays. That automatically means that most of them are going to be at the very least Freesync compliant, with most mid-upper tier models also hitting the FS Premium minimum of 120Hz.
 
It also says below that, "≥ 3440 Horizontal resolution: Max. Refresh Rate: ≥ 120 Hz", so with no exception mentioned in the graph itself, I assumed 4K displays fell into that category. Either way, I'm not interested in nitpicking that any further. My main point was simply that this change does nothing other than exclude displays that wouldn't have been excluded previously, which isn't a good thing IMO, and that's still true even with the exception.

View attachment 640302
That's just for FreeSync Premium. Displays/tvs which don't meet that req but meet the one for FreeSync can still be called FreeSync displays. Not that the name really matters other than to let you know what level of features it supports.

A monitor could not meet any of those reqs and still do the same things a FreeSync display does. Just maybe not as well. 🤷‍♂️
 
My first taste of VRR was a 75hz Freesync/G-Sync Compatible 1080p display.... It was AWESOME (at the time)

Me personally, I just cap everything at 90 FPS to keep frame-times in check and everything appears smooth as butter to Me

- 'Course I grew up in an era were 30+ fps was considered amazing
 
Updating the specs for higher tiers to align with what are premium specs today isn't a bad thing. I dislike them setting different refresh rates for laptop and desktop displays though, that's just going to add confusion. I do wish they'd added a basic tier at the bottom to capture variable speed 60/75hz displays though, they're still better than fixed 60hz models.

- 'Course I grew up in an era were 30+ fps was considered amazing

I'm having flashbacks to trying to play Duke Nukem 3D at <5 FPS. It actually sorta worked because the game engine handled keyboard IO faster than that so I could turn for 1/3rd a frame and shoot and have a halfway decent chance of hitting my target. Otherwise it was still awful as you can imagine though. 😂

A few years and a new computer later 20 FPS made me feel like a god in Quake 2 deathmatches. /old man screaming at clouds
 
I do wish they'd added a basic tier at the bottom to capture variable speed 60/75hz displays though, they're still better than fixed 60hz models
That would be covered by the VESA Certified Media Display.

AMD’s biggest issue is the VESA Certified Adaptive Sync program announced in 2022 does a better job at ensuring quality than the FreeSync one did, and the VESA programs are scammy as hell.
 
Back
Top