Why G-Sync and FreeSync are important for emulation and old games

Good video, makes some interesting points. I always wondered why some of the older emulations had microstutter in them.

I've been meaning to get a GSync monitor but from what I've seen there aren't any glossy ones :(
 
Keep in mind that modern emulator front-ends like RetroArch can re-clock games to eliminate stutter and correct the audio pitch.
Though old arcade games may run around 55Hz, newer systems are typically ±2% of 60Hz.
I doubt most people would notice - or care - if a Neo-Geo game is running at 59.185606Hz or 60.0Hz, or if a SNES game is running at 60Hz instead of 60.08Hz.

The problems of games stuttering if the refresh rate was not exact, and pitch errors or audio glitches if the games were not run at their original speed, are no longer issues today.
That's not to say that Variable Refresh Rate tech isn't important - you will probably notice if a game is sped up from 55Hz to 60Hz, and VRR eliminates V-Sync latency, but for most of the games/systems that people want to emulate, it's also less of a problem now.

I'd be more concerned about having a good strobe option on the display than VRR support, since 2D graphics blur horribly on sample-and-hold displays - and all VRR modes must use S&H.
 
Keep in mind that modern emulator front-ends like RetroArch can re-clock games to eliminate stutter and correct the audio pitch.
Though old arcade games may run around 55Hz, newer systems are typically ±2% of 60Hz.
I doubt most people would notice - or care - if a Neo-Geo game is running at 59.185606Hz or 60.0Hz, or if a SNES game is running at 60Hz instead of 60.08Hz.

The problems of games stuttering if the refresh rate was not exact, and pitch errors or audio glitches if the games were not run at their original speed, are no longer issues today.
That's not to say that Variable Refresh Rate tech isn't important - you will probably notice if a game is sped up from 55Hz to 60Hz, and VRR eliminates V-Sync latency, but for most of the games/systems that people want to emulate, it's also less of a problem now.

I'd be more concerned about having a good strobe option on the display than VRR support, since 2D graphics blur horribly on sample-and-hold displays - and all VRR modes must use S&H.

Actually, this is incorrect. Run Samurai Shodown II in Retroarch and you'll see the same scrolling and flicker irregularities. What's even worse about Retroarch is that it screws up variable refresh monitors, too, making it impossible to get perfectly smooth movement on them. Retroarch is garbage.
 
Actually, this is incorrect. Run Samurai Shodown II in Retroarch and you'll see the same scrolling and flicker irregularities. What's even worse about Retroarch is that it screws up variable refresh monitors, too, making it impossible to get perfectly smooth movement on them. Retroarch is garbage.
If it's configured correctly - and RetroArch can be confusing to set up if you're new to it - then things like 30Hz flicker in games works as intended. There is no irregular flickering or stutter.

I even use RetroArch on my CRT with its black frame insertion option enabled, so that I can use rates such as 110.035212Hz for old arcade games, because my CRT won't sync to anything below 60Hz. (55.017606Hz x2)

If you're running it in Windowed Full-Screen mode or don't set up the refresh rate, then you do get irregular flickering - which is really bad with BFI - but if you set it up correctly it works as you would expect.
 
If it's configured correctly - and RetroArch can be confusing to set up if you're new to it - then things like 30Hz flicker in games works as intended. There is no irregular flickering or stutter.

I even use RetroArch on my CRT with its black frame insertion option enabled, so that I can use rates such as 110.035212Hz for old arcade games, because my CRT won't sync to anything below 60Hz. (55.017606Hz x2)

If you're running it in Windowed Full-Screen mode or don't set up the refresh rate, then you do get irregular flickering - which is really bad with BFI - but if you set it up correctly it works as you would expect.

Upload a 60fps video of Samurai Shodown II running in Retroarch with regular flickering, because I don't believe you. I know how to configure Retroarch, and even though its rate control gives you results somewhat better than just out of the box MAME at the wrong refresh rate, it is NOT perfect.
 
Upload a 60fps video of Samurai Shodown II running in Retroarch with regular flickering, because I don't believe you. I know how to configure Retroarch, and even though its rate control gives you results somewhat better than just out of the box MAME at the wrong refresh rate, it is NOT perfect.
It doesn't really seem possible to upload a video to demonstrate that - at least not at 60 FPS.
  1. The game itself is not running at exactly 60 FPS (RetroArch measured this screen as 60.002399Hz)
  2. The recording is not going to be perfectly synchronized to that refresh rate (my camera will only shoot 1/125, not 1/120 - and even if it did, it would not be genlocked)
  3. YouTube playback of that recording on another machine is not going to be perfectly in sync with the video either
I did record and upload a video, but got different results on each system I tried playing it on - all of which showed varying degrees of irregular flicker that was not present in the source video.
Stepping through the recording itself showed that 30Hz flicker was working correctly in RetroArch - not that I needed a recording to tell me that, since black frame insertion would be completely unusable if RA was not synchronizing things correctly.
All the video highlighted was problems with YouTube playback, and not the game.
 
It doesn't really seem possible to upload a video to demonstrate that - at least not at 60 FPS.
  1. The game itself is not running at exactly 60 FPS (RetroArch measured this screen as 60.002399Hz)
  2. The recording is not going to be perfectly synchronized to that refresh rate (my camera will only shoot 1/125, not 1/120 - and even if it did, it would not be genlocked)
  3. YouTube playback of that recording on another machine is not going to be perfectly in sync with the video either
I did record and upload a video, but got different results on each system I tried playing it on - all of which showed varying degrees of irregular flicker that was not present in the source video.
Stepping through the recording itself showed that 30Hz flicker was working correctly in RetroArch - not that I needed a recording to tell me that, since black frame insertion would be completely unusable if RA was not synchronizing things correctly.
All the video highlighted was problems with YouTube playback, and not the game.

Your contention is that Retroarch can magically smooth out game updates even when your monitor doesn't run at the game's refresh rate, and your statements just proved that that's incorrect. If Retroarch could truly do that, there's no reason you shouldn't be able to create a 60fps video showing perfect updates, because Retroarch can smooth it all out, right? If I run MAME with syncrefresh, I CAN get perfectly regular updates in a Youtube video. The game will just be running too fast. Alternatively, I could just turn off vsync and record a video and have perfectly smooth updates, but you'd see tear lines.

You can't create a 60fps video in Retroarch without irregular flickering for the same reason Retroarch can't run Samurai Shodown II perfectly because the game isn't running at its native refresh rate but it's still trying to run the game at the right speed. Retroarch's rate control is NOT perfect. The reason you got different irregularities on each computer is that pretty much every monitor has a slightly different "true" refresh rate.

Outputting to a CRT monitor at the game's native refresh rate or using a variable refresh monitor are the only ways to get truly perfect, bullet smooth updates in these games.

Also, no idea why you keep referring to "30Hz flicker," which is nonsense, because Samurai Shodown II doesn't run at 30hz.

I made a 60fps video of MAME with syncrefresh on, and as long as your computer/browser doesn't chug, the updates should be perfectly consistent. Browsers are piles of crap, though, so you might have to play it more than once to go all the way through without the browser stuttering. Running this in Chrome might be the best way to go.

https://www.youtube.com/watch?v=tk65th1w2uE
 
Last edited:
Your contention is that Retroarch can magically smooth out game updates even when your monitor doesn't run at the game's refresh rate, and your statements just proved that that's incorrect. If Retroarch could truly do that, there's no reason you shouldn't be able to create a 60fps video showing perfect updates, because Retroarch can smooth it all out, right? If I run MAME with syncrefresh, I CAN get perfectly regular updates in a Youtube video. The game will just be running too fast. Alternatively, I could just turn off vsync and record a video and have perfectly smooth updates, but you'd see tear lines.

You can't create a 60fps video in Retroarch without irregular flickering for the same reason Retroarch can't run Samurai Shodown II perfectly because the game isn't running at its native refresh rate but it's still trying to run the game at the right speed. Retroarch's rate control is NOT perfect. The reason you got different irregularities on each computer is that pretty much every monitor has a slightly different "true" refresh rate.
  1. In your RetroArch config, set: audio_max_timing_skew = "10.000000" - the rest can be set via the GUI. This removes the rate limiter and will allow the audio sync feature to work at any speed.
  2. Set Maximum Run Speed to 0.0x in the Frame Throttle section. This lets the game run at an uncapped speed.
  3. Disable Windowed Fullscreen Mode. If it was previously enabled, toggle full-screen off/on to enter FSE mode. Windowed Mode presents frames to the desktop compositor instead of directly to the display, and will not be properly synchronized.
  4. Enable V-Sync, but not Hard GPU Sync. This will match the game speed to your refresh rate.
  5. Hit the enter key on your keyboard with "Estimated Monitor Framerate" selected to reset it. Wait at least 2048 samples until it is reading a fixed value with a deviation of less than 1%. Ideally it will be less than 0.5%. Hit the accept key (X) to set the refresh rate.
  6. Enable the audio sync feature. Leave audio latency at the default setting of 64.
  7. Run the game. Things should be perfectly in sync. No stuttering or irregular flicker.
This configuration synchronizes the game speed to your monitor's refresh rate, with the audio resampled to stay in perfect sync and pitch.

Now that it should be running correctly at this point, you can tune it for latency.
Enabling Hard GPU Sync and setting it to 0 will result in the minimal latency possible with V-Sync enabled.
However most systems won't be able to handle a setting of 0, and you will see stutter and irregular flicker again. Increase the Hard GPU Sync value until this disappears. On my system, a setting of 1 is sufficient.
With audio latency, reduce the value until you start to get audio glitches/crackling. Then bump it up one step higher. That should give you the lowest audio latency that your device can handle. This is largely determined by the audio device in use, rather than how fast your system is.

As I said before, with a display running at 60Hz, this means a speed difference of ±2% compared to the original hardware for most systems - often below 0.5%.
Only old arcade systems will be running at anything significantly different from 60Hz, which is where you are likely to notice the speed difference.

Outputting to a CRT monitor at the game's native refresh rate or using a variable refresh monitor are the only ways to get truly perfect, bullet smooth updates in these games.
No, that's the only way to get truly smooth updates at the original speed. If you don't care about a <2% variance - and most people won't - then you can get perfectly smooth updates on any 60Hz display.
On most PC CRTs, you will have to run at 2x the original system's speed with black frame insertion enabled, as most PC CRTs will not sync to anything below 60Hz. Enabling BFI effectively halves the refresh rate to run at the correct speed, at a cost of reduced brightness and increased flicker.
Of course video timing is never perfect, so RetroArch's sync capabilities are still required, even if that timing error is a fraction of a percent.

The problem with VRR is that VRR requires that the display is in a flicker-free mode to operate, and with emulators you are running at a fixed refresh rate anyway.
The only advantage that VRR brings compared to a multisync display which supports that refresh rate - or a multiple of it - is that you don't have to manually change the refresh rate for each system that you run.

The downside is that since VRR modes require the display to be flicker-free, you cannot use backlight strobing or black frame insertion to improve motion clarity - and 2D games are where the motion blur from a sample & hold display is most obvious.
Minor timing differences are far less noticeable than sample & hold motion blur in my opinion.

Also, no idea why you keep referring to "30Hz flicker," which is nonsense, because Samurai Shodown II doesn't run at 30hz.
The game runs at 60Hz but the shadows flicker at 30Hz, since they are only on for half the time.

I made a 60fps video of MAME with syncrefresh on, and as long as your computer/browser doesn't chug, the updates should be perfectly consistent. Browsers are piles of crap, though, so you might have to play it more than once to go all the way through without the browser stuttering. Running this in Chrome might be the best way to go.

https://www.youtube.com/watch?v=tk65th1w2uE
This demonstrates exactly the problem that I described in my previous post.
There are too many variables when recording and then trying to play back this footage, which can make it look as though you have stuttering and irregular flicker, even if you do not see any when looking at the actual display.
Playing back your video in a browser shows irregular flicker and stutter for me.
Assuming that your video contained perfect 30Hz sprite flicker without any sync issues, all that video does is demonstrate sync issues with playback, since it's running in Windowed Mode, in a browser.
It's not a case of the system being unable to handle it, but the reality of watching videos on a PC, in a web browser.
 
Last edited:
Uhhhh, all you did was restate what I originally said in the first place.

You changed the speed of the game. What you described is no different than syncrefresh in MAME, which is awful.

If you do that for Mortal Kombat, you're running the game 10% faster than it should be. That's a terrible solution.

You're completely wrong about G-Sync and FreeSync. There is no "problem" with G-Sync or FreeSync. They allow you to run perfectly smoothly WITHOUT altering the speed the games run at. That's why they're better than the garbage you described. Also, you CAN actually do software blackframe insertion combined with G-Sync to make games run at exactly double their original speed (because most G-Sync monitors are 144hz). There is no downside to G-Sync or FreeSync at all. On top of all that, they have less input latency because there's no v-sync lag.

"The only advantage that VRR brings compared to a multisync display which supports that refresh rate - or a multiple of it - is that you don't have to manually change the refresh rate for each system that you run."

No, the advantage is that you can actually run games at the right speed smoothly.

"The game runs at 60Hz but the shadows flicker at 30Hz, since they are only on for half the time."

No, the shadows flicker at 59.1hz, because that's the speed the game updates at. You're confusing what shadows look like and how often they're updated. They change every frame. Run the game in MAME in debug mode. Every frame they're in a different state. They don't change every other frame. If they did, the flicker would look slower than it currently does, because there would be two frames of nothing and two frames of black, but there's only one frame of nothing and one frame of black each loop.

Finally, I want to point out that strobed backlights and black frame insertion are more problematic than G-Sync. They destroy brightness and color quality. They dramatically reduce the quality of the image. Combined with the fact that you're going to be using some CRT shader for things like this, and scanlines already reduce image brightness, the combined effect is pretty brutal. Whereas I can say that G-Sync and FreeSync are objectively better than standard v-sync in every way with no tradeoffs, strobing and black frame insertion are objectively just tradeoffs. They're not a clear win across the board. Really, to me, motion blur on modern 144hz 1ms monitors (which isn't even that bad) is preferable to sucking down the gray haze of blacklight strobing.

Geezus.
 
Last edited:
Uhhhh, all you did was restate what I originally said in the first place.
Then I'm not sure what you were disputing before, since I stated that in my very first post and you repeatedly said that I was wrong about RetroArch being able to run games in sync with your refresh rate so that they don't stutter at all, so long as you don't mind what is typically less than a 2% speed error.

If you do that for Mortal Kombat, you're running the game 10% faster than it should be. That's a terrible solution.
I agree that it's far from ideal for old games which run significantly lower than 60Hz. The Mortal Kombat games are probably the newest titles running on hardware like that however. It's typically very old games that ran significantly below 60Hz.
Keep in mind that RA should still correct the pitch for those games, they'll just run a bit faster is all. Whether that's a big deal or not depends on the person.
I don't care about fighting games, but I had no problem playing older games like R-Type at a higher speed. Smoothness was more important than 100% accuracy in my opinion.

You're completely wrong about G-Sync and FreeSync. There is no "problem" with G-Sync or FreeSync. They allow you to run perfectly smoothly WITHOUT altering the speed the games run at. That's why they're better than the garbage you described. Also, you CAN actually do software blackframe insertion combined with G-Sync to make games run at exactly double their original speed (because most G-Sync monitors are 144hz). There is no downside to G-Sync or FreeSync at all. On top of all that, they have less input latency because there's no v-sync lag.
Black frame insertion on an LCD is terrible. It barely improves motion clarity and tends to wash out the image, since the LCD pixels are so slow to change.
You need backlight strobing/scanning with an LCD display. BFI only works with CRT or OLED.

No, the advantage is that you can actually run games at the right speed smoothly.
A multisync display which can run at 54.706840Hz, or 109.413680Hz if it won't sync below 60Hz, will be exactly as smooth as a VRR display - except it can also have the option of using backlight strobing, since VRR modes must disable that feature to work correctly.
VRR is a convenience-feature for emulation compared to a multi-sync display, as long as that display is capable of supporting the same rate or a multiple of the system that you're trying to run.

No, the shadows flicker at 59.1hz, because that's the speed the game updates at. You're confusing what shadows look like and how often they're updated. They change every frame.
If something is only displayed 30 times a second, it's flickering at 30Hz.

Finally, I want to point out that strobed backlights and black frame insertion are more problematic than G-Sync. They destroy brightness and color quality. They dramatically reduce the quality of the image.
Black frame insertion does ruin the image on an LCD.
On a CRT or OLED, the only thing BFI does is reduce the brightness, and the effective refresh rate. BFI at 120Hz looks identical to running the screen at 60Hz once the brightness level is matched up.
Backlight strobing/scanning on an LCD should only reduce the brightness if properly implemented - as is the case with my Sony TV.
I think it's only LightBoost monitors that showed problems with color rendering when it was enabled, since that: a) Was intended for use with 3D glasses b) Boosted the backlight brightness higher than the non-strobed modes allowed.

Combined with the fact that you're going to be using some CRT shader for things like this, and scanlines already reduce image brightness, the combined effect is pretty brutal.
I don't think it should be assumed that someone is going to be using CRT shaders. Frankly I think most of them look terrible.
Only a heavily tweaked CRT-Royale running on a 4K monitor can look anything close to a real CRT, and you really have to crank up the backlight for that, since it requires that you disable the bloom effects and the brightness enhancements. Without disabling that sort of thing, CRT shaders look garish.
At anything less than 4K, I avoid CRT shaders. The most I'll do is add some scanlines.

Whereas I can say that G-Sync and FreeSync are objectively better than standard v-sync in every way with no tradeoffs, strobing and black frame insertion are objectively just tradeoffs. They're not a clear win across the board.
Not being able to use backlight strobing is a significant disadvantage when the content you're viewing does not benefit from VRR support (only multisync support) and is the type of content where motion blur is most easily seen.

Really, to me, motion blur on modern 144hz 1ms monitors (which isn't even that bad) is preferable to sucking down the gray haze of blacklight strobing.
On a flicker-free display, which VRR requires, your motion blur is directly related to your framerate.
So if a game is running at 54.706840 FPS, you have an effective 18.3ms of motion blur - which is significant compared to the <2ms that you get with a CRT or strobed LCD.
 
I think it's only LightBoost monitors that showed problems with color rendering when it was enabled

Yep my xl2720z has no changes in color or anything except it becoming dimmer when the strobing backlight is enabled.
 
insults removed.

I'm not sure what point you are trying to make.
I thought I was quite clear that I have to run any systems which are <60Hz at double the refresh rate with BFI enabled on my CRT, since it won't sync to anything below 60Hz and enabling BFI halves the effective refresh rate.

109.413680 Hz plus BFI on a CRT = 54.706840 Hz without BFI
Since it's a CRT that will give you <2ms persistence as they are scanned displays.
Without BFI you end up with double-images when anything moves across the screen, if you're running at twice the refresh rate.

With a flicker-free LCD there should be zero difference between running at 54.7Hz or 109Hz.
 
Last edited by a moderator:
insults removed.

I suggest you refrain from insulting and flaming people when YOU are the one who is wrong.

I've been reading all of Zone74's posts and I know that HE is the one who is fully on the mark here.

Not only are you trolling him, you don't even know what black frame insertion IS or how it works.

You can't CREATE a well done strobed backlight with black frame insertion. It's not that simple. You can only create a strobed backlight by actually shutting off the LCD backlight between frames, and then you have to control the duration and time of the strobe as well. The reason is due to how strobe persistence works.

With black frame insertion, you're just effectively just reducing pixel persistence by half by doubling the refresh rate by software, so that 120hz with black frame insertion is effectively running at 240hz for games that are running at 120 fps (meaning the games will have 4.16ms of persistence instead of 8.3.

So for an emulator running at 55 fps, with gsync at 55hz, your game would then be running at 110hz (pixel persistence of 9.1 ms)

This isn't true strobing. it's just a software method to double the refresh rate to reduce motion blur by half from the original refresh rate.
This is no different than what those interpolated televisions do...taking a 60hz signal and creating an emulated 120hz from it. Or those 240hz TV's that accept 120hz signals.

The black frame insertion that MATTERS is when you are running a STROBED BACKLIGHT, and gsync and freesync are incompatible with strobing, as then you can REMOVE the drawback of running a game at half of the refresh rate of the strobed backlight.

This is when for example, you have a 60 FPS emulator and a 120hz ULMB / benq blur reduction / motion 240 / turbo240 strobed backlight. Your emulator is going to show a double image since its running at half the frames.

So what do you do?
You use black frame insertion on the game to effectively double the game refresh rate so then it matches the strobed backlight refresh rate, then you get perfect smoothness, without needing to speed up the game to 120hz/120 fps.

And just to educate you on strobing since you don't seem to know how it works:
Strobing turns the backlight off and on ONCE per refresh.
THIS IS NOT BLACK FRAME INSERTION !

Because the TIME the backlight is ON is NOT the same as the time it is off!
For example, for 120hz strobing, the backlight is cycled on and off every 8.3 milliseconds. If you want 1.0ms pixel persistence, the backlight would be *OFF* for 7.3 milliseconds and *ON* for 1.0 milliseconds for a total of an 8.3ms strobe.

You also have to deal with the strobe PHASE as well.
At what point in that 8.3 millisecond duration is the backlight turned on for the 1.0 millisecond time? The beginning or the end of the frame ?

And then you have to deal with crosstalk ...
 
Last edited by a moderator:
Black frame insertion isn't the same as interpolation, and it works without any strobing. So many clueless people on this site. I'm outta here. Enjoy your ignorance!
 
Too bad interpolation *IS* black frame insertion.
There are just different METHODS of it.

Some people with big egos really don't like to be proven wrong :) Cya and don't knock yourself out with the door as you stumble out of this thread !
 
most PC CRTs will not sync to anything below 60Hz
almost all CRT monitors I used and the one I currently use support 50Hz fine. I remember one monitor having some issues.

What CRT do you use that it cannot handle <60Hz?
When you use >100Hz are you also using native resolution to avoid necessity to use scanlines? It would make perfect sense to actually use something like 320x240@100Hz instead 640x480@100Hz with scanlines because in the first example you are already >30KHz required for most CRT monitors and you do not get such drastic luminance reduction. Personally I prefer something like 640x480@50Hz with scanlines because flicker is already bad, but if monitor cannot handle 50Hz better to avoid scanlines and use native res. Imho.

Anyhow, I have no issues compherending what you mean by anything you are saying so do not be discouraged by OP. It is the rare case where I actually agree with all you are saying. LOL

@bigbluefe
gosh, you made thread with valid point then you made idiot out of yourself. That is pretty lame.

Maybe if you didn't try to negate everything other say you would not have such issues comprehending what they try say.
 
Black frame insertion isn't the same as interpolation, and it works without any strobing. So many clueless people on this site. I'm outta here. Enjoy your ignorance!

just.. wow. What a spectacular display of arrogance. This thread should be saved and studied by students of the ego's effect on human reasoning.

Zone, you are a true gentleperson :)
 
Back
Top