166Hz Has Ruined 60 fps Forever

You don't as long as you maintain the original framerate (or multiple therein). (You can also drop frames, repeat frames, draw partial frames, etc. which is how judder is introduced.) If you deviate from that at all, the audio will become pitch shifted. You compensate by stretching (or compressing) the audio.

Anyway, I was leaning heavily on hyperbole, as a form of sarcasm. Without knowing how PowerDVD internally works to the exact detail, I can't possible know. The point is, they could have gone with either solution (or a hybrid).

The point I'm really trying to make is you end up with some morphed output either way.

A real world example of using audio stretching: Say I have a 24 fps source. The closest multiple of my available monitor refresh rates is 95Hz (96Hz being the ideal). Being 1Hz off is a big deal as it would introduce motion judder. So I have two options: 1) use frame interpolation to hit a target framerate of 95/4 = 23.75 fps or 2) just display the video as-is by slowing it down to 23.75 fps which also requires stretching the audio. As a personal preference, I'm going with option #2 as to maintain the original picture as my eyes are far more sensitive than my ears. You can use ReClock with MPC to achieve that result.

TLDR; Unless you game 24/7, there is nothing "glorious" about a new and fancy monitor that doesn't support a refresh rate that is a multiple of 24.

Eh variable refresh makes this a non issue. This goes back to devs being lazy fucks again. If PowerDVD doesn't have a full screen exclusive mode that actually runs at 24hz (or 48hz) for G-Sync monitors, they suck cock. 24fps movies should not have juddering in a world with variable refresh displays.
 
I know exactly how you feel, OP. I've been using 144hz+ monitors for a few years now, and when I recently purchased a budget gaming laptop with an admittedly nice IPS display, I realized immediately that 60hz was no longer enough for me. That panel got upgraded to a 120hz TN within the week.

60hz isn't "garbage," "rubbish," or "unplayable" by any means. It's just that... well, once you get used to the silky smooth movement of higher refresh rates, 60hz becomes instantly apparent.
 
You should look into the 43" 4k120hz Mango. I got one sitting on my desk next to the tiny X27 and the Mango is pretty much my go to display for everything these days. I like a big, fast display with low lag and the mango delivers.

The problem is you can only get it direct from SK, and when both DP 1.4 ports die a month apart to some strange glitch it does not bode well that they got the design totally correct on the controller. (its obvious its a dual zone of some sort)
Basically perma-tearing on the exact middle row of pixels, image is there fine but god help you if you move anything on it. The hdmi ports are fine but fuck 60hz, I have much better displays for media.

I'm working on sending it back as its still under warranty "in Korea" but boy is that a bitch. Thankfully I saved the really good packaging it came in.

Also the panel is rather mediocre, granted I was stepping down from a JS9000 which is literally the cream of the crop on panel image quality (/gush) but still. The mangos have a pretty washed out too-blue IPS panel that you can't totally correct for.
 
Without supplying high frame rates to fill hz, you aren't going to get the motion clarity (blur reduction) and motion definition (more articulated pathing and smoother "glassy" motion, higher animation definition etc).

So obviously a desktop/app use scenario outside game just moving a mouse and windows around should usually supply as many frames of motion as the monitor can handle but when you go into a game, especially a more demanding 1st/3rd person games and at higher resolutions, your frame rate is severely limited.
Even the "frame rate" you meter and quote is a wider range of fluctuating frame rates +/- that that can be graphed.
114699_ALY9lQS.png


The article talking about 1000hz realizes that 1000fps gaming outright is unrealistic and says that advanced interpolation of 100fps x 10 would be a likely scenario for 1000hz monitors (mathematically, 125fps x 8 would also work :) ). Even then, in fururistic 180 degree VR at extreme resolutions you'd sense the difference due to strobostropic effect unless it was 10,000 hz. So while they are saying 10,000hz fed massive fps at extreme resolutions would be indistinguishable from reality per se.. 1000fps (100fps interpolated 10x) at 1000hz would be essentially zero blur like a high end fw900 crt for the purposes of gaming.
This is the blur reduction you get at different frame rates at 1000hz
View attachment 64347
And the image below shows a visual representation of the blur at each persistence amount via persuit camera. However realize that in a 1st/3rd person game you are moving the entire game world full of high detail textures and depth via bump mapping around relative to you in the viewport when mouse looking and movement keying, rather than a simple singular ufo btimap graphic.
Display persistence is more noticeable for bigger FOV (bigger displays or virtual reality) and for higher resolutions (retina resolutions) due to bigger clarity differences between stationary & moving images.
In the most extreme future case (theoretical 180+ degree retina-resolution virtual reality headsets), display refresh rates far beyond 1000 Hz may someday be required (e.g. 10,000 Hz display refresh rate, defined by the 10,000 Hz stroboscopic-artifacts detection threshold). This is in order to pass a theoretical extreme-motion “Holodeck Turing Test” (becoming unable to tell apart real life from virtual reality) for the vast majority of the human population.
However, for general CRT-quality sports television watching,
1000fps at 1000Hz would sufficiently approximately match 1ms CRT phosphor persistence, for a flicker-free sample-and-hold display.
Technologically, this is achievable through interpolation on an ultra-high refresh rate display.
Note that while many of these replies are focused on blur reduction, higher frame rate on a high hz monitor (without using duplicated/interpolated frames) also provides greatly increased motion definition, motion path articulation, smoothness (and even animation cycle definition) of individual objects and of the entire game world moving in relation to you while mouse looking and movement keying in 1st/3rd person games So even if you had a 1000hz monitor using advanced interpolation, you would still need to run it at 100fps x 10 (or 125fps x 8, 200fps x 5) in order to get the greater motion definition benefit aspect of higher hz.






I had fw900's . They aren't the answer. More of a pain in the end to keep looking good enough.. and the size of the screen sucks even if you dismiss the size of the monitor itself. I held on to using that for years with a lcd alongside but I'm well past that now.

Most people avoid PWM like the plague. LCD strobing is essentially PWM. It will give you eyestrain. In order to use ulmb/lightboost properly you have to keep really high minimum frame rates. People are all looking to 3440 x 1440 and 4k resolutions now and are looking to HDR luminance ranges and color volumes. Strobing is really not applicable to the bar that modern premium gaming monitors are setting (High resolutions at high Hz, HDR luminance and color volume, VRR with higher graphics settings to avoid judder on dips and potholes, variance).

The real answer is fairly high frame rate to start with multiplied by advanced high quality interpolated (directly repeated not 'manufactured') frames combined with extremely high hz but that is still years off. The hz ceilings are making some progress now though at least.

"So while they are saying 10,000hz fed massive fps at extreme resolutions would be indistinguishable from reality per se.. 1000fps (100fps interpolated 10x) at 1000hz would be essentially zero blur like a high end fw900 crt for the purposes of gaming"

View attachment 76265


As per blurbusters.com 's Q and A:
-----------------------------------------------------
Q: Which is better? LightBoost or G-SYNC?
It depends on the game or framerate. As a general rule of thumb:
LightBoost: Better for games that sustain a perfect 120fps @ 120Hz
G-SYNC: Better for games that have lower/fluctuating variable framerates.
This is because G-SYNC eliminates stutters, while LightBoost eliminates motion blur. LightBoost can make stutters easier to see, because there is no motion blur to hide stutters. However, LightBoost looks better when you’re able to do perfect full framerates without variable frame rates.
G-SYNC monitors allows you to choose between G-SYNC and backlight strobing. Currently, it is not possible to do both at the same time, though it is technically feasible in the future.
......
Main Pros:
+ Elimination of motion blur. CRT perfect clarity motion.
+ Improved competitive advantage by faster human reaction times.
+ Far more fluid than regular 120Hz or 144Hz.
+ Fast motion is more immersive.
Main Cons:
– Reduced brightness.
– Degradation of color quality.
– Flicker, if you are flicker sensitive.
– Requires a powerful GPU to get full benefits. <edit by elvn: and turning down settings a lot more at higher resolutions>

--------------------------------------------------------
During regular 2D use, LightBoost is essentially equivalent to PWM dimming (Pulse-Width Modulation), and the 2D LightBoost picture is darker than non-LightBoost Brightness 100%.
--------------------------------------------------------
Once you run at frame rates above half the refresh rate, you will begin to get noticeable benefits from LightBoost. However, LightBoost benefits only become major when frame rates run near the refresh rate (or exceeding it).
-------------------------------------------------------
If you have a sufficiently powerful GPU, it is best to run at a frame rate massively exceeding your refresh rate. This can reduce the tearing effect significantly.Otherwise, there may be more visible tearing if you run at a frame rate too close to your refresh rate, during VSYNC OFF operation. Also, there can also be harmonic effects (beat-frequency stutters) between frame rate and refresh rate. For example, 119fps @ 120Hz can cause 1 stutter per second.
Therefore, during VSYNC OFF, it is usually best to let the frame rate run far in excess of the refresh rate. This can produce smoother motion (fewer harmonic stutter effects) and less visible tearing.
Alternatively, use Adaptive VSYNC as a compromise.
-------------------------------------------------------------
Pre-requisites
Frame rate matches or exceeds refresh rate (e.g. 120fps @ 120Hz).
  1. LightBoost motion blur elimination is not noticeable at 60 frames per second.
--------------------------------------------------------------
Sensitivity to input lag, flicker, etc. (You benefit more if you don’t feel any effects from input lag or flicker)

Computer Factors That Hurt LightBoost

  • Inability to run frame rate equalling Hz for best LightBoost benefit. (e.g. 120fps@120Hz).
  • Judder/stutter control. Too much judder can kill LightBoost motion clarity benefits.
  • Framerate limits. Some games cap to 60fps, this needs to be uncapped (e.g. fps_max)
  • Faster motion benefits more. Not as noticeable during slow motion.
  • Specific games. e.g. Team Fortress 2 benefits far more than World of Warcraft.
  • Some games stutters more with VSYNC ON, while others stutters more with VSYNC OFF. Test opposite setting.

"If you are trying to run strobing on a high resolution monitor, you have to run much lower settings in order to get sustained (not average) 100fps or better. It's a huge difference.
View attachment 62006
==================================
Easier to render games with very high fps work pretty well with ulmb.
Running 1440p or higher rez with any kind of high to ultra settings on the most demanding games won't let you sustain high fps, only average it.

A lot of people won't even buy a monitor with PWM backlight because it causes eye strain over time, ulmb strobing can be (is) eye fatiguing especially at 100hz or less strobes. The higher the resolution on more demanding games, the more ulmb fails to sustain the higher frame rates it needs to avoid tearing and judder without turning things down enough to stay ahead of the refresh cap. A lot of people used to cap the monitor at 100hz and turn things down enough to stay over 100fps (sustained) for this reason. It also dulls the screen. Anyway I doubt it will work with true HDR color volume either as things go forward (and especially FALD HDR) which is the future of gaming and movies.
"Slideshow"
typically refers to the motion definition aspect. Motion definition provides additional smooth detail in the pathing and animation cycles, and even shows more smooth motion definition of the entire game world moving relative to you when movement keying and mouse looking in 1st/3rd person games.
Mentioning different hz and features without including the accompanying frame rates each are typically running doesn't really tell what you are comparing.
60fps (average which ranges even lower rate part of the time) at 60hz+ is like molasses to me.
100fps at 100hz or better shows 5 new unique frames to every 3 at 60fps-hz.
120fps at 120hz or better doubles the motion definition.
The "slideshow" nickname is because there are few new frames of action being shown at low frame rates. The same frame being shown for a longer time time like a flip book animation with less pages being flipped slower.
At high fps on a high hz monitor, you will get much higher motion definition regardless of whether you have strobing or black frame insertion, crt redraw, etc.
------------------------------------------
The Motion Clarity (blur reduction) aspect is also improved by running at high fps+hz ranges on a high hz monitor, and is nearly pristine using backlight strobing (with some major, in my opinion critical, tradeoffs).
Non strobe mode, at speed (e.g. mouse looking viewport around):
60fps solid ... is a full smearing "outside of the lines" blur. -- At variable hz or at 60hz or at 100hz, 120hz, 144hz, 240hz.
120fps solid .. halves that blur (50%) to more of a soften blur inside the masks of objects -- At variable hz or at 120hz, 144hz, 240hz.
144fps solid .. drops the blur a bit more, 60% less blur -- At variable hz or at 144hz, 240hz.
240fps solid .. drops the blur down to a very slight blur showing most of the texture detail -- At variable hz 240hz

In recent news, this sounds interesting....
http://www.blurbusters.com/combining-blur-reduction-strobing-with-variable-refresh-rate/
I have doubt going forward that these strobing techs will work with HDR gaming monitors and FALD HDR gaming monitors by default, (and at the sustained-not-avg frame rates strobing requires at 1400p and 4k) which looks like where things are going eventually.
 
Last edited:
So, I have 2 setups. One with a 55" 4K TV and another with a recently bought 166Hz Ultrawide.

I've enjoyed 4K 60Hz gaming for the last year or so, but decided to try 1080p ultrawide for a change.

OMG, it is so much better. I had a 144Hz screen before, but 166Hz is a noticeable improvement for me.

Been gaming on that for the last few weeks, and went to try the TV again to test something. I can't go back.

Played a few minutes at 4K60 and I honestly got a headache. It was physically painful to watch, it was so choppy.

Not sure I can ever go back to 60Hz if I have a choice.


I have a 240hz panel and yet people ON THIS FORUM tell me I'm a fool or an idiot or a dumbass and there is no way to tell. I also have 100hz x34 gsync and 144hz mg something or another free sync Asus panel. My 240 gsync is Acer. The 240 crushes the other panels and it actually looks nice too.
 
I have a 240hz panel and yet people ON THIS FORUM tell me I'm a fool or an idiot or a dumbass and there is no way to tell. I also have 100hz x34 gsync and 144hz mg something or another free sync Asus panel. My 240 gsync is Acer. The 240 crushes the other panels and it actually looks nice too.
Absolutely, I'm sure 240Hz looks sick!
 
I have a 240hz panel and yet people ON THIS FORUM tell me I'm a fool or an idiot or a dumbass and there is no way to tell.
I think a lot of this stems from folks who can't (or just won't) justify spending that kind of money on a monitor, though they secretly want to, and are telling themselves (and you, secondarily) that it's a dumb waste of money.

Source: I've been that guy
 
Its possible but its dependent on your frame rate not just having a higher hz cieling.

Without supplying high frame rates to fill hz, you aren't going to get the greater motion clarity (blur reduction) and motion definition (more articulated pathing and smoother "glassy" motion, higher animation definition etc) increase that a higher hz cieling is capable of.

-----------------------------------------------
The "slideshow" nickname is because there are few new frames of action being shown at low frame rates. The same frame being shown for a longer time time like a flip book animation with less pages being flipped slower
At high fps on a high hz monitor, you will get much higher motion definition regardless of whether you have strobing or black frame insertion, crt redraw, etc.

------------------------------------------
The Motion Clarity (blur reduction) aspect is also improved by running at high fps ranges on a high hz monitor, and is nearly pristine using backlight strobing (with some major, in my opinion critical, tradeoffs).

Non strobe mode, at speed (e.g. mouse looking viewport around):
60fps solid ... is a full smearing "outside of the lines" blur. -- At variable hz or at 60hz or at 100hz, 120hz, 144hz, 240hz.
120fps solid .. halves that blur (50%) to more of a soften blur inside the masks of objects -- At variable hz or at 120hz, 144hz, 240hz.
144fps solid .. drops the blur a bit more, 60% less blur -- At variable hz or at 144hz, 240hz.
240fps solid .. drops the blur down to a very slight blur showing most of the texture detail -- At variable hz 240hz

okw997s-png.png


-------------------------------------------------

So yes 240hz is capable of being better but only on games and settings supplying over say 200 fps average (or at higher frame rates than its nearest neghbors of 120 fps+hz, 144 fps+hz and 166 fps+hz. at least) and only on a monitor who's response time and overdrive could keep up with - 4.2ms frames.

(similarly.. 166hz shows gains at over 120 or 144fps+Hz on monitor that can keep up with - 6ms frames)
 
Last edited:
In a lot of ways, LCD's still suck. Your bound to native resolution for the best results. Pixel density is comes into play instead of bigger just being better. You have different panel technologies with their own drawbacks and advantages. The viewing angles aren't quite there either.

There was still a dot pitch with CRTs. A lot of CRTs would accept resolutions they could not really display without blurriness. It is true you could go down pretty well, though. I could never go back to a CRT, honestly. They just have a softness I don't like compared to fixed pixel displays.

Fixed pixel displays are here to stay. Single inorganic LED per pixel is going to solve the majority of the other problems (MicroLED).
 
CRTs were always a pain to me. Even at 85Hz, the flicker would cause noticeable eye strain and head-aches.

For all the faults in sample-and-hold, at least it doesn't cause physical discomfort.
 
There was still a dot pitch with CRTs. A lot of CRTs would accept resolutions they could not really display without blurriness. It is true you could go down pretty well, though. I could never go back to a CRT, honestly. They just have a softness I don't like compared to fixed pixel displays.

Fixed pixel displays are here to stay. Single inorganic LED per pixel is going to solve the majority of the other problems (MicroLED).

Absolutely. I agree with you in most cases. They also had an aperture grill that effected the shape of the dots. You also had Sony's with their "horizontal stabilizer" horseshit which put a horizontal line in the lower 3rd of the screen. And yes, there are resolutions that would cause the screen to appear blurry, typically when you went to the edge of its capabilities.
 
The problem is you can only get it direct from SK, and when both DP 1.4 ports die a month apart to some strange glitch it does not bode well that they got the design totally correct on the controller. (its obvious its a dual zone of some sort)
Basically perma-tearing on the exact middle row of pixels, image is there fine but god help you if you move anything on it. The hdmi ports are fine but fuck 60hz, I have much better displays for media.

I'm working on sending it back as its still under warranty "in Korea" but boy is that a bitch. Thankfully I saved the really good packaging it came in.

Also the panel is rather mediocre, granted I was stepping down from a JS9000 which is literally the cream of the crop on panel image quality (/gush) but still. The mangos have a pretty washed out too-blue IPS panel that you can't totally correct for.

I got mine from Daysale out of California. Its my second one, the first one I had only had one DP working.
As Vega said, the Mangos picture quality leaves alot to be desired...especially when compared to the X27 or OLED.

However, its the only game in town if you want a massive high refresh display. Personally, I am eagerly anticipating the 43" BFGD and 4k120 OLEDs
 
yeah not worth the price tag and quality, lack of features compared to just waiting for 2019 models, sizes, hdmi 2.1, 4k 120hz native 4:4:4, full hdr 1000+, p3/high % rec 2020 color, va and fald black depths, etc. I did pay $600 for my 32" lg but that was about as far as i was willing to go for a monitor just to hold me over. I've had it several months already too. The closer the premium specs and hdmi 2.1 gets the less sense it makes to buy anything as a temporary placeholder imo.
 
Last edited:
yeah not worth the price tag and quality, lack of features compared to just waiting for 2019 models, sizes, hdmi 2.1, 4k 120hz native 4:4:4, full hdr 1000+, p3/high % rec 2020 color, va and fald black depths, etc. I did pay $600 for my 32" lg but that was about as far as i was willing to go for a monitor just to hold me over. I've had it several months already too. The closer the premium specs and hdmi 2.1 gets the less sense it makes to buy anything as a temporary placeholder imo.

Yea, the waiting game. But how long do you have to wait 3 month, 6 months, 1 year, 2 years. I have been waiting for 4k120hz OLED for years now and 2020 looks like the best case scenario at this point.

How about the 43" BFGD? I would love that, but what are we looking at? 6 months at least I think.

At least in the here and now I can get my ass whipped in PUBG @4k120hz while I wait for those other displays
 
It saddens me that there is no 31.5" IPS 144Hz with strobing (144Hz strobing capable preferably, but with 120Hz + strobing I would still consider).

I'm annoyed by the fact they jumped from 27 to 31.5" makes no sense, 30 or 30.5" had been better standard. Still even 31.5" is slightly higher density than 24" 1080 which I feel comfortable with or 24.5" in my case so it's even marginally higher "upgrade" but I disliked 27" 1440p when I've used it at work, had to use some zoom to get a "comfortable" view. But it's not what's easy on the eyes only but I dislike when GUI or objects appear "tiny" proportional wise from that point of view I prefer roughly in the range 91 - 96 PPI that seems ideal for me so that's why I won't go above 1440p in the very distant future as it would require such a large 4K display to even get close to the PPI value. Games cannot be zoomed in like Windows can be. There's a balance with everything, object proportional percieved size (immersion factor and to slightly less aspect the competitive shooter factor where it's easier to hit guys in shooters with lower pixel density) vs image quality for me. I hardly see ppl speak anything other than image quality, too high density isn't good either IMO, reasonable always best. High PPI and display size are things that I consider ppl being "crazy" with in these days like bigger and more pixels can't be enough of both... I don't think I'd want a bigger than 31.5" monitor honestly.
 
Last edited:
when it's in up to $1k or more territory I prefer to pick my battles, especially when I know 2019+ will have 4k 120hz tvs and monitors. So I am not as trigger happy but personalities and budgets differ of course. I'm really not into spending that kind of money on ips and tn black levels ever again either but again that's my taste and spending path.
 
Speaking of the Hz debate, it's worth noting that the 144Hz-vs-165Hz blur differences vary a lot.

LCD GtG becomes a huge bottleneck once it hits the majority of a refresh cycle, so we need to keep true GtG to as tiny a refresh cycle as possible. That's easy at 60Hz, but incredibly hard at 240Hz and will be a mountain at 480Hz. Since "mostly imperfect 1ms GtG" can be too slow for good 480Hz.

Though well-overdriven 1ms TN can now finally be sufficiently good enough for 480Hz, I've seen many unoptimized 240Hz monitors that where 240fps@240Hz quite exactly halve motion blur of 120fps@120Hz. With good GtG that doesn't bottleneck -- then perfect 240fps (at full Hz) has exactly half the motion blur of perfect 120fps (at full Hz) without using blur reduction modes.

If we're talking about IPS, the 5ms GtG really bottlenecks (1/200sec GtG) so 144Hz-vs-165Hz becomes really subtle, being bottlenecked by LCD GtG. Often it SHOULD look 165/144ths better than 144Hz, but doesn't quite reach that, because GtG starts hiding the magnitude of improvement as the refresh rate starts enroaching into GtG time.

That said, even ignoring GtG issues.... The 1/165sec spacing in phantomarray effects does make 165fps (or 166fps) look smoother than 144fps by a bit.

project480-mousearrow-690x518.jpg


(...You probably recognize this image from the 480Hz tests and the 1000Hz-journey articles on Blur Busters)

The 165Hz would be roughly 33% of the way between 120Hz and 240Hz in terms of mousearrow-spacing (and likewise, the phantomarraying effect of bright objects past your crosshairs when fixed-gaze on crosshairs while you turn).

We definitely indeed need to jump bigger up the curve. 60Hz -> 120Hz -> 240Hz -> 480Hz ..... Incremental improvements are quite minor, especially if you get so much stutter, have GtG more than half a refresh cycle, and a stuttery mouse.... that all muddies all the high-Hz benefits.
 
I know exactly how you feel, OP. I've been using 144hz+ monitors for a few years now, and when I recently purchased a budget gaming laptop with an admittedly nice IPS display, I realized immediately that 60hz was no longer enough for me. That panel got upgraded to a 120hz TN within the week.

60hz isn't "garbage," "rubbish," or "unplayable" by any means. It's just that... well, once you get used to the silky smooth movement of higher refresh rates, 60hz becomes instantly apparent.
So my main gaming box is 1070 SLI with 60mhz ips @ 1440p. Monitor is fantastic for work & works OK for gaming.

Just picked up an XG2402 to play Metro Last Light Redux on with my spare box (RX 580/4GB) and wow! What a difference with FreeSync/144hz on! With all details up @ 1080p I hit 80-115 fps. I think I may finish the game on my RX580
 
Nice. I've been meaning to play Redux. I beat the first 2 original games, and they were awesome.

Right now I'm playing Black Mesa (fan made HL1 remake). Then I will play the rest of the HL2 series.

Trying to go back and play game series I liked. Probably do Bioshock at some point too.

Always nice to see older games looking and playing way better than they did originally (for example, at higher refresh rate).
 
Nice. I've been meaning to play Redux. I beat the first 2 original games, and they were awesome.

Right now I'm playing Black Mesa (fan made HL1 remake). Then I will play the rest of the HL2 series.

Trying to go back and play game series I liked. Probably do Bioshock at some point too.

Always nice to see older games looking and playing way better than they did originally (for example, at higher refresh rate).
While it's awesome when that works, I find it depressingly uncommon that it does - especially with console ports like Bioshock which often have their physics engines timed by the framerate and thus get really wonky when you increase it. =(
 
Just wanted to give an update. It turns out my machine was jacked. I believe I was experiencing Crossfire microstutter.

I have since removed the 2x Vega 64 cards and replaced with 1x Radeon VII and the choppiness is GONE!!!

Granted, 166Hz still looks much better for me but 60fps is totally acceptable now that things are working.

Sorry for the troll.
 
I think just like some people don't experience higher framerate the same as others, some people don't witness microstutter issues. I think if you can tell the difference of higher framerate you are more susceptible to microstutter. I had two HD6870s back in the day and the microstutter killed me to where I had to go to one GPU.
 
Just wanted to give an update. It turns out my machine was jacked. I believe I was experiencing Crossfire microstutter.

I have since removed the 2x Vega 64 cards and replaced with 1x Radeon VII and the choppiness is GONE!!!

Granted, 166Hz still looks much better for me but 60fps is totally acceptable now that things are working.

Sorry for the troll.

Same issue I had. I gave up running SLI and Crossfire because of it. Some games were fine, some literally gave me a headache.

Glad you got it sorted.
 
I saw in your other thread you bought a 1660ti.

When I was on 1080p my heavily overclocked 980ti fell a bit short of keeping above 60fps minimums in everything, it annoyed me.
But the 1660ti is slower than the 980ti and you are using a 166Hz display.

I appreciate the high Hz lag difference and really like it myself.
But I would want high fps to go with that.
Is this card the best choice for you?
 
At work I use 4k 27" 200% scaling - now 27" 1080p is so grainy it gives me headache. But 4k in Windows 10 is so choppy.
At home I use 32" 1440p 165 Hz - it is blissfully smooth, desktop and games. We could have both in the same display, right?
 
I'm about to keep 43" 4k 60hz VAs at my desk on each side of a 32" LG 32GK850G. I'm going to set the larger monitors back a bit on the sides of the large desk. Currently I have one 43" 4k TCL VA (~ $250 , ~4200:1 contrast ratio) on the left and my old 27" 1440p cinema display on the right of the LG but I've purchsed the second 43" 4k (a $330 samsung with 6100:1 contrast this time) but have yet to set it up replacing the cinema display.

Personally I prefer getting the massive increase in desktop/app space at essentially the same ppi as 27" 1440p at the 4k's distance to shrinking the pixels on a 27".

4k_21x9_2560x-27in-and-30in_1080p_same-ppi.jpg



My middle monitor sounds similar to yours.. 32", has g-sync , 120hz capable of being tight with VA response times but it can go to 165hz, and it's 1440p so that I can keep higher frame rates on demanding games in order to get appreciable gains out of the higher hz with higher graphics settings. The 3000:1 VA contrast is nice. I'll never go back to ips and tn sub 1000:1 contrast ratios and weak black depths.

So no, not in one display. I've run two or more monitors since the early 2000's since there are always trade-offs.

-----------------------

Speaking of contrast :)
I am interested in the Dell alienware OLED gaming display if it has good variable refresh rate off of nvidia gpus, but I suspect the price might be a little too high for my taste. I'm willing to drop good money but the BFGDisplay's rumors of $5000 are way out of bounds for me if that is any indication. For a big modern living room TV I'm normally willing to pay $1800 - $3200 ish range but this would be dedicated to my pc not my living room. I won't go below 70" for my living room tv anymore anyway so it doesn't fit the bill there either being 55". It would also require a big re-design of my pc room and desk to give enough distance. Beyond that, OLED is protectively limited with ABL to around +200 to +250 nit of HDR color volume past SDR's ~350 to 400 nit range in order to avoid burn in - so while it would be a great display for the next few years, it won't be for the future of greater HDR color volume displays that get a much larger % color volume out of HDR 1000, HDR4000 and HDR 10000 capable content.
 
Last edited:
This article has a few good comments about microstutter in modern dual gpu setups with sli and nvlink:
https://www.gamersnexus.net/guides/3366-nvlink-benchmark-rtx-2080-ti-pcie-bandwidth-x16-vs-x8

SniperElite4:
"The next major consideration is frametime consistency: Multi-GPU has traditionally shown terrible frame-to-frame interval consistency, often resulting in things like micro-stutter or intolerable tearing. For this pairing, as you can see in our frametime plot, the lows scale pretty well. It’s not a near-perfect 2x scaling like the average, but it’s pretty close. As a reminder, these plots are to be read as lowest is best, but more consistent is more important than just being a low interval. 16ms is 60FPS. Very impressive performance in this game, which is more a testament to Sniper 4’s development team than anything else – they have continued to build some of the best-optimized games in the space."


FarCry5:
"In our frametime chart, we can see the limitations of scaling. Although the NVLinked cards run a higher average, they fail to sustain similar scaling in frametime consistency. Frametimes are spikier and potentially more jarring, although raw framerate alone makes up for much of this lost frame-to-frameEV interval consistency."

... because your actual frame rate times are much lower at high fps. 100fps = 10ms per frame , 120fps = 8.3 ms per frame 144fps = 6.9ms

That's only two games. Some games are much worse than others, and if your frame rate graphs are lower the effects of microstutter can appear worse. Think of the rapidity of the frame rate delivery.. a stutter becomes more of a quiver and a quiver becomes not much of anything. Again dependent on the game engine and the graphics settings.

-----------------------

There are games where microstutter can be horrible and if you aren't using a high bandwidth sli bridge and if you run lower fps graphs you'll experience it even more across the board. In addition to running DX11, several SLI capable games also require you to turn AA down to minimal levels or off to avoid obnoxious microstutter as well. e.g. "you have to turn AA down to low so that TAA gets turned off. taa uses the last frame to AA the next frame which causes a problem with afr and sli. "

Sli doesn't work on all games, and some of the ones it works on require work arounds/DiY fixes.

SLI scaling varies - even when it's scaling some games do 30%, 60% while some still do ~90%.

SLi requires DX11

SLI official support can potentially take months of game patches and nvidia driver updates before the wrinkles are ironed out - though some top games work great near/at launch.

Sli microstuttering can be overt on some, usually poorly optimized to start with, game engines and when running frame rate graphs with low bottoms, and when not using a high bandwidth SLI bridge.

SLI is very expensive and you can get by just fine, probably better served and price/performance with a single card at a lower resolution monitor like 1440p for 100fpsHz+ gameplay in order to get appreciable gains out of higher Hz.

It's definitely not for everyone and unnecessary at 1440p for the most part with the top tier modern gpus

(even if you'd still have to dial some over the top settings down on the most demanding or otherwise poorly optimized games to keep 100fpsHz+ average).

-------------------------------

So it is far from a perfect solution for every game and every game's graphics/AA settings - but playing beneath 100fpsHz average sample and hold blur and lack of motion definition on a high hz monitor is useless to me so SLI gives me that on higher resolutions and on higher demand games that support it adequately in the last few years (Witcher 3, GTA V, Dishonored 2, Prey. overwatch, dark souls 3(at least able to maintain 60fps with cranked settings, even modded graphics), shadow of mordor/War, FarCry 5, Vermintide 2 << etc. .. I haven't tried black ops 4 yet but I heard it runs well in sli.) I wouldn't even consider buying into 4k 120hz on some games without it. 1440p has now moved into the sweet spot more or less for single top tier gpu though.. even if you have to dial some of the over the top settings down on the most demanding or otherwise unoptimized games to hit 100FpsHz+ average.


By the time I upgrade again it will be to a HDMI 2.1 GPU so perhaps a die shrink will make 4k 100FpsHz+ on a high hz monitor more reasonable on a single gpu by then. Of course the graphics ceilings are really arbitrary so they will just increase with more gpu power eventually - so it's a game of leapfrog and otherwise dialing settings in/down as necessary (to get appreciable gains of 100FpsHz+ on a higher resolution high hz monitor) for single gpus once you reach their limit.
 
Last edited:
I saw in your other thread you bought a 1660ti. Is this card the best choice for you?
I have multiple computers. The machine that got the 1660 Ti is for my living room, paired with a 60Hz 1080p projector (125" screen) and for that it is perfect.

My main machine has the 34" 166Hz ultrawide from LG, and that machine is running a 2080 Ti for best performance (see the top rig in my sig).
 
  • Like
Reactions: Nenu
like this
-------------------------------

So it is far from a perfect solution for every game and every game's graphics/AA settings - but playing beneath 100fpsHz average sample and hold blur and lack of motion definition on a high hz monitor is useless to me so SLI gives me that on higher resolutions and on higher demand games that support it adequately in the last few years (Witcher 3, GTA V, Dishonored 2, Prey. overwatch, dark souls 3(at least able to maintain 60fps with cranked settings, even modded graphics), shadow of mordor/War, FarCry 5, Vermintide 2 << etc. .. I haven't tried black ops 4 yet but I heard it runs well in sli.) I wouldn't even consider buying into 4k 120hz on some games without it. 1440p has now moved into the sweet spot more or less for single top tier gpu though.. even if you have to dial some of the over the top settings down on the most demanding or otherwise unoptimized games to hit 100FpsHz+ average.


By the time I upgrade again it will be to a HDMI 2.1 GPU so perhaps a die shrink will make 4k 100FpsHz+ on a high hz monitor more reasonable on a single gpu by then. Of course the graphics ceilings are really arbitrary so they will just increase with more gpu power eventually - so it's a game of leapfrog and otherwise dialing settings in/down as necessary (to get appreciable gains of 100FpsHz+ on a higher resolution high hz monitor) for single gpus once you reach their limit.

Black Ops 4 scaling and great with no microstutter, but like other CPU hungry games, you might make your frame-time consistency worse if your CPU isn't up to the task. BO4 even scales just as well with temporal AA enabled for me.
 
  • Like
Reactions: elvn
like this
I have a PG278QR that can do 165 but in overclock mode. It concerns me to run it at 165 since it is in overclock mode and is out of warranty. On the other hand, it is hard for me to notice a difference between 144 and 165. Plus my GTX1080 sometimes has a hard time keeping 144 at max settings so I don't even know why I'm concerned.
 
I have a PG278QR that can do 165 but in overclock mode. It concerns me to run it at 165 since it is in overclock mode and is out of warranty. On the other hand, it is hard for me to notice a difference between 144 and 165. Plus my GTX1080 sometimes has a hard time keeping 144 at max settings so I don't even know why I'm concerned.
Does overclock mode void warranty on a monitor? I have a hard time believing that a feature they provide a setting for in the OSD can void your warranty...

Been running my 144/165hz panel in overclock mode since day one, not sure why they even sell it as a 144hz panel when it can do 165 no problem.
 
It's not that it voids the warranty.... it's that the monitor is out of warranty due to time passing.
 
It's not that it voids the warranty.... it's that the monitor is out of warranty due to time passing.
Ah! I misread.

Well, take it from another owner of that monitor, for what it's worth - I don't think OC mode puts you in any danger. Mine has been in 165hz for months and I can't see any evidence that it's hurting the thing.

Warranties are in place to capture manufacturing defects, not provide customers with a free replacement when their equipment dies from natural wear and tear - if your monitor hasn't exhibited any failures yet, it's unlikely that it will die prematurely.

Ymmv of course.
 
Last edited:
Yeah, I think the reason they call it OC is because it's not guaranteed to work on every panel. Not that it would cause damage.
 
I can feel a difference in smoothness at a consistent 75hz on my son's freesync monitor, but anything higher and i cant tell much of a difference. I went to a local Lan cafe and they had both 244hz 1080p and 144hz 1440p monitors and I honestly didn't see an improvement. Their machines had 8400 i7 and 2080 RTXs btw.
 
I never use to think the high refresh made much of a difference but in more recent years I have struggled with eye fatigue even with good glasses. I noticed switching to a 144hz and 120hz monitors I am no longer having the issue. I am a productivity user so not really gaming almost at all.
 
Back
Top