ASUS PG35VQ 3440x1440 200Hz Gsync


Um, that's a TCL TV... this is an Asus G-sync monitor. You can't compare these two things at all. It's extremely unlikely that local dimming in this product will be turned off in any mode(apart from maybe ULMB if it even supports it), that would defeat the whole point. We don't have very much information on how this monitor will work but it doesn't make any sense to build a gaming monitor with FALD and then say 'oh but all serious gamers need to turn it off'. No one would buy it.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
Um, that's a TCL TV... this is an Asus G-sync monitor. You can't compare these two things at all. It's extremely unlikely that local dimming in this product will be turned off in any mode(apart from maybe ULMB if it even supports it), that would defeat the whole point. We don't have very much information on how this monitor will work but it doesn't make any sense to build a gaming monitor with FALD and then say 'oh but all serious gamers need to turn it off'. No one would buy it.

Why no one would buy it ? it is FALD 512 zones beside VA , HDR , 200HZ and G-sync .
I think 512 zones for 35'' ultrawide will be enough and will give us great black levels .
 
Nobody would buy a FALD gaming monitor if you have to turn off the FALD to avoid input lag. Fortunately, you don't have to, because TVs have nothing to do with monitors.
 
There's another point people seem to be missing with these high refresh rate gsync monitors. With gsync enabled, when you hit the max refresh rate of your monitor, you have two options. vsync on = increased input lag. vsync off = potential screen tearing.
Fortunately, some games offer in game frame limiters that can be configured below your max refresh rate, so that youre screens max is never hit, providing the most responsive, picture without tearing. However, not all games do. Sometimes this can be overridden through nvidia inspector, but this method can also introduce input lag.

But, if you have a monitor with an unusually high refresh rate, it makes it less likely that your framerate will meet your max refresh rate. This means, without specific in game support or 3rd party tools, you're more likely to remain in the gsync range, providing the best image quality and lowest input lag. So there is some benefit to having a refresh rate higher than 144hz, even if it's not visibly smoother. Sure Quake 3 Arena and CS1.6 will blow past those frame rates, but I'd imagine most games made in the last decade wouldn't.

I'm already deciding which body parts I'm willing to part with the most for one of these screens.
this really isn't an issue at all. you can use msi afterburner to cap your framerate in every game, and if you don't do that and set it to vsync off >refresh rate i'd be surprised if you could notice tearing at 144+ fps.
 
Nobody would buy a FALD gaming monitor if you have to turn off the FALD to avoid input lag. Fortunately, you don't have to, because TVs have nothing to do with monitors.
Yes i agree with u . Why they did not make it permanent ? they know nobody would turn off the FALD , Why they did put option turn on and turn off ?
Hopefully if it was built into the monitor , i mean it is permanent and works all the time in order to benefit from the HDR and FALD for both Games and Movies in any mode .
 
this really isn't an issue at all. you can use msi afterburner to cap your framerate in every game, and if you don't do that and set it to vsync off >refresh rate i'd be surprised if you could notice tearing at 144+ fps.
Fast Sync, instead. This way your GPU is still working hard.
Yes i agree with u . Why they did not make it permanent ? they know nobody would turn off the FALD , Why they did put option turn on and turn off ?
Hopefully if it was built into the monitor , i mean it is permanent and works all the time in order to benefit from the HDR and FALD for both Games and Movies in any mode .
There is no similarity between the TV you posted and this monitor. You don't know what firmware features this monitor will have, yet. TCL is a Walmart budget brand, ASUS is a premium enthusiast brand.

There are no such things as processing "modes" in gaming monitors. The only thing worth doing is changing the pixel response time, which would improve motion clarity while input lag stays the same. We all know there will be some overhead with HDR, but we do not know yet how much it will be with this monitor. It could be as little as <= 1ms or as much as 1 frame.
 
Fast Sync, instead. This way your GPU is still working hard.

There is no similarity between the TV you posted and this monitor. You don't know what firmware features this monitor will have, yet. TCL is a Walmart budget brand, ASUS is a premium enthusiast brand.

There are no such things as processing "modes" in gaming monitors. The only thing worth doing is changing the pixel response time, which would improve motion clarity while input lag stays the same. We all know there will be some overhead with HDR, but we do not know yet how much it will be with this monitor. It could be as little as <= 1ms or as much as 1 frame.
fast sync doesn't work super well unless you're way, way above your refresh rate, ideally 2x refresh rate or higher, which is why no vsync is usually a better option for high refresh rate screens. counter-strike and overwatch are probably the only games today where it would make sense for people on 120+ Hz monitors, but then again the refresh rate & framerate are already so high that it's kind of unnecessary to have any sync. plus those are competitive games where every millisecond matters so the slight increase in smoothness and lack of tearing isn't really worth the input lag penalty.

with all of this taken into account, i think fast sync is almost exclusively for 60 Hz gaming.
 
What about this case ?
Asus and Acer (PG35VQ and X35) have 4:2:0 Chroma Subsampling and all the high end monitors have 4:2:2 Chroma subsampling , and we dont know if the HDR thing is gonna work on HDR movies and thats because the HDR only works with G Sync and G sync only works while u r gaming .
 
What about this case ?
Asus and Acer (PG35VQ and X35) have 4:2:0 Chroma Subsampling and all the high end monitors have 4:2:2 Chroma subsampling , and we dont know if the HDR thing is gonna work on HDR movies and thats because the HDR only works with G Sync and G sync only works while u r gaming .

high end monitor and chroma subsampling should never be on same sentence. unless its an OLED moniotr or VA+ FALD, every f time i see HDR on a monitor spec, i understand that they are talking about "HDR compatible", never "100% true and real HDR".
 
high end monitor and chroma subsampling should never be on same sentence. unless its an OLED moniotr or VA+ FALD, every f time i see HDR on a monitor spec, i understand that they are talking about "HDR compatible", never "100% true and real HDR".
I've read this from post in reddit , and the guy said we can set a lower refresh rate and get around that issue ( to get 4 : 4 : 4 ) .
 
What about this case ?
Asus and Acer (PG35VQ and X35) have 4:2:0 Chroma Subsampling and all the high end monitors have 4:2:2 Chroma subsampling , and we dont know if the HDR thing is gonna work on HDR movies and thats because the HDR only works with G Sync and G sync only works while u r gaming .

Where did you read that HDR only works with gsync? That's news to me.
 
I've read this from post in reddit , and the guy said we can set a lower refresh rate and get around that issue ( to get 4 : 4 : 4 ) .
I don't think that is confirmed, just conjecture. Hopefully true though.
 
Anyone want to guess what this monitor will retail for at release? I'm thinking $1999.99.
 
Anyone want to guess what this monitor will retail for at release? I'm thinking $1999.99.
For the price it's not a problem , i can wait until we see these monitors at a price $1500 .
But How can we do 200HZ + HDR + 3440 X 1440 + G-sync with single cable DP1.4 ?
I want to run 200HZ + HDR + 3440 X 1440 + G-sync + FALD 512 zones with at least 4:2:2 Chroma subsampling and Not 4:2:0 . Can this monitor do that at 200HZ or we need to set a lower refresh rate such as 144hz ?
 
So, does hdr work with any content, or does the content have to be made for hdr?
 
For the price it's not a problem , i can wait until we see these monitors at a price $1500 .
But How can we do 200HZ + HDR + 3440 X 1440 + G-sync with single cable DP1.4 ?
I want to run 200HZ + HDR + 3440 X 1440 + G-sync + FALD 512 zones with at least 4:2:2 Chroma subsampling and Not 4:2:0 . Can this monitor do that at 200HZ or we need to set a lower refresh rate such as 144hz ?
It can for sure do 200HZ + HDR + 3440 X 1440 + G-sync + FALD 512 zones with at least 4:2:2 Chroma subsampling since thats about 20% less bandwidth usage than the 4k 144hz 4:2:2.
 
It can for sure do 200HZ + HDR + 3440 X 1440 + G-sync + FALD 512 zones with at least 4:2:2 Chroma subsampling since thats about 20% less bandwidth usage than the 4k 144hz 4:2:2.
Do u mean single cable DP1.4 will be enough to do 200HZ + HDR + 3440 X 1440 + G-sync + FALD 512 zones with 4:2:2 which is = 4k 144hz 4:2:2 ?
I don't want to see 4:2:0 at 200HZ or set lower refresh rate to get 4:2:2 .
 
Last edited:
it is amazing how much nonsense one can find while browing a vaporware thread:

- 200Hz will be totally useless without a scanning/strobing backlight. it will fall behind any OLED panel at 60Hz and 100Hz panel with a scanning backlight would run circles around this daydreaming 200Hz.
-10b color is completely lost running anything that is not 4:4:4
- HDR loses much of its appeal for PC usage without 4:4:4

whar we really need:

- more contrast with OLED or FALD
- better motion with a scannning/strobing backlight
-HDR at 4:4:4

instead, we get this moronic Hz race to the bottom.
 
it is amazing how much nonsense one can find while browing a vaporware thread:

- 200Hz will be totally useless without a scanning/strobing backlight. it will fall behind any OLED panel at 60Hz and 100Hz panel with a scanning backlight would run circles around this daydreaming 200Hz.
-10b color is completely lost running anything that is not 4:4:4
- HDR loses much of its appeal for PC usage without 4:4:4

whar we really need:

- more contrast with OLED or FALD
- better motion with a scannning/strobing backlight
-HDR at 4:4:4

instead, we get this moronic Hz race to the bottom.
You can wait your whole life for new tech , I don't think that oled gaming monitors will come this or even next year. The next big thing is direct led, which we have since some years @ televisions.
So once we see OLED gaming monitors whether 16:9 or ultrawide 21:9 , we'll see them at 60HZ and that may be by 2020 . And we'll wait another 2 years to see them with high refresh rates + G-sync / freesync .
And another 2 years to see them with HDR +1000 nits . So for Gaming 4:2:2 will be enough whether with 4K 144HZ or 3444 X 1440 200HZ . I hope single cable DP1.4 will be enough to do 200HZ + HDR + 3440 X 1440 + G-sync + FALD 512 zones with 4:2:2 without having to set lower refresh rate to get 4:2:2 .
 
Do u mean single cable DP1.4 will be enough to do 200HZ + HDR + 3440 X 1440 + G-sync + FALD 512 zones with 4:2:2 which is = 4k 144hz 4:2:2 ?
Yes. And if i calculate it correctly, 4:4:4 should be possible around 160Hz. Assuming the hardware permits.
 
Yes. And if i calculate it correctly, 4:4:4 should be possible around 160Hz. Assuming the hardware permits.
Thank u bro ,
For me 4:2:2 will be enough at 200HZ , I asked because i've seen in post at reddit site X35 and PG35VQ will have 4:2:0 Chroma Subsampling at 200HZ .
Does this monitor have true 8-bit color depth or 10-bit ?
 
Thank u bro ,
For me 4:2:2 will be enough at 200HZ , I asked because i've seen in post at reddit site X35 and PG35VQ will have 4:2:0 Chroma Subsampling at 200HZ .
Does this monitor have true 8-bit color depth or 10-bit ?
They claim hdr10 så it must be 10bit.
 
Yes. And if i calculate it correctly, 4:4:4 should be possible around 160Hz. Assuming the hardware permits.
How can we notice by the naked eye whether 4:4:4 or 4:2:2 with different high refresh rates from 60HZ until 200HZ especially for Gaming ?
How u notice 4:4:4 for example at 144HZ and 4:2:2 at 200HZ with HDR ?
I want to know the method to calculate the exact refresh rate for 4:4:4 and the exact refresh rate for 4:2:2 .
I can try many high refresh rates from 60HZ until 200HZ but with HDR + 3440 x 1440 the situation is different , i mean the maximum refresh rate for 4:4:4 which if i skipped it , the new refresh rate will work with 4:2:2 and Not 4:4:4 .
 
According to the tftcentral review on the older z35 200hz va, the VA response time and that particular monitor's overdrive implementation was good to about 120hz before it would lose it again. So for a typical high hz gaming monitor, you start getting around 40% blur reduction at 100fps-hz , 50% blur reduction at 120fps-hz, 60% blur reduction at 144fps-hz. So I think if you shoot for 100fps-hz average or so your frame rate would probably be a ranged band from 70 - 100 - 130(160) or a scaled a little higher if you manage to get above 100fps-hz average. The idea being to either straddle the 120hz sweet spot as close as possible or perhaps cap the frame rate at 120hz or 144hz.

The point of the above is, unless the response time and/or overdrive implementation is much better on these newer "200hz" VA monitors, you would be better off setting a hard limit of 120hz or 144hz if possible or otherwise shooting for a frame rate band that hovers around 120 or 144 average. Frame rate limiting would probably be best, preserving the tightest blur reduction through the top 2/3 of a high average frame rate graph instead of a band varying across 1/3 low range, 1/3 tighter (100fps to 144fps-hz), and the top 1/3 of the graph losing it again above 120hz.

http://i.imgur.com/ALY9lQS.png

If you are able to get 4:4:4 limiting the frame rate to 144hz as people are wondering, you are probably already better off limiting it to 120hz or 144hz when possible regardless.

4:2:2 isn't bad at all for movies and games though, either. It is not 4:2:0.
3AOh1Lp.png


Besides all of that, you can set the frame rate in nvidia 3d settings to use a higher setting in games and use a default rate for desktop/apps (where 4:4:4 might come into play more for text). A lot of people have been doing that for years, running 120hz at desktop and 144hz in games because nvidia g-sync gpus weren't idling on desktop use at 144hz.
 
Last edited:
According to the tftcentral review on the older z35 200hz va, the VA response time and that particular monitor's overdrive implementation was good to about 120hz before it would lose it again. So for a typical high hz gaming monitor, you start getting around 40% blur reduction at 100fps-hz , 50% blur reduction at 120fps-hz, 60% blur reduction at 144fps-hz. So I think if you shoot for 100fps-hz average or so your frame rate would probably be a ranged band from 70 - 100 - 130(160) or a scaled a little higher if you manage to get above 100fps-hz average. The idea being to either straddle the 120hz sweet spot as close as possible or perhaps cap the frame rate at 120hz or 144hz.

The point of the above is, unless the response time and/or overdrive implementation is much better on these newer "200hz" VA monitors, you would be better of setting a hard limit of 120hz or 144hz if possible or otherwise shooting for a frame rate band that hovers around 120 or 144 average. Frame rate limiting would probably be best, preserving the tightest blur reduction through the top 2/3 of a high average frame rate graph instead of a band varying across 1/3 low range, 1/3 tighter (100fps to 144fps-hz), and the top 1/3 of the graph losing it again above 144hz.

http://i.imgur.com/ALY9lQS.png

If you are able to get 4:4:4 limiting the frame rate to 144hz as people are wondering, you are probably already better off limiting it to 120hz or 144hz when possible regardless.

4:2:2 isn't bad at all for movies and games though, either. It is not 4:2:0.
3AOh1Lp.png


Besides all of that, you can set the frame rate in nvidia 3d settings to use a higher setting in games and use a default rate for desktop/apps (where 4:4:4 might come into play more for text). A lot of people have been doing that for years, running 120hz at desktop and 144hz in games because nvidia g-sync gpus weren't idling on desktop use at 144hz.

Thanks for your great response , But i want to know what is the relation between Chroma Subsampling whether 4:4:4 or 4:2:2 and set lower refresh rate ?
I hope i can stay with 60HZ for RPG games and normal use beside 144HZ - 200HZ for shooter games , But i don't know what will happen for Chroma Subsampling in both cases especially with HDR + FALD + G-sync ?
 
Thanks for your great response , But i want to know what is the relation between Chroma Subsampling whether 4:4:4 or 4:2:2 and set lower refresh rate ?
I hope i can stay with 60HZ for RPG games and normal use beside 144HZ - 200HZ for shooter games , But i don't know what will happen for Chroma Subsampling in both cases especially with HDR + FALD + G-sync ?
No one except asus/acer/nvidia can answer that. It is fully possible 4:4:4 works at ~160Hz or lower, but its completely up to them how and if they implement it.
The best answer right now is simply that it is highly probable that 4:4:4 will work at roughly 160Hz or lower.

Its not a technical limitation, but for example nvidia have only allowed 10bit color on their quadro cards in the past. If we are unlucky they will go "yep, this monitor has 10bit color.. but you're locked to 4:2:2 chroma unless you buy a quadro card SORRY GUYS!"
 
4:2:2 isn't bad at all for movies and games though, either. It is not 4:2:0.

Every chroma subsampling breaks cleartype text rendering. for PC usage it means get 4:4:4 or or deal with fuzzy text, including but not limited to, in-game menus.

144HZ - 200HZ for shooter games

you manage to miss both points in one sentence:

- as i said, it is useless to go down the Hz race to bottom, because blur reduction without a strobing backlight is moot.

- as elvn said, it is useless to go much higher than 120-144Hz because the overdrive artifacts would make the image appear WORST than at 120Hz.

beware, image heavy examples:

samsung-c34f791, a 100Hz VA:
C34F791-blur.png


AOC AG352UCG, a 35" VA at 100Hz
AG352UCG-blur.png


AOC AG322QCX, a 144Hz Va:
AG322QCX-blur.png


And to finish, here is how a Va panel behaves with a strobing backlight:
samsung-c24fg70
C24FG70-blur.png


100Hz with strobe is much better than 200Hz and no strobe. i would go further and say that as of Q2 2017, all VA panels behave better at 120Hz than at 144Hz.
 
I know g-sync/variable hz does not currently work with strobing mode. I highly doubt a high density FALD HDR implementation will work with strobing. Even if it did (almost certainly it won't be possible with this monitor), it would mute/dim the overall and perceived brightness and dynamic range of the monitor which would defeat the purpose of having a minimum of 1000nit peak HDR.

Assuming the above is the case, referencing VA strobing results is irrelevant to the usage I'm assuming most people are buying the monitor for. The blur amount examples are always interesting and appreciated as a comparison though, and show how the existing high hz VA panels "lose" their blur reduction past 120fps-hz.

The main point is that the previous gen of 200hz VA monitor's response times and overdrive implementation only provided a tighter non-strobe blur reduction up to 100fps-hz to 120fps-hz. That is a 40% and 60% blur reduction which tightens up to more of a soften blur within the "shadow mask" of onscreen objects and architecture during high speed FoV movement and individual virtual object movement - rather than the smearing outside of the lines blur of lower fps-hz average bands. It is also a 5:3 to 2:1 motion definition increase. If that scenario based on the previous gen of 200hz VA gaming screens is the case here you would probably be better off capping the hz regardless of whether doing so allows you to get 4:4:4 or not because anything higher would be wasted by going back into a smearing range outside of the 'sweet spot'. Hopefully it will be able to get the 4:4:4 capped at 120hz or 144hz. We will have to wait and see unless someone from asus (or perhaps an early review at tftcentral) confirms ahead of time at some point.
 
Last edited:
No one except asus/acer/nvidia can answer that. It is fully possible 4:4:4 works at ~160Hz or lower, but its completely up to them how and if they implement it.
The best answer right now is simply that it is highly probable that 4:4:4 will work at roughly 160Hz or lower.

Its not a technical limitation, but for example nvidia have only allowed 10bit color on their quadro cards in the past. If we are unlucky they will go "yep, this monitor has 10bit color.. but you're locked to 4:2:2 chroma unless you buy a quadro card SORRY GUYS!"

They added a new option in the NVCP a few months back and if you connect a 10 bit display you can enable it now (on consumer cards, yes). At least it works over DP but that's what those g-sync monitors are going to use anyway.
edit : but there are some limitations IIRC, directX (11+) only and fullscreen exclusive mode only.
 
Hopefully it will be able to get the 4:4:4 capped at 120hz or 144hz. We will have to wait and see unless someone from asus (or perhaps an early review at tftcentral) confirms ahead of time at some point.
120hz or 144hz both are fine , But what is the purpose from buying 200hz monitor and running it at 120hz or 144hz all the time ?
If we can get the 4:4:4 capped at 120hz or 144hz + 3440 x 1440 + HDR 1000 nits + FALD 512 zones + G-sync at the same time , It does not matter not using 200 Hz .
 
I'm aware of the blurbusters link about the potential for combining strobe and g-sync techs, and I have actually linked it in this forum before but thanks. I'm not familiar with that dell.

I seriously doubt that the 1000 nit high density dynamic FALD array and it's effect of greatly increasing contrast, black depth, and its effect on improving HDR capability, which I'd assume is the major feature most people are willing to pay for these monitors, will be able to work fully - likely not at all - with backlight strobing mode enabled.

As for the 200hz comment, that was the point.. It sounds like you answered your own question. It's not "needed" if the response time and overdrive cant keep up. So if similar to the z35 VA, it would just allow your frame rate range to go over the sweet spot into a blur range again on the high end of a high fps average graph. The difference is you can cap the highest range of fps-hz to stay out of that end, you cant lock out the lower half of a frame rate graph out of the without lowering settings until the minimums are very high.

Quoting myself (4-272017):

"If you are trying to run strobing on a high resolution monitor, you have to run much lower settings in order to get sustained (not average) 100fps or better. It's a huge difference.

ALY9lQS.png

==================================
Easier to render games with very high fps work pretty well with ulmb.
Running 1440p or higher rez with any kind of high to ultra settings on the most demanding games won't let you sustain high fps, only average it.

As per blurbusters.com 's Q and A:

-----------------------------------------------------
Q: Which is better? LightBoost or G-SYNC?
It depends on the game or framerate. As a general rule of thumb:
LightBoost: Better for games that sustain a perfect 120fps @ 120Hz
G-SYNC: Better for games that have lower/fluctuating variable framerates.

This is because G-SYNC eliminates stutters, while LightBoost eliminates motion blur. LightBoost can make stutters easier to see, because there is no motion blur to hide stutters. However, LightBoost looks better when you’re able to do perfect full framerates without variable frame rates.

G-SYNC monitors allows you to choose between G-SYNC and backlight strobing. Currently, it is not possible to do both at the same time, though it is technically feasible in the future.
......
Main Pros:
+ Elimination of motion blur. CRT perfect clarity motion.
+ Improved competitive advantage by faster human reaction times.
+ Far more fluid than regular 120Hz or 144Hz.
+ Fast motion is more immersive.

Main Cons:
– Reduced brightness.
– Degradation of color quality.
– Flicker, if you are flicker sensitive.
– Requires a powerful GPU to get full benefits. <edit by elvn: and turning down settings a lot more at higher resolutions>

--------------------------------------------------------
During regular 2D use, LightBoost is essentially equivalent to PWM dimming (Pulse-Width Modulation), and the 2D LightBoost picture is darker than non-LightBoost Brightness 100%.
--------------------------------------------------------
Once you run at frame rates above half the refresh rate, you will begin to get noticeable benefits from LightBoost. However, LightBoost benefits only become major when frame rates run near the refresh rate (or exceeding it).
-------------------------------------------------------
If you have a sufficiently powerful GPU, it is best to run at a frame rate massively exceeding your refresh rate. This can reduce the tearing effect significantly.Otherwise, there may be more visible tearing if you run at a frame rate too close to your refresh rate, during VSYNC OFF operation. Also, there can also be harmonic effects (beat-frequency stutters) between frame rate and refresh rate. For example, 119fps @ 120Hz can cause 1 stutter per second.
Therefore, during VSYNC OFF, it is usually best to let the frame rate run far in excess of the refresh rate. This can produce smoother motion (fewer harmonic stutter effects) and less visible tearing.
Alternatively, use Adaptive VSYNC as a compromise.
-------------------------------------------------------------
Pre-requisites
Frame rate matches or exceeds refresh rate (e.g. 120fps @ 120Hz).

  1. LightBoost motion blur elimination is not noticeable at 60 frames per second.

--------------------------------------------------------------

Sensitivity to input lag, flicker, etc. (You benefit more if you don’t feel any effects from input lag or flicker)

Computer Factors That Hurt LightBoost

  • Inability to run frame rate equalling Hz for best LightBoost benefit. (e.g. 120fps@120Hz).
  • Judder/stutter control. Too much judder can kill LightBoost motion clarity benefits.
  • Framerate limits. Some games cap to 60fps, this needs to be uncapped (e.g. fps_max)
  • Faster motion benefits more. Not as noticeable during slow motion.
  • Specific games. e.g. Team Fortress 2 benefits far more than World of Warcraft.
  • Some games stutters more with VSYNC ON, while others stutters more with VSYNC OFF. Test opposite setting.
-----------------------------------end-of-blurbuster's-quotes--------------------------
In recent news, this sounds interesting....

http://www.blurbusters.com/combining-blur-reduction-strobing-with-variable-refresh-rate/

I have doubt going forward that these strobing techs will work with HDR gaming monitors and FALD HDR gaming monitors by default, (and at the sustained-not-avg frame rates strobing requires at 1400p and 4k) which looks like where things are going eventually.

A lot of people won't even buy a monitor with PWM backlight because it causes eye strain over time, ulmb strobing can be (is) eye fatiguing especially at 100hz or less strobes. The higher the resolution on more demanding games, the more ulmb fails to sustain the higher frame rates it needs to avoid tearing and judder without turning things down enough to stay ahead of the refresh cap. A lot of people used to cap the monitor at 100hz and turn things down enough to stay over 100fps (sustained) for this reason. It also dulls the screen. Anyway I doubt it will work with HDR either as things go forward (and especially FALD HDR) which is the future of gaming and movies.

---end of my quotes------
 
Last edited:
thank you for the yuge self quotes

actually, FALD allows for a better implementation of LMB than old school strobing: scanning backlight, like the ( simple but wffective) solution on the Samsung c24fg70. a relevant side effect of a scanning backlight is that total contrast and brightness levels do not suffers as much as in strobing solutions.
 
A 1000nit backlight would counter the dimming effect of strobe mode compared to say, strobing mode on a 350 - 400nit peak monitor in strobe mode's muted levels, but I'll believe fully perceived/uncompromised 1000nit FALD HDR achieved in strobing mode on this monitor when I see it. I'd gladly be proven wrong.

Strobe mode turns off the backlight enitrely. FALD and HDR are not always on/off.. they are varying levels and on this monitor across a back grid array 512 lights. One is essentially shuttering, the other is a dynamic dense arrray modulation ( light off/varied dimming/bright flaring) across detailed scene maps on the fly like some kind of animated mosaic. Shuttering is like PWM, eye fatiguing with the side effect of dimming/muting the entire screen, and in this case on a monitor most are buying to see the highest dynamic ranges available and squeeze the best (P3) color they can get out of it.

That of course ignores the rest of the factors mentioned about stobe mode in great detail from the main body of the quote I posted (being necessarily long as it was a large body of information from blurbusters). I won't post the highlighted info again.

If strobing mode option is the draw for you that is great. Having owned a few near pristine motion clarity FW900 crts and a few strobe capable gaming monitors, I actually like the idea of strobe mode were it not for the the tradeoffs I highlighted in my previous reply. Regardless, I doubt most who buy a $1400 - $2000 1000nit FALD HDR g-sync monitor are going to compromise (likely would have to turn off) those features to get strobing mode (and most likely at lower graphics settings for uncompromised strobe mode fps exceeding a 100hz or greater hz cap).
 
Last edited:
thank you for reminding me of another moronic number race: the 1000 nits PoS.

i have an ACD 30", Apple rated it for 270 nits, most reviews actually measure it around 343 nits.
simply put: there is no f way to use it at maximum brightness as PC monitor.

but now people try to convince me that it will be great to have a PC monitor 3 times brighter...
how about a monitor with blacks 30.000 times darker instead? my eyes will still be hurt at maximum brightness, but overall picture quality will much better than this 1000 nits PoS.

who is more out of his depth: those that want 4:2:2 12b color over 10b 4:4:4, those that want 200Hz over a scanning backlight, or those that desire 1000 nits in a PC monitor??
 
I doubt most who buy a $1400 - $2000 1000nit FALD HDR g-sync monitor are going to compromise (likely would have to turn off) those features to get strobing mode (and most likely at lower graphics settings for uncompromised strobe mode fps exceeding a 100hz or greater hz cap).
Sorry but i'm an arabic , so i don't know what does strobing mode mean ?
Give me simple explanation about it and how can strobing mode affect on HDR 1000 nits + FALD + G-sync at the same time ?
For me i don't know other than 120hz / 144hz / 200hz + HDR + FALD + G-sync and all these features in one monitor is a great work , so i never thought of strobing mode at all .
 
You obviously don't understand how brightness translates in high dynamic range scenes, FALD HDR in this case, (unless you are being deliberately obtuse). I suggest you read up on it.

Even though you sound triggered and are ignorant (literal term) of how high brightness is realized on a high dynamic range monitor I agree that black depth is also HIGHLY desired, which is why I feel a high density FALD array VA is the best bet until something better in a full featured gaming monitor comes out.

A lot of people, myself included, didn't get how effective 120hz+ monitors or g-sync were at first (or what lightboost was effective at, and not, for that matter). HDR will have a learning curve too. For most of these technologies, you ultimately have to see them in person to really understand. The fact that they come with a premium price tag for the first few years doesn't help the more stubborn people to accept the effectiveness of newer, superior features either.

By the way, the HDR premium label has 1000nit (peak) as the MINIMUM requirement for the HDR premium standard. HDR movies are filmed at 4000 nit peak, and that standard tops out at 10,000 nit max (theoretically, perhaps someday for very large screens). HDR premium label LCD standard is also a .05 black depth. I think this monitor should be able to get that in FALD mode. I have my doubts about the ips version though and am curious to see both reviewed properly.

=============================================================================

HDR is a completely different way of coding brightness.
"It replaces gamma with a scheme that is far more suited to flat panel displays that respond linearly (or close to it) unlike CRTs that naturally have a non-linear response, which gamma was near-enough the inverse of."

"HDR benefits OLED displays just as much as it benefits LCD displays - though OLEDs are less suited to the high brightness capabilities that HDR brings.

SDR video was designed to be viewed at 100 nits brightness.
HDR video does not place any real restrictions on brightness, so content can be mastered up to 10,000 nits.

Realistically, most content will be mastered at 1,200-2,000 nits.
But that is the absolute peak brightness that highlights will reach. Maximum scene brightness is to remain below 400 nits.

If you viewed SDR content on a 1,000 nit display, everything would be displayed 10x brighter than intended.
If you viewed that same SDR content on a 500 nit display, everything would be half the brightness of the 1,000 nit display.

If you viewed HDR content on a 1000 nit display or a 500 nit display, any scene where the peaks are lower than 500 nits should look exactly the same on both displays, due to the way that brightness is now coded. It represents an exact value, not a relative value.
For scenes which have highlights exceeding 500 nits, you will have highlight clipping on the lower brightness display (everything above 500 nits turns white) which would not be present on the higher brightness display.

HDR enables far more vivid, saturated, and realistic images than SDR ever could.
High-brightness SDR displays are not at all the same thing as HDR displays."

 
Last edited:
This monitor might actually be able do quite nice strobing by simple black frame insertion every 2nd frame at 200hz, if that can be made to work with g-sync (perhaps in the drivers, or even in games/mods if developers just add an option).
Assuming the FALD works flawlessly and without input lag every single frame, it would then black out all light and effectively work as 100Hz strobing. 5ms strobes 5 ms black that would automatically work with g.sync etc.
 
Back
Top