ASUS PG35VQ 3440x1440 200Hz Gsync

i am not aware of a X35, a name that google tell me was used back in 2015 on the rumor mill of the Z35, a 2560x1080 PoS with a 200Hz VA full of ghosting and overshooting., probably the worst monitor under the Predator brand.
The Z35p, a 3440x1440 100Hz that promises Oc to 120Hz is already on sale, but no reviews yet.
 
Intriguing.

I was just thinking about how we should have 2560x1440x240hz. This is a bit different, of course, but still nice. Really hoping we get QHD 240hz soon...
 
Intriguing.

I was just thinking about how we should have 2560x1440x240hz. This is a bit different, of course, but still nice. Really hoping we get QHD 240hz soon...

NOPE. 240hz means nothing without a strobing/scanning backlight. I would rather have 100Hz with strobe than 240Hz without it.
 
Depends on user preferences. On my XL2730Z play at 144hz strobing for when I am a try hard (OW, CSGO), but in most other cases I do not.

What I am curious about is if 240hz is "good enough" for motion tracking compared to 100hz-144hz strobing that it makes up for any minor deficit with a lack of strobe crosstalk.

You can have the option for either, btw. PG258Q supports 144hz strobing and 240hz non-strobing so I'd be curious what people felt about it.
http://www.tftcentral.co.uk/reviews/asus_rog_swift_pg258q.htm

It looks like it is approaching "good enough" when comparing UFO screenshots of 200hz/240hz vs 144hz strobing. I'd have to see it in person to judge though. Crosstalk can be very annoying when I notice it.

Non-strobing: http://www.tftcentral.co.uk/images/asus_rog_swift_pg258q/pursuit_1.jpg
Strobing (top, middle, bottom of screen): http://www.tftcentral.co.uk/images/asus_rog_swift_pg258q/ulmb.jpg

I want this in 27" QHD.
 
60hz-200hz.png


Maybe I'm missing something, but is there a difference between these two sides?
 
60hz-200hz.png


Maybe I'm missing something, but is there a difference between these two sides?

The back of the chopper has obviously taken less damage, which means 200hz helps in making you less scrubby at flying helis. Kidding aside the 200hz image even shows slight more blurry text, which shouldn't be the case.
 
Last time there was a 200Hz VA panel it was way too slow to make proper use of such a high refresh rate. If I could maintain 200Hz consistently I'd rather use 120Hz ULMB instead. But WILL this monitor have that or not?!
 
Last time there was a 200Hz VA panel it was way too slow to make proper use of such a high refresh rate. If I could maintain 200Hz consistently I'd rather use 120Hz ULMB instead. But WILL this monitor have that or not?!
I would expect this to use an AHVA panel, not a VA panel. AHVA panels typically have a 4ms GTG response time, so they can theoretically go up to 250 Hz before smearing becomes an issue. No word on ULMB yet, but I have yet to see a G-Sync display that doesn't have ULMB as an option.
 
I would expect this to use an AHVA panel, not a VA panel. AHVA panels typically have a 4ms GTG response time, so they can theoretically go up to 250 Hz before smearing becomes an issue. No word on ULMB yet, but I have yet to see a G-Sync display that doesn't have ULMB as an option.

Acer X34 and Asus PG348Q don't have ULMB and are G-Sync.
 
it was way too slow to make proper use of such a high refresh rate.
that isn't how that works. they are two separate issues. 200 Hz still looks like 200 Hz even if it has response times slower than that.
I would expect this to use an AHVA panel, not a VA panel. AHVA panels typically have a 4ms GTG response time, so they can theoretically go up to 250 Hz before smearing becomes an issue. No word on ULMB yet, but I have yet to see a G-Sync display that doesn't have ULMB as an option.
shouldn't you know better than to quote bullshit marketing like that as if it has any meaning at all by now?
 
200 Hz isn't going to matter much versus the 100 Hz 21:9's of today. 100 Hz is getting close to maxing out these IPS pixels speeds anyway.
 
Ya the HDR feature is the only thing that would sell it for me. I have one foot in microcenter right now on the x34 deal.
 
I'll preface this by saying we finally have strong 2560 x 1440 high hz capable gpus, singly, in the titan pascal and the 1080ti.

ALY9lQS_d.jpg


I do understand the resolution vs frame rate argument GoGaming is making though somewhat, especially for people who buy mid range gpus where 1080 high gives them a lot more room to crank up higher settings while keeping high fps-hz.

If you aren't getting at least 100fps-hz average, you aren't getting any appreciable gains out of a high hz monitor.

If you are trying to run strobing on a high resolution monitor, you have to run much lower settings in order to get sustained (not average) 100fps or better. It's a huge difference.


=========================================================
=========================================================

Easier to render games with very high fps work pretty well with ulmb.
Running 1440p or higher rez with any kind of high to ultra settings on the most demanding games won't let you sustain high fps, only average it.

As per blurbusters.com 's Q and A:

-----------------------------------------------------
Q: Which is better? LightBoost or G-SYNC?
It depends on the game or framerate. As a general rule of thumb:
LightBoost: Better for games that sustain a perfect 120fps @ 120Hz
G-SYNC: Better for games that have lower/fluctuating variable framerates.

This is because G-SYNC eliminates stutters, while LightBoost eliminates motion blur. LightBoost can make stutters easier to see, because there is no motion blur to hide stutters. However, LightBoost looks better when you’re able to do perfect full framerates without variable frame rates.

G-SYNC monitors allows you to choose between G-SYNC and backlight strobing. Currently, it is not possible to do both at the same time, though it is technically feasible in the future.
......
Main Pros:
+ Elimination of motion blur. CRT perfect clarity motion.
+ Improved competitive advantage by faster human reaction times.
+ Far more fluid than regular 120Hz or 144Hz.
+ Fast motion is more immersive.

Main Cons:
– Reduced brightness.
– Degradation of color quality.
– Flicker, if you are flicker sensitive.
– Requires a powerful GPU to get full benefits. <edit by elvn: and turning down settings a lot more at higher resolutions>

--------------------------------------------------------
During regular 2D use, LightBoost is essentially equivalent to PWM dimming (Pulse-Width Modulation), and the 2D LightBoost picture is darker than non-LightBoost Brightness 100%.
--------------------------------------------------------
Once you run at frame rates above half the refresh rate, you will begin to get noticeable benefits from LightBoost. However, LightBoost benefits only become major when frame rates run near the refresh rate (or exceeding it).
-------------------------------------------------------
If you have a sufficiently powerful GPU, it is best to run at a frame rate massively exceeding your refresh rate. This can reduce the tearing effect significantly.Otherwise, there may be more visible tearing if you run at a frame rate too close to your refresh rate, during VSYNC OFF operation. Also, there can also be harmonic effects (beat-frequency stutters) between frame rate and refresh rate. For example, 119fps @ 120Hz can cause 1 stutter per second.
Therefore, during VSYNC OFF, it is usually best to let the frame rate run far in excess of the refresh rate. This can produce smoother motion (fewer harmonic stutter effects) and less visible tearing.
Alternatively, use Adaptive VSYNC as a compromise.
-------------------------------------------------------------
Pre-requisites
Frame rate matches or exceeds refresh rate (e.g. 120fps @ 120Hz).

  1. LightBoost motion blur elimination is not noticeable at 60 frames per second.

--------------------------------------------------------------

Sensitivity to input lag, flicker, etc. (You benefit more if you don’t feel any effects from input lag or flicker)

Computer Factors That Hurt LightBoost

  • Inability to run frame rate equalling Hz for best LightBoost benefit. (e.g. 120fps@120Hz).
  • Judder/stutter control. Too much judder can kill LightBoost motion clarity benefits.
  • Framerate limits. Some games cap to 60fps, this needs to be uncapped (e.g. fps_max)
  • Faster motion benefits more. Not as noticeable during slow motion.
  • Specific games. e.g. Team Fortress 2 benefits far more than World of Warcraft.
  • Some games stutters more with VSYNC ON, while others stutters more with VSYNC OFF. Test opposite setting.
-----------------------------------end-of-blurbuster's-quotes--------------------------

In recent news, this sounds interesting....

http://www.blurbusters.com/combining-blur-reduction-strobing-with-variable-refresh-rate/

I have doubt going forward that these strobing techs will work with HDR gaming monitors and FALD HDR gaming monitors by default, (and at the sustained-not-avg frame rates strobing requires at 1400p and 4k) which looks like where things are going eventually.

I agree 200hz is out of reach of both the monitor and most demanding game's frame rates (200 fps-hz). If you aren't feeding the monitor a new, unique frame of action/world state you are just seeing "freeze frames" through multiple hz. If the monitor's response time capability is surpassed you likewise won't be seeing new, unique frames past whatever fps-hz threshold, which the reviews of the older 35" 200hz models go into detail about.

I'm still interested in what these monitors can do. Hopefully they will be able to be tight enough up to 120fps-hz. Their contrast and black depth should be 3x that of ips monitors (or more considering a large, fine FALD array backight). 1000 nit FALD hdr like the 27" 4k too
 
Last edited:
I agonized over 3x 1440 G-sync monitors vs 1x 34" G-sync in April.

I went with NV Surround and while I am mostly happy with it, 3 screens is always a hassle and PITA - doubly so when using an HDMI accessory monitor for audio.

I don't care about 200hz but HDR seems like a good addition.

I think by March of next year, I'll keep 1 of these S2417DG monitors and give 2 to my boys or maybe sell one and get one of these beasties for my main screen with a TN 2nd screen on the side.
 
This is obviously gonna be the 21:9 counterpart to the 27" 4K, 144hz, HDR G-sync PG27UQ.

That makes it a compelling product, even though 200hz is unrealistic in most games, it gives you HDR with FALD(and thus should have much better contrast and black levels than ANY other LCD on the market, including any VA panels) in a larger format(35") for people who prefer that, as well as being easier to drive 3440x1440 at high refresh rates than 3840x2160.

I think this is a great product just based on the display size alone, a lot of people are going to want G-sync 2.0 w/ HDR in a larger format than 27". If I'm paying $1500-$2000 or even more for one of these, I'm going to want the bigger display.
 
  • Like
Reactions: elvn
like this
In a way. But 5160x2160@100Hz is also a"21:9" counterpart to UHD 144hz. Uses about the same bandwidth. I weep for the GPU driving that, but I think some people would like that. Probably coming eventually.
 
I have 3 980 Tis and cant max out 2560x1920 @ 120Hz. What GPUs would you need for this monitor?!
 
I have 3 980 Tis and cant max out 2560x1920 @ 120Hz. What GPUs would you need for this monitor?!

Unless you typoed your resolution, 2560*1920 is the same number of pixels as 3440*1440. And anyway, sure, but in what games at what settings? 3-way SLI scaling is very bad and offers little to no improvement over 2-way SLI in many games, and at framerates in the 200 range games are often CPU bound anyway. CSGO and Overwatch probably cover the vast majority of FPS gaming people do in 2017, for example, and it's not a problem to get Overwatch to average 144fps@3440x1440 with 1x GTX 1080TI in Ultra settings, with settings minorly tweaked/bumped down or overclocking, average 200fps is probably achievable. With CSGO framerates obviously aren't an issue at any resolution.

No you're not gonna play Witcher 3 at 200fps, but that's what G-sync is for, you don't need 200fps to enjoy the great black levels and insane contrast provided by HDR/FALD. And in the games where high refresh rate really matters, it's usually worth it to tune settings down a little for the improved motion resolution.
 
Unless you typoed your resolution, 2560*1920 is the same number of pixels as 3440*1440. And anyway, sure, but in what games at what settings? 3-way SLI scaling is very bad and offers little to no improvement over 2-way SLI in many games, and at framerates in the 200 range games are often CPU bound anyway. CSGO and Overwatch probably cover the vast majority of FPS gaming people do in 2017, for example, and it's not a problem to get Overwatch to average 144fps@3440x1440 with 1x GTX 1080TI in Ultra settings, with settings minorly tweaked/bumped down or overclocking, average 200fps is probably achievable. With CSGO framerates obviously aren't an issue at any resolution.

No you're not gonna play Witcher 3 at 200fps, but that's what G-sync is for, you don't need 200fps to enjoy the great black levels and insane contrast provided by HDR/FALD. And in the games where high refresh rate really matters, it's usually worth it to tune settings down a little for the improved motion resolution.

I have a 6800K at over 4GHz so there shouldn't be a bottleneck. The 980Tis are overclocked to 1550MHz. I play War Thunder at 1600x1200 max settings 8xMSAA 180Hz to match the framerate to the refresh rate. At 2560x1920 I can get about 100FPS medium settings, no AA. Overwatch runs nicely at 2560x1920 120Hz without AA but MSAA will cause frameate dips. CSGO I run at 1024x768 300Hz with supersampling AA because FPS matters most and at such low res CPU bottlenecks so I can downsample.

I have a CRT so it is like having a more powerful version ULMB that is always on. Small drops in FPS under refresh rate are pretty bad, but since it is a tube, there isn't a fixed reesolution. You can always find a good res to play at and there is never motion blur.
 
Some people say it's AHVA ( IPS ) and the others say it's AMVA .
What is the right opinion ?
 



I hope it's VA for the contrast ratio and black depth, detail in blacks.
VA is way better for HDR since the contrast is much higher and the black depth much deeper. This results in not only much deeper blacks but more detail-in-blacks instead of black blobs. People mention ips color but these monitors are quantum dot and going for P3 color which is a few percent higher than adobe rgb. I was disappointed that the 4k one was ips actually so I am happy to see that the 21:9 version is supposedly VA. The HDR premium label standard is at least 1000nit peak brightness but also .05 black depth. Most ips monitors are .12 to .14 black depth and lucky to get over 900:1 contrast ratio. I'm not sure how much the 384 zone FALD array will do for the 4k IPS HDR models but them not posting any contrast ratio numbers on the 4k ips one is suspect.

The 1080p eizo fg2421 VA was over 4000:1 contrast ratio but a modern lower response time VA computer monitor is based on 3000:1 max contrast ratio. My VA tv is over 5000:1 with FALD off, and over 8000:1 with FALD on.. and it's nowhere near a 384zone array. Not sure how much the 1000nit peak brightness could give a false impression regarding the contrast ratio numbers but that is why the black depth figure is so important.
 
Last edited:
Well, at least we can stop speculating now.

So with VA we get smearing among other things, but it seems to me haloing as a result of the FALD would be less of an issue compared to IPS?
 
Viewing angles in this vid at the marked timestamp do exhibit the contrast shift typical of a VA panel. That's disappointing. I guess it will be okay if you run it at 100 Hz...

 
Well, at least we can stop speculating now.

So with VA we get smearing among other things, but it seems to me haloing as a result of the FALD would be less of an issue compared to IPS?

Maybe. Someone over on overclock forum bought the Dell HDR 384 zone FALD IPS monitor and the halo effect was absurd. Will the gsync monitors suffer from it too? Maybe. But none of those monitors are even out yet.
 
The Predator Z35 review from tft central said it was ok to 120hz . We will see what these newer models can do. IMO if the ips versions can't do the black depth of .05 that the HDR premium label standard outlines then they aren't really HDR. IPS and TN black depth is terrible. Again choose your tradeoffs.

Z35 tftentral quote... I'm assuming these newer ones will be at least as good perhaps better overdrive if lucky:

We felt the screen was ok at up to around 100 - 120Hz without the overshoot becoming distracting and noticeable. At these levels there was little excess blurring or smearing introduced due to some slow pixel changes in most cases, although some scenes showed problems. The overshoot was kept at fairly low levels and not really a problem though up to 120Hz. Sadly we felt the panel was not capable of achieving low enough response times to reliably support anything higher, including the 144Hz native maximum refresh rate, and the overclocked range from 160 - 200Hz. The smearing becomes more obvious and the overdrive impulse starts to get applied too aggressively to try and keep up, and so the resulting overshoot is a big problem. You might want to experiment with different settings above 120Hz perhaps, although we felt the performance at 120Hz gave a better overall appearance.

As for the shift in that video, who is going to be looking at the monitor from 4' to the left standing 5' away at 45 degrees? Within the focal point of the curvature it should be fine.
 
Last edited:
There's another point people seem to be missing with these high refresh rate gsync monitors. With gsync enabled, when you hit the max refresh rate of your monitor, you have two options. vsync on = increased input lag. vsync off = potential screen tearing.
Fortunately, some games offer in game frame limiters that can be configured below your max refresh rate, so that youre screens max is never hit, providing the most responsive, picture without tearing. However, not all games do. Sometimes this can be overridden through nvidia inspector, but this method can also introduce input lag.

But, if you have a monitor with an unusually high refresh rate, it makes it less likely that your framerate will meet your max refresh rate. This means, without specific in game support or 3rd party tools, you're more likely to remain in the gsync range, providing the best image quality and lowest input lag. So there is some benefit to having a refresh rate higher than 144hz, even if it's not visibly smoother. Sure Quake 3 Arena and CS1.6 will blow past those frame rates, but I'd imagine most games made in the last decade wouldn't.

I'm already deciding which body parts I'm willing to part with the most for one of these screens.
 
We know FALD is a great feature especially with VA + 512 zones, But What about this case ? Can't enable local dimming while game mode is turned on. Game mode makes the input lag feel incredibly quick and smooth, but it disables local dimming, which means the HDR experience in HDR-enabled games will not be as good as it could be.
 
Back
Top