Gsync 4k 144hz HDR monitor prices 'released'

Doesn't limiting the color information kinda go against the whole HDR thing?..
I was gonna say...

I thought half of what was needed with modern HDR displays was basically doing what Matrox was doing back in the '90s/early 2000s and using 10-bit color channels instead of just 8-bit (yeah, that's right, DisplayPort and HDMI are catching up to decades-old VGA!), since the expanded gamut would make banding more obvious with the limited color resolution. Needless to say, that's just even more data to be shunting through the video interface at high refresh rates - not a good thing when we're hitting bandwidth limits.
 
Doesn't limiting the color information kinda go against the whole HDR thing?..
HDR is about luminance. 4:2:2 subsampling is still providing full luminance information. The range of each color component is stil 10 bits. Besides, if you're watching a UHD HDR video from any source right now it is going to be 4:2:0.
 
HDR is about luminance. 4:2:2 subsampling is still providing full luminance information. The range of each color component is stil 10 bits. Besides, if you're watching a UHD HDR video from any source right now it is going to be 4:2:0.
By switching from 4:4:4 to 4:2:2 you lose half the chroma info in favor of luminance. 4:2:0 has a 1/4 of the chroma info of 4:4:4. HDR is about luminance and tone mapping no doubt. I still don't want to sacrifice one for the other though at $2k+, just won't be an early adopter :)
 
The measurements of the Philips 43" leave me again much more interested in the 35" VA versions of these monitors. If you can achieve 18000:1 contrast with a good VA panel and a 32-zone edge lit backlight, the performance of a 512-zone backlight operating at the low latencies the PG27UQ is capable of should make for a very impressive level of LCD contrast. Clearly it is possible to build a VA panel with better dark transition times than we've seen in the past.
 
The measurements of the Philips 43" leave me again much more interested in the 35" VA versions of these monitors. If you can achieve 18000:1 contrast with a good VA panel and a 32-zone edge lit backlight, the performance of a 512-zone backlight operating at the low latencies the PG27UQ is capable of should make for a very impressive level of LCD contrast. Clearly it is possible to build a VA panel with better dark transition times than we've seen in the past.

The original Phillips 4065 already proved that VA panels can be fast. That thing had an average GtG of 7.4ms with NO overdrive at all and it's worst transition being 0-255 at a respectable 18ms. Why no manufacturer has created an equally fast high refresh VA panel to this day just puzzles me.
 
The original Phillips 4065 already proved that VA panels can be fast. That thing had an average GtG of 7.4ms with NO overdrive at all and it's worst transition being 0-255 at a respectable 18ms. Why no manufacturer has created an equally fast high refresh VA panel to this day just puzzles me.

Because VA & TN panels need to DIE!!! I think IPS + FALD will be barely tolerable after OLED has raped my eyes.
 
Because VA & TN panels need to DIE!!! I think IPS + FALD will be barely tolerable after OLED has raped my eyes.

VA looks better than IPS outside of the dark transition issue. VA FALD is vastly superior to IPS FALD, and it's not close. There's a reason that almost all LCD TVs are VA. You simply can't overcome a native contrast difference of 5x via FALD, the VA light blocking also makes a big difference in visible haloing. In terms of viewing angle, VA does lose contrast off-angle but this is pretty irrelevant given the contrast starting point is so much better. In terms of color reproduction they're pretty much the same, the best IPSes can get a little more accurate but they're both easily capable of calibrating below 2.0 dE which is all that really matters.

Of course OLED beats everything but the idea that IPS is somehow better than VA is wrong. IPS is worse at pretty much everything other than off-angle color reproduction and dark pixel response time.
 
VA looks better than IPS outside of the dark transition issue. VA FALD is vastly superior to IPS FALD, and it's not close. There's a reason that almost all LCD TVs are VA. You simply can't overcome a native contrast difference of 5x via FALD, the VA light blocking also makes a big difference in visible haloing. In terms of viewing angle, VA does lose contrast off-angle but this is pretty irrelevant given the contrast starting point is so much better. In terms of color reproduction they're pretty much the same, the best IPSes can get a little more accurate but they're both easily capable of calibrating below 2.0 dE which is all that really matters.

Of course OLED beats everything but the idea that IPS is somehow better than VA is wrong. IPS is worse at pretty much everything other than off-angle color reproduction and dark pixel response time.

VA, TN and IPS are all Cat Poo....and this IPS FALD is the last plate of it that I am going to eat!
 
They all have deficiencies, one way or another.

Have a 165Hz IPS 1440p, 60Hz VA "HDR" 4k, and B7 OLED, and they all have their own issues. Actually have a few ancient TN panels that I don't mind either.

Maybe 'Micro LED' will be the one to rule them all?
 
They all have deficiencies, one way or another.

Have a 165Hz IPS 1440p, 60Hz VA "HDR" 4k, and B7 OLED, and they all have their own issues. Actually have a few ancient TN panels that I don't mind either.

Maybe 'Micro LED' will be the one to rule them all?

That's the problem. There is no perfect monitor. All panel technologies have some sort of inherent trade off compared to other types. If you go with a super fast panel you end up being constricted on sizes, resolution and image quality. If you go with a larger display that has better image quality, you will sacrifice response times and refresh rates. Going too large has its own issues as well. Going with some super ultra wide display comes with a different host of problems.

At the end of the day you have to look at all the options at the time your making a decision and buy according to a hierarchy of priorities that you set. For me, that means no 1440 or 1080 bullshit. HDR is a must and I won't accept less than 40" in size. I don't want badly supported aspect ratios either. I want a higher refresh rate and G-Sync, but that's not offered in the size range I like and I have to lose refresh rate to do it. I want a fast refresh rate but being stuck to less than 40" bothers me too much to go with any monitor that fits that description. Again, everyone is different. Many gamers just want fast refresh rates and care very little about size. 34-38" is acceptable to them.
 
The tradeoffs should narrow other than size in 2019+ with hdmi 2.1 having 120hz native at 4k, VRR (variable refresh rate standardized), and QFT low input lag... on HDR OLED displays. The only tradeoff then other than waiting for gpu manufacturers to support VRR and the likely gpu upgrade $$$$ (why support it in old models when you can force upgrade!) .. is the size might actually be TOO BIG for some people since LG OLED are 55" at the smallest so far. TCL is supposed to print and manufacture their own OLED lines in 2019 though, including a 31". For me personally it would be worth having the monitor/tv set further back from the desk in order to get those features even if I had to redesign a room around it.


========================================

HDMI 2.1

https://www.flatpanelshd.com/news.php?subaction=showfull&id=1511934073
The first phase of HDMI 2.1 certification starts in the second quarter, with full certification expected to begin in the third or fourth quarter of the year. Products with the official stamp of approval can be launched following successful certification.
Besides support for 8K and 10K resolution as well as 4K resolution at 120fps, HDMI 2.1 supports Dynamic HDR to enable “multiple static and dynamic HDR solutions”. It features eARC that “supports the most advanced high bitrate home theater audio formats, object-based audio, uncompressed 5.1 and 7.1, and 32-channel uncompressed audio” with audio bandwidth up to 37 Mb/s

  • Variable Refresh Rate (VRR) reduces or eliminates lag, stutter and frame tearing for more fluid and better detailed gameplay.
  • Quick Media Switching (QMS) for movies and video eliminates the delay that can result in blank screens before content is displayed.
  • Quick Frame Transport (QFT) reduces latency for smoother no-lag gaming, and real-time interactive virtual reality.
  • Auto Low Latency Mode (ALLM) allows the ideal latency setting to automatically be set allowing for smooth, lag-free and uninterrupted viewing and interactivity.

==========================================


2018 LG OLEDs are already making progress...
https://www.forbes.com/sites/johnar...what-a-difference-a-brain-makes/#5fc38ccd7ab7

The most impressive improvements are the OLED65E8’s sharpness with 4K pictures, upscaling of sub-4K sources, HDR Game mode, and handling of color noise.

Looking at the last of these improvements first, previous LG OLED TVs have struggled to resolve subtle color tone shifts within HDR/wide color sources, causing various types of noise (including some messy magenta blocking) in tricky image areas such as the ‘strata’ of different tones evident in the skies of the 4K Blu-ray of Mad Max: Fury Road. On the OLED65E8, though, this color noise is pretty much completely eradicated.

Great news for gamers
Gamers frustrated by the relative darkness of the low-latency HDR Game preset on LG’s previous two OLED generations will be elated to learn that the OLED65E8’s HDR Game mode looks far brighter.
In fact, its measured light output on a 10% white HDR window is as high as that of any of the other presets. And thanks to differences in its gamma and color temperature settings, it actually looks brighter than some of the HDR Video picture presets.

This, of course, makes games look much more like the dynamic visual experiences most of them are supposed to be, and no longer makes HDR gaming feel like a compromise versus the video HDR presets.

LG has been able to deliver this boost in HDR Game brightness without compromising input lag (the time the screen takes to ‘draw’ its pictures). In fact, I measured input lag at under 20ms on average, with a few measurements as low as 12ms. Excellent.
On top of all this, the OLED65E8 supports 120fps play (though only at HD resolutions) and 4:4:4 color handling from PCs.

It doesn’t, however, join Samsung’s Q9FN range in supporting variable refresh rates or automatic game mode switching - and so far as I can tell from LG, these features are not going to be addable via firmware updates. It seems they will only come with 2019 OLED sets, which will presumably carry the HDMI 2.1 ports set to make such features commonplace.

I should probably add a word of caution here regarding the potential for a much brighter gaming mode to cause screen burn; a phenomenon where prolonged exposure to bright and/or richly colored and relatively static image content (such as channel logos or gaming HUDs) may leave permanent traces of those static elements behind on an OLED screen.

In fact, I’d previously considered that the relatively dark look the HDR Game mode on LG’s TVs had something to do with the brand’s wariness over the screen burn issue.

LG is presumably pretty confident, though, that it has largely tackled the screen burn issue under typical viewing conditions/usage patterns with its latest screens, or else it surely wouldn’t have ramped up the Game mode brightness so much.

There are definitely still a few wrinkles/tradeoffs in the 2018 ones, outlined on page 2. Hopefully 2019 sets will continue to improve:
https://www.forbes.com/sites/johnar...-tv-review-what-a-difference-a-brain-makes/2/
 
Last edited:
  • Like
Reactions: Q-BZ
like this
That's the problem. There is no perfect monitor. All panel technologies have some sort of inherent trade off compared to other types. If you go with a super fast panel you end up being constricted on sizes, resolution and image quality. If you go with a larger display that has better image quality, you will sacrifice response times and refresh rates. Going too large has its own issues as well. Going with some super ultra wide display comes with a different host of problems.

At the end of the day you have to look at all the options at the time your making a decision and buy according to a hierarchy of priorities that you set. For me, that means no 1440 or 1080 bullshit. HDR is a must and I won't accept less than 40" in size. I don't want badly supported aspect ratios either. I want a higher refresh rate and G-Sync, but that's not offered in the size range I like and I have to lose refresh rate to do it. I want a fast refresh rate but being stuck to less than 40" bothers me too much to go with any monitor that fits that description. Again, everyone is different. Many gamers just want fast refresh rates and care very little about size. 34-38" is acceptable to them.

The 39" Zisworks X39 was a great compromise....except it lacked overdrive which made motion smear really horrid.

Oh and it broke last week and does not work anymore so that sucks too.

Back to working on my 55" C6 Oled.....and will have this 27" next to it.

Wish we could just have a 40" 4k120 HDR VRR OLED and be done with this dual display on the desk nonsense
 
That's the problem. There is no perfect monitor. All panel technologies have some sort of inherent trade off compared to other types. If you go with a super fast panel you end up being constricted on sizes, resolution and image quality. If you go with a larger display that has better image quality, you will sacrifice response times and refresh rates. Going too large has its own issues as well. Going with some super ultra wide display comes with a different host of problems.

At the end of the day you have to look at all the options at the time your making a decision and buy according to a hierarchy of priorities that you set. For me, that means no 1440 or 1080 bullshit. HDR is a must and I won't accept less than 40" in size. I don't want badly supported aspect ratios either. I want a higher refresh rate and G-Sync, but that's not offered in the size range I like and I have to lose refresh rate to do it. I want a fast refresh rate but being stuck to less than 40" bothers me too much to go with any monitor that fits that description. Again, everyone is different. Many gamers just want fast refresh rates and care very little about size. 34-38" is acceptable to them.
Seconded on the "no perfect monitor" bit, especially considering that for someone who's laid eyes on a GDM-FW900 in proper working order (before something blew inside; I'm waiting on a replacement D board/flyback transformer to rejuvenate it after two or three years of it just being a big paperweight), everything that's displaced aperture grille CRTs has frankly been a load of suck in one respect or another, aside from size, bulk, and in very recent years, resolution and refresh rate.

IPS glow, VA dark transitions, godawful TN gamma shift, OLED being far more susceptible to burn-in than late generation CRT phosphors... MicroLED may be the key, having all of the OLED advantages without the burn-in problem, but who knows when that'll hit the market - for an assuredly high price at launch, I'm sure.

With all of that said, I do care about size (because split/shared-screen gaming with friends in the same room is fun too), and a 65" panel would be sweet to sit back a few feet from. The problem is that going that big means paying HDTV prices for sets that generally have ridiculous amounts of input lag, 60 Hz limitations on video inputs (something that's only just now starting to turn around), and VRR. NVIDIA seems primed to change this with the BFGD initiative, but I'll be damned if those launch at a price below $5,000 at this rate.
 
The " 31.5" 2560x1440 165 Hz VA G-Sync - LG 32GK850G " actually seems like a good monitor for the moment until hdmi 2.1 VRR 120hz 4k OLED HDR is actualized but dropping $800+ of what I could have saved for the latter doesn't seem like a wise decision to me considering I have a ROG swift TN that works well enough for now at arms reach distance. It's still tempting though. Dropping $2k + on another 27" isn't even in the running. The BFGD prices will be insane and that money again could be saved for hdmi 2.1 OLED in 2019, even moreso... so too little too late. $800 - $2200 - $BFGD pricetag.. instead of waiting. We are already 5 months into 2018.
 
  • Like
Reactions: Q-BZ
like this
CRT's aren't perfect either. They are heavy, use a ton of power and create a great deal of radiant heat. Not only that but their picture degrades over time. You are also severely limited on size. The upside is that you don't have to worry about using a native resolution with those and the high quality units were capable of upwards of 120Hz at some resolutions. You also have geometry issues with CRT's. I found that LCD panels were much easier on my eyes than CRTs were. After 10+ hours of use my eyes are in far better shape after using an LCD compared to a CRT despite the former taking place while I was significantly younger.

The FW900 was a truly magnificent display but I was much happier with my Dell 30" 3007WFP despite its limitations.
 
Yeah I agree with LCDs being much easier on the eyes. They have also always looked sharper to me, with the pixels being more distinct. That has downsides and upsides, but I prefer it in most cases. Not sure if that is down to subpixel structure or what, exactly.
 
Yeah I agree with LCDs being much easier on the eyes. They have also always looked sharper to me, with the pixels being more distinct. That has downsides and upsides, but I prefer it in most cases. Not sure if that is down to subpixel structure or what, exactly.

Agreed. This may be the reason they are easier on the eyes. Another thing to consider is the refresh rates. On LCD's the refresh rate is harder to perceive outside of frame rates in games. For general desktop use it doesn't make any real difference. I think the flicker inherent to CRTs is always perceivable on some level.
 
I figure LCDs are easier on the eyes because, for the most part, they don't strobe. CRT monitors at 60 Hz tend to be painfully flickery, to say nothing of 50 Hz (read: Europe and PAL in general, hence why Amiga users in NTSC territories want PAL switching), because they're tuned with the expectation that the user's going to crank up the refresh rate, unlike a TV where it's fixed.

On the other hand, motion clarity suffers hard because of that. There's a reason BlurBusters.com exists, documenting all that LightBoost hackery until monitors implemented ULMB natively in some form. Thankfully, my Eizo FG2421 is one of those monitors, and having Turbo240 in action has to be seen to be believed, although anything running at 60 Hz gets noticeably double-strobed with after-images and input lag increases a bit.

Note that a CRT's relative softness isn't always a disadvantage; it provides a sort of natural anti-aliasing, for starters, one that's considered crucial to the retrogaming aesthetic to some people (as much or moreso than the scanlines). Of course, I wouldn't call an FW900 or any other FD Trinitron chassis close to it "soft"; it's practically LCD quality compared to an SDTV.

I also recall reading about how 3dfx cards use a sort of analog dithering to make 16-bit output look closer to 24/32-bit output than screencaps would suggest, but that it also only looks right on CRTs. Then, of course, there's the fact that native resolutions aren't a thing and the resolutions you can push are simply a matter of dot/grille pitch and the monitor's sync limitations.

Geometry and convergence errors really suck if the OSD and/or knobs won't let you tweak 'em right, though. That part I certainly don't miss, though all the late FD Trinitrons I'm talking about have computer-controlled calibrations that WinDAS and a TTL/UART serial cable lets you tweak, up to and including grid-based convergence on the FW900. I dialed mine in real nice while it worked...
 
Yes once again, without supplying high frame rates to fill hz, you aren't going to get the motion clarity (blur reduction) and motion definition (more articulated pathing and smoother "glassy" motion, higher animation definition etc).
As I've experienced, the biggest difference of 144 vs 240 was on desktop itself but it was marginal. In games ? I am not sure really.
I'm not trying to sound like a broken record but unless you are filling the higher hz with new unique frames by supplying at least very high frame rate range spans into the higher hz ceiling, you aren't going to get any benefit out of the higher hz. (I showed an explanation of this in graphs in my previous posts in this thread).
So obviously a desktop/app use scenario outside game just moving a mouse and windows around should usually supply as many frames of motion as the monitor can handle but when you go into a game, especially a more demanding 1st/3rd person games and at higher resolutions, your frame rate is severely limited.
Even the "frame rate" you meter and quote is a wider range of fluctuating frame rates +/- that that can be graphed.

The article talking about 1000hz realizes that 1000fps gaming outright is unrealistic and says that advanced interpolation of 100fps x 10 would be a likely scenario for 1000hz monitors (mathematically, 125fps x 8 would also work :) ). Even then, in fururistic 180 degree VR at extreme resolutions you'd sense the difference due to strobostropic effect unless it was 10,000 hz. So while they are saying 10,000hz fed massive fps at extreme resolutions would be indistinguishable from reality per se.. 1000fps (100fps interpolated 10x) at 1000hz would be essentially zero blur like a high end fw900 crt for the purposes of gaming.
This is the blur reduction you get at different frame rates at 1000hz
View attachment 64347
And the image below shows a visual representation of the blur at each persistence amount via persuit camera. However realize that in a 1st/3rd person game you are moving the entire game world full of high detail textures and depth via bump mapping around relative to you in the viewport when mouse looking and movement keying, rather than a simple singular ufo btimap graphic.
Display persistence is more noticeable for bigger FOV (bigger displays or virtual reality) and for higher resolutions (retina resolutions) due to bigger clarity differences between stationary & moving images.
In the most extreme future case (theoretical 180+ degree retina-resolution virtual reality headsets), display refresh rates far beyond 1000 Hz may someday be required (e.g. 10,000 Hz display refresh rate, defined by the 10,000 Hz stroboscopic-artifacts detection threshold). This is in order to pass a theoretical extreme-motion “Holodeck Turing Test” (becoming unable to tell apart real life from virtual reality) for the vast majority of the human population.
However, for general CRT-quality sports television watching,
1000fps at 1000Hz would sufficiently approximately match 1ms CRT phosphor persistence, for a flicker-free sample-and-hold display.
Technologically, this is achievable through interpolation on an ultra-high refresh rate display.
Note that while many of these replies are focused on blur reduction, higher frame rate on a high hz monitor (without using duplicated/interpolated frames) also provides greatly increased motion definition, motion path articulation, smoothness (and even animation cycle definition) of individual objects and of the entire game world moving in relation to you while mouse looking and movement keying in 1st/3rd person games So even if you had a 1000hz monitor using advanced interpolation, you would still need to run it at 100fps x 10 (or 125fps x 8, 200fps x 5) in order to get the greater motion definition benefit aspect of higher hz.





.
I had fw900's . They aren't the answer. More of a pain in the end to keep looking good enough.. and the size of the screen sucks even if you dismiss the size of the monitor itself. I held on to using that for years with a lcd alongside but I'm well past that now.

Most people avoid PWM like the plague. LCD strobing is essentially PWM. It will give you eyestrain. In order to use ulmb/lightboost properly you have to keep really high minimum frame rates. People are all looking to 3440 x 1440 and 4k resolutions now and are looking to HDR luminance ranges and color volumes. Strobing is really not applicable to the bar that modern premium gaming monitors are setting (High resolutions at high Hz, HDR luminance and color volume, VRR with higher graphics settings to avoid judder on dips and potholes, variance).

The real answer is fairly high frame rate to start with multiplied by advanced high quality interpolated (directly repeated not 'manufactured') frames combined with extremely high hz but that is still years off. The hz ceilings are making some progress now though at least.

"So while they are saying 10,000hz fed massive fps at extreme resolutions would be indistinguishable from reality per se.. 1000fps (100fps interpolated 10x) at 1000hz would be essentially zero blur like a high end fw900 crt for the purposes of gaming"

okw997S.png


As per blurbusters.com 's Q and A:
-----------------------------------------------------
Q: Which is better? LightBoost or G-SYNC?
It depends on the game or framerate. As a general rule of thumb:
LightBoost: Better for games that sustain a perfect 120fps @ 120Hz
G-SYNC: Better for games that have lower/fluctuating variable framerates.
This is because G-SYNC eliminates stutters, while LightBoost eliminates motion blur. LightBoost can make stutters easier to see, because there is no motion blur to hide stutters. However, LightBoost looks better when you’re able to do perfect full framerates without variable frame rates.
G-SYNC monitors allows you to choose between G-SYNC and backlight strobing. Currently, it is not possible to do both at the same time, though it is technically feasible in the future.
......
Main Pros:
+ Elimination of motion blur. CRT perfect clarity motion.
+ Improved competitive advantage by faster human reaction times.
+ Far more fluid than regular 120Hz or 144Hz.
+ Fast motion is more immersive.
Main Cons:
– Reduced brightness.
– Degradation of color quality.
– Flicker, if you are flicker sensitive.
– Requires a powerful GPU to get full benefits. <edit by elvn: and turning down settings a lot more at higher resolutions>

--------------------------------------------------------
During regular 2D use, LightBoost is essentially equivalent to PWM dimming (Pulse-Width Modulation), and the 2D LightBoost picture is darker than non-LightBoost Brightness 100%.
--------------------------------------------------------
Once you run at frame rates above half the refresh rate, you will begin to get noticeable benefits from LightBoost. However, LightBoost benefits only become major when frame rates run near the refresh rate (or exceeding it).
-------------------------------------------------------
If you have a sufficiently powerful GPU, it is best to run at a frame rate massively exceeding your refresh rate. This can reduce the tearing effect significantly.Otherwise, there may be more visible tearing if you run at a frame rate too close to your refresh rate, during VSYNC OFF operation. Also, there can also be harmonic effects (beat-frequency stutters) between frame rate and refresh rate. For example, 119fps @ 120Hz can cause 1 stutter per second.
Therefore, during VSYNC OFF, it is usually best to let the frame rate run far in excess of the refresh rate. This can produce smoother motion (fewer harmonic stutter effects) and less visible tearing.
Alternatively, use Adaptive VSYNC as a compromise.
-------------------------------------------------------------
Pre-requisites
Frame rate matches or exceeds refresh rate (e.g. 120fps @ 120Hz).
  1. LightBoost motion blur elimination is not noticeable at 60 frames per second.
--------------------------------------------------------------
Sensitivity to input lag, flicker, etc. (You benefit more if you don’t feel any effects from input lag or flicker)

Computer Factors That Hurt LightBoost

  • Inability to run frame rate equalling Hz for best LightBoost benefit. (e.g. 120fps@120Hz).
  • Judder/stutter control. Too much judder can kill LightBoost motion clarity benefits.
  • Framerate limits. Some games cap to 60fps, this needs to be uncapped (e.g. fps_max)
  • Faster motion benefits more. Not as noticeable during slow motion.
  • Specific games. e.g. Team Fortress 2 benefits far more than World of Warcraft.
  • Some games stutters more with VSYNC ON, while others stutters more with VSYNC OFF. Test opposite setting.

"If you are trying to run strobing on a high resolution monitor, you have to run much lower settings in order to get sustained (not average) 100fps or better. It's a huge difference.
View attachment 62006
==================================
Easier to render games with very high fps work pretty well with ulmb.
Running 1440p or higher rez with any kind of high to ultra settings on the most demanding games won't let you sustain high fps, only average it.
A lot of people won't even buy a monitor with PWM backlight because it causes eye strain over time, ulmb strobing can be (is) eye fatiguing especially at 100hz or less strobes. The higher the resolution on more demanding games, the more ulmb fails to sustain the higher frame rates it needs to avoid tearing and judder without turning things down enough to stay ahead of the refresh cap. A lot of people used to cap the monitor at 100hz and turn things down enough to stay over 100fps (sustained) for this reason. It also dulls the screen. Anyway I doubt it will work with true HDR color volume either as things go forward (and especially FALD HDR) which is the future of gaming and movies.
"Slideshow"
typically refers to the motion definition aspect. Motion definition provides additional smooth detail in the pathing and animation cycles, and even shows more smooth motion definition of the entire game world moving relative to you when movement keying and mouse looking in 1st/3rd person games.
Mentioning different hz and features without including the accompanying frame rates each are typically running doesn't really tell what you are comparing.
60fps (average which ranges even lower rate part of the time) at 60hz+ is like molasses to me.
100fps at 100hz or better shows 5 new unique frames to every 3 at 60fps-hz.
120fps at 120hz or better doubles the motion definition.
The "slideshow" nickname is because there are few new frames of action being shown at low frame rates. The same frame being shown for a longer time time like a flip book animation with less pages being flipped slower.
At high fps on a high hz monitor, you will get much higher motion definition regardless of whether you have strobing or black frame insertion, crt redraw, etc.
------------------------------------------
The Motion Clarity (blur reduction) aspect is also improved by running at high hz ranges on a high hz monitor, and is nearly pristine using backlight strobing (with some major, in my opinion critical, tradeoffs).
Non strobe mode, at speed (e.g. mouse looking viewport around):
60fps solid ... is a full smearing "outside of the lines" blur. -- At variable hz or at 60hz or at 100hz, 120hz, 144hz, 240hz.
120fps solid .. halves that blur (50%) to more of a soften blur inside the masks of objects -- At variable hz or at 120hz, 144hz, 240hz.
144fps solid .. drops the blur a bit more, 60% less blur -- At variable hz or at 144hz, 240hz.
240fps solid .. drops the blur down to a very slight blur showing most of the texture detail -- At variable hz 240hz

In recent news, this sounds interesting....
http://www.blurbusters.com/combining-blur-reduction-strobing-with-variable-refresh-rate/
I have doubt going forward that these strobing techs will work with HDR gaming monitors and FALD HDR gaming monitors by default, (and at the sustained-not-avg frame rates strobing requires at 1400p and 4k) which looks like where things are going eventually.
 
Last edited:
I actually have very little interest in this thing any more. I expect nothing but disappointment, as par for the course last 16 years of Lcd tech. I gave up and bought the Dell 144Hz TN, and actually am pretty happy with it for the money. Be pretty damn hard to justify 2k dollars, ( or even 800 dollars really) for just about anything in it's place, which is why i bought it instead of the IPS variants. However IF this thing has no glow, and no haloing and blackity black blacks i may not be able to resist, however i believe the chances of that are exactly 0.
 
I actually have very little interest in this thing any more. I expect nothing but disappointment, as par for the course last 16 years of Lcd tech. I gave up and bought the Dell 144Hz TN, and actually am pretty happy with it for the money. Be pretty damn hard to justify 2k dollars, ( or even 800 dollars really) for just about anything in it's place, which is why i bought it instead of the IPS variants. However IF this thing has no glow, and no haloing and blackity black blacks i may not be able to resist, however i believe the chances of that are exactly 0.

tenor.gif
 
Not sure if this has been discussed yet but it looks like they are making 2 versions of this monitor. The X27 which is on pre-order right now and the X27T, only difference being it has the eye tracking that we saw in the prototype, I would imagine they would charge ~$100-200 more.

Newegg updated their pages with this video pointing it out.

https://smedia.webcollage.net/rwvfp...d-bcb6-4996-b368-f6e3c2369037.mp4.mp4full.mp4
 
Looks like my X27 should get delivered just in time for my new full-time game "Active Shooter" to be released.

Oh man, it looks like we are all going to have to re-evaluate the purchase of these 144hz 4k gaming panels as "Active Shooter" just got pulled from Steam. Also, with the removal of Dab, Dance & Twerk I think I may as well cancel my 1180GTX pre-order :-(

https://twitter.com/i/web/status/1001592220783431680



x
 
Not sure if this has been discussed yet but it looks like they are making 2 versions of this monitor. The X27 which is on pre-order right now and the X27T, only difference being it has the eye tracking that we saw in the prototype, I would imagine they would charge ~$100-200 more.

Newegg updated their pages with this video pointing it out.

https://smedia.webcollage.net/rwvfp...d-bcb6-4996-b368-f6e3c2369037.mp4.mp4full.mp4

I'm still not buying one. I want something bigger in size.
 
Back
Top