ASUS TUF Gaming VG32VQ: World’s 1st display with concurrent motion blur reduction & Adaptive-Sync

Well the 2019 lg oleds did have bfi options that got removed before they shipped. With the gsync firmware update happening later this year it seems LG is aware of PC gamers.

I'm hopeful 2020 models will have their own version of ELMB with simultaneous bfi and vrr.

It may seem fast, but LG surprised everyone by having full HDMI 2.1 support in 2019 models.

The thing is you will potentially be able to have a ps5 with 120hz quasi 4k and VRR by the end of next year to play on hdmi 2.1 tvs out now and those due out in 2020. PC monitors and gpus have roadmapped way behind making some of us who realize what they are marketing to us for now in limited dp 1.4 high resolution gaming monitors (and non-hdmi 2.1 gpus) very frustrated.


NO HDMI 2.1 GPUs for over 60fps on HDMI 2.1 TVs
--------------------------------------------------------------------------
-Having hdmi 2.1 4k VRR on a giant tv and running 60fps max off of a pc to it is inadequate for higher hz gameplay in relation motion blur and motion defintiion/smoothness/pathing.


BFI problems/trade-offs are severe in the tech's current state
-----------------------------------------------------------------------------------

-BFI/strobing has to be 120hz flicker rate or higher or people will get PWM eyestrain/fatigue. I don't know if this can be done without interpolation to increase effective frame rates to match the strobing.. which in the current generations usually produces artifacts and adds a lot of input lag.

-BFI or strobing also has to use a much higher starting brightness otherwise SDR color/brightness will be much dimmer and HDR color volume is thrown out entirely. This is a big problem for all displays but especially for the already more dim SDR and ABL capped HDR OLEDs. I've read reports that strobing can cut the display color brightness/overall brightness by 2/3.

-BFI has limited motion definition gains unless you are running higher fps (impossible off of a non-hdmi 2.1 gpu to hdmi 2.1 4k 120hz tvs) or using motion interpolation to get a soap opera or VR time warp/space warp effect.

https://forums.blurbusters.com/viewtopic.php?t=5262
motion_blur_from_persistence_on_impulsed-displays.png


This means:
- A non-strobed OLED has identical motion blur to a fast non-strobed TN LCD (a model with excellent overdrive tuning).
- An OLED with a 50%:50% on:eek:ff BFI will reduce motion blur by 50% (half original motion blur)
- An OLED with a 25%:75% on:eek:ff BFI will reduce motion blur by 75% (quarter original motion blur)

Typically, most OLED BFI is only in 50% granularity (8ms persistence steps), though the new 2019 LG OLEDs can do BFI in 25% granularity at 60Hz and 50% granularity at 120Hz (4ms persistence steps)

Except for the virtual reality OLEDs (Oculus Rift 2ms persistence), no OLEDs currently can match the short pulse length of a strobe backlight just yet, though I'd expect that a 2020 or 2021 LG OLED would thus be able to do so.

Vega from that thread:
I've tested this on my LG C8 OLED versus my 165 Hz 1440p TN. With the OLED set to 1080p 120 Hz, the OLED pixels are so fast (sample and hold in this case) that I can see each individual frame. It doesn't "appear" as smooth as say the TN panel set to 1080p/120 Hz because even a fast TN will "smear" the images together. OLED doesn't blur one frame to the next.

To me seeing as OLED pixels are so dark fast, if kept sample and hold, the refresh rate even needs to be higher than LCD to get that silky smooth fast refresh feeling.

https://www.blurbusters.com/faq/motion-blur-reduction/
Sometimes blur reduction looks very good — with beautiful CRT-style motion clarity, no microstutters, and no noticeable double images.

Sometimes blur reduction looks very bad — with distracting side effects such as double images (strobe crosstak), poor colors, very dim, very microstuttery, flickery.


You need to have an extremely high frame rate in order to keep a high frame rate minimum unless you are using some kind of interpolation. Running lower strobe rates in order to use lower frame rate graphs means worse PWM flicker.
Strobing visually looks highest-quality when you have a consistent frame rate that match the refresh rate. If your framerate slows down, you may see dramatic microstuttering during Blur Reduction.

In some cases, it is sometimes favourable to slightly lower the refresh rate (e.g. to 85Hz or 100Hz for ULMB) in order to allow blur reduction to look less microstuttery — by more easily exactly matching frame rate to a lower refresh rate — if your GPU is not powerful enough to do consistent 120fps.

Frame rates are lower than the strobe rate will have multiple-image artifacts.

  • 30fps at 60Hz has a double image effect. (Just like back in the old CRT days)
  • 60fps at 120Hz has a double image effect.
  • 30fps at 120Hz has a quadruple image effect.

Inadequate frame rates at higher strobing Hz:
strobed-display-image-duplicates.png

Even if you had VRR + BFI/strobing to match the frame rates to the strobing , riding a roller coaster of different strobe lengths into lower strobes and back due to your varying frame rate span would be eye fatiguing and anything under 120hz strobing/blanking is bad in the first place imo.

You'd also be running lower frame rate ranges at 4k resolution with no hope of keeping a high enough frame rate to not sink below 120hz matched strobes in raw frame rates unless you were running a very easy to render or old game that gets over 150fps average.
 
Last edited:
The thing is you will potentially be able to have a ps5 with 120hz quasi 4k and VRR by the end of next year to play on hdmi 2.1 tvs our now and those due out in 2020. PC monitors and gpus have roadmapped way behind making some of us who realize what they are marketing to us for now in limited dp 1.4 high resolution gaming monitors (and non-hdmi 2.1 gpus) very frustrated.


NO HDMI 2.1 GPUs for over 60fps on HDMI 2.1 TVs
--------------------------------------------------------------------------
-Having hdmi 2.1 4k VRR on a giant tv and running 60fps max off of a pc to it is inadequate for higher hz gameplay in relation motion blur and motion defintiion/smoothness/pathing.


BFI problems/trade-offs are severe in the tech's current state
-----------------------------------------------------------------------------------

-BFI/strobing has to be 120hz flicker rate or higher or people will get PWM eyestrain/fatigue. I don't know if this can be done without interpolation to increase effective frame rates to match the strobing.. which in the current generations usually produces artifacts and adds a lot of input lag.

-BFI or strobing also has to use a much higher starting brightness otherwise SDR color/brightness will be much dimmer and HDR color volume is thrown out entirely. This is a big problem for all displays but especially for the already more dim SDR and ABL capped HDR OLEDs. I've read reports that strobing can cut the display color brightness/overall brightness by 2/3.

-BFI has limited motion definition gains unless you are running higher fps (impossible off of a non-hdmi 2.1 gpu to hdmi 2.1 4k 120hz tvs) or using motion interpolation to get a soap opera or VR time warp/space warp effect.

https://forums.blurbusters.com/viewtopic.php?t=5262
View attachment 187806



Vega from that thread:


https://www.blurbusters.com/faq/motion-blur-reduction/



You need to have an extremely high frame rate in order to keep a high frame rate minimum unless you are using some kind of interpolation. Running lower strobe rates in order to use lower frame rate graphs means worse PWM flicker.




Inadequate frame rates at higher strobing Hz:
View attachment 187807

Even if you had VRR + BFI/strobing to match the frame rates to the strobing , riding a roller coaster of different strobe lengths into lower strobes and back due to your varying frame rate span would be eye fatiguing and anything under 120hz strobing/blanking is bad in the first place imo.



mrw_someone_takes_my_precious
 

Attachments

  • XdCPtkzn7vmPTQ8xN7TF_MGVeCWQcPtyIi8_Ag3PWAs.png
    XdCPtkzn7vmPTQ8xN7TF_MGVeCWQcPtyIi8_Ag3PWAs.png
    589.3 KB · Views: 78
Last edited:
Actual test of FW900:
https://apps.dtic.mil/dtic/tr/fulltext/u2/a415156.pdf

Contrast is about 300:1.

That ancient test with ancient equipment wasn't a standard checkerboard "contrast ratio" test like is done today by a single sensor. The test in the citation is a "Dynamic range and Screen Reflectance" test. This test is not how contrast ratio's are attained today; apples to oranges. That test above is using a polystyrene box, angled photometer doing god knows what to the measurement through a reflective glass screen, ancient dB scales based off of input command level for measurements) etc.

From Displaymate:

Dynamic Range
Dynamic Range is simply the ratio of peak white luminance to black-level luminance that a display can produce. The values are measured separately – one screen for peak white and the other for the black-level. This is frequently referred to as “contrast,” “full field contrast,” or “full on/off contrast,” but the term contrast should be reserved for measurements on a single image, not on different screens. The ratio of the peak white to black-level luminance values tells us the maximum range of brightness that the display can produce. So Dynamic Range is especially important in imaging and home theater applications, where, for example, bright/day scenes and dark/night scenes both need to be rendered accurately. The higher the Dynamic Range the better the display will be able to reproduce wide differences in scene brightness. Note that a high Dynamic Range will also yield a dark black-level unless the peak brightness is very high. Here are the ratios calculated from the peak white and black-level values measured above in our testing:

CRT - 17,600:1
Sony PVM-20L5

LCD - 595:1
NEC LCD4000

"The CRT wins by a huge factor. (We’ve measured Dynamic Range values as high as 36,500 for a CRT using a sensitive photometer.) The CRT’s enormous lead in Dynamic Range is another major reason why it remains the technology of choice for home theater perfectionists."

Black-Level Measurements:

Here are the black-levels measured with the Konica Minolta CS-1000 Spectroradiometer:

CRT
Sony PVM-20L5 - 0.01 cd/m2

LCD
NEC LCD4000 -
0.72 cd/m2 Max Backlight
0.27 cd/m2 Min Backlight

The great black levels of CRT are also one of the reasons CRT looks far superior to LCD typically. Just put a FW900 next to a TN panel, set them both to the same luminance and tell me the TN panel has three TIMES better contrast ratio. You will get an eye opener (pun intended).

EDIT: I think I found the problem with that test that showed a terrible 301:1 CR on the FW900. The input command level was set to 0.1 foot-lambert which is .34 candela/square meter and hence artificially raising the black depth reading significantly. Not sure why they would do this. Almost like an artificial baseline to meet VESA Flat Panel Display Measurements Standard, Version 1.0, May 15, 1998 reference standards without realizing it would hobble performance of displays that have great black depth. According to that test, an OLED would get a terrible measurement.
 
Last edited:
That ancient test with ancient equipment wasn't a standard checkerboard "contrast ratio" test like is done today by a single sensor. The test in the citation is a "Dynamic range and Screen Reflectance" test. This test is not how contrast ratio's are attained today; apples to oranges. That test above is using a polystyrene box, angled photometer doing god knows what to the measurement through a reflective glass screen, ancient dB scales based off of input command level for measurements) etc.

Neither is what you are quoting. Here is Displaymate checkerboard contrast result. Apparently from the same page you are quoting, but conveniently you left that part out. ;)

http://www.displaymate.com/ShootOut_Part_1.htm
"A standard way to measure Display Contrast is to use a black and white checkerboard test pattern and measure the luminance at the center of the white blocks and then the black blocks. The smaller the blocks the greater the bleed, resulting in lower contrast values. We’ve done this for a 4x4 checkerboard, which is a standard pattern, and then for a much finer 9x9 checkerboard to see how much more the contrast falls when the blocks are reduced by an additional factor of 5 in area."

CRT Sony PVM-20L5

4x4 Checkerboard Contrast: 219
9x9 Checkerboard Contrast: 75

LCD NEC LCD4000

4x4 Checkerboard Contrast: 586
9x9 Checkerboard Contrast: 577


It's worse than 300:1, I originally quoted, and that's with a generously large 4x4 checkerboard. With a 9x9 Checkerboard it falls to an absolutely laughable 75:1 contrast.

I lived for decades with CRTs. We had to use them in the dark to make them decently usable, and as the above reveals, the real world black levels were garbage, once anything else was on the screen lighting it up. Or heck, if any room lighting was on.

Sure you could get a dark full black screen, but soon LCD manufacturers learned that trick as well, and could shut off the back-light on a black screen.

As you said the Standard checkerboard "Contrast" ratio is the revealing one, and CRTs have very low contrast when needing to display it on one screen at the same time.

There is really too much nostalgia glorifying the CRT these days. They were not nearly as good as some portray them.
 
Last edited:
The fw900 graphics professional flat screen trinitron crt was 16:10 , 22.5" diagonal.. usually run at 1920x1200 ~ 100.5 ppi and 85 Hz.

They were crisp and well saturated, and their greatest feature was that they had 1ms persistence, 1 pixel of motion blur which is essentially "zero" blur due to screen redraw.

That isn't just a single pixel cell shaded looking ufo crossing the screen, it's the entire game world of high detail textures, depth via bump mapping, geography and architecture, skies, etc. moving relative to you in the viewport while mouse looking and movement keying or controller panning in 1st/3rd person games. Everything perfectly crisp and readable and detailed no matter how you moved.
okw997S.png


KlIRG0B.png


* The images below do not show how dim and muted lightboost would be since they also represent other screen tech.

L7F09ME.png


The only way LCD and OLED can do that is with strobing/screen blanking/BFI which has major issues that I outlined in my previous post, or someday using interpolation to get to 1000fps on some future 1000hz display.


The crts were huge, very heavy and could run hot, and they would get "loose" settings requiring some expertise in getting under the hood and tweaking things which could be a big pain. They also had to warm up for awhile after you turned them on. Utimately it was their age and the fact that they would eventually fade and/or bloom, even pop the screen periodically and otherwise be unsuitable in the long run. That along with their screen size compared to modern lcds and the fact that obviously they aren't made anymore made people like me ditch them - though I did run through a few for several years of gaming along side a 60hz LCD for desktop/apps until 120hz gaming monitors came out.
 
Neither is what you are quoting. Here is Displaymate checkerboard contrast result. Apparently from the same page you are quoting, but conveniently you left that part out. ;)

http://www.displaymate.com/ShootOut_Part_1.htm
"A standard way to measure Display Contrast is to use a black and white checkerboard test pattern and measure the luminance at the center of the white blocks and then the black blocks. The smaller the blocks the greater the bleed, resulting in lower contrast values. We’ve done this for a 4x4 checkerboard, which is a standard pattern, and then for a much finer 9x9 checkerboard to see how much more the contrast falls when the blocks are reduced by an additional factor of 5 in area."

CRT Sony PVM-20L5

4x4 Checkerboard Contrast: 219
9x9 Checkerboard Contrast: 75

LCD NEC LCD4000

4x4 Checkerboard Contrast: 586
9x9 Checkerboard Contrast: 577


It's worse than 300:1, I originally quoted, and that's with a generously large 4x4 checkerboard. With a 9x9 Checkerboard it falls to an absolutely laughable 75:1 contrast.

I lived for decades with CRTs. We had to use them in the dark to make them decently usable, and as the above reveals, the real world black levels were garbage, once anything else was on the screen lighting it up. Or heck, if any room lighting was on.

Sure you could get a dark full black screen, but soon LCD manufacturers learned that trick as well, and could shut off the back-light on a black screen.

As you said the Standard checkerboard "Contrast" ratio is the revealing one, and CRTs have very low contrast when needing to display it on one screen at the same time.

There is really too much nostalgia glorifying the CRT these days. They were not nearly as good as some portray them.

You are talking about an "ANSI" contrast ratio, the checkerboard pattern, which the original test didn't even use. It used an "on/off" contrast ratio, which is totally different. So how could one CRT contrast ratio using "on/off" show 309:1 contrast ratio and another CRT contrast ratio using the "on/off" method show 17,600:1. That's why you have to use apples to apples. And you are forgetting one very big piece of the puzzle. Dynamic range is just as important to image quality as contrast ratio.

LCD's typically have good "static contrast ratios" but terrible dynamic range. CRT's have great dynamic range yet bad static contrast ratios. And LCD's get a huge boost in their static contrast ratio numbers due to them being able to get a lot brighter. But in a dark room, a good CRT is going to look way better than your average LCD due to the black depth and dynamic range. Then on top of it there is no standard for comparing contrast ratios with different equipment, for varying technologies over decades.

LCD left, FW900 right:





Due to CRT's great black depth and dynamic range, in a dim room the picture quality destroys mainstream LCD's. I know, I've had them side-by-side. But of course in a bright room watching TV with huge windows shining light in or working on an excel spreadsheet, LCD will dominate.
 
LCD's typically have good "static contrast ratios" but terrible dynamic range. CRT's have great dynamic range yet bad static contrast ratios. And LCD's get a huge boost in their static contrast ratio numbers due to them being able to get a lot brighter. But in a dark room, a good CRT is going to look way better than your average LCD due to the black depth and dynamic range. Then on top of it there is no standard for comparing contrast ratios with different equipment, for varying technologies over decades.

Dynamic range is meaningless if it only applies to an all black screen. LCDs have had "Dynamic" back-lights to play that game for years.

I have my current LCD set up to vary the backlight with it's light sensor, so when I dim the lights at night, it automatcially lowers the backlight and never looks remotely like your LCD image. My blacks still look black.
 
As nice as high resolution CRTs are, I still find the way they render pixels to be less pleasing. They just look fuzzy and indistinct compared to LCD/OLED pixels. This is a "benefit" when you're viewing low resolution pixel art, but the rest of the time it's bad, to me at least.

Even though CRTs have excellent motion resolution, I'd still take a 60hz OLED over any previously made CRT for literally any usage except for maybe first person shooter play.
 
That's dynamic (moving) contrast ratio you mention, by turning off part of the back-light, not dynamic range.
 
That's dynamic (moving) contrast ratio you mention, by turning off part of the back-light, not dynamic range.

I am not talking about local dimming (that would be turning off part of the backlight). I am talking about dynamic backlights, turning down the backlight in response to content, so LCD can do the same dark black screen trick as CRTs, even on cheap LCDs:
https://www.amazon.com/HP-27w-Monit...r_1_1?keywords=HP+-+27w&qid=1568751627&sr=8-1
"10,000,000:1 dynamic contrast ratio "

If you want to play the full on-off contrast ratio game, LCDs have had that in the bag for a LONG time. It's meaningless.

Checkerboard Contrast is what is used everywhere to measure contrast. Nice how you brought this up first, but try to ignore it when it's even worse than I first mentioned.
 
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
There is no "dark screen trick", that is literally how CRT's scan lines work to get great black levels and dynamic range. That's like saying per-pixel OLED light control is also a "trick". LCD dynamic contrast ratio has and always will be nonsense, and to lump CRT dynamic range into that category is nonsensical.

I want to play the "full on-off" contrast ratio game? You literally linked a reference test that ONLY had full on-off contrast ratio test and not an ANSI static contrast test. And no, "LCD's" haven't had on-off contrast ratio's in the bag since turning on "dynamic contrast" settings on LCD's have made the image quality worse and never get turned on by anyone that knows what they are doing. Like maximum overdrive settings that no one uses, they are only there for the marketing material. Trying to talk about image quality by only one metric of static contrast ratio as if that is the only thing that matters is ridiculous. Black depth and dynamic range have tons to do with it. The big reasons OLED picture quality is astounding, not just static contrast ratio.

Hell, you could have an LCD that has a black depth of 1.0, go to 5000 nits, have a great static contrast ratio, and look like garbage.
 
There are some serious rose colored nostalgia glasses when it comes to CRTs. I sure don't miss 100 nits max brightness, 300:1 contrast ratios, poor geometry, blurry analog connections, inherently blurry displays...

Not really. Up until this year I was still using one as my main display. I'm a developer by trade so I use LCD's all the time and have a Samsung gaming VA that ultimately replaced it. Reason for me replacing my CRT was downsizing in living space. Believe me, if I could keep my CRT monitors I still would. Yes, CRT had some poor ANSI contrast (that's where you get the checkerboard values :) ), but it still had much better full on-off, and near-perfect motion clarity.

I don't hate LCD monitors, but I do prefer CRT. If OLEDs became more prevalent in the PC monitor world, I'd definitely jump on one, providing it has rolling scan like the Sony BVM's.

But hey, if you don't like CRT then that's fine. You do you. But don't dismiss people who like a tech that you don't like as being blinded by nostalgia.
 
Checkerboard Contrast is what is used everywhere to measure contrast. Nice how you brought this up first, but try to ignore it when it's even worse than I first mentioned.

To clarify, since you're arguing the point, checkerboard contrast is used only to measure ANSI (intrafield) contrast. To measure a display's contrast you ALSO need to do full-on/full-off. CRT's don't have good ANSI because of their inner-glass reflections. But they have amazing full-on/full-off contrast.

EDIT: Full on/off contrast, by the way, is FAR from meaningless. I don't know where in the world you got that from. It's literally the thing that separates a top-end projector like a JVC D-ILA from an Epson, for example. Both have similar ANSI contrasts, but the JVC has far deeper blacks and higher native contrast, and ultimately a much more profound image (you pay for it though). :)
 
The search for an upgrade to my XG248Q continues....next contender is the HP Omen X27 1440p 240Hz
 

Attachments

  • Screenshot_20190917-135636_Amazon Shopping.jpg
    Screenshot_20190917-135636_Amazon Shopping.jpg
    336.9 KB · Views: 0
To clarify, since you're arguing the point, checkerboard contrast is used only to measure ANSI (intrafield) contrast. To measure a display's contrast you ALSO need to do full-on/full-off. CRT's don't have good ANSI because of their inner-glass reflections. But they have amazing full-on/full-off contrast.

EDIT: Full on/off contrast, by the way, is FAR from meaningless. I don't know where in the world you got that from. It's literally the thing that separates a top-end projector like a JVC D-ILA from an Epson, for example. Both have similar ANSI contrasts, but the JVC has far deeper blacks and higher native contrast, and ultimately a much more profound image (you pay for it though). :)

Full on/off contrast is meaningless because all you have to do, to score infinity on that test, is turn off the light source. That might be difficult with Big light bulbs and CCFL tubes, but it is trivial when you move to LED lighting.
 
Full on/off contrast is meaningless because all you have to do, to score infinity on that test, is turn off the light source. That might be difficult with Big light bulbs and CCFL tubes, but it is trivial when you move to LED lighting.

Context is everything though. Yes, a projector with a dynamic iris that can pump out good on/off contrast - it's cheating and the number means very little. But in the case of say, a closed iris system (that's not dynamic but static), it means a lot. A panel's native on/off also means a lot, because the lower the native contrast, the wider the dynamic iris range (or backlight FALD in the case of LCD monitors) have to be. You'll notice the transitions more in contrast (hur hur) to a panel with high native contrast.

Take a Sony SXRD - HW65ES vs JVC (anything in the last 6 years). They have very similar on/off contrasts, but the Sony requires a dynamic iris to achieve it. The JVC projectors have better contrast without relying on dynamic irises. So yes, you're partially right that on/off doesn't mean anything, but in the correct context it means a lot.
 
Context is everything though. Yes, a projector with a dynamic iris that can pump out good on/off contrast - it's cheating and the number means very little. But in the case of say, a closed iris system (that's not dynamic but static), it means a lot. A panel's native on/off also means a lot, because the lower the native contrast, the wider the dynamic iris range (or backlight FALD in the case of LCD monitors) have to be. You'll notice the transitions more in contrast (hur hur) to a panel with high native contrast.

Take a Sony SXRD - HW65ES vs JVC (anything in the last 6 years). They have very similar on/off contrasts, but the Sony requires a dynamic iris to achieve it. The JVC projectors have better contrast without relying on dynamic irises. So yes, you're partially right that on/off doesn't mean anything, but in the correct context it means a lot.

Bascially you are arguing in favor of high Native/Static/ANSI contrast. Which is what I have been saying all along. JVC is better because it has better ANSI/Static/Native contrast. You can't fake that.
 
Bascially you are arguing in favor of high Native/Static/ANSI contrast. Which is what I have been saying all along. JVC is better because it has better ANSI/Static/Native contrast. You can't fake that.

You have not been saying that all along. I think you're confusing ANSI contrast for Native Contrast. They're not the same thing. Again, ANSI contrast refers to intra-field contrast, hence the checkerboard pattern. IE - what's the contrast ratio of a full white square that's next to a full-black square? That's ANSI. Native contrast refers to full on/off. To test native contrast you display a full white pattern, take the luminance reading, then display a full black pattern and take the luminance reading. Divide white luminance by black luminance and you have your native/static contrast ratio.

CRT has good native/static contrast ratio but mediocre ANSI contrast. I think what's confusing this conversation is Displaymate's use of the term "dynamic range". From what I can tell by their measurements and how they get it, they're just talking static/native contrast and nothing more.

Here's an example of how a good reviewer measures contrast: https://www.soundandvision.com/content/benq-ht9060-dlp-projector-review-test-bench

The reviewer starts with full-on/full-off and gives numbers using the projector's "native" (no light-source dimming) and "dynamic" (light-source dimming) and moves up to 20% ADL (Average display luminance). If I remember correctly, 20% ADL means you have a white square consuming 20% of the visible space on the screen. You take the reading of that square, and then take the reading of the black that surrounds the white square and then calculate the contrast ratio. This is similar to ANSI contrast, but better in my opinion, because you can see how the projector behaves in different mixed-content situations.

EDIT: Also notice how, even though the projector has good ANSI contrast, the full on/off (especially without the assistance of lightsource-dimming is pretty bad). This display would only be good for things that have reasonably bright content on it. Dark content would look bad with such a small contrast ratio, especially in a darkened theater room, where the projector is intended to sit.

Anyways. I'm done arguing. TL-DR: Native Contrast and ANSI contrast aren't the same thing, you're confusing the two. A great display rocks in both. CRT and LCD only really excel in one and not the other. OLED all the way. :)
 
You have not been saying that all along. I think you're confusing ANSI contrast for Native Contrast.

Native contrast I have most seen used to refer to ANSI/Checkerboard by most of the big monitor Reviewers. I haven't even seen one of them bother to do a full on, full off testing of contrast.

Rtings:
https://www.rtings.com/monitor/tests/picture-quality/contrast-ratio
To measure the native contrast ratio, we use a black and white checkerboard pattern to determine the black and white luminance, as described above. We calibrate the monitor to have a white luminance as close as possible to 100 cd/m², and then we take five luminance measurements using an LS-100 luminance meter.

The white luminance, which should be close to 100 cd/m², is measured in the center white square. For the black luminance, we measure the luminance of each of the 4 black squares surrounding the white center square. This is to reduce the impact of black uniformity on our contrast measurement. The final black luminance number is the average of these 4 squares.

Once we have both of these values, the contrast ratio is simply the white luminance divided by the black luminance.

TFTCentral (apparently don't use a checkerboard but keep something on screen)
https://www.tftcentral.co.uk/reviews/asus_rog_swift_pg35vq.htm#brightness
At each brightness level we also measure the contrast ratio produced by the screen when comparing a small white sample vs. a black sample (not unrealistic full screen white vs. full screen black tests). The contrast ratio should remain stable across the adjustment range so we also check that.
....

Remember this is without the FALD active, which can allow for MUCH higher contrast ratios in both SDR and HDR content. What we are comparing here is the native contrast ratio of the panel really.


Native contrast WRT to monitors means the contrast capably of being displayed on the screen simultaneously, without interference from the backlight.
 
Last edited:
Native contrast I have most seen used to refer to ANSI/Checkerboard by most of the big monitor Reviewers. I haven't even seen one of them bother to do a full on, full off testing of contrast.

Rtings:
https://www.rtings.com/monitor/tests/picture-quality/contrast-ratio


TFTCentral (apparently don't use a checkerboard but keep something on screen)
https://www.tftcentral.co.uk/reviews/asus_rog_swift_pg35vq.htm#brightness



Native contrast WRT to monitors means the contrast capably of being displayed on the screen simultaneously, without interference from the backlight.

Interesting. I didn't know that reviewers were doing this these days. In my opinion, I think that full on/off numbers, along with ANSI numbers should be reported. For LCD's the number shouldn't really be different unless something interesting was going on. I'm also curious as to why the RTings test tests the average of the black squares to eliminate black uniformity issues, while only measuring a center white square (theoretically white uniformity could be off too). Anyways. Thanks for sharing.
 
Does anyone with one of these ELMB monitors use an AMD card? Before I publish my review of the VG27AQ and look like an idiot for getting it wrong, does ELMB work properly on AMD cards? TFTCentral mentioned getting it working on AMD but not on Nvidia, and that's the situation I'm in. I don't currently have an AMD card to test this with.



No one has seemed super impressed with ELMB so far, and I'm wondering if it's just broken on Nvidia cards.
 
Crosstalk is the problem. The implementation on the monitor is what's "broken". It doesn't matter which card you use.
 
So guys if I ignore the touted feature of this monitor then how is it? The thing is I am looking to buy a 32" 144hz 1440p VA monitor with FreeSync and this is by far the cheapest among the products we have in my country. I really don't care about ELMB as I don't even know what it is lol. I just want to use FreeSync.
 
So guys if I ignore the touted feature of this monitor then how is it? The thing is I am looking to buy a 32" 144hz 1440p VA monitor with FreeSync and this is by far the cheapest among the products we have in my country. I really don't care about ELMB as I don't even know what it is lol. I just want to use FreeSync.

You can read my impressions of it earlier. Basically, it's a bust. You cannot adjust the overshoot while using the strobing mode, so if you get a monitor with a particularly aggressive setting (like I did), you're SOL. I returned my screen and got an AOC 1440p VA monitor. It's far better and cheaper. My AOC is a 27 incher.
 
You can read my impressions of it earlier. Basically, it's a bust. You cannot adjust the overshoot while using the strobing mode, so if you get a monitor with a particularly aggressive setting (like I did), you're SOL. I returned my screen and got an AOC 1440p VA monitor. It's far better and cheaper. My AOC is a 27 incher.
What is strobing mode and what happens if I don't use it? AOC monitors aren't available in my country. ASUS is really the only option as far as VA monitors are concerned.
 
Does anyone with one of these ELMB monitors use an AMD card? Before I publish my review of the VG27AQ and look like an idiot for getting it wrong, does ELMB work properly on AMD cards? TFTCentral mentioned getting it working on AMD but not on Nvidia, and that's the situation I'm in. I don't currently have an AMD card to test this with.



No one has seemed super impressed with ELMB so far, and I'm wondering if it's just broken on Nvidia cards.


Do you mean while using GSync? It worked just fine on my Nvidia card (without using GSync).
 
What is strobing mode and what happens if I don't use it? AOC monitors aren't available in my country. ASUS is really the only option as far as VA monitors are concerned.

It's what I'm calling their "ELMB" mode. Basically, the screen strobes the backlight on and off for every frame displayed. This way your eye doesn't see the pixel transitions and your perceived motion clarity shoots through the roof.
 
Can you set this monitor to 120hz? I would be using it on the Xbox One X (capped at 120hz) but there’s been reports that the monitor has to be able to be set at specifically 120hz to work.

If anyone has used this one the Xbox, or can recommend another monitor for it, it would help a lot.
 
Neither is what you are quoting. Here is Displaymate checkerboard contrast result. Apparently from the same page you are quoting, but conveniently you left that part out. ;)

http://www.displaymate.com/ShootOut_Part_1.htm
"A standard way to measure Display Contrast is to use a black and white checkerboard test pattern and measure the luminance at the center of the white blocks and then the black blocks. The smaller the blocks the greater the bleed, resulting in lower contrast values. We’ve done this for a 4x4 checkerboard, which is a standard pattern, and then for a much finer 9x9 checkerboard to see how much more the contrast falls when the blocks are reduced by an additional factor of 5 in area."

CRT Sony PVM-20L5

4x4 Checkerboard Contrast: 219
9x9 Checkerboard Contrast: 75

LCD NEC LCD4000

4x4 Checkerboard Contrast: 586
9x9 Checkerboard Contrast: 577


It's worse than 300:1, I originally quoted, and that's with a generously large 4x4 checkerboard. With a 9x9 Checkerboard it falls to an absolutely laughable 75:1 contrast.

I lived for decades with CRTs. We had to use them in the dark to make them decently usable, and as the above reveals, the real world black levels were garbage, once anything else was on the screen lighting it up. Or heck, if any room lighting was on.

Sure you could get a dark full black screen, but soon LCD manufacturers learned that trick as well, and could shut off the back-light on a black screen.

As you said the Standard checkerboard "Contrast" ratio is the revealing one, and CRTs have very low contrast when needing to display it on one screen at the same time.

There is really too much nostalgia glorifying the CRT these days. They were not nearly as good as some portray them.

I think that people who have fond memories of crts are just wearing rose colored glasses. I had iiyama and Sony crts and I'd never pick those over my 1ms TN 144 hz panel. Ideally I want a microled display with high refresh rate but that's far off.
 
It's what I'm calling their "ELMB" mode.

Call it 'ELMB-Sync'.

The basic problem that this monitor is mostly failing to solve is the application of strobing to variable refresh rates. Strobing works great for increasing motion clarity with fixed refresh rates, but if refresh rates vary cycle to cycle with video card framerates, then the intensity and length of each strobe must vary to compensate, and as seen here, that problem hasn't quite been solved.
 
Call it 'ELMB-Sync'.

The basic problem that this monitor is mostly failing to solve is the application of strobing to variable refresh rates. Strobing works great for increasing motion clarity with fixed refresh rates, but if refresh rates vary cycle to cycle with video card framerates, then the intensity and length of each strobe must vary to compensate, and as seen here, that problem hasn't quite been solved.

From what I've seen this monitor does fine adjusting the intensity and length of the strobe. There isn't a noticeable brightness fluctuation.

It's failing because the timing of the strobe and the pixel transitions are off causing bad crosstalk. And it has this problem even using strobing without gsync.

Bad crosstalk makes strobing completely useless.
 
I think that people who have fond memories of crts are just wearing rose colored glasses. I had iiyama and Sony crts and I'd never pick those over my 1ms TN 144 hz panel. Ideally I want a microled display with high refresh rate but that's far off.

Nope. Not even close. There are people who still use CRT, who have used high-refresh LCD and still prefer CRT. I am one of those people. There are still areas in image quality in which CRT reigns supreme. I don't understand this dismissiveness. Just look at the FW-900 thread that's still alive after all these years. Believe me, if there was a truly superior-in-every-way monitor, we'd have abandoned our tubes long ago.

That said, I'd love to have a Micro-LED monitor.
 
  • Like
Reactions: blkt
like this
Nope. Not even close. There are people who still use CRT, who have used high-refresh LCD and still prefer CRT. I am one of those people. There are still areas in image quality in which CRT reigns supreme. I don't understand this dismissiveness. Just look at the FW-900 thread that's still alive after all these years. Believe me, if there was a truly superior-in-every-way monitor, we'd have abandoned our tubes long ago.

That said, I'd love to have a Micro-LED monitor.

Yeah and how many people are hanging on to those dinosaurs? Not even 0.1% gamers I bet. Shits dead, let it rest in the past where it belongs.
 
Yeah and how many people are hanging on to those dinosaurs? Not even 0.1% gamers I bet. Shits dead, let it rest in the past where it belongs.

Does it matter? Dude - we're all enthusiasts here. Seriously. None of us CRT fans (well the sane ones at least) are shitting all over LCD users.

Honestly, if you like what you like then I'm happy for you. Isn't this ultimately what it's all about?
 
  • Like
Reactions: blkt
like this
Back
Top