Asus VG248QE 144hz 3D Vision 2: The Official Thread

What's the difference between the BenQ XL2420T and the BenQ XL2411T ?

(and for that matter the Asus VG248QE)
 
Which has less ghosting/crosstalk with LightBoost?

I didn't take notes about the cross-talk in games (yes, fail I am) when I had the BenQ, but it is just as cross-talk free in the 3D movies I tested (Tron: Legacy [except 1 scene in which I saw a tincy bit but I can't remember how the BenQ handled it], Priest, Fright Night and Tintin).

IMO the VG23AH>VG236H> S23A700D>Nvidia 3D Vision 2 displays for 3D movies

There is lots of cross-talk when playing L4D2 but it's a cross-talky mess on all the 3D displays I've used, except the VG23AH. I can remember things about the VG23AH's 3D performance, but I doubt any one cares and I already installed my 7950.
 
I'm only interested in lightboost 2D gaming personally but having some color vibrance would be nice, like the samsung A750d and 950's have (though they have some input lag unfortunately, and can't compare to lightboost 2d gaming's complete blur elimination). Of course the samsungs are glossy which helps how the color vibrance, black depth, and clarity appear - whichis why I have been asking alot about what the asus VG248QE ends up like after the AG coating is removed.
 
IMO the VG23AH>VG236H> S23A700D>Nvidia 3D Vision 2 displays for 3D movies
Yes, the LightBoost displays have far less crosstalk.
And amongst LightBoost displays;

2ms panels has more crosstalk than 1ms panels (2ms > 1ms in terms of crosstalk visibility)

Also, reducing Contrast also massively reduces crosstalk (by as much as 90% fainter) on some models of LightBoost displays. I have found that the ASUS VG278H has far less 3D crosstalk with Contrast=60 than with Contrast=90. Almost an order of magnitude less crosstalk for some GtG transitions; almost begins approaching the 1ms panels. It also reduces the LightBoost 2D "faint trailing razor-sharp ghost" effect on the VG278H/VG278HE, which was already almost nonexistent on my XL2411T.
 
Last edited:
The 2ms panel is greater than 1ms in what aspect? Crosstalk?
 
The 2ms panel is greater than 1ms in what aspect? Crosstalk?
No, it's worse.

To clarify, we were talking in context of LightBoost crosstalk:
"2ms > 1ms" means
"2ms crosstalk > 1ms crosstalk" means
"2ms panels has more visible 3D LightBoost crosstalk than 1ms panels"

This is consistent with crosstalk visibility comparison:
VG23AH>VG236H> S23A700D>Nvidia 3D Vision 2 displays for 3D movies
Which goes from worse-to-best in left-to-right order.

Which is expanded to:
VG23AH > VG236H > S23A700D > Nvidia 3D Vision 2 (2ms panels) > Nvidia 3D Vision 2 (1ms panels)
 
Ah, I was looking at it in a different context lol. I was thinking of it has > as being "better", versus > as being "more cross-talk/worse". ;)
 
Vega is correct. None of the 3D monitors have any real problems with cross talk in movies. I was referring to the 3D image quality (colors and blacks) and the Vg23ah is naturally the best since it is ips and 72hz while the xl2420t (grainy screen door) and vg248qe (mediocre greyish black) are the worst. Mark has the order correct for 3D cross talk except some passive 3d games (l4d2 & mass effect) are leagues better in terms of cross talk.
 
Last edited:
NCX, were you able to correct for the weak blacks in 3D on the VG248QE through calibration, or was it still far short of the xl2420t's blacks in 3D?
 
The black level remains @0.29cd/m2, only the brightness+gamma changes...the contrast can range from 700:1 with max contrast but any setting over 65 causes bleaching and loss of white+light color detail while also further washing out the colors...to less than 400:1 (45 contrast-).

The optimal setting on mine was with a contrast setting of 65 which provided a 500:1 contrast ratio and 2.10 average gamma. All of the Other 3D monitors I have used can do >900:1 in the 3D mode. The ICC profile only helps with the Dela E, gamma and color temperature. The contrast remains the same, fortunately it isn't lowered even further after calibration.
 
Last edited:
Asus has real good quality control with these monitors. De-matted my 9th customers monitor and they have all had zero dead/stuck pixels.
 
Asus has real good quality control with these monitors. De-matted my 9th customers monitor and they have all had zero dead/stuck pixels.
I have heard cheap ASUS product can have poor quality contorl, but ASUS didn't skip on the VG248QE's. It costs twice as much as a common 60 Hz monitor, so there's more incentive to keep quality high.

$300 is still pretty cheap for a 120 Hz LCD -- the 120Hz displays used to cost a lot more than that before.
 
I haven't seen any dead/stuck pixels on mine and I'm pretty sure it has no back-light bleeding but I haven't done a dark room test yet.
 
I have heard cheap ASUS product can have poor quality contorl, but ASUS didn't skip on the VG248QE's. It costs twice as much as a common 60 Hz monitor, so there's more incentive to keep quality high.

$300 is still pretty cheap for a 120 Hz LCD -- the 120Hz displays used to cost a lot more than that before.

You can get this display as cheap as $265 shipped new. That is an absolute steal for what this monitor is and can do. :eek: Best deal in displays out there IMO.
 
Which has less ghosting/crosstalk with LightBoost?
I own both the BENQ XL2411T and ASUS VG278H, so I can comment:

If you adjust your Contrast downwards to about 65, the ghosting/crosstalk on the VG278H goes way down. It's apparently very sensitive to the Contrast setting since the Contrast setting also seems to affect the LightBoost overdrive artifact.

On my BENQ XL2411T, the LightBoost has virtually no ghosting, no coronas, and no motion blur (especially LB=10%) -- it's amazingly clear. The VG278H has the trailing sharp ghost effect but even that diminishes significantly if I adjust the contrast downwards to 65.
 
NCX, the link to your ICC profile isn't working. Please fix, or my gang steals the rest of the maple syrup.
 
Correct, 3D vision is polarized light.
Minor correction: nVidia version of 3D Vision does not use polarized light, no LightBoost monitors utilize polarized light as part of the 3D function. (You are referring to passive 3D). Active 3D Vision doesn't use polarized light.
 
Minor correction: nVidia version of 3D Vision does not use polarized light, no LightBoost monitors utilize polarized light as part of the 3D function. (You are referring to passive 3D). Active 3D Vision doesn't use polarized light.

If you rotate the 3D glasses perpendicular to the screen such as portrait mode, the image goes away. That is why NVIDIA disables 3D in portrait mode. If that isn't polarization problem, I don't know what it would be. ;) (But then I haven't tested in years as I don't care for 3D).
 
If you rotate the 3D glasses perpendicular to the screen such as portrait mode, the image goes away. That is why NVIDIA disables 3D in portrait mode. If that isn't polarization problem, I don't know what it would be. ;) (But then I haven't tested in years as I don't care for 3D).
That's the LCD polarizer, and has nothing to do with 3D function. It's just a side effect of how LCD's operate, in how the LCD panel adjusts light polarization to open/close pixels. That's how LCD's has operated since 1970. Even yesterday's LCD wrist watch had a polarizer.

Portrait mode 3D would end up working if you modify the 3D glasses and rotate the LCD lens 90 degrees (which requires hacking apart the glasses), to make the LCD polarizer of the shutter glasses compatible with the LCD polarizer of the monitor. As you can see, the polarized light in this case has no function in helping 3D (it's unrelated to the polarized version of 3D that does exist). It is a side effect of LCD being used in the shutter glasses, and LCD being used in the monitor.

Citations of LCD polarizers, unrelated to 3D:
- Google: LCD polarizer
- Wikipedia: Liquid Crystal Display
- Animation of polarization operating to turn on/off LCD pixels on TomsHardware
 
Last edited:
That's the LCD polarizer, and has nothing to do with 3D function. It's just a side effect of how LCD's operate, in how the LCD panel adjusts light polarization to open/close pixels. That's how LCD's has operated since 1970. Even yesterday's LCD wrist watch had a polarizer.

It may not explicitly have to do with the 3D function, but it sure has an effect on 3D, it makes it impossible to use 3D in portrait mode with these monitors in reference to exelias0's question.

My eyes don't care about light orientation = portrait is a go.
3D glasses care about light orientation = 3D glasses in portrait is a no go.

Regular sun glasses don't care about light orientation = portrait a go.
Polarized sun glasses care about light orientation = polarized portrait sunglasses a no go.

It's so crippling that NVIDIA just outright disables 3D in portrait, getting back to exelias0's question.
 
3D vision = the entire package, glasses, emitter, software. How is my statement incorrect? If the glasses are polarized to only work in landscape, 3D vision only accepts polarized light. Yes, LCD tech is polarized by default, but only until you introduce 3D vision polarized glasses does anyone care. If they introduced non-polarized glasses, sure 3D would still work. As it stands right now, 3D vision has a polarized light limitation.

So that begs the question, why are they polarized? It sure must provide some benefit. So unless exekuas0's hires a manufacture to make him custom portrait polarized glasses and hacks the NVIDIA drivers, it's a no go. ;)
 
Correct, just only your original statement is inaccurate:
... "3D vision is polarized light."

Polarization is an intrisinic feature of LCD, so the effect caused by the glasses is simply a side effect of one LCD panel (glasses) interacting with another LCD panel (glasses). Both are polarized panel technologies, and are subject to the polaroid effect you saw. But it doesn't play a role in 3D, except to introduce an incompatibility. You could even replace the 3D Vision glasses with mechanical shutter glasses (Nipkow-style disc with just one big hole; or an old movie-projector-style 180-degree shutter), and it'd work in both portrait and landscape mode. (A 60Hz synchronous motor with the disc attached) Would be quite noisy (it actually was in tests). They, in fact, used mechanical shutter glasses in an old experimential movie show, back in a 1922 trial:

ign-gears-history-of-3d-movies-20100423020038195-000.jpg


"Filmmakers and theater owners continued to experiment with the growing 3D market. Laurens Hammond and William F. Cassidy debuted their Teleview System in late 1922. This form of projection rapidly alternated frames from two film reels. Small viewers attached to the seats were synchronized to open and close their displays in accordance with the projector. Because of the cumbersome nature of the format, only one movie was ever developed specifically for the Teleview System."

Source: IGN History of 3D Movies

It would work successfully with 3D Vision -- simply replace the LCD glasses with non-LCD shutter glasses (that operate the shutters at the same speed), and it'd work in both portrait and landscape. So 3D Vision, as you can see, isn't using polarization as part of 3D function. In fact, it would already work in landscape mode right away with the existing 3D Vision (no driver modifications, no monitor modifications) in a science experiment using a synchronous motor (sync'd to refreshes) and a disc with a hole or cutout. Just activate the emitter (point the real glasses at emitter but put that glasses down, then pick up your homebrew mechanical shutter contraption "glasses") -- and the nVidia 3D Vision would work through the homebrew mechanical shutters. It would certainly be an interesting steampunk project, which would actually work with absolutely no 3D Vision hardware or software modifications!

I agree that the polarization limitation is technically silly but it's difficult to create a wearable, noiseless, lightweight, low-power shutter glasses that's not LCD. Currently, that's the problem; there's no really practical non-LCD shutter glasses technology suitable yet. A couple weeks ago, I made a high speed video illustrating shutter glasses operation with my 3D Vision glasses; all it is doing is simply performing a noiseless version of mechanical shutters (synchronized with the emitter & strobe backlight) -- I'll post it on my blog at some point. It would be nice if there was a breakthrough in this, though!
 
Last edited:
3D vision = the entire package, glasses, emitter, software. How is my statement incorrect?
Technically, depending on interpretation by other humans.

... "3D vision is polarized light."
Very Correct if interpreted as "does the whole technology chain utilize any polarized components?"
Very Incorrect if interpreted as "does the 3D feature use the polarized principle to separate left/right eye images?"

The original statement is, definitely, easy to misinterpret. Different people will misunderstand it, as being connected to passive 3D which definitely utilize the polarized principle to separate left/right eye images. Active 3D such as 3D Vision does not. One of Blur Buster's mandate, is to help fact-check the Internet on motion blur as well as related topics (The Blur Busters Squad responds to "HUMANS CANT TELL APART 30FPS AND 60FPS" emergecy calls! :D) -- and reduce potential misunderstandings. Even if you think it is correct, other humans will potentially misunderstand it, depending on interpretation.

Therefore, change:
... "3D vision is polarized light."
Into:
... "3D vision glasses, although it does not use polarized light to separate light for left/right eyes, use LCD's which require polarizers as part of all LCD technology. The polarization is why the 3D Vision glasses is not compatible for portrait mode."
(or something similar)

Perhaps I responded too fast, it's just a potential misinterpretation issue even if both you and I understand the intent of the statements. An understanding of the difference between passive 3D and active 3D, causes somebody educated in these technologies, to instantly realize the potential misinterpretation issue here, at hand.
 
Last edited:
In theory, it should work if you flip the LCD panels in the 3D Vision glasses inside out. Of course, then it would no longer work in landscape mode ;)
 
Ya, I only meant the limitation of using polarized 3D glasses in portrait, nothing more. ;)
 
Hi all. I'm new to the forum. I’ve read all the pages of this post and I have a question that I'm sure you folks can answer. I am going to purchase a three monitor (VG248QE) set up and two 7970 or 680 to drive it. The question is which cards will have the biggest advantage? I was leaning toward the 7970’s because of better pricing and cf performance. But some of the disadvantages concern me: no light boost, 3 display ports on one card to run all @ 144hz, from the previous posts saying no 3-D. The 680 are more costly but from what I read will do light boost, 3-D but no 144 hz. Maybe those of you that have done ether setups can lay it all out straight. This will be a pretty good chunk of cash and some good advise will go a long way. Thanks for the help.
 
I suggest you buy a Titan instead!

You need at least 2 Titans or 3 7970's to run 3 of these monitors @ 144Hz with the most eye candy and highest FPS.

AMD there is no light boost and no 3D.

Nvidia there is light boost and 3D.
 
Sorry, Lightboost set at 10 and contrast set at 100.
Just so you know, setting contrast to 100 will cause some white clipping (See Lagom Contrast), and will also amplify the LightBoost trailing "sharp-ghost".

IMHO, if you need extra brightness, better is to set LightBoost to 50% and lower the contrast to 80% or less. (I nowadays use 75% after realizing that Contrast amplified the faint remaining LightBoost overdrive artifact; TFTCentral observed this to). The sweet spot for maximum brightness while reducing/eliminating the LightBoost overdrive artifacts seems to be Contrast=65% or so (even on the ASUS).
 
Hi all. I'm new to the forum. I’ve read all the pages of this post and I have a question that I'm sure you folks can answer. I am going to purchase a three monitor (VG248QE) set up and two 7970 or 680 to drive it. The question is which cards will have the biggest advantage? I was leaning toward the 7970’s because of better pricing and cf performance. But some of the disadvantages concern me: no light boost, 3 display ports on one card to run all @ 144hz, from the previous posts saying no 3-D. The 680 are more costly but from what I read will do light boost, 3-D but no 144 hz. Maybe those of you that have done ether setups can lay it all out straight. This will be a pretty good chunk of cash and some good advise will go a long way. Thanks for the help.
Just so you know, LightBoost is only supported on nVidia product.

Doing 3 monitors at 30-60fps is a lot of fun, but if you plan to use LightBoost, keep in mind you need fps=Hz (just like you needed for CRT) to completely eliminate visible motion blur (caused by display limitations) so you need 100fps @ 100Hz or 120fps @ 120Hz to get the full zero motion blur effect. This is because LightBoost is only enabled at 100-120Hz. LightBoost is not enabled at 144 Hz. LightBoost at 60 fps looks fine, but it defintely won't look nearly as "wow" as LightBoost at 120 fps.

nVidia can run at 144 Hz, but you need to go down to 120 Hz to enable LightBoost.
Remember that Hz does not necessarily always equal motion blur
This is why CRT 60fps@60Hz has sharper and clearer motion than traditional LCD 120fps@120Hz.
This has to do with the sample-and-hold effect of traditional LCD's.
(Eyes are continuously moving when tracking moving objects; so static continuously-shining frames gets blurred by eye tracking motion)
LightBoost stroboscopically eliminates that problem (just like CRT flicker does)
So this is what you get:
PixPerAn Tests on BENQ XL2411T and ASUS VG278H

baseline - 60 Hz mode (16.7ms continuously-shining frame)
50% less motion blur (2x clearer) - 120 Hz mode (8.33ms continuously-shining frame)
60% less motion blur (2.4x clearer) - 144 Hz mode (6.94ms continuously-shining frame)
85% less motion blur (7x clearer) - 120 Hz LightBoost, set to 100% (2.4ms frame strobe flashes)
92% less motion blur (12x clearer) - 120 Hz LightBoost, set to 10% (1.4ms frame strobe flashes)
Thusly, a LOT less motion blur at 120Hz LightBoost than with 144Hz non-LightBoost.
(It would be nice if LightBoost supported 60 Hz and 144 Hz though; but nVidia limited its function to 100 through 120Hz)
 
Last edited:
Thanks for info. It sounds like the best setup for the three VG248QE would be to run at 100-120Hz, with Lightboost enabled at 10% and contrast set to around 65%. Would two 680 in SLI be able to handle this with setting set to reasonable levels on a game like BF3 multiplayer? Is Lightboost a hardware or software thing and is this something AMD might address?
 
Thanks for info. It sounds like the best setup for the three VG248QE would be to run at 100-120Hz, with Lightboost enabled at 10% and contrast set to around 65%. Would two 680 in SLI be able to handle this with setting set to reasonable levels on a game like BF3 multiplayer? Is Lightboost a hardware or software thing and is this something AMD might address?
LightBoost is a monitor feature licensed by nVidia graphics to monitor manufacturers, but it essentially requires nVidia drivers to unlock it in the monitor. So, it's software unlocking a hardware feature.

AMD users have gotten it to work by enabling it on an nVidia system then hot-plugging to a pre-configured (ToastyX CRU-tweaked) Radeon system. And you keep the LightBoost effect on Radeon. However, it's a quite cumbersome procedure.

Two 680's with some tweaking, will give you a very good LightBoost effect in Battlefield 3, especially if you can eliminate all other sources of bottleneck (recent CPU -- good i7). SSD helps a little bit in eliminating disk-access-based stutters too. Gaming mouse eliminates stutters (of a cheap mouse) that can reduce the zero motion blur effect. You will get frequent frame drops below 120fps in busy moments. Vega and others can chime in to confirm their experiences.

____

I personally set my VG278H to 75% contrast, LightBoost to 50%. Otherwise, it is too dim for me. I like this brightness a lot; not too bright and not too dim -- so it doesn't cause brightness-related eyestrain. I can't really tell apart LightBoost=10% vs 50% in video games, but I notice it in test patterns. LightBoost=10% versus LightBoost=100% is slightly noticeable. It's the jump from non-LightBoost to LightBoost that's most massive, so further improvements are already at the point of diminishing returns. However, some people swear by LightBoost=10%.
Having LightBoost at 100% already give you the vast majority of motion blur elimination benefits, the rest of the tweaking is just a bonus especially if even the dimmed LightBoost=100% is still too bright in a dark room. (Some say too bright; some say too dark)
 
Last edited:
I just got my ASUS VG248QE and am spoiled by the lightboost@120HZ tweak. I currently stream and used to use a DVI to HDMI adapter where I split the HDMI signal with a box to two other outputs: one went to my monitor and the other went to my Avermedia capture card(only accepts HDMI) in the second PC.

Now I would like to keep playing on this nice new monitor @120HZ with lightboost and avoid having to use software to clone my display, taking a performance hit.

So what I am currently seeking, if a decent one exists, is a DVI-D splitter to split the output to my monitor then somehow get it to the capture card. I've done some Googling and found some adapters but I am not sure if they support 120hz and if they would affect the way lightboost would work with the monitor.

In summary:
-Like to stream
-Want to play on 120hz lightboost monitor.
-Want to split signal from videocard to monitor(DVI-D 120hz&lightboost) and Avermedia HD Gamer capture card(HDMI)

Any feedback?
 
This review of quad titans has bar graphs showing a host of different gpu's from 7870's and 660's all the way up to 7990, 690, 680sli, and multiple titans. It seems like a good way to get an idea of single and dual setups and overall performance scale even outside of the crazy expensive multiple titan setups.

us.Hardware.nfo: Nvidia GeForce GTX Titan 3-way/4-way SLI review incl. 5760x1080 and frametimes

According to that review, a single 680 at 1080p seems capable of hitting 120fps on unmodded skyrim, and 118fps at medium setting on BF3.
.
Some newer games like hitman and farcry3 have much lower numbers. Whether due to more demanding graphics or poorer optimization idk. I'm curious what modded with much greater graphics eyecandy skyrim and some of the most demanding modern mmo's numbers would be. Also notably absent is the newest tomb raider but I don't think that was out when the testing was done.

According to the review, single titan at 1080p , BF3 ultra 4x AA = 100.7 fps, enough for 100hz lightboost mode. Tweaking the settings down some would hit 120fps though since a single 680 at medium hits it. So probably "high" setting rather than medium or ultra would push you over 120fps.

Farcry3 doesn't get over 80fps on ultra even with the beefiest gpus/combos. A single 680 and greater seem to hit near/at 100fps though.

I wish there were more games on that review but it might give some idea anyway.
 
I just got my ASUS VG248QE and am spoiled by the lightboost@120HZ tweak. I currently stream and used to use a DVI to HDMI adapter where I split the HDMI signal with a box to two other outputs: one went to my monitor and the other went to my Avermedia capture card(only accepts HDMI) in the second PC.

Now I would like to keep playing on this nice new monitor @120HZ with lightboost and avoid having to use software to clone my display, taking a performance hit.
I think that's a distribution amplifier you're looking for. (DVI distribution amp, HDMI distribution amp). TV stores use them to display the HDMI signal from one box to multiple TV's simultaneously.

I'm curious how much lag cloning adds, versus how much lag a good distribution amplifier adds. They act as repeaters; and repeaters can add lag. Some repeaters are worse than others.
I think using cloning mode with two identical display (minimize scaling lag) is probably no better. If you want to clone and record gaming while you play, try VGA output and a simple analog splitter; that will allow cloning with zero input lag. It's the only way to really guarantee elimination of lag caused by splitting a digital signal (which ends up repeatering the signal). Also you might not be able to play your Blu-Rays (e.g. HDCP copy protected content) when using some distribution amplifier.
 
Last edited:
Back
Top