Asus VG248QE 144hz 3D Vision 2: The Official Thread

i have a 750D currently, which is the 27" version of the 700D, and i'll replace it with the benq / asus because of this lightboost trick. see my experience so far with it:

http://hardforum.com/showpost.php?p=1039578000&postcount=576

http://hardforum.com/showpost.php?p=1039481085&postcount=98

it's a good monitor as far as TN goes, but not blur-free enough for me. admittedly, i seem to be a bit sensitive, i also can't stand bright light.

i think the catleap / overlord will do better than speed 5 in pixperan once overclocked
(i need a better card first, as i wrote in the zero blur thread)

the samsung has very bright colors for a TN monitor, although is looks rather pale next to the overlord. it's hard to imagine a TN screen with AG coating can top that. sure, i'd like it glossy too, but peeling the coating off is a bit risky imo.

regarding picture quality, that's a massive difference indeed, just look at the guy with the torch, it's like another picture.

what guy with a torch? I'll take your opinion at your word, but if you are viewing pictures of other people's monitors you have to take it with a grain of salt. Each of their monitor's calibration combined with the room lighting it is in, affect those displays. Then the camera itself has a bias, and is affectedby lighting as well. The image type rendered can affect the photos if not raw. Then you are viewing said pictures on a different monitor, with different calibration, and different room lighting condition bias. :rolleyes:
.
I dislike AG for the same reasons, and I own a 750D as well as a glossy 2560x1440 IPS. What you said makes sense, but I think people put too much stock in photos of monitors for the above reasons among others. I don't think the samung looks pale when tweaked. Its pretty saturated, just not uniform with the TN shading/shift/shadow.
 
You don't get any screen tearing in LB mode when keeping FPS 120 or above without VSync. Granted, you need a pretty powerful computer to keep FPS 120 or above at all times in all games.

Enabling LB locked my fps to 120. I could uncheck the enable 3d box but I had to do it while the game was already launched to keep lightboost on. I'm not sure if that's with every game but for black ops 2 that was the case for me.
 
If mark R gets a chance maybe he can chime in.
.
I'd like to know what the costs of using vsync (double buffered) are in regard to input lag at 100hz and 120hz on modern, enthusiast~beastly gaming systems... assuming you are trying to keep the fps higher than the refresh rate all the time for 1:1 processing. If it still adds input lag, how many milliseconds?
.
I'd also like to know if high levels of AA and other eye candy processing add to input lag at all, and if so, how many milliseconds?
COMPETITION: For competition gaming, you really do want to minimize input lag.

FACTORS: That said, you must consider the lag in the whole chain from game software to the human vision system. A 1000Hz mouse gives position updates faster, reducing input lag. A faster GPU and CPU turns game controls into rendered 3D frames faster, reducing lag. Etc. Increasing gamma on the monitor can improve reaction time (less brain lag) in dark environments. Etc. For some people (not all), LightBoost reduces brain lag more than the increase in display lag (which could be a couple of milliseconds). You do not want a weak link in your entire chain in competition gaming.

Does lightboost outweigh its slight added input lag?: Depends. Several people agree wholeheartedly, others disagree. My take: Definitely yes for some gaming styles for some people. This is due to less brain lag. It depends on your gaming style. And especially if you are used to CRT chiefly because of its zero motion blur ability and fast reaction times possible in panning-heavy operations such as circle strafing. Others are more used to LCD and have adapted their gaming styles. The lightboost zero motion blur effect, does outweigh (for me) maybe one or two extra milliseconds of lag, due to faster reaction times possible of faster identification of non-blurred objects while in motion. Some people prefer the motion blur, but others like me dislike it. It's up to debate how many Xms lag is worth giving up, to get 90% less motion blur than a standard 60Hz LCD, 75% less motion blur than a 120Hz TN without lightboost, and 70% less motion blur than overclocked Catleap 2B. Some say it may only be worth giving up 1ms, others say it's worth giving up one full frame of lag.)

SOLO gaming: Sometimes you are into solo gaming & want the prettiest pictures. When I game solo, I like using VSYNC because I am unusually sensitive to tearing (even at beyond 120fps). I am able to detect tearing during fast turns at 180fps @ 120Hz with LightBoost, unless I'm using "adaptive vsync" (which has its own pros and cons). Once it goes beyond 200-300fps, the tearing becomes too weak for me to see, but remarkably I can still still detect tearing; just that tearing ceases to bother me once it's above about ~250fps. I'm more sensitive to tearing than input lag for solo gaming, so when I am solo, I turn on VSYNC since I can still see it above refresh rate. You might not be sensitive, but I am.

EYE CANDY: Depends. Sometimes has no cost but accumulate slight amounts of input lag (mainly via lower framerate), some games worse than others, usually insignificant. Most of the input lag is in the reduced framerate. Even slowing down from 240fps to 120fps (even with vsync off or triple buffering) adds a few milliseconds of input lag, even if you don't feel it. It's hard to notice, and needs high speed equipment and hours/days of multipass scientific testing to be sure. Regardless, the difference between 240fps and 120fps is a calculated math of 4.2ms improvement in latency. (The difference in "freshness" of a frame rendered 1/240th of a second ago, versus a frame rendered 1/120th second ago!).

THE "SHOOT FIRST" effect. Sometimes input lag becomes small enough you can't feel any difference but it takes the "shoot first" effect to notice. Two guys in a game shoot simultaneously. The person that shoots 1ms sooner can win. It's like the 100 meter sprint, where milliseconds now matter, even if you can't feel a single 1ms. I do find, human brain lag (reaction time) for me is less with lightboost, despite lightboost's extremely tiny added display lag.
 
Last edited:
So roughly,

input lag using lighboost2 synchronized backlight strobing in 2D gaming. --> 1ms - 2ms

Aggressive EyeCandy Setting's possible resultant frame rate fluctuations/dips ---> ~2ms ("a few ms", "hard to notice" if noticeable without testing equipment at all)


sli / crossfire ------> ??? input lag

No vsync and much higher than refresh rate fps ---> not enough tearing to bother a lot of people

Vsync ---> input lag??? none at much higher fps than refresh rate? Not sure at lower fps. There is double buffer and triple buffer versions too.


I've never had any issue with screen tearing while keeping the FPS faster than the refresh rate (in this case 120+). I never user VSync. In games that don't use triple buffering like Skyrim, Vsync adds a ton of input lag. Contrary to popular belief, triple buffering decreases VSync lag over double buffering.

There are many things that increase input lag, but by far double buffered VSync is the largest culprit. While AA and technologies like SLI and Crossfire may add a hint of input lag, it's generally fairly low.

It would be interesting to see some solid numbers in that regard, but proper high-speed equipment is needed.

Thanks for that detailed reply btw.
 
Last edited:
So roughly,
By an order of magnitude.

input lag using lighboost2 synchronized backlight strobing in 2D gaming. --> 1ms - 2ms
Unconfirmed at this time. Some suggest it is a half a frame of input lag, due to the top edge of the screen being in dark for a whole frame until the pixel scanout hits the bottom of the screen and the backlight flashes. There is more input lag for the top edge of the screen than the bottom edge of the screen during strobe backlight operation, because of this. It's seen in my LightBoost high speed video. I need to do some tests this month, it is on my todo list. That said, it's the world's lowest lag synchronized backlight. The Sony/Samsung HDTV scanning backlights (e.g. Motionflow XR 960) add a lot more input lag, and are optimized for video instead of videogames. LightBoost is the first really good synchronized backlight good for games.

Aggressive EyeCandy Setting's possible resultant frame rate fluctuations/dips ---> ~2ms ("a few ms", "hard to notice" if noticeable without testing equipment at all)
Let's simplify this. It's simply lower framerate with eye candy. There's always more input lag during lower framerate. If you turn on an eye candy setting that slows framerate by 50%, you generally also have "about". 50% more input lag. This is even regardless of whether the frame rate is above or below refresh rate. Eye candy settings exist that slowed framerates by 10x (e.g. Laptops running Crysis at 5 frames per second), and 5 frames per second forces minimum input lag of 200 milliseconds! So you are off by two orders of magnitude for this example.
Rather, you should instead say "Eye candy lowers framerate, which increases input lag", which is the most common case of input lag caused by eye candy.

No vsync and much higher than refresh rate fps ---> not enough tearing to bother a lot of people
It's a complex science. Tearing can be a beat frequency effect, too. For example, 119fps,120fps, 121fps, 239fps, 240fps, 241fps @ 120Hz creates much more visible tearing than 171fps @ 120Hz. Setting fps_max to 119 and 121fps causes a slowly-moving tearline that slides up or down vertically. Setting fps_max to 239fps and 241fps causes two slowly-moving minor tearlines, both sliding vertically. Setting fps_max 120fps @ 120Hz with VSYNC off can cause an annoying perfectly-statioinary tearline, staying in the middle of your screen. (Note: Theoretically, it is possible game logic can move the tearline just below the bottom edge of screen, gaining the full benefit of double buffering without the input lag of traditional double buffering. Essentially render-timing optimized double buffering) Ditto for 240fps@120Hz (two stationary tearlines, half screen height apart). 171fps @ 120Hz causes a random tearline that's never in one position.

Vsync input lag??? none at much higher fps than refresh rate? Not sure at lower fps. There is double buffer and triple buffer versions too.
There is usually reduced input lag at higher framerates, regardless of whether VSYNC is enabled (triple buffered) or not. This is due to frames being fresher when the visible frames are finally delivered to the screen.
....There can be more input lag if it does not use the traditional (proper) triple buffering flipping technique but stacks multiple frames ahead (multi GPU setups complicate frame rendering scheduling). Sometimes games get more input lag at higher framerates due to framesbuffers being piled up more, but good programming (drivers and game) and proper configuration file settings, should prevent from this ever being possible to occur. Also other variables such as slower CPU with drivers consuming more CPU reading from a 1000Hz mouse, or CPU starvation (e.g. Threads so busy rendering frames, that it does not do other important things nearly as quickly, such as loading graphics or reading input).
....But assuming no petty bottlenecks like those, higher framerates should always result in less input lag, even when framerate is higher than refresh rate. If you ran a 15 year old 3D game at 640x480, the difference between 120fps@120Hz and 1000fps@120Hz can be a whopping 7 millisecond improvement in input lag (frame freshness 1/120sec vs 1/1000sec).

....And I'm telling less than 5% of the complex story of frame rates and input lag! It'll always be a contentious topic.
 
Last edited:
COMPETITION: For competition gaming, you really do want to minimize input lag.

FACTORS: That said, you must consider the lag in the whole chain from game software to the human vision system. A 1000Hz mouse gives position updates faster, reducing input lag. A faster GPU and CPU turns game controls into rendered 3D frames faster, reducing lag. Etc. Increasing gamma on the monitor can improve reaction time (less brain lag) in dark environments. Etc. For some people (not all), LightBoost reduces brain lag more than the increase in display lag (which could be a couple of milliseconds). You do not want a weak link in your entire chain in competition gaming.

Does lightboost outweigh its slight added input lag?: Depends. Several people agree wholeheartedly, others disagree. My take: Definitely yes for some gaming styles for some people. This is due to less brain lag. It depends on your gaming style. And especially if you are used to CRT chiefly because of its zero motion blur ability and fast reaction times possible in panning-heavy operations such as circle strafing. Others are more used to LCD and have adapted their gaming styles. The lightboost zero motion blur effect, does outweigh (for me) maybe one or two extra milliseconds of lag, due to faster reaction times possible of faster identification of non-blurred objects while in motion. Some people prefer the motion blur, but others like me dislike it. It's up to debate how many Xms lag is worth giving up, to get 90% less motion blur than a standard 60Hz LCD, 75% less motion blur than a 120Hz TN without lightboost, and 70% less motion blur than overclocked Catleap 2B. Some say it may only be worth giving up 1ms, others say it's worth giving up one full frame of lag.)

SOLO gaming: Sometimes you are into solo gaming & want the prettiest pictures. When I game solo, I like using VSYNC because I am unusually sensitive to tearing (even at beyond 120fps). I am able to detect tearing during fast turns at 180fps @ 120Hz with LightBoost, unless I'm using "adaptive vsync" (which has its own pros and cons). Once it goes beyond 200-300fps, the tearing becomes too weak for me to see, but remarkably I can still still detect tearing; just that tearing ceases to bother me once it's above about ~250fps. I'm more sensitive to tearing than input lag for solo gaming, so when I am solo, I turn on VSYNC since I can still see it above refresh rate. You might not be sensitive, but I am.

EYE CANDY: Depends. Sometimes has no cost but accumulate slight amounts of input lag (mainly via lower framerate), some games worse than others, usually insignificant. Most of the input lag is in the reduced framerate. Even slowing down from 240fps to 120fps (even with vsync off or triple buffering) adds a few milliseconds of input lag, even if you don't feel it. It's hard to notice, and needs high speed equipment and hours/days of multipass scientific testing to be sure. Regardless, the difference between 240fps and 120fps is a calculated math of 4.2ms improvement in latency. (The difference in "freshness" of a frame rendered 1/240th of a second ago, versus a frame rendered 1/120th second ago!).

THE "SHOOT FIRST" effect. Sometimes input lag becomes small enough you can't feel any difference but it takes the "shoot first" effect to notice. Two guys in a game shoot simultaneously. The person that shoots 1ms sooner can win. It's like the 100 meter sprint, where milliseconds now matter, even if you can't feel a single 1ms. I do find, human brain lag (reaction time) for me is less with lightboost, despite lightboost's extremely tiny added display lag.

Thanks for a informative post.

Do you mean that having 240fps is better vs 120fps on a 120hz screen?

I'm gaming CS:GO and I have a fps between 150-220, it's usually solid around 190fps.
Should I then OC my GPU even more to get the FPS solidly as high as possible?

I always thought that if never going under 120fps on a 120hz display then no worries.

I do have AA at maximum, if someone can show proof of added input lag I'll consider disabling AA.

Edit..> BTW, CS:GO is one of those game that I can't play smooth in crossfire, so I use just one gpu for CS:GO
 
Thanks for a informative post.
Do you mean that having 240fps is better vs 120fps on a 120hz screen?
I'm gaming CS:GO and I have a fps between 150-220, it's usually solid around 190fps.
Should I then OC my GPU even more to get the FPS solidly as high as possible?
I always thought that if never going under 120fps on a 120hz display then no worries.
It depends on how competitive you are. If you're doing the "big leagues" (paid professional gaming), then every little bit counts and you're open minded about testing different configurations, etc. All these will shave milliseconds off your whole input lag chain from game software to your eyeballs/brain.

For me, it doesn't matter as much (as pretty in-game visuals & zero motion blur) since I am an enthusiast/casual gamer rather than pro-league gamer. Sometimes I even prefer VSYNC-on 120fps@120Hz when playing online and live with that tiny bit of lag -- but it depends on the game. Beyond a certain point, for casual gaming, a few milliseconds of lag (for much worse visuals) is not going to matter much (unless "SHOOT FIRST" (even by 1ms) means winning a gaming championship).

I do have AA at maximum, if someone can show proof of added input lag I'll consider disabling AA.
More accurately, to rephrase: AA usually has nothing *directly* to do with input lag. However, AA can reduce framerate, which does increase input lag. If your fps decreases from 200fps to 100fps when enabling AA, your frames are potentially up to 1/100sec (10ms) less fresh (frames rendered 1/100sec ago rather than frames rendered 1/200sec ago). Then again, it is possible AA can make it easier to identify far-away objects if you're distracted by pixellation.

Personally: I prefer AA on, as long as I can hit 120fps@120Hz.
Game's built in GPU motion blur turned off. ("mat_motion_blur_enabled 0" in Portal 2 console, for example.) The blur effect in Portal 2 is very subtle so it's reasonably nice, but if I am running at 120fps@120Hz using LightBoost, it looks better to my eyes (personally) with it turned off.

Personal preferences vary, but, generally for me:
SOLO settings: AA maxed out, VSYNC ON, LightBoost on, 120fps@120Hz, for the "perfect CRT motion" effect.
COMPETE settings: AA low (2x), VSYNC OFF, LightBoost on, fps_max as high as possible (without causing slowdown side effects).

IMHO, now we are getting offtopic.
Please don't reply to this post, but post a link to a new thread.
This thread is about ASUS VG248QE.
 
Last edited:
Mark. can i run lightboost on one monitor, then normally have my vh242h? i have lightboost running on the vg248qe and my vh242h plugged in but whenever i turn pixperpan on, lightboost shuts off

vh242h plugged in via hdmi for sound
 
Mark. can i run lightboost on one monitor, then normally have my vh242h? i have lightboost running on the vg248qe and my vh242h plugged in but whenever i turn pixperpan on, lightboost shuts off
vh242h plugged in via hdmi for sound
Make the VG248QE the primary monitor. Some said they needed to install the .reg and .inf files from my HOWTO to keep LightBoost persistent. Also, make sure PixPerAn is launching into 120Hz instead of switching refresh rates.

To allow me to give courtesy to others -- for further questions about LightBoost/strobe backlights, post your questions in the "official" LightBoost thread on this forum. Though I enthusiacally answer questions directly addressed about this topic in any thread, there's now an overflow of LB-specific information in this thread. I want to avoid saturating this VG248QE-specific thread with "too much" LB-specific information (even though VG248QE is a LB monitor).
 
Can anyone confirm getting 144hz from an AMD card, either DL-DVI, or DP 1.1/1.2 on the VG248QE?

Thanks!
 
Personal preferences vary, but, generally for me:
SOLO settings: AA maxed out, VSYNC ON, LightBoost on, 120fps@120Hz, for the "perfect CRT motion" effect.
COMPETE settings: AA low (2x), VSYNC OFF, LightBoost on, fps_max as high as possible (without causing slowdown side effects).

IMHO, now we are getting offtopic.
Please don't reply to this post, but post a link to a new thread.
This thread is about ASUS VG248QE.

went off along time ago. I think there is a thread about Lightboost.. what do I know..
 
RavnosCC,

Yes, I have a 6950 and 144hz works fine with the provided DualLink DVI cable
 
Displayport I have not been able to get 144Hz using 1.1 cables ... 1.2 cables will be here wed for further testing...
 
Just ordered one of these to replace my ZR30w (had to return it to work when laid off).. Hopefully it's badass..

Can anyone let me know exactly what cables I'd need to get the full 144hz? I have an eVGA 680 GTX
 
Displayport I have not been able to get 144Hz using 1.1 cables ... 1.2 cables will be here wed for further testing...

There is no difference between a DP 1.1 cable and a DP 1.2 cable. That is all marketing and may be what they are tested up to. Unless the cables are really shitty quality or you are trying to go longer distances, they won't matter.

Read Q #2:

http://www.displayport.org/faq/
 
There is no difference between a DP 1.1 cable and a DP 1.2 cable. That is all marketing and may be what they are tested up to. Unless the cables are really shitty quality or you are trying to go longer distances, they won't matter.

Read Q #2:

http://www.displayport.org/faq/

Well I'm screwed... LoL

The 1.1 cable I have is from Monoprice... So IDK about the quality... The new ones are from Accell and are supposedly " Eyefinity Certified " ... Might be more BS IDK
 
Well I'm screwed... LoL

The 1.1 cable I have is from Monoprice... So IDK about the quality... The new ones are from Accell and are supposedly " Eyefinity Certified " ... Might be more BS IDK

Haha "Eyefinity Certified". Now I've heard everything. Marketing departments are just as clueless as the people they market to. It's symbiotic really. :eek:

But are we quite sure those cables will work with "LED" monitors, whatever those are. I must see the certification first. :)
 
Haha "Eyefinity Certified". Now I've heard everything. Marketing departments are just as clueless as the people they market to. It's symbiotic really. :eek:

But are we quite sure those cables will work with "LED" monitors, whatever those are. I must see the certification first. :)

LoL

I got sucker written all over my forehead.

Still kinda bummed I cant run @ 144Hz using 3 DP cables to my 7970 lightnings.
 
I don't think I've seen this asked yet in this thread.

Has anyone compared this "Hack" with an IPS Panel? Maybe even a Yamakasi Catleap 2B OC Monitor that is at 120hz?

This may be like comparing apples to oranges since the resolutions are totally different but the thought did cross my mind.

I own one of the Yamakasi's and couldn't be happier, and may even order a 2nd. lol
 
Check Vega's post history, he has commented several times on the tradeoffs between them. I would also consider that maintaining 120Hz at 1080p is quite a bit easier to do than 1440p on a single gpu.
 
Check Vega's post history, he has commented several times on the tradeoffs between them. I would also consider that maintaining 120Hz at 1080p is quite a bit easier to do than 1440p on a single gpu.
Vega, using motion tests, measured 7x less motion blur on the VG248QE (1.4ms) than the Catleap 2B 130Hz (10.1ms). He is torn between these two displays sometimes. Having perfectly-clear fast motion, that looks as perfectly sharp as a stationary image, has been a sight to behold.

Tough tradeoffs. If you don't do much FPS gaming and don't mind motion blur, definitely keep the 1440p display. But if you're used to CRT and is bothered by motion blur during FPS gaming... Lots of motion blur on all 1440p IPS panels ever made. Different tools for different jobs they're best at.
 
Last edited:
Well I got the monitor yesterday and so far very nice. I expected to see lightbleed since it was Asus yet I cant see none. Didnt do the crank the brightness to see. 144-120 is very very nice.

Not sure if I will ever do the Lightboost (Thanks Mark) I dont game that much. Since my 680 does not show the "keep 3d always on" again its really ok.
 
I don't think I've seen this asked yet in this thread.

Has anyone compared this "Hack" with an IPS Panel? Maybe even a Yamakasi Catleap 2B OC Monitor that is at 120hz?

This may be like comparing apples to oranges since the resolutions are totally different but the thought did cross my mind.

I own one of the Yamakasi's and couldn't be happier, and may even order a 2nd. lol


Better for Online Gaming, 120hz Strobed or 144hz Not Strobed?


You might also want to check the main lightboost 2 zero blur thread starting around post #300 .
ASUS/BENQ LightBoost owners!! Zero motion blur setting!

.
 
Last edited:
Hey guys,

Well, I installed my VG248QE, and now I feel like an idiot because for the life of me, I cannot get 120hz or 144hz to work without horrible artifacting all over the screen, unusable... I have 2x5850's in crossfire, I'm using the supplied DL-DVI cable that came w/the monitor... I've tried installing the monitor drivers from ASUS website, I've rebooted, I've searched google, and a few support forums, I've tried different dvi ports on my video card, I've disabled crossfire, I've unplugged my secondary screen... I'm going to try updating or rolling back AMD drivers at this point cause I've got the latest installed, but no luck... any help would be greatly appreciated, leave it to me to screw up taking a thing out of a box and plugging it in :p (btw, it works fine at 60hz, but that AINT why we got this ;-) )

Thanks again for any assistance...
 
Hey guys,
Well, I installed my VG248QE, and now I feel like an idiot because for the life of me, I cannot get 120hz or 144hz to work without horrible artifacting all over the screen, unusable...
Do you have "DVI noise" -- green pixel noise, colored sparkles, etc? If it looks half resolution horizontally, it looks like single-link DVI. This can also happen with defective dual-link DVI ports and cables.

Regardless, try another cable, maybe the monitor's included cable is defective. Try a high-end thick monoprice.com cable (cheap).
Or bring it into the computer store, and troubleshoot it with them -- they would try their DVI cable, and try their own computers; and they might subsequently discover it's a defective DVI port on your monitor, discover it's a defective cable, or discover it's your 5850's.
 
Which 5850 do you have? Did you check to make sure that your card does have a DL-DVI cable out. If that doesnt work.... Does your card have a displayport out? You could always try that.
 
Can anyone post their calibrated NON-LightBoost settings please? I only have a 7970 right now and cant test LB yet and cant seem to get this monitor calibrated correctly. THX!
 
Wow, So simple so far, I replaced the DL-DVI cable that came from ASUS with a cheapo extra I had lying around the house, bam, working now.

Thanks for the fix Mark :D Hopefully it goes smoothly from here on out.
 
I feel like I have a lot of backlight bleeding on mine. What's the best way to test, and does lightboost affect it?
 
I feel like I have a lot of backlight bleeding on mine. What's the best way to test, and does lightboost affect it?

Lightboost won't change the BLB pattern.

To test BLB just fullscreen a black image and it should be immediately visible. Something to keep in mind is that setting the Brightness too high will exaggerate BLB. I have my monitor at 48, and I think someone else said they set it in the 30s (probably in a dimmer room).
 
Started testing, but a quick FYI I'm running 144hz just fine using this $6.50 displayport cable from monoprice on my 670.

EDIT: IGNORE. Using this cable with Lightboost resulted in occasional BSODs upon application starting. Working over dual link DVI fixed the problem.
 
Last edited:
How do i get pixperan to run at 120hz to test? Each time i open it, it's at 30fps...
 
I'm about to order this monitor. Without going through the whole thread, can someone tell me if there is an advantage of using hdmi vs dvi vs display port with this monitor, or will they all work equally the same?
 
I'm about to order this monitor. Without going through the whole thread, can someone tell me if there is an advantage of using hdmi vs dvi vs display port with this monitor, or will they all work equally the same?

You will need to use Dual Link DVI or DisplayPort for 120Hz/144Hz and 3D mode.
 
Hummm ..... Very weird.... New 1.2 Displayport cables are working @ 144Hz .... So far playing Black Ops 2 Multiplayer @ 5760 X 1080p for the past hour... Lets try BF3....

Edit....after playing BF3 for about 45 mins .. it seems to be working too .... WOOT!!!

Only thing is I had a whole computer random crash that hasn't happened before :-/
 
Last edited:
Back
Top