ASUS/BENQ LightBoost owners!! Zero motion blur setting!

I cannot find the BenQ XL2420TE on the BenQ website.
Check the USA website:
http://shop.benq.us/xl2420te.html

So how on earth can I make sure that I'll receive a T-version with the new specs when I order an XL2420T? How should the vendor know?
This is completely crazy, or do I miss something here?
It's crazy. You can't tell before buying. For a guarantee of 144Hz/ZeroFlicker, you should order the XL2420TE from Amazon or another source that is advertising it as the XL2420TE

A very interesting characteristic of the XL2420TE is the ability to dim over a much wider range, interesting for those people who think Brightness=0% of current 120Hz monitors are still way too bright for them.
 
You were looking on BenQ's global site there. I think it's only the UK and Euro XL2420T that is upgraded. The XL2420TE is only available in America and it would make no sense that the XL2420T there would be the same but cheaper? Still the only info on the US site - http://www.benq.us/product/monitor/xl2420t
 
For the record I just purchased the BenQ XL2420TE. It is really nice! Right now I have it sitting next to an IPS ( 2209WA) selling it Saturday though. The colors are a little off YES. They arent that bad though. Now one nice thing for myself NO MORE EYE STRAIN!!!!!!!!

So worth it to me!
 
For the record I just purchased the BenQ XL2420TE. It is really nice! Right now I have it sitting next to an IPS ( 2209WA) selling it Saturday though. The colors are a little off YES. They arent that bad though. Now one nice thing for myself NO MORE EYE STRAIN!!!!!!!!

So worth it to me!
Seeing this is a ZeroFlicker monitor (PWM-free in non-LightBoost mode), and LightBoost uses flicker to eliminate motion blur, I'm pretty curious: Are you using LightBoost, or are you using the monitor in non-LightBoost mode?
 
Seeing this is a ZeroFlicker monitor (PWM-free in non-LightBoost mode), and LightBoost uses flicker to eliminate motion blur, I'm pretty curious: Are you using LightBoost, or are you using the monitor in non-LightBoost mode?

Non-lightboost mode. I did research and knew going in if I used that mode it would flicker. Now I may test it out just to see what the hub bub is all about.
 
Non-lightboost mode. I did research and knew going in if I used that mode it would flicker. Now I may test it out just to see what the hub bub is all about.
Either of the two situations will easily happen:

(1) You won't get eyestrain. No difference.
This will happen if your eyes liked high-refresh-rate CRT in the past. The eyestrain is coming from a different cause, either from PWM motion artifacts and/or excess brightness and/or bad spectrum (blue light, excess contrast, bad ambient lighting). There are people who get PWM eyestrain, but do not get CRT / LightBoost eyestrain (above ~85Hz)

(2) You will get eyestrain.
This will happen if your eyes were _directly_ sensitive to CRT flicker / PWM flicker in the first place and you had eyestrain because of that. Simply discontinue use of LightBoost.

Several people including myself and Vega get less gaming eyestrain with LightBoost than without, since motion blur can make the eyes tired, if you like tracking your eyes all over the place during motion (depending on the gameplay style, but usually during FPS games such as Battlefield 3, Counterstrike, or Team Fortress 2). Today, the easiest way to enter the LightBoost world is to use ToastyX Strobelight, which allows you to turn ON/OFF LightBoost via an easy keypress. The current LightBoost HOWTO has been updated with the Strobelight instructions.

Either way, glad you like the XL2420TE. So far, it's currently the world's lowest-eyestrain 120Hz monitor -- everyone who's had eyestrain with 120Hz monitors, who has tried the XL2420TE have been happy with it from the perspective of lack of eyestrain, even if the colors aren't ideal. Expensive monitor, but worth it if the VG248QE gives eyestrain. BENQ hit it from multiple angles -- better LED spectrum, PWM-free backlight dimming, wide dimming range all the way down to "barely visible", the reduced motion blur of 120Hz -- AND still being able to provide the optional LightBoost for further blur elimination -- apparently giving monitor users a lot of choices to relieve monitor eyestrain. Now if we can only get a gaming monitor similiar to XL2420TE in VA format or IPS format, that could be heaven.
 
Last edited:
What is the main difference between the XL2411T and the XL2420TE? Both are 144Hz, but here in the Netherlands one is nearly $150 cheaper than the other.....
 
The TE is flicker Free NO PWM. That about the only difference I can see. So basically no eyestrain for us folks that are sensitive to it.
 
The TE is flicker Free NO PWM. That about the only difference I can see. So basically no eyestrain for us folks that are sensitive to it.

According to the BenQ website, the XL2411T is also Flicker Free:

http://www.benq.com/product/monitor/xl2411t/

If they are both Flicker Free and both 144Hz, is there any other reason for the big price difference? The only thing I can find is that the XL2420TE has a 3 button switch for custom profiles....
 
The XL2411t was not originally flicker free. It appears the new flicker free version is being sold under the same model number. Good luck finding one.
 
The XL2411t was not originally flicker free. It appears the new flicker free version is being sold under the same model number. Good luck finding one.
I own an XL2411T, and it uses PWM dimming.

Find the ones that say "ZeroFlicker". Only on the XL2420TE, it's is guaranteed PWM-free dimming.
 
Thanks for the responses, but I'm still not clear on any other differences betwen the 2 models. I have found stores here which list both as being newer versions that are 144Hz and Flicker Free so that is fine, I am just trying to determine any other differences to account for the $150 price difference, I find it hard to believe that the custom profile switch could cost that much.....
 
See my above post. The XL2420TE is a North American model. Europeans like us don't need it. The new revision XL2420T AND XL2411T both use flicker free panels and are 144Hz. Which is why BenQ's EU websites list them as such.
 
See my above post. The XL2420TE is a North American model. Europeans like us don't need it. The new revision XL2420T AND XL2411T both use flicker free panels and are 144Hz. Which is why BenQ's EU websites list them as such.

Thanks, as I already mentioned I understand that. What I am trying to figure out is why the €100 price difference between the two (revised) models, given that they are both Flicker Free and 144Hz....
 
If you see the XL2420TE in Europe it must be an import with higher price. Comparing US stock, premium on the XL2420TE because 'they can'. The American 20T is only 120Hz and uses PWM AFAIK.
 
Thanks, as I already mentioned I understand that. What I am trying to figure out is why the €100 price difference between the two (revised) models, given that they are both Flicker Free and 144Hz....
The XL2420T(E) has a fancier design than the XL2411T, more inputs (additional HDMI, DP), headphone jack, USB hub, and the wired "remote" control.

Regarding the version confusion about the upgraded T vs. TE. I wonder if somebody here can confirm that there are actually flicker-free AND 144Hz-capable XL2420T or XL2411T on the market for real?
BTW, Amazon does not sell the XL2420TE to Europe.
 
The XL2420T(E) has a fancier design than the XL2411T, more inputs (additional HDMI, DP), headphone jack, USB hub, and the wired "remote" control.

Regarding the version confusion about the upgraded T vs. TE. I wonder if somebody here can confirm that there are actually flicker-free AND 144Hz-capable XL2420T or XL2411T on the market for real?
BTW, Amazon does not sell the XL2420TE to Europe.


That would be nice to know. I know for sure the TE is as Im sitting in front of one now. I didnt notice all the extra inputs. With all those extra inputs I wonder how they keep the input lag so low. Like if I were to use the usb hub for my mouse and keyboard would that raise the input lag?

Also Im on the American site and I cant see where the other models are listed as flicker free at least not in the store information.
 
With all those extra inputs I wonder how they keep the input lag so low.
Actually, keeping the input lag low is not so difficult if the timing of the input signal is compatible with the panel's timing range anyway, and the number of inputs has nothing to do with it. However, the manufacturer might decide to feed the signal through a processing chain which can handle all kinds of input signals, whether or not a particular input signal benefits from such processing. The more inputs there are the more possibilities the signal processing has to cover and the more exceptions might have to be implemented to efficiently bypass time-consuming processing stages when the signal actually does not need it. So providing low input lag is not a matter of speeding things up but of not slowing them down by unnecessary signal processing.

Like if I were to use the USB hub for my mouse and keyboard would that raise the input lag?
Maybe the lag for the mouse and keyboard is increased as the data travels through the hub, but the input lag of the display is not affected. Internally (and unless proven otherwise), the USB is just an add-on which works completely independently of all other components.

Also Im on the American site and I cant see where the other models are listed as flicker free at least not in the store information.
Yes, it is because in addition to changing the specifications without changing the model names, BENQ apparently also lists different specifications depending on the country the monitor is sold. However, there seems to be no confirmation yet that any of the upgraded T models are actually existing in the countries/continents for which BENQ lists them. So the only flicker-free model so far is the XL2420TE and it is only available in the U.S..
 
StrobeMaster said:
The XL2420T(E) has a fancier design than the XL2411T, more inputs (additional HDMI, DP), headphone jack, USB hub, and the wired "remote" control.

Thanks, that's more what I was after!

I have found the XL2420T Rev 2.0 (or TE in the U.S.) in the UK:

http://www.overclockers.co.uk/showproduct.php?prodid=MO-088-BQ&groupid=17&catid=510

and one E-tailer that I use a lot here in the Netherlands has the XL2411T listed as 144Hz, which I believe is also the Flicker Free version (although I am waiting on an email from them to confirm):

http://www.4launch.nl/shop/get/p-4-productid-141595

The TE is also available in Australia:

http://www.pccasegear.com/index.php?main_page=product_info&cPath=558_1094&products_id=24851
 
Last edited:
I just got a response back from Benq regarding the XL2411t.

Thank you for contacting BenQ.

All the monitors XL2411T manufactured after July 2013 are implemented with the new flicker free technology and they will retain the same model name.

Kind Regards

Ugo Turcio
 
So, it appears, all of the monitors that BENQ makes, will eventually be ZeroFlicker -- probably even including XL2720T. I guess, it's a matter of looking at the manufacture dates.

One could technically resell and repurchase their monitor, to get the new PWM-free technology.
 
I just got a new Asus VG248QE (manufactured in June 2013) and was positively surprised to measure a PWM frequency of 864Hz.
When LightBoost is unlocked (w/o being active though), the PWM frequency depends on the refresh rate and is always 6 times the refresh rate. Note that 864Hz happens to be 6 times 144Hz. Moreover, the PWM signal is then phase-locked to VSYNC, which helps to make things look even better.
Apparently, older VG248QE use a lower PWM frequency, something like 360Hz according to this review from June 2013 on prad.de.
I consider 864Hz basically flicker-free, especially when compared to the 180Hz used in the BENQs which do not have the flicker-free option. I wonder if the "flicker-free" BENQs are really continuous-mode or just use a higher PWM frequency. Anyway, I doubt that anyone would be able to tell the difference between 864Hz and continuous-mode just by looking at it.
 
I consider 864Hz basically flicker-free, especially when compared to the 180Hz used in the BENQs which do not have the flicker-free option. I wonder if the "flicker-free" BENQs are really continuous-mode or just use a higher PWM frequency. Anyway, I doubt that anyone would be able to tell the difference between 864Hz and continuous-mode just by looking at it.
It's basically flickerfree, but there are extreme cases where it is detectable.

People can detect 864Hz PWM artifacts in certain situations:
(1) Brightness is dimmed down to near-minimum settings (for large dark duty cycle in the PWM dimming); AND
(2) Fast synchronized panning such as www.testufo.com/photo#photo=toronto-map.png&pps=1920 or www.testufo.com/blurtrail

It is more comfortable to watch TestUFO Panning Map Test without PWM than with PWM, even at 864Hz. A soft blur is always more comfortable to look at than a jagged blur (example of jagged motion blur). (Incidentially, some people get eyestrain simply from just soft motion blur too -- e.g. certain testimonials) So, I still do not think 864Hz is not sufficient, at least during FPS gaming where people are moving eyes extremely fast at faster than 864 pixels per second (less than half a screen width panning motion at framerate=Hz motion). So when eyes track about ~1700 pixels/second, you're going to notice the strobe multi-edge artifact 2 pixels apart. The mathematics is rather simple -- 864Hz PWM = dotting every 1 pixel during 864pix/sec, dotting every 2 pixels at (2*864)pix/sec, dotting every 3 pixels at (3*864)pix/sec. This is easily confirmed with a motion test.

I still don't think 864Hz should be the final frontier, especially for wide field of view -- e.g. VR goggles -- when a slow 30-degrees-per-second of head turning on a 1080p VR panel, can create panning motion over 2000 pixels/second, enough to make PWM artifacts noticeable / bothersome to the human vision, unfortunately still creating eyestrain at 864Hz (not from the flicker, but from harsh motion artifacts). There are humans, that Blur Busters have found, who get eyestrain from motion artifacts rather than from flicker. 864Hz should NOT be the final frontier, and can still potentially create "eyestrain by motion artifact", even if not "eyestrain by flicker". I pretty much guarantee it. Far fewer reports than 360Hz eyestrain. But won't zero-out the eyestrain reports. (I am backed by sufficient user reports to even be willing to make a huge bet on this). Incidentially, there is also a segment of population that gets eyestrain from PWM, but no CRT eyestrain and no LightBoost eysstrain -- an interesting segment of human population that needs further scientific study. (Blur Busters believes this eyestrain is caused by motion blur, and sometimes even by artifacts inside the motion blur). Usually the eyestrain is caused by flicker, but apparently flicker isn't the only cause of eyestrain from PWM.

Yesterday, video and movies had a softness. We also didn't have high-def. Today we have computers, graphics & high resolutions (soon to be 4K). The human sensitivity to motion imperfections is amplified as a result by the sharp graphics and wider vision coverage. 864Hz PWM means strobing ~5 pixels apart at 4000 pixels/second horizontal panning motion on a 4K display that might be strapped to your head in tomorrow's VR displays, as we very slowly approach Holodeck capability. Content is also getting faster. Unlike slow sports panning, we've got fast FPS panning, racing panning, and virtual reality panning. PWM sensitivity is thus amplified even further. So the PWM threshould needs to be raised _very_, _very_, VERY, dramatically (>10,000Hz minimum, preferably 20,000Hz). 864Hz will still creates eyestrain, based on my pre-existing confirmations.

Due to the relationship between motion & strobing creating stroboscopic issues, humans can indirectly tell that something is strobing, up to 10 kilohertz:
stroboscopic_detection.png

(percent flicker = the flicker duty cycle)

This is in the lighting industry, and this is why the lighting industry now uses 20 kilohertz electronic ballasts for fluorescent lights.
Citation: http://www.lrc.rpi.edu/programs/solidstate/assist/pdf/AR-Flicker.pdf
My eyes have also confirmed -- I've detected 5,000Hz with my bare eyes in the fast-moving "strobing-LED-in-an-Arduino" test -- or the eye-roll test -- moving at an apparent 5 meter per second relative to my vision, the strobe creates an "array effect" of dotting 1 millimeter apart (5000Hz flickering LED combined with 5000mm/second motion = 1mm dotting)

Therefore, the same old-fashioned knowledge should also apply to displays' own PWM dimming circuits -- Therefore, PWM dimming should be at similar frequencies -- e.g. 20 kilohertz to eliminate issues for far beyond 5-sigma of human population. I believe the monitor industry SHOULD steal a page from the fluorescent ballast lesson, and maybe governments should governments SHOULD legally mandate 20KHz minimum for PWM dimming. This includes monitors. This includes LED taillights (I can easily detect 2000Hz PWM taillights at night on the freeway / highway / autobahn). It annoys the hell out of me.
Also see Ask Slashdot: Does LED Backlight PWM Drive You Crazy?. I agree too!

I still keep running into LED fixtures that flicker at 120Hz, and I can easily tell simply by saccading my eyes (the scenery goes into a stroboscopic blur rather than a continuous blur). Often an engineer tests 1000Hz PWM dimming of something driven by a LED light (lighting, monitor, televisions), tests it on 100 people, nobody complains, but then when it goes out to the population by the million people, you get hundreds of people complaining. Such engineers thought it was common sense that 1Khz was enough when it was not. 100 people isn't 5-sigma of population. As Blur Busters with existing research, I can guarantee you that 1KHz (large "off" duty cycle) will yet not be fully strain-free at 5-sigma of human population, if you started such a controlled study.

StrobeMaster, as you are a vision researcher / scientist, I hereby invite you to eventually study this interesting topic if your work allows you to do so. Combine the due-diligence of the old lighting research, with the due-diligence of PWM research.
You'd get a lot of KickStarter funds if you increased the umbrella to include PWM headlights, PWM dimming, PWM LED bulbs. You'd even be getting a 3-figure donation from me, personally. And if you can't start a study, tell another scientist.

Some scientist worldwide, please start a study on this.
 
Last edited:
I just got a new Asus VG248QE (manufactured in June 2013) and was positively surprised to measure a PWM frequency of 864Hz.
When LightBoost is unlocked (w/o being active though), the PWM frequency depends on the refresh rate and is always 6 times the refresh rate. Note that 864Hz happens to be 6 times 144Hz. Moreover, the PWM signal is then phase-locked to VSYNC, which helps to make things look even better.
Apparently, older VG248QE use a lower PWM frequency, something like 360Hz according to this review from June 2013 on prad.de.
I consider 864Hz basically flicker-free, especially when compared to the 180Hz used in the BENQs which do not have the flicker-free option. I wonder if the "flicker-free" BENQs are really continuous-mode or just use a higher PWM frequency. Anyway, I doubt that anyone would be able to tell the difference between 864Hz and continuous-mode just by looking at it.

The flicker free BenQ's would have to be continuous voltage, or that would be straight up false advertising. There are a lot of monitors that have very high flicker in the many thousands of Hz but they are still considered PWM monitors.
 
Oscilloscope tests have confirmed the BENQ is definitely continuous voltage. There's no PWM involved on the BENQ's. (and good riddance).

Strobing should only be aimed at motion blur reduction, not for dimming.
 
It's basically flickerfree, but there are extreme cases where it is detectable.

People can detect 864Hz PWM artifacts in certain situations:
...
I admit I overlooked the strobing effect (shouldn't have happened to me as "Strobe"Master). So even if 864Hz might be considered flicker-free it is not strobe-free.
Thank God that there is an optimum that can and has been implemented: continuous backlight. If there wasn't, we would have to discuss forever how high we needed to go to make everybody happy.
 
Very noob question here, but how do I check to see if my VG248QE has a PWM frequency of 864Hz? Special equipment required or is it findable with software/etc?
 
I admit I overlooked the strobing effect (shouldn't have happened to me as "Strobe"Master). So even if 864Hz might be considered flicker-free it is not strobe-free.
Thank God that there is an optimum that can and has been implemented: continuous backlight. If there wasn't, we would have to discuss forever how high we needed to go to make everybody happy.
Indeed, I think both you and I agree that there's no finite limit for the stroboscopic effect (theoretically).

Imagine a tiny ultrabright LED blinking at 1 million Hertz PWM.
Strapped it to the side of a rocket sled going nearly supersonic (1000kph)
Have a human stand next to the rocket sled, one meter to the side, staring stationary ahead directly through the expected trajectory of the 1 MHz blinking LED.
Zoom the rocket sled past by.
The human will detect the stroboscopic effect (dotting) of one million Hz, 1mm apart. (1,000,000mm/sec divided by 1,000,000 Hz = stroboscopic 1mm apart)
But obviously, that's not practical. The human will probably be blown backwards by the whoosh (unless well protected behind a strong wall with a plexiglas window!).

Or likewise, a wagonwheel spinning 1 million RPM, will still clearly exhibit the wagonwheel effect under a ultrashort-flashing (nanoseconds) strobe backlight. But obviously, that's not practical either too, due to the wheel ripped apart by centripetal forces.

Discussion over. I've herby proven there's no finite limit. It's all a matter of the numbers you punch into the variables. ;)

Realistically and practically, I do think about 10KHz+ is the practical minimum limit that PWM should use. Fortunately, at least one monitor actually has 10KHz PWM, according to at least one measured by one of the familiar sites (either prad/pcmonitors/TFTcentral). Apparently, at least one manufacturer thinks 5-digit PWM frequencies are easier to do than DC.
 
Last edited:
Very noob question here, but how do I check to see if my VG248QE has a PWM frequency of 864Hz? Special equipment required or is it findable with software/etc?
Set the brightness of your monitor rather low and take a picture while moving the camera laterally along the screen. Count the ghost images in that picture and multiply the number by the inverse of the shutter speed used for the picture (the shutter speed or exposure time can also be found in the EXIF data saved with the picture). This should give you a rough estimate of the PWM frequency. Say the shutter speed was 1/120 s and there were 7 ghosts, then you'd get 120Hz*7=840Hz which would indicate that your monitor uses the high PWM frequency.
Note that for this crude test you don't need to put much effort in maintaining a specific speed when moving the camera, as long as you can identify the ghosts. Of course, it helps if the screen content is static while doing this test, with a distinct small object on an otherwise uniform background.
 
Well I think I answered my own question. My build date is AUG 13, and I am guessing since that is more recent than the JUNE build above with 864hz I am sitting pretty.
 
So with the difficulty with maintaining such a high frames per second for light boost, is flicker free a worth while alternative?
 
So with the difficulty with maintaining such a high frames per second for light boost, is flicker free a worth while alternative?
Future strobe backlights shouldn't have a vendor-lock on the refresh rate range. LightBoost is simply vendor-limited to strobe at 100 to 120Hz, like owning a CRT that only operates at 100 to 120Hz. Strobe backlights can technically operate at lower refresh rates (at a tradeoff of increased flicker, like lower refresh rate CRT).

Someday we'll have the frame length of LightBoost (1/700sec) combined with flicker-free. But that will require 700fps@700Hz to have totally flicker-free LightBoost motion. So for a while, we need strobing as a cheap method of motion blur elimination. According to scientific references, we cannot have our cake (motion blur reduction) and eat it too (flicker free), unless we use insanely high Hz -- to have the really short frame visibility time -- i.e. really short samples in the sample-and-hold effect. (Otherwise, you get the motion blur you see at www.testufo.com/eyetracking during non-LightBoost mode).

Yes, flicker free is definitely a worthwhile alternative when motion blur reduction isn't important. It is a better mode for static content, such as photography, software development, graphics art development, etc.

If you want 60Hz LightBoost, check out Sony's Motionflow Impulse, which is Sony's version of LightBoost for consoles. We need to see this built-in to computer monitors too, without the artificial LightBoost 120Hz vendor restriction.
 
so between benQ and Asus 27" models, which is better for always on lightboost, and highest brightness. I do some gaming but use the hdmi as a TV (for PS3) as well.

I had fantastic experience with my now dead S27A750D which did a great job at both.

Edit: Iam in the red camp, if that makes a difference. I know sammys work with both, but seems like samsung stopped making anything 3d or 120mhz like 750D or 950D
 
Last edited:
I'm having issues with Toastyx's util...the only way I can get lightboost enabled is by starting the 3d vision setup wizard, which seems to make it persistent, but not at reboot for some resolutions (100).

Am I doing it wrong? How are amd users initializing lb? Thanks
 
I'm having issues with Toastyx's util...the only way I can get lightboost enabled is by starting the 3d vision setup wizard, which seems to make it persistent, but not at reboot for some resolutions (100).
Am I doing it wrong? How are amd users initializing lb? Thanks
You should report the issue at Monitor Tests forum (the owner of ToastyX Strobelight) -- tell ToastyX which monitor you have, and which AMD card you have. Sometimes there's an issue with the ToastyX successfully doing the LightBoost unlocking. (There's a one-time nVidia encrypted lock that needs to be unlocked before LightBoost works)
 
You should report the issue at Monitor Tests forum (the owner of ToastyX Strobelight) -- tell ToastyX which monitor you have, and which AMD card you have. Sometimes there's an issue with the ToastyX successfully doing the LightBoost unlocking. (There's a one-time nVidia encrypted lock that needs to be unlocked before LightBoost works)

Done, thank you.
 
Toms Hardware review of VG248QE is up. However, they don't seem to believe in LightBoost though, according to forum board comments:

Christian Eberle posted on TomsHardware:

Wow, you didn't test using lightboost in 2D for better motion clarity?

Frankly, there was no need to improve motion clarity because we didn't see any motion blur at all. The super-fast screen draw time means you don't have to flash the backlight (thereby reducing light output) to combat this issue. Even less-responsive panels these days don't exhibit much motion blur.

-Christian

You LightBoost users might want to followup to the TomsHardware reviewer's post in the TomsHardware Comments section.
 
Last edited:
Very noob question here, but how do I check to see if my VG248QE has a PWM frequency of 864Hz? Special equipment required or is it findable with software/etc?
Another good test -- view www.testufo.com/blurtrail while your monitor is at minimum brightness.

If you see one moving line during maximum brightness, and multiple moving lines during minimum brightness, you have PWM.
 
Hah.. no blur. What are they smoking? That is pretty sad to hear from a "technical" source. I've heard a lot of bad things about Tom's hardware reviews and benchmarks for a long time though. It hasn't been a solid source of benchmarks and such for enthusiasts I know for a very long time, tests pushed to preconceived results, omitting factors and tests other more thorough sites use, overlooking important factors in their tests, etc.
.
Mark's replies were very clear and accurate, substantive, and sourced. Unfortunately a lot of comments sections are full of disregard and omissions, which is one reason I prefer hardforum since I don't find that is the case so much here.

You can get a korean 2560x1440 for all the ips desktop benefits outside of gaming pretty cheap. Some people are getting some 4k tv's as monitors now , but the colors aren't as good as a good ips I hear. That's great for everything outside of games.
.
For gaming I don't get people who want 60hz, horrible smearing and obliterating of detail blur outside of the "shadow mask" of everything in the entire viewport, 1/2 or worse the scene action slice updates shown per second, 1/2 or worse the smoothness/fluidity of motion. The ufo test may be good to show a plain example, but in actual 1st/3rd person games the entire viewport of cgi scene architecture and "geology", all onscreen creatures and objects and all high detail textures and shaders with be smeared out.
.
Many also demand resolutions that cripple fps for gaming. Even 2560x resolutions are prohibitive on $750 - $1k in gpus to get high fps at medium, high, or high+/custom settings. 1080p looks like the sweet spot for enthusiast level budgets without going to extreme budgets (extreme $1500 - $2k + in gpus alone). You can get 100 - 120+ fps in a lot of games with a gtx780 or a titan at 1080p with the settings very high. On BF3 you can get 120fps with a gtx -6-80 on medium for that matter.
.
I find it very aggravating that several review/benchmark sites are not even including 1080p into their tests lately by the way.
.
I wish blurbusters would consider doing some of it's camera motion tracking tests(60hz, 120hz, 144hz, lightboost modes) on some popular games to show the effect of the entire viewport smear on very high detail scenes/textures/shaders and post the pictures in addition to the ufo photos. I think it would provide further enlightenment.
Unfortunately, unless you have a 120hz or greater input monitor, you are limited in examples to showing the blur amount still shots and are unable to show the increased action slices or aesthetic smoothness of motion outright.
.
Quote of something I wrote in a different forum to similar effect in response to a steambox gpu choices discussion which also mentioned the ps4, etc:
Personally I want 100fps+ on a 120hz monitor for my games (over 120fps optimally). Whatever my gpu budget is vs how demanding a game is limits what video settings quality i will set a game at. For me, sub 100fps and not running a 120hz monitor is not "ultra" graphics/display experience at all. It is not max settings to me, or perhaps "max configuration" and maximum presentation of the game world to me.
60hz monitors/tv's blur the entire viewport during FoV movement 50% more than 120hz at high fps, and 60% more than 144hz at high fps. Low fps and/or low hz also show half or less the most recent action slices per second, which means lower accuracy//lower motion tracking and lower aesthetic smoothness of motion/fluidity of motion. Maximum blur and the worst aesthetic motion smoothness (and reduced accuracy) is not the best/max visual presentation of a game.
 
Last edited:
Back
Top