Acer 120Hz 23.6" 1920x1080 LCD... but it's orange!

"starting to get", not "has"

Digitalversus has one and I'm just being naive and assuming there are a few more in the wild.

Gotcha. Well, count me as excited. If the input lag is low, I'm want one of these ASAP!
 
cant wait to buy this fuckn monitor dude. My gtx280 was giving me pink screens so I should have that before new years hopefully this monitor comes within a week of 2010
 
I cant wait, but it seems that I'll need a new vid card too if I want solid 120 FPS.
 
I had the 120hz samsung and I can confirm that its way better for Counter Strike Source. U dont even need vsync on. cuz with gtx280 I was getting like 200 fps+ at 120hz... But one time I accidently had it on 60hz probably cuz i restarted and I noticed it was playing shitty so I checked my windows monitor refresh rate and put it at 120hz and then it felt good again lol.... 60hz is shit compared to 120hz
 
If you haven't read already here's the link to the first test of this Acer which looks very promissing so far: http://www.google.com/translate?hl=...ww.lesnumeriques.com/article-240-7268-38.html or do the comparisions here http://www.digitalversus.com/duels.php?ty=6&ma1=38&mo1=626&p1=7268&ma2=41&mo2=596&p2=6823&ph=1

Beats VX2268wm in ghosting to a similiar amount VX2268wm beats 2233rz and comes better factory color calibrated than the two! Now we just need to await some input lag tests...

EDIT: Damn I just checked the "Input lag VS CRT" test on digitalversus... doesn't look too good... but then again digitalversus' method of testing input lag doesn't usually match with other input lag tests I've seen, for example they state the VX2268wm would have 14ms avg input lag vs 12ms avg input lag on the 2233rz. Either digitalversus input lag test method is way more accurate than other tests or it's flawed in some way.
 
Last edited:
ya digital versus have the prices wrong as well. i duno how they tested it yet since its not out yet
 
Digitalversus is a site that does reviewing for a living so it's not like recieving a sample for review before they appear in stock is anything special for them. :p

overclockers.co.uk is the only site so far which lists these for preorder at £299.99 incl VAT or £260.86 excl VAT http://www.overclockers.co.uk/showproduct.php?prodid=MO-049-AC

Should translate to around 330 EUR in Germany, I expect 350 EUR in my homecountry Finland (VAT incl) etc and probably $399.99 at newegg. Could very well end up even cheaper at egg seeing what ridiculously low price VX2265wm is sold for over there, $249, in Finland you can buy the VX2268wm for 299 EUR lol, so if you're lucky it might even end up at $350 in US but $399 sounds more likely for a new product (£260 = $416 USD but UK's expensive compared to USA).
 
Last edited:
Will this monitor work fine with ATI cards?

I remember reading about people having problems with 120 hz monitors, like the 2233rz, with newer ATI cards and a lot of flickering issues.
 
Yes and that's not a specific monitor issue but that's ATI driver issue so yes the flicker prob should be there with this Acer as well, at least I wouldn't buy one of these if I had an ATI card and would hope it would work. If I'd must have a 120Hz LCD I'd either wait until Fermi arrives or I'd look up some nvidia GTX260/275/280/285/295 offerings in forums' sales sections as lots of people are selling those cheap atm, I personally just bought a lightly used and nicely clocking EVGA GTX 280 cheap when my GTX 260 was starting to fail.
 
I think there is a workaround for ATI cards where you have to change the clock speeds at idle. During gaming there shouldn't be an issue using 120hz. I was hoping ATI would have resolved this already though:(
 
If you haven't read already here's the link to the first test of this Acer which looks very promissing so far: http://www.google.com/translate?hl=...ww.lesnumeriques.com/article-240-7268-38.html or do the comparisions here http://www.digitalversus.com/duels.php?ty=6&ma1=38&mo1=626&p1=7268&ma2=41&mo2=596&p2=6823&ph=1

Beats VX2268wm in ghosting to a similiar amount VX2268wm beats 2233rz and comes better factory color calibrated than the two! Now we just need to await some input lag tests...

EDIT: Damn I just checked the "Input lag VS CRT" test on digitalversus... doesn't look too good... but then again digitalversus' method of testing input lag doesn't usually match with other input lag tests I've seen, for example they state the VX2268wm would have 14ms avg input lag vs 12ms avg input lag on the 2233rz. Either digitalversus input lag test method is way more accurate than other tests or it's flawed in some way.


Nice, thank you for the full review from digitalversus.


I just read the short review they did and they say it is the most responsive monitor they have ever tested, beating out the vx2268 which is really great. I too was worried when I saw the the VS Crt thing but I am now relieved after reading the actual review.
 
There is NO advantage in 120Hz, your eye doesn't work that fast. The ONLY 120Hz is good for is 3D you idiots, 60Hz each layer = good 3D quality.

Retards.
 
There is NO advantage in 120Hz, your eye doesn't work that fast. The ONLY 120Hz is good for is 3D you idiots, 60Hz each layer = good 3D quality.

Retards.

chillpill.jpg

60/75Hz limitation is THE reason some people have used CRTs until the 120Hz LCD monitors finally arrived. I would concider whoever that hasn't discovered the awesomeness of 120Hz gaming lucky, it had certainly made my life easier. Since I got used to how smooth 100Hz gaming felt on CRTs, there's no possibility to go back to 60Hz. Sad really cuz there exists tons of good quality 60Hz LCDs out there today like Dell 2209WA for example but unfortunately it isn't 120Hz. I'd use any 120Hz LCD rather than any 60Hz no matter the price, that important 120Hz is for me. ^^
 
Last edited:
There is NO advantage in 120Hz, your eye doesn't work that fast. The ONLY 120Hz is good for is 3D you idiots, 60Hz each layer = good 3D quality.

Retards.

OMG, you're so clueless about how things really work.
 
There is NO advantage in 120Hz, your eye doesn't work that fast. The ONLY 120Hz is good for is 3D you idiots, 60Hz each layer = good 3D quality.

Retards.
Obvious troll is obvious. I don't think anyone would disagree that gaming is the primary reason to get 120hz. In that regard the difference is significant, even 75hz vs. 60hz is a very noticeable improvement to me.
 
Last edited:
Well I just saw Avatar in Imax 3d...........

And now I am selling both of my monitors just to buy this thing.




And maybe an extra gtx260 just to make sure.
 
Wrong. Human eye does perceive the fluidness of really fast moving objects way beyond 60Hz. 60Hz is very stuttery compared to +85Hz refresh rates.
yeah the 120Hz on tvs certainly helps smooth things out. if you watch a side by side demo comparing 60Hz to 120Hz it makes you wonder how we ever put up with it.

what gets me a little confused is that most tvs are doing true 120Hz compared to the few monitors that 120Hz. yet I believe that a DLP tv will let you use Nvidia 3D just fine. :confused:
 
There is NO advantage in 120Hz, your eye doesn't work that fast. The ONLY 120Hz is good for is 3D you idiots, 60Hz each layer = good 3D quality.

Retards.

Someone needs a temp ban to cool down sheesh. Baffles me how many people make this claim without having actually tried an above 60hz refresh rate.

Since around the release of HL1 ive been messing around with monitor refresh rates, and i can only say i miss my CRT, a lot. We'll see how this monitor pans out though.
 
Does Ati still has 120hz problems? /sigh...


Yes, but only in 2D mode. Thats because its GPU core clock speed is so low and it causes problems in 120Hz mode. If I understood correctly its a technical problem with the card itself and not a driver one. However, just "overclock" your GPU clocks a bit in 2D mode and it should be fixed. I think ATI refuses to update their drivers to fix this and make overclocked numbers stock as it will make video card drain more power in 2D mode than what they originally advertise?
 
what gets me a little confused is that most tvs are doing true 120Hz compared to the few monitors that 120Hz. yet I believe that a DLP tv will let you use Nvidia 3D just fine. :confused:

I don't know what you think it means, but "true 120Hz" means (to me at any rate) that the display takes 120Hz from a source outside the TV, such as a video card. Only three LCD monitors do this (Samsung 2233rz, Viewsonic VX2265wm, Viewsonic VX2268wm). No LCD TV does this.

A DLP is not an LCD TV. Completely different technology. I don't know enough about it to comment. Edit: after some brief googling, it seems like DLPs would be ideal for computer gaming monitors. They seem to start at 40+", and don't tend to be particularly expensive. There must be some good reason we don't talk about them at all in this forum?
 
Last edited:
Evilsofa> because they're 40"+ weighs a lot and takes a lot of space and they still cost a lot more than what the average 20 - 30" LCD probably. For 40" you'd need a bit higher resolution but even so you'd need to sit at quite good distance to make the usage comfortable I think. If they had made 30" DLP TVs then maybe it had been more common. Oh and these are like only available in the US market sadly, they are almost non-existant in europe. I haven't seen any guy trying to hook up a 120Hz DLP TV to a comp yet tho to see if it actually works with 120Hz and not all of them are 120Hz either but I know at least some of Mitsubishi's are.

What I'd want is IPS + LED + 120Hz and preferable 16:10 aspect ratio in sizes 22 @ 1680x1680 or 24 @ 1920x1200 with input lag <15ms. But when that day comes, I'm probably fed up with gaming already.
 
Last edited:
Damn just read one of the reviews and it can't do 120hz through HDMI.

Definitely waiting till CES10 to see what the Blu-ray 3d specs are before I go ahead and purchase this monitor. I was hoping to use this monitor for my ps3 but.....
 
Maybe I can help clear it up. Or maybe not, lol. But I have worked for some of the top game 3D game companies like Interplay. So I've heard many conversations on this. The brain only detects about 8 to 12 frames per second and can clearly see that anything below that is nothing more than flipping pictures at around 6fps. About 8 fps is all that is needed to give the illusion of motion since the brain starts to average all the images together. Back in the day when Descent was big, the very first version was set to get 8 fps minimum and that wasn't great, but ok and did look fluid. It just wasn't perfectly fluid. Around 15 fps and it was really smooth. The reason is in part do to with how even the frame rate is during game play, and that is dependent on the 3d engine along with visibility scheme. If you jump from displaying 800 polys to 4000, that makes frame rate choppy. And the eye sees that right off. Actually, most of the time, we notice the motion of models becoming jerky due to being rendered more quickly and their patterns changing speed if they move a certain way. But that jerkyness isn't due to lack of frame rate, but do to the 3d engine speeding up and slowing down (dependent on technology). Again, being locked to any frame makes a game look very smooth. But locking is hard..

If we assume that we are locked to a frame rate, and it's smooth, then at about 10 to 15 you get very fluid motion. At 15 you can barely discern jumping in slow moving scenery like backgrounds, or even fairly quick moving objects. But if you have a fast moving model, you may see some slight jumping in it. Moving up towards 20 begins to erase the problem completely. Each time you move up 5, the details in the game that seem to jump get smaller and smaller, and less significant or noticeable. So moving from 8 to 15fps gives a huge improvement. However, moving from 15 to 30 gives a smaller improvement. But it still helps. Moving beyond 30 does practically nothing if anything at all. Again, this assumes we are locked at 30 and the frame rate will not dip. But the truth is, in many games, the frame rate DOES dip. So someone may get like 80fps in a part of the game, but then have it dip down to 10 fps when they turn their view in the world. So either one of two things happens. Either the frame rate dips far enough they see jumping. OR, because the over speed of objects moving in the game change so drastically during the dip, the player senses that and assumes it's frame rate. It's partly dependent on how the game was created, but this is why sometimes someone who gets peaks of 150 fps can some times kill other people in the game faster. They are getting more hits in, more accuracy etc. But it's NOT because they can see the true frames per second. It's just how well the 3d engine is responding. And gamers can feel it. But they are wrong when they think they can see 60 fps vs 150 fps. They can NOT. They can only sense, relative world and object moment. And a game with a poor frame rate management that has the fps jumping every where doesn't play well, usually. Flight simulator X has this problem with auto generate trees on. It will create a spurt of trees and you can see them filling in. Looks ugly, lol. But again, it's not the total frame rate that is the culprit. However locking FSX to say 30 fps helps a lot. So again, 30 fps if it's always the same, its plenty in terms of fluid motion. But even if you wanted to be sure, then say 60. But where does that come from? It was a nice number to coincide with monitor refresh and to avoid video tearing. Why? Because as the screen is redrawn 60 times a second, it looks better if the frames are also fed in at that rate and the frame.. And it can prevent tearing during page flipping of graphics. It looks really bad when 1/2 the frame is drawn and you update the screen. But now the CRT is out, and LCD is the rage. And tech is changing , blah blah. lol. And we don't have to worry as much about refresh rate being harsh, or flashing etc. But in short, a high rate of 140 fps for same gamers, can mean that when they do dip down, it doesn't go down as far as someone with say a peak of 80fps. But again this is a game performance issue, not anything to do with the rate at which we perceive motion. But getting back to the start, for the most part, you can not appreciate over 30 fps, and especially not 60 if it's pure and locked down. It's only a display of computer power, which is I think is important. If you got the power, it always helps. Just not how you may have originally thought.....retards.
 
So WildMonkey according to you 7 years ago the lcd monitors that were out then are comparable to the ones today in terms of gaming and responsiveness? I mean they all run at 60hz and therefore there is no difference between them....

120hz is not only helpful in having smoother game play but also in achieving less artifacts, ghosting/smearing and a having lower input lag.

Read the reviews on these monitors

http://www.xbitlabs.com/articles/monitors/display/viewsonic-fuhzion-vx2268wm_8.html#sect0

"If you are not interested in the stereo mode, but want a good and fast gaming monitor, you should still consider 120Hz models. My tests have proved it once again that this refresh rate helps combine a very high speed of the matrix with a very low level of RTC artifacts.

Moreover, the 120Hz refresh rate provides substantial benefits of its own. I had not expected a reduction in response time (it is smaller than the refresh rate even at 120Hz) but the considerable reduction of the intensity of RTC artifacts was a nice surprise. This is a rare thing indeed: a 3ms matrix with no RTC artifacts!"


http://www.xbitlabs.com/articles/monitors/display/viewsonic-fuhzion-vx2268wm_8.html#sect0

"Then, the 120Hz refresh rate ensures smoother motion in games and at ordinary work. Perhaps it is not a critical improvement for office applications, yet an advantage anyway, and I guess that gamers will welcome 120Hz monitors warmly."


But hey these guys have actually tested these monitors and have first hand knowledge of them, unlike you.
 
Yes, yes, yes, whatever, next year you'll be all like "my 120Hz is shit, the 150Hz are so much better" the fact is you morons just don't fucking understand anything.
 
Yes, yes, yes, whatever, next year you'll be all like "my 120Hz is shit, the 150Hz are so much better" the fact is you morons just don't fucking understand anything.
the fact that you said 150Hz shows you dont know much either. :D

and instead of saying whatever why dont you actually read that review. perhaps xbitlabs are morons too?
 
Or they're trying to sell you stuff...either way.
they are a tech sites that reviews products. so are you actually saying they are taking money just to lie and say 120Hz is better. how about the digitalversus site that also says its better? I doubt its a conspiracy from tech sites just to get you to buy into it. I have seen the difference side by side and 120Hz is way smoother.
 
Oh okay first, you're incredibly stupid. Second, if done right it could be better, but not by much. Third, you could go up to 10K FPS and still have room for improvement because the human eye does NOT see in frames dipshit. Fourth, do you really think review sites don't get paid by manufacturers and advertisers?
Fifth, retard.
wow. well please take your superior intellect and start your own site.
 
Damn just read one of the reviews and it can't do 120hz through HDMI.

Definitely waiting till CES10 to see what the Blu-ray 3d specs are before I go ahead and purchase this monitor. I was hoping to use this monitor for my ps3 but.....

where did u see this? Will the ps3 be 3d compatible with this monitor?
 
Yes, but only in 2D mode. Thats because its GPU core clock speed is so low and it causes problems in 120Hz mode. If I understood correctly its a technical problem with the card itself and not a driver one. However, just "overclock" your GPU clocks a bit in 2D mode and it should be fixed. I think ATI refuses to update their drivers to fix this and make overclocked numbers stock as it will make video card drain more power in 2D mode than what they originally advertise?

So the only problem is when i'm not using gaming application just desktop surfing? on the OC how much OC it takes to remove the issue? for example on a 5870 if you have one or either know

been running my new 5870 for a week with the 2233rz and I haven't seen any issues so far.

the screen is always @120hz correct?
 
So the only problem is when i'm not using gaming application just desktop surfing? on the OC how much OC it takes to remove the issue? for example on a 5870 if you have one or either know

Yes. 3D never had problems with ATI and 120Hz. How much you have to OC, I dont know. Shouldnt be hard to Google out though.

To Wildmonkey

Of course human eye cannot see individual frames beyond certain point, correct, but thats not the point of 120Hz. Its about the fluidness of motion which your eyes and brains CAN sense. Again not as individual frames but smoothness of movement especially with fast moving objects, provided your FPS and 3D engine can keep up.
 
Maybe I can help clear it up. Or maybe not, lol. But I have worked for some of the top game 3D game companies like Interplay. So I've heard many conversations on this. The brain only detects about 8 to 12 frames per second and can clearly see that anything below that is nothing more than flipping pictures at around 6fps. About 8 fps is all that is needed to give the illusion of motion since the brain starts to average all the images together. Back in the day when Descent was big, the very first version was set to get 8 fps minimum and that wasn't great, but ok and did look fluid. It just wasn't perfectly fluid. Around 15 fps and it was really smooth. The reason is in part do to with how even the frame rate is during game play, and that is dependent on the 3d engine along with visibility scheme. If you jump from displaying 800 polys to 4000, that makes frame rate choppy. And the eye sees that right off. Actually, most of the time, we notice the motion of models becoming jerky due to being rendered more quickly and their patterns changing speed if they move a certain way. But that jerkyness isn't due to lack of frame rate, but do to the 3d engine speeding up and slowing down (dependent on technology). Again, being locked to any frame makes a game look very smooth. But locking is hard..

If we assume that we are locked to a frame rate, and it's smooth, then at about 10 to 15 you get very fluid motion. At 15 you can barely discern jumping in slow moving scenery like backgrounds, or even fairly quick moving objects. But if you have a fast moving model, you may see some slight jumping in it. Moving up towards 20 begins to erase the problem completely. Each time you move up 5, the details in the game that seem to jump get smaller and smaller, and less significant or noticeable. So moving from 8 to 15fps gives a huge improvement. However, moving from 15 to 30 gives a smaller improvement. But it still helps. Moving beyond 30 does practically nothing if anything at all. Again, this assumes we are locked at 30 and the frame rate will not dip. But the truth is, in many games, the frame rate DOES dip. So someone may get like 80fps in a part of the game, but then have it dip down to 10 fps when they turn their view in the world. So either one of two things happens. Either the frame rate dips far enough they see jumping. OR, because the over speed of objects moving in the game change so drastically during the dip, the player senses that and assumes it's frame rate. It's partly dependent on how the game was created, but this is why sometimes someone who gets peaks of 150 fps can some times kill other people in the game faster. They are getting more hits in, more accuracy etc. But it's NOT because they can see the true frames per second. It's just how well the 3d engine is responding. And gamers can feel it. But they are wrong when they think they can see 60 fps vs 150 fps. They can NOT. They can only sense, relative world and object moment. And a game with a poor frame rate management that has the fps jumping every where doesn't play well, usually. Flight simulator X has this problem with auto generate trees on. It will create a spurt of trees and you can see them filling in. Looks ugly, lol. But again, it's not the total frame rate that is the culprit. However locking FSX to say 30 fps helps a lot. So again, 30 fps if it's always the same, its plenty in terms of fluid motion. But even if you wanted to be sure, then say 60. But where does that come from? It was a nice number to coincide with monitor refresh and to avoid video tearing. Why? Because as the screen is redrawn 60 times a second, it looks better if the frames are also fed in at that rate and the frame.. And it can prevent tearing during page flipping of graphics. It looks really bad when 1/2 the frame is drawn and you update the screen. But now the CRT is out, and LCD is the rage. And tech is changing , blah blah. lol. And we don't have to worry as much about refresh rate being harsh, or flashing etc. But in short, a high rate of 140 fps for same gamers, can mean that when they do dip down, it doesn't go down as far as someone with say a peak of 80fps. But again this is a game performance issue, not anything to do with the rate at which we perceive motion. But getting back to the start, for the most part, you can not appreciate over 30 fps, and especially not 60 if it's pure and locked down. It's only a display of computer power, which is I think is important. If you got the power, it always helps. Just not how you may have originally thought.....retards.
Mr. WildMonkey, what you have just said is one of the most insanely idiotic things I have ever heard. At no point in your rambling, incoherent response were you even close to anything that could be considered a rational thought. Everyone in this thread is now dumber for having listened to it. I award you no points, and may God have mercy on your soul.
 
Back
Top