New EIZO Foris FS2332, gaming IPS monitor

probably locked to 60hz again like the fs2331
no-go for gaming
the 1920x1200 is really nice of course
edit: its 1080p
 
Last edited:
Waste of money. It can't even do higher refresh rates, so what makes this monitor so special for gaming? There are already plenty of low-lag IPS monitors, and some of them can even do higher refresh rates. Maybe the overdrive is slightly better on the EIZO, but that hardly makes a difference at 60 Hz. The only interesting thing is the 10-bit LUT, which might make it a nice compromise as a low-lag semi-pro monitor, but I have no way of knowing if the output is 8-bit or 6-bit with dithering unless I try it myself. I can't trust the specs because they have been wrong before. There's also no indication of uniformity compensation or the A-TW polarizer or anything else that might make it worth considering.
 
There's also no indication of uniformity compensation or the A-TW polarizer or anything else that might make it worth considering.

Uniformity compensation?
Lol, we are talking about general purpose monitor focused on gaming, not about high end professional monitors.
 
probably locked to 60hz again like the fs2331
no-go for gaming
the 1920x1200 is really nice of course

? It's 1920x1080.

Uniformity compensation?
Lol, we are talking about general purpose monitor focused on gaming, not about high end professional monitors.

Sorry, but I'm a gamer and care about uniformity too. Not to mention I do other things with my PC and don't have the room for a second monitor (nor the will to live with the bugs of dual display)
 
Can't decide between the SA850 and the 750D. Just wish they could make a 120hz panel with low response times and great colors.
 
Hopefully this doesn't use the same potentially POS panels the LG/Asus/NEC 2011 IPS LED models use, which have wildly varying quality (200:1-1000:1 contrast lottery+tinting+backlight bleeding), slow response times and aggressive AG (the Asus does not have aggressive AG coating confirmed on the TFT central forums by Simon Baker).

If it is good and doesn't have the aggressive AG then it could be well worth the Eizo premium.
 
Last edited:
Just read the link...

OK "smart resolution" can go to hell. I'm surprised a company like Eizo would push technology that changes things like sharpening based on content. Sheesh. Some of us struggle for "neutral" and "reference" and yet companies keep throwing more and more processing at us disguised as "features".

"Power gamma for gaming"? Again, if the display was following the PC standard of 2.2 accurately, why on earth would I be messing with it? How does changing gamma make a game feel more "two dimensional"?

Fast response and 10-bit gamma correction sound cool.

I don't know why the thread title says that it's a "gaming IPS display". It's marketed as an "entertainment" display which is not gaming specific. It's an interesting niche, I guess, for people who want some of the pro features plus some stuff they feel is tailored to entertainment purposes.

When I first saw the thread I thought that Eizo was selling it as a "gaming" display and was just cashing in on making something black with blue LEDs and calling it "gaming". Of course all the gamers run out and buy it because it says it's for "gaming" and will somehow make it better.
 
I'll wait for reviews to comment on quality. But Bezel looks like ass. I prefer low profile like NEC's 90 series. Also 16:9 is garbage. Third stand looks weak as in no elevation or rotation.
 
"Power gamma for gaming"? Again, if the display was following the PC standard of 2.2 accurately, why on earth would I be messing with it? How does changing gamma make a game feel more "two dimensional"?
Through the tonal value spreading the perceived contrast is raised.

http://www.prad.de/new/monitore/test/2010/test-eizo-fs2331-bk-teil9.html

Personally I don't recommend it. A real visual advancement would arise through sRGB gradation on display side instead of a fixed gamma of 2.2. This enhances details in dark tonal values (based on gamma 2.2 corrected material).

Best regards

Denis
 
Through the tonal value spreading the perceived contrast is raised.

http://www.prad.de/new/monitore/test/2010/test-eizo-fs2331-bk-teil9.html

Personally I don't recommend it. A real visual advancement would arise through sRGB gradation on display side instead of a fixed gamma of 2.2. This enhances details in dark tonal values (based on gamma 2.2 corrected material).

Best regards

Denis

Yeah, but WHY? If you have a display that's performing to standard spec, then you see exactly what the game designers / movie director intended you to see. People spend lots of time and money trying to calibrate and get displays operating properly, then we'll just throw some extra contrast and non-linearities in and call it a "feature"? I just don't see it belonging on a high end display...
 
2.2 gamma is overrated. It is a standard made for offices where monitors are stared at in bright rooms. In darker rooms, which gamers and movie watchers often use, it is hopelessly flat and dull. 2.3-2.35 is way more preferable (if the monitor can do it without crushing blacks) for movies and games, generally. It increases perceived contrast and makes colors pop out more. Hell, IIRC old uncalibrated CRTs usually had gamma of 2.4-2.5! Thats why CRT picture is often perceived as very deep.

Ideal gamma is very dependable on the lightning of your room, capabilities of your monitor and your own tastes AND what you are watching. There is no universal standard that is exactly right and accurate everywhere. In photo editing gamma of your monitor and the picture should match. But 2.2 is just one standard amongst many, its not universal. There is no way to know what standard whatever game/movie uses as there is NO de facto standard which everyone follows. Im not sure if I am putting all this exactly right, but the point is that dont fret about it...
 
Last edited:
2.2 gamma is overrated. It is a standard made for offices where monitors are stared at in bright rooms. In darker rooms, which gamers and movie watchers often use, it is hopelessly flat and dull. 2.3-2.35 is way more preferable (if the monitor can do it without crushing blacks) for movies and games, generally. It increases perceived contrast and makes colors pop out more. Hell, IIRC old uncalibrated CRTs usually had gamma of 2.4-2.5! Thats why CRT picture is often perceived as very deep.

Ideal gamma is very dependable on the lightning of your room, capabilities of your monitor and your own tastes AND what you are watching. There is no universal standard that is exactly right and accurate everywhere. In photo editing gamma of your monitor and the picture should match. But 2.2 is just one standard amongst many, its not universal. There is no way to know what standard whatever game/movie uses as there is NO de facto standard which everyone follows. Im not sure if I am putting all this exactly right, but the point is that dont fret about it...

Flat and dull? Well not on my monitors.
But wait now, isn't that just entirely subjective?
 
Flat and dull? Well not on my monitors.
But wait now, isn't that just entirely subjective?

Yeah, I dunno.... My HCFR calibrated plasma isn't "flat and dull". Neither is my Spectraview II calibrated desktop display... Sure, neither is sear-your-retina vivid but on the other hand skin looks like skin and grass looks like grass, so...

I guess if one is a stalwart fan of "vivid" mode where "punch" is most highly valued, skies must always be electric blue and grass nuclear green for it to look good to you then calibrated, profiled and standards-compliant displays would look "flat and dull" just like real life does. *shrug* I guess it's a personal preference thing...

I just don't know why studios, film makers, photographers etc... use such expensive equipment to accurately and realistically capture and process images according to standards but J.Q.Public always knows better and comes up with all kinds of settings that make it "better". Does everyone feel that the movies shown at your local IMAX theater are "flat and dull"? You can bet that they are standards-compliant and reference grade (to whatever standards for gamut, gamma etc... apply to IMAX). Some projectionist didn't just read on a forum somewhere that a higher gamma would make his movie "punchier" and start screwing around with it. I didn't hear anyone walking out of Avatar IMAX 3D complaining that it wasn't punchy enough or the shadows needed to be opened up... I know that IMAX will have technical specs superior to HDTV, but the concept is the same. Today's IMAX will be tomorrow's home environment.

If you use a calibrated display you're seeing what the director and the editor saw when they made the film. Unfortunately in the digital TV/movie world doing thing to "increase pop" or "open up the shadows" results in crush, blocking or artifacting. It's different if you're doing post on 14-bit RAW files from a dSLR which have more information than our 8-bit displays can show.
 
Last edited:
In other words, 2.2 gamma isn't overrated, it's the proper one for video/gaming.

AFAIK it can be different with photography or printing, but I'm no expert on that stuff.
 
In other words, 2.2 gamma isn't overrated, it's the proper one for video/gaming.

AFAIK it can be different with photography or printing, but I'm no expert on that stuff.


If they are made with strict sRGB standard specifications in mind then yes. Provided you watch the picture in similar lighting enviroment. In dark room, no you do not see a proper picture. You see a brighter, more washed out one one that varies from what was intended. Thats how human eye works.


Start reading this thread.
http://www.avsforum.com/avs-vb/showthread.php?t=1008297


These guys are more about TVs and movies but same things apply for computer monitors used for media use. Only reason why you would want to stare at 2.2 is if you do photo editing using sRGB specifications but then you have to take room lighting and such into account. Why do you think monitor colorimeters come with ambient room light checker? I am not a photo editor either and definetly not an expert on these things, I am merely a gamer and movie watcher too.
 
Last edited:
Why do you think monitor colorimeters come with ambient room light checker? I am not a photo editor either and definetly not an expert on these things, I am merely a gamer and movie watcher too.

The ambient light checker will 1/ look at room light level and guide you towards an appropriate target luminosity and 2/ some will look for significant colour tinting of your ambient light and produce warnings or take corrective action.

Neither has anything to do with gamma. Suggested display luminosity varies with ambient room lighting. You need to see what's on the screen so the target is higher in a bright room. This does not affect the target whitepoint(typo: s/b whitepoint colortemp), the primaries or the gamma curve. It may affect the blackpoint, depending on the technology of the display. If a display technology is able, you may be able to increase the whitepoint without increasing the blackpoint thus increasing the contrast ratio. White point and black point must be set prior to calibrating/profiling gamma.

I've read the thread you've referenced. 2.5 looks pretty poor on my plasma (where gamma variance is "permitted" according to viewer taste) and is clearly incorrect for colour mastering work on my PC. In the days of CRT and analog it was easier to play with these things without experiencing crush and banding. It's harder to get away with it in the digital world as it applies to "home" applications.
 
Last edited:
The ambient light checker will 1/ look at room light level and guide you towards an appropriate target luminosity and 2/ some will look for significant colour tinting of your ambient light and produce warnings or take corrective action.

Neither has anything to do with gamma. Suggested display luminosity varies with ambient room lighting. You need to see what's on the screen so the target is higher in a bright room. This does not affect the target whitepoint, the primaries or the gamma curve. It may affect the blackpoint, depending on the technology of the display. If a display technology is able, you may be able to increase the whitepoint without increasing the blackpoint thus increasing the contrast ratio. Whitepoint must be set prior to calibrating/profiling gamma.

I've read the thread you've referenced. 2.5 looks pretty poor on my plasma (where gamma variance is "permitted" according to viewer taste) and is clearly incorrect for colour mastering work. In the days of CRT and analog it was easier to play with these things without experiencing crush and banding. It's harder to get away with it in the digital world as it applies to "home" applications.


Then I have understood its use wrong. However it is true that too low gamma in dark enviroment results in washed out picture and too high gamma results in too dark greys and crushed blacks in bright enviroments. Our eyes are what causes it.

And yes, 2.5 IS way too high. I doubt any modern flat panel can achieve it without crushing greys and blacks anyway. I didnt mean to imply 2.5 is correct, (I got impression that it is only good for CRTs) I think it is even mentioned in that thread at some point. I meant that thread is good info about achieving ideal gamma and why 2.2, is not always correct. It IS a long thread with a lot of techbabble.

I for example ended up using 2.35 and it looks great in both mild light and dark enviroment. Only the deepest blacks are crushed (last few steps) but I believe thats because of my SPVA panel.

*edit* this maybe a bit better thread. less techbabble, less confusion, more on the point.
http://fractal.avsforum.com/avs-vb/showthread.php?t=1281326
*edit2* Again I must clarify that above forums are mostly about video. However I do believe this applies to videogames too. Photoediting is a different ballpark, a subject I wont dare to touch.
 
Last edited:
About gradation: There is always one "right" gradation: The inverse of the gamma correction of the material. In an ICC workflow this is transparent for the user as long as the actual display characteristic is desribed correctly in the profile. In the video sector it gets more complicated as we have a "blackbox" system. When looking at my material (DVB, DVD, Blu-ray) it's nevertheless quite obvious that the Rec.709 gradation isn't used for HD material - that's not astonishing because it would make no sense - nor regarding backward compatibility (no CRT has a native behaviour that is close to this characteristic). In most cases I assume that a gradation with a fixed gamma of 2.2 is assumed on output side. Does it make sense to use an other gradation than the one which was assumed during gamma correction? This depends on the "rendering conditions": We have very different and often not very ideal situations regarding playback so it can make sense to adapt to these situations. If for example details get lost in darker areas I would recommend the sRGB gradation which "moves" dark tonal values earlier in a brighter region without deviating too much from a fixed gamma of 2.2.

And yes, 2.5 IS way too high. I doubt any modern flat panel can achieve it without crushing greys and blacks anyway.
Regarding material that was gamma corrected with respect to a gamma of 2.2 on display side this will always lead to a black crush. Exception: A managed workflow that transforms the data correctly on basis of the participating profiles.

Ideal gradation for the transformation of tonal values from a linear space to a gamma corrected space with limited tonal values is L* because it reflects human perception. So we have density in places where we need it. ECI-RGB 2.0 is based on this gradation and a good choice for RAW development. We won't see it in the video sector because of backward compatibility reasons and high requirements regarding the display electronic.

Best regards

Denis
 
Last edited:
About gradation: There is always one "right" gradation: The inverse of the gamma correction of the material. In an ICC workflow this is transparent for the user as long as the actual display characteristic is desribed correctly in the profile. In the video sector it gets more complicated as we have a "blackbox" system. When looking at my material (DVB, DVD, Blu-ray) it's nevertheless quite obvious that the Rec.709 gradation isn't used for HD material - that's not astonishing because it would make no sense - nor regarding backward compatibility (no CRT has a native behaviour that is close to this characteristic). In most cases I assume that a gradation with a fixed gamma of 2.2 is assumed on output side. Does it make sense to use an other gradation than the one which was assumed during gamma correction? This depends on the "rendering conditions": We have very different and often not very ideal situations regarding playback so it can make sense to adapt to these situations. If for example details get lost in darker areas I would recommend the sRGB gradation which "moves" dark tonal values earlier in a brighter region without deviating too much from a fixed gamma of 2.2.


Regarding material that was gamma corrected with respect to a gamma of 2.2 on display side this will always lead to a black crush. Exception: A managed workflow that transforms the data correctly on basis of the participating profiles.

Ideal gradation for the transformation of tonal values from a linear space to a gamma corrected space with limited tonal values is L* because it reflects human perception. So we have density in places where we need it. ECI-RGB 2.0 is based on this gradation and a good choice for RAW development. We won't see it in the video sector because of backward compatibility reasons and high requirements regarding the display electronic.

Best regards

Denis


Thank you. A good post from a lot more informed and knowledgeable person than I am.
 
Here are some pics from a product event showing the new FS2332 next to the older FS2331-

http://www.flickr.com/photos/24042854@N03/

and here are two short videos from the same event-

[ame="http://www.youtube.com/watch?v=_PpEH8IYjDg"]‪EIZO FORIS FS2332_001‬‏ - YouTube[/ame]

[ame="http://www.youtube.com/watch?v=QYG7X-yUq1k"]‪EIZO FORIS FS2332_002‬‏ - YouTube[/ame]
 
The PRAD.de sneak peak Review is up, they gave it a very good rating.
 
Got it today.
So far i see no "sparke effect"
The screen looks like the one of my Lenovo X220

matte + *bit glossy* look
Excellent
 
It does look good (true 8bit, light AG, good uniformity, calibrated...) but I don't know if it could handle higher refresh rates. Now that would make it great, even though the black levels aren't too impressive with a low brightness, sadly.
 
It seems slower than my 2ms TN (maybe like a 5ms)

I will check CR once i get my new i1 D Pro
 
for some reasons, i have opened my FS2332

Surprisingly, the panel is a SAMSUNG PLS LTM230HL01, not an LG IPS
 
The PRAD.de sneak peak Review is up, they gave it a very good rating.

Horrible black level, as expected. I don't understand why Eizo switched to an IPS panel although the predecessor, FS2331, had a VA panel.
 
Yes, Samsung sticker with part number on the back.
I thought maybe it was only the backlight, but the part number seems to be for the whole panel.
(that explain the normal coating)

During use, you can't tell the difference with an ips panel. (Same "ips" glow etc..)


http://www.samsung.com/us/business/oem-solutions/pdfs/PSG2011_FINAL-092011.pdf
(page 27)

I really can't belive that EIZO sell a PLS monitor as an IPS one.
This is really disappointing not because I prefer the IPS over the PLS but because they sell something different from what they advertise.
I hope that someone will shed some light on this soon.
 
I really can't belive that EIZO sell a PLS monitor as an IPS one.
This is really disappointing not because I prefer the IPS over the PLS but because they sell something different from what they advertise.
I hope that someone will shed some light on this soon.

PLS is IPS.

edit; I guess this is pretty interesting. I wonder how long it'll be until other manufacturers start picking up these 23" panels for use in their displays? We might actually see a decent alternative to the U2312m and similar displays that use LG panels... without a grainy coating. I didn't even know Samsung was producing a 23" PLS panel.
 
Last edited:
Can't decide between the SA850 and the 750D. Just wish they could make a 120hz panel with low response times and great colors.

It's called a CRT. Well, it's not really a panel, but still...
 
Back
Top