What's the smaller compromise - 6 bit with FRC or using non-native resolution?

Joined
Oct 11, 2007
Messages
43
I'm in the market for a new 27 inch IPS screen, which I plan to use also for image processing (I use DxO Optics Pro 10 to convert my DSLR raw files, and Corel Paint Shop Pro X7 for editing). My current screen is a Dell 2209WA (8 bit), calibrated with Spyder2Express. I'm perfectly ok with its colors, but would like something larger. The problem is that I can only find screens that belong to these two groups:

- 1920x1080 with 6 bit + FRC. Pixel size is convenient for me also for general use, but I have some suspicion that the lower color depth of the screen will be a regression compared to my current screen.
- 2560x1440 with 8 bit. Colors should be fine, but the resolution is too high. In the past I returned a Dell 2007FP because the pixels were too small and I didn't like any of the workarounds to make Windows more usable (like setting higher DPI). Will using a modern 2560x1440 screen at 1920x1080 provide good sharpness?

Thanks! :)
 
Huh? 1440P at 27" is quite optimal and should not have PPI issues. Unless you have your screen WAY too far away but thats separate issue. The PPI is more or less the same as 1080p on your current 22" screen, you just have more room on the sides.
 
My current 22" is not FHD. It uses 1680x1050, which is perfect for my preference and eyesight. I wouldn't want smaller pixels.
 
27" 1080p would be perfect in terms of resolution, but all these screens seem to use 6 bit color which I'm concerned is worse than my current screen.

32" is too big for my desk constraints and probably out of my budget anyway.
 
Oh. Then I cant help. It has been many, many years since I had a 22" screen with 1680x1050 resolution. I don't have even a slightest memory what kind of pixel density it had.

Samsung has curved 27" VA panel in 1080P resolution and they are 8-bit. Color accuracy is not as good as IPS especially towards the edges (though curve helps on that regard) but miles ahead of any TV panel. I would say this is a perfect compromise.
 
Oh. Then I cant help. It has been many, many years since I had a 22" screen with 1680x1050 resolution. I don't have even a slightest memory what kind of pixel density it had.

Samsung has curved 27" VA panel in 1080P resolution and they are 8-bit. Color accuracy is not as good as IPS especially towards the edges (though curve helps on that regard) but miles ahead of any TV panel. I would say this is a perfect compromise.

This monitor and its siblings.
https://pcmonitors.info/reviews/samsung-s27e650c/

The default calibration is damn good too.
 
6-bit+FRC is terrible, especially if you're planning on using it for image editing work. Even 8-bit+FRC is pretty bad.
VA or TN panels are not suitable for photo editing, you need IPS.
Considering that you plan on using the display for photo editing, I'm surprised that you're looking for something so low resolution. All the photographers that I know, including myself, want the highest resolution display possible.
The higher the resolution is, the more difficult it is to see pixels, and the more accurate a representation it is of your work - this is especially important if you do any kind of print work.

I hate to ask the obvious, but have you had your vision tested recently?
Your current screen is ~90 PPI, which is already lower resolution than most.
Typically most monitors are in the 100-110 PPI range, with notebooks being closer to 130. 1920x1080 at 27" is only 80 PPI.

What I would suggest for you would be going in the opposite direction: buy a 4K screen and run it at 2x scaling.
Non-integer scaling can be problematic (e.g. 1.25x on 2560x1440) but exact scales like 2x generally work quite well in Windows now - especially if you're on Windows 10.
At 24" that gives you a 180 PPI screen - which is a significant upgrade over an 80-90 PPI display for photo work. With 2x scaling, the UI is equivalent to a 90 PPI screen - the same as you have now.
There are also 27/32" 4K screens, which would still be a good upgrade over your current screen for photo work (160/140 PPI) while giving you even larger text. (80/70 PPI equivalent)
 
6-bit+FRC is terrible, especially if you're planning on using it for image editing work. Even 8-bit+FRC is pretty bad.
VA or TN panels are not suitable for photo editing, you need IPS.
Considering that you plan on using the display for photo editing, I'm surprised that you're looking for something so low resolution. All the photographers that I know, including myself, want the highest resolution display possible.
The higher the resolution is, the more difficult it is to see pixels, and the more accurate a representation it is of your work - this is especially important if you do any kind of print work.

I hate to ask the obvious, but have you had your vision tested recently?
Your current screen is ~90 PPI, which is already lower resolution than most.
Typically most monitors are in the 100-110 PPI range, with notebooks being closer to 130. 1920x1080 at 27" is only 80 PPI.

What I would suggest for you would be going in the opposite direction: buy a 4K screen and run it at 2x scaling.
Non-integer scaling can be problematic (e.g. 1.25x on 2560x1440) but exact scales like 2x generally work quite well in Windows now - especially if you're on Windows 10.
At 24" that gives you a 180 PPI screen - which is a significant upgrade over an 80-90 PPI display for photo work. With 2x scaling, the UI is equivalent to a 90 PPI screen - the same as you have now.
There are also 27/32" 4K screens, which would still be a good upgrade over your current screen for photo work (160/140 PPI) while giving you even larger text. (80/70 PPI equivalent)



Now, photo editing is not my area but are you sure VA panels are worthless for photo editing? They do have shift towards the edges but its all gamma that shifts. There is no "going negative" effect like TN has towards the top. Eizo for example has several VA panel models meant for photo editing over the years.

But yeah, IPS is greatly preferable especially for a professional (and top of the line Eizos and NECs are all IPS I think) but for hobbyist I believe VA could be sufficient.
 
zone74, photo editing is not the only use of the screen. It also needs to be suitable for general use and gaming.

I can read the small text of high DPI screens, but I find it's less convenient and less enjoyable compared to lower DPI screens. In a way it's like reading the fine print of a medicine leaflet or the ingredients list of a snack bar, compared to reading the larger print of a book. Don't you find the latter more comfortable to read?

I can see how a 4K screen would solve all my problems, but I'm afraid I don't have the budget for it. A 4K screen like the Dell P2715Q is 50% more expensive, where I live, than a 2560x1440 screen like the U2715H, which is already a bit more expensive than what I'd like to pay and is 50% more expensive than a 6 bit FHD screen like the P2714H. Choosing between the last two is basically choosing between the compromise of using 6 bit for photo editing, and the compromise of using non-native resolution for general use. Which of these two do you consider to be the lesser evil?
 
I can read the small text of high DPI screens, but I find it's less convenient and less enjoyable compared to lower DPI screens. In a way it's like reading the fine print of a medicine leaflet or the ingredients list of a snack bar, compared to reading the larger print of a book. Don't you find the latter more comfortable to read?
The point of a high DPI 4K screen is that you run it at 2x scaling.
Text size/workspace would be identical to a 1920x1080 screen, but everything has double the resolution. Small text in particular is significantly clearer and easier to read as a result.
So a 27" 4K panel will give you the 1920x1080 workspace that you want (big text) but without the terrible pixelation of an 80 PPI display, and without stepping down to 6-bit+FRC color. In fact, there are a number of 10-bit 4K panels in that size - though that may be outside your budget.

It amazes me how many people on PC forums don't seem to understand this.
To them, resolution is workspace, so they are "throwing away" workspace buying a 4K panel and running it at 2x scaling. Never mind that text and images look significantly better when rendered at twice the resolution.

I can see how a 4K screen would solve all my problems, but I'm afraid I don't have the budget for it. A 4K screen like the Dell P2715Q is 50% more expensive, where I live, than a 2560x1440 screen like the U2715H, which is already a bit more expensive than what I'd like to pay and is 50% more expensive than a 6 bit FHD screen like the P2714H. Choosing between the last two is basically choosing between the compromise of using 6 bit for photo editing, and the compromise of using non-native resolution for general use. Which of these two do you consider to be the lesser evil?
A 6-bit FRC display is absolutely unsuitable for image work in my opinion. So is running at a non-native resolution.

If your choice is between those two, get the 1440p screen and pull the monitor closer if you find 110 PPI too small.
Or try non-integer scaling (1.25x/1.5x) with it - though I have generally not had great results with non-integer scaling, even on Windows 10.
Again: I would suggest that you get your vision checked out if you haven't recently, if you are finding 110 PPI text difficult to read. That is the standard pixel density for nearly all monitors sold these days.
The reason you're only finding 6-bit panels with a 1920x1080 resolution at that size, is because only the lowest end panels have such a low pixel density now.
 
There's a world of difference between a 6 bit + FRC TN Panel and a proper 8 bit IPS panel. Get the 8 bit, and make resolution changes as needed.
 
pixelblue, I wasn't referring to TN panels. I looking at IPS panels with 6 bit+FRC, like the Dell P2714H. According to the tftcentral review the difference compared to 8 bit is negligible. Quote:

The panel is capable of producing 16.7 million colours. According to the detailed panel spec sheet this is done with a 6-bit colour depth and an additional Frame Rate Control (FRC) stage (6-bit + Hi-FRC) as opposed to a true 8-bit panel. This is a measure commonly taken on modern panels, and the FRC algorithm is very well implemented to the point that you'd be very hard pressed to tell any difference in practice compared with an 8-bit panel.

This is why I'm considering this screen as a replacement for my 8-bit 2209WA.
 
Pixel Per Inch Calculator.

22" 1680x1050 is 90.05 PPI
24" 1920x1080 is 91.79 PPI
27" 1920x1080 is 81.59 PPI
27" 2560x1440 is 108.79 PPI
The Dell 2007FP 20.1" 1600x1200 you returned had 99.5 PPI.

If you wanted 2560x1440 to be the same PPI as 22" 1680x1050, it would need to be 32.6", hence the recommendation for a 32".

Since 99.5 PPI bothered you so much, forget about 27" 2560x1440 because the pixels are much smaller. If you want very similar PPI to your 2209WA - not enough to matter even to you - you'll want a 24" 1920x1080. If bigger pixels are good, then the 27" 1920x1080 panels might make you happy, if the pixels aren't so big that you are seeing them.
 
Again: that's why he should be looking at 4K panels.
27" 4K is 160 PPI, which gives you a 1920x1080 workspace with 2x scaling - equivalent to 80 PPI - without the pixelation or drop to a 6-bit panel.
 
the real question is not where you edit your photos, but where will they be seen.

The majority of the web content is seen on smartphones and low grade TN panel with 6bit+FRC.

So you if you are editing photos to produce web content, your work fidelity will actually improve on a native 6bit+FRC display, in true what you see is what you get mode: using for editing the very same screen type where the content will be seen.

on the other hand, if you edit photos for printing, you should aim for the highest color accuracy possible inside your budget.
 
Great comments everyone, thank you!

evilsofa, 2560x1440 is definitely too much for me. I need to find a showroom where I could check how things look if I use it at non-native 1920x1080 resolution. If the text doesn't look sharp then it's a non-starter. 32" inch won't fit reasonably well on my L-shaped corner desk.

zone74, I understand and agree with what you say. The problem is the budget. Sometimes in life we have to make compromises and buy products with less than perfect features.

geok1ng, I hardly ever print. It's all for display on screens (and projection). It's good to know 6bit+FRC is satisfactory (and possibly optimal) if viewing will be using similarly limited displays.
 
Using non-native resolutions will give terrible image quality due how scaling work and due to how fonts depend on subpixel stricture of panel thus even 4K panel with perfect integer/point scaling would produce worse image quality than native 1080p panel. It is the worst option possible. You would end using 2560x1440 anyway. Besides you can test how it will look. Just run 1280x800 on your monitor and it will be roughly the same thing just more pronounced due to larger pixels.

6-bit vs 8-bit is overblown out of proportion. In past it was necessary to use 8-bit because dithering algorithms were terribly bad, usually simple alternating patterns which were easily visible, especially when eyes moved across the screen. Todays dithering algorithms are result of carefully studying human perception and use subpixel noise eliminating nearly all flaws of old algorithms.

8-bit is still better but I would rather advice against being biased toward or against monitor solely on this one parameter. In real world it is important what gamut monitor have, if gamma response is good or needs correction. If you eg. use GPU from Nvidia or Intel and calibrate display you will have banding which is loosing gradation steps. In comparison 6-bitness in itself doesn't introduce any banding. 8-bit proponents often do not realize it or neglect this fact as non important and it have more impact than 6-bit.

Incidentally in IPS 6-bit looks and behaves much much better than in TN panels. For one TN's are much faster thus making pixel color more noticeable with sharp 'clicking' appearance to it while IPS due to slower response time is like pulsing, much less noticeable or irritating. Also TN due to how viewing angles affect near black colors made dithering very noticeable in some colors (near black) from some angles. IPS is best suited to FCR algorithms from all panels types.

Actually 6-bit + A-FCR have more color precision than 8-bit and it is less limiting than 8-bit panel but without any FCR at all. Many 8-bit monitors had banding in all but one setting which is panel native gamma and RGB controls set to 255 255 255. I do not know if 2208WA is like that but many monitors were and properly implemented 6-bit monitor will be simply better. Anyhow, it is best to look at monitor as a whole. It is iportant what gamut and gamma response those monitors have, not bitness.
 
Seriously, just go to the nearest Apple store and they might still have a 27" 2560x1440 iMac or display available to give you an idea what it looks like.

Your options are really only two: Get a 27" 1440p display and use scaling or get a 4K display and use scaling. It's much less hardware to run games well at 1440p.

This really seems to be a case of "I'm used to this and don't want to change". It takes a while to get used to smaller, sharper text or alternatively sharper text at the same size you are used to.
 
6-bit+FRC is terrible, especially if you're planning on using it for image editing work. Even 8-bit+FRC is pretty bad.
VA or TN panels are not suitable for photo editing, you need IPS.
Considering that you plan on using the display for photo editing, I'm surprised that you're looking for something so low resolution. All the photographers that I know, including myself, want the highest resolution display possible.
The higher the resolution is, the more difficult it is to see pixels, and the more accurate a representation it is of your work - this is especially important if you do any kind of print work.

I hate to ask the obvious, but have you had your vision tested recently?
Your current screen is ~90 PPI, which is already lower resolution than most.
Typically most monitors are in the 100-110 PPI range, with notebooks being closer to 130. 1920x1080 at 27" is only 80 PPI.

What I would suggest for you would be going in the opposite direction: buy a 4K screen and run it at 2x scaling.
Non-integer scaling can be problematic (e.g. 1.25x on 2560x1440) but exact scales like 2x generally work quite well in Windows now - especially if you're on Windows 10.
At 24" that gives you a 180 PPI screen - which is a significant upgrade over an 80-90 PPI display for photo work. With 2x scaling, the UI is equivalent to a 90 PPI screen - the same as you have now.
There are also 27/32" 4K screens, which would still be a good upgrade over your current screen for photo work (160/140 PPI) while giving you even larger text. (80/70 PPI equivalent)

I have never had an issue with my ultrasharps that have 6bit + FRC. Granted, I am no pro but I have not seen any noticeable issues between monitors that have and dont have it.
 
Using non-native resolutions will give terrible image quality due how scaling work and due to how fonts depend on subpixel stricture of panel thus even 4K panel with perfect integer/point scaling would produce worse image quality than native 1080p panel.
For sub-pixel anti-aliased text, you may be correct.
For all other uses, integer scaling looks better on a 4K panel than the 1080p native screen.
You eliminate the screen-door of the low resolution panel and the image appears sharper. Small details are easier to see.

I can't think of an example where sub-pixel anti-aliased text would ever have integer scaling applied though. Anything rendering text that way should support proper DPI scaling. It's mainly things like games where you would want integer upscaling.

6-bit vs 8-bit is overblown out of proportion. In past it was necessary to use 8-bit because dithering algorithms were terribly bad, usually simple alternating patterns which were easily visible, especially when eyes moved across the screen. Todays dithering algorithms are result of carefully studying human perception and use subpixel noise eliminating nearly all flaws of old algorithms.
You've clearly not looked at a 6-bit LCD panel any time recently. They are not suitable for photography/image editing.

I do agree with you about never running a non-native resolution on the display though. 1920x1080 on a 1440p panel will look terrible.
The solution is to use non-integer DPI scaling (1.25x/1.5x) instead of selecting a non-native resolution.
 
For sub-pixel anti-aliased text, you may be correct.
For all other uses, integer scaling looks better on a 4K panel than the 1080p native screen.
You eliminate the screen-door of the low resolution panel and the image appears sharper. Small details are easier to see.
.

I am not sure i follow your point on a theoretical basis. :confused:

In practice i whole disagree: all 4k panels for PC usage have a PPI equivalent of a 20" 1080p screen or smaller. any detail invisible at a 24" 1080p solution will not be "easier to see" on a 4k panel of 32' or 28".

Even at 39" upscaling 1080p content to a 4k screen does not bring more detail, at least on my Seiki 39". Upscaling is overrated.
 
I am not sure i follow your point on a theoretical basis. :confused:
In practice i whole disagree: all 4k panels for PC usage have a PPI equivalent of a 20" 1080p screen or smaller. any detail invisible at a 24" 1080p solution will not be "easier to see" on a 4k panel of 32' or 28".
Even at 39" upscaling 1080p content to a 4k screen does not bring more detail, at least on my Seiki 39". Upscaling is overrated.
When you are doing a 2x Nearest Neighbor scale on a display, your smallest image pixel is 4x the size of a physical pixel on the display.

This largely eliminates the "pixel grid" from over the image.
It is not adding detail, it is increasing the clarity of the existing image.
Note the elimination of color fringing and the pixel grid, due to the greatly increased panel resolution.
More examples here.

Even if the image resolution is the same, elimination of the pixel grid/color fringing from the panel is a good improvement when sitting close to the display - as you would with a monitor - in my opinion.
Of course if you have a higher resolution panel, it is best to take advantage of that extra resolution instead of just upscaling.
That's why you should use the DPI scaling options in Windows instead of rendering at a lower resolution and upscaling, if text is too small when things are displayed 1:1 at 1x scale.
 
The majority of the web content is seen on smartphones and low grade TN panel with 6bit+FRC.

So you if you are editing photos to produce web content, your work fidelity will actually improve on a native 6bit+FRC display, in true what you see is what you get mode: using for editing the very same screen type where the content will be seen.

If this statement is accurate, then it's a key takeaway from the discussion for me, as I don't plan to print and don't target users with large color gamut screens. I wonder what I'd be missing (in terms of suitability of the screen for web display) if I don't purchase an 8 bit screen?
 
@zone74
do not use images with point scaling as showcase of why 4K displays should be used because it is not something users will get. Windows use bilinear scaling, GPU use blinear scaling and monitors use something very similar to bilinear scaling and comparison 1080p panel vs 4K with lower resolution and/or windows 200% dpi looks like

lo-res43p30.jpg
hi-resdisplayw6o6h.jpg

with point scaling being something which won't happen

Any argument for 4K display made showing pixel doubled images is misleading! In reality image will be blurred compared to native 1080p display.

As to 6-bit, it adds noise but it doesn't make display unsuitable for professional uses. It is just bad and stupid choice for professional uses because noise (when properly implemented) make image look better than it really is. Here slightly because of small amount of noise needed. In plasma where there is literally few brightness levels it is big issue but still most plasma users never noticed they are noisy. Here it isn't, especially on IPS. What make display unsuitable is uncorrectable flaws like bad viewing angles. Modern 6-bit IPS with tons IPS glow is much better than best 8-bit VA because latter never display colors as they really are. Still as bad as VA are they are widely used professionally. It is much worse than if all those who use those VA panels used modern 6-bit IPS.
 
do not use images with point scaling as showcase of why 4K displays should be used because it is not something users will get. Windows use bilinear scaling, GPU use blinear scaling and monitors use something very similar to bilinear scaling and comparison 1080p panel vs 4K with lower resolution and/or windows 200% dpi looks like
http://abload.de/img/lo-res43p30.jpg
http://abload.de/img/hi-resdisplayw6o6h.jpg
with point scaling being something which won't happen
Windows uses point scaling at integer scales for non-DPI-aware applications. It only uses bilinear filtering with non-integer scales.
OS X can optionally be set to use point scaling for non-retina apps on a per-app basis.

Any argument for 4K display made showing pixel doubled images is misleading! In reality image will be blurred compared to native 1080p display.
4K does not inherently blur the image.
With point scaling, a 1080p native image on a 4K display will look better than on a 1080p native display.
But I was not suggesting sending a 1080p source image to a 4K display, I was suggesting buying a 4K monitor and running at 2x DPI scaling, which renders at 4K but gives you a 1080p workspace.
This is significantly higher quality than a 1080p display of equal size, especially for photo editing.

And some displays - though not many yet - do use point scaling when sent a non-native resolution.
With filtered scaling, I agree that things may be blurred - though that is not a bad thing for video, only desktop/games.

As to 6-bit, it adds noise but it doesn't make display unsuitable for professional uses. It is just bad and stupid choice for professional uses because noise (when properly implemented) make image look better than it really is. Here slightly because of small amount of noise needed. In plasma where there is literally few brightness levels it is big issue but still most plasma users never noticed they are noisy. Here it isn't, especially on IPS. What make display unsuitable is uncorrectable flaws like bad viewing angles. Modern 6-bit IPS with tons IPS glow is much better than best 8-bit VA because latter never display colors as they really are. Still as bad as VA are they are widely used professionally. It is much worse than if all those who use those VA panels used modern 6-bit IPS.
The added noise and/or potential banding, depending on how well the display converts to 6-bit, is unsuitable for professional photo work in my opinion.

I don't agree with geok1ng's assessment of smartphone displays.
The average smartphone display is better than the average desktop monitor now, with considerably higher pixel density, ≥8-bit IPS panels or OLEDs being preferred, and a focus on sRGB accuracy.
Despite that, however, if the target for his work is only smartphone displays, perhaps a 6-bit panel would be sufficient. But I still could not recommend it due to the low pixel density.
 
For sub-pixel anti-aliased text, you may be correct.
For all other uses, integer scaling looks better on a 4K panel than the 1080p native screen.
You eliminate the screen-door of the low resolution panel and the image appears sharper. Small details are easier to see.

I can't think of an example where sub-pixel anti-aliased text would ever have integer scaling applied though. Anything rendering text that way should support proper DPI scaling. It's mainly things like games where you would want integer upscaling.

You've clearly not looked at a 6-bit LCD panel any time recently. They are not suitable for photography/image editing.

I do agree with you about never running a non-native resolution on the display though. 1920x1080 on a 1440p panel will look terrible.
The solution is to use non-integer DPI scaling (1.25x/1.5x) instead of selecting a non-native resolution.

Have you not used a dell u2412M? I have like 4 of them and I have had no issues with them for PS editing and printing. I calibrated it and can edit an image and go to my school where they have calibrated imacs and the image is nearly the same and prints just fine.

It is 6bit with A FRC too.
http://www.tftcentral.co.uk/reviews/dell_u2412m.htm

PS editing works great on my P2715q too and isn't that an 8bit A FRC screen too? To simulate 10bit? I tried googling that but couldn't find it
 
I'm considering the option of getting a QHD screen and trying to work around the small text by changing Windows DPI settings (and living with the occasional broken app and truncated button labels). The question is - would a QHD screen like Dell U2715H be competitive to a FHD screen (e.g. P2714H) for gaming, assuming both are run at FHD for this purpose? I seriously doubt my Radeon HD7850 can run many games at QHD with reasonable framerate, so would likely need to use FHD. I'm not planning to upgrade my video card in the near future.

For reference, the CPU is i7 4770K and I have 8GB of RAM.
 
For sub-pixel anti-aliased text, you may be correct.
For all other uses, integer scaling looks better on a 4K panel than the 1080p native screen.
You eliminate the screen-door of the low resolution panel and the image appears sharper. Small details are easier to see.

I can't think of an example where sub-pixel anti-aliased text would ever have integer scaling applied though. Anything rendering text that way should support proper DPI scaling. It's mainly things like games where you would want integer upscaling.

You've clearly not looked at a 6-bit LCD panel any time recently. They are not suitable for photography/image editing.

I do agree with you about never running a non-native resolution on the display though. 1920x1080 on a 1440p panel will look terrible.
The solution is to use non-integer DPI scaling (1.25x/1.5x) instead of selecting a non-native resolution.

I'm considering the option of getting a QHD screen and trying to work around the small text by changing Windows DPI settings (and living with the occasional broken app and truncated button labels). The question is - would a QHD screen like Dell U2715H be competitive to a FHD screen (e.g. P2714H) for gaming, assuming both are run at FHD for this purpose? I seriously doubt my Radeon HD7850 can run many games at QHD with reasonable framerate, so would likely need to use FHD. I'm not planning to upgrade my video card in the near future.

For reference, the CPU is i7 4770K and I have 8GB of RAM.

you mean QHD and not 4K right? If so nm. What i was going to say was useless
 
Yeap, QHD. 4K is out of my budget.

they have the P2715Q for 400-500 on sale days. I use it and its quite nice. Not so good for gaming in terms of speed but games at 4K are pretty awesome looking even though blurring can be bad at times
 
In Europe you can get very well reviewed LG 27MU67 for less than € 500. 4K UHD monitors are virtually in the same price range as QHD monitors. If small text is what you are concerned about, it’s better to use UHD monitor with 200 % scaling than try to use QHD monitor with non-integer scaling factor.
 
The LG 27MU67 isn't available where I live, but I guess I should wait a few more months until a reasonably priced UHD screen with IPS panel is available. My current screen is working fine, so I should just be patient.
 
I was not aware that new Windows versions did point scaling. In Win7 there is only bilinear and even when application seems to be just scaled it have lots of issues with sizes of elements inside window. I have some spare SSD lying around so I will install Win10 and do some testing there.

The reason why 6-bit is unusitable for professional work might be actually overrating images as having more visual fidelity than they have because dithering if it is very random make images seem to look better than they look like. When I was using 37" 720p plasma from like 1m to 1.5m distance it seeemed all games and movies looked good on it. All textures had infite amount of details in them. The same games on CRT and LCD looked worse despite CRT and LCD having no such ridiculously drastic dithering. The only reason games looked as good as they look on this plasma was dithering.

I am not suggesting one should prefer 6-bit display but that it is not as clear as to what is 'better'. If 6-bit display is much cheaper and have good parameters it might not be as bad. It all depend on implementation of A-FCR and person who use monitor. If someone have averse reaction to random noise then it is probably best to use high-bitdepth displays very much as when someone have averse reaction to matte coating then it is best to use glossy displays.
 
Back
Top