High Refresh Rate or Overall Image Quality?

Greyson

Weaksauce
Joined
Jan 27, 2012
Messages
78
Yep, it's what you'd expect from the title. I'm in the market for a monitor and i'm not sure which I should prioritize. Should I go for a pro-grade IPS that tends to have better uniformity, contrast, and accuracy than their gaming-grade counterparts? Or go with one of the 144hz IPS/VA displays that are on the market which may not be as good in other areas but include technologies like G-Sync. (I use Nvidia)

Size and resolution are secondary right now, i'm just trying to figure out which side of the fence I fall on. I do play games, but a lot of them are RPG or RTS and i've never had a problem with 60hz up to this point even in the shooters I play. That's not to say a higher refresh rate wouldn't be better and if I could have everything I would, but it seems like I have to pick one or the other. I'm sure the phrase "you don't know what you're missing out on" is true, but i've been "surviving" on 60hz all the same.

Basically, for anyone else who has considered these options, does having a high refresh rate outweigh all other options? Is it worth it to sacrifice a bit of quality in all areas for extreme performance in one particular area?

I'm sure this thread is a bit derivative and the topic has come up before, i'm just completely stuck even trying to narrow down my options.
 
Last edited:
I had 240hz gsync 1080p tn and loved the smoothness but sold it due to image quality and just got 60hz ips because I want correct iq.

Now I am looking for something that does it all and the only monitor might be lg 32gk850g... Very very good reviews. Said to have good gamma, colors and 165hz, gsync along with best ever va pixel response... It sounds too good and its the reason I did not bought it yet
 
RTS means Frames to me. I never "see" tearing so 1080p and 60-200+ Frames Per Second is the way I play. It's really a personal decision and no one can really tell you what you want to visually see that's enjoyable for you. From your orig, post, I'd say stick with the 60Hz since it doesn't bother you and go for some more image quality given those two choices. Maybe you can demo something in a store or a friends to "see what you are missing" to help you make a better decision.
 
This is a very interesting question. I also always wonder if I should get a 144hz screen, or "better" pixels lime a QLED monitor. Usually I decide for image quality - as that never stops being noticeable, unlike 144h in most of one's desktop use. 60 to 75hz is fine most of the time for me, but then again I don't play competitive shooters.
 
Last edited:
I use setup consisting of Acer XB271HK 27" 16:9 4K G-Sync monitor and HP DreamColor LP2480zx 24" 16:10 professional grade monitor with A-TW
Colors of Acer are good enough for games, web, productivity, etc.
Perfect gamut and A-TW of HP make colors so much better and life-like. For games I do not need life-like colors though.
Thing to note: wide gamut monitors are worse at text rendering.

Setup consisting of 144Hz G-Sync IPS (avoid VA - they are terrible at color quality) and Eizo CX240 (or if you can get your hands on good condition HP LP2480zx <- it actually have far superior panel) would be great and probably fullfil all your needs

Unfortunately price would be pretty high :dead:
 
Give a high refresh monitor a try if you can. Some people think it's the best thing ever even for desktop use (scrolling and dragging windows will be smoother).

I personally think high DPI and image quality are far more important. I increase settings in games until I'm below 60FPS anyway, so have an IPS 4K 60Hz. To me high refresh is "that's kinda nice I guess" but high DPI is "wow everything looks so nice and smooth".
 
There is no reason that a 144 Hz display cannot be also accurate when it comes to color reproduction. My 8-bit TN panel on my 144 Hz G-Sync display is accurate for sRGB color space. The newer IPS panel versions should also be accurate for that and higher gamuts as well.

Then it comes down to which you want to use for gaming, do you turn settings down (depending on your GPU) to get 100+ framerates or do you rely on G-Sync to keep it smooth at 40-60+ fps by having high graphics settings.
 
There is no reason that a 144 Hz display cannot be also accurate when it comes to color reproduction. My 8-bit TN panel on my 144 Hz G-Sync display is accurate for sRGB color space. The newer IPS panel versions should also be accurate for that and higher gamuts as well.

Everything i've looked at suggests that most (or all) of the 144hz+ IPS displays on the market have mediocre uniformity, (exacerbated by backlight bleed) worse Delta E performance, and lower contrast than professional grade models, and the prices aren't all that different due to the premium that G-Sync fetches. If you can suggest something to me please do, I don't claim to know about every monitor out there but I have looked at many.

Give a high refresh monitor a try if you can. Some people think it's the best thing ever even for desktop use (scrolling and dragging windows will be smoother).

Yeah, i've been thinking about buying a monitor from somewhere with a good return policy just to give it a shot.

Thanks to everyone who has replied. I agree about image quality being something you "never stop noticing" vs. refresh rate being a thing that mainly benefits gaming, though I understand it has some fringe benefits for general tasks too. That's kind of where i'm leaning right now, because gaming is only one of many things I use this computer for.
 
So far, the 240hz tn monitors I've tried all looked much worse than 60hz tn monitor from 10 years ago.
The colors are fine but gamma performance is very low and contrast goes to hell when using high refresh modes. That is for tn panels
 
I suggest you consider VA gaming panels. IPS and TN have 860:1 to 990:1 contrast ratio and poor black levels around .14 .. While a modern gaming VA isn't as good as a TV it still has triple the contrast ratio (~ 3000:1) and black depth of ips and TN.

From TFTCENTRAL
6OJtiel.png


And there is a Rtings list of all the tv's they reviewed by contrast ratio here:
https://www.rtings.com/tv/tests/picture-quality/contrast-ratio

Some other things to consider.......

We are on the verge of moving to hdmi 2. 1 in TVs next year, and most likely in consoles in 2020. There will be some really good tvs that check all the boxes if re-doing your room setup for increased size and viewing distance isn't a problem. The samsung Q9FN 65" line has 480zone FALD , ~ 1700nit HDR 1000, 19018:1 dynamic fald contrast ratio, and near perfect REC 2020 color in it's hdr color volume. They already support VRR (variable frame rate hdmi standard) in their 2018 models via hdmi 2.0b (as does xbox and amd cards). In 2019, tvs should have hdmi 2.1 for 120hz native 4k at 4:4:4 color, VRR, and QFT (quick frame transport for low input lag gaming). The problem , other than the sizes available and the high price, is that nvidia has a monopoly on the most powerful gaming gpus and has vested interest in g-sync so it's likely they won't support VRR for a few years, if ever.. The fact that they released their 2000 gpu line early without hdmi 2.1 says something. Even though the 480zone Q9FN price model slot is $3700 currently and the 80zone Q8 is $2500, they are still $1700 less and $2500 less than the nvidia 65" BFG models which are over $5000 and like their over $2000, small 27" 4k fald hdr models will still be on hdmi 2.0b which means they can't do 4:4:4 color over 98hz and they lack standardized hdmi 2.1 VRR, QFT, dynamic HDR, and overall bandwidth. Hopefully in 2019 and 2020 we will start seeing more options and sizes as real HDR1000 displays with HDMI 2.1 start to become more common.

J62gufP.png


Regarding high hz benefits:
.....

120hz at high frame rate is a HUGE increase in display experience . Especially for 1st/3rd person gaming aesthetics (and even for watching sports if it were recorded and transmitted at high frame rates).

In 1st/3rd person games you are moving your viewport around at speed constantly so it's not just a simple flat colored bitmap ufo test object smearing. The entire viewport and game world (of high detail textures and depth via bump mapping, etc) in relation to you is smearing during movement-keying and mouse looking. 120fps at 120hz cuts sample and hold blur by 50% and doubles your motion definition and motion path articulation, and increases to glassy smoothness (more dots per dotted line, twice the unique animation scene pages/cells in a flip book paging twice as fast per se). 100fps-hz cuts sample and hold blur by 40% and does 5:3 motion definition improvement (10 unique frames shown at 100fps-hz to every 6 shown at 60fps-hz).



At 60fps or 60hz cap you are getting smearing blur during viewport movement. At 100 - 120fps on a high hz monitor you cut that blur down to more of a soften blur within the "shadow masks" of everything on the screen, within the lines of the coloring book so to speak. Modern gaming overdrive and low response times help mitigate this or it would be much worse.

Variable refresh rate is another HUGE bump in display experience. It allows you to straddle or at least dip well into those higher hz and frame rate ranges in a frame rate graph that has spikes, dips, and potholes without experiencing judder, stutter, stops, or tearing. So you can tweak your graphics settings higher for a better balance and avoid the bad effects of v-sync or no-syncing at all.

There are a few tvs that can do 60hz to quasi 120hz with interpolation even with gaming mode active without adding a ton of input lag but it still jumps from 15ms up to 20ms on the best samsung LCD screens. Most add a lot of input lag. Interpolation does reduce blur in a way but it does not add more actual frames of action. It's more like repeating a frame and floating it which can give an odd effect.. and they are usually not without artifacts and dark halos, etc.

The backlights and uniformity of general LCD tech are terrible, the black levels are really bad especially on IPS and TN which are usually 880:1 to 980:1, and the blur is at best a soften blur at high fps + high hz and worst smearing blur on typical 60hz lcd tech and even at lower frame rates on a high hz monitor. A good FALD VA can help with the black levels and contrast a lot, up to 5000:1, 8000:1 or much more with a denser FALD array.. but especially with HDR going forward, the haloing glow and/or dimming of areas is a big issue since the ratio of fald backlight to pixel sizes is huge and HDR often shows extremes of bright full color volume highlights and edges right next to darker scene sections or inky blacks and in a dynamically changing and panning scene at that.


OLED would be a great pc gaming leap once it gets hdmi 2.1 120hz 4k + VRR, except for that fact that the organics degrade over time no matter what, and more importantly have a risk of permanent burn in. The screens shift to lower brightness modes, use screen/pixel saving methods, and limit peak brightness for a reason. From what I've read they are usable for thousands of hours without issue but some static colors increase the risk (Rtings "real life" scenario OLED burn-in test .. CNN logo seems to be the worst) .. and there would always be that fear in the back of my mind after spending $1200 - $2600 on an OLED for PC. It's been 30 weeks of testing.over at Rtings. I'm still on the fence but won't be in the market until 2019's hdmi 2.1 LG OLEDS are out anyway.. The other issues with current LG oled are lower than optimal HDR peak brightness and some banding in large areas of similar color, which if you turn on the mpeg noise reduction feature to reduce, loses detail.
 
Last edited:
.....

60fps is molasses and the worst smearing blur of the whole screen in 1st/3rd person games where you are continually moving your viewport around. High fps + high hz is not just for twitch gaming, it is a huge aethetic benefit in both motion clarity (blur redcution) and motion definition (double or more the unique motion state images in a flip book that is flipping twice as fast). This creates tighter sample and hold blur to more of a soften blur and better with good overdrive, instead of smearing blur at 60fps ...

When you say "gets X fps" you are talking about the AVERAGE so you are really ranging down into 50 and on some games even down to 30 fps in your fps graph 1/3 of the graph. This is sludge to me. Think of a strobe light cutting away motion definition but instead of seeing the black state you just see the last action frozen through the black states of the strobe light. That is what's happening to everything in the game world and the motion of the viewport itself when you run 60fps-hz instead of 100 to 120fps-hz. where you would get glassy motion and more defined pathing (more dots per dotted line) and even more animation cycle definition.. as well as the movement keying and mouse looking of the entire game world moving in the viewport relative to you moving with more definition and glassiness with half the blur. So it is very aesthetic. 4k, at least sub 100fps-hz, makes for good screenshots.

View attachment 95735


View attachment 95736
...
 
After seeing it all new and old including plenty of CRT time in the days, IQ over Hz, especially as I mostly do workstation stuff. But a good balance is best for when I have some gaming time :)
Something like 10 bit, 80-90Hz would be optimal minimum. But if stuff gets crappy looking, dull and less vibrant which is often the case then nope. I'll go slower and nicer looking.
 
Something like 10 bit, 80-90Hz would be optimal minimum. But if stuff gets crappy looking, dull and less vibrant which is often the case then nope. I'll go slower and nicer looking.

This, exactly. That's why I don't care bout 144hz. It never looks as good as better lower hz panels. Smoother? Sure, but I'd rather look at a slower nice image, than a really fast dull one.
 
In the market right now then yes there is still a lot more fragmentation in trade-offs.

However it's going to change in the next two years with hdmi 2.1, HDR 1000 (1000nit+ peak), FALD contrast ratio of 10000:1 or more, rec 2020 color and 4k 120hz 4:4:4 native becoming standard in displays over time.


Frankly, this will blow all current displays away especially for gaming and media/movies.

The only displays even close are the nvidia FALD HDR 4k models but they don't have hdmi 2.1 so can't do 4:4:4 color past 98hz and lack VRR (including for consoles) and a few other hdmi 2.1 enhancements. Their pricing models are also insane even for high end pricing tiers at over $2000 for a 27" 4k (tiny for a 4k imo), and over $5000 for a BFG when a top of the line samsung tier with 100 more FALD zones (480 vs 384) costs $1700 less, and a 80 zone costs $2500 less (half)... and again no HDMI 2.1 on the nvidia displays or gpus while tvs in 2019 and the next gen of consoles in 2020 will all have hdmi 2.1 and it's features.


J62gufP.png



Non-HDR (SDR) color would be the flat base plane in both cases, and in most monitor's cases would be a smaller cutout within even that 2d plane.

3r1M8aR.png


4lXSE9G.png
 
Last edited:
If Samsung gets their shit together and puts a DP 1.4 on their premium screens, we got a pretty solid winner for PCs on most fronts. Making the smaller sizes more available would help a lot too. No gshit but possibly VRR over that, though you can always just go all in on GPU to mostly make up for the lack of whatever-sync.

For now I got one of the korean 43" 120hz 4k ips using make an offer on a killer bucks n discount day, its not bad. I run it at 10bit 96hz most of the time, though its a tad on the "ips blue" size no matter what color settings I tweak.
 
Back
Top