OLED Gaming Displays

I said that high hz + high fps's benefits are not just for twitch shooting advantage - it is a very aesthetic difference.
That was the main point of my post. While people continually post that it's just for twitch shooters and ignore or dismiss it's massive aesthetic benefits in games, I'll keep posting examples of what they are ignoring.

The graphics I post are because they are the ones I took the time finding which give some glimpse into what people are missing out on or trading-off since you can't "see" it without having a high hz monitor running high frame rates. If I find some examples that relay that information easier I'll save them and start posting those :b Unlike still frame screenshot wallpaper showing off high resolution 4k, you can't show high hz + high frame rate or HDR, high contrast and black depth of OLED , etc to someone without the hardware to see it.

-----------------------------------

You can surely enjoy gaming on 60hz screens and consoles and handhelds. You can enjoy gaming at 1080p resolution too.

In going 4k 60hz or 60fps currently you are losing one of the biggest benefits to pc gaming since 2009.. the huge and aesthetic benefits of high hz. Running a 60hz max 4k monitor or 60fps average you're actually, somewhat ironically, dropping your image clarity (and motion definition) of the entire viewport in 1st/3rd person games where you movement key and mouse-look at speed. From glassy motion and soften viewport contents during movement to slideshow motion and smearing viewport movement.

People gaming on 60hz OLED tvs currently are going even further and droping g-sync/VRR capability. Luckily in the future (perhaps 2019 LG tvs) there will be hdmi 2.1 displays with 120hz native 4k HDR and VRR (variable frame rate) tech, and QFT (quick frame transport for low input lag gaming) ... and hopefully support for those features in future gpu generations and with more powerful gpus.


We are just to the point where a single 1080i can run decent enough frame rate averages to get appreciable benefits of high hz at x1440 rez on demanding games. Personally I am not willing to drop 120hz+ gaming (with enough fps to benefit from it) since 2010, or VRR/g-sync since 2014.




--------------------------------------------------------------------

I agree g-sync is great.

Regarding backlight strobing / ulmb mode however the graphics settings and other tradeoffs are huge. I doubt ULMB will work with HDR going forward either.

"If you are trying to run strobing on a high resolution monitor, you have to run much lower settings in order to get sustained (not average) 100fps or better. It's a huge difference.
View attachment 62006
==================================
Easier to render games with very high fps work pretty well with ulmb.
Running 1440p or higher rez with any kind of high to ultra settings on the most demanding games won't let you sustain high fps, only average it.


As per blurbusters.com 's Q and A:
-----------------------------------------------------
Q: Which is better? LightBoost or G-SYNC?
It depends on the game or framerate. As a general rule of thumb:
LightBoost: Better for games that sustain a perfect 120fps @ 120Hz
G-SYNC: Better for games that have lower/fluctuating variable framerates.
This is because G-SYNC eliminates stutters, while LightBoost eliminates motion blur. LightBoost can make stutters easier to see, because there is no motion blur to hide stutters. However, LightBoost looks better when you’re able to do perfect full framerates without variable frame rates.
G-SYNC monitors allows you to choose between G-SYNC and backlight strobing. Currently, it is not possible to do both at the same time, though it is technically feasible in the future.
......
Main Pros:
+ Elimination of motion blur. CRT perfect clarity motion.
+ Improved competitive advantage by faster human reaction times.
+ Far more fluid than regular 120Hz or 144Hz.
+ Fast motion is more immersive.
Main Cons:
– Reduced brightness.
– Degradation of color quality.
– Flicker, if you are flicker sensitive.
– Requires a powerful GPU to get full benefits. <edit by elvn: and turning down settings a lot more at higher resolutions>

--------------------------------------------------------
During regular 2D use, LightBoost is essentially equivalent to PWM dimming (Pulse-Width Modulation), and the 2D LightBoost picture is darker than non-LightBoost Brightness 100%.
--------------------------------------------------------
Once you run at frame rates above half the refresh rate, you will begin to get noticeable benefits from LightBoost. However, LightBoost benefits only become major when frame rates run near the refresh rate (or exceeding it).
-------------------------------------------------------
If you have a sufficiently powerful GPU, it is best to run at a frame rate massively exceeding your refresh rate. This can reduce the tearing effect significantly.Otherwise, there may be more visible tearing if you run at a frame rate too close to your refresh rate, during VSYNC OFF operation. Also, there can also be harmonic effects (beat-frequency stutters) between frame rate and refresh rate. For example, 119fps @ 120Hz can cause 1 stutter per second.
Therefore, during VSYNC OFF, it is usually best to let the frame rate run far in excess of the refresh rate. This can produce smoother motion (fewer harmonic stutter effects) and less visible tearing.
Alternatively, use Adaptive VSYNC as a compromise.
-------------------------------------------------------------
Pre-requisites
Frame rate matches or exceeds refresh rate (e.g. 120fps @ 120Hz).
  1. LightBoost motion blur elimination is not noticeable at 60 frames per second.
--------------------------------------------------------------
Sensitivity to input lag, flicker, etc. (You benefit more if you don’t feel any effects from input lag or flicker)

Computer Factors That Hurt LightBoost

  • Inability to run frame rate equalling Hz for best LightBoost benefit. (e.g. 120fps@120Hz).
  • Judder/stutter control. Too much judder can kill LightBoost motion clarity benefits.
  • Framerate limits. Some games cap to 60fps, this needs to be uncapped (e.g. fps_max)
  • Faster motion benefits more. Not as noticeable during slow motion.
  • Specific games. e.g. Team Fortress 2 benefits far more than World of Warcraft.
  • Some games stutters more with VSYNC ON, while others stutters more with VSYNC OFF. Test opposite setting.
 
Last edited:
I disagree for my tastes at least. 60fps is molasses and the worst smearing blur of the whole screen in 1st/3rd person games where you are continually moving your viewport around. High fps + high hz is not just for twitch gaming, it is a huge aethetic benefit in both motion clarity (blur redcution) and motion definition (double or more the unique motion state images in a flip book that is flipping twice as fast). This creates tighter sample and hold blur to more of a soften blur and better with good overdrive, instead of smearing blur at 60fps ...

When you say "gets X fps" you are talking about the AVERAGE so you are really ranging down into 50 and on some games even down to 30 fps in your fps graph 1/3 of the graph. This is sludge to me. Think of a strobe light cutting away motion definition but instead of seeing the black state you just see the last action frozen through the black states of the strobe light. That is what's happening to everything in the game world and the motion of the viewport itself when you run 60fps-hz instead of 100 to 120fps-hz. where you would get glassy motion and more defined pathing (more dots per dotted line) and even more animation cycle definition.. as well as the movement keying and mouse looking of the entire game world moving in the viewport relative to you moving with more definition and glassiness with half the blur. So it is very aesthetic. 4k, at least sub 100fps-hz, makes for good screenshots.

View attachment 95735


View attachment 95736
I don't disagree and I can see the extra frames bringing smoothness to everything, not just games. However, for lot's of cinematic games 60fps is well beyond the consoles 30fps or movies. There are plenty of games that I'd rather have real 4k at max settings and 60fps like the [H] graph I showed than 144hz or whatever.

Of course, it would be awesome to never have to choose what compromise you want to make, but realistically if a 1080Ti could for 4k@144hz in every game, we'd just start talking about how much motion blur there is and we need 200/240hz or more. 60fps isn't perfect, it's just the fps benchmark for PC land gaming. At 4k, the pixel pitch is finally getting small enough that increases in resolution matter less; at 8k I expect it to sort of max out. Can't wait for my 8k@240hz 42" HDR display someday and whatever GPU is required to drive it.

For now it's moving goal posts and individuals deciding what compromises they want to make. I have no issue with someone gaming at 1080p@144hz or 4k@60fps. Find the settings that make you happy and games do vary. Just don't tell everyone else they're wrong for choosing a different compromise than you. Saying you can't game at 4k on a 1080Ti is just silly, you can, [H] has shown it, but you make different compromises than 1080p@144hz.
 
I said that high hz + high fps's benefits are not just for twitch shooting advantage - it is a very aesthetic difference.
That was the main point of my post. While people continually post that it's just for twitch shooters and ignore or dismiss it's massive aesthetic benefits in games, I'll keep posting examples of what they are ignoring.
The problem you face isn't technical. The loudest advocates of the ultra-high refresh monitors, and whom the manufacturers market to, are twitch gamers, not everyone else. They go so far as to start using TN panels which have awful colors to get there too. Some day we may not have to make those compromises, but for now, I haven't seen a high refresh rate monitor outside of OLED that looked even remotely good.

Thus, when people point out that currently, all the high refresh rate is primarily for twitch gaming, they are correct. I do wish it wasn't like that.
 
Sure they are heavily marketed "TO", with plenty of marketing speech for kills and annihilating your competition etc.

However , people in discussions like these often use "FOR" twitch gaming to dismiss how great the aesthetic benefits gained by running higher fps on a high hz monitor are. As if it's not a night and day difference.. It is a huge difference in visual eye candy.



whom the manufacturers market to, are twitch gamers, not everyone else. They go so far as to start using TN panels which have awful colors to get there too.

Marketed heavily to, true.. but not exclusively.
ROG Swift PG27UQ is the very first monitor to run 4K UHD (3840 x 2160) content at a 144Hz refresh rate, providing gamers with detailed ultra-high definition visuals at extremely smooth frame rates and a 4ms response time. Get unmatched levels of details, sharp images and crisp text.

The people buying 4k 120hz FALD HDR monitors care about colors going into even HDR color volume, contrast/black depth and across the board aesthetics including motion clarity and motion definition of a high hz g-sync monitor, and they are willing to pay a lot for it apparently.



Knowing that in 2019 there should be a LG 4k HDR OLED tv with hdmi 2.1 for 120hz input at 4k + VRR support and QFT support is making me skip that $2k + price tag for those FALD monitors, even if I have to wait awhile for a gpu that supports VRR.

Someone supposedly did a workaround to make freesync work off of a nvidia card recently though so who knows. If working 100% and it lasts without being blocked maybe there is hope for a hdmi VRR gpu sooner than I'd thought. Xbox already supports VRR on hdmi 2.0b incidentally.
 
Last edited:
I just think it's funny that I can get a 55" 2017 OLED for $1100 but I can't get a desktop sized display for anything less than multiples of that! I can't even get a 37" HDTV OLED to run as a monitor, they all come 55" or larger :p
 
I hear you, but
I'd pay over $2000 for a 2019 55" LG 4k HDR OLED w/ HDMI 2.1 + 4k 120hz native input + VRR + QFT
before I'd buy a $2000+ 27" FALD IPS. Even if I have to rearrange my pc room to face the desk to the tv further away.

I think the first 55" LG oled was $3500 in 2014. I think they were $2300 in 2016. So new model lines can be pricey at release.

I'm saving for the 4k 120hz VRR OLED. If those FALD HDR monitors came out 2 - 3 years ago it might be different.
 
Last edited:
Thus, when people point out that currently, all the high refresh rate is primarily for twitch gaming, they are correct. I do wish it wasn't like that.
120Hz is starting to become normal "office" standard even in phones because of it's better feel. They have OLED too. Phones get all the good stuff while we're still stuck in 2005 :p

I don't see why 120Hz shouldn't be the standard for office monitors as well, it does allow smoother operation even in dekstop UI.
 
Last edited:
120Hz is starting to become normal "office" standard even in phones because of it's better feel. They have OLED too. Phones get all the good stuff while we're still stuck in 2005 :p

I don't see why 120Hz shouldn't be the standard for office monitors as well, it does allow smoother operation even in dekstop UI.

This is correct. I can tell the difference between 60,75,120 and 144Hz just in how moving windows around feels... as the refresh goes up, its so silky buttery smooth like everything is coated in lube, its GREAT.
 
The problem you face isn't technical. The loudest advocates of the ultra-high refresh monitors, and whom the manufacturers market to, are twitch gamers, not everyone else. They go so far as to start using TN panels which have awful colors to get there too. Some day we may not have to make those compromises, but for now, I haven't seen a high refresh rate monitor outside of OLED that looked even remotely good.

Thus, when people point out that currently, all the high refresh rate is primarily for twitch gaming, they are correct. I do wish it wasn't like that.

The 8-bit TN panels found in some high refresh rate displays are far from awful. I've been using an ASUS PG278Q for years and it has been wonderful. It has no issues like dithering and barely any of the vertical color shift. I am fully aware of how awful TN panels you see in cheap laptops and monitors are and at least this one is nothing like those. It's accurate for the sRGB color space which is what most content uses. It won't do those oversaturated colors some love in higher gamut displays.

The pros are of course the lowest pixel response times but to be honest with advancements in IPS displays I don't think I would notice.

I don't disagree and I can see the extra frames bringing smoothness to everything, not just games. However, for lot's of cinematic games 60fps is well beyond the consoles 30fps or movies. There are plenty of games that I'd rather have real 4k at max settings and 60fps like the [H] graph I showed than 144hz or whatever.

I agree with this. On console games that have the option for 1080p at 60 fps vs 4K (or checkerboard version) at 30 fps I often opt for the 4K option instead because it looks better. These are usually games that don't require super fast reflexes and are very story and visual focused like the new God of War.

On PC I aim for 60 fps at 1440p with my current GPU but with G-Sync I don't care too much if it varies around that figure. I think the best thing PC gamers could do is turn off their damn fps counter software. Seeing that number in the corner will mentally fuck you up as you become more aware of framerate differences than you would be without them. While you can feel the difference between 120 and 60 fps I don't think it's anywhere near as jarring as 30 vs 60 fps or dropping under 30 fps.

With the compromises we have to make to get our desired playing experience we need better ways to handle resolution. Checkerboard rendering, integer scaling, user-configurable dynamic resolution scaling (e.g "scale between 1440p and 4k") would help tremendously especially now that we are entering a time where ray tracing means huge drops in framerates until hardware catches up.
 
The problem you face isn't technical. The loudest advocates of the ultra-high refresh monitors, and whom the manufacturers market to, are twitch gamers, not everyone else. They go so far as to start using TN panels which have awful colors to get there too. Some day we may not have to make those compromises, but for now, I haven't seen a high refresh rate monitor outside of OLED that looked even remotely good.

Thus, when people point out that currently, all the high refresh rate is primarily for twitch gaming, they are correct. I do wish it wasn't like that.
Not necessarily. 240 Hz panels certainly do use poor quality TN panels these days, but they are often 1920x1080 and marketed directly at the Twitch gamer who refuses to play with anything more than 800x600 resolution with the mistaken perception it gives them an advantage. 120 Hz is becoming ubiquitous in nearly every display market segment because the advantages outside of gaming are apparent. Even most gaming displays with high refresh are using IPS or IPS-type panels these days. The panel in the PG27UQ is one of the best LED LCD I have ever seen.
 
The PG27UQ 's FALD makes up for what would be a few issues.. but you can just leave it on dynamic for SDR content so the end result is what's important.

the White --- BLACK -- CONTRAST is great on those:
variable backlight
varying sized white patch on black background or black patch on white background yielded
black depths of .03 to .08, .15 to .37
contrast ratios of 946:1, 2320, 4225, 4986, 8725, 11,900:1

HDR overall white 1285nit, black depth .03 , contrast 42,833:1


You do go back up to 9.3 ms response time at 60fps-hz on it (and combine that with the sample and hold blur at 60fps-hz or other low range frame rate average spans). You only get the 6.6ms or 5.x ms response time on thePG27UQ and the PG279Q IPS screens when you are at very high hz + high fps rates, which can be impossible to achieve on demanding games at any type of very high to ultra settings at 4k resolution. It is definitely the best monitor out right now but IMO the price/size and upcoming 2019 hdmi 2.1 120hz native 4k HDR OLED TVs with VRR and QFT move my $2k budget elsewhere. I can be patient and pick my battles sometimes :)

TFTcentral about PG279Q gaming IPS
So what does this all mean? Well it means that the pixel response times of the screen will vary a little depending on the refresh rate you're using. If you plugged in a 60Hz console, the response times would be ~8.5ms G2G, still very good for an IPS panel. If you use G-sync and the refresh rate fluctuates between 30 and 144Hz, the response times are controlled dynamically and will vary a little as refresh rate changes. To be honest we aren't talking huge differences, although when you combine the slightly higher response time impact on blurring, with the impact of lower refresh rates on perceived blur, you will notice some difference in motion clarity depending on your active refresh rate. The variation in response times isn't really a big factor, and you're more likely to notice the difference in motion clarity caused by the changes in refresh rate anyway.

TFT central regarding the PG27UQ
"the 'normal' mode showed a good improvement compared with at 60Hz, with average G2G response time improving from 9.3ms to 6.9ms when at 98Hz, and a little lower at 6.6ms at 120Hz. "


--------------
elvn:
hopefully 2019 .. I'll end up with a 55" LG 4k HDR OLED with HDMI 2.1 's 120hz at 4k , VRR (variable refresh rate) , QFT (quick frame transport for low input lag gaming)... but it will require puwerful gpu(s) in the future to run it. At that size my desk would have to sit further away but with the right kind of setup - I'd have the option when running more demanding games to run a smaller 21:9 or 16:9 rez within that screen to get higher fps while gaming and still have quite a large screen with infinite black level, per pixel emission which avoids halos and glows and uniformity issues, tiny response times, and HDR with HDR color volume.

It would be great if 1080p to 4k scaling would work properly as another option but that doesn't seem to be happening. 1440p scaled at 4k usually ends up looking better but I've never been a fan of improper scaling.. Spending $2k+ now on a FALD at 27" size isn't happening knowing I could save that money towrd a HDR OLED 4k 120hz VRR in 2019 and gpus in 2019 - 2020. For me 4k would be good at 40" (or bigger further away) to get more desktop real estate and allow me windowed/other resolution and aspect ratio options without going tiny.
 
Last edited:
Not necessarily. 240 Hz panels certainly do use poor quality TN panels these days, but they are often 1920x1080 and marketed directly at the Twitch gamer who refuses to play with anything more than 800x600 resolution with the mistaken perception it gives them an advantage. 120 Hz is becoming ubiquitous in nearly every display market segment because the advantages outside of gaming are apparent. Even most gaming displays with high refresh are using IPS or IPS-type panels these days. The panel in the PG27UQ is one of the best LED LCD I have ever seen.
I personally can't wait for 120/144hz to become ubiquitous. 4k, HDR 100/120/144hz panels are very new. I can see the effects of a low 60hz panel, I can seem the effects of a low resolution panel, I can see the effects of a panel with poor color and dynamic range. For now it's a trade off and does depend on what games you play and your preferences. Personally, I can tolerate 60hz better than I can tolerate low resolution in most cases. That includes the desktop and most games I play. 60hz is still a huge increase over consoles or movies!

Edit: this all started with people lamenting that you can't play games at 4k, when you can. It's just a different set of compromises depending on your preferences, resolution is one of them. The monitor that has greater specs than your eyes doesn't exist quite yet, but we're getting closer.
 
I personally can't wait for 120/144hz to become ubiquitous. 4k, HDR 100/120/144hz panels are very new. I can see the effects of a low 60hz panel, I can seem the effects of a low resolution panel, I can see the effects of a panel with poor color and dynamic range. For now it's a trade off and does depend on what games you play and your preferences. Personally, I can tolerate 60hz better than I can tolerate low resolution in most cases. That includes the desktop and most games I play. 60hz is still a huge increase over consoles or movies!

Edit: this all started with people lamenting that you can't play games at 4k, when you can. It's just a different set of compromises depending on your preferences, resolution is one of them. The monitor that has greater specs than your eyes doesn't exist quite yet, but we're getting closer.

Very much this for me.

I've been gaming on Benq 3201's for a while now , and while more than 60hz would be nice , anything smaller than 32" seems tiny to me , lower res than 4k bothers me greatly also. Everyone has their individual tastes and and there's nothing out there right now that covers all the bases.
 
I may be interested in the samsung QLEDs due to OLED burn in concerns in the back of my mind. The samsung "QLEDs" and the LG OLEDs are very good 4k HDR monitors already. They just need hdmi 2.1 bandwidth , VRR and QFT.

The 27" FALDS are the best monitors out right now but that is only because hdmi 2.1 wasn't ready for 2018. If you want size, get a LG 4k HDR OLED with hdmi 2.1 120hz native 4k with VRR + QFT in 2019 , or a Samsung "QLED" 4k HDR VA LCD with 120hz native 4k, VRR, QFT.. The smallest they go is 55" but if you have the room to rearrange your desk to be further from the monitor I see no problem there. In fact, it would allow me to run 21:9 or 21:10 or even smaller 16:9/:10 resolution(s) 1:1 for higher frame rates while still having a very large viewport/monitor.

----------------------------------------------------------------

LG 4k HDR OLED with hdmi 2.1 120hz native 4k with VRR + QFT in 2019


LG 2018 C8 60Hz Rtings review

OLED per pixel emissive avoids FALD halos/glow and any screen uniformity issues, and it has INFINITE:1 contrast ratio which is amazing but there is still the chance of burn in over time.

  • Real scene HDR Brightness is very good, but still short of the 1000-4000 cd/m² HDR is mastered for. Large bright scenes are very dim due to the Automatic Brightness Limiter(ABL).
  • Black Level.. Infinite:1
  • The OLED55C8PUA has perfect black uniformity, with no clouding due to its ability to turn off black pixels.
  • Excellent color and white balance dE after calibration, better than the C7 and Samsung's Q9F. While the calibration out of the box was already very good, after calibration the colors were nearly perfect. Gamma follows our target almost perfectly.
  • The C8 has decent coverage of the P3 color space, but is unable to produce overly bright, saturated colors.
  • C8 displays our test gradient smoothly with no significant banding. In certain scenes there is some banding noticeable in large areas of similar color. This can be reduced by enabling 'MPEG Noise Reduction', which toggles the gradient smoothing feature of the C8. This reduces the visible banding but also results in a loss of fine detail.
  • OLED TVs such as the LG OLED C8 have an inherent risk of experience permanent image retention.
  • C8 handles motion extremely well. The near instantaneous response time is excellent for watching sports or playing video games, as there is no ghosting or trailing during fast motion. Also, there is no visible flicker since there is no traditional backlight on OLED TVs, unlike Samsung's QLED technology. One downside to OLED technology is that there is some stutter when playing low frame rate content, especially when watching movies or TV Shows.
  • Like all OLED TVs, there is no visible backlight flicker which helps motion appear smoother, but it does result in some persistence blur.
  • 4k @ 60Hz + HDR : 29.4 ms
  • 4k @ 60Hz @ 4:4:4 : 21.1 ms
  • 1080p @ 120Hz : 21.9 ms
  • Great choice for PC use. Image remains accurate when viewed at an angle so the sides of the screen are uniform. Supports chroma 4:4:4 for clear text across all backgrounds
  • the brightness of the screen changes depending on the content and areas of static content may have a risk of burn-in (see here)

An alternative to burn in concerns would be whatever the samsung Q8F series equivalent will be in 2019. They are HDR 1000 FALD VA tvs.
The high end samsung "QLED"s already support VRR/free-sync on amd gpus and xbox one in their 2018 model, they just can't do 4k 120hz native input yet since there is no hdmi 2.1 circuitry in 2018 tvs.

Samsung Q8F (rtings review)
  • Excellent wide color gamut
  • Feels responsive due to low input lag
  • Great motion handling
  • the viewing angles are poor so the sides of the screen lose accuracy when viewed from up-close.
  • "Excellent contrast ratio on the Samsung Q8F. It features a full array local dimming feature and is able to get very deep blacks. 7957:1 "
  • "Very good brightness with HDR content. Small highlights are hitting the target 1000 cd/m² that HDR is mastered for. The screen brightness dips considerably with very bright scenes, but is still good for a bright room. Similar brightness to the LG C8, but with brighter highlights in very dark scenes, as shown by the small window tests."
  • "Excellent wide color gamut. The Q8FN can display nearly 100% of the P3 color space, and has the highest Rec.2020 coverage we have ever seen, although it is very close to the 2017 Q9F"
  • Update 06/08/2018: FreeSync has been tested and the score has been updated. FreeSync was supported from our Xbox One S and our Radeon RX 580 GPU, in 1080p, 1440p and 4k resolutions. FreeSync is activated by enabling the TV's Game mode and FreeSync settings
  • Excellent low input lag on the Samsung Q8FN QLED TV. Input lag is exceptionally low with 120 Hz content, similar to the NU8000, and better than the LG C8. It can display most resolutions without any issues, but chroma 4:4:4 is not supported in PC Mode with a 1440p@120Hz signal (Likely a bandwidth limitation that will be overcome with hdmi 2.1 models in 1440p and 4k 120hz)
  • 4k with Variable Refresh Rate : 15.4 ms
  • 4k @ 60Hz @ 4:4:4 + 8 bit HDR : 16.7 ms
  • 1080p with Variable Refresh Rate : 6.5 ms
  • 1440p @ 120 Hz: 10.0 ms
  • can also interpolate games while keeping a low input lag, which is great for smooth play. 4k interpolated: 20.8ms
  • Great choice for a PC monitor. Picture quality is good. The TV supports chroma 4:4:4 for clear text across all backgrounds, and it has low input lag so the TV feels very responsive. It also has a low response time
 
Last edited:
You are living in a golden era of display technologies

The progress made in the last 15 years is absolutely astonishing. 15 years ago I was using a monitor that was not that different from what was available 30 years ago.

Today I have a huge display wall with lifelike colors, no distortions, no artifacts or smearing or noise (it is in fact perfect). It does not tire my eyes or irradiate my face to watch it all day. It is light and power efficient. And I can get it new for less then $2000.

The 4k @120hz will happen one day but it will be a very minor bump in display experience and not the nirvana you imagine it to be.
 
You are living in a golden era of display technologies

The progress made in the last 15 years is absolutely astonishing. 15 years ago I was using a monitor that was not that different from what was available 30 years ago.

Today I have a huge display wall with lifelike colors, no distortions, no artifacts or smearing or noise (it is in fact perfect). It does not tire my eyes or irradiate my face to watch it all day. It is light and power efficient. And I can get it new for less then $2000.

The 4k @120hz will happen one day but it will be a very minor bump in display experience and not the nirvana you imagine it to be.

No smearing? Bullfuck.
 
what? golden age? Literaly everything after crt and plasma is worse except oled
 
FW900 CRTs had essentially zero blur and low input lag, great screen uniformity decades ago. LCD was a huge step down for moving content back when I had one, to the point where for years I kept a LCD and a FW900 at the same desk up until several years ago. I also had a sony xbr 960 34" widescreen tv with hdmi input for years. Their size most of all, and "crispness" of their pixels to a degree , geometry management issues and overall age and support now make them a bad choice for me.

4k @120hz will happen one day but it will be a very minor bump in display experience and not the nirvana you imagine it to be.

120hz at high frame rate is a HUGE increase in display experience . Especially for 1st/3rd person gaming aesthetics (and even for watching sports if it were recorded and transmitted at high frame rates).

In 1st/3rd person games you are moving your viewport around at speed constantly so it's not just a simple flat colored bitmap ufo test object smearing. The entire viewport and game world (of high detail textures and depth via bump mapping, etc) in relation to you is smearing during movement-keying and mouse looking. 120fps at 120hz cuts sample and hold blur by 50% and doubles your motion definition and motion path articulation, and increases to glassy smoothness (more dots per dotted line, twice the unique animation scene pages/cells in a flip book paging twice as fast per se). 100fps-hz cuts sample and hold blur by 40% and does 5:3 motion definition improvement (10 unique frames shown at 100fps-hz to every 6 shown at 60fps-hz).

Today I have a huge display wall with lifelike colors, no distortions, no artifacts or smearing or noise (it is in fact perfect). It does not tire my eyes or irradiate my face to watch it all day. It is light and power efficient. And I can get it new for less then $2000.

At 60fps or 60hz cap you are getting smearing blur during viewport movement. At 100 - 120fps on a high hz monitor you cut that blur down to more of a soften blur within the "shadow masks" of everything on the screen, within the lines of the coloring book so to speak. Modern gaming overdrive and low response times help mitigate this or it would be much worse.

Variable refresh rate is another HUGE bump in display experience. It allows you to straddle or at least dip well into those higher hz and frame rate ranges in a frame rate graph that has spikes, dips, and potholes without experiencing judder, stutter, stops, or tearing. So you can tweak your graphics settings higher for a better balance and avoid the bad effects of v-sync or no-syncing at all.

There are a few tvs that can do 60hz to quasi 120hz with interpolation even with gaming mode active without adding a ton of input lag but it still jumps from 15ms up to 20ms on the best samsung LCD screens. Most add a lot of input lag. Interpolation does reduce blur in a way but it does not add more actual frames of action. It's more like repeating a frame and floating it which can give an odd effect.. and they are usually not without artifacts and dark halos, etc.

The backlights and uniformity of general LCD tech are terrible, the black levels are really bad especially on IPS and TN which are usually 880:1 to 980:1, and the blur is at best a soften blur at high fps + high hz and worst smearing blur on typical 60hz lcd tech and even at lower frame rates on a high hz monitor. A good FALD VA can help with the black levels and contrast a lot, up to 5000:1, 8000:1 or much more with a denser FALD array.. but especially with HDR going forward, the haloing glow and/or dimming of areas is a big issue since the ratio of fald backlight to pixel sizes is huge and HDR often shows extremes of bright full color volume highlights and edges right next to darker scene sections or inky blacks and in a dynamically changing and panning scene at that.


OLED would be a great pc gaming leap once it gets hdmi 2.1 120hz 4k + VRR, except for that fact that the organics degrade over time no matter what, and more importantly have a risk of permanent burn in. The screens shift to lower brightness modes, use screen/pixel saving methods, and limit peak brightness for a reason. From what I've read they are usable for thousands of hours without issue but some static colors increase the risk (Rtings "real life" scenario OLED burn-in test .. CNN logo seems to be the worst) .. and there would always be that fear in the back of my mind after spending $1200 - $2600 on an OLED for PC. It's been 30 weeks of testing.over at Rtings. I'm still on the fence but won't be in the market until 2019's hdmi 2.1 LG OLEDS are out anyway.. The other issues with current LG oled are lower than optimal HDR peak brightness and some banding in large areas of similar color, which if you turn on the mpeg noise reduction feature to reduce, loses detail.
 
Last edited:
A few of the most relevant quotes I found in the discussions, rather than just re-writing in my own words:

https://www.rtings.com/tv/learn/real-life-oled-burn-in-test/discussions

-----------------------------------------------------------------------------------------------------


https://www.rtings.com/tv/discussio...-was-wondering-how-burn-in-works-with-oled-so

Burn-in on OLED screens is caused by the diodes emitting less light as they age. In theory any color except for true black can cause burn-in on an OLED screen, but we don't quite know the long term causes yet. In theory the brighter your screen is, the faster the pixels will age and you will see long term effects. Initial symptoms will be some areas of the screen appearing dimmer than the rest of the screen. Older screens also tended to show a slight red-green shift in colors since the blue diodes aged faster, but newer pixel structures have helped to alleviate that. There are some tools that claim to reverse burn-in; all they do is age the rest of the screen to the same levels as the burnt-in part.

https://www.rtings.com/tv/discussio...nt-red-uniformity-issues-for-the-fifa-and-ncb

Hi and thanks for contacting us and pointing this out. These uniformity issues are visible in person, and the brighter areas resemble the Live CNN TV around week 6 (see here). We expect that this is overcompensation (a result of the algorithm).

https://www.rtings.com/tv/discussio...tnite-on-the-attached-xbox-one-s-2-hours-at-a

As long as more varied content is displayed during the rest of the week, you should not see permanent burn-in. There might be temporary image retention around the HUD (Heads-Up Display) elements after a 2-hour play session, but they usually go away after displaying more varied content. Running the fix program in the TV menu can help remove temporary image retention. It is important to note that permanent burn-in is caused by the pixels emitting less light as they age. Therefore, we expect the effect to be cumulative, so the total time during the life of the TV that the same pixels are illuminated has more importance than the length of the playing session. A good way to reduce the risk of burn-in is to display varied content as the pixels will wear more evenly across the screen.
 
Last edited:
I'm enjoying this thread very much, but I gotta say, whenever I read posts like "4K 144HZ GSYNC HDR OR BUST" I can't help but roll my eyes.

20 years ago, I was playing on crappy 15" CRTs on 640x480 @ 60hz and felt like that was glory. SDR, garbage refresh with our game hardware back then, lol at adaptive sync anything... and it was GREAT anyway. Now I'm using a frigging 4K 40" display. Sure, I wish I didn't have trouble with some games when trying to force 21:9 within it, but you know what? Anything above 1080p60 is freaking amazing and you should be glad that's baseline quality these days. But it's all a matter of perspective.

Sometimes I force games to 640x480 to scratch the nostalgia itch of super blurry graphics... and if you want to have a laugh, try 320x240... that was AMAZING because it finally ran SMOOTHLY back in the day before GPUs or even earlier on my Voodoo Banshee :D
 
People also demand a lot from games but there are still indie titles and platformers out there so it's not all like that. You can still play really well on a 1080p screen for a lot of demanding games with a halfway decent gpu and are able to turn the graphics settings up while getting really high frame rates on a high hz monitor if you choose to as well. Compared to 800x600 and 1024x768 17" mointors and graphics of old that would be mind blowing but that's not the point. A lot of technology would be mind blowing to people in the past :b

Unless you called someone - if you wanted to post in a discussion you'd have to drop something in a mailbox writing to a magazine or a newspaper, or find a physical public bulletin board to physically pin your note to, and wait a long time for a reply if you got any at all. You'd type with an ink slamming hammer typewriter.. Back in my day we had pong on green ray tubes not even black and white! eventually low baud modems dialing up ascii bulletin boards on a single voice phone line late at night to download tiny games at incredibly slow speeds.. and to play text games! TEXT stories and text rpgs (MUDS). 5.25" floppies. Porn was mostly magazines. You had to go to a shady grocery store to get one or sneak a peak at someone elses stash if you were a kid. RADIO and TV on crunchy cardboard cereal box sounding speakers. We had our phones on 12' tangled cables tethered to a wall because they weren't even wireless with big antenna yet! We had no answering machines for a long time! What was a microwave oven? No Cable tv or any kind of huge selection of your own chosen content outside of planning it out with a a tv guide <--- that's a little magazine not a guide on the TV ! Three to five channels one being pbs. Convex curved small screens and signal noise picture interference, and only one color tv in the house.

Time moves on. People demand better screens, reliable cellular and home internet service at decent speeds, etc Many of us know the display industry can do better. High end CRTs were better in many ways. Other tech and potential invention was sat on a long time or abandoned so they could milk consumers with inferior cheap LCDs longer. We have made some advancements since 2013 especially with higher hz, variable hz, higher resolution screens (2560x1440 mostly), and prices that dropped to $600 - $800 or less for a full featured one for awhile, rather than the $1100 - $1600 of the old 2560x1600 and 2560x1440 ips screens before the korean B grade knockoffs hit and 2560x1440 became more common. We are on the verge of 4k 120hz, better density FALD arrays in gaming capable screens, and a more standardized VRR now. Samsung, and perhaps apple, are taking the first steps toward making micro led displays in the years to come too. But there was a long time that display tech was more or less stagnant. Now consoles like xbox and amd gpus are supporting variable refresh rate hdmi standards even on hdmi 2.0b, and there are tvs that have HDR 1000 FALD arrays with deep black depth and near perfect and high volume color, VRR, low input lag at 55" and larger. Meanwhile nvidia is trying to force g-sync at a huge markup on what would already be expensive displays 4 -5 months or so before hdmi 2.1 hits (sometime in 2019 supposedly) and will likely keep a wall up vs supporting hdmi 2.1 VRR on their gpus.


When people say "OR BUST" about this currently, I think we mean that we know the page is turning and HDMI 2.1 120hz 4k HDR VRR is just around the corner. No matter what they won't be very cheap so holding out long enough is a wise move considering the timeframe. There are TCL 4k tvs that work great as desktop/app monitors (and even 60hz gaming) for $240 - $280 and decently priced g-sync gaming monitors or even used ones in the meantime if you need something "right now" , without blowing $1k - $2200 -3k+ on something before hdmi 2. 1. People here probably already have a decent gaming screen for the time being anyway I think. Personally I'm all about picking my battles and waiting for things to be ripe especially with 1k - 2k - 3k price tags of living room tvs or heavy prices on computer hardware/upgrades. That kind of money is nothing to roll eyes at to me personally and demands quality and features.
 
Last edited:
I'm enjoying this thread very much, but I gotta say, whenever I read posts like "4K 144HZ GSYNC HDR OR BUST" I can't help but roll my eyes.

20 years ago, I was playing on crappy 15" CRTs on 640x480 @ 60hz and felt like that was glory. SDR, garbage refresh with our game hardware back then, lol at adaptive sync anything... and it was GREAT anyway. Now I'm using a frigging 4K 40" display. Sure, I wish I didn't have trouble with some games when trying to force 21:9 within it, but you know what? Anything above 1080p60 is freaking amazing and you should be glad that's baseline quality these days. But it's all a matter of perspective.

Sometimes I force games to 640x480 to scratch the nostalgia itch of super blurry graphics... and if you want to have a laugh, try 320x240... that was AMAZING because it finally ran SMOOTHLY back in the day before GPUs or even earlier on my Voodoo Banshee :D

Most people did not play at 60 Hz as that is quite horrible flicker for a CRT. I remember most displays doing 75-100 Hz for everything but unusable resolutions for the sizes at the time. I have a TV studio CRT at home for use with a Raspberry Pi and emulators or old DOS games and those games benefit a lot from the CRT tech. They look more vivid and the lack of resolution is less jarring due to the small screen size and the slight smoothing the technology causes.

By comparison LCD display tech is only now entering a time where a lot of displays have all the goodies needed for a great gaming experience: high resolution, high refresh rate, low input lag and response times.

That said, a lot of people are way dogmatic about their game performance. The most recent game I've been playing has been God of War on the PS4. I opted for the "favor resolution" option which means 4K checkerboard at 30 fps over the "favor performance" option which is unlocked 30+ at 1080p. The game is immensely detailed so using the higher res for me was worth the tradeoff in framerate, considering it doesn't run at a constant 60 fps in 1080p either. People need to turn off their framerate counters and just enjoy games on their chosen platform(s). While PC gaming is known for being the best in terms of performance and visuals, it is now starting to come at a very hefty cost if you want 4K @ 60+ fps. At this point with GPU prices at an all-time high I just hope we can at least get some smaller options from TV manufacturers so we don't have to buy hugely expensive desktop monitors just to get high refresh rate 4K.
 
When people say "OR BUST" about this currently, I think we mean that we know the page is turning and HDMI 2.1 120hz 4k HDR VRR is just around the corner.

A) I loved everything about your post.
B) I'm aware I have historical perspective that I can't forget, so I judge my current equipment still from the eyes of 16 year old me.
C) Indeed, there is zero point on buying anything that's not HDMI 2.1 or 1 billion colors. It seems that both these things are going to explode in a big way in 2019, manufacturers know this and are trying to clear inventory any way they can with overpriced 6bit+dither crap panels.
 
Back
Top