Rant: switched back to IPS after being on OLED for a year

jyi786

Supreme [H]ardness
Joined
Jun 13, 2002
Messages
5,810
I might catch some flak from folks, but here goes: I switched back to an IPS screen from being on my LG 48" C3 OLED for over a year. The IPS screen I switched to is an Asus PG32UQX.

Sure, this is a "you do you" type of deal, but the sorry state of the industry when it comes to displays really has me ticked off. I spent a boatload of money, paired it with an RTX 4090, and got a really subpar experience with my LG C3 OLED. When it worked, it worked FABULOUS. But there were so many small annoyances that it became unbearable to do anything with: gaming, productivity, watching movies, everything.

I chronicled my issues here:
https://hardforum.com/threads/48-lg-c3-oled-vs-the-best-ips.2036445/

Ok, so some of this was to be expected going from a computer monitor to a TV. I fully own that. However, trying to get OUT of that situation by getting another monitor with the same technology is next to near impossible. I cannot believe we are in a time where we have to worry about DisplayPort and HDMI CABLES NOT WORKING. Seriously? I bought 3 expensive HDMI 2.1b certified cables, and all of them degraded. Every single one of them. Then, every single OLED screen out there has some ridiculous shortcoming. Not waking up from sleep. Flickering. Color corruption. Poor HDR over time. Cables not working/degrading over time. Blacks looking purple. Ports unsecured like they are going to fall out. Needing to select special resolutions in graphics card control panels. Needing to reset graphics drivers on switching power state. VRR flicker of varying degrees, all terrible. Backlight dimming. ABL on white screens, impossible to overcome on HDR. And worst of all? Pretty much every OLED computer monitor out there is trying to be a stupid freaking smart TV instead of a computer monitor. This one by far is the most egregious.

I sat here trying to fix my experience with the LG C3 for over a month. I was at wits ends. I tried everything under the moon and sun, and everything was a half-assed workaround at best. Then when I started researching purchasing a new monitor, that's when I found everything I listed above. I'm not going to try and sell the Asus PG32UQX I switched to. It also has some serious flaws itself (like comical FALD blooming on white-on-dark backgrounds). But they can all be mitigated, which I've done. And best of all? Everything. Just. Works. And it's 2+ years old.

It's just a really sad state of affairs that I've found. I want to be excited, but if I need to worry about cables not working, there's a serious problem with a lack of quality in the industry and it needs to be rectified.

Rant over. TLDR: switched to IPS from OLED and am much happier. Cool story bro. Thanks for reading.
 
Do you live somewhere humid if your cables are degrading? I've had the same HDMI cables (CableMatters and Club3D) hooked up to my LG CX 48" for a couple of years now, and had zero issues with the cables.

I do feel OLED tech atm is better suited for TVs rather than desktop displays. There's still too many caveats to them to make switching to an OLED full-time, and that's coming from someone who used the CX for 2 years as my desktop display. But I have pretty much no complaints about it as a media watching and PC gaming setup.

I hate the "let's just cram our smart TV stuff into monitors" trajectory. Its only purpose seems to be to push ads to you rather than provide a better experience. Those smart TV operating systems are generally not designed for quickly changing settings either, or for monitor specific features.

The smart TV functionality should be more separated from the monitor functionality, and possible to just ignore if you don't want to use it. For example the Samsung ARK's picture by picture and screen resizing features are all cool stuff, but also at the same time clearly made for a TV rather than a monitor without any real consideration of desktop use.
 
I didn't see any smart TV features on gaming monitors to be honest.
The bigger issue is dumbing down OSD controls, giving almost no control over basic settings and locking settings in profiles which have what looks like arbitrary settings which are not even consistent between different monitors from the same brand. Like on IPS I guess I want blueish image with totally washed out blacks in FPS game and reddish and over sharpened look on other monitor... yeah.
And don't get me started on HDR - I am totally perplexed how in standard where each value has specific luminance this can be screwed up with manufacturers just adding arbitrary amounts of black crush.

---
Cable issues - never had any.
Maybe HDMI receiver on your TV was bad or cable picked interference from somewhere or both. If there is some kind of noise issue it can be close to threshold of getting errors.
Otherwise as standards push bandwidth higher and higher this will become much bigger issue and especially with cheap cables it might be hit-or-miss. DP2.1 with highest bandwidth apparently doesn't have normal length cables and we will most probably need optical fiber cables to even get 5m cables.

---
OLED and desktop - again courtesy of manufacturers treating users like idiots will force auto-dimming and other nonsense.
If only these auto-dimming features worked in a way that made sense I would not complain - if using desktop at 250 nits it auto-dims when displaying white full screen or keeping same image for a while - ok, it dimmed to X nits - so let's set it as my maximum brightness.... nope, even if I set OLED brightness to 0 then it will still auto-dim aggressively on full screen white or with static elements...

---
Gaming monitors... make special 1080p mode at 480Hz because DP 1.4a + DSC only allows for 240Hz at 4K.
Integer scaling? No, hell no, make it blurry turning this feature totally useless in the process negatively affecting sales!

You can immediately tell people making these products are not enthusiasts and have no idea what they are doing.
There is certain amount of compromises that come from tech used because no tech is perfect but then you pretty much almost always need to expect some things to be flocked up by firmware.

And of course reviewers never give honest opinions and always justify flaws and not even mention obvious things.
They focus on if HDR has few nits more brightness but ignore that monitor doesn't at all show dark grays and just pure black and image looks broken and make this as a good thing claiming monitor has deep blacks...
I would say monitor manufacturers are less responsible for the state of monitors than reviewers - any case where sRGB has e.g. locked settings or there is any black crush at all should have giant DO NOT BUY! anti-recommendation and lack of integer scaling should be in each and every review also lowering score.
Nothing has integer scaling so why lower score? Because it is so easy to implement in hardware each and every monitor should have it for at least basic modes like scaling 1080p or 720p to 2160p.
 
I have never had HDMI cables degrade. Well other then the dog chewing up my cable... I ran a 60ft fiber optical cable and switched to a 25' HDMi 2.1 cable after the dog ate the fiber one and haven't had any issues.
 
I pray that we can get a 144hz+ version of the PA32UCXR as a successor to the PG32UQX. 2x the zone count, faster panel and 2000nits capable. I just don't see it happening without a dedicated Gsync module which seems to no longer be a thing.
 
Some really, really great feedback from everyone, indicative of pretty much everything I said.

To clear up a few things, I do not live in a humid location or have a faulty power grid that would render the cables or any of my other equipment susceptible. The problem would have been either the screen or the cables. More than likely the cables though, judging from how it behaved on initial installation to performance over time.

Believe it or not, "dedicated" Gsync DOES still seem to be a thing, but it's just way more convoluted now because of nVidia and manufacturer shenanigans. This Asus PG32UQX has Gsync Ultimate. My old Asus PG279Q also had dedicated Gsync, albeit the regular one. Both of these screens perform flawlessly at whatever I throw at it with Gsync enabled. Gsync COMPATIBLE is where I think there are some problems. Gsync has specific standards that needs to be adhered to, and as pointed out earlier in this thread, manufacturers apply it as loosely as loose change in a water fountain. If there is no real standard being followed, would one think that Gsync would work properly? Chances are it won't. And to my eyes, native Gsync does make a difference, enough for me to seek out native Gsync instead of Gsync Compatible, especially having been through a year of Gsync compatible mess.
 
I have never had HDMI cables degrade. Well other then the dog chewing up my cable... I ran a 60ft fiber optical cable and switched to a 25' HDMi 2.1 cable after the dog ate the fiber one and haven't had any issues.
Still using my hdmi cable my xbox one preorder came with on my pc.
 
Still using my hdmi cable my xbox one preorder came with on my pc.
I wish I could, but these are all the fancy schmancy 48GB cables. Ones where 99% of them come from one country and are all garbage. I've bought hard jacket, soft jacket, copper core, fiber, long, short, they all ended up failing. Well, either they fail or the screen is killing them. Either way it's absolutely ridiculous.

By comparison, I'm using an old 15 foot DisplayPort cable on my new Asus PG32UQX. I bought this cable 10 years ago. And it's still going strong.
 
And of course reviewers never give honest opinions and always justify flaws and not even mention obvious things.
They focus on if HDR has few nits more brightness but ignore that monitor doesn't at all show dark grays and just pure black and image looks broken and make this as a good thing claiming monitor has deep blacks...
I would say monitor manufacturers are less responsible for the state of monitors than reviewers - any case where sRGB has e.g. locked settings or there is any black crush at all should have giant DO NOT BUY! anti-recommendation and lack of integer scaling should be in each and every review also lowering score.
Nothing has integer scaling so why lower score? Because it is so easy to implement in hardware each and every monitor should have it for at least basic modes like scaling 1080p or 720p to 2160p.
I think you make an excellent point here. Reviewers bear responsibility for part of this. I NEVER EVER EVER even remotely heard of anything called "VRR flicker" until AFTER I spent $1300 on my new monitor and it was sitting at my desk all set up, all my old monitors given away, and I'm gaming and I see this atrocious flickering when FPS drops. Not a single OLED reviewer mentioned this, only recently have they started. Never said a single thing about ABL and HDR, or it was skimmed over very briefly. Never mentioned anything about horrible gradients, even though the screen was 10/12 bit. Never anything about that, but for the small short bursts that you CAN do these things at optimal conditions, it works gloriously, and THAT is what the reviewers hyper focus on.

Lesson learned, I won't be doing it again. Computer monitor only from now on, and I'll need a bulletproof return policy. Even going to be using a credit card that offers a buyer's protection plan for a year, because this is just absurd.
 
Never had any issues with NVIDIA (RTX 3080) and my LG C2, worked perfectly and smoothly from day one.
You probably got a bad product and/or bad cables.
Yep. Similar situation with a 4090, couldn't be happier with the setup for the last 24 months. Put 7000 hours on my C2 from new entirely as a monitor.
 
I think you make an excellent point here. Reviewers bear responsibility for part of this. I NEVER EVER EVER even remotely heard of anything called "VRR flicker" until AFTER I spent $1300 on my new monitor and it was sitting at my desk all set up, all my old monitors given away, and I'm gaming and I see this atrocious flickering when FPS drops. Not a single OLED reviewer mentioned this, only recently have they started. Never said a single thing about ABL and HDR, or it was skimmed over very briefly. Never mentioned anything about horrible gradients, even though the screen was 10/12 bit. Never anything about that, but for the small short bursts that you CAN do these things at optimal conditions, it works gloriously, and THAT is what the reviewers hyper focus on.

Lesson learned, I won't be doing it again. Computer monitor only from now on, and I'll need a bulletproof return policy. Even going to be using a credit card that offers a buyer's protection plan for a year, because this is just absurd.
In my history of buying new monitors I was burned few times and got stuck with something reviewers advertised as an amazing display but otherwise had immediately noticeable issues and deficiencies. Always used them for a while and lost some money in the process selling them.

Last two OLED monitors are such cases and I noticed flaws very quickly after getting monitors. At the very least I can use these monitors with workarounds. Three ways to mitigate issue of latest 360Hz QD-OLED which otherwise is almost perfect - no issues you mentioned on it and no near-black chrominance overshoot. Still bad for desktop - though if I was mad I could use it for desktop 🙃

I pray that we can get a 144hz+ version of the PA32UCXR as a successor to the PG32UQX. 2x the zone count, faster panel and 2000nits capable. I just don't see it happening without a dedicated Gsync module which seems to no longer be a thing.
It would be even more useful if IPS panel had higher contrast ratio than 1200-1300:1

LG announced some time ago efforts in bringing high refresh rate IPS Black panels and even some kind of A-TW-like filter to eliminate IPS glow.
Current IPS Black panels are 2000:1 so pretty good improvement but apparently new models will have contrast ratio improved to 2500:1 or even 3000:1.

My experience with A-TW enabled IPS panel is such that it maybe doesn't make IPS have perfect (read CRT/PDP/OLED-like) viewing angles but really good enough and you only really see image degradation from such steep viewing angles which are totally useless. Otherwise no nonsense like needing to get far away from the screen or corners will glow.
 
I didn't see any smart TV features on gaming monitors to be honest.
I'm going to make your world a bit worse by introducing you to the delightful Odyssey G80SD, as reviewed by TFT Central. It has wondrous features, such as

Smart TV streaming apps are available as you’d expect, like Netflix, Amazon Prime, Disney+ etc.

How can you not be enthused by the idea of being able to use streaming apps on the monitor itself? Much more convenient than using the PC that's already connected, right? I really, really hope this stuff doesn't become unavoidable in the future, but I'm not hopeful after seeing "dumb" TVs go extinct.
 
I'm going to make your world a bit worse by introducing you to the delightful Odyssey G80SD, as reviewed by TFT Central. It has wondrous features, such as

How can you not be enthused by the idea of being able to use streaming apps on the monitor itself? Much more convenient than using the PC that's already connected, right? I really, really hope this stuff doesn't become unavoidable in the future, but I'm not hopeful after seeing "dumb" TVs go extinct.
The hilarious thing is that THIS was the monitor I was just about to get to replace my LG C3. I read that it had perfect quality for text and great performance. But then I read that it had issues with waking up from sleep, ports that rattled in the monitor case like it was about to fall out, and then the stupid Smart TV garbage in the OS on the monitor that's supposed to be a monitor and not a TV. Then I actually confirmed every single one of these in person at Microcenter yesterday.

Know what I did? I went right to the salesperson and picked up the Asus PG32UQX.
 
The hilarious thing is that THIS was the monitor I was just about to get to replace my LG C3. I read that it had perfect quality for text and great performance. But then I read that it had issues with waking up from sleep, ports that rattled in the monitor case like it was about to fall out, and then the stupid Smart TV garbage in the OS on the monitor that's supposed to be a monitor and not a TV. Then I actually confirmed every single one of these in person at Microcenter yesterday.

Know what I did? I went right to the salesperson and picked up the Asus PG32UQX.
QD-OLED isn't perfect for text quality - there is obvious color fringing on fonts. On high contrasting horizontal lines in fact.
Default ClearType looks better on these latest QD-OLEDs than on old WOLED subpixels but e.g. grayscale fonts look identical on RGB and these older WOLEDs but they don't look good on QD-OLED. In fact just like RGB subpixels the subpixel font rendering looks better than grayscale on QD-OLED.

Still it is OLED so it had all issues - also Samsung with SmartTV features - doubt it would allow disable auto-dimming.
Issues with wake-up... yeah, this is just small TV and not monitor.

I'm going to make your world a bit worse by introducing you to the delightful Odyssey G80SD, as reviewed by TFT Central. It has wondrous features, such as



How can you not be enthused by the idea of being able to use streaming apps on the monitor itself? Much more convenient than using the PC that's already connected, right? I really, really hope this stuff doesn't become unavoidable in the future, but I'm not hopeful after seeing "dumb" TVs go extinct.
Doesn't Samsung already make TVs?
Imho it is great idea to make small 32 inch 4K SmartTV. If you are going to watch Netflix or YT or whatever it makes perfect sense to have such features.

I don't see existence of such product as sign of things to come. Unlikely typical monitor manufacturers to start putting such features in to their monitors.
On the other hand LG might... but given how terrible firmware is on their gaming monitor I would not complain really.

One issue would be price - but for now smaller TVs are cheaper than even smaller gaming monitors so I don't know if I should worry. Then again LG until they use WOLED I will just avoid their OLED tech 🙃
 
I've only been using OLED for a couple months and experienced none of the issues you mentioned. Moved from a 27" Acer Predator 1440 144hz IPS w/o HDR to an OLED Alienware 32 4K 240hz w/ HDR. Worth every penny. I have a Dell 4k IPS on another computer in the same room and the picture is inferior.

I haven't used a TV as a monitor since Unreal Tournament 3 came out. Gaming on a 32" TV was good, all other uses were less than ideal.
Found some pictures while trying to remember when I used to run a television :)

1728587603773.png

1728587804918.png
 
I've only been using OLED for a couple months and experienced none of the issues you mentioned. Moved from a 27" Acer Predator 1440 144hz IPS w/o HDR to an OLED Alienware 32 4K 240hz w/ HDR. Worth every penny. I have a Dell 4k IPS on another computer in the same room and the picture is inferior.
Every single OLED on the market has VRR flicker. It's inherent to the technology. Additionally, your screen also has ABL dimming when viewing pure white on HDR. Once again, another issue that is inherent to the technology. The reason you probably don't see it is because you have a 4090, I'm guessing. Crank up the details enough and you'll see it. You can mitigate it to an extent, but you have it, period.
 
Every single OLED on the market has VRR flicker. It's inherent to the technology. Additionally, your screen also has ABL dimming when viewing pure white on HDR. Once again, another issue that is inherent to the technology. The reason you probably don't see it is because you have a 4090, I'm guessing. Crank up the details enough and you'll see it. You can mitigate it to an extent, but you have it, period.
Using a 3080 ti and everything I play gets cranked to max visuals. DLSS balanced or quality. 50-110 fps usually. Haven't seen any flicker. Been playing Everspace 2 with some Space Marines 2 and Ghosts of Tsushima. Since I got the monitor
Edit: Worth noting that I had a "real" gsync monitor with the chip before. I'd like to think if there was tearing or flicker I'd see it. I was playing Everspace 2 for hundreds of hours on that monitor before I switched. Honestly it's been nothing but an upgrade outside of a big FPS hit due to 1440 to 4k transition.
edit2: A downside I have noticed is that for HDR in games to run right I need to run them in full screen. Makes alt tabbing a pain as it flashes and takes a couple seconds to get to the desktop. I used to run everything in Windowed Fullscreen and alt-tab was instant.
 
Last edited:
I can only attest to the OLED flickering which is certainly annoying. The rest is more like personal, sole issues😁
I'd like to go back to IPS but I couldn't stand the IPS glow\bleed or grey blacks, or object glowing due to the fald tech, mimicking the OLED performance.
 
I'm going to make your world a bit worse by introducing you to the delightful Odyssey G80SD, as reviewed by TFT Central. It has wondrous features, such as



How can you not be enthused by the idea of being able to use streaming apps on the monitor itself? Much more convenient than using the PC that's already connected, right? I really, really hope this stuff doesn't become unavoidable in the future, but I'm not hopeful after seeing "dumb" TVs go extinct.
I picked up an open box G80SD yesterday. The smart TV is a nice value add in theory except that the only digital audio connection is an eARC hdmi port (the Asus has optical) and the tv stuff has motion smoothing which I hate and you can't turn it off. It has other quirks. Sad because it was $620
 
Every single OLED on the market has VRR flicker. It's inherent to the technology.
Not the same profile of flickering though between different monitors... or even firmwares!!!

Remember how I wrote I don't see much if any VRR flicker on MSI MAG 271QPX in this post https://hardforum.com/threads/48-lg-c3-oled-vs-the-best-ips.2036445/post-1045964604 ?
I did eventually notice it in Overload (note: amazing Descent descendant - do recommend!!!!) which is about perfect game for VRR flicker visibility - quite dark and GPU demanding with framerate fluctuations - anyways I launched it and... OMG this looks terrible! There is like cyclical brightening of all tones 🤮
RTSS didn't help, using AMD's limiter didn't help... using 120Hz mode fixed the issue but... yeah, VRR vs V-Sync is too big of a difference in lag to use such solutions really.

What did help was... downgrading firmware to 0.12 from latest 0.24.
Now I still get VRR flicker but its like I actually need to specifically peep at the screen to notice slight per-frame brightness changes and otherwise there is no cyclical brightening of the whole screen and overall it feels quite stable. Not IPS-perfect stable but something "most users wouldn't necessarily notice in normal use"

Yeah... MSI did have in older firmware about 5ms higher input lag at 60Hz and less than 1ms in 120Hz compared to reference and they apparently fixed that but screw that jazz. Lag seemed good enough for me at 60Hz when I tested it with PS5. Only issue is the new firmware fixed power LED which could be disabled - so back to the electrical tape workarounds 🫣

BTW. Fun note: MSI changed name of VRR in my 271QPX from VESA Adaptive-Sync to Freesync Premium Pro on newer firmware - and why I quickly figured to downgrade FW. Apparently to get VESA's certification monitor actually needs to pass quite elaborate VRR flicker testing.
Not sure if that has anything to do with anything but maybe, just maybe, MSI decided to use FreeSync name because with their changes they had no chance of passing VESA's re-certification? Seems at least somewhat plausible...

...so yeah... apparently there are things which monitor manufacturers can adjust to mitigate VRR flicker.
And again: WE NEED VRR FLICKER TESTING IN REVIEWS.
For MSI monitors we got complaints about input lag - not even from users but users who read reviews - "OMG few milliseconds of lag at 60Hz!!! MSI needs to get their sheet together!!!" and this kind of nonsense.
I mean I assume they dropped workarounds for VRR flicker to get better lag and this is the issue here - might be totally unrelated and it might be that I am drawing overarching conclusions without doing extensive testing but these are my initial impressions/ideas from what I noticed thus far.

For now I am going to use older firmware as I had good time with it. Less so with flickery mess on newer firmware. It was quite noticeable in-game so I will know should it happens again - if for whatever reason FW change wasn't what caused observed differences. I might also want to re-test LG 48GQ900 as it got FW update since I last specifically tested VRR - though I haven't really noticed VRR flicker with PS5 games and there was no mention of input lag or VRR in fw notes.

Additionally, your screen also has ABL dimming when viewing pure white on HDR.
For at least desktop usage that MSI MAG 271QPX does not have any ABL - or rather all this sheet can be disabled. HDR has two modes and in mode with higher peaks there is ABL on full white - for whatever stupid reason as ability to display full white at 250 or so nits shouldn't depend on how high highlights can go but anyways... still quite HDR-y on VESA mode for me. I don't want 1000 nits peaks all that badly and if I do I can always switch to VESA HDR for desktop afterwards. ABL in-games isn't nearly as noticeable or annoying.

As for this sheet being inherent to technology: yes and no
Definitely there is power limit - like with e.g. plasma or even CRT if you crank it too high there is limit in how much power panel can draw and it will darken the screen when there is too much white on-screen. The bigger issue, at least to me, is when ABL works when I already operate well below such limits due to firmware artificially limiting brightness. If at 100% brightness at 100% window I get to say from 250 nits to 150 nits for white then if I set my white to 140 nits I should never ever experience such effects. This is how it works on plasma. LG however will make screen dim even at 0% brightness well below 100 nits for white and for desktop at least this is absolutely ridiculously irritating and it is NOT technology limitation.

Also OLEDs could get brighter with better power delivery and cooling. Probably 1000 nits full screen is pipe dream unless we move to inorganic emitters but even then since it isn't actually needed to sell products and would increase production cost and would increase max power draw rating which could negatively affect sales I don't think we will see much higher all-white brightness on OLEDs than maybe 400 nits tops.

-----------------
OLEDs are issue ridden.
My plan was to replace 48GQ900 for bigger QD-OLED TV, especially seeing excellent performance from that MSI monitor. Now I am totally not sure if that is such a good idea. LG 48GQ900 maybe has flocked up HDR but at least on PC it is usable with workarounds and VRR while it does flicker it isn't too bad. Certainly I would not bother disabling VRR with its showings so far like some WOLED users (including U) decide to do.

I think I should just calm down and wait until something breaks or burns in and while that happens OLED situation might improve enough that it will be a meaningful upgrade and not just changing one set issues to another set of issues.
 
Every single OLED on the market has VRR flicker. It's inherent to the technology. Additionally, your screen also has ABL dimming when viewing pure white on HDR. Once again, another issue that is inherent to the technology. The reason you probably don't see it is because you have a 4090, I'm guessing. Crank up the details enough and you'll see it. You can mitigate it to an extent, but you have it, period.
I've only seen VRR flicker on my LG CX 48" in loading screens, which is pretty irrelevant. I know this will depend a lot on the game. I've used the CX with a 2080 Ti and now a 4090 so that probably helps.

I hear the new Silent Hill 2 remake is particularly bad with this but it seems to have a bunch of technical issues that hopefully get solved by patches.

Overall I'm actually super happy how troublefree the CX has been as my living room gaming rig with both PC and consoles. I'd gladly upgrade it for another LG, but a 55" model with higher HDR brightness and 4K @ 240 Hz. Just higher HDR brightness of the G4 is not enough to make me interested in spending that much money.
 
I think you make an excellent point here. Reviewers bear responsibility for part of this. I NEVER EVER EVER even remotely heard of anything called "VRR flicker" until AFTER I spent $1300 on my new monitor and it was sitting at my desk all set up, all my old monitors given away, and I'm gaming and I see this atrocious flickering when FPS drops. Not a single OLED reviewer mentioned this, only recently have they started. Never said a single thing about ABL and HDR, or it was skimmed over very briefly. Never mentioned anything about horrible gradients, even though the screen was 10/12 bit. Never anything about that, but for the small short bursts that you CAN do these things at optimal conditions, it works gloriously, and THAT is what the reviewers hyper focus on.

Lesson learned, I won't be doing it again. Computer monitor only from now on, and I'll need a bulletproof return policy. Even going to be using a credit card that offers a buyer's protection plan for a year, because this is just absurd.
Hmmm. Rtings definitely mention VRR flicker, ABL and other items mentioned in their review of some HDR displays. I ignore all YouTube reviews. If it's not being reviewed by Rtings, I wait.
 
Last edited:
  • Like
Reactions: XoR_
like this
I was running an OLED above my LG 38GL950 (the best monitor I've ever owned) for the last few weeks and when I went with the 4K Samsung, it was just too big and too close - so I put the LG back in it's box - just a few months shy of it's 5th birthday.

I've decided I'm just going to run an OLED - for work and for play. On the work side, it's set to 15/50 brightness and 30/50 on the gaming PC. I've got a 4 year warranty from Best Buy - I'm probably going to have to return this Samsung and either get the HP Omen or the Asus UCDM.

If it burns in, it burns in, but I think the panels will get better so if I have to get it replaced in a couple years, I'm fine with that.
 
I've only seen VRR flicker on my LG CX 48" in loading screens, which is pretty irrelevant. I know this will depend a lot on the game. I've used the CX with a 2080 Ti and now a 4090 so that probably helps.

I hear the new Silent Hill 2 remake is particularly bad with this but it seems to have a bunch of technical issues that hopefully get solved by patches.

Overall I'm actually super happy how troublefree the CX has been as my living room gaming rig with both PC and consoles. I'd gladly upgrade it for another LG, but a 55" model with higher HDR brightness and 4K @ 240 Hz. Just higher HDR brightness of the G4 is not enough to make me interested in spending that much money.

The Anti Flicker function on my PG32UCDP actually works at reducing VRR flicker somehow, which doesn't make any sense to me given that all it supposedly does is change the effective VRR range from 40-240Hz up to like 80-240Hz or something like that. HDTVTest did mention in one of his videos that Anti Flicker is something beyond just a simple VRR range modifier but did not dive into any technical details, but he probably wouldn't know anyways since he's not an Asus engineer. Regardless, whatever magic Asus did with Anti Flicker does work. I played Silent Hill 2 on my PG32UCDP and it does indeed have less flicker, almost none to be honest. It just had some flickering on the first load up probably due to shader compilation.
 
I was running an OLED above my LG 38GL950 (the best monitor I've ever owned)
Given gamut and light spectrum of 38GL950 and my LG 27GP950 it seems its the same panel and with just different aspect ratio and size.
These Nano IPS panels maybe don't have the best contrast ratio but they surely have lovely colors.
I wouldn't say it is the best monitor but it is pretty much the best LCD I ever had when all things are considered.

I went with the 4K Samsung
Good choice.
Image WOLED is nothing like these Nano LEDs while QD-OLED is quite similar.
Of course not talking about things like contrast ratio and such.

The Anti Flicker function on my PG32UCDP actually works at reducing VRR flicker somehow, which doesn't make any sense to me given that all it supposedly does is change the effective VRR range from 40-240Hz up to like 80-240Hz or something like that. HDTVTest did mention in one of his videos that Anti Flicker is something beyond just a simple VRR range modifier but did not dive into any technical details, but he probably wouldn't know anyways since he's not an Asus engineer. Regardless, whatever magic Asus did with Anti Flicker does work. I played Silent Hill 2 on my PG32UCDP and it does indeed have less flicker, almost none to be honest. It just had some flickering on the first load up probably due to shader compilation.
I would say it has to do with timings. Might be good to compare video modes in EDID.
Can also be slight modification of when frame is displayed. Delaying frames slightly might have big impact on flickering while not have that much impact on motion or lag.
Is lag at given frame rate within VRR range where before there was flicker and after the same between this function enabled and disabled?

From my side I experienced VRR flicker after upgrading firmware on MSI MAG 271QPX and reduction after downgrading. There were supposed to be some input lag optimizations - personally I preferred higher input lag and being able to use VRR which itself reduces input lag quite a lot.
 
I would say it has to do with timings. Might be good to compare video modes in EDID.
Can also be slight modification of when frame is displayed. Delaying frames slightly might have big impact on flickering while not have that much impact on motion or lag.
Is lag at given frame rate within VRR range where before there was flicker and after the same between this function enabled and disabled?
It actually has nothing to do with timings. You can do this manually by creating a custom resolution and limiting the VRR range. I did it on my OLED and it works 70% of the time, so you can mitigate it. You just have to jump through hoops.
 
I can understand, I'm also a PG32UQX fanboy. I got one for a number of reasons, worry about burn-in was a big one and I figured it would be a decent compromise. However my computer is also wired to an S95B TV so I can just play games on the OLED when I want. I figured this would probably be not as good, but let me have something that could do HDR gaming when my girlfriend is using the TV... Well, turns out I actually prefer the monitor overall. I find that HDR gaming looks better on it than on the TV overall. It isn't without issues of course, the smearing in bright-to-dark transitions being the most noticeable to me, and backlight bloom being the next but all in all it just looks better with its brighter image.
 
OP - could your issues have been with THAT particular OLED and not just OLED's in general? Yes there's ABL in them but if you've ever used a Plasma before then you're used to it (or CRT for that matter). It eventually grows on you. But I want to say that it sounds like your particularly bad experience seems to stem from the LG C3 and not OLED's in general. But I'm glad you found something that works for you. For what it's worth, I have yet to get on the OLED bandwagon. I see that Viewsonic has a BFI monitor but I've not read super great things about it.
 
Ok, so some of this was to be expected going from a computer monitor to a TV. I fully own that. However, trying to get OUT of that situation by getting another monitor with the same technology is next to near impossible. I cannot believe we are in a time where we have to worry about DisplayPort and HDMI CABLES NOT WORKING. Seriously? I bought 3 expensive HDMI 2.1b certified cables, and all of them degraded. Every single one of them. Then, every single OLED screen out there has some ridiculous shortcoming. Not waking up from sleep. Flickering. Color corruption. Poor HDR over time. Cables not working/degrading over time. Blacks looking purple. Ports unsecured like they are going to fall out. Needing to select special resolutions in graphics card control panels. Needing to reset graphics drivers on switching power state. VRR flicker of varying degrees, all terrible. Backlight dimming. ABL on white screens, impossible to overcome on HDR. And worst of all? Pretty much every OLED computer monitor out there is trying to be a stupid freaking smart TV instead of a computer monitor. This one by far is the most egregious
You must have gotten the worst TV to sneak through the QC line because I've never had any of these issues with my C3 long term. I have a launch model that had extremely aggressive ABL out of the box, but that was fixed in later software updates. I also had one incidence where the TV wouldn't wake up and that was also fixed with a software update. Color issues, flickering, unsecured ports, custom resolutions, "resetting" graphics drivers are all issues I have never encountered.

I would also buy my cables based on brand and reputation rather than how expensive they are. I have never spent more than $20 on an HDMI or DisplayPort cable in the past 15 years and all of them still work.

If you don't want to use any of the "smart" features on a display, then never connect it to the internet.
Some really, really great feedback from everyone, indicative of pretty much everything I said.

To clear up a few things, I do not live in a humid location or have a faulty power grid that would render the cables or any of my other equipment susceptible. The problem would have been either the screen or the cables. More than likely the cables though, judging from how it behaved on initial installation to performance over time.

Believe it or not, "dedicated" Gsync DOES still seem to be a thing, but it's just way more convoluted now because of nVidia and manufacturer shenanigans. This Asus PG32UQX has Gsync Ultimate. My old Asus PG279Q also had dedicated Gsync, albeit the regular one. Both of these screens perform flawlessly at whatever I throw at it with Gsync enabled. Gsync COMPATIBLE is where I think there are some problems. Gsync has specific standards that needs to be adhered to, and as pointed out earlier in this thread, manufacturers apply it as loosely as loose change in a water fountain. If there is no real standard being followed, would one think that Gsync would work properly? Chances are it won't. And to my eyes, native Gsync does make a difference, enough for me to seek out native Gsync instead of Gsync Compatible, especially having been through a year of Gsync compatible mess.
The PG32UQX came out a little over 2 years ago. NVIDIA has recently said that they will no longer be making G-SYNC FPGAs.
I wish I could, but these are all the fancy schmancy 48GB cables. Ones where 99% of them come from one country and are all garbage. I've bought hard jacket, soft jacket, copper core, fiber, long, short, they all ended up failing. Well, either they fail or the screen is killing them. Either way it's absolutely ridiculous.

By comparison, I'm using an old 15 foot DisplayPort cable on my new Asus PG32UQX. I bought this cable 10 years ago. And it's still going strong.
"Fancy schmancy 48GB cables" are a marketing trick. All certified HDMI Ultra High Speed Cables are the same.
I'm going to make your world a bit worse by introducing you to the delightful Odyssey G80SD, as reviewed by TFT Central. It has wondrous features, such as



How can you not be enthused by the idea of being able to use streaming apps on the monitor itself? Much more convenient than using the PC that's already connected, right? I really, really hope this stuff doesn't become unavoidable in the future, but I'm not hopeful after seeing "dumb" TVs go extinct.
The integrated apps work much better than the PC apps. Prime Video, for one, doesn't even let you stream in 4K if you're using the website or Windows app. The Netflix Windows app is honestly the only one that I think is better than the Android TV or WebOS version.
Using a 3080 ti and everything I play gets cranked to max visuals. DLSS balanced or quality. 50-110 fps usually. Haven't seen any flicker. Been playing Everspace 2 with some Space Marines 2 and Ghosts of Tsushima. Since I got the monitor
Edit: Worth noting that I had a "real" gsync monitor with the chip before. I'd like to think if there was tearing or flicker I'd see it. I was playing Everspace 2 for hundreds of hours on that monitor before I switched. Honestly it's been nothing but an upgrade outside of a big FPS hit due to 1440 to 4k transition.
edit2: A downside I have noticed is that for HDR in games to run right I need to run them in full screen. Makes alt tabbing a pain as it flashes and takes a couple seconds to get to the desktop. I used to run everything in Windowed Fullscreen and alt-tab was instant.
I only noticed it in Ratchet & Clank: Rift Apart. I haven't noticed any VRR flicker in anything else.
 
I had a C1 OLED for a while before switching to a Samsung QN85A a couple years ago...for sure there was some flickering issues with powered HDMI cables on the LG that I didnt have with the standard IPS TV. Good to know this is still an issue.
 
Probably the big difference in cable experience is length. Past 6.6ft an ultra high speed cable has to be really good to be problem free. 2m and under is a lot more universally Ok at mediocre quality / build when you need 4k 120+ hdr.
 
You must have gotten the worst TV to sneak through the QC line because I've never had any of these issues with my C3 long term. I have a launch model that had extremely aggressive ABL out of the box, but that was fixed in later software updates. I also had one incidence where the TV wouldn't wake up and that was also fixed with a software update. Color issues, flickering, unsecured ports, custom resolutions, "resetting" graphics drivers are all issues I have never encountered.

I would also buy my cables based on brand and reputation rather than how expensive they are. I have never spent more than $20 on an HDMI or DisplayPort cable in the past 15 years and all of them still work.

If you don't want to use any of the "smart" features on a display, then never connect it to the internet.

The PG32UQX came out a little over 2 years ago. NVIDIA has recently said that they will no longer be making G-SYNC FPGAs.

"Fancy schmancy 48GB cables" are a marketing trick. All certified HDMI Ultra High Speed Cables are the same.

The integrated apps work much better than the PC apps. Prime Video, for one, doesn't even let you stream in 4K if you're using the website or Windows app. The Netflix Windows app is honestly the only one that I think is better than the Android TV or WebOS version.

I only noticed it in Ratchet & Clank: Rift Apart. I haven't noticed any VRR flicker in anything else.
1. To clarify, I'm not saying my C3 has all those issues. I'm saying that all OLEDs have one or more of those issues.
2. Kinda hard to "not use" the smart features on the display when the screen's OS is basically built to be/mimic a smart TV instead of a PC monitor.
3. The "certified" in HDMI Ultra High speed cables is part of the problem. I've bought at least two that are no longer certified. Leading me to believe that I got duped.
4. I notice VRR flicker in all games. Destiny 2, Cyberpunk, Mechwarriors, etc. Mostly in the menus. Rarely in game, unless there is a truly gray scene. This is even with my RTX 4090.
I had a C1 OLED for a while before switching to a Samsung QN85A a couple years ago...for sure there was some flickering issues with powered HDMI cables on the LG that I didnt have with the standard IPS TV. Good to know this is still an issue.
Yep, this is definitely still an issue.
Probably the big difference in cable experience is length. Past 6.6ft an ultra high speed cable has to be really good to be problem free. 2m and under is a lot more universally Ok at mediocre quality / build when you need 4k 120+ hdr.
Believe it or not, I picked up on this, which is why every subsequent cable I bought was shorter than the last. My latest cable is now only 6 feet.
 
Glad you made the right move. The PG32UQX is the best gaming PC monitor for HDR right now. Sure it has many short comings, but HDR image quality is not one of them.
 
Glad you made the right move. The PG32UQX is the best gaming PC monitor for HDR right now. Sure it has many short comings, but HDR image quality is not one of them.
It's bright and colors are good. It's definitely not as good as recent OLED monitors when it comes to color.
 
It's bright and colors are good. It's definitely not as good as recent OLED monitors when it comes to color.
Its so close that the difference is not noticable unless they are side by side. That is in comparison to OD-OLED.
 
Its so close that the difference is not noticable unless they are side by side. That is in comparison to OD-OLED.
Actually the PG32UQX is superior to almost all the OLEDs in color. Since it is IPS and not a VA panel, which has been seen in recent televisions such as the Bravia 9, color is one of its advantages.

It has a MUCH wider color gamut than than the WOLED variants and even higher than the QD-OLED. This is a common misconception among OLED users.

According to TFTCentral:
Rec 2020 coverage:

PG32UQX ~82%
PG32UCDM (QD-OLED): ~79%
WOLED: ~74%


This is to not even touch on color volume, of which given its peak HDR 1400 certification it would eclipse any OLED monitor on the market.

It is also in my experience more accurate in HDR than the Qd-OLEDs which tend to over saturate colors in HDR. I have both.
 
It's bright and colors are good. It's definitely not as good as recent OLED monitors when it comes to color.
Some of the new OLED monitors have very impressive colour accuracy out-of-the-box. I'd like someone to measure that over time though. I find that I have to recalibrate LCD monitors every 6-12 months to keep them looking right. I would expect OLED monitors to be worse than that, but I don't know.
 
Some of the new OLED monitors have very impressive colour accuracy out-of-the-box. I'd like someone to measure that over time though. I find that I have to recalibrate LCD monitors every 6-12 months to keep them looking right. I would expect OLED monitors to be worse than that, but I don't know.

The Samsung panels do. LG OLEDs have always had bad color accuracy due to the fact that the panel itself tilts toward blue/aqua color spectrum once you go around 20-30 degrees off-axis, which means its something you can't even color calibrate because its a progressive color shift issue on the panel itself. Not sure if its an effect of the anti-glare coating they are using or something else.
 
On OLED now. Never going back. Only real annoyance is VRR flicker but I turned off Gsync / VRR and it's gone now. Not an issue when you have a 4K 240 Hz as I see no tearing and can cap frame rates in case I do. Once you go OLED you can't really go back imo.
 
Back
Top