Why OLED for PC use?

Also, like I said before, we just need to go back to CRT. Nerds these days are pussies. Can't even lift a 70 pound monitor without pulling a muscle. Re-calibrating a CRT will make a man out of you.
 
Good sir, I hauled my 21" FD Trinitron monitor to lan parties. I am no pussy :p
Indeed, my 22" Iiyama Vision Master Pro 510 regularly made its way from London to LAN parties in the SouthWest.
Easiest way to keep fit, lugging that monitor around... you'd think.
Fk me, what a monitor, in many ways! Dangerous to health though.
 
Also, like I said before, we just need to go back to CRT. Nerds these days are pussies. Can't even lift a 70 pound monitor without pulling a muscle. Re-calibrating a CRT will make a man out of you.

Holy fuck the back footprint of the 21 inch Sony Trinitrons back in the day thankgod we have moved on from monitors with HUGEEE footprints.
 
Yup.

I've noticed that my OLED is (overall) easier on my eyes too. Hundreds of hours of Visual Studio. None whatsoever.

Displays are woefully imperfect windows to real life -- flicker is not the only eyestrain attribute, and the OLED flicker duty cycle is invisible.

The variables of the flicker matters. Compared to this 240Hz shallow-depth short-brief flickergraph. It's vastly MUCH easier to see 2000 Hz flicker (via stroboscopics) at 10%:90% duty cycle full depth ON:OFF flicker, than this 240 Hz flickergraph showing brief-and-shallow flicker that's even shallower than an average incandescent light-bulb flicker (they actually flicker at roughly 5%-10% flickerdepth in sync with the AC sinewave, from the cooldown of the tungsten during the AC zero-volts crossing events). Follow the science rather than the nonsense.

Now, that being said, fast GtG (on any display like OLED) also amplify visibility of stutter, of which stutter-edge flicker is gigantically the biggest problem. But this is a problem with low frame rates on any fast-GtG displays. Look at the 15-30fps UFOs at www.testufo.com for example and edge-flicker effects -- they visibly stutter (edge flicker) more on fast-GtG displays than slow-GtG displays. Low-framerate stutter is a vastly a bigger cause of eyestrain on fast-GtG displays, but this is mitigatable (either by higher frame rates, or by intentionally adding GPU motion blur effects). Now if you stick to high-Hz OLEDs (120Hz+) and keep framerates high, edge-flicker of stutter on fast-GtG displays can cease to be an eyestrain. More OLED eyestrain causes were traced to this, than by the nonsense by a specific forum member here.

And oh yes, there's the pros/cons of OLED and LCD.

Yes -- there's problems with OLED PWM on some OLEDs (e.g. low-Hz PWM on some mobile OLEDs that have very deep duty cycles -- 0%-100%-0%-100% -- instead of 95%-100%-95%-100% -- and for shorter pulsewidths with longer dark time. Famously mentioned elsewhere, obviously.

But scapegoating flicker these specific 240hz OLEDs, when they're now among the best-in-class for lowest OLED flicker during everyday use?
It is more accurate to say that setting the monitor brightness at 100nits for static images represents only 1% of real-world use.

A brightness level of 250-300nits is considered the most used cases. When playing games, the brightness can only increases as bright colors and high contrast can make game environments and characters more visually distinct and easier to differentiate. Higher range games can also showcase a game's graphics and visual effects more effectively, making them visually stunning. If you choose an OLED display, you will be permanently locked with images that have a very dull limited dynamic range. Just like how you can only see 100nits instead of 300nits.

The sub-1000Hz flickers from OLED displays induce significantly more eye strain than the 40KHz flicker from LCD displays or DC dimming displays. OLED flicker can easily induce more saccades and blinks, which are indicators of visual discomfort and eye strain. Only several hundred Hz flicker is far from enough when the OLED gets a little brighter, if it can get brighter at all.
 
This takes crown for most stupid post in this topic. This is quite an achievement given how much stupidity is posted here everyday 😵
Hear that? Look up, things are flying over your head.
 
VA ultrawides and/or 4K monitors + OLED is the rich man's "have your cake and eat it too". Most VA ultrawides have good contrast (3000:1 and up) and look terrific and you don't have to worry about burnin. Want to go to games? OLED. :) My VA monitor was exceptionally good for coding on. I miss it so much that I sometimes fantasize about buying another one just for work. But... I only have one desk, so my XG-4321 pulls double duty.
How about that VA motion handling and off axis contrast?
 
Holy fuck the back footprint of the 21 inch Sony Trinitrons back in the day thankgod we have moved on from monitors with HUGEEE footprints.
I'm pretty confident that the rise in man buns and the ending of CRT monitor production was no coincidence. Just saying.
 
How about that VA motion handling and off axis contrast?
I'd say motion handling is very good on latest Samsung VAs and for most users viewing angles are not that big a deal. Like with everything, you pick your compromises.
 
Hmm could this be why my OLED gives me less eye strain than all the miniLEDs I tried (when fald was on)?
Very possible. For me OLED (using LG CX since 2020 as my main display at home, I think I am around the ~12k hours mark now, without burn-in btw) has been way easier on my eyes than any LCD I have owned. Which is great since LCD looks terrible in the dark room that I like to sit in for gaming and home theatre usage anyway.
 
It is more accurate to say that setting the monitor brightness at 100nits for static images represents only 1% of real-world use.

A brightness level of 250-300nits is considered the most used cases. When playing games, the brightness can only increases as bright colors and high contrast can make game environments and characters more visually distinct and easier to differentiate. Higher range games can also showcase a game's graphics and visual effects more effectively, making them visually stunning. If you choose an OLED display, you will be permanently locked with images that have a very dull limited dynamic range. Just like how you can only see 100nits instead of 300nits.

The sub-1000Hz flickers from OLED displays induce significantly more eye strain than the 40KHz flicker from LCD displays or DC dimming displays. OLED flicker can easily induce more saccades and blinks, which are indicators of visual discomfort and eye strain. Only several hundred Hz flicker is far from enough when the OLED gets a little brighter, if it can get brighter at all.
Damn! Chief Blur Buster just got SCHOOLED! This is just beautiful. I'm literally clapping right now. You completely ripped his arguments apart. Almost brings a tear to my eye. I take my hat off to you, kram. I doubt this guy will show his face around here again after this total humiliation. Sorry I ever doubted your infinite wisdom, kram.
 
Damn! Chief Blur Buster just got SCHOOLED! This is just beautiful. I'm literally clapping right now. You completely ripped his arguments apart. Almost brings a tear to my eye. I take my hat off to you, kram. I doubt this guy will show his face around here again after this total humiliation. Sorry I ever doubted your infinite wisdom, kram.
Somebody better pretend to be out staring at the sun lol.

It's so funny to claim OLED cause less eye strain than DC dimming backlight under the same brightness.
 
Somebody better pretend to be out staring at the sun lol.

It's so funny to claim OLED cause less eye strain than DC dimming backlight under the same brightness.
Funny you are the one wasting your life pretending to see the sun on your dim display in your mom's dark basement. I can actually see real life.
 
And he's back at it. I almost missed it.
There was probably an automatic windows update that restarted his computer and the program didn't start back up automatically after, then just now volatix realized it wasn't running and started it back up again. That happens with my chatbot. I would switch it to linux but it's my backup gaming PC.
 
Somebody better pretend to be out staring at the sun lol.

New video today for what HUB considers best HDR PC Monitors.



Much like RTINGS (and several others), this is the 4th HDR video in a row in which OLED gets the overall win from HUB. They completely stopped even mentioning the PG32UQX a few videos back. Perhaps there is more to a display than just the highest nits?

Does this mean miniLED is bad and you're perspective on it is wrong? NO! It means people have different preferences, for you it's clearly highest nits. And that's fine. But bashing people for preferring OLED for PC when it continuously gets the win for PC gaming at the best display review sources is kind of sus.

It's so funny to claim OLED cause less eye strain than DC dimming backlight under the same brightness.

I tried each for long periods of time. With FALD on I get more strain than using OLED. I will probably be going back to miniLED when LG releases theirs in 2024, and I'm hoping this does not happen again.
 
Last edited:
Funny you are the one wasting your life pretending to see the sun on your dim display in your mom's dark basement. I can actually see real life.
Funny it's you pretending to see the sun with your 100nits OLED in a cave.
 
New video today for what HUB considers best HDR PC Monitors.



Much like RTINGS (and several others), this is the 4th HDR video in a row in which OLED gets the overall win from HUB. They completely stopped even mentioning the PG32UQX a few videos back. Perhaps there is more to a display than just the highest nits?

Does this mean miniLED is bad and you're perspective on it is wrong? NO! It means people have different preferences, for you it's clearly highest nits. And that's fine. But bashing people for preferring OLED for PC when it continuously gets the win for PC gaming at the best display review sources is kind of sus.



I tried each for long periods of time. With FALD on I get more strain than using OLED. I will probably be going back to miniLED when LG releases theirs in 2024, and I'm hoping this does not happen again.


Preference is not a fact. OLED is not bright enough to be true HDR monitor. The HDR mode on OLED is dimmer than SDR. Once you see a little bit brighter you get eye strain from its flickers. OLED is not even that fast.

I don't even recommend monitors. You choose whatever you want. The point is always OLED being worse for PC use. The market will show this trend. It's your business to choose OLED to display SDR then pretend it's HDR. It's your business to use OLED that's flickering all the time. It's your business to believe in whatever the media says.
 
Preference is not a fact.

You're confused. The display you prefer does reach higher nits. This IS A FACT. I know, we all know. But the PREFERENCE part is you wanting these high nits over something like response time or 0 blooming.

This logic should be understandable to any adult.

The point is always OLED being worse for PC use

This is not a fact. I disagree with this opinion for PC gaming as do most professional reviewers. I agree with HUB in that OLED is overall better for PC Gaming.
 
Last edited:
Good thing kramkram wasn't born when CRTs were still a thing. He would have probably died from the flicker lmao.

Funny it's you pretending to see the sun with your 100nits OLED in a cave.
Oh wow, is it really? Where? I'll wait for you to back up any of your claims. And by claims I mean nonsensical and moronic lies.
 
It's your business to believe in whatever the media says.

I'm not using RTINGS or HUB to form my opinion. I'm using them as citation to back up my already formed opinion after trying various miniLEDs and OLEDs for months on end. I merely agree with their own analysis, and they are far superior sources than you. IE: I can cite both sources in various display community "debates". However, I can't cite you (nor should anyone due to large amounts of misinformation and not understanding what preference means).
 
Last edited:
You're confused. The display you prefer does reach higher nits. This IS A FACT. I know, we all know. But the PREFERENCE part is you wanting these high nits over something like response time or 0 blooming.

This logic should be understandable to any adult.

This is not a fact. I disagree with this opinion for PC gaming as do most professional reviewers. I agree with HUB in that OLED is overall better for PC Gaming.

Higher range can have better images. This is why HDR exist in the first place. And OLED is too dim to be used efficiently to give an overall advantage. You don't have much option with OLED.

You won't see many people using OLED for PC. The market already shows. The competitive players won't use it because it is dim and it's not that fast. The casual players won't use it because they are again dim and they don't have better images compared to miniLED.

Besides that OLED is flickering. You get eyestrain when it gets a little bit brighter. You cannot even use your AW3423DW at 250nits in a dim room.
 
The market already shows.

Wait. Do you have a breakdown of the sales of the AW34 QDOLED vs PG32UQX vs Neo G7/G8?

I'm still going to go with my own analysis and that of HUB, RTINGS, etc. Sorry kram, I disagree with your opinion.
 
Wait. Do you have a breakdown of the sales of the AW34 QDOLED vs PG32UQX vs Neo G7/G8?
The only thing he has is whatever he pulls out of a certain place within arm's reach. Actually that's where he gets all his "facts", meaning practically everything he says.
 
  • Like
Reactions: Zahua
like this
The competitive players won't use it because it is dim and it's not that fast
Yeah, the skill of all those quake and cs pros just skyrocketed when they switched from those dim CRTs where they couldn't see anything to ultra bright LCDs.
Oh wait, no, that only happened in kramkram's own imagination land.
 
Wait. Do you have a breakdown of the sales of the AW34 QDOLED vs PG32UQX vs Neo G7/G8?

I'm still going to go with my own analysis and that of HUB, RTINGS, etc. Sorry kram, I disagree with your opinion.
You think AW3423DW sales are good against miniLED? Maybe it's good against PG32UQX since PG32UQX is like top of line Pagani without many production anyway.
They are not good at all against G7/G8. And there are way more miniLED out there.

AW3423DW
1678128710746.png



Neo G8
1678128954727.png


Neo G7
1678129006448.png
 
You think AW3423DW sales are good against miniLED?

Those aren't sales figures. So not only do most major display reviewers disagree with you, but you lied about your only crutch (sales) this whole thread?
 
[concentrated idiocy about sales figures]
Disregarding the fact that even using sales figures as any sort of argument for something being good is beyond moronic... actually no, let's not disregard that so I'll just end my post here.
 
Those aren't sales figures. So not only do most major display reviewers disagree with you, but you lied about your only crutch (sales) this whole thread?
Of course actual figures won't be available to show you. The higher the review number the higher the sale numbers lol.

It's you need a crutch pretending having a PG32UQX while owned a NEO G8 instead lol. I have AW3423DW. I know how shitty it looks compared to a 4-year-old good miniLED.

PG35VQ HDR vs AW3423DW HDR
52143868601_ecce55839d_o_d.png


PG35VQ SDR vs AW3423DW HDR
52157878212_2055476ea2_o_d.png



Tell me OLED is good for the eyes lol.

 
Last edited:
Of course actual figures won't be available to show you.

Thought so. Anyway I'm going to use the display I found to work out better for ME overall. You do the same, no one will judge you. But if you start to judge others, don't get upset when they start citing major reviewers who all happen to disagree with your opinion.
 
Thought so. Anyway I'm going to use the display I found to work out better for ME overall. You do the same, no one will judge you. But if you start to judge others, don't get upset when they start citing major reviewers who all happen to disagree with your opinion.
Again, your preference is not a fact. It works for you doesn't mean it has better images. I never judge you. I don't recommend monitors for you. All I am saying is OLED being worse for PC use. And the market will keep the trend.
 
Back
Top