Why OLED for PC use?

Strong example of justifying ownership of a certain product. Everything has flaws or compromises, nothing is perfect.

But it's easier endlessly praise that which was purchased than accept it has issues of its own.
Every tech has it's own flaws. You're assuming I don't realize what they are for both techs. I just prefer one and sharing my opinion, Senn.
 
Your problem is that you use a single display which is the QN90B to generalize ALL FALD displays and say that you will never notice blooming. Sure you won't notice it on the QN90B but that doesn't mean you won't notice it on other FALD displays.
 

Attachments

  • -FAIR-eNOUGH-meme-54909.jpg
    -FAIR-eNOUGH-meme-54909.jpg
    37.8 KB · Views: 1
Yeh it's not just the true/ultra/"infinite" black depths... it's that it can do that whole downward range of depth right next to a lit colored pixel on a per pixel basis pixel by pixel by pixel x 8million pixels to a razor's edge instead of light bucket dumps in less than a 45x25 grid. FALD is clever for what it can do but it's a major tradeoff for me. Ultimately everything will be per pixel emissive someyear. FALD is a stop gap solution. Soon all flagship VR headsets will be microoled and in later years microLED so they are all per pixel emissive going forward.



Not exactly. As I said I'm under no illusions. I'm going with measurable results in both technologies, as reported from major trusted review sites like HDTVtest, rtings, and tftcentral. OLED has major tradeoffs too. You will see ABL periodically - though I find scenes are dynamic so it might not be horribly often or for that long when it happens in typical content, especially in types of games with near constant FoV movement - but it will happen periodically and be noticeable when it does with a big drop. . (including on the 2000nit+ samsung TV's aggressive ABL). So ABL is a big tradeoff on any screen that has ABL compared to not having to suffer it. You will also get much lower % of screen brightness/hdr color volume and levels overall in HDR on oleds, and much shorter sustained periods. They are all major tradeoffs. Instead taking the monitor tradeoffs you value more and trying to say no-one will see the measured and reported cons of that competing display tech you like better is different imo. There are visible and measurable tradeoffs on both monitor technologies, and they are imo major on both sides. You can tell me all of the tradeoffs of oled are bad.. yeah they are pretty bad and I'd love a microLED per pixel emissive instead someday (someyear) - but the per pixel emissive nature of the OLED display is a better choice for what I value more compared to what FALD is good and bad at currently. That's what it comes down to.



Yes that is a good point. I'm aware that different firmwares for different models can lean differently from every tech site I've followed. However FALD will all have fluctuating luminance in their large tetris block lighting cells as the tech stands currently. Whether that luminance is biased + or - , it's non-uniform and lifting or dulling and dimming large cells of 7000 pixels each, perhaps more if using a wider FALD "gradient" of more cells like the samsungs reportedly do in game mode. There are some reviews on hdtvtest and reviews on tftcentral , etc. where the subtitles or credits are all dim grey instead of white for example, or the mouse pointer, etc. b/c a fw is leaning toward the dim side to prevent the contrasted edges from blooming into the black areas/lifting the black areas more. Same thing happens to the edge of a bright starship in space - dimmed instead of a bright metallic ship edge on black space down to the pixel level (or in opposite fw method the blackness of space lifted around the spaceship's edge). Even textured objects with dark details, dark details and recesses in architectures and geology/landscapes, faces+hair, bright relfections in eyes, etc etc will have their tetris block backlights dimming or lightening/lifting blocks of 7000 pixels at a time in a brickwork assortment of cells. I can see why you might prefer the dark side method though.

It's not like I coudn't use a FALD display, or even an edge lit one but I wouldn't be blind to their limitations any more than those of oleds. Per pixel emissive offers the best from what we have available right now for my taste, mainly per pixel down to oblivion right next to color pixels x 8million pixels individually (with some of those pixels hitting hdr mids and compressed highlights). Per pixel emissive is limited for now to oled in the enthusiast consumer space. Once we get a better per pixel emissive tech in the consumer space I'd be all over it. Eventually everything will be per pixel emissive but we have these stop gap solutions for now with their limitations.

I would happily jump back to OLED as my primary display if they could just do two things:

1. Start offering high refresh OLED options in something around 32"
2. Bring the overall brightness/colors up

Unfortunately, it seems like you cannot do #2 if you are trying to do #1. The smaller the OLED gets, the dimmer it seems to get as well. That 2023 Samsung S95C looks mighty impressive being able to keep up with a MiniLED in brightness and colors, but that thing is freaking 77 inches. If there was a 32" version of that S95C then no doubt it's going to be nowhere NEAR as bright as the 77". Of course the smaller MiniLED options also seem to lose out on brightness as the size decreases, but not to the same degree.
 
Do you think Bias lighting is good for a oled monitor? I tried it for a week and found it distracting from what I was looking at. Just using a Power Practical led luminoodle strip but now that I have it unplugged it's easier to focus on.
I mean the entire thing is to get away from led lights with a oled monitor.
I would think so. My CRT's weren't infinite contrast like OLED but they were deep. It helped my eyes to see a little better than what the alternative was - to sit in a pitch black room. But I think this kinda goes to the rest of the thread - to each their own.
 
I just took the Bias led strip off and it's alot better for some reason I still have a lamp nearby not like I'm in total darkness with a white Bias light strip it's like my eyes didn't know what to look at the surrounding color or the monitor.
 
I just took the Bias led strip off and it's alot better for some reason I still have a lamp nearby not like I'm in total darkness with a white Bias light strip it's like my eyes didn't know what to look at the surrounding color or the monitor.
I used a desk lamp with a 5000K bias light. I think it was 400 or so lumens and I aimed it at the wall behind the monitor for a more even spread. This was back when my CRT Sony F520 was my daily driver.
 
I miss the awesome visual pop of the QN90A mini Led (Samsung) that I used, but eyestrain is my issue with it. It did hurt my eyes looking at it due to the brightness I think. And I've heard about settings I can tweak to minimize the eye problems. If you know of those settings without sacrificing too much in IQ I can try my Samsung again.

So now I'm using the C2 42 to ease the eyestrain and it has the great colors and IQ but not always the wow rush you get with the vividness of videos and pc games on the Samsung. And some of the Samsung TVs out now apparently are better at dealing with bloom than my QN90A as you implied.

Btw does Oled have the edge in any situation in IQ in games than on Samsung mini Led? If I'm asking the question with the proper terms. I hate to bring up that stuff all over again but I'm just wondering about your take on any advantages that LG Oled may have in any category of image quality in pc games or possibly movies too.

I'm a casual type gamer playing pc games for the graphics, just touring around in them for the scenery and other IQ things. Constantly on Nexus looking for mods for visuals.
I am not that serious about it like others in this thread. I like to play games and tried the C2 and returned it because it was too dim and the ABL was bothering me. The OLED has better viewing angles and it's faster but it's slower being 120hz where the qn90b is 144hz so the qn90b is definitely faster with the extra 24fps even if the oled has faster pixel response the higher fps is noticably smoother. The wow factor like you said or impact of the overall image was much more desirable for me with the mini led. Also I don't want to deal with burn in lol.
 
Every tech has it's own flaws. You're assuming I don't realize what they are for both techs. I just prefer one and sharing my opinion, Senn.
At least you know when something is a preference and opinion, unlike a certain person you were oddly backing on this thread.

We prefer the same tech as we've spoken about before, though I wouldn't claim that blooming isn't noticeable. The dimming zones aren't quite small enough yet but I'm sure we'll get there eventually.

This years 90C range seems promising but I don't believe it has any more zones, just better management of those zones (correct me if I'm wrong).
 


Not that I actually care about this, but from that video at 9:59, apparently Hollywood does in fact use OLED for grading. So much for all the claims of "nobody grades on 100 nits OLED". lmfao.
 
  • Like
Reactions: elvn
like this


Not that I actually care about this, but from that video at 9:59, apparently Hollywood does in fact use OLED for grading. So much for all the claims of "nobody grades on 100 nits OLED". lmfao.

OLED PVM and BVM's have been around for over a decade now. :)
 
OLED PVM and BVM's have been around for over a decade now. :)

Right, but now they are just using off the shelf TV's that anybody can buy from their local BestBuy so wouldn't that mean that anyone who's watching the films that is mastered on an LG G3, on their own calibrated G3 at home would now be seeing the films with the best accuracy and creative intent as possible?
 
At least you know when something is a preference and opinion, unlike a certain person you were oddly backing on this thread.

We prefer the same tech as we've spoken about before, though I wouldn't claim that blooming isn't noticeable. The dimming zones aren't quite small enough yet but I'm sure we'll get there eventually.

This years 90C range seems promising but I don't believe it has any more zones, just better management of those zones (correct me if I'm wrong).

Once the dimming zones are small enough there is no discernable blooming there will be no reason to have an LCD layer on top of them. Which is what micro LED is.
 
Depends on what level you work at in the industry.

At the top end, OLED is mostly used to show clients.
The top of the line monitors used for grading are all from Flanders Scientific.
https://flandersscientific.com/XM312U/
This is what you'd find at places such as Dolby for mastering. Most can't afford to spend $20k on a monitor though.

If you have a smaller budget, then yeah, people have been using OLED TV's to grade on for a while. There are quite a few guides on how to get into the service menu and turn off all the "safety features". Which does mean that users have to be much more careful with static imagery, but when "used right", it's simply the output monitor with the graded image on it and all the tools are on a separate monitor. Meaning that static images are minimized. The C2 as an example more or less destroys all other desktop monitors for accuracy while also considering price when things like ABL are turned off.

I'm sure the G3 will be the go to as a client display... well until there is something better.
 
  • Like
Reactions: elvn
like this
Depends on what level you work at in the industry.

At the top end, OLED is mostly used to show clients.
The top of the line monitors used for grading are all from Flanders Scientific.
https://flandersscientific.com/XM312U/
This is what you'd find at places such as Dolby for mastering. Most can't afford to spend $20k on a monitor though.

If you have a smaller budget, then yeah, people have been using OLED TV's to grade on for a while. There are quite a few guides on how to get into the service menu and turn off all the "safety features". Which does mean that users have to be much more careful with static imagery, but when "used right", it's simply the output monitor with the graded image on it and all the tools are on a separate monitor. Meaning that static images are minimized. The C2 as an example more or less destroys all other desktop monitors for accuracy while also considering price when things like ABL are turned off.

I'm sure the G3 will be the go to as a client display... well until there is something better.

I posted that video mostly because of this false claim right here:

1678308056183.png


Nobody uses a 200 nits OLED to grade HDR1000 eh?
 
  • Like
Reactions: elvn
like this
Depends on what level you work at in the industry.

At the top end, OLED is mostly used to show clients.
The top of the line monitors used for grading are all from Flanders Scientific.
https://flandersscientific.com/XM312U/
This is what you'd find at places such as Dolby for mastering. Most can't afford to spend $20k on a monitor though.

If you have a smaller budget, then yeah, people have been using OLED TV's to grade on for a while. There are quite a few guides on how to get into the service menu and turn off all the "safety features". Which does mean that users have to be much more careful with static imagery, but when "used right", it's simply the output monitor with the graded image on it and all the tools are on a separate monitor. Meaning that static images are minimized. The C2 as an example more or less destroys all other desktop monitors for accuracy while also considering price when things like ABL are turned off.

I'm sure the G3 will be the go to as a client display... well until there is something better.
Unless I'm crazy, it looks like their OLED monitor (the large format ones at least) are using LG panels.
 
I posted that video mostly because of this false claim right here:

View attachment 554623

Nobody uses a 200 nits OLED to grade HDR1000 eh?
While not aware of that specific post, I'm very aware of kamnelis. I'm more or less done having discussions with him though, let him shout into the void. Stop feeding the troll. He says basically the same 5 or so statements over and over. And then when challenged will call you a liar or otherwise insult you. His playbook could be written in about 5 minutes.

Unless I'm crazy, it looks like their OLED monitor (the large format ones at least) are using LG panels.
Yes. The context for those again are to show clients or to use on sets - either as being a display to feed multiple camera images in on at once for programming, or as a reference for the DIT/Director, or both. Not specifically for mastering.
While you could master with those displays, that isn't their intended purpose.
 
Last edited:
While not aware of that specific post, I'm very aware of kamnelis. I'm more or less done having discussions with him though, let him shout into the void. Stop feeding the troll. He says basically the same 5 or so statements over and over.


Yes. The context for those again are to show clients or to use on sets - either as being a display to feed multiple camera images in on at once for programming, or as a reference for the DIT/Director, or both. Not specifically for mastering.
While you could do that with those displays, that isn't their intended purpose.
So it's intended for eye candy then?
 
So it's intended for eye candy then?

I mean in the video he clearly states that the displays are being used for actual mastering work. They are REPLACING the old Sony BVM monitors with the G3's. But it's not like I can verify that myself as fact I'm just taking his word for it.

1678309667809.png
1678309689463.png
1678309710736.png
 
So it's intended for eye candy then?
Not sure how you came to that conclusion based on what I said.

Perhaps said another way it's good at showing a reference image that's final or near final to clients. Using a nice 65" display matters, vs trying to have 5-6 people surround the colorist when discussing a grading session. Generally speaking Hollywood at the top levels is a very "touch based" interpersonal relationship industry. You want your clients comfortable on a nice comfy couch, and a convenient display that everyone can see comfortably while the colorist is buried behind his/her desk and tools.

Before the advent of HDR and OLED, there were projection rooms for clients and the colorist would be using CRT grading displays. Again for the same reasons.

I could reiterate the same points for on set, but then you'd also have to know that trying to look through the viewfinder or the 5-7" display on a camera and how inconvenient that is for the DP/camera operators and how that would also slow down production is bad vs having a large display at the DIT station. Generally crowding the people working isn't the best situation.

The short answer is: ergonomics. You want a large display of reasonably excellent quality and accuracy that people can reference on set or in the grading room so that the people working aren't getting crowded and it's generally a better experience.
 
I mean in the video he clearly states that the displays are being used for actual mastering work. They are REPLACING the old Sony BVM monitors with the G3's. But it's not like I can verify that myself as fact I'm just taking his word for it.

View attachment 554634View attachment 554635View attachment 554636
No, he's probably right. But he's also speaking specifically about one mastering house and the path they're choosing to take and the rationale behind it. LG also has a vested interest in selling more displays so they are definitely trying to push these for sure.

I just would say that that isn't the majority of mastering houses. It might literally just be the one. Certainly though they almost all use LG displays as a reference for exactly what he said, what top end consumer display will look like in a home and also as a client monitor. I also noted in my first post that there are definitely people using OLED's at different levels to grade with. So, it's not as if I'm disputing that either.



As a counter point, here is a video with Jill Bogdanowicz with her at home grading setup and BlackMagic Resolve Color Panels. You can very obviously see her Flanders Scientific display in the Zoom Call:

Jill, is basically a top person to reference in the industry. She works for Company3 and has credits as long as they come. She's done most of the color work for Marvel (most recently Thor: Love and Thunder, The Eternals, Black Widow, Shang-Chi), stuff for DC (like Joker and most recently Black Adam), the SpiderMan series (Far From Home, No Way Home, etc), John Wick 3, etc etc.

Here is a BTS at Dolby showing the color grading process there, also using a Flanders Display:


EDIT: Here's a little BTS of IMM creative:

Looks like they're still using Sony reference displays.



I realize you're not picking a fight, just showing you other data.
 
Last edited:
Once the dimming zones are small enough there is no discernable blooming there will be no reason to have an LCD layer on top of them. Which is what micro LED is.
Kind of. MicroLED is technically 3 times the density of any equivalently dense array of white(/blue) LEDs due to needing R, G and B per pixel.

But yeah, I get what you're saying. I'm sure we'd all rather see the jump to microLED rather than a FALD so dense it might as well be it.
 
Or we just don't obsess to much over displays and enjoy what we have? Then when something you like comes along we have the choice to buy it? Either we buy it or we pass for something else. Mini led and oled is fine for now. Pick one and enjoy it is my take on it. Lol
 
But yeah, you will . . . and like I quoted above - especially in game mode on samsungs where it will be even worse blooming / luminance fluctuation / dark area lifting/contrast loss across a wider number of zones in game mode . . and with slower transitions.



.



People can say they won't see VA trailing/ghosting on VA screens, uniformity issues, etc. too. I'll trust the professional reviews. I'm under no illusions about OLED's own tradeoffs and limitations currently either and even the tradeoffs on samsung qd-oleds vs lg's. You can try to ignore the cons of your screen tech and model, or say it's "not *that* noticeable" etc. but there are major tradeoffs on each display type and even between models/mfgs.
Actually no, you really won't notice it. Rtings is correct when they pick displays apart looking for every single pro and con. Sure they are correct. That doesn't mean you will notice it. The oled is a bit faster pixels and has less blooming, of course we already know that. In practice is the point I'm making. I game fast paced shooters at 144hz on Ultra detail and no, you don't notice anything distracting. No smear and no blur and blooming. If all those things were an "issue" I would have a 240hz monitor right now but I simply don't need it. You think I can't just go out and get even a 360 monitor to be faster than any mini led or oled? Of course I can but I don't need it because it is "fast enough". I'm not a pro gamer I'm just a hardcore casual lol and trust me reviewers always have pros and cons about monitors and displays that doesn't mean they are deal breakers for everyone or even come into play on all content. Maybe certain content you can trigger a weak point of a display but that is a specific case. For me the overall brightness of a oled was sad, I didn't wanna look at a dim screen and didn't want to worry about burn in. Those are noticeable and 100% of the time you will have a dim image quality and abl fluctuating and burning in as you use it. Those are absolutely noticeable drawbacks for a fact. With the mini led I have no complaints because the speed and blooming just are not a "deal breaker" or "issue" or "problem" or anything to worry about because you actually don't see or percieve bloom or smear it is controlled and fast to most of the popular majority of users except people that need the highest specs on paper or reviewers my guess is 9 out of 10 players are clueless and wouldn't notice anything about mini led but would absolutely notice how dim an oled is being none the wiser to its burn in to top it off.. It's a badass gaming monitor and you don't see or experience any of the extreme conditions because they are "good enough" if that makes sense.
 
Last edited:
The main place I noticed blooming on the FALD IPS I tried was desktop use (Discord, my calendar, etc.), and it was way too distracting there to be acceptable for me (I haven't tried the Samsung, so can't comment - I require between 27" and 32" which narrows things down a good bit; the FALD I tried was the ProArt UCG). I didn't really want to have to toggle it on and off each time I entered a game, etc., especially since use of the backlight was one of the main benefits to the high-cost monitor. (I will say in gaming it looked quite good for the limited testing I did, but for the price the tradeoffs weren't worth it, not to mention it would have needed exchanged even if I'd kept it because of QC). Like others have said, we all have different sensitivities so just have to pick the tradeoffs we have no problem living with at the end of the day; for me, that's OLED, but I can certainly see different appeals to FALD.
 
The main place I noticed blooming on the FALD IPS I tried was desktop use (Discord, my calendar, etc.), and it was way too distracting there to be acceptable for me (I haven't tried the Samsung, so can't comment - I require between 27" and 32" which narrows things down a good bit; the FALD I tried was the ProArt UCG). I didn't really want to have to toggle it on and off each time I entered a game, etc., especially since use of the backlight was one of the main benefits to the high-cost monitor. (I will say in gaming it looked quite good for the limited testing I did, but for the price the tradeoffs weren't worth it, not to mention it would have needed exchanged even if I'd kept it because of QC). Like others have said, we all have different sensitivities so just have to pick the tradeoffs we have no problem living with at the end of the day; for me, that's OLED, but I can certainly see different appeals to FALD.

When playing SDR games or using Windows on my VA monitor, HDR is not activated so the LEDs are at much lower brightness.
Because its already in the dimmed state the lower brightness lets black look very dark and there is no need for the display to activate FALD.
Desktop and SDR gaming look great.

Activating HDR while using an SDR desktop isnt the best idea :)
 
With all this oled vs qned debate going on, is there a consensus on what the better option is for text? I look at walls of text for 8 hours a day for my job, and from what I understand oled's pixel layout is not the best for text. What about qled for text?
 
Meanwhile it's illegal to sell Incandescent end of February 2023 light bulbs just noticed they were on clearance at Walmart luckily I have about 70 bulbs as Backups. Just changed two bulbs to LEDs that I don't use and the main lights I'll use Incadesants LEDs bulbs make me squint and blink.
 
When playing SDR games or using Windows on my VA monitor, HDR is not activated so the LEDs are at much lower brightness.
Because its already in the dimmed state the lower brightness lets black look very dark and there is no need for the display to activate FALD.
Desktop and SDR gaming look great.

Activating HDR while using an SDR desktop isnt the best idea :)

Unfortunately I no longer have the ProArt to play with, but I did play around with both SDR and HDR modes in Windows 11. I remember not being totally happy with either, but I can't remember if the blooming was as much of a problem in SDR with HDR off (I know I played around with a bunch of both). This could vary by display too and the algorithm controlling backlight behavior. I've had a few people tell me I might have been happier with the ROG UQX instead of the ProArt and that it allows for local dimming to be off only for SDR without having to manually toggle, so maybe that would have been a better experience. At any rate, I found something that works for me, but that's good feedback to keep in mind if I ever decide to give FALD another try in the future.
 
With all this oled vs qned debate going on, is there a consensus on what the better option is for text? I look at walls of text for 8 hours a day for my job, and from what I understand oled's pixel layout is not the best for text. What about qled for text?
No doubt about it, LCD. That said, I did not mind looking at text all day long on the LG CX 48" I used for several years, but part of that is definitely attributed to having it 1m distance away from me.

4K+ LCD is just going to be a simpler choice. QLED, if we talk about TVs, might have BGR subpixel layouts which are not ideal but can be adjusted to work with Cleartype.
 
Actually no, you really won't notice it. Rtings is correct when they pick displays apart looking for every single pro and con. Sure they are correct. That doesn't mean you will notice it. The oled is a bit faster pixels and has less blooming, of course we already know that

my guess is 9 out of 10 players are clueless and wouldn't notice anything about mini led but would absolutely notice how dim an oled is being none the wiser to its burn in to top it off.. It's a badass gaming monitor and you don't see or experience any of the extreme conditions because they are "good enough" if that makes sense.

OLED doesn't bloom, that's the point. It doesn't lift large cell area's dark levels (or dim bright levels) in contrasted areas and scenes with dark areas detailing things .. which is most scenes. Not only are the FALD cells lifting contrasted dark areas, that luminance is fluctuating as the camera in media or FoV movement in games moves or changes, puddle jumping across the large sub 45x25 zones. OLED lights each pixel individually times 8million pixels down to a razor's edge. FALD has to lift 7000pixel light buckets at a time in a tetris brickwork - even more in a wider splay of backlights with samsung's QD LED FALD LCDs when in game mode apparently, causing worse bloom and with slower noticeable transitions in game mode. For now we are also still able to get some OLEDs with glossy screens too which makes blacks and saturation look even better compared to matte ag with any ambient lighting hitting the screen (though you can use a matte in a dim to dark viewing environment where HDR is best to begin with, but still it's an abraded surface).

I'm not going to argue whether people can see 60fpshz vs 120fpshz or can see VA ghosting/trailing, etc on various screens. People red boxed 480p dvds on 720p screens for years after there were 1080p screens and blurays too and wouldn't know or care the difference on 1080p content and screens (and now 1080 to 4k).. That kind of argument is weak for this forum imo. Lets just put aside the sugar coating on both sides. They are major tradeoffs on both technologies. . Pick the tradeoffs you are happy with and that don't bother you as much personally / you can live with - until we can all get a better tech across the board.

When playing SDR games or using Windows on my VA monitor, HDR is not activated so the LEDs are at much lower brightness.
Because its already in the dimmed state the lower brightness lets black look very dark and there is no need for the display to activate FALD.
Desktop and SDR gaming look great.

Activating HDR while using an SDR desktop isnt the best idea :)

The ucx goes back to 1300:1 contrast and accompanying black depths when FALD isn't activated , the samsung's can't turn theirs off unless you use the service menu but if that is done they are back to their native contrast of 3170:1 for the Q95b/90B. Oled is "infinite" black depth contrasting next to individual black details and colored pixels to razors edge. I don't think I'd be happy going back from that would be a huge downgrade in that facet to me. Once we get per pixel emissive microLED in the enthusiast consumer space someyear, noone will use either of these technologies if they can afford not to. Everything will go to per pixel emissive across the board someyear. VR's flagship units from major mfgs are set to all go to microOLED soon so they will all be per pixel emissive way before pc. They are also going to start being HDR in VR. MicroOLED goes much brighter, some claims are extremely high ... plus they are right next to your eyes so the brightness is even higher perceptually. Later they'll probably all move to microLED someyear as well. FALD is a clever stop gap solution like OLED's burn in avoidance tech. They are both heavily flawed in major facets. Pick your poison.

With all this oled vs qned debate going on, is there a consensus on what the better option is for text? I look at walls of text for 8 hours a day for my job, and from what I understand oled's pixel layout is not the best for text. What about qled for text?

LCD has better text with it's RGB subpixel layout (though some LED LCD gaming TVs are BGR with some issues). This is because we have to resort to masking how low the PPD of our screens is by using text sub-sampling to smudge the edges of the text (and that smudging tech is not designed for non-standard sub pixel layouts).
We also have to mask how large the pixel structure appears in games with aggressive AA. Highly contrasted edges are the most noticeable but the PPD just isn't high enough overall so requires masking methods. If your ppd is too low, even those masking methods won't be able to avoid a degree of visible fringing on the most contrasted edges. The 2d desktop's graphics and imagery have no game AA or text sub-sampling to compensate for the perceived pixel sizes either.

That said, most people are using the larger gaming tvs, even 42" ones, at well beneath 60 PPD which will make all pixels look larger than the fine pixels you'd expect from a 4k screen normally. So they are making the non-standard pixel structures look worse than they otherwise would. The 2d desktop's graphics and imagery have no game AA or text sub-sampling to compensate for the pixel sizes though so higher PPD is always a better picture quality, within the constraints of the 50 to 60 degree human viewing angle. At 60 deg viewing angle any size 4k screen get 64PPD, at 50 deg viewing angle any size 4k screen gets around 77PPD. Unfortunately, the 2560x1440p scrreens that are out will never have that kind of 64 - 77ppd at normal viewing angles, so everything will look more pixelated/fringed than a 4k at the same viewing angles.

For FALD LCD, the PPD still matters but there are 31" and smaller options that would sit well on a desk. Those screens sized for desks wouldn't require a separate simple tv stand or wall mount that a larger 42"+ screen would need to decouple from the desk in order to get optimal 50 to 60 deg human viewing angle and accompanying 64 - 77 PPD. However for FALD screens, it's usually recommended you turn off the FALD for 2d desktop use as the large sideways icecube trays of backlight zones will make everything non-uniform on the screen otherwise. FALD arrays always are non-uniform, even in media. Once you disable FALD you are back to a very low contrast ratio and it's accompanying black depths though, on some screens more like typical ips 1000:1 - 1300:1. The samsung QDLED LCD TVs get 3170:1 for the Q95b/90B but that is still extremely poor compared to an oled. The larger samsung TVs still have the viewing angle + PPD issues unless you decouple them from the desk. You have to disable their FALD from the service menu for 2d destkop use if you desire to, there's no way to do it in the tv's OSD.
 
Last edited:
Meanwhile it's illegal to sell Incandescent end of February 2023 light bulbs just noticed they were on clearance at Walmart luckily I have about 70 bulbs as Backups. Just changed two bulbs to LEDs that I don't use and the main lights I'll use Incadesants LEDs bulbs make me squint and blink.
I'm still Googling incandescent LEDs (and incadesants LEDs too) but I can't find them on the net. Want to try this new tech.
 
OLED faster response times vs LCD make very little difference and 60fps game on both will look terrible compared to CRT or even PDP
OLED with its 0.1ms response times does absolutely nothing to make moving objects appear sharp and they blur as much as on fast IPS panel.

That said characteristic of pixel changes of OLED and IPS are different.
After getting OLED I am just not sure which is more distracting. LCD while it blurs sightly more also adds some fake details to sample&hold blur, especially with higher overdrive levels, making sample&hold blur appear differently than near perfect precision of OLED pixel transitions. In both displays best not to track moving objects so much when playing games...

Of course advantage of OLED shows more at higher frame-rates.
That said what we need is combining VRR with strobing. OLED due to struggling with pixel luminance is nowhere closer to realizing perfect gaming monitor than LCDs are. On LCD this whole thing is easier from the point of light output but driving backlit strobing is a bit more complicated and manufacturers which tried it usually failed to avoid obvious issues.
 
I'm assuming most here don't use gaming mode on a Led or Oled display or anything. Yeah there a some situations with a screen and pc rig where it works well. But some have posted over the months and years on the forums that going the manual way with settings and not gaming mode produces good or better IQ and can still compete fairly well in performance. I bring up gaming mode because it's used in testing rigs as mentioned in earlier posts.
 
You have to use game mode on most gaming TV screens in order to get the lowest input lag the screen is capable of. May depend on the screen tech and firmware that maybe there are some backdoor workarounds to get back to a game mode like state you saw somewhere - but game mode is pretty standard and is how all of the review testing of input lag I've seen is done. For example, the samsung s95b pentile qd-OLED is at 5.6ms input lag at 4k 120hz in game mode but jumps to 30ms to 60ms outside of it. If you aren't getting the lowest input lag the display is capable of it wouldn't be worth the tradeoff to go another route imo.

There are other ways to improve the picture quality to your taste like adjusting the color saturation slider in the tv's osd in game mode to bump it up slightly and then using reshade (or nvidia freestyle I guess which is similar bur requires geforce exp to be running and the game to be compatible) to tweak a number of settings including that saturation up or down . . but also brightness, contrast, sharpness, gamma, whites, blacks, etc. on a per game basis that it will remember. That's for SDR in game mode though. HDR looks great in games in game mode if the game implements it well enough and has user definable HDR settings in the game, plus now the windows HDR tuner that is available to begin with.
 
When playing SDR games or using Windows on my VA monitor, HDR is not activated so the LEDs are at much lower brightness.
Because its already in the dimmed state the lower brightness lets black look very dark and there is no need for the display to activate FALD.
Desktop and SDR gaming look great.

Activating HDR while using an SDR desktop isnt the best idea :)
A calibrated 3000:1 screen with a good bias light can look nice and punchy, and dare I say “inky”.
 
Actually no, you really won't notice it. Rtings is correct when they pick displays apart looking for every single pro and con. Sure they are correct. That doesn't mean you will notice it. The oled is a bit faster pixels and has less blooming, of course we already know that. In practice is the point I'm making. I game fast paced shooters at 144hz on Ultra detail and no, you don't notice anything distracting. No smear and no blur and blooming. If all those things were an "issue" I would have a 240hz monitor right now but I simply don't need it. You think I can't just go out and get even a 360 monitor to be faster than any mini led or oled? Of course I can but I don't need it because it is "fast enough". I'm not a pro gamer I'm just a hardcore casual lol and trust me reviewers always have pros and cons about monitors and displays that doesn't mean they are deal breakers for everyone or even come into play on all content. Maybe certain content you can trigger a weak point of a display but that is a specific case. For me the overall brightness of a oled was sad, I didn't wanna look at a dim screen and didn't want to worry about burn in. Those are noticeable and 100% of the time you will have a dim image quality and abl fluctuating and burning in as you use it. Those are absolutely noticeable drawbacks for a fact. With the mini led I have no complaints because the speed and blooming just are not a "deal breaker" or "issue" or "problem" or anything to worry about because you actually don't see or percieve bloom or smear it is controlled and fast to most of the popular majority of users except people that need the highest specs on paper or reviewers my guess is 9 out of 10 players are clueless and wouldn't notice anything about mini led but would absolutely notice how dim an oled is being none the wiser to its burn in to top it off.. It's a badass gaming monitor and you don't see or experience any of the extreme conditions because they are "good enough" if that makes sense.
This sounds a bit too close to that other poster for comfort. Just because you don't notice something doesn't mean that no one else will. And vice versa.

Is it not the case that, rather than having not noticed the inherent flaws of the technology, you've simply accepted them because they don't bother you, in your opinion?
 
This sounds a bit too close to that other poster for comfort. Just because you don't notice something doesn't mean that no one else will. And vice versa.

Is it not the case that, rather than having not noticed the inherent flaws of the technology, you've simply accepted them because they don't bother you, in your opinion?
Nah It's fine. If you want anything faster go 240 or 360hz. If you like OLED go OLED. There is no motion blur or blooming that an average gamer would notice. Only pixel peepers maybe but in that regard a pixel peepers should find something wrong with any display tech right lol There are other options out there. For me it's beautiful 😍
 
You have to use game mode on most gaming TV screens in order to get the lowest input lag the screen is capable of. May depend on the screen tech and firmware that maybe there are some backdoor workarounds to get back to a game mode like state you saw somewhere - but game mode is pretty standard and is how all of the review testing of input lag I've seen is done. For example, the samsung s95b pentile qd-OLED is at 5.6ms input lag at 4k 120hz in game mode but jumps to 30ms to 60ms outside of it. If you aren't getting the lowest input lag the display is capable of it wouldn't be worth the tradeoff to go another route imo.

There are other ways to improve the picture quality to your taste like adjusting the color saturation slider in the tv's osd in game mode to bump it up slightly and then using reshade (or nvidia freestyle I guess which is similar bur requires geforce exp to be running and the game to be compatible) to tweak a number of settings including that saturation up or down . . but also brightness, contrast, sharpness, gamma, whites, blacks, etc. on a per game basis that it will remember. That's for SDR in game mode though. HDR looks great in games in game mode if the game implements it well enough and has user definable HDR settings in the game, plus now the windows HDR tuner that is available to begin with.
Ya game mode is a game changer. I permanently keep it in game mode and HDR on. Looks fantastic and very snappy. Different firmwares behave differently though. I'm lucky My firmware is known to be a good one. That's why I disconnected my display from the internet. To make sure it stays offline. I also clear all bloatware apps and have it barebones. Only fancy features I keep on are HDR VRR Game mode FALD and Contrast enhancer everything else off .
 
Back
Top