LG 48CX

The resale of my 2080 Ti paid for 73% of my 3090, before state tax. Insane market indeed. I do admit I was lucky as hell to snatch a 3090 at MSRP from Best Buy. I hesitated at the time, but who could predict pulling the trigger on a $1500 GPU would be a good financial decision? What an upside down world.
Yeah, I also randomly found a 3090 in stock at Best Buy at MSRP soon after it launched and casually checked out! Now they are never in stock even if you use alerts. At the time I thought the price was insane over the 3080 but now it's the best purchase I made in the last 1.5 years.
 
I bought the gigabyte aorus external box with the 3090 AiO inside and shucked it (they don't run as fast in the box over TB cables as on the motherboard in a pci-e slot). You can still find the egpu units in stock pretty regularly on newegg but their water lines are very short on the stock AiO. You can strip it and put a 3rd party aio + bracket on it for long lines though. The 3090 egpu aren't cheap but they were available for just over $2k for quite some time. They've gone up in price several hundred dollars since then. I had a spare nzxt aio bracket from a long time ago. Besides they are just sheet metal you can drill the bracket to fit any holes really, especially if you have a drill press. So the older nzxt bracket or the modern one will fit one way or the other. I just needed an AIO that was the style with tabs/notches around a disc/puck heatsink to lock in to the bracket's mounting tabs..

I still have the broken down egpu case in the box it came in and could put some other gpu in it if I really wanted to someday also. It ran ok just in my pc shucked with the short lines and the stock aio but wasn't an optimal config with the radiator sitting in my case awkwardly and vs going the whole 9 yards with new pads, paste, fans, aio, etc.

In general, a lot of people choose to replace the thermal pads on the backplate and any on the front heatsink of their 3000 series gpus and then add one or more fans to the back because the memory junction runs hot. People also usually undervolt them slightly as it doesn't cause a performance hit (in some cases even slight gains).

This pics below are the general idea including a few pictures from someone else who did the same mod on 3090 egpu.
Mine is a different AiO (nzxt kraken) with 3rd party fans swapped on the radiator and different fans on the gpu pcb (I don't like the look of noctua browns). I kept the huge copper plate front heastsink that covers all of the front components. That huge copper heatsink came with it so I might as well use it instead of putting tiny heatsink stacks on. I paid for it after all..

cSrUGSx.png
71KweETBdGL._SL1500_.jpg

20FQ9Y8.jpg
peTbCN7.jpg




This is what it looks like right out of the case (from Linus tech tips review of it):

UzyEwLi.png
O5cKhrT.png



Pic of one with the heavy copper front heatsink and the rear black backplate taken off:

KOouuI6.jpg
Soopuc4.jpg
 
Quick question:
My Dell 3011 just died. So I have 3 choices:
1. 48c1,
2. something else,
3. wait for the 42c2.

Any advice would be welcome :)
 
Depends what you want to do with it. Personally I'm not a proponent of using them as a desktop monitor and rather using them as a media/gaming "stage" - though you'll find plenty that disagree in the thread. I use side screens for static desktop/app stuff.

Samsung is coming out with all blue oleds so you might want to wait to see what that's about as they are about 15% to 25% higher peak brightness and potentially less chance of burn in as they get more output per energy state (or less rapidly burning down the the burn in wear-evening buffer perhaps). Their smaller gaming monitor is 175hz (and too tiny at only ~13" tall for my taste) .. but the TV's with the same tech are going to be able to do 144hz at 4k. Though is price is a big consideration they will likely be expensive and the CX/C1's will be cheaper.. in some cases 1k or less. When the 42" comes out prices may shift too.




On the QDOLED TV - he said they measured 1000nit 1% patch and

1500nit color volume in a 3% patch:
TpfhqiE.png


Reports that they have increased saturation so much more vivid greens and reds. All blue (high energy) oled array stepped down with a quantum dot filter to red and green.


https://www.theverge.com/2022/1/4/2...-34-inch-qd-oled-samsung-gaming-monitor-specs




The ~ 13" tall 3440x1440 alienware gaming monitor is peak 1000nit instead of 1500 as a safer desktop usage scenario. They are giving it a 3 year exchange policy including burn-in. I don't know if they will do the same with their 144Hz 4k tvs at the higher peaks.
 
Last edited:
Im almost hoping QD-OLED won't be as good as I have been reading, as now I will want to wait for it's prices to come down rather then buying the 42" C2..
 
there is always something better around the corner, or around 2 corners... marginally at least. Early adopters pay max price. Even the CX dropped to around $900 - $1000 a few times later after many bought it for $1500
 
Im almost hoping QD-OLED won't be as good as I have been reading, as now I will want to wait for it's prices to come down rather then buying the 42" C2..

Going from LCD to OLED = massive upgrade 1000% better
Going from LG C2 to Alienware QD-OLED = kind of an upgrade, sort of a side grade because of the aspect ratio and lower resolution.

Also we don't know how much the QD-OLEDs will cost yet, but they could be well over 2X as expensive. It's going to be years until they come down to current LG OLED prices.


If you're only willing to spend $1000 my recommendation is go with the LG C2, you could have a very long wait for QD-OLED.
Enjoy the C2 which is a massive upgrade over any LCD, and almost as good as the QD-OLED.
Don't suffer for years waiting for QD-OLED to come down to the price you want.
 
Going from LG C2 to Alienware QD-OLED = kind of an upgrade, sort of a side grade because of the aspect ratio and lower resolution.

Not bad advice but just to be clear they are also going to release 4k 144hz QD-OLED TVs. The alienware monitor is probably going to be overpriced for what it is by comparison, at least in my opinion, as long as you have the space vs PPD of a larger screen. The TV's will be expensive though, sure. No word on if they will have any smaller TVs (42" - 43"- 48"), at least that I've heard yet but I'm hoping so.
 
I have been using an acer x35 for about 5 years now, so an oled will be a massive upgrade. So maybe the best upgrade path for me will be c2 42" in the spring, and a QD-OLED (micro LED?) 5 years from now..
In reality Im probably not going to get anything as Im about to have two kids in daycare so my money has gone bye bye for the next few years anyhow lol.
 
I have been using an acer x35 for about 5 years now, so an oled will be a massive upgrade. So maybe the best upgrade path for me will be c2 42" in the spring, and a QD-OLED (micro LED?) 5 years from now..
In reality Im probably not going to get anything as Im about to have two kids in daycare so my money has gone bye bye for the next few years anyhow lol.

Sounds like a now or never situation.
 
Quick question:
My Dell 3011 just died. So I have 3 choices:
1. 48c1,
2. something else,
3. wait for the 42c2.

Any advice would be welcome :)
Since you seem to need a monitor now, I would look at the 48" C1. The fascination with the 42" is mostly, I feel, due to people having smaller desks.
 
Since you seem to need a monitor now, I would look at the 48" C1. The fascination with the 42" is mostly, I feel, due to people having smaller desks.

It's not even that big of a difference in viewing distance if you want to stay at a minimum of 60PPD where text subsampling and AA are better able to compensate for the perceived pixel granularity at that distance, or if you want to keep closer to 80 PPD for a more optimal viewing angle, somewhat less aggressive pixel stucture and not having to lean quite as hard on AA.


60PPD 64 degree viewing angle
=======================

..technically a bit too close of a viewing angle vs periphery of screen being pushed out too far, but the pixel granularity will at least be low enough that subsampling and AA can compensate for the most part - at a performance hit)

98" 4k screen at ~ 68.5" away has the same PPD and viewing angle and looks the same as:

77" 4k screen at ~ 54" away (60PPD, 64deg viewing angle)

65" 4k screen at ~ 45" away

55" 4k screen at ~ 38.5" away

48" 4k screen at ~ 33.5" away

42" 4k screen at ~ 29" away

27" 4k screen at ~ 19" away

---------------------------------------------------------

80 PPD 48 deg viewing angle (optimal viewing angle is typically 45 - 55 deg)
========================================================

..reduced pixel granularity so can probably get away with a little more more moderate AA and text (with tweaked subsampling) will look a little better.
..until we get to something like 150PPD+ the pixels won't appear fine enough that we won't really have to rely on AA and subsampling anymore. However the gpu demand would counteract that resolution gain (8k+) anyway, losing motion clarity and motion definition aesthetics so probably better off using an optimal PPD on a 4k screen along with AA and text subsampling for the following years (though using an 8k screen on the side for desktop/apps would be good).

98" 4k screen at ~ 96" away has the same PPD and viewing angle and looks the same as:

77" 4k screen at ~ 75.5" away (80PPD, 48deg viewing angle)

65" 4k screen at ~ 64" away

55" 4k screen at ~ 54" away

48" 4k screen at ~ 47" away

42" 4k screen at ~ 41" away

27" 4k screen at ~ 26.5" away

-------------------------------------------------------------

You can see the 80PPD point (on a 4k screen) is where the screen diagonal measurement and the viewing distance make what is more or less an equilateral triangle or pyramid cone with your viewing angle. The view distance approaching the screen's diagonal is the neighborhood of the optimal viewing angle for anything with HUDs, notifications, pointers, text windows, etc. in my opinion, regardless of the PPD.

Coincidentally, a 48" 4k screen at ~ 47" - 48" away is a 48 degree viewing angle. 48diag ~ "48" view - 48deg

The thing is, if you were trying to use a 42" at 80PPD you'd actually be saving 6" so would be able to set it up at 41" viewing distance.
However it's more likely that people choosing a 42 over of 48 don't have the space for a 41" viewing distance on the 42" screen or 47" on the 48" screen to start with so are going to use 60PPD which is only a ~ 4" difference 33.5" vs 29.3" distance. I doubt their desk setup is going to be able to achieve 29.3" distance where it can't 33.5" but maybe there are a few setups within that 4" threshold. heh... That or they are going to try to use it below 60PPD where the pixel structure will be too aggressive for AA and text subsampling to operate effectively without compromised fidelity, let alone desktop content that can't benefit from 3d mode AA and text subsampling... and the viewing angle would be poor.

Say you had a desk that only let you sit ~29" away max. You'd be at 60PPD on the 42" and 54PPD on the 48" at that distance. So there the 42" screen would make sense to me.
If you could get a TV stand or other surface to put the screen on, I suspect you could add the other 4" view distance to where the 48" screen would look exactly the same as the 42" though. PPD, viewing angle, perceived screen size to your perspective.

If you had a desk setup that let you sit 41" away for a 42" 4k screen to have 80PPD, I bet you could manage the extra 6" to 47" view distance for a 48" screen to be at 80PPD too. There are some setups that are probably cutting it close and so could benefit from a 6" difference though.

The 42" also might change things and create different options for setups using multiple screens.
 
Last edited:
Whether it be the C2 or some QD-OLED, I'm committed to updating this year. I'll take some pictures of my B6 to show what extremely heavy use as a monitor over 5+ years, including two of constant WFH, have done to the panel.

EDIT

NVM, forgot how hard it is to capture a 55" screen on a tiny phone without loosing all the detail. :/
 
I'm definitely interested in Samsung's consumer QD-OLED solution. Hopefully it's going to be great!
 
I'm definitely interested in Samsung's consumer QD-OLED solution. Hopefully it's going to be great!
Pending reviews. At this point, we know what we're getting from LG. But if Samsung offers the same/better at a competitive price and there aren't any of the SW pains that LG went through, then I would consider them I perfectly viable alternative.

We'll see in a few months. I'm certainly upgrading this year come hell or high water; my B6 is "really" starting to wear at this point.
 
Pending reviews. At this point, we know what we're getting from LG. But if Samsung offers the same/better at a competitive price and there aren't any of the SW pains that LG went through, then I would consider them I perfectly viable alternative.

We'll see in a few months. I'm certainly upgrading this year come hell or high water; my B6 is "really" starting to wear at this point.
What do you run your OLED light level at? And do you move your windows around periodically or leave them in the same position all day when WFH?

Either way, the good news is that their panel tech has been revised to be more robust since the 6 series, so your next one should fare better.
 
Pending reviews. At this point, we know what we're getting from LG. But if Samsung offers the same/better at a competitive price and there aren't any of the SW pains that LG went through, then I would consider them I perfectly viable alternative.

We'll see in a few months. I'm certainly upgrading this year come hell or high water; my B6 is "really" starting to wear at this point.

It will not be a competitive price because yields are low. And don't bet on Samsung to not have any SW pains, in fact it might be even worst given what the Neo G9 experienced. Besides, Samsung will not be offering QD-OLED in a 4K 42" form factor, just 34" ultrawide, 55", and 65" sizes so they are already not even competing in the same display size category.
 
It will not be a competitive price because yields are low. And don't bet on Samsung to not have any SW pains, in fact it might be even worst given what the Neo G9 experienced. Besides, Samsung will not be offering QD-OLED in a 4K 42" form factor, just 34" ultrawide, 55", and 65" sizes so they are already not even competing in the same display size category.
I am interested in the QD-OLED tech, but you just reminded me of why I like my LG CX so much more than my Samsung Q90R. The Q90R has a more natural picture to it, which I really like, but I HATE the post-launch support. There are still many unresolved software bugs and at least one feature (eARC) which was promised but never delivered. Compare that to the LG CX, which has gotten amazing post-purchase support with continual improvements and feature adds, and it's no contest; if my LG CX quit working after the warranty expired, I'd probably buy the newest model of the same product only because I know LG is committed to their products.
 
  • Like
Reactions: elvn
like this
Same bad track record can be said of sony tvs support, lack of promised features, and their proprietary bent. LG really stands out.
 
A friend that recently bought a Samsung TV told me that none of them support Dolby Vision (!!!). Doesn't necessarily matter right now for monitor use...but DV gaming in Windows is coming. They may also get with the times and add it to the QD-OLED TV's.
 
A friend that recently bought a Samsung TV told me that none of them support Dolby Vision (!!!).
you’re very very late to the party…

Btw: there isn’t a monitor (not a TV) out there with Dolby Vision Support and there wasn’t any announcements on the Ces 2022 either.
Dolby Vision on Monitors is not a thing so it seems… unfortunately.
 
you’re very very late to the party…

Btw: there isn’t a monitor (not a TV) out there with Dolby Vision Support and there wasn’t any announcements on the Ces 2022 either.
Dolby Vision on Monitors is not a thing so it seems… unfortunately.
The PA32UCG does support Dolby Vision input over HDMI.
 
Same bad track record can be said of sony tvs support, lack of promised features, and their proprietary bent. LG really stands out.
Have never owned an LG TV. I will say I’m a little disappointed that of all manufacturers, Samsung is making the next oled panel. Every Samsung I’ve ever owned, from my plasma TV to my VA monitors have some *thing* that’s wrong. They just don’t seem to be built very well. Either there’s a dumb oversight to the display that makes it retarded (VA gaming panel that doesn’t allow brightness adjustments to be made in ULMB mode), or there’s an obvious corner cut on the build quality.
 
jbltecnicspro pretty sure Samsung is the AMD 6500xt of TV manufactures they cut features to make price points fit the tv i can only imagine the next thing on chopping block is what they pick and choose out of hdmi 2.1 or hdmi 2.1a then cut # of ports which features it'll support and speed of each port.
 
What do you run your OLED light level at? And do you move your windows around periodically or leave them in the same position all day when WFH?

Either way, the good news is that their panel tech has been revised to be more robust since the 6 series, so your next one should fare better.
Most of the damage on my B6 was one early, when I was running the OLED backlight at 100, now at 30. But there is definitely wear due to WFH, and other apps (the chrome top-bar especially) have done a fair bit of damage after nearly six years of constant use.
 
I am interested in the QD-OLED tech, but you just reminded me of why I like my LG CX so much more than my Samsung Q90R. The Q90R has a more natural picture to it, which I really like, but I HATE the post-launch support. There are still many unresolved software bugs and at least one feature (eARC) which was promised but never delivered. Compare that to the LG CX, which has gotten amazing post-purchase support with continual improvements and feature adds, and it's no contest; if my LG CX quit working after the warranty expired, I'd probably buy the newest model of the same product only because I know LG is committed to their products.
Yep, my B6 saw a new firmware just a few months ago. Say what you will, LG has done a good job supporting their TVs and fixing problems as they come up.
 
What do you run your OLED light level at? And do you move your windows around periodically or leave them in the same position all day when WFH?

Either way, the good news is that their panel tech has been revised to be more robust since the 6 series, so your next one should fare better.

Most of the damage on my B6 was one early, when I was running the OLED backlight at 100, now at 30. But there is definitely wear due to WFH, and other apps (the chrome top-bar especially) have done a fair bit of damage after nearly six years of constant use.

From what I read the modern LG OLEDs reserve the top ~ 25% of their brightness/energy states outside of user available range for their wear-evening routine that is done in standby periodically while plugged in and powered. Primarily that, but along with the other brightness limiters and logo dimming, pixel shift, and the turn off the "screen" (emitters) trick if utilized, should extend the life of the screens considerably. With the ~25% wear-evening routine buffer you won't know how much you are burning down the emitter range until after you bottom out that buffer though. As far as I know there is no way to determine what % of that buffer is remaining. So you could be fine abusing the screen outside of recommended usage scenarios for quite some time thinking your aren't damaging it, and you aren't sort-of .. but you will be shortening it's lifespan wearing down the buffer of all the other emitters to match your consistently abused area(s).

A taskbar, persistent toolbar, or a cross of bright window frames the middle of the same 4 window positions or whatever.. might be the first thing to burn-in when the time comes but on the modern LG OLEDs I think the whole screen would be down to that buffer-less level and vulnerable at that point as it would have been wearing down the rest of the screen in the routine to compensate all along over a long time.

The buffer seems like a decent system for increasing OLED screen's lifespan considering what we have for now. It's like having a huge array of candles that all burn down unevenly - but with 25% more candle beneath the table so that you can push them all up a little once in awhile and burn them all down level again.
Or you might think of it like a phone or tablet's battery you are using that has an extra 25% charge module, yet after you turn on your device and start using it you have no idea what your battery charge level is. You can use more power hungry apps and disable your power saving features, screen timeouts, run higher screen brightness when you don't need to, leave the screen on when you aren't looking at it etc. and still get full charge performance for quite some time but eventually you'd burn through the extra 25% battery.

The QD-OLEDs are all blue and blue is higher energy to start with so they can do average screen brightness at a lower energy level which might reduce the change of burn in / increase the lifespan of the QD-OLED screens. However they have higher peak brightness spec for HDR so depending how much HDR material you are watching it might be slightly less of a burn-in/down difference as they would be back to using the higher energy more often at least in some bright highlights and light sources (though those are usually moving around dynamically in scenes). Some OLEDs have used a heatsink on the entire back of the panel too, which reduces oled heat at higher energy states which can reduce burn in/down rate but not all OLED screens have this.
 
Last edited:
From what I read the modern LG OLEDs reserve the top ~ 25% of their brightness/energy states outside of user available range for their wear-evening routine that is done in standby periodically while plugged in and powered. Primarily that, but along with the other brightness limiters and logo dimming, pixel shift, and the turn off the "screen" (emitters) trick if utilized, should extend the life of the screens considerably. With the ~25% wear-evening routine buffer you won't know how much you are burning down the emitter range until after you bottom out that buffer though. As far as I know there is no way to determine what % of that buffer is remaining. So you could be fine abusing the screen outside of recommended usage scenarios for quite some time thinking your aren't damaging it, and you aren't sort-of .. but you will be shortening it's lifespan wearing down the buffer of all the other emitters to match your consistently abused area(s).

A taskbar, persistent toolbar, or a cross of bright window frames the middle of the same 4 window positions or whatever.. might be the first thing to burn-in when the time comes but on the modern LG OLEDs I think the whole screen would be down to that buffer-less level and vulnerable at that point as it would have been wearing down the rest of the screen in the routine to compensate all along over a long time.

The buffer seems like a decent system for increasing OLED screen's lifespan considering what we have for now. It's like having a huge array of candles that all burn down unevenly - but with 25% more candle beneath the table so that you can push them all up a little once in awhile and burn them all down level again.
Or you might think of it like a phone or tablet's battery you are using that has an extra 25% charge module, yet after you turn on your device and start using it you have no idea what your battery charge level is. You can use more power hungry apps and disable your power saving features, screen timeouts, run higher screen brightness when you don't need to, leave the screen on when you aren't looking at it etc. and still get full charge performance for quite some time but eventually you'd burn through the extra 25% battery.

The QD-OLEDs are all blue and blue is higher energy to start with so they can do average screen brightness at a lower energy level which might reduce the change of burn in / increase the lifespan of the QD-OLED screens. However they have higher peak brightness spec for HDR so depending how much HDR material you are watching it might be slightly less of a burn-in/down difference as they would be back to using the higher energy more often at least in some bright highlights and light sources (though those are usually moving around dynamically in scenes). Some OLEDs have used a heatsink on the entire back of the panel too, which reduces oled heat at higher energy states which can reduce burn in/down rate but not all OLED screens have this.
Do remember the B6 was an *early* OLED, before a lot of the enhancements to fight this type of damage were fully implemented. I'm *very* careful to specify that when talking about the wear I've experienced. But it is worth noting that long-term persistent placement of "stuff" has caused long term issues over a 6+ year timespan.
 
Most of the damage on my B6 was one early, when I was running the OLED backlight at 100, now at 30. But there is definitely wear due to WFH, and other apps (the chrome top-bar especially) have done a fair bit of damage after nearly six years of constant use.
Ah thanks, that helps put things into perspective. Regarding the WFH wear, I can't imagine the Chrome top bar or any other element doing damage unless left in a static position for, well, really long periods of time. Like hour after hour and day after day. That kind of constant static "abuse" is what it takes for OLED TVs in airports and electronics showrooms to show noticeable and permanent burn-in. So I was just kinda curious about how you tend to use yours when working.

Speaking for myself, I find that running most applications (I'm obviously not talking about games, movies, and most other sources of entertainment) full screen on a 48" 4K display is undesirable...as an example, I find reading forums like this one much easier when your eyes aren't scanning super long lines of small text that span the entire length of the monitor...not to mention many websites don't take advantage of the additional space and put the majority of their content in the middle of the browser window with huge patches of blank space or ads on the left and right. So I tend to run most of my applications in smaller-than-full-screen windows that I juggle depending on what I'm doing. When WFH I am constantly shuffling between a mix of browsers, RDP sessions, password managers, document editors, etc. so I tend to keep each window sized roughly anywhere from 27"-32"-40" and I also tend to move them around a bit as I'm working so nothing stays in a static position for hours. Between the fact that I switch between so many applications and I also move things around a bit as I'm working, I don't worry much about wear.

I imagine that if I was a programmer or worked in finance, I'd use it in much the same way. I can't imagine that running a full screen IDE or Excel window would be very desirable on a 48" so I'd likely run them as smaller windows in conjunction with other applications that I used so that nothing stays in one spot for a ridiculous amount of time. I guess there might be folks that want to use the display to run the same 4 applications that when combined are full screen (one in each quadrant, like Chrome in top left, stock ticker in top right, Outlook in bottom left, something else in bottom right) or split up in any number of other ways but that's just not how I, personally, have ever worked or desire to work. There's nothing wrong with that if that's what someone finds works best for them, but if you're going to do that on an OLED you just need to be aware of how it might affect things long term.

The buffer seems like a decent system for increasing OLED screen's lifespan considering what we have for now. It's like having a huge array of candles that all burn down unevenly - but with 25% more candle beneath the table so that you can push them all up a little once in awhile and burn them all down level again.
Haha, I've seen you talk about this 25% reserved buffer before, but IMO that candle analogy really does a great job of helping people understand how it works. That's probably the simplest explanation I've seen yet, so thanks. :)

Do remember the B6 was an *early* OLED, before a lot of the enhancements to fight this type of damage were fully implemented. I'm *very* careful to specify that when talking about the wear I've experienced. But it is worth noting that long-term persistent placement of "stuff" has caused long term issues over a 6+ year timespan.
And I think that's reasonable. Few who have kept up with this technology should be surprised that the early 6 series panel that was more susceptible to burn-in, combined with the 100 OLED backlight level, combined with long-term persistent placement of stuff resulted in noticeable wear. Had any one of those factors been different from the get-go, your panel likely would have fared a lot better. Lessons learned and all that. Still, 6 years with much of that being worst-case usage isn't terrible and like I said earlier (and you surely know by now), your next one should do a lot better even if you change -nothing- about how you use your applications and you start off with a lower OLED backlight level on the more refined/robust panel tech that they're using today.
 
And I think that's reasonable. Few who have kept up with this technology should be surprised that the early 6 series panel that was more susceptible to burn-in, combined with the 100 OLED backlight level, combined with long-term persistent placement of stuff resulted in noticeable wear. Had any one of those factors been different from the get-go, your panel likely would have fared a lot better. Lessons learned and all that. Still, 6 years with much of that being worst-case usage isn't terrible and like I said earlier (and you surely know by now), your next one should do a lot better even if you change -nothing- about how you use your applications and you start off with a lower OLED backlight level on the more refined/robust panel tech that they're using today.
Oh, I agree. And I try to share my experiences with everyone so they avoid making mistakes down the road.

Also note I am a *very* heavy user; 12+ hours a day for six years. So I'm basically OLEDs "worst case" user.
 
i don't even turn off my OLED lately, when I leave the PC for an hour or so with windows all over the display. There is visible temporary burn-in afterwards but it disappears in use or when the OLED is turned off. Really, the burn-in is not a problem, at all.
 
i don't even turn off my OLED lately, when I leave the PC for an hour or so with windows all over the display. There is visible temporary burn-in afterwards but it disappears in use or when the OLED is turned off. Really, the burn-in is not a problem, at all.
I'm at 6088 hours of powered on time now, zero burnin.
 
I'm at 6088 hours of powered on time now, zero burnin.
My 55C7 was perfect (no burnin) until 8000 hour mark or so (at 100 OLED and contrast). By 14,000 hours when I replaced it with 48C1 it was noticeably less bright in the center. The red went first, then green, then blue. The white went last.
 
i don't even turn off my OLED lately, when I leave the PC for an hour or so with windows all over the display. There is visible temporary burn-in afterwards but it disappears in use or when the OLED is turned off. Really, the burn-in is not a problem, at all.
I'd caution against this, but to each his own.
I'm at 6088 hours of powered on time now, zero burnin.
Is there a way to check? Would be interesting to see what I'm at.
 
Back
Top