Dell Alienware AW3423DW 34″ QD-OLED 175Hz (3440 x 1440)

Looks like I'm starting to get some burn-in from using two Chrome windows side by side and watching a lot of YouTube. The faint line going down the center of the screen seems to be from the scroll bar and the one towards the lower part of the screen is from the bottom part of the YouTube video player window. I tired running a panel refresh but it didn't fix it. Has anyone else here come across this? I'm hoping it goes away over time but the panel refresh failing to fix it has me a bit worried.
 

Attachments

  • OLEDBurnIn.jpg
    OLEDBurnIn.jpg
    309.5 KB · Views: 0
I'm expecting that in 6 months every forum will be filled with burn in related topics about these monitors. OLED is still OLED.
 
GameLifter
I have my Edge windows like this and usually use the PC 12-16 hours a day and I just checked and don't see any type of burn in or retention from the scroll bars or the task bar.

desktop-temp.jpg
 
Last edited:
GameLifter
I have my Edge windows like this and usually use the PC 12-16 hours a day and I just checked and don't see any type of burn in or retention from the scroll bars or the task bar.

View attachment 475646
Interesting. I use mine about 6 hours a day and I regularly move the windows around and minimize the one I'm not using. The center scroll bar on mine isn't dark like yours so maybe that's why mine caused image retention. I'm also using the default bright theme.
 
Looks like I'm starting to get some burn-in from using two Chrome windows side by side and watching a lot of YouTube. The faint line going down the center of the screen seems to be from the scroll bar and the one towards the lower part of the screen is from the bottom part of the YouTube video player window. I tired running a panel refresh but it didn't fix it. Has anyone else here come across this? I'm hoping it goes away over time but the panel refresh failing to fix it has me a bit worried.
That kind of just looks like a normal vertical OLED band to me from that picture if there is any doubt in your mind. It would have always been there though if that was the case.
 
Yeah if you look closely on grey or dark grey you can see tons of vertical streaks. These are visible on both this monitor and my LG OLED but you have to be pretty close and in a dark room to see them.
 
Are y'all using screen savers? I use these aggressively on my CRTs. And was thinking this would need to be treated the same.
 
Are y'all using screen savers? I use these aggressively on my CRTs. And was thinking this would need to be treated the same.
No, but if I do plan on leaving the computer for more than an hour I just turn it off.
 
Are y'all using screen savers? I use these aggressively on my CRTs. And was thinking this would need to be treated the same.
Really Slick Screensavers... just saying... I have used them for I guess close to 15 years now? I don't know when he started them. Love the fireworks one, every 4th its on my computers. And on an OLED with perfect blacks... they are amazing.
 
Are y'all using screen savers? I use these aggressively on my CRTs. And was thinking this would need to be treated the same.
Since I keep my screens on for up to 12-16 hours a day, I decided to use the ultimate screen saver: perfect pitch black wallpaper with no taskbar (not even auto hidden). The screen pixels only work when I need them. Mind you that I also have two other IPS screens I'm actively using (with my usual wallpaper, etc.) so this may not be the best solution for you depending on your monitors setup.

I started using this setup a couple of months before receiving the monitor, to test it out and see if it would be a bother or not. So far it hasn't been a problem. The biggest issue I would say is that the screen turn off after a pixel refresh which makes it disconnect from the PC and messes up the task bar arrangement, etc. For such a good screen, its a real shame that they couldn't get it perfect and fix all these small quirks... it's to the point I'm still wondering if I'll keep it or not, maybe wait for a refresh that fixes the annoyances. I'll probably end up keeping it since I do not have any other HDR screen of this calibre.
 
I saw several high-Hz OLEDs on exhibit at DisplayWeek 2022.

I'm very excited about the arrival of these high-Hz OLED displays.

120Hz-vs-240Hz is much easier to see on OLEDs than LCDs because slow GtG pixel response doesn't throttle the refresh rate differences.

This makes it easier to see geometric upgrade of a sample-and-hold display (60 -> 120 -> 240 -> 480 -> 1000Hz) -- the general Hz upgrade curve required by most everyday users to more easily see differences in this diminishing curve of returns. This will have more noticeable humankind benefits on OLEDs unclouded by LCD GtG, to reduce display motion blur without needing strobing or BFI.

Even 175 Hz OLED looks slightly clearer than most 240 Hz LCD (non-strobed).

That being said, we're still a long way to perfectly matching CRT motion clarity (requires some form of a strobing/scanning/flicker mode, unless you use ultra high refresh rates on a sample-and-hold display).
 
Last edited:
What are you using to disable the taskbar?
I drag and drop the main taskbar on one of my other display. While I don't think it's necessary, I also use Display Fusion and hide the secondary taskbar on the OLED (Right Click the taskbar --> Multi-Monitor taskbar --> Position --> Disable).
 
I drag and drop the main taskbar on one of my other display. While I don't think it's necessary, I also use Display Fusion and hide the secondary taskbar on the OLED (Right Click the taskbar --> Multi-Monitor taskbar --> Position --> Disable).
Oh, one additional consideration is that gaming performs with less stutter on the primary monitor, when you play games on different-Hz multimonitor setups. It's a very annoying problem.

So, one has to skip the windowed/borderless and go FSE with the OLED as primary. Or simply disable all except the one you play games on.

Then again -- DisplayFusion can automatically help with that (switch to OLED as only monitor when you're playing a game) to fix those "different Hz multimonitor game stutters" that happens too often on current versions of Windows (except in true FSE).
 
I've been using Buttery Taskbar to hide the taskbar. It only appears when you hit the windows key and doesn't have that 2 pixel line that is always visible like the default autohide behavior does. It's pretty slick but there is one downside - the invisible taskbar still takes up space when you maximize a desktop window.

1653157273178.png


EDIT: NM - I figured it out. Maximized windows will use the full vertical real-estate if I tell windows to autohide the taskbar even though I am using Buttery to hide it too. Kind of unintuitive, but it works.
 
Last edited:
Oh, one additional consideration is that gaming performs with less stutter on the primary monitor, when you play games on different-Hz multimonitor setups. It's a very annoying problem.

So, one has to skip the windowed/borderless and go FSE with the OLED as primary. Or simply disable all except the one you play games on.

Then again -- DisplayFusion can automatically help with that (switch to OLED as only monitor when you're playing a game) to fix those "different Hz multimonitor game stutters" that happens too often on current versions of Windows (except in true FSE).
Good advice. I've always tried to play in FSE so haven't really noticed it. It's good to know though and certainly worth keeping in mind when unexplained stutter appears. I've sure seen it while watching Netflix though, I have been switching to "PC screen only" in the project menu in windows in part to fix that but also to turn off my other screens without loosing HDCP status.
 
That kind of just looks like a normal vertical OLED band to me from that picture if there is any doubt in your mind. It would have always been there though if that was the case.
It wasn't there before. In the pic I was using the Steam chat window and if I move it over the scrollbar area in the center it lines up perfectly. Same with the lines from the bottom part of the YouTube player.
 
I saw several high-Hz OLEDs on exhibit at DisplayWeek 2022.

I'm very excited about the arrival of these high-Hz OLED displays.

120Hz-vs-240Hz is much easier to see on OLEDs than LCDs because slow GtG pixel response doesn't throttle the refresh rate differences.

This makes it easier to see geometric upgrade of a sample-and-hold display (60 -> 120 -> 240 -> 480 -> 1000Hz) -- the general Hz upgrade curve required by most everyday users to more easily see differences in this diminishing curve of returns. This will have more noticeable humankind benefits on OLEDs unclouded by LCD GtG, to reduce display motion blur without needing strobing or BFI.

Even 175 Hz OLED looks slightly clearer than most 240 Hz LCD (non-strobed).

That being said, we're still a long way to perfectly matching CRT motion clarity (requires some form of a strobing/scanning/flicker mode, unless you use ultra high refresh rates on a sample-and-hold display).
Not to digress too much off topic but what kind of refreshed rates would be needed in sample and hold displays to match CRT motion clarity?
 
It wasn't there before. In the pic I was using the Steam chat window and if I move it over the scrollbar area in the center it lines up perfectly. Same with the lines from the bottom part of the YouTube player.
Yikes - well that's not good. Even WOLED supposedly takes a few thousand hours to burn in and QD-OLED is supposed to be better. I guess you'll be one of the first ones to claim a burn-in warranty.
 
Not to digress too much off topic but what kind of refreshed rates would be needed in sample and hold displays to match CRT motion clarity?
It's hard to find actual response times for CRTs. According to this paper, a ViewSonic G90fB has a 0-255 response time of 0.58 ms and a 255-0 response time of 2.45 ms. According to this paper, a Mitsubishi RDF193H has a 0-255 response time of 150 μs and a 255-0 response time of 2 ms.

For a sample and hold display to have persistence of 150 μs would require a refresh rate of 6,666 Hz. 1,724 Hz for 0.58 ms, 500 Hz for 2 ms, and 408 Hz for 2.45 ms.

Suffice to say we're a long way away from CRT rise times in sample and hold displays, especially considering we don't even have a panel technology that can actually achieve those response times. OLED comes close, but it's still slower.
 
Last edited:
I saw several high-Hz OLEDs on exhibit at DisplayWeek 2022.

120Hz-vs-240Hz is much easier to see on OLEDs than LCDs because slow GtG pixel response doesn't throttle the refresh rate differences.

Even 175 Hz OLED looks slightly clearer than most 240 Hz LCD (non-strobed).
Just the person I wanted to hear from, very glad you are active here.
Any updated ETA on 240Hz? Guess we are waiting on realtek for updated hdmi/dp interfaces anyway...

I'm looking to buy an OLED this year and the 42" or refurb 48" LG is likely how I'll go. But the alienware also has me interested. Is the 55Hz extra worth it? I'd love the vertical area of the 16:9 panel for content, gaming and productivity but willing to compromise, also bit easier to fit.

Not to digress too much off topic but what kind of refreshed rates would be needed in sample and hold displays to match CRT motion clarity?
Look at the 8K CRT thread, Cheif covers that answer in extreme detail.

It's hard to find actual response times for CRTs. According to this paper, a ViewSonic G90fB has a 0-255 response time of 0.58 ms and a 255-0 response time of 2.45 ms. According to this paper, a Mitsubishi RDF193H has a 0-255 response time of 150 μs and a 255-0 response time of 2 ms.

For a sample and hold display to have persistence of 150 μs would require a refresh rate of 6,666 Hz. 1,724 Hz for 0.58 ms, 500 Hz for 2 ms, and 408 Hz for 2.45 ms.

Suffice to say we're a long way away from CRT rise times in sample and hold displays, especially considering we don't even have a panel technology that can actually achieve those response times. OLED comes close, but it's still slower.
LEDs have a very low nanosecond rise time depending on the driver design and type. They're the way to go. There are already 1KHz lithography based microled arrays (driven at) which is why I'd expect this first in VR goggles, and some miniled panels are internally running ~2khz. The base tech is there it's just not cost effective/lab prototypes for now.
 
Last edited:
It's hard to find actual response times for CRTs. According to this paper, a ViewSonic G90fB has a 0-255 response time of 0.58 ms and a 255-0 response time of 2.45 ms. According to this paper, a Mitsubishi RDF193H has a 0-255 response time of 150 μs and a 255-0 response time of 2 ms.

For a sample and hold display to have persistence of 150 μs would require a refresh rate of 6,666 Hz. 1,724 Hz for 0.58 ms, 500 Hz for 2 ms, and 408 Hz for 2.45 ms.

Suffice to say we're a long way away from CRT rise times in sample and hold displays, especially considering we don't even have a panel technology that can actually achieve those response times. OLED comes close, but it's still slower.
Blur Busters Approved answer.

But I must nuance it that consistency in the GtG heatmap of an OLED is much better than both CRT and LCD.

This is because 0-255 and 255-0 (or any color) is almost identical on many OLEDs and direct-view discrete MicroLED displays (no LCD layer).

Based on tests done so far, 1000fps 1000Hz with near-0ms GtG, is sufficient to be competitive with medium-phosphor-decay CRTs like Sony FW900 CRT. This isn't as fast as the 0-255, but much faster than 255-0. You have perfectly symmetric motion blur with no phosphor trails, ghosting or coronas, when it comes to OLEDs and direct-view MicroLEDs.

Let's talk the leading edge. A display pixel response can actually be overkill-league fast (150 microsecond rise response could easily be 500 microseconds rise response instead and you couldn't tell the difference), and in some cases, intentionally slowing down that to 1ms would not be visible for most motiosnpeeds (e.g. 480 pixels/sec motion would be only 0.5 pixels of blurring for a 1ms MPRT). But you still see the phosphor trailbehind, even if you intentionally slow down. "150us rise with 2ms* fall" versus "1ms rise with 2ms* fall" can look identical with slow motion speeds. What this means is that "1ms rise with 1ms fall" will look superior for these motion speeds -- which is why you only need roughly ~1000fps ~1000Hz to match medium-persistence CRT motion clarity with just sample and hold (give or take).

More Hz is better -- even 6666 Hz, as 1000Hz isn't even the final frontier. However, what this means is that CRT-matching sample-and-hold (assuming consistent all-colors near-0ms GtG tech like OLED or MicroLED) is coming earlier in the future than later -- my vision is to see this happen by the end of this decade.

____

*Now, it's more than 2ms to phosphor decay all the way back to black, though. It's more like >20ms, if you use a more accurate oscilloscope.

In ApertureGrille's CRT test, the phosphor decay of CRT is visible for more than 20 milliseconds:

1653197341340.png


Most of the time CRT phosphor decay is measured to 90% decay. I am not sure if the paper did this, but I immediately noticed the noisefloor problem in this paper: So it is potentially more of a 255->25 test, approximately -- the reason they use 10%-vs-90% thresholds is because of the noisefloor of oscilloscopes (look at the vertical thickness of the oscilloscope lines is almost 10% the thickness of the brightness range, so it'd not be able to easily distinguish colors below roughly RGB(25,25,25)*

*Correction (2022-07-24) - It's actually roughly ~RGB(88,88,88) not RGB(25,25,25) because of gamma 2.2, and the GtG cutoffs are photon based. You need roughly RGB(88,88,88) to have 10% the photons of RGB(255,255,255).So, effectively, GtG cutoff thresholds are omitting any blurs from one-third of the possible numeric RGB values, from a programming perspectie. That's worse than I expected..

In practice, watching www.testufo.com in full screen mode with the stars disabled (black background only) on a CRT and watching for phosphor ghosts, the phosphor ghosts multiple refresh cycles behind -- that's more than 1/60sec worth of phosphor decay, not 2ms. Faint as it may be, phosphor ghosting can be distracting to some people who are picky about that stuff -- utterly unimportant to some, important to others. On a typical medium-persistence-decay phosphor CRT like Sony FW900, it takes much longer to decay to fully black (>20ms when recorded with a noisefloor tighter than the difference of RGB(0,0,0) versus RGB(1,1,1) with a very sensitive ThorsLab photodiode + an accurate oscilloscope such as a good Tektronix). The researchers probably used a simple photodiode without an op-amp, directly connected to the leads of an oscilloscope -- which is fine enough for 10%-90% measurements, but hard to get noisefloor closer to 0% below the 10%.

Look at how noisy the research paper's (that you linked) oscilloscope is; it's almost 10% of the greyscale -- it would be hard to measure the difference between say RGB(0,0,0) vs RGB(10,10,10) which is below that specific rig's noisefloor.

1653195124002.png


Observe the oscilloscope noise floor is extremely thick (covering almost a 25-level greyscale span in 8-bit greyscale space), so more accurate measurements show more than 2ms of decay. Risetime is almost instant, but decay takes much longer. It's much better than most LCD though, but not as good motion as cherrypicked strobed LCD (the best <1% of LCDs).

Also, the Oculus Quest 2 uses a 0.3ms strobe flash, so you'd need 3333fps 3333Hz sample-and-hold to match that. Quest 2 does a perfect symmetric 0.3ms in both directions (255-0 and 0-255), with strobe crosstalk below human-visibility noisefloor (less than the difference between RGB(254,254,254) and RGB(255,255,255), for every pixel on the surface, top/center/bottom. Few LCDs can pull that "massively better than LightBoost" feat off, and look subjectively clearer-motion than a CRT in a cherrypicked-tuning on a cherrypicked-LCD. But the easiest way to witness this is just to borrow an Oculus Quest 2 or one of the fast-switching VR LCDs that manages to have a virtually perfect GtG heatmap (complete transitions during dark cycle of strobe).

Since we can't do such refresh rates yet, we have to use strobing to simulate a CRT for now -- with its attendant compromises (squarewave flash is a bit more harsh on eyes than rolling-scan decay).

For all practical intents and purposes, on the Oculus Quest 2, all pixel colors have 0.3ms MPRT and 0ms GtG on the Quest 2. (since strobe crosstalk went to zero, the effective GtG is controlled by the backlight, ala strobe backlight, with the LCD-layer GtG completely hidden in the dark periods between strobe backlight flashes). Most geeks here won't want to get a Quest 2 because they don't want Facebook, but I still have to give accolades to John Carmack's superlative work on the Quest 2 LCD as one of the best CRT-beating LCDs I have ever seen, with perfect ~0ms GtG & ~0.3ms MPRT bidirectional symmetry for all possible color-pair combinations -- at least borrow a friend's Quest 2 and see that LCD for yourself. You'll quickly agree with me that it (on average) beats the motion resolution of a CRT tube -- a good fast-moving dark TestUFO Panning Photo Test With Stars at 1920 pixels/sec, Oculus Browser, full screen mode (bigger browser canvas in VR than default).

The bonus is Quest 2 is pre-calibrated out of the box, with a perfect GtG=0 (for human vision) since the LCD-layer GtG is successfully hidden by the dark cycle of backlight, and the strobe backlight is the vision-effective GtG instead. And to boot, as a bonus, it is heavily voltage-boosted to very bright flashes and very brief (0.3ms MPRT), no Strobe Utility needed unlike the ViewSonic XG2431. Though Quest 2 and XG2431 can achieve similar zero-crosstalk quality, the Quest 2 does it more easily (preinstalled QFT mode -- specs revealed at DisplayWeek 2022 informed me they also are using fast-scanout / QFT technique too for ultralarge-VBI to hide LCD GtG too) and it strobes brightly thanks to all the engineering work they put into that VR LCD. So for the world's easiest better-than-CRT-motion-clarity LCD, just test-drive a Quest 2.

Try that on a Sony FW900 (medium-persistence phosphor) at night with lights out. You'll see that phosphor decay somewhat blurs the fast-moving stars. Now put on your Quest 2 VR headset, load this forum URL into it and use your VR controller to click the link in this post, then click the full screen button in TestUFO (to make browser bigger), and look for any motion blur, ghosting, or fuzziness or phosphor trails -- there is none. The star pixels are still perfectly resolvable with no ghostbehind effects. That being said, the blacks are crappy. The FW900 CRT beats the Quest 2 in blacks for sure (better stars), but the motion clarity of the stars is much better on Quest 2 than on the Sony FW900. Do this, and you'll see that symmetric ultrafast effective-visible response (for both GtG *and* MPRT simultaneously, for ALL colors) actually makes the stars look significantly clearer. Use very fast pixel speeds like 1920 pixels/sec -- to push the limits of the CRT as well as Quest 2 motion clarity.

You can get better consistency than a CRT (perfect GtG heatmap) with a cherrypicked strobed LCD like the Oculus Quest 2. If you ever wore a Quest 2 VR headset, dark TestUFO motions looks significantly clearer in the in-VR web browser than on a medium-persistence-decay CRT tube like FW900. There are faster CRTs, but I like talking about the FW900 because it's a famous model and it is a widescreen that is more "comparable" to a 24" LCD, for easier 1080p-vs-1080p comparisons, and it's an easier benchmark to hit for showing the venn diagram of LCD vs CRT now overlaps in motion clarity.

Now with that in mind, we don't need 3333fps 3333Hz to roughly match the averaged (all colors, response in both directions) response of a Sony FW900 CRT. It's more like 1000fps 1000Hz. Some CRT transitions (0->255) will always be faster, but a consistent near-0ms-GtG obliterates the cherrypicked-colors advantage, so an OLED-format or MiniLED-format 1000fps 1000Hz should be "within territory of achieving CRT clarity via blurless sample-and-hold". Also, CRTs have limited spatial resolution compared to digital panels, so fast motion speeds like 1920 pixels/sec are always faster on CRT than on a 4K display.

TL;DR To have a non-strobed sample and hold display match the motion clarity of a typical medium-persistence CRT, you need approximately 1000fps 10000Hz on an OLED or MicroLED display to avoid the need for strobing technology. The required number is lowered to only 1000 from the near-five-digit Hz range by the achievement of bidirectional near-zero GtG & MPRT with virtually perfect consistency (within human visible error margins) for all color combinations of pixel transitions (zero phosphor ghosting).
 
Last edited:
Looks like I'm starting to get some burn-in from using two Chrome windows side by side and watching a lot of YouTube. The faint line going down the center of the screen seems to be from the scroll bar and the one towards the lower part of the screen is from the bottom part of the YouTube video player window. I tired running a panel refresh but it didn't fix it. Has anyone else here come across this? I'm hoping it goes away over time but the panel refresh failing to fix it has me a bit worried.
F
 
I'm expecting that in 6 months every forum will be filled with burn in related topics about these monitors. OLED is still OLED.
Even more than that, this first-gen panel is guaranteed to be Samsung's learning period (so its going to be years before Samsung catches up with LGs WOLED vast array of lessons learned)

Remember when the LG c6 was a magnet for burn-in? it took several years of careful firmware updates plus panel tweaks before they had the usable long-life OLED tv
 
Last edited:
Even more than that, this first-gen panel is guaranteed to be Samsung's learning period (so its going to be years before Samsung catches up with LGs WOLED vast array of lessons learned)

Remember when the LG c6 was a magnet for burn-in? it took several years of careful firmware updates plus panel tweaks before they had the usable long-life OLED tv
I have the LG 2016 C6, it has burn in, which has severally lowered my confidence of investing too much money into an OLED display if I feel it will just be defective within 4-5 years. The OLED C6 had such a great picture quality, to me it was a perfect TV, so it is sad to see it deteriorate when it should have been useable for 10+ years, instead, will become waste sooner than expected.

Plus, it’s not like your average consumer buying these type of TVs are told about burn-in and that an OLED screen will not last as long as other screen tech despite your best efforts.

Heck, where does Dell mention in general marketing that the 3423 pixel shifts so often, the non-standard sub-pixel arrangement and its cons, the constant pixel refresh every 4 hours, SDR nits, etc. All things that are just conveniently left out. Sure, warrant says it includes burn-in for 3 years, but they don’t explain if it is still a guarantee to happen 4-5 years. Many of us here may be looking to upgrade by then or earlier, but if your screen has burn-in, good luck selling it or it retaining any value.
 
Even more than that, this first-gen panel is guaranteed to be Samsung's learning period (so its going to be years before Samsung catches up with LGs WOLED vast array of lessons learned)

Remember when the LG c6 was a magnet for burn-in? it took several years of careful firmware updates plus panel tweaks before they had the usable long-life OLED tv
Assuming it's calibrated to a standard 120 nits professional computer use, I rarely see OLED burn in on newer LG panels. I think you're OK with the eight-series and newer.

Although the jury is out, I feel that Samsung is not going to burn in as easily as the very old LG c6, because they've had many years of OLEDs in Samsung Galaxy tablets, and they've had time to refine the technology.

So they'll essentially be jumping straight to Version 2.0 technology, although it might not be as immune as 2022 LG panels, who knows?

Time will tell, but I don't think Sammy's will burn as quick as early LG c6's did. They've had many years of manufacturing Samsung Galaxy OLEDs as well as years of internally testing larger OLED (probably more years of internal beta than LG, because they refrained from launching large OLEDs for so long). LG may still be ahead in burnin-immunity experience now, but I don't think the years-delta will be as big as feared.
 
Last edited:
Oled tv’s have come a long way. My 2017 LG oled tv has no burnin and it’s my living room tv now… meaning my 2 & 4 year old use it and pay no attention to static content or turning it off.
Yeah,I've been running my b7 in the living room for four years now as my htpc.

The new auto-power-off mode,combined with my own existing protections (dark mode browsing,windows dark mode.,black desktop,black screensaver)have kept my burn-in free so-far (and should last me years more)
 
I swear sometimes this monitor firmware was designed be retards. Monitor sits all night off doing nothing, then in the morning asks me to do a "panel refresh" right when I need to use it.
 
I am concerned about the sub pixel sizes (look at how much dead space) and lax limiter employed on the AW screen. Time will tell but I'm not expecting this to survive without burn-in. 3 year warranty gives peace of mind though, but at what threshold will they accept burn-in claims? Existing dead pixel policies from other manufacturers were almost not worth the storage space taken online to serve them. Dell does have excellent support for panels when I went through 3 of them 10+ years ago, but how about now?
 
I am concerned about the sub pixel sizes (look at how much dead space) and lax limiter employed on the AW screen. Time will tell but I'm not expecting this to survive without burn-in. 3 year warranty gives peace of mind though, but at what threshold will they accept burn-in claims? Existing dead pixel policies from other manufacturers were almost not worth the storage space taken online to serve them. Dell does have excellent support for panels when I went through 3 of them 10+ years ago, but how about now?
OLED is not for you, if you're that worried.
 
I swear sometimes this monitor firmware was designed be retards. Monitor sits all night off doing nothing, then in the morning asks me to do a "panel refresh" right when I need to use it.
Thats a feature...to give you personal time for yourself. Reflect in a zen like state and finger fuck the blue button which is now the red button
 
OLED is not for you, if you're that worried.
The CX/C2 is fine for me and what I'm aiming at now. They do not exhibit burn in in later models when used with some precautions, as mentioned earlier, I am just concerned about the sub-pixel structure and small size of the oled elements, plus lax ABL leading to some decent burn-in. I was actually aiming towards the AW for some time, until very recent user gripes (slow res switching, input lag, fuzzy image) surfaced which really puts me off for something I want to keep at least 5 years. I'll try see one in person before making a call on it though.
 
WTF, monitor spent over two hours doing pixel refresh and shut down. Then I go to use my PC and press the monitor power button, says it needs to start a pixel refresh. What clueless idiots designed this firmware.
 
I've been perplexed by the pixel refresh frequency as well. There have been a couple nights where I turned off the monitor (using the power button) and it went into a refresh and then it asked to do another when I turned it on in the morning.
 
You guys should not ever be turning this thing off manually with the power button which is what's probably contributing to all this nonsense. I was never ever prompted to do a refresh after the initial time because I don't slam the power button ever 15 minutes stepping away.
 
The pixel refresh logic seems a bit broken if you're still getting the window where it asks you. It liked to ask me to do a pixel refresh on power-on as well. Since I told it to never ask again, it's not been an issue.

There is no benefit to having it ask you as far as I can tell.
 
Back
Top