OLED 4K 30" 60 Hz - Dell UP3017Q

This Dell has strobing pixel refresh and pixel shift, two anti-image retention features. Zero image retention experienced so far.

Well I would hope you aren't having image retention (burn in) after a couple of days.

Though I don't see strobing refresh as burn in prevention. For the same perceptual brightness you end up doing similar damage. If you run at 50% duty cycle, then you need to run 200% brighter, so it's a wash.

But strobing should eliminate the Sample and Hold effect, for those that have a fetish about that.
 
RGB stripe is the best choice for things like text.

Only if you are using an old and outdated OS, like those from Apple.
Under Windows, cleartype settings allow you to optimize your text rendering for other subpixel layouts, since at least Windows XP, AFAIR.
 
Is there any bluring while smooth scrolling this forum ?

At 60 Hz it's quite clear since it single strobes. At double strobe there is a clear second trailing image, just due to the nature of double strobing a single frame. Still looks WAY better than any 60 Hz LCD. Double strobing it has around the same MPRT as my 144 Hz G-Sync TN panel!

Well I would hope you aren't having image retention (burn in) after a couple of days.

Though I don't see strobing refresh as burn in prevention. For the same perceptual brightness you end up doing similar damage. If you run at 50% duty cycle, then you need to run 200% brighter, so it's a wash.

But strobing should eliminate the Sample and Hold effect, for those that have a fetish about that.

I read an article about Samsung using PWM on their OLED's. It said it was due to preventing image retention. There would be no other reason for an OLED to use PWM (as LG OLED's do not). Completely turning off the OLED pixels is what assists with the image retention prevention. Works better to drive a OLED pixel twice as bright and then turn completely off than to keep the OLED pixel on at all times with half the brightness.
 
I read an article about Samsung using PWM on their OLED's. It said it was due to preventing image retention. There would be no other reason for an OLED to use PWM (as LG OLED's do not). Completely turning off the OLED pixels is what assists with the image retention prevention. Works better to drive a OLED pixel twice as bright and then turn completely off than to keep the OLED pixel on at all times with half the brightness.

I don't think there is any practical basis for that claim.

Samsung OLED display phones get burn in from hell in a couple of months.

Also there is valid reason for strobing: Blur reduction.
 
Only if you are using an old and outdated OS, like those from Apple.
Under Windows, cleartype settings allow you to optimize your text rendering for other subpixel layouts, since at least Windows XP, AFAIR.

Cleartype doesn't make Pentile good for text, and it doesn't improve the fill factor of RGBW.
 
It would be cool if you could tweak the duration of the single-strobe to reduce flickering (at the cost of slightly more blurring).

Did you ever try changing refresh rates over HDMI?

The monitor is locked to 60 Hz as the maximum without frame skipping over all three inputs.

I don't think there is any practical basis for that claim.

Samsung OLED display phones get burn in from hell in a couple of months.

Sure there is. Burn in could have happened even quicker if it didn't use PWM.
 
Sure there is. Burn in could have happened even quicker if it didn't use PWM.

You could make the same kind of claim if you sprinkled it with chicken entrails as burn in protection:

"Sure it still gets massive burn in, but think how much worse it would be if we didn't use chicken entrails". :D

Generally in most degradable processes, running at higher percentage of maximum is more damaging than spending more time at a lower percentage.
 
You could make the same kind of claim if you sprinkled it with chicken entrails as burn in protection:

"Sure it still gets massive burn in, but think how much worse it would be if we didn't use chicken entrails". :D

Generally in most degradable processes, running at higher percentage of maximum is more damaging than spending more time at a lower percentage.

Not the same thing at all. Samsung went out of their way to implement PWM with their OLED in a time when non-PWM is becoming ubiquitous. Non refresh rate synced 240 Hz PWM as used on Samsung OLED doesn't have any practical motion clarity benefits. Not only that, it would be inane to flicker the OLED with PWM at 100% brightness, which it does. If PWM was used to control brightness, there would be no PWM at 100% brightness like virtually all LCD's. So there is another reason, reduction of image retention.

Also, this Dell OLED is a professional display. It does not need good motion clarity. Yet they designed it to strobe at 60 and 120 Hz. Once again, to prevent image retention. Turning off an OLED pixel for 2/3rd's the time and then on bright for 1/3rd the time versus having a OLED pixel on 1/3rd the brightness constantly can completely change the image retention characteristics and the lifespan of the pixels. Those are characteristics only known to OLED manufacturers.

Simply saying that flashing an OLED pixel at three times the brightness but off 2/3rd's the time versus a constantly on OLED pixel at 1/3rd the brightness would have identical image retention and lifespan characteristics is simply not true. Unless we see some data sheets or a manufacturer talk about it, it's all speculation.
 
Last edited:
Simply saying that flashing an OLED pixel at three times the brightness but off 2/3rd's the time versus a constantly on OLED pixel at 1/3rd the brightness would have identical image retention and lifespan characteristics is simply not true. Unless we see some data sheets or a manufacturer talk about it, it's all speculation.

Generally speaking it is essentially true.

Lifespan = Brightness Level X on time. It is generally irrelevant how you slice it up, until you get to extremes, like very high brightness causing disproportionate damage.

I'd like to see some real data on the supposed mechanism that reverses the typical behavior in this case.
 
Not the same thing at all. Samsung went out of their way to implement PWM with their OLED in a time when non-PWM is becoming ubiquitous. Non refresh rate synced 240 Hz PWM as used on Samsung OLED doesn't have any practical motion clarity benefits. Not only that, it would be inane to flicker the OLED with PWM at 100% brightness, which it does. If PWM was used to control brightness, there would be no PWM at 100% brightness like virtually all LCD's. So there is another reason, reduction of image retention.

Are you positive Samsung OLEDs have PWM at 100 % brightness? Maybe you could test this monitor at different brightness settings. I'm quite sure PWM is used to regulate brightness. It's very difficult to drive OLEDs accurately at low currents, especially over time, which can make the brightness of the OLED pixels not uniform. By using PWM, OLEDs can be driven at higher currents which improves the grayscale and image uniformity.

Also, emission spectrum of an OLED changes slightly with current, but I don't know if this effect is significant.

Also, this Dell OLED is a professional display. It does not need good motion clarity. Yet they designed it to strobe at 60 and 120 Hz. Once again, to prevent image retention. Turning off an OLED pixel for 2/3rd's the time and then on bright for 1/3rd the time versus having a OLED pixel on 1/3rd the brightness constantly can completely change the image retention characteristics and the lifespan of the pixels. Those are characteristics only known to OLED manufacturers.

Simply saying that flashing an OLED pixel at three times the brightness but off 2/3rd's the time versus a constantly on OLED pixel at 1/3rd the brightness would have identical image retention and lifespan characteristics is simply not true. Unless we see some data sheets or a manufacturer talk about it, it's all speculation.

Yes, this is speculation from your part. I would say it's just the opposite. Voltage across any LED always increases when current is increased, albeit slowly. Therefore it's more efficient to drive a LED at low current than high.

It seems the panel was intended to be driven at 120 Hz to prevent noticeable flicker. Therefore it likely has 2 tcons like current 5k monitors. Then, why didn't they enable 120 Hz using 2 cables? Maybe this could be still implemented with a firmware update.
 
Last edited:
PWM and strobing induce eye strain and vision degradation over time, correct? I thought that was the whole reason we've seen a transition to non-PWM for LCD over the years. Now we're going back the other way with OLED? Is this simply the nature of the beast when dealing with OLED imagine retention?
 
Last edited:
Are you positive Samsung OLEDs have PWM at 100 % brightness? Maybe you could test this monitor at different brightness settings. I'm quite sure PWM is used to regulate brightness. It's very difficult to drive OLEDs accurately at low currents, especially over time, which can make the brightness of the OLED pixels not uniform. By using PWM, OLEDs can be driven at higher currents which improves the grayscale and image uniformity.

Also, emission spectrum of an OLED changes slightly with current, but I don't know if this effect is significant.

Yes, this is speculation from your part. I would say it's just the opposite. Voltage across any LED always increases when current is increased, albeit slowly. Therefore it's more efficient to drive a LED at low current than high.

It seems the panel was intended to be driven at 120 Hz to prevent noticeable flicker. Therefore it likely has 2 tcons like current 5k monitors. Then, why didn't they enable 120 Hz using 2 cables? Maybe this could be still implemented with a firmware update.

Yes, OLED's reviewed with laptops with Samsung panels show PWM use at 100% brightness:

https://www.notebookcheck.net/Lenovo-ThinkPad-X1-Yoga-OLED-Convertible-Review.169458.0.html

There is ZERO reason to do that besides ant-image retention measures. Pointing out speculation with more speculation is helpful how? LED's and OLED's respond very differently in many respects.

The entire electronics package on this monitor was re-designed from when the 120 Hz version was shown at CES 2016. Talking about two cables and 120 Hz etc doesn't apply.
 
"100%" brightness is just a user control. It doesn't mean the panel couldn't get brighter with different firmware. So, there is nothing strange about PWM being active at 100%. It just means that the panel isn't actually running at 100%.
 
"100%" brightness is just a user control. It doesn't mean the panel couldn't get brighter with different firmware. So, there is nothing strange about PWM being active at 100%. It just means that the panel isn't actually running at 100%.

Yep, we need TFT central to get a hold of it, and see what the PWM looks like at different brightness levels. The Lenovo shows >90% duty cycle at 100%. I am betting that Sumsung just uses PWM for brightness control because it is easier.

pwm_oled.png
 
Yes, OLED's reviewed with laptops with Samsung panels show PWM use at 100% brightness:

https://www.notebookcheck.net/Lenovo-ThinkPad-X1-Yoga-OLED-Convertible-Review.169458.0.html

Ok, but this is one display only. Anyway, I looked at the article and the picture Snowdog posted that displays screen flickering at 100 % brightness shows that pulse width is almost a full cycle. I'm sure had they taken a similar measurement at a lower brightness setting it would display a shorter pulse width, and this means PWM is used for brightness regulation.

There is ZERO reason to do that besides ant-image retention measures. Pointing out speculation with more speculation is helpful how? LED's and OLED's respond very differently in many respects.

No, PWM provides better grayscale and image uniformity. LG's WOLED TVs have had issues with near black uniformity whereas, to my knowledge, Samsung's OLEDs have not, although not using PWM is not the only reason why WOLED is inferior in this respect.

The entire electronics package on this monitor was re-designed from when the 120 Hz version was shown at CES 2016. Talking about two cables and 120 Hz etc doesn't apply.

Yes, I'm just disappointed it wasn't redesigned in such a way that would allow 120 Hz with 2 cables. The panel itself is clearly capable of that.
 
No, PWM provides better grayscale and image uniformity. LG's WOLED TVs have had issues with near black uniformity whereas, to my knowledge, Samsung's OLEDs have not, although not using PWM is not the only reason why WOLED is inferior in this respect.

I don't think so. PWM is just easier to implement.

LG WOLED near black issues seem to be more about greyscale mapping and I recall reading that the issues disappear on Panasonic (IIRC) TV using the same LG panel.

Also Near black doesn't get the same kind of scrutiny on a phone as it does by the AV community on a TV.
 
Missing one point. Virtually all monitor manufactures have phased out PWM on their LCD's. Is there one modern LCD that just came out that uses PWM dimming? So you think someone at Samsung on all of their top panels (OLED) would say hey; what we need is some PWM on these things for the hell of it! Or that it is cheaper. If using PWM dimming is cheaper, why do they use DC dimming on their lowly $200 monitors and then select PWM for expensive OLED?

As far as I know the 2017 LG OLED's using DC brightness control don't have any appreciable issues with grey-scale or uniformity. That was more of a 2015 issue. PWM dimming isn't just on that one panel, it's all Samsung OLED's.
 
Last edited:
Missing one point. Virtually all monitor manufactures have phased out PWM on their LCD's. Is there one modern LCD that just came out that uses PWM dimming? So you think someone at Samsung on all of their top panels (OLED) would say hey; what we need is some PWM on these things for the hell of it! Or that it is cheaper. If using PWM dimming is cheaper, why do they use DC dimming on their lowly $200 monitors and then select PWM for expensive OLED?

As far as I know the 2017 LG OLED's using DC brightness control don't have any appreciable issues with grey-scale or uniformity. That was more of a 2015 issue. PWM dimming isn't just on that one panel, it's all Samsung OLED's.

Samsung also uses PWM on all (or nearly all) of its LED products as well. Maybe Samsung just uses it because its cheaper to use the same basic controller design across the board?
 
So, Vega? Is the Dell the BOTB? The MVP of monitors? I want to trade out my X34...I tire of the lack of 21:9 support for even new titles - YET, I know I would definitely miss Gsync. I guess the only other option is wait for 4k Gsync monitors on the horizon?
 
Samsung also uses PWM on all (or nearly all) of its LED products as well. Maybe Samsung just uses it because its cheaper to use the same basic controller design across the board?

Well even a Samsung monitor from three years ago didn't use PWM all the way down to 31% brightness:

http://www.tftcentral.co.uk/reviews/samsung_u28d590d.htm#panel

So, Vega? Is the Dell the BOTB? The MVP of monitors? I want to trade out my X34...I tire of the lack of 21:9 support for even new titles - YET, I know I would definitely miss Gsync. I guess the only other option is wait for 4k Gsync monitors on the horizon?

Yes, G-Sync is easy to miss. I love me some G-Sync. But of course 30" 4K OLED Is like god mode. Best image quality in the world. Good motion clarity due to strobing. But still only 60 Hz and there is some input lag. So... all personal preference of course.
 
Well even a Samsung monitor from three years ago didn't use PWM all the way down to 31% brightness:

http://www.tftcentral.co.uk/reviews/samsung_u28d590d.htm#panel

PWM is getting more rare on LCD.

But brightness control may be more difficult on OLED.

We will likely have to wait for TFTCentral review to get to the bottom line on what is going on.

BUT, for people bothered by PWM, this one is probably a no go, since it always seems active and at a fairly low frequency.

I expect it wouldn't bother me. Back in the CRT days 60Hz drove me nuts but 85Hz was fine all day long, so I imagine 120Hz would be fine for me and most people.
 
Well strobing is much better than PWM. PWM isn't linked to the refresh rate like strobing is. Anyone who likes ULMB would have no problem with this display.
 
Well strobing is much better than PWM. PWM isn't linked to the refresh rate like strobing is. Anyone who likes ULMB would have no problem with this display.

Don't you get double images though if you have 120z strobing but 60fps?
 
Don't you get double images though if you have 120z strobing but 60fps?

Yes. Single strobing motion clarity is obviously superior, but I can live with the slight double image. It is still far better motion clarity than any sub 120 Hz sample-and-hold LCD.
 
Damn seeing that if this monitor really was 120Hz with 120Hz strobing my lord it would've been so epic I would've instantly bought one too lol.
 
Yes. Single strobing motion clarity is obviously superior, but I can live with the slight double image. It is still far better motion clarity than any sub 120 Hz sample-and-hold LCD.

Why would you get a double image for flashing the same thing twice in the same place?

I can see that with LCD you might get that because the LCD is in transition, and the backlight is flashing.

But with OLED flashing the same frame twice, doesn't seem like it should create a double image unless you are moving your head or the camera.
 
But with OLED flashing the same frame twice, doesn't seem like it should create a double image unless you are moving your head or the camera.

That's exactly the issue. We move our eyes to track objects, and when they're flashed twice, we see them twice. Same with a camera as you move it with the target.
 
That's exactly the issue. We move our eyes to track objects, and when they're flashed twice, we see them twice. Same with a camera as you move it with the target.

It really shouldn't be that much of any issue. Back when Movie theaters used film, they would double or triple flash each frame to cut down on flicker. I don't remember seeing double images in a theater, ever.
 
Not the same thing at all. 24 FPS films have massive native blur that blends everything together. A very fast display like OLED is literally like taking super crisp still images and stitching them together very rapidly. OLED pixels are so fast there is nothing that masks motion artifacts.
 
Why would you get a double image for flashing the same thing twice in the same place?
I can see that with LCD you might get that because the LCD is in transition, and the backlight is flashing.
But with OLED flashing the same frame twice, doesn't seem like it should create a double image unless you are moving your head or the camera.
The image stays in the same place on the display.
It has moved position on your retina if you are tracking a moving object.

Not the same thing at all. 24 FPS films have massive native blur that blends everything together. A very fast display like OLED is literally like taking super crisp still images and stitching them together very rapidly. OLED pixels are so fast there is nothing that masks motion artifacts.
While they do shoot movies with extra motion blur to help smooth things over, it's actually less than you might think.
It's just that, since movies are never seen with a native low-persistence 24Hz presentation any more, people are used to seeing a lot more blur than is actually there.
If you display a 24 FPS source on a CRT at 48Hz with black frame insertion for an effective 24Hz, motion is super smooth and very clear.
If you thought flicker was bad on your OLED at 60Hz though...
 
It really shouldn't be that much of any issue. Back when Movie theaters used film, they would double or triple flash each frame to cut down on flicker. I don't remember seeing double images in a theater, ever.
You get used to it, but it was extremely noticeable at first to me in games like Skyrim and Fallout 4. Took about half an hour for me to get used to the double image effect.
 
Great thread Vega describing the OLED display. Sounds like it outstanding for professional work.

Now, my burning question, can you game on it? Can it run Crysis?! Have you tried playing twitch shooters, BF1 or anything else.

Many thanks!
 
I'd imagine that its a really, really good monitor for twitch gaming if you set the strobe to 60 Hz, assuming that 20~ ms of input lag is acceptable.

How does it fare for reading strain? 120 Hz strobing isn't very noticeable I hope?
 
That sounds like a lot of input lag, but I don't want to speculate. It would be real cool to hear from Vega if he has been gaming on the monitor to get his impressions.
 
Just my opinion, but 20ms is not significant for anything other than very high-level CSGO or fighting games. For example, the good old Dell U2412M had 17ms input lag and many of us gamed on those for years without complaint. Even the well-regarded Eizo FG2421 had 10ms of input lag + 4ms of pixel response time to deal with. Outside of very specific game scenarios played by highly sensitive people, anything below 25ms is fine. It's 30-50ms+ when it starts to get really noticeable even for average games and mildly irritating.

All of this assumes that the 20ms ballpark is correct, of course.
 
Back
Top