OLED 4K 30" 60 Hz - Dell UP3017Q

Just my opinion, but 20ms is not significant for anything other than very high-level CSGO or fighting games. For example, the good old Dell U2412M had 17ms input lag and many of us gamed on those for years without complaint. Even the well-regarded Eizo FG2421 had 10ms of input lag + 4ms of pixel response time to deal with. Outside of very specific game scenarios played by highly sensitive people, anything below 25ms is fine. It's 30-50ms+ when it starts to get really noticeable even for average games and mildly irritating.

All of this assumes that the 20ms ballpark is correct, of course.

I agree. My PG278Q has a response time of 1ms and input lag well under 10ms and I can't really tell that much of a difference to my Samsung KS8000 with much higher response time and input lag. In short I enjoy playing games on both of them but while I do play games that require accurate timing (e.g. Dark Souls), I don't play competitive FPS or fighting games. While I would prefer as low input lag as possible as well as minimal response times, TVs generally tend to be worse about this and seems OLED hasn't quite caught up with LCDs in the input lag department.

Ultimately I would like more effort spent on backlight flickering solutions as those do a lot more for motion clarity than shaving off a few ms off response time.
 
The 2017 LG OLED basically has the same input lag as the Dell, if the 20ms ballpark figure is correct. The LG doesn't strobe, but it does have 1080p @ 120Hz... would be interesting to hear from someone who has both.
 
The pricing is very similar to the first 4K monitors that used 31.5" Sharp IGZO panels.
 
I don't think so. PWM is just easier to implement.

LG WOLED near black issues seem to be more about greyscale mapping and I recall reading that the issues disappear on Panasonic (IIRC) TV using the same LG panel.

Also Near black doesn't get the same kind of scrutiny on a phone as it does by the AV community on a TV.

I see no basis why PWM would be easier in OLEDs, LGs WOLEDs are incapable of that altogether.

Look, I haven't done work on OLEDs, but from my previous experience in characterizing other optoelectronic components, I can tell that despite careful manufacturing, there are differences between similar units. Especially at low drive currents, there can be a significant difference in light output power. Those differences tend to become relatively smaller as you increase the drive current.

An OLED is a LED, and unlike what Vega stated, they respond in kind, and similarly, all those millions of pixels, or subpixels, if you want to be pedantic, in a display, they are not the same, their light output varies. At low currents those differences are relatively larger, and to add that, it's actually very difficult to drive the pixels accurately at very low currents. Now, suppose that instead of driving them at low currents, you drive them at higher currents, using PWM to regulate the brightness. These way, you get a smaller variation in light output of the pixels, and a smaller error in drive current through them. Still, those tiny pixels are different, but in a larger group that you stare at on a display, things tend to even out, and if the manufacturing process is mature enough, the whole screen may appear uniform.

An additional benefit if you use PWM is that each pixel has a similar emission spectrum at all brightness settings. Thus, you only need to calibrate the screen at one setting.
 
I see no basis why PWM would be easier in OLEDs, LGs WOLEDs are incapable of that altogether.

Really, because the rest of your post helps make the case that modulating different light levels is harder without PWM.

PWM is easier because you don't have to bias voltage circuits at different levels. It is simply ON or OFF. ON or OFF is something you have to do anyway.

PWM is by far the simplest form of brightness control.

Brightness Level control without PWM may that much more difficult for Samsung because it uses R,G,B OLED materials, and each of these materials may behave differently, Samsung can bias each color differently to balance them and just use PWM for brightness control, but it would be a lot more complex to manage brightness control while biasing different curves for each color pixel. Not impossible, but harder than it is for LG which has the same material at each pixel.
 
Last edited:
Really, because the rest of your post helps make the case that modulating different light levels is harder without PWM.

I meant PWM is not easier to implement, unlike what you claimed and still do.

PWM is easier because you don't have to bias voltage circuits at different levels. It is simply ON or OFF. ON or OFF is something you have to do anyway.

Maybe I misunderstood, but no, in OLED displays you only control white luminance with PWM. If you have say 8-bit panel, you still have to bias every subpixel to one out of 256 voltage levels, that doesn't change whether you use PWM or not.
 
Maybe I misunderstood, but no, in OLED displays you only control white luminance with PWM. If you have say 8-bit panel, you still have to bias every subpixel to one out of 256 voltage levels, that doesn't change whether you use PWM or not.

That makes no sense. Since white luminance is just a combination of individual RGB. You are really demonstrating, that you have no idea what you are talking about.
 
I see no basis why PWM would be easier in OLEDs, LGs WOLEDs are incapable of that altogether.

Look, I haven't done work on OLEDs, but from my previous experience in characterizing other optoelectronic components, I can tell that despite careful manufacturing, there are differences between similar units. Especially at low drive currents, there can be a significant difference in light output power. Those differences tend to become relatively smaller as you increase the drive current.

An OLED is a LED, and unlike what Vega stated, they respond in kind, and similarly, all those millions of pixels, or subpixels, if you want to be pedantic, in a display, they are not the same, their light output varies.
OLEDs are very different from LEDs. LEDs require a backlight, OLEDs do not. Light ouput on OLEDs is completely different from LEDs.
 
Yep. LEDs on TVs are passive and require a backlight and OLEDs are emissive and don't require a backlight.
You really don't know what you're talking about, do you? LED and OLED are both emissive. OLED is a kind of LED. For "LED TVs", LEDs are the backlight.

If you mean that LED LCDs are very different from OLED displays, then you are right. But the way you've phrased it, it just came out as non-sense.
 
Last edited:
That makes no sense. Since white luminance is just a combination of individual RGB. You are really demonstrating, that you have no idea what you are talking about.

Eh, care to elaborate? I'm not native in English, maybe I haven't grasped the meaning of your every word, or been able to express myself clearly. Of course luminance is combination of individual RGB, have I ever stated otherwise?
 
In OLED displays, every subpixel (red, green, blue and white (if white is used)) will omit it's light straight form the OLED subpixel level. Each subpixel luminance level is kept constant (no refreshing required) until the subpixel luminance changes.
Because there is on backlight, no PWM controlling for backlight is needed (in fact, backlight cannot be used at oll - would OLED displays use backlight, 100% black wouldn't be possible).
 
Yep. LEDs on TVs are passive and require a backlight and OLEDs are emissive and don't require a backlight.
In an LED LCD monitor, LEDs ARE the backlight, LCD is the part of the monitor that requires the backlight. Backlighted panels work by blocking out light selectively to use display different colours, which means that 100% blacks is impossible, because LC cannot block 100% of the light.
 
Awwwww doods, all I wanna know is how is the screen for gaming!? Is OLED the holy grail of high refresh rate, no lag, instant response, worthy to be the successor of the FW900 CRT? Is this the tech that SED displays promised but got killed off in patent, licensing & probably tooling wars amongst Asian display manufacturers?
 
Well, at 4K they are all still limited to 60 Hz. To me 60 Hz for gaming doesn't cut it. All display technology is limited by slow roll-out of connectivity standards.

Input lag is dependent on the electronics package, not the display type itself. OLED has the fastest pixels of any display type, less than 0.1ms which is quite a bit faster than FW900 phosphor glow decay.

Once we get Display port 1.4 and/or HDMI 2.1 into OLED displays, there will never be a reason to buy an LCD again.
 
Is OLED the holy grail of high refresh rate, no lag, instant response, worthy to be the successor of the FW900 CRT?
OLED has the potential to be better than a CRT, but won't be until someone builds a gaming-focused display which supports high refresh rates and scans the image like a CRT to eliminate motion blur.
You'll never get a digital display which is as low-latency as CRT. <5ms is possible - we have LCD monitors which achieve that today - but not 0ms like a CRT.

At the same time, G-Sync/FreeSync are very popular, and those trade motion resolution for motion smoothness.
If 2018 OLEDs support 120Hz inputs and variable refresh rates via HDMI 2.1, they have the potential to be the best of that type of display.
That said, with OLED TVs starting at 55" in size, the pixel density is pretty low compared to a monitor.
 
Many thanks. IMO, the best gaming screen at the moment is a TN 1 ms Freesync / G sync panel. At this time, the less costly the better!
 
At the same time, G-Sync/FreeSync are very popular, and those trade motion resolution for motion smoothness.
If 2018 OLEDs support 120Hz inputs and variable refresh rates via HDMI 2.1, they have the potential to be the best of that type of display.
That said, with OLED TVs starting at 55" in size, the pixel density is pretty low compared to a monitor.

WTF are you talking about? G-Sync and FreeSync have no tradeoffs. There's no "motion resolution" for "motion smoothness." They simply eliminate the need for traditional vertical sync. There's no downside to variable refresh. You're not getting less motion resolution with it.
 
WTF are you talking about? G-Sync and FreeSync have no tradeoffs. There's no "motion resolution" for "motion smoothness." They simply eliminate the need for traditional vertical sync. There's no downside to variable refresh. You're not getting less motion resolution with it.
Variable refresh rate displays need to be flicker-free for it to work correctly.
Since you can't strobe the image to improve motion resolution, you're stuck with sample-and-hold levels of motion blur.
Therefore you're trading motion resolution for motion smoothness.
 
Actually that is not true. NVIDIA are currently working on ULMB working in conjunction with G-Sync. Over at the blur-busters forum, there was found a way to get them working together on certain monitors. I even tried it myself. It was strobing the back-light with a variable refresh rate of G-Sync, but it was buggy. It would insert black frames and flicker at inappropriate times. No one expects a bug to work correctly though.

I cannot wait until NVIDIA reveals the real deal.
 
  • Like
Reactions: Inu
like this
Actually that is not true. NVIDIA are currently working on ULMB working in conjunction with G-Sync. Over at the blur-busters forum, there was found a way to get them working together on certain monitors. I even tried it myself. It was strobing the back-light with a variable refresh rate of G-Sync, but it was buggy. It would insert black frames and flicker at inappropriate times. No one expects a bug to work correctly though.

I cannot wait until NVIDIA reveals the real deal.
A bug that enables the two modes simultaneously does not mean that NVIDIA are working on such a thing.
Currently, at least, this is not an option that is actually provided to users.

Think about it logically though: you have complained about 60Hz flicker on this Dell OLED yourself.
One of the benefits to VRR is that you can run games at low framerates in the 30-60 FPS range with far less judder and input lag.
How do you think flicker is going to be at 30-60Hz? NVIDIA already restrict ULMB to a minimum of 85Hz.

That is not to say that I don't think it can work.
It might be possible to have a variable strobe duration which transitions to sample & hold below a lower threshold for example.
By doing this it might be possible to minimize the varying levels of flicker that will be seen when the framerate changes.
I hope it is something that NVIDIA are working on, but I'm not convinced it is anything more than a bug right now.

Now I am not bothered by flicker at all, even at low framerates - so long as it is a single strobe per frame.
I really want to be able to play old arcade games which run at <60Hz refresh rates with a single strobe mode and none of the V-Sync latency. (then again, I'd have to get them working properly with G-Sync at all first)
But that's not true for most people. 60Hz flicker is already unacceptable to them.
 
A bug that enables the two modes simultaneously does not mean that NVIDIA are working on such a thing.
Currently, at least, this is not an option that is actually provided to users.

Think about it logically though: you have complained about 60Hz flicker on this Dell OLED yourself.
One of the benefits to VRR is that you can run games at low framerates in the 30-60 FPS range with far less judder and input lag.
How do you think flicker is going to be at 30-60Hz? NVIDIA already restrict ULMB to a minimum of 85Hz.

That is not to say that I don't think it can work.
It might be possible to have a variable strobe duration which transitions to sample & hold below a lower threshold for example.
By doing this it might be possible to minimize the varying levels of flicker that will be seen when the framerate changes.
I hope it is something that NVIDIA are working on, but I'm not convinced it is anything more than a bug right now.

Now I am not bothered by flicker at all, even at low framerates - so long as it is a single strobe per frame.
I really want to be able to play old arcade games which run at <60Hz refresh rates with a single strobe mode and none of the V-Sync latency. (then again, I'd have to get them working properly with G-Sync at all first)
But that's not true for most people. 60Hz flicker is already unacceptable to them.

What is this 30-60fps range you keep talking about? Who runs at that crap range? When we talk about Gsync combined with strobing we are talking at least 120+ fps, which many of us do with TitanXPs
 
What is this 30-60fps range you keep talking about? Who runs at that crap range? When we talk about Gsync combined with strobing we are talking at least 120+ fps, which many of us do with TitanXPs
Oh how stupid of me. I forgot that everyone here has spent $2400 on video cards, only plays games with SLI support, is blind to microstutter, never runs into CPU bottlenecks, and never plays games with locked framerates.
 
Oh how stupid of me. I forgot that everyone here has spent $2400 on video cards, only plays games with SLI support, is blind to microstutter, never runs into CPU bottlenecks, and never plays games with locked framerates.

Dude, I think your lost...musta took a wrong turn back there at AMD ave. Yes, if your looking at a $2,000 display like the upcoming FALD 4k or in this case a $3,000 OLED display, please take your poverty GPU's elsewhere. Nobody looking at displays this high end is worrying about 35fps in a game. Gaaahhh there is always one of you guys lol.
 
Dude, I think your lost...musta took a wrong turn back there at AMD ave. Yes, if your looking at a $2,000 display like the upcoming FALD 4k or in this case a $3,000 OLED display, please take your poverty GPU's elsewhere. Nobody looking at displays this high end is worrying about 35fps in a game. Gaaahhh there is always one of you guys lol.
Do you even game at 4K?
Here's a selection of games from the last year, benched using GTX 1080 SLI.
An extra 25% performance (Titan Xp) is not going to suddenly bring framerates anywhere near 120 FPS. Most of them will still be well below 60.
 
So this monitor doubles refresh rate to reduce burn-in...
If going from 60Hz to 120Hz reduce burn-in then either:
a) at 60hz pixel is lit the same time in each pulse but twice as bright than at 120hz
b) at 60hz pixel is lit the same brightness but twice as long than at 120hz
without proper lab measurements it is impossible to say which is true

if we assume drive logic to be simple two options become most likely

a) scan time is the same at 8.3ms in both cases pixel brightness is controlled with voltage
a2) (!least likely!) scan time is twice as long at 60hz but still brightness is controlled with voltage (/!least likely!)
b1) scan time is the same at 8.3ms in both cases but display can control brightness of each line by controlling time it is displayed (I assume something that Later suggests as 'pwm')
b2) scan time is twice as long at 60hz at 16.6ms and brigtness is mostly controlled by simply displaying line twice as long but not due to dedicated control logic but rather waiting twice as long for new line data to arrive <- simplified design

scan time can be checked by how much eg. windows bend when moved around (aero enabled or win8/10) at 60hz and 120hz modes. If they bend the same scan time is the same at 8.3ms

to differentiate between a) and b1) we need to check single horizontal like on oscilloscope (i assume display drives whole line at once)
if it is option a) then it is truly pixel persistence reducing option as higher voltage is shortening life of any light emitting diode, organic or not
if this is option b1) or b2) then it is mostly flicker reducing option and have very little to do with pixel persistence and manufacturer mostly lied in this regard

ofcourse due to heat distribution it would slightly reduce burn-in even in options b) but difference would be very small and insignificant and certainly not worth naming it as burn-in prevention in which case monitor manufacturer would just used this term for marketing purposes. It is better to reassure customer that there are all sorts of preventive measurements in place even if it is just flicker reduction, and at the same time not mention it being flicker reduction to not make customer think about flicker =0

after carefully considering each option I have no idea which one manufacturer would use as each have its advantages and disadvantages XD we do really need lab quality measurements with oscilloscope and preferably with some ultra speed camera to make picture fuller to evaluate how this monitor work

I would however not believe in what manufacturers say even one bit. It is not like in military where if you have spec they have to have to be true. Manufacturers can say whatever they want as long as they put magic court preventing sentences saying that specs in manuals and marketing material can differ in actual product which they always put...

In any case seems like great monitor ^_^
Now please make 120Hz, G-Sync, HDR, point filtering of 1080p and put price tag of not more than 999$ and I run to the store... and for the time being it isn't worth the money imho at least for games as it does not provide anything better for gaming than eg. my plasma

BTW. HDR is not only ridiculous brightness but also wide color space support and without HDR support using it might be completely impossible except some movie player hacks on PC. No HDR on consoles and such ... =(
 
Last edited:
I never said you were making it up, only that people enabling the two via what appeared to be a bug was not confirmation of NVIDIA working on such a thing.
I had no idea that they had even demonstrated it. That's very exciting - though I am a little concerned that they were double-strobing at 70 FPS.
 
6yHmHiL.png


Y5Ilrbs.png
 
I wonder if that means it's a Samsung Panel.

RGB stripe is the best choice for things like text. Which is kind of important on a desktop monitor.

I have this same monitor, and my admittedly blurry picture of the pixel structure looks completely different from the previously posted picture. Note that the red subpixels seem to alternate with blue. I can notice color fringing around fine black text and the pixel structure is unusually noticeable along diagonals. I think I have a "Faux K' PenTile version.
 

Attachments

  • IMG_0990.JPG
    IMG_0990.JPG
    308.4 KB · Views: 67
I have this same monitor, and my admittedly blurry picture of the pixel structure looks completely different from the previously posted picture. Note that the red subpixels seem to alternate with blue. I can notice color fringing around fine black text and the pixel structure is unusually noticeable along diagonals. I think I have a "Faux K' PenTile version.

It's too overexposed for me to tell anything.

I processed Vega's image and there does seem to be something a little odd with the Red pixels. They seem to be grouped in pairs a little closer to each other, and the pair stagger changes in each column. It's kind of strange but not Pentile (unless those paired reds are really one big pixel).

Vega_proc.jpg
 
Last edited:
It's too overexposed for me to tell anything.

I processed Vega's image and there does seem to be something a little odd with the Red pixels. They seem to be grouped in pairs a little closer to each other, and the pair stagger changes in each column. It's kind of strange but not Pentile (unless those paired reds are really one big pixel).

View attachment 23626

I'm going to see if I can find something to magnify my screen to get a proper picture, as I no longer have access to a decent camera. But even with the overexposed photo, the vertical axis seems to alternate color consistently - there is no column that is completely red, for example, and I doubt the overexposure would cause that problem.

This is mainly to corroborate what I can see with my eyes. There is no way my screen is a traditional RGB stripe based upon what I see in person.
 
The only way this OLED is not RGB stripe is if your Dell OLED has a different panel then my Dell OLED.
 
I'm going to see if I can find something to magnify my screen to get a proper picture, as I no longer have access to a decent camera. But even with the overexposed photo, the vertical axis seems to alternate color consistently - there is no column that is completely red, for example, and I doubt the overexposure would cause that problem.

This is mainly to corroborate what I can see with my eyes. There is no way my screen is a traditional RGB stripe based upon what I see in person.

Really the focus is bad and overexposed, you really can't tell anything. But...

The only way this OLED is not RGB stripe is if your Dell OLED has a different panel then my Dell OLED.

But here is still something odd in your shot the way those red pixels are grouped.

It would be interesting to to see a shot with a 1 pixel checkerboard background to get another view of what is going on:
http://carltonbale.com/pixel_by_pixel_checkerboard/

Here is my LCD displaying a the above checkerboard. Very clear normal RGB stripe.
IMG_Checker.jpg
 
Last edited:
The only way this OLED is not RGB stripe is if your Dell OLED has a different panel then my Dell OLED.

Yes, this is what I should've said outright. I suspect they have more than one panel being used.
 
Yes, this is what I should've said outright. I suspect they have more than one panel being used.

Can you take an 8x zoom more clear picture? That is the only way to be certain, short of taking the back housing off.
 
Back
Top