Dell UP3017Q - 4K 120HZ Oled 30"

Anyone who buys a $5K OLED monitor will sell it and have a new display within two years. Tech moves that fast. Worrying about 14K-30K hour lifespans is a bit silly.

With the exception of my FW900, I can't remember the last time I kept a display longer than 2 years.

Based on my observations, high end Dell displays tend to half in price within 12-24 months of their release. Remember their 24"4k and 27" 5k, those prices fell like stones :eek: So now this sucker comes along and costs $5,000, and I think the same will happen, especially if LG makes a high refresh OLED gaming display...what I am saying is, if you pay $5,000 for a display and its worth $2,000 a year later, your probably just gonna hold on to the damn thing forever......or in this case 5 years lol

But then again, what do I know....I'm selling my Dell 5k, which I thought I was gonna keep forever, to fund towards this and my Rift DK2 to fund towards the CV1.....sucks being a technocrackhead!
 
Last edited:
Lifespan numbers are also to 50% brightness, not to panel death. Pretty much all monitors are still entirely usable at 50% brightness in home/consumer settings. Less so in office settings because of the typical extremely bright overhead fluorescents.
 
Lifespan numbers are also to 50% brightness, not to panel death. Pretty much all monitors are still entirely usable at 50% brightness in home/consumer settings. Less so in office settings because of the typical extremely bright overhead fluorescents.

Monitors with 50% dimmer backlight are usable, and hardly even noticable (My NEC 2490 is 50% dimmer or more). Also lifespan numbers are often exaggerated.

But, regardless OLEDs wont dim evenly.

I wouldn't buy one, because I like to buy good monitors and get as much out of them as possible, amortizing the cost over many years.

I bought my NEC 2490 in 2008, and it has been my only monitor since and I will keep using it till it dies. OLED won't work for me, since I probably would have burned in 3 of them in that time.
 
Last edited:
But, regardless OLEDs wont dim evenly.

That's just speculation. Uneven dimming is what kept Samsung's OLED technology out of the TV space, but it's also one of the major problems that LG solved to get their OLED TVs on the market.

I haven't yet seen a source for who is manufacturing this panel.
 
I want to make sure I'm understanding the potential .1ms capability of OLED correctly (and what it means for this monitor).

This will not vanquish the need for Adaptive Sync or G-Sync, right? That involves frametime consistency between GPU and monitor. But it will do away with the need for such things like ULMB? As I understand, ULMB exists to compensate for the relatively high ms that is native to LED/IPS. But, it seems you can't currently use ULMB with either sync at the same time.

Since the ms of OLED is almost non existent, that gets rid of the need for ULMB?
 
I want to make sure I'm understanding the potential .1ms capability of OLED correctly (and what it means for this monitor).

This will not vanquish the need for Adaptive Sync or G-Sync, right? That involves frametime consistency between GPU and monitor. But it will do away with the need for such things like ULMB? As I understand, ULMB exists to compensate for the relatively high ms that is native to LED/IPS. But, it seems you can't currently use ULMB with either sync at the same time.

Since the ms of OLED is almost non existent, that gets rid of the need for ULMB?

Not exactly. The sample and hold blur is still there, but that comes from your eyes and not the monitor. 120hz reduces it, provided that the game also runs at that speed, but its still there. Honestly though, we are now talking about so minimal amount of motion blur that you have to be pretty anal to care about it. And IF you are a person who keeps Motion Blur on in game settings on all arguments about monitor blurring is out of the window anyway. ;)
 
That's just speculation. Uneven dimming is what kept Samsung's OLED technology out of the TV space, but it's also one of the major problems that LG solved to get their OLED TVs on the market.

I haven't yet seen a source for who is manufacturing this panel.

It isn't just speculation. OLED burns in. LG OLED TVs would burn in if you tried to use them like a monitor.

If you only display video on this monitor and don't actually use it for a windows Desktop, then it would likely be ok.

But static images burn into OLED. There is no real cure for this. Except don't display static elements.
 
It isn't just speculation. OLED burns in. LG OLED TVs would burn in if you tried to use them like a monitor.

Well, if you ignore all the reports on avsforums of people using LG TVs to play video games for hours and not having their UIs burn-in, sure, OLED burns in.

Again, we don't know who is manufacturing this panel, or anything really. Plasma burned in, too, until it just didn't anymore. Plasmas from the last generation are completely and utterly immune to burn-in(I've had a static windows desktop up on my PN60F8500 for an excess of 12 hours at 3/4 cell light brightness, for weeks on end). It's ENTIRELY possible that if you don't do something idiotic like turn off the pixel shifting, there's no meaningful burn-in problems on this panel.

It's also possible that it burns in after 10 mins of a windows desktop. We have to see. We don't know.
 
I've used my OLED TV for many hours at a time desktop and games and never had any signs of burn in. This Dell also has pixel shift technology which the LG TV doesn't. Unless you abuse the display, the Dell OLED will be just fine.
 
It isn't just speculation. OLED burns in. LG OLED TVs would burn in if you tried to use them like a monitor.

If you only display video on this monitor and don't actually use it for a windows Desktop, then it would likely be ok.

But static images burn into OLED. There is no real cure for this. Except don't display static elements.

Well.. i own 3 LG OLED TVs and play games on them with HUDs for hours. and no burn-in.
also, lets use the right term. Burn-in doesn't really happen these days. The correct term should be using is image retention (IR). I had these issues on my plasmas before going OLED that it would hold an static image that was up for a long time. IR goes away.
That said, I'm having no IR issues.

Also, it is clear that you won't be owning any new tech for a long time and proud of it. trolling with out misinformation isn't helpful to anyone.
 
Last edited:
I've owned old IPS panels that were ruined by IR. Newer IPS panels don't have this problem.

This is a premium monitor costing several grand. Call me naive but why would Dell bring one of these out at this price for it to only last 5 minutes?

Personally I would not buy one of these as I do not want 4K and I don't want to empty my savings just to buy a monitor plus I am happy with my Swift, but I am still very excited by this thing. When the technology starts to trickle down to the 27" range at 2560x1440 and comes with Gsync i'll bite. Otherwise i'll enjoy looking at the reviews you savvy lads put up here when you buy one :)
 
Anyone who buys a $5K OLED monitor will sell it and have a new display within two years. Tech moves that fast. Worrying about 14K-30K hour lifespans is a bit silly.

I'm going to disagree. Tech moves slow unless you change just for the sake of change. I've been waiting years (6 now) to replace my 3008wfp. There has been nothing in those 6 years that has been tempting until 4k hit the scene. So I can see my self easily using the oled display for 6 years. Tech moves all over the place fast but not necessarily forward fast. Took years to finally get display port to be common place on nvidia cards (ati was ahead here). There are always new things coming out but they aren't necessarily interchangeable or replacements for existing tech. For example I don't care about gysnc/freesync, 3d, 120hz, or sub 30" 16:9 screens. So for a replacement for my uses nothing has really changed or improved until 4k hit the scene and large 40" monitors (or tvs as monitors). For along time if you wanted something with lots of screen space like the 30" 2560x1600 monitors then the area was really stagnant.

If I wanted to try different types of tech then sure lots of stuff comes out, but replacements for tried and true tech is slow.

Lifespan numbers are also to 50% brightness, not to panel death. Pretty much all monitors are still entirely usable at 50% brightness in home/consumer settings. Less so in office settings because of the typical extremely bright overhead fluorescents.
I use my 3008wfp at 0 brightness (zero), I prefer a dimmer screen so maybe I could get the panel to last longer. I wonder if dell covers burn in under their extended warranties. Probably not, but best buy does in theirs so I can hope :p.
 
Last edited:
It isn't just speculation. OLED burns in. LG OLED TVs would burn in if you tried to use them like a monitor.

If you only display video on this monitor and don't actually use it for a windows Desktop, then it would likely be ok.

But static images burn into OLED. There is no real cure for this. Except don't display static elements.

Do you own an OLED? No? Okay then.

I have about 300 hours on my LG OLED, ALL PC use. Zero "burn" in. I've had the instance of image retention of say, desktop icons, but other than that, nothing. And it was only noticeable on a gray screen when I was specifically trying to find it. And guess what, it disappeared. I solved this by putting all my icons in the center of the desktop, so that when I'm using my browser or something, they are covered up.
 
How do you think the pixel shift thing works? Does the screen have a couple extra rows and columns, and occasionally the whole screen shifts 1-2 rows over?

Do OLEDs have bad pixels just like LCDs?

Would anyone rather have this Dell 30" OLED in 2560x1600 vs 4K, so scaling isn't needed?
 
How do you think the pixel shift thing works? Does the screen have a couple extra rows and columns, and occasionally the whole screen shifts 1-2 rows over?

Do OLEDs have bad pixels just like LCDs?

Would anyone rather have this Dell 30" OLED in 2560x1600 vs 4K, so scaling isn't needed?

Yes, OLED still suffers from bad pixels.

I'd take a 2560x1600 OLED right now if it came at a more reasonable price. Unfortunately it's not going to happen though, the production lines are optimized for 4K.
 
How do you think the pixel shift thing works? Does the screen have a couple extra rows and columns, and occasionally the whole screen shifts 1-2 rows over?

Do OLEDs have bad pixels just like LCDs?

Would anyone rather have this Dell 30" OLED in 2560x1600 vs 4K, so scaling isn't needed?

Yes, same potential issue of dead pixels. I'm not sure on "stuck" pixels - those might actually not a be a problem?

I had a Samsung plasma with pixel shift. My guess is no extra lines, only moves over so it'll clip the edges of content when it's active. You'll never notice though given the 4K res at 30". Pixel shift really isn't going to do much though... items that may have image retention are typically icons and such, which won't be affected by a slight shift.

The best defense that I've seen is from LG, where the screen detects if the content has remained mostly the same, and will actively reduce brightness down to very dim levels in slow, gradual steps. Even if you left the display on for hours at that level of brightness, I'd bet money no amount of permanent IR would not occur. It can be a little jarring sometimes. When typing a forum post, or reading/looking at something static for awhile, the screen will become noticeably dimmer after several minutes. When I cause a large change on the screen, the brightness jumps back up. Not a big deal in my opinion though. It only ever happens on the desktop in my use.

I also use mostly dark desktop backgrounds (reduce energy/panel use and prevents ABL from kicking in so easily). But the best prevention in my opinion, is how I set my screensaver - after 3 minutes my screensaver goes to a blank/black image. The pixels are "off" thanks to OLED, but the display is basically in an instant-wake sleep mode as soon as I move the mouse. After 30 minutes of inactivity, I set the display to actually turn off.

I just played GTA 5 for the first time in months again. The game is mouthwatering, especially at night, with 4K and 55" of OLED goodness. I actually don't think I could justify spending the asking price for this monitor when OLED TV's are so readily available, more immersive, and more useful. 120Hz input would be fantastic, but at 30" - meh. I'm trying to work with LG to improve the input lag. It's not terrible, but it's far from great for gaming. If nothing else, hopefully their new 2016 displays will be much better in that regard.

Love that floating display effect at night.

23701857144_dd66096e57_k_d.jpg
 
Last edited:
I'm going to disagree. Tech moves slow unless you change just for the sake of change.

Display tech moves in steps. Once an area is broken into, it can move fast. Seeing this Dell coming out not only with OLED but 120 Hz 4k is a double stepping stone in display technology in a single monitor!

I could see purchasing this $5K Dell and then within a year or two another 4K 120+ Hz OLED monitor comes out once they design a new G-Sync chip to handle that much bandwidth. Then it would be pretty much a mandatory upgrade for me.

I just played GTA 5 for the first time in months again. The game is mouthwatering, especially at night, with 4K and 55" of OLED goodness. I actually don't think I could justify spending the asking price for this monitor when OLED TV's are so readily available, more immersive, and more useful. 120Hz input would be fantastic, but at 30" - meh. I'm trying to work with LG to improve the input lag. It's not terrible, but it's far from great for gaming. If nothing else, hopefully their new 2016 displays will be much better in that regard.

Love that floating display effect at night.

Yes, playing games on an OLED at night is an almost otherworldly experience. I did enjoy my 55" 4K OLED but the combination of only 60 Hz and noticeable input lag on top of that was just too much for me. :(
 
Pretty excited about this. I've tried other displays but keep going back to my old 3007WFP. Hoping this is a true replacement and upgrade to the Dell 30.
 
Yes, playing games on an OLED at night is an almost otherworldly experience. I did enjoy my 55" 4K OLED but the combination of only 60 Hz and noticeable input lag on top of that was just too much for me. :(

So you got rid of yours, eh? I almost did... bought another JS9000 to go back to, since its 120Hz mode basically matches the OLED's input lag. Game mode is pristine... but after playing games on it, and trying to watch a movie with it, I just couldn't. OLED has ruined me. LCD is no longer an option. The JS9000 is going back LOL.
 
I've had the instance of image retention of say, desktop icons, but other than that, nothing. And it was only noticeable on a gray screen when I was specifically trying to find it.

My LCD Macbook does the exact same thing(only noticeable on grey screen subsequent to the static image). A surprising number of LCD panels have this behaviour, even high end ones. Not at all unique to OLED. Most people just never notice.
 
It isn't just speculation. OLED burns in. LG OLED TVs would burn in if you tried to use them like a monitor.

If you only display video on this monitor and don't actually use it for a windows Desktop, then it would likely be ok.

But static images burn into OLED. There is no real cure for this. Except don't display static elements.

I have over 1000 hours just on Destiny alone on my OLED tv (look under stats)
http://destinystatus.com/psn/Fleat

If we look in more detail, almost 350 hours of that is mostly AFK in orbit (with a hud). My previous plasma would have had absolutely horrible image retention from this (and it did), and my LG OLED shows 0 IR. This doesn't even include the roughly 1000+ hours of PC use and other gaming, and still 0 problems.

I was in a bad motorcycle accident, and used this thing non-stop as my only display for 3+ months while I was recovering. I was probably the "worst case scenario" user outside of displaying in a Best Buy store. I would often fall asleep from pain meds and leave my PC desktop up on the display for hours at a time with nothing displaying without lingering issues. Just my 2 cents
 
So you got rid of yours, eh? I almost did... bought another JS9000 to go back to, since its 120Hz mode basically matches the OLED's input lag. Game mode is pristine... but after playing games on it, and trying to watch a movie with it, I just couldn't. OLED has ruined me. LCD is no longer an option. The JS9000 is going back LOL.

Yes, the last LCD's I will ever own are in my possession right now.
 
I have over 1000 hours just on Destiny alone on my OLED tv (look under stats)
http://destinystatus.com/psn/Fleat

If we look in more detail, almost 350 hours of that is mostly AFK in orbit (with a hud). My previous plasma would have had absolutely horrible image retention from this (and it did), and my LG OLED shows 0 IR. This doesn't even include the roughly 1000+ hours of PC use and other gaming, and still 0 problems.

I was in a bad motorcycle accident, and used this thing non-stop as my only display for 3+ months while I was recovering. I was probably the "worst case scenario" user outside of displaying in a Best Buy store. I would often fall asleep from pain meds and leave my PC desktop up on the display for hours at a time with nothing displaying without lingering issues. Just my 2 cents

My Plasma never had any retention issue problems up until I played Kingdom Hearts Remix. And really, it's only had an issue with that game, as I've had other HUDs after that which had no image retention. I think a lot of this has to do with the colors displayed on the screen.

I've used OLED screens at work though, and all of them had some issues when left on for a day or more. My previous boss would leave the screen on all the time, and the screen would be ruined after a weekend of a static desktop. Fortunately these units should take care of some of those problems, but I probably won't be an early adopter of this monitor. I'll probably wait until maybe June or so for other people to review it and give a better idea of desktop usage.
 
My Plasma never had any retention issue problems up until I played Kingdom Hearts Remix. And really, it's only had an issue with that game, as I've had other HUDs after that which had no image retention. I think a lot of this has to do with the colors displayed on the screen.

I've used OLED screens at work though, and all of them had some issues when left on for a day or more. My previous boss would leave the screen on all the time, and the screen would be ruined after a weekend of a static desktop. Fortunately these units should take care of some of those problems, but I probably won't be an early adopter of this monitor. I'll probably wait until maybe June or so for other people to review it and give a better idea of desktop usage.

Based on my own personal experience and all of the reading I have done on the OLED owner threads, I wouldn't hesitate to be an early adopter of this monitor. I am hoping there are far more people like me out there so that this technology really gets appreciated the way it should be.

LCD's (even modern ones) provide a truly subpar experience in comparison to OLED displays from a picture quality perspective. I will be elated when traditional LCD's are relegated to only the budget display option.
 
I have over 1000 hours just on Destiny alone on my OLED tv (look under stats)
http://destinystatus.com/psn/Fleat

If we look in more detail, almost 350 hours of that is mostly AFK in orbit (with a hud). My previous plasma would have had absolutely horrible image retention from this (and it did), and my LG OLED shows 0 IR. This doesn't even include the roughly 1000+ hours of PC use and other gaming, and still 0 problems.

I was in a bad motorcycle accident, and used this thing non-stop as my only display for 3+ months while I was recovering. I was probably the "worst case scenario" user outside of displaying in a Best Buy store. I would often fall asleep from pain meds and leave my PC desktop up on the display for hours at a time with nothing displaying without lingering issues. Just my 2 cents

very useful post about burn-in, thank you
 
I have over 1000 hours just on Destiny alone on my OLED tv (look under stats)
http://destinystatus.com/psn/Fleat

If we look in more detail, almost 350 hours of that is mostly AFK in orbit (with a hud). My previous plasma would have had absolutely horrible image retention from this (and it did), and my LG OLED shows 0 IR. This doesn't even include the roughly 1000+ hours of PC use and other gaming, and still 0 problems.

I was in a bad motorcycle accident, and used this thing non-stop as my only display for 3+ months while I was recovering. I was probably the "worst case scenario" user outside of displaying in a Best Buy store. I would often fall asleep from pain meds and leave my PC desktop up on the display for hours at a time with nothing displaying without lingering issues. Just my 2 cents

Yeah, thanks for this. I'm just in the camp that believes if not used in some ridiculous fashion (24/7 static image), burn-in in today's high-end displays is something of the past. I have an 8 year old Panny plasma which has gone through hell in terms of static images, etc., and it has shown no burn-in after all these years.

With today's displays if one uses even slight prudence, it should be fine. You would think that of all things, namely video games, one of the most popular forms of entertainment in the world, companies would have improved tech to mitigate IR in today's displays. These aren't first gen plasmas we are talking about. Just as mentioned by Nitemare3219, LG implements a bunch of fail-safes anyway.
 
Last edited:
Nice, an actual shot of the inputs. Look's like HDMI, mini-DP (which was running the display at CES), and a Thunderbolt 3 / USB Type C.

Unfortunately there is no label, so we don't know if that's a DP 1.2 or 1.3 speed yet.

I am going to just pretend the mini-dp is 1.3....otherwise begin to get nightmarish flashbacks to when accel did those DP to DVI adapters uggghhhhhh....I just refuse to believe that Thunderbolt 3 is the only way to hit 4k 120hz......are Nvidia or AMD even toying with the idea of using T3 connections? Or is that just an apple thing.
 
Last edited:
Nice, an actual shot of the inputs. Look's like HDMI, mini-DP (which was running the display at CES), and a Thunderbolt 3 / USB Type C.

Unfortunately there is no label, so we don't know if that's a DP 1.2 or 1.3 speed yet.

You think that the hdmi input accepts 4k60hz? If so I may jump on this, that is if the input lag is none like most pc monitors but I am not confident this will have low input lag, the pixel shifting is a give away and if this has ABL as-well that will increase input lag too..
 
Nice, an actual shot of the inputs. Look's like HDMI, mini-DP (which was running the display at CES), and a Thunderbolt 3 / USB Type C.

Unfortunately there is no label, so we don't know if that's a DP 1.2 or 1.3 speed yet.

DP 1.2 can't do 4K@120Hz though, can it? Wouldn't it necessarily have to be DP1.3? We have that Dell rep saying it's capable of 120Hz, although I'm not placing much faith in a twitter post.

Also, I came across this interesting bit on VESA's site:

Q: Will DisplayPort 1.3 enable further performance enhancements to 4K UHD displays?

A: Yes, when including the new HBR3 link rate option, DisplayPort 1.3 will enable a 4K UHD display to operate at a 120Hz refresh rate using 24-bit pixels, or a 96Hz refresh rate using 30-bit pixels.

So this pretty much dashes any hopes of being able to run this display (or any 4K display) yet at 120Hz with 10 bit per channel color. Going to have to dial it back to 8 bits per channel, which is unfortunate since it's OLED.

http://www.vesa.org/faqs/
 
I have over 1000 hours just on Destiny alone on my OLED tv (look under stats)
http://destinystatus.com/psn/Fleat

If we look in more detail, almost 350 hours of that is mostly AFK in orbit (with a hud). My previous plasma would have had absolutely horrible image retention from this (and it did), and my LG OLED shows 0 IR. This doesn't even include the roughly 1000+ hours of PC use and other gaming, and still 0 problems.

I have no issue with OLED TVs and even playing games on them. I am talking about Windows Desktop usage.

Destiny is hardly a static game, like a Windows Desktop is:
https://www.youtube.com/watch?v=XKQfiwPzjmc

Owners manual warns about static images and burn-in, and even warns against watching too much 4:3 video. Burn-in is not covered by warranty, and is happening to display models.

Using OLED like a office work screen (8hrs/day, 5 days week with windows desktop) will cause burn-in. It is pretty much the same kind of abuse display model TVs get.
 
You think that the hdmi input accepts 4k60hz? If so I may jump on this, that is if the input lag is none like most pc monitors but I am not confident this will have low input lag, the pixel shifting is a give away and if this has ABL as-well that will increase input lag too..

Yes, I highly doubt Dell would put HDMI 1.4 (30 Hz) on this display. Fairly positive it will be a HDMI 2.0 chip for 4K @ 60 Hz.

DP 1.2 can't do 4K@120Hz though, can it? Wouldn't it necessarily have to be DP1.3? We have that Dell rep saying it's capable of 120Hz, although I'm not placing much faith in a twitter post.

Also, I came across this interesting bit on VESA's site:

Q: Will DisplayPort 1.3 enable further performance enhancements to 4K UHD displays?

A: Yes, when including the new HBR3 link rate option, DisplayPort 1.3 will enable a 4K UHD display to operate at a 120Hz refresh rate using 24-bit pixels, or a 96Hz refresh rate using 30-bit pixels.

So this pretty much dashes any hopes of being able to run this display (or any 4K display) yet at 120Hz with 10 bit per channel color. Going to have to dial it back to 8 bits per channel, which is unfortunate since it's OLED.

http://www.vesa.org/faqs/

DP 1.2 cannot do 4K @ 120 Hz. That doesn't mean it's not a DP 1.2 port. I have my suspicions that DP 1.3 transmission controller chips still aren't on the market. I hope I am wrong.

That leaves USB Type C Thunderbolt 3 which can do 120 Hz 4K. The problem with that; it means to run the monitor to it's full capabilities you would need to connect the monitor to a laptop or desktop that has integrated GPU Thunderbolt 3. This basically rules out running the display up to it's potential with modern high performance GPU's. You are going to need some serious GPU horsepower to run 4K at 120 Hz.

Who knows, maybe AMD/NVIDIA will surprise us with their 2016 GPU's and provide a Thunderbolt 3 port (one could hope).
 
I have no issue with OLED TVs and even playing games on them. I am talking about Windows Desktop usage.

Destiny is hardly a static game, like a Windows Desktop is:
https://www.youtube.com/watch?v=XKQfiwPzjmc

Owners manual warns about static images and burn-in, and even warns against watching too much 4:3 video. Burn-in is not covered by warranty, and is happening to display models.
Although Destiny is not completely static, orbit does have elements that do not change.
i.e. https://www.youtube.com/watch?v=apWyXoEHG3U
And if you follow my link from my original post, you can see I spent 300+ hours in orbit or the tower which also has static elements.

Using OLED like a office work screen (8hrs/day, 5 days week with windows desktop) will cause burn-in. It is pretty much the same kind of abuse display model TVs get.
Oddly enough though, you ignored the exact part of my post where I said I have done just that -
I was in a bad motorcycle accident, and used this thing non-stop as my only display for 3+ months while I was recovering. I was probably the "worst case scenario" user outside of displaying in a Best Buy store. I would often fall asleep from pain meds and leave my PC desktop up on the display for hours at a time with nothing displaying without lingering issues. Just my 2 cents
Although the scenario is not exactly what you are describing, it is close enough in my mind that the current performance along with the other technical changes they are making should alleviate serious IR or burn in concerns for OLED monitors. Even if Dell gets it slightly wrong, if it pushes the OLED agenda forward and we get better monitors it is better for enthusiasts all around.
 
Yes, I highly doubt Dell would put HDMI 1.4 (30 Hz) on this display. Fairly positive it will be a HDMI 2.0 chip for 4K @ 60 Hz.

DP 1.2 cannot do 4K @ 120 Hz. That doesn't mean it's not a DP 1.2 port. I have my suspicions that DP 1.3 transmission controller chips still aren't on the market. I hope I am wrong.

That leaves USB Type C Thunderbolt 3 which can do 120 Hz 4K. The problem with that; it means to run the monitor to it's full capabilities you would need to connect the monitor to a laptop or desktop that has integrated GPU Thunderbolt 3. This basically rules out running the display up to it's potential with modern high performance GPU's. You are going to need some serious GPU horsepower to run 4K at 120 Hz.

Who knows, maybe AMD/NVIDIA will surprise us with their 2016 GPU's and provide a Thunderbolt 3 port (one could hope).

I'm confused. I thought it was as follows (for 4K):

DP 1.3: max 4K@120Hz (@8 bits per color channel).

DP 1.2: max 4K@60Hz

TB 3: max (2) 4K@60Hz, no 120Hz support; supports DP 1.2 but not 1.3.

HDMI 2.0: max 4K@60Hz.

HDMI 1.4: max 4K@30Hz.

I do agree that at this time, it's *unlikely that Dell has a DP 1.3 controller since we won't be seeing DP 1.3 on discrete GPUs until NVIDIA's pascal and AMD's polaris are released later this year.

*But according to the Dell rep and what we know, it'll be capable of 4K@120Hz. So that would seem to imply that it does, in fact, have a DP 1.3 controller? I guess it hinges on that USB Type-C port, since AFAIK, it comes "DP 1.3 ready."
 
Last edited:
*But according to the Dell rep and what we know, it'll be capable of 4K@120Hz. So that would seem to imply that it does, in fact, have a DP 1.3 controller? I guess it hinges on that USB Type-C port, since AFAIK, it comes "DP 1.3 ready."


Hopefully they aren't trying to pull a fast one like with TVs.. and they mean 120hz output. i doubt that is the case.

RE: IR (image retention - burn in is something long dead unless someone is doing something really dumb)
Also, there are 3 of us the in thread that, in fact, own OLED screens - and they don't have this extra tech that dell says can help with RI. We aren't seeing anything problems.

That said, some units before fall 2014 had some IR issues (nothing close to IR seen the best plasmas - and those would go away pretty fast)
It is my understanding that the the IR comes not from the pixels themselves, but from the circuits sending power to them - and LG made a change that randomized that connection. My first OLED screen in late 2014 was after the fix and I didn't nothing anything related to IR - even with hours of a god damn pause screen my wife would leave on the TV!!! (but honestly i didn't touch that TV much.)
My two new flat 4k LG ef9500 screens (65" and 55") i got a launch have zero IR issues too and sometimes i'll leave a windows desktop sitting on them for a few days (oops)- HTPC connected to them :)

None use the pixel shifting that plasmas used (which you could see) and i don't know if the LG strategy changed.
I did have a friend with one of the early 4k LGs TVs and he does use it as a desktop and had some minor IR if something was static for days, but it quickly went away. Was that before LG changed power to the pixels? i'm not sure. (i'm also not sure how he dealt with the unstoppable autio-dim power saving)
 
Who knows, maybe AMD/NVIDIA will surprise us with their 2016 GPU's and provide a Thunderbolt 3 port (one could hope).

I can easily see NVidia putting a port on their Titan line, assuming there's an upcoming Titan in 2016 (and AMD doing the same on their top end). If you're the type of person to spend $1000+ on a graphics card, and probably purchase multiple of them, you're probably going to be the type of person who buys the best of the best monitor, which this obviously will be.

Now, for their lower tiered graphics cards, such as the NVidia 1080 and 1080TI (and I'm just assuming the number here goes in 100 increments), I can see them not adding the port. Same with AMD.
 
I can easily see NVidia putting a port on their Titan line, assuming there's an upcoming Titan in 2016 (and AMD doing the same on their top end). If you're the type of person to spend $1000+ on a graphics card, and probably purchase multiple of them, you're probably going to be the type of person who buys the best of the best monitor, which this obviously will be.

Now, for their lower tiered graphics cards, such as the NVidia 1080 and 1080TI (and I'm just assuming the number here goes in 100 increments), I can see them not adding the port. Same with AMD.

Do we really think AMD or Nvidia would pay Intel for Thunderbolt 3 controllers on their GPUs? Also, my understanding is that in addition to 4x PCIe 3.0 lanes for the Thunderbolt 3 controller, the motherboard requires a direct IO thunderbolt "header" to work with Thunderbolt. Thus, I don't imagine AMD or Nvidia would build thunderbolt support into any of their GPUs because it would require particular motherboards.

What I could see happening is, however, USB-C shaped Displayport 1.3 ports that could carry 2 4Kx60Hz signals. This would be similar to how the current MacBook has USB-C ports for display info, but no thunderbolt support.
 
I'm confused. I thought it was as follows (for 4K):

DP 1.3: max 4K@120Hz (@8 bits per color channel).

DP 1.2: max 4K@60Hz

TB 3: max (2) 4K@60Hz, no 120Hz support; supports DP 1.2 but not 1.3.

HDMI 2.0: max 4K@60Hz.

HDMI 1.4: max 4K@30Hz.

I do agree that at this time, it's *unlikely that Dell has a DP 1.3 controller since we won't be seeing DP 1.3 on discrete GPUs until NVIDIA's pascal and AMD's polaris are released later this year.

*But according to the Dell rep and what we know, it'll be capable of 4K@120Hz. So that would seem to imply that it does, in fact, have a DP 1.3 controller? I guess it hinges on that USB Type-C port, since AFAIK, it comes "DP 1.3 ready."
USB Type-C with Thunderbolt 3 has 40 Gbps of bandwidth available. 3840x2160 at 120 Hz and 8 bpc requires approximately 24 Gbps of bandwidth (30 Gbps at 10 bpc), so the USB Type-C connector at least has the aggregate bandwidth to support the resolution and refresh rate of this display. I assume that just like 5K displays being supported by 2 DP 1.2 inputs that you could combine the 8 lanes of DP 1.2 in USB Type-C and leverage all the available bandwidth.
 
Back
Top