Dell UP3017Q - 4K 120HZ Oled 30"

What I could see happening is, however, USB-C shaped Displayport 1.3 ports that could carry 2 4Kx60Hz signals. This would be similar to how the current MacBook has USB-C ports for display info, but no thunderbolt support.

Exactly this. Probably some combination of a full size HDMI 2.0, a full size or mini DP 1.3, and one or two USB-C ports capable of sending DP 1.3.

It's unlikely that AMD or NVIDIA would include a Thunderbolt 3 port on their next gen cards, since TB uses DP to send video. Currently, TB3 only supports DP 1.2, which makes it inferior to both DP 1.3 or to USB-C (which supports DP 1.3) for video purposes.

It's ultimately Intel's call to update Thunderbolt for DP 1.3, and as such AMD and NVIDIA will probably avoid it.
 
Oddly enough though, you ignored the exact part of my post where I said I have done just that -

I am confused. You both played nonstop Destiny on it, and had the Windows Desktop on Nonstop?

During those two nonstop activities, you never watched any TV or movies on it?

Also you have an OLED display on you PC and disabled your screen saver?
 
It's unlikely that AMD or NVIDIA would include a Thunderbolt 3 port on their next gen cards, since TB uses DP to send video. Currently, TB3 only supports DP 1.2, which makes it inferior to both DP 1.3 or to USB-C (which supports DP 1.3) for video purposes.

It's ultimately Intel's call to update Thunderbolt for DP 1.3, and as such AMD and NVIDIA will probably avoid it.

Excellent point about TB3 being inferior to Displayport 1.3 by itself. Even more reason that AMD and Nvidia would not include a TB3 controller onboard. I suspect there will not be DP 1.3 support on Thunderbolt till TB4. By that point maybe it'll be on PCIe 4.0–– which has been in development for such a long time!
 
Note it here boys. When I win the power ball tonight I'll buy one of these bad boys for 50 random people on this post. I get to choose just in case someone is an Asshole on the forum. Then I'll send that guy a 768p TN panel and a box of sand.
 
Would be nice to have this monitor, but the cost and the GPU demands mean that this probably isn't going to be mine anytime soon (my current budget 4K TV does the trick for now).
 
I am confused. You both played nonstop Destiny on it, and had the Windows Desktop on Nonstop?

During those two nonstop activities, you never watched any TV or movies on it?

Also you have an OLED display on you PC and disabled your screen saver?

My OLED TV has been used for a variety of tasks just as many people use high end TV's. I purchased it in August of 2014 and it has served as my primary screen for console gaming, auxiliary and later main pc use, and other media viewing. I initially held many of the same concerns you have (especially as a plasma owner myself). With that in mind, I did initially treat my screen with care as I was concerned with burn in. What I personally found over time for my use case is that I just wasn't having any issues so I stopped caring as much.

I never said I have done anything "nonstop". I have had the TV for around 500ish days at this point, and I have used it for a large variety of things including being unintentionally abusive towards it at times. Fortunately, they seem to be pretty resilient. Other owner's in this very thread have chimed in and stated essentially the same thing.

If you need any further clarification, please refer to my initial post. In any case, I understand why you might not be comfortable being an early adopter with OLED. Hopefully many people will make the leap and we will have an exciting couple of years for monitors!
 
Is this thing's wide color gamut a problem for the pros its targetted at?
 
... and I have used it for a large variety of things ...

Hence why it isn't the same as using an OLED for office use. Which I have been trying to convey from the start.

If you do a mix of gaming, movie/tv watching and some desktop use, I never said there would be an issue and that I would be comfortable with an OLED TV, including using it for HTPC and PC gaming (as I do with my current TV).

But it wouldn't work out for my desktop Monitor which probably has about 20 000 hours with the Windows start button, taskbar, and Computer icon in the same place.

There is a massive scale of difference between those.
 
Hence why it isn't the same as using an OLED for office use. Which I have been trying to convey from the start.

If you do a mix of gaming, movie/tv watching and some desktop use, I never said there would be an issue and that I would be comfortable with an OLED TV, including using it for HTPC and PC gaming (as I do with my current TV).

But it wouldn't work out for my desktop Monitor which probably has about 20 000 hours with the Windows start button, taskbar, and Computer icon in the same place.

There is a massive scale of difference between those.

What if you have a moving screen saver (esp if it's on for a 2-3 hours at night)? Wouldn't that solve your potential problem?
 
What if you have a moving screen saver (esp if it's on for a 2-3 hours at night)? Wouldn't that solve your potential problem?

If you want to counter having the taskbar on the screen all day, you would have the inverse image on overnight.

Though this is "fixing" your problem by wearing out the rest of the display to match the burn in. Which will contribute to dimming of the whole display.
 
If you want to counter having the taskbar on the screen all day, you would have the inverse image on overnight.

Though this is "fixing" your problem by wearing out the rest of the display to match the burn in. Which will contribute to dimming of the whole display.

My point was that if the other poster's use of video or games changes it, then I would think a screen saver with a variety of pictures (possibly shifting around the display) would essentially accomplish the same thing. That said, it sounds like Dell is claiming they have some tech that counteracts this automatically.
 
Thunderbolt 3 has eight lanes of DP 1.2, exactly the setup the Dell 5K uses but with one cable instead of two. Thunderbolt 3 has plenty of bandwidth for 4K @ 120 Hz.

I believe this is the only way the monitor in it's release form will be able to run 120 Hz 4K (via laptop or integrated desktop graphics) and that the mini-DP is DP 1.2.

I hope I am wrong and that mini-DP is actually 1.3 which would allow this to be a "gaming" display.
 
If this thing was half the price and nearer to 40" i'd be all over it like a fat kid in a candy shop!
 
My point was that if the other poster's use of video or games changes it, then I would think a screen saver with a variety of pictures (possibly shifting around the display) would essentially accomplish the same thing. That said, it sounds like Dell is claiming they have some tech that counteracts this automatically.

That worked because it is likely that the Windows desktop was actually a fraction of his overall usage.

Just for example pretend it is 25% of his overall usage.

Now if you run with the windows desktop on your screen for 6 hours, and you want to do the same thing with a screen saver. You would need run the screen saver for the other 18 hours/day. This wouldn't be the best way to do it.

An intelligent burn in compensation circuit, would have something like counters for each pixel to measure usage. Then when you hit power off, at the end of the day, it would figure out an inverse mask for the usage pattern, it would then put up the reverse image at super high brightness (higher than normally allowed) and do a quick reverse burn to level out the days damage. Done correctly you wouldn't be able to burn an image into the screen.
 
An intelligent burn in compensation circuit, would have something like counters for each pixel to measure usage. Then when you hit power off, at the end of the day, it would figure out an inverse mask for the usage pattern, it would then put up the reverse image at super high brightness (higher than normally allowed) and do a quick reverse burn to level out the days damage. Done correctly you wouldn't be able to burn an image into the screen.
LG's televisions have a similar compensation circuit built in - though they don't unnecessarily wear down the rest of the screen. It's better to reduce the brightness of the other areas of the screen, since that lets you increase it again later if those areas end up receiving more wear.
That's why "burn in" has not appeared to be a problem with them yet - but I'm sure that it has a negative effect on the display's overall brightness.
 
Thunderbolt 3 has eight lanes of DP 1.2
DP1.2 is 21.6Gbps. "8 lanes" implies it's 8 times 21.6Gbps, but it appears TB3 does 40Gbps, which is less than 2 times DP1.2! (now I'm not getting into any 8b:10b differences, and 4K120 can run off 25Gbps throughput, so your point remains that TB3 has plenty of bandwidth).

at the end of the day, it would figure out an inverse mask for the usage pattern, it would then put up the reverse image
Good concept, but technically you wouldn't want it to run every day. Maybe once every few months or once a year, to prevent excessive counteracting wear. Of course, natural usage will cause more average and diverse wear that looking at it day-by-day.
 
DP1.2 is 21.6Gbps. "8 lanes" implies it's 8 times 21.6Gbps, but it appears TB3 does 40Gbps, which is less than 2 times DP1.2! (now I'm not getting into any 8b:10b differences, and 4K120 can run off 25Gbps throughput, so your point remains that TB3 has plenty of bandwidth).

Genuine question- where are you and Vega getting this info that TB3 supports 4K@120Hz? The only place I've seen this mentioned is on Reddit, and even then I couldn't find a link to back up their surmising it. I've gone through Intel's site, Wikipedia, etc and haven't found any claim that TB3 can do this. It has the bandwidth, yes, but isn't that moot if 4K@120Hz isn't defined or supported by the spec?
 
Last edited:
That worked because it is likely that the Windows desktop was actually a fraction of his overall usage.

Just for example pretend it is 25% of his overall usage.

Now if you run with the windows desktop on your screen for 6 hours, and you want to do the same thing with a screen saver. You would need run the screen saver for the other 18 hours/day. This wouldn't be the best way to do it.

An intelligent burn in compensation circuit, would have something like counters for each pixel to measure usage. Then when you hit power off, at the end of the day, it would figure out an inverse mask for the usage pattern, it would then put up the reverse image at super high brightness (higher than normally allowed) and do a quick reverse burn to level out the days damage. Done correctly you wouldn't be able to burn an image into the screen.

I assume that if this is still a problem, Dell must have some sort of a work around. After all, if you have a 3 year warranty, burn in is likely to show up before it's up. Of course if Dell says the warranty doesn't cover burn in, then we'll know it's still an issue.
 
DP1.2 is 21.6Gbps. "8 lanes" implies it's 8 times 21.6Gbps, but it appears TB3 does 40Gbps, which is less than 2 times DP1.2! (now I'm not getting into any 8b:10b differences, and 4K120 can run off 25Gbps throughput, so your point remains that TB3 has plenty of bandwidth).

Good concept, but technically you wouldn't want it to run every day. Maybe once every few months or once a year, to prevent excessive counteracting wear. Of course, natural usage will cause more average and diverse wear that looking at it day-by-day.
A single lane of DisplayPort 1.2 carries 5.4 Gbps of bandwidth. A single DP 1.2 connection carries 4 lanes for a total of 21.6 Gbps. 8 lanes * 5.4 Gbps = 43.2 Gbps total, or a little more than the nicely rounded off marketing number of 40 Gbps in the USB Type-C marketing materials.
 
Genuine question- where are you and Vega getting this info that TB3 supports 4K@120Hz? The only place I've seen this mentioned is on Reddit, and even then I couldn't find a link to back up their surmising it. I've gone through Intel's site, Wikipedia, etc and haven't found any claim that TB3 can do this. It has the bandwidth, yes, but isn't that moot if 4K@120Hz isn't defined or supported by the spec?

Thunderbolt 3 doesn't have to "support" 4K @ 120 Hz, Dell can use the 8x lanes of DP 1.2 however they like. Just like they did with the 5K Dell that uses 2x DP 1.2 cables.
 
I get what you mean by 8 lanes now...I guess more specifically TB3 gives the bandwidth of 8 1-lane DP1.2's @ 5.0 Gbps using a different physical layer.
 
If this thing was half the price and nearer to 40" i'd be all over it like a fat kid in a candy shop!

Not sure about the size, but it'll definitely be half the price in a year. The price of these monitors NEVER holds.A year ago, this monitor was $2500.

http://www.newegg.com/Product/Product.aspx?Item=9SIA4P02R19217&cm_re=UP2715K-_-24-260-302-_-Product

The reality is that nobody's paying $5000 for this thing, so there's no way the price will stay there. We also all know that it's going to have a ton of first generation problems (burn in, Dell probably found a way to make it have a lot of input lag, cable bandwidth not really "there" yet).

In two years these things will be $1,200 with FreeSync or G-Sync.
 
Uhhhh if it has DP 1.3 or there is an alternate to make T3 work with high end Nvidia / AMD GPUs then consider me nobody, because I do not see any other display manufactures giving any indication of producing anything close to this in the foreseeable future.

Basically, we have two and a half months to see if anybody else steps to the plate. Two and a half months to prepare: save, donate blood and semen, and sell crap off we don't really need! In that two and a half months if nobody does, then you will not see a similar display until 2017-2018. Look at how long it is taking Asus to bring their Acer X34 clone to the market.....gawwwwd!

And as far as price, well some of us do more with displays then eat cheetos, watch cam whores and play COD (although those are all very noble endeavors). I am producing a video game and constantly review artist work and test gameplay mechanics, etc, etc....I also run another business and have to deal with spread sheets and constant precision mouse movements....so something like this display is a perfect tool for me as I am currently balancing that work between my X34 and Dell 5K.....so selling those displays would put me half way to bada bing, which I would gladly do as long as we have DP1.3

Lets all hope, for the sake of humanity, that Dell does not screw the pooch with DP 1.2
 
Last edited:
You really don't want to spend 5 grand for this thing. The price is going to drop like a rock very fast. LG is pushing OLED hard. They're going to be common soon.
 
You really don't want to spend 5 grand for this thing. The price is going to drop like a rock very fast. LG is pushing OLED hard. They're going to be common soon.

Pretty much, unless you are someone that routinely drops $5K on displays each year most should probably hold off.
 
You really don't want to spend 5 grand for this thing. The price is going to drop like a rock very fast. LG is pushing OLED hard. They're going to be common soon.


The OLED TV market is about to explode...but the computer display market is still up in the air....I don't see ANYBODY else or hear about anybody else coming out with OLED computer displays. $5k expense sucks, but there does not seem to be anything else on the horizon.....although without any kind of warning this Dell dropped out of the sky from the heavens, so I am hoping we see some information from other manufactures (looking at LG) in the next month and a half...
 
The OLED TV market is about to explode...but the computer display market is still up in the air....I don't see ANYBODY else or hear about anybody else coming out with OLED computer displays. $5k expense sucks, but there does not seem to be anything else on the horizon.....although without any kind of warning this Dell dropped out of the sky from the heavens, so I am hoping we see some information from other manufactures (looking at LG) in the next month and a half...

Several companies are releasing laptops with OLED displays, but this is the only game in town for a standalone consumer focused OLED monitor at the moment.
 
The supply probably isn't there yet. Most of the panels produced are going into the TV market which is doing very well for LG. From what I've read LG is opening a new plant around late 2017 or early 2018. I don't think the OLED market will really explode until then.

It's a good bet that this display will be top dog for at least a full year. I think it's unlikely we will see any more OLED consumer products this year, but certainly in 2017, and the tsunami will really start in 2018.

Wonder what manufacture is behind this panel though, I'm guessing LG because 30" would be a huge step up from the OLED panel sizes Samsung has produced up until now.
 
The supply probably isn't there yet. Most of the panels produced are going into the TV market which is doing very well for LG. From what I've read LG is opening a new plant around late 2017 or early 2018. I don't think the OLED market will really explode until then.

It's a good bet that this display will be top dog for at least a full year. I think it's unlikely we will see any more OLED consumer products this year, but certainly in 2017, and the tsunami will really start in 2018.

Wonder what manufacture is behind this panel though, I'm guessing LG because 30" would be a huge step up from the OLED panel sizes Samsung has produced up until now.

I would agree that LG is the most likely candidate due to the size. So far, it seems that the laptops getting the 13" OLED displays are indicating that Samsung is making that display.
 
Hoping for a decent % off coupon to bring this below $4K when it releases.

shut-up-and-take-my-money_o_202409.jpg
 
According to Digitimes, the OLED panel is provided by Samsung. This most likely means the panel has RGB pixel structure with self-emitting subpixels and should achieve higher brightness than the WOLED structure LG is employing. Maybe the reason for 4k/120Hz choice is that the panel is actually flickering and not sample and hold but this is just speculation.
 
Last edited:
I would think about getting this if it has no DRM (HDCP) and supports strobing. Would be a nice side monitor, as I already have two very nice CRTs.
 
I would think about getting this if it has no DRM (HDCP) and supports strobing. Would be a nice side monitor, as I already have two very nice CRTs.

Tell us something we haven't heard 10 million times :rolleyes:
 
Q: Will DisplayPort 1.3 enable further performance enhancements to 4K UHD displays?

A: Yes, when including the new HBR3 link rate option, DisplayPort 1.3 will enable a 4K UHD display to operate at a 120Hz refresh rate using 24-bit pixels, or a 96Hz refresh rate using 30-bit pixels.

So this pretty much dashes any hopes of being able to run this display (or any 4K display) yet at 120Hz with 10 bit per channel color. Going to have to dial it back to 8 bits per channel, which is unfortunate since it's OLED.

http://www.vesa.org/faqs/

And a big fat sigh right there..... Had my hopes up. Ah well.. another OLED panel it shall be eventually.
 
Virtually every computer monitor out there runs at 8-bit color. Hell, most TN's are 6.
 
And a big fat sigh right there..... Had my hopes up. Ah well.. another OLED panel it shall be eventually.

In programs where 10b color would help, 96Hz should be OK. It sure beats 60!
However, dell might screw with this thing to block modes that the idiots at dell dont like. Dells LCDs all have artaficial refresh rate caps that block anything over about 75-77 hz even though the hardware would probably support more.

If I was buying a LCD, I'd take a Korean 1440 IPS over the name brands as they are free of DRM and are not capped. I cant wait until generic OLEDs come out.
 
Last edited:
Virtually every computer monitor out there runs at 8-bit color. Hell, most TN's are 6.

I ran 12 or 16bit color (was a while ago, i dont remember the exact depth) with MadVR and MPCHD to my CRT via AMD 7970. It is a CRT, so idk if it is relevant.
 
In programs where 10b color would help, 96Hz should be OK. It sure beats 60!
However, dell might screw with this thing to block modes that the idiots at dell dont like. Dells LCDs all have artaficial refresh rate caps that block anything over about 75-77 hz even though the hardware would probably support more.

If I was buying a LCD, I'd take a Korean 1440 IPS over the name brands as they are free of DRM and are not capped. I cant wait until generic OLEDs come out.

What does a drm free monitor get you? It's the end point, so what difference does it make?
 
And a big fat sigh right there..... Had my hopes up. Ah well.. another OLED panel it shall be eventually.

yes because we all know how important high refresh rates are in 10 bit workspaces.
 
Back
Top