If the Dell U2713HM has anywhere near a semi-gloss AG, I'm going to have a monitorgasm.
Beware the uniformity issues that may disturb you. Or wait for the coupon.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
If the Dell U2713HM has anywhere near a semi-gloss AG, I'm going to have a monitorgasm.
Beware the uniformity issues that may disturb you. Or wait for the coupon.
The luminance uniformity of the U2713HM was very good overall.
Yes, there is no loss of tonal values and the possibility of precise gradation in combination with excellent grey balance.With 14/16 bit internal display LUT (EIZO/NEC) and 8 bit output, what you can get is 100% grayscale on variety of gamma curves
Without active homogeneity compensations the panel area may suffer from unpleasing chroma or luminance differences. That underlies individual variability of course.What screen uniformity issues?
The screen coating on the U2713HM is a normal anti-glare (AG) offering. This is contrary to a lot of other screens using variants of the LM270WQ1 panel which offer a glossy screen coating. Readers will be pleased to hear though that the AG coating is actually nice and light and is not the usual grainy and aggressive solution you would normally find on an IPS panel. In fact in practice it is almost what you might call a semi-gloss coating being quite similar to AU Optronics AMVA offerings. Dell seem to have toned down the AG coating which is great news. It retains its anti-glare properties to avoid unwanted reflections, but does not produce an overly grainy or dirty image that some AG coatings can.
With 14/16 bit internal display LUT (EIZO/NEC) and 8 bit output, what you can get is 100% grayscale on variety of gamma curves.
Because many input values would be mapped to the same output value; just think of gradation or whitepoint adjustments – apart from that calculations (that are written into the LUT) wouldn't be as precise.I still don't get it, why can't you do that with an 8 bit LUT?
The calculations could and should be more precise without the table being wider than 8 bit. Could you give an example of what exactly would go wrong with an 8 bit table?Because many input values would be mapped to the same output value; just think of gradation or whitepoint adjustments apart from that calculations (that are written into the LUT) wouldn't be as precise.
No. The calculations are far more precise in a wider space. This is best practice not only regarding display LUTs of high end LCDs. An other example in the same context would be CMM transformations in the PCS.The calculations could and should be more precise without the table being wider than 8 bit
A display with 100% tonal range @Gamma 2.2-gradation would loose about 25% of the 8bit input signal when calibrated via a 8bit LUT to achieve the L* characteristic.Could you give an example of what exactly would go wrong with an 8 bit table?
Honestly - judging by reviews this is my dream monitor come true. It's like if Dell read all my posts about u2412m and created a monitor specially for me... Can't wait to see it personally.
No. The calculations are far more precise in a wider space. This is best practice not only regarding display LUTs of high end LCDs. A example in the same context would be CMM transformations in the PCS.
A display with 100% tonal range @Gamma 2.2-gradation would loose about 25% of the 8bit input signal when calibrated via a 8bit LUT to achieve the L* characteristic.
Best regards
Denis
I don't think so but to clear it up: It makes sense to implement a LUT far more precise than the input signal to avoid a loss of tonal values and to raise the precision of the tonal transformations. Of course you don't raise the tonal range above the input signal, there is no interpolation of new colors which aren't in the input signal.I think we're talking about different things here.
A dithering stage at the end (scaler or panelinternal) helps to "rescue" the tonal range.
I don't agree. Circuit delay is unambiguous and a term used in industry, whereas input lag is a term coined by gamers posting on forums, it isn't described or measured consistently.Circuit delay=input lag
That's a nice thought and it could happen. <8ms latency would make it a winner for me, but the scaler and image processing is probably more useful rom Dell's perspective.A 30" U3013, with sRGB native...
OK there was our misunderstanding. Yes FRC dithering stages are used. Here is a simplified workflow:Ah! Output is not 8 bit.
There are only minor drawbacks – for example slight noise in dark tonal values.I thought it degraded other aspects of quality.
One could think of such solutions but when calibrating via the 8bit videocard LUT you will experience a "hard cut" with increasing banding the more corrections are calculated. My external videoprocessor from Lumagen with extensive CMS uses at least spatial dithering for 8bit output.If dithering is indeed being used, wouldn't the videocard be able to take care of it?
I haven't found the whitepaper in english but the diagram is quite helpful.Ich spreche keine deutsch.
Consuetudo est altera natura. I will take this into consideration.i know this is off-topic but you really have to write "Best regards, Denis" on every single post? put it in your sig then, this is annoying.
Consuetudo est altera natura. I will take this into consideration.
Um, yeah. But the samsung s27b970 was measured as having 29ms of input lag, and it seemed to be very responsive in practice. This one is measured to have 22ms of input lag which is even faster so that's very OK by me.Unfortunately they missed the posts about input lag. The HP is still quite a bit faster.
Consuetudo est altera natura. I will take this into consideration.
Best regards
Denis
Um, yeah. But the samsung s27b970 was measured as having 29ms of input lag, and it seemed to be very responsive in practice. This one is measured to have 22ms of input lag which is even faster so that's very OK by me.
s27b970 was measured as having 29ms of input lag
This one is measured to have 22ms of input lag
off topic to talk about signatures, but since it has come up already perhaps it's worth pointing out that aside from good etiquette, the forum rules ask for post signing to be confined to the signature box:Best regards
Murzilka
(25) You are allowed one signature per post, do not put your signature in the body of your post. A signature section is supplied for that usage. No one wants to see your name on your posts multiple times as it is annoying.
Not only that but the HP has much more ghosting, almost as if it doesn't really have active overdrive.
It often looks like there is a tradeoff between good overdrive circuitry and lag. Almost all PVA/MVA lag a fair bit, but there seems to be a few that lag minimally, but they tend to have fairly horrible ghosting.
I had a Dell 2405 that must have had horrendous lag because that I could feel it was like everything was moving through molasses(though it also had tons of ghosting too). Then I got my NEC which has about 33ms of lag. I don't feel that at all.
Here we get 22ms lag, but with very good overdrive. Only the 120Hz TN panels beat it in the ghosting/blur tests. This is an awesome overall result for a good all round IPS screen.
Most unlikely. -)Anyone know if this display can accept a 2560 signal over HDMI?
Most unlikely. -)
At 24Hz, yes...Any reason for that? HDMI 1.4 supports 4k resolutions
Is it actually outputting 2560x1440 over the HDMI or is it being scaled by the monitor? I'm not saying it isn't doing it, I'm just curious if you know for sure that it is properly running at 1440p over HDMI. I have had... limited success using HDMI for almost anythingand I know some of the Korean panels (the microcenter one in particular) can be driven at 2560 by HDMI. Considering this is a fairly modern monitor, I think it's plausible.
Any reason for that? HDMI 1.4 supports 4k resolutions, and I know some of the Korean panels (the microcenter one in particular) can be driven at 2560 by HDMI. Considering this is a fairly modern monitor, I think it's plausible.
At 24Hz, yes...
Is it actually outputting 2560x1440 over the HDMI or is it being scaled by the monitor? I'm not saying it isn't doing it, I'm just curious if you know for sure that it is properly running at 1440p over HDMI. I have had... limited success using HDMI for almost anything
I'm not sure but from what I understand the hdmi supports 4k resolution at 24hz refresh rate, but it will not not support the 2560x1440 @ 60hz.
Sn0_Man beat me to it
I was definitely getting 2560x1440 - no scaling whatsoever. Picture was identical to what I got over displayport. HDMI definitely supports it - the signal is 340MHz - should allow for slightly higher resolutions than dual links at 165MHz. Admittedly there aren't many devices that support it. But the $400 micro center monitor definitely does. So I was hoping someone tried on this dell
Some simple math would show 4k at 24hz needs about the same bandwidth than 2560x1440 at 60hz. Under HDMI's limit either way at 8-bit color.
Edit:Found this - U2711 running at 2560x1440@60Hz over HDMI by using a hacked driver. So it is plausible that the U2713 natively supports it. Maybe
Any reason for that? HDMI 1.4 supports 4k resolutions, and I know some of the Korean panels (the microcenter one in particular) can be driven at 2560 by HDMI. Considering this is a fairly modern monitor, I think it's plausible.