Samsung 31.5" 4K monitor model UD970 support DVI, HDMI 2.0, Display Port

Happy Hopping

Supreme [H]ardness
Joined
Jul 1, 2004
Messages
7,823
http://www.pcauthority.com.au/Featu...amsungs-28-and-32-inch-ultra-hd-monitors.aspx

The UD970's stand and bezel are crafted from aluminium, which looks great and should stand up to a few years of punishing use. Like any serious pro-level monitor, the 32-inch display has a USB hub built into the left-hand rear edge of the screen, with four USB 3.0 inputs stacked vertically. Two DisplayPort 1.2, one HDMI 2.0 and one dual-link DVI port make for a pretty hefty connectivity suite.

If it really has HDMI 2.0, then it's a winner, but what on earth is Dual link DVI doing here? How can you have enough bandwidth for 3840x2160 using Dual link DVI?
 
For future proofing imho it rather needs bigger size (for it's resolution) then HDMI2, eg. starting from 39". Otherwise new Panasonic TC-50AX800U of 50" size, with HDMI2 and DP 1.2 for 60p and for cheaper price (if this samsung is yet another Sharp 32" clone) seems better choice to me.
 
what on earth is Dual link DVI doing here? How can you have enough bandwidth for 3840x2160 using Dual link DVI?
I run a 1440p monitor at 120Hz over dual link DVI every day. Equivalent bandwidth to 60hz 4k res. Dual Link DVI spec actually has no cap, merely a required "floor" that every cable has to be able to meet.

Now, the hardware at either end to produce and accept that much bandwidth hasn't really been officially supported very much (It's an overclocked Korean monitor and the 670 I use has modded drivers) but hey, the spec can do it.

Now, with Displayport and hdmi 2.0 the DVI starts looking pretty useless but it's not as out of place as you'd think.

PS @others: 31.5 inches is too BIG not too SMALL for 4k res.
 
For future proofing imho it rather needs bigger size (for it's resolution) then HDMI2, eg. starting from 39". Otherwise new Panasonic TC-50AX800U of 50" size, with HDMI2 and DP 1.2 for 60p and for cheaper price (if this samsung is yet another Sharp 32" clone) seems better choice to me.

I went from a 52" Bravia on my desktop--which was like sitting in the front row at a movie theater--to a 32" Vizio 120Hz OC
to a 39" Vizio 120Hz OC
to a 27" Asus 144Hz
and finally to a 23.5" Eizo.

I can't see how any desktop monitor over 32" is practical. The 39" was too big. The 32" was just inside of acceptable, but still overpowered my desk.
24"-32" looks normal and I can see all of it without turning my neck.
 
PS @others: 31.5 inches is too BIG not too SMALL for 4k res.

Depends what you want. A small monitor with high-dpi OS support will look really smooth, but won't give any additional information on the screen.

A big 40-50" 4K monitor without high-dpi OS support is essentially a single-screen replacement for current dual or triple-display setups in terms of the amount of information visible on the screen at any given time.
 
PS @others: 31.5 inches is too BIG not too SMALL for 4k res.
I'll disagree. For a while i had 50" skyworth UHD TV as monitor - it was just the right size, but 30Hz was below comfortable, both desktop and games. Reasoning about size being - most OSes (and applications written for them, including most widely used ones and often including ones written by OS vendor) have varrying amount of issues with font/ui element scaling. With 50" is like you have four normal 24" full hd monitors .. just without bezels inbetween. No scaling issues at all, just quadrupple desktop. Head turning? No problems, just like for guys using for years multimonitor setups before, use apps on part of screen, rest of screen estate - with peripheral vision for apps you are not currently focused on, for multitasking, ocasionally switching to or glancing to. Even on 27" of common resolutions i usually have at least two apps side by side and focusing on part of screen. @UHD you just have more screen estate, more browsers/terminals/mail clients/miscellaneous apps side by side without screen switching you can easily glance in, copy paste in between. In games big screen size was to my taste god send covering wider angle of vision adding a lot of enhancement to feel of immersion. Simply with UHD you get that bigger screen size without pixels getting ugly big, just normal, like ones you had with 27-30" before then, it's big screen size i enjoyed most in games, resolution helped it to not get ugly. Just don't use maximized apps (except of course movies & games, where that wide POV is nice). Use more of them simultaneously. Pitty though that lameness of new metro-ish like gui-s stemming from mobile gadgets goes in other dimension. Heck, even browser on 1920x1200 maximised is UI wise ugly and uncomfortable to use. Yet on half of screen in portrait size just the right thing, for longer forum threads or books browsing/reading.
If some complained that even 39" seemed a bit too small at default DPI to read comfortably, then 31.5", 28", of for god sake, 24" UHD is much worse. Hello to turning DPI to 150+, hello to pixel hunting in unscaled apps UI, hello to dialog windows with often text not fitting in and buttons misplaced. Hence for myself i'd prefer at very least 39" for such resolutions (pitty Asus didn't start making/selling announced 39" one), preferably 42-50". At least while living in not ideal real world with far from ideal font/ui scaling support at different apps i have no wish to fight with or live with. Resolution and screen size should go hand in hand for most enjoyable and problem free experience. 39" is too large .. for Full HD. We are talking about UHD. Hence i mentioned screen size as another thing i require from next monitor to be 'future proofed' for nearest years to come.
 
Last edited:
I run a 1440p monitor at 120Hz over dual link DVI every day. Equivalent bandwidth to 60hz 4k res. Dual Link DVI spec actually has no cap, merely a required "floor" that every cable has to be able to meet.

Now, the hardware at either end to produce and accept that much bandwidth hasn't really been officially supported very much (It's an overclocked Korean monitor and the 670 I use has modded drivers) but hey, the spec can do it.

Now, with Displayport and hdmi 2.0 the DVI starts looking pretty useless but it's not as out of place as you'd think.

PS @others: 31.5 inches is too BIG not too SMALL for 4k res.

Are you saying the UD970 is overclock? If so, by how many %?

they said dual link DVI doesn't have enough bandwidth to support 4K resolution

======================================

All I care is, is the source accurate? As he's the only 1 mentioned HDMI 2.0, the other link only says HDMI, although knowing there is a HDMI port, it would make sense for that port to be HDMI 2.0

And of course, the release date that samsung still hasn't release
 
Last edited:
I have no idea regarding the veracity of the source, I was just saying that Dual link DVI can in fact do 4K 60Hz, despite people not realizing it. There is no cap on the spec.

I have no idea regarding the overclockability of this monitor obviously, as I don't have one. I have no idea what HDMI revision will be included either.
 
You're the only 1 I know who said that. I can't remember which thread, it's either in the Asus PQ321 thread or this thread, that the other members said it has to be Display port. That dual link DVI simply doesn't have enough bandwidth

http://en.wikipedia.org/wiki/Dvi

and according to the above, it can only support 3840x2400 at 33Hz, which is too low
 
Last edited:
You're the only 1 I know who said that. I can't remember which thread, it's either in the Asus PQ321 thread or this thread, that the other members said it has to be Display port. That dual link DVI simply doesn't have enough bandwidth

http://en.wikipedia.org/wiki/Dvi

and according to the above, it can only support 3840x2400 at 33Hz, which is too low

You're probably right. But don't overthink it--1440p over DVI is what it's there for. I imagine the monitor can do 120Hz 1440p but maybe not.
 
I'd try my best to avoid trusting wikipedia on technical specifications.

The actual DVI specification is available online (http://www.microprocessor.org/DVISpec.pdf).

The specification avoids putting a cap on the total bandwidth availalble in dual-link scenarios, and it also explains that the reason there is a cap on single-link is so that there is a defined point where dual-link operations is required so that there is never any question as to what link-mode should be enabled (ensuring broader compatibility). It further states that the 165MHz limit for each link may be bypassed if the total required bandwidth is above 330MHz. Direct quote:
[...] the first link can operate at above 165MHz T.M.D.S. clock only in the case of the total bandwidth requirement surpassing 330MHz T.M.D.S clock.

Basically, wikipedia doesn't know shit, and quality DVI cables are capable of pushing bandwiths far in excess of 330MHz, in fact sufficient to run 4K at 60Hz. I've been running that bandwidth for almost 2 years.

However, It is not unlikely that the monitor in question in this thread won't have the hardware on it's side to parse that much bandwidth through the DVI connection. In which case the DVI connection on this monitor would be pretty pointless for users who can't accept 30Hz.
 
I seriously doubt 4k@60hz will ever be widely supported over DVI, regardless of the fact that the spec doesn't forbid it, just because DVI is rare on the new crop of 4k monitors(probably considered a legacy connector since this is a professional-targeted product) and DVI in general is on the way out. Perhaps you'll be able to make it work on Linux and with hacked Nvidia drivers or something, but that's about it and that sort of usage doesn't matter to very many people.
 
And just like with HDMI2 it will take ages for vendors to actually implement DP 1.3 in real products. And on both sides of cable at that. At some point one has to decide how long one should wait, as there always will be something better in future.
 
You're the only 1 I know who said that. I can't remember which thread, it's either in the Asus PQ321 thread or this thread, that the other members said it has to be Display port. That dual link DVI simply doesn't have enough bandwidth

http://en.wikipedia.org/wiki/Dvi

and according to the above, it can only support 3840x2400 at 33Hz, which is too low

well, the very wikipedia link you used states:

Dual link maximum data rate is limited only by the bandwidth limits of the copper the DVI cable is constructed of and by the DVI signal's source.

also, 3840x2400 is an EXAMPLE RESOLUTION, not a cap/limit.

re-read the article and par attention to this part:
Minimum clock frequency: 25.175 MHz
THERE IS NO F MAXIMUM CLOCK FREQUENCY ON DVI SPEC!!

grossly using the typical pixel clock limit of 300-340MHz of HDMI 1.4 implementations, 3840x2160@72hz under DL-DVI is not unfeasible. under the pixel clock limit of HDMI 2.0 of 600MHz the monitor engineer would be constrained by the Tcon long before and bandwidth limits kicked in.

DVI was build to last, HDMI was build to force us to change Tvs, players and graphics cards every 2 years.
 
If we get this monitor, using HDMI 2.0 interface, would that fix the infamous MST problem?
 
But , there is no games with 4K support right? maybe 1-2
also bluray is only 1080P

so i think 4K is too much at these moment
 
HDMI 1.4 increases the maximum resolution to 4K × 2K, i.e. 3840×2160 (4K Ultra HD) at 24 Hz/25 Hz/30 Hz or 4096×2160 at 24 Hz (which is a resolution used with digital theaters);

HDMI 1.4b was released on October 11, 2011.[153] One of the new features is that it adds support to 3D 1080p video at 120 Hz -allowing frame packing 3D format at 1080p60 per Eye (120 Hz total)

HDMI 2.0 increases the maximum TMDS per channel throughput from 3.4 Gbit/s to 6 Gbit/s which allows for a maximum total TMDS throughput of 18 Gbit/s.[156][157] This allows HDMI 2.0 to support 4K resolution at 60 frames per second (fps)

The goal of DisplayPort 1.3 is to support 8K resolution, possibly with light compression (HDMI can use color channel compression). The bandwidth of DisplayPort 1.3 is said to reach 8.1 Gbps per channel or 32.4 Gbps in total. For comparison, HDMI 2.0 caps out at 18 Gbps. Besides 8K support (7680 × 4320 and 8192 × 4320), it will enable users to power two 4K displays with a single cable, and it will also add support for modes such as 4K@120 Hz and 3D in 4K resolution. It will, like DisplayPort 1.2, also support color depths higher than 8-bit, such as 10-bit and 12-bit per primary color. In other words; amazing picture quality.


Of course you need at least two powerful gpus to average 100fps on the more demanding games at max settings even 2560x1440 at the moment (and a few are more like 60fps), and as newer gpus come out (20nm end year), more demanding games will come out (2015), so in regard to graphics ceilings I doubt you would have much gains at very high resolutions+high hz in regard to fps for a long time unless there were a huge leap in gpu power in the pricing hierarchy. That makes even a 4k 120hz panel using displayport 1.3 a far off release and in regard to high fps achievable by me very distant as well. Cart is well before the horse on this one in regard to gaming. Like others have said, goal would be 4k at 120hz (and g-sync/dynamic hz and backlight strobing modes) with the gpu power to average 100fps. So personally that would be off the table completely for me for a few years considering the state of things. I have no interest in playing games at 30fps and 60hz. I could see them being cool as a desktop/app monitor in the meantime though.

GTX 780Ti Benchmarks 1x-4x SLI (Work in Progress)

Putting this here just for some insight since people are mentioning sizes and comparisons.
http://www.web-cyb.org/images/lcds/4k_vs_27in_vs_30in_2560_same-ppi.jpg
I agree about giant monitors not be great in regard to gaming. I've had a 37" one before. Unless games allowed you to define a virtual primary monitor space in the middle with all extents added FoV for immersion, it would just be a giant wall of the same screen elements and eye bending to the periphery.
 
Last edited:
But , there is no games with 4K support right? maybe 1-2
also bluray is only 1080P

so i think 4K is too much at these moment

So is 2560x1600.
Odd that I've been using it since 2007, waiting for a worthwhile upgrade...
(which will likely be the first larger PWM-problem free 4K monitor)

For those only interested in 1080ish content, 4K also provides the option for flicker free 3D at full resolution in case a manufacturer happens to combine it with a polarization filter like we already see with 4K TVs
 
I'm running 2560 x 1440 with a 6ft dual-link DVI cable.

Will I lose bandwidth if I switch to a 10 ft cable?

Why does fps drop from 60 to 55 while I'm running the Unigine benchmark in full screen?
 
Bandwith is same for all lengths of cable for same standart interface. For very long ones only maybe more transmit errors are possible, and some out of spec modes not usable.
There might be lot of reasons for such little drop of fps. To me most probable seems slightly higher resolution / more work for gpu vs windowed mode, where part of screen is used for window UI controls and alikes, not all for said benchmark's 3D content.
 
Bandwith is same for all lengths of cable for same standart interface. For very long ones only maybe more transmit errors are possible, and some out of spec modes not usable.
There might be lot of reasons for such little drop of fps. To me most probable seems slightly higher resolution / more work for gpu vs windowed mode, where part of screen is used for window UI controls and alikes, not all for said benchmark's 3D content.

Thank you. I thought fps is constant, as determined by the hardware at any given resolution. BTW I used glxgears in Linux to measure fps refresh framerate.
 
speaking of cable length, for those of use who like to do a longer run from our computer to our desk - displayport 1.2 drops out on extended runs. I've run two 25' 24awg extensions off my 6990 for a few years without issue but with my new 780ti the output signals are weaker and I've had less sucess keeping both monitors on the setup (a 2560x1440 60hz ips and a 1080p 120hz tn). From what vega has told me, higher bandwidth requirements of 4k 60hz and 2560x1440 120hz fail on longer than 10' displayport cables too. There are no active extender displayport 1.2 cables that either of us could find, only 1.1 and that isn't enough bandwidth. I don't know if displayport 1.3 will be any better.

Both of us are going to have to move our pc's a lot closer to our desk and monitors now. Eyeing the asus 2560x1440 120hz + g-sync/ulmb mode 27" monitor due out (uses displayport 1.2 without having to use any dual monitor workarounds).

edit: I also run hdmi 35' from my htpc to my living room, and I've run 25' dvi-d runs and hdmi runs to my pc no problem. displayport is much weaker signal strength even if it is higher bandwdith.
 
Last edited:
If we get this monitor, using HDMI 2.0 interface, would that fix the infamous MST problem?

Depends if you have nvidia or amd card.

AMD cards have the requisite 600 mhz pixel clock and can do 4k60 through hdmi with a firmware update. Nvidia cards only have a 400 mhz pixel clock so bandwidth wise they can only pass 4k30. So an hdmi 2.0 monitor would be useless for nvidia hardware right now. Nvidia probably wont have a 600mhz ramdac until 20nm maxwell end of year or 2015. Amd has been ready for hdmi 2.0 for over a year.
 
dragonageinquisition: one thing makes me very suspicious though - if amd cards are capable of fully using HDMI 2.0 capabilities, why amd still never claimed that? It might rake in lot of potential buyers seeking more future proof gpu-s, so if amd hasn't claimed, i strongly suspect it's not capable or has other potential problems/bugs for such mode. Hence only sure way with current gpu-s from both camps to use 4K60p is via DP 1.2 MST with few current UHD monitors and Panasonic UHD TVs or few older very very expensive 4K monitors/TVs that had input fed by several hdmi 1.4 / dual dvi or alike interfaces (that again limits choice to amd only, as nvidia disabled 2*n setups for nvidia surround in gaming gpus (quadro cards IIRC have that support enabled) except with hack added back support for few whitelisted EDIDs of newest DP 1.2 MST UHD monitors). HDMI 2.0 .. for gpu-s it will be usable at best with only next gen gpu-s (hmm, half a year or year in future?).
 
Depends if you have nvidia or amd card.

AMD cards have the requisite 600 mhz pixel clock and can do 4k60 through hdmi with a firmware update. Nvidia cards only have a 400 mhz pixel clock so bandwidth wise they can only pass 4k30. So an hdmi 2.0 monitor would be useless for nvidia hardware right now. Nvidia probably wont have a 600mhz ramdac until 20nm maxwell end of year or 2015. Amd has been ready for hdmi 2.0 for over a year.

So the whole "bug" problem that people only see 1/2 a screen, among other small bugs on the MST is because of Nvidia? I thought it's a design flaw on Display port?

And I thought Nvidia is ahead of AMD, there is no way they would let AMD moves ahead by that much on this issue for 2 yr.?
 
So the whole "bug" problem that people only see 1/2 a screen, among other small bugs on the MST is because of Nvidia? I thought it's a design flaw on Display port?

And I thought Nvidia is ahead of AMD, there is no way they would let AMD moves ahead by that much on this issue for 2 yr.?

Well... What that post is saying is that AMD cards have a ramdac with enough bandwidth that will allow for 4k 60FPS on a Single Video Stream, where as on nVidia cards the ramdac doesn't have enough bandwidth so that they would need two streams, so hdmi 2.0 is impossible on nVidia cards.
 
So the whole "bug" problem that people only see 1/2 a screen, among other small bugs on the MST is because of Nvidia? I thought it's a design flaw on Display port?

And I thought Nvidia is ahead of AMD, there is no way they would let AMD moves ahead by that much on this issue for 2 yr.?

No, the MST issues aren't because of Nvidia. Check the Dell forum, plenty of people with AMD cards having same issues as Nvidia users with the UP3214Q.

I've done some research though and apparently ASUS has all those issues fixed with a firmware update on their 4K monitor.
 
I'm running 2560 x 1440 with a 6ft dual-link DVI cable.

Will I lose bandwidth if I switch to a 10 ft cable?

Why does fps drop from 60 to 55 while I'm running the Unigine benchmark in full screen?

I have pushed 100+ Hz @ 2560x1440 even on a 50 foot cable. As long as its a high quality/decent cable you will be fine.


Well... What that post is saying is that AMD cards have a ramdac with enough bandwidth that will allow for 4k 60FPS on a Single Video Stream, where as on nVidia cards the ramdac doesn't have enough bandwidth so that they would need two streams, so hdmi 2.0 is impossible on nVidia cards.

I don't see what a RAMDAC has anything to do with limitations of HDMI/DVI or DP. RAMDAC is only used for the old 15 pin VGA analog output. I believe the current drivers have a hard-set 400 Mhz pixelclock limit which probably does come from the fact that the RAMDAC (which was the highest limit until DP) but don't have anything to do with the hardware capbilities. We already know many people running well over 500 Mhz pixelclock on nvidia with a simple patch to remove the limit. There has not been a display that takes > 400 Mhz pixelclock yet so I think they simply haven't updated their drivers to do so. I don't believe its any kind of hard hardware limit anywhere unless you have a quote from nvidia specifying otherwise.
 
I've done some research though and apparently ASUS has all those issues fixed with a firmware update on their 4K monitor.

I read that Dell thread. And in another thread

http://hardforum.com/showthread.php?t=1794348&highlight=

I have been saying that I have those same problem and I DO NOT USE 4K monitor. I don't own one . So I thought it's display port technology screw up in conjunction w/ nvidia driver

Now, Asus cannot have a firmware update, they don't make that 4K monitor, Sharp did, so if Asus has a fix, so should Dell. Where's your asus link?
 
I read that Dell thread. And in another thread

http://hardforum.com/showthread.php?t=1794348&highlight=

I have been saying that I have those same problem and I DO NOT USE 4K monitor. I don't own one . So I thought it's display port technology screw up in conjunction w/ nvidia driver

Now, Asus cannot have a firmware update, they don't make that 4K monitor, Sharp did, so if Asus has a fix, so should Dell. Where's your asus link?

There is a firmware update for the ASUS in this thread... https://forums.geforce.com/default/...ting-it-to-work-and-debunking-a-few-myths-/1/

It fixes the cold boot issue which others have been experiencing with the Dell units. I think this issue is somehow related, I looked around and can't find anyone complaining about any sleep issues with the Asus. A rep also mentioned in that thread that new units are shipping with the updated firmware.

Lastly, I was contacted by the rep over the Dell forums that told me I could now request an exchange for a monitor with the updated firmware. He didn't say what the firmware fixes, only that new units would be shipping with it in 3 months

I'm going to try reaching out to the Asus contact listed in that thread to see if they've seen this issue on the PQ321Q.
 
Last edited:
But , there is no games with 4K support right? maybe 1-2
also bluray is only 1080P

so i think 4K is too much at these moment

Movies such as Skyfall and Oblivion are only mastered at 2K as well due to greed. Even more infuriating given the latter used a state-of-the-art digital camera. :mad:

Unless Universal is willing to spend a few million and roughly 4 months needed to re-render at 4K. Like that will happen. :rolleyes:
 
Last edited:
Only ppl getting 4 titans are ppl benching heaven or world record 3d Mark, real life gaming even 4k doesnt scale beyond 3 cards anyway, and sometimes even 2 cards is faster.
 
Back
Top