ultra wide or 16:9 4k?

Joined
Aug 25, 2015
Messages
32
hi guys, im wondering if you guys have triedboth dell options the 32" 4k and the ultra wide u3415w, and what do you guys reccomend, im mostly gonna be gaming , FPS, TPS, action games like arkham knight and the witcher 3 , Metal gear Solid V, etc. i currently have a dell u3011 and looking to upgrade, just dont know that to choose , is it worth the extra money for the 4k one or is the ultra wide better? what do you guys think?:confused:
 
Ultrawide is easier to drive and you get gsync potentially. 4K you are pretty much limited to a Freesync other than a crappy TN Acer 28" 4K Gsync monitor.
 
Sorry, but what advantage can you show for GSync over FreeSync exactly? I have seen many head to head matchups between the two technologies and the difference is not distinguishable on decent or better monitors from a users perspective. They both have pluses and minuses and are about equal overall. That is a really bad thing for GSync when it comes with a 100.00 to 200.00 premium on the cost of the monitor.
 
The advantage of gsync is currently SLI 980Tis gets you better minimum framerates than crossfire Fury/Fury X in DX11.

The other issue is that in a handful of AAA games there's more than a 50% delta in minimum or median framerates between aftermarket SLI 980Ti and CF Fury/Fury X, and Freesync currently only really goes down to 40 or 42Hz in most displays. The delta between 42 and 60 Hz is 42.8%, so in a few AAA games, there'd be no visual advantage for Freesync CF Fury X vs 60Hz SLI OC'ed 980Tis.

Finally the uniformity is pretty bad on the big Korean monitors, and the only real big 4K Freesync "monitor" outside of some questionable QC'ed TVs with freesync slapped on to them is the Samsung U32E850R, and that has a matte display.
 
you can mod it to support 21:9 :p

Really, got a link? I don't really like playing the game in 16:9 on my 21:9 screen.
The mod doesn't stretch it does it? I can make the game use the entire screen, but everything becomes squashed.
 
The advantage of gsync is currently SLI 980Tis gets you better minimum framerates than crossfire Fury/Fury X in DX11.

The other issue is that in a handful of AAA games there's more than a 50% delta in minimum or median framerates between aftermarket SLI 980Ti and CF Fury/Fury X, and Freesync currently only really goes down to 40 or 42Hz in most displays. The delta between 42 and 60 Hz is 42.8%, so in a few AAA games, there'd be no visual advantage for Freesync CF Fury X vs 60Hz SLI OC'ed 980Tis.

Finally the uniformity is pretty bad on the big Korean monitors, and the only real big 4K Freesync "monitor" outside of some questionable QC'ed TVs with freesync slapped on to them is the Samsung U32E850R, and that has a matte display.

That isn't a freesync limitation, its the monitor manufacture limitation. The monitor manufactures set those limits themselves based on what they feel the monitor is capable of. That is because the monitors we have now were never designed with freesync in mind, but since its part of the standard now they can take advantage of it even if it isn't ideal. There are already monitors out that go beyond those ranges for freesync and ones announced that completely blow them away.

The fact that nVidia hasn't activated FreeSync on their hardware which they are free to do so at any time (for free) in lieu of forcing you to pay 100 to 200 more for their exclusive product is just proof that nVidia is in the wrong on this. The advantages of GSync (if you are even able to see them from the user perspective) are dwindling fast while the price premium remains ... its a technology doomed to failure.
 
A single higher-end card will drive either at full resolution but you will have to compromise on quality settings at 4K. I was happily running 4K on a 780 Ti over a year ago with older games like Borderlands 2 maxxed out and then-newer games like Tomb Raider at a mixture of medium and high settings.

If you have two higher-end cards go for 4K. Do note that it might be cheaper to buy a 4K monitor and a second card than a single UW monitor.
 
If you want 4k I don't recommend getting the Dell 32 inch 4k. I would advise you get the BenQ 32 inch 4k instead. Between the Dell 34 inch ultra wide and the BenQ 4k, getting the Dell would be easier on the wallet since the actual display is cheaper by a few hundred bucks and you don't need as much GPU horsepower to pair it with.
 
I just came out with new term for ultrawide format. In cinema world it is know as Scope aspect ratio. Compared to projector screens and TVs desktop monitors are microscopic. Therefore, how about "Micro-scope display"! I can't wait marketing (or some witty tech reviewers) starting using it.
 
Sorry, but what advantage can you show for GSync over FreeSync exactly? I have seen many head to head matchups between the two technologies and the difference is not distinguishable on decent or better monitors from a users perspective. They both have pluses and minuses and are about equal overall. That is a really bad thing for GSync when it comes with a 100.00 to 200.00 premium on the cost of the monitor.

http://www.tomshardware.com/reviews/amd-freesync-versus-nvidia-g-sync-reader-event,4246.html

Regardless of the features, specifications, marketing, pricing, etc, it appears that people tend to prefer the performance and appearance of g-sync over freesync. I'm not a fan of tom's reviews, but this might be helpful information.
 
Gaming people seem to love the 21:9 after they get it. But remember U3011 has expanded color gamut, which if that matters, limits your upgrade to 31.5" 16:9. If that does not matter the 21:9 has been a monitor size taken up by EVERY major monitor seller. For me I can't give up the colorspace, but I'm hoping they eventually have 10 bit color 21:9 monitors when DP 1.3 becomes the standard.
 
Gaming people seem to love the 21:9 after they get it. But remember U3011 has expanded color gamut, which if that matters, limits your upgrade to 31.5" 16:9. If that does not matter the 21:9 has been a monitor size taken up by EVERY major monitor seller. For me I can't give up the colorspace, but I'm hoping they eventually have 10 bit color 21:9 monitors when DP 1.3 becomes the standard.

Why isn't DP 1.3 on monitors yet? Wasn't it finalized a year ago?
 
Why isn't DP 1.3 on monitors yet? Wasn't it finalized a year ago?

Well it wouldn't really matter since there are no video cards that have DP 1.3. It just takes a long time, even HDMI 2.0 didn't show up on video cards till a year later on NVidia Maxwell, heck AMD's latest cards today like the Fury X still do NOT have HDMI 2.0.
 
There should be a sticky thread "ultra wide or 16:9 4k -- why it is even a question"?

One line answer: who in the right mind would want ultrashort micro-scope display? I yet have to invent an adjective that reflects its ridiculous cost inefficiency. Suggestions welcome.
 
Takes time for 1.3.
First you get full finalization spec, then development starts making adapter/controller boards and finally manufacturing them. And then the monitor makers get to use those boards in developing products. If boards are buggy, you are back to the board maker redesign loop, and even IF it all works on the monitor side you need the spec to be "final and available" before it is incorporated into GPU designs. 1.3 might well move a bit faster to market than 1.2 did because 4K at higher than 60 needs it as does 5k designs. Deeper color also needs the higher bitrate too.

Nvidia couldn't include it in the ti series because they were too far along and the 1.3 spec wasn't finished. Not sure if it will make it into next gen, but I'd expect them to do so, since they are leading in this area. I would expect Gsync and Freesync to get more advanced with more bits to work with as well.

So just going to take time. 2016 you'll probably see the first wave. 2017 it will be everywhere like 1.2 is today.
 
Seen em both side by side, 4k 40"+ makes the ultrawides look tiny, not immersive and a ripoff for the same or more cost vs a 4k 40" plus.
4k at that size has the PPI like a good 1440p screen but also fills your vision immensely. It was an experience, instead of looking at a monitor. Hard to explain.
And glorious colour, 10 bit is available on the 4k screens.

4k is beneficial in all uses other than perhaps pro twitch fps.

A 5k 47"+ ultrawide would do the trick for me though.. when they exist one day.

And the crapping on FuryX CF? 980Ti does not beat the fury x cfx @4k in many scenarios. Two cards it's quite even, then three cards it's clear to see the limitations of SLI vs CF, scaling is CFs ballpark for sure.

HDMI2 can't do 10bit 4k60fps either.
 
Well it wouldn't really matter since there are no video cards that have DP 1.3. It just takes a long time, even HDMI 2.0 didn't show up on video cards till a year later on NVidia Maxwell, heck AMD's latest cards today like the Fury X still do NOT have HDMI 2.0.

Chicken and the egg. As a rule, I keep monitors longer than GPUs (but there are exceptions).
 
There should be a sticky thread "ultra wide or 16:9 4k -- why it is even a question"?

One line answer: who in the right mind would want ultrashort micro-scope display? I yet have to invent an adjective that reflects its ridiculous cost inefficiency. Suggestions welcome.

For gaming, it might be good, since it gives you a very wide angle of view, which is a benefit in many games. I don't want one, but I can see why one would want it, just like I can see why someone might want 2 to 4 monitors for a Flight Sim.

Of course if you meant why would someone want a 4k display, then I'd go with it's good for Photos/video as well as a future proof way to go 1440p.
 
Back
Top