Why ultrawide when 4K+custom res is possible?

euskalzabe

[H]ard|Gawd
Joined
May 9, 2009
Messages
1,478
Genuine question here, not trying to start some war.

I was considering buying an ultrawide monitor a couple months ago, but lately I've been much more attracted to affordable 4K TVs like the Samsung KU6300 that goes for ~$450. Doing some measurements I realized that a 40" TV (or any 4K 40" monitor for that matter) would actually give me a bigger diagonal than any 34" monitor - 37" of ultrawide aspect ratio to be exact.

So, why get an actual ultrawide for not much less money (or actually more money if we compare ultrawides to 4K 40" monitors instead of 40" 4K TVs) when you can just set a custom resolution at 2560x1080, 3440x1440 or even 3840x1620 that is greater than what WQHD ultrawides offer and would still fit in a 4K 16:9 panel? You get the same or better resolution at a bigger size and less price (the most comparable size would be LG's new 38UC99 and that costs $1499!! you can buy 3 KU6300s for that amount).

Unless you want adaptive sync or the 20ms of lag terribly bother you (I hear your complaints, competitive gamers, but not everybody plays twitch-reflex games), the panels in TVs like the K6300 have excellent color and contrast (just check the rtings.com reviews). What's the point of an ultrawide, then? It seems it would hurt productivity by eliminating the vertical space option, whereas with a 40" 4K 16:9 display gives you the best of both worlds: ultrawide FOV for games and 4K+vertical space for productivity.

I'll give you a quick example: I was playing Arkham Knight yesterday on my 1080p 40" TV. 16:9 gets a bit too restrictive for my taste, so I made a custom resolution of 1920x810 and, while not losing any of the vertical space advantage for productivity, I still gained the ultrawide FOV for gaming that I prefer:

20160831185559_1_zpsborbvey2.jpg


Am I crazy here? How is this not a win-win?
 
Last edited:
Hmm... considering the amount of interest in ultrawides and 4K displays, I thought this thought experiment would be more interesting to [H] users. It seems I was misguided.
 
Because most people (including myself) don't want to have huge, massive bars of black, unused pixels on the top and bottom of the screen. Its a preference thing, but I like my viewport to stop at the bezels. The reason why monitors have evolved over the years to have smaller and smaller bezels is because most everyone does not like the look of massive, thick frames. Those black bars of not-in-use pixels are just adding extra bezel space.
 
If you have a 4K display, then you'll never use the ultrawide option. Why would you make the picture shorter when you don't have to?

If you're asking yourself why you would get an ultrawide when you can get a 4K, then that's the right question and the answer is obvious.
 
There are a number of games that I play at 3840x1600 on my 43" 4K. When I do I move the monitor closer to me like I would if it were ultrawide. The black bars kind of just disappear. Some games are genuinely better like this (Project CARS) and some games it's nice to have less pixels to render. I came from a 3440x1440 Ultrawide previously.
 
I'm not a competitive gamer but I love the 100 hz that my X34 puts out.

I came from a 60 hz monitor, and what they say is true. You won't go back to a slower monitor / TV.

When we have 4K at 120hz, then I'm interested. Not any sooner though.
 
If you're asking yourself why you would get an ultrawide when you can get a 4K, then that's the right question and the answer is obvious.

Actually it's not. There's bigger, better 4K displays at cheaper prices than ultrawides, which necessarily complicates the question. You're ignoring the adaptability I proposed in my post.

There are a number of games that I play at 3840x1600 on my 43" 4K. When I do I move the monitor closer to me like I would if it were ultrawide. The black bars kind of just disappear. Some games are genuinely better like this (Project CARS) and some games it's nice to have less pixels to render. I came from a 3440x1440 Ultrawide previously.

Precisely. You can adapt depending on your use-case scenario. Doing productivity work? 16:9 at 4K. Playing a game that can benefit from ultrawide? Custom res.

Why would you buy a part of a monitor and not use it ?

Did you bother reading my post? I said you use 16:9 at 4K for productivity and switch to 21:9 custom res when games which support it are more pleasing to the user that way. You gain FOV. Nothing is lost. You'll use the vertical pixels again the second you step out of the game. What you have is options.

Changing the aspect ratio wider changes the FOV which can be beneficial in games. Then on the desktop for productivity you can use the whole panel. I don't see why not adjust it depending on what you are doing.

My point exactly. Adaptability. For the same or cheaper price. Unless users are so horrifyingly bothered by black bars, I don't see why you would get an ultrawide over a large ~40 (or more) 4K monitor.

I came from a 60 hz monitor, and what they say is true. You won't go back to a slower monitor / TV.

Please, don't extrapolate from your own experience to everybody's experience. I've used 120hz and 144hz panels before and while nice, I would never spend that kind of money on them when I can get better color reproduction or 4K resolution on other panels at 60hz for similar money. It's perfectly fine that you would prefer 120hz, but don't assume or imply that everybody wants higher refresh. If higher fps didn't come with the color sacrifices, then yes, I'd go for it, it's a net win. However, I will not go to a TN panel at this point in time - unless you need the high refresh for something specific - like competitive gaming - or you simply prefer the high refresh despite the sacrifices you're making, there's no point on losing on the VA/IPS color reproduction. Yes you can now do IPS at 144hz in some of the new LG panels, if high refresh is not your ultimate goal, why pay $700 when you can get a 40" 4K panel for $450?
 
It ends up being a wiser move...especially considering that even on the one title you cite, Arkham Knight, has quite a few scaling issues in 21:9

WSGF said:
Ultra-Widescreen (21:9) Specific Solution & Issues
Native Hor+ support with quite a few oddities.

  • HUD position is identical to 16:9
  • Zoom is pillarboxed
  • Those real-time rendered cut-scenes which are pillarboxed are Hor-/Vert-
  • City map is Vert-
  • Performance Test is Vert-
  • Main menu backdrop is Hor+/Vert+
Although it is possible to address some of these using MaintainYFOV (BaseEngine.ini) and FOV keys (BmInput.ini), this solution is too fiddly and imperfect to be really worth it.
 
It ends up being a wiser move...especially considering that even on the one title you cite, Arkham Knight, has quite a few scaling issues in 21:9

You don't have to use the 21:9 FOV when games don't support it well. But you have the option to do so.

That said, I don't know when that WSGF post was written but AK doesn't have bad ultrawide support now. It's not great - cutscenes revert to a forced 16:9 and the HUD indeed is placed where it would on 16:9, but the game has most definitely been updated to hor+ as I'm clearly seeing more on the sides, not a stretched image. I have not tried the performance test to check if it's hor+, but frankly using the performance test as an argument against the game's quality is malarkey. You'll fire that up once if at all and never really do it again as the goal is to play the actual game.
 
Like you said, if people don't mind 60 hz, input lag, fixed refresh, larger footprint and even making custom resolutions a 4K TV is the way to go, and has been for a while. You're not crazy, but neither is anyone else. Getting proper 4:4:4 color out of a TV is also a problem, and only specific models will do it.
 
As someone that has gone from Ultrawide to Nvidia Surround to 4K in less than a year, I agree with the OP. Overall, 4K is far superior and more flexible than 21:9. I loved my 34" Ultrawide until I ran into my first AAA game to not support it (Fallout 4) and then ran into a few more. 4K allows you the flexibility to do various resolutions and 21:9 if wanted while not having to worry about aspect ratio/resolution support.

I think the current downside is 4K display availability. 4K monitors has been hit and miss (mostly miss) and 4K TVs (currently have the Samsung KS8000 myself) have the downside of not being a monitor (input lag, refresh rate limits, etc.). I think as 4K matures in the monitor space, it will make 21:9 less and less compelling option.
 
Like you said, if people don't mind 60 hz, input lag, fixed refresh, larger footprint and even making custom resolutions a 4K TV is the way to go, and has been for a while. You're not crazy, but neither is anyone else. Getting proper 4:4:4 color out of a TV is also a problem, and only specific models will do it.

If we are comparing 21:9 to 4K exclusively in this thread, the refresh rate should not be an argument (yet)... A large majority of the 21:9 monitors are 60hz (I believe only Acer offers something more, but I could be wrong).

Input lag is definitely an issue, but has improved.

Custom resolutions are an option, not a requirement and should be looked at as a pro since you have that option. I would rather have customer resolutions than have to use a program like Flawless Widescreen.

Plenty of options support 4:4:4 or I can at least say there are more 4:4:4 4K TVs than there are 21:9 monitors that do more than 60hz :p.

I think the OPs point is that with a 4K display you can do everything a 21:9 display can do while still supporting games that a 21:9 display cannot. The trade off is the input lag (if you are using a 4K tv) and not having access to more than 60hz (assuming you have access to the 1 or 2 21:9 displays that have it).
 
You're not crazy, but neither is anyone else. Getting proper 4:4:4 color out of a TV is also a problem, and only specific models will do it.

Yeah, this thread has made that clear: it's all about preference and what bothers each particular user: input lag, refresh rate limitations, higher/lower black bars... 4:4:4 is definitely a problem in many TVs (for example none of the Vizio 2016 4K D-series support it, which is too bad as they're so cheap) and many users don't even know about it being a problem. However, if we're talking about users who know, this is less of a problem - we know to look for 4:4:4 support.

As someone that has gone from Ultrawide to Nvidia Surround to 4K in less than a year, I agree with the OP. Overall, 4K is far superior and more flexible than 21:9... I think as 4K matures in the monitor space, it will make 21:9 less and less compelling option.

Agreed. The whole point is flexibility. Having both options in 1 piece of hardware.

I think the OPs point is that with a 4K display you can do everything a 21:9 display can do while still supporting games that a 21:9 display cannot.

OK, I"m glad to see there's more people who see this like I do. I never really saw this talked about in forums and once I figured it out myself, that a custom res would give me the ultrawide FOV when I want to use it (and a game supports it) it just seemed like the obvious path for me. Clearly it's only plausible as long as the various downsides don't bother the user, but if you're a happy camper on a 60hz panel... the flexibility of a large 4K panel is all gains for that kind of user.

Personally, finding out that I can even do this will give my 7 year old 1080p 40" TV-monitor another year of life - now I try 1920x810 in every new game I play and if it supports it decently, that's how I game - if there's no support, back to 16:9 it is. I'd get a 4K TV now but I'm waiting/hoping for better HDR and a quantum dot display in 2017 for ~500. That'll make for a glorious PC display.
 
(all imo of course)

Because everyone overhypes 21:9 (both flavors - I think 2560x1080 is completely stupid) and everyone over-worries about the requirements to run 4K

4K is just flat out better than 21:9. I've lived with both surround and 21:9 for a year or longer, and while the occasional annoyances of having to fix non-native support games doesn't seem too bad - once you go back to a normal 4K at 16:9, you question why you ever bothered. You just simply open the game, turn off AA (maybe 2x depending on how picky you are), and then play without worrying if it will look satisfying.

I'm still running 3x GTX Titans - the original GTX Titans. Very far from the most ideal or most optimized setup, but BF4 multiplayer (my most played game), they still hit ~85fps with GPU#1 downclocking to ~900mhz (I'm assuming the driver sees no reason to push power consumption and keeps downclocking them no matter what I try) and GPU2 and 3 at 1202mhz. 4K is nowhere near as hard to run as people freak out over, and quit overhyping the 3440x1440 as the better solution

3440x1440 panels should not cost as much as they do (though they're finally coming down), but customers just seem to blindly follow the hype train, driving prices up.
 
Changing the aspect ratio wider changes the FOV which can be beneficial in games.
Then on the desktop for productivity you can use the whole panel.
I don't see why not adjust it depending on what you are doing.
Changing the FOV changes the FOV.
If using a letterboxed resolution is giving you a wider view in the game, you need to increase the FOV setting when displaying 16:9.
Then the central area which was letterboxed will look the same, but you'll have more height.
 
(all imo of course)

Because everyone overhypes 21:9 (both flavors - I think 2560x1080 is completely stupid) and everyone over-worries about the requirements to run 4K

4K is just flat out better than 21:9. I've lived with both surround and 21:9 for a year or longer, and while the occasional annoyances of having to fix non-native support games doesn't seem too bad - once you go back to a normal 4K at 16:9, you question why you ever bothered. You just simply open the game, turn off AA (maybe 2x depending on how picky you are), and then play without worrying if it will look satisfying.

I'm still running 3x GTX Titans - the original GTX Titans. Very far from the most ideal or most optimized setup, but BF4 multiplayer (my most played game), they still hit ~85fps with GPU#1 downclocking to ~900mhz (I'm assuming the driver sees no reason to push power consumption and keeps downclocking them no matter what I try) and GPU2 and 3 at 1202mhz. 4K is nowhere near as hard to run as people freak out over, and quit overhyping the 3440x1440 as the better solution

3440x1440 panels should not cost as much as they do (though they're finally coming down), but customers just seem to blindly follow the hype train, driving prices up.

Anything with Gsync is overpriced. And when the tech is new, yep, it's going to cost a premium. 3440x1440 is a small sector of the display industry, they aren't producing nearly the same amount of panels as your typical resolution 16:9's.

I like my X34, but I would have *never* paid full retail for it - I think the price I got it from Acer refurb was just right. And I've never had a single complaint beyond some really old games that don't natively support 21:9
 
Changing the FOV changes the FOV.
If using a letterboxed resolution is giving you a wider view in the game, you need to increase the FOV setting when displaying 16:9.
Then the central area which was letterboxed will look the same, but you'll have more height.

The problem is that many games don't allow changing FOV. If you force a custom ultrawide resolution, though, the game will switch (if supported) to hor+, thereby forcing it to change the FOV despite there not being menu options to do so.
 
Anything with Gsync is overpriced. And when the tech is new, yep, it's going to cost a premium. 3440x1440 is a small sector of the display industry, they aren't producing nearly the same amount of panels as your typical resolution 16:9's.

I like my X34, but I would have *never* paid full retail for it - I think the price I got it from Acer refurb was just right. And I've never had a single complaint beyond some really old games that don't natively support 21:9

21:9 support is not just an old game problem. Plenty of new AAA games also lack support (Bethesda, i'm looking at you).
 
21:9 support is not just an old game problem. Plenty of new AAA games also lack support (Bethesda, i'm looking at you).
Yeah it's slightly annoying but in the games that I've played in the last 9 months that I've had the monitor it's really only been an issue a couple of times and I didn't lose any sleep over it.
 
When 4K has 100Hz or better, they get the input lag down farther than they do currently, and video cards get more powerful to drive games at the high 4K resolutions, then I will make the switch to 4K. Until then I'll stick with 21:9. I figure 2017 we will see the first two happen but that GPU issue is going to take a while longer. It's all about personal preferences I guess. How picky do you want to be? What are your demands? What are you willing to put up with?
 
This idea is at least 1 year old:
I teach you how to widescreen (21:9) native 16:9 monitor

However, I challenge the very right for existence of that odd aspect ratio (21:9, properly called 7:3). Human vision is close to 4:3, the original IMAX aspect ratio, therefore height challenged monitors are just moronic. Yes, sometime in 1950s Hollywood wanted to differentiate itself from TV, so after long hard thought -- "How can we make movie watching experience for TV viewers as miserable as possible?" -- they came out with this idiotic aspect ratio. In the meantime, the world moved on, and virtual reality (where each eye sees square 1:1 image) will soon render this "cinematic" movie watching experience into a history footnote.
 
Last edited:
I love this idea. Just got a 4K TV to use as a monitor, coming from triple 1440P Surround. I like Surround, but it's very intensive and most of the periphery is wasted as you can hardly see the edges of the screens.

4K is also really intensive, but I think 21:9 on a 4K panel could actually be a good compromise. You'd get the more immersion of the wider aspect ratio, but without wasting any pixels and (in fact) perform better than both the native 4K or a Surround/Eyefinity setup.

Let me do some tests now and see what I think. Thanks for bringing this to light.
 
OK, I did some tests. This method actually works well. Finally I was able to get Deus Ex running decent on my 4K set and it looks great.

The performance gains are huge versus 4K native, and the letter-boxing doesn't diminish the experience too much. I mean, it's less screen real-estate, sure, but you gain some FOV so it's a trade-off.

DXMD_219.jpg


DOOM_219.jpg


The down-side is that you then have to deal with ultra widescreen compatibility issues like you would with Surround/Eyefinity or Ultra-Wide monitors (stretched HUDs, UI glitches, incorrect FOV or aspect ratio, etc.).

It also took me a while to get a custom resolution to work. Tried a bunch of different resolutions around 4K, but none of them were giving me anything but a black screen. Did try 1920x820 @ 60Hz, and that worked fine, so I knew I could do custom resolutions, but they weren't working higher. Finally I tried a 4K res at 50Hz and that worked (specifically 3840x1640 @ 50Hz) on my Samsung KU6300. Honestly 50Hz looks fine to me, and maybe even smoother than 60Hz but that might just be placebo on my part.

I'm not sure I would play every game like this. I kind of like 16:9, and one of the reasons I even bought a 4K TV to use as a monitor was to avoid all the multi-monitor software glitches on Nvidia Surround. But this honestly may be the better choice over a Surround setup. While you get more pixels with Surround, and a wider FOV, a lot of those pixels are wasted as you can barely see the outer limits. And 3 monitors is *way* more intensive graphically, meaning you either have to buy better hardware, turn settings down, or both. Not to mention, looking at those bezels. And, finally, the cost of 3 high-res monitors will likely be more than 1 TV (I paid $500 for the Samsung KU6300 I'm using). This virtual 21:9 trick seems like a more economical choice, and maybe better quality as well.

In term of comparing to ultra-wides, well you have the same cons (i.e. flaky software support) but you can easily switch back to 16:9 for games that work better at that ratio. So this seems like the best of both worlds.
 
1920x810 or 2840x1620 are good options, they're exact 21:9 resolutions. They should work fine at 60hz. From your screenshots, DOOM seems to work fine (though that HUD should be on the sides...) but your Fallout image seems clearly distorted - did you try forcing a different aspect ration in the in-game options? Fallout doesn't support 21:9 natively, in which case you may want to check Flawless Widescreen - Gaming the way it should be! to get it working on ultrawide resolutions.

Personally, if a game doesn't support ultrawide, I'm happy playing it at 16:9 - doing the work developers should have done is just too annoying for me to even bother.
 
The pic is from Deus Ex (not Fallout) and I didn't think it was distorted, but I can double check. It supports native widescreen, so it should be automatic, but maybe I messed sometime up.

The reason I used 3840 x 1640 is because I read that games perform better when the resolution is divisible by 8. Not sure what the performance difference would be, though.
 
Here's a comparison in DOOM between 16:9 (1920x1080) and 24:10. (1920x800)

doom_fovmgsea.gif


If an ultrawide is giving you a wider view, it's only because the FoV is scaled differently.
This comparison is also pretty good at demonstrating what it's like to switch between a 27" 16:9 panel, a 34" ultrawide, and a 40" 36" 16:9 panel, since larger displays allow for higher FoV settings.
If you use high FoV settings on a small display, it tends to look distorted at the edges of the screen.

Also: I've never been able to get Flawless Widescreen to work.
I always get an error saying that it couldn't open the game process. I'm guessing that this has something to do with the fact that I'm running the games on a regular user account rather than an admin account, but I really have no idea.
 
Last edited:
that above comparison is not correct as the horizontal is the same. When I play BF4 for example, the vertical image is the same on both 16:9 and 21:9 but I get more image on the left and right.
The comparison you are showing is how Blizzard did 21:9 on Overwatch, by keeping the same horizontal view for both 16:9 and 21:9 but the 21:9 is zoomed in so you lose vertical image.
 
Yeah, euskalzabe , you were correct. The aspect ratio was wrong. I assumed since Deus Ex Mankind Divided had multi-monitor support that a custom ultra-wide resolution would be automatic, but I was wrong. But I did get it working after tweaking the registry.

DXMD_219_Fix.jpg


While the custom resolution worked (in my case 3840x1640 @ 50Hz) the game still seemed to render at a 16:9 aspect ratio and stretch the image horizontally. I could not find an aspect ratio option in the menu, but there is in the registry. To edit this value, open regedit and go here:

HKEY_CURRENT_USER\Software\Eidos Montreal\Deus Ex: MD\Graphics\

Look for the key AspectRatio (it defaults to 0). Find your ratio by dividing pixel width by height then times by 1000 to get a whole number.

So, for me, I did (3840 / 1640) x 1000 = 2341

In the registry key for AspectRatio, set it to Decimal and then enter the 4 digit number. Now the game should render correctly.

zone74 No, that's not correct. I've gamed a bunch with Nvidia Surround, and (when everything is set up correctly) the center monitor renders exactly the same as normal and the side monitors contain extra information from the periphery.
 
Awesome! And yes, I meant Deus Ex, not sure why I typed Fallout (I'm doing way too many things simultaneously this morning) :)

Either way, I'm usually hyper sensitive to proportions being wrong, which is why I immediately noticed it. I'm glad you got it working properly.

The only reason why I mentioned those resolutions is because they're directly proportional to 1080p. Quadruple 1080p (twice vertical, twice horizontal) and you get 4K. If 2560x1080 is ultrawide, 1920x810 is the exact same ratio on lower res. Thus, quadruple that, and you get the 4K equivalent. Works perfectly well for me, and it should technically give you real, perfectly calculated 21:9 resolution.

But hey, whatever works for you!
 
Please, don't extrapolate from your own experience to everybody's experience. I've used 120hz and 144hz panels before and while nice, I would never spend that kind of money on them when I can get better color reproduction or 4K resolution on other panels at 60hz for similar money. It's perfectly fine that you would prefer 120hz, but don't assume or imply that everybody wants higher refresh. If higher fps didn't come with the color sacrifices, then yes, I'd go for it, it's a net win. However, I will not go to a TN panel at this point in time - unless you need the high refresh for something specific - like competitive gaming - or you simply prefer the high refresh despite the sacrifices you're making, there's no point on losing on the VA/IPS color reproduction. Yes you can now do IPS at 144hz in some of the new LG panels, if high refresh is not your ultimate goal, why pay $700 when you can get a 40" 4K panel for $450?

Maybe Im wrong and I thought OP was a gamer? As primarily a gamer, after I used even just a 100 hz X34, when I went back to test my LG at 60 hz, the only thing I could come up with was "Is this broken??" Just that solid 40 FPS difference (at the time I was running 2 x Titan X (maxwells) in SLI) was enough to make 60 hz gaming feel busted.

Now if I was a photo shop or spread sheet guy, I wouldn't care one bit about 60 vs 100 vs 144 hz. The same way I won't go to a 144 or now even 200 hz TN panel because (to me) the colors look like crap and I'm not willing to use anything smaller than 34", let alone a 27"

I think all comes down to what will you do with your monitor and what you want to spend on it. And honestly, as a gamer, I think VR is going to make desktop monitors for "poor people" within the next 3 years.
 
Last edited:
that above comparison is not correct as the horizontal is the same. When I play BF4 for example, the vertical image is the same on both 16:9 and 21:9 but I get more image on the left and right.
The comparison you are showing is how Blizzard did 21:9 on Overwatch, by keeping the same horizontal view for both 16:9 and 21:9 but the 21:9 is zoomed in so you lose vertical image.
Display size determines FoV, not aspect ratio.

sizesv6qa7.png


A 34" 21:9 display (blue) gives you more width than a 27" 16:9 display (green).
But a 36" 16:9 display (red) gives you more height than the 21:9 display.

Look at the image again.
The 24:10 image is giving you a wider FoV than a 16:9 image of equal height.
However a larger 16:9 display gives you more vertical FoV than the ultrawide.

An ultrawide only increases horizontal FoV when you're comparing it to panels of an equal height.
 
No, that is completely false.

Aspect ratio is the primary factor that determines FOV, screen size is actually irrelevant.

To the computer or video card, and certainly to a game engine, there is no concept of the size of the screen. Only the resolution matters (which dictates the aspect ratio, which then dictates the FOV).

You can have a 21" 1080P monitor or a 115" 1080P projector, and to a game it's exactly the same thing. Take a screenshot on either system and it will be pixel for pixel identical.
 
To the computer or video card, and certainly to a game engine, there is no concept of the size of the screen. Only the resolution matters (which dictates the aspect ratio, which then dictates the FOV).
You can have a 21" 1080P monitor or a 115" 1080P projector, and to a game it's exactly the same thing. Take a screenshot on either system and it will be pixel for pixel identical.
Well that's not strictly true, most displays report their size now, which will be used for DPI scaling and/or 3D output.
But yes, as far as FoV is concerned I can't think of any games which adjust things automatically based on display size.

For the image to look correct, however, you need to increase the FoV as the display size increases.
If you use a high FoV on a small display, everything looks distorted.
If you use a low FoV on a large display you get tunnel vision.

Technically, FoV should be a calculation that takes both screen size and viewing distance into account, but in the comparison image that I posted, viewing distance does not change - which is probably true for most PC gamers sitting at a desk. If they get a larger display, they aren't going to be sitting any further from it.
 
Maybe Im wrong and I thought OP was a gamer? As primarily a gamer, after I used even just a 100 hz X34, when I went back to test my LG at 60 hz, the only thing I could come up with was "Is this broken??" Just that solid 40 FPS difference (at the time I was running 2 x Titan X (maxwells) in SLI) was enough to make 60 hz gaming feel busted.

Now if I was a photo shop or spread sheet guy, I wouldn't care one bit about 60 vs 100 vs 144 hz. The same way I won't go to a 144 or now even 200 hz TN panel because (to me) the colors look like crap and I'm not willing to use anything smaller than 34", let alone a 27"

I think all comes down to what will you do with your monitor and what you want to spend on it. And honestly, as a gamer, I think VR is going to make desktop monitors for "poor people" within the next 3 years.

This is the kind of crap that always seems to find its way onto the doorstep...

There is nothing wrong with gaming with 60hz and it doesn't make you less of a gamer to do so. This attitude usually comes from FPS centric clowns that don't realize there are other games out there.
 
Well that's not strictly true, most displays report their size now, which will be used for DPI scaling and/or 3D output.
But yes, as far as FoV is concerned I can't think of any games which adjust things automatically based on display size.

For the image to look correct, however, you need to increase the FoV as the display size increases.
If you use a high FoV on a small display, everything looks distorted.
If you use a low FoV on a large display you get tunnel vision.

Technically, FoV should be a calculation that takes both screen size and viewing distance into account, but in the comparison image that I posted, viewing distance does not change - which is probably true for most PC gamers sitting at a desk. If they get a larger display, they aren't going to be sitting any further from it.
I like this line of thinking, especially as larger display sizes and VR bring real and effective field of vision into gaming with the notion of screen elements being corrected for the viewer's particular viewing configuration and interpupillary distance.
 
If you have a 4K display, then you'll never use the ultrawide option. Why would you make the picture shorter when you don't have to?

If you're asking yourself why you would get an ultrawide when you can get a 4K, then that's the right question and the answer is obvious.

4K and ultra-wides are completely different things.

Ultra-wides are not 16:9 ratio with tops chopped off to make it 21:9, but rather it's a 16:9 with SIDE VIEW STRETCHED to become 21:9. (The vertical FoV does not decrease, but it's the Horizontal FoV that gets increased)

All 16:9 have the same FoV, regardless of actual resolution, except for games that specifically scales it on purpose, or games that allows you to change your FoV (which the benefit of such depends almost entirely on screen size).

The problem with OP's suggested usage, is that some people can fail to appreciate that say a 32" 4k and 34" Ultra-wide, the 34" is significantly wider than the 32", so if you used 21:9 resolution option on the 32" 16:9, it'll be the same size as a 27" ultra-wide. a 34" Ultra-wide requires a 40" 16:9 screen to emulate.

Also, monitors tend to emulate the screen at the middle, rather than allowing you to choose, so your FoV may actually be necessarily higher than an actual Ultra-wide lets you.

Finally, some monitors simply do not like custom resolutions. For example, my Swift can handle 2560x1080 only at 24hz, which, as you can see, basically renders that resolution useless for gaming.
 
Here's a comparison in DOOM between 16:9 (1920x1080) and 24:10. (1920x800)

Thanks for this. Up until now, I have always assumed that FPS FoV is not in any way affected by its rendering resolution, it only affects its graphical fidelity (so far every game I have played conform to this).

It looks like Doom might be an exception, I will have to try this on the demo though.
 
Back
Top