Why ultrawide when 4K+custom res is possible?

Thanks for this. Up until now, I have always assumed that FPS FoV is not in any way affected by its rendering resolution, it only affects its graphical fidelity (so far every game I have played conform to this).

It looks like Doom might be an exception, I will have to try this on the demo though.

Your initial assumption was the correct one. Rendering resolution, given the same aspect ratio, does not in any way affect the FOV of the game. If you run 1280x720 or 1920x1080 or 2560x1440, the image will be identical in terms of the size and position of objects. The only difference will be the graphical fidelity. This is really easy to test and should be intuitive to anyone who has ever played a PC game and changed the resolution (which I'm hoping is everyone here).

The DOOM gif in the above post is bogus. The creator must have tweaked the FOV manually and it seems intentional misleading. I went ahead and did my own tests, to prove my point. You can see below I ran the same exact scene in DOOM, only changing the resolution between 2560x1440 (16:9) and 3440x1440 (21:9). Notice that the scene is in the exact same position, and the only major difference is that you can see extra pixels on the side peripheral view. The gun does shift slightly, but this is likely due to the way guns in FPS games have their own independent FOV so it may not be exact. But the the entire background is nearly identical in the middle. The view you are seeing vertically and in the center does not change. So running ultra-wide or Surround/Eyefinity just basically adds visible areas on the sides and does not take anything away.

DOOM_AB.gif


One thing to note is that not every game supports ultra-wide resolutions, and some don't work well at all. But if they are working, it would be as shown here.
 
4K and ultra-wides are completely different things.
Ultra-wides are not 16:9 ratio with tops chopped off to make it 21:9, but rather it's a 16:9 with SIDE VIEW STRETCHED to become 21:9. (The vertical FoV does not decrease, but it's the Horizontal FoV that gets increased)
Physical size determines this, nothing else.

A 25" 21:9 display is shorter and narrower than a 27" 16:9 display.
A 34" 21:9 display is wider than a 27" 16:9 display.
A 36" 16:9 display is taller than a 34" 21:9 display.
A 40" 16:9 display is larger in all dimensions.

You can't make a blanket statement about ultrawides being "short" or "wide" displays. That depends on what you're comparing it against.

All 16:9 have the same FoV, regardless of actual resolution, except for games that specifically scales it on purpose, or games that allows you to change your FoV (which the benefit of such depends almost entirely on screen size).
FoV is a physical thing.
It's affected by your screen size, aspect ratio, and your distance from the screen.
The FoV control in a game is there so that you can adjust what the game is displaying based on your physical FoV - since that can be highly variable.

If you set up FoV to be mathematically correct, the display becomes a window into another world. Everything should be displayed at true scale.

Here's an example of that:

(not my setup, see this post for details)

The problem with this, is that a mathematically correct FoV is usually far lower than gamers are used to.
Most PC gamers probably have less than a 40° physical FoV with their current monitor setup.
You either need to have a very large display and be sitting close to it, or be using a VR headset for accurate FoV values to be usable.

But while using accurate FoV values is not feasible in most setups, it doesn't change the fact that larger displays require higher FoV settings.
If you didn't change the FoV, you would just be stretching out the same image over a larger area and would end up with "tunnel vision" because you're now seeing a much smaller region of the game world in the central portion of your vision.

With a larger display you have to increase FoV so that the central portion of the image is displayed at the same scale and you gain peripheral vision at the edges.
That's true whether your display is simply wider (moving from 27" 16:9 to 34" 21:9) or whether your display is larger in all dimensions. (moving from 27" 16:9 to 36" 16:9)

That's why it's stupid to say that "ultrawides give a wider FoV" as a blanket statement.
Yes, your 34" ultrawide gives a wider field of view compared to a 27" 16:9 monitor.
But a 36" or larger 16:9 display will give a larger field of view in every dimension compared to the 34" ultrawide when both are set to display an image at the same scale.

Finally, some monitors simply do not like custom resolutions. For example, my Swift can handle 2560x1080 only at 24hz, which, as you can see, basically renders that resolution useless for gaming.
I don't think you're setting up the custom resolution correctly.
When creating a new display mode, you should first open up the timing section and set it to manual before making any changes to the display mode.
That ensures that the signal being sent to your monitor is its native resolution.

Then you can set the display mode to whatever you want.
In this case, I have created a custom 1920x800 resolution. (24:10)

custom-resq5sgc.png


The GPU is only rendering 1920x800 but my display receives a 1920x1080 signal.


Your initial assumption was the correct one. Rendering resolution, given the same aspect ratio, does not in any way affect the FOV of the game. If you run 1280x720 or 1920x1080 or 2560x1440, the image will be identical in terms of the size and position of objects. The only difference will be the graphical fidelity. This is really easy to test and should be intuitive to anyone who has ever played a PC game and changed the resolution (which I'm hoping is everyone here).

The DOOM gif in the above post is bogus. The creator must have tweaked the FOV manually and it seems intentional misleading. I went ahead and did my own tests, to prove my point. You can see below I ran the same exact scene in DOOM, only changing the resolution between 2560x1440 (16:9) and 3440x1440 (21:9). Notice that the scene is in the exact same position, and the only major difference is that you can see extra pixels on the side peripheral view. The gun does shift slightly, but this is likely due to the way guns in FPS games have their own independent FOV so it may not be exact. But the the entire background is nearly identical in the middle. The view you are seeing vertically and in the center does not change. So running ultra-wide or Surround/Eyefinity just basically adds visible areas on the sides and does not take anything away.

DOOM_AB.gif


One thing to note is that not every game supports ultra-wide resolutions, and some don't work well at all. But if they are working, it would be as shown here.
I thought I had been pretty clear about this in my posts. I've updated the image so that you understand:

doom_fov_29irrn.gif


DOOM seems to automatically adjust the FoV when changing aspect ratio, but you need to adjust it manually when using a larger display.

Why buy a 4k monitor when you have Ultrawide?
A 40" UHD display is typically less expensive than a 34" 3440x1440 ultrawide.
 
In your gif, going from 27" 16:9 to 34" 21:9 is exactly what happens in the game

What does NOT happen is when you go from 27" 16:9 to 36" 16:9, you have to manually increase your FOV in order to get that kind of FoV.

With games that uses FIXED FoV, you do not benefit from higher resolution except for graphic fidelity, when you are going from any aspect ratio to the same one but higher resolution.

Here are a select set of images I just took from the Doom Demo, all of the images have fixed FOV of 110, and I only changed the resolution (all 16:9, 4k and 5k are done using DSR).

https://puu.sh/qZmjI/ccca7bf370.jpg (1280x720)
https://puu.sh/qZmjF/a8369a86ca.jpg (1920x1080)
https://puu.sh/qZmjv/a3cb789aab.jpg (2560x1440)
https://puu.sh/qZmji/798c758c67.jpg (3840x2160)
https://puu.sh/qZmhC/6d05008c14.jpg (5120x2880)

All of these images share the exact same FOV despite being rendered the resolution, increasing the screen size only increases the PHYSICAL size of the things you are seeing, but if all of the screen size are normalized, the amount of information displayed on the screen are actually identical.

Now, if you meant that, having a bigger screen size would allow you to increase FOV if the physical size of the thing you are seeing the same, then I'd agree, but this is detached from any resolution, and not all games allows FoV adjustments.

Wider aspect ration is one of the ways of forcing an FOV increase (if only in 1 axis) for games that support the resolution, if it doesn't allow you to modify its FOV. DOOM and Bioshock Infinite are 2 games where I know you can modify your FOV as to benefit from larger screens.

But again, not all games support FOV changes, just like not all games support Ultra-Wides.
 
It's hilarious, some people literally cannot see outside of the box. He knows what FOV technically means, and he's introducing an accomodation of the term to account for an increase in screen size while preserving angular resolution at a set viewing distance. It's only confusing if you insist on being pedantic.
 
In your gif, going from 27" 16:9 to 34" 21:9 is exactly what happens in the game
What does NOT happen is when you go from 27" 16:9 to 36" 16:9, you have to manually increase your FOV in order to get that kind of FoV.
That's why the images show that I had to increase the FoV setting from 100° to 117° when moving to a larger display to keep scale the same.

Now, if you meant that, having a bigger screen size would allow you to increase FOV if the physical size of the thing you are seeing the same, then I'd agree, but this is detached from any resolution, and not all games allows FoV adjustments.
Yes, that is my point.
In order to keep scale the same, you need to increase FoV as the display gets larger.
I never said anything about resolution affecting FoV, only that larger displays require higher FoV values.

Wider aspect ration is one of the ways of forcing an FOV increase (if only in 1 axis) for games that support the resolution, if it doesn't allow you to modify its FOV. DOOM and Bioshock Infinite are 2 games where I know you can modify your FOV as to benefit from larger screens.

But again, not all games support FOV changes, just like not all games support Ultra-Wides.
It's just as likely that an ultrawide will reduce your FoV, as many games reduce vertical FoV with wider aspect ratios rather than increasing the horizontal FoV.
Not that it should matter, because you can fix that by increasing the FoV setting. It just takes a bit of work to calculate the correct value.
I can't think of any games which would have proper ultrawide support, but no FoV controls.

And most games - even if they don't offer FoV settings in the menus - can usually have it adjusted via config files or modifications.
If you can't adjust FoV at all, then it's going to look bad on many screens. It amazes me that game developers don't seem to understand that FoV controls are to adjust for your display setup, not to see behind you when playing a competitive FPS.
It's ridiculous that a game like Overwatch doesn't offer an unlocked FoV control so that things can be displayed correctly on a 21:9 panel or a large 16:9 display. By switching to VERT− they give 16:9 players an advantage, rather than leveling the playing field.
 
Right, so you manually changed the FOV. That explains things.

If you don't manually change the FOV, and use a game that supports wide-screen (i.e. Hor+) or use mods like FlawlessWidescreen, then Ultra-Wide and/or multi-monitor definitely DOES add more information on the sides (see my image above). The physical size of the monitor DOES NOT come into play at all, it's not a factor for the game. The only factor that the game takes into account is the aspect ratio, which is determined by the resolution. That is all.

Looking back at your posts, I see you did admit to manually adjusting the FOV but later started on the screen size thing, which is misleading to anyone not carefully following everything you said.
 
Last edited:
Too much detail on FOV in some posts, I'd say. It's simple:

1080p at 27" = 1080p at 1 billion". If both the 27" and billion" screens are 16:9, the image will be 100% identical, just larger, with no FOV changes, period.

If the aspect ratio of the screen deviates from 16:9 to something else, then FOV will change as you'll be seeing a different image - not just in size, but also in terms of what is actually rendered.

Unless, of course, you change it manually. That, however, is a completely different conversation.
 
Right, so you manually changed the FOV. That explains things.

If you don't manually change the FOV, and use a game that supports wide-screen (i.e. Hor+) or use mods like FlawlessWidescreen, then Ultra-Wide and/or multi-monitor definitely DOES add more information on the sides (see my image above). The physical size of the monitor DOES NOT come into play at all, it's not a factor for the game. The only factor that the game takes into account is the aspect ratio, which is determined by the resolution. That is all.

Looking back at your posts, I see you did admit to manually adjusting the FOV but later started on the screen size thing, which is misleading to anyone not carefully following everything you said.
Really, you thought the massive 100° / 100° / 117° labels that I put on the images were somehow misleading?

So because a game does not automatically scale FoV based on display size - which I've already explained is not possible since FoV is a physical metric that is calculated using display size, aspect ratio, and viewing distance - that somehow invalidates the fact that larger displays need to use higher FoV values?

It's okay to change the FoV value or modify games to properly support ultrawides, but not okay to properly adjust the FoV on larger 16:9 panels because that means they show more than your 21:9 display when set to the same scale?
 
OK, well neither of us is wrong. But we are talking about different things. It really depends what you are trying to accomplish.

When I use a bigger display (like the 40" TV I just bought), I expect to play the game same as with a small monitor (say a 27") just bigger. This, I believe, is what most people expect.

What you are saying is increasing the FOV manually in the game, such that the center 27" on a 40" display is exactly the same as playing on the 27", and the surrounding screen area includes more information.

That's fine if that is what you want. I hadn't really considered that option, and it doesn't seem like the common use-case. But that's what PC gaming is about: tweaking the settings to getting what you're looking for, and that's fine.
 
When I use a bigger display (like the 40" TV I just bought), I expect to play the game same as with a small monitor (say a 27") just bigger. This, I believe, is what most people expect.

What you are saying is increasing the FOV manually in the game, such that the center 27" on a 40" display is exactly the same as playing on the 27", and the surrounding screen area includes more information.
Why would you expect that making the display bigger in one dimension would enable the use of a larger field of view, but making the display bigger in both dimensions would not? That doesn't make any sense.

Then your game would be scaling like this:
doom_fov32msme.gif
 
As someone who has used both a large 48" 4k and an Ultrawide I can tell you they are different experiences. When doing a custom 21:9 resolution on your big TV, you still have a HUGE TV in front of your field of vision with bezels and black screen everywhere, where as with a 21:9 monitor, the monitor is all you see.

Its the same reason some people use Ultrawide 21:9 projection home theater screens with anamorphic lenses vs 1 big 16:9 screen with black bars.
 
Why would you expect that making the display bigger in one dimension would enable the use of a larger field of view, but making the display bigger in both dimensions would not? That doesn't make any sense.
Clearly, you've not read anything I've written. The situation has nothing to do with big or small. The size of the screen is completely irrelevant. The only factor is the aspect ratio. Not sure why this is so hard to understand.
 
Last edited:
Clearly, you've not read anything I've written. The situation has nothing to do with big or small. The size of the screen is completely irrelevant. The only factor is the aspect ratio. No sure why this is so hard to understand.
Only if you use the default FoV that the game gives you without correcting it.
That would be like me saying that ultrawide screens all give you a tiny FoV because some games are VERT− rather than HOR+, even if you can just increase the FoV setting to fix it.
 
It's ridiculous that a game like Overwatch doesn't offer an unlocked FoV control so that things can be displayed correctly on a 21:9 panel or a large 16:9 display. By switching to VERT− they give 16:9 players an advantage, rather than leveling the playing field.

Competitive scene is a different story, Blizzard is probably playing it safe here, lest they want to be CONTINUOUSLY bombarded with complaints that players with ultra-wide aspect ration will have an edge because they will have a larger Horizontal FOV, and thus leading to give them an 'edge' since they can simply move their eyes to see a bit further to each side rather than needing to move their mouse.

Not saying I agree with it, or if there is any actual indication that Ultra-Wides actually gives significant advantage if used that way, but it's something the players will cry about.

With regards to FoV control, I must admit I don't really play FPS religiously (I play a lot of top downs or side scrollers), so I never looked too deeply into FoV changes, nor do I bother with it outside of ingame option controls.
 
Here's a video of me playing Deus Ex MD at the custom 21:9 resolution. Though slightly less immersive than using the full screen, once you get into the game, you hardly notice the letterboxing anymore.

 
As someone who has used both a large 48" 4k and an Ultrawide I can tell you they are different experiences. When doing a custom 21:9 resolution on your big TV, you still have a HUGE TV in front of your field of vision with bezels and black screen everywhere, where as with a 21:9 monitor, the monitor is all you see.

Its the same reason some people use Ultrawide 21:9 projection home theater screens with anamorphic lenses vs 1 big 16:9 screen with black bars.

Yeah, not sure I see understand your reasoning. The black bars are annoying, but the back of the wall is not? Either way, the breakage of the screen kills immersion. The fact that you can accept the bezel and the back of the wall but not the black bars just means that you believe the bezel afford you better immersion. To me, they both break immersion. I agree that a big 4K gives you more options. You can make a screen smaller and wider, but you cannot make it bigger on a widescreen.
 
with HOR+ games and most graphics suites, the virtual camera is just like a lens of a real camera and uses virtual cinematography. Just like a real lens, any wider aspect will show more of a scene. Frog hopping FoVs one over another becomes a bit pedantic as you zoom out by steps over and over and do some kind of which came first the chicken or the egg.

I think the OP's question was a good one. If all things were equal regarding monitor technologies included, performance, sizes available it would make a lot more sense. I think the limitations of 4k monitors vs the higher end high hz + variable hz gaming monitors is the major hurdle to using one vs a high hz 16:9 or 21:9 with all of the modern gaming advancements included (high hz, g-sync, modern gaming overdrive, ultra low response time even at 60 - 75hz or less,low input lag). I can do 120hz native at 1080p on my 4k VA tv but 21:9 and 4k on it would be down to 60hz again. It has 15 - 20ms input lag in game mode but it lacks 4:4:4 (not a deal breaker to me on a tv), has no modern gaming overdrive, no varible hz/g-sync, and is way too big for any desk scenario even if it were the 55" model imo. It does have direct led zone lit/local dimming backlight and is VA so the contrast and black level depth/detail in blacks blow away my rog swift gaming monitor, and any monitor I've ever owned for that matter.

There will be 4k 120hz+ monitors but GPU setups usually can't do 4k at 100fps-hz average required to appreciably realize the benefits of a high hz monitor (which would typically ride a fps-hz graph "vibrating" dynamically in a range from around 70fps to 140fps with variable hz). The arbitrary ulta ceiling cutoff set by devs any generation isn't going to be standing pat it's only going to go higher with the newer gpus as well.

For my next desk gaming monitor, I am looking forward to reviews of the 27" 16:9 2560x1440 and 35" 21:9 3440 x 1440 200hz VA panels due out end 2016/Q1 2017. I will likely get one of those.
These should carry me for what I guess will be 3 - 5 yrs (probably the longer end for me) until I feel oled and HDR has matured enough and is available in more tv and monitor models (as well as having more avail content), and becomes merely expensive rather than what I consider cost prohibitive.

VA screens have much greater black depth and detail in blacks than IPS and TN. A calibrated IPS of TN screen usually has a contrast ratio of 750:1 to 900:1 (some list 1000:1). This is also compromised by your room lighting and any ambient lighting level/location shifts to your eyes post calibration however. A non-zone lit, non-dynamic VA monitor is usually around 4800:1 contrast ratio (list 5000:1).

======================================================
post by igluk

Here is yet another update to the TftCentral articles, only concerning AUO:
LCD and TFT Monitor News

In short:
- 25'' & 27'' 1080p 240hz TN panels end of 2016
- 35'' 3440x1440 now 200hz VA end of 2016
- 31.5'' 1440p 144hz VA Q4 production (same as planned Samsung panel)
- 27'' 1440p 144hz VA in planning phase
- 27'' 4k 144hz AHVA (IPS) mass production in 2017
- 240hz 1440p planned in 2017
 
Last edited:
Doesn't VA have horrid response times though that basically renders (no pun intended) 200hz basically useless on a VA panel?

My next Gaming monitor will probably be OLED high refresh rate based monitor, VA, TN and especially IPS all have their shortcomings that I do not plan to use any of them for gaming purposes if OLED becomes available.

I am pretty content with what I have currently though, even the TN monitor is doing great for me, but both my PG278Q and BL3201PT have flaws big enough that I would not use either one alone, and I would prefer my next monitor to be devoid of the flaws of both if alternatives are available.
 
Well yeah, we all want OLED. Problem is, right now the cheapest you can buy is the LG B6 and that goes for $2500. Hardly affordable. If it were cheaper, we'd all be buying OLEDs.
 
$2500?

That's a LOT more reasonable than the LG are charging over here.

It's been a while since I looked up the prices of OLED TV's, but back then, I remember those 4k TV's costing nearly $7000, just not sure what the price are right now.

4k 144hz @ $2500, assuming my no financial responsibility situation continues, I would probably bite.

Edit: had a quick skim around the net, OLED55B6T are currently being sold for a little under US$4000, which is basically the US price for the 65" version.

55" is the smallest 4k TV's (of any panel) we have, any smaller I would have to choose more asian local brands, which probably no one in US have heard of.

EDIT: I have forgotten to mention that Taiwan's taxes are very crooked when it comes to displays.

If a Smart TV is sold as a single unit (I'll explain this in a minute), then it gets charged higher taxes than a Smart TV that has the 'Display' and the 'Smart' part separated. IE if the TV is literally just a display, then it's taxed under "Computer" monitor, and the 'Smart' part (resembles a brick size PC) gets taxed separately, leading to lower taxes. So I think it's partly because of that tax.

But this is also why I am eagerly waiting for computer monitors. If TV's and monitors has the same MSRP, monitors will be cheaper for me (not withstanding actual availability).
 
Last edited:
https://www.amazon.com/LG-Electroni...?ie=UTF8&qid=1473253153&sr=8-1&keywords=lg+b6

That's the B6 at 55" for $2500. I don't think this is mainstream pricing though, and certainly not "affordable" for most people. When this goes down to 800, that's when it's mainstream. So, ~%25 of the current price. It will happen, but probably not in another 5/6 years. You can buy a new screen now and have it comfortably last until then :)
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
That's the one I was looking at. But as I was saying, if the Dell 4k 144hz OLED monitor ever gets a $2500 MSRP, I am probably biting.

TV's probably won't EVER get that sort of refresh rate, as they seem to be much more focused on Resolution than refreshrate, and HDMI 2.0 is falls far short of the bandwidth required.

Getting even a $2500 OLED TV at the moment is basically just upgrading the size and the panel of my BL3201PT, which is a little steep. a $2500 OLED 4k 144hz monitor is a lot more reasonable, even if it isn't 55".
 
The 2560 x 1080, 21:9 VA's were tight enough vs response time or ghosting to about 120hz (per review on tftcentral), so yes in their case the 200hz would just be overage. Some people cap their high hz monitors at or below their maximum hz to prevent input lag so it could be useful for that.

The eizo FG2421 was 120hz but also used some kind frame duplication combined with screen blanking to do what they were calling "240hz". I have heard nothing about that kind of tech being included in the upcoming high rez, high hz VA gaming monitors due out but it would be a great addition. Probably too much to hope for. Of course if they have g-sync models they should have "lightboost" but that isn't exactly the same.

================================================
Eizo Foris FG2421 Review
In order to help improve the the gaming performance and smoothness of moving content an additional feature has been added to the display by Eizo. This is independent of the 120Hz refresh rate support and Eizo refer to it as "Turbo 240". This is available in the OSD menu 'color' section, in the 'advanced settings' as shown above. In some preset modes (FPS1 and 2) the Turbo 240 is enabled by default and cannot be turned off. In the user modes you can choose whether to have it enabled or not.


This feature is designed to reduce the perceived motion blur much like the LightBoost system does. Since it is natively supported by the screen and easily turned on and off at a screen level from the OSD, there is no need for "hacking" your graphics card settings of trying to enable backlight strobing like on LightBoost monitors. It also means the feature can be used with non-NVIDIA graphics cards which is useful. Turbo 240 involves doubling the frames of the input signal (replicating each frame) and then blinking the backlight off and on in sync with these new frames like an impulse-type display. This should help reduce perceived motion blur much like the LightBoost backlight strobing does on TN Film screens.
JUphNYA.png


===========================================================

I'm interested in those VA monitors coming out obviously. For me, OLED computer monitors can wait until there are more models out and with variable hz tech included at closer to $2k (or less). By the time the dell comes down in price there should be others on the market I'd think and hopefully with oled wrinkles worked out, variable hz, and maybe even some kind of image retention /blur reduction tech.
OLED tv's are not the same thing as a modern gaming monitor.
 
That's the one I was looking at. But as I was saying, if the Dell 4k 144hz OLED monitor ever gets a $2500 MSRP, I am probably biting.
TV's probably won't EVER get that sort of refresh rate, as they seem to be much more focused on Resolution than refreshrate, and HDMI 2.0 is falls far short of the bandwidth required.
LG Wows IFA with OLED Tunnel & HFR HLG HDR TV Demo
lg-2160p100jeu1e.jpg


Seems like there's a good chance of it happening next year.
 
The LG OLEDs are beautiful. I did love the quality, and I may have actually spent the money, but 40" was about that largest that would fit on my desk in this space and it looks like they start at 55".

I'm not sure which is more likely: 4K monitors getting larger and getting TV features like OLED or HDR, or them making smaller high-end TVs. And what's the deal with 4K projectors? The prices are out of this world.
 
I bought a 4k 40" TV (KU7000) to use its native res (3840x2160) but found 3840x1600 (21:9) very useful to me as a custom res. Reason I prefer that is that the vertical height of a 40" monitor is too high for desktop usage, I have to look upwards at many things on screen, ie, browser controls, etc. But at 1600p is perfect. The black bar at top lowers it by about 2".
 
Personally I agree that 40" is the maximum size to use as desktop monitor. I've also been waiting on OLED, and I expected to keep my 40" 1080p screen until OLED became affordable. However there was a good 5-year hiatus where OLED went from promise to... nowhere, so I bought a 40" 4K LCD to hold me off, and for $300 it was a wonderful deal (compared to what I paid for te 1080p one in ~2009).

Only in the past couple years did OLED start becoming commercially feasible, so I'm guessing it'll be another ~3 years until OLED becomes cheap enough that 40" panels can be justified. Meanwhile, cost pretty much ensures more expensive tech like OLED will be tied to bigger panels - it makes it even more expensive to produce the new tech, but manufacturers can sell it at much higher price margins, so it's worth it for them.

Mark my words - 2020 is when we'll be able to buy a decently priced (~$500?) 40" OLED 4K panel, and by then it'll support HDR perfectly too. What do I base my prediction on? 20 years of consumer experience buying and researching all display technologies because they fascinate me.
 
I was going to say by 2020 we would have holograms or something, but then I realized that's only in 3 years, lol.
 
Is it possible to move the picture to the bottom of the screen so you don't have black bars on the top and bottom? Instead you'll have one big black bar on top.
 
Yeah, not sure I see understand your reasoning. The black bars are annoying, but the back of the wall is not? The fact that you can accept the bezel and the back of the wall but not the black bars just means that you believe the bezel afford you better immersion. To me, they both break immersion. .

Simple, run in windowed mode and set your desktop background to match the color of your walls :D

Seriously though, you have to factor in human perception of form over function, it's not always logical but a wide display 21:9 might just feel a lot more immersive, elegant even to some than a standard 16:9 with thick black bars on the top and bottom. Here we all talk about technical aspects, numbers and facts but there is still preference and subjectivity involved.


I bought a 4k 40" TV (KU7000) to use its native res (3840x2160) but found 3840x1600 (21:9) very useful to me as a custom res. Reason I prefer that is that the vertical height of a 40" monitor is too high for desktop usage, I have to look upwards at many things on screen, ie, browser controls, etc. But at 1600p is perfect. The black bar at top lowers it by about 2".

Which is why the LG38UC99 kind of makes some sense. It is not pushing the physical height limits like a 40" 16:9 but it's also not too vertically squat like a lot of ultrawides. It is native 3840x1600. It's PPi is just slightly higher than 27" 1440p so ideal and requires little to no scaling. As per most monitors it's low input lag and 1:1 mapping and uncorrupted color support like any other monitor ( i think it might support 10bit but it's not HDR). The GPU load is less and horizontally at least it is the same pixel count as 4k and it supports freesync. Now if this screen was $500 - 700 less, supported 120hz & HDR + 4k native input (it might be able to do that bit already) then for me id be happy to lose that extra vertical resolution for the vast curved display. Sadly this probably won't happen for another few years and by that point im hoping VR is good enough to use as a part time desktop replacement ( that probably won't be ready either :shifty: )

All that said, id imagine a reasonably priced 40" 4k monitor with 120hz/144hz, HDR, freesync / adaptive sync would make a lot of people consider that over a 21:9 screen. They can't be too far away now.
 
Last edited:
There's an interesting irony in that 2 years after I created this post, I'm now contemplating going from my 40" 4K screen to a 29" ultrawide, because there's enough games that cannot force-res within 16:9 and I'm willing to lose the big, defined 4K screen to gain the 21:9 aspect ratio.

Oh, how we change as years go by. Now I long for a 34" 1440p DCI P3 FALD HDR screen, but won't be able to justify the high price for a long, long time...
 
I have zero interest in 4K - I don't think the pixel density is worth the extra GPU horse power it takes to run it - if I'm going to push more pixels, I want more FOV width.

I'm happy with my 3x 1440P monitors and the only thing I'd consider as a replacement are the new 32:9 panels that are like a pair of 2560x1440 screens melded together. Has to have G-Sync though or no sale.

So it's gonna be a while.
 
I've considered swapping out my triple 1440p set for an ultrawide, but productivity is really nice with 3 monitors. I can have an IDE on one screen, web browser on another, and the app on the third screen. I'm not sure it would be the same with one screen.

Also, more on topic, running custom 21:9 on AMD Vega is a big pain, it basically doesn't work. I had no problem with I was using an Nvidia Titan X Pascal, so far I have not been able to play at 21:9 or any custom res with Vega.
 
Am I crazy here? How is this not a win-win?

You're not crazy. When VRR TVs are common, this will make the most sense across the board. The only thing PC monitors have over TVs right now is variable refresh and (sometimes) lower input latency. With HDMI 2.1, low latency modes, and VRR, there's going to be pretty much no reason to buy a computer monitor anymore.

27" 16:9 is too small, too. The shitty computer industry doesn't even make 30"+ displays normally. It's insanity.
 
  • Like
Reactions: elvn
like this
IAlso, more on topic, running custom 21:9 on AMD Vega is a big pain, it basically doesn't work. I had no problem with I was using an Nvidia Titan X Pascal, so far I have not been able to play at 21:9 or any custom res with Vega.

In fact, when I bought my current 40" 4K display, the *only* reason I sold my RX 480 and got a 1060 was that I couldn't manage to force-res games to 21:9 within 16:9. I knew Nvidia could do it (they have a checkbox to ignore whatever the display/game wants to do, which is why it works) and upon switching, it took 5 minutes to get it working. At least, on the games that allow you to do this.

You're not crazy. When VRR TVs are common, this will make the most sense across the board. The only thing PC monitors have over TVs right now is variable refresh and (sometimes) lower input latency. With HDMI 2.1, low latency modes, and VRR, there's going to be pretty much no reason to buy a computer monitor anymore.

This was my mentality 2 years ago when I opened this thread. However, I've found since then that some games refuse to let you force them into a non-native aspect ratio, appearing stretched even though the 21:9 dimensions are correct. I've fought with several Tomb Raider and AC games that do support 21:9, but refuse to play along with 21:9 within 16:9. It's become so annoying, that I'm now waiting for Black Friday discounts for a 29" $200 ultrawide to make do for the next few months, until a decent 34" 1440p DisplayHDR600 monitor comes down to a sane price (which could be about a year). So, while my initial thoughts haven't changed, the reality of game software has gotten in the way of that dream, and has overcome my desire for flexibility - guess I'm going back to :eek: 1080p, but at least I'll get freesync, since I'm seriously considering going back to a Radeon next spring (because honestly, subjectively, I feel like my games ran more fluidly and with better color on the Radeons... and I say that as someone who historically has mainly owned Nvidia GPUs).
 
Last edited:
I've fought with several Tomb Raider and AC games that do support 21:9, but refuse to play along with 21:9 within 16:9. It's become so annoying,

If you change your monitor resolution on the desktop/gpu driver level rather than keeping the desktop full 16:9 resolution and only trying to force resolution in game to 21:9, does it have the same effect?

----------------------

I'm eyeing the 65" Samsung Q9 line whenever their 2019 models get hdmi 2.1 so would have my desk set ~ 5' away and would like the option to run 21:9 or 21:10 resolution on it at times for higher frame rate and wide aspect. The Samsung Q9's have 480 zone FALD 1000nit+ HDR, VRR, and when they get hdmi 2.1 models should have 120hz native 4k with QFT (low input lag gaming spec of hdmi 2.1 .. quick frame transport). The Q8 series is similar for half the price but has 80 fald zones.

At this point I'm willing to wait until 2020 if I have to though, whenever strong enough pc gpus that actually support VRR and hdmi 2.1 come out. Xbox one already supports VRR on hdmi 2.0b - so unfortunately it's up to nvidia being a monopoly on the most powerful gaming cards keeping everyone hostage on g-sync longer, and whatever amd can come up with in the next two years I guess.
 
Last edited:
If you change your monitor resolution on the desktop/gpu driver level rather than keeping the desktop full 16:9 resolution and only trying to force resolution in game to 21:9, does it have the same effect?

Exactly the same problem. The trick, I think, is that the OS tells the game - this is the native rest/aspect ratio of this monitor - and some games operate with that information, nothing else. So when you create a different aspect ratio inside it, the game is unable to adapt. Let's be clear - this is %100 a software support problem, there is zero problem with the hardware. But, when two of the game series I want to play most have the exact same issue... yeah, I can forgo the 4K just so I can have 21:9, even if at lower resolution. 29" ultrawide will get me going for a little while for dirt cheap money, and it won't be that much smaller horizontally than my 40" 16/9.

My only warning for your Samsung plans is to be careful about games that will refuse to let you force-res. They may show the correct aspect ratio, but appear wrong - see this previous thread I started. It's the only reason I'm now moving to an ultrawide.
 
This was my mentality 2 years ago when I opened this thread. However, I've found since then that some games refuse to let you force them into a non-native aspect ratio.

While I hear that, what you describe is a software problem and not a hardware problem. People shouldn't support games with ass engineering.
 
Oh and what's funny is that my version of Final Fight actually explicitly supports this. 2m47s in the video.

 
Back
Top