Are you planning to buy a 4K monitor for gaming soon?

I’d like to know if you are planning to purchase a 4K monitor in the near future.

  • I plan to buy a 4k monitor within the next month or so.

    Votes: 18 12.8%
  • I want a 4k display but can’t afford it.

    Votes: 17 12.1%
  • I want a 4k display but my graphics is not powerful enough for it.

    Votes: 32 22.7%
  • 1080p is good enough, bugger off!

    Votes: 49 34.8%
  • I already have a 4k display.

    Votes: 25 17.7%

  • Total voters
    141
My dual 290x setup drives BF4 on ultra fine at 3840x2160.

I've been playing BF4, Alien Isolation, Titan Fall, and the Metro Redux games... no scaling issues for text/elements.

+1, I had no issues with GRID2/Civ5:BE/Endless Legend/Endless Space (this was the only game with a crummy UI at 4k res)/X-Com with a single R9280x. I guess those arent the most demanding games in the world though? looked pretty as could be.
 
My experience has been rather different with 4k, very different, but my decision to not do 4k is extrapolating on paper performance and my personal experience of DSR'ed 4k resolution to an actual 4k monitor. Since no shops here actually displays their 4k monitors, I have no idea what the increased PPI really looks like.

The games I have found UI issues were Civ 5 (not BE); Mass effect 3, DA:O and a few others. Civ 5 did let me see more of the map but the UI was so small that the text was literally illegible. I must admit that I did not search for UI scaling in Civ 5, but I had to disable DSR to see things again. DA:O and ME3 are not exactly the most demanding games, and while ME3 does look a bit better on 4k, I could not actually use the UI.

I tend to be very picky about stuff like this. If there is a problem with UI scale problem in 1 game, even if it is an obscure one, I will revert to what works originally. For example, while I haven't played CoH2 for a while, I found that the game was completely refusing to run under win 8.1 for some reason, and I wiped the OS for Win 7 because of it.

And, yes, SLI 980's would give good performance for 4k, but when I was comparing the SLI 970's under 1440p to 4k, the difference was with 1440p, the second 970 was a bonus, with 4k, the second 970 was required, which broke the deal. For example, I thought I wasn't going to be able to play Wolfenstien New Order at 4k as it does not support SLI, and for what it's worth, I can play AC:U with 1440p, and it's probably unplayable at 4k at the same level of detail (these are the hind sights). Sure I can lower resolution, but lowering resolution with a high res screen seem a bit counterproductive.

Lastly, I wanted to try high refresh monitors, 4k doesn't have them due to DP 1.2 limitations.

TL;DR: My decision to not go 4k was to make sure I can play every game without problems, and on paper, I found more potential and current problems with 4k than 1440p, hence I went with Swift.

my 2c
 
Also, increasing the resolution can make games look WORSE as the detail-per-pixel gets lopsided. Texture detail is usually fine, but polygonal/geometric detail is lost per-pixel.
 
Also, increasing the resolution can make games look WORSE as the detail-per-pixel gets lopsided. Texture detail is usually fine, but polygonal/geometric detail is lost per-pixel.

What are you talking about? Detail/pixel? A pixel only has one detail it's color. You're basically saying that switching one single colored pixel to 4 unique ones equals less detail.
 
I was perfectly fine with my 23" 1080p till... DSR. Now with 3200/1800 res I feel like I need, rather want a lot , a larger monitor. Even considering up to 4K. For now I will wait patiently as I really don't want to spend the money just yet.
 
What are you talking about? Detail/pixel? A pixel only has one detail it's color. You're basically saying that switching one single colored pixel to 4 unique ones equals less detail.

It's not that individual pixels get 'less' detail (as you stated, that's impossible) but rather that an edge that looked 'round' may not look 'round' when four times the pixels are used to define it's geometry. Jagged edges and low-res textures start to show through MUCH more when more pixels are available to render them.
 
No way. If you've been an A/V enthusiast for a long time, you'd know that there are other more important things to IQ than just pixel density. From your monitor, you also want low black levels, good contrast, good color accuracy, good viewing angles, etc. - and just about every one of these aspects is significantly more noticeable than pixel density. Resolution isn't a game about higher numbers - there's actually a sweet spot based on your viewing distance, quality of your eyesight, and the size of your screen. From my experience, 4K is a long jump off into the realm of diminishing returns. You're paying a lot more for your pixel density, as well as possibly sacrificing the more important aforementioned aspects of IQ. Hell, from an appropriate viewing distance, it's difficult to tell the difference between a 720P TV and 1080P TV watching a Blu-Ray (assuming the sets produce similar quality aside from resolution).

Seems more like a marketing gimmick than a practical feature in a monitor I'm looking to buy. If I wanted less jaggies or sharper detail, I'd be perfectly satisfied with a 1440P monitor and using DSR from there. I'd honestly be surprised if there are any 4k monitors or TVs that even come close to lower resolution sets in overall image quality. And I think it's a little sad so many people have already bought into 4K despite there not really being any warrant or much in the way of 4k content. There's barely even an existing 4k market, yet the majority of people want to hop in the bandwagon (if they haven't already)? Either people like to have their face right up in their monitors or they're very eager to spend huge amounts of money for a negligible boost in sharpness. I mean - what are you guys going to do when 8k sets pop up? Upgrade your rigs more and jump onto those too? 144Hz is way more important than 4k, and I don't even think 144Hz is all that important in the grand scope of things. I guess I shouldn't be surprised, many PC gamers buy into $800 flagship gaming monitors that deliver worse image quality than much larger, more beautiful $400 plasma sets. Of course, the plasma sets don't do high framerates, but I think PC gamers are really "feature" oriented.
 
Last edited:
I'm getting one of those 120+ Hz 1440p screens this year (hopefully) so then next step will be 120+ Hz 4K Oled after that and when we actually have enough gpu power.
 
It's not that individual pixels get 'less' detail (as you stated, that's impossible) but rather that an edge that looked 'round' may not look 'round' when four times the pixels are used to define it's geometry. Jagged edges and low-res textures start to show through MUCH more when more pixels are available to render them.
Uhhhh. That's not how that works.
 
70hz Philips 40" 4k monitor here....had a samsung S27A950D....will never go back after seeing the beauty that is VA.
 
Uhhhh. That's not how that works.

I'm pretty certain it is. I'm not talking about the "jagged" edges commonly ascociated with raster aliasing, I'm talking about the ability for a low resolution to 'hide' a lack of polygonal detail/texture detail.

An example is older PSX or N64 games rendered on 1080p. The added definition makes the lack of detail much more apparent. Basically there is more screen resolution, but no detail in the scene to utilise it.

I'm not saying that 4k will guaranteed make a game look worse than the same game at 1080p, but rather the added resolution may reveal some nasty bits of short-cutting you wouldn't have/couldn't have noticed otherwise.
 
Im good with my two 1440p monitors for now. Might be into 4k maybe in the future, just right now it isn't worth it (for me at least).
 
I'm pretty certain it is. I'm not talking about the "jagged" edges commonly ascociated with raster aliasing, I'm talking about the ability for a low resolution to 'hide' a lack of polygonal detail/texture detail.

An example is older PSX or N64 games rendered on 1080p. The added definition makes the lack of detail much more apparent. Basically there is more screen resolution, but no detail in the scene to utilise it.

I'm not saying that 4k will guaranteed make a game look worse than the same game at 1080p, but rather the added resolution may reveal some nasty bits of short-cutting you wouldn't have/couldn't have noticed otherwise.

I understand what you are saying but that is pretty silly. It's like people who say movies that are over 24fps because you notice the extra details that expose the effects or makeup or whatever.

Like this example, I would still take the higher resolution image even though it no longer looks like a sphere.

eeia0gD.png


XYTyyA0.png
 
I just don't see the point of going 4k on a monitor..

A friend has a 4k monitor (I think it's 29") and I really can't tell the difference between my 2560x1600.

Now, if you're gaming on a 50" screen, then yeah, absolutely. But how many are?
 
I understand what you are saying but that is pretty silly. It's like people who say movies that are over 24fps because you notice the extra details that expose the effects or makeup or whatever.

Like this example, I would still take the higher resolution image even though it no longer looks like a sphere.

eeia0gD.png


XYTyyA0.png

That's exactly what I'm talking about. I would still prefer the higher res image, but the game does look less realistic/convincing. It's like when HDTV launched and suddenly all if the newscasters who previously looked fine on 480i now look like hell in 1080i.
 
I will be at 1080 for a while. GPUs aren't fast enough for 1440P yet, and 4K runs like a slide show. I have no desire or the money to upgrade two GPUs every year to get 60 frame rates. I also don't want to deal with SLI/Crossfire issues.

When a single $250-300 GPU can run the vast majority of new PC games at 60 frame rates with 2-4x AA I will go to 1440P.
 
and 4K runs like a slide show.



Um, what? :rolleyes:

qO9bQ9g.png


EDIT: Also, the comments about 4k being pointless or even detrimental to image quality are ignoring how computer rendering works. I'll quote an old post of mine...


GoldenTiger said:
Yep, higher resolution gives you more pixels for a given screen area representing in-game objects/textures/etc. Say you're at a door that takes up the center bottom sixth of your screen. At 1920x1080, you might have roughly 800 pixels across and 500 high representing that door. The texture on the door has to be scaled by the game engine to fit in those dimensions. If the source texture is larger, it gets scaled down, thus reducing its visible quality. If you were on a monitor with twice the total pixel count and the same aspect ratio, you'd have a lot more of the texture visible in the same overall viewport area. This happens dynamically and scales as you move around and the door takes a different amount of screen space.

The same concept applies to geometry and the effect it has, especially on object edges in regards to antialiasing. If you have an edge that is horizontal at a slight angle downward left to right, think of looking through a grid (graph paper) where each square is representing a pixel. Think about how you could try to draw the line between its start and endpoints. You'll probably realize the problem here already: you don't have enough squares vertically to make it not come out looking very jagged and rough, and it doesn't look like a straight, smooth line like you see in real life on an edge. Screen resolution acts like this, a good analogy being a "screen door" that you are looking through to the virtual world, where each square of the screen was one pixel. The more resolution you have, the more pixels you have to represent a given object taking the same amount of viewport space, and thus smoother-looking less-jaggy edges (not perfect ever though even at super-high resolution) and much more of a texture represented in comparison to its source.
 
Last edited:
EDIT: Also, the comments about 4k being pointless or even detrimental to image quality are ignoring how computer rendering works. I'll quote an old post of mine...

No, he is sort of right. I wouldn't call it detrimental to image quality, just more like exposing flaws of low triangle count, especially with objects that have curved edges. I doubt that at 4k it makes much of a difference for it to be noticeable though, especially since most games already have a pretty high polygon count as it is. And even if it does expose it, that would mean it was originally blurry or aliased to begin with which is why the flaws weren't noticeable at that resolution, and I'd rather take a crisp image any day.

Playing Wii games at high resolution with Dolphin shows how low the polygon counts are on certain wii games, but nobody ends up playing those games at 480p by choice because of that.

So all in all it's not a pretty good argument against 4k. If the flaws are exposed at 4k then that must mean it already looked terrible at 1080p or whatever other resolution.
 
I wasn't contesting that higher display quality can show off art flaws, in fact I basically further explained why it can :p. I was contesting people pretending there is no benefit for anything in 4k or that it somehow magically is going to tank your performance without producing a sharper and more detailed image (especially noticeable on mid to long distance in games).
 
Back
Top