chaikovski2002
2[H]4U
- Joined
- Nov 19, 2003
- Messages
- 2,247
1200p at 60hertz. Now and forever. Amen.
Thats what Im on right now, 28inch 1920x1200.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
1200p at 60hertz. Now and forever. Amen.
My dual 290x setup drives BF4 on ultra fine at 3840x2160.
I've been playing BF4, Alien Isolation, Titan Fall, and the Metro Redux games... no scaling issues for text/elements.
Also, increasing the resolution can make games look WORSE as the detail-per-pixel gets lopsided. Texture detail is usually fine, but polygonal/geometric detail is lost per-pixel.
What are you talking about? Detail/pixel? A pixel only has one detail it's color. You're basically saying that switching one single colored pixel to 4 unique ones equals less detail.
Uhhhh. That's not how that works.It's not that individual pixels get 'less' detail (as you stated, that's impossible) but rather that an edge that looked 'round' may not look 'round' when four times the pixels are used to define it's geometry. Jagged edges and low-res textures start to show through MUCH more when more pixels are available to render them.
I can see those pixels from here.Thats what Im on right now, 28inch 1920x1200.
Uhhhh. That's not how that works.
I'm pretty certain it is. I'm not talking about the "jagged" edges commonly ascociated with raster aliasing, I'm talking about the ability for a low resolution to 'hide' a lack of polygonal detail/texture detail.
An example is older PSX or N64 games rendered on 1080p. The added definition makes the lack of detail much more apparent. Basically there is more screen resolution, but no detail in the scene to utilise it.
I'm not saying that 4k will guaranteed make a game look worse than the same game at 1080p, but rather the added resolution may reveal some nasty bits of short-cutting you wouldn't have/couldn't have noticed otherwise.
I understand what you are saying but that is pretty silly. It's like people who say movies that are over 24fps because you notice the extra details that expose the effects or makeup or whatever.
Like this example, I would still take the higher resolution image even though it no longer looks like a sphere.
and 4K runs like a slide show.
GoldenTiger said:Yep, higher resolution gives you more pixels for a given screen area representing in-game objects/textures/etc. Say you're at a door that takes up the center bottom sixth of your screen. At 1920x1080, you might have roughly 800 pixels across and 500 high representing that door. The texture on the door has to be scaled by the game engine to fit in those dimensions. If the source texture is larger, it gets scaled down, thus reducing its visible quality. If you were on a monitor with twice the total pixel count and the same aspect ratio, you'd have a lot more of the texture visible in the same overall viewport area. This happens dynamically and scales as you move around and the door takes a different amount of screen space.
The same concept applies to geometry and the effect it has, especially on object edges in regards to antialiasing. If you have an edge that is horizontal at a slight angle downward left to right, think of looking through a grid (graph paper) where each square is representing a pixel. Think about how you could try to draw the line between its start and endpoints. You'll probably realize the problem here already: you don't have enough squares vertically to make it not come out looking very jagged and rough, and it doesn't look like a straight, smooth line like you see in real life on an edge. Screen resolution acts like this, a good analogy being a "screen door" that you are looking through to the virtual world, where each square of the screen was one pixel. The more resolution you have, the more pixels you have to represent a given object taking the same amount of viewport space, and thus smoother-looking less-jaggy edges (not perfect ever though even at super-high resolution) and much more of a texture represented in comparison to its source.
EDIT: Also, the comments about 4k being pointless or even detrimental to image quality are ignoring how computer rendering works. I'll quote an old post of mine...