Difference between 4K resolution and 4K textures

edo101

Limp Gawd
Joined
Jul 16, 2018
Messages
480
So I recently got a 4K OLED and paired it with my r9 290. Suffice it to say, the card can't handle 4K well. I was playing Ace Combat 7 and though game looks good, I had to turn the resolution down to 1440p on my 4K tv and I didn't notice much of a difference between it and 4K. not texure wise anyways. The only thing I noticed was my rivertuner fps stuff being bigger instead of being smaller and finer as it would under 4K. I thought I'd have more screen real estate as well with 4K as opposed to 4K like you do on Windows but no, the FOV remained the same


Today I fire up Gears of War Ultimate and notice the same pattern except Gears of War Ultimate actually has a texture slider. with 4K, the game is barely playable on Ultra texture (4G ram) and ever less on 4K texture (6GB). What I ended up doing was go down to 1440p and select 4K textures. Game looks just as good, maybe a little less pop as it would have on 4k resolution plus 4k textures

Is this really it? Thats how 4k gaming was. It appears to me that 4K resolution means nothing if the game doesn't have 4K textures. I took some pics but was too lazy to upload them lol
 
4k textures doesn't equal 4k resolution, meaning how many pixels the GPU is drawing.

Textures are just what is mapped over the models and geometry. The resolution is how many pixels your computer is having to draw to represent things.
 
4k textures doesn't equal 4k resolution, meaning how many pixels the GPU is drawing.

Textures are just what is mapped over the models and geometry. The resolution is how many pixels your computer is having to draw to represent things.

I know it doesn't equal resolution but what I don't understand is why I don't see a difference in image quality unless the textures are changed. What is the point of 4K when 1440 looks at least in my opinion just as good? The only thing I can say is that a 1440p 55" screen won't pop as much as a 4K 55". But aside from screen size, it seems you don't notice the difference unless the details are actually scaled up. get what I'm saying?
 
I know it doesn't equal resolution but what I don't understand is why I don't see a difference in image quality unless the textures are changed. What is the point of 4K when 1440 looks at least in my opinion just as good? The only thing I can say is that a 1440p 55" screen won't pop as much as a 4K 55". But aside from screen size, it seems you don't notice the difference unless the details are actually scaled up. get what I'm saying?

and sensory input has its own limited resolution aka you might just not be able to see the difference
 
and sensory input has its own limited resolution aka you might just not be able to see the difference

Can you? What difference do you see when you set a game to 4K vs 1440p? I can tell on video content the difference between 4k and 1080p. Games for me depends on textures
 
Can you? What difference do you see when you set a game to 4K vs 1440p? I can tell on video content the difference between 4k and 1080p. Games for me depends on textures

It's pretty obvious, TBH. However, depending on how far away you're sitting from the display, and the quality of the scaler, which in HDTV's is usually pretty good, it won't be massive. That being said, I can notice the difference if not using native resolution. It's pretty obvious. Not to mention the loss in detail in distant geometry. There are less pixels to describe smaller objects.
 
It's pretty obvious, TBH. However, depending on how far away you're sitting from the display, and the quality of the scaler, which in HDTV's is usually pretty good, it won't be massive. That being said, I can notice the difference if not using native resolution. It's pretty obvious. Not to mention the loss in detail in distant geometry. There are less pixels to describe smaller objects.

So the difference is not in FOV or stuff close to you but in smaller/distant objects? Does the image on screen look sharper with native 4K as opposed to if it was in like 1440p?
 
So the difference is not in FOV or stuff close to you but in smaller/distant objects? Does the image on screen look sharper with native 4K as opposed to if it was in like 1440p?

Yes, 100%. Even the best scalers in high end HDTV's aren't perfect. Higher PPI matters up to a certain point. Ultimately though, if you're having to upscale you will have some quality loss.
 
4K texture are not necessarily seen on screen on a 1-to-1 pixel ratio. At a distance, filtering will down scale appropriately and cache that at a reduced resolution. Up close the higher resolution of the textures become apparent.

Video cards for awhile have supported even higher resolution textures and just now we're starting to see developers leverage higher resolutions. The catch is the memory requirements for such textures vs. the gain they provide while on screen. The larger the texture, the less likely it becomes that its s wholly sampled at full native resolution. There is a cost-benefit on traditional architectures here in that the storage capacity for the texture isn't put to full use because well, the full texture isn't on screen but does occupy memory. For reference 48 MB is used per 4096 x 4096 texture and double that figure for the filtered versions. Partially resident textures is the solution as only the areas of the texture and its filtered samples are actually stored on the card's memory. I just don't know off hand how wide spread partially resident texture support has spread throughout hardware and being leveraged since it was first proposed years ago. It is supposed to be the key feature that heralds wide spread usage of greater than 4k textures.
 
4K texture are not necessarily seen on screen on a 1-to-1 pixel ratio. At a distance, filtering will down scale appropriately and cache that at a reduced resolution. Up close the higher resolution of the textures become apparent.

Video cards for awhile have supported even higher resolution textures and just now we're starting to see developers leverage higher resolutions. The catch is the memory requirements for such textures vs. the gain they provide while on screen. The larger the texture, the less likely it becomes that its s wholly sampled at full native resolution. There is a cost-benefit on traditional architectures here in that the storage capacity for the texture isn't put to full use because well, the full texture isn't on screen but does occupy memory. For reference 48 MB is used per 4096 x 4096 texture and double that figure for the filtered versions. Partially resident textures is the solution as only the areas of the texture and its filtered samples are actually stored on the card's memory. I just don't know off hand how wide spread partially resident texture support has spread throughout hardware and being leveraged since it was first proposed years ago. It is supposed to be the key feature that heralds wide spread usage of greater than 4k textures.

I figured. Its the resolution vs texture thing that gets me.
 
So, not all eyes are the same. Some people can pick out 4K versus 1440p on a 28-inch monitor, others have a hard time telling the difference between 1080p and 4K on a 55" TV. Secondly: You can have a 3D asset with '4k textures' look blurry or low-res when wrapped around a model that takes up a considerable amount of screen space: Even at 1080p. This is because the surface area and visible area of the model will NEVER match, and you will rarely see all of a model's surface at once, meaning you can have an area that only encompasses 1kx1k stretched over a model in a huge area of your screen and each texel looks like its 4x4 pixels in size.
 
You also have to factor in that anti-aliasing is just a shitty approximation of running at a higher resolution. Ideally, you want to run at a resolution so high you can't even see pixels and wouldn't need anti-aliasing.
 
You also have to factor in that anti-aliasing is just a shitty approximation of running at a higher resolution. Ideally, you want to run at a resolution so high you can't even see pixels and wouldn't need anti-aliasing.

Which isn't really happening with current resolutions. Even at 4K antialiasing still removes issues like shimmering in small details and overall stabilizes the image. Different antialiasing methods result in different results. For example SMAA tends to do less well with shimmer but doesn't blur textures like FXAA. Meanwhile MSAA does a great job but comes at a high performance cost.

4K resolution vs 4K textures are two separate matters. Higher resolution textures allow for more finer surface details on models while higher display resolution allows actually seeing those details. 1440p already has plenty of pixels so the jump to 4K resolution isn't as drastic as 1080p to 1440p. Usually games use a variety of texture sizes as having a high res texture for something small like an apple on a table is pointless because you will never look at it as close up as you would a character model.
 
Which isn't really happening with current resolutions. Even at 4K antialiasing still removes issues like shimmering in small details and overall stabilizes the image. Different antialiasing methods result in different results. For example SMAA tends to do less well with shimmer but doesn't blur textures like FXAA. Meanwhile MSAA does a great job but comes at a high performance cost.

4K resolution vs 4K textures are two separate matters. Higher resolution textures allow for more finer surface details on models while higher display resolution allows actually seeing those details. 1440p already has plenty of pixels so the jump to 4K resolution isn't as drastic as 1080p to 1440p. Usually games use a variety of texture sizes as having a high res texture for something small like an apple on a table is pointless because you will never look at it as close up as you would a character model.

8k resolution does pretty much make anti aliasing unnecessary.
 
Can you? What difference do you see when you set a game to 4K vs 1440p? I can tell on video content the difference between 4k and 1080p. Games for me depends on textures
Why does it matter what I can see? Does it want you to run in 4k native resolution, because some stranger on the internet might be able to discern the difference?
 
Why does it matter what I can see? Does it want you to run in 4k native resolution, because some stranger on the internet might be able to discern the difference?

You can do whatever you want. No one is telling you otherwise.

I personally enjoying upscaling from 1 pixel to 4k. Looks brilliant. It greatly enhances my response times in video games. A simple on/off switch of sorts.
 
Can you? What difference do you see when you set a game to 4K vs 1440p? I can tell on video content the difference between 4k and 1080p. Games for me depends on textures

It's pretty obvious, TBH. However, depending on how far away you're sitting from the display, and the quality of the scaler, which in HDTV's is usually pretty good, it won't be massive. That being said, I can notice the difference if not using native resolution. It's pretty obvious. Not to mention the loss in detail in distant geometry. There are less pixels to describe smaller objects.

So, not all eyes are the same. Some people can pick out 4K versus 1440p on a 28-inch monitor, others have a hard time telling the difference between 1080p and 4K on a 55" TV. Secondly: You can have a 3D asset with '4k textures' look blurry or low-res when wrapped around a model that takes up a considerable amount of screen space: Even at 1080p. This is because the surface area and visible area of the model will NEVER match, and you will rarely see all of a model's surface at once, meaning you can have an area that only encompasses 1kx1k stretched over a model in a huge area of your screen and each texel looks like its 4x4 pixels in size.


This is basically my experience. I have a 55” 4k TV I play the Witcher on from about 7’ away. I actually play it at 1080p because I’d rather it smooth as a baby’s ass and I have a hell of a time telling the difference between 4k and 1080p at that distance / screen size. It’s my 1200/1080ti rig driving it... certainly capable of 4k/60 with some adjusting of settings.

I always took “4k textures” as more on the order of a marketing gimmick as others said it depends how close you are to the object in game, how far from your monitor you are, ect. If you want high resolution textures go for it. If the game looks fine as is and you don’t feel like bothering with it that’s also acceptable.
 
The largest issue is that resolution of the texturing itself does not equate to quality. If you have a 4k texture of a single shade of brown with some lazy artist work - It's still going to look like ass.

A lower resolution texture can look better if the artist who made it did a better job. Higher resolution just gives them more pixels to work with - But it's a diminishing returns type of thing up to a certain point.
 
You also have to factor in that anti-aliasing is just a shitty approximation of running at a higher resolution. Ideally, you want to run at a resolution so high you can't even see pixels and wouldn't need anti-aliasing.

Rastorized pixels are also an approximation for vector graphics which conceptually scales to an infinite resolution.

Anti-aliasing helps in other areas, even at higher resolutions. Indeed at higher resolution the antilaliasing effects will become more subtle and manifest more as helping contrast between objects than the raw blurring of pixels. Moire patterns with lots of positive space are a good example of this. Only when the lines converge do the raw pixels become a really muddy mess but the eye tends to focus along the lines crossing the positive space. Anti-aliasing here still helps make the lines 'pop' from the background, even at higher resolution.
 
Rastorized pixels are also an approximation for vector graphics which conceptually scales to an infinite resolution.

Anti-aliasing helps in other areas, even at higher resolutions. Indeed at higher resolution the antilaliasing effects will become more subtle and manifest more as helping contrast between objects than the raw blurring of pixels. Moire patterns with lots of positive space are a good example of this. Only when the lines converge do the raw pixels become a really muddy mess but the eye tends to focus along the lines crossing the positive space. Anti-aliasing here still helps make the lines 'pop' from the background, even at higher resolution.

At 8k+ you really don't need anti aliasing anymore.
 
You will need to sit uncomfortably close to a 4K TV to tell the difference between 1440p and 4k (assuming your TV has decent up scaling). 4K texture is very useful and can make an item look more detailed.
 
Back
Top