Lets be real does 4K actually make a difference?

24 inch - 1080p
27 inch - 1440p
30+ inch - 4k
If you follow the 110 PPI logic then the breakpoints are:

1920x1080 = 20"
2560x1440 = 27"
3840x2160 = 40"
is there a technical definition as to what exactly 4k textures are, because they can't possibly be 4000x4000 pixels, that would be retarded... just like conversations on pixel density without considering viewing distance
I always think the same thing when I see this. 4096x4096 textures are actually tiny by today's standards and wouldn't look good at 4K rendering resolution.
 
If you follow the 110 PPI logic then the breakpoints are:

1920x1080 = 20"
2560x1440 = 27"
3840x2160 = 40"

This is pretty much right on the money. I have two 27.5" 4k screens and I just run them at 1440, which is perfect for this size.
 
Textures are almost always powers of two, so 4096 x 4096. This is because it's ideal for mip-mapping. Although it's certainly possible to do mipmapping on NPOT textures as well, just not down to the last few mip levels at which point it probably doesn't matter.
I don't think there has been any real disadvantage to non-power of two textures for years now, at least going back to G80 and GCN1. As far as I know DirectX passively handles them at this point.
 
Textures are almost always powers of two, so 4096 x 4096. This is because it's ideal for mip-mapping. Although it's certainly possible to do mipmapping on NPOT textures as well, just not down to the last few mip levels at which point it probably doesn't matter.
My point was, it depends on the scaling of the texture on a model... stretch that "4k texture" across a huge surface and it looks like you're playing n64. Game editors use whatever texture resolution you throw at it, sometimes you only need 400 pixels for smaller objects, sprites, or decals. So is this "4k texture" phrase really just "texture that look good on 4k displays?" or is there some other definition.
 
Last edited:
My point was, it depends on the scaling of the texture on a model... stretch that "4k texture" across a huge surface and it looks like you're playing n64. Game editors use whatever texture resolution you throw at it, sometimes you only need 400 pixels for smaller objects, sprites, or decals. So is this "4k texture" phrase really just "texture that look good on 4k displays?" or is there some other definition.

I wouldn't equate textures with the display resolution regardless of any marketing terms. A texture should be sized to fit the application. I'd be more apt to say "4K Ready Textures" as that would imply what you're saying. A texture that will look appropriate for the display resolution. If someone is touting "4K Textures" I tend to think it's marketeering, nothing technical.
 
Which is why I don't get people who buy 65-70 inch TV's to watch at 1080p.

On a 65" 4k isn't even discernible until you're within about 8' and you don't get the full benefit of it until you're about 4' from it.

Everyone has different preferences but the "mixed usage" rule of thumb (as opposed to cinema) is 30 degrees which equals a 65" TV at a 9' viewing distance which means 4k has basically no benefit.

Using the cinema 40 degree FoV it'd be 6.5' for a 65" TV so 4k becomes beneficial.

I don't know what the average viewing distance is for people but I suspect that 4k is rarely beneficial on your typical 65" TV setups.
 
I don't know what the average viewing distance is for people but I suspect that 4k is rarely beneficial on your typical 65" TV setups.

I'm not going to disagree- well-mastered and delivered 1080p content is going to look pretty good, and on 'average', I doubt most could tell the difference, or care if they could.

But 4k is just one facet of home theater advancement...

...on the desktop, the difference can be huge.
 
I say

27" and under 2K
28" and over 4K

Its not just the pixel density, its the real estate and what each individual can handle with no scaling.
 
I think it's one of those things you don't always notice until you do a direct comparison. Kind of like when a new processor doesn't seem all that much quicker than one 4-5 gens ago...until you go back and actually use that old one again.
4K doesn't look that much better than 1080p until you toggle between the two. Well, that and the game/movie has to take advantage of it. A movie that letterboxes 1/2 your screen and has a shitload of film grain is going to look the same. A game with textures from 2008 will, too.
I do think it's a smaller leap than 480p to 720p and 1080p, but it's still pretty significant if the content supports it.
 
I think it's one of those things you don't always notice until you do a direct comparison. Kind of like when a new processor doesn't seem all that much quicker than one 4-5 gens ago...until you go back and actually use that old one again.
4K doesn't look that much better than 1080p until you toggle between the two. Well, that and the game/movie has to take advantage of it. A movie that letterboxes 1/2 your screen and has a shitload of film grain is going to look the same. A game with textures from 2008 will, too.
I do think it's a smaller leap than 480p to 720p and 1080p, but it's still pretty significant if the content supports it.
Actually, wide angle (letterboxed) movies, stand to benefit as much, if not more. A wide angle format limits the amount of vertical information possible in the frame. On a 1080p blu-ray, a wide angel movie only shows what, like 600 vertical pixels? With 4K, you get a huge boost there.
 
Actually, wide angle (letterboxed) movies, stand to benefit as much, if not more. A wide angle format limits the amount of vertical information possible in the frame. On a 1080p blu-ray, a wide angel movie only shows what, like 600 vertical pixels? With 4K, you get a huge boost there.

I'm just thinking that if 1/2 your screen is made up of black bars (and the picture is therefore smaller), it's not like you're going to see a huge jump in quality over 1080p.
 
I don’t know how you guys deal with the response times on tvs, especially oled. Most oled TVs sit at about a half second response time.
 
I don’t know how you guys deal with the response times on tvs, especially oled. Most oled TVs sit at about a half second response time.

Huh? Rtings rates my LG B7 10/10 for response time and 9.1/10 for input lag. While it's not a hardcore 165Hz gaming monitor, it's quite excellent for gaming and general use, especially where picture quality is concerned. Been using it since Nov. 2017 with zero complaints.
 
Huh? Rtings rates my LG B7 10/10 for response time and 9.1/10 for input lag. While it's not a hardcore 165Hz gaming monitor, it's quite excellent for gaming and general use, especially where picture quality is concerned. Been using it since Nov. 2017 with zero complaints.

I'm looking forward to using mine now that I have the 7.1.2 setup rolling ;)

[of course I'll be streaming to a Shield, but I'll be playing games that are amenable to that method, and it's nice to know that the TV isn't adding any extra lag!]
 
I don’t know how you guys deal with the response times on tvs, especially oled. Most oled TVs sit at about a half second response time.
Did you accidentally add a zero?
500ms is 0.5 seconds.
Lag figures I see on TVs are often under/around 50ms, sometimes up to 100ms.

I look for displays with 30ms or less which is difficult in UHD projector land.
I'm still waiting for a good enough projector to be released with the features I need (like HLG HDR, 2500+ lumens, wide colour spectrum, good bulb life with sensible price ...).
 
I'm just thinking that if 1/2 your screen is made up of black bars (and the picture is therefore smaller), it's not like you're going to see a huge jump in quality over 1080p.
Newer televisions often have input lag of 30ms or better, even at 4K with HDR. You don't even have to spend a lot of money for it. Most current models in Samsung's lineup have input lag better than 20ms (some as low as 12ms). Not as good as, say, a PG279Q is at 3ms, but they're getting there. As long as the lag is less than the length of a frame (16.67ms at 60 Hz) then it's good.
 
Depending on the resolution the bigger the screen the worse it will look at a close distance. Which is why I don't get people who buy 65-70 inch TV's to watch at 1080p.

If you sit at or around 6ft to 7ft. It's fine, if you got good tech. I love my 65inch VT50. It has served me well as I wait for HDR to get it's shit together.
 
I don’t know how you guys deal with the response times on tvs, especially oled. Most oled TVs sit at about a half second response time.
especially OLED??

OLED TVs are generally a lot more responsive than most LCDs and virtually all plasmas. Most of the OLED reviews I have seen are at or under 20ms. There are only a few LCDs under 30ms and most plasmas were over 50ms.
 
Where things have been challenging is that until this MONTH any game that had 4K support has been - either somewhat or definitely - some sort of CPU pig. Burnout Paradise Remastered stands that theory absolutely on its ear. (I didn't discover it - and I won't claim it; besides, the data is on YouTube. YouTuber Santiago Santiago put FRAPS to work on the PC version of Remastered and got 4K/60 fps on a combo of a dead-stock G4560 and GTX 750Ti. That is not even CPU piglet turf.) Now, I own Remastered myself, and have scarily similar hardware (check the sig below) - all I need to change is the display - not a thing else. I can upgrade either my PC display OR TV display for less than $400USD - or both for $800USD. Naturally, if I upgrade EVERYTHING else, we're talking on the order of $1500USD - inclluding the same two display upgrades. That merely assume the RTX 2060 releases at the planned price point without pricing insanity due to mining - and that assume that the rest of the upgrades remain current-gen Intel quad-core or AMD Ryzen equivalent. Still, a sensible upgrade path. However, I hate being held hostage to miners - which is the case for upgraders today. (Let's get real - how many of those RTX 2980s wound up in the mitts of miners - as opposed to benchmarketers, benchmarkers, or gamers?)
 
I ended up playing games for the last few days (after vacation activities) on a 4K Sony 77" OLED. I had brought my main gaming system along with arcade controllers with me in case we wanted to play some arcade games at night. I knew upfront that I'd have to lower settings way to far at 4K to get the frame rates I'm accustomed to, but I tried a few anyway, just to see how they looked. As I expected, they were gorgeous on that display, and unsurprisingly Doom was one of the few that could still keep up a decent frame rate with mostly decent settings.

Still, I ended up dropping things back to 1080, cranking all detail levels, running good levels of AA, and just enjoyed things on that size and quality of display while running at flawless rates. I have to say even at 1080 on that size display from maybe 7 feet away was eye-opening compared to my 65" at home. (not OLED) We played a lot of things, but I ended up getting totally sucked back into Doom and finished about two thirds of the game just because it was a new experience like that.

Anyway, this said more to me about just the quality of display tech than additional resolution obviously. Maybe if I had a 1080Ti+ I would have come away with better opinions of 4K overall. As I thought, once I can get my hands on a card that isn't ridiculously priced that will play at that resolution without breaking a sweat, I'll be all about it. :D
 
Where things have been challenging is that until this MONTH any game that had 4K support has been - either somewhat or definitely - some sort of CPU pig. Burnout Paradise Remastered stands that theory absolutely on its ear. (I didn't discover it - and I won't claim it; besides, the data is on YouTube. YouTuber Santiago Santiago put FRAPS to work on the PC version of Remastered and got 4K/60 fps on a combo of a dead-stock G4560 and GTX 750Ti. That is not even CPU piglet turf.) Now, I own Remastered myself, and have scarily similar hardware (check the sig below) - all I need to change is the display - not a thing else. I can upgrade either my PC display OR TV display for less than $400USD - or both for $800USD. Naturally, if I upgrade EVERYTHING else, we're talking on the order of $1500USD - inclluding the same two display upgrades. That merely assume the RTX 2060 releases at the planned price point without pricing insanity due to mining - and that assume that the rest of the upgrades remain current-gen Intel quad-core or AMD Ryzen equivalent. Still, a sensible upgrade path. However, I hate being held hostage to miners - which is the case for upgraders today. (Let's get real - how many of those RTX 2980s wound up in the mitts of miners - as opposed to benchmarketers, benchmarkers, or gamers?)

fwiw I was able to run Burnout Remastered at 11500x2160 in triple screen and it was surprisingly playable.
 
Anyway, this said more to me about just the quality of display tech than additional resolution obviously. Maybe if I had a 1080Ti+ I would have come away with better opinions of 4K overall. As I thought, once I can get my hands on a card that isn't ridiculously priced that will play at that resolution without breaking a sweat, I'll be all about it. :D

That's really the key. It's night and day better to look at 4K vs. 1080p on a large screen. You can toggle your resolution and see that much.
Yet when you factor performance in, it takes a LOT of horsepower to get acceptable framerates. Like you mention, basically a 1080Ti or better.
I'd rather play at 1080p/60 vs. 4K/<40fps myself, too.
 
The OP was probably complaining because his r290 graphics card was running super low FPS at 4K (which he/she admitted) not realizing this will compromise the quality of the graphics. The OP was probably thinking that is what 4K looks like but in actuality what he/she was seeing was a extremely compromised 4k quality graphics.

I've done this before back in the day maxing out Far Cry at 1080p to see what it looks like running like 20 FPS thinking graphics aren't that great. Later when I upgraded I realized it looked amazing and what I saw before was severely compromised graphics.

I game at both 4K and 1080p and 4K definitely looks better and sharper, more real estate. But gaming at 1080p still looks good to me as well. If I can't max out a game at 4k I play at 1080p
 
Last edited:
This is somewhat off topic but seems like there are some knowledgeable people posting in here so I'll go for it

On a 1440p monitor will a Blu-Ray(1080p) look better or worse than a 4k Blu-Ray? (2160p) One is getting stretched, the other is getting squished.
 
I have a 4K tv and a 21:9 ultrawide...

4K may not make a difference but an OLED with HDR sure does make a difference; and the only way to get to that is to go 4K.....
 
This is somewhat off topic but seems like there are some knowledgeable people posting in here so I'll go for it

On a 1440p monitor will a Blu-Ray(1080p) look better or worse than a 4k Blu-Ray? (2160p) One is getting stretched, the other is getting squished.

While I don't have a 1440p monitor to test, I would assume the 4k downscaled would look better than a 1080p stretched video. Sort of like viewing 720p on a 1080p, it will probably be slightly blurred or have a soft look to it.
I do find that running 1080p on 4k actually doesn't look to bad. It actually just looked like a native 1080p (at least on my 4k tv, haven't tested any monitors). Probably because 1080p divides evenly into 2160p.
 
Yeah, viewing 1080p videos on a 4K monitor doesn't look that different from a native 1080p display. I'm sure you can likely nitpic it, but I doubt most people would notice it. Especially with film grain, letterboxing, video compression, etc. factoring into most content.
 
I've been playing around with resolutions a lot as of late - My opinion lands on its a personal choice - myself, I play all First Person Multi-player games which frame times and rates come first (more-so since my reaction time isn't anything near what it used to be) - 2 sweet spots I've found is 1920x1080 244Hz (27in screens) and 2560x1080 200Hz (34/35in screens).

The size of charters come into play, and even map area, if there's too much its hard to keep your eyes directly in front of you and the weapon pulled up mid-way. Then of course at 4k you bring all the performance challenges into play for FPS Multi-player games.

Again though, to me, its a very personal choice - I'm not even a fan of 4k at my work desktop, my old eyes can't take all those tiny fonts and images - if I wanted that, I'd use my cell phone for everything.
 
PC Gaming:

Naw not really. Mostly because 100hz+ isn't support on HOR 4k IPS monitors yet. Ive actually been waiting on it for a while. Asus was the first to announce but they've delayed it over a year now. Supposedly Acer is now making plans to beat them out on it. I went through another of my seasonal emotional purchase weeks and tried the Acer Predator 4k but it's only 60hz, squared. I couldn't stand it. I ended up settling on an Acer Predator x34P. Very happy with it. I ended up parting out my Nvidia surround setup, just didn't need it anymore. For those not familiar its 120hz ultrawide 1440p IPS. The alienware equivalent has been on sale lately.

TV:

Yeah 4k is the standard now and on games and Blurays I can very much see the difference at 6-10ft.
 
Last edited:
For competitive FPS games (or anything where shooting accuracy matters), lower resolution is often better because it gives you more wiggle room. You're just shooting at the upper half of a blob in the distance instead of clearly having to hit something in the head.
 
I still need to test more games to see what I prefer but right now I'm jumping between my 24" monitor (1080p@144hz) and my new 65" 4k. I'm probably going to play more casual controller friendly games (and games that I can max out on my 1080) on the 4k screen and play more quick response kb/mouse games on my monitor.

I do have to say though, Forza Horizon 4 and Motorsport 7 both look amazing on 4k with hdr enabled.
 
For competitive FPS games (or anything where shooting accuracy matters), lower resolution is often better because it gives you more wiggle room. You're just shooting at the upper half of a blob in the distance instead of clearly having to hit something in the head.

I still run 800x600 or 1024x768 in CSGO but not for something like battlefield
 
I would say so. Mainly because a 4K display will generally have better picture quality than a monitor. After playing a PS4 Pro on my quantum dot HDR television going back to my 144 Hz IPS monitor for PC games is disappointing.
 
I would say so. Mainly because a 4K display will generally have better picture quality than a monitor. After playing a PS4 Pro on my quantum dot HDR television going back to my 144 Hz IPS monitor for PC games is disappointing.
Quatum dot monitors from Samsung might work for PC gaming. Especially with the nice ultrawides (my personal preference for any PC gaming).
 
Back
Top