Lets be real does 4K actually make a difference?

edo101

Limp Gawd
Joined
Jul 16, 2018
Messages
480
I moved from my 1440p monitor to a 55" 4K OLED for gaming. granted I haven't played any games 2016 or newer. The ones I play a lot are Witcher 3 and GTA V. What I have noticed with these two moving to 4K is that textures are less stellar. They look a little pixelated and I can tell the textures are below 4K textures.

So then what good is 4K if games don't have 4K textures? and this might be a stupid question but why is it that its take more resouces?

I have seen difference when it comes to videos however. Movies, videos, in 4K with HDR and without HDR
 
You have to consider that with a 55" screen, even a 4K OLED, pixels are going to be bigger than on smaller 4K or 1440p screens. Do you sit close to the TV? What graphics card do you have?
 
You have to consider that with a 55" screen, even a 4K OLED, pixels are going to be bigger than on smaller 4K or 1440p screens. Do you sit close to the TV? What graphics card do you have?

Eh I sit about 1.5 to 2 meters from TV. Card is an r9 290. I get low fps with highest textures but it was to see if the game looked better/sharper. Not really. Colors and vibrancy yes but thats cause of the display tech.
 
Try FFXV with the 4k texture pack, or another game with 4k textures.
 
HDR will make a much more noticeable improvement versus 4K by itself...if 4K TV's had launched without HDR support it would have been a failure
 
Gaming? Not really.

Movies? Marginal.

Text/graphics? Very noticeable.

I agree with polonyc2 - HDR is a bigger deal than 4K when done properly.
 
Depending on the resolution the bigger the screen the worse it will look at a close distance. Which is why I don't get people who buy 65-70 inch TV's to watch at 1080p.
 
Gaming is way better looking at 4k 32 inch or smaller. Your PPI is too low at 55 inches.

Depending on the resolution the bigger the screen the worse it will look at a close distance. Which is why I don't get people who buy 65-70 inch TV's to watch at 1080p.
You have to consider that with a 55" screen, even a 4K OLED, pixels are going to be bigger than on smaller 4K or 1440p screens. Do you sit close to the TV? What graphics card do you have?

Someone told me it because 27" 1440p panel has ~110 PPI and a 55" 4K panel. Is this whats really happening?
 
Post processing looks better at higher res. So if you run ReShade with adaptivesharpen, clarity, SMAA and so on, you're going to get a sharper and less artifacty image than 1440p.
 
Yes

any other questions?

Of course no resolution is going to make a difference if you just keep getting a larger screen and moving further and further away from it.
 
55" My parents want one of those but they sit like 12 feet away. I told them smaller is better I hope their eyes are not pryed to oblivious after buying one. Your eyes are designed for hunting food so basically these big screens are messing people up.

Gaming is all about reading stuff on the screen weather it's text or visual elements that tie the entire game together even though you might see more with a bigger screen it doesn't engage you the same way you pick up information on a smaller screen. It might look better because it has more volume and looks bigger but that is about it like sitting in the front row of a theater of Star War Episode 10.
 
55" My parents want one of those but they sit like 12 feet away. I told them smaller is better I hope their eyes are not pryed to oblivious after buying one. Your eyes are designed for hunting food so basically these big screens are messing people up.

Gaming is all about reading stuff on the screen weather it's text or visual elements that tie the entire game together even though you might see more with a bigger screen it doesn't engage you the same way you pick up information on a smaller screen. It might look better because it has more volume and looks bigger but that is about it like sitting in the front row of a theater of Star War Episode 10.

What? I game on a 55" OLED. It's fine. Glorious, actually. By what you're saying I should go back to the days of a 19-22" monitor and low res?

One of the benefits of 4K, besides sharper text, is getting more information on the screen (as in I can view more forum posts without constantly scrolling than I could at 1080p or 1440p, or have more windows open concurrently). But a 19-27" 4K would be darn near unreadable for me - most people seem to need at least 32-40" for it to be comfortably legible without scaling. And you can't get an OLED smaller than 55" so here I am, but I'd definitely not go smaller than 40" even if the option were available.

I don't think I understood your post at all. A 55" is an extremely common TV size and it's not surprising that your parents are considering one. Are you suggesting that they go with something smaller that will make watching TV less engrossing and harder to pick up on fine detail? Not trying to be argumentative at all, just trying to see where you're coming from with the "smaller is always better and eyes pryed to oblivious" thing.
 
Yes

any other questions?

Of course no resolution is going to make a difference if you just keep getting a larger screen and moving further and further away from it.

So what you're implying is that I sit a little farther away so it becomes sharper
 
55" My parents want one of those but they sit like 12 feet away. I told them smaller is better I hope their eyes are not pryed to oblivious after buying one. Your eyes are designed for hunting food so basically these big screens are messing people up.

Gaming is all about reading stuff on the screen weather it's text or visual elements that tie the entire game together even though you might see more with a bigger screen it doesn't engage you the same way you pick up information on a smaller screen. It might look better because it has more volume and looks bigger but that is about it like sitting in the front row of a theater of Star War Episode 10.

Ugh. I really hate that myth. Sitting "too close" to a TV does NOT hurt your eyes. This has been disproven multiple times and actual eye doctors (as in the American Optometric Association) say the myth is bunk.
 
So what you're implying is that I sit a little farther away so it becomes sharper
No, what I'm saying is that a 27" 4K screen will look much better than a 27" 1080p screen. But if you compare a 27" 1080p screen to a 55" 4K screen then you have the exact same pixel size, so it would not seem better, just larger. Despite having more detail, the jagged edges would be just as visible. That is what you're experiencing. With a 4K 55" you still need AA to obscure jagged edges. But a 290 won't be able to handle 4K in of itself let alone with AA. However with a regular 27-28" 4K monitor you could get away with not using AA on it, and the image would look much sharper.
 
Ugh. I really hate that myth. Sitting "too close" to a TV does NOT hurt your eyes. This has been disproven multiple times and actual eye doctors (as in the American Optometric Association) say the myth is bunk.
That myth was created in the time of CRT screens, and it had merit then, because CRT screens did emit radiation, so sitting too close was bad for ya! But this effect was negligible in modern CRT monitors as most radiation was deflected away from the viewer. Old TVs however you could feel your hairs stand up if you stood close to it.
 
Bs my main rig has a 55c6 as it monitor and its amazing. it might not have super high ppi but it looks amazing on a dark game.
 
Someone told me it because 27" 1440p panel has ~110 PPI and a 55" 4K panel. Is this whats really happening?

No. I believe a 55 inch panel has about 80PPI if I recall correctly. At 49" you have about 90PPI which isn't too far off my old Dell 3007WFP-HC, so I went for that. 49" borders on being too large for desktop use. I wouldn't want to go to a 55". If all I did was play games I might do that, but I also work from my PC.
 
55" inch 4k is about the same PPI as a 27.5" 1080p screen exactly half of its diagonal, since the resolution is halved in both dimensions. Which is IMHO a tad too low.

4k @ 32" definitely has the increased textures and sharpness going for it, everything looks a lot smoother than my 27" 1440p, but I still play on my 27" because of the 144hz
 
i play on a 55" 4K for my main PC/Console monitor. I sit about 4 ft away as I have a wide desk(purposefully). I can definitely tell a diff in 4k vs 1080/1440p on my 2ndary/3rd monitors. Wouldn't ever go back.
 
So then what good is 4K if games don't have 4K textures? and this might be a stupid question but why is it that its take more resouces?

This... has nothing to do with each other.

4k texture just means the texture has 4k resolutions if the image. However, a texture is not shown on your full screen it shown ona surface on the game which is much smaller then your screen.
So the pixels/pixels density of the texture is a lot higher than your screen's pixels.
aka even with a lower res texture you might still need a higher res screen to show the details because of size difference.

It take more ressources because there are more pixels to be generated..exh pixels had some data associated with it that needs to be calculated and moved around
 
Eh I sit about 1.5 to 2 meters from TV. Card is an r9 290. I get low fps with highest textures but it was to see if the game looked better/sharper. Not really. Colors and vibrancy yes but thats cause of the display tech.
Well there's your problem.
A 290 will choke on itself trying to run Ffxv at max like that.
An R9 280 meets the minimum requirements for FFXV if you plan on running it at 1280x720 ;).
 
It makes a very big difference... ...in your frame rate :p

I still play on a 65" 1080 TV in my living room. From where I sit, I'm not exactly seeing individual pixels. Though obviously if I move a bit closer I will. I find being able to crank up all of the settings in my game, and either sync at 60 or 120 without dropping (depending on the game) is much more of a benefit than having more pixels crammed into the same space. I know I'm probably in the minority there. I also have another system with a 47" that's wall mounted with a desk brought out from the wall a few feet (for more comfortable viewing for that size). I don't find 1080 to be an issue there either, but I think 4K would be more of a benefit there than in the living room scenario since it's up closer. I also have a couple of systems with 24" monitors, and while they're ok at 1080, I think 1440 would be better for these ones.

I guess a better question in my mind, is what technology the TV is using, which features you want, which frame rates it's capable of. I'd personally prefer an OLED for example. HDR could be nice, though not completely necessary. Also, which video card are you going to buy that will drive it at the performance you want out of your games, and the detail levels that you're accustomed to?

I will eventually jump to 4K. I've got nothing against it as a technology. The thing is, until I can buy a video card in the $400-500 range, that will allow 4K at 60FPS minimum with all effects and detail settings maxed, it really doesn't interest me all that much.
 
I just bought a 4K 65", but I sit about 10-13 feet away from it. I have a GTX 1080, so I turned up all the settings on Witcher 3 and I can assure you, it made a huge difference!
 
I just bought a 4K 65", but I sit about 10-13 feet away from it. I have a GTX 1080, so I turned up all the settings on Witcher 3 and I can assure you, it made a huge difference!

That's actually really good to hear. How are the frame rates with everything up at that res?

I haven't been following too closely as I haven't quite been ready to jump for the living room yet anyway. I need to upgrade the HTPC in there as well. Right now it's an i5 with a GTX-660 (my 1070 is in my desktop). I'll probably build a Ryzen system with an 1170 (or whatever they call it) and then upgrade the TV.

That's really the main thing I was holding off for. I want a large 4K eventually, but really want a video card that can properly keep up, and don't want to pay $700-1200 to do it. I was thinking maybe the 1170 would be up to the task, but we'll find out when that releases I suppose.
 
That's actually really good to hear. How are the frame rates with everything up at that res?

I haven't been following too closely as I haven't quite been ready to jump for the living room yet anyway. I need to upgrade the HTPC in there as well. Right now it's an i5 with a GTX-660 (my 1070 is in my desktop). I'll probably build a Ryzen system with an 1170 (or whatever they call it) and then upgrade the TV.

That's really the main thing I was holding off for. I want a large 4K eventually, but really want a video card that can properly keep up, and don't want to pay $700-1200 to do it. I was thinking maybe the 1170 would be up to the task, but we'll find out when that releases I suppose.
Everything but Hairworks.
upload_2018-8-7_16-33-26.png


I get around 50 FPS on my machine with HW 4xAA.
 
Everything but Hairworks.
View attachment 94641

I get around 50 FPS on my machine with HW 4xAA.

Not too bad. So, we're approaching the sort of performance I want if I'm going to jump, but not quite there yet in the price bracket of video cards that I typically buy these days.

There was a time when I'd pay for the top end. That was before the days of Titan cards. I've never paid $1000+ for a single video card. I've paid $600 or $400x2 for SLI setups before. However, since I have other hobbies, kids, wife, house etc. these days, I like to stay in the GTX x70 range or basically the $350-$500 range if we're going to skip the brand, and I don't do dual setups anymore. Say the GTX-1170 (assuming name) comes in at $499 (maybe even $550 as I'm fairly flexible) and maybe it edges out the 1080Ti. (Hypothetical of course.) That might push me to finally jump to 4K. That's a lot of IFs though. I'll probably be waiting one more generation if I'm honest. I do plan to grab the 1170 though assuming it's interesting, so maybe I'll change my mind around then. I REALLY like things synced at 60, so that's my personal target. (without dropping a bunch of settings)
 
At 4K you can sit close, even with a 49" TV, and not need any AA at all. Which means no AA artifacts or blurring when looking at in game objects like cables/wires/ropes, chain link fences, vegetation etc. And of course, no jaggies on edges. The only reason I don't run 4K is my SLI 980Tis can't run it fast enough to be fluid in Ultra in all games. Waiting for 1180 :)
 
That's actually really good to hear. How are the frame rates with everything up at that res?

I haven't been following too closely as I haven't quite been ready to jump for the living room yet anyway. I need to upgrade the HTPC in there as well. Right now it's an i5 with a GTX-660 (my 1070 is in my desktop). I'll probably build a Ryzen system with an 1170 (or whatever they call it) and then upgrade the TV.

That's really the main thing I was holding off for. I want a large 4K eventually, but really want a video card that can properly keep up, and don't want to pay $700-1200 to do it. I was thinking maybe the 1170 would be up to the task, but we'll find out when that releases I suppose.
Don't ask me about framerate, I don't monitor it and I couldn't tell the difference between 30 and 60 anyways. It's not an issue for me. <shrug>
 
Last edited:
I play at 144hz with 144fps and I can really feel a lesser frame rate or refresh.

4k as it is now, isn't for gaming. It's fantastic for productivity though.
 
is there a technical definition as to what exactly 4k textures are, because they can't possibly be 4000x4000 pixels, that would be retarded... just like conversations on pixel density without considering viewing distance
 
Back
Top