Do you Regret buying a 4K ?

Comixbooks

Fully [H]
Joined
Jun 7, 2008
Messages
22,002
I read these reviews on Amazon where people dable with 4k but end of going back to 1080P or more so 1440P just because the FPS is bad on the newer games out there with 4k because the optimization is bad or their computer can't handle it. I'll never run SLI myself due to the cost and problems running SLI.
 
Decided to stay 1440 at 144hz running Crossfire (smooth for me).

Glad I did, and until 4K refresh does not at least pass 120hz refresh, I ain't moving.
 
Like you said Comixbooks - unless you have the hardware for it, it's only going to disappoint. If I was a more hardcore gamer, I'd stick with 1080 or 1440 or 1600 - but given that I mostly just watch YouTube (4K videos are eye popping) and maybe play an RPG or Civ5 now and again, I don't really have much need for maximum uber frames per second. Hell I'm happy at smoothed out 32+ FPS much less 60. 120 or 144 is just overkill for my needs. If I get the urge, I guess I can always add another Titan X. Seems people are selling them off to go to Ti's.
 
From some of the reviews I've seen, quite a few 4K monitors run 120 or higher @ 1080p so Aadik has a point.
 
The hardware does not running at 4k very well. Can we wait for next generation video card or cpu at next year? That is bother me to choose 4k, 2k,1080p and size....Oh god.....
 
Just lower the game quality settings. I tried the dam scene in Crysis 3 several times at high and medium, and didn't notice any difference. Perhaps, if I were performing side-by-side comparison it would be more obvious, otherwise you have to know in advance what artifacts to focus your attention to. So I was able to raise FPS from 30 to 40 on humble GTX 970, and 40 was quite acceptable for me.
 
I read these reviews on Amazon where people dable with 4k but end of going back to 1080P or more so 1440P just because the FPS is bad on the newer games out there with 4k because the optimization is bad or their computer can't handle it. I'll never run SLI myself due to the cost and problems running SLI.

1) 4k is 2x width and 2x height of 1080p. So no need to "go back", you can still use your 4k monitor with 1080p without any visual problems. At same time you can use the 4k for desktop & apps.
2) why does everyone needs to run 4k Ultra 4xAA ? AA @ 4k is super useless.
 
As to the OP's original question - no, I do not regret buying a 4K :)
 
I regret buying a 4k not because of bad frame rates or anything, but because back then I failed to do any sort of research whatsoever and simply bought what was available and I ended up with a grainy matte TN panel with 35ms of input lag and was not willing to pay the restocking fee to return it.
 
Yes I did when I got a seiki 39 4k last year. It just felt chunky. I got a BenQ 4k ips 5 months ago and it got better. What really changed it though was finally setting up the 980ti in sli. . Wow. 4k is glorious now.
 
I just picked up a samsung ju7500 40 inch and all I can say is wow. Running dual 970s and only had to drop AA on BF4 to get 60fps. Makes all games look better , he'll even diablo 3 looks nice now lol. The black levels on the samsung kill my old monitor. Really enjoying it. I'm not a twitch gammer so the input lag hasn't bothered me. Haven't really noticed it at all really. I will be keeping this monitor for a long time or until there is a 4k OLED to replace it.
 
What people don't realize is that the actual picture quality is far more important for games than the amount of pixels.
 
1) 4k is 2x width and 2x height of 1080p. So no need to "go back", you can still use your 4k monitor with 1080p without any visual problems. At same time you can use the 4k for desktop & apps.
without integer scaling it make little difference if its 2x or some other non-integer number. 1080p on 20160p monitor will look almost identical to how 1080p look on 1440p monitor. That is in most images except some rare cases like if you scaled image with scanlines effect.

What you wrote would make sense for monitor that have integer/point scaling. In such case 1080p on such 2160p monitor would look as good if not better than on native 1080p monitor.
 
Decided to stay 1440 at 144hz running Crossfire (smooth for me).

Glad I did, and until 4K refresh does not at least pass 120hz refresh, I ain't moving.

Same. I hve Just gone from an Asus 28" TN 4k to a BenQ 32" VA 1440p and the overal image although not quite as sharp is just miles better ( same price ). I hardly ever return things but the Asus lasted about half a day ! my last screen lasted 6 years and had 8000 hours on

(the build quality and plastics were terrible, garbage in fact I don’t think Id buy another asus monitor at that price range)


Simply having more pixels doesn’t mean a better image quality past a certain point. I thought we all learned that when Plasma was still out and producing better images at 720p than LCD @ 1080p

I expect the same would occur if there was a comparison between a 1080p a nice OLED vs a mediocre noname brand 4k screeen. The OLED would trounce.



Btw The performance of 4k is overrated, sure i dont play heavy duty games, but I dropped AA to 'off' from x4, selected medium shadows and i could hit 40 - 60fps on a single gtx760 ! sure, there were times often when my vram was out and there were dips to 25-30fps on some games but, if you have a 980ti for example and do what I did to the settings your pretty much there. Next gen, single cards will handle 4k with all the current gen games but your always going to be spending more, the more pixels you choose to have in your rig.
 
Last edited:
forgot to add, scaling on a 28" i did notice a big loss in screen real estate. without it your at risk of hurting your eyes long term, I had to have a 28" screen less than 30cm away from my eyes to read the steam fonts ! you probably need about 34" to be good with 4k that or a 39.5".

I have always defended TN because my last panel was actually really nice for its time but now Im using VA I wouldn’t want another TN (( sorry TN owners ))
 
Last edited:
Never bought for a 4K for monitor use but I had a 55" Samsung HU8550. I eventually returned it, and moved to the EC9300 OLED (1080p). There's just no comparison, the OLED blows it away. Contrast makes a far bigger impact on PQ than res in my opinion.

I also have a LG 34UM95 but still end up using the 1080p OLED for most of my gaming now. I'll probably move to a 4K OLED next year though assuming they get the input lag under control.
 
Decided to stay 1440 at 144hz running Crossfire (smooth for me).

Glad I did, and until 4K refresh does not at least pass 120hz refresh, I ain't moving.

This

After reading the [H] write up about how two Furyx cards barely keep games at 70 FPS at 4K, it would seem GPUs need more time for 4K to hit a nice price/performance point.
 
I'm waiting until GPU's mature. Another two generations and 4K will be attainable in the mid-range cards.
 
without integer scaling it make little difference if its 2x or some other non-integer number. 1080p on 20160p monitor will look almost identical to how 1080p look on 1440p monitor. That is in most images except some rare cases like if you scaled image with scanlines effect.

What you wrote would make sense for monitor that have integer/point scaling. In such case 1080p on such 2160p monitor would look as good if not better than on native 1080p monitor.

But 1080p on UHD (because that is what most of people here mean by 4k) IS integer scaling. 3840/2=1920. 2160/2=1080. If you display one 1080p pixel on a UHD screen, you display it as 2x2 "UHD pixels".
 
I have an Acer XB280HK and Titan SLI.

I mostly do "regret" it and I am now thinking of 1440p @ 120+ Hz instead. I ran a Dell 1600p screen for years and I think that resolution (and 1440p) is great at the 27" - 30" segment.

I can tell you for sure that this particular monitor doesn't scale 1080p perfectly. It does look blurry. I try to play CS:GO at 1080p because at 4K I just can't be precise or quick enough with headshots in my advanced age and on 1080p it's not even close to crisp.

A lot of games have a very hard time running on 4K at high or ultra. Far Cry 4 was really tough to get 60 FPS on (low/medium),

Some games run pretty well. BF4 is surprisingly optimized. GTA5 worked pretty well. Diablo 3 and Starcraft 2 are fine. Skyrim works fairly well.

And then some older game engines can barely do it, or can't do it. Stalker struggles. Metro Redux is tough.

Anything in DX9 usually has some strange shading problem or sharpness problem at 4K. I had to upgrade to Dark Souls 2 Scholor edition because the original game was DX9 and had serious issues with steam overlay and ingame text being over sharpened.

Desktop experience is also tough. I run 125% scaling in Windows because 150% breaks a lot of video games. Most apps are blurry at 125% - Steam, trillian, older apps, Adobe Air apps... but Chrome is fine and that saves the day for me.
 
But 1080p on UHD (because that is what most of people here mean by 4k) IS integer scaling. 3840/2=1920. 2160/2=1080. If you display one 1080p pixel on a UHD screen, you display it as 2x2 "UHD pixels".
Nice theory. If only it happened this way in practice. :D
 
I
I can tell you for sure that this particular monitor doesn't scale 1080p perfectly. It does look blurry. I try to play CS:GO at 1080p because at 4K I just can't be precise or quick enough with headshots in my advanced age and on 1080p it's not even close to crisp.
.

My brief experience with the Asus PB287Q was that 1080p scaled to 4k looked really blury. I had bought the Asus expecting a better scaler and with 4x1080p to 4k I had expected a very clean image running scaled.

Such irony then , I can run 1080p on my new BL3200pt and it looks good enough to play without hardly any LCD scaling 'blur' in fact its a god send where some 2D games are only drawn for 1080p and they look clean, yet of course chunky as the pixels are fatter. Thats a screen with far fewer pixels and 3.5" bigger. Looking forward to pluggin a console in HDMi tbh for the first time I think it will be perfectly doable without looking terriblur..

That said when GPU's can handle 4k better its an eventual logical move to use 4k so long as the panel chosen is suitably big enough ( 34"+ )
 
1) 4k is 2x width and 2x height of 1080p. So no need to "go back", you can still use your 4k monitor with 1080p without any visual problems. At same time you can use the 4k for desktop & apps.
2) why does everyone needs to run 4k Ultra 4xAA ? AA @ 4k is super useless.

how small is the desktop at 4K on a 27 inch display?
how small are webpages, buttons, text ecc?
 
without integer scaling it make little difference if its 2x or some other non-integer number. 1080p on 20160p monitor will look almost identical to how 1080p look on 1440p monitor. That is in most images except some rare cases like if you scaled image with scanlines effect.

What you wrote would make sense for monitor that have integer/point scaling. In such case 1080p on such 2160p monitor would look as good if not better than on native 1080p monitor.

I don't catch your point.
I use DSR 4x on my 1080P monitor and it looks amazing, no blurred image, every scaler scale well a 4x image.
 
Nice theory. If only it happened this way in practice. :D

It's so frustrating that the handful of display scalar designers apparently live under a rock in China and consistently refuse to spin a chip that hits all the check boxes. 4k 60Hz 4:4:4, 1080p 120hz with correct scaling, low input lag, ...
 
Nope. Bought a 40" crossover and a 295x2, loving that combo. Only into the whole setup around $1100.
 
Nope, I have a CRT, so every resolution is native. 4K when I want it, 1600x1200 when I don't.
 
4k @ 28" didn't feel worth the compromises for gaming. (gpu upgrade, lower settings, 60 hz.)

4k @ 40" was a different story.
 
I don't catch your point.
I use DSR 4x on my 1080P monitor and it looks amazing, no blurred image, every scaler scale well a 4x image.

DSR has absolutely nothing to do with Monitor scaling.

Downscaling as a result of DSR is done entirely on the GPU, so a 4k image will look sharp on a 1080p screen as it is an integer multiple of the native.

Upscaling is, to my knowledge, done on the monitor itself, and almost all monitors use some form of internal scaler for all resolutions, even for half resolution images, which will blur the image.

A 4k image displayed on an 1080p screen will look good and sharp. A 1080p image displayed on a 4k display is almost always blurred, even when there shouldn't be any.
 
Well I have my sights on BL3201PT to complement my PG278Q, so I will let you know.

But I chose PG278Q due to a lot of on paper reasons, I moved from a 24" 1080p TN to a IPS 1080p 27", then to Swift. At the time, I didn't want anything larger than 27" because I like everything in my sights.

Many prefer larger screens due to immersion, I didn't want a large screen because in a lot of games it wouldn't let me see more, but actually less (resolution don't change your FOV unless you are going ultra wide, hence the FOV remains the same, so larger screen would let you see less if the screen size became bigger than your eye's FOV).

I am getting 3201PT solely as a IQ monitor, for those top down games and movies, otherwise I will still use my Swift as my main FPS monitor.
 
Well I have my sights on BL3201PT to complement my PG278Q, so I will let you know.

But I chose PG278Q due to a lot of on paper reasons, I moved from a 24" 1080p TN to a IPS 1080p 27", then to Swift. At the time, I didn't want anything larger than 27" because I like everything in my sights.

Many prefer larger screens due to immersion, I didn't want a large screen because in a lot of games it wouldn't let me see more, but actually less (resolution don't change your FOV unless you are going ultra wide, hence the FOV remains the same, so larger screen would let you see less if the screen size became bigger than your eye's FOV).

I am getting 3201PT solely as a IQ monitor, for those top down games and movies, otherwise I will still use my Swift as my main FPS monitor.



Have you checked into the brand new Samsungs, or Asus PA328G, Phillips BDM3275UP, etc..? The BenQ has flickering... derp.



Also... I have no idea what the heck you are talking about with regards to 4k not having more on screen, etc. Perhaps you never played on older Monitors, or 16:10 format, but it is superior to wide format for gaming. Because of FOV issues in wide format gaming.

There is a cost of diminishing return to FOV and ultra wide screens. Specially when you have to loose sight of (horizontal ground) to aim up to a 2nd story window...(ie: Battlefield) and loose sight of the ground, etc. Your body's eyesight does not take in information like ultra-wide gives.

@ 16:10 a single pan (horizontally) in either direction is all you need, ground or sky.
In Ultra-wide format, you literally have to pick you whole field of view off the ground, just to pan above horizon. It is unnatural and a gimmick. Although kewl as phuk.. (Acer X34)


a 4k 39" monitor means you are in the game essentially and have no need of panning all over, instead you ARE moving threw the grass. The immersion is there, but the performance is not. But obviously if you are going gaming, the 32" is the way to go. That new Asus 32".. I can't find anywhere for sale. I'd get that over the BenQ.


I am buying the Acer X34 because I want to see what all the fuss is about.
But for me a 32" 4k @ 120hz and low latency & response time is all I need right now. Question is, why won't they build it.. A: Because display port 1.3 is not being supported.
 
Last edited:
Back
Top