Need a new monitor and pulling my hair out trying to choose one. Please Help

Spyhawk

Limp Gawd
Joined
Jul 5, 2008
Messages
400
Yep...I just recently upgraded my video card to a MSI GTX 980TI Gaming 6G OC edition and figured I should also upgrade my simple 1080p, 60hz TN 27'' monitor to something much better now that I have the horsepower but I've been out of the loop on monitors that trying to decide between 4K, 1440p, 144hz, ultra wide, G-sync and curved is driving me nuts. I don't upgrade often so I really need to get this right.

I'm willing to pay up to 1k cad for a solid monitor that is an all around great performer that has a solid reputation with great reviews.

Was thinking 4k but I don't wanna saturate my video card to the point that I will need to lower settings in game so soon after upgrading LOL...so maybe I would be better off at 1440p...I really don't know. I've never had a premium monitor before....I was looking at the Asus rogue swift but the reviews on that monitor are up and down.

I will be using this monitor as an all around multi purpose monitor to play games(all genres) 40%, watch netflix 20%, HD TV 20% and surf the net at 20%

1) Is G-sync really worth the added price?
2) Does freesync work with Nvidia cards?
3) Is there really a noticeable advantage going with a 144hz display?
4) Is going 4k on a single 980ti a wise choice?
 
Last edited:
Given what I have seen on a 4k Monitor and on a 1440p Monitor, personally, I would prefer 1440p 144hz over 4k 60hz, especially if you are working with just 1 980ti.

980ti can manage medium on most 4k games, though not much more than that on its own, and there are issues with SLI that would make 4k far too dependent on SLI for my taste (my benchmark game is Wolfenstein, New order. If the game works on single GPU on that resolution, I'd consider it, otherwise I'd stick to lower resolution).

In case games that your GPU is too powerful, there is DSR, which *still* doesn't work on SLI UNLESS you use a non G-Sync monitor. MFAA flat out doesn't work with SLI.

1) Depends on your taste for screen tearing. Personally find tearing to be wholly untolerable, so I'd jump on any tech that helps eliminate tearing with as minimum side effect as possible

2) No, otherwise G-Sync monitors would have tanked

3) There are a lot of other advantages of going with 144hz, but a lot of it has to do with motion clarity (there is less sample and hold blur) if you can maintain higher than 60fps compared to 60hz panels.

4) Personally, I wouldn't use 4k as my primary resolution.

I would only consider 4k if you are willing to buy a quality 4k panel (VA or IPS, I don't think 4k should ever use TN, even though I find TN to be perfectly acceptable if quality is good enough) AND go SLI, and are willing to put up with any potential SLI issues.

I use 4k only for games that do not require a high degree of motion clarity (basically most non-fps type games) and games that are not demanding, or movies and stuff. I switch back to my Swift for FPS's, especially demanding ones like TW3. 4k Will look good, however performance is another issue. I strive for a balance of Visuals and performance, which is why I went to 1440p initially, and only added 4k only as part of my learning process, finding out whether 4k is all hype or glorious, and I found it to be a mix of both, but if I went back in time, I would not get 4k as my primary screen for the same reason before as I would not now: SLI dependency, which itself also has other dependencies.
 
I recently had a similar decision to make.
First off ultra wide GSync is out of your budget.
Second, 144 Hz is absolute must and can only be barely managed at 1440P.
Third, GSync is a must for everything that can't do full 144 FPS at 144 Hz.

A real upgrade for you is Acer XB271HU that has the right quality of the panel and features and won't make your video card bleed. 4K is nonsensical for a single graphics card and DSR massively compromises IQ. I didn't know it last week but this week with my new Acer I found what I was missing running DSR on a 1080P monitor. Let me just say 1440P native looks better than 4K DSR by a mile.
 
SINKs, whether F or G, are pointless and should not be considered. You cannot use SINKs with strobing, so they are a moot point.

Look for a 100Hz+ monitor with backlight strobing, such as the EIZO FG2421 or ASUS PG278Q. The FG2421 is a VA panel, so it is better quality but lower-res than the Asus.
 
^ again, that is entirely dependent on personal perspective, Rabidz does not mind tearing, which makes majority of the G/FreeSync benefit moot point, where as others, like myself, find tearing extremely distracting, so G-sync has been a god send.

The previous poster also prefers motion clarity over everything else, I don't since I had been using LCD's for many years and so I am very used to the blur to the point I don't notice it, and it got to the point where I don't really notice backlight strobing unless I went out of my way to find it (or in other words, when I am not playing game).
 
He doesn't mind tearing since originally he may not had the option to mind it.
Once he sees these displays in action, he will realize what he is missing.

Same thing with me. I used to tout that DSR 4K is fantastic. Come to less than 1 week now and I can't imagine playing anything with DSR on.

You can't comment or form a perception on something you have not experienced. Once you experience it, you will learn to appreciate it.
 
1) Is G-sync really worth the added price? YES, the reduction in input lag up to 3 frames less then running with vsync on and no tearing is very much worth the additional cost, we are talking NES on crt type responsiveness with your mouse and controller!
2) Does freesync work with Nvidia cards? NO
3) Is there really a noticeable advantage going with a 144hz display? YES, for games that support higher then 60 fps.
4) Is going 4k on a single 980ti a wise choice? If you run the games without AA, then one 980ti is just fine for 4k gaming. I used a Titan X with a 4k tv and as long as i ran with no AA no game gave me any real issues.
 
This thread is something I could have made for myself and it's already helped me. I'm running a single 980Ti and intend to stick with it for a while.

I've ruled out 4k for myself and I'm sick of tearing so I want Gsync. I really prefer to play at or close to 60fps whenever I can so I know 4K is out of the question for me.

Back this past late Spring I took that BenQ 24G... forget what it's called, 1080P Gsync 144Hz 6 bit TN for a spin and despite the drop offs of that TN that I personally couldn't live with I haven't forgotten what the tech brought to the table and it's been haunting me ever since. ;)

I went from CRT to plasma on TV and VA and IPS panels for LCD so that was my first and likely my last foray into TN land.

The split market on Gysync and Freesync is a true shame and I think that split is holding some of this back. I do hope at some point in the near future that can be resolved somehow.

I'd feel a lot more confident about making a purchase if I didn't see so much outrageous panel roulette situations here and abroad, most notably in the AU Optronics panel product threads. For $800 or even more I need to not see what too many people have been seeing on their expensive purchases.

Rabidz named the Eizo monitor which was something I had high hopes for but it definitely was and is a hit and miss product at best which is a shame.

I really don't have a lot of confidence in this display market right now, all told.

Now we're pretty close to CES so there's always another excuse and reason to wait.
 
This thread is something I could have made for myself and it's already helped me. I'm running a single 980Ti and intend to stick with it for a while.

I've ruled out 4k for myself and I'm sick of tearing so I want Gsync. I really prefer to play at or close to 60fps whenever I can so I know 4K is out of the question for me.

Back this past late Spring I took that BenQ 24G... forget what it's called, 1080P Gsync 144Hz 6 bit TN for a spin and despite the drop offs of that TN that I personally couldn't live with I haven't forgotten what the tech brought to the table and it's been haunting me ever since. ;)

I went from CRT to plasma on TV and VA and IPS panels for LCD so that was my first and likely my last foray into TN land.

The split market on Gysync and Freesync is a true shame and I think that split is holding some of this back. I do hope at some point in the near future that can be resolved somehow.

I'd feel a lot more confident about making a purchase if I didn't see so much outrageous panel roulette situations here and abroad, most notably in the AU Optronics panel product threads. For $800 or even more I need to not see what too many people have been seeing on their expensive purchases.

Rabidz named the Eizo monitor which was something I had high hopes for but it definitely was and is a hit and miss product at best which is a shame.

I really don't have a lot of confidence in this display market right now, all told.

Now we're pretty close to CES so there's always another excuse and reason to wait.


I would wait and see what CES 2016 comes out with, chances are you will at least be able to make a more informed decision after that. Keep in mind most monitors that are announced at CES could take as much as 11 months to come to market ( i.e. the Swift ips and 4k models ).

If I absolutely had to get a monitor right now and gsync and 4k was not important, I would go with the HP 27XW or HP 25XW. You could run a 980 ti at 1080p all maxed out with 60 fps and have the benefits of the LG AH-IPS panel that those monitors have. That 980ti will last a LONG time at 1080p resolution :)
 
So, I think it's time someone from the 4K side chimes in, all you G-Sync fanboys back off, it's my turn!

I am a long time user of 3 normal 24" 1080p monitors who just moved to 1 JU7500 40" 4K. I recently moved from a R9 290x to a GTX 970 which was upgraded to gain the 4:4:4 HDMI on the 9xx series cards. My personal opinion is 4K, yeah it's not 144hz, yeah it's not G-Sync and yeah it's not without it's issues but I think the other people in this thread are not quite giving the whole story.

With a single 980TI it is unlikely you will be able to max out the FPS with Fallout 4 @4K with all highest settings but even with 2 980TI's on a 40" 4K I doubt you would be able to max out the FPS @4k. You CAN however max out the settings with a 970 @1440p, I have every setting as high as I can get it, I am running 32 ultra HD texture enhancing mods for damn near everything and my game sits firmly at 60fps (cap I guess idk, not had time to look into it) and only dips into the 40's when I am either in a super populated area like Diamond City or in the Prydwen looking over the entire world.

Not to mention that if you want to keep the FPS even higher, just run the game in windowed mode @1440p, now it's the same dimension as those other 27" monitors you are looking at, has a high FPS and if you feel like it, you can switch from gaming to surfing to gaming again without delay.

The second thing is screen size, the JU7500 (what I am using, but essentially any 40" 4K monitor) is huge, so much room to do everything at once, you say you do 40% of your time watching Netflix and surfing (not sure about the missing 20% :p) and doing those on a huge screen is amazing. 4K Netflix movies, huge browser windows open WHILE watching a movie at the same time, it's quite the spectacle.

Not gonna lie, never used a 144hz monitor, nor a G-Sync monitor. I have however been a competitive (for fun) FPS gamer for 10+ years and I have no issues with this monitor. The ONLY negative thing I can say for this particular panel is I think the PWM is causing eye strain because I like to keep the screen on a low brightness which enhances the invisible strobe effect, though I have also been under the weather and working late for a week or so now which also could be the root cause of the eye strain so take that with a grain of salt.

In the end it really does not matter what you buy =P they all have their individual features which will excel them in their own particular field of specialty and you will likely love each one for different reasons.
 
He doesn't mind tearing since originally he may not had the option to mind it.
Once he sees these displays in action, he will realize what he is missing.

Same thing with me. I used to tout that DSR 4K is fantastic. Come to less than 1 week now and I can't imagine playing anything with DSR on.

You can't comment or form a perception on something you have not experienced. Once you experience it, you will learn to appreciate it.

I have tried playing on a GSINK monitor at a computer store. It wasn't impressive. It was just V-Sync without input lag. I would much rather have my CRT, but if I wanted a LCD, I'd buy one with strobing. If SINKs worked with strobing, then it would be a slight benefit. But SINKs do not work with strobing, so using them actually significantly reduces quality vs strobe.
 
Also, if you get a 3840x2160 LCD you should be able to scale cubicly to 1920x1080, which shouldn't add blur. If you get a CRT, you could just drop the res.

A 980 Ti is OK for some things at 3840x2160, some things not.
 
Also, if you get a 3840x2160 LCD you should be able to scale cubicly to 1920x1080, which shouldn't add blur.

And the number of 4K screens that actually do this... Pretty much none of them. Thank the scalar chip engineers that apparently live under rocks.
 
In fact, I know exactly 0 monitors that can do perfect half resolution scaling.

The only thing that I know of that can help alleviate the blur is getting a smaller 4k screen, which makes the blur less noticeable (P2415Q, VX2475-SMHL and EA244UHD are the smallest 4k available, at 24").

Now, whether 4k is actually beneficial at 24 at all or not is a completely different debate.

Bottomline is, if you are going to rely on a lower resolution to keep your game performance, you are much better off buying a lower native resolution monitor, or, at least have one such monitor handy.
 
And the number of 4K screens that actually do this... Pretty much none of them. Thank the scalar chip engineers that apparently live under rocks.

Who the hell uses a scaler inside a monitor?! If I knew how, I'd desolder the little shit from the PCB.

Scaling is always done on the GPU, at least if your IQ is 30+.
 
In fact, I know exactly 0 monitors that can do perfect half resolution scaling.

The only thing that I know of that can help alleviate the blur is getting a smaller 4k screen, which makes the blur less noticeable (P2415Q, VX2475-SMHL and EA244UHD are the smallest 4k available, at 24").

Now, whether 4k is actually beneficial at 24 at all or not is a completely different debate.

Bottomline is, if you are going to rely on a lower resolution to keep your game performance, you are much better off buying a lower native resolution monitor, or, at least have one such monitor handy.

See my other post about scaling.

24" is a good size monitor. Going 27+ is excessive.
 
If scaling is done on the GPU, it would not explain why every and all monitors (Swift included) would have blurry edges when upscaling half resolution images, nor would it explain why different monitors with the exact same size and resolution would behave differently between resolution scaling.

CRT may not have any internal scalers, but a lot of modern LCD's do, which makes no sense for half resolution images, but there you go. I'd imaging removing the scaler would completely destroy the functionality of the controller.

Compare that with 4x DSR, which is done entirely on the GPU and does the reverse and shows no artifacts.
 
If scaling is done on the GPU, it would not explain why every and all monitors (Swift included) would have blurry edges when upscaling half resolution images, nor would it explain why different monitors with the exact same size and resolution would behave differently between resolution scaling.

CRT may not have any internal scalers, but a lot of modern LCD's do, which makes no sense for half resolution images, but there you go. I'd imaging removing the scaler would completely destroy the functionality of the controller.

Compare that with 4x DSR, which is done entirely on the GPU and does the reverse and shows no artifacts.

A lot of people just send a non native res sognal to the monitor. Most folks don't use GPU scaling or don't use it properly.

CRTs don't ever have scalers as they don't have native resolutions. Most LCDs have internal scalers for the idiots that can't figure out how to use a proper resolution. I have some LCD screens without scalers. You just have to send the right res. Wrong res=uselessly garbled pucture.
 
I've taken a look at the purely gaming monitors and although they seem excellent at playing games they have downsides like for example just one diplay port input due to gsync restrictions so cant connect anything else to it. Thats a show stopper for me. Im not a professional gamer nor do I only play FPS shooters. I play a whole range of different genres. I also dont only play games on my monitor. I watch NHL Hockey games from a cable box provider, I play console games though my PS4, I watch episodes on Netflix and I also surf the net. So I want my next monitor to be an all around great Mutimedia performer for all those things at 27 to 32 inch and @ 1440p.

Was starting to warm up to this one. Its got excellent reviews as well. Samsung S32D850T
 
Last edited:
PG279Q now has 1 HDMI port as well, so does Dell's S2716D, so that restriction no longer applies, unless you connect more than 1 HDMI device to your monitor at the same time.

The lack of HDMI on the older iteration of G-Sync monitors was totally inconsequential to me, since I didn't have any HDMI devices to connect, nor was I going to connect any. Also I figured, even if I had one, I would rather get a second monitor for that, rather than having several devices sharing the same monitor (not only switching between them continuously a PITA, but I often want to use different devices at the same time, single montior would defeat that entire purpose).

I'll have to look into GPU scaling, but I seem to distinctly remember that I was either still unimpressed with the result, but I can't remember if it was because it was unable to get rid of the blurred edges (I am very picky about this) or if it simply refused to work in games (most likely the latter).
 
rabidz7 you have apparently IQ less than 30 yourself
GPU use bilinear for scaling algorithm and there is no way to change it to point scaling.

"GSINK" monitors usually come with ULMB which is full blown strobing so getting G-Sync monitor is simply the best choice.

I have but a one question: are you:
a) trolling?
b) mentally retarded?
c) high on DXM (or other similar neuron-killer brain melter) all the time?
d) all of the above?

@Spyhawk
At this day and age 4K is bad idea.
There is more issues with application scaling on smaller monitors than it is worth it.
Games require significantly more gpu performance to run and they simply do not have enough details for it to make any difference. Framerate will be decimated due to sheer number of pixels that need to be drawn.

Imho best choice for you is Acer Predator XB270HU (or XB271HU, I dunno what was changed in it) which is 27" 1440p with G-Sync (for more demanding games) and ULMB (when you can push >85Hz this will look ridiculously awesome, butter smooth and sharp like on CRT). You will be able to not use any scaling because fonts are big enough to read and 980Ti is just about right for 1440p to not worry too much about performance, especially with G-Sync.

If anything do not sacrifice G-Sync/100Hz+ or larger sized monitor. Some people get those curved super-wide-screen monitor for 'gaming'. It is just stupid. What immersion can there be with input lag and stuttering? I hate 60Hz monitors (for gaming, for desktop they are fine) with a passion and am proud of it :cool:
 
rabidz7 you have apparently IQ less than 30 yourself
GPU use bilinear for scaling algorithm and there is no way to change it to point scaling.

"GSINK" monitors usually come with ULMB which is full blown strobing so getting G-Sync monitor is simply the best choice.

IIRC, I have used cubic GPU scaling in the past. Monitors that have GSINK only sometimes support strobing, and that is only if the SINK refresh rate is ~100+. All 4K LCDs lack strobing, same with all 3440x1440 LCDs.
Some 2560x1440 units can strobe, but only the TN models do 120. The IPS 1440 strobes at 100.
 
Throwing in my 2 cents, recently upgraded to a samsung u3415w recently and love it. Beautiful screen, great for throwing up multiple windows or for ultrawide gaming/videos. There are some compatibility issues out there. For me, the format ended up being more important than the refresh rate and response time that are more sought after by others. I can definitely see the advantage to 144hz...just isn't a big deal to me.
I will say though, I wouldn't buy one of these unless you could get it for $500 or less. Thats where my purchase was.
 
I've looked at the Asus Rog swift PG278Q and PG279Q and the Acer predator XB270HU. First things first. I aint touching the asus monitor with a 100 yard pole LOL. I know some of you have had awesome experieces with that monitor but the chances of getting a bad one are just too high for me to fork out 1000$ for a pos. Negative Qc reviews on them monitors by customers are horrendous on the net and I don't wanna be caught up in that. As for the Acer, well they aint in stock and I want my monitor before December 22 so there is always the Dell option with the S2716DG. So far most reviews from customers Ive seen on that monitor are solid so Im leaning in that direction so far. Oh and btw...thanks to everyone for the advice. Appreciate the effort. :)
 
At first I thought I wanted an Ips panel but after reading up on them I decided that IPS glow, back light bleed, slower response times and heftier pricing wasnt worth it. Like who the fook cares about 178 degree viewing angles for a PC monitor. This isnt a freaken TV in the living room where you have people from all angles watching it. Its a pc monitor where generally one dude sits right in front of it. As for color accuracy...well I aint a content creation perfectionist...having perfect color isnt a must for me. As long as its near enough Im satisfied and todays high end TN panels are damn well near enough.....for me. :)
 
Also, if you get a 3840x2160 LCD you should be able to scale cubicly to 1920x1080, which shouldn't add blur.
The problem is that the number of 4K displays which do this can be counted on one hand.
I've been trying to get NVIDIA to add this to their drivers, but no luck yet. It would be great if you could take the time to post your interest over on their forums. Hopefully it will eventually get enough attention that they do something about it, as 4K/5K monitors are becoming more common, and we should see 4K120 monitors next year.

[...]Not gonna lie, never used a 144hz monitor, nor a G-Sync monitor.[...]
I don't think that really makes you qualified to compare a 4K60 monitor against a 1440p ≥144Hz monitor then.
I get that you're happy with your 4K monitor, but the difference between 60Hz and ≥144Hz is significant.

The only thing that I know of that can help alleviate the blur is getting a smaller 4k screen, which makes the blur less noticeable (P2415Q, VX2475-SMHL and EA244UHD are the smallest 4k available, at 24").
It really doesn't help in my opinion. Apple do the same thing when running older apps on iPads/iPhones, and even at phone/tablet sizes the blur is very obvious.

Now, whether 4k is actually beneficial at 24 at all or not is a completely different debate.
I'd be surprised if anyone argued that it wasn't beneficial. 4K at 24" is only around 180 pixels per inch. That's high resolution compared to a standard PC monitor, but low resolution compared to just about every other device, be it phones, tablets, or notebook displays.
8K panels can't get here soon enough, and hopefully 8K will be used at all sizes. 5K only works at 29" (200 PPI) or 58" (100 PPI) while 8K works at 22/29/44/88". (400/300/200/100 PPI)

I have tried playing on a GSINK monitor at a computer store. It wasn't impressive. It was just V-Sync without input lag. I would much rather have my CRT, but if I wanted a LCD, I'd buy one with strobing. If SINKs worked with strobing, then it would be a slight benefit. But SINKs do not work with strobing, so using them actually significantly reduces quality vs strobe.
It's amazing to me that you still haven't figured out the difference between "sink" and "sync", and that you don't understand what G-Sync or Adaptive-Sync/FreeSync actually do.
Unless you think you're being clever by calling it that, which you're not.

CRTs don't ever have scalers as they don't have native resolutions.
CRTs do have native resolutions. Get up close to your monitor and you should be able to see that quite clearly. There is a finite number of phosphor dots/stripes on the screen.
CRTs certainly look a lot better than a flat panel when displaying a non-native resolution due to their analog nature, but they do have a "native" resolution.

Who the hell uses a scaler inside a monitor?! If I knew how, I'd desolder the little shit from the PCB.
Scaling is always done on the GPU, at least if your IQ is 30+.
If properly implemented, it would be better to use the scaler in the display.
With the current bandwidth limitations of DisplayPort/HDMI, you could theoretically have a 4K native monitor which accepts 4K60 and 1080p240 as inputs. This would be ideal for gaming if the scaler just used a low latency "pixel doubling" solution.
Some televisions currently support 4K60 and 1080p120.
If you were scaling to 4K on the GPU, you would be limited to 1080p60.
 
I'd be surprised if anyone argued that it wasn't beneficial. 4K at 24" is only around 180 pixels per inch. That's high resolution compared to a standard PC monitor, but low resolution compared to just about every other device, be it phones, tablets, or notebook displays.
8K panels can't get here soon enough, and hopefully 8K will be used at all sizes. 5K only works at 29" (200 PPI) or 58" (100 PPI) while 8K works at 22/29/44/88". (400/300/200/100 PPI)

This is coming from someone who compared Zenphone 5 (the older zenphone) and Padfone S, the two phones have similar screen sizes, the former has 720p, while the latter has 1080p. I for one cannot notice the difference in resolution between the two phones. All I know is that the 1080p monitor burns through battery much faster than the 720p screen and the GPU in the Padfone isn't actually good enough to drive 1080p, so for all intents and purposes I would have MUCH prefered 720p screen on the phone itself (the Pad addon can stay 1200p, no arguments).

My argument is that if text/icons must be scaled in order to make them visible on a higher resolution, then majority of the benefits of that high resolution is lost because the increase in screen real estate no longer applies for those.

Now, if we are talking about looking at images, then sure, higher the better, more details you can squeeze in, but there may be increasingly diminishing returns where the PPI increases to the point where we can no longer distinguish the difference.
 
This is coming from someone who compared Zenphone 5 (the older zenphone) and Padfone S, the two phones have similar screen sizes, the former has 720p, while the latter has 1080p. I for one cannot notice the difference in resolution between the two phones. All I know is that the 1080p monitor burns through battery much faster than the 720p screen and the GPU in the Padfone isn't actually good enough to drive 1080p, so for all intents and purposes I would have MUCH prefered 720p screen on the phone itself (the Pad addon can stay 1200p, no arguments).
You're talking about ~300 PPI vs ~450 PPI on a 5" screen.
That's significantly higher resolution and significantly smaller than monitor-sized displays.
A 24" 4K monitor is only ~180 PPI, not even 2/3 the resolution of the 720p Zenphone 5.

My argument is that if text/icons must be scaled in order to make them visible on a higher resolution, then majority of the benefits of that high resolution is lost because the increase in screen real estate no longer applies for those.
You're missing the point then. Workspace ≠ resolution.

Computers have basically been designed around 90-110 PPI displays for decades now.
Though there are exceptions, that's the "standard" resolution for a monitor.
Doesn't matter what size that monitor was, resolution generally increased with size to maintain ~100 PPI resolution, because that is required to keep unscaled text at a comfortable size for most people.
It's not too big, and it's not too small.

So until the last few years, your workspace was defined by the size of your monitor.
Resolution had nothing to do with it, since that was fixed at ~100 PPI.
The larger your monitor was, the more pixels it needed to stay at ~100 PPI, giving you a bigger workspace.

Now what would happen in an ideal world is that everything would always scale perfectly, so no matter what your display resolution is, be it 100 PPI or 130/150/180/220 etc. text would always remain the same size. It would just get sharper/clearer as the resolution increased.
What happens today is that, without scaling, things get bigger or smaller depending on the pixel density. That shouldn't be happening.
I realize that some people do like to view things unscaled on a 24" 4K monitor or even higher density displays than that, but most people will find text to be unacceptably small.

While arbitrary scaling doesn't work well today, what does work well is integer scaling. Scaling to 2x/3x/4x generally works fine.
And 8K is the perfect resolution for this:
  • A 22" 8K monitor gives you a 400 PPI display with a 1920x1080 workspace.
  • A 29" 8K monitor gives you a 300 PPI display with a 2560x1440 workspace.
  • A 44" 8K monitor gives you a 200 PPI display with a 3840x2160 workspace.
So you get nice clean scaling, and workspace scales perfectly with display size.
For the people that want it, you also have the option of changing the scale.
You could easily have a 7680x4320, 3840x2160, 2560x1440, or 1920x1080 workspace on any of those sized monitors. You'd just end up with really big, or really small text.
Actually, for the 22/29" displays, I could see moving up one workspace size working quite well for the people that don't mind the smaller text.

Current 4K monitors are the wrong size (24" is too big, 40" is too small) and you have to compromise on resolution. I want a 3840x2160 workspace, but I don't want a ~100 PPI display.
With every display I look at these days having >200 PPI resolution, except for my PC monitors, the monitors are looking positively archaic.
But I accept it for now because I need the workspace more than I need the higher quality text, and because ≤60Hz sucks for gaming. (since no-one seems to have figured out that they could support 4K60 and 1080p120 or even 1080p240)

8K really seems like the "sweet spot" for resolution, since it works perfectly at all current monitor sizes, and it's at the point where I can't imagine a need for any greater resolution.
 
I've looked at the Asus Rog swift PG278Q and PG279Q and the Acer predator XB270HU. First things first. I aint touching the asus monitor with a 100 yard pole LOL. I know some of you have had awesome experieces with that monitor but the chances of getting a bad one are just too high for me to fork out 1000$ for a pos. Negative Qc reviews on them monitors by customers are horrendous on the net and I don't wanna be caught up in that. As for the Acer, well they aint in stock and I want my monitor before December 22 so there is always the Dell option with the S2716DG. So far most reviews from customers Ive seen on that monitor are solid so Im leaning in that direction so far. Oh and btw...thanks to everyone for the advice. Appreciate the effort. :)

The Acers and Asuses are all using the same AU Optronics panel(the IPS panels) so it's roulette no matter which of these brands you pick. A complete coin toss and a crap shoot.
 
Get as much refresh rate as you can. Gsync would be ideal, but using Adaptive Vsync is acceptable in most scenarios, at least from my personal experience...and may be a viable alternative since Gsync compliant monitors usually come with a hefty price premium that some may find hard to bite and chew. It's up to you. :)
 
Er... you mean Adaptive Sync? Adaptive V-Sync is an nVidia feature that dynamically switches V-Sync on and off depending on your framerate. Adaptive Sync is changing the refresh rate of the monitor depending on your framerate.

nVidia cannot use DP1.2a's variable refresh capabilities, that is true, though whether if it is because it physically can't use it or if it is only unsupported at a driver level is a different question.
 
Er... you mean Adaptive Sync? Adaptive V-Sync is an nVidia feature that dynamically switches V-Sync on and off depending on your framerate. Adaptive Sync is changing the refresh rate of the monitor depending on your framerate.

nVidia cannot use DP1.2a's variable refresh capabilities, that is true, though whether if it is because it physically can't use it or if it is only unsupported at a driver level is a different question.

Ahh yes thanks for the info.

I'm really torn on this gens IPS vs TN vs VA. They all have pros and cons. One minute I want IPS, then after reading a few reviews, I want TN or VA. Can't seem to make up my mind. The reason for the hold up on IPS is that I've read so many horror stories regarding 27 " 1440p 144hz IPS panels from all manufactuers that it's spooking me LOL
 
I know that feeling. I went through that when I decided to purchase my Swift amidst the QC crapstorm, but I got a very good panel (no issues so far and BLB is minimal, and I use it as a standard to compare IPS's to).

You could try and take a look at XB271HU (which is an improvement over 270HU) and see if the horror stories are as prevalent for that model as 270HU or 279Q.

I can definitely say that no one monitor can do everything perfectly (at least not ones that's available right now), so I ended getting two monitors with different panel types.

I decided to leave out VA because BDM4065 uses PWM (wasn't sure if I was sensitive to it, but I wasn't going to find out, return policy here in Taiwan just suck, period), and the other VA panel that is available is BL3200PT, but that is on 1440p, which I didn't want because I already have a 1440p monitor (I actually wanted a monitor of different resolution, I would have prefered 1080p instead of 1440p), add to that BL3201PT's low IPS glow made me go to that panel instead. Good monitor, but the IPS glow still, for some reason, manage to rear its ugly head at blacks, but I am very picky about the glow, and I find it borderline acceptable. If you are any less picky, I would reckon you'd be fine with it.

If I could go back and do everything all over again, I would probably get 271HU as my main gaming monitor, and wait for another 4k VA panel for movies or non-performance heavy games, or wait until OLED drops. But I am perfectly happy with the Swift enough that I won't replace it with 271HU.
 
I bought the Dell S2716DG and paired it with my EVGA 980Ti FTW. It's a perfect combination. I couldn't be happier with my decision. The jump from 1080p\60Hz to 1440p\144hz\4ms\GSync has completely changed my gaming experience. It's quite breathtaking. 1440p is the perfect resolution for a single 980Ti.
 
Well I've finally made my decision. I picked up an Acer Predator XB271HU a couple of days ago and all I can say is ...WOW. Playing BF4 feels like playing a new game LOL. Compared to my old TN monitor(SyncMaster 245BW), the 271HU is just flat out awesomness. Everything is crisp, clear and fast.

As for quality control well its almost perfect. There were no scratches to be found. No dead pixels. BLB is extremely hard to find but I think I have some in the bottom RH corner. Also I've found 2 specks of dust. One in the bottom right and the other on the LH side that are only noticeable on white background. All in all nothing that bothers me so this is a keeper for me. :)

The display port cable they ship with this monitor is too short. I had to rearrange my whole setup to connect this monitor due to this. I mean why cant they ship longer cables? Do they think most people keep their PC right next to the monitor ? To make matters worse, stores, (Wallmart, Bestbuy, Costco, Staples ) don't even sell DP cables. I have to order one online.:mad:

Thats the only negative I have. Oh and the price too. 950$ cad is the most I've ever payed for a monitor by 450$ but then again its an IPS panel with g-sync and 165hz.

Thanks everyone for helping me out here. Appreciated. :)
 
Last edited:
Thats the only negative I have. Oh and the price too. 950$ cad is the most I've ever payed for a monitor by 450$ but then again its an IPS panel with g-sync and 165hz.

Thanks everyone for helping me out here. Appreciated. :)

Good that you are happy with it. However, now you are locked into the Nvidia ecosystem for a long time. Next time you upgrade GPU, you have to choose Nvidia if you want to keep using G-Sync. And the time after that. And the time after that...

I guess Nvidia says "mission accomplished" every time someone buys a G-Sync monitor. Personally, I think it's very sad we are forced to couple monitor and GPU based on brand. Especially when the monitors costs a kidney.
 
Good that you are happy with it. However, now you are locked into the Nvidia ecosystem for a long time. Next time you upgrade GPU, you have to choose Nvidia if you want to keep using G-Sync. And the time after that. And the time after that...

I guess Nvidia says "mission accomplished" every time someone buys a G-Sync monitor. Personally, I think it's very sad we are forced to couple monitor and GPU based on brand. Especially when the monitors costs a kidney.

I agree. It sux. Its something I've thought about for some time prior to this purchase but being stuck with Nvidia isnt at all bad. Based on past gens, theres a good chance Nvidia will again be the best option and its not like I will be upgrading anytime soon anyways.
 
Back
Top