1140p Gsync or 4k monitor

Feellia

Weaksauce
Joined
Dec 16, 2011
Messages
76
I'm getting a free 980 hand me down = ) and been in the market for a new display. Current display is 1680x1050 and I'm sick of it...Should I go Gsync 1440p or a standard 4k.

Thoughts? is Gsync that good?

I will be upgrading the 980 whenever the next gen ones hit...so the 980 is just a hold over, and who knows I may go AMD which would make Gsync useless depending on objective responses

Cheers
 
I enjoyed gsync when i used it. It's almost a requirement for my gaming monitor purchases.
 
G-Sync is that good. I don't even really care about hardware all that much anymore. G-Sync makes my games work, period. Now if only I could have a high-contrast G-Sync display.

If you don't know, there are 4K G-Sync monitors out there as well.
 
There is precisely one 4k G-Sync monitor that is currently available: XB280HK

It is also a TN panel, which, if you are seriously considering a 4k screen, is IMHO a bad idea. TN is not actively bad, it's just that a lot of its benefits is negated by the 60hz panel (120hz 4k does not exist yet), and 28" is quite close to the upper limit of TN before brightness shift becomes almost completely unavoidable.

The only other 4k G-Sync that I am aware of that is on the way is XB328HK, which is 4k 32" IPS G-Sync (ETA unknown). The IPS glow on that monitor can go either, but if they use the same panel as BL3201PT/PH, it should be good, but I don't think they are using that panel.

If I was in the market for a monitor today, depending on whether or not you can afford it, I would definitely pick a lower resolution monitor over a higher one any day. especially if you are going to be relying only on a 980. I got Swift back when 980 was released and I was nowhere near impressed with the performance of either 980 or 970 in SLI at 4k. Hell, I am not that impressed with 980ti or fury X at 4k for that matter, but I use a different benchmark.

If you are going for 4k, I highly recommend that you either go all the way (980ti SLI and a good 4k monitor/TV), or don't go down that road at all.
 
I have an Acer Predator XB270HU monitor which is 1440p, G-Sync, IPS and 144Hz. It's not cheap, but I couldn't go back to non-gsync now. It really makes that much difference.

Seriously.

I wouldn't go 4k unless you're also getting SLI 980ti's. My friend has a 4k monitor with a single card and I get such a better gameplay experience than he does.
 
I wouldn't go 4k unless you're also getting SLI 980ti's. My friend has a 4k monitor with a single card and I get such a better gameplay experience than he does.

Is that so?
Because I'm running a 48" 4k screen with a single 980Ti and enjoying my immersive gameplay.
The Witcher 3 at 4k, everything maxed out including Hairworks, ~40 fps. that is good enough for me since most games are not that demanding and easily hit 60 fps.
 
Is that so?
Because I'm running a 48" 4k screen with a single 980Ti and enjoying my immersive gameplay.
The Witcher 3 at 4k, everything maxed out including Hairworks, ~40 fps. that is good enough for me since most games are not that demanding and easily hit 60 fps.

Immersion has nothing to do with performance, but also the definition of immersion changes from one person to another.

For example, the fact that I am using a KB/M or Joypad, and not using gestures to control my character pretty much negates the entire 'immersion' factor, not to mention that the whole immerson factor is much less pronounced when I am not in FP mode, which I tend to switch around when given the opportunity.

But I digress, this is not relevant to this thread. If you can stand 40fps, then great, since there are people who cannot.
 
Is that so?
Because I'm running a 48" 4k screen with a single 980Ti and enjoying my immersive gameplay.
The Witcher 3 at 4k, everything maxed out including Hairworks, ~40 fps. that is good enough for me since most games are not that demanding and easily hit 60 fps.

I would have to agree. Games that don't rely on very fast reflexes is when i worry about my frame rate. GTA5 and other single player games look good over 30 FPS.

The original question Feelia asked is more what the monitor is being used for. Strictly gaming i would go with G-sync. If you need more desktop real estate for programming or what have you, i would get the 4K.
 
I wouldn't go 4k unless you're also getting SLI 980ti's. My friend has a 4k monitor with a single card and I get such a better gameplay experience than he does.

1 GTX 970 here with a JU7500 40". Fallout 4 works great at max settings at 1080p resolution. Would not say you need SLI, however depends on what else you plan on doing, some other dude has 3x SLI 980TI's lol, some people like overkill :D
 
TV may have a different scaling than computer monitors in general, but after getting BL3201PT (which reviews have said good things about its scaling abilities) and subsequently being completely disappointed by it (I have no idea if this is the fault of the monitor, the game, the GPU, the driver or some other thing at work), I would only get any *monitor* if you are going to game exclusively on that resolution.

If you are planning to use 1080p as a fall back resolution due to lack of power, for the sole purpose of gaming, I would stick to 1080p native. At least 4k DSR on 1080p screen won't look bad, whereas 1080p on 4k monitor looks abyssmal.

I have looked far and wide for monitors that can handle native and half native resolutions equally well, with the closest being BL3201PT that I am aware of, but after using it, I am now convinced such a computer monitor do not exist, and one is better off using a lower native and use DSR instead of using a higher native and dropping down resolution.

Again, this ultimately depends on your taste. I like my images looking as sharp as possible, thus I find upscaling softening to be completely unacceptable.
 
Is that so?
Because I'm running a 48" 4k screen with a single 980Ti and enjoying my immersive gameplay.
The Witcher 3 at 4k, everything maxed out including Hairworks, ~40 fps. that is good enough for me since most games are not that demanding and easily hit 60 fps.


Trying to push an actual computer monitor @ 4k/144hz is very different than a tv.

I find 40 fps utterly unplayable. And in an FPS I wouldnt even bother unless Im getting 120+ fps.

Id love to see manufacturers make a 24" 4k monitor as I cannot fit a 27" on my desk. Till then Ill stick with my 24" 144hz measly HD rez :/
 
Acer's releasing a 32" IPS G-Sync monitor soon. Wait for that if you go for 4k.
 
GSINK is a gimmic.
I would recommend choosing a 2560x1440 monitor supporting strobing at high refresh rates. You could also consider getting a CRT. The PG278Q is a good LCD choice.
 
G-Sync and Freesync aren't gimmicks. Only a complete retard would have a problem with completely eliminating the possibility of tearing without any negative performance impact.

Unlike 21:9 displays and higher resolutions, the great thing about G-Sync and Freesync is that they JUST WORK. Every game is improved by these technologies whereas a very small percentage of games actually support exotic aspect ratios and high resolutions well. You're guaranteed to get use out of G-Sync in every game you play.

People's tolerance for dog shit makes me laugh. You LIKE tearing? You LIKE extra input lag to get rid of tearing? What kind of psychopath would not want to eliminate tearing without eating more input lag? What kind of a moron doesn't think that tearing is one of the most horrible visual defects in an experience on a computer?
 
Is it just me, or is there an irony overload when someone recommends a PG278Q, renowned for being the very first G-Sync monitor on the planet, and also thinks that G-Sync is a gimmick?

Also, the rabidz have said, multiple times, that tearing doesn't bother him, so we all have our tastes (I find tearing to be extremely distratcting).
 
Yeah, tearing doesn't bother him, CRT flicker doesn't bother him. Low resolutions don't bother him. He seems like a well-adjusted human being.

5oW9Gft.jpg
 
G-Sync without a doubt. It makes everything so much smoother. 4k looks great, but it's going to come at a huge performance impact. You'll end up sacrificing other graphical niceties to maintain a playable framerate. With a 1440p G-Sync display, I see no sacrifices being made. It's still a very nice bump in resolution for you, will make everything buttery smooth, and gives you a lot more wiggle room with other graphics settings (both by way of pushing a lot less pixels, and the fact that G-Sync makes sub-60 fps a lot more tolerable).
 
Yeah, tearing doesn't bother him, CRT flicker doesn't bother him. Low resolutions don't bother him. He seems like a well-adjusted human being.

Honestly, it just seems like you are very very sensitive to everything, except other people's opinions, you shit all over those.. we get it, you loooooooooooove your G-Sync, were all happy for you, now go get a room and stop being a dick to other members.

Not everyone finds G-Sync as revolutionary as you. Some of us (me included) prefer screen real-estate over slight (if any) increases in gaming prowess. I just got my JU7500 40" and love the hell out of it. If the day comes when there is a 40" 4K 144Hz G-Sync monitor, hell yes, grab that, but for now I prefer 4K. However that is an opinion, we all got them and the OP asked for them, stop being rude to other people posting theirs.

/me sits here waiting for an impressively articulated rebuttal from bigbluefe.
 
Is that so?
Because I'm running a 48" 4k screen with a single 980Ti and enjoying my immersive gameplay.
The Witcher 3 at 4k, everything maxed out including Hairworks, ~40 fps. that is good enough for me since most games are not that demanding and easily hit 60 fps.

Please post a fraps log and video of your forged in darkness, magical rig.

I have a 980ti and a 6700k. With Witcher 3 on max at 1440p frame rates dip into the 20s in Novigrad.
 
Do they make a 1920x1080 24" gsync 144hz monitor?

Yes, two actually: Acer XB240HA and AOC G2460PG, but both are TN panels.

IPS 1080p 144hz panels do not exist, be it G-Sync, FreeSync or Syncless. They only come in the 1440p variety.
 
Honestly, it just seems like you are very very sensitive to everything, except other people's opinions, you shit all over those.. we get it, you loooooooooooove your G-Sync, were all happy for you, now go get a room and stop being a dick to other members.

Not everyone finds G-Sync as revolutionary as you. Some of us (me included) prefer screen real-estate over slight (if any) increases in gaming prowess. I just got my JU7500 40" and love the hell out of it. If the day comes when there is a 40" 4K 144Hz G-Sync monitor, hell yes, grab that, but for now I prefer 4K. However that is an opinion, we all got them and the OP asked for them, stop being rude to other people posting theirs.

/me sits here waiting for an impressively articulated rebuttal from bigbluefe.

I don't care about any particular technology specifically. If you want a 40"+ monitor, there are going to be plenty of FreeSync screens that size. Why wouldn't you go that route?

Here's the thing: it's a clear technological win. There is no downside to variable refresh whether it be FreeSync or G-Sync.
 
I was very disappointed when I learned that G-Sync only takes away tearing(never have had big problems with tearing maybe because I use V-Sync). I always thought it makes games much smoother in lower FPS but I was wrong. Paying $200+ for g-sync module seems waste of money if you are not rich with very powerful rig and all things must to be high end. But I like that freesync is free and casual gamers can get that too now.
 
A 60fps 4k gsync monitor won't look any different than a non-gsync monitor with games that have proper triple buffering anyway.

But what about nVidia control panel triple buffering option? Can it works with games that doesn't have that option in game options?
 
I was very disappointed when I learned that G-Sync only takes away tearing(never have had big problems with tearing maybe because I use V-Sync). I always thought it makes games much smoother in lower FPS but I was wrong. Paying $200+ for g-sync module seems waste of money if you are not rich with very powerful rig and all things must to be high end. But I like that freesync is free and casual gamers can get that too now.

FreeSync and G-Sync are both still outside the realm of 'casual gamers', the cheapest of the FreeSync monitors are still way above the budget of normal people (EG No FreeSync 1080p 60hz panels).

The pocket required for FreeSync is less deep than G-Sync, sure, but the overall the cost is not that much different (EG the cost of a 1440p 144hz rig for AMD and nVidia is pretty much $200 difference, and AMD's easily total over $1500).

FreeSync or G-Sync won't be part of purchasing deciders for the normal layman until one of them breaks $200 barrier.
 
I have seen cheap LG 29UM67P 21:9 screen 29'' with freesync for 300€ and if with good AMD gpu 300€ so thats only 600€. That's not very much for casual gamer. I bet prices in USA are much better than in EU too. So you can get these things cheaper in USA.
 
My 'casual gamer' definition is one where the person spends as little as possible to achieve 'playability' (basically, including all other platforms), so GPUs and monitors above $150 is generally out of reach for such people. Especially compared to the appeal of consoles. $600 for GPU and monitor is what is generally considered high end, anything above $1000 is considered 'rich kids only' realm.

'Casual PC gamer' is a completely different league, but those are getting rare.
 
I was very disappointed when I learned that G-Sync only takes away tearing(never have had big problems with tearing maybe because I use V-Sync). I always thought it makes games much smoother in lower FPS but I was wrong. Paying $200+ for g-sync module seems waste of money if you are not rich with very powerful rig and all things must to be high end. But I like that freesync is free and casual gamers can get that too now.

Sounds like you fell for all the marketing bullshit and nonsense all the youtube reviewers spout out. Of course gsync won't make low frame rates feel smoother, gsync doesn't perform some frame rate interpolation to make it appear like a higher frame rate. At the end of the day low frame rates are still low frames and it sucks no matter what. I seriously don't understand why people can say that getting a gsync monitor will last you through many generations as games get more demanding because even if your fps is low then gsync will do some magic to make it feel like high fps again. It does NOT. I've been enjoying the lack of screen tearing though.
 
The thing that g sync has done for me, is to let me turn off the fps counter and just play. My acer predator xb270hu is fantastic and until we have 4k 144hz screens that can be driven by single gpus at 100fps or so, I don't see myself changing from the predator.
 
I was very disappointed when I learned that G-Sync only takes away tearing(never have had big problems with tearing maybe because I use V-Sync). I always thought it makes games much smoother in lower FPS but I was wrong. Paying $200+ for g-sync module seems waste of money if you are not rich with very powerful rig and all things must to be high end. But I like that freesync is free and casual gamers can get that too now.

Say what? Gsync absolutely makes lower fps smoother. There are limits, sure. 30fps still sucks no matter what. But I notice a pretty big difference in the 45 to 60 fps range.

Back when witcher 3 first came out, my game would randomly launch on borderless mode, which at the time would shut off gsync. My settings have me run right around 50 fps. When id get in game,I'd notice instantly that borderless was on and gsync was off. There was an unquestionable difference in how smooth the game ran.
 
G-Sync and Freesync aren't gimmicks. Only a complete retard would have a problem with completely eliminating the possibility of tearing without any negative performance impact.

Unlike 21:9 displays and higher resolutions, the great thing about G-Sync and Freesync is that they JUST WORK. Every game is improved by these technologies whereas a very small percentage of games actually support exotic aspect ratios and high resolutions well. You're guaranteed to get use out of G-Sync in every game you play.

People's tolerance for dog shit makes me laugh. You LIKE tearing? You LIKE extra input lag to get rid of tearing? What kind of psychopath would not want to eliminate tearing without eating more input lag? What kind of a moron doesn't think that tearing is one of the most horrible visual defects in an experience on a computer?

SINKs cannot be used with backlight strobing or any rolling-scan display. This means that there is a VERY big problem with SINKs. Tears are mildly-annoying at worst. Post-hold monitors are near-unplayable. Other than shoddy Ubishit games, I have never had a problem with any resolution on my CRT, including oddballs like 3200x2400, 2560x1920, 2304x1728, 1920x1440, etc. Nor have I had a problem with any aspect ratio I've tried (5:4, 4:3, 16:10, 16:9, 21:9).
 
Say what? Gsync absolutely makes lower fps smoother. There are limits, sure. 30fps still sucks no matter what. But I notice a pretty big difference in the 45 to 60 fps range.

Back when witcher 3 first came out, my game would randomly launch on borderless mode, which at the time would shut off gsync. My settings have me run right around 50 fps. When id get in game,I'd notice instantly that borderless was on and gsync was off. There was an unquestionable difference in how smooth the game ran.

Don't play at 60 or below. Tears are far less noticeable. I own a CRT, so I can easily can reduce the resolution without scaling things. If you have a LCD, try messing with the texture quality first.
 
SINKs cannot be used with backlight strobing or any rolling-scan display. This means that there is a VERY big problem with SINKs. Tears are mildly-annoying at worst. Post-hold monitors are near-unplayable. Other than shoddy Ubishit games, I have never had a problem with any resolution on my CRT, including oddballs like 3200x2400, 2560x1920, 2304x1728, 1920x1440, etc. Nor have I had a problem with any aspect ratio I've tried (5:4, 4:3, 16:10, 16:9, 21:9).

Then you have different taste, but a lot would appreciate it if refrain from stating what your taste should be what other people's tastes.

I can't stand tears, on any level, and I haven't used CRT for gaming since Pentium 2 days, started using LCD's as soon as they became available, and I don't find LCD blur to be annoying.
 
SINKs cannot be used with backlight strobing or any rolling-scan display. This means that there is a VERY big problem with SINKs. Tears are mildly-annoying at worst. Post-hold monitors are near-unplayable. Other than shoddy Ubishit games, I have never had a problem with any resolution on my CRT, including oddballs like 3200x2400, 2560x1920, 2304x1728, 1920x1440, etc. Nor have I had a problem with any aspect ratio I've tried (5:4, 4:3, 16:10, 16:9, 21:9).

Actually, you can have variable refresh with strobed backlights or black frame insertion. There is no technical reason that someone couldn't make a monitor where the backlight strobing is synced with the framerate. It'd be pretty simple, really. Just because something hasn't been done yet doesn't mean it won't be done. Have you heard of Retroarch? I can play a game in Retroarch today using G-Sync with software black frame insertion and play games without vsync input lag, no tearing, and low motion blur. Yeah. Welcome to 2015.

The main reason that many of these things haven't been done is because hardware manufacturers like to trickle out features as slowly as possible so that they can sell 10 generations of hardware instead of 1. Strobing/flickering is still arguably an awful tradeoff anyway (the brightness loss and flicker are almost worse than the motion blur of a fast display).

Backlights are kind of a backward idea, anyway. The future of displays aren't in backlights at all but self-lit displays such as OLED or pure LED displays. It just goes to show how small your thinking is, really, that you're obsessed with crappy backlights.

No company ever even made a high resolution 16:9 aspect ratio CRT (by far the most commonly supported aspect ratio by games). Enjoy your laughably tiny screen and letterboxing.
 
Last edited:
Actually, you can have variable refresh with strobed backlights or black frame insertion. There is no technical reason that someone couldn't make a monitor where the backlight strobing is synced with the framerate. It'd be pretty simple, really. Just because something hasn't been done yet doesn't mean it won't be done. Have you heard of Retroarch? I can play a game in Retroarch today using G-Sync with software black frame insertion and play games without vsync input lag, no tearing, and low motion blur. Yeah. Welcome to 2015.

The main reason that many of these things haven't been done is because hardware manufacturers like to trickle out features as slowly as possible so that they can sell 10 generations of hardware instead of 1. Strobing/flickering is still arguably an awful tradeoff anyway (the brightness loss and flicker are almost worse than the motion blur of a fast display).

Backlights are kind of a backward idea, anyway. The future of displays aren't in backlights at all but self-lit displays such as OLED or pure LED displays. It just goes to show how small your thinking is, really, that you're obsessed with crappy backlights.

No company ever even made a high resolution 16:9 aspect ratio CRT (by far the most commonly supported aspect ratio by games). Enjoy your laughably tiny screen and letterboxing.

A strobing backlight would have visible changes in brightness and flicker at different refresh rates/frame rates. It would look messed up.

OLEDs also can't be run with black frame insertion and GSINK for the same reason strobe backights would be problematic w/ SINKs.

This thread isn't even about CRTs, but Interview has a 28" 1920x1080@85Hz @ 2048x1152@80Hz tube. I've never had an aspect ratio problem in games other than ubishit console ports
 
FreeSync and G-Sync are both still outside the realm of 'casual gamers', the cheapest of the FreeSync monitors are still way above the budget of normal people (EG No FreeSync 1080p 60hz panels).

The pocket required for FreeSync is less deep than G-Sync, sure, but the overall the cost is not that much different (EG the cost of a 1440p 144hz rig for AMD and nVidia is pretty much $200 difference, and AMD's easily total over $1500).

FreeSync or G-Sync won't be part of purchasing deciders for the normal layman until one of them breaks $200 barrier.

SAMSUNG S24E370DL Glossy White PLS 23.6" 4ms Widescreen LED Backlight LCD Monitor; Free-Sync Compatible w/ Wireless Phone Charging Capability

$250
 
A strobing backlight would have visible changes in brightness and flicker at different refresh rates/frame rates. It would look messed up.

OLEDs also can't be run with black frame insertion and GSINK for the same reason strobe backights would be problematic w/ SINKs.

This thread isn't even about CRTs, but Interview has a 28" 1920x1080@85Hz @ 2048x1152@80Hz tube. I've never had an aspect ratio problem in games other than ubishit console ports

You don't even want FreeSync and G-Sync for variable refresh as much as you want them for no tearing without introducing any additional input lag.

There's no reason you couldn't just cap the framerate, strobe at that framerate, and get the best of everything: low motion blur, low input lag, no tearing.

2048x1152 is a laughably low resolution in late 2015.
 
Back
Top