Help Me in Leveling Up from 1080p

osrk

[H]ard|Gawd
Joined
Jan 10, 2003
Messages
2,032
Time to update my monitor setup (my least favorite thing due to panel lottery). I'm currently rocking two 24" dell ultrasharps (1080p). They're from 2009, so they've had a good life. I mainly due some office work, movie watching and would like to game more but I'm not hardcore. I'd like it to have freesync if possible, g sync does seem to expensive for the price premium. Getting a graphics card(s) to drive that isn't really an issue.

My questions are:
1. Should I keep the dual screen setup?
2. If so/if not what is a good size?
3. 2k or 4k?


Any recommendations? There are so many monitors to choose from. I've already spent hours looking.
 
All I will say, once you get used to multi-monitor, it's hard to go back to a single screen, no matter what resolution it is.

I just went to from two old 24" 1920x1200 Ultrasharps (one was '09, other was '04, I think) -- to two new P2715Q's (caught them on sale). By the time I got the scaling where I wanted it, I was glad I kept dual monitor, even though these are 4K panels. Stuff isn't really any smaller, just sharper. For my old eyes, the sharpness does make a noticeable difference on text once I got the scaling where I wanted it.

I do game some, but I have no problem turning down options or the resolution down to 1080 if I need to in order to keep performance up. The fact that I have 4K monitors hasn't seriously changed the way I game at all, even though I don't have what most folks would recommend to game at 4K (I have a 980GTX)

Not really trying to push these particular monitors or anything, just providing some insight on single vs dual, and 2K/4K. I was holding out for 4K HDR 120Hz+ VRR for the longest time... first I'm glad I didn't hold out, and second I'm glad I didn't spend a lot of money on settling for what I don't really want, if all of that comes about this year/next year, I won't feel bad about my current purchase as I upgrade to better monitors sooner than I usually do.
 
Last edited:
My questions are:
1. Should I keep the dual screen setup?


Since you do office type work, a double screen set up is pretty handy. However, have you checked out 21:9 monitors? Many offer the same real estate and resolution you’d get by having two separate 1080p monitors, it’s just one big one.


2. If so/if not what is a good size?

If you want to leave 1080p behind, 1440p at 27” is very common, although there are 25” 1440p options out there. If you got two 25s, you’d have about the same sized monitors as your 24s, but with much better resolution and clarity. With 4K, anywhere between 32”-43” is the sweet spot. although at the upper end you’re getting into TVs if that’s your thing. If you want to consider a 21:9 monitor, most generally fall somewhere between 29”-35” in size.


3. 2k or 4k?

Some clarification- by “2K” are you referring to 1440p? Or 1080p? I assume you mean 1440p (which is more accurately described as “2.5K”).
 
Time to update my monitor setup (my least favorite thing due to panel lottery). I'm currently rocking two 24" dell ultrasharps (1080p). They're from 2009, so they've had a good life. I mainly due some office work, movie watching and would like to game more but I'm not hardcore. I'd like it to have freesync if possible, g sync does seem to expensive for the price premium. Getting a graphics card(s) to drive that isn't really an issue.

My questions are:
1. Should I keep the dual screen setup?
2. If so/if not what is a good size?
3. 2k or 4k?


Any recommendations? There are so many monitors to choose from. I've already spent hours looking.

There is so many UHD monitors out there and yes it's a nightmare, don't choose Acer & Benq they use the same panel, you could try a LG, Samsung, UHD TV or monitor from LG 42 inch but if you think that's too big there is always a 32 inch. and yes G-sync is not worth the price on a UHD monitor. IMO

Best to stick with freesync.
 
Last edited:
HD = 720
FHD = 1080
QHD = 1440
UHD = 2160

'2K' is slang for 1440p, even though 1080p is closer to '2k' than 2160p is to '4k'. It's a terrible term that everyone needs to unlearn.

it's just an old format like 1080p and 2160p is UHD but they call it 4k, it strange but I am not getting into that here.
 
All I will say, once you get used to multi-monitor, it's hard to go back to a single screen, no matter what resolution it is.

I just went to from two old 24" 1920x1200 Ultrasharps (one was '09, other was '04, I think) -- to two new P2715Q's (caught them on sale). By the time I got the scaling where I wanted it, I was glad I kept dual monitor, even though these are 4K panels. Stuff isn't really any smaller, just sharper. For my old eyes, the sharpness does make a noticeable difference on text once I got the scaling where I wanted it.

I do game some, but I have no problem turning down options or the resolution down to 1080 if I need to in order to keep performance up. The fact that I have 4K monitors hasn't seriously changed the way I game at all, even though I don't have what most folks would recommend to game at 4K (I have a 980GTX)

Not really trying to push these particular monitors or anything, just providing some insight on single vs dual, and 2K/4K. I was holding out for 4K HDR 120Hz+ VRR for the longest time... first I'm glad I didn't hold out, and second I'm glad I didn't spend a lot of money on settling for what I don't really want, if all of that comes about this year/next year, I won't feel bad about my current purchase as I upgrade to better monitors sooner than I usually do.
This is what i wanted to know. Two 4k monitors at 27" sounds better than 2 2k monitors at 27".
 
HD = 720
FHD = 1080
QHD = 1440
UHD = 2160

'2K' is slang for 1440p, even though 1080p is closer to '2k' than 2160p is to '4k'. It's a terrible term that everyone needs to unlearn.
I only say what Fox news and Yum brands advertising tells me to say.
 
This is what i wanted to know. Two 4k monitors at 27" sounds better than 2 2k monitors at 27".

You mean UHD panels on the P2715Q if going to run two monitors you cannot play UHD/4k with a single GTX 980 at max settings on most modern games. you will need 2 cards to average around 60+fps on todays titles.
 
You mean UHD panels on the P2715Q if going to run two monitors you cannot play UHD/4k with a single GTX 980 at max settings on most modern games. you will need 2 cards to average around 60+fps on todays titles.
Figure out the panels first and then the video card(s). 1080 TIs are getting cheaper (though no freesync but not a huge deal for a once in a while gamer).
 
Figure out the panels first and then the video card(s). 1080 TIs are getting cheaper (though no freesync but not a huge deal for a once in a while gamer).

a single GTX 1080 ti will just do it fine but the prices now is not worth it, because soon the new GTX 2080 with 16GB HBM2 will come out this year we will know on the 29th of march 2018 if the online rumours are true.
 
Last edited:
You mean UHD panels on the P2715Q if going to run two monitors you cannot play UHD/4k with a single GTX 980 at max settings on most modern games. you will need 2 cards to average around 60+fps on todays titles.

Your right, I cannot. I'm the one running that with a 980 right now. Nor do I really mind it. But I said as much. Monitors I upgrade... once every 10 years or so. Video cards, every 3-4 years. I don't mind having a monitor I can't quite drive fully just yet. Most of what I do will play at 4K just fine on a toaster (Factorio, some older MMOs, etc) onbce i turn off the FSAA (you don't really need it at 4K) - and what doesn't, I run at 1080 and hardly notice the difference. If I were to get a gaming monitor, I would have got something much different.

The 4K, for me, is mostly for sharpness, not gaming. It makes smaller text much more legible. It makes pictures more clear. I don't use it for additional real estate. Maybe Apple has spoiled me with high PPI displays.

I have no idea what the OP will get with respect to video card though.
 
Also, I will say, going from 24" to 27" wasn't a huge jump, the panel isn't all that much bigger. Going up to 32+ though, for me, where my monitors sit on my desk (a little over 2' to 2.5'), was enough so that I couldn't see the entirety of a single monitor without moving my head. I'd have to move my monitor back another several inches, which I can't really do with my current setup.

I tinkered with going with a 42" 4K panel, didn't like it and by the time I got monitor back far enough to where I could see the entire panel, and the text big enough so that I could read it, it wasn't as good as smaller screens up closer. I really really wanted to go to a 55" OLED, the screen is gorgeous, but just couldn't make it fit unless I could get back 5+ feet, and that won't work in my current home office.

I did not try an ultrawide, for how I use dual monitors - usually work/game running on left, web browser/netflix/music player on right. The bezel in the middle doesn't bother me, and for me to do more or less the same workflow on an UW means I'd end up running games in Windowed mode, and have to deal with the title bar and window borders.
 
Last edited:
Your right, I cannot. I'm the one running that with a 980 right now. Nor do I really mind it. But I said as much. Monitors I upgrade... once every 10 years or so. Video cards, every 3-4 years. I don't mind having a monitor I can't quite drive fully just yet. Most of what I do will play at 4K just fine on a toaster (Factorio, some older MMOs, etc) onbce i turn off the FSAA (you don't really need it at 4K) - and what doesn't, I run at 1080 and hardly notice the difference. If I were to get a gaming monitor, I would have got something much different.

The 4K, for me, is mostly for sharpness, not gaming. It makes smaller text much more legible. It makes pictures more clear. I don't use it for additional real estate. Maybe Apple has spoiled me with high PPI displays.

I have no idea what the OP will get with respect to video card though.

Everyone to there own way of gaming there is no rules and there is still allots of gamers on a 1080p that don't make a jump to UHD monitors because it's not worth it at this time for me I came from a 1080p to UHD made a big mistake buying a Gsync monitor what a big waste and money it is all hype, now I have had it repair two times in one year I am lucky if I see it last another year let alone 10 years so now I ask for a refund on my Acer XB321HK also Acer & Benq monitors share the same panel problem.

My old Samsung 27 inch 1080p last me six years I am using it now because I am waiting for a refund that I had to fight to get it.
 
Last edited:
I have to admit, for gaming, the jump from 1080 to 4K was ... hardly noticeable to me. I did notice a big difference in reading text and images/pictures.

I don't know about high refresh rate, a lot of people say, for gaming, that's very impactful. If I were buying a monitor for gaming, this would be where i would consider spending some money.

I can say, having recently purchased an OLED TV - HDR makes a big difference, when it's used. And OLED in general - very very nice, but again, no smaller OLED panels are out there (except a new 21" I think).

If I were buying more for gaming right now, nothing out there ticks all my boxes. Right now it would probably be a tossup between a LG 27UK650 (IPS, 4K, HDR, 60Hz, Freesync) or a Samsung C27HG70 (VA, 1440p, HDR, 144Hz, Freesync). You pay through the nose for Gsync, even though I currently have an nVidia card, it is a tough sell to get me to shuck out another $150-$300 just for that. I only picked up the P2715Q Dells because they were on firesale and my older 24" was on it's last legs.
 
Last edited:
1440p+high refresh rate >>>>> 4K+60Hz
It's not just for gaming, everything is smoother.
But of course 4K+144Hz, which is rumored to be coming very soon, would be perfect, as long as you have a deep pocket.

Think twice before getting high refresh rate.
It's so gorgeous once you have it you can't go back. And that pretty much limit your future monitor choices to high ends.
 
I have to admit, for gaming, the jump from 1080 to 4K was ... hardly noticeable to me. I did notice a big difference in reading text and images/pictures.

I don't know about high refresh rate, a lot of people say, for gaming, that's very impactful. If I were buying a monitor for gaming, this would be where i would consider spending some money.

I can say, having recently purchased an OLED TV - HDR makes a big difference, when it's used. And OLED in general - very very nice, but again, no smaller OLED panels are out there (except a new 21" I think).

If I were buying more for gaming right now, nothing out there ticks all my boxes. Right now it would probably be a tossup between a LG 27UK650 (IPS, 4K, HDR, 60Hz, Freesync) or a Samsung C27HG70 (VA, 1440p, HDR, 144Hz, Freesync). You pay through the nose for Gsync, even though I currently have an nVidia card, it is a tough sell to get me to shuck out another $150-$300 just for that. I only picked up the P2715Q Dells because they were on firesale and my older 24" was on it's last legs.

I did have a Gigabyte GTX 1080 TI card but that only last me 7 months it had a fault with the RGB and kept turning off my screen had to return to Amazon got a refund, now it's my crap Gsync monitor from Acer, I am hoping nothing else go wrong! now. I build my old computer over six years ago and not had one fault, come to think of it this is my first time to get two items with faults anyway, at this time I am using my MB on board graphic to play some old games but that ok I will wait for the new UHD monitors without hype up Gsync also the new GTX 2080 cards to come out this year I hope.

The OLED to me look nice but again it's misleading on it resolution at only 3840 X 2160 a UHD not 4K now if it had a 4096 x 2304 then yes it would tick my box and I would buy one, But all this hype and on marketing media over 4K, is just 'BS' marketing, so when gamers say I got a 4K monitor it not true it only a UHD format in fact it's 256 pixels less then 4K why on earth did the manufactures go in this direction! idiots I never know maybe just for the cost? only 256 pixels more that it would be great and worth the money, if you look at the GTX 1080 TI spec it had near 8K max resolution so 4K wouldn't be an problem, but then the manufactures went in another wrong direction again and add a DP1.2 on all monitors when the GTX card had a DP1.4 build for the future of 4K @4096 x 2304. at 60hz

I don't listen to all the hype anymore it's headache over FPS, Gsync, 4k/UHD I learned allot will not be con again.

BTW nice talk with you have a nice day:)
 
Last edited:
The OLED to me look nice but again it's misleading on it resolution at only 3840 X 2160 a UHD not 4K now if it had a 4096 x 2304 then yes it would tick my box and I would buy one, But all this hype and on marketing media over 4K, is just 'BS' marketing, so when gamers say I got a 4K monitor it not true it only a UHD format in fact it's 256 pixels less then 4K why on earth did the manufactures go in this direction! idiots I never know maybe just for the cost? only 256 pixels more that it would be great and worth the money.

Everyone understands that "4K" is parlance for 3840 X 2160, even if 4096 x 2304 is the "real" 4K resolution. No one uses that resolution except for the movie studios. I don't see why you're so hung up on it.
 
Can you provide link where I can read about 1440p being referred to as 2K by any monitor manufacturer?

though no freesync but not a huge deal for a once in a while gamer
If you do not intend buying Radeon then Freesync is a bad choice.
Get G-Sync monitor and I can guarantee you that you won't remain 'once in a while gamer' for very long.

My recommendation is 1440p G-Sync IPS monitor.
4K is harder to drive GPU-wise and you are limited to 60Hz for now (which isn't such a big issue anyway) - which is totally justified if you want to have super nice sharp fonts while browsing web :cool:

I always wonder why so many people choose to save insignificant amount of money for which they can not buy anything significant anyway only to then spend long hours watching stuttering :confused:

It is like with anything related to audio-video quality. Life is short, why spend it using low quality crappy hardware when these things are not that expensive anyway. G-Sync premium compared to how much life costs is nothing.
 
Everyone understands that "4K" is parlance for 3840 X 2160, even if 4096 x 2304 is the "real" 4K resolution. No one uses that resolution except for the movie studios. I don't see why you're so hung up on it.

Right. Please stop getting distracted with your pointless bitching in every single thread that even mentions 4k resolution. The pixel difference between Cinema 4k and UHD 4k is 7%. You wouldn't notice the difference unless someone told you it was there.

Anyway, you can cool your jets about this. You have HDMI TO BLAME for being lazy and stealing the DVI spec. Which doesn't have the bandwidth to display 2048x1080, because it was built to handle 1600x1200. So they compromised on 16:9 2k, and called it 1080p to keep the two distinct.

Cinema 2k got dropped from consumer for two reasons: It's pretty wide at 1.89:1 ratio, and we didn't have enough bandwidth in current standards. Most of the time when you display a 2.39:1 aspect-ratio film, it ends-up having lower resolution than 1080p. So fighting over these minuscule differences in different n named standards is a waste of effort, BECAUSE YOU ALMOST NEVER SEE FULL Cinema 2k on a screen!

I think it's stupid calling UHD 4k, but I didn't make the naming convention. Just roll with the branding, because no matter how much you bitch the industry has already made up their mind.
 
Last edited:
Anyway, you can cool your jets about this. You have HDMI TO BLAME for being lazy and stealing the DVI spec. Which doesn't have the bandwidth to display 2048x1080, because it was built to handle 1600x1200. So they compromised on 16:9 2k, and called it 1080p to keep the two distinct.
HDMI can handle 2048x1080 just fine

There is Dell SP2309 with HDMI and resolution of 2,048 x 1,152
 
HDMI can handle 2048x1080 just fine

There is Dell SP2309 with HDMI and resolution of 2,048 x 1,152

Once they invented CVT-R, you could do 2048x1152@60hz, or 1920x1080@75 Hz. They found 20% more bandwidth by using a more efficient signaling system. Once video cards produced the signal timing AND new monitors accepted it, you could overclock them (75 Hz from 1080p), or bump the resolution.

https://en.wikipedia.org/wiki/Coordinated_Video_Timings#Reduced_blanking


When DVI launched, it was 1600x1200@60hz. Because the timings were looser to provide easy compatibility for CRTs that accepted DVI input. Which almost never happened. And of course the first-generation VGA to digital processors used by most early flat panels, none of which used CVT-R.
 
Last edited:
Everyone understands that "4K" is parlance for 3840 X 2160, even if 4096 x 2304 is the "real" 4K resolution. No one uses that resolution except for the movie studios. I don't see why you're so hung up on it.

It's not just for movie studios anymore many will use the True 4K, 4096 x 2304 for photographic work, design work, video editing, LG UltraFine 4K Display 4096x2304 resolution 21.5-inch also LG UltraFine 5K Display 27-inch, Apple is bring out there 5K monitor for the IMac Pro 27 inch this year all these monitors are just all too small for me.

I am not so hung up on it as you said if you don't believe me google it, yes I can't change the manufacturers idiots way why they call it 4K UHD @3840 X 2160?

Lg 22MD4KA-B
http://www.lg.com/us/monitors/lg-22MD4KA-B-4k-uhd-led-monitor

Lg 27MD5KA-B
http://www.lg.com/us/monitors/lg-27MD5KA-B-5k-uhd-led-monitor
 
Last edited:
True 4K did not became standard and thus no one is using it. No one even cares.

"4K UHD" is the standard, everyone have it, everyone is using it and all the so called "4K" content is mastered exclusively for it.

Please stop your nonsensical rant asap!
 
True 4K did not became standard and thus no one is using it. No one even cares.

"4K UHD" is the standard, everyone have it, everyone is using it and all the so called "4K" content is mastered exclusively for it.

Please stop your nonsensical rant asap!

I am not going to make argument of it there is no point, You have your opinion and I have mine.

it was said: (but I did not write it)
LG UH6100, UH6400 and UF6800 series TVs do indeed offer 2160 horizontal scan lines with 3840 pixels in each line but unlike regular real UHD displays, these models pull a sort of trick which reduces image sharpness by substituting red, green and blue subpixels in some of their full pixels for white subpixels. This mechanism is called RGBW and it’s distinctly inferior to the RGB found in real 4K TVs. In the classical sense of the spec. Instead, what they do offer is a sort of pseudo-4K which really consists of 2.8K native full color resolution

The link is here to read more the year 2016
http://4k.com/news/lthree-of-lgs-4k...eal-for-consumers-uh6400-uh6100-uf6800-16649/

I found this topic from this forum year 2017
https://forums.redflagdeals.com/folks-beware-fake-4k-lg-tvs-2066358/

Have a nice day:D
 
Last edited:
1. Should I keep the dual screen setup?
That's up to how you use your monitors. I've had dual 27" and even triple 27" setup, I grew tired of not being able to track so much space on the sides.

2. If so/if not what is a good size?
Again, it depends on your usage and viewing distance, but I switched my triple 27" for a single 40" KU6290 4K TV and haven't looked back. I'm delighted with the setup, have it wall-hung 3 feet from my face it's all in front of my face and there's tons of clarity and desktop space.

3. 2k or 4k?
Guess what, it depends... on the size and the power of your hardware. Anything bigger than 30", go 4K. Anything smaller, 1440p is perfectly fine. Those people who complain about, say, 1440p at 27" have absolute perfect vision that most people don't have. 4K at 40" is a "low" 110ppi and I couldn't discern individual pixels even if i wanted to.
 
it was said: (but I did not write it)
LG UH6100, UH6400 and UF6800 series TVs do indeed offer 2160 horizontal scan lines with 3840 pixels in each line but unlike regular real UHD displays, these models pull a sort of trick which reduces image sharpness by substituting red, green and blue subpixels in some of their full pixels for white subpixels. This mechanism is called RGBW and it’s distinctly inferior to the RGB found in real 4K TVs. In the classical sense of the spec. Instead, what they do offer is a sort of pseudo-4K which really consists of 2.8K native full color resolution
this is valid reason to be angry and rant
unlike 4096 pixel wide resolution which would be totally pointless for TV anyway

but keep in mind that for typical TV content this doesn't matter at all because most sources have 4:2:2 or even 4:2:0 subsampling and in this light RGBW subpixel structure isn't really harming picture quality that much if at all
I would still preffer OLED with RGBW than full RGB LCD for my next TV simply because of black and motion performance
 
Those people who complain about, say, 1440p at 27" have absolute perfect vision that most people don't have. 4K at 40" is a "low" 110ppi and I couldn't discern individual pixels even if i wanted to.
I can see individual pixels from normal viewing distance (about 0.6m) on my 27" 4K UHD monitor. 100% scale in desktop is just fine. If I sit closer I can see screen door effect like is obvious normally on any other LCD

40" is nice target for 8K Hyper Ultra Ridiculous HD (not to be confused with real true 8K) :astronaut:

There is book from William Horatio Bates about improving eyesight. Best book I have ever read, very enlightening.
 
this is valid reason to be angry and rant
unlike 4096 pixel wide resolution which would be totally pointless for TV anyway

but keep in mind that for typical TV content this doesn't matter at all because most sources have 4:2:2 or even 4:2:0 subsampling and in this light RGBW subpixel structure isn't really harming picture quality that much if at all I would still preffer OLED with RGBW than full RGB LCD for my next TV simply because of black and motion performance

Technology just keep changing just like the computers maybe in 2020 it will all change again when the 8k TV's come out!, I read that the new HDMI 2.1 is now in testing, has significantly increases bandwidth; from 18Gb/s to 48 Gb/s also support 8K and 10K resolution as well as 4K resolution at 120fps. an OLED TV with HDMI 2.0b now would be a waste of money if the 2.1 was to come out later this year! just like buying a GTX 1080 TI card now when the 2080 is coming out. IMO
 
I am excited for variable refresh rate TVs and new consoles supporting it making 30fps hell thing of the past.
Increased resolution... not so much.
UHD seems more than enough for next few years on TVs
Computer monitors are of course completely different story :)
 
The computer monitors are too expensive today for UHD, it would be cheaper to have 42 inch UHD TV with IPS, I have to wait for the new TV's to come this year to buy one, OLED yes it's nice but way over price IMO:)
 
Imho even G-Sync UHD monitors have completely acceptable prices.
Try to buy something that have actually good colors and viewing angles and you end up paying considerably more for small 24" 1200p monitor :hungover:
Boy, am I glad I have no such issues with my HP LP2480zx... and I have two of these, second one acting as a spare... you know, just to be on safe side :cool:

TVs are usually very bad for computer usage, only some of them have 4:4:4 and ability to disable all 'image enhancers' to display undistorted image. Lag is at least 24ms, only 60Hz without VBR. Cheap TVs have panels with very low image quality. Use W-LEDs of absolutely worst sort and always do some gimmicky thing with colors like conversions from YUV to RGB. Some TVs do not even have RGB pixel arrangement...
 
Yes there is always going to be draw backs using a TV as a monitor, plus they don't have the same quality has a monitor panel I know that, but times are changing I was looking at the LG 43ud79-b 8ms with DP @60hz also HDCP 2.2 and HDMI 2.0a if you use a UHD Blu-ray disk player but it don't have HDR and many gamers are have problems trying to update the firmware on it. But that is just a small problem, then there is the Sony X800E I like it supports 4096x2160 @60hz with HDR10 but there is no DP port only HDMI 2.0b also I can't find the 'MS' on it.

So I will have to wait a bit longer for the next new TV/monitors products coming on the market this year.
 
Yes there is always going to be draw backs using a TV as a monitor, plus they don't have the same quality has a monitor panel I know that, but times are changing

Yup, I can assure my humble 40” KU6290 looks way better than 90% of monitor models out there today. Better color depth, 10bit panel, ~6000:1 native contrast ratio. And it only cost me $300.
 
HP Omen 32 with FreeSync

https://hardforum.com/threads/hp-omen-32-1440p-monitor-refurb-239-shipped.1954537/

https://hardforum.com/threads/hp-omen-32-qhd-monitor-2560x1440-5ms-75hz-freesync-va-panel.1909919/


FREESYNC USE INSTRUCTIONS
turn off vsync
turn on freesync
enjoy less input lag with your mouse
noticeably smoother feel and look of your games between 48-75fps (48-75hz - which is FreeSync range on the Omen 32)
and no screen tearing between 48-75hz

If you want to try to keep the monitor in the FreeSync range at all times (I advise yes) - then also put a frame rate target at 72, 73, 74 FPS in the Catalyst Control Center. (you have to go a bit below 75Hz - because this driver setting isn't an exact limiter - it's a target, and as such the FPS will fluctuate a 2-3 frames up or down from the target)

RESULT: Everything in games will just feel buttery velvety smooth.

It's one of those things, IMO, that at first is not completely obvious, but after you've used it for a while, it's disgusting when you don't have it.

Case in point, I got used to Freesync on my HP Omen monitors over the course of six months. I replaced my Fury X in crossfire, with a pair of 1080TI in SLI. I plugged in the new 1080TI just a few days before a weekend long annual LAN party I hosted. I dropped the refresh rate to 60hz to avoid the frameskipping that occurs on the Omens with Nvidia at 75hz.

I was still so disgusted by the way my computer felt and played with the 1080TI at 60hz vs. the AMD Fury X cards at 75Hz, that I didn't even enjoy the LAN party. It made everything feel choppy, gross, re-introduced screen tearing which I hadn't seen from 6 months, and overall just ruined the gaming experience I was accustomed to.

Freesync or Gsysnc are both in final effect the same experience IMO (right now I'm using two 1080TI and a Alienware Gsync 34" (3440x1440 monitors at 120hz) --- but I'm one of the people that really likes the tech, notices it when it's gone. I'm NOT one of the people who notice super high FPS by comparison. 60hz is fine, heck 50hz is fine when FreeSync or Gsync is on, and I can't tell the difference between 75hz and 144hz with Freesync monitors (I've tried them side by side) --- but I can absolutely tell - even blind testing (which I did with some friends) when Freesync is on or off.
 
Last edited:
Back
Top