Is 1440 or 1080p best for gaming

I have 1920x1200 and it suits me. Don't want to have to spend $800 on video cards and/or lower settings to get a decent framerate. If you sit more than 32 inches, or two feet, 8 inches, away from your monitor, the human eye cannot distinguish the difference between 1440p in a 27 inch 16:9 monitor and 1200p in a 24 inch monitor.. something to keep in mind. If the games don't have the textures to support that resolution, you will still need anti-aliasing. It's just more expensive than I need to invest in given I don't game as much as many here do.

I'm not saying I'll never upgrade... but I'm just not impressed with the current offerings.. which still have backlight issues, IPS glow, and lackluster black levels. There's not enough incentive for me to upgrade from my Dell 24 inch to a Dell 27 inch. Start using ATW polarizes and I might bite... I'd even pay $100 extra to have one on my screen. Honestly, the idea of trying a 120hz TN panel and using nvidia's lightboost or whatever it's called has crossed my mind too. I never really paid attention to the motion blur on the panel until recently and now I do see it... I should bring the old CRT back from my parent's place and do a little comparison testing.
 
Last edited:
Depends on the game. You don't want 1440p if you're only going to be getting 30 FPS or lower because the resolution is too high. And running anything without supersampling AA these days just looks too ugly ...shimmering = YUCK.

Frankly, no (single chip) video card in the world can run 1080p in almost any current or semi current (FPS) game with high detail and 8x super sampling AA without the game slowing down to molasses (I don't consider CS:Go to be a "Current" game btw). Hell, you can't even run a 6 year old Crysis 1 game at something super low like 1024x768 (CRT) with 8x Supersampling in DX10 and get truly playable framerates! Dx9 mode is playable at such a low resolution and 60 FPS, but 120 for the 120 hz LCD screens? Forget it.

The day when a video card comes out that can run every single current FPS release at 60-120 FPS with 8x Supersampling AA at 1080p (1440p will be probably impossible with a single card) and get 120 FPS ideally (60 minimum) will be a happy day for all gamers.
 
At this present time I am comfortable at 1920 x 1080. I own a 120Hz display therefore higher resolutions would require some serious horsepower to maintain a high frame-rate and I do not wish to invest that much in video cards.
 
Who cares what our preferences are, your preference is what is important.

Try both a 2560x1440 display and 120hz and decide for yourself.
 
Same aspect ratio between 1440p and 1080p so the fov doesn't increase.
Thus..........What?


It seems the only difference is text size

Here is 1080p


1080p.jpg



here is 1440p


1440p.jpg
 
I just upgrade from a 32inch Fullhd tv to the new Ultrasharp 27 from Dell.
I used to game a lot at a competitive level . and nothing comes close to a good old CRT for that!
But the 120hz pannels are also a great choice , but lack colors and pixels!

My lcd tv had an inputlag of +- 45ms vs a CRT monitor
My Ultrasharp has +- 60ms compared to that CRT.

I was obviously freaked out a bit! So I fired up a game of Counterstrike and immediately I easily became MVP. So no worries there.

On the subject of GPU power , it takes a bit more power yes, but at those resolutions and DPI you don't really need Anti-aliasing anymore.
 
Depends, 1440p is WAY better then 1080p @ 60hz,

But i think 1080p @ 120hz is WAY better then 1440p.

Now if you are into fuckin with drivers PCB's and shit, you can get 1440p @ 120hz, which is WAY better.

But this is what I think.
 
1440p means image quality is more important to you than high frame rates, as you need a beefy rig to run modern games at that resolution. choose what suits you best.
 
Just got a 27" Catleap. Awesome monitor and a lot better than my U2410 surprisingly.

It seems the only difference is text size

Here is 1080p



here is 1440p

The menu is smaller, not the image. I use 75 FOV.
I'm seeing you use 90? Turn it down, you could be getting a fish eye effect.
 
And running anything without supersampling AA these days just looks too ugly ...shimmering = YUCK.

WRONG!!:D

Higher resolutions needs less AA, and not all game engines have the shimmering effect without SSAA, there are other, less slower AA methods available. At 1080p one may need 8x AA ( i completely disagree that it must be SSAA) but at 1440p or 1600p 2xAA ( yeah, pretty much any AA looks good at 1600p:D) is enough to make things look beautiful. I have higher quality demands in slower pacing games like MMOs strategy games, but FPs i dont feel great returns in overkill quality settings- not if i am dead every couple of seconds anyway;)

I would say that the truth goes exactly the other way around: pretty much any modern card today achieves gaming grade performance at 1080p, most middle range VGAs are OK for 1600p gaming and high end VGAs are multimonitor territory. At 120Hz things are much more demanding, especially with high AA levels. 256 bits bus memory is almost too little for such setups:cool:

If half of what the lightboost maniacs says is true, 60Hz strobed will deliver better visual quality than 140Hz non-strobed, and 1400p+ monitors would get even more value.

back to topic- resolutions for gaming: 1080p if your card is slow, 1440p if its fast, F900w if you are a gaming genius with no other income source than professional gaming:cool:
 
I disagree, I still use some form of supersampling when ever possible on my 1440p display.

Also MSAA only takes care of certain aliasing. At the very least I like to use some form of transparency aa. I prefer to use some form of SSAA whenever possible. For example 4x MSAA won't take care of the shimmering in the trees or grass in Crysis.
 
Everyone would prefer higher resolution if, they could afford to push the FPS on it and there was support for it in programs.

That said now days I think 1920 x 1080 is the sweet spot for performance, if you tend to run higher end hardware 2560x1440 is very reasonable though.
 
Everyone would prefer higher resolution if, they could afford to push the FPS on it and there was support for it in programs.

That said now days I think 1920 x 1080 is the sweet spot for performance, if you tend to run higher end hardware 2560x1440 is very reasonable though.

I agree. People who worry about 4K when we don't even have the bandwidth to do 1080p properly, same thing... 4K is not going to be in consumer grade equipment at a reasonable price for a decade or more.. only the rich need access to such resolutions and they will pay for the privilege. There's little to gain from such resolutions unless you sit only a few feet from your display. Televisions don't need it, the human eye simply cannot perceive the difference at normal viewing distances. I'd rather they keep working on getting oled panels down in price...

If I was going to upgrade my monitor I'd probably get the 30inch, 1600p and go all out. My common sense gene in my case says this is unnecessary though. But that doesn't mean other people who game more should not buy one. :)
 
Sort of on topic....Im looking for a monitor for both photo editing and playing games. If we use the monitor at its native 1440 for photo editing can i run games on it at 1080? Will they look alright?
 
Sort of on topic....Im looking for a monitor for both photo editing and playing games. If we use the monitor at its native 1440 for photo editing can i run games on it at 1080? Will they look alright?

1920x1080 will look noticeably worse on a 1440p display, though you could always play @1080p in a window instead of stretching it to full screen.. It's hard to capture on camera, one really needs to see for themself. The Viewsonic VP2770 is the best for photo editing and gaming:

http://hardforum.com/showthread.php?t=1732378
 
Do you live in Canada or the US? There are better deals in Canada

AMD:
Gigabyte 7970 with Bioshock Infinite & Crysis 3 (380$ from NCIX Canada)
XFX 7970 with Far Cry 3, Sleeping Dogs, Bioshock Infinite & Crysis 3 (390$ from NCIX Canada)

Nvidia:
MSI GTX 680 Lightning (500$)
SPARKLE 700021 (490$ New Egg Canada/US)
EVGA GTX 680 Superclocked Signature 2 490$ from New Egg US and 470$ from NCIX Canada (02G-P4-2687-KR=specific model number...Evga sells many 680's)
Galaxy GTX 680 4gb 460$ after 40$ MIR from NCIX Canada
Galaxy GTX 670 GC with dual fans for 330$ after rebate from Tiger Direct (US) and 330$ from NCIX Canada

If you have 600$ to spend on a single 680 I would get 2x Galaxy GC's since a 670 is only 5-10% slower than a 680 but costs less.

Amazon Canada & US sells the Viewsonic VP2770 and has the best return+exchange policy.
 
Swap the Dell for the Viewsonic. The Dell has matte coating cross hatching issues, burn in and higher input lag than the Viewsonic. The Viewsonic will less likely have screen uniformity issues too.

The Samsung SSD you wanted isn't in stock, get this: http://www.newegg.com/Product/Product.aspx?Item=20-148-443&ParentOnly=1

I live in Canada and haven't really checked US prices so I'll leave that to you. 32gb of ram is overkill and the windows 7 you picked seems really expensive....

http://www.amazon.com/s/ref=nb_sb_n...ows 7&sprefix=window,aps&rh=i:aps,k:windows 7
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
I've had both, 1440p certainly looks better.

My opinion -
Casual Gamer who wants best graphics - 1440p
Competitive Gamer interested in performance - 1080p144hz or CRT
 
Back
Top