1080P vs 1440P for gaming?

Drawmonster

2[H]4U
Joined
Jun 9, 2007
Messages
3,300
Why 1440p for gaming? Are the differences enough to justify 1440 over 1080? Are they enough to justify a dual card setup with all the issues/expense that it involves vs. a single card setup with no issues and half the cost? Or is this in the same realm as the "audiophiles" who insist on buying third party sound cards, studio quality speakers, and premium cables for gaming? Bluray discs store video in 1080P resolution. Why isn't that quality enough for gamers, aside from pure epeen value? It's easy to quote the pixel increase from 1080P to 1440P, but who cares? When did some extra pixels that you won't notice trump a beautiful game with an acceptable FPS?

If you have a response, please post proof or data to back it up. Not looking for people who are trying to justify their 1440P monitor purchases....
 
I have a 23.5 inch 1080p surround setup and a 32 inch 1440p. There's a pretty big difference in image quality. Once you go 1440p you most likely won't want to go back to 1080p
 
1440p doesn't require dual card setups if you're content at 60Hz
 
Data? How about simple math.

1080p = 2073600 pixels
Spread over a 23" display gets you 95.78 PPI

1440p = 3686400 pixels
Spread over a 27" display gets you 108.79 PPI

More pixels, larger area, higher pixel density.

I chose 23 and 27" because those are typically the most common sizes for the respective resolution.
 
Just straight off the bat it seems like you're taking a very aggressive stance towards 1440p. For what reason, I do not know. Your excessive use of question marks is also unnecessary just fyi. I'll clear some things up for you though. First is that you, by no stretch of the imagination need an SLI or xfire config to run 1440p. A 970 or amd equivalent will do just fine at 60hz. Look up benchmarks if you need 'data'.

Judging by your attitude, specifically at your comment referring to 'a bunch of extra pixels that you won't notice,' going from a 1080p to a 1440p is well.. to be honest wrong. You notice when you go from a 1080p display to a 1440p display to a 4k display. Pixel density matters depending on how crisp and clean an image looks.

I would say yes 1440p is worth it for gaming. A 27" monitor with a 1080p resolution in my opinion looks fairly horrid. I wanted a bigger screen, therefore I chose one with a larger resolution to maintain image quality. I very much appreciate the added screen real estate of a 27" over a 24".
 
1080P was yesterday and today.
1440P is today and tomorow.
4K is tomorrow ( except if you are rich to get the rig with it )
 
Op, what you can justify is clearly not the same as others can.
If you find you cant afford it and are jealous enough to create a thread, perhaps you need a better job.

If you are happy at 1080p (as you said in another thread while throwing an earlier tantrum on this topic) then why are you giving us brain ache in that thread and now this new thread that you created because you are happy?
Clearly not happy lol.
 
If you are looking at 27" 1080p vs 27" 1440p, then it is in fact purely about pixels per inch. Higher PPI will give you a better picture. For gaming, 1080p is a bit on the low side for 27" at 81.59 ppi, while 27" 1440p is 108.79 ppi, which is a noticeable improvement. This is why you're seeing so many 27" 1440p monitors now; 1080p is the standard for 24" monitors, 1440p is the standard for 27" monitors.

In terms of size of the screen, here's how a 24" 16:9 and a 27" 16:9 comparison. If you happen to have been using a 24" 16:10 display, a 27" 16:9 will be almost the same height, which does surprise some buyers.

officer921 said:
When did some extra pixels that you won't notice trump a beautiful game with an acceptable FPS?

If you go forward with the assumption that you won't notice the extra pixels, then by all means save yourself the money and make a 1080p gaming setup. 1080p is still perfectly valid, especially if you're interested in 120Hz monitors. The answer to the argument between higher FPS and higher resolution always depends on how you game.
 
My brief experience with a 32" 1440p on the "flashy" BenQ 3200pt revealed two things to me (compared to an ASUS 120Hz 27" panel).

For reference I have an overclocked gtx 780.

If you are concerned about frame rate then 1440p stresses you system significantly more than 1080p. If you have a high end system or are not sensitive to low frame rates (I need a minimum of 85Hz to see fluid movement) then this should not be a worry.

1440p looks much clearer on the games I was playing (Battlefield 3 and Company of Heroes 2). With Battlefield 3 it was easier to identify targets in the far background so I got more kills but because of lower frame rate probably missed more in my peripheral vision. Company of Heroes looked a lot better BUT the minimap was now too small (even on a 32" monitor).

For desktop usage I found 1440p is a lot better.
 
The increase in pixel density is amazing for gaming. 1440p is pretty demanding with recent games though but it would be hard for me to go back. All 3d games look so much better.

Some games have bad UI scaling which can be an issue. For example I play Age of Conan (MMO > the higher the resolution the more you can see on the screen) and had to edit the UI files to increase the size of the fonts and icons because it was way too small - on the other hand the scaling is perfectly adequate on BF3/4.

Also, with 1440p on 27" I really don't care about aliasing anymore, not saying I don't see it but it's not enough for me to say "woah I really need some AA there". This will vary from person to person and I don't have good vision at all myself (but I'm always super close to my monitors though) but already with [email protected]" I was rarely bothered by aliasing.
 
Last edited:
I've been using a 1440p monitor for probably close to 2 years at this point and I would highly recommend it to anyone. Not only is the image quality great but for games (especially BF4), it will enable you to see a lot more detail (a distant sniper on a ridge, for example... which is often tough to see) than a 1080p. I jump back and forth between a 150ppi 1080p display and a 110ppi 1440p quite often and even though the 1080p trumps it in pixel density... I still prefer the higher resolution display.

I am running a dual GPU setup (HD7990) which is nice for some of the more modern/demanding games, but due to occasional crossfire issues, I often find that a single 7970 is plenty to run games at 1440p. So don't let that scare you away.

Another huge advantage of 1440p is using your PC for anything outside of gaming as well. Having all that extra screen realstate is simply awesome. I do 3D modeling for a living so having the ability to view more work on screen at once is a great help.

As soon as a company releases a good 30" 120hz 4K IPS, I'll be jumping in head first.
 
just know it would make all the difference in your beloved eve online
 
You mention Bluray being 1080p, well, Blueray doesnt think 1440p is worth it. They think you should skip 1440p and go straight to 4k. http://www.engadget.com/2015/01/09/4k-blu-ray-hdd/ The worth of higher resolution is situation specific based on distance and screen size. For many situations, 1080p is the highest quality needed, but other scenarios require more pixels to obtain the same overall image quality.
resolutionchartml2.jpg
 
Windows aspect radios displays best with a pixel density of ~100-110 PPI.

So as for 1080p vs 1440p, for me it's whatever size display that get that pixel density. I do want a monitor 27" or above. And 1440p at 27" or ultrawide at 34" gets me that proper pixel density, so those were my choices.
 
Resolution depends on the monitor size IMO. If you want a 32" screen then 1440 makes a lot of sense. If you're at 27" then 1440 is fine I suppose but not mandatory. A much heavier load on your GPU and a sizeable drop in frame rates. Yeah you don't "need" dual GPU's for 1440 but you do if you want to play at the same frame rates as with a single GPU and stay around 60. Looking at Anand's bench and using my 290x in BF4 for example you drop from 72 to 47. Metro LL you go from 86 to 49. Crysis 3 from 79 to 52. Company of Heroes from 62 to 42. Crysis 3 isn't playable at 1440 on max settings but it is at 1080.

So to me 1440 ain't worth it at least not at 27". 32" makes a better argument because it's a big enough screen that the extra pixels make a bigger difference but at 27 you're looking at a marginal increase in image quality at the cost of a big hit in frame rates and overall cost.
 
forget 1440p...the real gamers resolution is 16:10...loving my 1920 x 1200 setup :D
 
It's not just about pixel density increases though (a small 1080p might have the same pixel density as a large 1440p monitor). Don't textures become more detailed at 1440p and beyond? Specifically, distant textures?

Also, you don't need SLI to run 1440p. I was running with a single 970 GTX at 1600p for a while with buttery smooth frames.

The way I see it is that 1080p is quickly becoming the new 640x480. A resolution that nobody wants anymore. 1440p or 1600p is the sweet spot for gaming. It's a resolution that the latest cards can handle just fine without the need for SLI. 4k is indeed tomorrow's resolution. Today's hardware just isn't up to snuff to have it running >= 60 fps at ultra settings.
 
I dont have 1440p monitor and yet I still consider the TS question bit silly. Ever since the dawn of PC gaming bigger resolution and greater fidelity has always been desired. From EVGA/VGA to SVGA and so on. Unlike TV's which sit far away computer screen is always atleast 1 meter away, often much closer. Even smallest changes to resolution an increase or loss in detail is easy to spot.

Now to be fair, 1080 is kind of sweet spot. The picture is already very detailed and with help of good antialiasing jaggies are pretty much gone (provided the screen size is sane for the resolution, my 32" TV I use as monitor is severely pushing it) so going higher is not "necessary" so to speak. But if your graphics card has horsepower to do it, there is no reason why you should NOT buy 1440p monitor if you are in a market for a new one. This is not an hifi audiophile situation where differences are minute and sometimes even placebo imagination, not even close. Differences are objective and measurable.
 
At the core of any argument for using one resolution over another is picture/image clarity and edge crispness/sharpness of the displayed content. A gamer more than likely wants to be as immersed as possible into his/her game and would desire an increased level of realism/graphical fidelity to simulate the "realness" of the experience.

As pointed out previously pretty much every 1440p display has around 110ppi and most 1080p displays have somewhere between 91-95ppi (most "gaming" monitors at that resolution tend to be 23 or 24 inches - thought note 27 inch 1080p displays drop to about 82ppi). And for rendering an image having more pixels available means the image can be displayed more accurately - its essentially the same concept as integrals/data sampling/bitrates for music playback etc etc. Sure you can take the area under a curve using rectangles and triangles and doing a bunch of calculations to get a close enough answer, but if you using an "infinite" (very, very large) amount of rectangles and triangles and do one calculation, you can get the exact answer.

So we've tackled the ability of an image to be more accurately represented by having more pixels per inch with which to draw out the image, but now the clarity comes in - aka the jaggies/blurring/aliasing issues. It's the same issue as before, not enough pixels exist in the panel to properly draw out the edges of objects like grass, trees, small objects, etc; thus a blurred edge is created and the need to apply AA to the image is needed. Now it's true that with a good AA technique you can turn that less than ideal image at 1080p into something crisp, the performance hit to the GPU is massive with anything but FXAA, which is only enough to remove the blur, but not the rough edges of objects. Enter 1440p, with 17% more pixels per inch and 77% more pixels on the screen to work with objects are naturally rendered/drawn on the screen with much more detail and clarity and the need to blur reduction and edge smoothing is not as important.

There is one more thing though, nVidia's new DSR technology for the 900 series cards. It functions essentially as a supersampler and renders higher resolution video at the card and then applies a downscaler to output the image at native resolution. While taking a good amount of GPU power to do it's still only rendering an image "once" versus applying some form of AA which takes an already rendered frame and re-renders it with smoother and less blurred edges.

Personally though, there is no comparison between 1080p and 1440p as one is clearly superior as far as gaming immersion goes. I went from a 1920x1080 23" IPS to a 3440x1440 34" IPS and have never been happier with the image quality of my games. I was able to bump my graphical settings down an entire level in almost every game as well as nix AA down to FXAA instead of MS/SSAA x2 or 4 and still get out the same quality image with a significant amount less strain on my old 680.

Since I got my 980 I've been able to turn up the settings at 1440p to what they were when I was still doing 1080p and I literally said "I can't believe its not butter" and spent the next 20 minutes flipping the settings up and down and up and down in awe because I never would've been able to see that kind of clarity (never mind the fact I'm now seeing more of my game because I have an extra 360 pixels vertical and an extra 880 pixels horizontal) at 1080p without making my card beg for mercy from having to use some ridiculous 4x or 8x MS/SSAA option which the majority of games don't even have support for.
 
If you watch a 1080p video on a 4k TV do you think you will get a better video quality?
The same goes for games if they are not done for high resolutions then only a few things will look better.
One game i know that is done for high resolutions (4k) will be the soon release PC version of GTA 5 but i don't think we will see a lot of games done for 4k because most game developers will give us a crapy 1:1 console port.
I will only buy a 4k Display if the price is almost the same as a 1080p Display or if the Panel quality is better or if most of the media content is done in 4k.
 
Last edited:
Personally though, there is no comparison between 1080p and 1440p as one is clearly superior as far as gaming immersion goes. I went from a 1920x1080 23" IPS to a 3440x1440 34" IPS and have never been happier with the image quality of my games. I was able to bump my graphical settings down an entire level in almost every game as well as nix AA down to FXAA instead of MS/SSAA x2 or 4 and still get out the same quality image with a significant amount less strain on my old 680.

Well that's not exactly a comparison of resolution is it? I mean sure, going from a 23" screen to a 34" screen is obviously going to be a huge move in performance and immersion. That's why I think it's more an argument of screen size and not resolution. 1080 @ 27" is still perfectly fine. 32 and above then 1440 is where you want to be.

I don't think it's an argument of whether or not 1440 is superior to 1080. It all depends on how big the monitor is.
 
If you watch a 1080p video on a 4k TV do you think you will get a better video quality?
The same goes for games if they are not done for high resolutions then only a few things will look better.
One game i know that is done for high resolutions (4k) will be the soon release PC version of GTA 5 but i don't think we will see a lot of games done for 4k because most game developers will give us a crapy 1:1 console port.
I will only buy a 4k Display if the price is almost the same as a 1080p Display or if the Panel quality is better or if most of the media content is done in 4k.
Graphics are rendered instead of static sizes like they used to be decades ago.
It all depends on how big the monitor is.
It's a function of size *and* view distance where view distance seems to have the largest impact.
 
I went from using a 24" 1080P monitor to 27" 1440P. It's a great improvement (and yes, I want to justify my significant investment in these monitors. Bite me).

I'd recommend 27", even at 1080P, as the size makes games more immersive. The jump from 1080P to 1440P is certainly noticeable, and gives the picture a finer quality. I think there are other features that make more of a difference, like G-Sync or 3D Vision. Those are things that completely change the landscape and are a fundamental addition. The resolution bump from 1080P to 1440P does not fundamentally change the experience, it's a incremental improvement.
 
I think another important consideration you need to seriously look into is, everything is 1080p due to standardisation, so is it worth going against the grain for not much benefit!? Maybe jump from 1080 to 4k but 1440? Most gamers don't care about resolution they want fps and visibility. Not sure why the guys above me said people have always been into high resolutions as that is not entirely true. Back in the days most people were gaming in 800x600 because it's easier to hit larger pixels. Also picmip was set to pretty much no textures and just solid colors. It's one thing playing a game and another viewing a game casually. I'd say stick to 1080 and don't pay much for monitor either as soon there will be 144hz IPS
 
Are there any games that actually take advantage of the extra screen real estate? I've gone from 1080 to 1440 for workstation purposes which is completely phenominal, but when I play games here or there there isn't any extra openess from the extra resolution.
 
I used to be in a camp of 1080p@60 Hz is optimal as it's easier to hit constant 60 fps in games to avoid tearing.
Which realistically means your game is running at 80-90 fps most of the time.

But freesync/g-sync changes that logic by 180 degrees. Same gpu which never fallen below 60 fps will be able to get that 60 fps average on 2560x1440 with some drops but drops won't be that noticable.

And if you can run most demanging games at 60 fps then there's dozens of smaller/older titles that will run at 100+ fps.

So I'll try to upgrade to one of those new 144Hz IPSes as soon as money allows :D
 
Well that's not exactly a comparison of resolution is it? I mean sure, going from a 23" screen to a 34" screen is obviously going to be a huge move in performance and immersion. That's why I think it's more an argument of screen size and not resolution. 1080 @ 27" is still perfectly fine. 32 and above then 1440 is where you want to be.

I don't think it's an argument of whether or not 1440 is superior to 1080. It all depends on how big the monitor is.

Actually it kind of is a comparison of resolution, notice the mention of turning down graphical settings while still getting the same quality image despite almost doubling the screen area. Go take a screenshot or a thumbnail of some avatar and then use an editor to double the resolution and see how poor the new image looks. Then take the original image and half the resolution and see if it looks any crisper than the original. That's the exact opposite of what happens when you increase the rendering resolution, increasing rendering resolution increases quality.

That ties into the other point somebody made whom I forgot to quote, which is that who cares about the extra resolution if content isn't made for it - well, PC games are almost always able to do more (there are some exceptions...I'm thinking a specific Ubisoft title that was released last year and was a major disappointment). Yes, your 1080p or 720p content from your console or BD won't look any prettier but if I recall this thread was about gaming.

Are there any games that actually take advantage of the extra screen real estate? I've gone from 1080 to 1440 for workstation purposes which is completely phenominal, but when I play games here or there there isn't any extra openess from the extra resolution.

Head on over to Wide Screen Gaming Forum and look for games that have Vert+ (for 16:9 1440p) or Hor+ or a silver or gold star in the Ultrawide category (for 21:9 1440p). Lots of games have some form of making us of the extra pixels beyond getting a prettier picture without the need for large multipliers of AA techniques.
 
Old post but interesting topic with 165Hz and even 240Hz (coming this fall) 1440P (including IPS) monitors - In First Person Shooters, how is the target size affected - 27" 1080P vs 27" 1440P how are the targets (aiming at) scaled?
 
Old post but interesting topic with 165Hz and even 240Hz (coming this fall) 1440P (including IPS) monitors - In First Person Shooters, how is the target size affected - 27" 1080P vs 27" 1440P how are the targets (aiming at) scaled?

I think it really depends on the game.

I'm playing Modern Warfare on 3X 1440/144Hz Gsync panels, but I'm running 75% resolution (5760x1080 vs 7680x1440) and getting about 100fps with a 1080Ti - it's not sharp, but for a fast moving, short range shooter, this works great for me.

For Apex legends, I play at 1440/144hz as the engagement range and the need to spot people from far away make clarity much more important.

Will probably go to the LG 38" 144Hz monitor if it's not a lemon.
 
For the most part, going from 1080p to 1440p on the same size monitor (say 27") will have all game objects remain the same size, just be sharper.

I do recall a few older (like really old) games where the HUD overlays were a fixed pixel size, but I believe most newer games do not have this problem.
 
My experience is that pixel density matters more for windows than games.

When i tried 1080p on a 27" i was horrified by the desktop, but games were actually fine. Just slap some extra anti-aliasing if you must.
 
When I had an ultrawide, 2560x1080 was pretty nice for gaming and, as I discovered upon returning to 16:9, does not tax hardware nearly as much as 2560x1440. I am actually waiting for price drops so I can get another ultrawide. 1440 and a midrange GPU don't play well when you're aiming for a consistent 100 FPS.
 
When I had an ultrawide, 2560x1080 was pretty nice for gaming and, as I discovered upon returning to 16:9, does not tax hardware nearly as much as 2560x1440. I am actually waiting for price drops so I can get another ultrawide. 1440 and a midrange GPU don't play well when you're aiming for a consistent 100 FPS.
Yeah, I'm on 2560x1080 and honestly it looks great. I've done 1080p to triple 1440p to 4K and then to 1080p ultrawide. I like the ultrawide the best.

It is not as sharp as 4K or even 1440p, but once you get into the game it is not an issue. Also performance is much better, so you can max things out with no problem, use high anti-alias etc. that you can't do at 4K.

I've even run DSR 5K res and it looks really nice, much better than you think 1080p should look. So it is flexible.
 
  • Like
Reactions: viivo
like this
Honestly is 1080p + anti-aliasing (or super resolution thingy) going to be all that far off from 1440p?

Desktop is another story of course.
 
Honestly is 1080p + anti-aliasing (or super resolution thingy) going to be all that far off from 1440p?

Desktop is another story of course.

Depends on the size of the monitor. As for technologies that render at higher resolutions and then scale back down are going to increase input lag by quite a bit.
 
As for technologies that render at higher resolutions and then scale back down are going to increase input lag by quite a bit.
Do you have a source for this? In my experience, I've run 5K DSR at 166 fps and everything felt super smooth.
 
Do you have a source for this? In my experience, I've run 5K DSR at 166 fps and everything felt super smooth.

"Quite a bit" may have been overstating it, but it does add some amount due to the processing required to first upscale and then downscale the image. It could be only in the ~5ms range, so those not particularly perceptible to changes in input lag may not notice.
 
Okay. I agree that there must be some cost, but it seems very small in my experience.

But I've also played on 4K on a TV (even with consoles) and that is acceptable to me so I'm not too sensitive.
 
Back
Top