You don't really need Ampere or Big Navi... UNLESS! (Display tech post)

Wow, guess I am just a huge loser. I "game" on a 32" curved 1080p 60hz and enjoy it. I play Wow, Overwatch, CoD, and several other titles, but according to this group, unless you use a 4k monitor you are just a huge piece of sh*t.

Why the animosity and contempt for people who cannot afford bleeding edge or high level tech?

Quote: "You can't be serious with "most people still run 1080p 60hz monitors." Normal people might be running that. Gamers started moving on about 10 years ago. Playing games on a PC does not make it a gaming PC. No one building a gaming PC has cared about 1080p for at least 5 years." :end Quote

Guess I am not a gamer....
You built your PC for gaming in the last 5 years and purposefully went with 1080p 60hz? It's just hard to imagine. 970s were doing 1440p. 1440p monitors are cheap now.

There's no animosity there by the way.
 
Last edited:
Wow, guess I am just a huge loser. I "game" on a 32" curved 1080p 60hz and enjoy it. I play Wow, Overwatch, CoD, and several other titles, but according to this group, unless you use a 4k monitor you are just a huge piece of sh*t.

Why the animosity and contempt for people who cannot afford bleeding edge or high level tech?

Quote: "You can't be serious with "most people still run 1080p 60hz monitors." Normal people might be running that. Gamers started moving on about 10 years ago. Playing games on a PC does not make it a gaming PC. No one building a gaming PC has cared about 1080p for at least 5 years." :end Quote

Guess I am not a gamer....

Sensitive, much? You missed the whole point.
 
Just how many copies do people think MSFS sells? Mainstream game?
 
Just how many copies do people think MSFS sells? Mainstream game?

I am not sure but it is the type of game that reach the non gamers/non PC enthusiast disproportionately, they expect to sales over 2.2 millions copy the first 3 year's, will see after the Xbox release in 2021 (and I imagine why there is talk of such high sales figures). People that had 10 year's old PC and will get a new one just for it (or an XBOX).
 
Last edited:
You can't take this that seriously. Look how many people are UNDER 1080p.
1366x768 was still extremely common on laptop. I know people that work on a computer for a living that are under 1080p, imagine just gaming. I think it is possible that 20% under 1080p
 
Just how many copies do people think MSFS sells? Mainstream game?
Fortnite is a mainstream game... You can play it on a phone. Not really a driver of PC sales - or likely to rekindle an interest in former gamers.

MSFS is a totally different beast. A lot of people have good memories of the game from way back when. Combine nostalgia with time, opportunity and disposable income - you get a surge in PC sales from an unexpected quarter
 
More people in general are at home more than before => Investing in stuff to do at home will go up. So game PCs, HDTVs, Consoles you see great sells. Once that is over with as in people start being out more I can see a sizable slump will be upcoming. So investing in a 3080 with enough good information can give a rather big positive result for those who are gaming more and maybe more interested afterwards the next go around making the PC gaming industry grow more. That would be good.

I don't see the need as being the driving force, having fun and doing something different with the extra income not going to the movies, baseball/football/hockey vacations etc. is the driving force. If the experience can be improved, more fun in other words, there is money there for it. People buying $700+ on GPU's may also have money for better monitors. Their are monitors and HDTV that would allow even the 3090 to use it's full potential for gaming.

Would be nice seeing a user end type scenario viewpoint, as in here you got a $2500, 65" OLED TV, what would it cost to have actual detailed 4K content ability for this TV as in gaming. The cheapest great PC gaming build for example: 5600x, 3800, + + + to drive it. Why it would be better over a PS5 or Xbox series X?
 
I think the problem in the GPU market right now is that the older cards have not really dropped in price at retail. You can still buy a 2080 super for 600-700 dollars even though 3080 is out. The market correction hasn't happened yet and probably won until after next week.

It's not in Nvidia's interest to drop the price. All they did was stop making them. It's been hard-ish to find a "new" 2080 for a while.
 
I did at Micro Center in the showroom, they were all hooked up to machines with keyboard and mice so I played around with many models and while 4k was for sure gorgeous, I found my eyes straining hard in general computing/reading, etc on the 4k. 1440p won out for that reason. I would have needed to upgrade to 4k capable hardware and was willing to, but ultimately I did not like the eye strain with the 4k. And for now, my 980 is limping along with older titles at 1440p until I build a new rig early next year.

If it was a bigger display 4k would probably be fine (I'm thinking 40" +), but at 32" it was too straining for me.

Text on a 4K 32" monitor is MUCH clearer than a 1440P monitor. The issue is that you need to use scaling on the text, so as an example on my 4K 32" the text is at 150%. It is not straining at all to read. Demo monitors often do not have scaling on, and at 100% yeah its completely awful to read.
 
Well i think the kind of person who has a 1080p / 60 screen and then buys and installs a 3080 will also be the type of person who will probably educate themselves somewhat when they get the new toy and then decide to upgrade their monitor?
 
I bought my first 1440p monitor late 2010 it was the Dell Ultrasharp u2711. I went to 3440x1440p in 2015 with the LG 34UM95, then got a 4k TV in 2016. Then bought 3x27" 4k monitors in 2017 and recently got my first 1440p 165hz monitor. Now I am going 4x27" 4k 144hz with the LG 27GN950B. I will always try to buy the most pixel dense displays possible, if they make them I will probably buy them. I skipped the rtx 2000 series because it just wasnt that big enough an upgrade to play 4k. The rtx 3000 series however is a more than good enough for 4k60+.
 
I try and explain to people looking for a 3080 or 3090 that you need a MINIMUM of 1440P 144hz in order to truly explore the capabilities of an Ampere GPU. Even a 1080P 360hz is most wasted because you're limited by CPU most of the time. I specifically got a 3080 for my LG CX monitor/TV, and let me tell you... it's a match made in heaven. The 3080/3090 are able to spread their wings at 4K120. It is truly glorious.

It is time for the industry to move on from 1080P. 1440P should be the new standard for computer monitors, with 4K/8K being premium options.
 
Not sure industry needs to move on from 1080p. I recently picked up a couple 24" Asus 1080p 165hz monitors because I can play games on a system that costs 1/2 as much and only needs to be upgraded 1/2 as often.

This way when I upgrade I get great fps for the first few years and can either upgrade to keep the fps high or wait it out to save money.

There's also tons of professional gamers that prefer high and steady fps to higher resolution.

The nice thing is we don't all have to game for the same reasons and want the same gear. I can happily build a new pc that only needs a 3060 or 3070 in it.

Enjoy your 3080/3090 on 4k.
 
I try and explain to people looking for a 3080 or 3090 that you need a MINIMUM of 1440P 144hz in order to truly explore the capabilities of an Ampere GPU. Even a 1080P 360hz is most wasted because you're limited by CPU most of the time. I specifically got a 3080 for my LG CX monitor/TV, and let me tell you... it's a match made in heaven. The 3080/3090 are able to spread their wings at 4K120. It is truly glorious.

It is time for the industry to move on from 1080P. 1440P should be the new standard for computer monitors, with 4K/8K being premium options.
The problem is there is not alot of 4k high refresh monitors out there and they're expensive.
 
Not sure industry needs to move on from 1080p. I recently picked up a couple 24" Asus 1080p 165hz monitors because I can play games on a system that costs 1/2 as much and only needs to be upgraded 1/2 as often.

... .

There's also tons of professional gamers that prefer high and steady fps to higher resolution.
... .

Enjoy your 3080/3090 on 4k.

This is kind of the whole point of what I wrote. Higher resolution is not a goal in itself. Rather, you need optimal resolution for the size of screen you use - and that, along with your refresh rate desire should drive your card purchases.

What the new cards are doing is providing incentives for industry to build capable large format screens, given that the tech involved is way past pushing 1080p at 24... But may only (we don't know yet) be satisfactory for 4k 144. If the new cards are sweet at that pixel count and refresh - you know the competitive crew will be moving up to 32 b/c the larger screen size can finally give them an advantage - and the wannabe crew will follow
 
Last edited:
I plan on a 38" 3800x1600 ultrawide res so I think I would benefit from a 3080. Not quite the pixels of mainstream 4K but def almost there.
 
What will Cyberpunk2077 do then, trillions?
Cyberpunk isn't likely to drive unusual purchases. Rather, the usual crew of us who game and are interested in this will do what we always do: update already capable systems to let us see all of the goodies the new title offers. The gaming industry has already mapped us out in their forecasts.

The point of interest with MSFS is that people who have not bought anything new in years, who have been happy playing solitaire on the Intel on - board gpu are suddenly getting new systems for a game.

No one forecasted that.
 
I read this post and while I think it makes some great points, I disagree with some of it.

Thesis: The mainstream gamer is wasting their time and money looking at Big Navi or Ampere. This is because the 'mainstream' gamer is looking at all of the artificial benches and hype reviews, without really thinking about how it will affect the system they're actually gaming on.

Note: this is for the 'average' user. You [H]Gawds should already know this; but we get tourists, too

You would need to define the mainstream gamer. What does that mean? What behavioral characteristics do mainstream gamers exhibit? This also assumes that all mainstream gamers have hardware suitable to meet their needs currently. While one could make a case for older hardware being sufficient in many of these cases, it wouldn't make any sense to buy an older card when newer cards will supplant them at similar price points. If someone is rocking a GTX 960 or older card now, it wouldn't make sense to buy an RTX 2060 or 2070 now when the 3070, etc. are inbound. Regardless of your tastes as a gamer, everyone wants to get the most bang for their buck hardware wise and they want that hardware to be able to serve them as long as possible should they choose to avoid more incremental upgrades.

I write this because of an epiphany I had while trying to explain to a friend why he needed a 27 inch 1440 120hz monitor to enjoy the 3080. The realization: most people look at the bench / hype reviews without fully understanding what they're actually saying.

I don't agree with this. Everyone's threshold for enjoyment is different. More over, people using 1920x1080 monitors can still benefit from faster cards in some cases. Features such as DLSS and ray tracing are nice features that have massive performance penalties. In some games, they might even need the additional graphics "horsepower" as it were. Your "thesis" as you put it relies on mainstream gamers having no knowledge of these features or how to use them. However, you do not define a mainstream gamer and have no evidence to back up a definition for one. Granted, it's likely that most mainstream gamers (assuming we could agree on what that means) do not always understand those features or know anything about hardware. However, that doesn't mean that they can't see some benefit to having a card like the 3080. It also assumes that they don't max out games when they have the hardware for it. There is no basis to say this presented in your post.

Most people still run 1080p 60hz monitors. That is the most mainstream and common monitor -- and if you are running a system with that monitor, Ampere and Navi are a waste of money for you. It's like having a 500 hp engine on a gokart. You're not going to get the most out of the engine, despite the noise and cost and bragging rights. Read about CPU limitations for a bit of this; but the bottom line is that for most monitors, the newest cards are not necessary.

Evidence? While there is evidence to suggest that the bulk of users are still using 1920x1080 monitors, you have zero evidence that they are limited to 60Hz refresh rates. This is especially true since 1920x1080 monitors have been exceeding 60Hz refresh rates for more than 10 years.

Until, and unless you are running 1440 or 4k, and trying to push 120hz or more... you are really wasting time, energy and money on the 'New Shiney'.

Because that's what these cards are: door openers for the industry to start making and pushing better display tech.

Look, I agree that buying an RTX 3080 or 3090 for 1080P is ridiculous, but I wouldn't go so far as to say that you need to run at 2560x1440 or 3840x2160 to take advantage of the extra power. This is simply not accurate. You can leverage these cards at lower resolutions, but it does take some doing and specific game features to do it.

...

(Long post, I know: read on if you want my justifications for the thesis)

...


Last Gen - even 2 Gen old cards run 1080p just fine. With overhead even, on a 1080p 60hz monitor in most, even the latest games. Scroll through any of the bench mark threads you want - see the comparisons... and anywhere you see a game on an old card still putting out @ 100 fps at 1080p; that card is JUST FINE - at 60hz.

The most interesting thing is that 2080 promised 4k gaming... and there are 27 and 32 inch 4k monitors out there (at 60hz) -- but the combo of the gold standard of high pixels and high refresh DOES NOT EXIST in the wild yet. The most promising 32 inch monitors with 4k, IPS and 120(+)hz don't even come onto the market until Q1 2021... if they stay on track.

Other than the part about older cards running games at 1920x1080 at 100FPS, the rest of this is irrelevant. I don't think the 2080 promised 4K gaming either. Do you have evidence that such a claim was made by NVIDIA? It was barely faster than an GTX 1080 Ti, and much slower than the RTX 2080 Ti. The RTX 2080 Ti sort of made that promise and actually, for the most part it delivered on that. I actually ran 4K@60Hz with an RTX 2080 Ti and when combined with the right CPU, it can achieve that most of the time.

So lets talk panels.

You have to - before you go into thinking about buying the new cards, understand the relationship between cards, pixels, refresh rate and panel sizes. You need a new card to push high pixels and high refresh rate (FPS is NOT the whole story). You don't need a this Gen card to push a 3 Gen old panel. Pixels is 1080p, 1440, and 4k. Refresh rate is Herz (hz) which ties into FPS. If you have a card that lets you push 120 FPS but only a 60hz monitor... pretty much (there's more to this, btw) you're only seeing 60 FPS - despite what the game / bench says you're getting. On to the argument:

Pixels:

1080p is JUST FINE for a 24 inch monitor; in fact, it's the sweet spot. Higher resolutions can actually make gaming harder on a panel this small. The key is - given the distance most computer users sit (this is for PC, not Console gamers, btw) from the monitor, if you don't have the right, correct pixel pitch for the size of the monitor, you are going to dislike what you see. The TLDR here (for 60hz) is this:
  • 24 go with 1080p -- almost all cards from the last 3 gens work fine
  • 27 go with 1440 -- almost all cards from last gen work, and some from 2 gens ago still work
  • 32 (and up) go with 4k -- only high end cards from last gen, and very, very few (if any) older are playable/competitive

What PPI, monitor size, and resolution is best for each is largely subjective territory. I've got a 20" 1920x1080 monitor around here and it looks way nicer than any 24" 1080P monitor that I've seen in terms of pixel pitch. The image is clearer, and I think that smaller size is better than a 24" for that resolution. However, 1920x1200 and later 1920x1080 monitors did debut around the 24" panel size originally. I think more PPI is almost always better, regardless of size unless we are talking about extreme examples such as 27" 4K displays or something along those lines. However, what people can see and where they find the optimal range may be different. To me, the PPI of a 55" TV @ 4K is horrible for anything but gaming and therefore, is unacceptable. Plenty of people on these forums disagree as they run such a setup. I'd also argue that 4K isn't ideal for monitor sizes until you reach 40". I've seen 32's and less at 4K, and I can't stand font scaling and the way it looks in Windows. It's unacceptable to me on anything under 40" in size.

This is still subjective territory. That said, I can agree that your basic thought process isn't too far off the mark in general terms. 24" is fine at 1920x1080, 27's at 2560x1440 look great, and less than 32" for 4K is really a bad idea. It's too small of a monitor to really benefit from that resolution.

At 1080p, on a 24 inch monitor you've got roughly 92 pixels per inch (PPI). That's pretty durn good, and quite enjoyable. Full HD. Step it up to a 27 inch monitor, at 1080p and you've got 82 PPI - which, suddenly, can make the images on screen noticeably pixelated (you can see the spaces between the pixels). At 32 with 1080p, you get 69 PPI - which will look horrible, unless you are far away. (FYI - the farther away you are, the less your eye can tell the difference. e.g. a 42 inch 720p TV across the room from you looks fine, but up close, not so much).

At 1440, on a 24 you're going to have 122 PPI. Note here, that more does not equal better; the higher PPI makes everything look smaller, and you need a better card to push the pixels. Trying to shoot that guy far away? He's tiny. At 27 inches - you're hitting the sweet spot; 109 PPI at 1440. Trying to run 1440 on a 32? That's okay; you're getting the equivalent of Full HD on a 24 in a much larger screen -- 93 PPI. For old eyes trying to see the far away guy - he's a lot bigger now. And things look good.

A 24 inch 4k monitor is a waste of money. You're pushing 185 PPI - everything is tiny, and you need a good card. At 4k on a 27, again you're hitting that 'too many pixels in too small a space' problem; 169 PPI... The far away guy is teeny tiny. (Yes, you can play with the settings; but why go to the effort?) At 32, 4k settles down a bit to 138 PPI - but some say anything larger than 110 is too much for normal Windows use - especially on older programs without good native scaling, but others say that 32 is the sweet spot for 4k -- plenty of real estate and all the visual goodies.

I don't disagree with you entirely, but again this is somewhat subjective territory. It really depends on what your use case scenario is. I use my machine for a mix of productivity and gaming. PPI isn't as important for gaming, so you can afford to go larger at lower resolutions than you can for productivity tasks. Again, a 49" 3840x2160 Samsung KS8500 had pretty shitty PPI for work, but for gaming it was rather nice. I have a 28" LG 3840x2160 display and it's PPI is way too high. I need font scaling to make it work which looks almost as bad. Text is even hard to read with font scaling at times. For gaming, it doesn't look any better to me than it does on slightly larger monitors at lower resolution.

I'd also disagree with the notion that 32" is the sweet spot for 4K. For me, 38"-43" is the optimal range for such a display.

Refresh Rate (hz):

The other thing that you need to know - and that everyone's talking about, but some folks don't understand, is refresh rate. If you're running a 60hz monitor in today's games, you are actually gimped against a player running a 144 (presuming both players have cards that can push frame rates that high). If you have a 60 hz monitor, and the card / game reports you getting 100 (+) FPS - you can still be competitive, but you would do better on a higher refresh rate monitor. If both of you are getting 100 FPS reported, you only see 60, he sees all of them - and there's some complexity both in the hardware and wetware that I won't go into... but the higher refresh rate and reporting of more FPS does equal into higher human response times, or, an advantage to the guy with the higher hz monitor.

Note: Refresh rate is totally distinct from pixels; Great cards are needed to push both, or you can choose a good card to go for one over the other. Don't get sucked into the "I need a 4k monitor" thinking without understanding when/ why you want / need 4k. There's lots of competitive gamers that are buying and using 24 inch 1080p monitors; they're just spending money on getting way higher refresh rates to be as competitive as possible. The key here is - they know that at 24 the sweet spot is 1080p, but by buying a monitor and card that can push frames into the 200s... they've got a competitive advantage over the casuals playing 1080p at 60hz.

No argument with any of this.

So - Final Thought: you DON'T need a new card... UNLESS. Unless you're trying to game at a higher refresh rate than 120, or at higher pixels than 1080p, don't worry about the New Shiney.

If you want the new shiney anyway; be dad gummed sure you've got a panel that will take advantage of it!




Resources:

https://levvvel.com/pixel-density-resolution/
https://en.wikipedia.org/wiki/Pixel_density

This is a very broad statement and one that isn't necessarily accurate. While it's true that the vast majority of people need to fall within needing higher frame rates for high refresh rate panels or higher resolution panels to benefit from such a graphics card, this isn't universally true. Again, you can leverage DLSS and ray tracing at 1080P and get something out of that. You also do not factor in the various reasons someone might buy such a graphics card. There are more reasons than one to do it. If you have an aging system, it doesn't make sense to buy an older video card right now if you plan on keeping it over the long haul. Some people will buy ultra-high end now so that they don't need to buy anything for years on end. This actually makes some sense if you can do it.

If you buy a 3080 10GB now for $699 you might just be able to run that thing for the next 5-6 or more years. That's going to be less money in the long term than buying $400+ mid-range GPU's every two to three years. Also, those weaker cards tend to run into some issues that the more expensive cards won't. Down the road, less VRAM often becomes problematic earlier leading to a worse experience than you'd have had running a high end card over a longer period of time.
 
Last edited:
I read this post and while I think it makes some great points, I disagree with some of it.



You would need to define the mainstream gamer. What does that mean? What behavioral characteristics do mainstream gamers exhibit? This also assumes that all mainstream gamers have hardware suitable to meet their needs currently. While one could make a case for older hardware being sufficient in many of these cases, it wouldn't make any sense to buy an older card when newer cards will supplant them at similar price points. If someone is rocking a GTX 960 or older card now, it wouldn't make sense to buy an RTX 2060 or 2070 now when the 3070, etc. are inbound. Regardless of your tastes as a gamer, everyone wants to get the most bang for their buck hardware wise and they want that hardware to be able to serve them as long as possible should they choose to avoid more incremental upgrades.



I don't agree with this. Everyone's threshold for enjoyment is different. More over, people using 1920x1080 monitors can still benefit from faster cards in some cases. Features such as DLSS and ray tracing are nice features that have massive performance penalties. In some games, they might even need the additional graphics "horsepower" as it were. Your "thesis" as you put it relies on mainstream gamers having no knowledge of these features or how to use them. However, you do not define a mainstream gamer and have no evidence to back up a definition for one. Granted, it's likely that most mainstream gamers (assuming we could agree on what that means) do not always understand those features or know anything about hardware. However, that doesn't mean that they can't see some benefit to having a card like the 3080. It also assumes that they don't max out games when they have the hardware for it. There is no basis to say this presented in your post.



Evidence? While there is evidence to suggest that the bulk of users are still using 1920x1080 monitors, you have zero evidence that they are limited to 60Hz refresh rates. This is especially true since 1920x1080 monitors have been exceeding 60Hz refresh rates for more than 10 years.



Look, I agree that buying an RTX 3080 or 3090 for 1080P is ridiculous, but I wouldn't go so far as to say that you need to run at 2560x1440 or 3840x2160 to take advantage of the extra power. This is simply not accurate. You can leverage these cards at lower resolutions, but it does take some doing and specific game features to do it.



Other than the part about older cards running games at 1920x1080 at 100FPS, the rest of this is irrelevant. I don't think the 2080 promised 4K gaming either. Do you have evidence that such a claim was made by NVIDIA? It was barely faster than an GTX 1080 Ti, and much slower than the RTX 2080 Ti. The RTX 2080 Ti sort of made that promise and actually, for the most part it delivered on that. I actually ran 4K@60Hz with an RTX 2080 Ti and when combined with the right CPU, it can achieve that most of the time.



What PPI, monitor size, and resolution is best for each is largely subjective territory. I've got a 20" 1920x1080 monitor around here and it looks way nicer than any 24" 1080P monitor that I've seen in terms of pixel pitch. The image is clearer, and I think that smaller size is better than a 24" for that resolution. However, 1920x1200 and later 1920x1080 monitors did debut around the 24" panel size originally. I think more PPI is always better, regardless of size. However, what people can see and where they find the optimal range may be different. To me, the PPI of a 55" TV @ 4K is horrible for anything but gaming and therefore, is unacceptable. Plenty of people on these forums disagree as they run such a setup. I'd also argue that 4K isn't ideal for monitor sizes until you reach 40". I've seen 32's and less at 4K, and I can't stand font scaling and the way it looks in Windows. It's unacceptable to me on anything under 40" in size.

This is still subjective territory. That said, I can agree that your basic thought process isn't too far off the mark in general terms. 24" is fine at 1920x1080, 27's at 2560x1440 look great, and less than 32" for 4K is really a bad idea. It's too small of a monitor to really benefit from that resolution.



I don't disagree with you entirely, but again this is somewhat subjective territory. It really depends on what your use case scenario is. I use my machine for a mix of productivity and gaming. PPI isn't as important for gaming, so you can afford to go larger at lower resolutions than you can for productivity tasks. Again, a 49" 3840x2160 Samsung KS8500 had pretty shitty PPI for work, but for gaming it was rather nice. I have a 28" LG 3840x2160 display and it's PPI is way too high. I need font scaling to make it work which looks almost as bad. Text is even hard to read with font scaling at times. For gaming, it doesn't look any better to me than it does on slightly larger monitors at lower resolution.

I'd also disagree with the notion that 32" is the sweet spot for 4K. For me, 38"-43" is the optimal range for such a display.



No argument with any of this.



This is a very broad statement and one that isn't necessarily accurate. While it's true that the vast majority of people need to fall within needing higher frame rates for high refresh rate panels or higher resolution panels to benefit from such a graphics card, this isn't universally true. Again, you can leverage DLSS and ray tracing at 1080P and get something out of that. You also do not factor in the various reasons someone might buy such a graphics card. There are more reasons than one to do it. If you have an aging system, it doesn't make sense to buy an older video card right now if you plan on keeping it over the long haul. Some people will buy ultra-high end now so that they don't need to buy anything for years on end. This actually makes some sense if you can do it.

If you buy a 3080 10GB now for $699 you might just be able to run that thing for the next 5-6 or more years. That's going to be less money in the long term than buying $400+ mid-range GPU's every two to three years. Also, those weaker cards tend to run into some issues that the more expensive cards won't. Down the road, less VRAM often becomes problematic earlier leading to a worse experience than you'd have had running a high end card over a longer period of time.
Fantastic, on point critique and discussion of what I wrote. I've addressed some points in posts after the OP - but a couple of quick notes.

For the sake of brevity (insert wry humor) I skipped over some things like defining and linking to sources indicating who the 'mainstream gamer' is, although in reference to the use of 24 inch 1080p others have provided links to Steam surveys etc. There are other resources out there - none of them disprove my assertion (see, e.g. https://gs.statcounter.com/screen-resolution-stats/desktop/United-States-of-America)

My shorthand definition was clearly 24" 1080p - and I assumed 60hz because most non-gamers, casuals - and frankly non-enthusiasts - likely have 60hz machines. Note: I did not (in the first part) talk about refresh rate on purpose - but I did come to it near the end. The main reason for this is that I had a specific type of audience in mind - the person who has a capable rig, but who hasn't done a major overhaul in years - along with the people looking to get back into gaming (or a first time buyer) that has been enthused by the new tech and hype - but who does not recognize the balancing act that we go through in assessing how the components of a system work together - I. E. Where the 3080 (etc) cards is going to really improve their overall performance. Specifically, I've talked to lots of people who think 4k must be better than 1080p because it's more. Hard stop. Other people literally have no idea what refresh rate they have on the desktop they currently use, but think because the reviews say that they need a 3080 to play with good frames - that's the word.

Even the term 'gamer' triggered some people - as the term is highly subjective. One person may consider themselves to be a gamer if they only play WOT. AOE, Rome TW, or BFBC2 on their 24 inch 1080p 60hz machine where others will say you are not a gamer unless you are gunning on a 240hz machine in CS:Go or LOL. 'Gamer' means different things to different people.

Back to the point about 'mainstream gamers' -- given how capable the cards of the last 3 generations have been - outside of the enthusiasts, few people have really needed to upgrade their monitors. Anyone who gets 100 + frames on a 60hz machine enjoys a relatively smooth experience. The residents of this forum are enthusiasts... But tourists deserve good info.

At the end of my post, I would hope that someone who is not an enthusiast, but considering upgrading, would know to ask themselves what they hope to achieve by purchasing a new GPU. If they want the highest refresh rate and fps, are they willing to accept a smaller monitor? If they want a bigger monitor, what is a good target native resolution given the size of the panels, and what the trade-off is for choosing between panels, and what if they want something in-between? IOW - how do you choose a card and monitor that is best for your goals?

... On to other stuff...

Yep - 2080 (et.al.) introduced RT capabilities - but much of the hype was also about 4k - which it could do, if not excel at https://www.google.com/amp/s/www.th...x-2080-review-benchmarks-performance-features. https://www.techradar.com/reviews/nvidia-geforce-rtx-2080

...

Your 20 inch 1080p probably looks fantastic b/c the ppi is likely around 110. Clearly crisper than the 92 on a 24. You are also correct that panel choice is hugely personal and subjective. Factors like view distance, age and use all inform the choice and enjoyment of a given ppi /monitor size. 32 is, IMO the 'entry size' for 4k. The ppi of a 32 inch 4k is higher than the recommended 110 for windows use - but larger monitors can be overwhelming on a desktop. - you can look at some of my other posts about vertical screen space for why the 32 appeals to me - the 38 inch ultrawides also have similar vertical space - which is important to some (but not all) people.

...

I won't quibble every point - because I agree with much of your criticism. My post is more of a primer than an attempt to be a definitive guide to GPU and Panel selection. To the effect that we have a good discussion going on here, I hope folks searching for advice on what to buy now know enough to ask questions and make informed decisions.

Thanks for your post!
 
Last edited:
...
Look, I agree that buying an RTX 3080 or 3090 for 1080P is ridiculous, but I wouldn't go so far as to say that you need to run at 2560x1440 or 3840x2160 to take advantage of the extra power. This is simply not accurate. You can leverage these cards at lower resolutions, but it does take some doing and specific game features to do it.
...

While it's true that the vast majority of people need to fall within needing higher frame rates for high refresh rate panels or higher resolution panels to benefit from such a graphics card, this isn't universally true. Again, you can leverage DLSS and ray tracing at 1080P and get something out of that. You also do not factor in the various reasons someone might buy such a graphics card. There are more reasons than one to do it. If you have an aging system, it doesn't make sense to buy an older video card right now if you plan on keeping it over the long haul. Some people will buy ultra-high end now so that they don't need to buy anything for years on end. This actually makes some sense if you can do it.

If you buy a 3080 10GB now for $699 you might just be able to run that thing for the next 5-6 or more years. That's going to be less money in the long term than buying $400+ mid-range GPU's every two to three years. Also, those weaker cards tend to run into some issues that the more expensive cards won't. Down the road, less VRAM often becomes problematic earlier leading to a worse experience than you'd have had running a high end card over a longer period of time.

I failed to address this in the OP, and wish I had. In fact, I will add an edit to it - because I think it's worth noting in the OP as a use-case scenario that does mitigate in favor of someone buying Ampere or BNavi for those features.

...

What specifically did you not like about 32-inch 4k? This is of particular interest to me (as you can see below)


...

If interested /willing - maybe you can help with a use-case scenario that will illustrate both our points to someone who's stumbled across this thread looking for info on an upgrade?

...


Specifically, let me ask a question about your experience with 32 inch monitors. My opinions on this are based upon research and in-store observation rather than extended personal use. Here's where I'm going, and perhaps you can lend your experience to increase my knowledge:

I've got two computers. One for (mostly) productivity runs the HPZR24W, a 16:10 1920x1200 60hz IPS display that I've been satisfied with for years. The other, almost exclusively for gaming, runs an Asus 24, 16:9 TN at 1080p with 144hz. I found over time that I really dislike the Asus - mostly because it lacks the vertical real estate, is a TN panel and the images on the HP just look much better overall. In fact, for almost every use, I prefer the HP. About the only time I actually use the Asus machine is when I'm competitively gaming and don't care so much about the visuals, but want the extra frames and information advantage the higher refresh offers. As soon as I'm done with that, it's back to my old, comfortable IPS.

The goal is to upgrade the HP, get a larger monitor and plug in Ampere or Big Navi, which as you note, will future-proof me for a while. For brevity, presume that the system overall is, aside from the monitor, a good candidate for one of the newest cards.

Observations:
  • I want a fast-refresh IPS panel and am willing to pay for good build quality.
  • I've looked at 27s, and the size increase (3 inches in width, almost none in height) doesn't excite. http://www.displaywars.com/27-inch-16x9-vs-24-inch-16x10 From a 'if I'm spending money on an upgrade' standpoint - I want a real upgrade. A 27 could be acceptable because I don't lose vertical real estate - but the math tells me that I'd be spending a pretty penny for an extra 1.5 inches on either side of my current screen.
  • I don't like curved monitors. They seem like if you're not in the exact place designed for the curve that the images look weird. I also do a lot of design work - requiring precise layout of parallel lines. Curved monitors make it difficult (for me) to be sure I'm laying things out parallel - so the monitor I need is a flat panel.
  • Industry does offer 32 inch flat panels - both in 1440p and 4k. The high refresh rate panels are "coming soon". 32 is a solid upgrade both vertically and horizontally from my 24 inch panel. http://www.displaywars.com/32-inch-16x9-vs-24-inch-16x10 The problem is, as noted in the OP - for productivity, it's not exactly an ideal size for either 1440 or 4k, given how Windows scales.
  • There are, as far as I can tell, no flat panel 34s. Certainly no 4k, IPS fast refresh panels.
  • 38s are almost all curved ultrawides
  • 40 is a TV size - and given my typical view distance (between 38 and 46 inches) seems way too big for desktop use. Pixel pitch at 4k is about 108; in the 'recommended' range for Windows scaling; but such a huge panel overwhelms the desktop and creates ergonomic problems at desktop use distances.

You wrote: "I'd also disagree with the notion that 32" is the sweet spot for 4K. For me, 38"-43" is the optimal range for such a display."

So - the question: what's a feller to do in this scenario?

As far as I can see, the options are limited:
  • Keep the HP, skip the card
  • Get a 27" fast-refresh 1440p and the card
  • Wait for Q1 2021, get a 32" fast-refresh 4k and the card and twiddle with the scaling to ensure I can still work on it.
Do you perceive other options? What recommendations would you have for this scenario?

(and thanks in advance if you're willing to play the game!)
 
Last edited:
Specifically, let me ask a question about your experience with 32 inch monitors. My opinions on this are based upon research and in-store observation rather than extended personal use. Here's where I'm going, and perhaps you can lend your experience to increase my knowledge:
I should preface this that I have 3 monitors, all of them 27 inches. Two of them are 4k60, one is 1440p165hz. They're all on a single arm and aligned horizontally. Combined, it's WELL outside of a single field of vision, but that's not my use case that way.

4k at 150% scaling at the same physical dimensions as 1440p is basically the exact same size when it comes to menus and such. There is, for me, at least, a very noticeable difference in picture detail and especially in video, between the two monitors*. But I've always had really good eyesight. I game at both 4k60 or 1440p165hz depending on the style (and demands) of the game, but I absolutely will NOT watch 4k video on the 1440p monitor.

I think that 8k at 32inch with 200% scaling would be the sweet spot to finally erase jaggies and individual pixels.

*There's also an extremely noticeable framerate difference jumping from 60hz to 165hz, but I feel the detail loss at 1080p isn't worth it, regardless of how many FPS it is.

Also, only vaguely related. I just got a new 75inch 4k TV and 4K just isn't enough pixel density for a TV of that size. It's still beautiful, mostly because it has an HDR implementation that's far beyond anything I've ever used before - but even across the living room I can absolutely see the pixels on the most beautiful 4k content that exists.
 
Last edited:
I should preface this that I have 3 monitors, all of them 27 inches. Two of them are 4k60, one is 1440p165hz. They're all on a single arm and aligned horizontally. Combined, it's WELL outside of a single field of vision, but that's not my use case that way.

4k at 150% scaling at the same physical dimensions as 1440p is basically the exact same size when it comes to menus and such. There is, for me, at least, a very noticeable difference in picture detail and especially in video, between the two monitors*. But I've always had really good eyesight. I game at both 4k60 or 1440p165hz depending on the style (and demands) of the game, but I absolutely will NOT watch 4k video on the 1440p monitor.

I think that 8k at 32inch with 200% scaling would be... .

Interesting setup - are you driving all three monitors off the same gpu? If so are you limited to the lower resolution?

Also do you get UI problems in games?
 
I failed to address this in the OP, and wish I had. In fact, I will add an edit to it - because I think it's worth noting in the OP as a use-case scenario that does mitigate in favor of someone buying Ampere or BNavi for those features.

...

What specifically did you not like about 32-inch 4k? This is of particular interest to me (as you can see below)

To be clear, I have limited experience with 32" 4K monitors. However, I've seen them in stores and I do own a 28" 4K display, a 34" 3440x1440 display, a 32" 2560x1440 display, 3x 2560x1600 displays, and I used a Samsung 48" TV, (can't recall the model) and a Samsung KS8500 49" 4K TV as a main display on my PC. I used each of these for roughly a year a piece.

All of these use cases have led me to conclude that the optimal display size for me would be around 43". 48" and 49" displays were great for gaming, but weren't great for productivity given that the top corners were almost too far away to be of any use. The 34" 3440x1440 display isn't quite 4K, but the pixel density for the size is good. The 32" 2560x1440 and 30" 2560x1440 monitors are also about perfect for their sizes. Basically PPI and pixel density is in my opinion, roughly optimal for those sizes. 28" 4K is terrible as 150% font scaling still isn't that great for visibility. It does something to the fonts where they appear thin and too light. I can work with it, but its less than ideal. While 150% scaling was fine on the 48 and 49" TV's, I found it unnecessary and I could view text perfectly without the scaling. Again, those were ultimately too large for productivity, based on the corners being "too far" away from my vision. This could be mitigated by different stands, but as they sit they were less than ideal.

In short, it's based on my experiences with monitors below and above 32" in size and resolutions that lead me to conclude that 32" isn't the sweet spot for that. Around 40-43" @4K would be ideal in a 16:9 configuration in my opinion. Again, I've used a lot of different monitor options and I have more "keyboard time" on a computer than anyone should. 20+ hours a day for me isn't uncommon. Anything over 16 occurs multiple times a week. That 28" display is on my test bench, so it gets used extensively. The others have all been on my main rig at various points and sometimes multiple monitors at once.

...

If interested /willing - maybe you can help with a use-case scenario that will illustrate both our points to someone who's stumbled across this thread looking for info on an upgrade?

I'm not sure what I can do here. Monitors are about the most subjective thing about computing next to input devices. Your eyesight, preferences, system configuration and what games you like all factor in here. There are people who can't tell the difference between medium and high settings in games. Some people can't tell the difference between 1920x1080 and 2560x1440. Just as some people don't notice the inherent lag in older displays and some types of configurations. There are people who probably wouldn't notice if you turned off chroma subsampling on their 4K TV's.

...


Specifically, let me ask a question about your experience with 32 inch monitors. My opinions on this are based upon research and in-store observation rather than extended personal use. Here's where I'm going, and perhaps you can lend your experience to increase my knowledge:

As I said above, I have extensive usage experience around monitors of similar sizes but different resolutions. I've also got lots of experience at 4K, albeit smaller and much larger than 32". It's that experience that leads me to the conclusion I've made. I think 32" is still too small for 4K use. Font scaling is the answer for some people and for games, it can be fine. However, I disagree with that. That said, font scaling looks weird and I don't like what it does to application windows and the way fonts look aren't the most readable to me. This can also vary with panel types and specific font selections, clear type settings and so on. In my experience, I have found the 150% scaling to be workable, but not ideal.

When it comes to gaming, there is no doubt that at 32" the pixel density will make games at 3840x2160 look fantastic provided the display has good colors and all that. However, it is my belief that at this size, the performance impact of running at 3840x2160 resolution isn't worth the trade off. At that size, I don't think you really benefit that much from the pixel density. It's too small for us to really notice the improvement a whole lot over something like 2560x1440. At sizes around 40" or so, it starts to make more sense as the dot pitch, PPI etc. at that size would look terrible going any lower. I remember the 1920x1080P Westinghouse 37" craze and I've tried running 1080P TV's upwards of 55" in games. It's unusable for the desktop and crap for gaming in my opinion. Console gamers are used to it, but I think even they might agree that at some point the image gets so big that it doesn't look that great without a resolution increase.

I've got two computers. One for (mostly) productivity runs the HPZR24W, a 16:10 1920x1200 60hz IPS display that I've been satisfied with for years. The other, almost exclusively for gaming, runs an Asus 24, 16:9 TN at 1080p with 144hz. I found over time that I really dislike the Asus - mostly because it lacks the vertical real estate, is a TN panel and the images on the HP just look much better overall. In fact, for almost every use, I prefer the HP. About the only time I actually use the Asus machine is when I'm competitively gaming and don't care so much about the visuals, but want the extra frames and information advantage the higher refresh offers. As soon as I'm done with that, it's back to my old, comfortable IPS.

The goal is to upgrade the HP, get a larger monitor and plug in Ampere or Big Navi, which as you note, will future-proof me for a while. For brevity, presume that the system overall is, aside from the monitor, a good candidate for one of the newest cards.

That's all fine and I don't disagree with any of what your talking about here. I have used such displays on both sides of the isle. I switched from 3x 30" Dell 3007WFP-HC IPS monitors to 3x 27" 2560x1440 ROG Swift 144Hz monitors and while they were fantastic for gaming, I hated them for literally everything else. The viewing angles were so bad they didn't work in a multi-monitor array at all.

Observations:
  • I want a fast-refresh IPS panel and am willing to pay for good build quality.

About the fastest I know of right now are only 144Hz. Most of those, I believe are going to be 32" 2560x1440 or 3440x1440 ultra-wides.

  • I've looked at 27s, and the size increase (3 inches in width, almost none in height) doesn't excite. http://www.displaywars.com/27-inch-16x9-vs-24-inch-16x10 From a 'if I'm spending money on an upgrade' standpoint - I want a real upgrade. A 27 could be acceptable because I don't lose vertical real estate - but the math tells me that I'd be spending a pretty penny for an extra 1.5 inches on either side of my current screen.

I couldn't agree more. That prospect by itself isn't exciting either. 27's aren't a size I like at all. The problem may be unique to me, but it mostly comes from the fact that I went from a 20" 16:10 (1600x1200) display to a 30" 2560x1600 display. I literally never owned a 1080P display on my main gaming rig. I've used plenty for work and on my test bench, but I never played games on them.

  • I don't like curved monitors. They seem like if you're not in the exact place designed for the curve that the images look weird. I also do a lot of design work - requiring precise layout of parallel lines. Curved monitors make it difficult (for me) to be sure I'm laying things out parallel - so the monitor I need is a flat panel.

I'm a fan of curved monitors in specific cases. Going over 43", absolutely. My 3440x1440 ultra-wide is curved, but its an IPS panel and the viewing angles etc. are pretty good. That being said, I rarely look at it off angle. I normally sit right in front of it and use it for work and playing games. In general, I like this display albeit, I'd prefer a larger one. 38" ultra-wides are something I've looked into, and 40" 4K's appeal to me as well.

  • Industry does offer 32 inch flat panels - both in 1440p and 4k. The high refresh rate panels are "coming soon". 32 is a solid upgrade both vertically and horizontally from my 24 inch panel. http://www.displaywars.com/32-inch-16x9-vs-24-inch-16x10 The problem is, as noted in the OP - for productivity, it's not exactly an ideal size for either 1440 or 4k, given how Windows scales.

I disagree. 32" at 2560x1440 is perfectly fine. It's actually a good size for that resolution. I used 30" 2560x1600 and a 32" 2560x1440 monitor quite a bit. They are both fine choices for productivity and play. That said, I'd want more than one if I were going with that size. It's actually something I've debated given some of the cons of ultra-wide displays. There are also serious cons with 40"+ displays at 4K as well.

  • There are, as far as I can tell, no flat panel 34s. Certainly no 4k, IPS fast refresh panels.

No, you basically have 34" Ultra-wides. Those are 3440x1440. Stepping up to 38" is a big jump in cost and a resolution increase to 3840x1600. Nearly 4K. Refresh rates can go upwards of 144Hz on the latest 34" ultra-wides but you will pay for that. It's more than $1,000 for a good one.

  • 38s are almost all curved ultrawides

No, I'm pretty sure all of them are.

  • 40 is a TV size - and given my typical view distance (between 38 and 46 inches) seems way too big for desktop use. Pixel pitch at 4k is about 108; in the 'recommended' range for Windows scaling; but such a huge panel overwhelms the desktop and creates ergonomic problems at desktop use distances.

Actually, this isn't true. There are a handful of genuine 40" or 43" 4K displays. The problem is that most are 60Hz. Even the 120Hz Acer Predator X43 has drawbacks. HDR is present, but not really enough to be a worth while feature on its own. You have to use dual displayport cables to run it. I believe there are issues running G-Sync like that, and so on. Having said that, I used a 48" and later a 49" TV for Windows use without font scaling and found that if I could have gone a bit smaller, about 43" it would have been nearly perfect in my opinion.

You wrote: "I'd also disagree with the notion that 32" is the sweet spot for 4K. For me, 38"-43" is the optimal range for such a display."

So - the question: what's a feller to do in this scenario?

As far as I can see, the options are limited:
  • Keep the HP, skip the card
  • Get a 27" fast-refresh 1440p and the card
  • Wait for Q1 2021, get a 32" fast-refresh 4k and the card and twiddle with the scaling to ensure I can still work on it.
Do you perceive other options? What recommendations would you have for this scenario?

(and thanks in advance if you're willing to play the game!)

The problem is that all displays are going to be a trade off for one reason or another. That may be something that persists forever. It's pretty much that way with any technology I'm familiar with including things like firearms or cars. I'd probably still buy the upgraded graphics card as soon as you can and then wait a bit for the better monitor. Then again, I don't think we agree on what's ideal. In your shoes I'd probably go with a faster 32" 2560x1440 monitor and enjoy being able to use some DLSS and potentially ray tracing in newer games. 34" Ultra-wides are an option, but they are very expensive for good ones. Again, I wouldn't go 4K unless the display was larger than 32". So far, I've not been satisfied with font scaling.

Personally, I ended up with a Alienware AW3418DW the last time I went looking for a display. Despite the ultra-wide cons, it's pretty good for a productivity and gaming display. Nothing I play has an issue at this aspect ratio. At 120Hz, it's reasonably fast. It's IPS which is good for photo editing, which I do a lot of because I'm shit with a camera. It supports G-Sync, which helps make for a smooth gaming experience even when I can't sustain 120FPS or higher in a game. Color reproduction is good, contrasts are reasonable for an IPS display. Pixel density at this size is excellent. I don't need font scaling or anything like that. The images are sharp and clear. So it works for me. I do wish it were larger and supported HDR, but we can't have everything.
 
I failed to address this in the OP, and wish I had. In fact, I will add an edit to it - because I think it's worth noting in the OP as a use-case scenario that does mitigate in favor of someone buying Ampere or BNavi for those features.

...

What specifically did you not like about 32-inch 4k? This is of particular interest to me (as you can see below)


...

If interested /willing - maybe you can help with a use-case scenario that will illustrate both our points to someone who's stumbled across this thread looking for info on an upgrade?

...


Specifically, let me ask a question about your experience with 32 inch monitors. My opinions on this are based upon research and in-store observation rather than extended personal use. Here's where I'm going, and perhaps you can lend your experience to increase my knowledge:

I've got two computers. One for (mostly) productivity runs the HPZR24W, a 16:10 1920x1200 60hz IPS display that I've been satisfied with for years. The other, almost exclusively for gaming, runs an Asus 24, 16:9 TN at 1080p with 144hz. I found over time that I really dislike the Asus - mostly because it lacks the vertical real estate, is a TN panel and the images on the HP just look much better overall. In fact, for almost every use, I prefer the HP. About the only time I actually use the Asus machine is when I'm competitively gaming and don't care so much about the visuals, but want the extra frames and information advantage the higher refresh offers. As soon as I'm done with that, it's back to my old, comfortable IPS.

The goal is to upgrade the HP, get a larger monitor and plug in Ampere or Big Navi, which as you note, will future-proof me for a while. For brevity, presume that the system overall is, aside from the monitor, a good candidate for one of the newest cards.

Observations:
  • I want a fast-refresh IPS panel and am willing to pay for good build quality.
  • I've looked at 27s, and the size increase (3 inches in width, almost none in height) doesn't excite. http://www.displaywars.com/27-inch-16x9-vs-24-inch-16x10 From a 'if I'm spending money on an upgrade' standpoint - I want a real upgrade. A 27 could be acceptable because I don't lose vertical real estate - but the math tells me that I'd be spending a pretty penny for an extra 1.5 inches on either side of my current screen.
  • I don't like curved monitors. They seem like if you're not in the exact place designed for the curve that the images look weird. I also do a lot of design work - requiring precise layout of parallel lines. Curved monitors make it difficult (for me) to be sure I'm laying things out parallel - so the monitor I need is a flat panel.
  • Industry does offer 32 inch flat panels - both in 1440p and 4k. The high refresh rate panels are "coming soon". 32 is a solid upgrade both vertically and horizontally from my 24 inch panel. http://www.displaywars.com/32-inch-16x9-vs-24-inch-16x10 The problem is, as noted in the OP - for productivity, it's not exactly an ideal size for either 1440 or 4k, given how Windows scales.
  • There are, as far as I can tell, no flat panel 34s. Certainly no 4k, IPS fast refresh panels.
  • 38s are almost all curved ultrawides
  • 40 is a TV size - and given my typical view distance (between 38 and 46 inches) seems way too big for desktop use. Pixel pitch at 4k is about 108; in the 'recommended' range for Windows scaling; but such a huge panel overwhelms the desktop and creates ergonomic problems at desktop use distances.

You wrote: "I'd also disagree with the notion that 32" is the sweet spot for 4K. For me, 38"-43" is the optimal range for such a display."

So - the question: what's a feller to do in this scenario?

As far as I can see, the options are limited:
  • Keep the HP, skip the card
  • Get a 27" fast-refresh 1440p and the card
  • Wait for Q1 2021, get a 32" fast-refresh 4k and the card and twiddle with the scaling to ensure I can still work on it.
Do you perceive other options? What recommendations would you have for this scenario?

(and thanks in advance if you're willing to play the game!)
As you mentioned: high refresh 32 inch IPS aren't available yet. Most of the focus has been on 27 inch, because that's considered a sweetspot in terms of size Vs. physical field of view on the monitor. At common desk gaming distances, you benefit from a pretty large monitor, but still don't have to physically turn your head, to see the edges of the screen. And also, 1440p and 1080p both have good ppi, here. So, you can choose which one you want, based on your needs, and still drive consistently high framerates. Assuming you have the hardware.

Due to all of that, I do not think pro gamers will be going up to 32 inch screens. Its a bit big for most desk setups or setups at a competition hall.

*also, some pro's still prefer TN panels anyway. And the bigger those get, the worse their image quality becomes. Because after like 24 inches, you basically have permanent color shift at each corner. due to physical distance causing the viewing angle to always be sub-par.


Personally , another positive for a larger monitor like a 27, it is more comfortable for my eyes to focus on for long periods of time, than a 24 inch or less.


And back on 32 inch screens: The good 32 inch IPS screens are really expensive. like $700+ and once high refresh equivalents are available, prices are going to be even higher. I doubt there will be any 32 inch 144hz 1440p or 4K IPS worth owning, for similar pricing to current 32 inch high refresh VA ($350 and up) or current 27 high refresh IPS ($350 and up). Heck, some of the 32 inch VA screens are over $450. I mean, look at how much the best 27 inch high refresh IPS cost.....ugh....

The monitor market is nuts right now. Not only due to pricing, but also the "quality" you get for that price. Even the best monitors are often subject to backlight bleed, poor uniformity, dirty screen effect, etc.



For me personally, if I was seriously considering a high dollar monitor for gaming ----- I would instead save my money a little longer. and then mount one of LG or Vizio's smallest 120hz capable OLED onto the wall, behind my desk. Pull the desk away from the wall a bit, if I have to, to achieve a balanced physical FOV. If you can't mount to a wall, then get a second desk behind your desk and do an arm mount.

Yeah, its twice the money. But its an exponentially better display. OLED's response times are a fraction of the best IPS, on par with high refresh TN, and the image quality is essentially perfect. Input lag in gaming mode is as good or better. Yes, its true that a 240hz TN technically has a bit better motion clarity than a 120hz OLED. But I'm willing to bet the perfect contrast and color response of the OLED benefits competitive gaming as much or more. In terms of competitive scores. And absolutely benefits non-competitive gaming.

If you use it only for gaming and watching movies/video, burn in shouldn't be an issue (I speak as an owner of LG plasma TV, which suffered minor, temporary burn in). Have a second, actual monitor, for your general PC use. Worst case, I might have a life bar or something burnt into the screen which I might need to wipe out with 10 mins of color slides, before watching a wide aspect, letterboxed movie (black bars). If the movie or TV show I watched was 16:9 so it took up the whole screen, I wouldn't need to bother. The first few minutes of viewing would clear out the burn in.


Currently, I have a high refresh, 1440p, 32 inch VA panel. Its great for gaming and was affordable at $350. but its not good for sharing a movie with my girlfriend. VA viewing angles are not good enough.
 
Text on a 4K 32" monitor is MUCH clearer than a 1440P monitor. The issue is that you need to use scaling on the text, so as an example on my 4K 32" the text is at 150%. It is not straining at all to read. Demo monitors often do not have scaling on, and at 100% yeah its completely awful to read.
Would that be a monitor setting or something in Windows?
 
You built your PC for gaming in the last 5 years and purposefully went with 1080p 60hz? It's just hard to imagine. 970s were doing 1440p. 1440p monitors are cheap now.

There's no animosity there by the way.
Specially during some mining hype, GPU were really not cheap in the last 5 year's:
https://ca.camelcamelcamel.com/product/B00R3NK2LE?context=search

And they didn't even do 60 FPS at 1080p in all games at high setting, depending on what type of game you play obviously, 1440p was a good step up in price.
 
Sitting here playing World of Warcraft on a 43" 4K IPS monitor with a 2080. I also have 32GB of ram for good measure.

Didn't read all the shenanigans in this thread. To each their own and what they want.

I also like a fast system for editing my photography.

4k.jpg
 
Last edited:
I'm buying an LG CX 77" OLED for my home theater.. Anything other than a 3xxx I might as well game on a console...
 
I use an Eizo FG2421, 1080p 120 Hz, as my main monitor. Not much by modern standards, but it'll tide me over until I can treat myself to an LG C9 or CX once the 4K 120 Hz G-SYNC support's been verified from the peope lucky enough to actually have Ampere cards.

Meanwhile, I'm still itching for an Ampere or Big Navi card. Why?

Two words: Valve Index. Driving my Oculus Rift CV1 was hard enough, driving the Index at 120 or even 144 Hz is a true challenge. VR does not handle framedrops below refresh rate gracefully; reprojection artifacts look just plain awful.

The other reason is that I'm still on a five-year-old GTX 980. Pascal wasn't available yet, and I was put off by Turing's price hike. Ampere is what I've been waiting for, an upgrade I can readily feel without paying through the nose for it, except I just can't get my hands on a damn 3080 despite my best efforts. We'll see if the 3070 fares any better.

Said GTX 980 is still okay for gaming at 1080p, but for VR? Let me put it this way: DCS World went from reprojection all the time to actually quite playable despite just taking a free flight with nobody else around, simply by replacing the GTX 980 with an RTX 2080 (non-Super) for an evening. My performance literally DOUBLED, all without changing the rest of my seven-year-old i7-4770K 4.5 GHz system.

To be fair, DCS is notoriously unoptimized. All you people calling MSFS 2020 the next Crysis are only just now discovering the dilemma that sim gamers outside of racing sims get to deal with. DCS is notoriously unoptimized, IL-2 Great Battles isn't much better, ArmA 3 isn't even an in-depth vehicle sim and still performs like jank... and the kicker? DCS performs much, MUCH better in flat/pancake monitor mode than it does in VR.

I'm due for an upgrade already, just let me get Ampere or Big Navi now, and I'll probably go Zen 4 next year to bring the rest of the computer up to speed. Having proper NVMe boot support without any UEFI hacking kludgery is gonna be critical after this next gen of consoles.

Oh, wait, there's one more reason: HDMI 2.1. That's going to be critical for having decent monitor and HDTV options, aforementioned LG OLED offerings included. We need the new cards to get that.
 
There are more things that 1080p has to offer now then before as AMD added Image Shaping and Radeon Boost all the way back for RX 570 ..

When you offer up a 4Gb and 8Gb of the same card on Market place that is a known for 1080p they all want the 8Gb card because new games like Crysis Remastered uses over 7Gb of vram in 1080p Med settings now and DX 12 is only adding more life for the 1080p gamer that can get by with a RX 5500 XT 8Gb / RX 570 8Gb ..

Hopeful Zen 3 can scale every video card made and give them a bump in overall performance if they exceed Intel in gaming , Everyone gets the uplift and why prices would hold firm on DX 12 ready cards that offer over 6Gb or more is my take from Market place sales question I get ask .
 
Wow, guess I am just a huge loser. I "game" on a 32" curved 1080p 60hz and enjoy it. I play Wow, Overwatch, CoD, and several other titles, but according to this group, unless you use a 4k monitor you are just a huge piece of sh*t.

Why the animosity and contempt for people who cannot afford bleeding edge or high level tech?
Use whatever you want and makes you happy. People here are enthusiasts, and not everyone has the same priorities. Case in point: I could easily spend $5K on a Asus PA32UCX-PK miniled monitor. Heck, I could buy several if I wanted to right now. Instead, I only finally upgraded to a 27 1440p 100hz monitor (not even 144hz, the nerve!) and spent that $5K in redoing my house’s crawl space and cutting down 2 trees that were starting to loom dangerously over my bedroom. Won’t game much if my house sinks under rotten wood or a tree falls on me, will I!

That’s just my point, and I think it’s yours too. Instead of $5K I spent $300 on a monitor because it was good value and it serves me well for work and play. Meanwhile, my family and house are higher priorities than gaming. Not everyone here gets that people have more priorities than playing videogames (to be fair, plenty of people do! Those who don’t tend to be the most vocal, sadly).

You, and I, are as much of a gamer as anyone who spends all their money upgrading their PC all the time. Nobody needs to justify their spending habits, and if anyone here judges you for it... well, mute them from the forum. I’ve create a nicely curated Hardforum limited to respectful people that way.
 
I upgraded my whole system to be able to game at 120fps or higher at 1440p with high settings. That was the whole reason for getting the system in my specs. Before i was getting 75-90fps in most games with some games on high detail and just wasnt enough anymore.
 
Back
Top