DarkSideA8
Gawd
- Joined
- Apr 13, 2005
- Messages
- 990
Thesis: The mainstream gamer is wasting their time and money looking at Big Navi or Ampere. This is because the 'mainstream' gamer is looking at all of the artificial benches and hype reviews, without really thinking about how it will affect the system they're actually gaming on.
I write this because of an epiphany I had while trying to explain to a friend why he needed a 27 inch 1440 120 (+)hz monitor to enjoy the 3080. The realization: most people look at the bench / hype reviews without fully understanding what they're actually saying.
Most people still run 1080p 60hz monitors. That is the most mainstream and common monitor -- and if you are running a system with that monitor, Ampere and Navi are a waste of money for you. It's like having a 500 hp engine on a gokart. You're not going to get the most out of the engine, despite the noise and cost and bragging rights. Read about CPU limitations for a bit of this; but the bottom line is that for most monitors, the newest cards are not necessary.
Until, and unless you are running 1440 or 4k, and trying to push 120hz or more... you are really wasting time, energy and money on the 'New Shiney'.
Because that's what these cards are: door openers for the industry to start making and pushing better display tech.
...
(Long post, I know: read on if you want my justifications for the thesis)
...
Last Gen - even 2 Gen old cards run 1080p just fine. With overhead even, on a 1080p 60hz monitor in most, even the latest games. Scroll through any of the bench mark threads you want - see the comparisons... and anywhere you see a game on an old card still putting out @ 100 fps at 1080p; that card is JUST FINE - at 60hz.
The most interesting thing is that 2080 promised 4k gaming... and there are 27 and 32 inch 4k monitors out there (at 60hz) -- but the combo of the gold standard of high pixels and high refresh DOES NOT EXIST in the wild yet. The most promising 32 inch monitors with 4k, IPS and 120(+)hz don't even come onto the market until Q1 2021... if they stay on track.
So lets talk panels.
You have to - before you go into thinking about buying the new cards, understand the relationship between cards, pixels, refresh rate and panel sizes. You need a new card to push high pixels and high refresh rate (FPS is NOT the whole story). You don't need a this Gen card to push a 3 Gen old panel. Pixels is 1080p, 1440, and 4k. Refresh rate is Herz (hz) which ties into FPS. If you have a card that lets you push 120 FPS but only a 60hz monitor... pretty much (there's more to this, btw) you're only seeing 60 FPS - despite what the game / bench says you're getting. On to the argument:
Pixels:
1080p is JUST FINE for a 24 inch monitor; in fact, it's the sweet spot. Higher resolutions can actually make gaming harder on a panel this small. The key is - given the distance most computer users sit (this is for PC, not Console gamers, btw) from the monitor, if you don't have the right, correct pixel pitch for the size of the monitor, you are going to dislike what you see. The TLDR here (for 60hz) is this:
1080p vs Screen Size: At 1080p, on a 24 inch monitor you've got roughly 92 pixels per inch (PPI). That's pretty durn good, and quite enjoyable. Full HD. Step it up to a 27 inch monitor, at 1080p and you've got 82 PPI - which, suddenly, can make the images on screen noticeably pixelated (you can see the spaces between the pixels). At 32 with 1080p, you get 69 PPI - which will look horrible, unless you are far away. (FYI - the farther away you are, the less your eye can tell the difference. e.g. a 42 inch 720p TV across the room from you looks fine, but up close, not so much).
1440 vs Screen Size: At 1440, on a 24 you're going to have 122 PPI. Note here, that more does not equal better; the higher PPI makes everything look smaller, and you need a better card to push the pixels. Trying to shoot that guy far away? He's tiny. At 27 inches - you're hitting the sweet spot; 109 PPI at 1440. Trying to run 1440 on a 32? That's okay; you're getting the equivalent of Full HD on a 24 in a much larger screen -- 93 PPI. For old eyes trying to see the far away guy - he's a lot bigger now. And things look good.
4k vs Screen Size: A 24 inch 4k monitor is a waste of money. You're pushing 185 PPI - everything is tiny, and you need a good card. At 4k on a 27, again you're hitting that 'too many pixels in too small a space' problem; 169 PPI... The far away guy is teeny tiny. (Yes, you can play with the settings; but why go to the effort?) At 32, 4k settles down a bit to 138 PPI - but some say anything larger than 110 is too much for normal Windows use - especially on older programs without good native scaling, but others say that 32 is the sweet spot for 4k -- plenty of real estate and all the visual goodies.
Refresh Rate (hz):
The other thing that you need to know - and that everyone's talking about, but some folks don't understand, is refresh rate. If you're running a 60hz monitor in today's games, you are actually gimped against a player running a 144 (presuming both players have cards that can push frame rates that high). If you have a 60 hz monitor, and the card / game reports you getting 100 (+) FPS - you can still be competitive, but you would do better on a higher refresh rate monitor. If both of you are getting 100 FPS reported, you only see 60, he sees all of them - and there's some complexity both in the hardware and wetware that I won't go into... but the higher refresh rate and reporting of more FPS does equal into higher human response times, or, an advantage to the guy with the higher hz monitor. Image smoothness is often what is shown when comparing 60hz vs 144 - and there's plenty of videos out there about this. The big thing to know is that the major improvement is the jump from 60 - 120(+), and that many, many people cannot really see the difference between 144 and 240(+).
Note: Refresh rate is totally distinct from pixels; Great cards are needed to push both, or you can choose a good card to go for one over the other. Don't get sucked into the "I need a 4k monitor" thinking without understanding when/ why you want / need 4k. There's lots of competitive gamers that are buying and using 24 inch 1080p monitors; they're just spending money on getting way higher refresh rates to be as competitive as possible. The key here is - they know that at 24 the sweet spot is 1080p, but by buying a monitor and card that can push frames into the 200s... they've got a competitive advantage over the casuals playing 1080p at 60hz.
Dan_D pointed out a use-case justifying purchase of a 3080 or RX 6000 class card that I failed to originally write about. Getting one of the cards for the 'new goodness' of Ray Tracing, DLSS, higher VRAM and cost/benefit longevity. These are absolutely valid considerations, as old cards don't (necessarily) offer these techs / advantages - regardless of panel size considerations. Most of the people who know about this stuff are enthusiasts. The user I originally wrote about is someone on a 24 inch 1080p monitor who either doesn't know the refresh rate, or has a 60hz monitor... and that person might want to play Minecraft or some other title (Cyberpunk, etc) that offers RT, and the new cards are ideal for that. There are also quite a few games that are becoming demanding in terms of VRAM, and the new cards have lots of it. The final point is cost/benefit and longevity; certainly you should not buy an old card for anything similar to the price-points the new cards are offered.
So - Final Thought: you DON'T need a new card... UNLESS. Unless you're trying to game at a higher refresh rate like 120(+), or at higher pixels than 1080p, [or want to leverage the new technologies] - don't worry about the New Shiney.
If you want the new shiney anyway; be dad gummed sure you've got a panel that will take advantage of it!
--EDIT 2 --> The DisplayNinja thread below has some interesting data about how close you sit to a monitor before you stop noticing the pixels.
Resources:
https://levvvel.com/pixel-density-resolution/
https://en.wikipedia.org/wiki/Pixel_density
https://www.displayninja.com/what-is-pixel-density/
Note: this is for the 'average' user. You [H]Gawds should already know this; but we get tourists, too
I write this because of an epiphany I had while trying to explain to a friend why he needed a 27 inch 1440 120 (+)hz monitor to enjoy the 3080. The realization: most people look at the bench / hype reviews without fully understanding what they're actually saying.
Most people still run 1080p 60hz monitors. That is the most mainstream and common monitor -- and if you are running a system with that monitor, Ampere and Navi are a waste of money for you. It's like having a 500 hp engine on a gokart. You're not going to get the most out of the engine, despite the noise and cost and bragging rights. Read about CPU limitations for a bit of this; but the bottom line is that for most monitors, the newest cards are not necessary.
Until, and unless you are running 1440 or 4k, and trying to push 120hz or more... you are really wasting time, energy and money on the 'New Shiney'.
Because that's what these cards are: door openers for the industry to start making and pushing better display tech.
...
(Long post, I know: read on if you want my justifications for the thesis)
...
Last Gen - even 2 Gen old cards run 1080p just fine. With overhead even, on a 1080p 60hz monitor in most, even the latest games. Scroll through any of the bench mark threads you want - see the comparisons... and anywhere you see a game on an old card still putting out @ 100 fps at 1080p; that card is JUST FINE - at 60hz.
The most interesting thing is that 2080 promised 4k gaming... and there are 27 and 32 inch 4k monitors out there (at 60hz) -- but the combo of the gold standard of high pixels and high refresh DOES NOT EXIST in the wild yet. The most promising 32 inch monitors with 4k, IPS and 120(+)hz don't even come onto the market until Q1 2021... if they stay on track.
So lets talk panels.
You have to - before you go into thinking about buying the new cards, understand the relationship between cards, pixels, refresh rate and panel sizes. You need a new card to push high pixels and high refresh rate (FPS is NOT the whole story). You don't need a this Gen card to push a 3 Gen old panel. Pixels is 1080p, 1440, and 4k. Refresh rate is Herz (hz) which ties into FPS. If you have a card that lets you push 120 FPS but only a 60hz monitor... pretty much (there's more to this, btw) you're only seeing 60 FPS - despite what the game / bench says you're getting. On to the argument:
Pixels:
1080p is JUST FINE for a 24 inch monitor; in fact, it's the sweet spot. Higher resolutions can actually make gaming harder on a panel this small. The key is - given the distance most computer users sit (this is for PC, not Console gamers, btw) from the monitor, if you don't have the right, correct pixel pitch for the size of the monitor, you are going to dislike what you see. The TLDR here (for 60hz) is this:
- 24 go with 1080p -- almost all cards from the last 3 gens work fine
- 27 go with 1440 -- almost all cards from last gen work, and some from 2 gens ago still work
- 32 (and up) go with 4k -- only high end cards from last gen, and very, very few (if any) older are playable/competitive
1080p vs Screen Size: At 1080p, on a 24 inch monitor you've got roughly 92 pixels per inch (PPI). That's pretty durn good, and quite enjoyable. Full HD. Step it up to a 27 inch monitor, at 1080p and you've got 82 PPI - which, suddenly, can make the images on screen noticeably pixelated (you can see the spaces between the pixels). At 32 with 1080p, you get 69 PPI - which will look horrible, unless you are far away. (FYI - the farther away you are, the less your eye can tell the difference. e.g. a 42 inch 720p TV across the room from you looks fine, but up close, not so much).
1440 vs Screen Size: At 1440, on a 24 you're going to have 122 PPI. Note here, that more does not equal better; the higher PPI makes everything look smaller, and you need a better card to push the pixels. Trying to shoot that guy far away? He's tiny. At 27 inches - you're hitting the sweet spot; 109 PPI at 1440. Trying to run 1440 on a 32? That's okay; you're getting the equivalent of Full HD on a 24 in a much larger screen -- 93 PPI. For old eyes trying to see the far away guy - he's a lot bigger now. And things look good.
4k vs Screen Size: A 24 inch 4k monitor is a waste of money. You're pushing 185 PPI - everything is tiny, and you need a good card. At 4k on a 27, again you're hitting that 'too many pixels in too small a space' problem; 169 PPI... The far away guy is teeny tiny. (Yes, you can play with the settings; but why go to the effort?) At 32, 4k settles down a bit to 138 PPI - but some say anything larger than 110 is too much for normal Windows use - especially on older programs without good native scaling, but others say that 32 is the sweet spot for 4k -- plenty of real estate and all the visual goodies.
Refresh Rate (hz):
The other thing that you need to know - and that everyone's talking about, but some folks don't understand, is refresh rate. If you're running a 60hz monitor in today's games, you are actually gimped against a player running a 144 (presuming both players have cards that can push frame rates that high). If you have a 60 hz monitor, and the card / game reports you getting 100 (+) FPS - you can still be competitive, but you would do better on a higher refresh rate monitor. If both of you are getting 100 FPS reported, you only see 60, he sees all of them - and there's some complexity both in the hardware and wetware that I won't go into... but the higher refresh rate and reporting of more FPS does equal into higher human response times, or, an advantage to the guy with the higher hz monitor. Image smoothness is often what is shown when comparing 60hz vs 144 - and there's plenty of videos out there about this. The big thing to know is that the major improvement is the jump from 60 - 120(+), and that many, many people cannot really see the difference between 144 and 240(+).
Note: Refresh rate is totally distinct from pixels; Great cards are needed to push both, or you can choose a good card to go for one over the other. Don't get sucked into the "I need a 4k monitor" thinking without understanding when/ why you want / need 4k. There's lots of competitive gamers that are buying and using 24 inch 1080p monitors; they're just spending money on getting way higher refresh rates to be as competitive as possible. The key here is - they know that at 24 the sweet spot is 1080p, but by buying a monitor and card that can push frames into the 200s... they've got a competitive advantage over the casuals playing 1080p at 60hz.
-- -- EDIT -- --
Dan_D pointed out a use-case justifying purchase of a 3080 or RX 6000 class card that I failed to originally write about. Getting one of the cards for the 'new goodness' of Ray Tracing, DLSS, higher VRAM and cost/benefit longevity. These are absolutely valid considerations, as old cards don't (necessarily) offer these techs / advantages - regardless of panel size considerations. Most of the people who know about this stuff are enthusiasts. The user I originally wrote about is someone on a 24 inch 1080p monitor who either doesn't know the refresh rate, or has a 60hz monitor... and that person might want to play Minecraft or some other title (Cyberpunk, etc) that offers RT, and the new cards are ideal for that. There are also quite a few games that are becoming demanding in terms of VRAM, and the new cards have lots of it. The final point is cost/benefit and longevity; certainly you should not buy an old card for anything similar to the price-points the new cards are offered.
-- -- EDIT -- --
So - Final Thought: you DON'T need a new card... UNLESS. Unless you're trying to game at a higher refresh rate like 120(+), or at higher pixels than 1080p, [or want to leverage the new technologies] - don't worry about the New Shiney.
If you want the new shiney anyway; be dad gummed sure you've got a panel that will take advantage of it!
--EDIT 2 --> The DisplayNinja thread below has some interesting data about how close you sit to a monitor before you stop noticing the pixels.
Resources:
https://levvvel.com/pixel-density-resolution/
https://en.wikipedia.org/wiki/Pixel_density
https://www.displayninja.com/what-is-pixel-density/
Last edited: