You don't really need Ampere or Big Navi... UNLESS! (Display tech post)

DarkSideA8

Gawd
Joined
Apr 13, 2005
Messages
988
Thesis: The mainstream gamer is wasting their time and money looking at Big Navi or Ampere. This is because the 'mainstream' gamer is looking at all of the artificial benches and hype reviews, without really thinking about how it will affect the system they're actually gaming on.

Note: this is for the 'average' user. You [H]Gawds should already know this; but we get tourists, too

I write this because of an epiphany I had while trying to explain to a friend why he needed a 27 inch 1440 120 (+)hz monitor to enjoy the 3080. The realization: most people look at the bench / hype reviews without fully understanding what they're actually saying.

Most people still run 1080p 60hz monitors. That is the most mainstream and common monitor -- and if you are running a system with that monitor, Ampere and Navi are a waste of money for you. It's like having a 500 hp engine on a gokart. You're not going to get the most out of the engine, despite the noise and cost and bragging rights. Read about CPU limitations for a bit of this; but the bottom line is that for most monitors, the newest cards are not necessary.

Until, and unless you are running 1440 or 4k, and trying to push 120hz or more... you are really wasting time, energy and money on the 'New Shiney'.

Because that's what these cards are: door openers for the industry to start making and pushing better display tech.

...

(Long post, I know: read on if you want my justifications for the thesis)

...


Last Gen - even 2 Gen old cards run 1080p just fine. With overhead even, on a 1080p 60hz monitor in most, even the latest games. Scroll through any of the bench mark threads you want - see the comparisons... and anywhere you see a game on an old card still putting out @ 100 fps at 1080p; that card is JUST FINE - at 60hz.

The most interesting thing is that 2080 promised 4k gaming... and there are 27 and 32 inch 4k monitors out there (at 60hz) -- but the combo of the gold standard of high pixels and high refresh DOES NOT EXIST in the wild yet. The most promising 32 inch monitors with 4k, IPS and 120(+)hz don't even come onto the market until Q1 2021... if they stay on track.

So lets talk panels.

You have to - before you go into thinking about buying the new cards, understand the relationship between cards, pixels, refresh rate and panel sizes. You need a new card to push high pixels and high refresh rate (FPS is NOT the whole story). You don't need a this Gen card to push a 3 Gen old panel. Pixels is 1080p, 1440, and 4k. Refresh rate is Herz (hz) which ties into FPS. If you have a card that lets you push 120 FPS but only a 60hz monitor... pretty much (there's more to this, btw) you're only seeing 60 FPS - despite what the game / bench says you're getting. On to the argument:

Pixels:

1080p is JUST FINE for a 24 inch monitor; in fact, it's the sweet spot. Higher resolutions can actually make gaming harder on a panel this small. The key is - given the distance most computer users sit (this is for PC, not Console gamers, btw) from the monitor, if you don't have the right, correct pixel pitch for the size of the monitor, you are going to dislike what you see. The TLDR here (for 60hz) is this:
  • 24 go with 1080p -- almost all cards from the last 3 gens work fine
  • 27 go with 1440 -- almost all cards from last gen work, and some from 2 gens ago still work
  • 32 (and up) go with 4k -- only high end cards from last gen, and very, very few (if any) older are playable/competitive

1080p vs Screen Size: At 1080p, on a 24 inch monitor you've got roughly 92 pixels per inch (PPI). That's pretty durn good, and quite enjoyable. Full HD. Step it up to a 27 inch monitor, at 1080p and you've got 82 PPI - which, suddenly, can make the images on screen noticeably pixelated (you can see the spaces between the pixels). At 32 with 1080p, you get 69 PPI - which will look horrible, unless you are far away. (FYI - the farther away you are, the less your eye can tell the difference. e.g. a 42 inch 720p TV across the room from you looks fine, but up close, not so much).

1440 vs Screen Size: At 1440, on a 24 you're going to have 122 PPI. Note here, that more does not equal better; the higher PPI makes everything look smaller, and you need a better card to push the pixels. Trying to shoot that guy far away? He's tiny. At 27 inches - you're hitting the sweet spot; 109 PPI at 1440. Trying to run 1440 on a 32? That's okay; you're getting the equivalent of Full HD on a 24 in a much larger screen -- 93 PPI. For old eyes trying to see the far away guy - he's a lot bigger now. And things look good.

4k vs Screen Size: A 24 inch 4k monitor is a waste of money. You're pushing 185 PPI - everything is tiny, and you need a good card. At 4k on a 27, again you're hitting that 'too many pixels in too small a space' problem; 169 PPI... The far away guy is teeny tiny. (Yes, you can play with the settings; but why go to the effort?) At 32, 4k settles down a bit to 138 PPI - but some say anything larger than 110 is too much for normal Windows use - especially on older programs without good native scaling, but others say that 32 is the sweet spot for 4k -- plenty of real estate and all the visual goodies.

Refresh Rate (hz):

The other thing that you need to know - and that everyone's talking about, but some folks don't understand, is refresh rate. If you're running a 60hz monitor in today's games, you are actually gimped against a player running a 144 (presuming both players have cards that can push frame rates that high). If you have a 60 hz monitor, and the card / game reports you getting 100 (+) FPS - you can still be competitive, but you would do better on a higher refresh rate monitor. If both of you are getting 100 FPS reported, you only see 60, he sees all of them - and there's some complexity both in the hardware and wetware that I won't go into... but the higher refresh rate and reporting of more FPS does equal into higher human response times, or, an advantage to the guy with the higher hz monitor. Image smoothness is often what is shown when comparing 60hz vs 144 - and there's plenty of videos out there about this. The big thing to know is that the major improvement is the jump from 60 - 120(+), and that many, many people cannot really see the difference between 144 and 240(+).

Note: Refresh rate is totally distinct from pixels; Great cards are needed to push both, or you can choose a good card to go for one over the other. Don't get sucked into the "I need a 4k monitor" thinking without understanding when/ why you want / need 4k. There's lots of competitive gamers that are buying and using 24 inch 1080p monitors; they're just spending money on getting way higher refresh rates to be as competitive as possible. The key here is - they know that at 24 the sweet spot is 1080p, but by buying a monitor and card that can push frames into the 200s... they've got a competitive advantage over the casuals playing 1080p at 60hz.

-- -- EDIT -- --​

Dan_D pointed out a use-case justifying purchase of a 3080 or RX 6000 class card that I failed to originally write about. Getting one of the cards for the 'new goodness' of Ray Tracing, DLSS, higher VRAM and cost/benefit longevity. These are absolutely valid considerations, as old cards don't (necessarily) offer these techs / advantages - regardless of panel size considerations. Most of the people who know about this stuff are enthusiasts. The user I originally wrote about is someone on a 24 inch 1080p monitor who either doesn't know the refresh rate, or has a 60hz monitor... and that person might want to play Minecraft or some other title (Cyberpunk, etc) that offers RT, and the new cards are ideal for that. There are also quite a few games that are becoming demanding in terms of VRAM, and the new cards have lots of it. The final point is cost/benefit and longevity; certainly you should not buy an old card for anything similar to the price-points the new cards are offered.

-- -- EDIT -- --​

So - Final Thought: you DON'T need a new card... UNLESS. Unless you're trying to game at a higher refresh rate like 120(+), or at higher pixels than 1080p, [or want to leverage the new technologies] - don't worry about the New Shiney.

If you want the new shiney anyway; be dad gummed sure you've got a panel that will take advantage of it!


--EDIT 2 --> The DisplayNinja thread below has some interesting data about how close you sit to a monitor before you stop noticing the pixels.

Resources:

https://levvvel.com/pixel-density-resolution/
https://en.wikipedia.org/wiki/Pixel_density
https://www.displayninja.com/what-is-pixel-density/
 
Last edited:
I doubt there anything mainstream for video card at that price point:

https://store.steampowered.com/hwsurvey/videocard/

The mainstream user is still playing with a GTX 1060 or equivalent, 2060 at the most I would imagine, if he is not playing on a console.

Anything above that is reaching more niche enthusiast level, like you said most people still have somewhat low resolution/fps screen, I do not imagine that type of people making line or F5 battle for an $700 USD video card either, right ?
 
I doubt there anything mainstream for video card at that price point:

https://store.steampowered.com/hwsurvey/videocard/

The mainstream user is still playing with a GTX 1060 or equivalent, 2060 at the most I would imagine, if he is not playing on a console.

Anything above that is reaching more niche enthusiast level, like you said most people still have somewhat low resolution/fps screen, I do not imagine that type of people making line or F5 battle for an $700 USD video card either, right ?
The interest in the Ampere cards thus released is way above and beyond anything I've seen before (paper launch not included). I also know several people who are inspired by the new release of MSFS that are currently trying to build systems (for the first time) simply to play the game. They've all bought into the hype train - and its folks like that I'm trying to help.

When you look at how shiney the New Shiney is; people are reading the hype and saying 'gotta have that' -- but w/o a bit of extra thought... this isn't the typical upgrade. Ampere is actually calling for an upgrade in panels - which has really been the realm of enthusiasts. In other words, Ampere and Navi, along with the new Consoles really kick in the door for 4k and higher refresth... and people are responding. But Industry isn't really there yet for PC players (Console folks can get away with TVs... we tend to prefer Monitors; with all that implies).
 
I've had a 4k60hz monitor since 2015.
I've also had a 1440p165hz monitor since 2018.
(I have both attached to my computer, as well as a second, hdr 4k monitor.)

My laptop is a 15 inch 1080p120hz screen. And yeah, 1080p isn't good enough for it. It's too obviously low res. Fuck my 5.7inch PHONE has a higher resolution than 1080p and it's also 3 years old.

I'm not even that heavy of an enthusiast, I haven't spent more than $250 on a monitor.
 
I've got a 4k TV. (One of the LG Oleds. Stunning images.) I have no current desire to upgrade my computers to 4k...atm. My computers are attached to monitors pretty much in line with what the OP has stated: 24" 1080 (okay, 1920x1200, with all the gimped refresh rates those monitors have), or a lone 27" 1440 (Asus MG279Q, running less than its advertised 144Hz). I sit about 32" back from the 24" 1080's and a bit closer to the 27" 1440. Right now? I'm typing on the 1080s because it's easier to read due to the pixel pitch. Just like the OP discussed. I find fonts just a wee too small on the 27" 1440, but games do look good.

Right now? Well, I'm going to get a gaming 1440 monitor to add to one of the rigs. That'll run ~$450-500. (Hoping for a deal in November, maybe a bit lower.) To drive it at it's advertised 144Hz refresh rate, yes, I'll also buy a new GPU to drive it.

Shopping for a new GPU, there's no way I'm going to spend RTX3090-level money. My budget will cap out around $500.

That puts me spending ~$1,000 to add a little more real estate and faster refresh to what I've got going. (I'll pass down the GPU, so there's that savings.) Budgets are a reality. Everyone spends as they can for what they think is important. I have other things pulling on my wallet and, I still have to recognize that my preferred computer screens are the Dell U2415s and Asus 1920x1200s. (Had 4, down to 3) for work, browsing, etc. (Two such screens, side by side, are sweet...for me. 3840x1200 on pixel pitch of ~94ppi.)

The OPs thesis strikes a chord with me and I find my purchases aligned with what he states.
 
Thesis: The mainstream gamer is wasting their time and money looking at Big Navi or Ampere. This is because the 'mainstream' gamer is looking at all of the artificial benches and hype reviews, without really thinking about how it will affect the system they're actually gaming on.

You'll have to excuse me for being obtuse, but I'd argue that most who seriously consider spending $700+ (or $300+, for that matter) on a video card are not really mainstream.
 
It's all fair enough, but many people on here have monitors that are able to wring every ounce of performance out of a 3090 and still want more.
 
Im in the market for a card that will run all graphics settings on high/ultra while maintaining 144hz on my 1440p monitor. At some point, it would be nice to switch to a 360hz monitor, which sounds like it would most likely be tied to a 1080p panel with current display tech.

I'm sure tech in the future will get to a point where I might see a 360hz 32" IPS panel, but I won't hold my breath. Even then, I doubt the card the card I buy today would be able to push 360 fps consistent unless all graphics settings were set to low.
 
Just picked up my 1080/144 last December. 1070TI runs it just fine. If the card dies within this generation, I'll be waiting for the 3060 or 3050, unless I can pick up a 2060 SUPER for cheap.

That, and I play really old games... new games just can't seem to catch my interest anymore.
 
You can't be serious with "most people still run 1080p 60hz monitors." Normal people might be running that. Gamers started moving on about 10 years ago. Playing games on a PC does not make it a gaming PC. No one building a gaming PC has cared about 1080p for at least 5 years.
 
Last edited:
You'll have to excuse me for being obtuse, but I'd argue that most who seriously consider spending $700+ (or $300+, for that matter) on a video card are not really mainstream.
I'll agree - in the main. But I must reiterate that I'm seeing interest in 3080 /3090 from people who have never built a computer... All asking 'how hard is it to upgrade'? 'How good is that Alienware with the 3090'? 'Should I get the IBuy with the 3080 just to get the card?'

I started this insane hobby by adding a Diamond card to the Dell I bought in '96. Those mainstream guys who have always been scared to open up the box are about to become enthusiasts... Might as well help them along the way.
 
I started this insane hobby by adding a Diamond card to the Dell I bought in '96. Those mainstream guys who have always been scared to open up the box are about to become enthusiasts... Might as well help them along the way.
No
 
I did see a nice fps bump for 1440p 120hz gaming going from 1080 to 1080ti to 2080ti to 3090. The lows are what concern me for gaming. I am considering 2 things in the near future for upgrade. 4K120hz gsync (nov est release date says amazon) and/or a CPU/mobo/ram bump that is worthwhile. So far my own personal testing showed that on the games I play, 5.0ghz 7700 is still cutting it nicely compared to a current gen i7 also at 5.0ghz. (ie <10% difference in fps)

On the flipside, I had this discussion last night with a friend who just decided to get back into pc gaming thanks to the hype train. I found myself saying, get a used 2070super/2080/2080ti for a good used price since he's not splurging on more than 60hz gaming. I even offered my old dust-collecting 27 inch 120hz monitor for him to use just to see some benefit over 60hz gaming for 1080p.
 
Last edited:
People that still running 1080p monitors are not the customer for high end video cards. This a dumb article. Yo not call them gamers cause they still run 1080p is dumb. Not everyone can afford mid range hardware let alone high end. Most kids just play fortnight, CoD, minecraft and that roblox game. Most which are not very demanding. A lot of the most popular games are really not that demanding.
 
Last edited:
I think this "epiphany" is for us to handle how we recommend pc building to those not in the know. I was in a computer store and had some talks with people on launch day of the 3080 and although a small sample size, it showed just how much gaming and tech news has now hit the mainstream consciousness. These are the people outside the tech forums and enthusiast level who ARE the typical customers of the hw. We are also now the victims of technology going mainstream now too where average joe/mom is buying a 3080 for their teenage son to play.... fortnight. Simply cause their kid wants the best.

I don't need a 3090 for my gaming pc. It just means that since I am working from home, I can do some of my workflow from my home pc which is a nice to have option. I've been spoiled by working with tesla, volta cards and DGX100/200 for work and I have 2x DGX A100 on the way along with A6000 cards all on order.
 
I think you're in a bubble and want to belive this. I was out of PC Gaming for a few years. During that time, I talked to maybe 5 people that would even know what an RTX 1080/2080/3080 is and what it does. PC gaming is still not mainstream. Gaming might be, but building a PC to play games definitely is not.
I think this "epiphany" is for us to handle how we recommend pc building to those not in the know. I was in a computer store and had some talks with people on launch day of the 3080 and although a small sample size, it showed just how much gaming and tech news has now hit the mainstream consciousness. These are the people outside the tech forums and enthusiast level who ARE the typical customers of the hw. We are also now the victims of technology going mainstream now too where average joe/mom is buying a 3080 for their teenage son to play.... fortnight. Simply cause their kid wants the best.

I don't need a 3090 for my gaming pc. It just means that since I am working from home, I can do some of my workflow from my home pc which is a nice to have option. I've been spoiled by working with tesla, volta cards and DGX100/200 for work and I have 2x DGX A100 on the way along with A6000 cards all on order.
 
I think the problem in the GPU market right now is that the older cards have not really dropped in price at retail. You can still buy a 2080 super for 600-700 dollars even though 3080 is out. The market correction hasn't happened yet and probably won until after next week.
 
  • Like
Reactions: Epos7
like this
I think you're in a bubble and want to belive this. I was out of PC Gaming for a few years. During that time, I talked to maybe 5 people that would even know what an RTX 1080/2080/3080 is and what it does. PC gaming is still not mainstream. Gaming might be, but building a PC to play games definitely is not.
I agree with you that (not just) the last 5 years showed little interest from the average person into PC gaming... Consoles were for gaming - and people bought them for their kids.

But something has changed this year.

I think Slade has the truth of it. Tech news (and hype) is reaching the mainstream audience. MSFS and Covid also cannot be ignored as factors- but population dynamics could be in effect as well.

The people I am talking to are all in their 40s-50s with tweens and teens. They're not just buying the XBone for Little Johnny - they're getting a PC for themselves. They remember playing Doom on their Mom's laptop or MSFS on a CRT - and they don't want that. These are people with considerable disposable income and time to kill, and they want a good machine.

The only problem is that most everything they read supports or is supported by the hype train for the companies. We need to help people find the wheat in the chaff. Case in point; LG is purportedly offering a 32 inch 8k monitor. Some people will buy it because 'more is better', right?

This forum generally supports those who want to build killer machines, but it is also traditionally used to find the best bang for the buck machine... And in that effort - we need to help each other (and the new guys)
 
Last edited:
I've been thinking about this a bit as well. I'm on a 1080Ti w/ 27"/1440p/165hz G-sync monitor. I've mostly been playing The Division lately and with g-sync even 80-110fps looks super smooth to me. The fact it cant' even hit 165 fps doesn't really even bother me. I don't own a 4k monitor. Do I need a 3080? I'd LIKE one, but need one? No.
 
I've been thinking about this a bit as well. I'm on a 1080Ti w/ 27"/1440p/165hz G-sync monitor. I've mostly been playing The Division lately and with g-sync even 80-110fps looks super smooth to me. The fact it cant' even hit 165 fps doesn't really even bother me. I don't own a 4k monitor. Do I need a 3080? I'd LIKE one, but need one? No.
This shows how much Nvidia botched the launch of the 3000 series.
 
Yea most my computer purchases are on a whim more then anything. I really had no reason to go from a 7820x too a 3950x. I just wanted one. I could of still be chugging along with my 7829x and 1080ti.
 
Well sure.
I run things at 4K and am a stickler for 60fps, so here we are :p
 
I think the problem in the GPU market right now is that the older cards have not really dropped in price at retail. You can still buy a 2080 super for 600-700 dollars even though 3080 is out. The market correction hasn't happened yet and probably won until after next week.
Exactly. As much as the average gamer doesn't "need" a 3080, it doesn't make any sense for them to spend the same amount of money on older tech. I wouldn't say the 2060 or 2070 are smart buys either, with replacements right around the corner. It probably seems like a good time to buy a new graphics card to a lot of people, but the average gamer would be better off waiting a few months for a 3070, 3060, or one of AMD's new offerings.
 
I run a 1440p monitor and a 1080. VR as well with an Index. The 1080 just isn't cutting it anymore in a lot of games and VR especially with the higher-res HMD.
 
Exactly. As much as the average gamer doesn't "need" a 3080, it doesn't make any sense for them to spend the same amount of money on older tech. I wouldn't say the 2060 or 2070 are smart buys either, with replacements right around the corner. It probably seems like a good time to buy a new graphics card to a lot of people, but the average gamer would be better off waiting a few months for a 3070, 3060, or one of AMD's new offerings.
This is, in fact, the crux of the problem for one guy I've worked with extensively. The impending release of MSFS got him all excited about building a rig specifically for it. When we started this process the 3080 was 'just around the corner' and the promise of all it's goodness and price (leaked info at the time) compared to the current (2080) offerings was a 'no brainer.' He did not 'need' it either, but it was the only card that looked, at the time, like it would give him the playability he desired. We worked up a killer system spec and waited. We all know the rest.

It is almost criminal how fully Nvidia stoked the hype fires... and then pulled off this paper launch. I'm not an economic psychologist - but I don't think they've quite got the 'mindshare' they were hoping for. I know people are gleefully waiting to see if AMD puts out a 'good enough / competitive' card to the 3080 with sufficient stock. NVidia may be trying to steal AMDs thunder with a dump of 3080 and 3070 cards during the AMD launch - but it could backfire if AMD cards all read "In Stock" on Newegg, Amazon, BB and MicroCenter at a similar price point. If that happens - lots of boxes will be Red for a long time.
 
Yea most my computer purchases are on a whim more then anything. I really had no reason to go from a 7820x too a 3950x. I just wanted one. I could of still be chugging along with my 7829x and 1080ti.
That’s how I bought the 3900x. Walked into goodwill on 58 with the plan to window shop. Saw the 3900x... Left with enough components to build a new computer.
 
I think folks are also forgetting that if you are a member of this forum and are posting here, talking about things like Hz/FPS/etc, you're likely not the majority.
 
That’s how I bought the 3900x. Walked into goodwill on 58 with the plan to window shop. Saw the 3900x... Left with enough components to build a new computer.
You found a 3900x in a goodwill lol? How much did you pay?
 
Regarding monitor sizes with regard to resolution, I have a 32" 1440p monitor. 4k was very uncomfortable for me for reading. 1440p is a nice upgrade from 1080p and a nice middle ground.
 
Regarding monitor sizes with regard to resolution, I have a 32" 1440p monitor. 4k was very uncomfortable for me for reading. 1440p is a nice upgrade from 1080p and a nice middle ground.
It's really interesting reading the 'user' reviews of 32s. There's lots of people, depending upon their eyes, view distance, etc. that prefer one or the other. As far as I can tell, its pretty split; lots of likes for 1440, lots of likes for 4k. What would be awesome is if there was a place you could go to check them out side by side; but BB doesn't carry anything, there's very few MicroCenters and otherwise, everyone is stuck with reading reviews and ordering and hoping.

I presume from what you wrote that you tried a 4k 32?
 
It's really interesting reading the 'user' reviews of 32s. There's lots of people, depending upon their eyes, view distance, etc. that prefer one or the other. As far as I can tell, its pretty split; lots of likes for 1440, lots of likes for 4k. What would be awesome is if there was a place you could go to check them out side by side; but BB doesn't carry anything, there's very few MicroCenters and otherwise, everyone is stuck with reading reviews and ordering and hoping.

I presume from what you wrote that you tried a 4k 32?
I did at Micro Center in the showroom, they were all hooked up to machines with keyboard and mice so I played around with many models and while 4k was for sure gorgeous, I found my eyes straining hard in general computing/reading, etc on the 4k. 1440p won out for that reason. I would have needed to upgrade to 4k capable hardware and was willing to, but ultimately I did not like the eye strain with the 4k. And for now, my 980 is limping along with older titles at 1440p until I build a new rig early next year.

If it was a bigger display 4k would probably be fine (I'm thinking 40" +), but at 32" it was too straining for me.
 
Last edited:
The interest in the Ampere cards thus released is way above and beyond anything I've seen before (paper launch not included). I also know several people who are inspired by the new release of MSFS that are currently trying to build systems (for the first time) simply to play the game. They've all bought into the hype train - and its folks like that I'm trying to help.
The trying to play a flighting simulator and building a system just for it is a case, where non gamers that know little/nothing will ask their kids to setup them and could buy a 3080, but I am not sure a 3080 is overkill for what Flight Simulator will become over the years (specially if a transition is made toward VR something very natural for a rich player of flight sim) and that a game where 60 FPS is a natural and extremely hard to achieve target (arguably a bit high for a locked minimum on a screen) anyway for a game like that, it does not require a high FPS monitor.

I still have the feeling that going for a 3080/3090 over the 3060/3070/AMD equivalent is almost in a trivial way above the mainstream, will see with the numbers on steam in 2 year's but they will probably stay niche.

PC games are getting very old and quite rich I imagine, so it will happen someone will buy a 3080 and plug it on a 1080p/60hz for all of that video card life, but I doubt it will be common at all, that person will have a 4K TV almost for sure, like I imagine some bluray player where never plug to nice TVs and some nice TVs never had a nice content played on them and other suboptimal affair.

That said I imagine console will be wise to target either 4K or either high FPS but never both for nice looking big title (the first title seem to be on that direction), I am not sure a HDMI 2.0 type of 120hz 1440p, 60hz 4K panel will be an real issue this console generation.

We are talking about video cards that often cost a bit less to about 2 times the price of a fresh new console, it is ought to be out of the mainstream by definition.
 
Wow, guess I am just a huge loser. I "game" on a 32" curved 1080p 60hz and enjoy it. I play Wow, Overwatch, CoD, and several other titles, but according to this group, unless you use a 4k monitor you are just a huge piece of sh*t.

Why the animosity and contempt for people who cannot afford bleeding edge or high level tech?

Quote: "You can't be serious with "most people still run 1080p 60hz monitors." Normal people might be running that. Gamers started moving on about 10 years ago. Playing games on a PC does not make it a gaming PC. No one building a gaming PC has cared about 1080p for at least 5 years." :end Quote

Guess I am not a gamer....
 
Variable refresh is the most significant thing to happen to displays in the 21st century. It's not even close.

Resolution has been increasing very slowly (I had a 1920x1200 monitor in like 2004), and flat panels are just now catching up with CRT refresh rates.
 
Back
Top