Is medium to high end Pc gaming becoming too niche ?

Its becoming too goddamn expensive, which in tern makes it niche.

$2000 for a graphics card is absurd!

No one is forcing you to buy a $2,000 GPU. So what if Nvidia made a $3,000 or $4,000 card, you would still try to buy it? I have no problem if Nvidia or ATI charged $1,000 or $2,000 or even $3,000 for a card. It's like with cars, you can buy a Buick Encore or a Ferrari, I don't whine that I can't afford a Ferrari.
 
It's like with cars, you can buy a Buick Encore or a Ferrari, I don't whine that I can't afford a Ferrari.
What you don't whine about regarding pricing, you make up for on other peoples opinions.

Its not about the money, its about the precedent.
 
Again, you have to think of these as professional-grade GPUs they're making available to consumers. For an engineer or scientist or whatever making $100+/hr, the time a 4090 saves that worker compared to a 3090 in one month more than pays for the cost of the card.
Nice try Nvidia marketing team.
 
Again, you have to think of these as professional-grade GPUs they're making available to consumers. For an engineer or scientist or whatever making $100+/hr, the time a 4090 saves that worker compared to a 3090 in one month more than pays for the cost of the card.
I couldn't care less about the professional features any GPU offers. I don't care about CUDA, mining performance or any other professional features on graphics cards. I wish I could get cards that were just as fast and didn't necessarily do things other than play games.
 
Nice try Nvidia marketing team.
Well people are trying to cure cancer, make bridge and building, make alternative global financial system, 200 millions movies, self driving cars on very similar hardware (if not the exact same if not lower).

It is quite different than the 80s early 90s with people having quite different video processor on $15-45k SGI workstation than what people playing game with.
 
Last edited:
No one is forcing you to buy a $2,000 GPU. So what if Nvidia made a $3,000 or $4,000 card, you would still try to buy it? I have no problem if Nvidia or ATI charged $1,000 or $2,000 or even $3,000 for a card. It's like with cars, you can buy a Buick Encore or a Ferrari, I don't whine that I can't afford a Ferrari.
That's all well and good if the performance justifies the cost with a certain amount of caché. But what has happened is the normal price/performance curve we've seen over the past 30 years has been utterly obliterated by nV's mahoosive price increases across the board. Time was, every two years or so, you'd get a ~50% performance increase for around the same price as the prior gen +/-inflation. Now what you have is a complete stagnation, nay abaissement, of the historical (I dare say, natural) price/performance improvement over time i.e. you're paying the same or more for the performance you had years back.
 
If you're happy to play at 1080p 60fps, pc gaming is cheap as chips.

And 1080p 60fps is honestly more than what the last gen consoles could accomplish. And with only a small bump in price you can graduate to 1080p 120+FPS.

If you wanted 1440p 60fps it's still cheap.

And let's be 1000% honest with ourselves:

4K 144FPS ALL settings maxed (including raytracing) has never been cheaper.

It's never been possible, before the 4090, but I'm still technically correct.
 
People just envy what they can't get. No one needs a 4090 and 13900k. These are luxury items and are priced according. You can build a solid gaming PC for under $1000 ATM. It won't play 4k at max settings but it will still play all the latest games for years to come. Stop acting like Nvidia owes you something. You honestly believe if you were in charge of Nvidia and could sell every GPU at $2000 why would you charge $700. You know you wouldn't. Even if you still made a healthy profit at $700 it is bad business sense. Would I like a 4090 for $700? Sure I would but I live in the real world and understand corporations are not charities.
 
Well people are trying to cure cancer, make bridge and building, make alternative global financial system, 200 millions movies, self driving cars on very similar hardware (if not the exact same if not lower).

It is quite different than the 80s early 90s with people having quite different video processor on $15-45k SGI workstation than what people playing game with.
Speaking of the reduction in gap from workstation GPU to desktop to play game GPU, just saw that:

The computers used to do 3D animation for Final Fantasy VII... in 1996.​

https://lunduke.substack.com/p/the-computers-used-to-do-3d-animation

https%3A%2F%2Fbucketeer-e05bbc84-baa3-437e-9518-adb32be77984.s3.amazonaws.com%2Fpublic%2Fimages%2F1369cd1d-977c-493b-831d-fdeab5d60b1d_624x1222.png


Today it would not surprise me if the vast majority of people currently working on a game has less GPU power than a single 4080 in their machine.
 
Last edited:
To me, the advent of forced crossplay with consoles and heavy aim assist for controller players feels like the beginning of the end of high end pc gaming.

The financial incentives are aligned to ensure this trend continues.

It's infuriating, but making console plebs feel good is a winning business model.
In my experience, even with aiming assist, consoles dont stand a chance. Sometimes i play with my console friends from an xbox and they dominate most matches, if i join them from a PC we end up getting wrekt every match. If one PC person is in the party, they have to play against other PC players -- now they dont invite me to play anymore because id rather lose from my PC then be carried on an Xbox. (i absolutely hate FPS with a controller)
 
In my experience, even with aiming assist, consoles dont stand a chance. Sometimes i play with my console friends from an xbox and they dominate most matches, if i join them from a PC we end up getting wrekt every match. If one PC person is in the party, they have to play against other PC players -- now they dont invite me to play anymore because id rather lose from my PC then be carried on an Xbox. (i absolutely hate FPS with a controller)
Have you played a recent AAA fps game?

People are calling the aim assist in Modern Warfare 2 an "aimbot"

90% of the top pro players in Apex Legends are controller (ie: aim assist) players.

Here are actual titles of youtube videos:
 

Attachments

  • Screenshot_20221209_174038_Discord.jpg
    Screenshot_20221209_174038_Discord.jpg
    654.7 KB · Views: 1
  • Screenshot_20221209_174012_Discord.jpg
    Screenshot_20221209_174012_Discord.jpg
    720.6 KB · Views: 1
Aimbots are a different beast all together. Unless we're talking about how much you're willing to spend for one.

Here's another avenue for high end PC gaming.
Internet connection. Back in the day. Hypothetically you could have two people with nearly identical PC hardware and identical skill levels. If one player was paying for a better connection to the internet than the other, I would say that six times out of ten they'd have the upper hand.

I also knew of people who would tune their audio setup to to hear a games footstep sound more clearly.
That's why "Gamer" headphones became a market.
Paying upwards of a grand for headphones would enable someone to hear magnificent soundscapes.
Dropping fifty to a hundred bucks on these weird studio monitors would muffle the unimportant sounds so we could focus on what part of the level a gunshot sounded from.

High DPI mice used come at a premium.
Faster Ram and hard drives meant that we could load into the next level faster than the other team. Those

High end PC gamers would scoff at the being restricted to only using a mouse and keyboard for all games. They had bespoke input devices for each type of game they played.


My income during the mid to late 90's came from birthday cards or saving my school lunch money. I didn't assemble my own PC until around 2003.
By that time nearly everyone had broadband that was halfway decent in bad weather and The ability to launch a game like quake 3 or morrowind, if not play it at respectable frame rate.
 
Looking at the "professional gamers" out there, they're certainly more likely to spend top dollar to keep themselves at the top of the game.

That's their living, and those who excel at the hottest games can certainly rake in very good money, but again, they're professionals, not casual or even advanced gamers like what most of us here are.

It's no different than back in the 70's and early 80's, when most of the kids playing tennis out there were using standard to midsized aluminum rackets, and that there was always some top notch kid out there whose parents were willing to buy him a Prince Graphite tennis racket, or even spending the big bucks for a Prince Boron.

For most tennis players, they wouldn't have the skills to justify playing with a Prince Boron (500 dollars back then), or even a Prince Graphite (300+ back then) or Wilson Pro Staff ($250+), when a Prince Classic (40 bucks) or a Prince Pro (70 bucks) was about as good as what they needed and wouldn't hold them back.

Eventually graphite (and their composites) tennis rackets took a huge plunge in price, and it wasn't long before you could buy a Prince Graphite for about $150 through the various mail order sources. Shortly after that, almost all kids were using graphite rackets, again of which they didn't really have any benefit in using other than to stave off having to cast it aside once their skills were good enough.




The way I see it, if you're a true, competitive high end gamer, and can really benefit from every last possible advantage, then by all means, knock yourself out. If that's how you make your money, then the old adage of "it takes money to make money" can certainly apply, and to stay at the top of the game in such a manner does require more resources. Yes, it costs more these days, but from what I've seen, the prize monies are also inflated.

If you're not a pro, then you have to ask yourself if it's really worth spending all of that money to squeeze out every last drop of performance. After all, you have a day job, and need to put food on the table, a roof over your head, gas in your car, etc., and your gaming certainly can't cover it if you aren't among the best of the best.

As I've said earlier in this thread, you can still enjoy excellent gaming, even among today's latest games, using a moderately priced rig, as long as you aren't trying to game in 4K with max detail, etc. If the gameplay is that entertaining for you, then detail and eye candy play a distant secondary role in the game itself.
 
Have you played a recent AAA fps game?

People are calling the aim assist in Modern Warfare 2 an "aimbot"

90% of the top pro players in Apex Legends are controller (ie: aim assist) players.

Here are actual titles of youtube videos:
We are playing apex, havent been on call of duty since black ops on the OG xbox one. All i can say is what we experience, my friends win a lot when its console vs console, and lose a lot when i join them from PC. Possibly the average console player is below mediocre in skill.
 
Have you played a recent AAA fps game?

People are calling the aim assist in Modern Warfare 2 an "aimbot"

90% of the top pro players in Apex Legends are controller (ie: aim assist) players.

Here are actual titles of youtube videos:
I play cash cups in fortnite, and yes controller players have a huge advance, sometimes they didn't even have to aim to shoot you in the head.
 
The main thing is the consoles have been the most powerful they've ever been relative to gaming PCs. At this point outside of niche overpowered builds, with the console version you get proper HDR, streaming assets from SSD at high speed and low level access to the hardware means the gaming experience on a console is just better outside of the highest end builds.

Yeah you can see some high fps numbers, but what you don't have on consoles are random hitching, driver problems, data streaming stuttering, and lack of proper HDR, and you don't need to spend hours trying to troubleshoot random software problems. Really unless I'm playing a couple titles with full blown ray tracing and the most cutting edge games, I'd rather play the console version 99% of the time these days.
 
Aimbots are a different beast all together. Unless we're talking about how much you're willing to spend for one.

Here's another avenue for high end PC gaming.
Internet connection. Back in the day. Hypothetically you could have two people with nearly identical PC hardware and identical skill levels. If one player was paying for a better connection to the internet than the other, I would say that six times out of ten they'd have the upper hand.

I also knew of people who would tune their audio setup to to hear a games footstep sound more clearly.
That's why "Gamer" headphones became a market.
Paying upwards of a grand for headphones would enable someone to hear magnificent soundscapes.
Dropping fifty to a hundred bucks on these weird studio monitors would muffle the unimportant sounds so we could focus on what part of the level a gunshot sounded from.

High DPI mice used come at a premium.
Faster Ram and hard drives meant that we could load into the next level faster than the other team. Those

High end PC gamers would scoff at the being restricted to only using a mouse and keyboard for all games. They had bespoke input devices for each type of game they played.


My income during the mid to late 90's came from birthday cards or saving my school lunch money. I didn't assemble my own PC until around 2003.
By that time nearly everyone had broadband that was halfway decent in bad weather and The ability to launch a game like quake 3 or morrowind, if not play it at respectable frame rate.

I have seen very few obvious aimbots in Apex - we ran in to some this morning. Super accurate shots from the edge of visibility.

People who use aimbots should be given capital punishment. In public.

If you have not been playing FPS games on a PC vs consoles in the last few years, then you're basically posting from - what, 2012?
Before aim assist, PCs would always wipe the floor with consoles - most reasonable people think a little aim assist is OK on consoles.

The problem is that it's no longer "a little" and "assist" is a serious undersell. It's a massive crutch and an unfair advantage and PC players can't avoid it. I believe PS5 players can opt out of crossplay and only play with PS5 players. In virtually all games, PC players must play with aim assisted console players.

At this point, you probably can't sell a AAA FPS game without it - people expect it. It's straight up dopamine.

If you're not a pro, then you have to ask yourself if it's really worth spending all of that money to squeeze out every last drop of performance. After all, you have a day job, and need to put food on the table, a roof over your head, gas in your car, etc., and your gaming certainly can't cover it if you aren't among the best of the best.

As I've said earlier in this thread, you can still enjoy excellent gaming, even among today's latest games, using a moderately priced rig, as long as you aren't trying to game in 4K with max detail, etc. If the gameplay is that entertaining for you, then detail and eye candy play a distant secondary role in the game itself.

I don't mind paying for a high end PC. I probably wouldn't play if I had to use a mid range system. For me, the issues are not money but the shifts in the market to over monetized titles that sell new skins every month and lets the bugs and cheating go unchecked and, as mentioned above, forcing "differently abled" aiming systems to play together.

We are playing apex, havent been on call of duty since black ops on the OG xbox one. All i can say is what we experience, my friends win a lot when its console vs console, and lose a lot when i join them from PC. Possibly the average console player is below mediocre in skill.
If good players play with casuals on a pre-formed team in Apex, everyone will get matched at the level of the best player and you may not win as much. There are a lot of really good console players.

The main thing is the consoles have been the most powerful they've ever been relative to gaming PCs. At this point outside of niche overpowered builds, with the console version you get proper HDR, streaming assets from SSD at high speed and low level access to the hardware means the gaming experience on a console is just better outside of the highest end builds.

Yeah you can see some high fps numbers, but what you don't have on consoles are random hitching, driver problems, data streaming stuttering, and lack of proper HDR, and you don't need to spend hours trying to troubleshoot random software problems. Really unless I'm playing a couple titles with full blown ray tracing and the most cutting edge games, I'd rather play the console version 99% of the time these days.
I couldn't focus and communicate playing on a console in my living room. It's not comfortable and not conducive to the right mindset. So if I was going to play on a console, it would just be another system on my desk. I know they've improved but they are not immune to hitches and lag issues. I'm not going to spend $800 setting up console to supplant a $8000 PC so I can beat aim assist by joining aim assist. Apex is already circling the drain due to mismanagement - if I move on, it will be to something where I don't have to play against a business model disguised as a feature.
 
.Yeah you can see some high fps numbers, but what you don't have on consoles are random hitching, driver problems, data streaming stuttering, and lack of proper HDR, and you don't need to spend hours trying to troubleshoot random software problems. Really unless I'm playing a couple titles with full blown ray tracing and the most cutting edge games, I'd rather play the console version 99% of the time these days.
That's not true at all. And it has been less true since the PS4 pro was launched, arguably longer. Remember the Gameboy Color or the RAM carts for the Sega Saturn and Nintendo 64?

The Xbox One and PlayStation 4 were virtually the same compared to their predecessors.
That still didn't stop games that barely worked out of the box from being released. The dog shit performance of Control on my launch PS4 is what compelled me to bellyflop back into PC gaming.

The major difference between the, ideology. Of PC and console gaming if you will. Is the type and options available to the end user.

There's another thread on here asking about the death of HEDT. I bring that up because it's essentially the identical question with similar answers. I imagine.

The answer to the question posed in this thread is no because it's always been a niche, unless you were in the search for the ultimate vehicle simulator. There's been a box for every price range to play games with.

As long as the most profitable/popular games allow cross platform play, the difference between each platform will cease to matter.
 
The main thing is the consoles have been the most powerful they've ever been relative to gaming PCs. At this point outside of niche overpowered builds, with the console version you get proper HDR, streaming assets from SSD at high speed and low level access to the hardware means the gaming experience on a console is just better outside of the highest end builds.

Yeah you can see some high fps numbers, but what you don't have on consoles are random hitching, driver problems, data streaming stuttering, and lack of proper HDR, and you don't need to spend hours trying to troubleshoot random software problems. Really unless I'm playing a couple titles with full blown ray tracing and the most cutting edge games, I'd rather play the console version 99% of the time these days.
The only time this was true was at the beginning of gen 7 (Xbox 360, PlayStation 3). It took a year for PC hardware to catch back up. Consoles of gen 8-9 are midrange PCs, at best.

I can tell you with gen 9 since I still play regularly on consoles next to my PC that the idea of consoles not having game issues is just not true. Quality control has been greatly slipping to the point that I regularly experience crashes to dashboard on both my Series X and PS5 in a number of games. Dying Light 2 has actually made my Series X hard lock on a few occasions to the point I needed to pull the power cord to turn it off (the power button wasn't responding). The PS5 version of Call of Duty: Cold War did the same thing to me at one point. Stuttering is also still a reality on consoles. Try playing the console version of A Plague Tale: Requiem and tell us it's smooth. Proper HDR is also a laugh when several games still ignore the console calibration and set their own reference white level that decreases the luminance range and makes blacks appear grey. I have also yet to run into a game that loads faster on either of my consoles than on my PC where the game is available on both.

My most recent experience of a new game on console, The Callisto Protocol, experienced as many issues as the PC version. The only thing it did better was not hitching when a particle effect loaded, but the latest PC patch fixed that. The Series X version stutters a lot including in the performance mode. HDR is worse than the PC version as it clamps luminance to the point that the spotlights in the prison at the beginning of the game didn't break through the fog and blended in with the background. On the PC version the spotlights are blinding. Once shaders are compiled on the PC version it loads into the game faster on a gen 3 SSD than the Series X version.

Consoles are closer to PC than ever before, but they are still severely lacking in comparison.
 
This is the important consideration. IMHO mid is 1440p60 and high is 4k60.

We've just been able to hit 4k60 on max settings with almost every game with the 6800XT/6900XT/3080/3090 in the last couple years. In my mind the newest generation of GPUs, the 7000-series and 4080/4090, is not high end, it is beyond high end. They're effectively professional-grade cards for people who use their PCs to do work, not just play games. That's why you see them at such high prices. I didn't pay for my 3090s, my lab did. Nvidia simply realized there are enough gamers with this kind of disposable income that they could sell $1,000+ GPUs that aren't Quadros. The current generation of GPUs is not niche high end, it is a new niche entirely for consumer-grade hardware.

I think mid-range 1440p60 gaming is as accessible as ever: $150-200 for a 5600X and $350 for a 6700XT or 3060 Ti and you're set. This is close to what I paid for an i5-4670K and GTX 970 almost a decade ago, which could do 1080p60 at max settings for just about every game (which the GTX 700-series could not do).

In a sense, it took about a decade to go from 1080p to 1440p gaming as the mid-range, and you're seeing this transition from 1440p to 4k gaming happening right now. You can already get used 6900XTs/3080s for $500-600. Not sure what 'mid range' GPUs this generation will cost, but I would not be surprised if they can do 4k60 for $500-600, which the previous gen GPUs at that price point could not do.

Totally untrue. You may think a 4090 is just for professionals but in reality it is the first 4K card that truly allows you to do high refresh rate gaming at 4K in almost all titles. That is a big deal. IMO 4K native gaming is MUCH nicer than 1440.

And you could make a decent gaming PC even with a 4090 for around $3K
 
It's always been a niche, but what I've really noticed as of late is that the PS5 and Xbox Series X have seriously stepped things up to where they don't feel immediately dated on release compared to a PC like their predecessors did, and while CPU prices are quite fair given the sheer leap in performance we've had recently compared to Haswell or Skylake builds from nearly a decade ago where 4C/8T was all you got until you went HEDT, GPU prices are getting absurd for non-professional models.

The RTX 4090 still sells out at Micro Center, even for $1,800 AIB models with a $200 price premium, because it's the literal best consumer GPU at the market - something that not just gamers want, but creative professionals. While the miners finally bowed out because of Ethereum finally going proof-of-stake, the AI crowd is picking up the slack a bit, so it's not just gamers with too much money to burn.

Personally, under normal circumstances, I would not have spent $1,000 on an RX 7900 XTX when it's such absolute overkill and an RTX 3080 at $500 would've been more than enough for 1080p120 as is, but my GTX 980 wasn't getting any newer and VR is a niche within the niche of PC gaming that requires the most powerful single GPU you can afford to get acceptable performance in some titles.

Want to run No Man's Sky maxed out without reprojection/synth frames kicking in or running DLSS/FSR and ruining your visual quality? You'll need at least an RTX 4080, more comfortably a 4090. Want to run MSFS 2020 without reprojection? The hardware literally does not exist because even the RTX 4090 still hovers around 45 FPS, so the most you can settle for is all synth frames without any dropped ones.

$1,600-1,800 for a GPU to back up a $1,000 Valve Index full kit (and there are even more excruciatingly expensive ones out there that make the Index look cheap, like the Varjo Aero), just to try and get these games to run smoothly? Yeah, I think most people are just gonna go PS5 + PSVR2 for that at around $950-1,050 in all likeliness, but my extensive SteamVR library wouldn't carry over.

Thing is, if you're not enthused about VR like I am and you're only targeting 1080p60 like the average person, a 12600K or 12700K + Z690 DDR4 mobo combo at the local Micro Center provides by far the most bang for the buck for starting a new build at roughly $250-350, possibly even less than that, and then you can scrounge together some cheap DDR4, a cheap case, a decent CPU cooler (wouldn't skimp too much on the 12700K, it gets real toasty if you like to overclock), enough PSU for your chosen GPU, and you should have a decent PC cobbled together for about $500-600 sans GPU, especially with a willingness to buy the right used parts.

Yeah, without the GPU, you're definitely not gaming anywhere near PS5 or XSX level, but the rationale here is that you're going to need a PC for various non-gaming tasks anyway, and then you take that money you could've spent on the next-gen console, put it toward a GPU, and know that the rest of your system won't be holding it or any newer GPUs back for a good while. (Some games are horrendously CPU-bottlenecked; ArmA III saw ZERO benefit from upgrading from the GTX 980 to the RX 7900 XTX, but while moving from 4770K to 7700K provided a minor improvement at best, the 12700K blows them both out of the water.)

Then you save lots of money on Steam sales and get to enjoy literal generations worth of gaming on the same system, which consoles just can't offer. The closest any particular console got to the long legacy of PC gaming is the early CECHA/CECHB PS3 with a whole PS2 section on the motherboard, capable of running three generations of gaming in one box, but later consoles lost the ability to run PS2 disc games, and PS4 and PS5 won't even run PS1 disc games despite happily letting you run downloaded, emulator-packed versions from PSN.

Speaking of the reduction in gap from workstation GPU to desktop to play game GPU, just saw that:

The computers used to do 3D animation for Final Fantasy VII... in 1996.​

https://lunduke.substack.com/p/the-computers-used-to-do-3d-animation

Today it would not surprise me if the vast majority of people currently working on a game has less GPU power than a single 4080 in their machine.
After tinkering a bit with some SiliconGraphics hardware myself at VCFMW and taking home a modestly equipped Octane workstation, it really dawns on me that SGI's true strength was in those clustered Onyx supercomputers, though workstations like the Indigo2 and Octane were no slouch in and of themselves. They simply offered graphics performance you couldn't obtain any other way, like having a computer from the future, and IRIX is marvelously responsive in a way that many newer Windows and macOS systems fail to accomplish, never mind actual '90s PC and Mac hardware. The catch was, as always, in the exorbitantly expensive pricing that could easily buy an entire car, and a fairly luxurious one at that.

But then NVIDIA suddenly upended them with a little something called the GeForce 256, which rivaled SGI's VPro graphics options in sheer performance at a fraction of the price, and I don't think their older IMPACT/MARDIGRAS architecture from the Indigo2/Octane era would've stood a chance, especially not with the 4 MB RDRAM tops you got for texture memory. Unsurprisingly, the first GeForce architecture is also the first Quadro, as NVIDIA knew they could move into professional 3D markets with hardware that powerful.

Suddenly, commodity PCs gave you all the SGI power at a fraction of the price, and now SGI is just a fondly-remembered footnote in history while NVIDIA occupies the HPC space they once did, with their DGX Station and Tesla hardware setups.

I want everyone to think about that for a moment - that we have GPUs now that do ray-tracing in real time, which Blender gladly utilizes, and that's a task that people used to need literal clusters of SGI Origin and Onyx systems totaling out to six or even seven figures to get anywhere near accomplishing. We have more graphics processing power than we all know what to do with, and what do we do with it? Play video games with it, or better yet, make those games in the first place!
 
It's always been a niche, but what I've really noticed as of late is that the PS5 and Xbox Series X have seriously stepped things up to where they don't feel immediately dated on release compared to a PC like their predecessors did, and while CPU prices are quite fair given the sheer leap in performance we've had recently compared to Haswell or Skylake builds from nearly a decade ago where 4C/8T was all you got until you went HEDT, GPU prices are getting absurd for non-professional models.

The RTX 4090 still sells out at Micro Center, even for $1,800 AIB models with a $200 price premium, because it's the literal best consumer GPU at the market - something that not just gamers want, but creative professionals. While the miners finally bowed out because of Ethereum finally going proof-of-stake, the AI crowd is picking up the slack a bit, so it's not just gamers with too much money to burn.

Personally, under normal circumstances, I would not have spent $1,000 on an RX 7900 XTX when it's such absolute overkill and an RTX 3080 at $500 would've been more than enough for 1080p120 as is, but my GTX 980 wasn't getting any newer and VR is a niche within the niche of PC gaming that requires the most powerful single GPU you can afford to get acceptable performance in some titles.

Want to run No Man's Sky maxed out without reprojection/synth frames kicking in or running DLSS/FSR and ruining your visual quality? You'll need at least an RTX 4080, more comfortably a 4090. Want to run MSFS 2020 without reprojection? The hardware literally does not exist because even the RTX 4090 still hovers around 45 FPS, so the most you can settle for is all synth frames without any dropped ones.

$1,600-1,800 for a GPU to back up a $1,000 Valve Index full kit (and there are even more excruciatingly expensive ones out there that make the Index look cheap, like the Varjo Aero), just to try and get these games to run smoothly? Yeah, I think most people are just gonna go PS5 + PSVR2 for that at around $950-1,050 in all likeliness, but my extensive SteamVR library wouldn't carry over.

Thing is, if you're not enthused about VR like I am and you're only targeting 1080p60 like the average person, a 12600K or 12700K + Z690 DDR4 mobo combo at the local Micro Center provides by far the most bang for the buck for starting a new build at roughly $250-350, possibly even less than that, and then you can scrounge together some cheap DDR4, a cheap case, a decent CPU cooler (wouldn't skimp too much on the 12700K, it gets real toasty if you like to overclock), enough PSU for your chosen GPU, and you should have a decent PC cobbled together for about $500-600 sans GPU, especially with a willingness to buy the right used parts.

Yeah, without the GPU, you're definitely not gaming anywhere near PS5 or XSX level, but the rationale here is that you're going to need a PC for various non-gaming tasks anyway, and then you take that money you could've spent on the next-gen console, put it toward a GPU, and know that the rest of your system won't be holding it or any newer GPUs back for a good while. (Some games are horrendously CPU-bottlenecked; ArmA III saw ZERO benefit from upgrading from the GTX 980 to the RX 7900 XTX, but while moving from 4770K to 7700K provided a minor improvement at best, the 12700K blows them both out of the water.)

Then you save lots of money on Steam sales and get to enjoy literal generations worth of gaming on the same system, which consoles just can't offer. The closest any particular console got to the long legacy of PC gaming is the early CECHA/CECHB PS3 with a whole PS2 section on the motherboard, capable of running three generations of gaming in one box, but later consoles lost the ability to run PS2 disc games, and PS4 and PS5 won't even run PS1 disc games despite happily letting you run downloaded, emulator-packed versions from PSN.


After tinkering a bit with some SiliconGraphics hardware myself at VCFMW and taking home a modestly equipped Octane workstation, it really dawns on me that SGI's true strength was in those clustered Onyx supercomputers, though workstations like the Indigo2 and Octane were no slouch in and of themselves. They simply offered graphics performance you couldn't obtain any other way, like having a computer from the future, and IRIX is marvelously responsive in a way that many newer Windows and macOS systems fail to accomplish, never mind actual '90s PC and Mac hardware. The catch was, as always, in the exorbitantly expensive pricing that could easily buy an entire car, and a fairly luxurious one at that.

But then NVIDIA suddenly upended them with a little something called the GeForce 256, which rivaled SGI's VPro graphics options in sheer performance at a fraction of the price, and I don't think their older IMPACT/MARDIGRAS architecture from the Indigo2/Octane era would've stood a chance, especially not with the 4 MB RDRAM tops you got for texture memory. Unsurprisingly, the first GeForce architecture is also the first Quadro, as NVIDIA knew they could move into professional 3D markets with hardware that powerful.

Suddenly, commodity PCs gave you all the SGI power at a fraction of the price, and now SGI is just a fondly-remembered footnote in history while NVIDIA occupies the HPC space they once did, with their DGX Station and Tesla hardware setups.

I want everyone to think about that for a moment - that we have GPUs now that do ray-tracing in real time, which Blender gladly utilizes, and that's a task that people used to need literal clusters of SGI Origin and Onyx systems totaling out to six or even seven figures to get anywhere near accomplishing. We have more graphics processing power than we all know what to do with, and what do we do with it? Play video games with it, or better yet, make those games in the first place!

And then some people just do not want to ever use a controller.
 
It's always been a niche, but what I've really noticed as of late is that the PS5 and Xbox Series X have seriously stepped things up to where they don't feel immediately dated on release compared to a PC like their predecessors did, and while CPU prices are quite fair given the sheer leap in performance we've had recently compared to Haswell or Skylake builds from nearly a decade ago where 4C/8T was all you got until you went HEDT, GPU prices are getting absurd for non-professional models.

The RTX 4090 still sells out at Micro Center, even for $1,800 AIB models with a $200 price premium, because it's the literal best consumer GPU at the market - something that not just gamers want, but creative professionals. While the miners finally bowed out because of Ethereum finally going proof-of-stake, the AI crowd is picking up the slack a bit, so it's not just gamers with too much money to burn.

Personally, under normal circumstances, I would not have spent $1,000 on an RX 7900 XTX when it's such absolute overkill and an RTX 3080 at $500 would've been more than enough for 1080p120 as is, but my GTX 980 wasn't getting any newer and VR is a niche within the niche of PC gaming that requires the most powerful single GPU you can afford to get acceptable performance in some titles.

Want to run No Man's Sky maxed out without reprojection/synth frames kicking in or running DLSS/FSR and ruining your visual quality? You'll need at least an RTX 4080, more comfortably a 4090. Want to run MSFS 2020 without reprojection? The hardware literally does not exist because even the RTX 4090 still hovers around 45 FPS, so the most you can settle for is all synth frames without any dropped ones.

$1,600-1,800 for a GPU to back up a $1,000 Valve Index full kit (and there are even more excruciatingly expensive ones out there that make the Index look cheap, like the Varjo Aero), just to try and get these games to run smoothly? Yeah, I think most people are just gonna go PS5 + PSVR2 for that at around $950-1,050 in all likeliness, but my extensive SteamVR library wouldn't carry over.

Thing is, if you're not enthused about VR like I am and you're only targeting 1080p60 like the average person, a 12600K or 12700K + Z690 DDR4 mobo combo at the local Micro Center provides by far the most bang for the buck for starting a new build at roughly $250-350, possibly even less than that, and then you can scrounge together some cheap DDR4, a cheap case, a decent CPU cooler (wouldn't skimp too much on the 12700K, it gets real toasty if you like to overclock), enough PSU for your chosen GPU, and you should have a decent PC cobbled together for about $500-600 sans GPU, especially with a willingness to buy the right used parts.

Yeah, without the GPU, you're definitely not gaming anywhere near PS5 or XSX level, but the rationale here is that you're going to need a PC for various non-gaming tasks anyway, and then you take that money you could've spent on the next-gen console, put it toward a GPU, and know that the rest of your system won't be holding it or any newer GPUs back for a good while. (Some games are horrendously CPU-bottlenecked; ArmA III saw ZERO benefit from upgrading from the GTX 980 to the RX 7900 XTX, but while moving from 4770K to 7700K provided a minor improvement at best, the 12700K blows them both out of the water.)

Then you save lots of money on Steam sales and get to enjoy literal generations worth of gaming on the same system, which consoles just can't offer. The closest any particular console got to the long legacy of PC gaming is the early CECHA/CECHB PS3 with a whole PS2 section on the motherboard, capable of running three generations of gaming in one box, but later consoles lost the ability to run PS2 disc games, and PS4 and PS5 won't even run PS1 disc games despite happily letting you run downloaded, emulator-packed versions from PSN.


After tinkering a bit with some SiliconGraphics hardware myself at VCFMW and taking home a modestly equipped Octane workstation, it really dawns on me that SGI's true strength was in those clustered Onyx supercomputers, though workstations like the Indigo2 and Octane were no slouch in and of themselves. They simply offered graphics performance you couldn't obtain any other way, like having a computer from the future, and IRIX is marvelously responsive in a way that many newer Windows and macOS systems fail to accomplish, never mind actual '90s PC and Mac hardware. The catch was, as always, in the exorbitantly expensive pricing that could easily buy an entire car, and a fairly luxurious one at that.

But then NVIDIA suddenly upended them with a little something called the GeForce 256, which rivaled SGI's VPro graphics options in sheer performance at a fraction of the price, and I don't think their older IMPACT/MARDIGRAS architecture from the Indigo2/Octane era would've stood a chance, especially not with the 4 MB RDRAM tops you got for texture memory. Unsurprisingly, the first GeForce architecture is also the first Quadro, as NVIDIA knew they could move into professional 3D markets with hardware that powerful.

Suddenly, commodity PCs gave you all the SGI power at a fraction of the price, and now SGI is just a fondly-remembered footnote in history while NVIDIA occupies the HPC space they once did, with their DGX Station and Tesla hardware setups.

I want everyone to think about that for a moment - that we have GPUs now that do ray-tracing in real time, which Blender gladly utilizes, and that's a task that people used to need literal clusters of SGI Origin and Onyx systems totaling out to six or even seven figures to get anywhere near accomplishing. We have more graphics processing power than we all know what to do with, and what do we do with it? Play video games with it, or better yet, make those games in the first place!

I wouldn't use steam or any other digital market place as a selling point for either of the reasons you brought up.
At most it breaks even, on both points.
Steam has nothing to do with backwards compatibility. Not really.
 
I think a big part of why high-end gaming is becoming too niche is because it's a fact that mainstream gaming has halted to a stand-still over the last half decade. I mean, the most popular mainstream gpu is arguably still the GTX 1060 which is now a 6 year old gpu. When I bought a GPU this year, the fastest possible GPU that I could buy that used a 6 pin power connector was the 3 year old 1650 Super.

It used to be that mainstream gaming would get you about 1/2 the performance of enthusiast-class cards and that every generation of enthusiast-class gpu's would have a corresponding mainstream product line. Now enthusiast class cards are like 6 times faster than mainstream cards because mainstream cards are basically not getting updated anymore and we are still seeing ancient gpu's getting enormous sales. Mainstream graphics is falling behind and that is why enthusiast gaming is becoming so niche. It used to be that mainstream gamers would get a taste of enthusiast gaming with their hardware and it would be enough to justify developers putting a ton of effort pushing ahead with better graphics. The fact is that 90% of gamers are not enthusiasts and if the only people who are getting better hardware year on year is enthusiasts that is not really enough to compel developers to invest the insane $$ needed to push next-gen graphics.

The fault here is NVidia and AMD. We should not be seeing GTX 1060's getting sold (or even 1650 Supers) in 2022, these cards are ancient. NVidia should have discontinued these product lines years ago and they should be releasing RTX 2650's and 3650's that have comparable performance to RTX 2060 gpu's and above with lower power requirements and better thermal performance. This is what used to happen and it pushed the industry ahead. Similarly, AMD should not be releasing RX 6500XT's in 2022 that are no faster than the 1650 Super: AMD is part of the problem. If AMD did their job and actually competed with NVidia it would not allow for this blatant neglect of the mainstream gaming market segments.

Anyways, I firmly believe that the answer to fixing the problem with enthusiast gaming is to look at the enormous neglect of the mainstream market. These two market segments have a symbiotic relationship and are actually very dependent on each other, if one is not healthy than neither will the other.
 
You are exactly correct, but I guess the game plan is to let the glut of high end used cards filter down and become the main stream cards. I can't believe Nvidia and AMD intend this to be permanent but perhaps until the weird last few years of supply chain issues sort out they're both ok with trickle down from gamers and miners selling old cards.

But let's not forget that the 6600XT became available Aug 2021. And has been carried on by the 6650XT. These are available new today from Newegg or Amazon at $270. So there ARE a few new-ish mainstream cards out there.

Yes, they "should" be sub $200 cards but we all know the inflation ship sailed years ago.
 
I feel there some tension going on, if we look at Cyberpunk, Elden Ring, Flight Sim, etc... PC sales and the type of computers they need to be ran comfortably

) How niche it is really ?
) How true it is that dev to not push graphic, the new Plague Tales, callipso, etc... are quite hard to run and seek to find the limit, CoD need to run or more hardware and have better performance but they still pushed the graphic to incredible level, I suspect the new God of war will look fantastic.

) How hard and how high priced it is, the demand seem huge (i.e. again how niche it is really, high performance PC gaming became incredibly mainstream, at least it is my feeling, I was one of the very few with a serious 3d card in 1998 in my whole small city school, household had a single game able computer if they had one, I would imagine now it is incredibly common to have one by kid)

) steam hardware survey seem to show about 30-35 millions of computer with a video card significantly stronger than a 1060 class, even if 80% of them have a steam account we are above 40 millions, is that niche ? Worldwide obviously, but that more than the most watched free to watch NBA final games tv rating
 
  • Like
Reactions: gvx64
like this
) steam hardware survey seem to show about 30-35 millions of computer with a video card significantly stronger than a 1060 class, even if 80% of them have a steam account we are above 40 millions, is that niche ? Worldwide obviously, but that more than the most watched free to watch NBA final games tv rating
That is an interesting point. Just coming from the perspective of the console wars, a 30 million install base is generally considered a middle of the road success (aka. not a flop but not a massive success either). For example, the N64 sold about 33 million lifetime and it did get some decent third party exclusives but third party developers definitely preferred the PS1 and it's 100 million install base. That said, the even bigger appeal to a developer would be to gain access to a 133 million unit install base by developing a multi-platform third party title that is easily ported with minimal extra development time. The problem with the N64 was a little bit similar to the problem that I believe enthusiast level PC gaming is encountering today: which is that it was tough for developers to make games for and port to other platforms. If you developed a game for the N64 you were in a tough spot if you wanted to port it: you almost had to re-develop the game from scratch for other platforms.

The problem with enthusiast gaming isn't quite the same as with the N64 but it is similar in that if you want to develop a game that will truly push an RTX 4090 and show off what it can do with ray-tracing, etc you almost have to re-develop a completely separate game that will look great and play well on a GTX 1650 because that card is just so far behind now. It's easier for developers to just develop for the 1650 and then add a few underwhelming bells and whistles (like ultra mode or whatever that maybe uses more than 4GB of VRAM) for the enthusiasts than it is the other way around.

I truly believe that the solution to the problem is to close the gap between the midrange and enthusiast level cards. It is a problem when I read a month ago that the 1060 is finally being dethroned as the most popular GPU... by the 1650. The 1650 is even slower than the 1650 Super or 1060! What is going on here? If we could have a RTX 3650 that is twice as fast as the 1650, ray-tracing capable and maybe 1/3 to 1/2 as fast as the 4090 become the most popular card it makes it a lot easier to develop a killer demanding game for the 4090 that will still play decent on a mid-range computer without having to do a ton of work. As for the 1650 and 1060, they need to be discontinued asap. In 2022, they shouldn't be in production period let-alone be dethroning each other for the crown of most popular gpu out there right now.
 
At least the 1650 has the redeeming quality of being the fastest slot-powered GPU (discounting the A2000),
and not using a dumb 4x lane configuration like the 6500XT
Still a sad state of affairs though
 
Last edited:
It's easier for developers to just develop for the 1650 and then add a few underwhelming bells and whistles (like ultra mode or whatever that maybe uses more than 4GB of VRAM) for the enthusiasts than it is the other way around.
I think more and more we will see game developed to run at just 60fps with variable shader rate and aggressive upscaling for people that seat far from a TV on an around 6700xt-2070super/2080 type of cards that are the PS5-Xbox-X with a 30fps high quality mode, which by itself would make out of the box hard to run at 4k native 120fps for most high end card without having to add higher setting, you need to be more than 4 time a PS5 faster.

How many cards will be comfortable to run in native what a PS5 ran at 30fps with all the upscaling, variable rate shaders and optimization for a specific device that goes on in 4K native, I feel the strongest cards already does not have an easy time with the latest Fornite and it will be the norm for nanite UE 5 titles
 
  • Like
Reactions: gvx64
like this
Slightly off topic but related to these very high prices... I think I'm just gonna focus on older games and less demanding indie games until I decide I'm ready to build a new system. Problem was last year I bought the LG 48X which is a fantastic 4KTV computer monitor but I can't run it at native res full 120hz for any newer games. So I'm just gonna hold off on playing the likes of red dead redemption 2 cyberpunk et cetera until I can build a new systemm which probably won't be until the next generation of cards When the prices on the current gen of AMD and nvidia are lower.
 
I am quiet happy with my used parts from HardForums :D picked up Asus tuf z690 wifi 1-2 year warranty + i5-12600k $325 shipped and Evga RTX 3080 XC3 /w 2 year warranty left! $525 shipped

Total price $850

I am running 27" 1440p 165hz i play bf1,bf5,wow,diablo,COH2 and when COH3 releases. COH3 i am excited about
 
I wouldn't use steam or any other digital market place as a selling point for either of the reasons you brought up.
At most it breaks even, on both points.
Steam has nothing to do with backwards compatibility. Not really.
Never intended to say Steam has anything to do with backwards compatibility in the midst of my rambling; it's more of a side effect of PC games not being locked to certain generations of hardware for the most part, at least until you start getting back into the Windows 9x era - that's when you start needing more period-appropriate builds. (I actually have a P4EE 3.2 GHz build for that exact purpose.) Newer stuff past the Windows XP and especially Vista era tends to run fine on today's hardware and OSes.

I just like being able to buy a game once and be fairly confident that it'll run on whatever computer I get down the road, often far better to boot, and consoles didn't really permit that even when they could. Why not let a PS5 run a PS1 disc game out of the box, for instance? Gotta keep my PS3 around for that.

I see value in not being locked into a certain generation of hardware, especially after seeing how owners of PS4 games might actually have to pay to upgrade to PS5-native versions once they have the new console whereas a PC version was already "upgraded" from the start if your hardware could handle it.

Also, yeah, I'm one of those guys who will casually fire up Doom and UT'99 on a modern 12700K/7900 XTX/Windows 11 build, because why not?

Problem is, on PC, we lost the option for physical, no-Internet-needed releases of games a long time ago. The closest thing we've got now is backing up DRM-free GOG installers to a physical medium once downloaded, and even then, a lot of retail, physical boxed copies of PC games nowadays are really just a Steam code, or an Origin/EA Downloader code, or a Ubisoft Uplay code, or a Rockstar Games Launcher code... yeah, there's too damn many of those launchers now.

Then you look at publishers like Limited Run Games, Strictly Limited Games, etc. that offer fancy physical releases of games, sometimes old ones, sometimes new ones, often old ones on new platforms, and almost every time, it's console-only. Switch and PS4 releases for days, but not a single PC release even when the games in question obviously have PC releases. Makes me wonder if it's because of the above point about those PC versions just being a DRM key for one of the aforementioned services.
 
Never intended to say Steam has anything to do with backwards compatibility in the midst of my rambling; it's more of a side effect of PC games not being locked to certain generations of hardware for the most part, at least until you start getting back into the Windows 9x era - that's when you start needing more period-appropriate builds. (I actually have a P4EE 3.2 GHz build for that exact purpose.) Newer stuff past the Windows XP and especially Vista era tends to run fine on today's hardware and OSes.

I just like being able to buy a game once and be fairly confident that it'll run on whatever computer I get down the road, often far better to boot, and consoles didn't really permit that even when they could. Why not let a PS5 run a PS1 disc game out of the box, for instance? Gotta keep my PS3 around for that.

I see value in not being locked into a certain generation of hardware, especially after seeing how owners of PS4 games might actually have to pay to upgrade to PS5-native versions once they have the new console whereas a PC version was already "upgraded" from the start if your hardware could handle it.

Also, yeah, I'm one of those guys who will casually fire up Doom and UT'99 on a modern 12700K/7900 XTX/Windows 11 build, because why not?

Problem is, on PC, we lost the option for physical, no-Internet-needed releases of games a long time ago. The closest thing we've got now is backing up DRM-free GOG installers to a physical medium once downloaded, and even then, a lot of retail, physical boxed copies of PC games nowadays are really just a Steam code, or an Origin/EA Downloader code, or a Ubisoft Uplay code, or a Rockstar Games Launcher code... yeah, there's too damn many of those launchers now.

Then you look at publishers like Limited Run Games, Strictly Limited Games, etc. that offer fancy physical releases of games, sometimes old ones, sometimes new ones, often old ones on new platforms, and almost every time, it's console-only. Switch and PS4 releases for days, but not a single PC release even when the games in question obviously have PC releases. Makes me wonder if it's because of the above point about those PC versions just being a DRM key for one of the aforementioned services.

I dislike keeping track of which service a game is on as much as the next person, but you have rose tinted glasses if you think that it was any less of a shit show back then.
Especially if you are bringing up Vista, an operating system that couldn't run on most of the computers that bore it's sticker.

It's merely a different kind of shit show these days.
.... nevermind....I pretty much agree with everything else you're saying. I just... Let's agree to disagree.
 
Back
Top