Well developers have to focus on the consoles then offer upgrades for the PC audience in terms of textures and bling. But the consoles for now and the foreseeable future are the baseline.Looks like if I'm upgrading to 4xxx, I'd be kissing my mini-ATX cases goodbye. The performance of the new cards looks...nice...but for what purpose? So I can get 4k 100+ FPS instead of 4k 95 FPS in Spiderman: Remastered? Back in the day, graphics in software was improving somewhat along the same rate as GPU horsepower. But now? Software developers have no use for that much GPU power. SLI has been dead/redundant for quite a long time now (and I remember having to convince people it was dead a few years back). We're entering a new ballgame. I don't blame the developers, they're just trying target and accommodate the common consumer rather than the enthusiast, and the buy-in on a regular GPU has increase an incredible amount over the last 5 years.
The bulk of gamers are still 1080p with 1440 coming in a hot second with 4K way in the back.
But the last 2 years didn’t help, the average gamer (60% by Steam hardware survey) is still running 16GB ram, on a GTX 1060 at 1080p with a 6 core cpu in the 2.3-2.7 range. If steam makes up 90% of games sales more than half your potential buyers are still at or below what the modern consoles are capable of.
So to ensure a game is available to the widest audience possible developers for the first time probably ever have to look at the base specs of the consoles and dumb it down for PC users…