Looks like if I'm upgrading to 4xxx, I'd be kissing my mini-ATX cases goodbye. The performance of the new cards looks...nice...but for what purpose? So I can get 4k 100+ FPS instead of 4k 95 FPS in Spiderman: Remastered? Back in the day, graphics in software was improving somewhat along the same rate as GPU horsepower. But now? Software developers have no use for that much GPU power. SLI has been dead/redundant for quite a long time now (and I remember having to convince people it was dead a few years back). We're entering a new ballgame. I don't blame the developers, they're just trying target and accommodate the common consumer rather than the enthusiast, and the buy-in on a regular GPU has increase an incredible amount over the last 5 years.
I feel the same way. There was time when my FPS was peaking around 30-50 (without VRR) maxed out as whatever was considered high resolution at the time. And so the jump to 60 fps the following gen was very noticeable. And then even from 60 to say, 80-100. And then VRR came along and made anything 75+ great. Now the baseline with a 3090 is 4K 80-100fps with VRR. So I can't see myself bothering with a 4090 just to get 100-110fps. It's so minimal in my gaming experience (of which I even hardly do anymore) that I'm staying put this gen.