I have said this 1000x and I am saying it again for myself because I am irrationally aroused by the 3090.
Whenever you buy hardware based on synthetic benchmarks, relative performance to previous generations, or features alone, you are probably going to make an expensive mistake.
I have been doing this since 1998. The question is:
(1) What is the most demanding game I will be playing;
(2) What resolution will I use;
(3) What is my minimum frame rate requirement;
(4) What features/settings do I want/need;
(5) Does potential card hit all of the above with room to spare?
So, for example, if you play Battlefield, we know Battlefield 6 is coming. We know GTA 6 is coming. We know certain games tend to push hardware (anything outside, with foliage).
Can you imagine how sick you would feel if you dropped $1,500 on an RTX3090 and then a year later or two years later a next gen game launched (we are on the cusp or a new console gen after all) and it didn’t quite hit that 4K at 120fps, HDR you wanted. I mean it is one thing to upgrade from an $800 card in that time, but $1,500?
I think you not only need to have the $1,500 but also be willing to risk that it will fall short when it really matters.
I mean, are we really building PC’s around this cyberpunk game?
I always look at Frostbite Engine and Grand Theft Auto as my benchmark because there are TREES that get rendered. Pick your killer app and then go one level ABOVE what it requires.
Or you just wait until you load a game you are dying to play and you just can’t get the FPS you need.
I bought two 980s at launch for SLI, and then right after launch they announced that Shadow of Mordor could use more VRAM than I had. That setup ended up SUCKING in Battlefield and got replaced with a single 1080ti. That card was not THAT much faster but lasted three years because it had the bus speed and vram to knock everything out of the park at 1080p.
I also once bought a 590, which was touted as a giant killer. Also sucked. So hot it destabilized the whole system.
The RTX 3080ti will be the card to own, most likely, for 4K. Probably next year. There will probably be better displays as well and better benchmarks. I just have a bad feeling about the 3080 at only 10GB and 3090 at $1,500. My gut says wait it out.
And the fact that Fortnite and Minecraft are our RTX benchmarks do not bode well for the longevity of these cards.
I think NVIDIA accomplished a lot but the purchase decision is always subjective and based on the games you play and will play, and your monitor!
I think very similar to you. In my case if I were going to jump this year it would be a 3080 realistically.
In the last several iterations I've definitely gravitated towards Ti and "likely" will wait for the 3080ti but it's going to be tough. I'm currently on a 1080Ti and I use a 4K 120Hz (potential) monitor with G-sync. I'm not terribly picky. As long as I can get the game in the "G-sync zone" I'm happy.
The "step up" and trade-in programs only go out so far.