I thought 2k meant 1440p?
According to Wikipedia 2k == 1080p, at least for television and consumer media. It sounds like for cinematography (film) it is closer to being 1440p but I don't know much about that.
Yeah, it is extremely unclear. I am just going to assume that the OP is referring to 1080p. I honestly have no idea why they stopped labelling resolutions according to horiz. pixel row numbers and started using no. vertical columns. It makes it confusing to convert especially when we have different aspect ratios floating around out there.
As for the topic. At 1080p, I am pretty sure that a GTX 1660 Super is enough to let you max out even the most demanding games at probably close to 50 fps min framerate. That said, in 2023 it seems like there are more gamers out there who care about maintaining insanely high fps in games as opposed to the older days when all we cared about was that min fps didn't fall below 30 during the most demanding gameplay segments. I guess if the OP is absolutely dead set on getting 120 fps in even the most demanding games then maybe a 4090 could be beneficial on a 1080p monitor, I am just not sure that making the jump from 50 fps to 100+ fps is worth the pricetag of a 4090 (I honestly don't think that my eyes could even perceive the difference between those two framerates).
My recommendation to the OP would be GTX 1660 for 1080p gaming (RTX 3060 if you want to build in some future-proofing). Without having more detail or clarification, I can't see a clear benefit in buying a 4090 in this application.
I blame this on Samsung since they used "2K" for the marketing of their 2560x1440p screen on their cell phones.I don't get why no one just writes the resolutions out like they used to... It's a lot clearer.
I don't get why no one just writes the resolutions out like they used to... It's a lot clearer.