4K Gaming: Need your advice for a new graphics card

Cannibal Corpse

[H]ard|Gawd
Joined
Sep 22, 2002
Messages
1,277
Hello all,
I recently finished putting together the following system (see signature, or specs below), and I need to know which nVIDIA graphics card would be suffice for 4K gaming (with all game's settings to Ultra).

CPU: Ryzen 7800X3D
MoBo: GIGABYTE B650 Aorus Elite AX
RAM: CORSAIR VENGEANCE RGB DDR5 RAM 32GB (2x16GB) 6000MHz CL36 |
HDD: Crucial T700 2TB Gen5 NVMe M.2 SSD
GPU: RTX 4060 Ti
SOUND: CREATIVE Sound Blaster ZxR Sound Card
PSU: SEASONIC 850W Prime Titanium PSU
DISPLAY: Sony BRAVIA 55" OLED Display (XR-55A80K) (4K @ 120Hz)
CASE: CORSAIR Crystal Series 570X RGB ATX Mid-Tower Case


Should I get an RTX 4080 or 4090?

Thanks for your insights and advice!
 
Last edited:
Hello all,
I recently finished putting together the following system (see signature, or specs below), and I need to know which nVIDIA graphics card would be suffice for 4K gaming (with all game's settings to Ultra).

CPU: Ryzen 7800X3D
MoBo: GIGABYTE B650 Aorus Elite AX
RAM: CORSAIR VENGEANCE RGB DDR5 RAM 32GB (2x16GB) 6000MHz CL36 |
HDD: Crucial T700 2TB Gen5 NVMe M.2 SSD
GPU: RTX 4060 Ti
SOUND: CREATIVE Sound Blaster ZxR Sound Card
PSU: SEASONIC 850W Prime Titanium PSU
DISPLAY: Sony BRAVIA 55" OLED Display (XR-55A80K) (4K @ 120Hz)
CASE: CORSAIR Crystal Series 570X RGB ATX Mid-Tower Case


Should I get an RTX 4080 or 4090?

Thanks for your insights and advice!
4090. Go all in.

You didn't mention what you play, but 4k 120 ultra is tough on some recent titles. Some aren't doable on a 4090. Some popular, but less recent, titles are a piece of cake. So, you can cap it and reduce power, heat and noise.
 
Get the 4090 as we are already getting games that are giving it a work out at 4k ultra. Surely it does not have to be said after all this time that the 4080 makes no sense for 1200 bucks anyway.
 
Get the 4090 as we are already getting games that are giving it a work out at 4k ultra. Surely it does not have to be said after all this time that the 4080 makes no sense for 1200 bucks anyway.
It's linear price performance at $1200, and the 4080 is often available for $1000 now. It actually is a better value because of that. I mean, it's great if you can afford to go all out, but the 4090 is not 60% faster than the 4080 :p.
 
It's linear price performance at $1200, and the 4080 is often available for $1000 now. It actually is a better value because of that. I mean, it's great if you can afford to go all out, but the 4090 is not 60% faster than the 4080 :p.
Well if its 1000 bucks then of course things change. At 1200 bucks it made more sense to get a 4090 since it provided the same or better price performance when you normally get way worse card for your money at the top of the stack. And he said he wanted 4k ultra so I still say get the 4090 if he can afford it.
 
Well if its 1000 bucks then of course things change. At 1200 bucks it made more sense to get a 4090 since it provided the same or better price performance when you normally get way worse card for your money at the top of the stack. And he said he wanted 4k ultra so I still say get the 4090 if he can afford it.
At $1200 as I said it's linear price performance for the 4080, around a third more money gets you a third more performance from the 4090. If he has the cash it's just a matter of how fast he wants to go, not value.
 
That's kind of the crazy thing about the 4090... or rather, this generation. At the default MSRP of the 4080, the 4090 actually had pretty much linear (or arguably better sometimes) gains than the price increase from the 4080. That actually went down to most of the product stack afaik. At 1k, just from a pure performance standpoint, the 4090 starts losing a bit... but honestly? If it was even close before, that's kind of saying a lot; it's very unusual since the top end splurgy halo card of a generation is normally never even a remotely good value proposition.

However, then we get into VRAM. The 4090 has 1.5x the VRAM of the 4080... which might or might not matter, depending on how much you intend to use DLSS. At 4k, you might always be forced to use DLSS. At 3440x1440, I can actually get away without using DLSS in many games while getting reasonable/playable framerates, and I am hitting around 15-16GB VRAM in some of them already. The 4080 would not be able to do this.

That said, where are all of you getting 1k 4080s? I see 1.1k-ish at minimum: https://pcpartpicker.com/products/video-card/#c=542&sort=price&xcx=0

Tbh if I was considering either card, I would at least wait to hear some details on the 4080 Super, though. While the 4090s are selling out, they don't seem to exactly be flying off of shelves quite yet.
 
Last edited:
If you are doing 4k or other high resolution displays for gaming, you should buy the absolute fastest card money can buy at the time you are building or upgrading the system.

Value really isn't part of the equation in most instances. I get people want value or what they perceive to be value but it really doesn't apply here in the current GPU landscape. If you want that, you need to stick to 2560x1440 or even 1929x1080 displays. At 4K, you have to pay to play.

4090 struggles at times in the latest AAA games. Anything below that is going to be objectively worse. There really isn't much else to say about it.
 
Last edited:
Not that my opinion counts in any way since he specified nVidia, but my Radeon 6800XT has been amazing for me at 4K with the games I play. I also don't play a shit-ton of cutting-edge games...
 
I also don't play a shit-ton of cutting-edge games...
And the difference between high vs ultra, Rt of vs Rt off, a lot of game are playable at peasant under 90fps performance, etc...

But if you want to play the current and upcoming next 30 fps on XboxX-PS5 console game at 3-4x time the resolution and 3-4x time the speed with higher detail, a video card about at least 8 time as powerful than a 6700 would be quite useful with cpu, that's around 2.5 time a 4090... or just make peace with upscaling.
 
If you are doing 4k or other high resolution displays for gaming, you should buy the absolute fastest card money can buy at the time you are building or upgrading the system.

Value really isn't part of the equation in most instances. I get people want value or what they perceive to be value but it really doesn't apply here in the current GPU landscape. If you want that, you need to stick to 2560x1440 or even 1929x1080 displays. At 4K, you have to pay to play.

4090 struggles at times in the latest AAA games. Anything below that is going to be objectively worse. There really isn't much else to say about it.
It is as simple as that. There is no such thing as too much gpu for ultra settings at 4k high refresh as games are getting more and more demanding.
 
The 4090 is the only card available that's worth bothering with at 4K. Speaking as a 7900 XTX owner trying to make do at 4K.
I’m running a 7900 XT (it’s pretty heavily overclocked 3.16/2.75) and I’m not having any trouble with 4K gaming. What games are you being disappointed by the performance of? The only game I have that doesn’t run as well as I’d like in 4K is Cyberpunk but that’s only when enabling RT, rasterized it runs great. Perhaps it’s your CPU that’s holding you back?
 
I’m running a 7900 XT (it’s pretty heavily overclocked 3.16/2.75) and I’m not having any trouble with 4K gaming. What games are you being disappointed by the performance of? The only game I have that doesn’t run as well as I’d like in 4K is Cyberpunk but that’s only when enabling RT, rasterized it runs great. Perhaps it’s your CPU that’s holding you back?
It's a 5800X3D, so no, not really.

Cyberpunk is the worst, but Alan Wake 2 is going to be worse (granted even the 4090 isn't acceptable there) and I have a bad feeling that trend will continue.
 
Hello all,
I recently finished putting together the following system (see signature, or specs below), and I need to know which nVIDIA graphics card would be suffice for 4K gaming (with all game's settings to Ultra).

CPU: Ryzen 7800X3D
MoBo: GIGABYTE B650 Aorus Elite AX
RAM: CORSAIR VENGEANCE RGB DDR5 RAM 32GB (2x16GB) 6000MHz CL36 |
HDD: Crucial T700 2TB Gen5 NVMe M.2 SSD
GPU: RTX 4060 Ti
SOUND: CREATIVE Sound Blaster ZxR Sound Card
PSU: SEASONIC 850W Prime Titanium PSU
DISPLAY: Sony BRAVIA 55" OLED Display (XR-55A80K) (4K @ 120Hz)
CASE: CORSAIR Crystal Series 570X RGB ATX Mid-Tower Case


Should I get an RTX 4080 or 4090?

Thanks for your insights and advice!
3090, 3090 Ti, 4090, 7900 XT, 7900 XTX. Any card with more than 20GB VRAM is gonna lasted until 2025 at the earliest for 4K Maxed out + RT/PT. DLSS can also help reducing VRAM usage, but as someone who wants perfect PQ, I never bother with that crap.
 
From someone who went from a 4090 to a 4080 for less power consumption and heat, the 4080 does great in 4K. Still play all my games with the same Ultra/Max settings as my previous 4090.
I would not sacrifice and take a 4080 off a 4090 for 4k. No way. 4080 is a 1440 card if you want to run games max settings, not 4k.

Also, you don't have to use the 4090 in 600w mode. You can just plug in 3 versus 4 cables and the vBios will default to a 450w capped mode.
 
I would not sacrifice and take a 4080 off a 4090 for 4k. No way. 4080 is a 1440 card if you want to run games max settings, not 4k.

Also, you don't have to use the 4090 in 600w mode. You can just plug in 3 versus 4 cables and the vBios will default to a 450w capped mode.
The 4080 isn’t just a 1440P card only no matter how much you wish it to be. Not trying to get into a thing here either. I’m just letting people who haven’t owned either card to understand that the 4080 is a fine 4k card and to be more informed.
 
Last edited by a moderator:
From someone who went from a 4090 to a 4080 for less power consumption and heat, the 4080 does great in 4K. Still play all my games with the same Ultra/Max settings as my previous 4090.
What you're saying doesn't really make a lot of sense because if you were pushing your 4090 to where it had a noticeable power increase over the 4080 then you can't possibly be running the same settings and getting the same performance.
 
  • Like
Reactions: Dan_D
like this
The 4080 isn’t just a 1440P card only no matter how much you wish it to be. Not trying to get into a thing here either. I’m just letting people who haven’t owned either card to understand that the 4080 is a fine 4k card and to be more informed.
I own both. The 4080 is not a 4k card unless you’re fine turning settings down.
 
Not for nothing, but I'm running all my stuff at 1440@144 on a 3070 without much of a hitch.

/shrug
Running games at 2560x1440 and running them at 3840x2160 are two very different things. The demand on your hardware is much greater. Running at 1920x1080 or 2560x1440 can be done well on midrange hardware pretty well for the most part. At 4K, we can't run these games at native resolution full time. We need to employ things like resolution scaling, dynamic resolution and DLSS just to make some of these games playable. This is especially true with ray tracing or path tracing. This is even true with the RTX 4090. If the RTX 4090 can't do it in every game out there, then the 4080, 7900XTX and everything below those damn sure can't do it.
 
I don't think 4K 120 is possible on anything other than a 4090. Even then most games won't hit that target rate especially if RT is enabled and if frame gen is not offered. Can't even imagine playing the games I play on a 4080 when a 4090 is barely making ends meet.
 
Running games at 2560x1440 and running them at 3840x2160 are two very different things. The demand on your hardware is much greater. Running at 1920x1080 or 2560x1440 can be done well on midrange hardware pretty well for the most part. At 4K, we can't run these games at native resolution full time. We need to employ things like resolution scaling, dynamic resolution and DLSS just to make some of these games playable. This is especially true with ray tracing or path tracing. This is even true with the RTX 4090. If the RTX 4090 can't do it in every game out there, then the 4080, 7900XTX and everything below those damn sure can't do it.
I was only saying that because someone mentioned a 4080 was a 1440 card while my low tiered, previous generation is playing 1440 just fine. /shrug
 
I was only saying that because someone mentioned a 4080 was a 1440 card while my low tiered, previous generation is playing 1440 just fine. /shrug
My 3080ti will do most latest release games at 1440 OK, but it struggles compared to a 4080. The 4080 is the current-gen '1440 card'. That is evident in games like Alan Wake 2 in particular. The 4080 will play every current game w/ max settings at 1440. Any lower-end card won't. I guess I should add the qualifier of 40-50FPS+, because some people might say a their 2070 is just fine at 1440 at 30FPS, but 30FPS is not a playable FPS to me.
 
My 3080ti will do most latest release games at 1440 OK, but it struggles compared to a 4080. The 4080 is the current-gen '1440 card'. That is evident in games like Alan Wake 2 in particular. The 4080 will play every current game w/ max settings at 1440. Any lower-end card won't. I guess I should add the qualifier of 40-50FPS+, because some people might say a their 2070 is just fine at 1440 at 30FPS, but 30FPS is not a playable FPS to me.
Maybe I'm just not playing new enough games, which could totally be it. There aren't any real recent games that interest me.

Path of Exile I get over 100fps. When I was playing World of Warcraft, 200fps mid raid maxed graphics. I unlocked the FPS in Fallout 4 and was 200+. Doom 2016 I think is locked at 60. I played Warzone a few years back and was 100+ fps.

I guess, yea, newer games like whatever Call of Duty or Battlefield I'd see the difference.

Running a 5800X3D by the way.
 
Maybe I'm just not playing new enough games, which could totally be it. There aren't any real recent games that interest me.

Path of Exile I get over 100fps. When I was playing World of Warcraft, 200fps mid raid maxed graphics. I unlocked the FPS in Fallout 4 and was 200+. Doom 2016 I think is locked at 60. I played Warzone a few years back and was 100+ fps.

I guess, yea, newer games like whatever Call of Duty or Battlefield I'd see the difference.

Running a 5800X3D by the way.
Those are all last gen or older games..

If you look at the current crop of AAA 2022/2023 titles, yes, you'll run into issues. Cyberpunk, Alan Wake 2, etc.. Assuming you want raytracing. But again, I could say the same statement that my 1660 is a 4k card because all I ever do is play oxygen not included or MS minesweeper. Like, come on guys, we shouldn't have to qualify these statements that we're talking about the latest games in 2023.
 
Those are all last gen or older games..

If you look at the current crop of AAA 2022/2023 titles, yes, you'll run into issues. Cyberpunk, Alan Wake 2, etc.. Assuming you want raytracing. But again, I could say the same statement that my 1660 is a 4k card because all I ever do is play oxygen not included or MS minesweeper. Like, come on guys, we shouldn't have to qualify these statements that we're talking about the latest games in 2023.
I explained that in the first line you quoted.
 
What you're saying doesn't really make a lot of sense because if you were pushing your 4090 to where it had a noticeable power increase over the 4080 then you can't possibly be running the same settings and getting the same performance.
I wasn’t really pushing the 4090 to its upper power limits, but I also didn’t need to I guess., but the heat is definitely less. I see no matter what is said here everyone wants to justify their 4090 purchase. It’s a great card..enjoy
 
I wasn’t really pushing the 4090 to its upper power limits, but I also didn’t need to I guess., but the heat is definitely less. I see no matter what is said here everyone wants to justify their 4090 purchase. It’s a great card..enjoy
I'm not trying to justify anything and would be making the same advice no matter what hardware I was using. Having a 4090 though I know first hand that a 4080 is absolutely not capable of maintaining even 60 FPS in some of the latest games all settings cranked.
 
I was only saying that because someone mentioned a 4080 was a 1440 card while my low tiered, previous generation is playing 1440 just fine. /shrug
Avatar just recommended a 4080 to play at 1267p-60hz ultra setting.

Playing 1440p, 4k, etc... without specifying which game and setting is a bit empty, many card can play CS:Go/Rocket league at 1440p, about none can play Alan wake 2 or lords of the fallen at 1440p with everything maxed out.

The terms make more sense about memory bus-cache type of decision I would guess, as some card drop at 4k by clear cost saving design decision
 
Last edited:
As always, what you play is going to be the biggest determinant of this. That said, if you're going Nvidia, 4080 or higher. Anything less than a 4080 for 4K is a waste of time (plus you'll be getting less than 16GB of VRAM below that tier, which is a non-starter in my view at that price level). A 4090 if you can afford it and/or are willing to spend that kind of money. I'm gaming at 4K on my 7900 XTX and it's crazy demanding. I wouldn't go less than a 4080/7900 XTX at that resolution, and even then, there are certain titles that put up quite the fight.
 
wat

holy crap, haha

wow I'm so far behind times. XD
He looked at the requirements wrong because they are for 4k not 1080p. That said, it is showing balanced upscaling which is not much above 1080p.
 
He looked at the requirements wrong because they are for 4k not 1080p. That said, it is showing balanced upscaling which is not much above 1080p.
They are not for 4k at all, but yes my mistake and I just posted the 1267p (under 1440p and only 60 fps being the 2 important point)
 
You'll be spending upwards of $1800-$2100 for a 4090 at this point because of external factors.
 
Back
Top