Conventional wisdom: Need 4k+ monitors to make use of the RTX3080/90 or RX6800 series?

j_smithy

n00b
Joined
Jul 4, 2020
Messages
11
I say nah...

Just earlier this year I was still running my 1920x1200 60Hz resolution monitor, and totally using up all the GPU performance I could get from my 2070 Super. I don't have eagle eyes but can easily notice aliasing in many games when rendered at native resolution. And even in some games with TAA the aliasing is still obvious, especially when in motion (eg: Microsoft Flight Simulator 2020). So I crank up either the internal render resolution, or use DSR if the internal render resolution slider isn't available in a game. Brute force anti-aliasing but always looks great.

Now on my 2560x1440 60Hz monitor, and upgraded to a 30-series card. Throwing the newfound GPU performance at internal render resolution/DSR, and more pixels for ray tracing (higher pixel density, less noisy ray tracing too). Again using up most of the GPU performance I can get.
 
All depends on what games and what settings you like, i am a 1440p 145Mhz OC to 165 monitor now, and a 6800 would keep my FPS high and keep my details in game high as well, vs say a 3060 which struggles at 1440p and a 3070 that does well in most cases...but a 6800 seems like a better choice.
 
I say nah...

Just earlier this year I was still running my 1920x1200 60Hz resolution monitor, and totally using up all the GPU performance I could get from my 2070 Super. I don't have eagle eyes but can easily notice aliasing in many games when rendered at native resolution. And even in some games with TAA the aliasing is still obvious, especially when in motion (eg: Microsoft Flight Simulator 2020). So I crank up either the internal render resolution, or use DSR if the internal render resolution slider isn't available in a game. Brute force anti-aliasing but always looks great.

Now on my 2560x1440 60Hz monitor, and upgraded to a 30-series card. Throwing the newfound GPU performance at internal render resolution/DSR, and more pixels for ray tracing (higher pixel density, less noisy ray tracing too). Again using up most of the GPU performance I can get.
The way I look at it, beyond cost - value (which is often very subjective), is to question where you are going to run into hardware constraints.

The problem is that the reviews have conditioned people to think some artificial benchmark or the pure frame numbers are the most important questions. And rather than think about it - people just buy the most expensive card they feel comfortable buying. That was fine when new cards were pushing frames to the 100s-150s. Nowadays you can get 300s or more - and much of that overage is wasted (reducing any cost-benefit to just buying the most you can afford). So it makes sense to look at how the system integrates with the card and where you have bottlenecks - and then not over buy your system.

Quite often with the lower resolution monitors, reviews have demonstrated that the cards are overperforming the ability of CPUs to push more frames. Thus the term 'cpu limited'. This is the most common thing explained when looking at why the top end of the new cards are suggested for 4k and not the lower resolutions. Sadly, the reviews don't take the next step and recommend against overbuying.

Software-counted frames alone are not the end of the discussion. We all understand about monitors with high refresh rates. But some people don't understand that the perceptual jump between 60 and 120 is far greater than the jump from 120 to 240. Meaning a 144hz monitor is a huge upgrade over a 60...but a 240 is a negligible upgrade over the 144. (confirmation bias, where people say they like a more expensive 240hz monitor is not borne out by blind testing)

Again - this not the whole show. A card pushing 100 frames in a game played on a 60hz monitor does actually provide a better user experience than a lower class card pushing 60 frames on the same monitor. But buying a 300 frame pusher on a 60 hz is a total waste - because the hardware limitations and human perception are such that the extra 200 frames won't improve the experience at all.

Thus - this generation of cards is so powerful that you need look at the whole system to see how it fits. If you are running 1080p at 60... You don't need a 3080. Hard stop. If you are running 1440p at 144hz...3070 may actually be a good choice. Look at the reviews - any titles getting over 144 frames means that you are getting full value. If the title you want is getting less - and it's not because of cpu limitations, maybe a 3080 is required.

But from everything I've seen - the 3080 is made for 4k and 120/144hz. It is both an invitation to the industry to make and sell everyone new large format 4k high refresh monitors...and the GPU makers chasing the market as people's taste is shifting to large format 4k TV - that lacks the features of a true monitor.
 
Last edited:
i'm planning to get a 3080 or equivalent for my 2k 144 display just so i can maintain a high fps with high quality settings.
 
There is a lot more than just resolution (though that is a big differentiator).

For example, what refresh rate are you targeting? Do you have a FreeSync/GSync monitor? Do you wish to use ray-tracing? Any specific needs like HDMI 2.1?

I think that now is the time for 1440p ultrawide high refresh. I've gamed on 4K60 before, and it's nice but honestly not worth the performance hit. Also 60fps sucks even at 4K.

And I realize 4K120 is just now available, but I still think for the price you'd be better off with a 1440p ultrawide (cheaper and honestly a much better experience).
 
By the time I can afford a 6800XT it'll be a 1080P 60fps card lol

But if I was to buy one now I'd pair it with a 1080P 144hz panel for longevity and room for maxing out RT. With good AA I'm ok with 22-24" 1080P and I'd rather have room for games to grow into the gpu.
 
You'll want 4k monitors TODAY.

But what about in two years? Games will be more advanced and the top teir cards won't stretch as far.

No issue with using them on a 1440p
 
The way I look at it, beyond cost - value (which is often very subjective), is a to question where you are going to run into hardware constraints.

The problem is that the reviews have conditioned people to think some artificial benchmark or the pure frame numbers are the most important questions. And rather than think about it - people just buy the most expensive card they feel comfortable buying. That was fine when new cards were pushing frames to the 100s-150s. Nowadays you can get 300s or more. So it makes sense to look at how the system integrates with the card and where you have bottlenecks - and then not over buy your system.

Quite often with the lower resolution monitors, reviews have demonstrated that the cards are overperforming the ability of CPUs to push more frames. Thus the term 'cpu limited'. This is the most common thing explained when looking at why the top end of the new cards are suggested for 4k and not the lower resolutions.

Software-counted frames alone are not the end of the discussion. We all understand about monitors with high refresh rates. But some people don't understand that the perceptual jump between 60 and 120 is far greater than the jump from 120 to 240. Meaning a 144hz monitor is a huge upgrade over a 60...but a 240 is a negligible upgrade over the 144. (confirmation bias, where people say they like a more expensive 240hz monitor is not borne out by blind testing)

Again - this not the whole show. A card pushing 100 frames in a game played on a 60hz monitor does actually provide a better user experience than a lower class card pushing 60 frames on the same monitor. But buying a 300 frame pusher on a 60 hz is a total waste - because the hardware limitations and human perception are such that the extra 200 frames won't improve the experience at all.

Thus - this generation of cards is so powerful that you need look at the whole system to see how it fits. If you are running 1080p at 60... You don't need a 3080. Hard stop. If you are running 1440p at 144hz...3070 may actually be a good choice. Look at the reviews - any titles getting over 144 frames means that you are getting full value. If the title you want is getting less - and it's not because of cpu limitations, maybe a 3080 is required.

But from everything I've seen - the 3080 is made for 4k and 120/144hz. It is both an invitation to the industry to make and sell everyone new large format 4k high refresh monitors...and the GPU makers chasing the market as people's taste is shifting to large format 4k TV - that lacks the features of a true monitor.

1) Not sure why, but last year when I tried my friend's 120Hz monitor for about an hour and I struggled to feel the extra fluidity I was hoping for (kinda like way back when I used to play 90-100Hz refresh on my CRT). Maybe something about it wasn't set right, the game wasn't fast enough, or maybe something about my perception has slowed the past decade or two? Anyway, my gaming refresh rate needs appear to be pretty pedestrian. If I get 60fps locked in games with input lag <20ms I'm happy. I'm no longer that guy who needs to spot someone else running across and score the headshot.

2) While my framerate needs are pretty low by today's standards, my appetite for minimal aliasing artifacts is high. So the money goes towards making that minimal aliasing happen. Something about a very clean, aliasing free image is super pleasing to me lol.

So eventhough I run 1200p and 1440p monitors, I essentially place 4K+ monitor loads on my video cards due to DSR or internal render resolutions settings
 
Last edited:
i'm planning to get a 3080 or equivalent for my 2k 144 display just so i can maintain a high fps with high quality settings.
This is what I did for my 1440p and 165hz monitor and I'm very happy with my choice. Not only does it allow me to push all max settings and get my monitors max FPS (or slightly under at worst case), it also future proofs me, because obviously while it's excellent today, graphics will only get more demanding and the card you buy today for 165 fps, is not going to handle games as easily 2-3-4 years down the line, and I plan to keep this GPU for 5 years. A stronger GPU today means I won't need to upgrade for longer in the future.

Even now with all max settings in some older games such as Subnautica and The Evil Within I get drops to 120/130 FPS zone sometimes, on 3080, at 1440p. And triple-A graphically intensive games? Those would obviously be even lower frames. So I'm not sure why this stigma of 3080+ is for 4k only.. It's ridiculous IMO

Also the difference between a $500 (at minimum) 3070 and a $750-800 3080 is let's say $300. If you keep the card for 3 years that's 300/36=$8 only more a month (less if you plan on re-selling the card, which why wouldn't you) for a much nicer experience that will allow you to run everything at all max without worrying or fussing with anything.
 
Last edited by a moderator:
I felt that the 2080 Ti was just about the perfect card for 1440p/120Hz. Sure, there can be tangible gains at that res going to the 3080/90, depending on the game and settings. But 4k >60Hz is really where these cards show their muscle.

Side note; on the issue of aliasing, be aware that 4k res is much cleaner than 1440. I honestly don't need to use AA anymore, after stepping up to a 4k TV. A little added sharpening via the Nvidia in-game overlay is all. So just be aware that whatever edge smoothing techniques you require to enjoy looking at 1440, you won't necessarily need at 4k.
 
1) Not sure why, but last year when I tried my friend's 120Hz monitor for about an hour and I struggled to feel the extra fluidity I was hoping for
I'm in the same boat and just consider myself lucky. My 1080ti has no problem still playing everything great at 1440p 60hz. At the rate games are progressing I won't need a new GPU for another 5 years.

I tried it on a friend's rig and couldn't tell a difference. The funny thing is despite swearing up and down about better "fluidity" he couldn't distinguish between 60hz and 144hz in the games of his choice when we did some quick experiments . (Of course he still swears by it despite his guesses being less accurate than a coin flip would have been lol.)
 
Maybe you weren't playing the right game. Try Doom Eternal or even an older game like Half-Life 2, the fluid motion of high refresh can't be missed.
 
Maybe you weren't playing the right game. Try Doom Eternal or even an older game like Half-Life 2, the fluid motion of high refresh can't be missed.
I disagree. Sure, there is a pretty big difference in technical terms, but for less technical person I could absolutely imagine that locked 60 FPS can look and feel very similar to 120FPS if you're not paying close attention to it. I'm relatively technical but I was playing Black Ops at 60Hz a few matches last night (was set that way from bringing my PC to my parent's) and it eventually hit me that I wasn't even playing at 120Hz. Far from a bludgeoning difference. Motion clarity is certainly very nice...but I would completely understand if someone was hard pressed to notice the difference in most gaming situations. If I was forced to pick between the two, I would pick higher IQ over higher refresh rate every time.
 
Maybe you weren't playing the right game. Try Doom Eternal or even an older game like Half-Life 2, the fluid motion of high refresh can't be missed.
Blind test in the past, sponsored by 120hz monitor maker (so we can imagine made to make 120 hz shine as much as they could) did show a 10-15% of gamers that either were not able to tell a difference.

A minority but I imagine still possible, a bit like in the past when someone people told you that they either didn't saw or mind the flicker of CRT monitors under 85 hz, it felt like it was impossible to many but they must have been honest.
 
Last edited:
Yeah, I have a feeling the people that were bothered by CRT flicker (me included) are the same people that value high refresh.
 
The 3080/3090 or 6800 series cards make 4K120 a reality. Can you use a 1440P 144hz+ monitor for these cards? Absolutely. In fact, I would say that the RX 6000 cards shine most at 1440p.

But the 3080/3090 shine most at 4K+.

I maintain, though, that the biggest advantage of the RTX 3000 and RX 6000 cards is HDMI 2.1 support. This allows people to take full advantage of high-end TV sets, like the LG CX. The LG CX supports 4K, 120hz, GSYNC, 4:4:4, 10-bit color, and HDR... all at the same time. Previous GPUs did not support this... and it's a game changer for large displays.
 
Yeah, I have a feeling the people that were bothered by CRT flicker (me included) are the same people that value high refresh.
Hmm, maybe related but unlikely to the main mechanism.

CRT flicker annoys people because they notice the brightness dips during each refresh due to the way the CRT excite the phospors left right up down and parts of the image become dimmer as the phosphors lose excitation.

LCDs don't flicker due to the refresh mechanism. They may flicker due to backlight issues or something else, but the refresh itself should not be the reason.
 
Yeah, I have a feeling the people that were bothered by CRT flicker (me included) are the same people that value high refresh.
I knew a guy in college who couldn't tell the difference between 60Hz and 85Hz. He had his CRT set to 60Hz. I showed him how to fix it and he didn't notice any change. On CRTs I was generally ok with 75Hz but 80+ was better. Past 80 I didn't notice the difference.

As for the original topic, I'd say the higher end cards are worthwhile at 1440p if you want to run with raytracing on. I would exclude the 3090 for being too expensive, but the price is making it easier to get.
 
I've had a chance to reflect on this since all the new cards came out. Having read numerous reviews, I'm left with the perception that 3080 and 6800xt are solid performers at 1440 - but sort of entry level to mid level at 4k.

My reasoning? With the top-tier cards running at 4k, very very few of the 'benchmark titles' the review sites are using get above 100 FPS - with just under half being below or right at 60 FPS. Contrast this with 50% of those titles getting above 144 fps at 1440 and all of them being comfortably above 60 frames. Clearly, these cards are for - and competing at - 1440p. At 4k? They're merely competent.

Which brings me to a dilemma: Seeing as I want to buy one of the new 32" 4k IPS 144hz monitors expected next year... the 'lifespan' or value proposition of a 3080/6800xt doesn't look all that hot. For gamers running 1440 displays, the value proposition is easy: buying an enthusiast level card will provide great value for years to come.

The new displays are gonna be pricey. Given that the 3070 and 6800 offer good (not great) to 'acceptable' entry-level performance at 4k (less than half showing comfortably above 60 FPS, with the same number peaking below) - the current 'second tier' cards are probably a good 'placeholder decision' while we wait and see what RDNA3 and whatever NV throws out in the next iteration bring to 4k gaming. The $200 price difference matters if you're going to be turning down visuals either way. Choosing one of the top tier cards would be much easier if they were crushing it - or just consistently above 100 FPS at 4k across the board... but they're not.

Decisions... Decisions...


*Note: not considering the 6900xt or 3090 - because those cards are way overpriced for what you get.
 
I've had a chance to reflect on this since all the new cards came out. Having read numerous reviews, I'm left with the perception that 3080 and 6800xt are solid performers at 1440 - but sort of entry level to mid level at 4k.

My reasoning? With the top-tier cards running at 4k, very very few of the 'benchmark titles' the review sites are using get above 100 FPS - with just under half being below or right at 60 FPS. Contrast this with 50% of those titles getting above 144 fps at 1440 and all of them being comfortably above 60 frames. Clearly, these cards are for - and competing at - 1440p. At 4k? They're merely competent.

Which brings me to a dilemma: Seeing as I want to buy one of the new 32" 4k IPS 144hz monitors expected next year... the 'lifespan' or value proposition of a 3080/6800xt doesn't look all that hot. For gamers running 1440 displays, the value proposition is easy: buying an enthusiast level card will provide great value for years to come.

The new displays are gonna be pricey. Given that the 3070 and 6800 offer good (not great) to 'acceptable' entry-level performance at 4k (less than half showing comfortably above 60 FPS, with the same number peaking below) - the current 'second tier' cards are probably a good 'placeholder decision' while we wait and see what RDNA3 and whatever NV throws out in the next iteration bring to 4k gaming. The $200 price difference matters if you're going to be turning down visuals either way. Choosing one of the top tier cards would be much easier if they were crushing it - or just consistently above 100 FPS at 4k across the board... but they're not.

Decisions... Decisions...


*Note: not considering the 6900xt or 3090 - because those cards are way overpriced for what you get.
Agree with you on that. I think you'd have to run the high-end cards in SLI/Crossfire (if it were possible) to get enthusiast grade performance at 4k. I don't consider 60FPS @ enthusiast resolution (4k) to be enthusiast grade performance/graphic cards.

DLSS may be the interim answer. If only any more real/meaningful games took advantage of it..

Either it's just too demanding of a resolution for the technology available, or perhaps 4k is such a small share of gamers/displays that it doesn't make financial sense just yet for nVidia and AMD to put more R&D into developing enthusiast cards more fitting for that resolution
 
Agree with you on that. I think you'd have to run the high-end cards in SLI/Crossfire (if it were possible) to get enthusiast grade performance at 4k. I don't consider 60FPS @ enthusiast resolution (4k) to be enthusiast grade performance/graphic cards.

DLSS may be the interim answer. If only any more real/meaningful games took advantage of it..

Either it's just too demanding of a resolution for the technology available, or perhaps 4k is such a small share of gamers/displays that it doesn't make financial sense just yet for nVidia and AMD to put more R&D into developing enthusiast cards more fitting for that resolution
You know - In my dithering, I did not even consider the RT experience. I think everything is currently 'entry level' RT at this point: and that most people will turn off RT first in favor of performance. I have to admit, the reflections alone look stunning with RT on (from the vids I've watched) - but when you are getting 40 FPS... bleah.
 
Funny...I've been using a 4K G-sync monitor for over 4 years now, started with my GTX 1080. At the time I mainly got it for photo editing, but it also held up pretty well over the years and the GTX 1080 did a pretty decent job holding its own for a while with most games a I played. GTA V was one of the few in my library that wouldn't hold 60fps with most of the settings turned all the way up and the game was more often sitting in the 40s. But for that game type, it wasn't impacting and G-sync really helped keep the experience smooth. Same goes for RDR2, but I would have to play at lower details/resolution for that game.

Now fast forward to today and I am using an RTX 3090 FE. The problem I run into now is framerate stutter if I don't cap the framerate of my games to 58fps to prevent it from hitting the V-sync and refresh rate limit. I would love to get a monitor with 120/144Hz refresh rate that is also G-Sync capable to allow my video card to flex its muscles and benefit from the higher fluidity of higher framerates. I also have a Valve Index and I can see the improvement that going to a higher refresh rate offers, I even notice the difference going from its 90Hz to 120Hz. I want to experience that on my monitor. And while I would love to stick with 4K, there are not too many options that are 32-36" in size that are 120-144Hz, HDR capable, and G-Sync. And the ones that are available are pricey, and this is coming from someone that had no problems spending $900 4+ years ago on a Asus ROG Swift PG27AQ. If I had the room, it would be cheaper to get an LG CX 48", but I don't. The 48" needs 42-43" of width and I am limited to about 36" or less. Even the 43" 4K options out there are too big for me at being over 38" wide.

I really don't want to drop down to a 2K 2560x1440 monitor just because of selection options and pricing if I can help it. I've been too spoiled by the higher resolution. But I have toyed with the idea of maybe going wide instead, preferably 3840x1600 if possible. For games that support the wider resolutions, it would enhance the experience while still not being quite as taxing as 4K. Which would be good for games that push features like ray tracing and still maintain a higher framerate than 4K. That Alienware AW3821DW doesn't look too shabby, especially for the current sales price.
 
1440P / 120Hz here. I did the 3080 because I want to set everything to max and not worry about it. Ever.
 
I have been on 1920 x 1080p forever because I am getting older lol .. I decided to just go high refresh at HDMI @ 144Hz and Dp @ 165Hz on 1080p with the new Free Sync engine and get some more life out of my RX 5700 while I wait for this dumb azz pricing to go back to affordable .. as I want a Ryzen 5000 cpu more then a RX 6800 video card because the power these new gpu's use is insane like running my R9- 290X all over again '
 
I played half of AC: Valhalla with a 3080 and the other half with a 3060Ti on the same 1440p/144Hz monitor. After GFE adjusted the settings, I didn't really notice a big difference between the two. Obviously some of the "volumetric fog" type settings were on medium instead of Ultra High, but it didn't feel any different.

I mean in a lot of cases below 4k, you're paying a substantial amount of money to run on Ultra High vs. Medium in a lot of fringe effects and often the difference isn't that great. At 4K, you need more power.
 
I have been on 1920 x 1080p forever because I am getting older lol .. I decided to just go high refresh at HDMI @ 144Hz and Dp @ 165Hz on 1080p with the new Free Sync engine and get some more life out of my RX 5700 while I wait for this dumb azz pricing to go back to affordable .. as I want a Ryzen 5000 cpu more then a RX 6800 video card because the power these new gpu's use is insane like running my R9- 290X all over again '
Heh. I'm old too. But I am still one of the people that can see the spaces between pixels at 1080p (I may need glasses now - but I still can) . As I had spent a bit of money on a good ips way back when I just dealt with it - by sitting further back. But now I want a bigger monitor - and a 27 is only a tad bigger than a 16:10 24.

So I need a 32 - and that wants 4k, which depresses me to think that I will be upgrading the gpu again in the near future
 
Heh. I'm old too. But I am still one of the people that can see the spaces between pixels at 1080p (I may need glasses now - but I still can) . As I had spent a bit of money on a good ips way back when I just dealt with it - by sitting further back. But now I want a bigger monitor - and a 27 is only a tad bigger than a 16:10 24.

So I need a 32 - and that wants 4k, which depresses me to think that I will be upgrading the gpu again in the near future
It's funny as my 1080p is 32" VA panel that Wal-Mart sold for Black Friday @ $155 .. I sat back from it about 3 to 4 foot and run Image Sharping and Radeon Boost . It has been a joy moving up to 144Mhz for the price as my gaming is more limited to World of Tanks or Ship ..
 
I say nah...

Just earlier this year I was still running my 1920x1200 60Hz resolution monitor, and totally using up all the GPU performance I could get from my 2070 Super. I don't have eagle eyes but can easily notice aliasing in many games when rendered at native resolution. And even in some games with TAA the aliasing is still obvious, especially when in motion (eg: Microsoft Flight Simulator 2020). So I crank up either the internal render resolution, or use DSR if the internal render resolution slider isn't available in a game. Brute force anti-aliasing but always looks great.

Now on my 2560x1440 60Hz monitor, and upgraded to a 30-series card. Throwing the newfound GPU performance at internal render resolution/DSR, and more pixels for ray tracing (higher pixel density, less noisy ray tracing too). Again using up most of the GPU performance I can get.

It's nonsense. While you may have to use DSR or ray tracing to leverage the cards at 1920x1080, it's possible. Depending on the game, you can leverage those GPU's with 2560x1440 monitors which are becoming more common. Of course, as higher refresh rates become available to more monitor resolutions, achieving those frame rate targets without running the game in potato mode becomes more difficult.
 
It's nonsense. While you may have to use DSR or ray tracing to leverage the cards at 1920x1080, it's possible. Depending on the game, you can leverage those GPU's with 2560x1440 monitors which are becoming more common. Of course, as higher refresh rates become available to more monitor resolutions, achieving those frame rate targets without running the game in potato mode becomes more difficult.
He's playing flight sims so it's not surprising he does not value high fps/refresh.
 
The most realistic goal for 4K is 60fps, even for the 3090. So lower resolutions at higher frame rates still seem very relevant. My current primary display is a 4K/60 display w/o Freesync or Gsync. Early next year I'm hoping to add a second 4K display, with Freesync/Gsync and 144Hz. Even with a 3080, I'm somewhat tempted to get a 1440p/144 panel instead. Still leaning 4K but haven't made up my mind yet.
 
I dont know man I have a 240hz 1440p 32" samsung g7 and a 6900xt and rtx 3090 and those two gpus can peg this monitor.

Some games I get 170 fps others 140 some 60 like CP2077 which requires a Starfleet supercomputer to get over 100 fps regularly.

But many titles just peg this monitor. So I dont personally think 4k is worth it. Id stick with 1440 but get a 144hz minimum display.

Now I cant see what you see but a 60hz 1080p monitor shouldnt even come close to pegging a 2070super much less a 3070 or 80 range card. Unless your using rays. I never use rays. Only at first to see game in all its glory then turn off performance.
 
It's funny as my 1080p is 32" VA panel that Wal-Mart sold for Black Friday @ $155 .. I sat back from it about 3 to 4 foot and run Image Sharping and Radeon Boost . It has been a joy moving up to 144Mhz for the price as my gaming is more limited to World of Tanks or Ship ..
How is WOT at 32? Gotta be fun
 
Try maxing out everything in Cyberpunk 2077 with RT and you will find even a 3090 can't stay above 60fps all the time...at 1080p. Everything is about what settings you are satisfied with and what you are willing to give up.
 
Try maxing out everything in Cyberpunk 2077 with RT and you will find even a 3090 can't stay above 60fps all the time...at 1080p. Everything is about what settings you are satisfied with and what you are willing to give up.
Even though it is the popular new kid on the block, seeing as how buggy and unoptimized Cyberpunk 2077 appears to be, maybe we should be waiting a few months for CDPR to try and straighten things out before we start using that as our GPU/monitor litmus test.
 
Funny...I've been using a 4K G-sync monitor for over 4 years now, started with my GTX 1080. At the time I mainly got it for photo editing, but it also held up pretty well over the years and the GTX 1080 did a pretty decent job holding its own for a while with most games a I played. GTA V was one of the few in my library that wouldn't hold 60fps with most of the settings turned all the way up and the game was more often sitting in the 40s. But for that game type, it wasn't impacting and G-sync really helped keep the experience smooth. Same goes for RDR2, but I would have to play at lower details/resolution for that game.

Now fast forward to today and I am using an RTX 3090 FE. The problem I run into now is framerate stutter if I don't cap the framerate of my games to 58fps to prevent it from hitting the V-sync and refresh rate limit. I would love to get a monitor with 120/144Hz refresh rate that is also G-Sync capable to allow my video card to flex its muscles and benefit from the higher fluidity of higher framerates. I also have a Valve Index and I can see the improvement that going to a higher refresh rate offers, I even notice the difference going from its 90Hz to 120Hz. I want to experience that on my monitor. And while I would love to stick with 4K, there are not too many options that are 32-36" in size that are 120-144Hz, HDR capable, and G-Sync. And the ones that are available are pricey, and this is coming from someone that had no problems spending $900 4+ years ago on a Asus ROG Swift PG27AQ. If I had the room, it would be cheaper to get an LG CX 48", but I don't. The 48" needs 42-43" of width and I am limited to about 36" or less. Even the 43" 4K options out there are too big for me at being over 38" wide.

I really don't want to drop down to a 2K 2560x1440 monitor just because of selection options and pricing if I can help it. I've been too spoiled by the higher resolution. But I have toyed with the idea of maybe going wide instead, preferably 3840x1600 if possible. For games that support the wider resolutions, it would enhance the experience while still not being quite as taxing as 4K. Which would be good for games that push features like ray tracing and still maintain a higher framerate than 4K. That Alienware AW3821DW doesn't look too shabby, especially for the current sales price.
4K gaming monitors are not in a good spot right now. There is not one VA panel option between 32" and 40". The 43" is all that I know of. If you want G-Sync as well the options become even less. I am hoping that 2021 brings some options.
 
The most realistic goal for 4K is 60fps, even for the 3090. So lower resolutions at higher frame rates still seem very relevant. My current primary display is a 4K/60 display w/o Freesync or Gsync. Early next year I'm hoping to add a second 4K display, with Freesync/Gsync and 144Hz. Even with a 3080, I'm somewhat tempted to get a 1440p/144 panel instead. Still leaning 4K but haven't made up my mind yet.
I will be perfectly happy with 4K and 60 FPS. The way I see it is that consoles are now running at 4K, so a new high end PC build should aim for that as well.
 
Back
Top