If you where buying today 4k card

David97

n00b
Joined
Jul 31, 2019
Messages
24
I really want 4k gaming but can not afford a TI.

The RTX 2080 can be had for less now with the super out. From what i have read the super only gains 4,5 fps over the non super.

The 2080 can be had for around $150 less than the super.

yes i know even a 2080 can't play all games at 4k.
 
The problem is not that a 4K solution does not exist, I used my 5 yr old 390X to drive a AOC freesync 4K at 60 Hz, the problem is after the novelty wears off, the lag and micro stutter becomes an issue, so I went to a LG VA 2K at 144 Hz and you will not want to go back to 60 Hz.

Whether the FPS is 75 or 144 there is no tearing or lag, butter smooth.

I was able to hold 60 FPS by making some reductions in settings like AO, DOF, motion blur off, render resolution at 85% when gaming at 4K.
 
If you going to go the console way at least get the PS4 instead. It actually has games worth playing that can't be found else where.

actually i have a PS4 Pro and a One X already and a nintendo switch games from a switch look bad on a TV you should see wolfenstein on it!! It's bad!!. The PS4 Pro fan is loud so i go for the xbox more often. I paid under $370 for the One X brand new months ago. I never buy games or consoles when they first come out only paid $17 for horizon zero dawn $19 for god of war 4 both games i completed more than once.

BUT i have a OC water cooled 6700k PC here with no GPU and a bunch of steam games.
 
Save money for a few more months and buy a 2080 Ti.
The problem is not that a 4K solution does not exist, I used my 5 yr old 390X to drive a AOC freesync 4K at 60 Hz, the problem is after the novelty wears off, the lag and micro stutter becomes an issue, so I went to a LG VA 2K at 144 Hz and you will not want to go back to 60 Hz.

Whether the FPS is 75 or 144 there is no tearing or lag, butter smooth.

I was able to hold 60 FPS by making some reductions in settings like AO, DOF, motion blur off, render resolution at 85% when gaming at 4K.
A 390X is 40% of a 1080 Ti. And 1920x1080 in 2019? :vomit:
 
actually i have a PS4 Pro and a One X already and a nintendo switch games from a switch look bad on a TV you should see wolfenstein on it!! It's bad!!. The PS4 Pro fan is loud so i go for the xbox more often. I paid under $370 for the One X brand new months ago. I never buy games or consoles when they first come out only paid $17 for horizon zero dawn $19 for god of war 4 both games i completed more than once.

BUT i have a OC water cooled 6700k PC here with no GPU and a bunch of steam games.
I also have a PS4 Pro and can't hear the fan over whatever I'm playing or watching. Switch games look fine to me, but ymmv. I also wouldn't play a game like Wolfenstein on the Switch though. There's nothing on the Xbox that you can't get on PC and both games you mentioend are on PS4.
 
4K is nice, I mean it looks great, but you will spend a lot of money getting there and performance continues to be an issue.

I went from 1440p (actually triple screen), to a 4K TV, then back down to 1080p for an ultrawide. I've been running old games at virtual 5K resolution on the screen and they look great.

Even native 1080p (2560x1080) isn't so bad, and makes it viable to play newer games with ray-tracing and get above 60fps. Also some flexibility with DSR on older titles.

I would say if you want to try 4K it can be nice, but the resolution is not everything. I think a 2070 Super could do 4K with reduced settings.
 
The problem is, 4K is very demanding and an RTX 2080 Ti can't even maintain 60FPS with maximum settings in some games all the time. It's barely adequate. Lesser cards will provide a very sub-optimal experience. Really, the best thing you can do is either abandon the 4K idea entirely, or save up and buy the RTX 2080 Ti, or if you wait long enough, its replacement.

4K is nice, I mean it looks great, but you will spend a lot of money getting there and performance continues to be an issue.

I went from 1440p (actually triple screen), to a 4K TV, then back down to 1080p for an ultrawide. I've been running old games at virtual 5K resolution on the screen and they look great.

Even native 1080p (2560x1080) isn't so bad, and makes it viable to play newer games with ray-tracing and get above 60fps. Also some flexibility with DSR on older titles.

I would say if you want to try 4K it can be nice, but the resolution is not everything. I think a 2070 Super could do 4K with reduced settings.

I agree with you for the most part, but I was never one to reduce settings in games if there is a technical solution that allows me to avoid that. Cost be damned. I had 3x 30" 2560x1600 monitors for gaming, then 3x 27" 2560x1440 monitors, then 4K @ 60Hz and now a 1440P Ultra-wide. When you play the 4K game, you end up needing to buy not only the fastest GPU's available, but the fastest gaming processors. Minimum frame rates and frametimes definitely change with the different CPU's. high resolution gaming has always been a costly endeavor.

I've been doing it for years. When SLI support was reasonably good, I bought cards in pairs. That's two cards in SLI (or more) since the 6800GT all the way up to the GTX 1080 Ti's. I have skipped very few card generations. I skipped Pascal, and that's about all I can think of. The RTX 2080 Ti is the very first card I've had by itself from NVIDIA since the AGP slot days.

Processors? I haven't been nearly as diligent about changing those, but I have gone through a number of them over the years. Typically laying out the cash for an Extreme Edition of some sort. I also ran a dual QX9775 system, and most generations of the HEDT processors from Intel. That's a grand every tock of Intel's tick / tock strategy and another one during the tick if there was enough of an improvement to warrant it in my mind.

I have spent $7,000-$10,000 a year on high resolution gaming for the last 15 years or so. Not all of my enormous computing budget was spent on video cards, monitors, and CPU's, it's gone towards RAM, SSD's, cooling hardware and peripherals as well.

And I'm not kididng on the CPU front. Specifically, look at Destiny 2 in my MSI MEG X570 GODLIKE review. I compare different CPU minimum, average and maximum frame rates. It may surprise you to see some very powerful CPU's providing pathetic minimum frame rates in this game.

In fact, here's how I arrived where I am at today. I have been using a Threadripper 2920X since roughly around April of this year. That was somewhat of a side grade gaming wise from my Intel Core i9 5960X @ 4.5GHz. Never the less, I figured it would be fine considering conventional wisdom suggests your CPU is far more important than your GPU at 4K. This isn't entirely wrong, but the CPU has far more of an impact than is generally known. The reason why is because most reviewers do not report anything more than average frame rates, and even providing minimums and maximums doesn't tell the whole truth.

In Destiny 2, I saw occasional slow downs in intense combat in certain areas. Interestingly, my Intel setup and dual GTX 1080 Ti's didn't have this problem. As it turns out, Destiny 2 is one of the few titles that likes SLI. Going to an RTX 2080 Ti and Threadripper was actually a down grade in that game. I was at a friends house working on his ancient machine and I fired up Destiny 2, as I didn't think his hardware was up to the challenge as it was a 3770K and dual 980 Ti's. In every way, it was vastly inferior to my machine but did 60FPS in Destiny 2 better than mine did. Another friend uses a 2700X and a single RTX 2080 Ti reference card. Again, shouldn't be faster than my Threadripper, well not video card wise. His CPU should be slightly better than my Threadripper due to Threadripper's configuration.

I was wrong. These guys had worse systems than me in just about every way you can imagine. I have faster SSD's, faster RAM, more RAM and a factory overclocked RTX 2080 Ti that I've overclocked further and their gaming experience was better than mine. This led me down a rabbit hole of testing which brought me to the data point above. I've investigated this in some other titles as well, and the reality is that at 4K, getting 60FPS with the best hardware available in modern games all the time is simply not possible.

That's right. Even on an i9 9900K clocked at 5.0GHz, using an RTX 2080 Ti, it won't happen in every game. Now, it does happen allot, but not all the time. At 4K, without variable refresh rates, you have to contend with V-Sync, and all kinds of things. Variable refresh helps, but even then you still won't get 60FPS all the time. You just won't notice the impact as much.

At 4K, you end up tweaking every last thing in your system to achieve the optimal results. Frankly, it renewed my interest in tuning, but it drove me to do a costly revamp of my entire system. I'm actually still in the middle of that. My quest for smoothness in games like Destiny 2, Ghost Recon Wildlands and the upcoming Ghost Recon Breakpoint, which I participated in the technical test for led me to ditch 4K for a G-Sync monitor. Even though G-Sync, 120Hz, and 3440x1440 improved things, my system still wasn't as smooth as I would have liked.

So I'm off Threadripper and on an Intel Core i9 9900KF that I'm tuning. So, what did this entail? Well, that led me to buy a processor. It led me to switching out memory for some faster modules. It led to me changing motherboards, but getting back to the processor, the AIO wasn't enough. I went back to custom water cooling. That's basically $500 or more right there on cooling alone. I completely gutted and rebuilt my machine which meant moving the power supply down, getting extension cables, re-wiring, plumbing the system, bleeding the system, leak testing and all of that. I've thoroughly enjoyed the ride, but again, it was costly and due to having some parts on hand, it was cheap for me compared to what it would be for most people.

Oh, and then there is the cost of the 34" Alienware AW3418DW G-Sync 3440x1440 monitor I bought. I got it open box at Microcenter on a discount, but it was still $670. Without motherboard, processor and RAM, I'm already at $1,100. This was all part of my quest for smoothness. Keep in mind, the minimum FPS I saw on my TR 2920X was 26FPS (4.2GHz all core) or 36FPS using PBO. Comparatively, the Core i9 9900K achieves a minimum frame rate of 56FPS at 5.0GHz and 54FPS at stock speeds. That is at 4K, I don't recall what the 3440x1440 numbers were, but the minimums were actually similar.

BTW, I'm not even done. I will be buying a water block for my video card and another radiator for my system. That's another $250 right there. That will of course mean additional fittings and tubing which are incidental, but it all adds up.

The moral of the story is do not take 4K gaming lightly. If you can't afford to do it right, then your better served by 2560x1440. Even 3440x1440 is considerably less demanding. That's still an expensive resolution to game at, but its doable on less with good results.

Hope this helps.
 
Last edited:
Yeah, I guess what I am saying is that 4K looks nice but maybe not worth the cost.

I think it's more accurate to say that while it looks nice, it may not be worth the performance hit you take to do it. Even if you go all out as I've done, it still doesn't perform all that well. If you don't go all out, the performance hit is an even larger issue.
 
Eh, some of the settings that have the largest impact on performance sometimes have the smallest impact on image quality. I've only run into a couple games so far where I had to turn any settings down to get a consistent framerate on my system. But as you can see in my sig, I did figuratively pay out the ass for my PC.
 
Yeah, I guess what I am saying is that 4K looks nice but maybe not worth the cost.

You have to define nice- personally, I think slideshows look like ass ;).

If a game requires responsive input (I'm still playing BF4, for example), I'm not opposed to cratering every setting available until I get the performance I'm looking for.
 
The problem is, 4K is very demanding and an RTX 2080 Ti can't even maintain 60FPS with maximum settings in some games all the time. It's barely adequate. Lesser cards will provide a very sub-optimal experience. Really, the best thing you can do is either abandon the 4K idea entirely, or save up and buy the RTX 2080 Ti, or if you wait long enough, its replacement.



I agree with you for the most part, but I was never one to reduce settings in games if there is a technical solution that allows me to avoid that. Cost be damned. I had 3x 30" 2560x1600 monitors for gaming, then 3x 27" 2560x1440 monitors, then 4K @ 60Hz and now a 1440P Ultra-wide. When you play the 4K game, you end up needing to buy not only the fastest GPU's available, but the fastest gaming processors. Minimum frame rates and frametimes definitely change with the different CPU's. high resolution gaming has always been a costly endeavor.

I've been doing it for years. When SLI support was reasonably good, I bought cards in pairs. That's two cards in SLI (or more) since the 6800GT all the way up to the GTX 1080 Ti's. I have skipped very few card generations. I skipped Pascal, and that's about all I can think of. The RTX 2080 Ti is the very first card I've had by itself from NVIDIA since the AGP slot days.

Processors? I haven't been nearly as diligent about changing those, but I have gone through a number of them over the years. Typically laying out the cash for an Extreme Edition of some sort. I also ran a dual QX9775 system, and most generations of the HEDT processors from Intel. That's a grand every tock of Intel's tick / tock strategy and another one during the tick if there was enough of an improvement to warrant it in my mind.

I have spent $7,000-$10,000 a year on high resolution gaming for the last 15 years or so. Not all of my enormous computing budget was spent on video cards, monitors, and CPU's, it's gone towards RAM, SSD's, cooling hardware and peripherals as well.

And I'm not kididng on the CPU front. Specifically, look at Destiny 2 in my MSI MEG X570 GODLIKE review. I compare different CPU minimum, average and maximum frame rates. It may surprise you to see some very powerful CPU's providing pathetic minimum frame rates in this game.

In fact, here's how I arrived where I am at today. I have been using a Threadripper 2920X since roughly around April of this year. That was somewhat of a side grade gaming wise from my Intel Core i9 5960X @ 4.5GHz. Never the less, I figured it would be fine considering conventional wisdom suggests your CPU is far more important than your GPU at 4K. This isn't entirely wrong, but the CPU has far more of an impact than is generally known. The reason why is because most reviewers do not report anything more than average frame rates, and even providing minimums and maximums doesn't tell the whole truth.

In Destiny 2, I saw occasional slow downs in intense combat in certain areas. Interestingly, my Intel setup and dual GTX 1080 Ti's didn't have this problem. As it turns out, Destiny 2 is one of the few titles that likes SLI. Going to an RTX 2080 Ti and Threadripper was actually a down grade in that game. I was at a friends house working on his ancient machine and I fired up Destiny 2, as I didn't think his hardware was up to the challenge as it was a 3770K and dual 980 Ti's. In every way, it was vastly inferior to my machine but did 60FPS in Destiny 2 better than mine did. Another friend uses a 2700X and a single RTX 2080 Ti reference card. Again, shouldn't be faster than my Threadripper, well not video card wise. His CPU should be slightly better than my Threadripper due to Threadripper's configuration.

I was wrong. These guys had worse systems than me in just about every way you can imagine. I have faster SSD's, faster RAM, more RAM and a factory overclocked RTX 2080 Ti that I've overclocked further and their gaming experience was better than mine. This led me down a rabbit hole of testing which brought me to the data point above. I've investigated this in some other titles as well, and the reality is that at 4K, getting 60FPS with the best hardware available in modern games all the time is simply not possible.

That's right. Even on an i9 9900K clocked at 5.0GHz, using an RTX 2080 Ti, it won't happen in every game. Now, it does happen allot, but not all the time. At 4K, without variable refresh rates, you have to contend with V-Sync, and all kinds of things. Variable refresh helps, but even then you still won't get 60FPS all the time. You just won't notice the impact as much.

At 4K, you end up tweaking every last thing in your system to achieve the optimal results. Frankly, it renewed my interest in tuning, but it drove me to do a costly revamp of my entire system. I'm actually still in the middle of that. My quest for smoothness in games like Destiny 2, Ghost Recon Wildlands and the upcoming Ghost Recon Breakpoint, which I participated in the technical test for led me to ditch 4K for a G-Sync monitor. Even though G-Sync, 120Hz, and 3440x1440 improved things, my system still wasn't as smooth as I would have liked.

So I'm off Threadripper and on an Intel Core i9 9900KF that I'm tuning. So, what did this entail? Well, that led me to buy a processor. It led me to switching out memory for some faster modules. It led to me changing motherboards, but getting back to the processor, the AIO wasn't enough. I went back to custom water cooling. That's basically $500 or more right there on cooling alone. I completely gutted and rebuilt my machine which meant moving the power supply down, getting extension cables, re-wiring, plumbing the system, bleeding the system, leak testing and all of that. I've thoroughly enjoyed the ride, but again, it was costly and due to having some parts on hand, it was cheap for me compared to what it would be for most people.

Oh, and then there is the cost of the 34" Alienware AW3418DW G-Sync 3440x1440 monitor I bought. I got it open box at Microcenter on a discount, but it was still $670. Without motherboard, processor and RAM, I'm already at $1,100. This was all part of my quest for smoothness. Keep in mind, the minimum FPS I saw on my TR 2920X was 26FPS (4.2GHz all core) or 36FPS using PBO. Comparatively, the Core i9 9900K achieves a minimum frame rate of 56FPS at 5.0GHz and 54FPS at stock speeds. That is at 4K, I don't recall what the 3440x1440 numbers were, but the minimums were actually similar.

BTW, I'm not even done. I will be buying a water block for my video card and another radiator for my system. That's another $250 right there. That will of course mean additional fittings and tubing which are incidental, but it all adds up.

The moral of the story is do not take 4K gaming lightly. If you can't afford to do it right, then your better served by 2560x1440. Even 3440x1440 is considerably less demanding. That's still an expensive resolution to game at, but its doable on less with good results.

Hope this helps.

I am a huge fan of 3440x1440.

That review was really interesting, what are you min frames? Is it the minimum over x amount of time? 0.1%, 1%, ect? Just curious because I am still debating between Intel and AMD.
 
I am a huge fan of 3440x1440.

That review was really interesting, what are you min frames? Is it the minimum over x amount of time? 0.1%, 1%, ect? Just curious because I am still debating between Intel and AMD.

All my testing was done at 4K. The 3440x1440 monitor is fairly new, and I haven't done much testing with it. I've played some games here and there, but mostly I'm trying to get through my backlog of hardware reviews. 3440x1440 is easier than 4K, but the same advice would apply. Tons of testing has been done on AMD vs. Intel, and generally speaking, Intel has about an aggregate lead of 6% over AMD in the relevant titles. That said, the lead is larger in some titles, with Destiny 2 being an especially egregious example of this. I haven't had a chance to do testing on that since the AGESA 1.0.0.3 patch ABB fix was issued that resolves the issue in presumably a better fashion than the chipset driver work around.
 
David, what are the specs of the system, in which you are deciding a GPU for...?
 
I think it's more accurate to say that while it looks nice, it may not be worth the performance hit you take to do it. Even if you go all out as I've done, it still doesn't perform all that well. If you don't go all out, the performance hit is an even larger issue.
I meant cost both in the financial sense and performance sense. Also the reduction in quality of settings to hit 60fps. It is all a type of cost.

However, I should say that 4K is certainly possible with current tech. I think if you have G-Sync/FreeSync that can help. I did 4K with a Radeon VII and it was working well. I think a 2070 Super should be in that same range, so it should be possible. You just have to be willing to tweak settings to get the needed performance.
 
I meant cost both in the financial sense and performance sense. Also the reduction in quality of settings to hit 60fps. It is all a type of cost.

However, I should say that 4K is certainly possible with current tech. I think if you have G-Sync/FreeSync that can help. I did 4K with a Radeon VII and it was working well. I think a 2070 Super should be in that same range, so it should be possible. You just have to be willing to tweak settings to get the needed performance.

It's a matter of personal preference, but I for one do not like turning in-game quality settings and visual effects down to achieve the desired frame rates. You can get by with a card like the RTX 2070 Super if you are fine with doing that.
 
You can get a 2070 super reference for $499. . I would go with that. Used 1080ti is a option but i would go new and get the 2070 super and oc it.
 
I love my 4K monitors for general use.

I don't love them for gaming though. For me, my biggest issue is that too many older games have issues with UI scaling. And even when you don't have UI issues, the screen isn't "OMG" so much better than 1080 was. It's better, don't get me wrong, but given what it takes to get there, it is an entirely underwhelming experience for gaming.

I'm driving 2 4K panels with a 980 GTX. For what I play and the detail/frame rate I am comfortable with, performance isn't an issue. I'm also willing to dial the settings off "MAX MAX ULTRA" if needed, but generally can't tell a huge difference.

Going from 1080 to 4k on my computer - meh. Text and pictures look great, games are ... just meh.
Going from VA LCD to HDR OLED on my TV - OMG what else have I been missing all my life.
 
Many ARPG's and MMO's look fantastic on a 32" 4k screen and dont necessary need anything over a 2070 to run really well.

Obviously if all you play is latest AAA titles then you either up the GPU or get a second monitor more suitable for your card.
I have a 1080p monitor next to my 4k one, it does FPS games only like Quake II RTX. Also use it for Discord when playing games on the main 4K
PNY RTX 2070 OC.
 
As for 4K gaming and settings, I have no qualms in reducing a setting when it has zero or indistinguishable IQ differences. In fact some settings that hurt performance I consider IQ degrading such as Motion Blur, Film Grain. Some games DOF settings looks more like you have eye problems as in nearsightedness and lower values can look better. Any good feeling placebo effect seeing a switch to highest position I am immune to, it better do something that is worth while, if not It can be totally off as far as I am concern.

Some of what I look for when 4K gaming is smooth game play which for me is keeping frame rates within freesync range, if it not smooth enough for me, it does not matter what the IQ looks like for most of the type of games I play. It better be smooth. Rendering at 3200x1800 and upscaling worked great with the Radeon Nano before the 1080 Ti's. Plus older games never really had that much issue.

Lately I rather play with the 144hz monitor and HDR, HDR to me is superior to resolution plus the wider FreeSync Range makes any gaming experience very enjoyable.

Now what I would call an ideal monitor would be HDR 1000+, 5K (5120 x 2160), 100hz+, 34"-38" - Unfortunately they are not made yet.
 
Last edited:
The screen size matters too and some people want 4k just to say they have 4k. It makes very little sense to get a 27 inch 4k screen like many people are doing. There is nearly zero visual difference in games between a native 1440p 27 inch and native 4k 27 inch yet performance can be cut nearly in half at times. A 1440p native 27 inch with graphics turned up will look better than a native 4k 27 inch with reduced settings. And in some games at 4k you will still not even keep 60 fps with reducing some settings anyway. IMO if are buying a monitor smaller than 32 inch then just go 1440p. You can really crank the settings in games and use DSR when you have plenty of headroom. Not to mention not having to worry about Windows scaling as its not needed with 1440p.
 
I'm not sure what your guy's FPS expectations are, but my VII (and 1080 ti before it) ran pretty much all games at highest IQ and 40 FPS or higher. Heck, even my laptop with a 2070 runs a good many games at acceptable levels hooked up to my monitor.

As for 4k vs 1440p on a 27" display, the difference is entirely noticeable to my eyes. The difference was even noticable when I was running a 24" 4k and 1440p display side by side. Having 4k on a 27" monitor is by no means "just to say" you have 4k.
 
I'm not sure what your guy's FPS expectations are, but my VII (and 1080 ti before it) ran pretty much all games at highest IQ and 40 FPS or higher. Heck, even my laptop with a 2070 runs a good many games at acceptable levels hooked up to my monitor.

As for 4k vs 1440p on a 27" display, the difference is entirely noticeable to my eyes. The difference was even noticable when I was running a 24" 4k and 1440p display side by side. Having 4k on a 27" monitor is by no means "just to say" you have 4k.

I play shooters primarily. If I can't get at least 60FPS all the time in the latest shooters at maximum settings, then the card isn't good enough for 4K. An overclocked RTX 2080 Ti is barely adequate for this purpose in many games.
 
So the OP never said if they were buying a new monitor.

If so I would probably agree that 4K is not the best choice, especially if on a budget.

1440p is probably the best overall for cost and performance, definitely try to find G-Sync/FreeSync and high refresh if possible. It makes a much huger difference than 4K.
 
  • Like
Reactions: noko
like this
I play shooters primarily. If I can't get at least 60FPS all the time in the latest shooters at maximum settings, then the card isn't good enough for 4K. An overclocked RTX 2080 Ti is barely adequate for this purpose in many games.

As long as the fps is above the freesync range I'm fine. They say it is 52 for my monitor, but I think it does go down to around 45 in reality.

That said I enjoyed 4K on a 290x 5 years ago.
 
As long as the fps is above the freesync range I'm fine. They say it is 52 for my monitor, but I think it does go down to around 45 in reality.

That said I enjoyed 4K on a 290x 5 years ago.

We are going to disagree on that. Sub-60FPS is simply not ok. FreeSync and G-Sync mask it to an extent, but no thanks.
 
I'm not sure what your guy's FPS expectations are, but my VII (and 1080 ti before it) ran pretty much all games at highest IQ and 40 FPS or higher. Heck, even my laptop with a 2070 runs a good many games at acceptable levels hooked up to my monitor.

As for 4k vs 1440p on a 27" display, the difference is entirely noticeable to my eyes. The difference was even noticable when I was running a 24" 4k and 1440p display side by side. Having 4k on a 27" monitor is by no means "just to say" you have 4k.
You have some magical vision if you can spot the difference in a game between native 1440p 27 inch and native 4k 27 inch. If you look hard enough in some games then you might see a very slight aliasing difference at best.
 
We are going to disagree on that. Sub-60FPS is simply not ok. FreeSync and G-Sync mask it to an extent, but no thanks.
The purpose of freesync and g-sync is not to mask "fps". It's to eliminate tearing without adding stuttering.

Well you don't want to play sub 60 fps that's OK. But as I've said I'm fine with it. I probably wouldn't be able to tell the difference between 50 and 60 fps in a blind test anyway. Are you confident that you could?
 
The purpose of freesync and g-sync is not to mask "fps". It's to eliminate tearing without adding stuttering.

Well you don't want to play sub 60 fps that's OK. But as I've said I'm fine with it. I probably wouldn't be able to tell the difference between 50 and 60 fps in a blind test anyway. Are you confident that you could?

I understand that. But it has a side effect of making the game feel smoother than it is when the FPS drops suddenly.

And yes, I'm confident that I could spot the difference between 60 and 50FPS.
 
I'm not sure what your guy's FPS expectations are, but my VII (and 1080 ti before it) ran pretty much all games at highest IQ and 40 FPS or higher. Heck, even my laptop with a 2070 runs a good many games at acceptable levels hooked up to my monitor.

As for 4k vs 1440p on a 27" display, the difference is entirely noticeable to my eyes. The difference was even noticable when I was running a 24" 4k and 1440p display side by side. Having 4k on a 27" monitor is by no means "just to say" you have 4k.
Agreed. Also most of the ultra settings usually can't be distinguished between without screenshots zoomed in, let alone in motion. I only am a stickler for textures, which I never drop.

I've had 4k monitors from 24" through 32" and it is a drastic, noticeable quality upgrade compared to 2560 resolutions I used for many years.
 
Agreed. Also most of the ultra settings usually can't be distinguished between without screenshots zoomed in, let alone in motion. I only am a stickler for textures, which I never drop.

I've had 4k monitors from 24" through 32" and it is a drastic, noticeable quality upgrade compared to 2560 resolutions I used for many years.
Sorry but I do not believe you can see a "drastic, noticeable quality upgrade" difference in a game between say a native 1440p 27 inch and native 4k 27 inch. I think it is quite ironic that you say ultra settings "can't be distinguished between without screenshots zoomed in" yet claim you magically can easily see the difference between a native 1440p and native 4k small screen in a game. I have excellent vision and am quite picky and still have to stare closely to see any difference if at all even having the monitors side by side. It will of course vary by game.
 
I understand that. But it has a side effect of making the game feel smoother than it is when the FPS drops suddenly.

And yes, I'm confident that I could spot the difference between 60 and 50FPS.

Dan, completely random, but, do you have a link to pictures of your build? I'm curious and would like to see your setup, =)
 
Sorry but I do not believe you can see a "drastic, noticeable quality upgrade" difference in a game between say a native 1440p 27 inch and native 4k 27 inch. I think it is quite ironic that you say ultra settings "can't be distinguished between without screenshots zoomed in" yet claim you magically can easily see the difference between a native 1440p and native 4k small screen in a game. I have excellent vision and am quite picky and still have to stare closely to see any difference if at all even having the monitors side by side. It will of course vary by game.
What you believe is irrelevant. I've tried 27" and 4K and it was noticeably better than 1440, but in the end I choose 2560x1440 because in games that didn't support crossfire the FPS was simply too low even for me. And some say I played f1gp2 at 15fps.
 
Back
Top