Will lack of high refresh rate 4k monitors hurt Nvidia RTX Turing sales?

Joined
Mar 18, 2013
Messages
580
The main reason I upgrade my GPU, is because I usually have purchased a new monitor, a monitor where the refresh rate and resolution is significantly higher then my monitor before it, and I want to hit my new monitors high refresh rate when gaming. In the past, when I made this monitor upgrade, my GPU usually no longer takes full advantage of my new monitors features, creating the necessity for me to also upgrade my GPU, this has been my upgrade cycle for the last 10 years or so. Now, its believed that Nvidia 2080/2080ti will be a healthy jump in performance over the Pascal line of GPUs, and that's great, however I’m having trouble seeing why the average PC gamer would be clawing at the chance to buy the upcoming Turing line of GPUs, if their GPU already pushes their monitor to its limits. At this point in time, to me, a Pascal 1080ti or even a 1080 seems like they could push 95% of the monitors on the market to their limit. We are not swinging in monitors that laugh at our puny GPUs, like so many times in the past..
(I fully understand there are data scientists, computer artist, etc our there that are excited for Turing, but right now I’m focusing on FPS hunters.)


I personally can not think if a single monitor that I would like to purchase now, or in the upcoming future, therefor this significantly reduces my reasoning to replace my 1080ti. Don’t get me wrong, I would love to have one of those $2000 120hz 4k 27” screens, but I’m not buying a 27” 4k monitor, especially at two grand. So, as of right now, a 2080 would be useless to me, because my 1080ti already pushes my 3440x1440 screen to the 100fps it can handle, at max settings.

When I can get a 34” 4k g-sync 144hz monitor (for less then $600), I will have a new Turing GPU under my other arm, but we could be waiting years for that screen to come out, as there is little in the way of evedince a screen with those specs will be released even next year.


Anyways, Im just killing time before 5PM and I punch out for the weekend, this was just something I was wondering I’m alone in thinking.

Have a good weekend bois.
 
The real thing hurting 4k144hz is DisplayPort not being able to do 4k144hz with HDR. I hope to see some 4k144hz displays without HDR to fill in the price gap.

I will (and many others) will probably wait for the next revision of DP/HDMI.

Such a bummer. I am really excited about Ultrawide 4k with a high refresh rate. That is a long ways off.

Can you see yourself upgrading to a 3440x1440@200hz display? That could be pretty sweet.
 
I’d personally take a good curved 21:9 high Hz over 4k anyday.

But yeah... generally the last gen’s top card don’t have a good reason to upgrade for a while anyways. I definitely feel like we’re getting into diminishing returns like how CPUs have been.

Eventually there will be a flood of mining GPUs that should hurt sales as well.
 
Yeah, definitely. My point was that reaching 60fps in 4K today is a struggle to begin with for many games.

We should learn to walk before we run, so when we can reliably get 60fps in high-end games at 4K, then we can think about high refresh.

I bring up Deus Ex because I had a LOT of trouble getting it playable, even with a Titan X Pascal. Eventually I dropped to a "virtual" 21:9 resolution and at around High settings I got it to 60fps.

But, to stay on topic, I would be interested in high refresh 4K, when it's available. Currently mostly gaming at 60Hz on a 4K TV but I don't mind playing older games where high refresh is possible.
 
You dont need to max out a game for it to still look good on 4k. Games with reduced settings on 4k to me look FAR better than maxed out games at 1080p.

false, a lot of factors have to be taken in consideration first... depends more on screen size at given resolution medium settings 4K@60hz 48" panel vs ultra settings 1080P@60hz 24" will be better on the 1080P panel.. DPI play a huger role in a game resolution quality. this is asuming same type and kind of panel are used. and not things as jumping from TN to IPS and so on.
 
I don't think so. If gtx 2080 is only little faster than 1080ti then there no reason lack of high end monitors will hurt its sales. 2080ti will be a different story though.
 
false, a lot of factors have to be taken in consideration first... depends more on screen size at given resolution medium settings 4K@60hz 48" panel vs ultra settings 1080P@60hz 24" will be better on the 1080P panel.. DPI play a huger role in a game resolution quality. this is asuming same type and kind of panel are used. and not things as jumping from TN to IPS and so on.
Speaking from personal experience. Bought a 4k Samsung TV for my bedroom and ended up swapping my 27" IPS 1440p for it on my PC. Size is an important factor in this but not in the same context you argue (which I do not necessarily disagree with). Size is a MAJOR element in immersive gaming for me. Also have an IPS 24" Dell 1080p in a spare PC, so have extensive experience with all resolutions and game settings. "Medium" settings on a 4k display is a poor way to go about it. You have to judiciously choose which settings matter in IQ and can be a mix of very high, low and medium. Always end up blown away in any game on my 4k unit vs the 1440p and 1080p maxed out. Things that would be virtually invisible on a 'tiny' 24" display stand out on a 40" 4k screen (ideal DPI @ 2' away). Everyone who has seen my gaming rig is similarly blown away vs my smaller displays. 24" no matter how good or maxed out the game is soooo underwhelming in comparison.
 
You dont need to max out a game for it to still look good on 4k. Games with reduced settings on 4k to me look FAR better than maxed out games at 1080p.
There is nothing magical that happens at 4K to make a game look better. The reduced aliasing is certainly nice but other than that it's the same damn game so if you have lower settings it still does not look as good as having higher settings. Of course it will look better than 1080p if its on a native 4k monitor but that's just because it's going to be kind of blurry and fuzzy looking compared to the native resolution. I would much rather have a 27 inch 1440p monitor and be able to max out settings than I would to have a 4K 27-inch monitor and have to run greatly reduced settings just to get the same frame rate.
 
There is nothing magical that happens at 4K to make a game look better. The reduced aliasing is certainly nice but other than that it's the same damn game so if you have lower settings it still does not look as good as having higher settings. Of course it will look better than 1080p if its on a native 4k monitor but that's just because it's going to be kind of blurry and fuzzy looking compared to the native resolution. I would much rather have a 27 inch 1440p monitor and be able to max out settings than I would to have a 4K 27-inch monitor and have to run greatly reduced settings just to get the same frame rate.
You completely miss the point. Its not the resolution per se that makes it look better, but the resolution + SIZE being the critical factor that makes it so much better and immersive. Tiny details that you cannot see or make out on a 24" 1080p screen suddenly spring to life on a 40" 4k display. Test this 4k image in its full size then reduce its size by half (to very roughly represent a smaller screen). The window frames in the right foreground building disappear and the whole image suddenly seems more distant and lifeless. Vibrant colors of small objects on small screens are not as impressive as would be on larger, higher res screens.

Another good example is watching a good movie on a 72" screen than going into another room and seeing it on a 32" screen. The level of immersiveness is suddenly diminished if not gone. That imo is the main point of 4k from a PC perspective, being able to increase the size of your screen (within reason) and thus significantly increasing your level of immersion in it.

I understand that it is not for everyone from a practical standpoint (not many are willing to live with 40" screens on or above their desktop) or where strict, competitive gaming purposes @ 144hz or above is the main priority. But for those who can live with reduced FPS on slow paced games are in for a treat. Again, I have had many games maxed out on lower res, smaller screens and they pale in comparison to how the appear on a 40" 4k screen even with lowered settings. Still own my 1440p and 1080p IPS monitors but not willing to go back to them.
 
Last edited:
The real thing hurting 4k144hz is DisplayPort not being able to do 4k144hz with HDR. I hope to see some 4k144hz displays without HDR to fill in the price gap.

Can you see yourself upgrading to a 3440x1440@200hz display? That could be pretty sweet.

Yeah that's is what I was thinking too. It seems like monitor tech has failed to really improve over the last few years, and we are waiting for the next 'must have' monitor spec. The problem now being, that next must have monitor is not even in sight.

And yes, I would love a 200hz 3440x1440p ultra wide. We will see what the future brings.
 
Yeah that's is what I was thinking too. It seems like monitor tech has failed to really improve over the last few years, and we are waiting for the next 'must have' monitor spec. The problem now being, that next must have monitor is not even in sight.

And yes, I would love a 200hz 3440x1440p ultra wide. We will see what the future brings.

This is coming early 2019 apparently. But it will be VA. And expensive (Gsync + HDR).
 
You wont need a 60hz+ monitor unless you buy the RTX Titan. RTX 2080 won't be much faster than 1080ti and you won't see the 2080ti till start of next year.
 
The first Korean 120hz 42" IPS monitors are available (~$1k) which means the panels exist - as someone in the thread pointed out, the display port issue is not helping, however DP 1.4 supports 4K 120hz + HDR, as does HDMI 2.1. Hopefully both are present on the new cards.

As for not being able to drive that with a new RTX processor - several newer games have had options to allow for dynamic resolution, display scaling, checkerboarding and other techniques which while they might at first glance seem gimmicky, do work quite well. The deepper analyses by Eurogamer / Digital Foundry on some of these techniques in newer engines and game releases has been very interesting to view. I would like those techniques if they let me keep my frame rates hovering around 90+, which seems to be the minimum for where you get most of the benefits of higher frame rates. Of course true 120hz+ 4K with no compromises would be best and we are at least another generation or two away from that. I remember when high frame rate 1440p seemed forever over the horizon and now we are there, so 4k HFR will come too.

I hope to see more new technologies in the new cards, more than just Pascal +25% performance bump.
 
There’s a lot of people throwing around performance numbers of a card that hasn’t even been annouced much less benchmarked yet.
 
Even if the rumors are bunk, we can probably expect at least a 30% gain over the current line-up. At least I would hope so.

50% would be amazing, and may actually get me to revive my 3D Vision Surround setup.
 
There’s a lot of people throwing around performance numbers of a card that hasn’t even been annouced much less benchmarked yet.

Pretty much par for the course. nVidia rarely says much before the launch though.

Just a lot of these comments seem pretty confident for being based on nothing... not even based on prior history.
 
Last edited:
Seeing how the Titan RTX is just going to be a $3000 rebranded Titan V CEO Edition, I don't think the performance we are looking at is so earth shattering that we need to consider high refresh 4K monitors just yet. A single Titan RTX isn't going to net you more than 80-85fps with most AAA titles at 4K.

The 7nm Titan in 2019 however, will certainly be interesting for high refresh rate 4K gaming. But at by the time that comes around, I'm sure there will be another price bump ($4000 Titan?). At some point you really gonna need to ask whether or not it's the kind of investment you want just to achieve 120+ Hz 4K gaming.
 
The real thing hurting 4k144hz is DisplayPort not being able to do 4k144hz with HDR. I hope to see some 4k144hz displays without HDR to fill in the price gap.

I will (and many others) will probably wait for the next revision of DP/HDMI.

Such a bummer. I am really excited about Ultrawide 4k with a high refresh rate. That is a long ways off.

Can you see yourself upgrading to a 3440x1440@200hz display? That could be pretty sweet.

HDMI 2.1 has enough bandwidth for 8k HDR while putting Variable Rate Refresh into the mainline specification. I'm hoping 2.1 makes fixed refresh rates a thing of the past.
 
Seeing how the Titan RTX is just going to be a $3000 rebranded Titan V CEO Edition, I don't think the performance we are looking at is so earth shattering that we need to consider high refresh 4K monitors just yet. A single Titan RTX isn't going to net you more than 80-85fps with most AAA titles at 4K.

The 7nm Titan in 2019 however, will certainly be interesting for high refresh rate 4K gaming. But at by the time that comes around, I'm sure there will be another price bump ($4000 Titan?). At some point you really gonna need to ask whether or not it's the kind of investment you want just to achieve 120+ Hz 4K gaming.
Source on your first point since you stated it as fact?
HDMI 2.1 has enough bandwidth for 8k HDR while putting Variable Rate Refresh into the mainline specification. I'm hoping 2.1 makes fixed refresh rates a thing of the past.
At 60 Hz, but otherwise I agree.
 
Source on your first point since you stated it as fact?

Wccftech speculation, based on speculation from AdoredTV about Titan RTX being 50% faster than 1080Ti. "The fact" I am stating is someone speculated speculation on a speculated product which may or may not exist. Which is in fact, a factual description.

That's in fact the topic of this thread is regarding "Nvidia RTX", which is based on those recent rumors, so the news source material for the thread topic is in fact from the same as the source being used for my post. To make it easier for you in the future, if a thread is about hardware rumors I suggest reading the original source material on the news sites.
 
Last edited:
I am kind of like the OP, I upgrade my GPU after upgrading monitors. I typically upgrade my monitor when there is something wrong with it or it breaks. My last monitors were two Hanns G 27" 1080p monitors that I have had since 2012. Because my eyesight is getting worse (almost 50), I upgraded to a 4k 43" Acer that runs at 60 Hz. I don't game as much as I have in the past, but I do use my monitor for watching movies, Fusion 360, 3D Printing, Photoshop. I upgraded my GPU after my monitor purchase from a GTX 970 to a GTX 1080. At this point, I don't see me purchasing anything higher than what I have. I would imagine we will be looking at 8k monitors by then and I am only likely to upgrade if my current GPU or monitor die. I think we have reached a point of diminishing returns. I can only see those who are upgrading from a GTX9XXX series or older unless you are a "must have" junkie.
 
Good grief OP. How many times do we have to go over this. 4K can bury a 1080ti into the dirt with some high settings.

4K desperately needs the 2080/2080ti even at 60Hz.
 
Forget 4K monitors, try running an Oculus Rift or HTC Vive with significant amounts of supersampling! For those things, 90 FPS minimum is critical - not average, MINIMUM.

That is why we need way better GPUs than the 1080 Ti right now; VR is just too demanding if you like turning up the details, and future HMDs will assuredly have much higher-resolution screens, closer to 4K if not greater overall, while still having to refresh at least 90 Hz.
 
The real thing hurting 4k144hz is DisplayPort not being able to do 4k144hz with HDR. I hope to see some 4k144hz displays without HDR to fill in the price gap.

I will (and many others) will probably wait for the next revision of DP/HDMI.

Such a bummer. I am really excited about Ultrawide 4k with a high refresh rate. That is a long ways off.

Can you see yourself upgrading to a 3440x1440@200hz display? That could be pretty sweet.

Nvidia Big Format Gaming Displays
 
Nvidia Big Format Gaming Displays
No doubt these will move things forward and put options on the market. Not to mention there will be more volume on the parts being ordered (panels etc.)
But these displays will probably be 3k+. For a 55” fald ips or va display, that’s competitive with TVs. I would still like to see stuff in the 30 - 42” range that you can put on a desk.
Non gaming specific TVs with hdmi 2.1 and variable refresh / high frame rate at CES 2019 in January will make a good counterpoint to the BFGDs and I do wonder which will actually be available first.
 
The real thing hurting 4k144hz is DisplayPort not being able to do 4k144hz with HDR.

Uh, ya it can. I run 4K 144 Hz 10-bit HDR just fine with 4:2:2. And before you say "ugg 4:2:2", it literally makes no difference in games versus 4:4:4.

Almost all HDR content (including all UHD Blu-rays) are 4:2:0.
 
Turning 2080 Ti and this monitor would fit well, even two 1080 Ti's in games that scale well would be awesome:
Upcoming monitors:
https://www.displayninja.com/new-monitors-in-2018-and-2019/

A single 2080 Ti pushing 4K past 60 FPS at max settings or near max is asking a lot but for Ultra Wides such as 3440 x 1440p and higher - seems to be right at the sweet spot for 100 hz plus monitors.
 
Last edited:
Quieter is better, it would be nice to have minimum frame rates over 60 without the fans breaking 50%.

56 was the minimum in the Strix 2080 Ti preview on Shadow of the Tomb Raider, so that is close.
 
Back
Top