So... when will we see 4K, VA/IPS, 120hz, G-sync monitors?

And video cards that can drive them....

2017? I am wishing too hard?

I could see it happening by then, but the video cards that can drive them part will depend on the games and what graphical settings you find acceptable.
 
4k G-sync monitors? 2017 maybe... MAYBE.

The display/monitor world moves at a snails pace honestly. We've just now started to get monitors capable of lightstrobing that can keep up with old CRT displays. That only took what? 12 years?

A lot will depend on the drive from gamers to use 4k. That means games that will keep pushing the envelope. Hopefully Mantle will help push that along but developers will keep making lazy console ports. Once indie developers can start really pushing the envelope graphically without tons of money being required then we'll see lots of great looking PC only titles that will warrant the push.

But if all we are going to get is Crysis 3 junk games that are like Super Models which are of course are devoid of any intelligence then its going to be a very long haul to 4k G-Sync monitors.
 
The DisplayPort 1.3 spec was just announced, and is expected to be ratified Q2 2014 with end-products by 2015. This will bring both 8K and 4K @120Hz, so perhaps now that even VA can manage 120Hz (e.g. Eizo Foris FG2421) we'll see either/both VA/IPS in 4K @120Hz by 2015 or 2016. It's exciting to hear about Dell's upcoming P2815Q 4K 28-incher at "under $1000" early next year, but I don't think we've had any confirmation as to panel type. Regardless, it's a relief to see 4K quickly trickle down from the "prosumer" market without all the wide-gamut backlighting and gaggle of redundant inputs that plagued 2560x1440 models for years after their introduction.

G-sync support will roll out through 2014, and I've heard several non-mouthpieces use the term "game-changer" to describe its effect, so hopefully it'll have widespread support by the time DP1.3 products come to market. Graphically, Moore's Law is definitely slowing its roll, and two years after the first 28nm GPUs we still have no new 20nm node to look forward to anytime soon. What's the most recent projection? Maybe H2 2014?

We'll have another year of 28nm, and therefore no outrageous graphical gains, but I imagine by 2015 or 2016 at the latest (mindless optimism or not, I'm thinking we can beat your guarded hope of 2017) we'll have DP1.3-supported Gsync 4K 120Hz monitors with 20nm-class Nvidia/AMD GPUs that will also support DP1.3, and DPI scaling improvements in Windows 8.2 (or whatever it's called, but "Threshold" is the codename, and it's expected H1 2015).

Exciting stuff, and by then we should even have some 4K game benchmarks from hardware reviewers that don't spell doom-and-gloom by idiotically applying like 32x SSAA or something imbecilic to show that Crysis gets 3FPS @4K. Without AA, I have no doubt that a couple of $250 20nm GPUs in 2015 could run any ol' PS4/XBO console port at a glorious 4K 120Hz, and if someone with hawkvision still wanted anti-aliasing at that insane PPI, they could throw on like 2 or 4x CSAA or something and still have great frames.
 
The slowest denominating factor for high refresh monitors seems to usually be the release of a display interface capable of what you want it to do, and 3-5+ years wait until said interface has reached a fairly large graphics card market share (displayport standard was set in 2006, only now do we start seeing screens that really make full use of it - all the others have simply had it as a bonus interface type, utilizing only a fraction of the capacity).
1440p 120hz has been possible all the time sine then, but noone has bothered releasing one because of the market size of people with displayport graphics cards in combination with the market size of computers capable of driving 1440p 120hz.

For 4k 120hz, we need the interface meant for 8k 60hz, (displayport 1.3, which isn't finalized, I think i recall an article saying it would be released in 2014/2015 or so).
After it is released we need to wait for it to gain a fairly large market share in the gpu market as well as mainstream GPU's with the power to drive them.
I think it is realistic to say that 120hz 4k will probably not hit mainstream until around 2020, or a year or two before that at best. Expensive niche monitors might be an exception, of course.

As for panel types, OLED or VA are most likely in my opinion. IPS have always been too slow for real 120hz refresh, and all the new 4k IPS/IGZO are even slower than the old lower res ones.
 
Last edited:
Some very good comments here. I'll add too that it's been years since AMD and NVIDIA have been pressured by the advancements of external hardware spec requirements, as opposed to merely software; 4K is definitely bringing the heat. I think the last time we saw such a sea change throughout the industry was the proliferation of 1080p, which was a decade ago. Sure, current solutions provide for a passable 4K experience, but it's far from ideal. There's a whole checklist of baseline conditions that need to be met first for us to even take advantage of 4K, however,

- Proper dpi (ppi) scaling in Windows 8.x (on the way in 2014-2015)
- Non-MST connectivity for 4K displays (on the way in 2014)
- HDMI 2.0 to merely enable 60Hz @ 4K (on the way in 2014)
- DisplayPort 1.3 for 120Hz @ 4K (on the way in 2015)
- Better advancements in both AMD and NVIDIA single card GPUs to power those resolutions.
- Huge advancements in Intel chipsets to power those resolutions (sans graphics card). Currently, I think the Z87 chipset is only capable of 4K @ 25Hz. I can't speak for AMD, as I'm not overly familiar with their current APUs.

It's going to take awhile for all of these to align on a mass market scale. And that's to say nothing of OLED, which is going to take at least a few more years of first adopter -> refine -> re-release cycles for it to start showing up and being affordable at lower price levels (south of $5000) on smaller TVs/monitors. And unless NVIDIA makes G-Sync an open standard, I have a feeling it's going to be pretty niche.
 
Interesting. Thanks guys. Sounds like my random guess of 2017 wasn't TOO far off for cheapish hardware. I guess that the Displayport 1.3 spec is going to be really important.

Do we know for sure that it supports 120hz at 4K?
 
- Proper dpi (ppi) scaling in Windows 8.x (on the way in 2014-2015)
- Non-MST connectivity for 4K displays (on the way in 2014)
- HDMI 2.0 to merely enable 60Hz @ 4K (on the way in 2014)
- DisplayPort 1.3 for 120Hz @ 4K (on the way in 2015)
- Better advancements in both AMD and NVIDIA single card GPUs to power those resolutions.
- Huge advancements in Intel chipsets to power those resolutions (sans graphics card). Currently, I think the Z87 chipset is only capable of 4K @ 25Hz. I can't speak for AMD, as I'm not overly familiar with their current APUs.

Several of those things I really don't see as being a 'baseline' conditions. Not everyone would even use DPI scaling (I know I wouldn't if I even ran windows). I think MST is a fine solution with no issues on atleast some OS/configuration. Since DP can drive it HDMI 2.0 I don't think is a requirement but it would be nice so people could use TV's as monitors. I don't think 120Hz@4k should be a baseline requirement. Not everyone games or plays the newest games (I run older counter-strike: source or quakelive which run fine at 4k mid-end stuff).

Just saying... Most of the things you listed would be nice but I think we already have the 'baseline' conditions for 4k already. I assume this is just generic 4k and not the 4k/120hz that the OP posted?

Of course I have been way ahead of the curve using 4k since 2005. Even the first card I had capable of 4k@60hz (geforce gt 6600 AGP) provided an acceptable 4k experience if you didn't game @ 4k. Took a few more years for fast enough cards to come out to play the games I play @ 4k.
 
When they sell everyone 4k monitors, then they will finally add one of the other features to resell those people more monitors. This is now the display industry works.
 
Back
Top