Gsync 4k 144hz HDR monitor prices 'released'

Been playing the new God Of War game on my OLED and I can't even begin to imagine just how awful the blooming would look in scenes like this if I was playing on a FALD IPS instead.
 

Attachments

  • 20180422_214924.jpg
    20180422_214924.jpg
    660.8 KB · Views: 0
At these prices, flawed pixels are inexcusable.

I think it's becoming a case of gamer = audiophile to manufacturers.

Watch those 65" gsync monitors be in the realm of flagship 85-100" television prices.
 
If this came out 2-3 years ago I might have been willing to spend that much. But I've been spoiled by OLED PQ, and it's pretty affordable now considering the size you get, so there's no way I'd buy of these. I'm pretty content with just waiting for a high refresh OLED display. It seems almost inevitable at this point, just a matter of time.
 
According to an Asus rep that has been answering questions on the ROG forums, the FALD cannot operate in SDR mode. So yea I'm not buying a $3000 monitor that only uses its local dimming in like 30 games. By the time HDR PC game support is widespread, there will be far better display options.
 
According to an Asus rep that has been answering questions on the ROG forums, the FALD cannot operate in SDR mode. So yea I'm not buying a $3000 monitor that only uses its local dimming in like 30 games. By the time HDR PC game support is widespread, there will be far better display options.

LOL wtf. Asus/Nvidia really shouldn't have tried to push so hard for HDR on this thing and should've just aimed for standard 4k144Hz. Then it likely wouldn't have faced so many delays and also would have been better priced due to having no FALD.
 
Oh my god this is another FALD only in HDR monitor? Epic failure! A failure just like the max 300 nit in SDR mode 32" ProArt FALD. Display manufacturers are incredibly incompetent.

I think I'll stick with my C8 OLED.
 
I'm thinking about saying fudge it and start looking at the 4K monitors that are out there already. I really want to upgrade my monitor and waiting on these doesn't seem worth it at this point.
 
Holy cow that's insane for just HDR

Glad I bought my GSYNC monitor for $900 now :ROFLMAO:
 
Holy cow that's insane for just HDR

HDR is hard.

Right now it's just a direction- there's literally no display in existence that can do HDR right. These displays are decent efforts considering the other features packed in.
 
HDR is hard.

Right now it's just a direction- there's literally no display in existence that can do HDR right. These displays are decent efforts considering the other features packed in.


Yea man, but the FALD implementation is a total Neg if its gonna be gimped like the 60hz models.
3hnAmBm.gif


Two years ago they should have given US a 32" IPS 4k120hz MST @ $1,200 - $1,500 price points. Instead we got bled and bled and bled with tired 1440p.....and now when we finally do get a high refresh 4k gaming display they go and muck it up with a totally half baked product.....and if / when it fails all the suits will say, SEE WE SHOULD HAVE JUST KEPT GIVING THEM MORE 1440PEEEEEE!






x
 
Yea man, but the FALD implementation is a total Neg if its gonna be gimped like the 60hz models.
View attachment 69225

Two years ago they should have given US a 32" IPS 4k120hz MST @ $1,200 - $1,500 price points. Instead we got bled and bled and bled with tired 1440p.....and now when we finally do get a high refresh 4k gaming display they go and muck it up with a totally half baked product.....and if / when it fails all the suits will say, SEE WE SHOULD HAVE JUST KEPT GIVING THEM MORE 1440PEEEEEE!






x

I think the problem simply lies with display manufacturers not making the screens en masse or not having the ability to manufacture them at all
 
I think the problem simply lies with display manufacturers not making the screens en masse or not having the ability to manufacture them at all

What are you talking about? We have had 4k60hz IPS displays since at least 2015 and MST was a viable technology powering 5k displays since then as well.

Right now I am typing on a Zisworks X39 4k120 running by the magic of MST. If one man in a garage can do it, WHY can't any of the major players in the display market? Not only is a 30"-40" 4k120 a hot commodity for gaming but its also completely excellent for work productivity and CADD drafting. And yet we have been force fed bottom feeder 1440P and UW's for the last three years.

And then when we do get a 4k144hz its gimped all to high hell and most likely completely over priced.
I swear the display industry is the WORST in terms of innovation and advancement :-(
 
Last edited:
It really is mind boggling. Same with the criminalization of glossy gaming displays.

They are so out of touch.
 
According to an Asus rep that has been answering questions on the ROG forums, the FALD cannot operate in SDR mode. So yea I'm not buying a $3000 monitor that only uses its local dimming in like 30 games. By the time HDR PC game support is widespread, there will be far better display options.

LOL wtf. Asus/Nvidia really shouldn't have tried to push so hard for HDR on this thing and should've just aimed for standard 4k144Hz. Then it likely wouldn't have faced so many delays and also would have been better priced due to having no FALD.

Oh my god this is another FALD only in HDR monitor? Epic failure! A failure just like the max 300 nit in SDR mode 32" ProArt FALD. Display manufacturers are incredibly incompetent.

I think I'll stick with my C8 OLED.
MarshallR posted a followup saying that FALD is not disabled in SDR.

https://rog.asus.com/forum/showthread.php?99007-Pg27uq/page26#post717065
MarshallR said:
FALD is not disabled for SDR - I got that wrong. The engineers got back to me since yesterday to clarify.
 
So if my ten year old monitor kicks the bucket on me... and that can happen any day now... what should I buy? Seriously? Vega, l88bastard... some of you others... where would you go in 2018 if you had to make a decision? Because I share l88's (and others) assessment about the appalling and frustrating state of this industry.
 
I'd wait on HDMI 2.1 and a clearer HDR monitor landscape, 120hz oled VRR hdmi 2.1 .. etc. personally. But if I had to this year I'd look to sub-$1k price range 120hz+ VA(for 3x the black depth and contrast ratio vs ips and tn) G-sync monitors at 1440p and not blow ton more money (up to triple or more!) on these FALD HDR ones. I would save my piggy bank breaking for 120hz native OLED on HDMI 2.1 (whose spec includes 120hz, VRR variable refresh rate, Low Latency tech, dynamic HDR, etc).

I'd take a good look at this one which a few regulars replying in this thread have I think:

https://hardforum.com/threads/31-5-2560x1440-165-hz-va-g-sync-lg-32gk850g.1947751/

https://www.newegg.com/Product/Product.aspx?Item=N82E16824025836&ignorebbr=1

Otherwise with a mid range gpu and a more affordable monitor budget, I'd even look at the 1080p 120hz+ g-sync VAs so I could crank all the settings up. That or score a used tn or ips rog swift 1440p. Either would keep you from blowing too much before a few hdmi 2.1 tv/gaming displays are out in 2019. I can't see blowing $3k - $5k on a "super monitor" of the moment considering what is just around the corner. That kind of purchase should last me awhile.
 
Last edited:
  • Like
Reactions: Q-BZ
like this
Yes, I'm glad he corrected that. But we still need a review from a respected publication to see if they even got the FALD right, or if they just gave up and dumped it on the market because the development costs were getting out of control and they were afraid that 2019 HDMI 2.1 TV competition would render it irrelevant to the market.
 
What is this shit? These manufacturers are fucking crazy.
I'm not buying anything 4k below 40" and especially not for that price.
27", lol. People can actually play on that?
My friend is even calling me out for cheating in FPS because I don't even need a scope on my 50" TV. What he sees as a tiny dot I clearly see as a player.
Even these 34" ultra wide screens are anything special. Still small, just wider.

I couldn't agree more. I'd love the better refresh rates, but the sizes on all the fast displays are all just insufficient.
 
So if my ten year old monitor kicks the bucket on me... and that can happen any day now... what should I buy? Seriously? Vega, l88bastard... some of you others... where would you go in 2018 if you had to make a decision? Because I share l88's (and others) assessment about the appalling and frustrating state of this industry.

Well, that is exactly the problem. For the last three years your choice is and has been:

- Overpriced 27" 144-165 hz TN, VA or IPS displays with shit quality control and terrible AG.

- Overpriced 4k 60hz panels that suck for competitive FPS

- Overpriced and over hyped Ultra Wides that create more problems than they solve.

I use a C7 OLED 55" when I want the most beautiful immersive experience and for faster paced gaming I built a Zisworks X39 4k120, which cost me under $500 total out the door. The X39 is not perfect as it has slow pixel transition VA drawbacks, however, what you get for the price cannot be beat.

I am sitting things out until we get proper high refresh 4k displays for realistic prices or LG releases a 40"-55" 4k120hz OLED.
 
  • Like
Reactions: Q-BZ
like this
So if my ten year old monitor kicks the bucket on me... and that can happen any day now... what should I buy? Seriously? Vega, l88bastard... some of you others... where would you go in 2018 if you had to make a decision? Because I share l88's (and others) assessment about the appalling and frustrating state of this industry.


If my Acer XB321HK died (and warranty didn't apply for some reason), I'd replace it with a cheap 32" 4k IPS display, eg monoprices. I want the extra height vs a 27, playing games with corner HUDs I absolutely do not want an ultra wide even if I could get one tall enough to not be a downgrade, The extra sharpness from 140dpi vs 100 is totally worth it vs my old 30" 2560x1600 monitor, so I want to keep that. OTOH as someone who never noticed tearing while actually gaming (vs stepping through video one frame at a time) GSYNC hasn't had any noticeable affect on my gaming experience; and the single frame glitch/day because of what passes for quality control with AUO is bad enough I'm not willing to pay a premium to replace with like.

My dissatisfaction with the QC on my current display is such that even if AUOs new panel came out at $500 instead of $2500 I'd refuse to buy it, and would instead wait for a similar Samsung, etc panel.
 
  • Like
Reactions: Q-BZ
like this
I would save my piggy bank breaking for 120hz native OLED on HDMI 2.1 (whose spec includes 120hz, VRR variable refresh rate, Low Latency tech, dynamic HDR, etc).

Knowing this industry it'll probably be another 10 years before we see all those specs together in one monitor.

I'm also not so sure HDMI 2.1 is right around the corner, at least when it comes to monitors. TVs will, I'm sure, feature them next year. But I'd be very surprised to see any HDMI 2.1 on monitors <$1,000 for a couple years yet.
 
2018 LG OLEDs already have 120hz native 4k, the hdmi 2.1 spec wasn't available in time to fab the direct input so they can only stream 120fps/hz on their 2018 models.. 2019 will almost certainly have HDMI 2.1 LG OLED TVs...

HDMI 2.1 spec has low latency mode for gaming. It also has a low latency/transition spec so you don't get black screen pauses when switching rez/sources etc. HDR spec includes P3 color so the color spectrum should be pretty good, especially in regard to gaming purposes, and of course with a much wider HDR range color volume. The top end of the HDR color volume is a bit lower, but OLED's emissive "per pixel FALD" has unequaled black depth.

The only real question is whether HDMI VRR will be supported by gpu manufacturers. XBOX supporting it already is a good sign though I hope. I guess one other issue would be someone who is unwilling to (or lacks the space to) set their desk apart like a service desk away from a wall or monitor mount so they can use a 55" oled at a reasonable distance of +/- 6' away. I don't see any reason to buy a $3k to $5k FALD monitor if you can have a 4k 120hz native over HDMI 2.1, HDMI 2.1 low latency input, HDR, OLED in 2019 .. with hopefully VRR support coming down the pipe since it's part of the spec. Any intial Lack of VRR support on the gpu end is the only thing that would give me a second thought.

----------------------------------------
----------------------------------------

HDMI 2.1 https://www.flatpanelshd.com/news.php?subaction=showfull&id=1511934073
The first phase of HDMI 2.1 certification starts in the second quarter, with full certification expected to begin in the third or fourth quarter of the year. Products with the official stamp of approval can be launched following successful certification.
Besides support for 8K and 10K resolution as well as 4K resolution at 120fps, HDMI 2.1 supports Dynamic HDR to enable “multiple static and dynamic HDR solutions”. It features eARC that “supports the most advanced high bitrate home theater audio formats, object-based audio, uncompressed 5.1 and 7.1, and 32-channel uncompressed audio” with audio bandwidth up to 37 Mb/s

  • Variable Refresh Rate (VRR) reduces or eliminates lag, stutter and frame tearing for more fluid and better detailed gameplay.
  • Quick Media Switching (QMS) for movies and video eliminates the delay that can result in blank screens before content is displayed.
  • Quick Frame Transport (QFT) reduces latency for smoother no-lag gaming, and real-time interactive virtual reality.
  • Auto Low Latency Mode (ALLM) allows the ideal latency setting to automatically be set allowing for smooth, lag-free and uninterrupted viewing and interactivity.
 
Last edited:
If only they had gsync. Or even freesync. I would downgrade to an amd card to use it.
 
Ok so FALD works in SDR mode, now my question is what will be the contrast ratio with FALD and will it suffer from insane amounts of blooming and dimming lag?
 
My question is are they actually going to give us a fuckin US release date or even price? So much for Nvidia's promise of this fiscal quarter :p
 
So if my ten year old monitor kicks the bucket on me... and that can happen any day now... what should I buy? Seriously? Vega, l88bastard... some of you others... where would you go in 2018 if you had to make a decision? Because I share l88's (and others) assessment about the appalling and frustrating state of this industry.

I don't really "do" monitors anymore. I rock a 2018 LG OLED (C8). I may get another monitor when OLED with HDR comes down the pipe.

I was originally excited when this monitor was revealed almost a year and a half ago. Having played around with FALD, I found it is not the answer. (ESPECIALLY with an IPS panel). Plus IMO 27" is too small for 4K. Is it clear? Of course, but the increase is largely wasted I think versus a more immersive 32". And now you toss on lowering chroma just to get the higher refresh rate, I don't think these will be particularly hot sellers unless the price drops significantly.

Will the monitor be pretty good for bright games like say PUBG? Sure. I'm almost certain anyone who loves dark games will be in for quite the disappointment.
 
I don't really "do" monitors anymore. I rock a 2018 LG OLED (C8). I may get another monitor when OLED with HDR comes down the pipe.

I was originally excited when this monitor was revealed almost a year and a half ago. Having played around with FALD, I found it is not the answer. (ESPECIALLY with an IPS panel). Plus IMO 27" is too small for 4K. Is it clear? Of course, but the increase is largely wasted I think versus a more immersive 32". And now you toss on lowering chroma just to get the higher refresh rate, I don't think these will be particularly hot sellers unless the price drops significantly./QUOTE]

Thing to keep in mind with the "too small for 4k" is that there are different reasons to want 4k. If you want it to get more screen real estate, then ya you want a larger display, though they need to get pretty large to offer you the same pixel density as 24" monitors at 1920x1080. However the reason that others are looking at 4k (or 5k or more) is higher pixel density. You have your programs scale things up, so that everything looks more smooth. The idea being eventually get pixels small enough that you can't perceive the individual pixels, and everything is perfectly smooth. For that you need a higher resolution on an existing size.
 
Well this is a gaming monitor and for games, high ppi is nice but I and many others prefer a larger screen for a better viewing experience. I have a 24 inch 4k monitor so I know how good games can look when you're running close to 200ppi, but I would not trade off ppi for screen size when playing games. 4k at 40 inches with lower ppi just gives me a better viewing experience than 4k at 27 inches with higher ppi.
 
I have a 32" 4K monitor and honestly, high PPI is really nice for things that are often very close to your eyes like phones, but on monitors? It's a marginal improvement for text only, and I'm not really reading a large amount of text and focused on comfortable text aliasing when I'm playing video games. Not to mention, the additional fidelity you get from 4K 27" PPI goes immediately to hell the second you move your viewport, because even at 144hz the motion resolution of a non-strobing LCD is nowhere NEAR a static 4K image.

Viewing distance is a significant variable in perceived resolution, and while the PPI of a 4K 40" screen is "only" 110, similar to a 1440p 27" screen, the effective PPI is higher because you are going to be further away. If you put your 40" screen at the same distance I have my 27" -- about 20 inches -- you would have to turn your head constantly, which you don't want.

Even on 55" 4K you only really perceive screen door effect if you inspect the screen at very close distances that you would not normally use for viewing.
 
Sooo whos gonna take one for the team?

Whos gonna be the first to drop $2,500 and give these a go?





* assuming they are $2,500 lol
 
In 2016, a crack commando unit was sent to eternal timeout by display manufacture oligarchies for wanting high refresh 4k displays at a reasonable price. These men promptly escaped from a maximum-security stockade to the internetz underground. Today, still wanted by the noobz, they survive as soldiers of fortune. If you have a problem... if no one else can help... and if you can find them... maybe you can hire... The [H]-Team
 
So I would be happy to pay that for a 65" 2160p OLED 144Hz G/Freesync display.

I would have been in line right behind you if they'd been released in the last year...

And I'll get in line tomorrow if they become available :D.
 
No biggie, this badass LG 32GK850G will do me just fine until 2020 when these 4k/120Hz/VRR monitors and the (single) GPU's to drive them at their full potential actually exist.
 
No biggie, this badass LG 32GK850G will do me just fine until 2020 when these 4k/120Hz/VRR monitors and the (single) GPU's to drive them at their full potential actually exist.

tempting for sure.. larger format than my 27" swift gaming display, maintains a more reasonable resolution for high end gpu(s), has g-sync and VA black levels at that. However even though I realize HDR is still a bit immature -dropping $800+ on a non HDR, non OLED/FALD monitor this late is still a hard sell for me considering the alternative of skipping, saving and shelling out for a 120hz native , VRR, HDR, 4k OLED in the next year or two.

Regarding single GPU driving 4k to full potential - I think the graphics ceiling is an arbitrary set point whose ceiling can be blown out immensely. The challenge of devs is to whittle down virtual world rendering to "real time" , not the other way around. It would be very easy to up the graphics sliders 2x, 3x and more with detailed view distances and animated objects in those distances, scene complexity, shadows, hair, reflections, fx, textures, supersampling, etc. For example, in actual complex cgi movie making, they use dummy low rez versions of characters and assets to animate and work in real time, then bake renders very slowly to make the actual scene at full complexity. Dynamic resolution in games is in a way similarly adjusting the operating complexity down for real time frame rates.
..... So while I understand what you are getting at .. screen shots sell and I don't think devs are going to just keep the current gen's complexity going forward. It would probably be accurate to say current games right now, without modifying their complexity in the future, would be able to be driven at say 100fps+ at 4k with a next gen single card. Future generations of games' ultra/max graphics ceilings will probably still be out of reach at high frame rates unless gpu's make a huge leap beyond what dev's decide the next gen's arbitrary graphics ceilings are.
 
Last edited:
tempting for sure.. larger format than my 27" swift gaming display, maintains a more reasonable resolution for high end gpu(s), has g-sync and VA black levels at that. However even though I realize HDR is still a bit immature -dropping $800+ on a non HDR, non OLED/FALD monitor this late is still a hard sell for me considering the alternative of skipping, saving and shelling out for a 120hz native , VRR, HDR, 4k OLED in the next year or two.

Regarding single GPU driving 4k to full potential - I think the graphics ceiling is an arbitrary set point whose ceiling can be blown out immensely. The challenge of devs is to whittle down virtual world rendering to "real time" , not the other way around. It would be very easy to up the graphics sliders 2x, 3x and more with detailed view distances and animated objects in those distances, scene complexity, shadows, hair, reflections, fx, textures, supersampling, etc. For example, in actual complex cgi movie making, they use dummy low rez versions of characters and assets to animate and work in real time, then bake renders very slowly to make the actual scene at full complexity. Dynamic resolution in games is in a way similarly adjusting the operating complexity down for real time frame rates.
..... So while I understand what you are getting at .. screen shots sell and I don't think devs are going to just keep the current gen's complexity going forward. It would probably be accurate to say current games right now, without modifying their complexity in the future, would be able to be driven at say 100fps+ at 4k with a next gen single card. Future generations of games' ultra/max graphics ceilings will probably still be out of reach at high frame rates unless gpu's make a huge leap beyond what dev's decide the next gen's arbitrary graphics ceilings are.

It'll be both moving at the same time. Just as over the previous decade we've both seen what counts as high quality settings go up, while simultaneously the top end card went from being able to deliver them comfortably at 1080p60, to comfortably at 1440p60, to barely at 4k60.

The next generation or two will probably see 4k60 become comfortably reachable in most games with a top end card at high/ultra settings while the ability to go above 60hz gradually builds up in the years beyond then.
 
yes I'd guess two or more but who knows. We just got to the point where a single top end gpu (1080ti) can do 100fps+, average, at 2560x1440 on games of it's generation - even so, with some over the top settings still turned off on a few of the most demanding games. If high hz 4k screens become more prevalent perhaps it would drive gpu manufacturers to achieve more sooner though.
 
Last edited:
Back
Top