Acer X27 4k 144hz HDR Gsync

Just in general. 3440x1440 is a fairly low resolution, ~110ppi all considering the 34/35 inchers. Granted the FALD and HDR is going to help the image quality of 35" 3440x1440 a lot, but there is only so much you can do with such a large size and smaller resolution. Also, VA pixel speeds max out the quickest so I'm not sure how 200 Hz isn't going to be a gimmick.

I agree it's a low resolution but I'm also completely happy with 1440p 27" and see no reason to upgrade to 4K. I just don't perceive much improvement. I've used the 27" 4Ks before and they are meh as far as I'm concerned for gaming. I do notice improvement on desktop text rendering, but that is a very low priority. Especially when you are paying refresh rate for it, ANY refresh rate. 4K is just not exciting at all. I'm never going to do SLI, so framerates at 4K are also a concern even with a 1080TI/1180TI(whenever that comes out).

HDR and black levels are much more important to me, and I expect the VA to outperform on those areas because it is VA, plus it is Ultrawide which is nice because I mostly play mobas outside of AAA games. I agree that there is a risk the dark transitions will be a problem for >144hz, but fast VA panels do exist so maybe they've done a good job, who knows.
 
I agree it's a low resolution but I'm also completely happy with 1440p 27" and see no reason to upgrade to 4K. I just don't perceive much improvement. I've used the 27" 4Ks before and they are meh as far as I'm concerned for gaming. I do notice improvement on desktop text rendering, but that is a very low priority. Especially when you are paying refresh rate for it, ANY refresh rate. 4K is just not exciting at all. I'm never going to do SLI, so framerates at 4K are also a concern even with a 1080TI/1180TI(whenever that comes out).

HDR and black levels are much more important to me, and I expect the VA to outperform on those areas because it is VA, plus it is Ultrawide which is nice because I mostly play mobas outside of AAA games. I agree that there is a risk the dark transitions will be a problem for >144hz, but fast VA panels do exist so maybe they've done a good job, who knows.

I agree , i think VA+FALD 512 zones+HDR 1000 nits will give us the closest thing to OLED .
You can see Samsung Q9FN with 480 zones and Panasonic DX902B 512 zones and how they perform against OLED .
 
I've used the 27" 4Ks before and they are meh as far as I'm concerned for gaming. I do notice improvement on desktop text rendering, but that is a very low priority
That's the reason to get 4k it improves text clarity by a lot. Currently you are right 4k is not for gaming however you don't buy monitors every year (at least I don't) and in 2 years max pushing 4k in gaming won't be a problem anymore even for mid range cards. Still there are very few 4k high refresh monitors on the market now and they cost an arm and a leg. I wouldn't advise buying one now.
 
i decided to wait for the 32" 4k 144hz nonhdr flat panel. i will not support these prices especially with this drawbacks (fan, plasticstyle, blooming) if lg is doing a 4k 144hz i am sold because they have way better quylity control. had so many monitors in the past and the lg ones all were flawless.
Now at 32gk850g. perfect monitor, super blacks, good colors out of the box (but not oversaturated), fastest va atm. only drawback is ppi which is the same as 24" 1080p.
 
That's the reason to get 4k it improves text clarity by a lot. Currently you are right 4k is not for gaming however you don't buy monitors every year (at least I don't) and in 2 years max pushing 4k in gaming won't be a problem anymore even for mid range cards. Still there are very few 4k high refresh monitors on the market now and they cost an arm and a leg. I wouldn't advise buying one now.

I doubt that unless you mean at 60Hz and not 144Hz, and not at max settings.
 
Some serious hardware in these monitors:

https://www.pcper.com/reviews/Graph...z-G-SYNC-Monitor-True-HDR-Arrives-Desktop/Tea

Also interesting about the quality of the components that went into the new HDR G-sync module. That is some serious hardware for a monitor and makes the price not seem so crazy anymore. But everyone wants everything on the cheap these days...

Unfortunately if Ken's right about about the FPGA driving the 4k/144 gsync being a $500 part it suggests that the non HDR version's going to be around $1500. The $100-200 GSYNC premium in early generation models was painful enough; but $500 is just brutal. If they actually want to keep growing the share of the market they desperately need to either create their own ASIC or do the same thing for desktop displays that they've done for laptops and implement it as custom drivers/firmware on an existing commodity panel controller.
 
  • Like
Reactions: Q-BZ
like this
i have a question to the owners of the x27 or pg27uq: as we know colors are worse @144hz even in sdr mode. so how does that subsampling work? can i use 144hz at monitor settings and then use 120fps cap ingame and get full 8bit colors? or are the worse colors activated if i turn on 144hz? the question is because of gsync+vsync if i can play at 120 fps without vsync and full colors or if i have to use 120hz mode and cap at 118 fps so that vsync doesnt activate. so is this subsampling on the fly or do i get the worse colors when i use 144hz oc in monitor even when i have lower frames or capped?
 
Unfortunately if Ken's right about about the FPGA driving the 4k/144 gsync being a $500 part it suggests that the non HDR version's going to be around $1500. The $100-200 GSYNC premium in early generation models was painful enough; but $500 is just brutal. If they actually want to keep growing the share of the market they desperately need to either create their own ASIC or do the same thing for desktop displays that they've done for laptops and implement it as custom drivers/firmware on an existing commodity panel controller.

Got to be a production volume/complexity/longevity question.

No doubt they could build an ASIC, but it would still be expensive on a per-unit basis given the performance requirements; given the prices of the displays involved, the volume will likely not be there; and as has been somewhat ignorantly discussed above, we still need another advancement in controller/cable technology, putting a short lease on any development today.
 
G-Sync is just an interim solution. The entire industry is a fucking mess and disaster right now.

Everything other than the computer industry standardized around HDMI and home theater receivers. That's exactly what PCs should be doing, too. HDMI 2.1 should completely replace FreeSync and G-Sync and everything should be using the variable refresh standard there, and you should just be able to hook your computer monitor into a home theater receiver's HDMI and drive everything through that.

The current situation of having "phantom monitors" because your display uses display port and your audio uses HDMI is such a fucking farce. Only PC enthusiasts have enough patience and tolerance for horseshit like this. It's just disgraceful. These shitty industries need to clean up their acts.
 
  • Like
Reactions: N4CR
like this
G-Sync is just an interim solution.

It's also the superior technological solution and has been upon release. FreeSync has almost caught up, and we've yet to see where HDMI VRR really stands- and in both cases, a lot of it depends not on the standard but upon the implementations. See the shit-show that FreeSync is pre-FreeSync 2.
 
G-sync is superior yeah but IIRC the main things that made it superior were frame doubling, which Freesync has now(LFC) and variable overdrive, which Freesync still doesn't have afaik. But is also an LCD-specific feature. In the end, I'd rather have Nvidia GPUs be compatible with HDMI 2.1 VRR so I can use an OLED TV at which point I don't care about overdrive anymore at all. Unfortunately it seems like we're going to get a continuing shit show instead of any of that.
 
I found out why my HDR games look relatively "washed out" compared to SDR. It appears there is something broken in the chain of game -> driver/NVIDIA control panel -> Windows 10 -> HDR display. Normally, if you set digital vibrance above 50% in the NVIDIA control panel, it will start to clip bright color on a color chart. This isn't so until above 65% while Win 10 is in HDR mode (or games). Digital vibrance on the desktop actually affects full screen games. So digital vibrance set at 65% for HDR games is about the proper color in testing to SDR 50%. You can control Digital Vibrance automatically for gaming with a utility called VibranceGUI.
Now my HDR games look as they are suppose to, absolutely glorious!
 
Got to be a production volume/complexity/longevity question.

No doubt they could build an ASIC, but it would still be expensive on a per-unit basis given the performance requirements; given the prices of the displays involved, the volume will likely not be there; and as has been somewhat ignorantly discussed above, we still need another advancement in controller/cable technology, putting a short lease on any development today.

The proliferation of much cheaper Freesync displays combined with Nvidia themselves implementing Gsync for Laptops using custom firmware on a commodity ASIC controller says that there's no reason they couldn't make it work at a much cheaper price using something other than an FPGA.
 
The proliferation of much cheaper Freesync displays combined with Nvidia themselves implementing Gsync for Laptops using custom firmware on a commodity ASIC controller says that there's no reason they couldn't make it work at a much cheaper price using something other than an FPGA.

For previous versions sure, but this is a bit more 'bleeding edge'.

And it's not like I don't want them to develop said ASIC- I just understand why they may not yet have.
 
Wouldnt continued sales of Gsynch make it cheaper though via production? By that I mean mass manufacturing make it cheaper like we see with a lot of other products?
 
Wouldnt continued sales of Gsynch make it cheaper though via production? By that I mean mass manufacturing make it cheaper like we see with a lot of other products?

There has to be a significant 'scale' for there to be economy of scale effects. To compare with FreeSync, FreeSync is technologically simpler (and technically inferior), primarily because it is a slight tweak on the power-saving mode that was already in use. This made it inexpensive to implement in a cost/benefit scenario for ASIC makers.

I think one of the issues with G-Sync is that balance of volume; Nvidia is in it to make money, after all, and there's no guarantee that if they invest in producing an ASIC with a lower unit cost that the sales would cover their investment.

Which is why they may wait until we have a more 'future-proof' DisplayPort spec before doing so.
 
I doubt that unless you mean at 60Hz and not 144Hz, and not at max settings.
Not everybody has to play on max settings you know, I for one never do unless it's a 10 years old game. If you want max settings and high refresh you will pony up the cash for the latest and greatest GPU anyway. Also you don't need to go 144hz on every game. Would have liked if monitor manufacturers offered the option to have integer scaling and higher refresh in lower res but they don't seem to give a crap about usability only their pockets...
 
Not everybody has to play on max settings you know, I for one never do unless it's a 10 years old game. If you want max settings and high refresh you will pony up the cash for the latest and greatest GPU anyway. Also you don't need to go 144hz on every game. Would have liked if monitor manufacturers offered the option to have integer scaling and higher refresh in lower res but they don't seem to give a crap about usability only their pockets...

But this monitor is 120-144Hz...why would you not want to achieve that frame rate or at least close to it? That's one of the selling points of the monitor.

And I'd much rather increase graphics options at a resolution such as 1440p over driving a 4K resolution with medium graphics on a 27" screen. Why would I want to play a game at medium settings and at 4K with HDR? On a 27" monitor graphics options will usually make a much greater impact on visuals than bumping up to a 4K resolution; and the 4K resolution will also be a much greater performance hit than the vast majority of graphics options.

But then again...if you can drop $2k on this monitor, I assume you can also afford SLI 1080 Ti's or a Titan V to drive this monitor as intended.
 
But this monitor is 120-144Hz...why would you not want to achieve that frame rate or at least close to it? That's one of the selling points of the monitor.
Agreed, my comment wasn't geared for this particular monitor, but in general for 4k. In time driving it even at high refresh rates will become affordable, and 4k really shines for work.
 
i have a question to the owners of the x27 or pg27uq: as we know colors are worse @144hz even in sdr mode. so how does that subsampling work? can i use 144hz at monitor settings and then use 120fps cap ingame and get full 8bit colors? or are the worse colors activated if i turn on 144hz? the question is because of gsync+vsync if i can play at 120 fps without vsync and full colors or if i have to use 120hz mode and cap at 118 fps so that vsync doesnt activate. so is this subsampling on the fly or do i get the worse colors when i use 144hz oc in monitor even when i have lower frames or capped?
On the PG27UQ at least, the monitor automatically changes to subsampling based on the resolution settings. So if you have it set to 144 Hz then it is automatically going to use 4:2:2 subsampling regardless. On my PG27UQ I couldn't get 144 Hz to activate, anyway, but I am running mine at 98 Hz because all of the games I have can't get framerates higher than that without compromising image quality, so I may as well set it universally and take advantage of full chroma in HDR games. I noticed some loss in mouse precision on the desktop at first compared to 144 Hz on my PG278Q, but it doesn't bother me anymore. More importantly, all the games I've played so far are smooth and beautiful.
But this monitor is 120-144Hz...why would you not want to achieve that frame rate or at least close to it? That's one of the selling points of the monitor.

And I'd much rather increase graphics options at a resolution such as 1440p over driving a 4K resolution with medium graphics on a 27" screen. Why would I want to play a game at medium settings and at 4K with HDR? On a 27" monitor graphics options will usually make a much greater impact on visuals than bumping up to a 4K resolution; and the 4K resolution will also be a much greater performance hit than the vast majority of graphics options.

But then again...if you can drop $2k on this monitor, I assume you can also afford SLI 1080 Ti's or a Titan V to drive this monitor as intended.
I'm getting a lot out of this monitor right now with a single Titan X, and I plan to keep it for at least as long as my PG278Q (4 years). The most I've had to compromise is turning down some options from "Ultra" or "Very High" to "High," and for the most part I can't tell what I'm missing by doing so. The pixel density of the display far makes up for any potential loss in the big performance killers like shadows or particle density. So I'd say having at least a single 1080 Ti is enough to enjoy this monitor right now.
 
I'm getting a lot out of this monitor right now with a single Titan X, and I plan to keep it for at least as long as my PG278Q (4 years). The most I've had to compromise is turning down some options from "Ultra" or "Very High" to "High," and for the most part I can't tell what I'm missing by doing so. The pixel density of the display far makes up for any potential loss in the big performance killers like shadows or particle density. So I'd say having at least a single 1080 Ti is enough to enjoy this monitor right now.

What performance and what games were you playing on your Titan X with Very High and High settings? When I tested it on my 1080 Ti, Very High/High settings on most modern games gave me 80-100 FPS at 4K. It wasn't until I started going into medium settings at 4K did I begin to reach 120-144 FPS.
 
Last edited:
What performance and what games were you playing on your Titan X with Very High and High settings? When I tested it on my 1080 Ti, Very High/High settings on most modern games gave me 80-100 FPS at 4K. It wasn't until I started going into medium settings at 4K did I begin to reach 120-144 FPS.
I'm capping at 98 Hz.

I didn't have to change any settings in Mass Effect: Andromeda or Far Cry 5 compared to my PG278Q. ME:A ran 60-90 FPS at 1440p and 30-60 FPS at 4K. FC5 ran 100-150 FPS at 1440p and runs 50-80 FPS at 4K.

I had to take Final Fantasy XV down from the "Highest" preset to "High" to get acceptable framerate. It ran around 70 FPS using the Highest preset at 1440p, but frequently dipped into the 20s at 4K. It runs around 50 FPS with the High preset.
 
Let me ask another dumb question. Assuming someone actually throws down for one of these regardless of the dubious (IMO) cost... if you're rolling with a 980Ti or even a 1080ti, there's no way you're going to get the most out of these specs. So do you just switch on Gsync, run at 4K ac
I'm capping at 98 Hz.

I didn't have to change any settings in Mass Effect: Andromeda or Far Cry 5 compared to my PG278Q. ME:A ran 60-90 FPS at 1440p and 30-60 FPS at 4K. FC5 ran 100-150 FPS at 1440p and runs 50-80 FPS at 4K.

I had to take Final Fantasy XV down from the "Highest" preset to "High" to get acceptable framerate. It ran around 70 FPS using the Highest preset at 1440p, but frequently dipped into the 20s at 4K. It runs around 50 FPS with the High preset.

Here's a stupid question but as good a place as any. In the situation you're talking about here... where does G-sync kick in and what does it do in a situation like this?
 
I'm capping at 98 Hz.

I didn't have to change any settings in Mass Effect: Andromeda or Far Cry 5 compared to my PG278Q. ME:A ran 60-90 FPS at 1440p and 30-60 FPS at 4K. FC5 ran 100-150 FPS at 1440p and runs 50-80 FPS at 4K.

I had to take Final Fantasy XV down from the "Highest" preset to "High" to get acceptable framerate. It ran around 70 FPS using the Highest preset at 1440p, but frequently dipped into the 20s at 4K. It runs around 50 FPS with the High preset.

Ah ok, I tend to target the max refresh rate of the monitor (100Hz on my X34 and 144Hz [OC'd] on the X27). I think 100Hz is the sweet spot before refresh rate goes into diminishing returns for me. My 1080 Ti is perfect for 3440x1440p at 100Hz and mostly max settings, but struggles once bumped up to 4K to retain the same settings and/or performance.

Pixel density is nice, but the difference between 109 and 163 PPI was not as large as I thought it'd be. I sit about 2-2.5" away from the monitor if that makes any difference. Ended up having to increase the desktop scaling because everything seemed a tad too small.
 
Last edited:
  • Like
Reactions: Q-BZ
like this
Let me ask another dumb question. Assuming someone actually throws down for one of these regardless of the dubious (IMO) cost... if you're rolling with a 980Ti or even a 1080ti, there's no way you're going to get the most out of these specs. So do you just switch on Gsync, run at 4K ac


Here's a stupid question but as good a place as any. In the situation you're talking about here... where does G-sync kick in and what does it do in a situation like this?
There is an optimal range where G-Sync is the smoothest. Capped at 144 Hz this was 80-110 FPS on my PG278Q. I haven't narrowed it down on my PG27UQ yet, but I seem to be getting the best experience in the 50-70 FPS range, which would match the 50-70% of max refresh rate I've experienced thus far. But I've always said that with G-Sync you don't have to worry about trying to hit the maximum refresh rate so long as you're happy with the image quality. Just make sure you never go below 30 FPS, as that is the floor for G-Sync.
Ah ok, I tend to target the max refresh rate of the monitor (100Hz on my X34 and 144Hz [OC'd] on the X27). I think 100Hz is the sweet spot before refresh rate goes into diminishing returns for me. My 1080 Ti is perfect for 3440x1440p at 100Hz and mostly max settings, but struggles once bumped up to 4K to retain the same settings and/or performance.

Pixel density is nice, but the difference between 109 and 163 PPI not as large as I thought it'd be. I sit about 2-2.5" away from the monitor if that makes any difference. Ended up having to increase the desktop scaling because everything seemed a tad too small.
With G-Sync you don't have to worry about hitting max framerate as I explain above, with 50-70% of your refresh rate being the sweet spot in my experience. The pixel density is pretty amazing in my opinion compared to the 109 PPI I just came from. Windows defaulted to 150% scaling automatically when I plugged this monitor in and it seems perfect to me. Considering 163 PPI is about 150% of 109 PPI this makes sense. Compared to DPI scaling I've experienced in the past everything I've used so far seems to be adhering to the scaling.
 
  • Like
Reactions: Q-BZ
like this
PPI isn't the only way to look at clarity. I'd go with the statement that 4K is 126% clearer than 1440p. 8.3 Million pixels versus 3.68 Million. The difference is quite startling once you get used to 27" 4K. Looking at a 27" 1440p after getting used to 4K, it looks absolutely terrible.
 
It took 7 business days and 2 calls to Newegg but I finally got my money back from this monitor. Looking forward to the next latest and greatest one to come out. From the looks of it, it will be Mini LED and fingers crossed a better fan or cooling solution. I think the biggest thing I will hope for is HDMI 2.1 or DP 1.5 where the bandwidth will allow something better than 98HzHDR without comprimising picture quality. http://www.guru3d.com/news-story/mi...may-reach-market-at-the-end-of-this-year.html

Vega, I'm sure I will be seeing you then haha.
 
It took 7 business days and 2 calls to Newegg but I finally got my money back from this monitor. Looking forward to the next latest and greatest one to come out. From the looks of it, it will be Mini LED and fingers crossed a better fan or cooling solution. I think the biggest thing I will hope for is HDMI 2.1 or DP 1.5 where the bandwidth will allow something better than 98HzHDR without comprimising picture quality. http://www.guru3d.com/news-story/mi...may-reach-market-at-the-end-of-this-year.html

Vega, I'm sure I will be seeing you then haha.

The next displayport spec is going to be a way out. In January of this year VESA stated they intended to have the spec finished within 18 months, or mid 2019. Then you've got the delay from spec completion to actual product launches. HDMI 2.1 is a year and a half and counting on that front; DP1.5 may be faster on that front since it'll be a much smaller increase on the previous state of the art than HDMI 2.1 was but I really doubt we'll see anything supporting it before 2020.
 
With higher transmission/faster controller chips + brighter and more LED backlights, fans aren't going anywhere. They will actually become ubiquitous in computer monitors like they have become in high end LCD TV's. You simply cannot passively cool the amount of wattage going through these displays.

And picture quality in no noticeable way is compromised by running HDR at 144 Hz 4K 10-bit 4:2:2 in games or movies. I've done a side-by-side comparison test between the X27 and the PG27UQ, one set at 98 Hz 4K 10-bit 4:4:4 and the other at 144 Hz 10-bit 4:2:2 and you cannot tell the difference. Only on text on the desktop can you tell the difference, and 120 Hz 8-bit RGB Full works brilliantly there. 4:2:2 at 144 Hz for games and movies is a total non-issue. I'd wager 100% of people would fail the side-by-side test "telling" which one is which in a game.
 
With higher transmission/faster controller chips + brighter and more LED backlights, fans aren't going anywhere. They will actually become ubiquitous in computer monitors like they have become in high end LCD TV's. You simply cannot passively cool the amount of wattage going through these displays.

I strongly disagree with first part of your conclusion. These displays need active cooling because they're being run by a really large FPGA. One of the down sides of using an FPGA instead of an ASIC (in addition to cost) is that their power efficiency stinks. Do a proper controller and the need for active cooling will go away.

It's possible backlighting power may become an issue, OTOH the cult of insanely thin at all costs is less strong in the PC market (not to mention it precludes the extra space needed in strategic spots to install frag harder lights); building a substantially enough to have passive cooling's a lot more practical if you're allowed to make something a few mm thicker than the absolute minimum possible.
 
NVIDIA is not going to redesign their HDR G-Sync chip anytime soon after spending millions of dollars developing it. So fans are here to stay on all HDR G-Sync monitors moving forward for the foreseeable future. Heck, the PG27UQ actually has two fans, one for the G-Sync module and one for the chassis/FALD.

DSC05037.jpg
 
NVIDIA is not going to redesign their HDR G-Sync chip anytime soon after spending millions of dollars developing it. So fans are here to stay on all HDR G-Sync monitors moving forward for the foreseeable future. Heck, the PG27UQ actually has two fans, one for the G-Sync module and one for the chassis/FALD.

View attachment 84232

I don't care if they use fans as long as the design and quality of fans they use isn't on the low end that ends up causing more noise than necessary. I highly doubt there isn't any room for improvement in the cooling design and the fans that the X27 used; it and the ASUS equivalent are the first revisions of these types of monitors. Would not surprise me if more efficient cooling solutions are implemented in later revisions.
 
I'm in when they release the 3840x1600 21:9 version running at a true 4:4:4 at 144Hz. 21:9 is too good to give up imo, having tried going back to 16:9. Plus it's 35% less pixels to drive, and even with a Titan V pushing high refresh at this resolution, that will help a lot.
 
So for those of us with deep pockets, is it worth the toy factor, or is it a hard pass? It wouldn't be the most expensive screen I've ever bought. (NEC3090WQXi, which I am still using at this very moment actually.)
 
So for those of us with deep pockets, is it worth the toy factor, or is it a hard pass? It wouldn't be the most expensive screen I've ever bought. (NEC3090WQXi, which I am still using at this very moment actually.)
I probably would've kept it if it weren't for the fan noise. Maybe it could've been something I got used to, but who knows.
 
I probably would've kept it if it weren't for the fan noise. Maybe it could've been something I got used to, but who knows.

My NEC3090WQXi has one or two fans, I never even realized until my ear was right by the top one day a couple years later. Seems like maybe the Acer is more obnoxious though.
 
If I could get the $900 back I spent on my 35" VA ultrawide I'd sooooo have one of these on the way.

Side note: I know this isn't a good solution for most people, but ya know what completely kills fan noise? Noise canceling headphones. I use Bose quiet comfort 20's when I game on my laptop and you can't hear a thing from any fans. Try it!
 
Back
Top