ASUS announces Swift PG27UQ 4K IPS 144Hz G-Sync HDR monitor

I see someone has a penchant for hyperbole. A budget CRT from the 90's was a 15" 800x600 60 Hz flickering 4:3 display. A 27" 4K 144 Hz display with FALD and HDR can't compete with that? lol. OLED doesn't have any crippling, ridiculous problems. The only reason the motion quality isn't better isn't because of the technology, it's because of the way it's packaged in today's 60 Hz TV's. OLED picture quality absolutely destroys CRT.



If you use the previous entire frame, there will be a lag with the FALD responding. There is a reason why in every TV made, turning off FALD is a requirement for any decent input lag. I'm just wondering what the geniuses at NVIDIA came up with to minimize the lag.


It seems like they could just control FALD on the fly with essentially no latency. Setting the zones brightness as it gets the data just like it sets the color of each pixel as it gets that data.
If each pixel has a brightness value it just keeps a value for each zone and once all the pixels are received for that zone adjusts the brightness. It would basically just be controlled as it is scanned out like the pixels are, only instead of millions of pixels it would be hundreds of lighting zones.
 
I see someone has a penchant for hyperbole. A budget CRT from the 90's was a 15" 800x600 60 Hz flickering 4:3 display. A 27" 4K 144 Hz display with FALD and HDR can't compete with that? lol. OLED doesn't have any crippling, ridiculous problems. The only reason the motion quality isn't better isn't because of the technology, it's because of the way it's packaged in today's 60 Hz TV's. OLED picture quality absolutely destroys CRT.

That's bullcrap. I had a cheap display in 1998 that ran at 100hz and 1600x1200.
 
  • Like
Reactions: N4CR
like this
That's bullcrap. I had a cheap display in 1998 that ran at 100hz and 1600x1200.
I had a Sun Microsystems GDM-5010PT that did 85 Hz at 1600x1200 and it certainly wasn't cheap. But I paid $100 for it in a fire sale in 2004 from a LAN center that was going out of business after Hurricane Charley. While the Trinitron tubes were amazing for the time, my PG278Q beats it in every way but viewing angles.
 
Except you are comparing a 2017 Ferrari to a 2007 Toyota. This new monitor completely blows everything out of the water. But remember there is also an Acer version in the works.

Honestly I would pay $2,500 to $3,000 if it were 32". But I know they stopped at 27" because even at $2,000 sales numbers were going to drop drastically as less and less people could afford. Going larger probably wasn't an option from the bean counters.

I don't know where you get the data that the old monitors are toyota and the new one is a ferrari.
the only addition over previous models is the 144Hz at 4K that is simply unuseful and the HDR.

HDR is awesome but surely doesn't justify 220% price increase.
 
I will buy this thing and be grateful if it ACTUALLY comes out (ahem dell OLED)

People have zero reason to complain about the price, there is plenty of time between now and October 2017 to raise funds by selling plasma and hooking the streets for spare cash!

this reasoning is pretty simplistic.
it's not a question of having money or not having it, it's simply a question on how much you are willing to spend for a gaming monitor that adds two features over the previous model
 
Last edited:
The left display looks like ULMB is on, which naturally decreases perceived brightness.

For $2k I'd rather get a large OLED TV... For $1,199 USD this would have been a day one buy for me.

same here.
I have no problem in spending 2000€ for a thing that I like but I don't accept theft in the name of fanboysm.
 
That's bullcrap. I had a cheap display in 1998 that ran at 100hz and 1600x1200.
That I find hard to believe, simply because FD Trinitron G1 chassis monitors like the Sun GDM-5410 and Dell P1110 can only hit 95 Hz max at that res, and those are the next best thing in the whole FD Trinitron lineup after the G1W chassis monitors like the FW900.

Perhaps there's a 22" Diamondtron NF that could do it, but I'm not as well-informed on those. Even then, we're not talking 1998, but roughly 2002-2003.

I had a Sun Microsystems GDM-5010PT that did 85 Hz at 1600x1200 and it certainly wasn't cheap. But I paid $100 for it in a fire sale in 2004 from a LAN center that was going out of business after Hurricane Charley. While the Trinitron tubes were amazing for the time, my PG278Q beats it in every way but viewing angles.
I doubt that the PG278Q could touch it in terms of color (even without taking the usual crappy TN viewing angles into account), black levels, input lag and non-native resolution handling.

I'll give it this, though: the PG278Q was on display at the local Micro Center a while back and looked surprisingly tolerable for a TN panel. Not tolerable enough for the price premium it's commanding, though, especially given today's AHVA options.
 
you could get a pg278Q new for $650 at one point. Considering the price of high end gpus I don't find that crazy anymore. Of course they were $800 or so at launch. I think the fw900 graphics professional 24" widescreen triniton crt was over $3000 in today's dollars when it was released.

There are budget monitor options for budget minded people, even if at 1080p. Like anything else, bleeding edge is the highest price and it trickles down over the years. You could prob score a 1080p static hz monitor now cheap, or even a 1440p static hz, or a 60hz obviously. I think a good 1080p 144hz g-sync monitor is like $300 now too, and the 21:9's have come down in price considerably since they came out (owing to 100hz version's release).

As shown in my previous posts, you pretty much need dual 1080's in sli on the most demanding games even running 1440p upscaled on these monitors to get 100fps-hz average, a point where the higher hz is utilized appreciably throughout the frame rate graph. (The only outlier I saw on demanding games was GTAV which could get almost 100fps-hz at 4k rez with dual gtx 1080's). In either of those gpu scenarios, you are talking about $1200+ in gpu cost.


that's what g-sync is for, though it's not a "fix" for low frame rates imo. I'd shoot for 100fps-hz average as a target, which means I'd probably have to run 1440p upscaled on the most demanding games when I upgrade. Looking up 1080 sli vs single titan.. on a lot of games it's like 16fps less on titan 1440p.. gta V is 20 fps diff.. 4k gta V is 94.9fps vs 62.8 fps single titan though.

gtx1080 sli /vs/ single titan(pascal):
witcher3: 1440p 112.3fps / 96.6 fps ... 4k 75.6fps / 57.6fps
GTA V:__1440p 140.4fps / 120.7 fps , 4k.. 94.9 fps/ 62.8fps

Dual card seems more like a requirement for 4k at high fps-hz unsurprisingly, and even then would require dialing down from the aribitrary "ultra" ceiling a bit on the most demanding games (like witcher3) to hit the 100fps-hz average target in the case of gtx 1080sli. GTAV is very close with 1080 sli at 4k being 94.9fps average however. The best I could find about BF1 sli was an intense multiplayer benchmark showing gtx 1080 sli getting 82.7 fps-hz average at 4k, titan pascal 69.2 fps-hz average at 4k.
By upscaling 1440p you could hit the 100fps-hz average mark (much higher on some games) and use g-sync to ride the graph dynamically from 70 - 100 - 140 (160 even depending on the game). Here is an example of a frame rate graph showing fps(fps-hz with g-sync) at around 100fps-hz vs 60fps-hz to give an idea of the actual ranges.

blur reduction/motion clarity increase 100fps-hz ave:
0% <- 20%(~80fps-hz) <-- <<40% (100fps-hz)>> -->50%(120fps-hz)->60% (144fps-hz)
and motion definition/path articulation/smoothness wise
1:1 -- 1.x:1 (~80+ f-hz) <--- << 5:3 (100fps-hz)>> --> 2:1 (120fps-hz) -> 2.4:1 (144fps-hz)

So even with g-sync if you are around 60fps-hz average your actual graph is more like 30 - 90 which means you are only hitting 20% - 30% blur reduction in 1/3 of the wildly dynamic graph. That means you'd only get a dynamic "vibrating" mix of the top 1/3 of the graph thrown in to the blend of 2/3 full blur and very low motion definition. And that top 3rd mixed in the blend being only being 20 - 30% reduction at that. That is why I shoot for 100fps-hz average as a target so my range follows the colored example range I posted above. That way g-sync allows me to get 100 to 144fps 2/3 of the graph in a mix/blend. 120fps-hz or 140fps-hz average would be even better when possible obviously, like GTAV can do at 1440p, so the low end blended in would only hit 90fps-hz or 110fps-hz, respectively.
 
Last edited:


When it was shown next to the other monitor at an angle the black levels on the PG27UQ looked much better and I couldn't see the back light shining through like the other one. Very Impressive! Linus was mentioning DP 1.4 and that the cards in the demo system may have been Quadros. Does this mean Quadros are the only cards with full DP 1.4 support? I looked up the GTX 1080 specs and it says it's "DP 1.3/1.4 Ready" but it's only DP 1.2 certified. I'm guessing the GTX 1080 would still work fine with this monitor?
 
Yes, Pascal GPU's will run DP 1.3/1.4. But Volta will be out by the time this monitor releases anyway. I plan on getting SLI Volta once this display hits.
 
Any word on hdcp on these?

That's a good question since you could watch 4K HDR UHD Blur-Ray's on this display if it has an HDMI 2.0 port with HDCP. I'd assume NVIDIA made that happen whilst designing this new DP 1.4 G-Sync chip (and how the HDMI port is configured in the G-Sync package), but we all know where assumptions get you...
 
That's a good question since you could watch 4K HDR UHD Blur-Ray's on this display if it has an HDMI 2.0 port with HDCP. I'd assume NVIDIA made that happen whilst designing this new DP 1.4 G-Sync chip (and how the HDMI port is configured in the G-Sync package), but we all know where assumptions get you...

There's that and would love to pair my PS4 pro and with it. Also which "HDR" standard(s) does it have?
 
Being 1000nit , P3 color it could conform to the UHD alliance's HDR premium label as long as it's black depth is deeper than .05 (via it's FALD), among other things. Hopefully the combination of quantum dot and the large number of dimming zones will allow it to have deep enough blacks considering it isn't a VA panel.

Requirements it would have to meet:

• Image Resolution: 3840×2160
• Color Bit Depth: 10-bit signal
• Signal Input: BT.2020 color representation
• Display Reproduction: More than 90% of P3 colors
• Color Palette (Wide Color Gamut)
• More than 1000 nits peak brightness and less than 0.05 nits black level




http://www.uhdalliance.org/uhd-alliance-press-releasejanuary-4-2016/

UHD Alliance Technical Specifications Overview
The UHD Alliance has developed three specifications to support the next-generation premium home entertainment experience. The three specifications cover the entertainment ecosystem in the following categories:
• Devices (currently, television displays, with other devices under consideration)
• Distribution
• Content
A high level overview of each technical specification can be found below. Please join the UHD Alliance for full access to all technical and test specifications.

Devices
The UHD Alliance supports various display technologies and consequently, have defined combinations of parameters to ensure a premium experience across a wide range of devices. In order to receive the UHD Alliance Premium Logo, the device must meet or exceed the following specifications:
• Image Resolution: 3840×2160
• Color Bit Depth: 10-bit signal
• Color Palette (Wide Color Gamut)
• Signal Input: BT.2020 color representation
• Display Reproduction: More than 90% of P3 colors
• High Dynamic Range
• SMPTE ST2084 EOTF
• A combination of peak brightness and black level either:
• More than 1000 nits peak brightness and less than 0.05 nits black level
OR
• More than 540 nits peak brightness and less than 0.0005 nits black level

Distribution
Any distribution channel delivering the UHD Alliance content must support
• Image Resolution: 3840×2160
• Color Bit Depth: Minimum 10-bit signal
• Color: BT.2020 color representation
• High Dynamic Range: SMPTE ST2084 EOTF

Content Master
• The UHD Alliance Content Master must meet the following requirements:
• Image Resolution: 3840×2160
• Color Bit Depth: Minimum 10-bit signal
• Color: BT.2020 color representation
• High Dynamic Range: SMPTE ST2084 EOTF

The UHD Alliance recommends the following mastering display specifications:
• Display Reproduction: Minimum 100% of P3 colors
• Peak Brightness: More than 1000 nits
• Black Level: Less than 0.03 nits

The UHD Alliance technical specifications prioritize image quality and recommend support for next-generation audio.

UHDA Member Companies
Company Membership Level
The DIRECTV Group, Inc. Board Member Company
Dolby Laboratories, Inc. Board Member Company
LG Electronics Board Member Company
Netflix, Inc. Board Member Company
Panasonic Corporation Board Member Company
Samsung Electronics Corporation Board Member Company
Sony Corporation Board Member Company
Technicolor Board Member Company
The Walt Disney Studios Board Member Company
Twentieth Century Fox Board Member Company
Universal Pictures Board Member Company
Warner Bros. Entertainment Inc. Board Member Company

Amazon.com Contributor
ARRI, Inc. Contributor
Dreamworks Contributor
DTS Contributor
Fraunhofer Gesellschaft Contributor
Hisense Contributor
HiSilicon Technologies Contributor
Intel Corporation Contributor
Koninklijke Philips N.V. Contributor
MStar Semiconductor, Inc. Contributor
Nanosys Inc. Contributor
Novatek Contributor
NVIDIA Contributor
Orange Contributor
Realtek Semiconductor Corp. Contributor
Rogers Communications Contributor
Sharp Corporation Contributor
Shenzhen TCL New Technology Co., Ltd. Contributor
Sky UK Ltd Contributor
THX Ltd Contributor
Toshiba Lifestyle Products & Services Corporation Contributor
TP Vision Europe B.V. Contributor
 
Last edited:
There's that and would love to pair my PS4 pro and with it. Also which "HDR" standard(s) does it have?

It has HDR10 standard, same as the consoles

Nvidia article: (http://www.geforce.com/whats-new/articles/nvidia-g-sync-hdr-announced-at-ces-2017)
"These G-SYNC HDR monitors also come with support for HDR10 the HDR format being adopted for PC gaming."

This contradicts one of the older Mass Effect videos where the dev said it's going to have Dolby Vision on PC.
Apparently they must have had talks in the meantime and now they're aiming for the HDR10 standard in PC gaming.
It remains to be seen whether it will work with consoles and HDR video just like that or whether Nvidia has put some arbitrary proprietary lock on it so it only works for PC games (hopefully not)
 
Last edited:
You can't see HDR without a 1000nit HDR monitor and any picture of two monItors with one brighter than the other is going to show one as much darker and less saturated than in RL due to camera bias. That picture is nothing like what that would look like in person to your eyes. I would completely disregard it.
1000nit is the minimum requirement for HDR and is still not true HDR by standards set. 10,000 nit is needed for a true HDR experience, which no TV or Monitor currently has. There is a few TV's shown recently which have 4000nit, making 1000nit the base.
 
Blockbuster movies are mastered on 4000nit Dolby Pulsar prototype displays.
Various HDR10 content is currently being mastered at levels between 1000-4000nits.
From the specs presented already, it is expected most 2017 HDR TV's will have up to around 2000nits this time.
Tones that extend beyond the display's range have to be remapped by the manufacturers algorithm.
 
HDR 10
-------------
HDR Prem: UHD alliance standard HDR10: Open Standard .. Dolby Vision: Proprietary
HDR Prem, HDR10: Software Upgradable .. Dolby Vision: SoC Hardware-Embedded
HDR Prem: 10bit HDR 10 :10bit Dolby: up to 12bit
HDR Prem: .05 black depth min LCD HDR 10: ?? black depth Dolby Vision: ?? black depth
HDR Prem: 1000nit min(to 4000) HDR 10: 1000 nit (to 4000nit) Dolby: target/mastered at 4,000nit (dolby spec max 10,000 nit)
HDR Prem: static metadata HDR 10: static metadata Dolby: static metadata, dynamic metadata (scene-by-scene)

Color Gamut
Standard HD TV and Blu-ray uses 8-bit color encoding and displays colors in the Rec. 709 color space. Rec. 709 covers approximately 33% of the human visible spectrum.
Dolby Vision uses 12-bit color encoding and can display colors in the Re. 2020 color space which encompasses a bit more than 57% of the visible spectrum.
The Ultra HD Premium standards for HDR TVs stipulate that a screen must be capable of handling 10-bit color encoding and be able to display 90% of the colors in the DCI P3 color space which falls between the narrower color gamut of Rec. 709 and the wider gamut of Rec. 2020.

VR6gxX2.png


Dynamic metadata
The Dolby Vision spec also allows the screen displaying a video to know about the screen used to master the scene, and can automatically account for the differences between the two displays. This leads to an image that is automatically adapted to fit the best it can to your individual display, rather than relying on assumptions made by the mastering engineer.

"Dolby Vision is a proprietary solution. To take advantage of it, you need Dolby Vision-mastered content played through a Dolby Vision-compatible player and outputted to a Dolby Vision-enabled display. This requires Dolby’s system-on-a-chip, certification process, and licensing fees–which is more expensive for manufacturers and for you."
 
Last edited:
I'm aware of the higher nit ceiling outlined in all three specs, and of the "higher-than-enthusiast-level priced displays will be capable of for the foreseeable future" dolby spec ceiling of 10,000. (dolby vision content is mastered at 4,000).

My point was that the picture comparison given as an example was not as it would look in person due to camera bias making one display look drastically more "muted" vs the other and vice versa, than what either would look like in person to your eyes. Also that even if you had a few separate photos that managed some level of accuracy you wouldn't be able to see how the difference actually looked if viewing it on your typically 400nit and less monitor or tv. i.e "Look how much brighter and more contrasted that monitor is than this monitor I'm viewing it on!" ;b

I don't know how critical the difference between 1000nit/.05 black depth and dolby vision (content mastered at 4000nit currently) would be in games for the next few years, or even what game's HDR range limits are coded to currently.

Dolby sounds like g-sync in that it needs a proprietary chip and a certification process, as well as licensing fees. It obviously is superior in ranges for any available content mastered in those ranges. Primarily the color range increase (mostly the green end of the spectrum), and in the dynamic frame handling (for movies, not games). Mastering at 4000nit is a good set point however and all of the standards include 4000nit though they don't require it, opting for a 1000nit minimum. The 10k dolby vision brightness ceiling is going to be a tech demo for a long time yet I'm guessing, so I don't think it's relevant personally (esp. considering the content is mastered in 4000nit)..

I think both HDR standards are drastic improvements over previous tech. This monitor's max brightness is 1000nit obviously, and is designed for gaming. That's still much higher than the typical 350nit gaming monitor, and it's the first with FALD and has a lot of zones.

Until this monitor's color bit depth (what does it's quantum dot color equate to?) and black depth is tested -- it may not pass even HDR premium label standard.

I'm holding my breath about the black depth and detail in blacks it's capable of with it's high number of dynamic backlights combined with quantum dot color tech. My interest is in it's capabilities and aesthetics for gaming even if it gets some misc desktop/app use.
 
Last edited:
I really want a slightly bigger version of this. 30-32" would be perfect.
That is exactly what I was thinking, but I would have said 32-34". I have a 30" now (2560x1600) and I would really like a little more screen real estate when talking 4k. Otherwise this monitor hits all my other desired points (4K, 144hz, HDR). I guess wake me up when they can find another 7"?
 
That is exactly what I was thinking, but I would have said 32-34". I have a 30" now (2560x1600) and I would really like a little more screen real estate when talking 4k. Otherwise this monitor hits all my other desired points (4K, 144hz, HDR). I guess wake me up when they can find another 7"?

TBH i want a 40 inch version. Kinda tired of seeing such tiny 4k monitors. Sure the higher ppi is a nice bonus but when it comes to 4k ill take size/immersion over ppi.
 
I agree. This tiny thing isn't even approaching consideration for me. Going for a 40-50" TV this year it looks like.
 
  • Like
Reactions: N4CR
like this
I agree. This tiny thing isn't even approaching consideration for me. Going for a 40-50" TV this year it looks like.

While I love my B6 OLED, the 60hz, high input lag, and lack of VRR kills for me as a gaming display. TVs just lack these key features for me atm. Seems like the best bet for an immersive but also smooth gaming experience is something like the Samsung CF791 100hz VA ultrawide but that monitor supposely has quite some problems of its own...sighhh looks pretty depressing this year for gaming monitors.
 
Really, if wishing I'd like this all in a large 21:9 or even wider (32:9?), 2160 high, nearly as wide as triple monitor without bezels. As long as the resolution is high enough, you could run whatever rez you wanted 1:1 in a smaller or narrower window anyway. I think in the far future that's what monitors, VR and AR will be like when a 2D viewing pane is desired. An overabundance of resolution where you just place your virtual display wherever you want in your field of vision.

I've heard nothing about any other full featured gaming monitors (low input lag, low response time, modern gaming overdrive, g-sync/variable hz, 144hz, Hi resolution) having 1000nit peak brightness, HDR and a high density FALD array. This breaks new ground. No tv is going to match these features other than physical size vs distance. There are however a few 4k tv's that can do 120hz native input at 1080p with fairly low input lag (16 - 20ms with FALD and other processing off) but they still lack all of the other modern gaming monitor features.

I've read that samusng and AUO are going to make some other 144hz 3440 x1440 and 2560 x 1440 VA gaming screens in 2017, including a 34" 2560 x1440 if I remember correctly. So perhaps there will be some other full featured gaming monitor options this year without 1000nit, HDR, 384zone FALD.

Edit: These below among others listed on that link. I guess I was remembering the 31.5", 2560x1440 16:9, 144hz VA due out this year (4.5" larger diagonally 16:9 than a 27"). There are larger ips ones like a 37.5" 3840 x 2160 , 144hz planned too. My own interest was the 35" 3440 x 1440 VA when it was 144hz - 200hz but now it's only listed at 100hz on dp 1.2. None of these has 1000nit + FALD, (and many of them are free-sync only) - so I'm more interested in the PG27UQ now.

http://www.tftcentral.co.uk/articles/high_refresh_rate.htm

Article Change Log - Update 20/10/16

  • Updated status of 34" IPS Ultra-wide panel with 3440 x 1440 res @ 144Hz. Planned production delayed from Q1 to Q3 2017.

  • Added new detail of planned 37.5" IPS panels with 3840 x 2160 @ 144Hz

  • Correction to panel part numbers for 24.5" TN Film panels at 240Hz

  • Updated mass production dates for 240Hz TN Film panels. 24.5" now in mass production from Oct, and 27" from Nov 2016.

  • Updated mass production date for 27" 240Hz TN Film panels, Oct/Nov 2016. Panel part numbers also updated

  • Update to 35" 3440 x 1440 VA panels from AUO. 100Hz versions mass production delayed from June/July to Sept 2016. 200Hz version no longer listed (now 100Hz).

  • Update panel part for AUO 31.5" VA panel with 2560 x 1440 @ 144Hz. Mass production expectation of January 2017.
 
Last edited:
I agree. This tiny thing isn't even approaching consideration for me. Going for a 40-50" TV this year it looks like.

We are talking about PC monitors. Peripherals generally used for PC usage.
A 50 inch display is a TV and is useful for film and console like gaming. So you don't need a PC monitor but a TV.

PC monitor will never be 50 inches.
 
HjHBapE.png

pretty much anything worth buying can be calibrated to essentially perfect color accuracy, with few exceptions. as you can see by this image, even high-end TN panels can be calibrated to be as accurate or even more accurate than some IPS panels.
The problem is most of them won't keep the calibration in games.
 
The problem is most of them won't keep the calibration in games.
there are a few ways around that. reshade iirc has a color profile shader, color clutch which is similar to reshade being an injector, and i think AMD cards can maintain color/gamma ramps in fullscreen. there's also windowed borderless which is the easiest way, and even most old games can use tools to be run in that mode.
 
there are a few ways around that. reshade iirc has a color profile shader, color clutch which is similar to reshade being an injector, and i think AMD cards can maintain color/gamma ramps in fullscreen. there's also windowed borderless which is the easiest way, and even most old games can use tools to be run in that mode.

Running games in window borderless does not add any color profile support to games. So running games full screen mode or borderless is exactly the same
 
Running games in window borderless does not add any color profile support to games. So running games full screen mode or borderless is exactly the same
it has nothing to do with adding color profile support to games, and you're completely wrong. just tested it with cpkeeper.
 
TBH i want a 40 inch version. Kinda tired of seeing such tiny 4k monitors. Sure the higher ppi is a nice bonus but when it comes to 4k ill take size/immersion over ppi.

Yup. Needs to be 34+ to get me to replace my ACER X34 (I only use my PC for gaming and light internet browsing).

I'd say TFTCentral hit the nail on the head when they said this about it:
"Perhaps the real benefits for many will appear when we see larger screen sizes appear with 4K @ 144Hz."
 
nice monitor but way too small.. especially for 4k. would prefer a 32" wqhd 144hz or 37.5" 4k with 144hz. this industry is rly stupid...
 
it has nothing to do with adding color profile support to games, and you're completely wrong. just tested it with cpkeeper.

window borderless does not bring any improvements over full screen.
if the game is not color managed there is improvement with fullscreen or borderless.
 
nice monitor but way too small.. especially for 4k. would prefer a 32" wqhd 144hz or 37.5" 4k with 144hz. this industry is rly stupid...

the industry follow the general consensous.
I don't like monitors bigger than 27 inch for PC usage. I use PC on a desktop with mouse and keyboard and not on a couch, so the monitor is pretty close to my eyes and having a monitor bigger than 27 inch ruins the experience.
 
Back
Top