NVIDIA Big Format Gaming Display

Now, if samsung puts hdmi 2.1 or gsync on "the wall", THAT would be nice. 146", quite exactly the size i need :)
Not going to happen though, most likely... 99% chance it will be a $30000-50000 60Hz product :( If they put g-sync or hdmi2.1 with VRR on it and 120Hz+, I'd buy one in that pricerange (if it had estimated lifespan of 10+ years or 5-10 year warranty/possibility to replace single failed microled modules).
Products like this always cheap out on the inputs, which I find ridiculous.

"A press event scheduled for March is said to offer more details"
 
I'll definitely be getting one to try out on release day.

The more I see of this OLED ProArt the more I question its practicality for desktop users. Seems to be a hybrid between a traditional monitor and a portable tablet.

- Visible matte anti-glare coating
- No VESA mount
- Questionable placement of USB-C and HDMI ports along the left vertical edge.

Some vids for our pleasure:



 
You're not wrong. Most likely I will punt the whole dumpster fire to 2019 and buy an PG27UQ or PG35VQ now that it seems we have confirmation those are still slated to come out this year. Kinda leaning towards the PG35VQ. You're gonna have smear on these BFGDs anyway since they're VA panel based, might as well get 21:9 and a usable desktop size.

Well the 21:9 is a VA panel too. Looks to be the same 35" VA panel that has been around, just with a FALD slapped on the back. So 200 Hz on a VA panel isn't really practical, as seen on the Acer Z35. Pixel speeds cannot keep up with 200 Hz. I'm leaning towards the 27 FALD which is IPS.

The more I see of this OLED ProArt the more I question its practicality for desktop users. Seems to be a hybrid between a traditional monitor and a portable tablet.

- Visible matte anti-glare coating
- No VESA mount
- Questionable placement of USB-C and HDMI ports along the left vertical edge.

Ya who puts matte coating on a super high PPI OLED wtf.
 
Well the 21:9 is a VA panel too. Looks to be the same 35" VA panel that has been around, just with a FALD slapped on the back. So 200 Hz on a VA panel isn't really practical, as seen on the Acer Z35. Pixel speeds cannot keep up with 200 Hz. I'm leaning towards the 27 FALD which is IPS.
It would be interesting if they added driver-level black frame insertion at 200Hz, with the gpu pushing 100fps.
The driver could fald-dim the backlight fully every 2nd frame while the pixels shift to the next frame while the backlight is off, effectively offsetting the slow VA response times.
Would be quite awesome 5ms persistence equivalent 100Hz screen.
 
It would be interesting if they added driver-level black frame insertion at 200Hz, with the gpu pushing 100fps.
The driver could fald-dim the backlight fully every 2nd frame while the pixels shift to the next frame while the backlight is off, effectively offsetting the slow VA response times.
Would be quite awesome 5ms persistence equivalent 100Hz screen.

If this could be made to work with G-Sync...
 
If this could be made to work with G-Sync...
Yeah.. it would face the same exact issues as ulmb+gsync, so probably not trivial to solve that part. Theres nothing stopping it from working although the brightness would appear to fluctuate as framerate drops with gsync enabled.. but why not let the user have a checkbox [x] enable gsync with BFI - warning, only use with stable high framerates.
But with these new shiny 1000nits backlights they could probably compensate for the gsync+ulmb dimness problem by capping the max brigtness at say, 500nits and use the full 1k to compensate when framerate drops. That would require a lot of tweaking and testing though. The "old" ulmb displays had no possibility to do that as they were too dim already at max brightness ulmb.
On the old gen monitors ulmb+gsync was a problem that couldn't be solved. Now it probably can.
 
Last edited:
I highly doubt screen blanking or ULMB will be used with HDR and HDR is the future, even for still photos and static desktop as well as youtube and web content as cameras and apps start using it more. Going to SDR mode would clip or blow out highlights on HDR content. 500nit imo isn't going to show HDR highlights well enough, and anything weaker than a VA's black depth isn't going to show black depth enough.

These very high density dynamic FALD blacklights may improve the ~ 3000:1 contrast ratios of modern gaming VA's to more like 3800 - 3900:1 or more (+30% or so).. hopefully 4100+ but maybe I'm just being hopeful. A TN or IPS are usually around 850:1 to 1000:1 for reference, though the asus/acer 27" ips ones with high density FALD should push that higher.. if we use a 30% improvement via high density FALD as a guestimate , then perhaps 1300:1 or more which is still quite low.

I'd guess if anything it would be an either/or scenario with screen blanking. G-sync(VRR) + HDR with full bright minimum acceptable highlights of up to 1000nit would be superior to SDR range mode ulmb/screen-blanking once HDR content becomes ubiquitous. Remember 1000nit is the minimum acceptable peak brightness for HDR content while HDR movies are mastered at 4000nit, and most other content likely mastered at at least 1200 - 2000nit.

SDR is a limited (narrow) band, having a 1000nit HDR display doesn't push that same rendered band 2x to 3x higher like it would ramping up the brightness on a SDR screen. SDR brightness is relative, HDR uses a different system for brightness which uses absolute values. It opens up the spectrum so content's highlights, shadows, etc can go into a much broader band (1000nit peak to .05 black depth). When SDR goes "outside" of it's narrow band, it crushes colors to white, and muddies dark detail to black. HDR will show the actual colors at higher brightness highlights (gleaming reflections,edges) without crushing to white. HDR shows the same content at the same brightness when that content falls within a calibrated SDR range, it does not scale up the brightness of the whole scene like turning the brightness of a SDR screen up would.

HDR video increases the range of brightness in an image to boost the contrast between the lightest lights and the darkest darks. If you’re having difficulty grasping how that translates into a more realistic image on your screen, think of the subtle tonal gradations a fine artist creates in a charcoal drawing to build the illusion of volume, mass, and texture, and you should begin to get the picture. But HDR doesn’t just improve grayscale; its greater luminance range opens up a video’s color palette as well. “Basically, it’s blacker blacks, whiter whites, and higher brightness and contrast levels for colors across the spectrum,” says Glenn Hower, a research analyst at Parks Associates.

The result is richer, more lifelike video images. Rather than washing out to white, as it would in conventional video, a ray of sunlight reflecting off a lake in HDR will gleam, and a bright cloud will appear soft and cottony. Basically any image your current TV would render shadowed, dull, muddy, or bleached out will look nuanced, vibrant, and strikingly realistic in HDR."

--------------------------------------------------------

"Realistically, most content will be mastered at 1,200-2,000 nits.
But that is the absolute peak brightness that highlights will reach. Maximum scene brightness is to remain below 400 nits."

--- This is what the uninformed thinks happens:
"If you viewed SDR content on a 1,000 nit display, everything would be displayed 10x brighter than intended.
If you viewed that same SDR content on a 500 nit display, everything would be half the brightness of the 1,000 nit display."

---This is what really happens:

If you viewed HDR content on a 1000 nit display or a 500 nit display, any scene where the peaks are lower than 500 nits should look exactly the same on both displays, due to the way that brightness is now coded. It represents an exact value, not a relative value.
For scenes which have highlights exceeding 500 nits, you will have highlight clipping on the lower brightness display (everything above 500 nits turns white) which would not be present on the higher brightness display. <---- Which will show full color gradations across the spectrum in bright highlights instead of crushing to white after hitting the sdr ceiling.

HDR enables far more vivid, saturated, and realistic images than SDR ever could.
High-brightness SDR displays are not at all the same thing as HDR displays.""

------------------------------------------------------

http://www.creativeplanetnetwork.com/news/shoot/understanding-gamma-and-high-dynamic-range/615702

"To achieve HDR, it is necessary to use bit-depth more efficiently while keeping average brightness about the same as an SDR image. This means more bit-levels for low light levels where the eye is more sensitive and fewer bit-levels for the high brightness areas where the eye cannot see the contouring. In others words, we need a Perceptual Quantizer or PQ that does a better job than the PQ of the current BT.709 gamma."

"Modern cameras are capable of capturing a wide dynamic range. But unfortunately SDR displays will either clip or blow out highlights in images."

All it takes is one look at a true HDR video on an HDR-capable screen, and consumers are convinced it’s the way to go. But having that impact requires a good understanding of gamma and its role in capturing and monitoring HDR content."
 
I highly doubt screen blanking or ULMB will be used with HDR and HDR is the future, even for still photos and static desktop as well as youtube and web content as cameras and apps start using it more. Going to SDR mode would clip or blow out highlights on HDR content. 500nit imo isn't going to show HDR highlights well enough, and anything weaker than a VA's black depth isn't going to show black depth enough. These very high density dynamic FALD blacklights may improve the ~ 3000:1 contrast ratios of modern gaming VA's to more like 3800 - 3900:1 or more (+30% or so).. hopefully 4100+ but maybe I'm just being hopeful.

I'd guess if anything it would be an either/or scenario with screen blanking. G-sync(VRR) + HDR with full bright minimum acceptable highlights of up to 1000nit would be superior to SDR range mode ulmb/screen-blanking once HDR content becomes ubiquitous. Remember 1000nit is the minimum acceptable peak brightness for HDR content while HDR movies are mastered at 4000nit, and most other content likely mastered at at least 1200 - 2000nit.
I think hdr with 250-300 nits will be fine in a light controlled room.. not like we max brightness on displays as is anyway. I go for around 100 nits (often around 20% brightness), i highly doubt i will allow anything to blast my eyes with anything over 250-300 nits at monitor distance. 1000 Nits must be more like for rooms with daylight at tv range. The best hdr tvs (oled) max out at 5-600nits. But if it turns out we really DO want 1000nits its still a sound concept if they just made the backlight brighter (2000 nits, or w/e to compensate).

My example would not provide 500 nits btw, It would be 250 nits (500 nits displayed 50% of the time, same dimming effect as ulmb or 100Hz PWM).
 
Last edited:
that's not the way HDR works, if you read what was included in the boxed off quotes..

HDR is not like the brighness setting of a SDR display.

a lot of the scene remains in the SDR range. 1000 nit is not even bright enough to display what full HDR movies are mastered at (4000nit, much higher color ranged highlights mostly) in a bright room or not.. it is used as a minimum for the spec.

What happens is any color information that is out of SDR range in media (including stills) will be clipped to white or crushed to mud on more limited range displays.

When it comes to screen blanking it's not just the dimness, though that is a huge issue.. it's that it is also variable and probably won't mesh with HDR well at all. It also requires high frame rates to reduce the PWM effect of strobing... and most people nowadays avoid PWM monitors entirely so might not want to re-introduce it to their eyes at 85hz or 100hz.

You either do not understand the relevant features for a gaming monitor ou underestimate the color capabilities of high end "gaming monitors".
100Hz with a strobing backlight beats 200Hz without it any day of the week.
The X34 Predator does not have a strobing backlight, so its motion blur is way below the 24" and 27" pro gaming alternatives. But its color accuracy is excellent after calibration, making the X34, and most other displays using the same 34' LG panel good value for productivity.

21:9 for me is a gimmick. At the same refresh rate and feature set, i would pick the higher resolution available always. No f way one can convince me that 2560x1080 looks better than 2560x1600, or That paying more than twice for 34" 3440x1440 than the price asked for 40" 3840x2160 is a smart move.


I'll preface this by saying we finally have strong 2560 x 1440 high hz capable gpus, singly, in the titan pascal and the 1080ti.

I do understand the resolution vs frame rate argument GoGaming is making though somewhat, especially for people who buy mid range gpus where 1080 high gives them a lot more room to crank up higher settings while keeping high fps-hz.

If you aren't getting at least 100fps-hz average, you aren't getting any appreciable gains out of a high hz monitor.

If you are trying to run strobing on a high resolution monitor, you have to run much lower settings in order to get sustained (not average) 100fps or better. It's a huge difference.

ALY9lQS.png


=========================================================
=========================================================

Easier to render games with very high fps work pretty well with ulmb.
Running 1440p or higher rez with any kind of high to ultra settings on the most demanding games won't let you sustain high fps, only average it.

As per blurbusters.com 's Q and A:

-----------------------------------------------------
Q: Which is better? LightBoost or G-SYNC?
It depends on the game or framerate. As a general rule of thumb:
LightBoost: Better for games that sustain a perfect 120fps @ 120Hz
G-SYNC: Better for games that have lower/fluctuating variable framerates.

This is because G-SYNC eliminates stutters, while LightBoost eliminates motion blur. LightBoost can make stutters easier to see, because there is no motion blur to hide stutters. However, LightBoost looks better when you’re able to do perfect full framerates without variable frame rates.

G-SYNC monitors allows you to choose between G-SYNC and backlight strobing. Currently, it is not possible to do both at the same time, though it is technically feasible in the future.
......
Main Pros:
+ Elimination of motion blur. CRT perfect clarity motion.
+ Improved competitive advantage by faster human reaction times.
+ Far more fluid than regular 120Hz or 144Hz.
+ Fast motion is more immersive.

Main Cons:
– Reduced brightness.
– Degradation of color quality.
– Flicker, if you are flicker sensitive.
– Requires a powerful GPU to get full benefits. <edit by elvn: and turning down settings a lot more at higher resolutions>

--------------------------------------------------------
During regular 2D use, LightBoost is essentially equivalent to PWM dimming (Pulse-Width Modulation), and the 2D LightBoost picture is darker than non-LightBoost Brightness 100%.
--------------------------------------------------------
Once you run at frame rates above half the refresh rate, you will begin to get noticeable benefits from LightBoost. However, LightBoost benefits only become major when frame rates run near the refresh rate (or exceeding it).
-------------------------------------------------------
If you have a sufficiently powerful GPU, it is best to run at a frame rate massively exceeding your refresh rate. This can reduce the tearing effect significantly.Otherwise, there may be more visible tearing if you run at a frame rate too close to your refresh rate, during VSYNC OFF operation. Also, there can also be harmonic effects (beat-frequency stutters) between frame rate and refresh rate. For example, 119fps @ 120Hz can cause 1 stutter per second.
Therefore, during VSYNC OFF, it is usually best to let the frame rate run far in excess of the refresh rate. This can produce smoother motion (fewer harmonic stutter effects) and less visible tearing.
Alternatively, use Adaptive VSYNC as a compromise.
-------------------------------------------------------------
Pre-requisites
Frame rate matches or exceeds refresh rate (e.g. 120fps @ 120Hz).

  1. LightBoost motion blur elimination is not noticeable at 60 frames per second.

--------------------------------------------------------------

Sensitivity to input lag, flicker, etc. (You benefit more if you don’t feel any effects from input lag or flicker)

Computer Factors That Hurt LightBoost

  • Inability to run frame rate equalling Hz for best LightBoost benefit. (e.g. 120fps@120Hz).
  • Judder/stutter control. Too much judder can kill LightBoost motion clarity benefits.
  • Framerate limits. Some games cap to 60fps, this needs to be uncapped (e.g. fps_max)
  • Faster motion benefits more. Not as noticeable during slow motion.
  • Specific games. e.g. Team Fortress 2 benefits far more than World of Warcraft.
  • Some games stutters more with VSYNC ON, while others stutters more with VSYNC OFF. Test opposite setting.
-----------------------------------end-of-blurbuster's-quotes--------------------------

In recent news, this sounds interesting....

http://www.blurbusters.com/combining-blur-reduction-strobing-with-variable-refresh-rate/

I have doubt going forward that these strobing techs will work with HDR gaming monitors and FALD HDR gaming monitors by default, (and at the sustained-not-avg frame rates strobing requires at 1400p and 4k) which looks like where things are going eventually.
 
Last edited:
that's not the way HDR works, if you read what was included in the boxed off quotes.
You do not need 1000 nits at monitor distance to have the same brightness as 1000 nits at tv distance... 1000 nits at short range is a _lot_ brighter than 1000nits a few meters away (basically consider reading a book under your bedside reading light. now put that light a few meters away and try). Observed brightness is inversely affected by the square of the distance (sorry if i worded that wrong, i haven't used those terms in english).
However, unless you use a display type with perfect blacks then the dynamic range is still reduced, but not because of the brightness, but because the not-black blacks of a lcd will be diminished the same way by the distance. As long as the fald actually dims the blacks properly however, 1000 nits are not needed at monitor distance to provide the same dynamic range as 1000nits at tv range.
 
Morkai, you're correct about light over distance (we use the same rule in flash photography, your translation is good!), however I think the point is that if the display isn't bright enough (and dark enough!), then it won't be capable of displaying HDR properly regardless of view distance.
 
Morkai, you're correct about light over distance (we use the same rule in flash photography, your translation is good!), however I think the point is that if the display isn't bright enough (and dark enough!), then it won't be capable of displaying HDR properly regardless of view distance.
Except if blacks are truely black, such as OLED or correctly dimmed blacks with fald. then, and im guesstimating here, lets say 250 nits at monitor range equals 1000nits at tv range, and the blacks in both scenarios are absolute, so also equal - same dynamic range.
 
Sure, dynamic range is comparable- remembering that dynamic range is the difference between the darkest and brightest possible outputs- but you still need to be able to hit a certain defined maximum brightness for HDR, while simultaneously hitting a minimum darkness.

OLEDs have the darkness down, but brightness is a challenge, while LCDs can get as bright as you want them, but darkness is a challenge...

:D
 
I don't know how true that kind of drop off is, and at which viewing distances. By that math, a 1000nit monitor is perceived as 4000nit because of the near distance. Our eyes also perceive lighting intensity (and contrast, and even saturation) by what the ambient lighting is of course, which you hinted at.

What I was saying is that a 250nit or 400 nit SDR scene, in HDR , would still have much of the scene at 250 or 400nit.. but it wouldn't clip or blow out the highlights to white for all of the added scene information in the HDR range.

Think of how dodge and burn works (light and dark in drawing) .. if you made a perspective art scene with a limited brightness color range of chalk (bland) and shaded it for 3d depth and shadow effects, then you made the same scene with a much brighter and darker full color range of chalk added to your pallet - it would not only just make a much more vibrant and colorful image but would affect the way your eyes perceive the depth of the scene.

HDR brightness does not work like dimming a bulb like SDR does. HDR uses absolute values for brightness.
I understand what you are saying though. You are saying in a dark room brights appear brighter (like a flashlight) - so if you were able to turn your 1000 peak down to 500 in a dark room and the sdr range of the scene becomes 250 it will be pretty similar to 400 or 500 scene with 1000nit peak highlights. This might work somewhat in theory. If you are starting at a lower value and going 250 to 500 you will not get the same range of colors though.

http://www.avsforum.com/forum/166-lcd-flat-panel-displays/2812161-what-color-volume.html

3r1M8aR.png


--- This is what the uninformed thinks happens:
"If you viewed SDR content on a 1,000 nit display, everything would be displayed 10x brighter than intended.
If you viewed that same SDR content on a 500 nit display, everything would be half the brightness of the 1,000 nit display."
---This is what really happens:

If you viewed HDR content on a 1000 nit display or a 500 nit display, any scene where the peaks are lower than 500 nits should look exactly the same on both displays, due to the way that brightness is now coded. It represents an exact value, not a relative value.
For scenes which have highlights exceeding 500 nits, you will have highlight clipping on the lower brightness display (everything above 500 nits turns white) which would not be present on the higher brightness display. <---- Which will show full color gradations across the spectrum in bright highlights instead of crushing to white after hitting the sdr ceiling.

HDR enables far more vivid, saturated, and realistic images than SDR ever could.
High-brightness SDR displays are not at all the same thing as HDR displays.""
 
Last edited:
It will be real interesting to see what they could do on that FALD VA 35". They could do scanning back-light and some other real interesting motion clarity stuff. But I haven't heard a peep about ULMB on the 27 or 35 FALD's, so I wonder if they are even going to mess with it.

Here on the 65" FALD once again zero mention of ULMB or any strobing/scanning of the back-light. :( Probably because the brightness to allow HDR and a strobing/scanning back-light isn't bright enough even with direct LEDs? Maybe they figure people would much rather have sample-and-hold G-Sync + HDR.
 
HDR is a much bigger deal and will make SDR worse than a low color gamut, low contrast monitor is compared to a modern SDR screen is now.. once HDR content becomes ubiquitous for most types of content (including photography, digital art, games, home-made videos in addition to studio made movies, etc).

Simulation from the same article on avs: http://www.avsforum.com/forum/166-lcd-flat-panel-displays/2812161-what-color-volume.html

4lacN4T.png


The way it is now in SDR, your range is very narrow so you can move that range higher or lower. Higher you might see the wheel's hub and the shadow detail but you'd lose the black depth, the contrast would go pale and you'd crush the high's to white. Lower and you'd get more black depth but lose the brightness and pop of the saturation, and the detail in black would be lost to mud. With HDR you get a broad spectrum of color volume across the brightness and darkness range. It opens it up a lot more than the SDR middle ground.
 
Last edited:
I spent some time to research how the standard has been tested.
"Experimental results (1) show that a simple mean of displayed pixel luminances provides a good correlation with subjective brightness at 3.2 picture heights from the screen"
The example i found from hdr standard testing was tested with a 47" tv, with subjects seated 3.2x the panel height away (roughly 2 meters). at that distance, 1000 nits is apparantly proper for HDR. The same luminance at close range will be brighter.
https://www.itu.int/dms_pub/itu-r/opb/rep/R-REP-BT.2408-2017-PDF-E.pdf

They also seem to consider automatic range measurement + luminance adjustments for future HDR standards.

There is also data for when it gets annoying:
"Images with average luminance greater than 25% of peak luminance began to be judged as “too bright” by many viewers."
 
65" diagonal 16:9 = 65.7 W x 31.9 H ... 55" diagonal 21:9 = 50.6 W x 21.7 H .. 35" diag. 21:9 = 32.2 W x 13.8 H ... 27" 16:9 = 23.5 W x 13.2 H

3.2x the panel height ..
65" = 102.08" view dstance (8.5', pretty accurate normal view distance for a 65" to 70" tv).
35" 21:9 = 44.2" (3.7' , a bit further than typical 1.5 - 2.5' desk viewing distances)
27" 16:9 = 42,4" (3.5' , again about 1.5' further away than normal).

How much that 1' matters on desktop monitors vs it being "too much" overall screen brightness for broad dynamic range highlights of saturated color,could be negligible.

Average luminance would vary a lot depending on the scenes and content you are watching, and would be biased by your ambient lighting.

You also might consider how surround sound systems work. If you want it set up for "realism" rather than normalized, you'd like to hear the wind in the sails and the pirates whispering below deck, but when the sea combat starts and the cannons start firing many listeners in a test might report it as "too loud".
 
Huh funny, Large format monitors, I own one, and absolutely love it! I even have it overclocked to 90hz. AND it's a PVA monitor to boot.

Good to see Nvidia trying to make them a thing!
 
You do not need 1000 nits at monitor distance to have the same brightness as 1000 nits at tv distance... 1000 nits at short range is a _lot_ brighter than 1000nits a few meters away (basically consider reading a book under your bedside reading light. now put that light a few meters away and try). Observed brightness is inversely affected by the square of the distance (sorry if i worded that wrong, i haven't used those terms in english).
However, unless you use a display type with perfect blacks then the dynamic range is still reduced, but not because of the brightness, but because the not-black blacks of a lcd will be diminished the same way by the distance. As long as the fald actually dims the blacks properly however, 1000 nits are not needed at monitor distance to provide the same dynamic range as 1000nits at tv range.
Very wrong. As you double the distance, only a quarter of the light reaches your eyes. But the display will also fill just a quarter of the field of view. The perceived brightness remains the same. So you need less total light output on a monitor, but the same cd/m2 (nits), which is a measure of luminous intensity per area. Observed brightness only changes with distance once the light source is so far away that it's effectively a point to your eyes, like stars in the sky.
 
I have a Samsung JS7550 - do you ever get flickering with yours?
Before the last firmware update, it would go black for a second after coming out of sleep, after a few minutes then come back on an never do it again. Since the last firmware, it does it probably 1 out of 10 times coming out of sleep. But I have never seen any kind of flickering at all. This JS9000 has been, without a doubt, the best computer display I have ever had. No regrets at all. Zarathustra[H] was the reason I bought it. I waited 6 years to move from the triple 24" Eyefinity/Surround. I knew large format 4K was my next upgrade. I can't even imagine going back. Gaming on it is friggin incredible. Between the JS9000 and the bookshelf speaker setup, that Zarathustra[H] and cageymaru spec'd, gaming has never kicked so much ass. Even "twitch gamer" friends that come over and use the system are blown away. I love the fact that NVIDIA is pushing this, but I would like to see a bit smaller formats for sure. We are really starting to move beyond the multiplier monitor era. That only started a short 8 years ago. :) Shot this video in 2009.

 
Very wrong. As you double the distance, only a quarter of the light reaches your eyes. But the display will also fill just a quarter of the field of view. The perceived brightness remains the same. So you need less total light output on a monitor, but the same cd/m2 (nits), which is a measure of luminous intensity per area. Observed brightness only changes with distance once the light source is so far away that it's effectively a point to your eyes, like stars in the sky.
But you don't double the distance. If you compare a 27" monitor with a 55" TV which is almost exactly 4x the area, you are likely to quadruple the distance (at least 50cm distance for a 27" monitor, at least 2m for a 55" tv). If you follow the hdr testing recommendation of sitting 3.2x the panel height away , then yes, it's double the distance. But that's insane, just doesn't scale for monitor sized displays - 1.07m away from a 27" monitor. (I measured and my eyes are 50-55cm from my 27". I doubt people sit much further away than me. I also believe people sit more than 2m away from their 55" tvs on average).


And there are plenty of real world examples where observed brightness changes with distance even if its not "so far away that it's effectively a point to your eyes, like stars in the sky."
*Turn on a lightbulb in one room, try read a book in the next room, in direct line of sight of the lightsource. You can probably do it, but it will be really dim and tiresome.
*Try use a home projector on the wall in the next room. picture will be really dim. More extreme: Try it in a cinema and you could probably not even see the picture.
*try to read a book in darkness with just the display light from a mobile phone, you need to have it right next to it to work. Lift the phone 1m up and it will be too dim.

The point of hdr is also not to use a square display area to illuminate you. It is to have a small pinpoint part of the screen, like a sun or a lamp or a fireball stand out from the average. Unless i calcluate it wrong, a 27" at 50cm at 250 nits will equal a 55" at 2m at 1000 nits. (But a 27" at 1000 nits at 1.07m would equal a 55" double the distance at 1000 nits, yes).
 
And there are plenty of real world examples where observed brightness changes with distance even if its not "so far away that it's effectively a point to your eyes, like stars in the sky."
*Turn on a lightbulb in one room, try read a book in the next room, in direct line of sight of the lightsource. You can probably do it, but it will be really dim and tiresome.
*Try use a home projector on the wall in the next room. picture will be really dim. More extreme: Try it in a cinema and you could probably not even see the picture.
*try to read a book in darkness with just the display light from a mobile phone, you need to have it right next to it to work. Lift the phone 1m up and it will be too dim.
These are different, the light per area of what you are looking at changes. Not so for looking directly at something and varying the distance to it. Then the light that reaches your eyes is proportional to how much of your visual area the object covers.

But you don't double the distance. If you compare a 27" monitor with a 55" TV which is almost exactly 4x the area, you are likely to quadruple the distance (at least 50cm distance for a 27" monitor, at least 2m for a 55" tv). If you follow the hdr testing recommendation of sitting 3.2x the panel height away , then yes, it's double the distance. But that's insane, just doesn't scale for monitor sized displays - 1.07m away from a 27" monitor. (I measured and my eyes are 50-55cm from my 27". I doubt people sit much further away than me. I also believe people sit more than 2m away from their 55" tvs on average).

The point of hdr is also not to use a square display area to illuminate you. It is to have a small pinpoint part of the screen, like a sun or a lamp or a fireball stand out from the average. Unless i calcluate it wrong, a 27" at 50cm at 250 nits will equal a 55" at 2m at 1000 nits.

Well, I wouldn't call them equal. Your eyes will see a difference. Larger highlights will appear half the size but brighter on the 55" (at 4x distance). But I think you are onto something. The perceived difference in brightness could be far smaller than the cd/m2 values suggest. It depends on the human visual system, though, not something you can determine with simple calculations.
 
If the VA is tight to 120hz at 120fps or more (to 130 if lucky), that is still great no matter what they claim the max hz is.
A VA triples the black depth and contrast of a IPS or TN and FALD probably amps that up by an additional 30% or more, and on a HighDynamic(luminous saturated color) Ranged display.

Your scene is more blurry (less clarity) at below 100hz at 100fps' 40% blur reduction, and with the VA's you are probably dropping off the clarity again beyond 120hz@120fps or 130hz + fps. Therefore you can dial in your settings and cap your frame rate to stay in the sweet spot as much as possible.

------------------------------------

80hz at 80fps gives you 20% blur reduction vs 60hz@60fps, and gives you 1.3:1 motion definition increase (8 frames to every 6 shown at 60fps-hz)

100hz at 100fps gives you 40% blur reduction vs 60hz @ 60fps, and gives you 1.6:1 motion definition increase. (5 frames to every 3 shown at 60fps-hz)

120hz at 120fps gives you 50% blur reduction vs 60hz @ 60fps, and gives you 2:1 motion definition increase (double the motion definition, including viewport movement of the entire game world in 1st/3rd person games).

130hz at 130fps is prob ~55% blur reduction vs 60hz@60fps , and prob gives around ~ 2.25:1 motion def increase (~9 frames to every 4 shown at 60fps-hz)
... tha blur reductiont is, if VA still stays tight enough at this point. Prob not much past 120fps-hz but we'll see.

144hz at 140fps is 60% blur reduction and gives 2.4:1 motion def increase (12 frames to every 5 shown at 60fps-hz)
................... ... a VA's response time wouldn't be tight enough over 120 or 130 so this would probably be in effect sinking back to less clarity like the the rates below 120 fps+hz. aka outside of the "sweet spot" - like a bell curve, perhaps even like a cliff slope on the far end depending how bad the response time effects are at any given fps-hz range.

------------------------------------

So if you can achieve 100fps-hz average or more, then you can cap it at 120fps+hz or 130fps+hz.
Even if you do 100fps-hz average for more like a mainly 70 - 100 - 130 range, utiliziing g-sync for the fluctuation and lows, you'd still get a quite appreciable benefit.
That is a great spot to be on a High Hz 4k (or a 3440 x 1440) VA 1000nit FALD HDR Quantum-Dot Filtered/P3 color monitor, with more than 3x the black depth and contrast to start with(and a lot more than that with FALD) than a IPS and TN.

Realistically, most people would not be capable of even hitting 100fps-hz average (70 - 100 - 130 range for the most part) on the most demanding games without dialing graphics settings down a lot more than usual, especially at 4k - so 200hz range would not be pushed into in most cases even on the high end of the graph.

In my opinion you'd be much better served dialing in to the best graphics settings you can tweak to achieve 100fps-hz average (~ 70 - 100 - 130) or 120fps-average (--90 - 120 -.|cap|. 150---.. cap at 120 or 130) when possible. Try to keep the low end at least 70+ for the most part and cap the top at 120 or 130. You'd still have a few potholes and spikes in addition to the (sometimes abrupt) sinks to and from 70hz@70fps+, so g-sync is still valuable in this usage, preventing judder/stutter and stops.
 
For all of you complaining about the size: this is exactly what I want, because now I can move my monitor back against the wall and still view it clearly at a distance. It also makes 100% DPI scaling at 4K sensible to use, too; even 42" is too small for that. Also, HDTVs suck at the very things I want out of a gaming monitor, especially for the ridiculous prices they generally go for.

You might wonder why the hell I'd want to do the wall-mount thing for PC gaming, and it's actually pretty simple: I'm trying to minimize desk depth to maximize the possible room space in this cramped computer room, where 8" square doesn't make for a very comfortable room-scale VR experience if you want to walk around and still be sure you won't hit something by mistake. In fact, the end goal is to remove the desk altogether; what I need is more akin to a chair flanked by cockpit control mounts that can be moved aside when I'm feeling that room-scale itch.

It also means I can move back to the middle of the room, have a few friends beside me for local multiplayer of the sort usually only found on consoles due to the general expectation of living-room play, and everyone can have a good view of what's going on. Heck, I actually do intend to use my more modern consoles with it that way, too, seeing as the living room in this house isn't exactly conducive to long hours of PS3/Wii U/docked Switch gaming. My FG2421 feels a bit small for that, albeit usable.

The only thing is that this is almost certainly going to destroy my wallet the same way a typical 65" HDTV that isn't garbage would. NVIDIA loves their profit margins.
 
The more I think about it, the more I come to the conclusion that these are not usable for a desk, as much as I want them to be. And it's mostly because of the PPI, not the size. I think a 65" 8K would be usable at 40" away (within desk range), you just gotta train your neck muscles. It's like having several monitors on your desk.

For gaming, you would have to lean back a bit, but not too much. I can fit my 43" into my field of vision at 30" away no problem. For a 65", I'd lean back a bit more, but it would still work with a desk (maybe a bit deeper). But, again, the problem is the resolution. 4K is too little for 65" at desk range. 8K would be perfect with 125-150% scaling. But 8K is obviously nowhere near ready for prime time with 120Hz, HDR, and having the hardware to maintain a frame rate. So I hope Nvidia listens and releases 45" versions for people who actually want to put a huge monitor on their desks (a lot already have).
 
Nvidia 65" @4k with Gsync and 120hz I don't know why Nvidia calls it 'Gsync' it should be Esync (Expensive sync) I think a 43" with 8k @200hz would be better IMO, but having this big format on a desktop is way too big and bad for the eyes and neck you will have to sit at lease 42" away this is just graphics & marketing 'BS' only made for the console Fanboys not PC gamers in mind.
 
Nvidia 65" @4k with Gsync and 120hz I don't know why Nvidia calls it 'Gsync' it should be Esync (Expensive sync) I think a 43" with 8k @200hz would be better IMO, but having this big format on a desktop is way too big and bad for the eyes and neck you will have to sit at lease 42" away this is just graphics & marketing 'BS' only made for the console Fanboys not PC gamers in mind.

...what?

Yes, G-Sync is expensive! Well, more expensive, since there's dedicated hardware that brings more features than say FreeSync. FreeSync 2.0, which will bring some of those features, is also more expensive, because it also requires more dedicated hardware...

...and I cannot see where this is 'only made for the console Fanboys'- do any of the consoles that purport to play games at 4k support G-Sync over DisplayPort?
 
I think the point of these is you sit at a normal monitor distance and use it instead of a triple monitor surround setup. You use a high field of view and get a lot of peripheral vision. Your entire real life fov is supposed to be filled.
 
...what?

Yes, G-Sync is expensive! Well, more expensive, since there's dedicated hardware that brings more features than say FreeSync. FreeSync 2.0, which will bring some of those features, is also more expensive, because it also requires more dedicated hardware...

...and I cannot see where this is 'only made for the console Fanboys'- do any of the consoles that purport to play games at 4k support G-Sync over DisplayPort?

I think PS4 Pro and Xbox One X support resolution 4K at 60hz so this BFGD won't be an issue for playing whether PC Gaming or consoles .
For 65'' i think is the perfect size for everything whether Movies or Gaming . Many people loved their TVs at 65'' 4K such as Sony Z9D , X900E and OLEDs . I didn't see anyone complained about 4K at 65'' and u can enjoy with different movies with good shadow details and highlights .
Now u don't need to choose a monitor for pc and a big size for TV , u can do everything in one monitor and using it for gaming and movies at the same time .
All you have to do is choosing the best distance for your eyes when u are gaming or watching movies , and use small ultrawide for surfing the internet and productivity .
I'm very happy to see nvidia making this great thing , TV features and PC Monitor features in giant pc monitor .Let's let technology take its route to the best .
This is the human nature we only have to criticize everything and whenever we see something new everyone begins to say its disadvantages and forget its advantages .
 
I think the point of these is you sit at a normal monitor distance and use it instead of a triple monitor surround setup. You use a high field of view and get a lot of peripheral vision. Your entire real life fov is supposed to be filled.

You do that with a 43". 65" is too big and the PPI is too low.
 
I watch my 70" 4k (low density FALD) VA TV on my sectional couch with a couchmaster couch desk at 8' away to my eyeballs. It does 4k 60hz and can do 1080p at 120hz native on modern gpus. Though it is mainly for streaming movies and youtube, etc off of my shield - It has low input lag and I've played a few games on it on occasion, including the witcher 3. I think 8' away is perfect. In fact, for the money this is going to cost, It'd be better at 70". At 65" this would actually be a size downgrade for me in regard to my living room.

I wouldn't consider using this *at* at desk, but the room setups with a desk facing AWAY from a wall instead of stuffed up against the wall like a bookshelf, with the giant display on the other wall, would work out great.

This is how nvidia had it set up distance wise:

yYAWgsW.png



Though the CGI scene example in their promo video was more like this distance, the actual virtual camera distance of the scene was further away to see the whole screen.
 
Last edited:
This is how nvidia had it set up distance wise:

Good luck running 4k 120hz at more than 3 feet distance. or using wireless "gaming grade" keyboard and mouse for FPS. Big format looks great on paper, but be ready for some compromises.
 
I think gamers are missing the point on this new Nivida Big format thinking 120hz on a single GTX 1080 TI card is not going to cut it, I think this 120hz is for a two cards or more setup not one because the 1080 ti @4k can only run 60hz even titan V but we will see when it comes out this summer.

Don't build up too much hope it all look very good on paper & with video graphics made by their marketing team.

Also I am all for the new technology only then they get it right or support it, But True 4K is 4,096 by 2,160 not 3840x 2160 Nividia
 
Last edited:
Good luck running 4k 120hz at more than 3 feet distance. or using wireless "gaming grade" keyboard and mouse for FPS. Big format looks great on paper, but be ready for some compromises.


I can run usb cables around the back wall in conduit or even down to basement ceiling and back up. The couch master couch desk setup comes with usb3 cables and a usb3 hub too.

7JVaK4t.png




Other than that, what is the problem with using a big screen more than 3' away? Visually it would work a lot better further away. At 8' away, my 70" is bordering on requiring me to tilt head to focus on the extremes. Unless you mean the video cables?

You'd keep the computer near the screen (like consoles and other media devices are) and run the usb cable around the back wall.
You could alternately deploy a no-trip floor cabling strip directly across to your seating position (at least pulling it out from under entertainment center when switching to wired gaming peripherals for gaming sessions).

8' viewing distance to your eyeballs means you could prob run a 16' usb cable run, (maybe a bit more if not using no-trip strip directly across), and feed it around the back wall in some conduit or some sort of wire-management shielding through to under the couch. Or you could feed it down behind the wall behind entertainment center, through basement rafters and back up behind couch drilled through the moulding at the wall's bases or via a wall plate if you want to get fancy.

You can run another usb cable or utilize the hub for a g29 steering wheel and use one of those metal pop up stands they mount on .. You could also use a xboxone controller wired or unwired for a lot of fun games.

Not every game is extremely demanding on gpus, and some can dial down a bit without losing too much. I agree 4k is gpu crushing on demanding games though. See my previous posts about the 100hz @ 100fps average as a minimum target for gaming on high hz displays. In fact, for people with moderate gpu budgets, a high hz 1080p where you can crank up all the graphics settings is probably a much better idea. You could alternately run a lower rez on a high hz monitor like this for the most demanding games and rely on scaling if you had to, or even letterboxing depending how much smaller you go.
 
Last edited:
I think gamers are missing the point on this new Nivida Big format thinking 120hz on a single GTX 1080 TI card is not going to cut it, I think this 120hz is for a two cards or more setup not one because the 1080 ti @4k can only run 60hz even titan V but we will see when it comes out this summer.

Don't build up too much hope it all look very good on paper & with video graphics made by their marketing team.

So turn a few settings down on the latest games, and still have 10x better graphics than a console?

Also I am all for the new technology only then they get it right or support it, But True 4K is 4,096 by 2,160 not 3840x 2160 Nividia

This is UHD 4k, and yes, it is a standard.
 
I think gamers are missing the point on this new Nivida Big format thinking 120hz on a single GTX 1080 TI card is not going to cut it, I think this 120hz is for a two cards or more setup not one because the 1080 ti @4k can only run 60hz even titan V but we will see when it comes out this summer.

Don't build up too much hope it all look very good on paper & with video graphics made by their marketing team.

Also I am all for the new technology only then they get it right or support it, But True 4K is 4,096 by 2,160 not 3840x 2160 Nividia

If u ask me , i'll tell u that gtx 1070 will be enough for many games at 4k 120hz and not just 1080 ti . You can play with the settings and turn some down until get the best fps and fast motion for your eyes .
 
Back
Top