NVIDIA Big Format Gaming Display

Discussion in 'Displays' started by realworld, Jan 8, 2018.

  1. Vega

    Vega [H]ardness Supreme

    Messages:
    5,389
    Joined:
    Oct 12, 2004
    What sucks is VRR is the real deal. So it is hard to brush off G-Sync so easily. Especially in a very demanding 4K scenario at high refresh rates, VRR is needed more than ever.

    I'll definitely be getting one to try out on release day.

    We can only hope...

    That is what I fear will happen. If NVIDIA has refused Free-sync this entire time, why would they change?
     
    IdiotInCharge likes this.
  2. Sancus

    Sancus Gawd

    Messages:
    534
    Joined:
    Jun 1, 2013
    You're not wrong. Most likely I will punt the whole dumpster fire to 2019 and buy an PG27UQ or PG35VQ now that it seems we have confirmation those are still slated to come out this year. Kinda leaning towards the PG35VQ. You're gonna have smear on these BFGDs anyway since they're VA panel based, might as well get 21:9 and a usable desktop size.
     
  3. Morkai

    Morkai Limp Gawd

    Messages:
    324
    Joined:
    May 10, 2011
    Now, if samsung puts hdmi 2.1 or gsync on "the wall", THAT would be nice. 146", quite exactly the size i need :)
    Not going to happen though, most likely... 99% chance it will be a $30000-50000 60Hz product :( If they put g-sync or hdmi2.1 with VRR on it and 120Hz+, I'd buy one in that pricerange (if it had estimated lifespan of 10+ years or 5-10 year warranty/possibility to replace single failed microled modules).
    Products like this always cheap out on the inputs, which I find ridiculous.

    "A press event scheduled for March is said to offer more details"
     
  4. gan7114

    gan7114 [H]Lite

    Messages:
    93
    Joined:
    Dec 14, 2012
    The more I see of this OLED ProArt the more I question its practicality for desktop users. Seems to be a hybrid between a traditional monitor and a portable tablet.

    - Visible matte anti-glare coating
    - No VESA mount
    - Questionable placement of USB-C and HDMI ports along the left vertical edge.

    Some vids for our pleasure:



     
    Armenius likes this.
  5. Vega

    Vega [H]ardness Supreme

    Messages:
    5,389
    Joined:
    Oct 12, 2004
    Well the 21:9 is a VA panel too. Looks to be the same 35" VA panel that has been around, just with a FALD slapped on the back. So 200 Hz on a VA panel isn't really practical, as seen on the Acer Z35. Pixel speeds cannot keep up with 200 Hz. I'm leaning towards the 27 FALD which is IPS.

    Ya who puts matte coating on a super high PPI OLED wtf.
     
    Armenius likes this.
  6. Morkai

    Morkai Limp Gawd

    Messages:
    324
    Joined:
    May 10, 2011
    It would be interesting if they added driver-level black frame insertion at 200Hz, with the gpu pushing 100fps.
    The driver could fald-dim the backlight fully every 2nd frame while the pixels shift to the next frame while the backlight is off, effectively offsetting the slow VA response times.
    Would be quite awesome 5ms persistence equivalent 100Hz screen.
     
    IdiotInCharge likes this.
  7. IdiotInCharge

    IdiotInCharge Not the Idiot YOU are Looking for

    Messages:
    4,653
    Joined:
    Jun 13, 2003
    If this could be made to work with G-Sync...
     
  8. Morkai

    Morkai Limp Gawd

    Messages:
    324
    Joined:
    May 10, 2011
    Yeah.. it would face the same exact issues as ulmb+gsync, so probably not trivial to solve that part. Theres nothing stopping it from working although the brightness would appear to fluctuate as framerate drops with gsync enabled.. but why not let the user have a checkbox [x] enable gsync with BFI - warning, only use with stable high framerates.
    But with these new shiny 1000nits backlights they could probably compensate for the gsync+ulmb dimness problem by capping the max brigtness at say, 500nits and use the full 1k to compensate when framerate drops. That would require a lot of tweaking and testing though. The "old" ulmb displays had no possibility to do that as they were too dim already at max brightness ulmb.
    On the old gen monitors ulmb+gsync was a problem that couldn't be solved. Now it probably can.
     
    Last edited: Jan 11, 2018
    IdiotInCharge likes this.
  9. elvn

    elvn 2[H]4U

    Messages:
    2,791
    Joined:
    May 5, 2006
    I highly doubt screen blanking or ULMB will be used with HDR and HDR is the future, even for still photos and static desktop as well as youtube and web content as cameras and apps start using it more. Going to SDR mode would clip or blow out highlights on HDR content. 500nit imo isn't going to show HDR highlights well enough, and anything weaker than a VA's black depth isn't going to show black depth enough.

    These very high density dynamic FALD blacklights may improve the ~ 3000:1 contrast ratios of modern gaming VA's to more like 3800 - 3900:1 or more (+30% or so).. hopefully 4100+ but maybe I'm just being hopeful. A TN or IPS are usually around 850:1 to 1000:1 for reference, though the asus/acer 27" ips ones with high density FALD should push that higher.. if we use a 30% improvement via high density FALD as a guestimate , then perhaps 1300:1 or more which is still quite low.

    I'd guess if anything it would be an either/or scenario with screen blanking. G-sync(VRR) + HDR with full bright minimum acceptable highlights of up to 1000nit would be superior to SDR range mode ulmb/screen-blanking once HDR content becomes ubiquitous. Remember 1000nit is the minimum acceptable peak brightness for HDR content while HDR movies are mastered at 4000nit, and most other content likely mastered at at least 1200 - 2000nit.

     
  10. Morkai

    Morkai Limp Gawd

    Messages:
    324
    Joined:
    May 10, 2011
    I think hdr with 250-300 nits will be fine in a light controlled room.. not like we max brightness on displays as is anyway. I go for around 100 nits (often around 20% brightness), i highly doubt i will allow anything to blast my eyes with anything over 250-300 nits at monitor distance. 1000 Nits must be more like for rooms with daylight at tv range. The best hdr tvs (oled) max out at 5-600nits. But if it turns out we really DO want 1000nits its still a sound concept if they just made the backlight brighter (2000 nits, or w/e to compensate).

    My example would not provide 500 nits btw, It would be 250 nits (500 nits displayed 50% of the time, same dimming effect as ulmb or 100Hz PWM).
     
    Last edited: Jan 11, 2018
  11. elvn

    elvn 2[H]4U

    Messages:
    2,791
    Joined:
    May 5, 2006
    that's not the way HDR works, if you read what was included in the boxed off quotes..

    HDR is not like the brighness setting of a SDR display.

    a lot of the scene remains in the SDR range. 1000 nit is not even bright enough to display what full HDR movies are mastered at (4000nit, much higher color ranged highlights mostly) in a bright room or not.. it is used as a minimum for the spec.

    What happens is any color information that is out of SDR range in media (including stills) will be clipped to white or crushed to mud on more limited range displays.

    When it comes to screen blanking it's not just the dimness, though that is a huge issue.. it's that it is also variable and probably won't mesh with HDR well at all. It also requires high frame rates to reduce the PWM effect of strobing... and most people nowadays avoid PWM monitors entirely so might not want to re-introduce it to their eyes at 85hz or 100hz.

     
    Last edited: Jan 11, 2018
    Armenius and IdiotInCharge like this.
  12. Morkai

    Morkai Limp Gawd

    Messages:
    324
    Joined:
    May 10, 2011
    You do not need 1000 nits at monitor distance to have the same brightness as 1000 nits at tv distance... 1000 nits at short range is a _lot_ brighter than 1000nits a few meters away (basically consider reading a book under your bedside reading light. now put that light a few meters away and try). Observed brightness is inversely affected by the square of the distance (sorry if i worded that wrong, i haven't used those terms in english).
    However, unless you use a display type with perfect blacks then the dynamic range is still reduced, but not because of the brightness, but because the not-black blacks of a lcd will be diminished the same way by the distance. As long as the fald actually dims the blacks properly however, 1000 nits are not needed at monitor distance to provide the same dynamic range as 1000nits at tv range.
     
  13. IdiotInCharge

    IdiotInCharge Not the Idiot YOU are Looking for

    Messages:
    4,653
    Joined:
    Jun 13, 2003
    Morkai, you're correct about light over distance (we use the same rule in flash photography, your translation is good!), however I think the point is that if the display isn't bright enough (and dark enough!), then it won't be capable of displaying HDR properly regardless of view distance.
     
    Armenius likes this.
  14. Morkai

    Morkai Limp Gawd

    Messages:
    324
    Joined:
    May 10, 2011
    Except if blacks are truely black, such as OLED or correctly dimmed blacks with fald. then, and im guesstimating here, lets say 250 nits at monitor range equals 1000nits at tv range, and the blacks in both scenarios are absolute, so also equal - same dynamic range.
     
  15. IdiotInCharge

    IdiotInCharge Not the Idiot YOU are Looking for

    Messages:
    4,653
    Joined:
    Jun 13, 2003
    Sure, dynamic range is comparable- remembering that dynamic range is the difference between the darkest and brightest possible outputs- but you still need to be able to hit a certain defined maximum brightness for HDR, while simultaneously hitting a minimum darkness.

    OLEDs have the darkness down, but brightness is a challenge, while LCDs can get as bright as you want them, but darkness is a challenge...

    :D
     
    Armenius likes this.
  16. elvn

    elvn 2[H]4U

    Messages:
    2,791
    Joined:
    May 5, 2006
    I don't know how true that kind of drop off is, and at which viewing distances. By that math, a 1000nit monitor is perceived as 4000nit because of the near distance. Our eyes also perceive lighting intensity (and contrast, and even saturation) by what the ambient lighting is of course, which you hinted at.

    What I was saying is that a 250nit or 400 nit SDR scene, in HDR , would still have much of the scene at 250 or 400nit.. but it wouldn't clip or blow out the highlights to white for all of the added scene information in the HDR range.

    Think of how dodge and burn works (light and dark in drawing) .. if you made a perspective art scene with a limited brightness color range of chalk (bland) and shaded it for 3d depth and shadow effects, then you made the same scene with a much brighter and darker full color range of chalk added to your pallet - it would not only just make a much more vibrant and colorful image but would affect the way your eyes perceive the depth of the scene.

    HDR brightness does not work like dimming a bulb like SDR does. HDR uses absolute values for brightness.
    I understand what you are saying though. You are saying in a dark room brights appear brighter (like a flashlight) - so if you were able to turn your 1000 peak down to 500 in a dark room and the sdr range of the scene becomes 250 it will be pretty similar to 400 or 500 scene with 1000nit peak highlights. This might work somewhat in theory. If you are starting at a lower value and going 250 to 500 you will not get the same range of colors though.

    http://www.avsforum.com/forum/166-lcd-flat-panel-displays/2812161-what-color-volume.html

    [​IMG]

     
    Last edited: Jan 11, 2018
    Armenius and IdiotInCharge like this.
  17. Vega

    Vega [H]ardness Supreme

    Messages:
    5,389
    Joined:
    Oct 12, 2004
    It will be real interesting to see what they could do on that FALD VA 35". They could do scanning back-light and some other real interesting motion clarity stuff. But I haven't heard a peep about ULMB on the 27 or 35 FALD's, so I wonder if they are even going to mess with it.

    Here on the 65" FALD once again zero mention of ULMB or any strobing/scanning of the back-light. :( Probably because the brightness to allow HDR and a strobing/scanning back-light isn't bright enough even with direct LEDs? Maybe they figure people would much rather have sample-and-hold G-Sync + HDR.
     
    Armenius and IdiotInCharge like this.
  18. elvn

    elvn 2[H]4U

    Messages:
    2,791
    Joined:
    May 5, 2006
    HDR is a much bigger deal and will make SDR worse than a low color gamut, low contrast monitor is compared to a modern SDR screen is now.. once HDR content becomes ubiquitous for most types of content (including photography, digital art, games, home-made videos in addition to studio made movies, etc).

    Simulation from the same article on avs: http://www.avsforum.com/forum/166-lcd-flat-panel-displays/2812161-what-color-volume.html

    [​IMG]

    The way it is now in SDR, your range is very narrow so you can move that range higher or lower. Higher you might see the wheel's hub and the shadow detail but you'd lose the black depth, the contrast would go pale and you'd crush the high's to white. Lower and you'd get more black depth but lose the brightness and pop of the saturation, and the detail in black would be lost to mud. With HDR you get a broad spectrum of color volume across the brightness and darkness range. It opens it up a lot more than the SDR middle ground.
     
    Last edited: Jan 11, 2018
    Armenius and mms like this.
  19. Morkai

    Morkai Limp Gawd

    Messages:
    324
    Joined:
    May 10, 2011
    I spent some time to research how the standard has been tested.
    "Experimental results (1) show that a simple mean of displayed pixel luminances provides a good correlation with subjective brightness at 3.2 picture heights from the screen"
    The example i found from hdr standard testing was tested with a 47" tv, with subjects seated 3.2x the panel height away (roughly 2 meters). at that distance, 1000 nits is apparantly proper for HDR. The same luminance at close range will be brighter.
    https://www.itu.int/dms_pub/itu-r/opb/rep/R-REP-BT.2408-2017-PDF-E.pdf

    They also seem to consider automatic range measurement + luminance adjustments for future HDR standards.

    There is also data for when it gets annoying:
    "Images with average luminance greater than 25% of peak luminance began to be judged as “too bright” by many viewers."
     
    GoldenTiger and IdiotInCharge like this.
  20. elvn

    elvn 2[H]4U

    Messages:
    2,791
    Joined:
    May 5, 2006
    65" diagonal 16:9 = 65.7 W x 31.9 H ... 55" diagonal 21:9 = 50.6 W x 21.7 H .. 35" diag. 21:9 = 32.2 W x 13.8 H ... 27" 16:9 = 23.5 W x 13.2 H

    3.2x the panel height ..
    65" = 102.08" view dstance (8.5', pretty accurate normal view distance for a 65" to 70" tv).
    35" 21:9 = 44.2" (3.7' , a bit further than typical 1.5 - 2.5' desk viewing distances)
    27" 16:9 = 42,4" (3.5' , again about 1.5' further away than normal).

    How much that 1' matters on desktop monitors vs it being "too much" overall screen brightness for broad dynamic range highlights of saturated color,could be negligible.

    Average luminance would vary a lot depending on the scenes and content you are watching, and would be biased by your ambient lighting.

    You also might consider how surround sound systems work. If you want it set up for "realism" rather than normalized, you'd like to hear the wind in the sails and the pirates whispering below deck, but when the sea combat starts and the cannons start firing many listeners in a test might report it as "too loud".
     
    Armenius likes this.
  21. Brackle

    Brackle Old Timer

    Messages:
    7,016
    Joined:
    Jun 19, 2003
    Huh funny, Large format monitors, I own one, and absolutely love it! I even have it overclocked to 90hz. AND it's a PVA monitor to boot.

    Good to see Nvidia trying to make them a thing!
     
    Kyle_Bennett likes this.
  22. Desert Fish

    Desert Fish n00bie

    Messages:
    29
    Joined:
    May 30, 2016
    Very wrong. As you double the distance, only a quarter of the light reaches your eyes. But the display will also fill just a quarter of the field of view. The perceived brightness remains the same. So you need less total light output on a monitor, but the same cd/m2 (nits), which is a measure of luminous intensity per area. Observed brightness only changes with distance once the light source is so far away that it's effectively a point to your eyes, like stars in the sky.
     
    Armenius likes this.
  23. Elf_Boy

    Elf_Boy [H]ard|Gawd

    Messages:
    1,823
    Joined:
    Nov 16, 2007
    I have a Samsung JS7550 - do you ever get flickering with yours?

    I really like the idea of a large monitor, it would solve the game UI too small to read and Fantasy Grounds too small issues for me (or I hope it would).
     
  24. Kyle_Bennett

    Kyle_Bennett El Chingón Staff Member

    Messages:
    48,750
    Joined:
    May 18, 1997
    Before the last firmware update, it would go black for a second after coming out of sleep, after a few minutes then come back on an never do it again. Since the last firmware, it does it probably 1 out of 10 times coming out of sleep. But I have never seen any kind of flickering at all. This JS9000 has been, without a doubt, the best computer display I have ever had. No regrets at all. Zarathustra[H] was the reason I bought it. I waited 6 years to move from the triple 24" Eyefinity/Surround. I knew large format 4K was my next upgrade. I can't even imagine going back. Gaming on it is friggin incredible. Between the JS9000 and the bookshelf speaker setup, that Zarathustra[H] and cageymaru spec'd, gaming has never kicked so much ass. Even "twitch gamer" friends that come over and use the system are blown away. I love the fact that NVIDIA is pushing this, but I would like to see a bit smaller formats for sure. We are really starting to move beyond the multiplier monitor era. That only started a short 8 years ago. :) Shot this video in 2009.

     
  25. Morkai

    Morkai Limp Gawd

    Messages:
    324
    Joined:
    May 10, 2011
    But you don't double the distance. If you compare a 27" monitor with a 55" TV which is almost exactly 4x the area, you are likely to quadruple the distance (at least 50cm distance for a 27" monitor, at least 2m for a 55" tv). If you follow the hdr testing recommendation of sitting 3.2x the panel height away , then yes, it's double the distance. But that's insane, just doesn't scale for monitor sized displays - 1.07m away from a 27" monitor. (I measured and my eyes are 50-55cm from my 27". I doubt people sit much further away than me. I also believe people sit more than 2m away from their 55" tvs on average).


    And there are plenty of real world examples where observed brightness changes with distance even if its not "so far away that it's effectively a point to your eyes, like stars in the sky."
    *Turn on a lightbulb in one room, try read a book in the next room, in direct line of sight of the lightsource. You can probably do it, but it will be really dim and tiresome.
    *Try use a home projector on the wall in the next room. picture will be really dim. More extreme: Try it in a cinema and you could probably not even see the picture.
    *try to read a book in darkness with just the display light from a mobile phone, you need to have it right next to it to work. Lift the phone 1m up and it will be too dim.

    The point of hdr is also not to use a square display area to illuminate you. It is to have a small pinpoint part of the screen, like a sun or a lamp or a fireball stand out from the average. Unless i calcluate it wrong, a 27" at 50cm at 250 nits will equal a 55" at 2m at 1000 nits. (But a 27" at 1000 nits at 1.07m would equal a 55" double the distance at 1000 nits, yes).
     
    IdiotInCharge likes this.
  26. geok1ng

    geok1ng [H]ard|Gawd

    Messages:
    1,985
    Joined:
    Oct 28, 2007
    just?? a true FALD is big news for a PC monitor

    make the FALD scan and the end result is about as good as an IPS panel without scan.
     
    Armenius likes this.
  27. Desert Fish

    Desert Fish n00bie

    Messages:
    29
    Joined:
    May 30, 2016
    These are different, the light per area of what you are looking at changes. Not so for looking directly at something and varying the distance to it. Then the light that reaches your eyes is proportional to how much of your visual area the object covers.

    Well, I wouldn't call them equal. Your eyes will see a difference. Larger highlights will appear half the size but brighter on the 55" (at 4x distance). But I think you are onto something. The perceived difference in brightness could be far smaller than the cd/m2 values suggest. It depends on the human visual system, though, not something you can determine with simple calculations.
     
  28. elvn

    elvn 2[H]4U

    Messages:
    2,791
    Joined:
    May 5, 2006
    If the VA is tight to 120hz at 120fps or more (to 130 if lucky), that is still great no matter what they claim the max hz is.
    A VA triples the black depth and contrast of a IPS or TN and FALD probably amps that up by an additional 30% or more, and on a HighDynamic(luminous saturated color) Ranged display.

    Your scene is more blurry (less clarity) at below 100hz at 100fps' 40% blur reduction, and with the VA's you are probably dropping off the clarity again beyond 120hz@120fps or 130hz + fps. Therefore you can dial in your settings and cap your frame rate to stay in the sweet spot as much as possible.

    ------------------------------------

    80hz at 80fps gives you 20% blur reduction vs 60hz@60fps, and gives you 1.3:1 motion definition increase (8 frames to every 6 shown at 60fps-hz)

    100hz at 100fps gives you 40% blur reduction vs 60hz @ 60fps, and gives you 1.6:1 motion definition increase. (5 frames to every 3 shown at 60fps-hz)

    120hz at 120fps gives you 50% blur reduction vs 60hz @ 60fps, and gives you 2:1 motion definition increase (double the motion definition, including viewport movement of the entire game world in 1st/3rd person games).

    130hz at 130fps is prob ~55% blur reduction vs 60hz@60fps , and prob gives around ~ 2.25:1 motion def increase (~9 frames to every 4 shown at 60fps-hz)
    ... tha blur reductiont is, if VA still stays tight enough at this point. Prob not much past 120fps-hz but we'll see.

    144hz at 140fps is 60% blur reduction and gives 2.4:1 motion def increase (12 frames to every 5 shown at 60fps-hz)
    ................... ... a VA's response time wouldn't be tight enough over 120 or 130 so this would probably be in effect sinking back to less clarity like the the rates below 120 fps+hz. aka outside of the "sweet spot" - like a bell curve, perhaps even like a cliff slope on the far end depending how bad the response time effects are at any given fps-hz range.

    ------------------------------------

    So if you can achieve 100fps-hz average or more, then you can cap it at 120fps+hz or 130fps+hz.
    Even if you do 100fps-hz average for more like a mainly 70 - 100 - 130 range, utiliziing g-sync for the fluctuation and lows, you'd still get a quite appreciable benefit.
    That is a great spot to be on a High Hz 4k (or a 3440 x 1440) VA 1000nit FALD HDR Quantum-Dot Filtered/P3 color monitor, with more than 3x the black depth and contrast to start with(and a lot more than that with FALD) than a IPS and TN.

    Realistically, most people would not be capable of even hitting 100fps-hz average (70 - 100 - 130 range for the most part) on the most demanding games without dialing graphics settings down a lot more than usual, especially at 4k - so 200hz range would not be pushed into in most cases even on the high end of the graph.

    In my opinion you'd be much better served dialing in to the best graphics settings you can tweak to achieve 100fps-hz average (~ 70 - 100 - 130) or 120fps-average (--90 - 120 -.|cap|. 150---.. cap at 120 or 130) when possible. Try to keep the low end at least 70+ for the most part and cap the top at 120 or 130. You'd still have a few potholes and spikes in addition to the (sometimes abrupt) sinks to and from 70hz@70fps+, so g-sync is still valuable in this usage, preventing judder/stutter and stops.
     
  29. NamelessPFG

    NamelessPFG Limp Gawd

    Messages:
    371
    Joined:
    Oct 16, 2016
    For all of you complaining about the size: this is exactly what I want, because now I can move my monitor back against the wall and still view it clearly at a distance. It also makes 100% DPI scaling at 4K sensible to use, too; even 42" is too small for that. Also, HDTVs suck at the very things I want out of a gaming monitor, especially for the ridiculous prices they generally go for.

    You might wonder why the hell I'd want to do the wall-mount thing for PC gaming, and it's actually pretty simple: I'm trying to minimize desk depth to maximize the possible room space in this cramped computer room, where 8" square doesn't make for a very comfortable room-scale VR experience if you want to walk around and still be sure you won't hit something by mistake. In fact, the end goal is to remove the desk altogether; what I need is more akin to a chair flanked by cockpit control mounts that can be moved aside when I'm feeling that room-scale itch.

    It also means I can move back to the middle of the room, have a few friends beside me for local multiplayer of the sort usually only found on consoles due to the general expectation of living-room play, and everyone can have a good view of what's going on. Heck, I actually do intend to use my more modern consoles with it that way, too, seeing as the living room in this house isn't exactly conducive to long hours of PS3/Wii U/docked Switch gaming. My FG2421 feels a bit small for that, albeit usable.

    The only thing is that this is almost certainly going to destroy my wallet the same way a typical 65" HDTV that isn't garbage would. NVIDIA loves their profit margins.
     
    Armenius and IdiotInCharge like this.
  30. QuantumBraced

    QuantumBraced Limp Gawd

    Messages:
    379
    Joined:
    Nov 21, 2015
    The more I think about it, the more I come to the conclusion that these are not usable for a desk, as much as I want them to be. And it's mostly because of the PPI, not the size. I think a 65" 8K would be usable at 40" away (within desk range), you just gotta train your neck muscles. It's like having several monitors on your desk.

    For gaming, you would have to lean back a bit, but not too much. I can fit my 43" into my field of vision at 30" away no problem. For a 65", I'd lean back a bit more, but it would still work with a desk (maybe a bit deeper). But, again, the problem is the resolution. 4K is too little for 65" at desk range. 8K would be perfect with 125-150% scaling. But 8K is obviously nowhere near ready for prime time with 120Hz, HDR, and having the hardware to maintain a frame rate. So I hope Nvidia listens and releases 45" versions for people who actually want to put a huge monitor on their desks (a lot already have).
     
  31. AORUS

    AORUS [H]Lite

    Messages:
    111
    Joined:
    Oct 31, 2017
    Nvidia 65" @4k with Gsync and 120hz I don't know why Nvidia calls it 'Gsync' it should be Esync (Expensive sync) I think a 43" with 8k @200hz would be better IMO, but having this big format on a desktop is way too big and bad for the eyes and neck you will have to sit at lease 42" away this is just graphics & marketing 'BS' only made for the console Fanboys not PC gamers in mind.
     
  32. IdiotInCharge

    IdiotInCharge Not the Idiot YOU are Looking for

    Messages:
    4,653
    Joined:
    Jun 13, 2003
    ...what?

    Yes, G-Sync is expensive! Well, more expensive, since there's dedicated hardware that brings more features than say FreeSync. FreeSync 2.0, which will bring some of those features, is also more expensive, because it also requires more dedicated hardware...

    ...and I cannot see where this is 'only made for the console Fanboys'- do any of the consoles that purport to play games at 4k support G-Sync over DisplayPort?
     
    Armenius likes this.
  33. sharknice

    sharknice [H]ard|Gawd

    Messages:
    1,165
    Joined:
    Nov 12, 2012
    I think the point of these is you sit at a normal monitor distance and use it instead of a triple monitor surround setup. You use a high field of view and get a lot of peripheral vision. Your entire real life fov is supposed to be filled.
     
    NamelessPFG and mms like this.
  34. mms

    mms n00bie

    Messages:
    41
    Joined:
    Apr 7, 2017
    I think PS4 Pro and Xbox One X support resolution 4K at 60hz so this BFGD won't be an issue for playing whether PC Gaming or consoles .
    For 65'' i think is the perfect size for everything whether Movies or Gaming . Many people loved their TVs at 65'' 4K such as Sony Z9D , X900E and OLEDs . I didn't see anyone complained about 4K at 65'' and u can enjoy with different movies with good shadow details and highlights .
    Now u don't need to choose a monitor for pc and a big size for TV , u can do everything in one monitor and using it for gaming and movies at the same time .
    All you have to do is choosing the best distance for your eyes when u are gaming or watching movies , and use small ultrawide for surfing the internet and productivity .
    I'm very happy to see nvidia making this great thing , TV features and PC Monitor features in giant pc monitor .Let's let technology take its route to the best .
    This is the human nature we only have to criticize everything and whenever we see something new everyone begins to say its disadvantages and forget its advantages .
     
  35. QuantumBraced

    QuantumBraced Limp Gawd

    Messages:
    379
    Joined:
    Nov 21, 2015
    You do that with a 43". 65" is too big and the PPI is too low.
     
  36. elvn

    elvn 2[H]4U

    Messages:
    2,791
    Joined:
    May 5, 2006
    I watch my 70" 4k (low density FALD) VA TV on my sectional couch with a couchmaster couch desk at 8' away to my eyeballs. It does 4k 60hz and can do 1080p at 120hz native on modern gpus. Though it is mainly for streaming movies and youtube, etc off of my shield - It has low input lag and I've played a few games on it on occasion, including the witcher 3. I think 8' away is perfect. In fact, for the money this is going to cost, It'd be better at 70". At 65" this would actually be a size downgrade for me in regard to my living room.

    I wouldn't consider using this *at* at desk, but the room setups with a desk facing AWAY from a wall instead of stuffed up against the wall like a bookshelf, with the giant display on the other wall, would work out great.

    This is how nvidia had it set up distance wise:

    [​IMG]


    Though the CGI scene example in their promo video was more like this distance, the actual virtual camera distance of the scene was further away to see the whole screen.
     
    Last edited: Jan 13, 2018
    mms and IdiotInCharge like this.
  37. geok1ng

    geok1ng [H]ard|Gawd

    Messages:
    1,985
    Joined:
    Oct 28, 2007
    Good luck running 4k 120hz at more than 3 feet distance. or using wireless "gaming grade" keyboard and mouse for FPS. Big format looks great on paper, but be ready for some compromises.
     
  38. AdamK47

    AdamK47 Limp Gawd

    Messages:
    272
    Joined:
    Feb 7, 2004
    This is my computer chair. My home PC doesn't require a desk.

    [​IMG]
     
    Armenius, elvn and Kyle_Bennett like this.
  39. AORUS

    AORUS [H]Lite

    Messages:
    111
    Joined:
    Oct 31, 2017
    I think gamers are missing the point on this new Nivida Big format thinking 120hz on a single GTX 1080 TI card is not going to cut it, I think this 120hz is for a two cards or more setup not one because the 1080 ti @4k can only run 60hz even titan V but we will see when it comes out this summer.

    Don't build up too much hope it all look very good on paper & with video graphics made by their marketing team.

    Also I am all for the new technology only then they get it right or support it, But True 4K is 4,096 by 2,160 not 3840x 2160 Nividia
     
    Last edited: Jan 13, 2018
  40. elvn

    elvn 2[H]4U

    Messages:
    2,791
    Joined:
    May 5, 2006

    I can run usb cables around the back wall in conduit or even down to basement ceiling and back up. The couch master couch desk setup comes with usb3 cables and a usb3 hub too.

    [​IMG]


    Other than that, what is the problem with using a big screen more than 3' away? Visually it would work a lot better further away. At 8' away, my 70" is bordering on requiring me to tilt head to focus on the extremes. Unless you mean the video cables?

    You'd keep the computer near the screen (like consoles and other media devices are) and run the usb cable around the back wall.
    You could alternately deploy a no-trip floor cabling strip directly across to your seating position (at least pulling it out from under entertainment center when switching to wired gaming peripherals for gaming sessions).

    8' viewing distance to your eyeballs means you could prob run a 16' usb cable run, (maybe a bit more if not using no-trip strip directly across), and feed it around the back wall in some conduit or some sort of wire-management shielding through to under the couch. Or you could feed it down behind the wall behind entertainment center, through basement rafters and back up behind couch drilled through the moulding at the wall's bases or via a wall plate if you want to get fancy.

    You can run another usb cable or utilize the hub for a g29 steering wheel and use one of those metal pop up stands they mount on .. You could also use a xboxone controller wired or unwired for a lot of fun games.

    Not every game is extremely demanding on gpus, and some can dial down a bit without losing too much. I agree 4k is gpu crushing on demanding games though. See my previous posts about the 100hz @ 100fps average as a minimum target for gaming on high hz displays. In fact, for people with moderate gpu budgets, a high hz 1080p where you can crank up all the graphics settings is probably a much better idea. You could alternately run a lower rez on a high hz monitor like this for the most demanding games and rely on scaling if you had to, or even letterboxing depending how much smaller you go.
     
    Last edited: Jan 13, 2018