OLED Gaming Displays

Discussion in 'Displays' started by Vega, Aug 6, 2018.

  1. gamerk2

    gamerk2 [H]ard|Gawd

    Messages:
    1,607
    Joined:
    Jul 9, 2012
    Personally, with HDMI 2.1 adding Variable Rate Refresh into the mainline specification, I'm hoping displays get away from fixed refresh rates entirely.
     
  2. elvn

    elvn 2[H]4U

    Messages:
    3,255
    Joined:
    May 5, 2006
    I said that high hz + high fps's benefits are not just for twitch shooting advantage - it is a very aesthetic difference.
    That was the main point of my post. While people continually post that it's just for twitch shooters and ignore or dismiss it's massive aesthetic benefits in games, I'll keep posting examples of what they are ignoring.

    The graphics I post are because they are the ones I took the time finding which give some glimpse into what people are missing out on or trading-off since you can't "see" it without having a high hz monitor running high frame rates. If I find some examples that relay that information easier I'll save them and start posting those :b Unlike still frame screenshot wallpaper showing off high resolution 4k, you can't show high hz + high frame rate or HDR, high contrast and black depth of OLED , etc to someone without the hardware to see it.

    -----------------------------------

    You can surely enjoy gaming on 60hz screens and consoles and handhelds. You can enjoy gaming at 1080p resolution too.

    In going 4k 60hz or 60fps currently you are losing one of the biggest benefits to pc gaming since 2009.. the huge and aesthetic benefits of high hz. Running a 60hz max 4k monitor or 60fps average you're actually, somewhat ironically, dropping your image clarity (and motion definition) of the entire viewport in 1st/3rd person games where you movement key and mouse-look at speed. From glassy motion and soften viewport contents during movement to slideshow motion and smearing viewport movement.

    People gaming on 60hz OLED tvs currently are going even further and droping g-sync/VRR capability. Luckily in the future (perhaps 2019 LG tvs) there will be hdmi 2.1 displays with 120hz native 4k HDR and VRR (variable frame rate) tech, and QFT (quick frame transport for low input lag gaming) ... and hopefully support for those features in future gpu generations and with more powerful gpus.


    We are just to the point where a single 1080i can run decent enough frame rate averages to get appreciable benefits of high hz at x1440 rez on demanding games. Personally I am not willing to drop 120hz+ gaming (with enough fps to benefit from it) since 2010, or VRR/g-sync since 2014.




    --------------------------------------------------------------------

    I agree g-sync is great.

    Regarding backlight strobing / ulmb mode however the graphics settings and other tradeoffs are huge. I doubt ULMB will work with HDR going forward either.


    As per blurbusters.com 's Q and A:
    -----------------------------------------------------
    Q: Which is better? LightBoost or G-SYNC?
    It depends on the game or framerate. As a general rule of thumb:
    LightBoost: Better for games that sustain a perfect 120fps @ 120Hz
    G-SYNC: Better for games that have lower/fluctuating variable framerates.
    This is because G-SYNC eliminates stutters, while LightBoost eliminates motion blur. LightBoost can make stutters easier to see, because there is no motion blur to hide stutters. However, LightBoost looks better when you’re able to do perfect full framerates without variable frame rates.
    G-SYNC monitors allows you to choose between G-SYNC and backlight strobing. Currently, it is not possible to do both at the same time, though it is technically feasible in the future.
    ......
    Main Pros:
    + Elimination of motion blur. CRT perfect clarity motion.
    + Improved competitive advantage by faster human reaction times.
    + Far more fluid than regular 120Hz or 144Hz.
    + Fast motion is more immersive.
    Main Cons:
    – Reduced brightness.
    – Degradation of color quality.
    – Flicker, if you are flicker sensitive.
    – Requires a powerful GPU to get full benefits. <edit by elvn: and turning down settings a lot more at higher resolutions>

    --------------------------------------------------------
    During regular 2D use, LightBoost is essentially equivalent to PWM dimming (Pulse-Width Modulation), and the 2D LightBoost picture is darker than non-LightBoost Brightness 100%.
    --------------------------------------------------------
    Once you run at frame rates above half the refresh rate, you will begin to get noticeable benefits from LightBoost. However, LightBoost benefits only become major when frame rates run near the refresh rate (or exceeding it).
    -------------------------------------------------------
    If you have a sufficiently powerful GPU, it is best to run at a frame rate massively exceeding your refresh rate. This can reduce the tearing effect significantly.Otherwise, there may be more visible tearing if you run at a frame rate too close to your refresh rate, during VSYNC OFF operation. Also, there can also be harmonic effects (beat-frequency stutters) between frame rate and refresh rate. For example, 119fps @ 120Hz can cause 1 stutter per second.
    Therefore, during VSYNC OFF, it is usually best to let the frame rate run far in excess of the refresh rate. This can produce smoother motion (fewer harmonic stutter effects) and less visible tearing.
    Alternatively, use Adaptive VSYNC as a compromise.
    -------------------------------------------------------------
    Pre-requisites
    Frame rate matches or exceeds refresh rate (e.g. 120fps @ 120Hz).
    1. LightBoost motion blur elimination is not noticeable at 60 frames per second.
    --------------------------------------------------------------
    Sensitivity to input lag, flicker, etc. (You benefit more if you don’t feel any effects from input lag or flicker)

    Computer Factors That Hurt LightBoost

    • Inability to run frame rate equalling Hz for best LightBoost benefit. (e.g. 120fps@120Hz).
    • Judder/stutter control. Too much judder can kill LightBoost motion clarity benefits.
    • Framerate limits. Some games cap to 60fps, this needs to be uncapped (e.g. fps_max)
    • Faster motion benefits more. Not as noticeable during slow motion.
    • Specific games. e.g. Team Fortress 2 benefits far more than World of Warcraft.
    • Some games stutters more with VSYNC ON, while others stutters more with VSYNC OFF. Test opposite setting.
     
    Last edited: Aug 13, 2018
  3. LuxTerra

    LuxTerra Limp Gawd

    Messages:
    211
    Joined:
    Jul 3, 2017
    I don't disagree and I can see the extra frames bringing smoothness to everything, not just games. However, for lot's of cinematic games 60fps is well beyond the consoles 30fps or movies. There are plenty of games that I'd rather have real 4k at max settings and 60fps like the [H] graph I showed than 144hz or whatever.

    Of course, it would be awesome to never have to choose what compromise you want to make, but realistically if a 1080Ti could for 4k@144hz in every game, we'd just start talking about how much motion blur there is and we need 200/240hz or more. 60fps isn't perfect, it's just the fps benchmark for PC land gaming. At 4k, the pixel pitch is finally getting small enough that increases in resolution matter less; at 8k I expect it to sort of max out. Can't wait for my 8k@240hz 42" HDR display someday and whatever GPU is required to drive it.

    For now it's moving goal posts and individuals deciding what compromises they want to make. I have no issue with someone gaming at 1080p@144hz or 4k@60fps. Find the settings that make you happy and games do vary. Just don't tell everyone else they're wrong for choosing a different compromise than you. Saying you can't game at 4k on a 1080Ti is just silly, you can, [H] has shown it, but you make different compromises than 1080p@144hz.
     
  4. LuxTerra

    LuxTerra Limp Gawd

    Messages:
    211
    Joined:
    Jul 3, 2017
    The problem you face isn't technical. The loudest advocates of the ultra-high refresh monitors, and whom the manufacturers market to, are twitch gamers, not everyone else. They go so far as to start using TN panels which have awful colors to get there too. Some day we may not have to make those compromises, but for now, I haven't seen a high refresh rate monitor outside of OLED that looked even remotely good.

    Thus, when people point out that currently, all the high refresh rate is primarily for twitch gaming, they are correct. I do wish it wasn't like that.
     
  5. elvn

    elvn 2[H]4U

    Messages:
    3,255
    Joined:
    May 5, 2006
    Sure they are heavily marketed "TO", with plenty of marketing speech for kills and annihilating your competition etc.

    However , people in discussions like these often use "FOR" twitch gaming to dismiss how great the aesthetic benefits gained by running higher fps on a high hz monitor are. As if it's not a night and day difference.. It is a huge difference in visual eye candy.



    Marketed heavily to, true.. but not exclusively.
    The people buying 4k 120hz FALD HDR monitors care about colors going into even HDR color volume, contrast/black depth and across the board aesthetics including motion clarity and motion definition of a high hz g-sync monitor, and they are willing to pay a lot for it apparently.



    Knowing that in 2019 there should be a LG 4k HDR OLED tv with hdmi 2.1 for 120hz input at 4k + VRR support and QFT support is making me skip that $2k + price tag for those FALD monitors, even if I have to wait awhile for a gpu that supports VRR.

    Someone supposedly did a workaround to make freesync work off of a nvidia card recently though so who knows. If working 100% and it lasts without being blocked maybe there is hope for a hdmi VRR gpu sooner than I'd thought. Xbox already supports VRR on hdmi 2.0b incidentally.
     
    Last edited: Aug 27, 2018
  6. Ryom

    Ryom [H]ard|Gawd

    Messages:
    1,615
    Joined:
    Oct 11, 2006
    I just think it's funny that I can get a 55" 2017 OLED for $1100 but I can't get a desktop sized display for anything less than multiples of that! I can't even get a 37" HDTV OLED to run as a monitor, they all come 55" or larger :p
     
  7. elvn

    elvn 2[H]4U

    Messages:
    3,255
    Joined:
    May 5, 2006
    I hear you, but
    I'd pay over $2000 for a 2019 55" LG 4k HDR OLED w/ HDMI 2.1 + 4k 120hz native input + VRR + QFT
    before I'd buy a $2000+ 27" FALD IPS. Even if I have to rearrange my pc room to face the desk to the tv further away.

    I think the first 55" LG oled was $3500 in 2014. I think they were $2300 in 2016. So new model lines can be pricey at release.

    I'm saving for the 4k 120hz VRR OLED. If those FALD HDR monitors came out 2 - 3 years ago it might be different.
     
    Last edited: Aug 27, 2018
    Stryker7314 likes this.
  8. datum

    datum n00b

    Messages:
    51
    Joined:
    Aug 27, 2018
    Higher cost and lower runtime?
    Hm, I don't think so.

    A solution in search of a problem. I am happy with the PQ of current machines, I just want the refresh rate as high as possible, 240 Hz or higher.
     
  9. Lepardi

    Lepardi Limp Gawd

    Messages:
    205
    Joined:
    Nov 8, 2017
    120Hz is starting to become normal "office" standard even in phones because of it's better feel. They have OLED too. Phones get all the good stuff while we're still stuck in 2005 :p

    I don't see why 120Hz shouldn't be the standard for office monitors as well, it does allow smoother operation even in dekstop UI.
     
    Last edited: Aug 28, 2018
    Armenius likes this.
  10. KazeoHin

    KazeoHin [H]ardness Supreme

    Messages:
    7,842
    Joined:
    Sep 7, 2011
    This is correct. I can tell the difference between 60,75,120 and 144Hz just in how moving windows around feels... as the refresh goes up, its so silky buttery smooth like everything is coated in lube, its GREAT.
     
    kalston and Armenius like this.
  11. kasakka

    kasakka [H]ard|Gawd

    Messages:
    1,173
    Joined:
    Aug 25, 2008
    The 8-bit TN panels found in some high refresh rate displays are far from awful. I've been using an ASUS PG278Q for years and it has been wonderful. It has no issues like dithering and barely any of the vertical color shift. I am fully aware of how awful TN panels you see in cheap laptops and monitors are and at least this one is nothing like those. It's accurate for the sRGB color space which is what most content uses. It won't do those oversaturated colors some love in higher gamut displays.

    The pros are of course the lowest pixel response times but to be honest with advancements in IPS displays I don't think I would notice.

    I agree with this. On console games that have the option for 1080p at 60 fps vs 4K (or checkerboard version) at 30 fps I often opt for the 4K option instead because it looks better. These are usually games that don't require super fast reflexes and are very story and visual focused like the new God of War.

    On PC I aim for 60 fps at 1440p with my current GPU but with G-Sync I don't care too much if it varies around that figure. I think the best thing PC gamers could do is turn off their damn fps counter software. Seeing that number in the corner will mentally fuck you up as you become more aware of framerate differences than you would be without them. While you can feel the difference between 120 and 60 fps I don't think it's anywhere near as jarring as 30 vs 60 fps or dropping under 30 fps.

    With the compromises we have to make to get our desired playing experience we need better ways to handle resolution. Checkerboard rendering, integer scaling, user-configurable dynamic resolution scaling (e.g "scale between 1440p and 4k") would help tremendously especially now that we are entering a time where ray tracing means huge drops in framerates until hardware catches up.
     
    kalston and Armenius like this.
  12. Armenius

    Armenius I Drive Myself to the [H]ospital

    Messages:
    18,152
    Joined:
    Jan 28, 2014
    Not necessarily. 240 Hz panels certainly do use poor quality TN panels these days, but they are often 1920x1080 and marketed directly at the Twitch gamer who refuses to play with anything more than 800x600 resolution with the mistaken perception it gives them an advantage. 120 Hz is becoming ubiquitous in nearly every display market segment because the advantages outside of gaming are apparent. Even most gaming displays with high refresh are using IPS or IPS-type panels these days. The panel in the PG27UQ is one of the best LED LCD I have ever seen.
     
  13. elvn

    elvn 2[H]4U

    Messages:
    3,255
    Joined:
    May 5, 2006
    The PG27UQ 's FALD makes up for what would be a few issues.. but you can just leave it on dynamic for SDR content so the end result is what's important.

    the White --- BLACK -- CONTRAST is great on those:

    You do go back up to 9.3 ms response time at 60fps-hz on it (and combine that with the sample and hold blur at 60fps-hz or other low range frame rate average spans). You only get the 6.6ms or 5.x ms response time on thePG27UQ and the PG279Q IPS screens when you are at very high hz + high fps rates, which can be impossible to achieve on demanding games at any type of very high to ultra settings at 4k resolution. It is definitely the best monitor out right now but IMO the price/size and upcoming 2019 hdmi 2.1 120hz native 4k HDR OLED TVs with VRR and QFT move my $2k budget elsewhere. I can be patient and pick my battles sometimes :)

    TFT central regarding the PG27UQ

    --------------
    elvn:
     
    Last edited: Aug 28, 2018
  14. LuxTerra

    LuxTerra Limp Gawd

    Messages:
    211
    Joined:
    Jul 3, 2017
    I personally can't wait for 120/144hz to become ubiquitous. 4k, HDR 100/120/144hz panels are very new. I can see the effects of a low 60hz panel, I can seem the effects of a low resolution panel, I can see the effects of a panel with poor color and dynamic range. For now it's a trade off and does depend on what games you play and your preferences. Personally, I can tolerate 60hz better than I can tolerate low resolution in most cases. That includes the desktop and most games I play. 60hz is still a huge increase over consoles or movies!

    Edit: this all started with people lamenting that you can't play games at 4k, when you can. It's just a different set of compromises depending on your preferences, resolution is one of them. The monitor that has greater specs than your eyes doesn't exist quite yet, but we're getting closer.
     
    Dahkoht likes this.
  15. Dahkoht

    Dahkoht Limp Gawd

    Messages:
    432
    Joined:
    Jan 2, 2014
    Very much this for me.

    I've been gaming on Benq 3201's for a while now , and while more than 60hz would be nice , anything smaller than 32" seems tiny to me , lower res than 4k bothers me greatly also. Everyone has their individual tastes and and there's nothing out there right now that covers all the bases.
     
    LuxTerra likes this.
  16. JRUHg

    JRUHg Limp Gawd

    Messages:
    385
    Joined:
    Jan 5, 2016
    interesting discussion on OLED Gaming Displays
     
  17. elvn

    elvn 2[H]4U

    Messages:
    3,255
    Joined:
    May 5, 2006
    I may be interested in the samsung QLEDs due to OLED burn in concerns in the back of my mind. The samsung "QLEDs" and the LG OLEDs are very good 4k HDR monitors already. They just need hdmi 2.1 bandwidth , VRR and QFT.

     
    Last edited: Sep 4, 2018
  18. Ziran

    Ziran Limp Gawd

    Messages:
    142
    Joined:
    Mar 26, 2015
    You are living in a golden era of display technologies

    The progress made in the last 15 years is absolutely astonishing. 15 years ago I was using a monitor that was not that different from what was available 30 years ago.

    Today I have a huge display wall with lifelike colors, no distortions, no artifacts or smearing or noise (it is in fact perfect). It does not tire my eyes or irradiate my face to watch it all day. It is light and power efficient. And I can get it new for less then $2000.

    The 4k @120hz will happen one day but it will be a very minor bump in display experience and not the nirvana you imagine it to be.
     
  19. bigbluefe

    bigbluefe Gawd

    Messages:
    648
    Joined:
    Aug 23, 2014
    No smearing? Bullfuck.
     
    IdiotInCharge likes this.
  20. HiCZoK

    HiCZoK Gawd

    Messages:
    820
    Joined:
    Sep 18, 2006
    what? golden age? Literaly everything after crt and plasma is worse except oled
     
  21. elvn

    elvn 2[H]4U

    Messages:
    3,255
    Joined:
    May 5, 2006
    FW900 CRTs had essentially zero blur and low input lag, great screen uniformity decades ago. LCD was a huge step down for moving content back when I had one, to the point where for years I kept a LCD and a FW900 at the same desk up until several years ago. I also had a sony xbr 960 34" widescreen tv with hdmi input for years. Their size most of all, and "crispness" of their pixels to a degree , geometry management issues and overall age and support now make them a bad choice for me.

    120hz at high frame rate is a HUGE increase in display experience . Especially for 1st/3rd person gaming aesthetics (and even for watching sports if it were recorded and transmitted at high frame rates).

    In 1st/3rd person games you are moving your viewport around at speed constantly so it's not just a simple flat colored bitmap ufo test object smearing. The entire viewport and game world (of high detail textures and depth via bump mapping, etc) in relation to you is smearing during movement-keying and mouse looking. 120fps at 120hz cuts sample and hold blur by 50% and doubles your motion definition and motion path articulation, and increases to glassy smoothness (more dots per dotted line, twice the unique animation scene pages/cells in a flip book paging twice as fast per se). 100fps-hz cuts sample and hold blur by 40% and does 5:3 motion definition improvement (10 unique frames shown at 100fps-hz to every 6 shown at 60fps-hz).

    At 60fps or 60hz cap you are getting smearing blur during viewport movement. At 100 - 120fps on a high hz monitor you cut that blur down to more of a soften blur within the "shadow masks" of everything on the screen, within the lines of the coloring book so to speak. Modern gaming overdrive and low response times help mitigate this or it would be much worse.

    Variable refresh rate is another HUGE bump in display experience. It allows you to straddle or at least dip well into those higher hz and frame rate ranges in a frame rate graph that has spikes, dips, and potholes without experiencing judder, stutter, stops, or tearing. So you can tweak your graphics settings higher for a better balance and avoid the bad effects of v-sync or no-syncing at all.

    There are a few tvs that can do 60hz to quasi 120hz with interpolation even with gaming mode active without adding a ton of input lag but it still jumps from 15ms up to 20ms on the best samsung LCD screens. Most add a lot of input lag. Interpolation does reduce blur in a way but it does not add more actual frames of action. It's more like repeating a frame and floating it which can give an odd effect.. and they are usually not without artifacts and dark halos, etc.

    The backlights and uniformity of general LCD tech are terrible, the black levels are really bad especially on IPS and TN which are usually 880:1 to 980:1, and the blur is at best a soften blur at high fps + high hz and worst smearing blur on typical 60hz lcd tech and even at lower frame rates on a high hz monitor. A good FALD VA can help with the black levels and contrast a lot, up to 5000:1, 8000:1 or much more with a denser FALD array.. but especially with HDR going forward, the haloing glow and/or dimming of areas is a big issue since the ratio of fald backlight to pixel sizes is huge and HDR often shows extremes of bright full color volume highlights and edges right next to darker scene sections or inky blacks and in a dynamically changing and panning scene at that.


    OLED would be a great pc gaming leap once it gets hdmi 2.1 120hz 4k + VRR, except for that fact that the organics degrade over time no matter what, and more importantly have a risk of permanent burn in. The screens shift to lower brightness modes, use screen/pixel saving methods, and limit peak brightness for a reason. From what I've read they are usable for thousands of hours without issue but some static colors increase the risk (Rtings "real life" scenario OLED burn-in test .. CNN logo seems to be the worst) .. and there would always be that fear in the back of my mind after spending $1200 - $2600 on an OLED for PC. It's been 30 weeks of testing.over at Rtings. I'm still on the fence but won't be in the market until 2019's hdmi 2.1 LG OLEDS are out anyway.. The other issues with current LG oled are lower than optimal HDR peak brightness and some banding in large areas of similar color, which if you turn on the mpeg noise reduction feature to reduce, loses detail.
     
    Last edited: Sep 5, 2018
  22. elvn

    elvn 2[H]4U

    Messages:
    3,255
    Joined:
    May 5, 2006
    A few of the most relevant quotes I found in the discussions, rather than just re-writing in my own words:

    https://www.rtings.com/tv/learn/real-life-oled-burn-in-test/discussions

    -----------------------------------------------------------------------------------------------------


    https://www.rtings.com/tv/discussio...-was-wondering-how-burn-in-works-with-oled-so

    https://www.rtings.com/tv/discussio...nt-red-uniformity-issues-for-the-fifa-and-ncb

    https://www.rtings.com/tv/discussio...tnite-on-the-attached-xbox-one-s-2-hours-at-a

     
    Last edited: Sep 5, 2018
  23. euskalzabe

    euskalzabe Gawd

    Messages:
    987
    Joined:
    May 9, 2009
    I'm enjoying this thread very much, but I gotta say, whenever I read posts like "4K 144HZ GSYNC HDR OR BUST" I can't help but roll my eyes.

    20 years ago, I was playing on crappy 15" CRTs on 640x480 @ 60hz and felt like that was glory. SDR, garbage refresh with our game hardware back then, lol at adaptive sync anything... and it was GREAT anyway. Now I'm using a frigging 4K 40" display. Sure, I wish I didn't have trouble with some games when trying to force 21:9 within it, but you know what? Anything above 1080p60 is freaking amazing and you should be glad that's baseline quality these days. But it's all a matter of perspective.

    Sometimes I force games to 640x480 to scratch the nostalgia itch of super blurry graphics... and if you want to have a laugh, try 320x240... that was AMAZING because it finally ran SMOOTHLY back in the day before GPUs or even earlier on my Voodoo Banshee :D
     
  24. elvn

    elvn 2[H]4U

    Messages:
    3,255
    Joined:
    May 5, 2006
    People also demand a lot from games but there are still indie titles and platformers out there so it's not all like that. You can still play really well on a 1080p screen for a lot of demanding games with a halfway decent gpu and are able to turn the graphics settings up while getting really high frame rates on a high hz monitor if you choose to as well. Compared to 800x600 and 1024x768 17" mointors and graphics of old that would be mind blowing but that's not the point. A lot of technology would be mind blowing to people in the past :b

    Unless you called someone - if you wanted to post in a discussion you'd have to drop something in a mailbox writing to a magazine or a newspaper, or find a physical public bulletin board to physically pin your note to, and wait a long time for a reply if you got any at all. You'd type with an ink slamming hammer typewriter.. Back in my day we had pong on green ray tubes not even black and white! eventually low baud modems dialing up ascii bulletin boards on a single voice phone line late at night to download tiny games at incredibly slow speeds.. and to play text games! TEXT stories and text rpgs (MUDS). 5.25" floppies. Porn was mostly magazines. You had to go to a shady grocery store to get one or sneak a peak at someone elses stash if you were a kid. RADIO and TV on crunchy cardboard cereal box sounding speakers. We had our phones on 12' tangled cables tethered to a wall because they weren't even wireless with big antenna yet! We had no answering machines for a long time! What was a microwave oven? No Cable tv or any kind of huge selection of your own chosen content outside of planning it out with a a tv guide <--- that's a little magazine not a guide on the TV ! Three to five channels one being pbs. Convex curved small screens and signal noise picture interference, and only one color tv in the house.

    Time moves on. People demand better screens, reliable cellular and home internet service at decent speeds, etc Many of us know the display industry can do better. High end CRTs were better in many ways. Other tech and potential invention was sat on a long time or abandoned so they could milk consumers with inferior cheap LCDs longer. We have made some advancements since 2013 especially with higher hz, variable hz, higher resolution screens (2560x1440 mostly), and prices that dropped to $600 - $800 or less for a full featured one for awhile, rather than the $1100 - $1600 of the old 2560x1600 and 2560x1440 ips screens before the korean B grade knockoffs hit and 2560x1440 became more common. We are on the verge of 4k 120hz, better density FALD arrays in gaming capable screens, and a more standardized VRR now. Samsung, and perhaps apple, are taking the first steps toward making micro led displays in the years to come too. But there was a long time that display tech was more or less stagnant. Now consoles like xbox and amd gpus are supporting variable refresh rate hdmi standards even on hdmi 2.0b, and there are tvs that have HDR 1000 FALD arrays with deep black depth and near perfect and high volume color, VRR, low input lag at 55" and larger. Meanwhile nvidia is trying to force g-sync at a huge markup on what would already be expensive displays 4 -5 months or so before hdmi 2.1 hits (sometime in 2019 supposedly) and will likely keep a wall up vs supporting hdmi 2.1 VRR on their gpus.


    When people say "OR BUST" about this currently, I think we mean that we know the page is turning and HDMI 2.1 120hz 4k HDR VRR is just around the corner. No matter what they won't be very cheap so holding out long enough is a wise move considering the timeframe. There are TCL 4k tvs that work great as desktop/app monitors (and even 60hz gaming) for $240 - $280 and decently priced g-sync gaming monitors or even used ones in the meantime if you need something "right now" , without blowing $1k - $2200 -3k+ on something before hdmi 2. 1. People here probably already have a decent gaming screen for the time being anyway I think. Personally I'm all about picking my battles and waiting for things to be ripe especially with 1k - 2k - 3k price tags of living room tvs or heavy prices on computer hardware/upgrades. That kind of money is nothing to roll eyes at to me personally and demands quality and features.
     
    Last edited: Sep 6, 2018
    kasakka likes this.
  25. kasakka

    kasakka [H]ard|Gawd

    Messages:
    1,173
    Joined:
    Aug 25, 2008
    Most people did not play at 60 Hz as that is quite horrible flicker for a CRT. I remember most displays doing 75-100 Hz for everything but unusable resolutions for the sizes at the time. I have a TV studio CRT at home for use with a Raspberry Pi and emulators or old DOS games and those games benefit a lot from the CRT tech. They look more vivid and the lack of resolution is less jarring due to the small screen size and the slight smoothing the technology causes.

    By comparison LCD display tech is only now entering a time where a lot of displays have all the goodies needed for a great gaming experience: high resolution, high refresh rate, low input lag and response times.

    That said, a lot of people are way dogmatic about their game performance. The most recent game I've been playing has been God of War on the PS4. I opted for the "favor resolution" option which means 4K checkerboard at 30 fps over the "favor performance" option which is unlocked 30+ at 1080p. The game is immensely detailed so using the higher res for me was worth the tradeoff in framerate, considering it doesn't run at a constant 60 fps in 1080p either. People need to turn off their framerate counters and just enjoy games on their chosen platform(s). While PC gaming is known for being the best in terms of performance and visuals, it is now starting to come at a very hefty cost if you want 4K @ 60+ fps. At this point with GPU prices at an all-time high I just hope we can at least get some smaller options from TV manufacturers so we don't have to buy hugely expensive desktop monitors just to get high refresh rate 4K.
     
    vegeta535 likes this.
  26. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    11,346
    Joined:
    Jun 13, 2003
    No.
     
  27. euskalzabe

    euskalzabe Gawd

    Messages:
    987
    Joined:
    May 9, 2009
    A) I loved everything about your post.
    B) I'm aware I have historical perspective that I can't forget, so I judge my current equipment still from the eyes of 16 year old me.
    C) Indeed, there is zero point on buying anything that's not HDMI 2.1 or 1 billion colors. It seems that both these things are going to explode in a big way in 2019, manufacturers know this and are trying to clear inventory any way they can with overpriced 6bit+dither crap panels.
     
    LuxTerra likes this.