OLED Gaming Displays

Discussion in 'Displays' started by Vega, Aug 6, 2018.

  1. Vega

    Vega [H]ardness Supreme

    Messages:
    5,703
    Joined:
    Oct 12, 2004
  2. Sancus

    Sancus Gawd

    Messages:
    729
    Joined:
    Jun 1, 2013
  3. Vega

    Vega [H]ardness Supreme

    Messages:
    5,703
    Joined:
    Oct 12, 2004
    That doesn't look like a very reliable source. Especially as translated. There was a ASUS rep at CES that said on video more "around" $1000.
     
  4. thelead

    thelead [H]ard|Gawd

    Messages:
    1,986
    Joined:
    May 28, 2005
    Why would they ruin that oled panel with AG film? Even true blacks look gray when light reflects off of it.
     
  5. Sancus

    Sancus Gawd

    Messages:
    729
    Joined:
    Jun 1, 2013
    Personally, given that it's JOLED I'd actually trust a japanese-language source in a more recent computex article more than I'd trust a much older statement from Asus, but shrug, I guess we'll see.

    If it is $1000 and available in October I will definitely be picking one up, but I'm skeptical -- even limited to 22", a production $1000 PC OLED display would be a huge game changer and yield coup for JOLED. $5K pilot price and $1.8K once mass production settles in sounds a lot more realistic to me.
     
  6. bigbluefe

    bigbluefe Limp Gawd

    Messages:
    444
    Joined:
    Aug 23, 2014
    It's pointless if it ships without G-Sync.

    Really, it's time for these shitty companies to pony up. 4k, 32", variable refresh, 120hz, low input lag, and either GOOD mini LED or something.

    Not buying another monitor until all those boxes are ticked. They can fuck off in the mean time.
     
  7. KazeoHin

    KazeoHin [H]ardness Supreme

    Messages:
    7,519
    Joined:
    Sep 7, 2011
    OLEDs are sure taking their sweet ass time popping up on a gaming-focused screen..
     
  8. Lepardi

    Lepardi [H]Lite

    Messages:
    117
    Joined:
    Nov 8, 2017
    4K goes straight into the trashcan in a 120Hz+ monitor.

    Reality is that even a 1080 Ti can only guarantee 120+ FPS in 1080p. 1440p, nope. 2160p is just a distant dream in the 2020's.

    30" 1440p 240-480Hz OLED would be awesome, PPI is good enough. Heck, the response times of OLED would allow even 960Hz refresh rates easily.
     
    Stryker7314, Krisium and jnemesh like this.
  9. ors

    ors Limp Gawd

    Messages:
    140
    Joined:
    Jan 30, 2016
    No it does not, they just need to add integer scaling to their display, problem solved, you can play in 1080p until GPUs catch up, on the desktop you will have 4k, best for both and once GPUs catch up you already have the monitor for it...
     
    jfreund and Armenius like this.
  10. Lepardi

    Lepardi [H]Lite

    Messages:
    117
    Joined:
    Nov 8, 2017
    But GPU manufacturers wont add nearest neighbor, only the blurry algorithm is available.
     
  11. Morkai

    Morkai Limp Gawd

    Messages:
    328
    Joined:
    May 10, 2011
    I think, that simply based on the fact that nvidia decided to spend years of time and resources to invest in their latest g-sync module being a hdr fald-controller, it means that according to nvidias knowledge of the monitor market and info from their monitor partners, oled monitors are not coming anytime soon.
    If nvidia thought anyone was making viable mass-market oled monitors, they would've just scrapped the fald-hdr controller and sold monitors with the old gsync module another year. For this to make business sense, I highly doubt there will be any oled/microled g-sync monitors the next 2 or 3 years.
    It was based on this logic i gave up and bought the pg27uq (which IS excellent, despite what I first anticipated) instead of waiting.

    There might be a chance we get surprised by freesync microled though, who knows.
     
    Last edited: Aug 7, 2018
  12. ors

    ors Limp Gawd

    Messages:
    140
    Joined:
    Jan 30, 2016
    And they don't need to, you can disable GPU scaling and let the monitor handle it (as is the default on most systems). So they can implement it in the monitor's scaler if they want...
     
  13. Lepardi

    Lepardi [H]Lite

    Messages:
    117
    Joined:
    Nov 8, 2017
    But do you see this feature even in the $2500 4K gaming monitors? Nope.
     
    jnemesh likes this.
  14. ors

    ors Limp Gawd

    Messages:
    140
    Joined:
    Jan 30, 2016
    Easy let's all stop buying this crap they are pushing right now, and they will start shipping what people want. You need to vote with your wallet, that's the only language they understand... I for sure won't buy any sub 4k monitors anymore...
     
    jfreund likes this.
  15. sharknice

    sharknice [H]ard|Gawd

    Messages:
    1,300
    Joined:
    Nov 12, 2012
    Anyone know how joled's panels fare when it comes to burn in and aging? Are they using all white oleds with filters like LG?

    It says they make oleds for cars so you would think they would be good at dealing with image retention.
     
  16. Lepardi

    Lepardi [H]Lite

    Messages:
    117
    Joined:
    Nov 8, 2017
    They now have limited "4.5 gen" panel production, and they will open a bigger factory with more capacity producing "5.5 gen" panels. Whatever those gen things mean.
     
  17. Sancus

    Sancus Gawd

    Messages:
    729
    Joined:
    Jun 1, 2013
    Nobody can, LG has patents on that manufacturing method, which is why no one else has been able to manufacture reasonably priced larger OLED panels. JOLED is using an RGB printing process, and if they are even able to get the panels out at anything remotely like a reasonable price, they will be the first manufacturer, including Samsung, to be able to do this at sizes larger than 14" or so.

    Honestly, if they can produce a 32" high refresh rate panel for less than $5000 it would be an instant buy for me no questions asked, I don't care about burn in, as long as it's not completely ridiculous like in <1 hour or something. But I would expect these to be at least as susceptible to burn in as any LG panel, if not moreso.
     
    Armenius likes this.
  18. sharknice

    sharknice [H]ard|Gawd

    Messages:
    1,300
    Joined:
    Nov 12, 2012
    Thanks, that's what I thought but I wasn't sure if they had some alternative method to get around the patent or got a license to use LG's method.

    Hopefully it's a real high refresh rate monitor and not a dud like the Dell. I probably won't get it without gsync, but if they do something crazy like support 4k 120hz and 1080 480hz I just might.
     
  19. Enhanced Interrogator

    Enhanced Interrogator Gawd

    Messages:
    817
    Joined:
    Mar 23, 2013
    Or just run any resolution your PC can handle, 1440p, 1800p, whatever, since you have an insanely high dot pitch:

     
    admiralperpetual likes this.
  20. jnemesh

    jnemesh Gawd

    Messages:
    984
    Joined:
    Jan 21, 2013
    Samsung offered a 55" RGB oled for sale several years ago. Then they got smart, realized that yields were terrible, so large panels would NEVER become cost effective, and the downsides with image retention (Screen burn) along with other issues led them to abandon large screen OLEDs and focus on NON-organic light emitting displays. "QLED" as it stands today is just a stopgap. The real fun begins when they release their emissive QLED displays in another couple years. All the advantages of OLED without the phosphors degrading and without screen burn problems. They are already showing prototype displays of this tech. It's not far off...and when it hits, OLED is dead.
     
    euskalzabe likes this.
  21. jnemesh

    jnemesh Gawd

    Messages:
    984
    Joined:
    Jan 21, 2013
    You are an overly entitled prick who thinks that somehow, these companies aren't spending MILLIONS and employing THOUSANDS of engineers to bring you the latest tech? YOU fuck off! People like you are everything wrong with this world. No appreciation for the technology involved, the challenges in design or manufacturing, you just expect someone to wave a magic wand and give you the tech you want. Piss off.
     
  22. Sancus

    Sancus Gawd

    Messages:
    729
    Joined:
    Jun 1, 2013
    Yes, but I said reasonably priced. The KN55S9C was 1080p, $9,000, and Samsung was never able to improve yields to the point it was actually viable as a product.

    You're very optimistic about this lol.
     
    Stryker7314 and Armenius like this.
  23. jnemesh

    jnemesh Gawd

    Messages:
    984
    Joined:
    Jan 21, 2013
    Not really, they are sampling small displays right now...
     
  24. Sancus

    Sancus Gawd

    Messages:
    729
    Joined:
    Jun 1, 2013
    Source? AFAIK, Samsung's next plan is to go to micro-LED backlit LCDs, and then maybe, if they ever manage to scale down and make the non-organic LEDs manufacturable in small sizes(not guaranteed this is even possible) then what you are talking about will happen.
     
    Armenius likes this.
  25. AORUS

    AORUS Limp Gawd

    Messages:
    417
    Joined:
    Oct 31, 2017
    Why play it in 1080p old scale what the point? and if you going to wait for the GPU's to catch up to play on a OLED, MicroLED will replace it by then problem solved.
     
  26. DoubleTap

    DoubleTap [H]ard|Gawd

    Messages:
    1,908
    Joined:
    Dec 16, 2010
    Consider getting over yourself.

    Non G-Sync displays are a hard pass for a lot of people, including myself. Unless AMD decides to make a decent GPU - but I'm not holding my breath.
     
    Stryker7314 likes this.
  27. KazeoHin

    KazeoHin [H]ardness Supreme

    Messages:
    7,519
    Joined:
    Sep 7, 2011

    here here.

    The truth is that we know these three things exist:

    4K OLED Panels able to be driven beyond 120Hz

    VRR-capable display driver boards able to run 4K at 120Hz

    Cables/Interfaces that can drive the needed bandwidth to push 4K 4:4:4 at over 120Hz.


    So these things are not science fiction. Display manufacturers could make these. They choose not to.

    And to the person who says Display manufacturers have 'thousands of people' making products for us: this is half true, but these companies would rather NOT release something amazing and revolutionary, something 100% better than the competition at a similar price: if they do that, then they loose the opportunity to release something that's only 50% better, and charge the same for it. 50% less effort, and it still is "better than the competition" and will still make waves and people will still buy it. But why stop there? by releasing something 50% better, you miss the opportunity to release something 25% better, doing half the effort and charging the same for it. But wait, why release something that 25% better, when you can release something 10% better, and do less than half the effort, and charge the same for it?

    This is why tech manufacturers strive for as little, tiny incremental updates as possible. In the realm of innovation, you can only go 'up'. once you take that step, you can't go back down. This is why monitor manufacturers have been milking the same old tech as much as possible, and why panel manufacturers RARELY release anything groundbreaking.
     
    Krisium and jfreund like this.
  28. kasakka

    kasakka Gawd

    Messages:
    890
    Joined:
    Aug 25, 2008
    To add to this, manufacturers make TVs first, gaming displays second. 120 Hz panels stem more from things like motion smoothing technologies for watching sports than getting low motion blur gaming to happen. On the software side TVs are downright horrible with poor support for anything but the latest gen, crappy UI design and bugs that never get fixed. Samsung even would have the capability of upgrading the One Connect boxes in a few years older models to HDMI 2.1 for a price but I bet that will never happen as they'd just rather sell you the slightly upgraded version.

    I'm surprised the new 4K 27" 144 Hz displays actually even have upgradeable firmware without sending the device back to the manufacturer. Things we take for granted on a lot of other devices seem to trickle down to TVs and monitors at a very slow rate.
     
  29. jnemesh

    jnemesh Gawd

    Messages:
    984
    Joined:
    Jan 21, 2013
    euskalzabe likes this.
  30. XoR_

    XoR_ Limp Gawd

    Messages:
    376
    Joined:
    Jan 18, 2016
    nothing to hold breath for
     
  31. elvn

    elvn 2[H]4U

    Messages:
    2,919
    Joined:
    May 5, 2006
    ...
    " If I have to I'll wait into 2019 or whenever LG OLED 4k HDR with hdmi 2.1 120hz, VRR (variable refresh rate standardized in the hdmi spec), QFT (quick frame transport for low lag gaming) are out and there are eventually gpus with hdmi 2.1 + VRR support .. even if I have to change camps to AMD if NVIDIA drags their feet on VRR support in future gpu lines and I end up waiting until 2020 for such gpus. As it is the 32" 21:9 and BFG HDR FALD models I had initially expressed some interest in (especially the BFG) are delayed at least to mid 2019 anyway. 27" is ok, I've been using one for years.. but I decided to drop $600 and sell my pg278Q for $300 +/- if I can get it locally for a ~$300+/- display upgrade with 3x the contrast and black depth as ips and tns, and a 4.5" diagonal size upgrade which is considerable while still staying at 2560x1440 gaming where I can dial in settings to have 100fps+ average (70 - 100 - 130+ band) or more to get more out of the high hz capability, where 4k would require me to turn the graphics down a lot more to achieve that even with powerful gpus.

    I would have loved these a few years ago but with the roadmap I'm looking at going forward, this $300 32" VA upgrade will hold me over until I'm willing to drop thousands on a hdmi 2.1 4k HDR 120hz VRR oled and new gpus that will support it. Also by that time there should be a lot more HDR content hopefully.

    Glad a lot of people are happy with these FALD models though. I have a FALD VA tv and love it even with a fraction of the zones. I wouldn't buy another non OLED and non hdmi 2.1 ~ 120hz native + VRR tv going forward either though considering what I already have for now and what I know is coming. I'm not dropping $2k + on a monitor (and more on a 70" tv) every 1 to 2 years when I can be smart and pick my battles. HDMI 2.1 120hz 4k + VRR on HDR OLEDTVs and GPUS is the holding point for me now."
     
  32. LuxTerra

    LuxTerra Limp Gawd

    Messages:
    200
    Joined:
    Jul 3, 2017
    Have you bothered to read any [H] GPU reviews? You know that there's a few image quality sliders that have huge hit on frame rates, but limited impact on image quality, right? Does it count if the 1080Ti can play pong in 4k at far greater than 60fps?

    Sorry, that's a very broad over generalization. Besides, I buy my monitors to last 5-10yrs and 4k plays just fine at 60fps+ on my 1080Ti for lots of games.
     
    Armenius and GoldenTiger like this.
  33. elvn

    elvn 2[H]4U

    Messages:
    2,919
    Joined:
    May 5, 2006
    Witcher 3 runs 108 fps average at 2560x1440 on ultra with hairworks disabled.
    DX11
    Ultra mode
    AA enabled
    16x AF enabled
    SSAO enabled
    Nvidia hairworks OFF
    Other settings ON

    FarCry 5 2560x1440 , 1080ti = 101fps (78min)
    FarCry 5 3840x2160 , 1080ti = 54fps (45 min)

    Of course you can turn some of that stuff down/off. The point being we are just to the point where you can run 2560x1440 on demanding games at very high+ to ultra settings (with a few over the tops settings turned off perhaps) on a single gpu (1080ti) at high enough frame rates to get appreciable benefits out of a high hz monitor.
    The same settings at 4k resolution result in 66fps average graph on witcher 3 and FarCry5 gets 54fps average at 4k.. which leaves little wiggle room and is getting pretty practically nothing out of the higher hz capability of a 120hz - 144hz monitor. ...
     
  34. Sancus

    Sancus Gawd

    Messages:
    729
    Joined:
    Jun 1, 2013
    I spend only about 10% of my gaming time playing games like witcher 3, and the other 90% playing games like Overwatch, Heroes of the Storm, League of Legends, etc, ie competitive multiplayer games. Those all hit 120+ fps at 4K with a 1080TI no problem at all.

    So I don't really care very much about the performance limitations of AAA games that push GPUs to the max, but I do care about the refresh rate limitations of my display.
     
    Armenius, GoldenTiger and LuxTerra like this.
  35. elvn

    elvn 2[H]4U

    Messages:
    2,919
    Joined:
    May 5, 2006
    Sure isometrics and cartoonish TeamFortress2 of the modern era graphics will run higher frame rates but there are a lot more games than arena smashups.


    "
    As for just how beefy a PC is required, it turns out Monster Hunter World is a fairly demanding beast. With a GeForce GTX 1080, Intel Core i7-4790K and 16GB RAM, the following average frame rates were achieved in the starting hub at 1440p resolution:

    • Low: 108fps
    • Medium: 65fps
    • High: 60fps
    • Ultra: 44fps
    Unsurprisingly, volumetric lighting is a demanding graphics setting in Monster Hunter World, and dropping this from Ultra to High pushes the average frames per second up to a respectable 50fps."

    http://www.game-debate.com/news/254...on-pc-confirmed-gtx-1080-performance-revealed

    Final fantasy 15 gets around 71fps on high settings at 1440p with a 1080ti too.

    Farcry 5 101fps with 1080ti at 1440p which is the target I aim for(or higher) to the get appreciable benefit out of a higher hz monitor.

    Kingdom Come deliverance gets ~ 71fps at 1440p with a 1080ti.. 37fps at 4k resolution.


    To be clear I'm saying 4k is too demanding for state of the art games to hit high fps+hz at higher graphics settings .. outside of cartoonish rompers and isometrics. However I'm agreeing that for my values - refresh rates have to be high and frame rate averages relatively high for to fill that hz (100fps average or better) .. which is why 2560x1440 is still a better resolution for gaming (for now) across the board.
     
    Last edited: Aug 11, 2018
    ncjoe likes this.
  36. Lepardi

    Lepardi [H]Lite

    Messages:
    117
    Joined:
    Nov 8, 2017
    Yeah so basically downscale resolution to 1440p so it looks crappier than a 1440p native? Nope.

    So you're buying a 4K monitor now with a 5-year plan, and maybe in the last year you'll have a GPU capable of running it in minimum ~140FPS with games maxed out. Sure if you only play singleplayers/co-ops/rts this 60fps will suffice.
     
  37. LuxTerra

    LuxTerra Limp Gawd

    Messages:
    200
    Joined:
    Jul 3, 2017
    What you and others fail to realize is that there is not one type of game available to play; i.e. first person shooters. The world does not revolve around awful graphics and silly high refresh rates. There are lots of games, some of them look spectacular at 4k (not scaled), at or near max settings, and 60fps or more on a 1080Ti.

    This literally came out the other day from HardOCP (I just picked one I've actually played at 4k and enjoyed):
    1533382018o15hwmewwq_9_2.png

    IIRC, that's a stock 1080Ti FE. OC it and those few dips below 60fps go away. At 7nm next year, we should get another 980Ti to 1080Ti uplift, but this year it's probably more of a 780Ti to 980Ti uplift across the board.

    So ya, you can run nice games at 4k@60fps on a single GPU for over a year.
     
    jedolley and Armenius like this.
  38. elvn

    elvn 2[H]4U

    Messages:
    2,919
    Joined:
    May 5, 2006
    I disagree for my tastes at least. 60fps is molasses and the worst smearing blur of the whole screen in 1st/3rd person games where you are continually moving your viewport around. High fps + high hz is not just for twitch gaming, it is a huge aethetic benefit in both motion clarity (blur redcution) and motion definition (double or more the unique motion state images in a flip book that is flipping twice as fast). This creates tighter sample and hold blur to more of a soften blur and better with good overdrive, instead of smearing blur at 60fps ...

    When you say "gets X fps" you are talking about the AVERAGE so you are really ranging down into 50 and on some games even down to 30 fps in your fps graph 1/3 of the graph. This is sludge to me. Think of a strobe light cutting away motion definition but instead of seeing the black state you just see the last action frozen through the black states of the strobe light. That is what's happening to everything in the game world and the motion of the viewport itself when you run 60fps-hz instead of 100 to 120fps-hz. where you would get glassy motion and more defined pathing (more dots per dotted line) and even more animation cycle definition.. as well as the movement keying and mouse looking of the entire game world moving in the viewport relative to you moving with more definition and glassiness with half the blur. So it is very aesthetic. 4k, at least sub 100fps-hz, makes for good screenshots.

    ALY9lQS.png


    KaL2I0d.png
     
    Last edited: Aug 12, 2018
  39. kasakka

    kasakka Gawd

    Messages:
    890
    Joined:
    Aug 25, 2008
    This is more about your desire to have 100+ fps, which I agree current GPUs and probably next gen ones won't do without cutting detail settings. Your examples aren't the best because several of them are console ports, MHW seems to be a crappy one in the first place and Nvidia's driver mess doesn't help. I game at 1440p with a 980 Ti and G-Sync display and I'm perfectly happy with framerates that are roughly around 60 fps for most of the time. G-Sync really helps make framerate fluctuations less jarring. If I get 60+ fps in a game I typically opt for ULMB instead because it does a better job of blur reduction than anything else. It's a downright shame that it is not available on the latest 4K 144 Hz displays.

    You love posting those same graphs in a lot of posts. Nobody is disagreeing that higher framerates are better but with current tech we have to choose which compromises to make. A lot of people are enjoying playing games at 30 fps on consoles or around 60 fps on PC.
     
    Armenius likes this.