OLED Gaming Displays

That doesn't look like a very reliable source. Especially as translated. There was a ASUS rep at CES that said on video more "around" $1000.
 
Why would they ruin that oled panel with AG film? Even true blacks look gray when light reflects off of it.
 
That doesn't look like a very reliable source. Especially as translated. There was a ASUS rep at CES that said on video more "around" $1000.

Personally, given that it's JOLED I'd actually trust a japanese-language source in a more recent computex article more than I'd trust a much older statement from Asus, but shrug, I guess we'll see.

If it is $1000 and available in October I will definitely be picking one up, but I'm skeptical -- even limited to 22", a production $1000 PC OLED display would be a huge game changer and yield coup for JOLED. $5K pilot price and $1.8K once mass production settles in sounds a lot more realistic to me.
 
It's pointless if it ships without G-Sync.

Really, it's time for these shitty companies to pony up. 4k, 32", variable refresh, 120hz, low input lag, and either GOOD mini LED or something.

Not buying another monitor until all those boxes are ticked. They can fuck off in the mean time.
 
It's pointless if it ships without G-Sync.

Really, it's time for these shitty companies to pony up. 4k, 32", variable refresh, 120hz, low input lag, and either GOOD mini LED or something.

Not buying another monitor until all those boxes are ticked. They can fuck off in the mean time.
4K goes straight into the trashcan in a 120Hz+ monitor.

Reality is that even a 1080 Ti can only guarantee 120+ FPS in 1080p. 1440p, nope. 2160p is just a distant dream in the 2020's.

30" 1440p 240-480Hz OLED would be awesome, PPI is good enough. Heck, the response times of OLED would allow even 960Hz refresh rates easily.
 
4K goes straight into the trashcan in a 120Hz+ monitor.
No it does not, they just need to add integer scaling to their display, problem solved, you can play in 1080p until GPUs catch up, on the desktop you will have 4k, best for both and once GPUs catch up you already have the monitor for it...
 
No it does not, they just need to add integer scaling to their display, problem solved, you can play in 1080p until GPUs catch up, on the desktop you will have 4k, best for both and once GPUs catch up you already have the monitor for it...
But GPU manufacturers wont add nearest neighbor, only the blurry algorithm is available.
 
I think, that simply based on the fact that nvidia decided to spend years of time and resources to invest in their latest g-sync module being a hdr fald-controller, it means that according to nvidias knowledge of the monitor market and info from their monitor partners, oled monitors are not coming anytime soon.
If nvidia thought anyone was making viable mass-market oled monitors, they would've just scrapped the fald-hdr controller and sold monitors with the old gsync module another year. For this to make business sense, I highly doubt there will be any oled/microled g-sync monitors the next 2 or 3 years.
It was based on this logic i gave up and bought the pg27uq (which IS excellent, despite what I first anticipated) instead of waiting.

There might be a chance we get surprised by freesync microled though, who knows.
 
Last edited:
But GPU manufacturers wont add nearest neighbor, only the blurry algorithm is available.
And they don't need to, you can disable GPU scaling and let the monitor handle it (as is the default on most systems). So they can implement it in the monitor's scaler if they want...
 
And they don't need to, you can disable GPU scaling and let the monitor handle it (as is the default on most systems). So they can implement it in the monitor's scaler if they want...
But do you see this feature even in the $2500 4K gaming monitors? Nope.
 
But do you see this feature even in the $2500 4K gaming monitors? Nope.
Easy let's all stop buying this crap they are pushing right now, and they will start shipping what people want. You need to vote with your wallet, that's the only language they understand... I for sure won't buy any sub 4k monitors anymore...
 
Anyone know how joled's panels fare when it comes to burn in and aging? Are they using all white oleds with filters like LG?

It says they make oleds for cars so you would think they would be good at dealing with image retention.
 
Anyone know how joled's panels fare when it comes to burn in and aging? Are they using all white oleds with filters like LG?

It says they make oleds for cars so you would think they would be good at dealing with image retention.
They now have limited "4.5 gen" panel production, and they will open a bigger factory with more capacity producing "5.5 gen" panels. Whatever those gen things mean.
 
Anyone know how joled's panels fare when it comes to burn in and aging? Are they using all white oleds with filters like LG?

Nobody can, LG has patents on that manufacturing method, which is why no one else has been able to manufacture reasonably priced larger OLED panels. JOLED is using an RGB printing process, and if they are even able to get the panels out at anything remotely like a reasonable price, they will be the first manufacturer, including Samsung, to be able to do this at sizes larger than 14" or so.

Honestly, if they can produce a 32" high refresh rate panel for less than $5000 it would be an instant buy for me no questions asked, I don't care about burn in, as long as it's not completely ridiculous like in <1 hour or something. But I would expect these to be at least as susceptible to burn in as any LG panel, if not moreso.
 
Thanks, that's what I thought but I wasn't sure if they had some alternative method to get around the patent or got a license to use LG's method.

Hopefully it's a real high refresh rate monitor and not a dud like the Dell. I probably won't get it without gsync, but if they do something crazy like support 4k 120hz and 1080 480hz I just might.
 
No it does not, they just need to add integer scaling to their display, problem solved, you can play in 1080p until GPUs catch up, on the desktop you will have 4k, best for both and once GPUs catch up you already have the monitor for it...

Or just run any resolution your PC can handle, 1440p, 1800p, whatever, since you have an insanely high dot pitch:

 
Nobody can, LG has patents on that manufacturing method, which is why no one else has been able to manufacture reasonably priced larger OLED panels. JOLED is using an RGB printing process, and if they are even able to get the panels out at anything remotely like a reasonable price, they will be the first manufacturer, including Samsung, to be able to do this at sizes larger than 14" or so.

Honestly, if they can produce a 32" high refresh rate panel for less than $5000 it would be an instant buy for me no questions asked, I don't care about burn in, as long as it's not completely ridiculous like in <1 hour or something. But I would expect these to be at least as susceptible to burn in as any LG panel, if not moreso.

Samsung offered a 55" RGB oled for sale several years ago. Then they got smart, realized that yields were terrible, so large panels would NEVER become cost effective, and the downsides with image retention (Screen burn) along with other issues led them to abandon large screen OLEDs and focus on NON-organic light emitting displays. "QLED" as it stands today is just a stopgap. The real fun begins when they release their emissive QLED displays in another couple years. All the advantages of OLED without the phosphors degrading and without screen burn problems. They are already showing prototype displays of this tech. It's not far off...and when it hits, OLED is dead.
 
It's pointless if it ships without G-Sync.

Really, it's time for these shitty companies to pony up. 4k, 32", variable refresh, 120hz, low input lag, and either GOOD mini LED or something.

Not buying another monitor until all those boxes are ticked. They can fuck off in the mean time.

You are an overly entitled prick who thinks that somehow, these companies aren't spending MILLIONS and employing THOUSANDS of engineers to bring you the latest tech? YOU fuck off! People like you are everything wrong with this world. No appreciation for the technology involved, the challenges in design or manufacturing, you just expect someone to wave a magic wand and give you the tech you want. Piss off.
 
Samsung offered a 55" RGB oled for sale several years ago.

Yes, but I said reasonably priced. The KN55S9C was 1080p, $9,000, and Samsung was never able to improve yields to the point it was actually viable as a product.

The real fun begins when they release their emissive QLED displays in another couple years.

You're very optimistic about this lol.
 
Yes, but I said reasonably priced. The KN55S9C was 1080p, $9,000, and Samsung was never able to improve yields to the point it was actually viable as a product.



You're very optimistic about this lol.

Not really, they are sampling small displays right now...
 
Not really, they are sampling small displays right now...

Source? AFAIK, Samsung's next plan is to go to micro-LED backlit LCDs, and then maybe, if they ever manage to scale down and make the non-organic LEDs manufacturable in small sizes(not guaranteed this is even possible) then what you are talking about will happen.
 
No it does not, they just need to add integer scaling to their display, problem solved, you can play in 1080p until GPUs catch up, on the desktop you will have 4k, best for both and once GPUs catch up you already have the monitor for it...

Why play it in 1080p old scale what the point? and if you going to wait for the GPU's to catch up to play on a OLED, MicroLED will replace it by then problem solved.
 
You are an overly entitled prick who thinks that somehow, these companies aren't spending MILLIONS and employing THOUSANDS of engineers to bring you the latest tech? YOU fuck off! People like you are everything wrong with this world. No appreciation for the technology involved, the challenges in design or manufacturing, you just expect someone to wave a magic wand and give you the tech you want. Piss off.

Consider getting over yourself.

Non G-Sync displays are a hard pass for a lot of people, including myself. Unless AMD decides to make a decent GPU - but I'm not holding my breath.
 
No, YOU fuck off. Those companies live and die by consumers like me patronizing them, so you can sit on my cock and rotate, chump.

These displays SUCK COCK. Do you get it now? They fucking SUCK. They aren't delivering what I want, and they can go fuck themselves until they do.

I'd seriously rather have new CRTs manufactured than the shit they're squatting out right now.


here here.

The truth is that we know these three things exist:

4K OLED Panels able to be driven beyond 120Hz

VRR-capable display driver boards able to run 4K at 120Hz

Cables/Interfaces that can drive the needed bandwidth to push 4K 4:4:4 at over 120Hz.


So these things are not science fiction. Display manufacturers could make these. They choose not to.

And to the person who says Display manufacturers have 'thousands of people' making products for us: this is half true, but these companies would rather NOT release something amazing and revolutionary, something 100% better than the competition at a similar price: if they do that, then they loose the opportunity to release something that's only 50% better, and charge the same for it. 50% less effort, and it still is "better than the competition" and will still make waves and people will still buy it. But why stop there? by releasing something 50% better, you miss the opportunity to release something 25% better, doing half the effort and charging the same for it. But wait, why release something that 25% better, when you can release something 10% better, and do less than half the effort, and charge the same for it?

This is why tech manufacturers strive for as little, tiny incremental updates as possible. In the realm of innovation, you can only go 'up'. once you take that step, you can't go back down. This is why monitor manufacturers have been milking the same old tech as much as possible, and why panel manufacturers RARELY release anything groundbreaking.
 
here here.

The truth is that we know these three things exist:

4K OLED Panels able to be driven beyond 120Hz

VRR-capable display driver boards able to run 4K at 120Hz

Cables/Interfaces that can drive the needed bandwidth to push 4K 4:4:4 at over 120Hz.


So these things are not science fiction. Display manufacturers could make these. They choose not to.

And to the person who says Display manufacturers have 'thousands of people' making products for us: this is half true, but these companies would rather NOT release something amazing and revolutionary, something 100% better than the competition at a similar price: if they do that, then they loose the opportunity to release something that's only 50% better, and charge the same for it. 50% less effort, and it still is "better than the competition" and will still make waves and people will still buy it. But why stop there? by releasing something 50% better, you miss the opportunity to release something 25% better, doing half the effort and charging the same for it. But wait, why release something that 25% better, when you can release something 10% better, and do less than half the effort, and charge the same for it?

This is why tech manufacturers strive for as little, tiny incremental updates as possible. In the realm of innovation, you can only go 'up'. once you take that step, you can't go back down. This is why monitor manufacturers have been milking the same old tech as much as possible, and why panel manufacturers RARELY release anything groundbreaking.

To add to this, manufacturers make TVs first, gaming displays second. 120 Hz panels stem more from things like motion smoothing technologies for watching sports than getting low motion blur gaming to happen. On the software side TVs are downright horrible with poor support for anything but the latest gen, crappy UI design and bugs that never get fixed. Samsung even would have the capability of upgrading the One Connect boxes in a few years older models to HDMI 2.1 for a price but I bet that will never happen as they'd just rather sell you the slightly upgraded version.

I'm surprised the new 4K 27" 144 Hz displays actually even have upgradeable firmware without sending the device back to the manufacturer. Things we take for granted on a lot of other devices seem to trickle down to TVs and monitors at a very slow rate.
 
...
I went with the (non HDR, non-FALD) 32" 16:9 LG 32GK850G VA g-sync 144hz/165hz on sale for $600 free ship no tax at egg instead. It will probably arrive tomorrow.

" If I have to I'll wait into 2019 or whenever LG OLED 4k HDR with hdmi 2.1 120hz, VRR (variable refresh rate standardized in the hdmi spec), QFT (quick frame transport for low lag gaming) are out and there are eventually gpus with hdmi 2.1 + VRR support .. even if I have to change camps to AMD if NVIDIA drags their feet on VRR support in future gpu lines and I end up waiting until 2020 for such gpus. As it is the 32" 21:9 and BFG HDR FALD models I had initially expressed some interest in (especially the BFG) are delayed at least to mid 2019 anyway. 27" is ok, I've been using one for years.. but I decided to drop $600 and sell my pg278Q for $300 +/- if I can get it locally for a ~$300+/- display upgrade with 3x the contrast and black depth as ips and tns, and a 4.5" diagonal size upgrade which is considerable while still staying at 2560x1440 gaming where I can dial in settings to have 100fps+ average (70 - 100 - 130+ band) or more to get more out of the high hz capability, where 4k would require me to turn the graphics down a lot more to achieve that even with powerful gpus.

I would have loved these a few years ago but with the roadmap I'm looking at going forward, this $300 32" VA upgrade will hold me over until I'm willing to drop thousands on a hdmi 2.1 4k HDR 120hz VRR oled and new gpus that will support it. Also by that time there should be a lot more HDR content hopefully.

Glad a lot of people are happy with these FALD models though. I have a FALD VA tv and love it even with a fraction of the zones. I wouldn't buy another non OLED and non hdmi 2.1 ~ 120hz native + VRR tv going forward either though considering what I already have for now and what I know is coming. I'm not dropping $2k + on a monitor (and more on a 70" tv) every 1 to 2 years when I can be smart and pick my battles. HDMI 2.1 120hz 4k + VRR on HDR OLEDTVs and GPUS is the holding point for me now."
 
4K goes straight into the trashcan in a 120Hz+ monitor.

Reality is that even a 1080 Ti can only guarantee 120+ FPS in 1080p. 1440p, nope. 2160p is just a distant dream in the 2020's.

30" 1440p 240-480Hz OLED would be awesome, PPI is good enough. Heck, the response times of OLED would allow even 960Hz refresh rates easily.
Have you bothered to read any [H] GPU reviews? You know that there's a few image quality sliders that have huge hit on frame rates, but limited impact on image quality, right? Does it count if the 1080Ti can play pong in 4k at far greater than 60fps?

Sorry, that's a very broad over generalization. Besides, I buy my monitors to last 5-10yrs and 4k plays just fine at 60fps+ on my 1080Ti for lots of games.
 
Witcher 3 runs 108 fps average at 2560x1440 on ultra with hairworks disabled.
DX11
Ultra mode
AA enabled
16x AF enabled
SSAO enabled
Nvidia hairworks OFF
Other settings ON

FarCry 5 2560x1440 , 1080ti = 101fps (78min)
FarCry 5 3840x2160 , 1080ti = 54fps (45 min)

Of course you can turn some of that stuff down/off. The point being we are just to the point where you can run 2560x1440 on demanding games at very high+ to ultra settings (with a few over the tops settings turned off perhaps) on a single gpu (1080ti) at high enough frame rates to get appreciable benefits out of a high hz monitor.
The same settings at 4k resolution result in 66fps average graph on witcher 3 and FarCry5 gets 54fps average at 4k.. which leaves little wiggle room and is getting pretty practically nothing out of the higher hz capability of a 120hz - 144hz monitor. ...
ALY9lQS.png
 
I spend only about 10% of my gaming time playing games like witcher 3, and the other 90% playing games like Overwatch, Heroes of the Storm, League of Legends, etc, ie competitive multiplayer games. Those all hit 120+ fps at 4K with a 1080TI no problem at all.

So I don't really care very much about the performance limitations of AAA games that push GPUs to the max, but I do care about the refresh rate limitations of my display.
 
Sure isometrics and cartoonish TeamFortress2 of the modern era graphics will run higher frame rates but there are a lot more games than arena smashups.


"
As for just how beefy a PC is required, it turns out Monster Hunter World is a fairly demanding beast. With a GeForce GTX 1080, Intel Core i7-4790K and 16GB RAM, the following average frame rates were achieved in the starting hub at 1440p resolution:

  • Low: 108fps
  • Medium: 65fps
  • High: 60fps
  • Ultra: 44fps
Unsurprisingly, volumetric lighting is a demanding graphics setting in Monster Hunter World, and dropping this from Ultra to High pushes the average frames per second up to a respectable 50fps."

http://www.game-debate.com/news/254...on-pc-confirmed-gtx-1080-performance-revealed

Final fantasy 15 gets around 71fps on high settings at 1440p with a 1080ti too.

Farcry 5 101fps with 1080ti at 1440p which is the target I aim for(or higher) to the get appreciable benefit out of a higher hz monitor.

Kingdom Come deliverance gets ~ 71fps at 1440p with a 1080ti.. 37fps at 4k resolution.


To be clear I'm saying 4k is too demanding for state of the art games to hit high fps+hz at higher graphics settings .. outside of cartoonish rompers and isometrics. However I'm agreeing that for my values - refresh rates have to be high and frame rate averages relatively high for to fill that hz (100fps average or better) .. which is why 2560x1440 is still a better resolution for gaming (for now) across the board.
 
Last edited:
  • Like
Reactions: ncjoe
like this
Have you bothered to read any [H] GPU reviews? You know that there's a few image quality sliders that have huge hit on frame rates, but limited impact on image quality, right? Does it count if the 1080Ti can play pong in 4k at far greater than 60fps?

Sorry, that's a very broad over generalization. Besides, I buy my monitors to last 5-10yrs and 4k plays just fine at 60fps+ on my 1080Ti for lots of games.
Yeah so basically downscale resolution to 1440p so it looks crappier than a 1440p native? Nope.

So you're buying a 4K monitor now with a 5-year plan, and maybe in the last year you'll have a GPU capable of running it in minimum ~140FPS with games maxed out. Sure if you only play singleplayers/co-ops/rts this 60fps will suffice.
 
Yeah so basically downscale resolution to 1440p so it looks crappier than a 1440p native? Nope.

So you're buying a 4K monitor now with a 5-year plan, and maybe in the last year you'll have a GPU capable of running it in minimum ~140FPS with games maxed out. Sure if you only play singleplayers/co-ops/rts this 60fps will suffice.
What you and others fail to realize is that there is not one type of game available to play; i.e. first person shooters. The world does not revolve around awful graphics and silly high refresh rates. There are lots of games, some of them look spectacular at 4k (not scaled), at or near max settings, and 60fps or more on a 1080Ti.

This literally came out the other day from HardOCP (I just picked one I've actually played at 4k and enjoyed):
1533382018o15hwmewwq_9_2.png


IIRC, that's a stock 1080Ti FE. OC it and those few dips below 60fps go away. At 7nm next year, we should get another 980Ti to 1080Ti uplift, but this year it's probably more of a 780Ti to 980Ti uplift across the board.

So ya, you can run nice games at 4k@60fps on a single GPU for over a year.
 
What you and others fail to realize is that there is not one type of game available to play; i.e. first person shooters. The world does not revolve around awful graphics and silly high refresh rates. There are lots of games, some of them look spectacular at 4k (not scaled), at or near max settings, and 60fps or more on a 1080Ti.

This literally came out the other day from HardOCP (I just picked one I've actually played at 4k and enjoyed):
View attachment 95692

IIRC, that's a stock 1080Ti FE. OC it and those few dips below 60fps go away. At 7nm next year, we should get another 980Ti to 1080Ti uplift, but this year it's probably more of a 780Ti to 980Ti uplift across the board.

So ya, you can run nice games at 4k@60fps on a single GPU for over a year.

I disagree for my tastes at least. 60fps is molasses and the worst smearing blur of the whole screen in 1st/3rd person games where you are continually moving your viewport around. High fps + high hz is not just for twitch gaming, it is a huge aethetic benefit in both motion clarity (blur redcution) and motion definition (double or more the unique motion state images in a flip book that is flipping twice as fast). This creates tighter sample and hold blur to more of a soften blur and better with good overdrive, instead of smearing blur at 60fps ...

When you say "gets X fps" you are talking about the AVERAGE so you are really ranging down into 50 and on some games even down to 30 fps in your fps graph 1/3 of the graph. This is sludge to me. Think of a strobe light cutting away motion definition but instead of seeing the black state you just see the last action frozen through the black states of the strobe light. That is what's happening to everything in the game world and the motion of the viewport itself when you run 60fps-hz instead of 100 to 120fps-hz. where you would get glassy motion and more defined pathing (more dots per dotted line) and even more animation cycle definition.. as well as the movement keying and mouse looking of the entire game world moving in the viewport relative to you moving with more definition and glassiness with half the blur. So it is very aesthetic. 4k, at least sub 100fps-hz, makes for good screenshots.

ALY9lQS.png



KaL2I0d.png
 
Last edited:
Sure isometrics and cartoonish TeamFortress2 of the modern era graphics will run higher frame rates but there are a lot more games than arena smashups.


"
As for just how beefy a PC is required, it turns out Monster Hunter World is a fairly demanding beast. With a GeForce GTX 1080, Intel Core i7-4790K and 16GB RAM, the following average frame rates were achieved in the starting hub at 1440p resolution:

  • Low: 108fps
  • Medium: 65fps
  • High: 60fps
  • Ultra: 44fps
Unsurprisingly, volumetric lighting is a demanding graphics setting in Monster Hunter World, and dropping this from Ultra to High pushes the average frames per second up to a respectable 50fps."

http://www.game-debate.com/news/254...on-pc-confirmed-gtx-1080-performance-revealed

Final fantasy 15 gets around 71fps on high settings at 1440p with a 1080ti too.

Farcry 5 101fps with 1080ti at 1440p which is the target I aim for(or higher) to the get appreciable benefit out of a higher hz monitor.

Kingdom Come deliverance gets ~ 71fps at 1440p with a 1080ti.. 37fps at 4k resolution.


To be clear I'm saying 4k is too demanding for state of the art games to hit high fps+hz at higher graphics settings .. outside of cartoonish rompers and isometrics. However I'm agreeing that for my values - refresh rates have to be high and frame rate averages relatively high for to fill that hz (100fps average or better) .. which is why 2560x1440 is still a better resolution for gaming (for now) across the board.

This is more about your desire to have 100+ fps, which I agree current GPUs and probably next gen ones won't do without cutting detail settings. Your examples aren't the best because several of them are console ports, MHW seems to be a crappy one in the first place and Nvidia's driver mess doesn't help. I game at 1440p with a 980 Ti and G-Sync display and I'm perfectly happy with framerates that are roughly around 60 fps for most of the time. G-Sync really helps make framerate fluctuations less jarring. If I get 60+ fps in a game I typically opt for ULMB instead because it does a better job of blur reduction than anything else. It's a downright shame that it is not available on the latest 4K 144 Hz displays.

You love posting those same graphs in a lot of posts. Nobody is disagreeing that higher framerates are better but with current tech we have to choose which compromises to make. A lot of people are enjoying playing games at 30 fps on consoles or around 60 fps on PC.
 
Back
Top