OLED HDMI 2.1 VRR LG 2019 models!!!!

Really, it's time for the Jedi special purpose monitors to end. It's bullshit. All monitors should have accurate colors, low latency, variable refresh, etc. It doesn't have to be and shouldn't be a tradeoff. Anything less is a scam.
 
  • Like
Reactions: DF-1
like this
Really, it's time for the Jedi special purpose monitors to end. It's bullshit. All monitors should have accurate colors, low latency, variable refresh, etc. It doesn't have to be and shouldn't be a tradeoff. Anything less is a scam.

Talk about tripling or quadrupling the average price of a monitor...

I get the sentiment, but most computer uses (well, nearly all) just need 'close enough', at the very best.
 
Talk about tripling or quadrupling the average price of a monitor...

I get the sentiment, but most computer uses (well, nearly all) just need 'close enough', at the very best.

Honestly it's all economies of scale. If you just make more of the same product you can make it much cheaper. Consumers are just getting fucked hard here. The industry has blown and stagnated for a very long time.
 
Honestly it's all economies of scale. If you just make more of the same product you can make it much cheaper. Consumers are just getting fucked hard here. The industry has blown and stagnated for a very long time.

The only part that would be difficult to scale would be accurate colors; perhaps they could devise a production process that produces monitors that are accurate without calibration, but generally speaking in order to guarantee calibration each unit must be tested for such and adjusted where necessary.
 
The only part that would be difficult to scale would be accurate colors; perhaps they could devise a production process that produces monitors that are accurate without calibration, but generally speaking in order to guarantee calibration each unit must be tested for such and adjusted where necessary.
Calibration AI
 
Really, it's time for the Jedi special purpose monitors to end. It's bullshit. All monitors should have accurate colors, low latency, variable refresh, etc. It doesn't have to be and shouldn't be a tradeoff. Anything less is a scam.

This won't happen as long as most computer buyers keep buying the cheapest displays available.

The whole monitor and TV industry is fucked up. TV manufacturers barely bother to support their products, I mean my 2016 Samsung TV is not seeing things like Freesync support just because even though it's most likely 90% the same hardware as the models made a year later. Meanwhile computer monitor manufacturers just want to peddle the same 27" panels over and over with extremely slow development.
 
This won't happen as long as most computer buyers keep buying the cheapest displays available

That's definitely a big part of it. It seems like a majority of PC users have always focused more on the FRAPS counter in the corner of the screen than the quality of visuals they actually have on the screen
 
Interesting that someone took the idea of how white OLEDs are sort of per pixel FALD and made a dual layer VA LCD with the rear LCD as the backlight. It's capable of .003 black depth but it isn't very bright, definitely not HDR color bright. Around 25ms or less input lag so that has room for improvement. They claim 4k hdmi 2.1 possible. Wonder why no one thought of it before. Idk why they used a 1080p as the backlight display either.

 
Last edited:
Interesting that someone took the idea of how white OLEDs are sort of per pixel FALD and made a dual layer VA LCD with the rear LCD as the backlight. It's capable of .003 black depth but it isn't very bright, definitely not HDR color bright. Around 25ms or less input lag so that has room for improvement. They claim 4k hdmi 2.1 possible. Wonder why no one thought of it before. Idk why they used a 1080p as the backlight display either.

I hear about progress on this every year. And they always have problems like it isn't bright enough or has terrible viewing angles or strange artifacts or something.

It seems like they've finally got it working decent here, but you can tell it's far from perfect even through the youtube video. You can see pretty significant haloing in the video and who knows how the colors and everything else is. I think it's a long way away form actually competing with OLED.
 
I hear about progress on this every year. And they always have problems like it isn't bright enough or has terrible viewing angles or strange artifacts or something.

It seems like they've finally got it working decent here, but you can tell it's far from perfect even through the youtube video. You can see pretty significant haloing in the video and who knows how the colors and everything else is. I think it's a long way away form actually competing with OLED.
Panasonic demoed a very similar dual-layer LCD tech a couple of years ago, and there are now a couple of pro monitors you can buy that use it (e.g., Eizo CG3145). Very expensive though.

I can see what looks like some haloing in the video, but that's on the single grayscale module that's there to show how the tech works. The full display (second half of the video) looks pretty good as far as I can tell. It's a VA panel, of course, so it's going to have more limited viewing angles than IPS or OLED, but if they can actually deliver at a low-ish price this might be a great alternative to OLED or conventional FALD LCD.
 
Last edited:
made a dual layer VA LCD with the rear LCD as the backlight. Wonder why no one thought of it before. Idk why they used a 1080p as the backlight display either.

They used 1080p to reduce backlight attenuation, which increases as pixel density increases.

Stacking LCD panels on top of eachother isn't novel technology, that's how the Eizo CG3145 works. And its specs are indeed impressive. But common sense tells you why this isn't a viable consumer product: Two LCDs cost twice as much, AND vastly attenuate the backlight, making it much more expensive and power consuming. Not to mention the fact that LCDs have slow pixel response times and calibrating the panels to respond identically and consistently is probably very challenging. I'm sure there are a ton of other issues. Hell, even getting the FALD backlight to play nice is incredibly challenging and R&D-intensive, that's why Nvidia had to sink so much effort into getting Gsync HDR backlights to actually respond fast enough to even look correct at gaming framerates and viewport movement speeds. Now you're talking about coordinating two LCDs AND an extra-powerful, extra-dense FALD backlight.

It really doesn't make much sense, even microLED is more practical even though it's not ready yet.

E: Yeah viewing angles are another issue, since you have two panels with their associated structure around the subpixels, any color variation or brightness reduction off-axis will be at least doubled.
 
Last edited:
The video did say the idea wasn't new, that Dolby did some kind of demo years ago but had a lot of artifacts.. It sounds difficult but looked decent at a glance in the vid and they said they had made advancements past some of the previous hurdles - especially artifacts (going by what linus said in the video). Effective 2 million zones of local dimming, multiples of light blocking etc.

The .0003 black depth and 100% DCI P3 color, and according to linus "cheaper than oled to make".. 120hz if hdmi 2.1 models all sounded good. The brightness ("just shy of 300nit") is severely lacking and I think he said at 25ms due to the panel matching chip which while ok isn't great compared to modern 120hz tvs. Two lcds worth of power wouldn't bother me at all though . I already run three LCDs at my desk plus a surround receiver and speakers :b If all else was good the brightness would still be a dealbreaker. As it is it's like an even dimmer peak color pseudo OLED black depth display with no burn in risk.. instead of getting high HDR color nits of a high end fald LCD.and future microLED tech. Now if it had no artifacting or screw-ups with VRR, HDR, and had 2000nit or more HDR color with 10ms input lag and good response time it would look more like a microled option.

It'll still be interesting to see if it goes anywhere or improves.
 
Source? How would ray-tracing be useful for self-driving cars?

Just something I saw in an article somewhere or other back towards the beginning/middle of last yeark. Nvidia had been bringing the tech into their servers to do properly accurate simulations for autonomous driving for both Tesla and Uber, and then both those deals for hardware and software fell through after Tesla went different direction and Uber slowed things way down after the Arizona crash. With the hardware developed and extra servers around, and no customers wanting to be associated with autonomous driving failures and fatal accidents, Nvidia instead put the hardware to other use
 
Personally I think we should make the infrastructure space age..

... That is, by making factory prefabricated smart road sections and making the entire road infrastructure smart roads starting with the highways.. instead of rock bed concrete and asphalt frosting driveways. With both the roads and the cars being smart , as well as making every car equipped with smart emitters/transmitters/receivers by law.. the margin of error would be extremely small and it would also help vs weather conditions. Phones or some other smart devices could also help pedestrians to be "seen". You could also lay massive stretches of fiber in the roads at the same time.. to reach everywhere roads go in the long run. This would also go a long way toward making AR more doable and sooner on roadways to help enhance things even more. Of course it would also put a lot of (taxable and spending) people to work for decades too. We've had automatied assembly lines for products that fly down conveyor belts and rollers and auto sort and divert etc for decades.. why can't we do similar making the roads into a smart machine? You could alternately retro fit existing roads with smart technology which would be a lot cheaper of course but probably not as grand in vision and results.
/gets off soapbox.

TLDR: Rather than trying to make mainly optics based singular cars.. make the roads smart, all crossings smart, make all vehicles smart, make pedestrian's smart entities via smart devices as part of the system framework too. Make jobs doing it.
 
Last edited:
The video did say the idea wasn't new, that Dolby did some kind of demo years ago but had a lot of artifacts.. It sounds difficult but looked decent at a glance in the vid and they said they had made advancements past some of the previous hurdles - especially artifacts (going by what linus said in the video). Effective 2 million zones of local dimming, multiples of light blocking etc.

The .0003 black depth and 100% DCI P3 color, and according to linus "cheaper than oled to make".. 120hz if hdmi 2.1 models all sounded good. The brightness ("just shy of 300nit") is severely lacking and I think he said at 25ms due to the panel matching chip which while ok isn't great compared to modern 120hz tvs. Two lcds worth of power wouldn't bother me at all though . I already run three LCDs at my desk plus a surround receiver and speakers :b If all else was good the brightness would still be a dealbreaker. As it is it's like an even dimmer peak color pseudo OLED black depth display with no burn in risk.. instead of getting high HDR color nits of a high end fald LCD.and future microLED tech. Now if it had no artifacting or screw-ups with VRR, HDR, and had 2000nit or more HDR color with 10ms input lag and good response time it would look more like a microled option.

It'll still be interesting to see if it goes anywhere or improves.

Yeah it seems like it could be good for mid range screens.

But they still have a long way to go for high end. And by the time they improve it enough to compete with OLED microled will be here. If they ever even get to that point.
 
The video did say the idea wasn't new, that Dolby did some kind of demo years ago but had a lot of artifacts.. It sounds difficult but looked decent at a glance in the vid and they said they had made advancements past some of the previous hurdles - especially artifacts (going by what linus said in the video). Effective 2 million zones of local dimming, multiples of light blocking etc.

The .0003 black depth and 100% DCI P3 color, and according to linus "cheaper than oled to make".. 120hz if hdmi 2.1 models all sounded good. The brightness ("just shy of 300nit") is severely lacking and I think he said at 25ms due to the panel matching chip which while ok isn't great compared to modern 120hz tvs. Two lcds worth of power wouldn't bother me at all though . I already run three LCDs at my desk plus a surround receiver and speakers :b If all else was good the brightness would still be a dealbreaker. As it is it's like an even dimmer peak color pseudo OLED black depth display with no burn in risk.. instead of getting high HDR color nits of a high end fald LCD.and future microLED tech. Now if it had no artifacting or screw-ups with VRR, HDR, and had 2000nit or more HDR color with 10ms input lag and good response time it would look more like a microled option.

It'll still be interesting to see if it goes anywhere or improves.


The brightness seems to be one of the biggest sticking points to the display color. As brightness rises, you start to lose color volume. And conversely, it's probably easier to hit 100% of DCI P3 at lower brightness levels.
 
Interesting that someone took the idea of how white OLEDs are sort of per pixel FALD and made a dual layer VA LCD with the rear LCD as the backlight. It's capable of .003 black depth but it isn't very bright, definitely not HDR color bright. Around 25ms or less input lag so that has room for improvement. They claim 4k hdmi 2.1 possible. Wonder why no one thought of it before. Idk why they used a 1080p as the backlight display either.

Linus is being a goof. It's marketed as 2900 nits. not 300 nits as the video suggests. The Input Lag is cause their chipset is mediocre.

If Samsung/LG made one of this using their own processors i'd go for it.

Really the sacrifice versus OLED here are the perfect viewing angles and panel the response times.

I could live with that trade off for high brightness, gauranteed no burn in and a lower price, and hopefully either 8k (for upscaled 4k120) or a smaller panel size. Then again we'd need to see the blacks in real life to judge the capibilities of the tech.
 
  • Like
Reactions: elvn
like this
The best professional monitors also used to be the best gaming monitors (FW900). There is no fucking excuse for why it isn't like that today. High end professional monitors should not have high input lag and be useless for gaming. It's a fucking scam to try to sucker people into buying more monitors than they need.
 
Linus is being a goof. It's marketed as 2900 nits. not 300 nits as the video suggests. The Input Lag is cause their chipset is mediocre.

If Samsung/LG made one of this using their own processors i'd go for it.

Really the sacrifice versus OLED here are the perfect viewing angles and panel the response times.

I could live with that trade off for high brightness, gauranteed no burn in and a lower price, and hopefully either 8k (for upscaled 4k120) or a smaller panel size. Then again we'd need to see the blacks in real life to judge the capibilities of the tech.


Just want to correct that the first time I gave figures from the video, I misquoted and typed .003 black depth when the screen is actually .0003 nit black depth, at least according to linus in that video. So the .003 quoted from my reply a few times here is wrong and it is actually much darker at .0003 . (Not that .003 is bad at all).
 
The best professional monitors also used to be the best gaming monitors (FW900). There is no fucking excuse for why it isn't like that today. High end professional monitors should not have high input lag and be useless for gaming. It's a fucking scam to try to sucker people into buying more monitors than they need.

We'll get back there- the challenge is cost. CRTs were downright cheap compared to what we've spent on 'gaming' and 'professional' monitors, and the best of each have wildly varying objectives that do not mesh well with current LCD technology. Mostly because LCD technology will likely never be 'enough', and hell, OLED might not either.
 
The best professional monitors also used to be the best gaming monitors (FW900). There is no fucking excuse for why it isn't like that today. High end professional monitors should not have high input lag and be useless for gaming. It's a fucking scam to try to sucker people into buying more monitors than they need.
I'm not sure that's true. You might have a point on input lag, but it's entirely possible that when it comes to refresh rate, there's a trade-off between speed and image quality. The very best pro monitors still use IPS-Pro panels, which notably have never achieved higher refresh rates, AFAIK.

A bit more broadly, different panel types excel in different areas: IPS has the best colors and viewing angle, VA the best blacks and contrast, and TN the fastest pixel transitions. Ideally, you'd want the best of all three, but that hasn't been possible due to inherent limitations of the technology. I have some optimism that we might finally see some "best of all worlds" LCDs, with high zone-count mini LED FALD and dual-layer tech coming online in the next couple of years, plus ongoing improvements to the traditional panel types (e.g., Samsung's VA panel TVs have gotten better than ever).
 
Last edited:
The very best pro monitors still use IPS-Pro

I suppose it depends on the purpose. I strongly suspect that the OLED-based Sony mastering monitors are better in accuracy and uniformity than even the best Eizo/whoever LCDs, but I'm sure long hours of photo editing on one of those would produce burn-in eventually. And pretty much any OLED panel can be operated at 120hz or above if the supporting parts are there(tcons, input standards, etc).
 
I suppose it depends on the purpose. I strongly suspect that the OLED-based Sony mastering monitors are better in accuracy and uniformity than even the best Eizo/whoever LCDs, but I'm sure long hours of photo editing on one of those would produce burn-in eventually. And pretty much any OLED panel can be operated at 120hz or above if the supporting parts are there(tcons, input standards, etc).
Sony is actually moving away from OLED for their reference monitors. The new BVM-HX310 will be LCD, likely the same panel as the Eizo CG3145 and FSI XM311K. These use Panasonic's dual-layer tech to achieve very high contrast ratios (similar to what Hisense showed at CES), and can maintain color volume at the high brightness needed for HDR, unlike OLED.
 
Last edited:
Yes I'd hope they'd have something like that. Typical TN and IPS are 860:1 , 980:1 out of1000:1 contrast ratio.. other than the nvidia FALD ones. .14 black depths too. That's not the best for media playback and games by any means.

While it's possible to master broader ranged content and much broader HDR content like games using color gradient maps .. tone maps that is (much like temperature maps) to represent the higher peaks and deeper blacks, or otherwise using numbers to map the brightness/color ranges.. you wouldn't actually be seeing the content as the end viewer would.
https://www.resetera.com/threads/hdr-games-analysed.23587/
Shadow of Mordor
SoM offers a super simplified approach, the game is always outputting 10k nit maximum and the tone mapping is left entirely to the TV. This is interesting, as we know that the developers have never actually seen the game outputting at 10k nits as there is no such display available.
There is a traditional Brightness slider which allows users to specify the black point to their taste/viewing conditions.

Here we can see the obvious place to look for 10k nits, the sun and the specular highlights. You can also see that as it should be, the shadowed sided of the player character is as dark as it should be.

For home theater, media and games... typical ips black depth and contrast ratios are not good enough anymore imo. Neither is edge lit standard backlighting.

Some of the high end Eizos are only 1500:1 contrast, 350nit. That sony dual layer sounds interesting and a step in the right direction.

We'll get back there- the challenge is cost. CRTs were downright cheap compared to what we've spent on 'gaming' and 'professional' monitors, and the best of each have wildly varying objectives that do not mesh well with current LCD technology. Mostly because LCD technology will likely never be 'enough', and hell, OLED might not either.

I'd also like to point out that though the the FW900 graphics professional CRT in later years could be found refurbed for $350 - $450 at one point, when it was released in 2001 it was $2000. In 2018 dollars, that is $2870. It was a graphics professional monitor and not a cheap crt. ;)
 
Last edited:
Sony is actually moving away from OLED for their reference monitors. The new BVM-HX310 will be LCD, likely the same panel as the Eizo CG3145 and FSI XM311K. These use Panasonic's dual-layer tech to achieve very high contrast ratios (similar to what Hisense showed at CES), and can maintain color volume at the high brightness needed for HDR, unlike OLED.

Hmm that's unfortunate. I guess they don't need to care about pixel response times since HFR hasn't really caught on for anything except sports maybe. It's definitely true LCDs have a color volume advantage at high brightness(at least for now), but to me this is a very minimal upside compared to the order of magnitude faster pixel response times on self-emissive displays.

There really isn't one magic display tech that does everything perfectly, unfortunately, even at the $30K price point.
 
Hmm that's unfortunate. I guess they don't need to care about pixel response times since HFR hasn't really caught on for anything except sports maybe. It's definitely true LCDs have a color volume advantage at high brightness(at least for now), but to me this is a very minimal upside compared to the order of magnitude faster pixel response times on self-emissive displays.

There really isn't one magic display tech that does everything perfectly, unfortunately, even at the $30K price point.
What benefit does the fast pixel response give over say the Samsung Q9 series?
 
Where OLED's response times would make a bigger difference is above 120hz but we are just about to see 4k 120hz 4:4:4 chroma on hdmi 2.1 displays ... but without any hdmi 2.1 gpus for probably another year :b

Modern VA gaming monitor response time
================================
is good to 120hz or so before it starts to smear the worst (slowest) black/white/grey transitions.

--That is, modern gaming VA are tight until when fed over 120fps on an over 120hz display .. where the shorter than 120fps 8.3ms frame times start exceeding the response time of the display..---

TFTCentral LG GK850G review:
"The average G2G figure was measured at 8.3ms which was good for a VA panel. In fact it would have been even better at about 6.4ms G2G if we didn't have those few particularly slow changes from black > grey. " ..
.."the best case the response time actually reached down to 2.8ms G2G which was impressive"


Then there are modern feature rich VA tvs like the samsung Q9FN ...
=====================================================

4k.com Review of Samsung Q9FN
----------------------------------------------
"The Q9FN is an ultra-premium 4K HDR TV so you can expect some great motion handling across the board. However, even by the standards of many older and many current premium 4K TV models, this baby delivers the goods stunningly well. For one thing, it has one of the best response times we’ve ever seen in an LCD TV, sitting as it does at a little over 3 milliseconds. This alone means that the Q9FN offers very smooth sharp handling of fast movement on the screen. Beyond this, the Q9FN’s native 120Hz display panel interpolates all sorts of lower frame rate content for very smooth handling on the screen. The inclusion of Black Frame Insertion technology spreads this feature to games played on the TV as well for a more fluid gameplay experience."

It also flickers at 480hz and has the quoted optional BFI and interpolation help the end result of transition times too ... and which help cut the worst offender.. sample and hold blur which still affects both OLED and LCDs no matter what the OLED response time is. "

- I'm sure that 3ms quote if anything is an average but theoretically 3ms solid response time would be tight to 333FpsHz. (6ms tight to 166FpsHz, etc), That is without overdrive quality or lack of, high hz flicker capability, BFI, or interpolation considerations. And of couse that's "tight" only in regard to ghosting/trailing/black smearing transitions and doesn't mean anything vs sample and hold blur.
---------------------------------------------------

Running high frame rate averages on a high hz monitor such as 120hz 4k can cut the sample and hold blur (suffered by both LCDs and OLEDs) down by about 50% at 120FpsHz too. so is highly dependent on how high your frame rate average is vs graphics demands and settings of any given game.
-------

The dual layer LCD tech could be a great next step if it comes around in the next few years while we wait on microLED's release and for microLED's prices to be high rather than astronomical.
 
Last edited:
Thank you for your concise answer.
By the time HDMI 2.1 is taking off we should be seeing the merits of the microLED displays.
MicroLED will use large-ish LEDs at first (in comparison) which limits the minimum screen size but they will shrink rapidly and reduce in price some time after.
If the problem of OLED blue brightness/longevity and decent colour volume throughout the brightness range can be solved, we should see serious competition between OLED and microLED.
But I fear OLED will be left behind. Development has been slow, demonstrated by the need to add white LEDs to improve brightness beyond 400nits (ish) which does nothing positive for OLED colour volume.
I'm interested what you think about this.

ps I have a Q9FN but have been following OLED since around Y2K when Cambridge Display Technology in the UK developed them.
 
Last edited:
What benefit does the fast pixel response give over say the Samsung Q9 series?

I'm just merely speaking about the technology's potential in general. Current OLEDs mostly do very basic sample and hold with maximum refresh rate of 60hz, so you don't see as much of the benefit(aside from absolute no ghosting/smearing) as you could. Samsung Q9s are VA, which is "good enough on average" for 120hz, but that doesn't mean there's no smearing. Even the fastest VA panels always have degenerate cases going from black to dark or medium grey where they are 2-3x slower than their average response time, which isn't great. Of course, this doesn't matter that much with normal TV content -- in TV and film, panning happens very slowly and you are hamstrung by the low framerates of the content. In video games, however, it's easy to introduce visible smearing on any non-strobed LCD panel by just spinning the viewport quickly. Most fps players, for example, pan their viewports far far faster than any TV or film show ever would, because it would be extremely disorienting.

To get an idea of just how big the difference is, you can just look at the response time parts of LG C8 and Q9FN reviews. OLED is 17.5x faster 80% transition, and 9x faster on average full transition. MicroLED should be just as fast.

Of course, if you are viewing 30fps content on a 60hz sample and hold display, there's so much motion blur inherent to the content that you may not notice the blur added by the panel, especially one as good as the Samsung Q9. PC monitors don't seem to get these excellent VA panels.

What this all means is that current self-emissive panels(OLED or MicroLED) are capable of correctly displaying content at 240hz, 480hz, or even higher, or using rolling scans to achieve a similar effect(which is apparently what the Oculus VR display does), which LCD really can't do properly. They are mostly hamstrung by manufacturers and electronics. We've been waiting up until this year just to get enough HDMI bandwidth to do 4K 120hz, let alone higher refresh rates...and even DP 1.4 lags behind HDMI 2.1.
 
I'm just merely speaking about the technology's potential in general. Current OLEDs mostly do very basic sample and hold with maximum refresh rate of 60hz, so you don't see as much of the benefit(aside from absolute no ghosting/smearing) as you could. Samsung Q9s are VA, which is "good enough on average" for 120hz, but that doesn't mean there's no smearing. Even the fastest VA panels always have degenerate cases going from black to dark or medium grey where they are 2-3x slower than their average response time, which isn't great. Of course, this doesn't matter that much with normal TV content -- in TV and film, panning happens very slowly and you are hamstrung by the low framerates of the content. In video games, however, it's easy to introduce visible smearing on any non-strobed LCD panel by just spinning the viewport quickly. Most fps players, for example, pan their viewports far far faster than any TV or film show ever would, because it would be extremely disorienting.

To get an idea of just how big the difference is, you can just look at the response time parts of LG C8 and Q9FN reviews. OLED is 17.5x faster 80% transition, and 9x faster on average full transition. MicroLED should be just as fast.

Of course, if you are viewing 30fps content on a 60hz sample and hold display, there's so much motion blur inherent to the content that you may not notice the blur added by the panel, especially one as good as the Samsung Q9. PC monitors don't seem to get these excellent VA panels.

What this all means is that current self-emissive panels(OLED or MicroLED) are capable of correctly displaying content at 240hz, 480hz, or even higher, or using rolling scans to achieve a similar effect(which is apparently what the Oculus VR display does), which LCD really can't do properly. They are mostly hamstrung by manufacturers and electronics. We've been waiting up until this year just to get enough HDMI bandwidth to do 4K 120hz, let alone higher refresh rates...and even DP 1.4 lags behind HDMI 2.1.
But they arent capable of correctly displaying at 240Hz+, as far as I am aware no OLED display on sale can do this yet.
They are also hamstrung by electronics.
 
But they arent capable of correctly displaying at 240Hz+, as far as I am aware no OLED display on sale can do this yet.
They are also hamstrung by electronics.

Hm? That's what I said. The panels are capable of it, if there was electronics to do it and if manufacturers cared to implement it. If you're doing your own electronics like Oculus did, then you can absolutely make a rolling scan OLED display that has super low persistence with existing panels and technology.
 
  • Like
Reactions: Nenu
like this
Thank you for your concise answer.
By the time HDMI 2.1 is taking off we should be seeing the merits of the microLED displays.
MicroLED will use large-ish LEDs at first (in comparison) which limits the minimum screen size but they will shrink rapidly and reduce in price some time after.
If the problem of OLED blue brightness/longevity and decent colour volume throughout the brightness range can be solved, we should see serious competition between OLED and microLED.
But I fear OLED will be left behind. Development has been slow, demonstrated by the need to add white LEDs to improve brightness beyond 400nits (ish) which does nothing positive for OLED colour volume.
I'm interested what you think about this.

ps I have a Q9FN but have been following OLED since around Y2K when Cambridge Display Technology in the UK developed them.
I wouldn't expect MicroLED to become cost competitive with OLED in the next few years. OLED, meanwhile, has a number of improvements that should be landing in the next year or two: top emission, TADF, and QD-OLED all have the potential to offer marked improvements in brightness and/or lifetime.
 
I'd guess that things like the Dell alienware 55" OLED gaming panel with hdmi 2.1 and displayport inputs will be a top end "right now" gaming display for 2019. This is because Nvidia doesn't support Variable Refresh Rate over HDMI , at least not yet..
https://www.anandtech.com/show/1384...5inch-4k-120-hz-oled-gaming-monitor-showcased
while Dell says that the Alienware 55 display is set to support an adaptive refresh rate technology, the manufacturer does not disclose whether it will eventually support AMD’s FreeSync/FreeSync 2 or NVIDIA’s G-Sync/G-Sync HDR when it is finalized. As for connectivity, the monitor features DisplayPort 1.4 and HDMI 2.1 (with the latter possibly pointing not only to a new cable requirement, but also to variable refresh rate (VRR) and other HDMI 2.1 features support).

Any 2019 LG OLED tvs or high end Samsung Q series with HDMI 2.1 and 4k 120hz 4:4:4 will lack variable refresh rate off nvidia gpus because nvidia doesn't support VRR over HDMI (and I suppose you could say due to the fact that the TVs lack a displayport).

You could of course write off VRR and get one of the high end TVs anyway. There may still be hope that nvidia will someday support VRR over HDMI too, but if I had to guess I'd say it would be after they add a line of GPUs with hdmi 2.1 outputs which could be 2020 or later.

So there is that and also the fact that HDR games are not ubiquitous yet is another factor in the timeline.

So the forecast to my sensibilities seems to be looking at the dell hdmi 2.1 OLED gaming display or a Samsung Q9fn 2019 hdmi 2.1 series tv... perhaps in a few years a Dual Layer 3000nit hdr color + .0003 black depth LCD assuming they roll out from manufacturers .. which could potentially hold ground for 2 - 3 more years until microLED becomes available at high prices rather than astromical ones.
 
Last edited:
Hm? That's what I said. The panels are capable of it, if there was electronics to do it and if manufacturers cared to implement it. If you're doing your own electronics like Oculus did, then you can absolutely make a rolling scan OLED display that has super low persistence with existing panels and technology.
I agree, OLED gives the better experience currently for VR.
My question is related to how well OLED can compete with microled over the coming years, it should be serious competition.
ie both will be capable of ultra high Hz but will OLED be able to overcome the lack of brightness and colour volume?
If they can raise brightness enough for HDR (without the loss of colour volume) it should solve the problem of image persistence under normal PC use as long as HDR is turned off.
At the moment OLED uses a hack to achieve 800 nits which doesnt solve the problems OLED has and cannot get close to the 2000nit displays we currently have.
 
I would expect microLED to supersede OLED when it becomes price competitive, but I also don't see microLED being price competitive for at least 5-10 years and it's entirely possible that improvements will happen. LG has mainly been producing the exact same panel tech with very incremental improvements, we'll see if anything changes when they move to top emission next year. 2500cd/m^2 top emission OLEDs have been demonstrated.

It is pretty difficult to tell what improvements can be made to both OLED and microLED without reading a lot of research studies and probably talking directly to the engineers actually working in R&D at LG and Samsung(and they probably can't tell you much without risking their jobs:p).

Purely speaking about currently buyable tech alone, I game more than I watch TV and I find all the FALDs in LCDs to be way too laggy to provide a good experience there. The one in the Acer X27/PG27UQ is very, very impressive and much faster than any TV FALD, but it also took Nvidia over 2 years to tune and I still can't justify spending $2K USD on a 27" screen... lol. So just the FALD latency alone means that I'm looking at the 120hz OLED Dell 55"(it will have DP1.4 AND HDMI 2.1 so you can use it now with existing video cards and be future proofed with HDMI 2.1) or maaaaaaaaaybe the PG35VQ/Acer X35(pending review/backlight performance/response times/price) as my next display.
 
Last edited:
  • Like
Reactions: Nenu
like this
Personally I think we should make the infrastructure space age..

... That is, by making factory prefabricated smart road sections and making the entire road infrastructure smart roads starting with the highways.. instead of rock bed concrete and asphalt frosting driveways. With both the roads and the cars being smart , as well as making every car equipped with smart emitters/transmitters/receivers by law.. the margin of error would be extremely small and it would also help vs weather conditions. Phones or some other smart devices could also help pedestrians to be "seen". You could also lay massive stretches of fiber in the roads at the same time.. to reach everywhere roads go in the long run. This would also go a long way toward making AR more doable and sooner on roadways to help enhance things even more. Of course it would also put a lot of (taxable and spending) people to work for decades too. We've had automatied assembly lines for products that fly down conveyor belts and rollers and auto sort and divert etc for decades.. why can't we do similar making the roads into a smart machine? You could alternately retro fit existing roads with smart technology which would be a lot cheaper of course but probably not as grand in vision and results.
/gets off soapbox.

TLDR: Rather than trying to make mainly optics based singular cars.. make the roads smart, all crossings smart, make all vehicles smart, make pedestrian's smart entities via smart devices as part of the system framework too. Make jobs doing it.

Or we can do this instead and eliminate the need for all that shit!
6202684692_c5f6a48d6d_b.jpg
 
  • Like
Reactions: elvn
like this
Hm? That's what I said. The panels are capable of it, if there was electronics to do it and if manufacturers cared to implement it. If you're doing your own electronics like Oculus did, then you can absolutely make a rolling scan OLED display that has super low persistence with existing panels and technology.
This is kindof a selling point of these 2019 LGs though. They support 4k120HZ playback with a rolling display mode ( top to bottom, persumably ), for an effective motion rate of 240. Point is, the actual panel is effectively being written to at 240hz, with 4ms between writes.

They definitely won't support taking in 240hz signals, but the BFI motion will make these appear silky smooth.


I'd guess that things like the Dell alienware 55" OLED gaming panel with hdmi 2.1 and displayport inputs will be a top end "right now" gaming display for 2019. This is because Nvidia doesn't support Variable Refresh Rate over HDMI , at least not yet..
https://www.anandtech.com/show/1384...5inch-4k-120-hz-oled-gaming-monitor-showcased


Any 2019 LG OLED tvs or high end Samsung Q series with HDMI 2.1 and 4k 120hz 4:4:4 will lack variable refresh rate off nvidia gpus because nvidia doesn't support VRR over HDMI (and I suppose you could say due to the fact that the TVs lack a displayport).

You could of course write off VRR and get one of the high end TVs anyway. There may still be hope that nvidia will someday support VRR over HDMI too, but if I had to guess I'd say it would be after they add a line of GPUs with hdmi 2.1 outputs which could be 2020 or later.

So there is that and also the fact that HDR games are not ubiquitous yet is another factor in the timeline.

So the forecast to my sensibilities seems to be looking at the dell hdmi 2.1 OLED gaming display or a Samsung Q9fn 2019 hdmi 2.1 series tv... perhaps in a few years a Dual Layer 3000nit hdr color + .0003 black depth LCD assuming they roll out from manufacturers .. which could potentially hold ground for 2 - 3 more years until microLED becomes available at high prices rather than astromical ones.

I mean at this point before whenever the next GPU gen comes out ( hopefully computex for Navi ) we cannot make that call.

Especially as NVidia does not have ANY HDMI2.1 capability, it's highly likely that they'll implement VRR as it's part of the spec.

Q9FN is a bad buy currently, unless you don't want the monitor functionality.

It'll be at least a few more months until 4k120 TVs are purchasble ( ~marchish? ) and then a few more until the potential HDMI 2.1 GPUs release ( June/July for Navi ). Pricing on the OLEDs also does the declining thing over the year so that's a small extra for waiting for an HDMI 2.1 capable GPU first.
 
I havent noticed, what am I missing?
The main selling point - the 4k120 video mode.

Downscaling to 1080p for full chroma content or 1440p for reduced chroma content is a severe limitation alleviated by HDMI 2.1.

If you wanted to buy something _RIGHT_NOW_, sure, but otherwise it's LG/Alienware All the way. Hopefully the 2019 Samsung Flagship comes in with 4k120 + BFI aswell.

Also the QD-OLED by samsung looks really interesting , but that's a next year thing by the looks of it.
 
Last edited:
Back
Top