Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Really, it's time forthe Jedispecial purpose monitors to end. It's bullshit. All monitors should have accurate colors, low latency, variable refresh, etc. It doesn't have to be and shouldn't be a tradeoff. Anything less is a scam.
Talk about tripling or quadrupling the average price of a monitor...
I get the sentiment, but most computer uses (well, nearly all) just need 'close enough', at the very best.
Honestly it's all economies of scale. If you just make more of the same product you can make it much cheaper. Consumers are just getting fucked hard here. The industry has blown and stagnated for a very long time.
Calibration AIThe only part that would be difficult to scale would be accurate colors; perhaps they could devise a production process that produces monitors that are accurate without calibration, but generally speaking in order to guarantee calibration each unit must be tested for such and adjusted where necessary.
Really, it's time forthe Jedispecial purpose monitors to end. It's bullshit. All monitors should have accurate colors, low latency, variable refresh, etc. It doesn't have to be and shouldn't be a tradeoff. Anything less is a scam.
This won't happen as long as most computer buyers keep buying the cheapest displays available
Interesting that someone took the idea of how white OLEDs are sort of per pixel FALD and made a dual layer VA LCD with the rear LCD as the backlight. It's capable of .003 black depth but it isn't very bright, definitely not HDR color bright. Around 25ms or less input lag so that has room for improvement. They claim 4k hdmi 2.1 possible. Wonder why no one thought of it before. Idk why they used a 1080p as the backlight display either.
Panasonic demoed a very similar dual-layer LCD tech a couple of years ago, and there are now a couple of pro monitors you can buy that use it (e.g., Eizo CG3145). Very expensive though.I hear about progress on this every year. And they always have problems like it isn't bright enough or has terrible viewing angles or strange artifacts or something.
It seems like they've finally got it working decent here, but you can tell it's far from perfect even through the youtube video. You can see pretty significant haloing in the video and who knows how the colors and everything else is. I think it's a long way away form actually competing with OLED.
made a dual layer VA LCD with the rear LCD as the backlight. Wonder why no one thought of it before. Idk why they used a 1080p as the backlight display either.
Source? How would ray-tracing be useful for self-driving cars?
The video did say the idea wasn't new, that Dolby did some kind of demo years ago but had a lot of artifacts.. It sounds difficult but looked decent at a glance in the vid and they said they had made advancements past some of the previous hurdles - especially artifacts (going by what linus said in the video). Effective 2 million zones of local dimming, multiples of light blocking etc.
The .0003 black depth and 100% DCI P3 color, and according to linus "cheaper than oled to make".. 120hz if hdmi 2.1 models all sounded good. The brightness ("just shy of 300nit") is severely lacking and I think he said at 25ms due to the panel matching chip which while ok isn't great compared to modern 120hz tvs. Two lcds worth of power wouldn't bother me at all though . I already run three LCDs at my desk plus a surround receiver and speakers :b If all else was good the brightness would still be a dealbreaker. As it is it's like an even dimmer peak color pseudo OLED black depth display with no burn in risk.. instead of getting high HDR color nits of a high end fald LCD.and future microLED tech. Now if it had no artifacting or screw-ups with VRR, HDR, and had 2000nit or more HDR color with 10ms input lag and good response time it would look more like a microled option.
It'll still be interesting to see if it goes anywhere or improves.
The video did say the idea wasn't new, that Dolby did some kind of demo years ago but had a lot of artifacts.. It sounds difficult but looked decent at a glance in the vid and they said they had made advancements past some of the previous hurdles - especially artifacts (going by what linus said in the video). Effective 2 million zones of local dimming, multiples of light blocking etc.
The .0003 black depth and 100% DCI P3 color, and according to linus "cheaper than oled to make".. 120hz if hdmi 2.1 models all sounded good. The brightness ("just shy of 300nit") is severely lacking and I think he said at 25ms due to the panel matching chip which while ok isn't great compared to modern 120hz tvs. Two lcds worth of power wouldn't bother me at all though . I already run three LCDs at my desk plus a surround receiver and speakers :b If all else was good the brightness would still be a dealbreaker. As it is it's like an even dimmer peak color pseudo OLED black depth display with no burn in risk.. instead of getting high HDR color nits of a high end fald LCD.and future microLED tech. Now if it had no artifacting or screw-ups with VRR, HDR, and had 2000nit or more HDR color with 10ms input lag and good response time it would look more like a microled option.
It'll still be interesting to see if it goes anywhere or improves.
Interesting that someone took the idea of how white OLEDs are sort of per pixel FALD and made a dual layer VA LCD with the rear LCD as the backlight. It's capable of .003 black depth but it isn't very bright, definitely not HDR color bright. Around 25ms or less input lag so that has room for improvement. They claim 4k hdmi 2.1 possible. Wonder why no one thought of it before. Idk why they used a 1080p as the backlight display either.
Linus is being a goof. It's marketed as 2900 nits. not 300 nits as the video suggests. The Input Lag is cause their chipset is mediocre.
If Samsung/LG made one of this using their own processors i'd go for it.
Really the sacrifice versus OLED here are the perfect viewing angles and panel the response times.
I could live with that trade off for high brightness, gauranteed no burn in and a lower price, and hopefully either 8k (for upscaled 4k120) or a smaller panel size. Then again we'd need to see the blacks in real life to judge the capibilities of the tech.
The best professional monitors also used to be the best gaming monitors (FW900). There is no fucking excuse for why it isn't like that today. High end professional monitors should not have high input lag and be useless for gaming. It's a fucking scam to try to sucker people into buying more monitors than they need.
I'm not sure that's true. You might have a point on input lag, but it's entirely possible that when it comes to refresh rate, there's a trade-off between speed and image quality. The very best pro monitors still use IPS-Pro panels, which notably have never achieved higher refresh rates, AFAIK.The best professional monitors also used to be the best gaming monitors (FW900). There is no fucking excuse for why it isn't like that today. High end professional monitors should not have high input lag and be useless for gaming. It's a fucking scam to try to sucker people into buying more monitors than they need.
The very best pro monitors still use IPS-Pro
Sony is actually moving away from OLED for their reference monitors. The new BVM-HX310 will be LCD, likely the same panel as the Eizo CG3145 and FSI XM311K. These use Panasonic's dual-layer tech to achieve very high contrast ratios (similar to what Hisense showed at CES), and can maintain color volume at the high brightness needed for HDR, unlike OLED.I suppose it depends on the purpose. I strongly suspect that the OLED-based Sony mastering monitors are better in accuracy and uniformity than even the best Eizo/whoever LCDs, but I'm sure long hours of photo editing on one of those would produce burn-in eventually. And pretty much any OLED panel can be operated at 120hz or above if the supporting parts are there(tcons, input standards, etc).
Shadow of Mordor
SoM offers a super simplified approach, the game is always outputting 10k nit maximum and the tone mapping is left entirely to the TV. This is interesting, as we know that the developers have never actually seen the game outputting at 10k nits as there is no such display available.
There is a traditional Brightness slider which allows users to specify the black point to their taste/viewing conditions.
Here we can see the obvious place to look for 10k nits, the sun and the specular highlights. You can also see that as it should be, the shadowed sided of the player character is as dark as it should be.
We'll get back there- the challenge is cost. CRTs were downright cheap compared to what we've spent on 'gaming' and 'professional' monitors, and the best of each have wildly varying objectives that do not mesh well with current LCD technology. Mostly because LCD technology will likely never be 'enough', and hell, OLED might not either.
Sony is actually moving away from OLED for their reference monitors. The new BVM-HX310 will be LCD, likely the same panel as the Eizo CG3145 and FSI XM311K. These use Panasonic's dual-layer tech to achieve very high contrast ratios (similar to what Hisense showed at CES), and can maintain color volume at the high brightness needed for HDR, unlike OLED.
What benefit does the fast pixel response give over say the Samsung Q9 series?Hmm that's unfortunate. I guess they don't need to care about pixel response times since HFR hasn't really caught on for anything except sports maybe. It's definitely true LCDs have a color volume advantage at high brightness(at least for now), but to me this is a very minimal upside compared to the order of magnitude faster pixel response times on self-emissive displays.
There really isn't one magic display tech that does everything perfectly, unfortunately, even at the $30K price point.
What benefit does the fast pixel response give over say the Samsung Q9 series?
But they arent capable of correctly displaying at 240Hz+, as far as I am aware no OLED display on sale can do this yet.I'm just merely speaking about the technology's potential in general. Current OLEDs mostly do very basic sample and hold with maximum refresh rate of 60hz, so you don't see as much of the benefit(aside from absolute no ghosting/smearing) as you could. Samsung Q9s are VA, which is "good enough on average" for 120hz, but that doesn't mean there's no smearing. Even the fastest VA panels always have degenerate cases going from black to dark or medium grey where they are 2-3x slower than their average response time, which isn't great. Of course, this doesn't matter that much with normal TV content -- in TV and film, panning happens very slowly and you are hamstrung by the low framerates of the content. In video games, however, it's easy to introduce visible smearing on any non-strobed LCD panel by just spinning the viewport quickly. Most fps players, for example, pan their viewports far far faster than any TV or film show ever would, because it would be extremely disorienting.
To get an idea of just how big the difference is, you can just look at the response time parts of LG C8 and Q9FN reviews. OLED is 17.5x faster 80% transition, and 9x faster on average full transition. MicroLED should be just as fast.
Of course, if you are viewing 30fps content on a 60hz sample and hold display, there's so much motion blur inherent to the content that you may not notice the blur added by the panel, especially one as good as the Samsung Q9. PC monitors don't seem to get these excellent VA panels.
What this all means is that current self-emissive panels(OLED or MicroLED) are capable of correctly displaying content at 240hz, 480hz, or even higher, or using rolling scans to achieve a similar effect(which is apparently what the Oculus VR display does), which LCD really can't do properly. They are mostly hamstrung by manufacturers and electronics. We've been waiting up until this year just to get enough HDMI bandwidth to do 4K 120hz, let alone higher refresh rates...and even DP 1.4 lags behind HDMI 2.1.
But they arent capable of correctly displaying at 240Hz+, as far as I am aware no OLED display on sale can do this yet.
They are also hamstrung by electronics.
I wouldn't expect MicroLED to become cost competitive with OLED in the next few years. OLED, meanwhile, has a number of improvements that should be landing in the next year or two: top emission, TADF, and QD-OLED all have the potential to offer marked improvements in brightness and/or lifetime.Thank you for your concise answer.
By the time HDMI 2.1 is taking off we should be seeing the merits of the microLED displays.
MicroLED will use large-ish LEDs at first (in comparison) which limits the minimum screen size but they will shrink rapidly and reduce in price some time after.
If the problem of OLED blue brightness/longevity and decent colour volume throughout the brightness range can be solved, we should see serious competition between OLED and microLED.
But I fear OLED will be left behind. Development has been slow, demonstrated by the need to add white LEDs to improve brightness beyond 400nits (ish) which does nothing positive for OLED colour volume.
I'm interested what you think about this.
ps I have a Q9FN but have been following OLED since around Y2K when Cambridge Display Technology in the UK developed them.
while Dell says that the Alienware 55 display is set to support an adaptive refresh rate technology, the manufacturer does not disclose whether it will eventually support AMD’s FreeSync/FreeSync 2 or NVIDIA’s G-Sync/G-Sync HDR when it is finalized. As for connectivity, the monitor features DisplayPort 1.4 and HDMI 2.1 (with the latter possibly pointing not only to a new cable requirement, but also to variable refresh rate (VRR) and other HDMI 2.1 features support).
I agree, OLED gives the better experience currently for VR.Hm? That's what I said. The panels are capable of it, if there was electronics to do it and if manufacturers cared to implement it. If you're doing your own electronics like Oculus did, then you can absolutely make a rolling scan OLED display that has super low persistence with existing panels and technology.
Personally I think we should make the infrastructure space age..
... That is, by making factory prefabricated smart road sections and making the entire road infrastructure smart roads starting with the highways.. instead of rock bed concrete and asphalt frosting driveways. With both the roads and the cars being smart , as well as making every car equipped with smart emitters/transmitters/receivers by law.. the margin of error would be extremely small and it would also help vs weather conditions. Phones or some other smart devices could also help pedestrians to be "seen". You could also lay massive stretches of fiber in the roads at the same time.. to reach everywhere roads go in the long run. This would also go a long way toward making AR more doable and sooner on roadways to help enhance things even more. Of course it would also put a lot of (taxable and spending) people to work for decades too. We've had automatied assembly lines for products that fly down conveyor belts and rollers and auto sort and divert etc for decades.. why can't we do similar making the roads into a smart machine? You could alternately retro fit existing roads with smart technology which would be a lot cheaper of course but probably not as grand in vision and results.
/gets off soapbox.
TLDR: Rather than trying to make mainly optics based singular cars.. make the roads smart, all crossings smart, make all vehicles smart, make pedestrian's smart entities via smart devices as part of the system framework too. Make jobs doing it.
This is kindof a selling point of these 2019 LGs though. They support 4k120HZ playback with a rolling display mode ( top to bottom, persumably ), for an effective motion rate of 240. Point is, the actual panel is effectively being written to at 240hz, with 4ms between writes.Hm? That's what I said. The panels are capable of it, if there was electronics to do it and if manufacturers cared to implement it. If you're doing your own electronics like Oculus did, then you can absolutely make a rolling scan OLED display that has super low persistence with existing panels and technology.
I'd guess that things like the Dell alienware 55" OLED gaming panel with hdmi 2.1 and displayport inputs will be a top end "right now" gaming display for 2019. This is because Nvidia doesn't support Variable Refresh Rate over HDMI , at least not yet..
https://www.anandtech.com/show/1384...5inch-4k-120-hz-oled-gaming-monitor-showcased
Any 2019 LG OLED tvs or high end Samsung Q series with HDMI 2.1 and 4k 120hz 4:4:4 will lack variable refresh rate off nvidia gpus because nvidia doesn't support VRR over HDMI (and I suppose you could say due to the fact that the TVs lack a displayport).
You could of course write off VRR and get one of the high end TVs anyway. There may still be hope that nvidia will someday support VRR over HDMI too, but if I had to guess I'd say it would be after they add a line of GPUs with hdmi 2.1 outputs which could be 2020 or later.
So there is that and also the fact that HDR games are not ubiquitous yet is another factor in the timeline.
So the forecast to my sensibilities seems to be looking at the dell hdmi 2.1 OLED gaming display or a Samsung Q9fn 2019 hdmi 2.1 series tv... perhaps in a few years a Dual Layer 3000nit hdr color + .0003 black depth LCD assuming they roll out from manufacturers .. which could potentially hold ground for 2 - 3 more years until microLED becomes available at high prices rather than astromical ones.
I havent noticed, what am I missing?Q9FN is a bad buy currently, unless you don't want the monitor functionality
The main selling point - the 4k120 video mode.I havent noticed, what am I missing?