ASUS 25" 500hz TN panel

Image quality is probably meh on this panel but its also probably very responsive, especially at 500fps
 
I'm gonna be one of those dudes that comes on this thread and says "OLED has spoiled me". Never going back to TN.

Now if they were to find a way to create this performance in an IPS or VA panel, it would be far more interesting.
 
I'm gonna be one of those dudes that comes on this thread and says "OLED has spoiled me". Never going back to TN.

Now if they were to find a way to create this performance in an IPS or VA panel, it would be far more interesting.

Nothing is more suitable for higher refresh rates than OLED. The fact that there hasn't been even a 240Hz OLED monitor yet is mind boggling.
 
Nothing is more suitable for higher refresh rates than OLED. The fact that there hasn't been even a 240Hz OLED monitor yet is mind boggling.
What's even more mind boggling is how burn in and ABL have not been fixed and plague all OLED panels. At 500hz this thing should have outstanding motion clarity. I'm on a 120hz display and it is barely acceptable imo but also It's 4k so yeah It's gonna be a while to have it all.
 
What's even more mind boggling is how burn in and ABL have not been fixed and plague all OLED panels. At 500hz this thing should have outstanding motion clarity. I'm on a 120hz display and it is barely acceptable imo but also It's 4k so yeah It's gonna be a while to have it all.

Well unfortunately that's just inherent to the technology and will never be fixed. That's like asking the crystals in an LCD to be self emmisive or have no response time.
 
What's even more mind boggling is how burn in and ABL have not been fixed and plague all OLED panels. At 500hz this thing should have outstanding motion clarity. I'm on a 120hz display and it is barely acceptable imo but also It's 4k so yeah It's gonna be a while to have it all.
The best displays in our history have all been self emissive. Even the new micro led will be prone to burn-in. Maybe not as much as OLED but if there’s a chance that any pixels would be more used than others then they will degrade sooner (and thus exhibit burn-in). That’s the reality. Nothing is perfect, nothing lasts forever.

Edit - self-emissive displays will always have the propensity to burn in.
 
What's even more mind boggling is how burn in and ABL have not been fixed and plague all OLED panels. At 500hz this thing should have outstanding motion clarity. I'm on a 120hz display and it is barely acceptable imo but also It's 4k so yeah It's gonna be a while to have it all.
I have a 3 year old C9 OLED with over 4000 hours on it and looks as good as the day I got it still. I also have a CX with over 1000 hours used exclusively with a windows PC with no issues still.
 
The best displays in our history have all been self emissive. Even the new micro led will be prone to burn-in. Maybe not as much as OLED but if there’s a chance that any pixels would be more used than others then they will degrade sooner (and thus exhibit burn-in). That’s the reality. Nothing is perfect, nothing lasts forever.

Edit - self-emissive displays will always have the propensity to burn in.
I thought Micro LED was supposed to be immune to burn in because of no reliance on organic materials.
 
I thought Micro LED was supposed to be immune to burn in because of no reliance on organic materials.

Nope, just a myth. The longevity depends on what they make it out of, but a non-organic LED isn't immune to decay and death. Just like the non-organic LED light bulbs in your house, and backlights for LCD TVs they will dim over time and eventually die, and the harder they're driven the faster this happens.
"Burn in" is just when some of the pixels don't get as bright anymore. So you would see a grayish burned in logo on a bright white screen.

There's always a risk with any display unless it's being used at a very low brightness compared to what it's capable of. Even for the backlights on LCDs.
 
I thought Micro LED was supposed to be immune to burn in because of no reliance on organic materials.
Depends how hard it's driven.
Well heatsinked LEDs exceeded projected longevity testing in data I have seen. Longest test I know of was/is cree with an XR-E? (one of their first LEDs). Was less than 1% per year at outrageous driving/temp conditions. Later models also have exceeded projections even when run at very high TJ and high drive.
What this means is if you run an led at around 60-70% current and keep TJ around/preferably under 40 degrees C, aka better heatsinked than most consumer shit out on the market, they will outlive anyone reading this forum currently, excluding freak bathtub curve exceptions/defective parts. I have some with 20-30k hours and basically no measurable degradation (within margin of error) over a decade doing this. OLED appears to be quite similar if treated nicely, if you run them moderately the electronics/capacitors will be more of a worry if you don't abuse them, when looking at extreme lifetime use. in short, i'd say a microled (either discreet or lithographic microled which is what I'm really looking at for HMD asskicking - 1000Hz already a year or more ago), should be very reliable, and I'd rate them al ittle above OLED as the heatsinking is easier and the materials they are made of to be more known, running higher power in other applications and thus less subject to degradation.
 
LED’s degrade. Not as quick as organic material but they do degrade.
That is true. Journalism has certainly made people, including me, believe that the tech is immune to burn in. But it does make sense that any LED will degrade over time.
 
That is true. Journalism has certainly made people, including me, believe that the tech is immune to burn in. But it does make sense that any LED will degrade over time.

It's not immune but it really shouldn't even be considered an issue at all, you know even LCDs can burn in right? Yet nobody ever considers burn in to be a problem with LCD. So much like nobody will consider burn in to be a problem with LCD, most people would not even bother to mention burn in with Micro LED, but sure if you want to be all technical about it then I guess Micro LED can burn in.
 
Do large TV/monitor OLEDs flicker at low brightness like phones do? Now, I haven't laid hands on a 90-120hz phone, but every other OLED I have flickers. Samsung tablets and phones and LG phones are my only experience, though. I keep thinking how neat a non-flickering OLED monitor would be. I honestly don't think I'd care if it exceeded 120hz.
 
Do large TV/monitor OLEDs flicker at low brightness like phones do? Now, I haven't laid hands on a 90-120hz phone, but every other OLED I have flickers. Samsung tablets and phones and LG phones are my only experience, though. I keep thinking how neat a non-flickering OLED monitor would be. I honestly don't think I'd care if it exceeded 120hz.
I keep my CX at 0 brightness most of the day. No detectable flicker from what I've seen. It's a sample-and-hold system, so it doesn't change the pixel until it has to. LG's OLED TVs do have a a Black Frame Insertion tech which does flicker the screen to create motion clarity in gaming, but that's an option and not on by default.
 
So if you look at a picture of a white vertical line on a black screen and look left and right quickly, it's a solid blur and not flashing?
 
So if you look at a picture of a white vertical line on a black screen and look left and right quickly, it's a solid blur and not flashing?
That is correct. There is no flickering in the image unless you turn on BFI.

This shows the CX in slow motion on frame refresh.

 
People really care about longevity of their TVs? Most will likely replace them long before they die. I have never kept a TV for 10 years.
 
For laptops lol where's the desktop monitors?
Most people don't leave their laptop lids open all the time. They close them when they aren't being used. Burn in is less of a problem on a laptop compared to a desktop monitor where people are more wreckless about leaving static elements on screen.
 
The best displays in our history have all been self emissive. Even the new micro led will be prone to burn-in. Maybe not as much as OLED but if there’s a chance that any pixels would be more used than others then they will degrade sooner (and thus exhibit burn-in). That’s the reality. Nothing is perfect, nothing lasts forever.

Edit - self-emissive displays will always have the propensity to burn in.
I guess it depends on how you define "self-emissive", but a raster scanning display, for example Laser Phosphor Display could potentially have less burn-in risk than any OLED or μLED while retaining excellent contrast, and having inherently superior motion.
 
Do large TV/monitor OLEDs flicker at low brightness like phones do? Now, I haven't laid hands on a 90-120hz phone, but every other OLED I have flickers. Samsung tablets and phones and LG phones are my only experience, though. I keep thinking how neat a non-flickering OLED monitor would be. I honestly don't think I'd care if it exceeded 120hz.
Don't know if it's the same kind of flicker as phones/tablets, but all the displays based on LG's panels do flicker. Not just at low brightness but at all times. Seems to be the way they refresh. Unfortunately in my case it's enough to cause eyestrain / headache within several minutes of dark room viewing.
C2_oled_flicker_RTINGS.png
 
I guess it depends on how you define "self-emissive", but a raster scanning display, for example Laser Phosphor Display could potentially have less burn-in risk than any OLED or μLED while retaining excellent contrast, and having inherently superior motion.
Yes! I mentioned this on the CRT thread. I would love to see a laser phosphor display in action.
 
Nothing is more suitable for higher refresh rates than OLED. The fact that there hasn't been even a 240Hz OLED monitor yet is mind boggling.
240Hz OLED is finally coming. I saw Samsung demo one at DisplayWeek 2022.

I guess it depends on how you define "self-emissive", but a raster scanning display, for example Laser Phosphor Display could potentially have less burn-in risk than any OLED or μLED while retaining excellent contrast, and having inherently superior motion.
I love raster-laser displays (most CRT-accurate look temporally) but I'm also a big fan of eventually simply using brute refresh rate (1000Hz+ OLED) to simulate a CRT electron beam in firmware or software (GPU shader). This allows a display-independent method of simulating a CRT tube.

Basically a temporal equivalent of a spatial CRT filter, utilizing rolling scan and simulated phosphor decay.
As you retina-out resolution, spatial CRT filters become more and more accurate.
As you retina-out refresh rate, temporal CRT filters become more and more accurate.
(Doing both simultaneously would be golden -- e.g. 4K+ 1000Hz+ OLED or MicroLED)

It's the same way how playing back a high-speed 1000fps video of a CRT tube, back to a 1000Hz display (in real time), starts to temporally look like the original CRT itself. It's pretty interested when this is tested on laboratory displays. Instead, you'd emulate the CRT raster electron beam in software instead, in fine granularity Hz increments. For a 960Hz display, you'd use 16 digital refresh cycles to emulate 1 CRT Hz (for a 60Hz CRT emulation). For a 1920Hz display, you'd use 32 digital refresh cycles to emulate 1 CRT Hz. Today, we can do it already (sorta) using 6 digital refresh cycles to emulate 1 CRT Hz on commercially-available 360Hz monitors, but we can't get simulated phosphor decay fast enough to look more realistic, and current 360 Hz displays aren't HDR (poor brightness during CRT beam emulation tests), but if all the technological pieces of the puzzle converge simultaneously someday -- then that is a path to display-independent CRT-look-feel preservation.

Once you provide retina gamut (superset of CRT color gamut), retina resolution (superset of ability to resolve CRT phosphor dots), retina refresh (ability to simulate CRT electron beam), sufficient HDR surge room (for bright CRT beam), you can make a digital flat panel theoretically pass a CRT turing test (A/B test flat tube vs an OLED/MicroLED/similar behind thick CRT-mimick glass). This isn't possible for a while yet, but definitely within my lifetime in an upcoming decade (on time before current CRTs die of wear-tear etc...).

Configurable shadow masks/aperture grille. Configurable phosphor chemical. Configurable phosphor decay. Mimic a specific CRT fairly accurately, once gamut/resolution/refresh is sufficiently all simultaneously retina'd out to allow virtually human-vision-perfect simulation.

Although CRTs have instant rise time (sometimes nanoseconds), we only effectively need it to be retina (e.g. hundred microseconds league), which is within the league of projected refresh rate race over the next 20 years. At least until the display has finally reached retina refresh rate where no further humankind benefits are possible (currently computed to approximately ~4000Hz for 24" 1080p displays at arm length, but ~20,000Khz for 180-degree 16K VR headset. Ideally 2x oversampled, if you want to account for known temporal nyquist factors). There are already engineering plans for 8K 1000Hz test prototypes.

P.S. I am cited in 25+ peer reviewed research papers in the display-temporals sphere; I hope to collaborate on a new one related to CRT electron beam simulation. I have access to prototype quad-digit Hz displays.
 
Last edited:
240Hz OLED is finally coming. I saw Samsung demo one at DisplayWeek 2022.


I love raster-laser displays (most CRT-accurate look temporally) but I'm also a big fan of eventually simply using brute refresh rate (1000Hz+ OLED) to simulate a CRT electron beam in firmware or software (GPU shader). This allows a display-independent method of simulating a CRT tube.

Basically a temporal equivalent of a spatial CRT filter, utilizing rolling scan and simulated phosphor decay.
As you retina-out resolution, spatial CRT filters become more and more accurate.
As you retina-out refresh rate, temporal CRT filters become more and more accurate.
(Doing both simultaneously would be golden -- e.g. 4K+ 1000Hz+ OLED or MicroLED)

It's the same way how playing back a high-speed 1000fps video of a CRT tube, back to a 1000Hz display (in real time), starts to temporally look like the original CRT itself. It's pretty interested when this is tested on laboratory displays. Instead, you'd emulate the CRT raster electron beam in software instead, in fine granularity Hz increments. For a 960Hz display, you'd use 16 digital refresh cycles to emulate 1 CRT Hz (for a 60Hz CRT emulation). For a 1920Hz display, you'd use 32 digital refresh cycles to emulate 1 CRT Hz. Today, we can do it already (sorta) using 6 digital refresh cycles to emulate 1 CRT Hz on commercially-available 360Hz monitors, but we can't get simulated phosphor decay fast enough to look more realistic, and current 360 Hz displays aren't HDR (poor brightness during CRT beam emulation tests), but if all the technological pieces of the puzzle converge simultaneously someday -- then that is a path to display-independent CRT-look-feel preservation.

Once you provide retina gamut (superset of CRT color gamut), retina resolution (superset of ability to resolve CRT phosphor dots), retina refresh (ability to simulate CRT electron beam), sufficient HDR surge room (for bright CRT beam), you can make a digital flat panel theoretically pass a CRT turing test (A/B test flat tube vs an OLED/MicroLED/similar behind thick CRT-mimick glass). This isn't possible for a while yet, but definitely within my lifetime in an upcoming decade (on time before current CRTs die of wear-tear etc...).

Configurable shadow masks/aperture grille. Configurable phosphor chemical. Configurable phosphor decay. Mimic a specific CRT fairly accurately, once gamut/resolution/refresh is sufficiently all simultaneously retina'd out to allow virtually human-vision-perfect simulation.

Although CRTs have instant rise time (sometimes nanoseconds), we only effectively need it to be retina (e.g. hundred microseconds league), which is within the league of projected refresh rate race over the next 20 years. At least until the display has finally reached retina refresh rate where no further humankind benefits are possible (currently computed to approximately ~4000Hz for 24" 1080p displays at arm length, but ~20,000Khz for 180-degree 16K VR headset. Ideally 2x oversampled, if you want to account for known temporal nyquist factors). There are already engineering plans for 8K 1000Hz test prototypes.

P.S. I am cited in 25+ peer reviewed research papers in the display-temporals sphere; I hope to collaborate on a new one related to CRT electron beam simulation. I have access to prototype quad-digit Hz displays.
The King has spoken lol bro I'll be honest I didn't understand anything you said but boi did I enjoy reading it like the wannabe nerd that I am :woot:
 
Back
Top