Dell Alienware AW3423DW 34″ QD-OLED 175Hz (3440 x 1440)

Little late to this discussion but I just wanted to chime in and say that IPS are still the gold standard for color grading professionals. Not all IPS are created equally of course, but the best monitors for accurate color grading appear to still be using IPS panels.
Yep. The microled IPS panels that Apple is using are amazing. They aren’t great for gaming, but for everything else they are amazing.
 
"everything else" . Well, edge lit ips are not great for media/HDR and black depths and the only way to get uniformity on a FALD IPS for accurate work is to turn FALD off. It's only a 45x25 lighting resolution, or maybe 125x75 on the small 10,000 zone mac screens.

With FALD off you are back to 1000:1 to 3000:1 mixed contrasted content/small checkerboard test contrast levels and their accompanying black depths which is very poor by comparison to FALD enabled and OLED.

With FALD on, contrasted details, lights in darks and darks in lights will also be 3000:1 to 5000:1 contrast in and around those areas (blended across multiple zones cleverly where possible rather than harsh haloing, where avoidable anyway) - while the rest of the screen with larger fields of bright and dark separated can be much higher contrast, much brighter and much darker. So FALD won't be uniform. It works well for what it can do as a workaround in media but it's not uniform enough to be accurate.
 
Last edited:
"everything else" - yeah not for media/ HDR . The only way to get uniformity on a FALD IPS for accurate work is to turn FALD off. It's only a 45x25 lighting resolution, or maybe 125x75 on the small 10,000 zone mac screens.

With FALD off you are back to 1000:1 to 3000:1 mixed content/small checkerboard test contrast levels and their accompanying black depths which is very poor by comparison.

With FALD on, contrasted details, lights in darks and darks in lights will also be 3000:1 to 5000:1 contrast in and around those areas while the rest of the screen with larger fields of bright and dark separated can be much higher contrast, much brighter and much darker. So FALD won't be uniform. It works well for what it can do as a workaround in media but it's not uniform enough to be accurate.
Apple does a far better job with their FALD setup since they are programming their firmware better.

Also, keep in mind, anyone doing productivity work purchasing these Apple displays is doing so in a well lit room/office. The FALD blooming because far less noticeable in that environment.

I've used these displays frequently on both a Macbook Pro and the studio display, and the things are absolutely killer. Gaming is the only area where I wouldn't use them though and far prefer this aw3423dw.
 
It still won't be uniform for accurate work unless you turn the fald off. FALD are non-uniform by nature. Most people don't have that requirement in general usage that but it's worth saying in regard to "everything else".

That the 16" 10,000 zone FALD apple displays are around 76 PPD to 83PPD at 18" to 20" viewing distance and are glossy means they'd look pretty tight besides.

That said, it would be great if someone extrapolated that 10,00 zone mini led laptop density up to a 55" 4k or 8k screen at 100,000 zones someday. Price/profit and maybe heat/energy concerns are probably preventing that though.

Looks like apple will eventually go the oled route however in the long run. OLED should be brighter by then too:

macrumors: april 4th, 2023

In a tweet, Young shared a new Reuters report detailing Samsung Display's $3.1 billion investment in OLED production in Asan, South Korea and said that the facilities will be used to make OLED displays for 14- and 16-inch MacBook Pro models from 2026.


Last month, Young said that the MacBook Pro is unlikely to adopt an OLED display until 2026, when Apple's supply chain is expected to have sufficient notebook-optimized OLED display production capacity. Until then, Young said suppliers will be focused on OLED displays for tablets, such as the iPad Pro.

In the same report, Young explained that the first OLED Mac is expected to be a MacBook Air with a slightly smaller, 13.4-inch display. Simultaneously, it was reported that Samsung Display has started development of OLED displays that will be used for this future ‌MacBook Air‌ model.

In addition to the 2026 time frame, the information suggests that despite Apple's wish to get away from Samsung displays and switch to its own custom MicroLED technology, Samsung Display will have an omnipresent role in supplying OLED panels for Apple's next-generation devices – contributing to the 11.1-inch ‌iPad Pro‌, 13.4-inch ‌MacBook Air‌, and 14- and 16-inch MacBook Pro.
 
Last edited:
It still won't be uniform for accurate work unless you turn the fald off. FALD are non-uniform by nature. Most people don't have that requirement in general usage that but it's worth saying in regard to "everything else".

That said, it would be great if someone extrapolated that 10,00 zone mini led laptop density up to a 55" 4k or 8k screen at 100,000 zones someday. Price/profit and maybe heat/energy concerns are probably preventing that though.
There are already micro-led in the 55" range, rather than upscaling apple's mini-led tech, I look forward to the micro-led tech being downsized.
 
There are already micro-led in the 55" range, rather than upscaling apple's mini-led tech, I look forward to the micro-led tech being downsized.

Not 100,000 zone ones.. I was saying hypothetically extrapolating the density /small array of the 16" macbook to 55" for # of backlights per degree, for the near term. Where the ~ "4k" macbook gets 10,000 mini-led zones for 4k of resolution that same density could (at least space wise) fit over 100,000 mini-led zones on a 55" 4k screen. We don't already have any 100,000 zone fald screens, at least in the enthusiast consumer pricing space. That is probably a long way off yet for microLED.

There are ultrawide screens more or less as wide as 55" 16:9's already, and wider, just cut in half vertically. A 55" 1000R 8k for example could theoretically have a quad of 4k screen spaces, run high hz 4k game 1:1 centrally, 4k AI upscaled to 8k full screen, various width ultrawide resolutions or 5k, 6k 16:9's etc.

A 45" 21:9 ultrawide is around 42" wide ignoring any curves.

49" super ultrawide is 47" wide
55" 16:9 screen is 48" wide.

57" super ultrawide is 55" wide
 
Last edited:
Not 100,000 zone ones.. I was saying hypothetically extrapolating the density /small array of the 16" macbook to 55" for # of backlights per degree, for the near term. Whre the ~ "4k" macbook gets 10,000 mini-led zones for 4k of resolution that same density could (at least space wise) fit over 100,000 mini-led zones on a 55" 4k screen. We don't already have any 100,000 zone fald screens, at least in the enthusiast consumer pricing space. That is probably a long way off yet for microLED.

There are ultrawide screens more or less as wide as 55" 16:9's already, and wider, just cut in half vertically. A 55" 1000R 8k for example could theoretically have a quad of 4k screen spaces, run high hz 4k game 1:1 centrally, 4k AI upscaled to 8k full screen, various width ultrawide resolutions or 5k, 6k 16:9's etc.

A 45" 21:9 ultrawide is around 42" wide ignoring any curves.

49" super ultrawide is 47" wide
55" 16:9 screen is 48" wide.

57" super ultrawide is 55" wide

I see, maybe there's a disconnect in our understanding each other. I was just referencing the micro led tvs that have millions of LEDs.
 
I see, maybe there's a disconnect in our understanding each other. I was just referencing the micro led tvs that have millions of LEDs.

You made microLED sound almost ubiquitous. So far they have been priced out to the extreme.. 10's of thousands of dollars. Everyone seems to be dumping $ into brighter higher tech OLED in displays and microOLED in VR for now. MicroLED might not be in the enthusiast consumer price range for years yet but they supposedly are making somewhat smaller ones so that might change. Samsung didn't release any pricing information at all yet afaik though so I'd expect sticker shock for the next few years. "most affordable microled to date" isn't saying much vs a 80,000 tv lol.

https://www.sammobile.com/news/samsung-micro-led-tv-2023-launched-50-140-inch/

Four years after introducing The Wall at CES 2018, Samsung is poised to take its MicroLED technology mainstream. At CES 2023, the company announced it would offer 50-, 63-, 76-, 89-, 101-, 114- and 140-inch MicroLED models, greatly expanding the amount of choice consumers have when it comes to the new display technology. Samsung didn’t provide pricing and availability information for the expanded line, but the company claims the new models are its most affordable MicroLED TVs to date. Since a few of the sets are smaller than any of the MicroLED TVs Samsung has offered in the past, you also won’t need to pay for a professional to install them in your home.


Samsung claims its MicroLED line will set the standard for picture quality in 2023. And judging from the 76-inch model’s topline features, that’s not a mere boast from the company. The set sports a 240Hz variable refresh rate and 2-nanosecond response time. It also offers 20-bit black detail for “intense”
 
You made microLED sound almost ubiquitous. So far they have been priced out to the extreme.. 10's of thousands of dollars. Everyone seems to be dumping $ into brighter higher tech OLED in displays and microOLED in VR for now. MicroLED might not be in the enthusiast consumer price range for years yet but they supposedly are making somewhat smaller ones so that might change. Samsung didn't release any pricing information at all yet afaik though so I'd expect sticker shock for the next few years. "most affordable microled to date" isn't saying much vs a 80,000 tv lol.

https://www.sammobile.com/news/samsung-micro-led-tv-2023-launched-50-140-inch/
Well, I don't disagree, but we were just talking dream-hypotheticals, and with the micro-led, the technology is there and it's only going to get cheaper and smaller. I remember when plasma TVs first came out, my dad bought a 40" Sony for $5000, and within a year they were sub $1000 and the Sony already had burn in lol.

Anyways, just putting this out into the universe, but a 27" micro-led would be awesome!
 
  • Like
Reactions: elvn
like this
You made microLED sound almost ubiquitous. So far they have been priced out to the extreme.. 10's of thousands of dollars. Everyone seems to be dumping $ into brighter higher tech OLED in displays and microOLED in VR for now. MicroLED might not be in the enthusiast consumer price range for years yet but they supposedly are making somewhat smaller ones so that might change. Samsung didn't release any pricing information at all yet afaik though so I'd expect sticker shock for the next few years. "most affordable microled to date" isn't saying much vs a 80,000 tv lol.

https://www.sammobile.com/news/samsung-micro-led-tv-2023-launched-50-140-inch/
Manufacturers also aren't expecting MicroLED to get that much cheaper any time soon. If you look at cost projections it is that it will still stay very expensive for quite a while. It's cool tech to be sure, but a good ways out from being able to be made cheap enough for most people.

Also, while it has advantages, it still can suffer from burn-in. Burn in isn't an OLED problem, it is an emissive display problem. Light sources fade with use. If you have a display that uses a uniform backlight that is all on at the same level, like an LCD, then it'll fade (mostly) uniformly and so get dimmer, but no image related burn in. If you have a display that is emissive, where each element emits its own light then it will fade based on the usage of each element, which means it can burn in. CRT, Plasma, OLED, MicroLED, doesn't matter. If you display something static with a brighter area, it will slowly burn in as those elements are used more. MicroLED having higher peak brightness and long overall life will deal with burn in MUCH better than OLED, but it still can happen.

For now, and for immediate (like 5ish years at least) future, OLED seems to be the way to go if you want a per sub-pixel emissive tech.
 
Manufacturers also aren't expecting MicroLED to get that much cheaper any time soon. If you look at cost projections it is that it will still stay very expensive for quite a while. It's cool tech to be sure, but a good ways out from being able to be made cheap enough for most people.

Also, while it has advantages, it still can suffer from burn-in. Burn in isn't an OLED problem, it is an emissive display problem. Light sources fade with use. If you have a display that uses a uniform backlight that is all on at the same level, like an LCD, then it'll fade (mostly) uniformly and so get dimmer, but no image related burn in. If you have a display that is emissive, where each element emits its own light then it will fade based on the usage of each element, which means it can burn in. CRT, Plasma, OLED, MicroLED, doesn't matter. If you display something static with a brighter area, it will slowly burn in as those elements are used more. MicroLED having higher peak brightness and long overall life will deal with burn in MUCH better than OLED, but it still can happen.

For now, and for immediate (like 5ish years at least) future, OLED seems to be the way to go if you want a per sub-pixel emissive tech.

Hey I never said this would happen soon, I just said it would be cool if it happened.

I just bought two new monitors and my last pair lasted me for 10 years and were still going strong at the time of replacement, so I'll be happy if the micro-led tech or equivalent is available in a monitor size solution at an affordable price point in the next 7-10 years.

I've read online that micro-led is impervious to burn-in, are you sure about it's not?
 
I've read online that micro-led is impervious to burn-in, are you sure about it's not?
I am because for it to be impervious to burn-in, they would have to be LEDs that never faded, and LEDs do fade. They fade slowly, particularly if you don't drive them too hard, but they do fade. If you compare the brightness of an LED that is brand new to one that has been providing light for 10 years, it is a noticeable difference.

Now what I imagine they mean is that under normal use, the fade will be so minor, and something the electronics can compensate for, that you'll never see any burn-in. But physically, the LEDs will wear out at different rates based on what they are displaying and how bright, which means burn-in is possible. I don't imagine it'll be a concern in most practical scenarios though. Even OLED is getting much better on that front and while still a concern, it has gone from "this is a big problem be really careful" to "you probably won't have an issue don't worry too much".
 
They aren't organic like oled but heat will probably be an issue especially in 4000nit screens. I wouldn't be surprised if active cooling were a thing on most HDR screens eventually, perhaps with profiles like gpus use. The pro-art 1400-1600nit fald screens already have boxy vented grille housings with active cooling fans on a cooling profile of some kind. The 2000nit 4k and 8k LED FALD LCD samsung screens went for a slim chasis and no active cooling and they both suffer aggressive ABL like an OLED.

Not the same kind of LEDs but there are LED lighting systems for reef aquariums that grow corals. Aquarium lighting had previously been hot metal hallide bulbs which were very hot running like an arc welder and using a lot of energy due to being inefficient. The metal-hallide bulbs also burned out enough to affect the corals with a spectrum shift/burn down of the bulb output in about 6 months. The LEDs are cooler and much more energy efficient., and much much longer lived but they still require active cooling on them vs heat. They are usually blasting at 100% output though all of the time that they are on, probably 6 - 8 - 10 hrs a day.

This one is 200W but reef aquariums of decent size might have several of these panels above them.


SKY-LED-diffuser-neptune-systems.jpg


19371.png





neptune-sky-1_477x354.jpg


Sky-Circulation.001.png



"
Advanced Thermal Management

The SKY has an advanced approach to thermal management that is more than just the two fans you see.

So, why two fans? With two fans the SKY can operate them at much lower RPM while keeping the LED cooler. Cooler LEDs mean a longer lifespan for your light. Also, running the fans slower means that the SKY is quieter, in fact, it's the quietest actively-cooled LED in its power-class. The SKY puts out just 33dB at one meter and set to 100%. The leading competitor clocks in at >40dB at 100% power.

Along with these fans a temperature and power monitoring algorithm manages their speed. If one fan were to fail, the other will pick up the slack and keep the SKY running and safely cooled until a replacement is obtained. Also, if for some other reason the SKY cannot adequately cool itself, such as excessively high ambient air temperature, it will throttle the power to the LED to keep them safely cool."

. . . .

. . . .

They claim when adequately cooled that the LEDs should last 50,000 hours which is a crazy amount of time. 10 hours a day 365 days a year is 3,650 hours a year. 10 years would be 36,500. That is when cooled though so it would be shorter if they ran hotter and the heat could even damage the components if hot enough (and in the case of an aquarium heat the water unsafely or trigger other aquarium cooling compensation methods unnecessarily).

It's not an apples to apples comparison with microLED but for reference those reef lighting units have only 104 LEDs per lighting unit. A microLED will be jam packed if at the subpixel pixel level (4k ~> over 8million pixels). The microOLEDs won't all be on at 100% output all of the time during content viewing though. Most scenes are mixed.
 
Last edited:
If I had to look at my plant growing aquarium LED's all day, I would be blind.

IMO, more than 1000 nits just isn't needed on a display that you are sitting 6 inches from. My X27 burns my eye balls out when it's operating at 1000 nits. We just need OLED that is capable of doing 1000 nits full panel in a reliable fashion.
 
They claim when adequately cooled that the LEDs should last 50,000 hours which is a crazy amount of time. 10 hours a day 365 days a year is 3,650 hours a year. 10 years would be 36,500. That is when cooled though so it would be shorter if they ran hotter and the heat could even damage the components if hot enough (and in the case of an aquarium heat the water unsafely or trigger other aquarium cooling compensation methods unnecessarily).

That's in line with most inorganic LEDs. They usually have 30k-50k, or sometimes even longer, listed life. That isn't actually their MTBF or how long until they die, but how long until they reach X% of their rated output level. What X is varies, 70% and 50% seem common. For a room light, or something like that, it is all that matters so long as the light output is still enough, light is still useful. For a display, more complex because while 70% could easily be more than enough brightness, if part of it has faded that much and part has not... that's burn in.

Now if the process is slow enough, it really won't be noticeable with normal content use and for inorganic LEDs it should be. Also it is something that the display processor could compensate for to a degree, which is part of how OLEDs work with image refreshing and such. I imagine that in real world use MicroLED displays will be resistant enough to burn in as to be no issue. Always a possibility though, I would still probably prefer LCD for digital signage displays.
 
If I had to look at my plant growing aquarium LED's all day, I would be blind.

IMO, more than 1000 nits just isn't needed on a display that you are sitting 6 inches from. My X27 burns my eye balls out when it's operating at 1000 nits. We just need OLED that is capable of doing 1000 nits full panel in a reliable fashion.



That's not how hdr works though, (or real life for that matter which goes much higher even in non extreme lighting environments). You wouldn't be looking at full brightness full screen all day by a long shot.

Brighter hdr in parts of a scene will be more realistic looking but it will still not be bright compared to reality overall.

Dolby engineers have done talks where they said they master typical hdr 10,000 nit scene as

50% cumulatively of screen/scene space at zero to 100nit
25% as 100 to 1000nit
25% as 1000 to 10,000 nit highlights, reflections and light sources (cumulatively, not necessarily all in one part of the screen (e.g. scintellations on water, starfields, metallics, bright sun or moon as a small part of a dynamic scene, headlights, bright flames and light sources, etc.)

A hdr display typically maps the first half of the display's capability 1:1 (e.g. 500 nit of a 1000nit capable screen) then compresses the rest of the hdr curve into the remaining half of the screen's limits intelligently based on the mfgs display design/firmware. Im guessing a hdr 4000nit screen would map that first 75% 1:1 to 1000nit, then compress the last 25% of a scene's 1000nit to 10,000nit range into the 1000nit to 4000nit range of the display - but dolby and/or the display mfg could do it however they decide to.

There are scenes where a nuclear bomb goes off or scenes where a character has a single torch in a cave so it varies from that typical breakdown, and depending on the point of view of the camera at any given time (e.g. zoomed in on the characters face while holding a bright torch or lightsaber alongside of it, taking up a large part of the screen momentarily, or a long shot where the torch/lightsaber is a small portion of the screen).
You also aren't getting a normal viewing angle six inches away from a screen unless maybe it's a phone but I am guessing that you were exaggerating there.


. .

I was quoting the grow lights that instead do full field max output the whole time they are on just as an example of long lived bright leds that get hot enough to require an active fan cooling curve even with a low density array of them.

Hypothetically a hdr 4000 to 10,000 display mfg could allow for user selectable performance vs. Cooling curves like a desktop gpu or gaming laptop has.

Less performance/hdr peaks ("throttled" capped nits outright, and/or aggressive abl) with passive cooling

moderate performance with normalized fan curves and safe fan boosting ranges (+ abl if necessary)

extreme peak range with higher "throttle" point using more aggressive active cooling profile, higher fan rpm.


Then you'd be able to set it how you'd like it if you wanted it capped lower with more compressed/squashed color volumes and with little to no active cooling fan speeds, or if you wanted middle or peak hdr performance. Being able to swap modes on the fly (like you can with named osd picture settings) depending on what you were viewing would be nice. That would be the best way to do it I think. Hope that kind of thing happens rather than no option other than osd brightness settings, or worse only weak passive cooling with max abl/restrictions from the factory.
 
Last edited:
Hypothetically a hdr 4000 to 10,000 display mfg could allow for user selectable performance vs. Cooling curves like a desktop gpu or gaming laptop has.
They are going to have to as peak brightness grows because some directors are not going to keep it reasonable. You see the same shit with audio. THX specs 105dB on the mains, and 115dB on the sub peak. That is LOUD, way louder than you should listen to for any extended period of time. However, the intent wasn't that you'd listen to that for a long time, it is there for big hits. Be able to peak to high levels, but the normal dialogue normalization should be around 65-75dB and music maybe 80-85dB.

Ya well, some directors like, say, Michael Bay, abuse the shit out of that and push it up loud throughout a whole lot of the movie. It is not something most want to listen to, particularly at home, so the ability to turn down the volume and/or change dynamic range compression is necessary.

I'm sure the same shit will happen with HDR. It should be something where, as you say, most of it is SDR range with some HDR highlights, but some will go apeshit and turn everything up to 11 and have a scene that is full scene brightness of 2000 its or more all the time.
 
They are going to have to as peak brightness grows because some directors are not going to keep it reasonable. You see the same shit with audio. THX specs 105dB on the mains, and 115dB on the sub peak. That is LOUD, way louder than you should listen to for any extended period of time. However, the intent wasn't that you'd listen to that for a long time, it is there for big hits. Be able to peak to high levels, but the normal dialogue normalization should be around 65-75dB and music maybe 80-85dB.

Ya well, some directors like, say, Michael Bay, abuse the shit out of that and push it up loud throughout a whole lot of the movie. It is not something most want to listen to, particularly at home, so the ability to turn down the volume and/or change dynamic range compression is necessary.

I'm sure the same shit will happen with HDR. It should be something where, as you say, most of it is SDR range with some HDR highlights, but some will go apeshit and turn everything up to 11 and have a scene that is full scene brightness of 2000 its or more all the time.

Yeah, the peak brightness of small highlights is much more impactful than full screen brightness. Your eyes adjust to full screen brightness just like IRL. So it's only impactful at the very beginning of a scene change, and it can be uncomfortable to viewers to have their eyes adjusting.

But if it's just a tiny area of the screen that is very bright and the overall picture level remains the same your eyes do not adjust. Or if it's a brief, very bright explosion and back to normal it's also more impactful.

And same for audio. I don't want to have permanent hearing damage just because there is an earthquake going on, and I don't want to wear sunglasses because a scene takes place midday in the desert.
 
Yeah, the peak brightness of small highlights is much more impactful than full screen brightness. Your eyes adjust to full screen brightness just like IRL. So it's only impactful at the very beginning of a scene change, and it can be uncomfortable to viewers to have their eyes adjusting.

But if it's just a tiny area of the screen that is very bright and the overall picture level remains the same your eyes do not adjust. Or if it's a brief, very bright explosion and back to normal it's also more impactful.

And same for audio. I don't want to have permanent hearing damage just because there is an earthquake going on, and I don't want to wear sunglasses because a scene takes place midday in the desert.
Yep. Dynamic range, in sound or light, is what makes for impact. However some directors don't seem to understand that and turn it up to 11.
 
As far as I know there have been no apps that allow the 2d desktops actual running app windows to be in 3d space, or even rendered flat onto a front facing cube (rather than rotating axis object capability) in a 3d engine where graphics anti aliasing could be active.

Virtual Desktop at www.vrdesktop.net is a fantastic third party alternative.

It lets you render the desktop in many simulated rooms, from outer space or a walkable 3D render of an ordinary computer room.

A workaround is to use multimonitor and treat each monitor as its own application window.

The monitor is resizeable and repositionable panes in 3D space, or on simulated screens of your choice (rendering of a computer monitor, a television, or an IMAX screen).

Utilities such as OVR ToolKit to add multiple screens, can also be helpful getting you the multimonitor experience in VR as a substitute.
 
Virtual Desktop at www.vrdesktop.net is a fantastic third party alternative.

It lets you render the desktop in many simulated rooms, from outer space or a walkable 3D render of an ordinary computer room.

A workaround is to use multimonitor and treat each monitor as its own application window.

The monitor is resizeable and repositionable panes in 3D space, or on simulated screens of your choice (rendering of a computer monitor, a television, or an IMAX screen).

Utilities such as OVR ToolKit to add multiple screens, can also be helpful getting you the multimonitor experience in VR as a substitute.


I used that on an oculus quest before now that you mention it (before my quest 1 bricked to firmware/reset screen at the 2 yr point).

. . . but I never tried to see if those desktop apps were capable of applying anti-aliasing to a 2d desktop in a 3d environment full screen on a compute monitor/tv That was what I was getting at.

I'd assume it would just apply the 2d desktop as a projection~texture to a virtual polygon or box, like compositing a video into a 3d scene on the side of a cube for example. Not every type of AA affects textures, some just affect the edges of polygons, but maybe there could be a way to use a 3d desktop rendering app to apply a type of AA in a small amount that could anti alias the whole 2d desktop's imagery (including text) experimentally.

I realize that text sub-sampling designed specifically for specific subpixel layouts on a letter by letter basis would be better , but with non-standard layouts like wrgb and pentile applying some anti-aliasing to the entire desktop might be a better compromise (along with viewing at high enough PPD). The other benefit of that method would be that the rest of the 2d desktop's content, graphics and imagery, would also be getting anti-aliasing.
 
Redmagic have announced a 49" QD-OLED super ultrawide, they label it as "4K" but it's actually 5120x1440, so a bit of a PR goof. Still, 240 Hz, and enable DLDSR, you would get the 2160p vertical res in games so technically, you could say it is 4K.... If you have an Nvidia card that can do DLDSR.

https://wccftech.com/redmagic-massi...ng-display-240hz-samsung-panel-cheaper-price/

I'm probably just doing something wrong but honestly I've found that DLDSR just makes the game look WORST, like a lot blurrier. I even turned the "DSR Smoothness" all the way down to 0 which is supposed to be no smoothening and it still looks worst than native 4k.
 
Redmagic have announced a 49" QD-OLED super ultrawide, they label it as "4K" but it's actually 5120x1440, so a bit of a PR goof. Still, 240 Hz, and enable DLDSR, you would get the 2160p vertical res in games so technically, you could say it is 4K.... If you have an Nvidia card that can do DLDSR.

https://wccftech.com/redmagic-massi...ng-display-240hz-samsung-panel-cheaper-price/
I've been using Nubia RedMagic 5G smartphone for several years, and it has been stable as rock, not a single software or hardware hick up for all the years of operation. (Unlike the fancy Sony that I use now.) They also have a high quality ecosystem for there phones. Like BT headphones, that sound awesome. All sorts of Joysticks. I mean, I've been a fan of Nubia RedMagic gear for some time now and can only say good things about it. If they make something - I rather trust it will be a quality product.

There you have it, my free of charge opinion on Nubia RedMagic gear.
I believe this is the case for the praise where praise is due.
 
Yeah their phones are good too (just lacking in the customisation features compared to say Samsung and no IPX rating) - But otherwise great performers.
I'm probably just doing something wrong but honestly I've found that DLDSR just makes the game look WORST, like a lot blurrier. I even turned the "DSR Smoothness" all the way down to 0 which is supposed to be no smoothening and it still looks worst than native 4k.
Something definitely being done wrong then because this certainly isn't the case!
 
Something definitely being done wrong then because this certainly isn't the case!

My guess is that DSC is playing a role here. My monitor uses DSC in order to achieve 4K 144Hz 10bit HDR so probably the combination of DSC + DSR screws up the image. Perhaps on a display that doesn't rely on DSC the results would be different.
 
Oh I thought you were referencing the QD-OLED hence this thread lol - The QD-OLED looks amazing with DLDSR at 5160x2160 (2.25x) - 144Hz 10-bit.
 
I may have to get one of these finally. I decided to stick with my AW3821DW because I like the bigger size, and I worry a little about burn-in given what I do with the desktop, things like Nuendo that have lots of static display for long times. Works fine, I do most of my gaming on my Q95B. Well that is just totally ruining me when it comes to HDR games. For SDR, I'm still ok with my LCD. No it isn't as nice as an OLED but it's fine. However I am an HDR addict, I love it, and I can't go back and the HDR is SOOOOO much better on OLED.

So I'm seriously thinking about getting one for when I am gaming at the desktop, rather than on the TV (both are hooked to the same PC, it is a matter of if the kind of game is better played on a desktop, or if the GF is using the TV for something else).
 
  • Like
Reactions: Xar
like this
I may have to get one of these finally. I decided to stick with my AW3821DW because I like the bigger size, and I worry a little about burn-in given what I do with the desktop, things like Nuendo that have lots of static display for long times. Works fine, I do most of my gaming on my Q95B. Well that is just totally ruining me when it comes to HDR games. For SDR, I'm still ok with my LCD. No it isn't as nice as an OLED but it's fine. However I am an HDR addict, I love it, and I can't go back and the HDR is SOOOOO much better on OLED.

So I'm seriously thinking about getting one for when I am gaming at the desktop, rather than on the TV (both are hooked to the same PC, it is a matter of if the kind of game is better played on a desktop, or if the GF is using the TV for something else).
I've had mine since launch and use a static taskbar and usually have 2 Edge windows open all the time and haven't noticed any burn in or anything.
Mine is set to do the Pixel Refresh every time I turn it off.

1688745784362.png
 
I've had mine since launch and use a static taskbar and usually have 2 Edge windows open all the time and haven't noticed any burn in or anything.
Mine is set to do the Pixel Refresh every time I turn it off.

Bullshit, more like 80 windows open! OMFG!

(I know what you meant, but HOLY SHIT)
 
I've had mine since launch and use a static taskbar and usually have 2 Edge windows open all the time and haven't noticed any burn in or anything.
Mine is set to do the Pixel Refresh every time I turn it off.
Ya I'm still considering it, particularly since I am in the fortunate financial position of being able to buy shiny new toys fairly often meaning even if I do burn it, I can replace it with new technology in a couple years if I wish. I'll probably get one if I decide I want to do any serious gaming at the desktop instead of on the TV. I will really miss the extra vertical rez in Nuendo though.
 
Question for you all. I got my monitor fairly early on at have the M0B101 firmware. I don't use this monitor a ton as I normally game on my LG OLED's but occasionally I do use this display.
The panel is in great shape, and overall, I'm satisfied with the monitor (I don't hear fan noise, colors seem decent).

Would it be worth contacting Dell for a RMA to get a later firmware? I would like the display to not shut off running the pixel refresh when I do use it, but it isn't the end of the world. I hear some bad experiences with people getting RMA monitors worse than their original one.
 
Question for you all. I got my monitor fairly early on at have the M0B101 firmware. I don't use this monitor a ton as I normally game on my LG OLED's but occasionally I do use this display.
The panel is in great shape, and overall, I'm satisfied with the monitor (I don't hear fan noise, colors seem decent).

Would it be worth contacting Dell for a RMA to get a later firmware? I would like the display to not shut off running the pixel refresh when I do use it, but it isn't the end of the world. I hear some bad experiences with people getting RMA monitors worse than their original one.
I wouldn't. All the new firmware gives you is the automated pixel refresh when the monitor goes into standby. That's pretty much it. Not worth playing the panel lottery for just that feature alone.
 
Have you guys had your monitors not auto turn off / go to sleep? I’ve had to manually power off since day one.
 
Have you guys had your monitors not auto turn off / go to sleep? I’ve had to manually power off since day one.
Make sure you have eco mode enabled. If you don't, the monitor won't go into standby mode. Otherwise, it's likely an issue with whatever devices you have plugged into the monitor giving it an active signal.
 
Make sure you have eco mode enabled. If you don't, the monitor won't go into standby mode. Otherwise, it's likely an issue with whatever devices you have plugged into the monitor giving it an active signal.
Eco mode is enabled...so yeah must be the devices. Dang. Thanks for the reply!
 
On my 4th DW now, first one had panel marks, the 2nd and 3rd had burn-in after 2 months and then 1.2 years. I started the replacement process on Thursday via WhatsApp and the replacement arrived today. I can report that they sent me a brand new one made in August 2023.

The burn-in, only visible on this dark grey type of BG:
V6lVLCD.jpg


You now also get a microfibre cleaning cloth in the box to wipe the marks left over from the packing sheet that is on the front panel, not bad.

My old one was the 3rd AW3423DW I've had, the first one was bought at launch but had panel marks so they replaced it, the 2nd one ended up with screen burn after 2-3 months so they replaced it with another new one which was a new revision (A03) with the same firmware 101 as the 2nd one.

This 4th one is Rev A07 and runs firmware M0B204. So both FW and revision are significantly newer than what the 3rd AW was which was made in August 2022.

I can report that the fans appears to be quieter on this new one. They are still audible but now it's an ambient airflow noise rather than fan motor hum that was on the previous ones.

The pixel refresh dialogue is revised too, maybe the refresh algo is different too, to work more effectively, only time will tell.

5kbDDqB.jpg
 
Back
Top