Sweet Baby Jeebus Alienware 55" 4k120 OLED Displayport

and they'll cover the same area of your field of vision if you're at the right viewing distance(which is obviously quite close for a phone).

I can guarantee you that I do not cover as much of my field of vision with my phone as I do with a computer monitor when gaming; nor do I sit that close to my TV when watching something.

I can also guarantee you that you won't be able to find more than a scattered example of someone who does.

The point being, because one usually sits closer to a monitor/TV when gaming, having a super-bright display isn't really that important and could even be a detriment with careless developers.
 
The point being, because one usually sits closer to a monitor/TV when gaming, having a super-bright display isn't really that important and could even be a detriment with careless developers.

If you say so. For me, we're at least 3-4x away from having bright enough displays to really display HDR as intended in the sense that it duplicates the full dynamic range you can see in real life. More like 10x away at 400 nits.
 
If you say so. For me, we're at least 3-4x away from having bright enough displays to really display HDR as intended in the sense that it duplicates the full dynamic range you can see in real life. More like 10x away at 400 nits.

10,000 nits has been thrown around, but I think that's just the HDR spec.

Far, far more important is the ability to show per-pixel brightness down to zero, and to have very fine transitions in luminosity where present in the source material.
 
Nope, they flat out told me they do not care about my business, do not care if I take the issue to social media and do not care if I try to sue them. Every number leads to their india call center where you are greeted by thick accent speaking xenophobes who get argumentative when you ask to speak with a United States based CSR escalation tier group.

Also, they were HELL to deal with on getting the replacement exchange. It took three weeks and countless hours / days of riding their ass to get it done.

I get that the fine print says 30 days, but screwing your customer at the 32nd day mark is third rate customer service. Long story short, don't buy from Dell on day one, wait for their inevitable price drop and pray your display has no issues. Otherwise kiss your sanity good bye.....actually long, LOOOOOOOOONG story short just don't buy from Dell at all.....unless they release a 32" version lmao

With that said, the AW55 is incredible. Its a shame Dell's marketing/pricing & customer support suck because, the Alienware department did an awesome job. Been playing the new Pacific Maps on BFV and campaign on the new COD and my jaw constantly hits the floor!
I have to admit, it's really weird that this display isn't available through any 3rd party reseller like Amazon or B&H. I'd have recommended buying through them in that case.
The point being, because one usually sits closer to a monitor/TV when gaming, having a super-bright display isn't really that important and could even be a detriment with careless developers.
I sit close to my monitors when doing anything that doesn't involve media consumption. When I'm watching something or playing a game, I move further back. I though this was the more common way of doing it. Pretty surprised.
 
Last edited:
10,000 nits has been thrown around, but I think that's just the HDR spec.

Far, far more important is the ability to show per-pixel brightness down to zero, and to have very fine transitions in luminosity where present in the source material.

Agree on that. And yeah, 10,000 nits is the ultimate spec, but I think that is just to account for future technological developments and the most extreme possible artistic needs. Few scenes would ever actually call for 10k anywhere.
 
HDR is highlights. The average scene brightness is still quite low, in fact some people complain that it's too low since it uses absolute values designed for a home theater dim to dark viewing environment. HDR is not like SDR where the whole scene's brightness is relative and turned up or down.

HDR 400 is not really hdr. Movies are made based on a 10, 000 nit color volume. Uhd HDR discs are so far hdr 1000, hdr 4000, and at least one hdr 10,000 (blade runner 2049) so far. When a display hits its color brightness cieling, it clips those colors to white and they are lost. LG OLED tv tech now offers some optional compensatory tech that can be used to try to remap the details down to the hdr600 nit abl limits they have since the content is made for hdr 1000 and 600 nit color falls short. 400 really isn't much. OLED displays already look great in sdr though so I'm not trying to say they don't.
 
Actually not aware of any Panasonic TVs being sold in the US, at least with respect to any real retail presence. Do have an LG TV with a Panasonic panel in it though that's about nine years old.
 
The Panasonic GZ2000 is actually capable of full 1000-nit highlights at up to 10% windows, interestingly. I believe they do this with customized cooling and panel binning. I've never seen one but I'd like to. It's more than double the price of a same-size C9 around here, unfortunately, and I'm not even sure if it's sold in the US at all.

No HDMI 2.1 might rule that out from the start imo when LG C9's have it for future compatibility with 4k 444 120hz bandwidth. Tradeoffs as always I guess.


LG C9 per rtings.com review lists
peak 10% window 855cd/m2 , peak 25% window 845cd/m2

The C9 can reach very good brightness levels with HDR content; slightly better than the 2018 LG C8, but still not as good as top LED models like the Samsung Q90R or Sony Z9F. Unfortunately, it doesn't perform as well in all scenes, due to the C9's aggressive ABL that dims the screen with different content. This is especially noticeable in content with large bright areas.
HDR Real Scene Peak Brightness

: 726 cd/m²
HDR Peak 2% Window

: 855 cd/m²
HDR Peak 10% Window

: 845 cd/m²
HDR Peak 25% Window

: 530 cd/m²
HDR Peak 50% Window

: 301 cd/m²
HDR Peak 100% Window

: 145 cd/m²
HDR Sustained 2% Window

: 814 cd/m²
HDR Sustained 10% Window

: 802 cd/m²
HDR Sustained 25% Window

: 506 cd/m²
HDR Sustained 50% Window

: 286 cd/m²
HDR Sustained 100% Window

: 138 cd/m²
HDR ABL

: 0.109

Panasonic GZ2000 Review
----------------------------------------------------------------
https://www.flatpanelshd.com/review.php?subaction=showfull&id=1568954772

GZ2000 is the first OLED TV capable of hitting 1000 nits peak brightness. It can maintain this level of brightness up to a 10% window (10% of the screen area), while it drops to around 630 nits with a 25% window. To put that into perspective, Sony's latest A9G OLED has peak brightness of around 650 nits.

You may be wondering if it even constitutes a difference and yes, it does. Comparing GZ2000 side-by-side to a Sony OLED, GZ2000 packs serious punch. It is clear that GZ2000 is capable of maintaining a high brightness level with larger coverage area than other OLED TVs on the market today. Together with pixel-level luminance control it is an ideal partner for HDR video.

Furthermore, GZ2000 adheres almost perfectly to the reference EOTF curve from start to end. Panasonic has opted for a modest roll-off; almost a 'hard clip' at the top end. This does not leave the panel with much headroom to resolve highlight details in content mastered to brightness above 1000 nits, which also brings us to our main point of criticism with GZ2000 (yes, so you can skip to the end after this). In highlight details we spotted occasional banding effects, mostly visible in sunset scenes (as seen below). If you look closely at the details around the sun (please ignore that the camera struggles to capture the full dynamic range), you will see some hard gradient transitions.

With that being said, I must emphasize that I almost never spotted the issue on real content, meaning HDR video.

OLED continues to be at the center of a sometimes heated debate about burn-in. Since OLED is a self-emitting display technology, where the light source ages during use, it is indisputable that each light emitting diode will age at a different pace, meaning inhomogeneously, if you abuse the screen by constantly displaying static content. The real question is "how much does it take"? We cannot answer that question during our time 2-3 week review period but curiously GZ2000 exhibits different behavior than other OLED TVs. Normally, we can provoke temporary retention through the use of our test patterns for calibration. This retention will disappear again soon after. GZ2000 seemed almost immune to our torture tests. Even after long test sessions with a 1000 nits static window there was no retention to be found on the panel - not even on a grey verification pattern. Of course, it is far too early to conclude that Panasonic has cracked the code but one possible explanation could be that since the panel is equipped with a more effective heat dissipation solution, cool off time for the diodes is reduced. If this is indeed the case, the risk of burn-in is most likely be reduced, too, since ageing of diodes is greatly accelerated the warmer they get (that's why accelerated tests typically take place at elevated temperatures).

Like other OLED TVs, GZ2000 comes with firmware that acts as a screen saver solution. One feature is the infamous “dimming” function that detects static content on-screen and dims brightness if it is deemed critical. The problem with this algorithm - which appears to be implemented on the panel level - is that there is no option to turn it off, even if you are willing to take the chance. GZ2000 will dim its panel in both SDR and HDR mode, if you let static content remain on the screen. With SDR content, GZ2000 reduces panel brightness by approx. 20% after 5 minutes. With HDR content, panel brightness continues to drop until it almost hits the floor - or you change the content.
 
No HDMI 2.1 might rule that out from the start imo when LG C9's have it for future compatibility with 4k 444 120hz bandwidth. Tradeoffs as always I guess.


LG C9 per rtings.com review lists
peak 10% window 855cd/m2 , peak 25% window 845cd/m2




Panasonic GZ2000 Review
----------------------------------------------------------------
https://www.flatpanelshd.com/review.php?subaction=showfull&id=1568954772
Meh. I love my 65' c9 that I paid like half of what the Panasonic will cost.
 
c9 has hdmi 2.1 on all ports where the panasonic is still hdmi 2.0b

The difference in peaks C9 vs GZ2000 is
at peak 10% window 845 cd/m2 vs 1000 cd/m2
at peak 50% window 530 cd/m2 vs 630 cd/m2

The GZ2000 reportedly has much less IR occurence and perhaps less burn in risk from a better cooling solution hand in hand with the higher nit tech.
The C9's have more aggressive ABL snap in to a 100 nit lower nit value than the GZ2000's abl.

And yes like you said it's a big price difference, but it's much worse than what you said according to https://www.techradar.com/reviews/panasonic-gz2000-4k-oled-tv-review
from 11 days ago,
The Panasonic GX2000 is available in two sizes, 55-inch and 65-inch, priced at £3,299 and £4,399 respectively.

That converts to $5691.87 for the 65" and $4268.58 for the 55" when the C9 has been on sale from authorized LG reseller warehouse sellers on ebay for $1750 - $2100 for the 65" C9 and $1280 - $1400 for the 55" C9. So yeah no contest.
 
But it accepts an 'HDR' signal? Cause 400 is plenty for desktop use and more than I think I'd ever want for gaming. Could you imagine a flashbang at HDR1000?

I've had the 27" Acer X27 give me a full 1000nit blast before and it was like:
giphy.gif
 
I've had the 27" Acer X27 give me a full 1000nit blast before and it was like:
View attachment 197232

Idk maybe your display or content had screwed up the white point. HDR is generally perceived as too dim on average scene brightness to a lot of people who aren't using a dim to dark home theater environment.

--- This is what the uninformed thinks happens:

"If you viewed SDR content on a 1,000 nit display, everything would be displayed 10x brighter than intended.

If you viewed that same SDR content on a 500 nit display, everything would be half the brightness of the 1,000 nit display."

---This is what really happens:


If you viewed HDR content on a 1000 nit display or a 500 nit display, any scene where the peaks are lower than 500 nits should look exactly the same on both displays, due to the way that brightness is now coded. It represents an exact value, not a relative value.

For scenes which have highlights exceeding 500 nits, you will have highlight clipping on the lower brightness display (everything above 500 nits turns white) which would not be present on the higher brightness display. <---- Which will show full color gradations across the color spectrum in bright highlights instead of crushing to white after hitting the sdr ceiling.

=====================================

That means more detail and brighter colors throughout the 3d color gamut heights instead of clipping or rolloff at sdr peaks. It also means that when HDR is done properly the gamma and white point is absolute at 200nits. When you are using a "hybrid" hdr and changing the white point you are changing everything. If you scale the white point up and/or change your gamma curve to compensate for a poor viewing environment ("It's too dim, I'll turn it up") - your highlights and peaks are going to be darker or much brighter than the HDR standards based on human eyesight.

I'm pretty sure most displays (and users) are not using HDR properly right now. In fact, with true PQ HDR which uses absolute values, the common complaint is that the average image brightness is too dark for rooms that aren't home theater dim to dark rooms. HLG gamma is relative which allows the white point to be all over the place.


https://www.lightillusion.com/uhdtv.html
"Although the nominal nits value for HLG diffuse white will vary with the peak brightness of the display, a 1000 nit display will have place diffuse white around 200 nits, similar to PQ based HDR, while a 5000 nits HLG display will have diffuse white around 550 nits, depending system gamma (see later for info on HLG system gamma).

So the reality is that HDR should mainly add to the existing brightness range of SDR displays, so more detail can be seen in the brighter areas of the image, where existing SDR images simply clip, or at least roll-off, the image detail."

--------------------------------------------

That means more detail and brighter colors throughout the 3d color gamut heights instead of clipping or rolloff at sdr peaks. It also means that when HDR is done properly the gamma and white point is absolute at 200nits. When you are using a "hybrid" hdr and changing the white point you are changing everything. If you scale the white point up and/or change your gamma curve to compensate for a poor viewing environment ("It's too dim, I'll turn it up") - your highlights and peaks are going to be darker or much brighter than the HDR standards based on human eyesight.

-----------------------------------------------
https://www.lightillusion.com/uhdtv.html


HDR (High Dynamic Range) via PQ (Perceptual Quantizer) or HLG (Hybrid Log-Gamma) EOFT (Gamma) and WCG (Wide Colour Gamut) imagery

Absolute vs. Relative - PQ vs. HLG
One of the things we have simply come to accept when watching TV at home is that we set the peak brightness of the TV to accommodate the existing viewing environment within the room that houses the TV - most commonly the lounge. This is obviously ignoring the videophiles that have environment controlled man-caves with true home cinema setups, but they are not the norm for home TV viewing.

Whilst we know and understand that the SDR grading display will have been calibrated to 100 nits, we also understand that it will have been housed in a controlled grading environment, with little ambient light. The beauty of SDR's relative approach to gamma is that the TV can simply be made brighter to overcome uncontrollable light contaminated environments, including the use of different gamma values.

One of the often overlooked potential issues with PQ based HDR for home viewing is that because the standard is 'absolute' there is no way to increase the display's light output to overcome surrounding room light levels - the peak brightness cannot be increased, and neither can the fixed gamma (EOTF) curve.


As mentioned above, with PQ based HDR the Average Picture Level (APL) will approximately match that of regular SDR (standard dynamic range) imagery. The result is that in less than ideal viewing environments, where the surrounding room brightness level is relatively high, the bulk of the PQ HDR image will appear very dark, with shadow detail potentially becoming very difficult to see. This is still true with a diffuse white target of 200 nits, rather than the original 100 nits diffuse white.


To be able to view PQ based 'absolute' HDR imagery environmental light levels have to be very carefully controlled. Far more so than for SDR viewing. This really does mean using a true home cinema environment.

Or, the PQ EOTF (gamma) has to be deliberately 'broken' to allow for brighter images - which many home TVs do.

To back this statement up, the average surround illumination level that is specified as being required for PQ based HDR viewing is 5 nits, while for SDR it has always been specified as 10% of the maximum brightness of the display. Unfortunately, the surround illumination specification for SDR has since been (incorrectly) changed to 5 nits as well...

--

PQ - An Absolute Standard

Referring to PQ as an 'absolute' standard means that for each input data level there is an absolute output luminance value, which has to be adhered to. There is no allowance for variation, such as changing the gamma curve (EOTF), or increasing the display's light output, as that is already maxed out.

With HLG based 'relative' HDR this can be less of an issue, as the HDR standard can be scaled in exactly the same way as traditional SDR TVs, and further, includes a system gamma variable based on surround illuminance, specifically aimed at overcoming environmental lighting issues.

But, having said that, HLG based HDR has its own issues if the peak luma of the display is below approx. 1000 nits, as the average picture level of the HDR image will appear dimmer than the equivalent SDR image. This is due to the nominal diffuse white point being lower than the actual peak luma of an SDR TV set to 200 to 250 nits, as is normal for home viewing environments.
 
Last edited:
No HDMI 2.1 might rule that out from the start imo when LG C9's have it for future compatibility with 4k 444 120hz bandwidth. Tradeoffs as always I guess.

If you're deliberately buying a TV that costs twice as much as the C9 purely for the slightly better HDR effect, you have enough money to just buy next year's version and swap it out anyways :p

LG C9 per rtings.com review lists
peak 10% window 855cd/m2 , peak 25% window 845cd/m2

Yeah but rtings is a different site. I don't really like to compare measurements across sites, as methodology and tools vary and brightness numbers are sometimes VERY different.

Of course there is panel variation as well. The same site(flatpanelshd.com) found 725 nits for the C9. I prefer rtings and would have used them if they reviewed both TVs, but they haven't reviewed the Panasonic likely due to being US-based.

And yes like you said it's a big price difference, but it's much worse than what you said according to https://www.techradar.com/reviews/panasonic-gz2000-4k-oled-tv-review
I live in Canada and was referring to local prices. EU pricing is a whole other ball game for sure.
 
Idk maybe your display or content had screwed up the white point. HDR is generally perceived as too dim on average scene brightness to a lot of people who aren't using a dim to dark home theater environment.

--- This is what the uninformed thinks happens:

"If you viewed SDR content on a 1,000 nit display, everything would be displayed 10x brighter than intended.

If you viewed that same SDR content on a 500 nit display, everything would be half the brightness of the 1,000 nit display."

---This is what really happens:


If you viewed HDR content on a 1000 nit display or a 500 nit display, any scene where the peaks are lower than 500 nits should look exactly the same on both displays, due to the way that brightness is now coded. It represents an exact value, not a relative value.

For scenes which have highlights exceeding 500 nits, you will have highlight clipping on the lower brightness display (everything above 500 nits turns white) which would not be present on the higher brightness display. <---- Which will show full color gradations across the color spectrum in bright highlights instead of crushing to white after hitting the sdr ceiling.

=====================================

That means more detail and brighter colors throughout the 3d color gamut heights instead of clipping or rolloff at sdr peaks. It also means that when HDR is done properly the gamma and white point is absolute at 200nits. When you are using a "hybrid" hdr and changing the white point you are changing everything. If you scale the white point up and/or change your gamma curve to compensate for a poor viewing environment ("It's too dim, I'll turn it up") - your highlights and peaks are going to be darker or much brighter than the HDR standards based on human eyesight.

No man, sometimes on the X27 you would get a pure white screen of 1000 nits in your face like Peter North Bustin on Jilly Kelly's tits!
 
If you're deliberately
Idk if you are agreeing with me or not lol. I think you are thinking I'm saying the GZ/GX is worth the difference and I am not saying that at all. It seems like a good step for the tech for the future though if others do it.

C9 = yes
-------------------
I'm saying C9 = yes , even if it's not based on 1000nit peak so has to roll down, tone map, and use ABL losing detail in brightest colors. The value of the C9's is huge at the overstocked prices.

Also C9 = yes for the hdmi 2.1 inputs . I don't think everyone buying a GZ/GZ is going to upgrade it in a year like you are proposing, especially since a lot of people who buy them do so for movies and not for gaming where the hdmi 2.1 will make a much bigger difference once there are hdmi 2.1 gaming sources in or by end of 2020.

Peaks
-------------
There are several different measurements. There is peak "burst" 10% , 25%, 50%, 100% window and sustained 10% , 25%, 50%, 100% window , as well as what the ABL kick in drops down to and how aggressive/quick it does.
You can tell pretty much where you are at with the C9 compared to the GZ/GX and other 1000nit+ fald lcd displays -which is 200 to 300 under 1000nit HDR basis and down to 500 to 600nit when ABL kicks in. After those peaks the color brightness and detail in those areas is lost in roll down, tone mapping, or clipping to white.

Even though the GZ/GX panasonic can do 1000nit it can't do it for much of the screen or very long. It still uses ABL down to 630nit so it's not a perfect solution for HDR 1000 content's color brightness volume either.
 
No man, sometimes on the X27 you would get a pure white screen of 1000 nits in your face

Yes I'm saying that games use hybrid log gamma where you can move the white point and gamma brighter which can make things out of bounds one extreme or the other. IDK of much content that would blast max nits fullscreen sustained unless it was a poorly done game with a bad white point/gamma scale.

A 800nit or higher peak is impossible on OLEDs outside of peak 10% window of highlights and before ABL kicks in.

A C9 OLED's HDR can only do
Fullscreen peak/burst: 301 nit ... Fullscreen sustained: 286 nit
50% peak/burst: 530nit ... 50% sustained: 506nit
25% peak/burst: 845nit ... 25% sustained: 802nit
10% peak/burst: 855nit .. 10% sustained: 814nit

A Samsung Q90's HDR for comparison since it's LED LCD and very bright
Fullscreen peak/burst: 536 nit ... Fullscreen sustained: 532 nit
50% peak/burst: 816nit ... 50% sustained: 814nit
25% peak/burst: 1275nit ... 25% sustained: 1235nit
10% peak/burst: 1487nit .. 10% sustained: 1410nit



For comparison

200w bulb = ~ 3000
100 W = 1600 lumen
75 w = 800 lumen
40 w = 450 lumen


However think of HDR normally sort of like having a screen size slide of film with varied levels of transparency that the light is shining through, of which the brightest pass throughs are mostly highlights of bright colors. On SDR screens and low peak nit HDR screens those colors would be clipped to white or rolled down from at a much lower color volume ceiling.

Then consider that the locations are often moving. The glint of a rifle barrel, the oscillating reflections of the sun on water, moving cars, etc too. And if you are in a virtual reality type of scenario while gaming, you probably shouldn't stare directly into the sun or a nuclear blast for too long eh?


HDR 10,000 is the benchmark for the most realism. Even HDR 1000 as a content set point is fractional HDR. There are a lot of hdr 1000 uhd discs, a bunch at hdr4000 and only 1 or 2 afaik that are 10,000 because noone has the hardware to play them yet at those levels. Some games can also technically do hdr10,000 when tested with color maps.


https://www.theverge.com/2014/1/6/5276934/dolby-vision-the-future-of-tv-is-really-really-bright

"The problem is that the human eye is used to seeing a much wider range in real life. The sun at noon is about 1.6 billion nits, for example, while starlight comes in at a mere .0001 nits; the highlights of sun reflecting off a car can be hundreds of times brighter than the vehicle’s hood. The human eye can see it all, but when using contemporary technology that same range of brightness can’t be accurately reproduced. You can have rich details in the blacks or the highlights, but not both.


So Dolby basically threw current reference standards away. "Our scientists and engineers said, ‘Okay, what if we don’t have the shackles of technology that’s not going to be here [in the future],’" Griffis says, "and we could design for the real target — which is the human eye?" To start the process, the company took a theatrical digital cinema projector and focused the entire image down onto a 21-inch LCD panel, turning it into a jury-rigged display made up of 20,000 nit pixels. Subjects were shown a series of pictures with highlights like the sun, and then given the option to toggle between varying levels of brightness. Dolby found that users wanted those highlights to be many hundreds of times brighter than what normal TVs can offer: the more like real life, the better.
"
 
Last edited:
A C9 OLED's HDR can only do
Fullscreen peak/burst: 301 nit ... Fullscreen sustained: 286 nit
50% peak/burst: 530nit ... 50% sustained: 506nit
25% peak/burst: 845nit ... 25% sustained: 802nit
10% peak/burst: 855nit .. 10% sustained: 814nit

A Samsung Q90's HDR for comparison since it's LED LCD and very bright
Fullscreen peak/burst: 536 nit ... Fullscreen sustained: 532 nit
50% peak/burst: 816nit ... 50% sustained: 814nit
25% peak/burst: 1275nit ... 25% sustained: 1235nit
10% peak/burst: 1487nit .. 10% sustained: 1410nit

I think what gets lost in the discussion about HDR displays is all the other advancements in PQ we've seen over the last few years.

It's not a dig at you or anyone else, and I'm not arguing against the information you've provided (which is true), I just find it funny the discussion always devolves into this nit picking over peak nits at such-and-such a window (or what is bright or isn't bright -- largely a subjective thing), when in terms of overall display PQ, it contributes very little in my opinion. Personally, I'd rather have a panel what has excellent properties of: grey and black uniformity, contrast, color gamut and volume, and motion response.

HDR is good and all, but it's not the end all be all that it's made out to be.
 
It's just like anything else in threads going way back.. 16:9 vs 4:3 gaming , higher resolutions of 1080p "HD", 1440p, 4k .. 120hz, variable refresh rate. ... and yes per pixel contrast side by side, ultra dark black depths.. etc.. it'll become somewhat common then ubiquitous eventually and at that point most people would never choose to do without it.

My point in the last comment was not to dis the limitations of the C9 really (or the lack of HDR on the dell alienware), it was to show that it's impossible to have 1000nit full screen on current displays, even the well over 1000 nit capable Q90's and Q9fns.... and that most people do prefer a more realistic (higher) range of color volume/color brightness rather than clipping to white at a low SDR ceiling or rolling down from a lower ceiling. The brightest hdr is in the highlights, the average scene brightness generally remains the same. A real HDR experience is often cited as being more impressive and important than 4k vs 1080p even is, especially for movies/media.

There are a lot of dismissive comments about HDR and misunderstandings about how HDR color brightness/color volume and overall scene brightnesses work. It gets repeated a lot.
 
Last edited:
  • Like
Reactions: Nenu
like this
It's just like anything else in threads going way back.. 16:9 vs 4:3 gaming , higher resolutions of 1080p "HD", 1440p, 4k .. 120hz, variable refresh rate. ... and yes per pixel contrast side by side, ultra dark black depths.. etc.. it'll become somewhat common then ubiquitous eventually and at that point most people would never choose to do without it.

My point in the last comment was not to dis the limitations of the C9 really (or the lack of HDR on the dell alienware), it was to show that it's impossible to have 1000nit full screen on current displays, even the well over 1000 nit capable Q90's and Q9fns.... and that most people do prefer a more realistic (higher) range of color volume/color brightness rather than clipping to white at a low SDR ceiling or rolling down from a lower ceiling. The brightest hdr is in the highlights, the average scene brightness generally remains the same. A real HDR experience is often cited as being more impressive and important than 4k vs 1080p even is, especially for movies/media

There are a lot of dismissive comments about HDR and misunderstandings about how HDR color brightness/color volume and overall scene brightnesses work. It gets repeated a lot.

To me HDR is not even about having blindlingly bright colors, it's about being able to show more details. In Uncharted 4 I was amazed that in HDR it was able to show me more graduation in sand or in Shadow of the Tomb Raider it showed better skin tones on Lara, things like that. At its best it works really nicely.

I just wish Microsoft implemented a "Play HDR games and apps...but not unless it's fullscreen" toggle so HDR would only activate in games that support it instead of the desktop. HDR is very seamless on consoles and easy to toggle. I get that MS wanted windowed HDR but in most cases it just makes SDR content awful to use for what is ultimately a niche feature compared to fullscreen HDR.
 
Yes if you follow hdtvtest on youtube he always shows examples of how details are lost when the lower *color* brightness limits (color volume height limits) are reached because they then clip to white (or roll down to the same color brightness limit as the best color limit the screen can reach losing gradation) .. So they lose any detail that goes past that and leave a uniform area.

People need to realize having colors throughout brightness in colored highlights and details and a few colored light sources, reflections, etc.. while the rest of the scene generally remains in SDR ranges is not the same as taking the relative brightness of a SDR screen's brightness control "knob" and turning the whole screen to "11".
 
@l88bastard Did Dell put some crap Matte coating over the screen or is it a semi-gloss like the LG OLEDs and most TV's out there? It's hard to tell from the pics and not sure if the final production models made the swap or not. Per their website:

Screen Coating
Anti-reflective, 2H Hard Coating
 
@l88bastard Did Dell put some crap Matte coating over the screen or is it a semi-gloss like the LG OLEDs and most TV's out there? It's hard to tell from the pics and not sure if the final production models made the swap or not. Per their website:

Screen Coating
Anti-reflective, 2H Hard Coating

Its flossy glossy just like the LG oleds. I have a C9 and its identical. Its a fabulous display, just don't let dell rape you in the ass!
 
Congrats! It was getting lonely on this island :)

It has very good motion clarity, the only displays I have seen with better clarity are the new 240hz Lenovo Legion 1440p and that one curved Asus TN 165hz 1440p.

This video gives a good example of its blur clarity


Either I'm having a false memory, or CRT is still better in this department. But still, we're getting closer. All we need is rolling scan and I think we'll be set.
 
Either I'm having a false memory, or CRT is still better in this department. But still, we're getting closer. All we need is rolling scan and I think we'll be set.

CRT is still the motion king, but the 120oleds are far nicer than lcds in that department. I have a FW900 and the AW55 dismantles it on everything else and has the throne of most holy kingdom godlike grail display blessed beauty.....just don't let dell price rape you in the ass.
 
CRT is still the motion king, but the 120oleds are far nicer than lcds in that department. I have a FW900 and the AW55 dismantles it on everything else and has the throne of most holy kingdom godlike grail display blessed beauty.....just don't let dell price rape you in the ass.

I'm pretty sure a rolling scan will fix this. That or have an image algorithm that mimics the phosphor decay of a CRT and call it a day. Then we'd be good.
 
I'm pretty sure a rolling scan will fix this. That or have an image algorithm that mimics the phosphor decay of a CRT and call it a day. Then we'd be good.

The AW55 is nearly damn perfect compared to "everything else" out there. However, I think we are 3 years away from most blessed beauty holy grail of the hubba hubba order! 32"4k240hz VRR + BFI/RollingScan would be so good everybody from nuns to disco kings would rejoice!
78acc3d6-99b8-469c-9e5f-4f43c1d0f532_1.39033b051f81c1b0951714c03af57530.jpg
 
55" is way to big for a desktop display.
I tried 50" samsung last year and couldn't handle it.
Everything above 40" is overkill, imo.
 
55" is way to big for a desktop display.
I tried 50" samsung last year and couldn't handle it.

Guess that means nobody else can? I'll agree that it's large, but it's all about viewing distance and personal preference. Stating these things like they're facts is crazy.

It's like me saying "600 horsepower is way too much for a street car." For many, it is.

The way I use mine is I don't run apps full screen (a 55" Excel spreadsheet or browser window is nuts to me). I resize my apps to roughly what a 40" monitor would be. Because the OLED has a completely black background outside of that window, it looks quite nice.

Then for gaming, full screen obviously. It's immersive as heck, but I do want to make the jump to the 48" OLED if and when it emerges. I used a 48" Samsung JS9000 very comfortably before this. It wasn't too big.


Right.
 
55" is way to big for a desktop display.
I tried 50" samsung last year and couldn't handle it.
Everything above 40" is overkill, imo.

If you have the space you can make a separate command center desk island and put a tv on a pillar stand freestanding or on a wall. I wouldn't sit closer than 4 feet with a 55inch at the closest, perhaps a little farther for comfort especially if using another screen along with it. Maybe in 21:9 or 21:10 resolutions letterboxed when using the extra side game world real estate peripherally for immersion in things like driving games I could sit nearer.

Currently I sit 3 feet or so from 43inch but I have more than one monitor so I could probably sit a bit farther back. The perceived ppi is a function of viewing distance of course so as you drop back the ppi shrinks to your perspective too. If I were much farther than 3 feet Id probably have to start scaling past 100% on 43 inch 4k.
 
Is this monitor recommended for gaming ?
Burn in image .... that's why I ask.
Thank you
 
Is this monitor recommended for gaming ?
Burn in image .... that's why I ask.
Thank you

It's not recommended. If you need to ask basic questions about it, it's not for you honestly. Too many caveats and conditions for the price.
 
Is this monitor recommended for gaming ?
Burn in image .... that's why I ask.
Thank you

https://www.cnet.com/news/oled-screen-burn-in-what-you-need-to-know/ (2018)
"What's colloquially called "burn-in" is actually, with OLED, uneven aging. They don't "burn in" as much as they "burn down." ... OLED pixels very, very slowly get dimmer as they're used. In most cases this isn't an issue since you're watching varied content and all the pixels, on average, get used the same amount."

Always a chance with OLED but realistically probably not much danger on this monitor since it has a 400 nit SDR brightness limit unless you keep static UI elements on the screen all the time and don't vary the content. They also can use pixel shifting and an idle screensaver built into the screen I think. They (the LG C9 tvs at least) also run a maintenance program when turned "off" while in standby mode which attempts to intelligently even the wear of the oleds that had miles put on them that day.

Personally I would use an oled screen with black wallpaper and no icons as a "media stage" for games, movies and streams, picture slideshows etc. and use a different monitor for desktop/apps and taskbar.

-----------------------------------

The brighter the screens the hotter the oleds and the more risk of burn in so brighter HDR OLED tvs use ABL (auto brightness limiter) tech which kicks in when higher brightness content is detected in HDR scenes and snaps the screen down to 500 or 600nit from ~ 700 nit to 800nit overall spikes on bright color gradations in HDR content/highlights. The alienware gaming OLED uses 400 nit maximum brightness.


LG OLED also use an all white oled layer with a color filter above it. This avoids uneven color wear (but not uneven screen location wear). They use a WRGB array with one clear spot on the color filter that allows the white straight through. On brighter HDR capable screens, this means the color accuracy is off and a little "white washed" at the higher color brightness HDR highlights.

Brightness tests are usually measured using bright squares in large grids and they call it a % window. So if there is a 780nit 10% window it means only 10% of the screen has 780nit brightness. A 530nit 50% window means 50% of the screen was bright and obviously 100% window is the whole screen. However it's also measured in peak and sustained brightnesses because a screen can peak or burst at a higher brightness but it can only sustain a lower brightness. This comes into play a lot on HDR screens since HDR content is based on HDR 1000 nit (or HDR 4000, or HDR 10,0000 on future capable hardware). Anything under HDR 1000 has to use a hybrid gamma and use more questionable tone mapping to try to make up for the new more limited sub 1000nit based scale.

The expensive panasonic GZ/GZ oled tvs use a metal sheet behind the oled array which acts like a heat sink. This allows the screen to hit 1000nit peaks on smaller %'s of the screen for highlights and it reportedly makes IR (image retention) happen less when at all and more briefly when it does. Reviewers are guessing it would also reduce the chance of burn in too but that would take a long time to test and would still probably be an unreliable test scenario anyway. Hopefully in the following years more OLEDs will use cooling technologies like this.

----
 
Last edited:
After some delay and playing on a C9, finally picked one up.

Will be running 3440x1600 120hz via displayport with Nvidia Scaling feeding the monitor 4k. Gonna check if monitor scaling works at lower than 4k resolution to try and feed it lower than 4k rez's with 10bit 444.

Might run 3440x1440, gonna try both and see what I like best.

Finally have a no compromise endgame monitor: OLED, Ultrawide capable, 120hz, VRR, variable resolution with driver scaling to map pixels 1:1 up to 4k. Flawless Victory!

Blown away this can all be achieved TODAY with this monitor at a price less than ridiculous local dimming IPS glow displays.

Spare me the burn-in remarks, have over 4 years experience with other OLED TV's and no burn in with all kindsa content from letterboxed to gaming long hours with huds. My C6 still looks as bright and beautiful as the day I got it (It was also more expensive than this monitor.)

To use the full width for 21:9 aspect ratio gonna check out 3840x1648. May be the way to go to only letterbox the top and bottom and not sides.
 
Last edited:
Custom resolutions work pretty good with Nvidia (less so with AMD, but I did eventually get something working).

I used to game on a 40" 4K with custom 21:9. It's really a win/win, better performance and a more immersive experience.

These days I use a native 21:9 monitor, but the virtual 21:9 did me good for a while.
 
Back
Top