Sweet Baby Jeebus Alienware 55" 4k120 OLED Displayport

I'm on here every day, haven't heard anything about an HDMI 2.1 firmware upgrade. News to me at least.

Well technically It is NOT an hdmi 2.1 firmware upgrade. All the driver update is going to do is enable VRR over hdmi on the RTX cards and make the LG C9 sets "gsync compatible" but it won't turn those hdmi 2.0 ports into hdmi 2.1 ports.
 
Must of missed the bolded text so let me help you out with what was emphasized: Nvidia's imminent firmware upgrade to 2.1 for its RTX graphics cards.

LMAO and you actually believe we will get REAL HDMI 2.1 through a firmware upgrade? Oooooooooooookay then.
 
LMAO and you actually believe we will get REAL HDMI 2.1 through a firmware upgrade? Oooooooooooookay then.

Quoting my first post because reading is hard before replying:

"If the firmware upgrade thing is true, or even possible, then LG is the way to go."
 
There is no IF dude. I didn't miss your nicely bolded text, I just ignored it because I figured it was pretty obvious to anyone that you cannot just magically turn hdmi 2.0b ports into 2.1 through software updates but apparently using logic before replying is hard.
 
There is no IF dude. I didn't miss your nicely bolded text, I just ignored it because I figured it was pretty obvious to anyone that you cannot just magically turn hdmi 2.0b ports into 2.1 through software updates but apparently using logic before replying is hard.

Ignore a quote and the posters comments in that post just to reply to it with something completely irrelevant to that post. meh
 
Ignore a quote and the posters comments in a post just to reply to it with something completely irrelevant to that post. meh

Dang it's that important that I must quote reply to you? Anyways I'm not gonna drag this further on and derail the thread man. Good things are coming to the display market and that's all we need to know.
 
I just skimmed through the cnet article, and the author clearly has no idea what they're talking about. LOL.
 
https://www.cnet.com/reviews/alienware-aw5520qf-monitor-review/

meh review, but it's somethin', not sure if it has HDR...

Has some questionable statements:

https://www.cnet.com/reviews/alienware-aw5520qf-monitor-review/

"If you're familiar with Dell monitors, you'll recognize the Smart HDR modes -- in this case, game, movie, desktop and reference -- which report to Windows whether it can toggle its HDR settings. If you've chosen the game setting, Windows can enable it for games or wide-color gamut. If you want it for movies, you'll have to enable that setting instead."

"In fact, G-Sync support is slated to come to LG's TVs soon, thanks to the company's rollout of HDMI 2.1 in its 2019 TVs and Nvidia's imminent firmware upgrade to 2.1 for its RTX graphics cards."



Make this a 30-40" monitor, add G-Sync Ultimate and I'll pay the early adopter tax.
If the firmware upgrade thing is true, or even possible, then LG is the way to go.

A bit late to the party. The gsync driver update has been confirmed a while back already. RTX cards will be limited to 60Hz VRR on the LG 2019 sets but the main point is that nvidia has confirmed HDMI VRR support so once they launch cards with HDMI 2.1 we will have full 4k120Hz VRR over hdmi.

Dang it's that important that I must quote reply to you? Anyways I'm not gonna drag this further on and derail the thread man. Good things are coming to the display market and that's all we need to know.

Apparently so.

Anyway, need a real review for this thing with metrics.
 
I'm sensing confusion about two things:
  • Turing GPUs getting HDMI VRR that will work with LG monitors
  • LG monitors having HDMI 2.1 such that they can accept a 4k120 input
These are two different things. I'm only confused as to why Turing isn't getting HDMI VRR enabled for everything.
 
I'm only confused as to why Turing isn't getting HDMI VRR enabled for everything.
From my understanding, FreeSync over DisplayPort is using the DP VRR standard, which is what Nvidia now supports.

FreeSync over HDMI is actually a custom AMD thing, not the HDMI standard VRR, which is why Nvidia does not support it.

I believe LG is using the actual HDMI 2.1 standard VRR, which Nvidia seems to have found a way to support on Turing (though obviously within the limits of HDMI 2.0 bandwidth and thus not 4K120).
 
I'm sensing confusion about two things:
  • Turing GPUs getting HDMI VRR that will work with LG monitors
  • LG monitors having HDMI 2.1 such that they can accept a 4k120 input
These are two different things. I'm only confused as to why Turing isn't getting HDMI VRR enabled for everything.

It should work on any HDMI VRR capable display. The only difference is that the LG TV's will have the "Gsync Compatible" label on them while anything else wouldn't, but it will not stop you from using HDMI VRR on it anyways. Sorta like how my Omen X27 isn't "Gsync Compatible" but I can still use Gsync on it regardless.
 
I'm sensing confusion about two things:
  • Turing GPUs getting HDMI VRR that will work with LG monitors
  • LG monitors having HDMI 2.1 such that they can accept a 4k120 input
These are two different things. I'm only confused as to why Turing isn't getting HDMI VRR enabled for everything.

It most likely has to do with the fact that HDMI VRR is only available on HDMI 2.1 displays at the time of writing. The number of models with HDMI 2.1 is very limited in the first place. While technically it would be possible to make it work on HDMI 2.0, it would also require TV manufacturers to add support for it, which they won't because they want you to buy their 2020 model instead.

As said Freesync over HDMI has been a proprietary AMD solution and Nvidia is unlikely to want to reverse engineer that but instead will add support for the standardized HDMI 2.1 VRR which will be available in at least most higher end TVs sold in 2020.
 
The samsung Q series, Q9fn and the c9 have supported freesync/VRR at 1080p and 1440p on amd gpus and xbox one for awhile now, somewhere around may 2018 - so there's no reason it can't be done at those resolutions on hdmi 2.0b. It's really not worth it to me at 60hz 4k even with VRR for a tiny range of hz variance and full smearing viewport sample and hold blur at 60fps on displays without displayport for right now.


The next gen of consoles will probably be hdmi 2.1 since they are touting 120hz gaming capability at their quasi 4k resolutions (most likely a performance vs static eyecandy choice in supported game's settings). That is unless they are running non native resolution more of the time to hit 120hz with more aggressive checkerboarding and dynamic rez 1440p etc. but I'm assuming they will have hdmi 2.1 and are rumored to have gpu power similar to a 2080. Someyear nvidia should finally release hdmi 2.1 gpus, hopefully with a die shrink. Until then (nvidia hdmi 2.1), for now I'd consider a hdmi 2.1 OLED TV for the living room to do media playback and ps4 60hz and later ps5 120hz gaming but not for my gaming pc's gaming monitor.

There are obviously no hdmi 2.1 output nvidia gpus and likely won't be for awhile yet. So that's the main price gouge for now on a bunch of displays like the alienware 55", the FALD BFGs, the 43" 120hz 4k's and several other monitors - they have a displayport connection so they can squeeze watered down 4k 120hz through it at lower 8 bit color (HDR?), lower chroma (which can affect text and fine details), or on 2000 gpu series with compatible display use DSC which is display compression 3:1 (4k clarity?). So one of what are essentially various forms of compression or cap it at 98Hz. None of those are optimal but are what is required to fit high hz 4k down too narrow of a pipe.

Btw - Products can list that things support hdmi 2.1 features without being true/full featured hdmi 2.1 so it muddies the waters. Similarly a lot of displays slap a HDR graphic on their products when they really aren't capable of true hdr.
 
Last edited:
Reasoning = free. Maybe better than free (wink).

But here is a breakdown from the vid...

Pros:
- no burn in concerns or limitations
-1000nit color brightness in HDR movies and games looked amazing to him
-no glow around letterboxing even in completely dark room at night
-no glow around small white game bitmap objects floating around (snowflakes on a black background in an indie game tested around 16:26)
-lower input lag than tvs
-no pixel response time/overdrive issues
-gsync: has g-sync, g-sync still has some advantages over freesync.
-High refresh rate


Cons:
- Some vertical banding/lines possibly from the local dimming zone grid when viewed up very close on certain games. (He had his nose touching the screen).
- glow around mute icon during HDR movie playback which means that there is likely glow around closed captions, at least in HDR. What you'd expect on a FALD but:
.... Odd since the small bitmap snowflakes on black background he showed didn't have glow haloing around them in games (but that was SDR).
..... No mention of glow haloing in shadow of tomb raider in HDR or any of the other bitmap games or HDR movies he ran.

Misc:
- 4k 120hz displayport limitations (chroma/sub-sampling, 8bit, or 98Hz).
- Size/viewing distance outside of living room setups
- living room setups requiring couchmaster or other lapdesk type setup, preferably long wired peripherals vs input lag
- requires a powerful gaming pc and gpu for 4k in living room
- doesn't like the stand aesthetic compared to his other tv
- whining that his logitech remote doesn't already have support for the BFG
 
Last edited:
Reasoning = free. Maybe better than free (wink).

But here is a breakdown from the vid...

Pros:
- no burn in concerns or limitations
-1000nit color brightness in HDR movies and games looked amazing to him
-no glow around letterboxing even in completely dark room at night
-no glow around small white game bitmap objects floating around (snowflakes on a black background in an indie game tested around 16:26)
-lower input lag than tvs
-no pixel response time/overdrive issues
-gsync: has g-sync, g-sync still has some advantages over freesync.
-High refresh rate


Cons:
- Some vertical banding/lines possibly from the local dimming zone grid when viewed up very close on certain games. (He had his nose touching the screen).
- glow around mute icon during HDR movie playback which means that there is likely glow around closed captions, at least in HDR. What you'd expect on a FALD but:
.... Odd since the small bitmap snowflakes on black background he showed didn't have glow haloing around them in games (but that was SDR).
..... No mention of glow haloing in shadow of tomb raider in HDR or any of the other bitmap games or HDR movies he ran.

Misc:
- 4k 120hz displayport limitations (chroma/sub-sampling, 8bit, or 98Hz).
- Size/viewing distance outside of living room setups
- living room setups requiring couchmaster or other lapdesk type setup, preferably long wired peripherals vs input lag
- requires a powerful gaming pc and gpu for 4k in living room
- doesn't like the stand aesthetic compared to his other tv
- whining that his logitech remote doesn't already have support for the BFG
Thanks for the incredibly detailed summation. I just wanted a few words since I won’t be able to watch until tonight, but it’s appreciated regardless.

I have to say that I disagree with a lot of his pros. Only a few of them seem like things I’d care to want improved. I guess the big thing with me is that I’m going to using the screen for desktop use the majority of the time. That DOES include games and movies, but always in SDR. The best SDR experience hands down comes from OLED.
 
Reasoning = free. Maybe better than free (wink).

-no pixel response time/overdrive issues

I find that hard to believe without any actual measurements/pursuit camera footage. I wouldn't trust Linus's eyes on something like that. Yes I'm well aware of the fact that the monitor has REAL gsync so that helps a lot in terms of overdrive implementation, but without actual numbers and footage there's no reason to believe him.
 
Ah right -- it goes back to AMD halfassing it again, got it.
No, not exactly. HDMI 2.1 VRR didn't exist when FreeSync came out (and barely even exists today) so AMD came up with their own proprietary solution, just like GSync is proprietary to Nvidia.
 
I find that hard to believe without any actual measurements/pursuit camera footage. I wouldn't trust Linus's eyes on something like that. Yes I'm well aware of the fact that the monitor has REAL gsync so that helps a lot in terms of overdrive implementation, but without actual numbers and footage there's no reason to believe him.

Of course testing would be best. but there was nothing overt that he saw to call it out on, even playing pixel bitmap games like super meatboy which is very fast moving on different solid backgrounds as well as some other bitmap quest rpg.

Thanks for the incredibly detailed summation. I just wanted a few words since I won’t be able to watch until tonight, but it’s appreciated regardless.

I have to say that I disagree with a lot of his pros. Only a few of them seem like things I’d care to want improved. I guess the big thing with me is that I’m going to using the screen for desktop use the majority of the time. That DOES include games and movies, but always in SDR. The best SDR experience hands down comes from OLED.

I wrote it out for myself as well as the community and was planning on posting it into the BFG thread when I finished.

Personally 65" is a downgrade for my living room since I've gotten used to a 70" 4k Vizio FALD VA for the last 2.5 years or so. If anything I'd consider a 77" C9 OLED when they get closer to $3000 - $3200 (already have been 3800 from a reputable vendor on ebay I guess so that is promising). The Q9's from samsung are out of the running from not having 48gbps HDMI 2.1 ports.

For my pc desk, even my combo desktop island I have going, 65" is way too big for the small spare room I have my pc in right now so that is out. Even a 55" OLED would prob not work out and without hdmi 2.1 gpus there's no real reason for me to do that until nvidia has hdmi 2.1 on a die shrink in top level Ti tier. This Dell OLED is 400 nit non HDR and doesn't even have HDMI 2.1 for futureproofing in order to get 4:4:4 uncompressed 4k 120hz while being well overpriced so that is out of the running too.
 
No, not exactly. HDMI 2.1 VRR didn't exist when FreeSync came out (and barely even exists today) so AMD came up with their own proprietary solution, just like GSync is proprietary to Nvidia.

On DisplayPort, they used the standard 'better'; both FreeSync and G-Sync use the available modes, but FreeSync became the standard in large part because it is "open". That' also why it was a shitshow in terms of implementation -- you'd still prefer G-Sync where possible.

On HDMI, AMD made up their own standard that is not the standard that's getting used by the industry, thus another shitshow.

Go AMD...
 
This Dell OLED is 400 nit non HDR and doesn't even have HDMI 2.1 for futureproofing in order to get 4:4:4 uncompressed 4k 120hz while being well overpriced so that is out of the running too.
Would it matter wether or not it would have an HDMI 2.1 port or not? After all, the panel itself is limited to 60Hz regardless. This is the monitor I'm looking at right now, because despite being overpriced (it really is when there are $1.5K 55" versions of it with high refresh and G-sync), it's the only monitor that can actually fit on my desk that seems worth buying right now. Everything else is a gross compromise. If I don't end up getting it, picking up a $500 4K TV for my desktop use would be the only other justifiable purchase. It's a time when I really want to upgrade because my current monitor is a 2009 Apple Thunderbolt Display, but can't seem to find anything worth upgrading to. I suppose there's also Apple's own successor, their upcoming 6K Pro Display XDR, which comes as close to OLED quality as we're ever going to get with non-microLED, but that's not really for the market I inhabit. That's why it's price is so high.

I wish LG would just stop idling and release a 43" version of their OLED TVs. I wish ANYONE would release a 43" OLED. Even if it's crazy in price. If this Dell 55" was 43" instead, I'd have swallowed that $4K price tag in a heartbeat.

Based on the parts of your comment I didn't quote, it looks like you're also unsure what to get going forward. Anything looking decent to you at the moment?
 
Would it matter wether or not it would have an HDMI 2.1 port or not? After all, the panel itself is limited to 60Hz regardless. This is the monitor I'm looking at right now, because despite being overpriced (it really is when there are $1.5K 55" versions of it with high refresh and G-sync), it's the only monitor that can actually fit on my desk that seems worth buying right now. Everything else is a gross compromise. If I don't end up getting it, picking up a $500 4K TV for my desktop use would be the only other justifiable purchase. It's a time when I really want to upgrade because my current monitor is a 2009 Apple Thunderbolt Display, but can't seem to find anything worth upgrading to. I suppose there's also Apple's own successor, their upcoming 6K Pro Display XDR, which comes as close to OLED quality as we're ever going to get with non-microLED, but that's not really for the market I inhabit. That's why it's price is so high.

I wish LG would just stop idling and release a 43" version of their OLED TVs. I wish ANYONE would release a 43" OLED. Even if it's crazy in price. If this Dell 55" was 43" instead, I'd have swallowed that $4K price tag in a heartbeat.

Based on the parts of your comment I didn't quote, it looks like you're also unsure what to get going forward. Anything looking decent to you at the moment?


There's no market for OLEDs that small, that's the problem. And look how cheap 55" are getting... LG's 48" next year will be even more affordable. Great news, but a 43" probably wouldn't be much cheaper to produce, therefore they couldn't sell it for much less... but for argument's sake let's say they could, it would then undercut LCD prices, and they'd never do that.

You're in an UNIMAGINABLY small minority of people who would pay $4K for a 43" OLED lol! Any manufacturer releasing such a product at that price would be a laughing stock. Heck, Alienware are being mocked as harshly as it gets for this 55", and I doubt they will sell many at all.

This doesn't even begin to address the burn-in issue though, which absolutely becomes a problem when you get down to the size that people would be using as a dedicated PC monitor for daily (continuous) use. A 55" is not that... it will be a lounge gaming set-up, far more casual use. This is the domain for OLED and where it will remain for the foreseeable future... and it's great for that. For someone using their PC 12+ hours a day, web browsing, Photoshop, video editing... no way will OLED ever make a dent here. Will they develop a smaller OLED monitor for gamers, with a whopping great warning label on the box warning against excessive use, due to burn-in risk? I doubt it, but only time will tell.

I think the future for more practical (32"-43") Mini LED monitors looks far more promising than OLED, for which I see absolutely nothing on the horizon.
 
There's no market for OLEDs that small, that's the problem. And look how cheap 55" are getting... LG's 48" next year will be even more affordable. Great news, but a 43" probably wouldn't be much cheaper to produce, therefore they couldn't sell it for much less... but for argument's sake let's say they could, it would then undercut LCD prices, and they'd never do that.
No. This doesn't HAVE to be true. Look at Samsung's Serif and Frame TVs. Their 40-43" models are over $1000, which is considered ridiculous for a TV of that size. But Samsung still put them out, and people still buy them, because their physical design (and to a lesser extent, image quality) are beyond anything else in the category. But, for only around a hundred more, you can get the 48" Frame. And I think for only a few hundred more than that, around $1500, you can get the 55" version. Do you see what I'm getting at here? The difference between price is minimal. I can EASILY see the 48" LG OLED being only a hundred or so cheaper than the 55", and I doubt it would even remotely affect their sales. And a potential 43" model would be only a hundred or so less than that, still over a thousand. If Samsung is smart enough to figure out that just because people want a small TV, it doesn't necessarily mean they want a shit TV, then LG should be able to understand it to. A 43" OLED would not be much cheaper to make, but they wouldn't have to sell it much cheaper, and it would STILL not undercut any 43" TV. It would only match the price of the 43" monitors and the Samsung Frame. There is no reason this isn't feasible.
You're in an UNIMAGINABLY small minority of people who would pay $4K for a 43" OLED lol! Any manufacturer releasing such a product at that price would be a laughing stock. Heck, Alienware are being mocked as harshly as it gets for this 55", and I doubt they will sell many at all.
This was hyperbole to show how bad I want one.
This doesn't even begin to address the burn-in issue though, which absolutely becomes a problem when you get down to the size that people would be using as a dedicated PC monitor for daily (continuous) use. A 55" is not that... it will be a lounge gaming set-up, far more casual use. This is the domain for OLED and where it will remain for the foreseeable future... and it's great for that. For someone using their PC 12+ hours a day, web browsing, Photoshop, video editing... no way will OLED ever make a dent here. Will they develop a smaller OLED monitor for gamers, with a whopping great warning label on the box warning against excessive use, due to burn-in risk? I doubt it, but only time will tell.
Do you think I haven't considered this? I'll deal with it. I swipe between different virtual desktops ever few minutes. I'll set the computer to launch a screensaver after 30 seconds. It's worth it for me, and for many others too.
I think the future for more practical (32"-43") Mini LED monitors looks far more promising than OLED, for which I see absolutely nothing on the horizon.
That's the whole problem.
 
Do you think I haven't considered this? I'll deal with it. I swipe between different virtual desktops ever few minutes. I'll set the computer to launch a screensaver after 30 seconds. It's worth it for me, and for many others too.


Yes, but again, you're in a very small minority here. Most consumers won't accept this. And the problem isn't desktop wallpapers... no one sits staring at their desktop all day lol! It's programs... Photoshop, video editing, MS Word, web browsing... people who work in front of their screen continuously for 12-16+ hours a day, and also game... that's me, and that's A LOT of PC users. OLED would be a disaster in those circumstances, and manufacturers know it.

Case in point, EIZO have just announced the Foris Nova, a 21.6" 4K OLED, 60Hz, non-HDR monitor... limited to 500 units worldwide. No price yet, but obviously it's going to be extortionate. This just demonstrates how far we have to go. Furthermore, Eizo have issued a warning on their product specs stating "If the monitor is left on continuously over a long period of time, dark smudges or burn-in may appear. To maximize the life of the monitor, we recommend the monitor be turned off periodically." People turning their PC monitors off frequently simply isn't the future, and it's the predominant reason why OLED will never get a foothold in the monitor market... manufacturers won't even try. Obviously, I can't know for sure that no one will ever make a limited edition 32" 144Hz OLED gaming monitor, and charge the GDP of a small country for it, but if they do it will be many years away yet.

The ONLY hope is if LG's 48" OLED next year sells exceptionally well, and they then decide to put out a 40-43" model... that would then offer something more practical for PC users. With HDMI 2.1 the standard by then, people should be able to enjoy high refresh OLED for PC use without incurring insane costs, therefore mitigating fear of burn-in somewhat.
 
Last edited:
And the problem isn't desktop wallpapers... no one sits staring at their desktop all day lol! It's programs... Photoshop, video editing, MS Word, web browsing... people who work in front of their screen continuously for 12-16+ hours a day, and also game... that's me, and that's A LOT of PC users. OLED would be a disaster in those circumstances, and manufacturers know it.
I think you misunderstood what I meant. When I mentioned switching between multiple virtual desktops, I wasn’t talking about switching wallpapers or anything like that. See, I don’t work by having my applications open in multiple floating windows like most other users. I maximize one application (say, and IDE), and then use virtual desktops to gesture back and forth between different applications. This is a central feature on macOS, and I’m pretty sure it’s one Windows now too. Each of these applications is maximized in its own desktop. My point is, I don’t stare at one screen, interacting with the contents in each application. Because I work this way, the screen is changing ever 2-8 minutes while I’m at it. I’m not arguing that I’m in the niche minority, but I think you just misunderstood what I meant by that snippet.
The ONLY hope is if LG's 48" OLED next year sells exceptionally well, and they then decide to put out a 40-43" model... that would then offer something more practical for PC users. With HDMI 2.1 the standard by then, people should be able to enjoy high refresh OLED for PC use without incurring insane costs, therefore mitigating fear of burn-in somewhat.
I really hope this ends up being the case. 48” is a big step forward. It’s exponentially easier to use as a monitor as compared to a 55”, but still way too big for most people (including me). A 43” isn’t too nuts to hope for, but it’s just that: a hope.

I have no idea what I can buy today that can come even close to OLED’s overall imagine quality, especially if I want to spend $2K or bellow.
 
I'm probably a minority too in how I use my OLED but what I do is I simply use it exclusively for games and movies, never any desktop work as 55" is just damn impossible to work with. It's kept on a separate desk near my main setup and I have wireless peripherals + xbox s controller that I can easily move around whenever it's time to fire up a game on it. Been doing this for about 2 years now and zero signs of burn in. If the biggest burn in concern is desktop usage, well then avoiding that altogether just might be the best solution?
 
I think JOLED will have their 32" 4K OLED panel out before LG comes out with something smaller than 48".
 
All joking aside, does anyone plan on getting this thing? Wouldn't mind some user impressions besides LTT...

I was tempted since I wanted to continue my Alienware-themed setup (to match my sons setup) but then my senses came back to me and pulled the trigger to go back to a custom-build setup. I already got a mountain mods case on the way and just waiting another week or so to see if I want to go with a 9900 KFC edition setup or a Core i9-10900XE setup to go with my LG OLED 55" C9 & 2x 2080 Ti's (as I only game 100%, I'm leaning more towards the 5.0+ghz 9900KS + faster memory vs a slower 10-core even though I can get full 16x/16x GPU bandwidth).
 
Last edited:
Well, I started this thread so I guess I'll chime in my 2 cents.

The big bitch arrived today, next to smallfry.
AW55-Incognito.jpg


Cleaned off all the dried semen on my desk and set it up.
big-bitch.jpg
online image hosting

Upon close inspection of sensitive materials I noted a screen defect of about 15 pixels on the right buttock!
AW55-Blemish-pic-2.jpg


Which upset RastaRaffee.com
AW55-Rasta-Raffee.jpg


I played around with it for awhile:
- BF4 is pretty amazing on it
- PUBG was very intense
- BFV had failure to launch and still won't launch after the latest massive update

I have used a C6, C7 & C9 55" in the past so this was very much like those experiences except with
the 120hz, no screen tearing and limited brightness. Its an impressive display, very impressive indeed.

Had Alienware / Dell gone with their 2017 30" OLED panel mated with these electronics they would of had a real standout winner IMHO. Unfortunately, its too comparable to the C9's which are much cheaper and will have superior features once GPUs with HDMI 2.1 or DP1.4 to HDMI 2.1 adapters come out from Realtek.

For today, right now the AW55 is awesome and amazing, but tomorrow when the C9's offer 4k120 + 10bit color + HDR, this AW55 will be a relic.

Also, resale value is a major item for me. Part of the fun of the hobby is selling the old to get the new and I think the resale value of the AW55 is going to be horrendous once the c9's & c10s get fully supported.

Dell CSR support for the return ticket has been good and the full return experience remains to be seen.
 
Thanks for the quick write up. Interesting with that cluster of pixel defects. Never seen that before so unsure what it could be. I’m not sure if OLEDs are prone to dead/stuck pixels like LCD’s but clearly it’s something else.
 
Thanks for the quick write up. Interesting with that cluster of pixel defects. Never seen that before so unsure what it could be. I’m not sure if OLEDs are prone to dead/stuck pixels like LCD’s but clearly it’s something else.

To my eye, it looks like a tiny nick in the screen. They package these far better than the C9's and even add an additional layer of cardboard sheath over the screen. Looks to me like something QC overlooked because 15 pixels is relatively smallish vs the ocean size of the screen.
 


Interesting OLED gaming display impressions video. His distance evaluation is pretty much what I've been saying .. 3.8' to 4' away on a 55" for things like anno/rts and first person games and can be closer for things like driving games. that's where a huge monitor arm would come in handy (or a 2nd desk space like I have, which he actually mentions as what would be an option to make it much more usable).

He did a lot of evaluation just on the size/rez effect on gaming at first and was not directly comparing the oled to the FALD. Then he started to hit the major drawback. The big drawback was no HDR. For that kind of money heading toward xmas 2019 to not have HDR on such an expensive monitor and one obviously sized well enough to play movies as well as HDR games on is a huge mistake and he relegated it to something of a 2nd monitor in a spare space which this flagship pricing is not really for. I think he was just being kind at that point.

Once nvidia has hdmi 2.1 output gpus all of these monitors cashing in on 98hz - 120hz over displayport in order to do 4k at highly inflated prices are going to be obsolete. If you are strategically planning the spending of $2k to $5k on a monitor or tv with forward thinking, really you shouldn't buy anything without hdmi 2.1 48gbps , full eARC uncompressed sound capability, and a good HDR implementation (preferably 1000nit color brightness ceiling or higher). If you want to blow your money for a year I guess these are for you, but it's a lot of money for hobbled tech.
 
Last edited:
Got BFV working last night.....now typically I'm not a fan of Zeeeergfest shooters these days and more a fan of getting murdered relatively fast by the stone cold killers in PUBG.

However, with that said...OMFG BFV was AMAZING on this thing...I had to sit about 5-6 feet away but the detail of that game really shines on a screen this large. I could see ALL of the battlefield, tanks battling off to the far right, little ant size people scurring in the distance, planes swooping in dropping their payload into a massive explosion right in my face.....it was glorious seeing it all in 4k120 with zero screen tearing, no BLB, and MASSIVE.

I also played some Project Cars 2 and it was really, really good. Played some old need for speed hot pursuit and was having a blast with that too....
 
Got BFV working last night.....now typically I'm not a fan of Zeeeergfest shooters these days and more a fan of getting murdered relatively fast by the stone cold killers in PUBG.

However, with that said...OMFG BFV was AMAZING on this thing...I had to sit about 5-6 feet away but the detail of that game really shines on a screen this large. I could see ALL of the battlefield, tanks battling off to the far right, little ant size people scurring in the distance, planes swooping in dropping their payload into a massive explosion right in my face.....it was glorious seeing it all in 4k120 with zero screen tearing, no BLB, and MASSIVE.

I also played some Project Cars 2 and it was really, really good. Played some old need for speed hot pursuit and was having a blast with that too....

Ohh god, you're killing me, these are all the games I play so reading this makes me want to run out and get this monitor. (Im not, I can wait till RTX 3000)
But damn, I know I'm missing out on beautiful detail using my x34. x34 has served me find for four years now, but Im ready for an upgrade.
 
Back
Top