Sweet Baby Jeebus Alienware 55" 4k120 OLED Displayport

Here is my C9 setup. Just got it going a few days ago. Gaming at 1440 120hz right now with VRR. Works very well with latest Nvidia drivers and latest OLED firmware. Sorry for mess, still straightening it up. For the $1200 I paid for it, it works great and will last me just fine until an hdmi 2.1 card is available. Id rather keep the $2800 difference in my pocket. IMHO

View attachment 195836

Just curious, is it 1440 1:1 pixels with black boxes around or is it interpolated and fills the entire panel?
 
How's the LG C9 in comparison to this now that it's supported by Nvidia? They go for less than 1/2 the price of the Alienware.

Had originally planned to wait for the ASUS/Acer's, but the ASUS looks pretty shit in the recent testing - so I might stick with TVs. My only qualm is that 55 is pushing the limits of what I want to sit 2 feet away from. My Samsung has been great, but after 3 years I finally have 1 stuck pixel and it's annoying the everliving shit out of me but there's nothing in the 43 inch space to upgrade to that's worth an upgrade. Seems like all the TV manufacturers quit making nice 43's.

You're paying for a displayport since nvidia has no hdmi 2.1, 48gbps port gpus. Displayport is still too narrow in bandwidth to do 444 120hz 4k at 10bit though so you have to run 8 bit or 422 or cut the hz down to 98hz. So they are gouging the hell out of having a displayport for now upcharging thousands of dollars on this display and considerably on a few other high rez high hz dp 1.4 displays.

Alienware 55inch oled gaming display:
- 98hz to 120hz(8bit or 422) over dp 1.4
- High hz at 100fps+hz solid cuts persistence blur by 40%, at 120fps+hz cuts persistence blur by 50% (to more of a soften blur instead of smearing)
- freesync vrr
- no Hdmi 2.1 48gbps for next gen of consoles, no full 4k 10bit 444 120hz bandwidth on future hdmi 2.1 gpus
- no HDR
- price

LG C9 TV:
- stuck at 60hz max at native rez since no hdmi 2.1 output gpus.. which means smearing sample and hold blur of the entire viewport when moving and low number of motion definition frames
- VRR range at 60hz peak doesn't mean as much imo as it's a narrow low range of frame rates compared to 98hz - 120hz top range where you'd want to be to get appreciable gains out of high hz.
- has optional BFI but only at 60hz flicker rate which to me would be annoying pwm.
- has hdmi 2.1 for future 120hz console and pc gpus
- has HDR but ABL protection vs. burn in risk cuts it back to 600 nit on persistent brights and overall bright scenes so it's really +200 nit over 400nit sdr plus some spikes. The snap in ABL dimming can be noticeable to some people too. Still quite good looking on oleds and some games like tomb raider and the new cod reportedly look a lot better in HDR.
- couldn't do uncompressed HDMI audio over eArc which is a big deal to some people with surround systems and/or good headphones. I'm not sure if it's been fixed or if it's even possible to be enabled on C9's ever.
- very fairly priced especially from LG partnered warehouse vendors on eBay deals lately ~ $1300 to $1400

lmk if I'm forgetting anything.
 
Last edited:
You're paying for a displayport since nvidia has no hdmi 2.1, 48gbps port gpus. Displayport is still too narrow in bandwidth to do 444 120hz 4k at 10bit though so you have to run 8 bit or 422 or cut the hz down to 98hz. So they are gouging the hell out of having a displayport for now upcharging thousands of dollars on this display and considerably on a few other high rez high hz dp 1.4 displays.

Alienware 55inch oled gaming display:
- 98hz to 120hz(8bit or 422) over dp 1.4
- High hz at 100fps+hz solid cuts persistence blur by 40%, at 120fps+hz cuts persistence blur by 50% (to more of a soften blur instead of smearing)
- freesync vrr
- no Hdmi 2.1 48gbps for next gen of consoles, no full 4k 10bit 444 120hz bandwidth on future hdmi 2.1 gpus
- no HDR
- price

LG C9 TV:
- stuck at 60hz max at native rez since no hdmi 2.1 output gpus.. which means smearing sample and hold blur of the entire viewport when moving and low number of motion definition frames
- VRR range at 60hz peak doesn't mean as much imo as it's a narrow low range of frame rates compared to 98hz - 120hz top range where you'd want to be to get appreciable gains out of high hz.
- has optional BFI but only at 60hz flicker rate which to me would be annoying pwm.
- has hdmi 2.1 for future 120hz console and pc gpus
- has HDR but ABL protection vs. burn in risk cuts it back to 600 nit on persistent brights and overall bright scenes so it's really +200 nit over 400nit sdr plus some spikes. The snap in ABL dimming can be noticeable to some people too. Still quite good looking on oleds and some games like tomb raider and the new cod reportedly look a lot better in HDR.
- couldn't do uncompressed HDMI audio over eArc which is a big deal to some people with surround systems and/or good headphones. I'm not sure if it's been fixed or if it's even possible to be enabled on C9's ever.
- very fairly priced especially from LG partnered warehouse vendors on eBay deals lately ~ $1300 to $1400

lmk if I'm forgetting anything.

if the C9 had a 43 I'd upgrade to that, even if it was still 60hz native. It's definitely a nice screen. I just don't have anywhere I can put it other than right on my desk which will put it 2 feet from my face Lol. Desk has 2 windows behind it so no option for wall mounting it to get another foot further back.
 
Filled. Its scaled to fit by the display. More resolution would be great (4k) and will be here in due time. For now, from an image quality perspective it creates a very eye pleasing image. I've had friends over that all said the same in that they thought 1440 would've looked much worse and then went on to say its the best image they've seen. I had the PG27UQ before this and it doesnt even come close. The obvious downside being everyone says its just physically too big for their setup.

Ill post images of some games either in this thread or my other OLED thread. The alienware not having HDR is just flat out stupid to me. I can't even rationalize it why they wouldnt especially at that price. I'm not even going to go into the burn in debate because it'll just annoy me as I've had multiple OLEDs without issue. Ill trust what I see not what I read. It's never been a problem for me so not worried there.

The very surprising part which people commented on is the input delay. I was skeptical that it could be that good for such a "TV". With VRR 120hz, it is just madness how quick it is. I feel zero difference between this and my old PG27UQ. Hell even that 240hz Acer in the picture feels the same. Whether i'm a better gamer on one or the other hasn't manifested itself so could be im just not good enough but i seem do to well no matter the display. What is apparent is that TN 240hz flat out looks like a pile of donkey dick next to the OLED. The fact they even charge 500 for the gsync 240s is just obnoxious. I think it boils down right now to limited available bandwidth and companies creating many different panels that appeal to small markets and having to charge a massive premium to justify the manufacturing. It's dumb. I'll take OLED and know it'll only get better as the hardware becomes available.
 
Here is my C9 setup. Just got it going a few days ago. Gaming at 1440 120hz right now with VRR. Works very well with latest Nvidia drivers and latest OLED firmware. Sorry for mess, still straightening it up. For the $1200 I paid for it, it works great and will last me just fine until an hdmi 2.1 card is available. Id rather keep the $2800 difference in my pocket. IMHO

View attachment 195836

giphy.gif


Displayport is still too narrow in bandwidth to do 444 120hz 4k at 10bit though so you have to run 8 bit or 422 or cut the hz down to 98hz. So they are gouging the hell out of having a displayport for now upcharging thousands of dollars on this display and considerably on a few other high rez high hz dp 1.4 displays.

Games barely use 8bit422 these days..... 8bit444 is still overkill and the AW55 does 4k120 444VRR HDR400 8-bit no problem. We are still years away from seeing 10bit + high refresh rate being mainstream. The picture quality & visceral experience of the AW55 is PHENOMINAL and well worth the four grand if one can afford it.
 
Last edited:
if the C9 had a 43 I'd upgrade to that, even if it was still 60hz native. It's definitely a nice screen. I just don't have anywhere I can put it other than right on my desk which will put it 2 feet from my face Lol. Desk has 2 windows behind it so no option for wall mounting it to get another foot further back.

Just put it outside and open the windows.
 
View attachment 195884



Games barely use 8bit422 these days..... 8bit444 is still overkill and the AW55 does 4k120 444VRR HDR400 8-bit no problem. We are still years away from seeing 10bit + high refresh rate being mainstream. The picture quality & visceral experience of the AW55 is PHENOMINAL and well worth the four grand if one can afford it.

That doesn't address the main point of that passage you lifted which is they are gouging thousands of dollars for a displayport addition on a 2019 LG oled. $1400 vs $4000.

Displayport 1.4 is not enough bandwidth to utilize 4k 10bit at 120hz is a fact, so you do have to run at 8 bit or 422 or cut the hz down to 98hz. If you like that and your games all run on it anyway that's fine, it's just a fact being that it was a comparison of the C9 TV's current capability and it's future capability compared to the dell alienware 55 oled as asked for. I was actually listing a lot of pros on the alienware 98hz - 120hz oled relative to gaming other than price and lack of hdr since there are no hdmi 2.1 gpus in sight to utilize most of the pros of the C9. I personally wouldn't buy any monitor without high hz and variable hz in 2019+, and on a very high priced one I'd want HDR at this point too though.

400 nit isn't HDR. SDR screens do 400nit and slap HDR on the box. 600 nit ABL is like sdr+ 200 to 250nit into a 3d gamut of hdr highlights. Still appreciable especially with oled side by side contrast/blacks but even then there are tradeoffs in ABL and lost detail past the ABL ceiling (to 1000nit HDR content) that LG oled tvs now allow you to use features that attempt to remap them down(optionally).

I'm pretty sure shadow of the tomb raider and the new CoD both support 10 bit color on 10bit capable hdr tvs in HDR but I'd have to check, and it could be 422 not 444 at that since no hdmi 2.1 gpus. 10 bit probably isn't as big of an issue in SDR media's 2D color gamut and relative color brightness as it is throughout HDR's 3d color volume heights anyway. Still there's a limited number of games that do HDR right and most people are going to go for pushing graphics limits and suffer 60fps+ smear blur of the viewport with low motion smoothness rather than trying to get near 100fps average or better in order to even get any appreciable blur reduction and motion definition increases out of 98hz let alone 120hz so your point is taken. However how is a $4000 55" gaming monitor mainstream? "We are still years away from seeing 10bit + high refresh rate being mainstream".. And if in addition to not getting higher color and into HDR color volumes - if you aren't trying to get high refresh rate gaming since it's "years away from being mainstream" (especially at 4k) why wouldn't you get the C9 tv instead?

I'm sure this monitor looks amazing. I was trying to post an honest comparison, as requested, of the C9 and its current 60hz limits compared to the dell alienware oled -- the huge ones being displayport allowing 98hz - 120hz depending.. for an extra $2600 and no HDR vs. HDR600 ABL (to 700something spikes), no hdmi 2.1 for future compatibility.
 
Last edited:
That doesn't address the main point of that passage you lifted which is they are gouging thousands of dollars for a displayport addition on a 2019 LG oled. $1400 vs $4000.

Your comparing a monitor to a TV. Your comparing mass consumer grade electronics geared towards joe six pack sitting on the couch playing a console VS a refined display tuned for PC gaming use. Comparing a monitor to a TV is like comparing an RTX Titan to an XboxOneX. Its like your telling me that I can play 4k on the xboxone x, so no need for an expensive PC lol

I am not quite convinced that the C9 is going to deliver the same or even better experience for gaming once GPUs support HDMI 2.1...I find it kind of funny that many assume it automatically will. I have tried my C9 55" @ 1440p 120hz and it felt like chewing ABC gum found under a desk at the DMV.

As for the price, four grand is reasonable for what you get since your only other OLED monitor option is a 60hz NONVRR 22" postage stamp for $5,000!

It does not matter how many stats, bar graphs, pie charts, regression analysis, line graphs, etc, etc that you pull out of your ass. The AW55 is 4k120VRR OLED right fucking now, which means it sits on top of the heap today and that is all that matters in this hobby. Tomorrow, something better and cheaper will top it, but we have no idea how long tomorrow will take....whether the c9 ends up being better or not is irrelevant, what is important is that high refresh oled draws gaming consumers away from supporting LCD, which should lead to manufactures offering more oled options. Wake me up when a 32" 4k144 OLED VRR is available.

I don't know about you, but I was so sick of trash LCD and slow 60hz that the AW55 is pure gold. The experiences I have had playing BF4, BFV, PUBG, BF2, CODMW, GTAV, etc, etc in the last month alone on this thing have been mind blowing.....bar none the best gaming experience I have ever had. My jaw has constantly been on the floor, its that damn good.
 
Last edited:
I mean there's no difference between the Alienware and a 55" C9 except that the Alienware probably uses a 2018 panel to save money and it has DP1.4. That's it.

I'm sure it's a nice display but it's a lot of money to pay for the privilege of 120hz VRR 6 months early.

E: Both are still a super fucking awkward size for any normal sized office.
 
Here is my C9 setup. Just got it going a few days ago. Gaming at 1440 120hz right now with VRR. Works very well with latest Nvidia drivers and latest OLED firmware. Sorry for mess, still straightening it up. For the $1200 I paid for it, it works great and will last me just fine until an hdmi 2.1 card is available. Id rather keep the $2800 difference in my pocket. IMHO

View attachment 195836

I have the exact same setup but with a narrower desk and a SFF case. Curious how you got VRR working already? The latest C9 firmware update isn't available yet in my region (Canada) unless I mess around with a USB stick upgrade. Once it is available for download is it just a matter of running the latest nvidia driver? Do I need to use the beta driver? TIA

M.
 
Your comparing a monitor to a TV. Your comparing mass consumer grade electronics geared towards joe six pack sitting on the couch playing a console VS a refined display tuned for PC gaming use. Comparing a monitor to a TV is like comparing an RTX Titan to an XboxOneX. Its like your telling me that I can play 4k on the xboxone x, so no need for an expensive PC lol

I am not quite convinced that the C9 is going to deliver the same or even better experience for gaming once GPUs support HDMI 2.1...I find it kind of funny that many assume it automatically will. I have tried my C9 55" @ 1440p 120hz and it felt like chewing ABC gum found under a desk at the DMV.

As for the price, four grand is reasonable for what you get since your only other OLED monitor option is a 60hz NONVRR 22" postage stamp for $5,000!

It does not matter how many stats, bar graphs, pie charts, regression analysis, line graphs, etc, etc that you pull out of your ass. The AW55 is 4k120VRR OLED right fucking now, which means it sits on top of the heap today and that is all that matters in this hobby. Tomorrow, something better and cheaper will top it, but we have no idea how long tomorrow will take....whether the c9 ends up being better or not is irrelevant, what is important is that high refresh oled draws gaming consumers away from supporting LCD, which should lead to manufactures offering more oled options. Wake me up when a 32" 4k144 OLED VRR is available.

I don't know about you, but I was so sick of trash LCD and slow 60hz that the AW55 is pure gold. The experiences I have had playing BF4, BFV, PUBG, BF2, CODMW, GTAV, etc, etc in the last month alone on this thing have been mind blowing.....bar none the best gaming experience I have ever had. My jaw has constantly been on the floor, its that damn good.

So ... the more apt comparison is 2 graphics cards based on the same chip. Their components are largely the same but with slightly different i/o.

I applaud you for spending $4k to play video games though. When I bought my C9 the AW55 wasn't out yet and I wasn't convinced it was going to make it out before the end of the year. The C9 definitely seems to be the more future-proof choice at this point. However it very well could have sub-optimal results with HDMI 2.1 4k VRR .... have to wait and see.
 
So ... the more apt comparison is 2 graphics cards based on the same chip. Their components are largely the same but with slightly different i/o.

I applaud you for spending $4k to play video games though. When I bought my C9 the AW55 wasn't out yet and I wasn't convinced it was going to make it out before the end of the year. The C9 definitely seems to be the more future-proof choice at this point. However it very well could have sub-optimal results with HDMI 2.1 4k VRR .... have to wait and see.

The RTX Titan was what, three grand? I had no interest in that as I learned a long time ago to get the best mainstream GPU as it tends to have the most support = ala 2080ti. Yes the RTX Titan is 15% faster, or whatever, but the cost value was not there for me.

The AW55 on the other hand is like ice cream for my eyes. I had the Acer X27 FALD and this AW55 beats the living shit out of it!

I simply cannot stand LCD for gaming anymore and gaming is one of my hobbies.
- I don't golf but dudes spend way more than four grand a year golfing
- I don't fish but dudes spend way more than four grand a year fishing
- I don't go to strip clubs but dudes spend way more than four grand a year in clubs
- I don't buy firearms.....oh wait I do lmao!

I have said a dozen times in this thread already, that smart money is on waiting for the C9 / HDMI 2.1 for value. However, I have not been enjoying gaming at all the last couple of years. I'm middle aged and 27" screens are too small for me to process....maybe its my eyes or slipping reflexes, but modern games have so much detail in them now I don't know how anybody sees shit on the smaller displays. I would rather have too big than too small (thats what she said).

My original plan was to try the AW55 and then return it and wait for HDMI 2.1 / C9-C10, but after a weekend with it I was undone! Now I am having a blast with every game I play.
 
Last edited:
I'm middle aged and 27" screens are too small for me to process....maybe its my eyes or slipping reflexes, but modern games have so much detail in them now I don't know how anybody sees shit on the smaller displays. I would rather have too big than too small (thats what she said).
It’s very much a personal preference, but I’m the opposite way around. I’ll take too small as opposed to too big. Don’t get me wrong, when it comes to displays I definitely sing to the tune of “bigger is better.” It’s only when I have to decide between too big and too small that I go with too small.

In this case, the Alienware is literally too huge for me to accommodate into my set up. No matter what I do, I simply cannot use it. Those 21 inch OLEDs, on the other hand, I can just use a foot from my eyeballs and get a pretty decent experience. Being mildly myopic, I’d even have the benefit of not needing to use my glasses. Now, would I pay $5,000 for that honor? No, of course not.

My point is, this 55” OLED is not necessarily the only or best option for everyone.
 
So the Nvidia press release earlier in the month said they'd be bringing HDMI 2.1 to the 20 series RTX cards, correct? This should be out pretty soon then and we can finally use TV's like the C9 @ 4K / 120hz? Or did I read the article wrong. Just trying to clarify, if so, makes more sense for my upgrade path even though I'm not entirely sold on the idea of a 55" 2 feet from my face. I wish someone would've made a capable 43", but it is what it is. Likely won't rely on ASUS or Acer to make something not shitty.
 
So the Nvidia press release earlier in the month said they'd be bringing HDMI 2.1 to the 20 series RTX cards, correct? This should be out pretty soon then and we can finally use TV's like the C9 @ 4K / 120hz? Or did I read the article wrong. Just trying to clarify, if so, makes more sense for my upgrade path even though I'm not entirely sold on the idea of a 55" 2 feet from my face. I wish someone would've made a capable 43", but it is what it is. Likely won't rely on ASUS or Acer to make something not shitty.

Nope. Just the VRR over HDMI feature. The hardware isn't capable of full HDMI 2.1 bandwidth.
 
So the Nvidia press release earlier in the month said they'd be bringing HDMI 2.1 to the 20 series RTX cards, correct? This should be out pretty soon then and we can finally use TV's like the C9 @ 4K / 120hz? Or did I read the article wrong. Just trying to clarify, if so, makes more sense for my upgrade path even though I'm not entirely sold on the idea of a 55" 2 feet from my face. I wish someone would've made a capable 43", but it is what it is. Likely won't rely on ASUS or Acer to make something not shitty.
Not possible, it needs HDMI 2.1 hardware which wasnt available when they were designed.
They will be enabling some HDMI 2.1 features on HDMI 2.0
In a similar fashion to HDMI 2.0 TVs and Receivers also getting HDMI 2.1 features.
 
So the Nvidia press release earlier in the month said they'd be bringing HDMI 2.1 to the 20 series RTX cards, correct? This should be out pretty soon then and we can finally use TV's like the C9 @ 4K / 120hz? Or did I read the article wrong. Just trying to clarify, if so, makes more sense for my upgrade path even though I'm not entirely sold on the idea of a 55" 2 feet from my face. I wish someone would've made a capable 43", but it is what it is. Likely won't rely on ASUS or Acer to make something not shitty.

I'm using an Ikea Galant corner desk, and currently maintain a 2.5' distance. I could easily increase that to 3' by moving my keyboard back or wall mounting the display, which I'm seriously thinking of doing after seeing tigger's picture above. That looks pretty great (and would allow me to move it down a couple of inches by eliminating the stand). Since this works for now though, I'll probably wait to see if the magical 48" LG appears. If it doesn't and we're stuck with 55" as the smallest size, I might do the wall mount thing but 48" is perfectly usable at 2.5' so I'm hoping it's coming.

Just wondering if the 2' you mention is because of a desk limitation (i.e. yours not being very deep) or something else.

As for the Nvidia announcement, sharknice and Nenu covered it but they can't bring HDMI 2.1 to the current cards without releasing new hardware, which I don't see them doing until a new model of card. The limitation is the hardware (port), not software or firmware.
 
Yeah I saw that, just released today. Getting mixed responses though so what does that actually mean?

It means you will get VRR over HDMI on current RTX cards, BUT you will be limited to 4k60Hz. No 4k120Hz until we get HDMI 2.1 GPUs.
 
40fps+Hz to 60fps+Hz range without hdmi 2.1 output gpus at native rez I think on the C9's though a lot of freesync monitors in the past were 48hz - 60hz. 40 to 60 is much less useful than 40fps/Hz to 120fps/Hz range.

Your frame rate is an average which means it can be a roller coaster +/- 30fps from that average with a few potholes and upshoot spikes outside of that in reality on most demanding games.. without turning the graphics down a ton, especially at 4k resolution. If you have a frame rate graph that goes under the lower variable hz limit part of the time with g-sync the chip will revert to doing some frame repeating 2x or 3x if required to keep the frame rate "high enough" (really more of a place holder filler) which turns your frame rate motion definition to molasses at those rates but still avoids stutter and judder mostly unless you are bottoming out on bottom of the graph with extremely low frame rate to zero rate potholes at times. I think g-sync does LFC better on the chip and originally freesync didn't do this as well or at all if i recall correctly but it has a method of LFC (low framerate compensation) using the video card driver on the gpu itself to compensate for going under the bottom end of the freesync range now.

The recommendation for the top end with g-sync is usually to cap the fps -3 fps beneath the max refresh rate of the monitor so that it doesn't revert back to v-sync when going over the max refresh rate since it would otherwise add input lag. You can do this with in game frame rate limiters when available or use rtss app rather than the nvidia drivers since rtss is essentially no input lag added to the chain where nvidia's driver method adds a slight bit. However according the quote below you might have to cap freesync/VRR display's "g-sync" off of nvidia gpus even more than -3fps in order for the method to be reliable. If that's the case then with the hdmi 2.1 LG C9's and no hdmi 2.1 gpu currently you'd potentially be down to less than recommended (40fps-Hz to) 57fps-hz range cap. Not interesting to me for modern pc gaming at all.


This is a quote months ago from the head of the blurbusters.com web site regarding freesync. I haven't seen any more recent info changing this yet but I'll look around. If anyone has updates to it feel free to contribute.

  1. The V-SYNC option doesn’t appear to be working as a frametime compensation mechanism to 100% prevent tearing during frametime variances in the upper and lower range, and simply reverts to V-SYNC behavior when falling out of the VRR (variable refresh rate) range.

  2. The minimum refresh range (aka LFC: low framerate compensation) appears to be currently limited in functionality on the 12 odd FreeSync monitors that Nvidia officially supports, and not functioning at all on unsupported monitors, which means when the framerate drops below the supported physical minimum refresh rate of the given Freesync panel, no refresh duplication behavior occurs to compensate, and it instead reverts either to full V-SYNC behavior (V-SYNC option on) or tearing (V-SYNC option off).

  3. There are some reports that -3 FPS isn’t currently enough to always stay in the VRR range with this driver feature, as there appears to be a slower communication rate between the driver and the GPU when directly compared to a genuine G-SYNC module. What that required number is, is unknown until high speed tests are done.
Since the V-SYNC option doesn’t appear to be working (as it does on a genuine G-SYNC module) with the new “G-SYNC on FreeSync” feature, my current recommendation for those that have an Nvidia GPU paired with a FreeSync monitor is to use G-SYNC + V-SYNC “Off” (both in-game and in the NVCP) and limit the FPS to 120 on a 144Hz monitor, for instance.

This should reduce or remove the tearing seen in the upper FPS range near the bottom of the screen by giving enough “breathing room” for frametime variances. However, tearing will still be seen occasionally during frametime spikes, and/or whenever the framerate exceeds or drops below the Freesync monitor’s VRR range.

For lower or higher refresh rates, the number required to reduce tearing in this area may be different, and should be lowered gradually until it diminishes or disappears.

To be extremely clear here, this is currently ONLY necessary for the G-SYNC feature on FreeSync displays, and ONLY because the V-SYNC option doesn’t appear to be working (as it does on a genuine G-SYNC monitor) when paired with this driver-only version of G-SYNC. "
----

You can opt to run a non native 2560x1440 rez at 120hz on demanding games on the C9 TV's to get to 120hz but non-native tends to look a bit muddy and less crisp. I'm not sure what the resolution cutoff is for 120hz but i think it is 2560x1440 on the C9 tvs. Perhaps you could try different resolutions past that though and see if 90hz, 98hz works on resolutions between 2560x1440 and 3840x2160.

Personally regarding the screen size of a 55" for gaming I'd consider running a 3840 x 1440 21:9 or 3840x1600 21:10 rez to get a huge ultrawide out of it and squeeze some extra frame rate out of it vs dialing graphics settings down even further on more demanding games at 4k. I'd be interested to see impressions of current owners of the alienware 55" OLED (and C9 tvs) in ultrawide resolutions if any owners get around to it at some point. You'd be getting a giant ultrawide resolution granting more game world in your FoV and a slightly higher frame rate which would be very helpful in order to get any appreciable gains out of the higher hz capability of the alienware (or other 120hz hdmi 2.1 displays eventually).
 
Last edited:
If 40 is the bottom it's limited from 40-120hz. It wouldn't meet gsync compatible certification if the maximum wasn't double the minimum.

Yep, so Nvidia (or AMD lmao) have to release an HDMI 2.1 GPU for that to happen @4k, unless you wanna rock 1440p 120hz with 200% SS...or sell your ass on the street corner for a week like I did to get the AW55....and yes I did sell my ass, here is part of my He-Bitch journal for proof!
2.jpg

3.jpg

4.jpg
how to host images

Besides the AW55 which is amaze balls and support for C9/C10 LGs, I don't see anything on the display horizon which interests me. That proart 32" 4k120 looks nice, but inferior LCD motion blur and lack of gsync overdriving those pixels is probably gonna dudd that display out. So basically 2021 at best for JOLED 32" 4k120VRRs....uuugghhhhh

I see the C9 55's going for as low as $1199 new on the bay now, which is incredible value. Literally there is no excuse for everybody not to get one of those instead of the shitastic LCDs being forced out right now. Fuck the $2500 UW Fald and Fuck the 512 zone 27" 4k bullshit!
 
Last edited:
So the C9's VRR is limited to 40-60hz? That's worthless.

40fps+Hz to 60fps+Hz range without hdmi 2.1 output gpus at native rez
edited to clarify.. it's limited in range by the native resolution and the lack of hdmi 2.1 output hardware. The panel itself is 120hz 4k. It can do 1920x1080 and 2560x1440 at 120hz off of hdmi 2.0b but I'm not a fan of non native mud so 4k or otherwise require 1:1 pixel mapping with bars.

I was talking about it's VRR range as of now at 4k driven off of non hdmi 2.1 gaming hardware sources. It's even less than 40 to 60 if capped properly to avoid input lag, especially with free-sync/VRR rather than off of a g-sync hardware module unless something has changed.

according the quote below you might have to cap freesync/VRR display's "g-sync" off of nvidia gpus even more than -3fps in order for the method to be reliable. If that's the case then with the hdmi 2.1 LG C9's and no hdmi 2.1 gpu currently you'd potentially be down to less than recommended (40fps-Hz to) 57fps-hz range cap. Not interesting to me for modern pc gaming at all.

that is, until hdmi 2.1 gpus and full 120hz 4k VRR functionality of course.
 
Last edited:
Tested new 441.08 drivers and Gsync compatibility is working great on C9. Just in case anyone cares.

Can we just stop and appreciate for a moment that, here in 2019, we finally have a major GPU vendor working in conjunction with a major TV vendor, that results in a world class PC level gaming experience in the living room (or rearranged office).

Once we have HDMI 2.1 hardware on the GPU side (likely next year), it’s basically game over for all these over priced monitors. Who would have thought, even 5 years ago, that it would be TVs blazing the way for the finest desktop experience.
 
Once we have HDMI 2.1 hardware on the GPU side (likely next year), it’s basically game over for all these over priced monitors. Who would have thought, even 5 years ago, that it would be TVs blazing the way for the finest desktop experience.

That's going a bit far because large TVs will still be an option for a small group of people willing to figure out how to set up a large TV for comfortable desktop use. I would call it a game over when they can offer flagship specs, VRR and high refresh rates at something around 43" size rather than the subpar stuff we have seen like the Samsung Q60 series.
 
Filled. Its scaled to fit by the display. More resolution would be great (4k) and will be here in due time. For now, from an image quality perspective it creates a very eye pleasing image. I've had friends over that all said the same in that they thought 1440 would've looked much worse and then went on to say its the best image they've seen. I had the PG27UQ before this and it doesnt even come close. The obvious downside being everyone says its just physically too big for their setup.

Ill post images of some games either in this thread or my other OLED thread. The alienware not having HDR is just flat out stupid to me. I can't even rationalize it why they wouldnt especially at that price. I'm not even going to go into the burn in debate because it'll just annoy me as I've had multiple OLEDs without issue. Ill trust what I see not what I read. It's never been a problem for me so not worried there.

The very surprising part which people commented on is the input delay. I was skeptical that it could be that good for such a "TV". With VRR 120hz, it is just madness how quick it is. I feel zero difference between this and my old PG27UQ. Hell even that 240hz Acer in the picture feels the same. Whether i'm a better gamer on one or the other hasn't manifested itself so could be im just not good enough but i seem do to well no matter the display. What is apparent is that TN 240hz flat out looks like a pile of donkey dick next to the OLED. The fact they even charge 500 for the gsync 240s is just obnoxious. I think it boils down right now to limited available bandwidth and companies creating many different panels that appeal to small markets and having to charge a massive premium to justify the manufacturing. It's dumb. I'll take OLED and know it'll only get better as the hardware becomes available.

Do you have the option to use 1:1 and not scale?
 
Can we just stop and appreciate for a moment that, here in 2019, we finally have a major GPU vendor working in conjunction with a major TV vendor, that results in a world class PC level gaming experience in the living room (or rearranged office).

Once we have HDMI 2.1 hardware on the GPU side (likely next year), it’s basically game over for all these over priced monitors. Who would have thought, even 5 years ago, that it would be TVs blazing the way for the finest desktop experience.

Maybe this will pursuade some of the TV manufacturers to bring in more 43 inch panels. Bring curved back too FFS it's great for PC gaming.
 
Maybe this will pursuade some of the TV manufacturers to bring in more 43 inch panels. Bring curved back too FFS it's great for PC gaming.

Agreed, I had a Curved C6 Oled and it was pretty awesome. It helped cut back the need to sit back an extra foot that flat panels require.
 
Can we just stop and appreciate for a moment that, here in 2019, we finally have a major GPU vendor working in conjunction with a major TV vendor, that results in a world class PC level gaming experience in the living room (or rearranged office).

Once we have HDMI 2.1 hardware on the GPU side (likely next year), it’s basically game over for all these over priced monitors. Who would have thought, even 5 years ago, that it would be TVs blazing the way for the finest desktop experience.

I'm blown away when I see a screen of this size be that fast. Whether its the Alienware or the C9, I agree it is shocking. Hopefully, this is a case where we all win whether you want a monitor or a TV display. What's shocking is that size used to be what really dictated price. Now especially with monitors its what feature one has. I paid $600 less for a 55" OLED then my PG27UQ which is a whole 27" haha.

Do you know if it only works with RTX cards?

I have a 2080Ti, don't have anything else to test. I don't see why not as its using HDMI 2.0 and Gsync. There is nothing specifc to the RTX cards unless Nvidia themselves restrict it artificially. Wouldn't put it past them.

Do you have the option to use 1:1 and not scale?

Yep. Just go into nvidia control panel and turn off scaling. You will get black bars all around the image. Honestly, the scaling is pretty decent to me. I was surprised. I'm normally super against scaling anything but I can live with it for now.
 
Agreed, I had a Curved C6 Oled and it was pretty awesome. It helped cut back the need to sit back an extra foot that flat panels require.

The w models are pretty flexible. You could probably mod it to be curved. It's only a $6000 experiment.
 
Yep. Just go into nvidia control panel and turn off scaling. You will get black bars all around the image. Honestly, the scaling is pretty decent to me. I was surprised. I'm normally super against scaling anything but I can live with it for now.

Then you are actually sending a 4k signal which defeats the purpose of 120hz over hdmi right meow. Question is can the tv display native 1:1 without scaling it if it gets a 1440p signal. Makes for a 36.5" display.
 
Last edited:
Then you are actually sending a 4k signal which defeats the purpose of 120hz over hdmi right meow. Question is can the tv display native 1:1 without scaling it if it gets a 1440p signal. Makes for a 36.5" display.

OLED is 3840x2160 no matter what. Nothing else besides 3840x2160 would be true 1:1 without going through a scaler either in the display or the GPU. OLED is a fixed pixel array display. There would be no such thing as 2560x1440 showing 1:1 on OLED.
 
OLED is 3840x2160 no matter what. Nothing else besides 3840x2160 would be true 1:1 without going through a scaler either in the display or the GPU. OLED is a fixed pixel array display. There would be no such thing as 2560x1440 showing 1:1 on OLED.

The question then is can the tv's scaler assign 1:1 pixels if it were to receive 2560x1440 signal so no interpolation is done and just surrounds the 2560x1440 pixels with black bars?
 
Last edited:
  • Like
Reactions: elvn
like this
I'm interested in how the 55" oled screens look and function at ultrawide resolutions not just for the added game world real-estate shown compared to 16:9 but because it would make the goal of 100fps or better rates more achievable and with less dial down of the graphics settings. Otherwise 120hz is pretty meaningless on more demanding titles. It could also make a better screen size for a nearer viewing distance, with the sides adding immersion while keeping the regular 16:9 sized portion in your full focus viewing angle.

I'd be interested in full width 3840x1600 21:10 rez, 3840x1400 21:9, and even other resolutions with a full black frame all around 1:1 to see what kind of frame rates are possible on more demanding games and at what graphics settings (med-high, regular high, high plus to ultra minus, etc) but that can be found out from game benchmarks of screens with those resolutions so I guess I'm really interested in what the new viewport sizes look like on a 55" OLED screen. I'm only interested in 1:1 so that would mean black bars (completely "off" blacks on oleds makes that great). You can also run games in windowed mode 1:1 optionally (with a black wallpaper and no icons of course vs IR/burn-in risk)


If LG C9 OLED menus are the same as their regular LG webOS then this would still be possible.
https://www.lg.com/ca_en/support/product-help/CT20098005-1437128729864-others
● In the HDMI-PC mode, you can only select 4:3, 16:9 or Just Scan aspect ratio options.
downloadFile.png
downloadFile.png
downloadFile.png
► Picture ► Aspect Ratio



16:9

Displays the image on screen within the confine of a 16:9 aspect ratio (also known as 1.78:1

picture format) making use of the entire screen as adopted for High Definition Digital Television

broadcasting.



Just Scan

Displays the picture on screen in its fully integrated original 16:9 widescreen format size without

cutting away any of the edges (not subjected to overscanning unlike other widescreen aspect ratio

options) matching pixel for pixel when the video source supplies a 1920 x 1080 image resolution.

Most ideal option for Blue-ray players, digital game consoles and personal computers.

● Just Scan can be selected for DTV/HDMI/Component (720p or higher).



Full Wide

When TV receives a wide screen signal (generally in any other proportion than 1.78:1) it will let you

adjust the picture horizontally or vertically, in a linear proportion, to fully fill the entire screen.

4:3 and 14:9 video is supported in full screen without any video distortion through DTV input.



Set by Programme / Original

Alters screen aspect ratio between 4:3 and 16:9 depending on the incoming video signal format.



4:3

Displays the image on screen in an aspect ratio of 4:3 notably to accommodate Classic TV and

old black & white movies. Please note that the lateral black bars at both extremities are a necessary

evil to preserve image proportions on screen. Be forewarned that excessive use of this option will

produce uneven wear & tear across the screen whereas in time the center of the screen will appear

washed out compared to the sides.




Zoom

Enlarges the image to match the screen width. The top and bottom portions of the video screen may

not be cut off. May distort appearances of objects and persons appearing on screen stretching them

or having them appear short and stubby.



Cinema Zoom

Resizes any picture featuring cinemascope ratio of 2.35:1 or 2.50:1 extra widescreen format without

image distortion maintaining proportion accuracy. In doing so, viewer will rid screen of top & bottom

black bars but will also suppress parts of the image at the far left and far right.



● The configurable items differ depending on model or country.

● Viewing content from an external device or, for an extended period of time, having fixed text such as

the program name or using an aspect ratio of 4:3 may result in uneven wear & tear.

● Depending on the input signal, available screen size options may be different.

● In the HDMI-PC mode, you can only select 4:3, 16:9 or Just Scan aspect ratio options.

-------------------

TVs are leading the way because of consoles. They aren't making 120hz hdmi 2.1 4k VRR tvs for pcs really when you think about it. The BFGs are all massively overpriced displayport screens in the meantime. At this rate a ps5 could have hdmi 2.1 out on what's rumored to be 2080 gpu level of power (plus utilizing quasi 4k resolution using dynamic resolution and checkerboard tricks) with VRR for 120hz performance mode enabled titles before pcs get hdmi 2.1 output gpus but I do hope nvidia comes out with a die shrink hdmi 2.1 output gpu line before then.
I wouldn't be so quick to pat nvidia on the back for allowing 40 - 60 (55?capped) worth of VRR on 120hz 4k tvs and while their display partners overcharge thousands for displayport on 120hz 4k monitors. AMD gpus of off pc and xbox one enabled freesync VRR on capable tvs over hdmi 2.0b way before nvidia did, since April 2018. Nvidia supported freesync in january 2019 but only over displayport. Nvidia just started supporting VRR on LG OLED (over hdmi 2.0b out) in september/october 2019.
 
Last edited:
I'm interested in how the 55" oled screens look and function at ultrawide resolutions not just for the added game world real-estate shown compared to 16:9 but because it would make the goal of 100fps or better rates more achievable and with less dial down of the graphics settings. Otherwise 120hz is pretty meaningless on more demanding titles. It could also make a better screen size for a nearer viewing distance, with the sides adding immersion while keeping the regular 16:9 sized portion in your full focus viewing angle.

I'd be interested in full width 3840x1600 16:10 rez, 3840x1400, and even other resolutions with a full black frame all around 1:1 to see what kind of frame rates are possible on more demanding games and at what graphics settings (med-high, regular high, high plus to ultra minus, etc) but that can be found out from game benchmarks of screens with those resolutions so I guess I'm really interested in what the new viewport sizes look like on a 55" OLED screen. I'm only interested in 1:1 so that would mean black bars (completely "off" blacks on oleds makes that great). You can also run games in windowed mode 1:1 optionally (with a black wallpaper and no icons of course vs IR/burn-in risk)


If LG C9 OLED menus are the same as their regular LG webOS then this would still be possible.
https://www.lg.com/ca_en/support/product-help/CT20098005-1437128729864-others



-------------------

TVs are leading the way because of consoles. They aren't making 120hz hdmi 2.1 4k VRR tvs for pcs really when you think about it. The BFGs are all massively overpriced displayport screens in the meantime. At this rate a ps5 could have hdmi 2.1 out on what's rumored to be 2080 gpu level of power (plus utilizing quasi 4k resolution using dynamic resolution and checkerboard tricks) with VRR for 120hz performance mode enabled titles before pcs get hdmi 2.1 output gpus but I do hope nvidia comes out with a die shrink hdmi 2.1 output gpu line before then.
I wouldn't be so quick to pat nvidia on the back for allowing 40 - 60 (55?capped) worth of VRR on 120hz 4k tvs and while their display partners overcharge thousands for displayport on 120hz 4k monitors. AMD gpus of off pc and xbox one enabled freesync VRR on capable tvs over hdmi 2.0b way before nvidia did, since April 2018. Nvidia supported freesync in january 2019 but only over displayport. Nvidia just started supporting VRR on LG OLED (over hdmi 2.0b out) in september/october 2019.

That Just Scan setting would probably do the trick!
 
Back
Top