LG 48CX

There was talk about OLED preservation and wallpapers.

I can absolutely recommend Wallpaper Engine. With that I made a playlist of a few hundred wallpapers (animated even) and they change every 2 minutes.
It also has the option to adjust the desktop icon opacity. I have it at 50% now.

It only costs a few bucks/euros.

View attachment 259565

I know this can work along with displayfusion but I haven't tried it. Apparently wallpaper engine works as an overlay so will "overwrite" the displayfusion wallpaper(s). What I'm not sure about is how it works with multiple monitors. For example, with displayfusion you can set a different wallpaper for each monitor. You can also set each monitor to have a different wallpaper, and you can set the wallpaper to be a slideshow from various sources, free or otherwise.

Displayfusion wallpaper sources you can use:
53iQXbxWYw.png


-------------------------

To be honest, even with all of the desktop real-estate I have, I usually have most of my monitors tiled with apps so while wallpaper is neat for posting pictures of my setup it's not that big of a deal day to day the way I actually use my current setup. I minimize all of my windows if I'm leaving the PC and the monitors eventually just time out on power saving. Having a slideshow kick on to help reduce oled burn in risk once I get an OLED as a gaming/media "stage" could be a good usage for cycling wallpaper on that monitor though, at least until windows 10 times out the monitor and puts it in standby. I'd still rely more on the OLED's own OSD screensaver or power saving timeout because PCs themselves can crash to a frozen screen or window, bios boot screen, etc. potentially even if that is a pretty rare fluke nowadays.
 
Last edited:
I just reordered since it seems the tvs are getting here sooner now. I planned to wait until vizio released theirs, but that is too far off and I am weak and want oled NOW!

Incidentally, I have a store where the pickup is available today, it's the same store I initially ordered the 48 for a pickup so it must have been routed there because of my earlier order.
 
I'm debating whether or not to keep a 27" portrait monitor on the side. Reason being a LCD looks really terrible next to the OLED with a black wallpaper.

EDIT: Looks like almost every where got some stock of this today including Microcenter and Walmart so those of you still looking should easily get a hold of one now.
 
Last edited:
I'm debating whether or not to keep a 27" portrait monitor on the side. Reason being a LCD looks really terrible next to the OLED with a black wallpaper.

EDIT: Looks like almost every where got some stock of this today including Microcenter and Walmart so those of you still looking should easily get a hold of one now.

Yeah I have my good quality 32" VA 1440P monitor next to mine that was known for it's good native contrast ratio. Black is laughably bright on it in comparison.
 
I'm debating whether or not to keep a 27" portrait monitor on the side. Reason being a LCD looks really terrible next to the OLED with a black wallpaper.
It's true, it's not the same quality-wise, but I do keep a 27" on the side on portrait (it fits perfectly, width-to-height wise) and I like having somewhere to keep emails/web/video/music players etc. so I can flip windows from one screen away from games etc.
 
You can use displayfusion or other apps to set a different wallpaper on each monitor just so you are aware.
 
Yeah I have my good quality 32" VA 1440P monitor next to mine that was known for it's good native contrast ratio. Black is laughably bright on it in comparison.

120hz is just not enough for me, otherwise i would get this OLED for my PC if it went up to 165hz. But for consoles, this is a killer display.
 
SDR should fit, but from what I've gathered HDR will not. HDR supposedly adds a 10% overhead to your total bandwidth. Some sources even say 20%.
I think the overhead is pure the extra bits needed for the color depth if you use more than 8-bit.
For 8-bit HDR, the requirements are roughly the same as for SDR. I have created custom resolution for 3840x2160x120Hz so that it just fits within the bandwidth for SDR, but 8-bit HDR works perfectly as well.
Maybe HDR10+ or Dolby Vision would require more, but I doubt HDR10 (8-bit) does.
 
120hz is just not enough for me, otherwise i would get this OLED for my PC if it went up to 165hz. But for consoles, this is a killer display.
I would challenge this statement; I expect 120Hz on an OLED to be superior to anything less than 200Hz on an LCD, in terms of realized motion resolution. Really anything that's not a gaming TN, and then, OLED would obviously be superior in every other metric, so better motion resolution than OLED is likely to require significant compromises.
 
120hz is just not enough for me, otherwise i would get this OLED for my PC if it went up to 165hz. But for consoles, this is a killer display.
I just swapped between 144hz and 120hz on my 27" monitor and the difference is pretty subtle but there. The real question is what are you playing that can hit 163FPS at 4K for it to matter. Or is this a desktop smoothness thing.
 
Most people aren't going to be getting 120fps average let alone minimum to feed this 4k resolution monitor fully on games with gorgeous graphics and settings. Even somewhat older games can have 4k texture mods and updated meshes, FX, lighting, etc.

=========================================

100fps average (10ms per frame) would get you a FPS graph something like:

70 to 85fpsHz <<<-------------100fpsHz ------------>>> 115 to 130fpsHz (capped at 115 or 117fpsHz to avoid v-sync input lag)

To get 120fps minimum to feed this monitor fully you'd probably need to be running 135 or up to 150fps+ average to be sure. That limits you to CS:GO and other simple to render games, or games with their graphics settings stripped.


-------------------------------------------------------------------------

165 fps average would be something like

135 to 150fpsHz <<<----- 165fpsHz ----->>> 180 to 195fpsHz




To get 165 fps minimum you'd probably need to be running 180fps or 195fps average to be sure.

========================================



We'll see what a 3080 or "3090" Ti's performace is but the graphics ceilings of games isn't going to be stagnant on newer generations of graphics cards either. The graphics ceilings are pretty arbitrary to begin with.

There is also some hope of higher frame rates from DLSS 2.0 though. I think things like DLSS / AI upscaling and some kind of duplication interpolation are going to be necessary in the future to get very high frame rates.
 
I just swapped between 144hz and 120hz on my 27" monitor and the difference is pretty subtle but there. The real question is what are you playing that can hit 163FPS at 4K for it to matter. Or is this a desktop smoothness thing.
We're going to need 1000Hz or more for smooth cursor movement at speed across a dark background... :)
 
I would challenge this statement; I expect 120Hz on an OLED to be superior to anything less than 200Hz on an LCD, in terms of realized motion resolution. Really anything that's not a gaming TN, and then, OLED would obviously be superior in every other metric, so better motion resolution than OLED is likely to require significant compromises.
Agreed. Went from a 144hz Gsync monitor to 120hz OLED, and the difference in motion is almost imperceptible. The strength of OLED is its near instant pixel response, so even though you see less frames than a 144/165hz monitor, the frames are clearer and cleaner than a comparable LCD. The only real way that 120hz OLED is a downgrade is if you've already got a 240hz monitor. In that case, the screen response of the 240hz monitor alone will win.
 
120hz is just not enough for me, otherwise i would get this OLED for my PC if it went up to 165hz. But for consoles, this is a killer display.

Not enough for what exactly? You'd be hard pressed to hit more than 120fps at 4k unless you are playing overwatch or csgo. And in that case you might as well say 165Hz isn't enough either and get a 240Hz monitor.
 
120hz is just not enough for me, otherwise i would get this OLED for my PC if it went up to 165hz. But for consoles, this is a killer display.

My 32" VA LCD is 165hz. Motion clarity on the OLED at 120hz is better than the LCD at 165hz. Motion clarity at 120hz with BFI on high on the OLED blows the LCD at 165hz out of the water, just like it does with blacks. The max refresh rate on LCD monitors is essentially a fake metric driven by overdrive and leading to overshoot/smearing/ghosting. Yeah the screen accepts that signal, but in many (most) situations the pixels don't react fast enough to display it cleanly.
 
Not true. Club 3D says the input to the adapter (Display port 1.4) accepts a display stream compression signal (DSC), which the RTX series can do. So the bandwidth for 4K/120Hz 4:4:4 or RGB at 10-bit HDR should be there on both the input (DSC DP 1.4) and output (HDMI 2.1 without DSC) ends of the adapter.

I believe he is right, DP1.4 supports 4K 4:4:4 10Bit HDR up to 96Hz. At least that is what I found online.

Found out the issue. Current gen GPU (2080TI) can do RGB or 4K 120 Hz / 144 Hz HDR 10-bit YCbCr422. Not the latter with 444.

Try making a custom resolution for 96Hz. See if that allows you to apply 4K 4:4:4 10Bit HDR.
Thanks for giving us feedback on your adapter. I should get mine on Thursday so I will also try it.
 
looks like my order is ready for pickup

I'll swing by after "work" from home and get it.

In the meantime, I should probably get rid of some of the clutter to make room.
 
The Club3D adapter doesn't even support gsync so if you want to have gsync for gaming then you would have to unplug the TV from Displayport and then plug it into HDMI which sounds like even more of a pain in the ass to do every time vs just switching back and fourth from 60Hz 444 and 120Hz 420 lol.
 
The Club3D adapter doesn't even support gsync so if you want to have gsync for gaming then you would have to unplug the TV from Displayport and then plug it into HDMI which sounds like even more of a pain in the ass to do every time vs just switching back and fourth from 60Hz 444 and 120Hz 420 lol.

You COULD leave both hooked up and switch inputs on the TV for less of a pain in the ass, and without it you can be stuck with the pain in the ass of switching between 4K 60hz RGB and 4k 120hz 4:2:0 in display settings depending on what you're doing. Nothing is ideal till there are GPU's that support HDMI 2.1 natively.
 
I believe he is right, DP1.4 supports 4K 4:4:4 10Bit HDR up to 96Hz. At least that is what I found online.



Try making a custom resolution for 96Hz. See if that allows you to apply 4K 4:4:4 10Bit HDR.
Thanks for giving us feedback on your adapter. I should get mine on Thursday so I will also try it.
98 Hz, not 96.
 
You COULD leave both hooked up and switch inputs on the TV for less of a pain in the ass, and without it you can be stuck with the pain in the ass of switching between 4K 60hz RGB and 4k 120hz 4:2:0 in display settings depending on what you're doing. Nothing is ideal till there are GPU's that support HDMI 2.1 natively.

Does that work? If it does that's definitely viable then and I'd probably consider doing it myself. I never said switching between 444 and 420 WASN'T a pain in the ass, I just said unplugging and plugging DP/HDMI from your PC just seems like more of a bigger one.
 
Does that work? If it does that's definitely viable then and I'd probably consider doing it myself. I never said switching between 444 and 420 WASN'T a pain in the ass, I just said unplugging and plugging DP/HDMI from your PC just seems like more of a bigger one.

I haven't tried it because I don't personally have the adapter but I don't see why it wouldn't. I'll try it today with 2x straight HDMI though and see, it might be convenient to have a Gsync and non-Gsync (disable instant game response on a port) connection to switch between.
 
Last edited:
Could you upload a video please?

Sure.




Looking at it again I think this issue has nothing to do with LG's OLED but instead it's a gsync issue itself. I have played game after game after game on my CX and there has been absolutely zero "brightening of near black objects". None. This weird flickering I've encountered in the Witcher 3 before on an older gsync monitor of mine so I'm positive this is just a gsync problem with this game in particular.
 
^ Nice.

I don't know that I care enough about 120Hz 4:4:4 to buy that adapter which I'd only use temporarily. It for sure looks nicer on the desktop, but I think I'm OK with just waiting for the 3080Ti for now. I'll be keeping an eye on the reviews of it here though, so bring it!

I'm running my OLED light level at 35 and lowered the contrast from 85 to 80, and still notice the ABL from time to time. 80 should be lower enough to disable it, yeah?
 
Most people aren't going to be getting 120fps average let alone minimum to feed this 4k resolution monitor fully on games with gorgeous graphics and settings. Even somewhat older games can have 4k texture mods and updated meshes, FX, lighting, etc.

You heard of resolution scaling? You don't have to play EVERY game at 4k:



And there will be even better methods in the future, like checkerboard rendering.
 
Yes, because you are limited to DisplayPort 1.4's bandwidth of 32.4 Gbps with the adapter.

The 48CX does not support DSC. AVS confirmed this by looking at the EDID.

48CX doesn’t need to support DSC. HDMI 2.1 has enough bandwidth without compression to do what we need it to do. DSC is required only from the GPU to the adapter via DP 1.4 because that connection DOESN’T have enough bandwidth to do what we need it to do. Precisely why Club 3D says DSC is only used on the DP 1.4 side of the adapter, not the HDMI 2.1 side. The chip inside the adapter literally decodes the DSC in real time to send it out the HDMI port without compression.
 
I'm running my OLED light level at 35 and lowered the contrast from 85 to 80, and still notice the ABL from time to time. 80 should be lower enough to disable it, yeah?

What worked for me was changing the the port you have your PC plugged into to "PC".. You can do this by:
  1. hitting the home button on your magic remote
  2. in the upper-right corner of the screen click on the settings (gear) button
  3. click "Edit"
  4. click on the icon to the left of the text label of the port your PC is plugged into
  5. select "PC" from the list (this will automatically change the text label to PC as well)
This should fix your ABL issues (even with contrast above 80). Just keep in mind that when you have it set to PC it will disable/grey out many picture settings for the TV.
 
Did you get a tracking number?

Yeah I got a shipping update this morning and a OnTrac # just now.

This whole experience was such a joke. I preordered the first week of June from a different retailer and they are still not scheduled to ship for another week.

Went over to the Bestbuy site that had a July 27th ETA and ordered on Sunday and it shipped today. Should have just ordered from BB to begin with and had the TV for the past 2 weeks.
 
Back when I was figuring out monitor sizes and layouts I considering that 27" match in portrait but at the distances I'd be using a 48" or 55" screen the 27" would be very narrow and not enough desktop real estate at the scaling I'd end up using on a 27" ' s 108.8ppi. I'd rather have an oversized portrait mode monitor and am considering filling in the gap at the top or bottom with different monitors.


You heard of resolution scaling? You don't have to play EVERY game at 4k:



And there will be even better methods in the future, like checkerboard rendering.

Or DLSS 2.0 which we have today and is far superior to any resolution scaling on consoles.


Which I mentioned in that post if you bothered to read it.

Personally I have zero interest in traditional non native resolution use scaled beyond 1:1. It looks muddy and soft.. I'm not buying a 4k oled to run shit on it.

DLSS / AI upscaling has potential though, which is why I mentioned it.

I'm also not opposed to running an ultrawide resolution if I can get away with it, not just for the extra frame rates but in order to show more game world in certain games.

-----------------

The point still stands. Most aren't going to be getting 120fps average let alone minimum on demanding games at native non mud resolution, and graphics ceilings aren't going to get any lower once the next gen of 7nm gpus is out. So in my opinion 117fps-Hz cap on an OLED with it's incredible aesthetics will be quality.
 
Last edited:
Yeah I got a shipping update this morning and a OnTrac # just now.

This whole experience was such a joke. I preordered the first week of June from a different retailer and they are still not scheduled to ship for another week.

Went over to the Bestbuy site that had a July 27th ETA and ordered on Sunday and it shipped today. Should have just ordered from BB to begin with and had the TV for the past 2 weeks.


I ordered mine on June 18 through discount bandit for $1392 at the time, figuring out I may save $100, the order got processed through AppliancesConnection but shipping date keeps getting pushed, now scheduled to ship on July 21. Wished I just ordered through BestBuy....
 
I ordered mine on June 18 through discount bandit for $1392 at the time, figuring out I may save $100, the order got processed through AppliancesConnection but shipping date keeps getting pushed, now scheduled to ship on July 21. Wished I just ordered through BestBuy....

Cancel it and order with Bestbuy.
 
Yeah I got a shipping update this morning and a OnTrac # just now.

This whole experience was such a joke. I preordered the first week of June from a different retailer and they are still not scheduled to ship for another week.

Went over to the Bestbuy site that had a July 27th ETA and ordered on Sunday and it shipped today. Should have just ordered from BB to begin with and had the TV for the past 2 weeks.
I'm with ya. Only I've got hate for Best Buy. Ordered June 25th and haven't heard a thing. Still saying July 21 ship date, despite the thing being in stock w/ 2-day delivery all morning.

So don't do what I did folks, keep your current orders. Best Buy is a shit show as well.
 
I'm with ya. Only I've got hate for Best Buy. Ordered June 25th and haven't heard a thing. Still saying July 21 ship date, despite the thing being in stock w/ 2-day delivery all morning.

So don't do what I did folks, keep your current orders. Best Buy is a shit show as well.

You should try talking to them through chat because I saw the same thing this morning (said free delivery tomorrow) while my order said July 27th. I got into chat telling the rep that I can order again right now and have it tomorrow so why is my current order still showing July 27th and after the guy looked into my order it miraculously shipped 5 minutes later.

He also said that I'm "lucky" to have it shipping soon because demand is huge with a long order back log.

EDIT: The reason I gave up and ordered at BB is because their ETA as evident by this thread and else where is very conservative. People have been ordering and receiving it in a few days or weeks. Yours seems to be an outlier.

FYI Walmart has it in stock online too last I checked and it said I'd have it in a few days.
 
I'm also not opposed to running an ultrawide resolution if I can get away with it, not just for the extra frame rates but in order to show more game world in certain games.
For this, you can just widen the FOV, unless changing the resolution is to get around an application limitation. Both methods should result in the same number of degrees of FOV per pixel, and the same effective in-game view distance (which is what you give up when you widen the FOV, at least if it's widened linearly).
 
Referencing my post again about tick rates in games too, since a lot of these higher than 120hz people at 4k people seem to think they are seeing online "competitive" game world data sooner because they have a high hz monitor. Single player local and LAN gaming yeah.

If you read through what I quoted about tick rates on servers you'd see that you'll still likely be getting 15.6 ms per tick on a 128tick server unless you had pristine ping and were willing to risk 250ms delays whenever your 2nd package is lost. For most game's 64 tick , 22, tick and 12 tick games your ms from the server would be much longer.

" Put into English this means that once you pull the trigger and this information package gets sent to the server, it then goes back from the current server time (the time the pulling the trigger package was received) by your ping plus your interpolation time. Only then it is determined if the client hit the shot or not. "

Then consider that everyone else in an online game is subject to the same lag compensation formulas.

I really think the response time conisdering extreme hz and mouse usage is overblown considering all of that, unless you are playing LAN games only (or single player local games) and at very high frame rates rather than using VRR to ride a roller coaster of frame rates that are moderate or low on the middle and low end of a frame rate graph. I'm guessing most people buying a HDR capable 4k OLED are buying it for some serious eye candy, not playing a very high frame rate competitive game at low settings or that is low graphics by design. What I do agree with is that at very high frame rates on very high frame rate monitors, the sample and hold blur would be reduced (without suffering the tradeoffs of BFI). That is not only advantageous for image clarity while moving for targeting purposes, but also aesthetically.

Anyone testing would have to make sure that they are running 120fps (115 or 117fps capped) as a minimum , no averages, in order to get 8.3ms per frame at 120fps (or 8.6ms at 115fps) for the upper limit of what the monitor can do. However a more realistic test would be 90 to 100fps average where people are using higher graphics settings on demanding games, relying on VRR to ride a roller coaster of frame rates.

-------------------------------------------
If someone is doing graphics settings overboard or has a modest gpu and cranks up the graphics at 4k resolution on a game so that they are getting say 75fps average, they would then be frame durations something in the ranges of:

......25ms / 16.6ms <<< 13.3ms >>>> 11.1ms / 9.52ms

at...40fps / 60fps <<< 75fps >>> 90fps / 105fps

------------------------------------------
https://win.gg/news/4379/explaining-tick-rates-in-fps-games-difference-between-64-and-128-tick
  • CSGO official matchmaking: 64-tick
  • CSGO on FACEIT: 128-tick
  • CSGO on ESEA: 128-tick

Valorant tick rates:
  • Valorant official matchmaking: 128-tick

Call of Duty: Modern Warfare tick rates:
  • COD multiplayer lobbies: 22-tick
  • COD custom games: 12-tick
While that sounds fast, many CSGO players have monitors capable of running at 144Hz. In simple terms, the monitor can show a player 144 updates per second, but Valve's servers only give the computer 64 frames total in that time. This mismatch in the server's information getting to the computer and leaving the server can result in more than a few issues. These can include screen tearing, a feeling like the player is being shot when protected behind cover, and general lag effects.
---------------------------------------------------

You'd think that a tick of 128 would be 7.8ms and a tick of 64 would be 15.6ms , but it's not that simple... (see the quotes below)

----------------------------------------------------


http://team-dignitas.net/articles/b...-not-the-reason-why-you-just-missed-that-shot

interpolation. When your game client receives a package from the server, it doesn’t simply show you the updated game world right away. This would result in everyone breakdancing in 128 or 64 tick intervals across the map. Rather, it waits a set interpolation time called “lerp”, whose name probably originated by a network engineer stepping on a frog.

During this time, a set number of further packages arrived on the client’s side containing more updated ticks from the server. Through these ticks, the client is able to interpolate what has happened between these two points in time and display this assumption to the player (don’t get mad yet). Interpolation time is determined by the simple equation

cl_interp = cl_interp_ratio / cl_updaterate
So in our 128 tick server example from above, on otherwise default settings this would mean: You receive a new packet every 7.8 Milliseconds (cl_updaterate 128) but the server waits until you received a third packet (cl_interp_ratio 2) before displaying the information, making the interpolation time is 15.6 Milliseconds for this example. On the other hand, a client running cl_interp_ratio 1 is presented with a renewed state of the game every 7.8 Milliseconds – assuming all other hardware and software variable are optimal.



Of course, from everything we’ve learned in our long online gaming history we assume that a lower number in front of the ms sign is always preferable. But, you already guessed it, things aren’t so easy this time around as bad connections and lag compensation come into the picture.

Again, the people with unreliable connections are better off to accept higher interp times, as the game client requires a new package of information from the server precisely at the interpolation time to update your game. If the second package is lost, the client waits 250ms on another package before flashing that red warning message in the top right corner of the screen.


For someone who tends to experience any package loss pretty much ever, it is safer to set cl_interp_ratio to 2, especially since you regain the “lost” time in the lag compensation.

Lag Compensation


The inevitable conclusion from the preceding segment and also the fact that all players on the server have a ping is, that everything you see on your screen has happened on the server already a few Milliseconds in the past.


Let’s leave any philosophical and Einsteinian implications of this to the side for the moment to focus on how a playable game is produced from this situation in which you don’t have to pre-aim your crosshair in front of the enemy.


The process responsible for this is lag compensation in which the server accounts for both ping and interpolation timings through the formula:


Command Execution Time = Current Server Time – (Packet Latency + Client View Interpolation)


Put into English this means that once you pull the trigger and this information package gets sent to the server, it then goes back from the current server time (the time the pulling the trigger package was received) by your ping plus your interpolation time. Only then it is determined if the client hit the shot or not.


Combining all of these factors - these tiny ms differences on the LG are really moot and especially if arguing latency regarding online (rather than LAN) gameplay and without using solid frame rates that don't dip below the max Hz of the monitor.
 
Back
Top