LG 48CX

4k is super demanding so no.. on a lot of modern demanding games you are lucky to get 60 or 75fps average with all the bells and whistles turned on graphically and that is with a top end gpu. With your scenario, you would have to push 120fps SOLID .. that is, your minimum frame rate would have to be 120fps to avoid any frame rate fluctuation. Fluctuating frame rates with no compensating tech causes judder, and when going over the top Hz of the monitor in (actual ranging, not just average) frame rate it causes tearing. Usng V-sync adds a lot of input lag and if using double buffered v-sync can cut your frame rate in half when your frame rate can't match your monitor's max refresh rate.
 
Last edited:
With HDMI 2.1, is it now preferable to set your HDMI input to PC mode vs game mode? I remember over the summer people were recommending Game Mode with HDMI 2.0, but I don't remember why...
These are different things. You want the picture mode to be "Game" still (there is no "PC" here), and you want the label on the input in the "Home" menu to be "PC".
 
These are different things. You want the picture mode to be "Game" still (there is no "PC" here), and you want the label on the input in the "Home" menu to be "PC".

Yes, thank you. I meant PC vs Console in the HDMI input screen. Thanks!
 
@Whoever just deleted the gsync-off-no-tearing post: If you turn gsync off and vsync on you'll just have VSync at > 120Hz.
 
This monitor is pretty amazing. With G-sync off (to avoid the stuttering), I can barely notice the tearing. Is that because of the way OLED refreshes the screen? G-sync on and off is incredibly subtle (but I can notice it if I actively look for it). I'm guessing this is why people think the workaround of toggling "enable for selected display" actually works, because they don't even notice the tearing.

Compare this to LCDs - where I found the tearing to be SUPER noticeable even at high refresh rates of 165Hz, and games unplayable w/o g-sync. I'm not even sure I need g-sync on this screen.

Regarding input lags and online multiplayer - your latency etc. doesn't really matter when almost all games implement some form of rollback netcode. You are basically playing locally with rollback netcode, so all that really matters is your local input delay.

Rollback aka rubberbanding is attempting to keep you up to the tick rate to compensate for ping times it's not compensating to keep everyone else up to your local fps-Hz rate. At best you would theoretically get the tick rate of the server which with the interp2 on a 128tick server is 15.6ms but that is not how the formula works with your ping time contributing to it. More likely in the end you are getting serveral frames of your local game between each server state update. For example the game server could be sending you game action slices at a 55Hz (55ms) rate considering interpolation time + ping even if you are running 120fpsHz or 360fpsHz. On worse tick servers than 128 tick, which is most games, the interp time itself is much worse to start with before your ping time is factored in.

So you are running 15ms interp + your ping time on the best 128tick servers, with some averaging done by the client between 3 frames (current plus 7ms x 2 at interp2) that has been "buffered". By comparison, a LAN game typically has a latency of 3ms to 7ms.


If you’re now sitting there and asking yourself how only 128 or much worse 64 ticks/s produce a playable game that is not stuttering crazily when receiving only 128 updates a second, you deserve a cookie. The answer is interpolation. When your game client receives a package from the server, it doesn’t simply show you the updated game world right away. This would result in everyone breakdancing in 128 or 64 tick intervals across the map. Rather, it waits a set interpolation time called “lerp”, whose name probably originated by a network engineer stepping on a frog.

During this time, a set number of further packages arrived on the client’s side containing more updated ticks from the server. Through these ticks, the client is able to interpolate what has happened between these two points in time and display this assumption to the player (don’t get mad yet). Interpolation time is determined by the simple equation


cl_interp = cl_interp_ratio / cl_updaterate


So in our 128 tick server example from above, on otherwise default settings this would mean: You receive a new packet every 7.8 Milliseconds (cl_updaterate 128) but the server waits until you received a third packet (cl_interp_ratio 2) before displaying the information, making the interpolation time is 15.6 Milliseconds for this example. On the other hand, a client running cl_interp_ratio 1 is presented with a renewed state of the game every 7.8 Milliseconds – assuming all other hardware and software variable are optimal.


Of course, from everything we’ve learned in our long online gaming history we assume that a lower number in front of the ms sign is always preferable. But, you already guessed it, things aren’t so easy this time around as bad connections and lag compensation come into the picture.


Again, the people with unreliable connections are better off to accept higher interp times, as the game client requires a new package of information from the server precisely at the interpolation time to update your game. If the second package is lost, the client waits 250ms on another package before flashing that red warning message in the top right corner of the screen.


For someone who tends to experience any package loss pretty much ever, it is safer to set cl_interp_ratio to 2, especially since you regain the “lost” time in the lag compensation.

.... So you are loading multiple ticks sent (with your latency) which are on the best tick server a new packet + 7.8ms x 2 = 15ms tick frames, then figuring out what happened between them. This is like buffering a few frames and making a frame out of the difference between them.


Command Execution Time = Current Server Time – (Packet Latency + Client View Interpolation)

So you pull the trigger or peek around a corner = current server time (say 100min:00sec:00ms) minus packet latency (25ms to 40ms typical) + client view interp (15ms) =

128tick at interp_2 = 15ms + (25 - 40ms) ping = 40ms to 55ms ~~~~~~~~~> 5 to 7 frames before new action/world state data is show/new action is registered (at 117fps solid)
64 tick at interp_2 = 31.2ms + (25 - 40ms) ping = 56ms to 71ms ~~~~~~~~> 7 to 8 frames
22 tick at interp_2 = 90ms + (20 - 40ms) ping = 110ms to 130ms ~~~~~~~~> 13 to 15 frames
12 tick at interp_2 = 166ms + (20 - 40ms)ping = 186ms to 206ms ~~~~~~~> 22 to 24 frames


Lag Compensation


The inevitable conclusion from the preceding segment and also the fact that all players on the server have a ping is, that everything you see on your screen has happened on the server already a few Milliseconds in the past.


Let’s leave any philosophical and Einsteinian implications of this to the side for the moment to focus on how a playable game is produced from this situation in which you don’t have to pre-aim your crosshair in front of the enemy.


The process responsible for this is lag compensation in which the server accounts for both ping and interpolation timings through the formula:


Command Execution Time = Current Server Time – (Packet Latency + Client View Interpolation)
Put into English this means that once you pull the trigger and this information package gets sent to the server, it then goes back from the current server time (the time the pulling the trigger package was received) by your ping plus your interpolation time. Only then it is determined if the client hit the shot or not.

... So the rollback you are talking about
-starts from the current server time (the server time logged after your latency sent packet was received)
-minus
- your ping time (25ms - 40ms) plus your interpolation time (128tick = 15ms, 64 tick = 31ms, 22 tick = 90ms, 12 tick = 166ms)

There's no factoring in of 120fpsHz 8.3ms frames or 360fpsHz 's 2.7ms frames. You are bound by the tick rate. The best tick is closer to 60fpsHz so higher fpsHz isn't doing anything to better that. If anything your latency is the factor (at least until there are someday higher tick servers).

Imagine you want to reach the boxes on top mid on Mirage while an AWP is perched in window. You jump out of the cover of the wall, fly and land safe behind the boxes. In the moment you land, the AWP shot goes off and you somehow die and end up right on the edge of the boxes, a good half meter away from where you stood on your screen. In the German scene, you would have just been “interped”, even though “being lag compensated” might be the more accurate term (I acknowledge that is way more clunky and less easy to complain about).

As the peeking CT moves into the gap of the double doors, his lag compensated hitbox and model are still behind the door, giving the Terrorist no real chance to respond. However, it is imperative in this scenario for the peeking player to actually hit (and in most cases kill) his opponent in the time it takes the server to compute all executed commands and the appropriate lag compensation. Of course, the showcased example is taken with a ping of 150ms, which is unrealistically high for most people, artificially lengthening that time.


Should any of you reading this have the good fortune to play on LAN one day, you should keep in mind that peeker's advantage is solely dependent on lag compensation, a big part of which is made up by a players ping. With the typical LAN connection ping of 3-7ms, peeker's advantage is practically non-existent anymore. Together with many other factors, this is one of the reasons why CS:GO has to be played differently in certain aspects on LAN than on the internet.

--------------------------

https://www.reddit.com/r/Overwatch/comments/3u5kfg/everything_you_need_to_know_about_tick_rate/

Note: In an example where two players shoot eachother, and both shots are hits, the game may behave differently. In some games. e.g. CSGO, if the first shot arriving at the server kills the target, any subsequent shots by that player that arrive to the server later will be ignored. In this case, there cannot be any "mutual kills", where both players shoot within 1 tick and both die. In Overwatch, mutual kills are possible. There is a tradeoff here.

  • If you use the CSGO model, people with better latency have a significant advantage, and it may seem like "Oh I shot that guy before I died, but he didn't die!" in some cases. You may even hear your gun go "bang" before you die, and still not do any damage.
  • If you use the current Overwatch model, tiny differences in reaction time matter less. I.e. if the server tick rate is 64 for example, if Player A shoots 15ms faster than player B, but they both do so within the same 15.6ms tick, they will both die.
  • If lag compensation is overtuned, it will result in "I shot behind the target and still hit him"
  • If it is undertuned, it results in "I need to lead the target to hit them".

So you are never being fed new information or having your new actions register.. compensated across [[ two received 7.8ms tick action slices (15ms) + your local current frame state ]] on the client ... fast enough to make 13.x ms of input lag a competitive factor in online gaming imo even if you had a ping of zero but especially with online latency factored in.
 
Last edited:
4k is super demanding so no.. on a lot of modern demanding games you are lucky to get 60 or 75fps average with all the bells and whistles turned on graphically and that is with a top end gpu. With your scenario, you would have to push 120fps SOLID .. that is, your minimum frame rate would have to be 120fps to avoid any frame rate fluctuation. Fluctuating frame rates with no compensating tech causes judder, and when going over the top Hz of the monitor in (actual ranging, not just average) frame rate it causes tearing. Usng V-sync adds a lot of input lag and if using double buffered v-sync can cut your frame rate in half when your frame rate can't match your monitor's max refresh rate.
This is not true at all, v-sync will not any input lag if cap your framerate correctly. You can find multiple tests about that. Vsync only add input lag when its using without gsync.
https://hardforum.com/data/attachment-files/2019/01/298588_Q0SEx8F.png
 
This is not true at all, v-sync will not any input lag if cap your framerate correctly. You can find multiple tests about that. Vsync only add input lag when its using without gsync.
https://hardforum.com/data/attachment-files/2019/01/298588_Q0SEx8F.png

Correct.. I was replying to this
I have a stupid question or comment. Through the years, I have always gamed at 1080 and 1440p without Gsync and freesync. Sometimes when playing on a 75hz refresh monitor (most professional monitors are 75hz), I would turn on Vsync only. On other gaming monitors (144-165), I simply leave Gsync and/or freesync off.
You guys state there is flickering with Gsync/Freesync? Why not just leave it off? Am I missing something?
Surely your graphics cards can push 120 fps without breaking a sweat? I do not understand the problem?

Another point. When you activate 1440p in windowed mode, what does the screen size become? For example on a 32" 1440p monitor when you activate a lower resolution (1080) the screen size goes from 32" to 24". ??
 
Last edited:
No issue so far at 4K 120hz 4:4:4 using the following combination on my 3080 and 48CX setup
1) IBRA 2.1 HDMI 10ft Cable
2) Monoprice 48Gbps HDMI 3ft extention
 
Jost posted this on the AVS CX gaming thread but I’ve been following both of these threads since purchasing the CX.

I’ve had my CX for a week now and have done a lot off messing around with a 3070 and here is my experience.

I am using the zeskit 6ft. hdmi 2.1 cable and I do get random lost signal issues here and there. It is usually when I am changing Nvidia control panel settings. 4:4:4 to 4:2:0, 10bit to 8bit or gsync on or off. Sometimes it comes back after 15 to 20 seconds, sometimes I can power the CX off and on and it returns. Unplug and replugging the hdmi in always makes the signal come back. This doesn’t happen everything and it is fine more often than not. Doesn’t happen during normal steady use, shouldn’t be a problem long term especially after gsync stutter fix firmware is out and I don’t have a need to use 4:2:0 8bit for non stuttering gsync.

As for black levels and VRR, there isn’t only an issue with flickering... this is easily more noticeable when the frame rate dips for whatever reason. In control, pausing the game for instance, or the frames dipping in a dark area, gamma is shifting and it is annoying. BUT with VRR I can get a stable enough 60fps 4K/DLSS, Yes the game looks great, the picture is overall awesome but you are still dealing with a noticeable CONSTANT raise in gamma.

You can easily see the difference with the same scene by popping out and switching gsync off. Pop back into the game and only then did I notice it was back to perfect black. It not something you easily pickup on. This was in a pitch black room where I even was able to pick this out. Fired up Gears 5 and the same result. Playing during the day I don’t think I would notice and vrr smoothing things out and it’s a trade off I will make for now.

Testing what people have said about dropping brightness to compensate...
With the raised black in VRR I could see steps down in what should have been perfect black all the way to 47, going to 46 is when I could no longer see the true black area of the screen get any darker. Switching back and forth again VRR off vs on and the blacks are now right where they need to be. Of course at the expensive of some black crush. Nothing that’s a dealbreaker but still a shame to have to make compromises to get a accurate picture because features the TV is advertised having are not where they need to be.

I am going to play around and find something that works for me, all issues aside the CX is beautiful and it is a joy to play on. I think console users should not be worried. Most games are going to be at a more steady 30/60 and in most cases you can have VRR off, have a beautiful image and still enjoy all the benefits of an oled.

I know there are journalist covering this issue in some news stories but I wish more of the outlets all touting this as the best/perfect next gen gaming TV would update them and let people know of the issues. Even Rtings CX vs Vizio oled comparison video posted yesterday glosses over the issue completely. Don’t quote me but when they get to VRR on the CX they say it works perfectly on the CX without a hitch.

That all being said I still love it and look forward to the gsync firmware. The VRR / gamma issues are a bummer but I can do without it and still be extremely happy to own this TV.
 
Been testing my RTX 3090 FTW3 Ultra that I got yesterday and I've been really enjoying it. I think there might still be some firmware issues because I have been experiencing cut-outs when changing the color or G-Sync settings on the TV. Otherwise I haven't noticed the raised blacks on the VRR or any stuttering during my usual routine. If anything, everything looks quite amazing, and a great step up from my C6 OLED.
 
Those of you with HDMI 2.1, are you going with RGB 10 bit or 444? What are you setting your dynamic output in NVCP as well as black level for either?
 
Those of you with HDMI 2.1, are you going with RGB 10 bit or 444? What are you setting your dynamic output in NVCP as well as black level for either?
I'm at 10 bit 444 with HDR enabled. OLED light is set at 65.

I recently tried switching the Input (via the Home Panel) to PC, and it made my text blurry. Previously the text appeared crisp.

Not quite sure what the issue is. I have the Screen Shift off. I suppose I'll keep playing with the settings.
 
Some guy on reddit say we can have the new firmware thanks to the "engeneer mode" but i don't exactly what it mean ???
He also say the new firmware correct the issue of stutering with gsync.
So if anyone can explain to me how and it is safe ?
I have an extra code for watch dog legion (redemption on geforce experience >> must have a 3080 or 3090) if you want to trade message me.
 
I'm at 10 bit 444 with HDR enabled. OLED light is set at 65.

I recently tried switching the Input (via the Home Panel) to PC, and it made my text blurry. Previously the text appeared crisp.

Not quite sure what the issue is. I have the Screen Shift off. I suppose I'll keep playing with the settings.
If PC mode is off then you are not getting 4:4:4 output, the display will turn it into 4:2:2. If text becomes blurry make sure pixel shift is off. Make sure Quickstart+ is off too or pixel shift will turn itself back on. It's a bug LG hasn't bothered to fix.

Some guy on reddit say we can have the new firmware thanks to the "engeneer mode" but i don't exactly what it mean ???
He also say the new firmware correct the issue of stutering with gsync.
So if anyone can explain to me how and it is safe ?
I have an extra code for watch dog legion (redemption on geforce experience >> must have a 3080 or 3090) if you want to trade message me.
This can be done via the service menu, which requires the service remote or Android phone with IR blaster and LG service remote app. Poking in the service menu is not safe.
 
Screen Shift doesn't seem to zoom the image anymore even on 03.11.25 - it only pans the image.
Can confirm so at least they fixed that. I guess the display not retaining the screen shift off state when quickstart+ is enabled is still broken though?
 
I ended up turning Quickstart+ off and leaving it off. Yeah the TV boots noticeably faster with it enabled, but I can tolerate a few extra seconds per "session" for the other benefits it brings leaving it disabled. Fixing that "bug", if that's what it is, was really low on my priority list compared to the VRR/G-Sync stuff. I'm glad that LG remains committed to working hard on steady firmware updates. Someone said it earlier but the HDMI 2.1 implementation combined with all of the various gaming features was uncharted territory and I'm not surprised that everything wasn't working 100% perfectly out of the box with the 3xxx series cards. That being said, it was STILL a best-in-class display before, and they've worked to deliver the features as promised. Like a couple of people have said, anything else is a nitpick at this point. I bought the 48CX at launch and feel that it was money well spent (and due to GPU supply issues I'm still not able to take advantage of all of its features). It will be interesting to see how the AMD 6xxx series GPUs do with this display (and in general - drivers, etc.).
 
I ended up turning Quickstart+ off and leaving it off. Yeah the TV boots noticeably faster with it enabled, but I can tolerate a few extra seconds per "session" for the other benefits it brings leaving it disabled. Fixing that "bug", if that's what it is, was really low on my priority list compared to the VRR/G-Sync stuff. I'm glad that LG remains committed to working hard on steady firmware updates. Someone said it earlier but the HDMI 2.1 implementation combined with all of the various gaming features was uncharted territory and I'm not surprised that everything wasn't working 100% perfectly out of the box with the 3xxx series cards. That being said, it was STILL a best-in-class display before, and they've worked to deliver the features as promised. Like a couple of people have said, anything else is a nitpick at this point. I bought the 48CX at launch and feel that it was money well spent (and due to GPU supply issues I'm still not able to take advantage of all of its features). It will be interesting to see how the AMD 6xxx series GPUs do with this display (and in general - drivers, etc.).
Yeah it's the bottom of the list of my problems too. The only reason I care about it at all is with Quickstart+ enabled turning off the TV doesn't trigger an "unplug" event on the display and rearrange all your windows. Since I'm obviously turning the display off when I'm leaving the computer, it's an annoyance, albeit a minor one.
 
Correct, and it's still blurring text, that's how I noticed it.
But if Screenshift doesn't zoom and only shifts the picture around, leaving borders opposite of the direction it's temporarily shifted to blank, Shouldn't it not have an affect on whether or not text is blurry? It's still 1:1 pixel usage. Just all the pixels are shifted a few pixels in whichever direction it's currently shifted.

Trying to understand this because if my description is correct and it's NOT zooming like I guessed it would, and therefore still 1:1, I might actually start using it.
 
But if Screenshift doesn't zoom and only shifts the picture around, leaving borders opposite of the direction it's temporarily shifted to blank, Shouldn't it not have an affect on whether or not text is blurry? It's still 1:1 pixel usage. Just all the pixels are shifted a few pixels in whichever direction it's currently shifted.

Trying to understand this because if my description is correct and it's NOT zooming like I guessed it would, and therefore still 1:1, I might actually start using it.
I think it's still zooming personally, it definitely is on the settings menu. Some people never believed it was zooming in the first place so I think it's hard for some people to notice. Or perhaps in an effort to be less noticeable it shifts fractional pixel amounts and leaves artifacts from the processing.
 
It would have to be zooming, unless you have a black line down one or more edges of the screen.
 
It would have to be zooming, unless you have a black line down one or more edges of the screen.
That's what we're trying to nail down. I'm trying to determine if there are black edges from the screen being shifted. But it's hard to tell because the pixels are so small, and them being off looks the same as the bezel.
And then there's the question of if it's both that AND zooming. Not zooming would be better because then it would be 1:1 pixel usage across the TV minus the blank edges.
 
To the guys saying 117fps is smooth using the latest 03.11.30 firmware; I'm using the same firmware myself and still getting 116fps when using G-Sync + Vsync with a 117fps cap, so not sure what's going on? What drivers are you using?
 
To the guys saying 117fps is smooth using the latest 03.11.30 firmware; I'm using the same firmware myself and still getting 116fps when using G-Sync + Vsync with a 117fps cap, so not sure what's going on? What drivers are you using?
117fps limit set where, and 116fps measured where?

I have 117fps set in NVCP (not in game or RTSS), and am measuring 117fps with RTSS. I'm on 457.30 (latest I believe).
 
117fps limit set where, and 116fps measured where?

I have 117fps set in NVCP (not in game or RTSS), and am measuring 117fps with RTSS. I'm on 457.30 (latest I believe).
117fps cap set in game and vsync set from NVCP, but it was apparently nvidia reflex that caused the issue (not sure why it affects g-sync+vsync => probably a bug). Disabled it and getting the full 117fps now!
 
That's what we're trying to nail down. I'm trying to determine if there are black edges from the screen being shifted. But it's hard to tell because the pixels are so small, and them being off looks the same as the bezel.
And then there's the question of if it's both that AND zooming. Not zooming would be better because then it would be 1:1 pixel usage across the TV minus the blank edges.
I have a Philips OLED in my livingroom and the screenshift moves the picture a couple of pixels left and right every once in a while. My LG CX in the bedroom just seems to "zoom in" 1 pixel, so i believe it uses some other method. Since on my Philips i have still see my entire cursor in the top and bottom of the monitor, but on my LG the cursor gets "cut off" by all the screen edges. If that makes sense....
 
A few months ago the scrollbar auto hide option deep in Chrome flags stopped working, now there is a huge default always present white scrollbar.

What do you OLED guys do for such things? A quick glance at the extension store didn't have any extremely simple (no other bloat) scrollbar mod with high ratings.
 
But if Screenshift doesn't zoom and only shifts the picture around, leaving borders opposite of the direction it's temporarily shifted to blank, Shouldn't it not have an affect on whether or not text is blurry? It's still 1:1 pixel usage. Just all the pixels are shifted a few pixels in whichever direction it's currently shifted.

Trying to understand this because if my description is correct and it's NOT zooming like I guessed it would, and therefore still 1:1, I might actually start using it.
That's how it's supposed to work and that's how it actually works now with 03.11.25 or higher firmware.

The bug was that when running at 4K 120 Hz, it would cause like some sort of overscan (underscan?) effect that would zoom the image a bit causing pixel mapping to not be 1:1 anymore.

Now it will just randomly cut a few pixels out of your screen.

A few months ago the scrollbar auto hide option deep in Chrome flags stopped working, now there is a huge default always present white scrollbar.

What do you OLED guys do for such things? A quick glance at the extension store didn't have any extremely simple (no other bloat) scrollbar mod with high ratings.
Nothing much you can do about it other than switch to MacOS which will hide scrollbars automatically. There isn't just much capability to either style or change the behavior of scrollbars in Windows which is a bummer.

Best case scenario would be apps automatically starting to use the small sliver scrollbar you can see in for example Windows settings instead of the legacy big bar.
 
You can try this for chrome with this Rescroller addon, editing the scrollbar size to "none" as described in the link below.

https://www.quora.com/How-would-I-remove-or-hide-the-scrollbars-from-Google-Chrome

You can try using WinAeroTweaker app to set all of the windows scrollbars and window frames to being very slim also.

I would experiment with dark themes too to see if you can make the bars black.

You could also try using some software with window placement functions.. (Displayfusion or others) so that you can activate/hotkey exact window placements. Set the chrome window's placement so that chrome's scrollbar is just offscreen as if you dragged the window there yourself.

But for me personally, I'll be using separate monitor(s) for desktop/apps like some others do in this thread. The OLED will be a blacked out wallpaper media and gaming "stage".
 
  • Like
Reactions: DF-1
like this
A few months ago the scrollbar auto hide option deep in Chrome flags stopped working, now there is a huge default always present white scrollbar.

What do you OLED guys do for such things? A quick glance at the extension store didn't have any extremely simple (no other bloat) scrollbar mod with high ratings.
I just don't leave a browser maximized on the screen for long periods of time, and non-maximized browsers vary in their location so it's no big deal. Theoretically I guess this wears the right half of the screen more....but I HIGHLY doubt it will have any noticeable impact.

And I'm not sitting on the desktop looking at white browser scrollbars at max brightness.
 
Yeah it's the bottom of the list of my problems too. The only reason I care about it at all is with Quickstart+ enabled turning off the TV doesn't trigger an "unplug" event on the display and rearrange all your windows. Since I'm obviously turning the display off when I'm leaving the computer, it's an annoyance, albeit a minor one.
Interesting that you're having that issue. I do not. My window positions remain after powering on the display with Quickstart+ disabled. I did have that issue with my 2015 Samsung 48JS9000 Quantum Dot display, though. If I remember correctly, all of my application windows would snap to the upper left hand corner of the screen and I'd have to reposition them. But no such issues with the 48CX.
 
You can try this for chrome with this Rescroller addon, editing the scrollbar size to "none" as described in the link below.

https://www.quora.com/How-would-I-remove-or-hide-the-scrollbars-from-Google-Chrome

You can try using WinAeroTweaker app to set all of the windows scrollbars and window frames to being very slim also.

I would experiment with dark themes too to see if you can make the bars black.

You could also try using some software with window placement functions.. (Displayfusion or others) so that you can activate/hotkey exact window placements. Set the chrome window's placement so that chrome's scrollbar is just offscreen as if you dragged the window there yourself.

But for me personally, I'll be using separate monitor(s) for desktop/apps like some others do in this thread. The OLED will be a blacked out wallpaper media and gaming "stage".

Thanks!
the "can view and modify all content on every webpage" kind of freaks me out... hope it's not going to cost me my battlenet account or something! :eek:
 
Thanks!
the "can view and modify all content on every webpage" kind of freaks me out... hope it's not going to cost me my battlenet account or something! :eek:

I can move my browser off to the side just enough to hide the scroll bar, then just use the mouse wheel to scroll. With displayfusion or other window placement software you can set up window positions for apps so chrome would pop to that position when you hit a hotkey. That seems like a fairly easy workaround where you could make the scrollbar exactly offscreen.
 
Back
Top