LG 48CX

Not true. Have not tried that specific resolution @120 hz but several others.

Did you try a CUSTOM resolution? That's the key thing here, 1440p and 1080p works at 120Hz as they aren't really custom resolutions but ones that you could already pick from the TV by default.
 
Did you try a CUSTOM resolution? That's the key thing here, 1440p and 1080p works at 120Hz as they aren't really custom resolutions but ones that you could already pick from the TV by default.

Yes, several both on my C7 and on my GX. Use it for CS GO and similar. It can be a bit tricky to get it to work though, especially with DSR.
 
K. Ive played HLL at 3840x1600 because on epic its demanding for 4k. But there was no way i could get it higher than 60hz. After that iVe trie d 3840x1440. Also no success. Maybe im doing it wrong.

1440p at higher hz, yes. But only at 2560. Havent tried 3440.
 
K. Ive played HLL at 3840x1600 because on epic its demanding for 4k. But there was no way i could get it higher than 60hz. After that iVe trie d 3840x1440. Also no success. Maybe im doing it wrong.

1440p at higher hz, yes. But only at 2560. Havent tried 3440.

Not sure what problems you are experiencing but maybe your HDMI cable is not good enough to handle the needed bandwidth?
 
This guy is a known idiot.
Yeah, I can't stand that QUANTUMTV channel. He's one of those people who has more rhetoric and controversy than facts/analysis; and he has some weird hard-on about his life's mission to bash LG. In short, a hack.
 
Not sure what problems you are experiencing but maybe your HDMI cable is not good enough to handle the needed bandwidth?
No. Ive just replace it with a proper 2.1 cable to prep my 3080ti arrival :)

If i set a higher frequency itll not pass the test. Itll flicker black and back. And if i set the frequency to 60 it works.
 
No. Ive just replace it with a proper 2.1 cable to prep my 3080ti arrival :)

If i set a higher frequency itll not pass the test. Itll flicker black and back. And if i set the frequency to 60 it works.

There are no HDMI 2.1 cables, only bandwidth :)

Can't really come up with any good ideas unfortunately, just that I know for a fact that custom resolutions at 120 hz works.
 
I really want this, I am more interested in how 4k 120hz G-sync works for those that have it and can come back on the gaming experience.

If anyone here has switched from the Samsung 2015 JS9000 55" series would LOVE to hear your feedback.

It could be fun for us to put together another post for those with the CX/GX for a guide on the correct and 'best' settings for gaming.

The previous 2015 Samsung post has helped a lot of people in the world (at least I think so... :) )
 
I really want this, I am more interested in how 4k 120hz G-sync works for those that have it and can come back on the gaming experience.

Unfortunately, 4k120 gsync does not work on HDMI 2.0. We're waiting for a video card with HDMI 2.1 to see if it actually works. The good news is that 1440p / 120 / gsync / HDR works just fine.
 
Looks like more ABL (automatic brightness limiter) has come up again. Honest for gaming & movies, I've never once noticed it and I have very high standards.

But for windows / office work, my god it's horrible. I'm one of the fortunate ones who could work from home due to COVID 19 for the past 3 months. I've been using my OLED as my secondary monitor since I had a 4x1080p monitor setup at work. It's so annoying using spreadsheets, word documents, or websites in general. The screen constantly changes brightness almost anytime I scroll or do anything.

I'm a fanboy/white knight of OLED's but even I couldn't recommend it for any type of PC work.
 
Looks like more ABL (automatic brightness limiter) has come up again. Honest for gaming & movies, I've never once noticed it and I have very high standards.

But for windows / office work, my god it's horrible. I'm one of the fortunate ones who could work from home due to COVID 19 for the past 3 months. I've been using my OLED as my secondary monitor since I had a 4x1080p monitor setup at work. It's so annoying using spreadsheets, word documents, or websites in general. The screen constantly changes brightness almost anytime I scroll or do anything.

I'm a fanboy/white knight of OLED's but even I couldn't recommend it for any type of PC work.

Can you lower brightness when doing desktop/browsing activities such that the ABL would never actually kick in (for example, say ABL knocks fullscreen white down to 150nits ... what if you set brightness low enough that the screen never went higher than 150 nits period)? For desktop type activities, I imagine 150nits is still reasonably bright and contrasty (especially with OLED blacks and some amount of light control in your space) but I've never had an OLED and have not tried this. I'm curious if any OLED owners can comment on this kind of scenario.
 
Can you lower brightness when doing desktop/browsing activities such that the ABL would never actually kick in (for example, say ABL knocks fullscreen white down to 150nits ... what if you set brightness low enough that the screen never went higher than 150 nits period)? For desktop type activities, I imagine 150nits is still reasonably bright and contrasty (especially with OLED blacks and some amount of light control in your space) but I've never had an OLED and have not tried this. I'm curious if any OLED owners can comment on this kind of scenario.

I use mine for 6-8 hours per day on average since starting WFH in December 2018 and I do notice the ABL if there is a lot of white on the screen, but unlike Seyumi it doesn't bother me in the least. It is good to point out that it exists as it could be annoying to some but I am not bothered by it and usually welcome the reduction in brightness because even at lower brightness settings like I run, these can feel like staring at the sun if you have light sensitivity which my eyes seem to (I frequently find myself squinting when outside on bright days, lol). I would say that this will vary per person. It is certainly not a dealbreaker for PC use for me. I use mine for everything and love it - by far the best monitor I've ever had in 25 years of owning and using PCs. /shrug
 
Just mentioning a few things again in case they got lost in the wash......
---------------------------

VRR Near Blacks reportedly have nasty gradients around whites
====================================================
I'm still not convinced that adjusting the gamma will fix the VRR "brighter near blacks" issue since in some reports the issue presents itself as gradients around white areas, . They are saying something weird is going on where the dithering is turned off and bright gradients areas show up around white objects on an otherwise pure black screen.

https://www.avsforum.com/forum/40-o...9-dedicated-gaming-thread-consoles-pc-31.html

"I've actually just noticed it for the first time over the weekend, and it really stuck out and smacked me in the face. It really bugged me. So much so, that I disabled G-sync for the time being. The game was Ori and the Will of the Wisps. The gamma is all jacked up when G-S ync is on, highly visible in the map screen as posterization in the gradients surrounding anything white (like an icon, or the cursor) on the otherwise pure black screen. The same thing happens during gameplay in the Mouldwood Depths area in game that takes place in almost pure darkness; nasty gradients surrounding the illuminated areas. Pure black is pure black (no glow) but it's obvious there's something wonky going on.

--------------------------------------------------------------------------------

Settings to avoid ABL in SDR mode for SDR content.
================================================
According to RTings CX review, the CX has aggressive ABL like the C9, E9.
"The CX has decent HDR peak brightness, enough to bring out highlights in HDR. There's quite a bit of variation when displaying different content, and it gets the least bright with large areas, which is caused by the aggressive ABL. "
That is how it is with HDR.
With SDR, there is a Peak Brightness Setting. Since it limits the peak brightness it doesn't seem compatible with HDR.
350384_x6eTr0w.png
From the Rtings C9 Review, regarding SDR settings concerning ABL:
"If ABL bothers you, setting the contrast to '80' and setting Peak Brightness to 'Off' essentially disables ABL, but the peak brightness is quite a bit lower (246-258 cd/m² in all scenes)."
 
Yeah, I can't stand that QUANTUMTV channel. He's one of those people who has more rhetoric and controversy than facts/analysis; and he has some weird hard-on about his life's mission to bash LG. In short, a hack.
Considering the very poor FALD contrast in the 2020 sets with 1 hdmi 2.1 port, the glow (or dim corona) halos in FALD zone balancing along with the other weaknesses shown - OLED ABL is a fair tradeoff. I didn't even think it was that bad in the hate video personally, and he was playing a SDR game off of a console or emulator which means he could have easily set SDR mode to never have ABL. A more relevant example would be ABL kicking on in HDR content side by side with the color temperature mapped version.

I thought there might be a 2020 version of the Q9 but with hdmi 2.1 so I was looking out of curiousity for a future living room purchase. The Q9FN has native contrast of 6055:1 but more importantly the Q9FN with FALD active it has 19,018:1 contrast ratio. The 2020 QLED/LED LCD sets with 1 hdmi 2.1 port that I mentioned are horrible by comparison, and their game mode makes them even worse.

Edit:
The Q90/Q90R RTings review quotes 11,200:1 contrast ratio with FALD active so there is that one at least.
"The Q90 has a great local dimming feature. There's very little blooming, but it tends to dim the edges of bright objects, causing a vignetting effect, and small highlights like stars are crushed. In Game Mode, the local dimming doesn't react as quickly to changes in a scene, leading to more visible blooming. "

" Unfortunately the TV's 'Ultra Viewing Angle' optical layer makes the pixels hard to see clearly. We observed the same issue on the Q900R pixel photo. "

"HDMI 2.1 : UNKNOWN"

--------------------------------------------------------------


There are no HDMI 2.1 cables, only bandwidth :)

Can't really come up with any good ideas unfortunately, just that I know for a fact that custom resolutions at 120 hz works.

The current hdmi 2.0b cables for 4k are 18Gbps.

HDMI 2.1 cables have 48Gbps of bandwidth.
https://www.hdmi.org/spec21Sub/UltraHighSpeedCable

https://www.hdmi.org/spec/hdmi2_1
"Q: Will existing HDMI High Speed Cables deliver the HDMI 2.1 features also?

A: Existing HDMI High Speed Cables with Ethernet can only deliver some of the new features, and the new Ultra High Speed HDMI Cable is the best way to connect HDMI 2.1 enabled devices to ensure delivery of all the features with improved EMI characteristics."
 
I really hope this popularity of these TVs get Nvidia to reimplement 10bit support in their consumer GPUs. They must already be aware of the demand. Has anyone posted this to their user forum?
They have for DP; for HDMI, not yet, but they very likely will for HDMI 2.1 to ensure proper HDR support.
 
On the note of the whole 10bit thing. I have an Acer X27 which does 10bit at 98Hz or 8bit at 120Hz, both modes RGB 444. You can run HDR content on either setting 8 or 10bit and honestly I can't really tell a major difference in banding between 8bit and 10bit. However, the X27 doesn't have a TRUE 10bit panel, it's actually 8bit + FRC so that could be a contributing factor, or I'm simply just blind. I'm not the only one though as there have been others who have also been able to support my claims of not seeing a huge difference between 120hz 8bit and 98Hz 10bit in HDR games. Perhaps if someone can give me a specific scene in an HDR game that clearly shows more banding in 8bit vs 10bit I'd gladly test it out.
 
It seems like it's like a lot of forms of compression. They don't necessarily make something unwatchable or un-listenable but they are still cutting the source material down from the original fidelity 1:1. Some of us demand or at least stongly prefer source fidelity, ~ uncompressed and uncut.. aka "lossless" wherever possible so it becomes a factor in our purchases of displays and audio hardware, especially where other models are available that do no force "lossy" transmission.

I'm not sure exactly what the order of fidelity downgrade would be but I'm guessing something like:

Source/Lossless > 8bit dithered > DSC > lower chroma > upscaled lower resolution (without AI upscaling/DLSS).

--------------------------------------------------------------------------------------
8bit Banding vs 10bit on 10 bit content

"From Nvidia's own white paper in 2009: "While dithering produces a visually smooth image, the pixels no longer correlate to the source data "

From nvidia's studio drivers pages, showing the difference between 24bit (8bit) color banding and 30bit (10bit) color on a 10bit panel. You can use dithering to add "noise" as a workaround on 8bit in order to smooth/smudge/haze the banding out but it will degrade from 1:1 "lossless" source material fidelity.

"By increasing to 30-bit color, a pixel can now be built from over 1 billion shades of color, which eliminates the abrupt changes in shades of the same color. "

(Have to click the thumbnails to really see it)

346680_346583_24-bit-32-bit-gfe.png


346681_346584_300567_Colour_banding_example01.png



From

Your viewing distance would definitely matter with how visible degredation to the original source material would be when using 8bit, 8bit dithered, or lowering chroma .. something like like seeing the difference between 1080p, 1440p, 4k, and 8k native panels is relative to your viewing distance. Since these are going to be used as monitors by some of us at around 4' viewing distance rather than from 8' to 10' away at a couch that could be very visible (even then people notice and complain about banding though).

Lower chroma definitely affects text detail/fidelity and so will affect high detail textures and photos, etc. and as shown in example below it also affects graphics in games. Whether that bothers you or not is another matter.
 
Last edited:
There SHOULD be a difference, but I haven't really noticed any, at least not in games. However, the number of HDR games I've actually played on it is very little though....maybe 2 or 3 I think, so it's not like I've had much experience in seeing 8bit vs 8bit + FRC.
 
K. Ive played HLL at 3840x1600 because on epic its demanding for 4k. But there was no way i could get it higher than 60hz. After that iVe trie d 3840x1440. Also no success. Maybe im doing it wrong.

1440p at higher hz, yes. But only at 2560. Havent tried 3440.

This is my experience on the C9 as well. Only 1920x1080 and 2560x1440 work at 120 Hz, custom resolutions like 3840x1080, 3840x1440 and 3840x1600 only work at 60 Hz until we get HDMI 2.1 GPUs.
 
No. Ive just replace it with a proper 2.1 cable to prep my 3080ti arrival :)

If i set a higher frequency itll not pass the test. Itll flicker black and back. And if i set the frequency to 60 it works.
I've done some additional testing. I think all custom resolutions work (tested many normal monitor formats without scaling). But ONLY If the frequency is 60Hz or less. Also - i've tested Gsync in pendulum demo for instance at 3440x1440. It smoothens out significantly. It works till 60hz... anything above will cause tearing again.
 
Last edited:
I've done some additional testing. I think all custom resolutions work (tested many normal monitor formats without scaling). But ONLY If the frequency is 60Hz or less. Also - i've tested Gsync in pendulum demo for instance at 3440x1440. It smoothens out significantly. It works till 60hz... anything above will cause tearing again.

First you said that custom resolutions did not work at all above 60 hz but now that you have tearing which would be hard to see if you had no picture :)

Are you maybe refering to custom resolutions with working VRR or something like that? I've not tried that as I currently have not GPU that can support VRR via HDMI.
 
This is my experience on the C9 as well. Only 1920x1080 and 2560x1440 work at 120 Hz, custom resolutions like 3840x1080, 3840x1440 and 3840x1600 only work at 60 Hz until we get HDMI 2.1 GPUs.
Even with scaling set to gpu? Very odd if so.
 
LG CX48 prices are likely to drop rapidly.
In Finland, LG CX48 is now offered for EUR 1399 (EUR 1599 original price)
1590743435529.png
 
Yes. Refuses to output anything higher than 60 Hz.

Will try this again when I can find the time for it, perhaps something has changed recently or similar. Just remember that 120 hz uses a lot more bandwidht than 60 hz so a high resolution with 120 hz, 4:4:4, HDR etc. is not doable without HDMI 2.1. But would imagine that most people in here already knew that :)
 
I am very confused about the 4k 120 gsync not working, because it s should works ? The gsync range is 40 > 120... Is it possible that is a firmware bug or a nvidia driver issue ? Because of that it is no buying for me
 
I am very confused about the 4k 120 gsync not working, because it s should works ? The gsync range is 40 > 120... Is it possible that is a firmware bug or a nvidia driver issue ? Because of that it is no buying for me

How do we know it does not work?
 
I am very confused about the 4k 120 gsync not working, because it s should works ? The gsync range is 40 > 120... Is it possible that is a firmware bug or a nvidia driver issue ? Because of that it is no buying for me
In de nvidia pendulum demo at 4k 120Hz you cannot enable Gsync. The motion is smooth though.... and there is NO tearing. I've tested it on 40-115 fps.

If you enable the GSYNC compatible indicator in nvidia control panel - it will NOT show up on the 4k 120hz... if i switch to supported reso's it shows. So i think the pendulum gsync indicator aligns with the one setup through the nvidia control panel.

If you set the display at 3840x1440 - pendulum shows GSYNC is enabled. But i still get massive tearing. So it might not actually work with custom resolutions.
 
Last edited:
In de nvidia pendulum demo at 4k 120Hz you cannot enable Gsync. The motion is smooth though.... and there is NO tearing. I've tested it on 40-115 fps.

If you enable the GSYNC compatible indicator in nvidia control panel - it will NOT show up on the 4k 120hz... if i switch to supported reso's it shows. So i think the pendulum gsync indicator aligns with the one setup through the nvidia control panel.

If you set the display at 3840x1440 - pendulum shows GSYNC is enabled. But i still get massive tearing. So it might not actually work with custom resolutions.

And you have a GPU that supports it?
 
In de nvidia pendulum demo at 4k 120Hz you cannot enable Gsync. The motion is smooth though.... and there is NO tearing. I've tested it on 40-115 fps.

If you enable the GSYNC compatible indicator in nvidia control panel - it will NOT show up on the 4k 120hz... if i switch to supported reso's it shows. So i think the pendulum gsync indicator aligns with the one setup through the nvidia control panel.

If you set the display at 3840x1440 - pendulum shows GSYNC is enabled. But i still get massive tearing. So it might not actually work with custom resolutions.

I would expect it would work fine with custom resolutions as long as GPU scaling is used. To the display it should be the same as it receiving 4K input if I understand it correctly, which is also why it is limited to 60 Hz. In some previous firmware version I would get these purple anomalies if I tried to set the display to 4K 120 Hz and the same was true for custom resolutions.
 
I use mine for 6-8 hours per day on average since starting WFH in December 2018 and I do notice the ABL if there is a lot of white on the screen, but unlike Seyumi it doesn't bother me in the least. It is good to point out that it exists as it could be annoying to some but I am not bothered by it and usually welcome the reduction in brightness because even at lower brightness settings like I run, these can feel like staring at the sun if you have light sensitivity which my eyes seem to (I frequently find myself squinting when outside on bright days, lol). I would say that this will vary per person. It is certainly not a dealbreaker for PC use for me. I use mine for everything and love it - by far the best monitor I've ever had in 25 years of owning and using PCs. /shrug

The key to success is to not run things fullscreen. Which in most cases is pointless anyway even though lots of people still do it as they have alwayd done it. Regardless of if one is using an OLED or not, I would recommend installing something like PowerToys and divide the screen in different areas, as it is not that often you really need 4K fullscreen (at least I don't). Besides being more productive, that will also improve the ABL situation, especially with some theme changes. PowerToys is free and it also remembers position and size of application (if you want it to), so you won't have to constantly rearrange things.

https://github.com/microsoft/PowerToys
 
The key to success is to not run things fullscreen. Which in most cases is pointless anyway even though lots of people still do it as they have alwayd done it. Regardless of if one is using an OLED or not, I would recommend installing something like PowerToys and divide the screen in different areas, as it is not that often you really need 4K fullscreen (at least I don't). Besides being more productive, that will also improve the ABL situation, especially with some theme changes. PowerToys is free and it also remembers position and size of application (if you want it to), so you won't have to constantly rearrange things.

https://github.com/microsoft/PowerToys

Window management is a great thing when you have a lot of desktop real-estate. :) There are other apps out there like Divvy (if that one is still a thing) and a few others. Winaero Tweaker has some neat stuff for general windows 10 things too.

Displayfusion Pro is the app I use for window management and multi-monitor functionality. Along with a large number of other things it can do, it remembers window positions and you can add hotkeys for window placement. There are a lot of other window/desktop functions using a list of built in pre-made scripts, finding more on the internet forum, editing a little, etc. I use a bunch of displayfusion hotkeys with a streamdeck so I can pop monitors all over my 3 monitors (mainly my two 43" 4k monitors).

Streamdeck layout I made with some screenshots and a little simple editing below. I put it in quotes to keep the reply tight.
IqtJHai.png

..The 4k quadrants are not the same size intentionally, with one larger quadrant and 3 more tiled to fill the rest, leaving one small window area "gap" in the middle top if I have 4 windows going.
..When I click each named monitor button it moves the active window to a full height centered or other favorite position I set specifically to each monitor on each of those "monitor buttons".
..The corner XL is for media generally when I want a big window but not fullscreen, it leaves about 25% of the screen available at the top for tiling my screenshot app, several chat windows, and foobar across in a row.
..The up left and mid left come in handy for getting a hold of misc apps.
..The 40% left , 60% right buttons are not only useful for splitting the screen perfectly on the fly, but they correspond to the quadrant window widths. That means the can fill the left or right side of a quad of windows without covering the other half of the monitor's 2 open windows.

I also use a few other arrays of buttons that I get to with that "level up" arrow. I set up some apps so that when I click their streamdeck button it focuses on the app. I'm working on figuring out a script so that when I click the app's stream deck button it will check to see if it's open, open it if not, restore it if minimized, or minimize it.. all with one button. There are a few functions that try to do a few of those but I haven't found a pre-made one that does all of that yet so I'll have to experiment. For now I can for example click the "foobar" icon to focus it, then have play/pause toggle button whose graphic changes ( >, || ), and a fwd and back button.

The focusing of apps would come in handy with the window management buttons but as it is now I only have 11 keys at a time with the recursion button avaialable so I have to jump up to the main directory/level then back down to the app buttons one. I was thinking of getting a streamdeck XL someday if they go on sale again since that has a lot more buttons but now that I've been watching a few youtube videos about it, it might work better just adding a second streamdeck so that only half of the buttons change when you recurse directories.

There is a lot more stuff you can do with both DisplayFusion Pro and Streamdeck in themselves and with other software like OBS, etc. if you wanted to put a lot of time into it - but you don't really have to go crazy to get some very convenient functionality. Also worth mentioning that there is a streamdeck app for phones or tablets so I'm pretty sure you don't have to buy one if you just want to try it out that way.

I also use a 3rd party dual-pane file manager and set the background to a darker medium grey. I already mentioned the browser stuff I use like "Turn off the Lights" and "Color Changer".

--------------------

I agree that you don't need to be running things fullscreen when you have a ton of desktop real-estate on wall(s) of high rez monitors available. However, I think cutting 4k into different windows could in some cases be worse for an OLED than running something fullscreen because of the window borders - even if you use a custom theme or window tweaking app to make them tiny. Apps with their own taskbars, floating toolboxes and paramter windows, etc are all bad static elements. With a lot of apps you still don't have the option of running a dark theme at all, especially design suites, office apps, and utility apps. As mentioned a few replies back - you can avoid ABL with specific settings so breaking space down into windows to hit a different % screen area vs brightness window probably wouldn't be necessary in SDR. You could probably use the settings on a specific screen mode (e.g. PC mode or other named mode rather than Game mode or movie mode).

From the C9 review, but the CX has the same setting in the OSD.. "If ABL bothers you, setting the contrast to '80' and setting Peak Brightness to 'Off' essentially disables ABL, but the peak brightness is quite a bit lower (246-258 cd/m² in all scenes)." .


Personally I'm intending to use the OLED as a media "stage" (PC and console games, full screen videos, streams, art/graphics/image slideshows, audio visualizations, etc) with a black wallpaper and no taskbar or icons on it. There is supposed to be a 3rd party app that can hide the taskbar sliver completely that I linked earlier in the thread. I'm not going to be leaving static app windows and text on the OLED. It is going to be part of an array with other monitor(s) for that stuff.



Quote below has the links to the taskbar hiding app (also mentioning that displayfusion has it's own taskbars for side monitor(s)). There is also a link to another taskbar app that keeps it hidden and uses a show/hide toggle hotkey you can set to whatever you like.

Displayfusion Pro has full featured taskbar that includes all of the windows taskbar functions and more. I've been using displayfusion pro for years for all of it's multi-monitor functions. The displayfusion pro taskbar completely hides itself when set to auto hide.
https://www.displayfusion.com/Features/Taskbar/

Thread from 2020 with the displayfusion author referencing the 2nd thread as being a valid workaround:
https://www.displayfusion.com/Discu...f39s/?ID=46c7fa88-585f-463a-adec-1acaebb5a1b9

The original displayfusion forum thread with the author detailing how to hide the windows taskbar on the primary monitor completely in order to replace it with displayfusion's taskbar there:
https://www.displayfusion.com/Discu...kbar/?ID=3d0cad5b-2ab0-4b2d-acbc-cd2b45389c38


------------------------------------------------------

Taskbar Magic is referenced by most of the pages that mention completely hiding the default windows taskbar. You can then add the full featured and completely hide-able displayfusion taskbar in its place (or set the primary monitor's apps to show on the displayfusion taskbars on other monitors in an array) .

https://www.404techsupport.com/2015/09/16/hide-taskbar-start-button-startup/
View attachment 243716




https://superuser.com/questions/219605/how-to-completely-disable-the-windows-taskbar/641719

reddit r/windows/comments/atj5l2/is_there_a_way_to_hide_the_taskbar_completely/

-------------------------------------------------------

You can also use taskbar hider which sets the show/hide of the taskbar to a hotkey toggle, but I don't think this one gets rid of the sliver of taskbar showing by itself.

http://www.itsamples.com/taskbar-hider.html

------------------------------------------------------

Use any at your own risk but they seem to be working methods.

https://www.dropbox.com/s/480hjn1n0thpl3j/Taskbar-Magic.zip

https://www.displayfusion.com/
 
Last edited:
Displayfusion is a great software. I augment it with the free MS Powertoys FancyZones which lets you quickly drag and drop windows to different regions, resizing them to those regions. I would consider these essential on super ultrawide displays because Windows is not at all designed to work with them.
 
  • Like
Reactions: elvn
like this
I guess I didn't consciously register the fact until just now - that after I installed theTransluscent Taskbar app (TranslucentTB) awhile ago and set my taskbar to hide, being 100% see-through it doesn't leave the taskbar sliver anymore. As long as the icons are hidden there is nothing static in the taskbar area at all.

TransluscentTB - Microsoft store


You can also install Taskbar Hider to always hide the taskbar and show/hide toggle it with a custom hotkey.

http://www.itsamples.com/taskbar-hider.html

A few things I found out with taskbar hider after experimenting with it:

-You have to close and restart the app in the system tray if you change the hotkey for the new hotkey to work
-If you have windows set to hide the taskbar automatically ("hide taskbar in desktop mode"), the taskbar hider app only works as a "lock" on the taskbar state. E.g. hiding the taskbar even on mouseovers until you hit the hotkey to unlock it again.
-If you have windows "hide taskbar in desktop mode" turned off, the taskbar hider app's hotkey acts as a toggle, showing or hiding the taskbar. It stays locked in either position until the hotkey is hit again.
-unlike when your taskbar is hidden normally, hitting the windows key to bring up the windows menu does not "wake up" and show the taskbar anymore. Thats a good thing to me.
-you can still just alt+tab / shit+alt+tab(reverse), ctrl-alt-tab (leaves open apps selection window open even after keys are released), and win+tab opens tiles of all open apps on all monitors to choose from, so unless you are launching an app from the taskbar or using right click on it's icon you could just keep the taskbar hidden more often.
-there is about a 1 to 2 second delay on bringing the taskbar up but I haven't found that bothersome. It hides pretty instantaneously.
-It can show/hide the windows taskbar on all monitors in the array if you run it in admin mode
- It obviously doesn't work with the displayfusion pro taskbars on other monitors but displayfusion has it's own hotkeys and functions if you need that.

I set my taskbar hider app's hotkey to ctrl + shit + Z since I'm unlikely to use mod keys with z in a game much, and Z is on the bottom of the keyboard sort of so it seems to fit there. I could set it to a streamdeck key but that wouldn't be as quick.
 
Last edited:
Back
Top