LG 48CX

After doing some reading it looks like I really won't have to worry about burn-in for a good number of years since I'll be using it in a dark room with a brightness setting (for SDR) equivalent to about 100cd/m2. Most people seem to be using their set with a far far brighter setting. That and the improvements they keep making on the newer models.

Of course when watching HDR content it will "burn" a bit but that is really going to be very far from 100% of the time.
 
After doing some reading it looks like I really won't have to worry about burn-in for a good number of years since I'll be using it in a dark room with a brightness setting (for SDR) equivalent to about 100cd/m2. Most people seem to be using their set with a far far brighter setting. That and the improvements they keep making on the newer models.

Yeah, that's why I asked about the OLED light setting. The information that I was given by a pretty knowledgeable OLED monitor user here a couple of years ago indicated that a high OLED light level setting accelerates retention, but the brightness setting does not. It was my assumption that the OLED light setting changes how much voltage goes to each pixel or something, but I followed his advice and haven't had any burn-in.

He recommended that people who are using them as monitors keep the OLED light level no higher than 30 for full time use with lots of static content. After some experimentation, I found that even 20 was too blinding for me. :) I initially tried a setting of 8 before settling on 12. Although, even that can feel too bright to my eyes when there's lots of white on the screen, so I've been meaning to back it down to 10 and see how that feels. Obviously it'll depend on your individual eyes and viewing environment, but hopefully this helps someone. I'm sure it's been discussed to death over at AVS Forum, but I haven't seen too many people mention it here.
 
Yeah, that's why I asked about the OLED light setting. The information that I was given by a pretty knowledgeable OLED monitor user here a couple of years ago indicated that a high OLED light level setting accelerates retention, but the brightness setting does not. It was my assumption that the OLED light setting changes how much voltage goes to each pixel or something, but I followed his advice and haven't had any burn-in.

He recommended that people who are using them as monitors keep the OLED light level no higher than 30 for full time use with lots of static content. After some experimentation, I found that even 20 was too blinding for me. :) I initially tried a setting of 8 before settling on 12. Although, even that can feel too bright to my eyes when there's lots of white on the screen, so I've been meaning to back it down to 10 and see how that feels. Obviously it'll depend on your individual eyes and viewing environment, but hopefully this helps someone. I'm sure it's been discussed to death over at AVS Forum, but I haven't seen too many people mention it here.
On an OLED screen the brightness affects the luma while OLED light is the luminance. On a backlit LCD display the brightness setting often affects both. Some backlit LCD televisions like those from Samsung separate backlight luminance and pixel luma like an OLED.
 
I use these on firefox. I used to use chrome somtimes but chrome still does a white page load flash almost every time so I stopped using it for most things

This puts a little brightness slider on the bottom right corner of the web page. You can turn it on or off per page by an icon on the browser's toolbar. It also has a whitelist/blacklist in the settings (I set it so hardforum doesn't get affected since it's dark by default for example)
https://addons.mozilla.org/en-US/firefox/addon/turn-off-the-lights/

This changes the colors of the background and text on sites with a sort of color wheel and a quick color brightness slider for it:
https://addons.mozilla.org/en-US/firefox/addon/site-color-changer/

I also keep a zoom + / - tool on the firefox toolbar since a lot of pages have jumbo text.

Using those and setting windows 10 color preference to dark
6g1YJSb.pngor using dark themes, and custom grey colored backrounds in directory opus 3rd party file manager, editpadlite notepad app set to grey background, chat apps dark themed, etc... I have no problems with screens being overly bright or eye fatiguing at my preferred OSD brightness settings for everything else, even with two big 43" 4k screens at my desk.

Also though it was hinted at, remember that when it comes to HDR -- most of a HDR scene stays within the same SDR range, even the game example below which shows color mapping up to 10,000 nit HDR. HDR is more commonly about point sources, edges, scintillation, highlights.... and maintaining detail across the varying fields color on bright colored objects (and dynamically in relation to movement and light sources in a scene) instead of capping it to a blob of color at the SDR mode limit or a a low nit quasi HDR display's color brightness ceiling.

QfSF1rn.jpg
 
Last edited:
$1,499? Wow, at that price you don't even need to wait for the sales. Also makes it less likely that the 48CX will see as deep a price cut during the holidays. That's a solid win for gamers, right there.

With that, LG didn't just put all these gaming monitor manufactuers to shame, they took a machete and butchered them hard.

Folks, we just entered a new desktop and PC gaming era. For years have we waited, and we have finally passed through the gates to nirvana.
 
Last edited:
My Best Buy card is ready lol... damn June though .. I had hope next month at the most. I’m using a crappy 23inch dell right now and it’s killing me. Might go steal my bedroom Samsung 40inch off the wall lol.
 
Last edited:
The "Gaming monitor" killer.
Just add hardware Gsync module and make it in 43" size for total annihilation.

G-Sync module would provide no benefit for an OLED. Variable refresh overdrive wouldn’t be used/mean anything since OLEds have no/need no overdrive.

Even if there was an input lag difference it would be so minuscule to be not worth a consideration.
 
Gah this is going to be the longest 3 months ever. Would've been super nice to have it now since I'm working from home atm due to covid19.
 
G-Sync module would provide no benefit for an OLED. Variable refresh overdrive wouldn’t be used/mean anything since OLEds have no/need no overdrive.

Even if there was an input lag difference it would be so minuscule to be not worth a consideration.
I think it would for Vsync. There is a difference in cutting the amount of frames in half vs displaying each of them regardless of the framerate. As in syncing every frame. If you play without vsync, then it wouldn't matter, and you can enjoy your tearing.
 
Are you talking about the g-sync frametime compensation with v-sync on?

I think that's actually one thing we might lose from the lack of hardware module because from what I'm reading it's not something Freesync or "g-sync compatible" monitors have. But don't quote me on this (yet), it's hard to find any info about it and maybe it just depends on the specific monitor. And perhaps it can be dealt with via the drivers.

If not it would cause an occasional tearline near the bottom of the screen when very close to the refresh rate. Could deal with it by capping the framerate at the highest value where it doesn't happen (with 120hz that's probably around 110 or something).
 
Are you talking about the g-sync frametime compensation with v-sync on?

I think that's actually one thing we might lose from the lack of hardware module because from what I'm reading it's not something Freesync or "g-sync compatible" monitors have. But don't quote me on this (yet), it's hard to find any info about it and maybe it just depends on the specific monitor. And perhaps it can be dealt with via the drivers.

If not it would cause an occasional tearline near the bottom of the screen when very close to the refresh rate. Could deal with it by capping the framerate at the highest value where it doesn't happen (with 120hz that's probably around 110 or something).
Yeah, that's that. G-Sync frame synchronization with V-Sync ON. The G-sync compatible monitors can also do that, but starting with 40fps. So, when you go below 40fps you start getting stutter, because your actual frame rate gets halved by the v-synced buffering... This doesn't happen with hardware g-sync because it start synchronizing frames with 1 fps! So the smoothness of the hardware g-sync (with v-sync ON) is impeccable.
I don't care about input lag. I am perfectly fine with anything below 15ms. And I know that OLEDs have the best pixel response times on the market.
 
Solution.. don't go under 40fps. That is molasses anyway. Even with g-sync duplicating frames under 40fps you are only seeing the same frozen frame snapshots through multiple screen refreshes. For example when your graph is 30fps you are seeing the same frame 4 times as long on a 120hz monitor. That's 8.3ms x 4 = 33ms per frame and that is page-y sludge even if g-sync is avoiding tearing. Even for example having a low of around 40 with a graph like 40 <~~ around 70 average ~~> 100 isn't a very good graph for a high hz capable monitor imo. That 60 to 70 fps average isn't really doing much with 120hz capability. Under 40 would be what? 30 ~60~ 90?. 20 ~50~ 80? ugh.

-You get half the sample and hold blur (image persistence) while moving the whole viewport around when running 120fps solid at120Hz. That brings it down to a soften blur and you get double the motion definition/pathing articulation as compared to 60fps-Hz (2 frames to every 1 shown at 60fps/Hz). That's 8.3ms per frame.
-You get about 40% blur reduction at 100fps solid at 100Hz, and your motion definition compared to 60fps Hz is 5 frames to every 3. That's 10ms per frame.

Shooting for 100fps average results in something like a mostly a 70 <~~100~~>130 fps/Hz graph, hopefully on a 3080ti in the future with hdmi 2.1. A 100 - 120 - 150 graph capped at 145 would be even better but 100fps-hz average is typically hard enough to achieve at 4k resolution. You could also run a 16:10 rez for gaming on a 48" LG CX at 3840 x 1600 to get more FoV and a slightly higher frame rate than 4k would have.

-Some few games use dynamic resolution like consoles do, so you can set the minimum frame rate or a frame rate goal where the game will dynamically downsample the resolution during those very gpu demanding parts of the game. I'm not a big fan but it does seem to work and should avoid dropping below a freesync minimum.
-Checkerboarding is another console trick that saves frame rate/gpu budget but I haven't heard of it on pc games.
-Other techs used by VR systems include relying on a form of interpolation that cuts any framerate below 90fps down to 45 then doubles it back to 90 (with less motion definition of course), or filling the next frame with a guess based on the looking direction flow.
-Nvidia, amd , and facebook/oculus are all working on machine learning tech that provides upscaling of lower resolution to a higher fidelity resolution's render look saving a lot of gpu power (some say over 60% more).
-Eventually there will hopefully also be very advanced interpolation multiplying frame rates by 3x or more without artifacts or noticeable input lag increase so that after using some of the other tricks listed above, you could get a decent motion definition of 90 or 100 raw fps (minimum) and crank it up to 180 to 300 so you'd never go below the max refresh rate of a monitor. That could be farther off though and potentially tied to future monitor generations.
 
-Checkerboarding is another console trick that saves frame rate/gpu budget but I haven't heard of it on pc games.

Apparently DLSS 2.0 is actually very good, although it did take quite a while to finally reach a point where it's useable.



I would imagine a lot of people are going to be pairing up this bad boy with a 3080 Ti class gpu so in theory we should already have the power to do ~90/100fps average in most games perhaps with a few minor graphical tweaks but I guess DLSS 2.0 will now give us that extra boost that we need once it gets deployed in those newer next gen games. Especially once you factor in ray tracing performance hits.
 
Apparently DLSS 2.0 is actually very good, although it did take quite a while to finally reach a point where it's useable.



I would imagine a lot of people are going to be pairing up this bad boy with a 3080 Ti class gpu so in theory we should already have the power to do ~90/100fps average in most games perhaps with a few minor graphical tweaks but I guess DLSS 2.0 will now give us that extra boost that we need once it gets deployed in those newer next gen games. Especially once you factor in ray tracing performance hits.


DLSS 2.0 sounds nice and all but I worry about input lag increases. COD Warzone is incredible and I'm planning to downscale in-game rendering to 75% from 4k to get higher FPS if I need to. The 55 inch models are hitting shelves soon so I hope to hear about the BFI ASAP. I hope Rtings gets one ASAP.
 
Has anyone tried segmenting larger screens like this so that you're only playing on part of the screen? Using it as a monitor which you're seated close to, the upper section of the screen is going to feel fairly high so might benefit from being cropped?

Below shows what it would look like at the same aspect ratio as the LG 38GL950G, albeit physically wider and taller.

monitor_01.png


I guess you could even crop it to a 34" ultrawide and use the top portion for something like browsing etc.?

monitor_02.png


Or is this folly for those who have tried it and if so, what was not so great?
 
Has anyone tried segmenting larger screens like this so that you're only playing on part of the screen? Using it as a monitor which you're seated close to, the upper section of the screen is going to feel fairly high so might benefit from being cropped?

Below shows what it would look like at the same aspect ratio as the LG 38GL950G, albeit physically wider and taller.

View attachment 230730

I guess you could even crop it to a 34" ultrawide and use the top portion for something like browsing etc.?

View attachment 230731

Or is this folly for those who have tried it and if so, what was not so great?

It would have all the issues of running games in a window and it can be hacky to get them borderless in this case or moving them around. It’s a shame there are no PbP features on these TVs.

I don’t think it would be useful to use this layout for anything but some Discord messages or something. Browser is going to be like a landscape smartphone so pretty awkward to use.
 
What setting is your OLED light on in the menu?

I don't think I've ever had the taskbar stay open like that. That sucks. The good news is that your set is one of the early ones and they've made improvements to the panels since then to make them less susceptible to burn-in.

For PC mode, it was apparently still set to 70; it was set to 50 for all other modes, not sure why it was still so high on the one mode that it would conceivably be a problem.

The root problem was a lot of my emulators, which when maximized [not-fullscreen] would not get rid of the taskbar. Combined with the backlight set to higher then it should have been and almost three years of use for 6+ hours per day, there's some minor burn-in that is really only noticeable with a mono-color background. I've been manually throwing the pixel refresher at it, but the retention hasn't cleared up yet (though I think it *is* getting a bit fainter...).
 
I think it would for Vsync. There is a difference in cutting the amount of frames in half vs displaying each of them regardless of the framerate. As in syncing every frame. If you play without vsync, then it wouldn't matter, and you can enjoy your tearing.

Not sure what you are referring to. G-Sync compatible works the same as hardware G-Sync with respect to V-Sync ON. If you are referencing going below 40 FPS/Hz, well then I suggest adjusting hardware and/or game settings as that low of FPS is a terrible experience anyway.
 
Last edited:
Not sure what you are referring to. G-Sync compatible works the same as hardware G-Sync with respect to V-Sync ON. If you are referencing going below 40 FPS/Hz, well then I suggest adjusting hardware and/or game settings as that low of FPS is a terrible experience anyway.
I'd rather just have G-Sync module that would synchronize frames starting with 1 fps. Also the G-sync compatibility works only in exclusive fullscreen mode, while the hardware G-Sync works in all modes, even in windowed mode.
I currently play Vermentide 2 game, which is very demanding in all aspects - cpu, ram and gpu. During the explosion I constantly go well below 40 fps, and I am not ready to lower the setting just so the explosions go smoother. I also run this game in windowed mode, because I always alt-tab during loading screens and between the matches. So, the g-sync compatible monitor would do nothing for me and this game, because I will still have stuttering during the explosions and it's not even run in full screen mode. Also, full screen and exclusive full screen are two different modes.... And it's just one game. I am sure there are plenty other games with the same limitations for running the G Sync compatible displays. G-Sync hardware module doesn't have any of these limitations.
 
Ya I think part of the problem can be windowed mode. I've never seen windowed mode work as well/fast as exclusive full-screen. I avoid non exclusive full screen whenever possible.
 
Ya I think part of the problem can be windowed mode. I've never seen windowed mode work as well/fast as exclusive full-screen. I avoid non exclusive full screen whenever possible.
Absolutely.
 
Has anyone tried segmenting larger screens like this so that you're only playing on part of the screen? Using it as a monitor which you're seated close to, the upper section of the screen is going to feel fairly high so might benefit from being cropped?

Below shows what it would look like at the same aspect ratio as the LG 38GL950G, albeit physically wider and taller.

View attachment 230730

I guess you could even crop it to a 34" ultrawide and use the top portion for something like browsing etc.?

View attachment 230731

Or is this folly for those who have tried it and if so, what was not so great?
=====================================================
I wrote this below to post here but mainly just to figure it out for myself.
Thanks for reading if anyone read it or got anything out of it.
=====================================================

I'm planning on running 3840x1600 21:10 rez on mine with bars at the top and bottom. If the calculations I ran are correct, that should come out to something like the height of a 40.6" diagonal 16:9 screen for the visible gaming portion which is still quite large.

Specifically the numbers I came up with are:
19.9" tall and 41.8" wide visible display area in 16:10 as compared to around
23.5" tall and 41.8" wide for the full 48" 16:9 display area.

So at 3.6" shorter that would mean a 1.8" bar at the top and bottom of the normally 23.5" tall display area . At least according to a quick dimension calculator online.
zMV033C.png

I'll still work my 43" 4k VA tvs as monitors into the array somehow for use as desktop/app monitors , keeping the OLED as the media/gaming "stage". The 43" displays are around 37.5"w x 21.1" tall so they would be way too tall in portrait mode to match up, at least cleanly so they would be side ear panels at best. 48" CX 16:9 is 23.5" tall and the 43" portrait would be 37.5" tall. 14" total or 7" overhang top and bottom when centered. It could work but I'll see how it looks. For the record, a 27" 16:9 is about 23.5" wide and would match the height in portrait mode but it would be quite narrow and the text might look tiny at 1:1 zoom level due to the viewing distance required for the 48" screen of 40"+ distance imo.

It would still be a very wide array though. Even with portrait mode orientation side 43" monitors it would be up to 84" wide or more (7 feet long) flat - though less in reality with the side monitors angled inward.
-----> I just measured my all landscape mode array of 43" - 32" - 43" 16:9 monitors as it is normally with the side 43" ones angled in a bit and coincidentally it measured exactly 7 feet long to outer edge of the bezels so I think I should be good, at least width of the array wise, if I make the side ones portrait mode and center them in regard to the 48".

What I have now:
ZHbBImT.png

Replacing the 32" 16:9 with a 48" LG CX:

9ehFPhE.png 2KszYvc.png RuscvN1.png

---------------------------------------------------------------------------------------------------------------------
In mapping the monitor layouts at multimonitorcalculator.com , I came to some interesting options and realized one of them left a space that is 14" tall and 41.8" wide. A 43.4" 32:10 monitor would fit there pretty well since they are 12.9" tall and 41.4" wide, leaving only an inch short of height to play with pretty much. If I could swing buying that 4th monitor price wise I'd consider this eventually but I'd start out with one of the three layouts I showed above since those 32:10 aspect 43.4"monitors still go for almost $800 after tax at the moment. But wow I'd be closer to my wall of monitor dream, even if it's still with bezels for now.

CQNxOuK.png

Option 4 with all 4 monitors - Note that the samsung C43J890 43.4" 32.10 has a curve to it, though it's slight. I wasn't sure how much that would shorten the width physically. By microcenter's measurement of 43.84" pysical width it would fit the LG OLED's 43.8" width perfectly. By the samsung site's spec of 41.84" width physically, it would be about an inch short on each side but that gap could perhaps be lessened a bit to your perspective by moving it a bit closer in relation to the other monitors. If they were all on arms they should end up with some wiggle room to work with.
--------------------------------------------------------------------------------------------------------------------
 
Last edited:
I'd rather just have G-Sync module that would synchronize frames starting with 1 fps. Also the G-sync compatibility works only in exclusive fullscreen mode, while the hardware G-Sync works in all modes, even in windowed mode.
I currently play Vermentide 2 game, which is very demanding in all aspects - cpu, ram and gpu. During the explosion I constantly go well below 40 fps, and I am not ready to lower the setting just so the explosions go smoother. I also run this game in windowed mode, because I always alt-tab during loading screens and between the matches. So, the g-sync compatible monitor would do nothing for me and this game, because I will still have stuttering during the explosions and it's not even run in full screen mode. Also, full screen and exclusive full screen are two different modes.... And it's just one game. I am sure there are plenty other games with the same limitations for running the G Sync compatible displays. G-Sync hardware module doesn't have any of these limitations.

Adaptive sync on a Freesync 2 display and Nvidia GPU works just fine in windowed, borderless and exclusive fullscreen mode. <40 fps triggers LFC which duplicates frames which is not as good as G-Sync but on average the experience is not much different. I don’t know if HDMI 2.1 VRR is any different in this respect but that too works just fine for me. I have first gen G-Sync monitor, LG C9 OLED and a Samsung CRG9 (Freesync 2) here.

Windowed mode will always have some drawbacks in input lag etc.
 
Adaptive sync on a Freesync 2 display and Nvidia GPU works just fine in windowed, borderless and exclusive fullscreen mode. <40 fps triggers LFC which duplicates frames which is not as good as G-Sync but on average the experience is not much different. I don’t know if HDMI 2.1 VRR is any different in this respect but that too works just fine for me. I have first gen G-Sync monitor, LG C9 OLED and a Samsung CRG9 (Freesync 2) here.

Windowed mode will always have some drawbacks in input lag etc.
Really? That is nice to hear.
 
Ya I think part of the problem can be windowed mode. I've never seen windowed mode work as well/fast as exclusive full-screen. I avoid non exclusive full screen whenever possible.

Hate to burst your bubble here gents but all DX12 games render Full Screen Exclusive just like Borderless fullscreen so you get none of the FPS or input lag benefits that DX11 had. I hate it but unless Microsoft changes DX12 that's what your future looks like.

If you're obsessed with ultra-low latency gaming this guys youtube channel is a life-eater:
 
Last edited:
Hate to burst your bubble here gents but all DX12 games render Full Screen Exclusive just like Borderless
isn't it on the opposite? DX12 renders everything as Full Screen Exclusive? As in Full Screen Borderless with Full Screen Exclusive benefits?
 
isn't it on the opposite? DX12 renders everything as Full Screen Exclusive? As in Full Screen Borderless with Full Screen Exclusive benefits?

Nope, not according to the testing Chris (dude that makes video above) has done. DX12 renders everything to full screen borderless. Edit: I suspect because there is a lot of pressure to have overlays. There's a whole community of input lag fanatics that follow him and they spend 100s of hours trying to get DPC latency, input lag, windows settings and bios settings perfect for zero lag. These guys refuse to use on-board audio because it can cause DPC latency spikes. They even refuse to use more recent versions of Windows 10 because of DPC latency spikes. It's too much for me, makes gaming feel like a job and I'm an IT professional!

Here's Chris spending 30 minutes making Windows changes just for COD MW. At 2:50 he talks about FSE:
 
Last edited:
Nope, not according to the testing Chris (dude that makes video above) has done. DX12 renders everything to full screen borderless because there is a lot of pressure to have overlays.
It does, I just thought that it also carries all the features from DX11 Full Screen Exclusive mode... hmm. Got to read more about that.
 
Personally I still play some dx11 games or set some games to dx11 mode for sli compatibility, and run them fullscreen. Some games just run better in fullscreen mode regardless - better frame rate, less stutter that some seem to get in windowed mode resource wise. o_O

-----------------------------

Regarding monitor space usage outside of games while playing the games fullscreen (without minimizing them):

I tried that MouseMux app that allows multiple mouse pointers on the desktop controlled by different mice. Both mice can move independently at the same time, but only one can click at a time. Clicking changes which mouse owns the "clicker" capability until you click the other. Ownership is shown by the pointer changing to the default size while the one that doesn't own the click-ability gets a little larger and has a little colored square next to it.

It works really well for desktop apps but when I launched grim dawn and switched it to fullscreen mode, left-clicking the 2nd mouse cursor would always just pull the character in it's direction when I clicked it , as if all of my monitors were one big game overlay even though the game was only visible on my middle one.

I suspect that the only way to do this properly in a full screen locked game is to use some of the last several virtual desktop style options on that page I linked: https://www.raymond.cc/blog/install-multiple-mouse-and-keyboard-on-one-computer/

Those methods allow you to run a different user desktop on other monitor(s) with that user having his own mouse and pointer, keeping your gaming monitor dedicated to gaming with your gaming mouse in some ways as if they were connected to different PCs. That means you wouldn't be able to drag or move windows onto your gaming monitor anymore but it might be a worthwhile tradeoff to me for being able to use a 2nd mouse independently, specifically a thumb trackball mouse with no need for actual mouse pad travel space. You'd still have access to all of the installed apps and drives of the computer, and you could export/import browser profiles and bookmarks and such over pretty easily. I'm not sure how cuda accel for playing videos, browsers accel, apps would work on the 2nd user/desktop with a game running though since I don't do VM stuff much.

Another way I could see it potentially working on a single monitor more like fishcake was asking about is by making the virtual desktop run the game in the virtual desktop(user2) window "fullscreen", while the primary windows10 user keeps it's own mouse outside of that VM frame. I'm not sure how the gpu power is assigned across virtual desktops though so I'd have to ask around and dig up info on it to see if it's worth doing.
 
Nope, not according to the testing Chris (dude that makes video above) has done. DX12 renders everything to full screen borderless. Edit: I suspect because there is a lot of pressure to have overlays. There's a whole community of input lag fanatics that follow him and they spend 100s of hours trying to get DPC latency, input lag, windows settings and bios settings perfect for zero lag. These guys refuse to use on-board audio because it can cause DPC latency spikes. They even refuse to use more recent versions of Windows 10 because of DPC latency spikes. It's too much for me, makes gaming feel like a job and I'm an IT professional!

Here's Chris spending 30 minutes making Windows changes just for COD MW. At 2:50 he talks about FSE:



QDE5YQo.png
 
Has anyone tried segmenting larger screens like this so that you're only playing on part of the screen? Using it as a monitor which you're seated close to, the upper section of the screen is going to feel fairly high so might benefit from being cropped?

Below shows what it would look like at the same aspect ratio as the LG 38GL950G, albeit physically wider and taller.

View attachment 230730

I guess you could even crop it to a 34" ultrawide and use the top portion for something like browsing etc.?

View attachment 230731

Or is this folly for those who have tried it and if so, what was not so great?
To follow up about Virtual desktops on a big monitor like this:

The scenarios talked about by me and more or less requested by fishcakes were seeking to have usable space outside of a fullscreen game isolated to a section of the monitor without minimizing the game or having to run windowed or fullscreen windowed mode, (or at least in order to isolate those spaces outside of a windowed mode game with no possible overlap)

-------------------------------------
I found that VMs most certainly can not utilize a "shared" gpu fully, so running a game IN a VM user's desktop window is a no go. That's a noob find but I am a VM noob admittedly.
3D acceleration capabilities of VirtualBox and VMWare, and as of Jan 2019, if GPU acceleration in the guest VM is important to you then choosing VMWare is your best bet -- especially if your guest OS is windows. VirtualBox support GPU acceleration capabilities are a bit less developed, as you can only give the guest VM a max of 128MB video RAM, and in VMWare allows up to 2GB to be allocated to video memory. VMWare supports DirectX 10, and Virtualbox supports DirectX9
don't expect to be able to run games in a guest VM. 3D acceleration pass thru still is not quite that far along. But you can expect to have a modern OS and UI in your guest, and have an acceptable experience. One would be able to play older games in the guest VM

So it seems like if anything, a virtual desktop of a second user account could be useful assigned to additional monitor(s) in an array, and assigned it's own hardware mouse (and even keyboard if desired), rather than having a periphery around the ("fullscreen ultrawide rez") game window on the same monitor. At least the VM user account can apparently still get some amount of hardware acceleration for browsers, video playback, and apps so isn't entirely hobbled in that respect (unlike gaming). The other option would be to run other monitor(s) off of another pc or laptop and just network the drives and share some folders.
-------------------------------------

Whether using a 2nd computer or a VM on the same machine with a 2nd user account, the account would have to be locked to the non gaming monitors while gaming. At least with the VM method on the 2ndary monitors, you could just close the VM after gaming and go back to full pc power plus back to utilizing all of your monitors outside of games so I will be experimenting with that to see how I like it. You could do the same sort of thing just swapping between inputs on the other monitor(s) in the 2nd computer scenario though, assuming you had a suitable laptop or pc to connect while at your desk. The latter is a lot easier if your displays come with remote controls to swap between inputs of course.

Otherwise if you decide you want to instead run a game in windowed mode rather than fullscreen mode - for example an ultrawide rez or a 1:1 mapped 2560x1440 in the middle of a big 48" LG CX while not having to worry about moving things behind the game window you could experiment with some monitor dividing/tiling software solutions like desktop divider, aquasnap, or some of the advanced features and window placement ~ app window size and location remembering capabilities of DisplayFusion Pro.
 
Last edited:
Per fishcakes idea (and I raised it a few pages ago for turning it into a 21:9 by chopping the bottom off below the desk)

It's more work than VMWare or Virtualbox (Oracle *spit*) but using GPU passthrough on a linux hypervisor gives you damn close to 100% performance. We use that at work pretty extensively, albeit for clients doing CUDA workloads not 'graphics'. As an aside it's even better now that you can do it in docker too. I'm sure it wouldn't be that hard to set a custom output to only take a portion of the screen, the challenge is placing it. I've not seen how that could be done, but I'm sure it can be. Linux is good for weird shit.

I'm in the process of putting together the spec for my new build when I'll finally move myself onto a Hypervisor for everything so I'll find out at some point in the next few months but would be interested to know.
 
Back
Top