31.5" 2560x1440 165 Hz VA G-Sync - LG 32GK850G

I am considering again returning this monitor and going 4k for sharper text, which is my primary use concern, with gaming being secondary. And I could run at 1080 to prevent loading the gpu too much during gaming.

I think it should be stressed that the choice between 4k60hz and 1440p144hz depends largely on the type of games you're playing. FPS and other shooters benefit greatly from higher refresh rates and a more "connected" reticle, but RPG and RTS fans will probably appreciate the extra detail in a 4k panel far more than a barely perceptible smoothness.

I play both, but I'm happy enough with the bump from 1080->1440, the extra refresh is rarely utilized fully by my system anyway, but 100+ fps is nice on the eyes (y).
 
I think it should be stressed that the choice between 4k60hz and 1440p144hz depends largely on the type of games you're playing. FPS and other shooters benefit greatly from higher refresh rates and a more "connected" reticle, but RPG and RTS fans will probably appreciate the extra detail in a 4k panel far more than a barely perceptible smoothness.

I play both, but I'm happy enough with the bump from 1080->1440, the extra refresh is rarely utilized fully by my system anyway, but 100+ fps is nice on the eyes (y).

It's certainly playable, I've even played a few PS4 games at 30fps but they are very page-y... clunky/chuggy movement and if you actually pan the viewport around much outside of your regular gamepad movement (aka freelook) it's a smeary mess. A strong 60fps with g-sync is better but still can't compare to 120hz+ at 100fps average or better.


Regarding high hz benefits:

120hz and higher at high frame rate is a HUGE increase in display experience . Especially for 1st/3rd person gaming aesthetics.

In 1st/3rd person games you are moving your viewport around at speed constantly so it's not just a simple flat colored bitmap ufo test object smearing. The entire viewport and game world (of high detail textures and depth via bump mapping, etc) in relation to you is smearing during movement-keying and mouse looking.



120fps at 120hz cuts sample and hold blur by 50% and doubles your motion definition and motion path articulation, and increases to glassy smoothness (more dots per dotted line, twice the unique animation scene pages/cells in a flip book paging twice as fast per se).

100fps-hz cuts sample and hold blur by 40% and does 5:3 motion definition improvement (10 unique frames shown at 100fps-hz to every 6 shown at 60fps-hz).

At 60fps or 60hz cap you are getting smearing blur during viewport movement. At 100 - 120fps on a high hz monitor you cut that blur down to more of a soften blur within the "shadow masks" of everything on the screen, within the lines of the coloring book so to speak. Modern gaming overdrive and low response times help mitigate this or it would be much worse.

Variable refresh rate is another HUGE bump in display experience. It allows you to straddle or at least dip well into those higher hz and frame rate ranges in a frame rate graph that has spikes, dips, and potholes without experiencing judder, stutter, stops, or tearing. So you can tweak your graphics settings higher for a better balance and avoid the bad effects of v-sync or no-syncing at all.



The higher your framerate+Hz, the better. We won't get to 1ms CRT level "zero" blur until we get high end motion interpolation of something like 100fps x 10 interpolated on a 1000 Hz monitor.

-------------------------------------------
https://www.blurbusters.com/blur-bu...000hz-displays-with-blurfree-sample-and-hold/

Motion Definition when filled with new frames at high frame rates....
KaL2I0d.png


Sample and hold blur reduction of the entire viewport during mouse looking and movement keying (panning).

Someyear we'll hopefully get 1000hz displays and use something like 100fps x 10 interpolated frames per frame to achieve "zero" motion blur.

 
I think a higher frame rate being better is relative to use case. If you spend alot of time playing fast pace shooters, a high frame rate might be better for you. And that might even have a limit. I have seen a number of hardcore gamers mention that they see no advantage to 240 hz over 100-120 hz, for example. And if you don't play fast pace shooters, a high frame rate might not matter to you at all. It's all relative. Personally, I'm not seeing any benefit to the higher frame rate for my use cases and with this monitor. I definitely could benefit from sharper text, though.

I think any benefits of a higher refresh rate are also very relative to pixel color transition time, input lag time, and whether the monitor has ULMB. This all gets pretty weedy, and it definitely is not a simple case of a higher refresh rate alone being better, even for those use cases which would benefit from a higher refresh rate. For example, some people think that having a higher refresh rate without ULMB is a waste and that it is better to have ULMB at a somewhat lower refresh rate than to have a higher refresh rate without ULMB.

On refresh rate alone, I personally do prefer 60 hz over 30 hz. Depending on the game, I think that 30 hz can feel choppy or it can feel fine. But 60 hz doesn't feel choppy to me for any games that I play.
 
Last edited:
High Hz + High Frame Rate graphs gives huge aesthetic gains is my point. You get huge gains aesthetically in 1st/3rd person game worlds - not just twitchy shooters.

It's not just about gameplay accuracy, aim, spotting, etc. at all. I used to use a fw900 CRT and it has 1ms persistence , essentially "zero" blur. LCDs smear badly with sample and hold blur. High fps at high hz cuts that down from smearing to more of a "soften" blur within the lines (even tighter on very high hz screens w/ the accompanying fps), and doubles or more the amount of frames so way more unique pages in an animated flip book so to speak, and more pathing dots per dotted line, even more animation cycle definition depending. Those are huge aesthetically. Having a 4k screen that smears at lower fps ranges every time you mouse look or movement key at speed is not showing you higher fidelity during movement.

You can play just fine without bigger and better improvements in frame rates, variable hz, resolutions, contrast ratios, HDR, etc and can play with lower gpu power frame rates or consoles even but there is better. If every screen had 4k 144hz hdmi 2.1 with VRR and every pc and console had the gpu power to push 100fps-hz average and better , everyone would be using it. Instead people are taking trade offs (choosing which negatives to stomach along with their gains and budget) as necessary.

--------------------------------------------------------------------------------------
ULMB / STROBING
--------------------------------------------------------------------------------------

ULMB was a cool idea at the time but it really requires very very high frame rates to be used properly -so much lower graphics settings or very simple to render games which is pretty much the opposite of using VRR to enable higher graphics settings to straddle higher vavariable frame rate graphs without hiccups. So incompatible with VRR and it dims/mutes the screen. It also won't be compatible with HDR going forward so it's really not the answer imo.

Most people avoid PWM like the plague. LCD strobing is essentially PWM. It will give you eyestrain. In order to use ulmb/lightboost properly you have to keep really high minimum frame rates. People are all looking to 3440 x 1440 and 4k resolutions now and are looking to HDR luminance ranges and color volumes. Strobing is really not applicable to the bar that modern premium gaming monitors are setting (High resolutions at high Hz, HDR luminance and color volume, VRR with higher graphics settings to avoid judder on dips and potholes, variance).

The real answer is fairly high frame rate to start with multiplied by advanced high quality interpolated (directly repeated not 'manufactured') frames combined with extremely high hz but that is still years off. The hz ceilings are making some progress now though at least.

"So while they are saying 10,000hz fed massive fps at extreme resolutions would be indistinguishable from reality per se.. 1000fps (100fps interpolated 10x) at 1000hz would be essentially zero blur like a high end fw900 crt for the purposes of gaming"

View attachment 76265

As per blurbusters.com 's Q and A:
-----------------------------------------------------
Q: Which is better? LightBoost or G-SYNC?
It depends on the game or framerate. As a general rule of thumb:
LightBoost: Better for games that sustain a perfect 120fps @ 120Hz
G-SYNC: Better for games that have lower/fluctuating variable framerates.
This is because G-SYNC eliminates stutters, while LightBoost eliminates motion blur. LightBoost can make stutters easier to see, because there is no motion blur to hide stutters. However, LightBoost looks better when you’re able to do perfect full framerates without variable frame rates.
G-SYNC monitors allows you to choose between G-SYNC and backlight strobing. Currently, it is not possible to do both at the same time, though it is technically feasible in the future.
......
Main Pros:
+ Elimination of motion blur. CRT perfect clarity motion.
+ Improved competitive advantage by faster human reaction times.
+ Far more fluid than regular 120Hz or 144Hz.
+ Fast motion is more immersive.
Main Cons:
– Reduced brightness.
– Degradation of color quality.
– Flicker, if you are flicker sensitive.
– Requires a powerful GPU to get full benefits. <edit by elvn: and turning down settings a lot more at higher resolutions>

--------------------------------------------------------
During regular 2D use, LightBoost is essentially equivalent to PWM dimming (Pulse-Width Modulation), and the 2D LightBoost picture is darker than non-LightBoost Brightness 100%.
--------------------------------------------------------
Once you run at frame rates above half the refresh rate, you will begin to get noticeable benefits from LightBoost. However, LightBoost benefits only become major when frame rates run near the refresh rate (or exceeding it).
-------------------------------------------------------
If you have a sufficiently powerful GPU, it is best to run at a frame rate massively exceeding your refresh rate. This can reduce the tearing effect significantly.Otherwise, there may be more visible tearing if you run at a frame rate too close to your refresh rate, during VSYNC OFF operation. Also, there can also be harmonic effects (beat-frequency stutters) between frame rate and refresh rate. For example, 119fps @ 120Hz can cause 1 stutter per second.
Therefore, during VSYNC OFF, it is usually best to let the frame rate run far in excess of the refresh rate. This can produce smoother motion (fewer harmonic stutter effects) and less visible tearing.
Alternatively, use Adaptive VSYNC as a compromise.
-------------------------------------------------------------
Pre-requisites
Frame rate matches or exceeds refresh rate (e.g. 120fps @ 120Hz).
  1. LightBoost motion blur elimination is not noticeable at 60 frames per second.
--------------------------------------------------------------
Sensitivity to input lag, flicker, etc. (You benefit more if you don’t feel any effects from input lag or flicker)

Computer Factors That Hurt LightBoost

  • Inability to run frame rate equalling Hz for best LightBoost benefit. (e.g. 120fps@120Hz).
  • Judder/stutter control. Too much judder can kill LightBoost motion clarity benefits.
  • Framerate limits. Some games cap to 60fps, this needs to be uncapped (e.g. fps_max)
  • Faster motion benefits more. Not as noticeable during slow motion.
  • Specific games. e.g. Team Fortress 2 benefits far more than World of Warcraft.
  • Some games stutters more with VSYNC ON, while others stutters more with VSYNC OFF. Test opposite setting.
==================================

"Slideshow"
typically refers to the motion definition aspect. Motion definition provides additional smooth detail in the pathing and animation cycles, and even shows more smooth motion definition of the entire game world moving relative to you when movement keying and mouse looking in 1st/3rd person games.
Mentioning different hz and features without including the accompanying frame rates each are typically running doesn't really tell what you are comparing.
60fps (average which ranges even lower rate part of the time) at 60hz+ is like molasses to me.
100fps at 100hz or better shows 5 new unique frames to every 3 at 60fps-hz.
120fps at 120hz or better doubles the motion definition.
The "slideshow" nickname is because there are few new frames of action being shown at low frame rates. The same frame being shown for a longer time time like a flip book animation with less pages being flipped slower.
At high fps on a high hz monitor, you will get much higher motion definition regardless of whether you have strobing or black frame insertion, crt redraw, etc.
------------------------------------------
The Motion Clarity (blur reduction) aspect is also improved by running at high fps+hz ranges on a high hz monitor, and is nearly pristine using backlight strobing (with some major, in my opinion critical, tradeoffs).
Non strobe mode, at speed (e.g. mouse looking viewport around):
60fps solid ... is a full smearing "outside of the lines" blur. -- At variable hz or at 60hz or at 100hz, 120hz, 144hz, 240hz.
120fps solid .. halves that blur (50%) to more of a soften blur inside the masks of objects -- At variable hz or at 120hz, 144hz, 240hz.
144fps solid .. drops the blur a bit more, 60% less blur -- At variable hz or at 144hz, 240hz.
240fps solid .. drops the blur down to a very slight blur showing most of the texture detail -- At variable hz 240hz
 
Last edited:
The extreme anti-Windows 10 stuff reminds me of dealing with countless old people and the Windows XP -> 7 transition in the IT world.
 
The extreme anti-Windows 10 stuff reminds me of dealing with countless old people and the Windows XP -> 7 transition in the IT world.

And it still boils down to 'let me run insecure software because I hate any and all change!'.

I say give them Macs. Maybe they'll quit.
 
And it still boils down to 'let me run insecure software because I hate any and all change!'

The idea that people dislike change is a myth. People like change, where the change has enough pros to outweight the cons.

Windows 10 brought too much unbeneficial change for many Windows 7 users. The GUI doesn't know if it wants to be a desktop or a tablet or a web storefront or thin client in Microsoft's 'cloud'. The privacy issues and Microsoft privacy policy which arrived after Windows 10 have turned Microsoft into a trust disaster. The constant and mandatory updating is breaking user experience and sometimes breaking functionality. None of this is benefiical change. And even if it were only change for change's sake, it wouldn't be enticing for many Windows users. But it isn't that. It is change for Microsoft's further benefit to the detriment of users.

None of this is to say that Windows 10 users haven't seen any benefits to updating. But at the same time, Windows 7 users have seen anti-benefits to updating.

Any way, these sorts of things aren't so simple as 'it boils down to'. And by the way, according to statistics over the last few months from various web sources, the number of Windows 7 users isn't far behind the number of Windows 10 users at something like 35% and 40% respectively of the total pool of Windows users, which is very telling considering that computers have been shipping with Windows 10 installed for years now, and Microsoft tried VERY HARD to get everyone to update to Windows 10 for free. And I would say that it has been more difficult not to update to Windows 10 than to update. A new Windows computer can't be found without Windows 10.
 
Last edited:
Any way, these sorts of things aren't so simple as 'it boils down to'. And by the way, according to statistics over the last few months from various web sources, the number of Windows 7 users isn't far behind the number of Windows 10 users at something like 35% and 40% respectively of the total pool of Windows users, which is very telling considering that Windows computers have been shipping with Windows 10 installed for years now, and Microsoft tried VERY HARD to get everyone to update to Windows 10 for free.

We saw the same thing with Windows XP, and the only benefit needed is security patches. It's expensive for Microsoft (or anyone) to continue to support software that they've moved on from. Further, unpatched systems are a danger to everyone.
 
We saw the same thing with Windows XP, and the only benefit needed is security patches. It's expensive for Microsoft (or anyone) to continue to support software that they've moved on from. Further, unpatched systems are a danger to everyone.

Security from who? If users can't trust the company pushing out security patches, then it becomes something of a wash. Microsoft created a situation of distrust.

And look at the same situation on phones. Probably hundreds of millions to billions of users are running unpatched older phones. I'm not saying that it is ideal, but in reality, the world hasn't fallen apart over it.

And the Linux desktop is becoming more attractive all the time. I think that once Windows 7 reaches end of support, there will be many more Linux desktop users. And it really isn't a bad experience these days, especially for desktop users who don't run niche software. Using the internet and playing games covers most use cases, and that is very doable on the Linux desktop.
 
Last edited:
I don't think linux has the performance in games. It doesn't have direct x natively so prob missing a lot of optimizations and even shaders and lighting stuff. I also don't know how it works with VRR and eventually HDR. Major devs don't support it enough. Running games on it can almost be like running an emulator in some cases.. trying to fit a square peg into round hole. Steam does have some linux games but you are really just being stubborn I think. Linux is fine, macOS is fine too - but windows, directx, nvidia drivers, game support, game performance are way better in windows.

You can also dual or multi boot operating systems, and even run VMs/virtual desktops.

Security wise, some articles say motherboards themselves are comprimised, intel chips, phones. I'm not saying ignore security entirely but again to me you sound like the guy who wants to keep his clamshell phone and doesn't want to pay for a data plan :b


--------------------------------------------------------------------------------

Windows 10 has a lot of tweaks.
It's not a touchscreen metro desktop front end by default anymore and it's gone through a lot of updates.

---------------------------------------------------------------------------------

I've been using most of these apps and interface tweaks since windows XP and on the front end visually and functionality-wise little has changed:

I run a grey interface theme on mine and I use winaerotweaker app to shrink all the window border frames to ultra slim along with a bunch of other nice tweaks.

I use displayfusion multi monitor app to customize some nice slim taskbars on all screens and set up some cool window placement shortcuts as well as the app's default memorizing/mapping of apps to locations/monitors.

For my file manager, I use directory opus which has way more fucntionality than I even know but even using the tip of the iceberg it is great. Visually I can use it as a dual pane file browser with tabbed broswing that has folder tabs on the bottom of each pane (can also drag a tab from one side to the other, duplicate tabs, etc). It has favorites, file collections, enhanced file copy/move operations and a million other things. It also allows you to customize the fonts, font sizes, and all of the colors and background color.. as well as resizing text like a browser on a per tab basis independent to that tab by holding ctrl+moving the mouse wheel up/down.

On chrome web browser I use a full window video addon, umatrix url script blocker app (global, custom per site, etc that remembers your choices.). tabli to list all my tabs across multiple browser windows in a consise index, session buddy to save sessions incl selectively and re-open selectively from a url index, nosquint addon to change background colors (medium to dark grey usually) and sometimes text color as well as full zoom level or just text zoom level on a per site basis, along with a few more.

So, all of those along with a few others.

The only complaint I have about windows 10 is that they dumbed down some of the deeper customization setting applets/config panels but with winaerotweaker bringing deeper registry tweaks to an easy interface with sliders, etc and the fact that my 3rd party apps are all still pretty much the same interface wise.. that really isn't a problem

-------------------------------------------------------------------------------

A HUGE security problem with using an end of life version of windows version is that when microsoft continually patches the newer version (windows 7/8 vs winXP formerly, now windows 10 vs win7/8), it is basically broadcasting where the security hole was. These security holes are often applicable to the older version of the operating system. So you are really leaving yourself wide open unless you are not connected to the internet. Even then, even if not a phone-home type of malware - if you are updating your apps or installing things you could still be comprimised maliciously.
 
Last edited:
I don't think linux has the performance in games.

Something to point out: there doesn't seem to be anything egregiously wrong with Linux game performance. Generally speaking, you're correct in stating that there is a performance issue, it's just that the issue isn't really Linux itself as much as it is native development and API and driver support. With respect to the latter, Nvidia drivers on Linux are essentially the same performance as on Windows, as are AMD's most recently, and we can expect the same from Intel. To wit: I've played Windows games, including the sloppily coded League of Legends, through WINE on a laptop with integrated Intel graphics. Performance wasn't the same, but it was playable, suggesting that the Linux side of the equation was functional.

We're currently seeing non-native games running on the WINE translation layer run nearly as well as on Windows and sometimes faster; further, games written with Vulkan need no tranlsation for graphics. These calls can be made immediately to the Vulkan stack in Linux.

Now, I will also echo your thoughts on HDR and VR: I also have no idea how this will work. We do know that 'Linux' can handle HDR, as we've seen that on Android (my Nvidia Shield is a testament to that), but hell if I know about VR and integrating that support into a system. I also have no idea how HDR would work on a desktop.

I'd just further point out that development in these and all areas of the Linux desktop and Linux gaming are ongoing at rapid pace, and that Intel and AMD (and Microsoft!) are massive contributers alongside Valve and Google, for example.
 
  • Like
Reactions: elvn
like this
Security wise, some articles say motherboards themselves are comprimised, intel chips, phones. I'm not saying ignore security entirely but again to me you sound like the guy who wants to keep his clamshell phone and doesn't want to pay for a data plan :b

I don't doubt that. Linus (creator of Linux) has admitted himself that he was approached for backdooring Linux. And we have been seeing more and more hardware vulnerabilities reveiled recently. To think that those haven't been taken advantage of is probably being naive.

But all of that is beside the point. Point being that Microsoft has outright created an OS that is spyware and boldly told us such within their privacy policy, while at the same time very aggressively pushed for Windows users to update to it. And while many Windows users have resisted updating or reluctantly updated (or had no choice when purchasing new devices), a minority of users (along with the monotone of tech media) are acting just as aggressive as Microsoft themselves in trying to convince other users to update to Windows 10. It's as if the minority of enthusiastic Kool-Aid drinkers have set up their own Kool-Aid stands and are shouting at passerbyers to also drink the Kool-Aid. Drink the Kool-Aid hethens, yas ye shall perish into the flames.

I guess I see myself as being agnostic to the fearmongering. My plan is to continue with Windows 7 until I switch to Linux.
 
Last edited:
Point being that Microsoft has outright created an OS that is spyware and boldly told us such within their privacy policy, while at the same time very aggressively pushed for Windows users to update to it.

This boils down to trust.

If you fundamentally distrust Microsoft, then your perspective makes sense.

However, I'll point out that nearly all pieces of technology in use by consumers contain similar clauses. Yes, most Linux distros don't do this, yet. But the software that runs on them can.
 
This boils down to trust.

If you fundamentally distrust Microsoft, then your perspective makes sense.

However, I'll point out that nearly all pieces of technology in use by consumers contain similar clauses. Yes, most Linux distros don't do this, yet. But the software that runs on them can.

I guess I look at it such that if I know for sure that something is a problem, it makes sense to address it. Outside of Android, software running on Linux is a maybe in the spyware department, and the chances of it being such on the Linux desktop are vastly lower than for Windows 10 along with a privacy policy that essentially states that it is spyware.
 
I guess I look at it such that if I know for sure that something is a problem, it makes sense to address it. Outside of Android, software running on Linux is a maybe in the spyware department, and the chances of it being such on the Linux desktop are vastly lower than for Windows 10 along with a privacy policy that essentially states that it is spyware.

Really, any Linux software can be spyware- or worse. Any rogue dev can insert malware into a PPA and have that automatically installed across the world. It's happened, and it'll happen again.

With respect to Microsoft, they're probably the most trustable of their type of organization. To hold that against them would be pretty hypocritical unless one also refused to use any other piece of technology that also does the same kind of collection, publicly acknowledged or otherwise.

With respect to Windows 10: the main reason for the push is that too many users refuse to patch their systems. That's why Windows 10 forces patching; or rather, why it's acceptable for them to do so, and not particularly reasonable to run a version of Windows that doesn't get patched.

I get not running Windows, but I don't get refusing to run a patched system that's connected to the Internet.
 
Really, any Linux software can be spyware- or worse. Any rogue dev can insert malware into a PPA and have that automatically installed across the world. It's happened, and it'll happen again.

With respect to Microsoft, they're probably the most trustable of their type of organization. To hold that against them would be pretty hypocritical unless one also refused to use any other piece of technology that also does the same kind of collection, publicly acknowledged or otherwise.

With respect to Windows 10: the main reason for the push is that too many users refuse to patch their systems. That's why Windows 10 forces patching; or rather, why it's acceptable for them to do so, and not particularly reasonable to run a version of Windows that doesn't get patched.

I get not running Windows, but I don't get refusing to run a patched system that's connected to the Internet.

Where is the logic?

Any software can be spyware. What we are talking about here is probability. And the probability of Windows 10 being such is at the extreme 'yes' end of the scale. Microsoft tells us that it is in their privacy policy. I don't get the trustworthiness of a company telling users that the OS is spyware. I suppose you could look at it as being a good thing that at least they do tell users that it is, but it is spyware nonetheless. Linux is on the other extreme 'no' end of the scale. Sure, there has been some malicious software available on Linux over the years, but in comparison to Windows it is a drop in the bucket. And we have no reasons to believe that the Linux OS itself is spyware.

There is a presupposition here that Windows 7 users are running without security updates. Where is that coming from? Windows 7 support hasn't ended yet and it won't until the end of the year. But we still have to keep in mind that those security patches are coming from an untrustworthy company.

And on running unpatched systems in general (aside from any presumptions), consider how many phone users are running unpatched systems right now and have been for many years and the reality of that situation.

So then, it all becomes a question of probability. At one end of the scale, Microsoft's Windows 10 is spyware. On the other, Linux is not as far as we know. Tangent to that is the possiblity of getting hacked in the future by running an unpatched Windows 7 system. Running a spyware OS is a sure thing. The tangent possibility of getting hacked by running an unpatched Windows 7 system (after the end of the years) must be hashed out as a probability (if someone plans to run an unpatched Windows 7) given the reality of running unpatched systems in general, rather than a given of absolutely getting hacked.
 
Last edited:
Any software can be spyware. What we are talking about here is probability. And the probability of Windows 10 being such is at the extreme 'yes' end of the scale.

Are they collecting metadata and selling it? Of course. So does every other system like Windows, which Linux isn't- and it's far from a desktop replacement. This coming from a Linux desktop user.

There is a presupposition here that Windows 7 users are running without security updates. Where is that coming from? Windows 7 support hasn't ended yet and it won't until the end of the year. But we still have to keep in mind that those security patches are coming from an untrustworthy company.

Here's the thing: people easily can. Thus they do. Further, at the end of support, they'll have no choice.

And on running unpatched systems in general (aside from any presumptions), consider how many phone users are running unpatched systems right now and have been for many years and the reality of that situation.

Consider that the 'userspace' on Android presents a smaller attack surface than desktop or server Linux distributions. Apples and oranges, really.

So then, it all becomes a question of probability.

The probability is 100% that attackers are targeting unpatched systems, of every type. With respect to 'spyware', this depends highly on calling metadata collection that all enterprises perform 'spyware', versus something designed to maliciously record user details such as bank credentials.

Microsoft isn't going to do that (probability, as you claim), while they are working to prevent others from doing it.
 
A HUGE security problem with using an end of life version of windows version is that when microsoft continually patches the newer version (windows 7/8 vs winXP formerly, now windows 10 vs win7/8), it is basically broadcasting where the security hole was. These security holes are often applicable to the older version of the operating system. So you are really leaving yourself wide open unless you are not connected to the internet. Even then, even if not a phone-home type of malware - if you are updating your apps or installing things you could still be comprimised maliciously.

Quoting myself in case it got lost in the larger reply there.

The chance goes up exponentially when you consider that the holes they are patching in the still supported current version (windows 10 in this case) can exist in the previous unsupported versions of windows, holes which will never be patched in those versions.. The security patches detail what vulnerability existed and is being patched. This shows authors of malicious software where the security holes are so they can then design software exploits to target the now unsupported systems.
 
Recently got myself an 850G and tried some settings that look pretty good on Gamer 1 picture mode, but when I'm gaming I'd like to have different settings, am I missing something or that's not possible with this monitor? When I change to Gamer 2 picture mode, the brightness and contrast settings remain the same, I'd like them to be different, like 2 different profiles. Also isn't there a saturation setting? or that's only controllable with RGB which is more complex overall than a simple saturation setting to make the colors in game pop a bit more.

Any help would be appreciated guys.
 
Last edited:
The extreme anti-Windows 10 stuff reminds me of dealing with countless old people and the Windows XP -> 7 transition in the IT world.

"Software as a service" is a scam, and Windows 10 continues Microsoft's growing trend of decreasing the amount of control the user has over their own computer. Change is all well and good but some things are just worse. When i'm blocked from removing what Microsoft decides are "core system services" even when using Powershell on Enterprise, this is a problem. Having to use the Group Policy Editor just to accomplish minor tasks without the system getting in the way and trying to tell me the best way to use my system is a problem. Problems that don't exist on Windows 7. I don't care what number they stick on the end, I just want an OS that works for me and not against me. 10 is still built on the same decades-old NT code that every other iteration of Windows has been built on since, it's not a new OS, it's the same old shit in a new package with less end-user control.

The LTSB version of 10 strips some of the crap out, but MS downplays the availability of that branch. I'll update when I have to, it is what it is.
 
Last edited:
  • Like
Reactions: N4CR
like this
The 850F version (Freesync 2, Wide Colour Gamut) of this monitor is currenlty $299.99 on Costco.com with membership.
 
1440p is so 2012, rtings.com review shows blue oversaturation, and poor viewing angle.
Theres plenty of other better displays.
 
Last edited:
I use 32/4K/60/200% scaling at work. It is much better than 27/1080/60/100%.
I am OK with my 32/1440/165/150% at home where I play FPS/RaceSims a lot. But I will change it to 32/4K/165/200% very soon.
 
If you're into gaming, 4K is not a good option until we get way better graphic cards IMO. 1440p is the sweet spot at the moment.

I'd say this display is very good for overall usage once calibrated and amazing for gaming, very good contrast, low input lag (I play fighting games so I can easily sense that), viewing angles are good and colors are pretty good too once you adjust the settings.
 
it's 4k gaming, framerate looks more than acceptable

Straddling 60fps average's higher and lower (30-50fps mud) frame rate range isn't really getting anything out of the higher hz capability of a high Hz monitor though.
You really start getting appreciable benefits at 100fps average (70 - 100 - 130 graph for the most part). 120fps-Hz average (90 - 120 - 150) or more is even better.



If you're into gaming, 4K is not a good option until we get way better graphic cards IMO. 1440p is the sweet spot at the moment.

I'd say this display is very good for overall usage once calibrated and amazing for gaming, very good contrast, low input lag (I play fighting games so I can easily sense that), viewing angles are good and colors are pretty good too once you adjust the settings.

"1440p is the sweet spot at the moment."
I agree.
While there are some games that can get 100fps-Hz (average) on a single high end 2000 series gpu at 4k now, it's usually with some of the over the top settings turned down/off to "very high+" or lower - or there are games with simpler graphics to start with of course. The most demanding and/or unoptimized games are still low frame rate, well under 100fps-Hz average at 4k so I agree that 1440p is still the sweet spot for modern high hz gaming's aesthetic benefits in motion clarity (blur reduction) and motion definition (more pages in an animation flip book flipping much faster, more dots per dotted line shaped path). Getting up into the higher Hz ranges affects the entire game world moving in relation to you when mouse-looking and movement keying (or gamepad panning) at speed. This usually takes things from a smeary viewport at lower fps-Hz (so much for 4k detail during fast viewport movement) to a soften effect clarity wise and with a slick glassy motion from the increased frames. This is much less aggressive to your eye's focus and looks way better and clearer, especially with a high quality overdrive implementation. It's still not CRT level 1px of motion blur (essentially "zero") clarity wise, but it's much better than sinking your frame rate's middle and bottom range down into 60 to 30fps. Higher frame rate graph's motion definition increases aren't't something people should dismiss either. I mean, that is why people supposedly buy high Hz monitors... if you aren't filling much more of those Hz with new, unique game world states/action states consistently, you aren't really getting much out of a higher hz monitor other than the label.

"this display is very good for overall usage once calibrated and amazing for gaming, very good contrast, low input lag (I play fighting games so I can easily sense that), viewing angles are good and colors are pretty good too once you adjust the settings."
I wouldn't call this very good for desktop/apps unless you have it set back a bit farther so the pixels shrink to your perspective.
The contrast and black depth is great as compared to tn and non-fald ips... triple or more.
The input lag and overdrive vs response time are very good, especially for a VA.

The colors are pleasantly saturated once tweaked/bumped slightly up. This is great for gaming combined with the contrast and black depth. I turn the nvidia desktop color vibrance up to +52% from 50 and tweak the OSD's R/G/B , contrast and brightness, etc. for desktop usage. I then use nvidia freestyle sliders in the freestyle overlay on a per game basis to tweak the color saturation, contrast, and sharpness better per game (which it remembers.). This 32" LG is completely usable as a general desktop monitor for simple stuff like browsing and even viewing some pictures once tweaked but text and interfaces will look jumbo and a bit "soft" at nearer distances. Unfortunately there are no sharpness controls in the OSD for desktop/app use but there is a sharpness adjustment in the nvidia freestyle settings for games. The uniformity and bumping the saturation up makes this not suitable for color work of course. Games are where this monitor really shines.

In games this monitor is well contrasted, vibrant, fast, better sized including height than the ~ 13" height of most other monitors, has good native resolution for gaming in order to get a lot more out of it's higher hz capabilities, plus has the benefits of g-sync.

"viewing angles"
Reading this statement has reminded me of the moment when Bush claimed Iraq develops WMD.

I have three VA screens at my desk, on monitor arms. As long as I angle them to my line of sight they all look great. This 32" LG is the worst of the three by far to be fair but that is only obvious when looking at a solid background of bright color full screen, where the edges next to the monitor's side borders shade slightly somewhat like TN shift. Since I generally use a medium grey, dark grey, or black background in my apps and windows theme this isn't obnoxious used as a secondary or tertiary desktop/browsing monitor. In games its a complete non-issue. The size, contrast and black levels are huge gains. I'd never go back to non-fald IPS or TN at .14 black depths and 860:1 to 980:1 contrast ratios which are around a 1/3 of those of this monitor at best.
 
Last edited:
Sorry to interrupt this debate, but am I the only one with the problem? My displays are set to turn off after short while to save power. All other displays seem to turn off but are still visible for the system, while LG turns of completely. This leads to extreme mess when I turn displays back on - Windows thinks that LG got disconnected, so it rearranges all programs all over the other displays. Any ideas how to get rid of it? Tried turning off "Automatic Standby" in General settings, but no result.
 
Sorry to interrupt this debate, but am I the only one with the problem? My displays are set to turn off after short while to save power. All other displays seem to turn off but are still visible for the system, while LG turns of completely. This leads to extreme mess when I turn displays back on - Windows thinks that LG got disconnected, so it rearranges all programs all over the other displays. Any ideas how to get rid of it? Tried turning off "Automatic Standby" in General settings, but no result.

Ok, when I tried this a month ago it didn't appear to make a difference, but during the last week or so I noticed my windows weren't jumping between my screens any more when I turn the panel back on, everything stays where I left it now (exactly what I wanted).

Thinking back, the only thing I changed was this "Deep Sleep" setting to off, and then a few weeks ago the monitor asked to "restart" when I was playing with the overclocked 165Hz setting, which seems to have done the trick.

So for future reference, if you're tired of all your windows jumping to your other monitors when you turn your DP monitor off for the night, try turning off Deep Sleep AND ALSO do a full monitor restart.
 
I use displayfusion with hotkeys mapped to window locations. You can micromanage the settings deeper to customize it but it also remembers window states for re-opening them the next time by default. So using the hotkey method, with a few clicks I can move windows to a bunch of my home positions I pre-set to hotkeys in displayfusion, and I can shuffle the active window around to any of the positions if you get what I mean. I use an elgato stream deck's hardware array of buttons to activate the hotkeys. I also recently installed a browser addon called tab session manager which saves all the browser tabs and window states periodically as well as allowing you to save named sessions manually. When you re-open the named session, all of the browser windows re-open with all of their tabs and window size and position states. Along with Tab Suspender and a few other addons it works great. I can close my browser or reboot the pc and just re-open the session and everything goes back to where it was browser wise. If I need to shuffle anything else I just click on it and hit a window location hotkey.
 
Anyone having issues with coil whine on these?

I recently just picked up a great deal on one of these panels (Looks like its the 650 rather than the 850 variant?)

and good lord almighty the coil whine is killing me from anywhere in the same room, I know im more sensitive to most when it comes to this, but its slaying my ears.
 
I have no coil whine on my GK850G.

Are you overclocking your 650? I'd experiment with seeing if it still whines at 120hz limit or 144hz.

Are you using freesyc on that model? If so, with what gpu? Is it a nvidia certified freesync monitor? I'd try disabling freesync as well in the interest of a process of elimination just to see if it makes a difference.

In addition to that I would see what the ambient temperature is in the room perhaps as these tend to run hot (at least the ones with a g-sync module do).

I'd also consider the power source and how many things are drawing on it. Things often whine when under powered, almost like a water pump sucking a vacuum or air. Does it whine all the time or only when stressed in games (using more power)?

In the old days when I ran FW900 CRTs, I had to get a tripplite line conditioner which provided clean sine wave power (avoiding brown-dim or surge-bright power fluctuations) to the CRT and eliminated other line interference. I still use one on my rig now, before the UPS since a UPS won't recharge and keep the system poweed on off of dirty wall power when running off of a generator during power outages. That's even when the UPS has it's own (inferior) "pure sine wave" line conditioner in it.

I'd return it if it ends up being a mfg defect or something.
 
I own the 650F and there is no coil whine here, monitor for all intent purposes is completely silent.

Have you tried plugging the monitor into a different power point? Usually coil whine is related to the power source/supply. I live in a new 5 year old home so all my wiring is new, I remember in my old house before we demolished it with old wiring, I used to experience more coil whine/power device failure due to the poorer quality power source.
 
Last edited:
So there is no way I can adjust sharpness on my LG 32GK850G outside of games? I had F version before but didnt like colors, so I returned it. Also pixel response was a bit slower than on G version, at least I have such a feeling. But sadly now I see G version doesnt have "sharpness" option, which somehow make me sad. Really want to keep this monitor because I like colors on G model to be more natural. Anyway, is there any kind of 3th party app that can sharpen my screen image or simply should I forget about such a thing?
 
Back
Top