LG Launches 32UD99-W 32" Class 4K UHD IPS LED Monitor with HDR10 and FreeSync

D

Deleted member 93354

Guest
Because gsync is proprietary and adds to the overall cost - VESA Adaptive Sync standard being supported by AMD and Intel (probably with cannonlake cpus)

nVidia is really hurting their customers by not supporting a VESA standard
It's an irony that nvidia does have the quality better tech in terms of released monitors. As it is now there quite are a number of freesync monitors out there that aren't worth a @#$@#$ because of their refresh rate range is limited. The ones that are worth anything are ALMOST as expensive at the nSync. And GSync does have the display buffer built in for low refresh rate doubling. But nvidia is so freakin expensive. This is a sad state of affairs because I always prefer open standards to proprietary.
 

v6maro

[H]ard|Gawd
Joined
Oct 10, 2002
Messages
1,552
Is this 10 or 12 bit (HDR)? aka whats the HDMI/DP gbps rating? 10.2 or 18gbps? Also, why no g-sync? *cry*
 

Neurofreeze

2[H]4U
Joined
Jan 11, 2002
Messages
2,523
Guys, dont fall for it. HDR requires high native contrast ratio. Unless this comes with local dimming the IPS simply cannot cut it. This monitor has marketing HDR, meaning its compatible with it and maybe show bigger color space but without high contrast panel things just get washed out in brighter scenes. Hell, even VA panel has this problem to some extent if there is no local dimming.
I believe the HDR spec requires local dimming; but it doesn't matter.

Even if it had natively good contrast ratio, which it doesn't as IPS doesn't pretty much ever compared to the world of TVs, LG for some reason loves edge-lit displays even on their highest-end LCDs, so the local dimming zones are utter junk (bloom everywhere, giant horrible light banding, etc).

From casual googling, it appears this monitor is also edge-lit.
 
D

Deleted member 93354

Guest
I believe the HDR spec requires local dimming; but it doesn't matter.

Even if it had natively good contrast ratio, which it doesn't as IPS doesn't pretty much ever compared to the world of TVs, LG for some reason loves edge-lit displays even on their highest-end LCDs, so the local dimming zones are utter junk (bloom everywhere, giant horrible light banding, etc).

From casual googling, it appears this monitor is also edge-lit.
Local dimming helps with the contrast alone but in a limited range. Combined with more light, and local dimming you create a greater luminosity range. This is where the 1000 Nits comes in, to help achieve that contrast.

Std set full white without local dimming: 500 Nits * 100% = 500 Nits
Std set pitch broadcast black without local dimming: 500 Nits * 15% = 75 Nits

Contrast is 500/75

Std Set Full White With Local Dimming: 500 Nits * 100% = 500 Nits
Std Set Pitch Broadcast Black With Local Dimming: 10 Nits * 15% = 1.5 Nits

Contrast is 500/1.5

HDR set full white with local dimming: 1000 Nuts * 100% = 1000 Nits
HDR set Pitch Broadcast Black with Local Dimming: 10 Nits * 15% = 1.5 Nits

Contrast is 1000/1.5

Still not ideal, but much better.
 
Last edited by a moderator:

travisty

Gawd
Joined
Feb 3, 2016
Messages
815
It's an irony that nvidia does have the quality better tech in terms of released monitors. As it is now there quite are a number of freesync monitors out there that aren't worth a @#$@#$ because of their refresh rate range is limited. The ones that are worth anything are ALMOST as expensive at the nSync. And GSync does have the display buffer built in for low refresh rate doubling. But nvidia is so freakin expensive. This is a sad state of affairs because I always prefer open standards to proprietary.
I won't argue which is the better approach. All i'm saying is that nVidia should support the open standard even if they think gSync is better. If they support both they get to claim as much and cut's AMD's position.
 
D

Deleted member 93354

Guest
I won't argue which is the better approach. All i'm saying is that nVidia should support the open standard even if they think gSync is better. If they support both they get to claim as much and cut's AMD's position.
That would make sense only 1/2 way. You have to remember GSync processing cards that get in stuck in the back of monitors are a profit maker for NVIDIA. So if they support free-sync, that works against their interest. Buying a monitor becomes a no brainer if both major competitors support free-sync.
 
Top