LG 34GK950F-B (3440x1440 144 Hz QD strobing backlight)

Vega

Supreme [H]ardness
Joined
Oct 12, 2004
Messages
7,184
https://www.lg.com/us/monitors/lg-34GK950F-B-gaming-monitor

This could be very interesting. The first strobing back-light 144 Hz ultra-wide.

Global_34GK950F_2018_Feature_02_Intro_D.jpg
 
well, strobing and freesync dosen't work at the same time anyway, so.. the nVidia cards would be a better fit.. a 2080ti might even be able to hold 144 fps in most games..
 
nvidia g-sync on this moritor 34GK950G is limited to 120 because they state thats all that g-sync can do.

ps nvidia is going to block any use of their cards with free sync
 
nvidia g-sync on this moritor 34GK950G is limited to 120 because they state thats all that g-sync can do.

ps nvidia is going to block any use of their cards with free sync

Are there not 4k 120hz HDR displays that support Gsync? So why would this one be any worse, it's lower resolution :)
 
G-Sync being proprietary and expensive when there are free equivalents is bad enough. But for G-Sync modules to be so outdated that the free versions are markedly BETTER is just egregious for NV to not support it.

UGH.
 
Are there not 4k 120hz HDR displays that support Gsync? So why would this one be any worse, it's lower resolution :)
Looks like it is not using G-Sync HDR, as that version lists no HDR support unlike the Freesync version. That would mean it's using DisplayPort 1.2, not 1.4, which is limited to about 17 Gbps with the former compared to 25 with the latter. 3440x1440 8bpc @ 144 Hz is a little over 17 Gbps uncompressed, and G-Sync adds some overhead.
G-Sync being proprietary and expensive when there are free equivalents is bad enough. But for G-Sync modules to be so outdated that the free versions are markedly BETTER is just egregious for NV to not support it.

UGH.
Freesync 2 isn't free like the original. It now goes through the same approval process G-Sync displays go through. Licensing is still free, though.

I have yet to see a comparison between Freesync 2 HDR and G-Sync HDR. G-Sync was originally better than Freesync objectively, but it didn't matter all that much perceptively. Truth is all VRR tech is great for gaming.
 
Yes the "G" model is using the old/cheap DP 1.2 G-Sync chip, not the new/expensive DP 1.4 HDR G-Sync chip.

I'm more interested in the plain Freesync "F" model posted here that can run 3440x1440 at 144 Hz STROBE!
 
Does it indicate what the strobe rate is?
The backlight strobes in sync with the refresh rate (so 120 Hz = 120 Hz, or 8.33ms). It says "1ms MBR" so I assume the "on" period of the backlight is 1ms. On ULMB displays it was adjustable in a range from 0.25ms to 1.875ms (the shorter the period, the less persistence but dimmer the peak brightness).
 
nvidia g-sync on this moritor 34GK950G is limited to 120 because they state thats all that g-sync can do.

That sounds like a DP1.2 limit to me, 3440x1440 @ 144hz exceeds the 17.28Gbit/s limit. So they're probably using an old gsync module. They would need the new g-sync module that the PG27UQ/PG35VQ are using, and my guess is they just feel it's too expensive. Based on the fact that the G-sync non-FALD XB273K is $400 more expensive than its' freesync brother, the DP1.4 g-sync module is Ridiculously Expensive.
 
It's dumb that Nvidia hasn't updated the standard G-Sync module with DP 1.4 and HDMI 2.0. I get that the new G-Sync HDR module has all this crazy shit in it that requires active cooling and crap, but how about at least modernizing the original G-Sync module. It's like 3 years old now and is obviously holding back monitors like this one where the Freesync version is BETTER than the G-Sync version.
 
nvidia is just saying pay us $400 for a $25 module.
I remember when they first introduced it and they said the price would never be over $50 difference.
Times change but N does not...greed is good in their eyes
 
There’s going to be so many DisplayHDR 600 monitors next year, nothing else is worth buying right now - seems to me like they’re clearing inventory. Personally I’d ride out 2018 and wait for next year. DCI p3 is great, but if contrast and luminance aren’t there to help display it, the gain is minimal.
 
So for Nvidia card owners -- it seems like the choice here is 120 Hz + G-Sync or 144 Hz? How does 24 Hz compare to G-Sync considering you're already at high refresh rates?
 
Its a shame there are not more Gsynch monitor choices. Its also a shame AMD doesn't have better GPUs to take advantage of all these free synch monitors. Not sure which is worse really.
 
It's dumb that Nvidia hasn't updated the standard G-Sync module with DP 1.4 and HDMI 2.0. I get that the new G-Sync HDR module has all this crazy shit in it that requires active cooling and crap, but how about at least modernizing the original G-Sync module. It's like 3 years old now and is obviously holding back monitors like this one where the Freesync version is BETTER than the G-Sync version.

If AMD doesn't capitalize on this, they're completely fucking retarded. When HDMI 2.1 OLED monitors ship with VRR, AMD can swoop in, be the ONLY GPUs that actually support them, and be extremely attractive to gamers even if their performance isn't as good as Nvidia's. This is their big chance. Nvidia's kind of shot themselves in the foot: they can't peddle their proprietary G-Sync crap and actually support the new tech.
 
  • Like
Reactions: elvn
like this
There’s going to be so many DisplayHDR 600 monitors next year, nothing else is worth buying right now - seems to me like they’re clearing inventory. Personally I’d ride out 2018 and wait for next year. DCI p3 is great, but if contrast and luminance aren’t there to help display it, the gain is minimal.

Bingo. 2018 was a complete dud for monitors. They're all crap. There wasn't a single noteworthy monitor release this year. The 4k variable refresh monitors were all too small (not a single one released at 30"+ that wasn't turd), and the rest are just ancient rebranded crap (we've had 1440p 144hz for like 4 years now). Everyone's just treading water until HDMI 2.1 monitors come out. It's stupid to buy anything right now.
 
Bingo. 2018 was a complete dud for monitors. They're all crap. There wasn't a single noteworthy monitor release this year. The 4k variable refresh monitors were all too small (not a single one released at 30"+ that wasn't turd), and the rest are just ancient rebranded crap (we've had 1440p 144hz for like 4 years now). Everyone's just treading water until HDMI 2.1 monitors come out. It's stupid to buy anything right now.

LOL... you just never stop beating that drum do ya bluewaffle? The X27 is an incredible display and I for one am glad that you are missing out on its goodness....because you are not.......worthy ROFL
 
AMD is already talking mid 2019 and wow it will be super duper blow everything away.
Just like 2017
 
LOL... you just never stop beating that drum do ya bluewaffle? The X27 is an incredible display and I for one am glad that you are missing out on its goodness....because you are not.......worthy ROFL

It sure is incredible. Incredibly small.

?u=https%3A%2F%2Fmedia.giphy.com%2Fmedia%2FI4Jmrcjnr8Zfq%2Fgiphy.gif
 
It sure is incredible. But that is too rich for my blood and it depresses me to think about.

giphy.gif

I disagree..... you should beat yourself up over it because you are missing out on the best display since the FW900
 
I disagree..... you should beat yourself up over it because you are missing out on the best display since the FW900

The FW900's height is greater than it even though it's only 24". Pretty much says it all. I don't like monitors made for ants.
 
The 27" FALDS are the best monitors out right now but that is only because hdmi 2.1 wasn't ready for 2018. If you want size, get a LG 4k HDR OLED with hdmi 2.1 120hz native 4k with VRR + QFT in 2019 , or a Samsung "QLED" 4k HDR VA LCD with 120hz native 4k, VRR, QFT.. The smallest they go is 55" but if you have the room to rearrange your desk to be further from the monitor I see no problem there. In fact, it would allow me to run 21:9 or 21:10 or even smaller 16:9/:10 resolution(s) 1:1 for higher frame rates while still having a very large viewport/monitor.

----------------------------------------------------------------

LG 4k HDR OLED with hdmi 2.1 120hz native 4k with VRR + QFT in 2019


LG 2018 C8 60Hz Rtings review

OLED per pixel emissive avoids FALD halos/glow and any screen uniformity issues, and it has INFINITE:1 contrast ratio which is amazing but there is still the chance of burn in over time.

  • Real scene HDR Brightness is very good, but still short of the 1000-4000 cd/m² HDR is mastered for. Large bright scenes are very dim due to the Automatic Brightness Limiter(ABL).
  • Black Level.. Infinite:1
  • The OLED55C8PUA has perfect black uniformity, with no clouding due to its ability to turn off black pixels.
  • Excellent color and white balance dE after calibration, better than the C7 and Samsung's Q9F. While the calibration out of the box was already very good, after calibration the colors were nearly perfect. Gamma follows our target almost perfectly.
  • The C8 has decent coverage of the P3 color space, but is unable to produce overly bright, saturated colors.
  • C8 displays our test gradient smoothly with no significant banding. In certain scenes there is some banding noticeable in large areas of similar color. This can be reduced by enabling 'MPEG Noise Reduction', which toggles the gradient smoothing feature of the C8. This reduces the visible banding but also results in a loss of fine detail.
  • OLED TVs such as the LG OLED C8 have an inherent risk of experience permanent image retention.
  • C8 handles motion extremely well. The near instantaneous response time is excellent for watching sports or playing video games, as there is no ghosting or trailing during fast motion. Also, there is no visible flicker since there is no traditional backlight on OLED TVs, unlike Samsung's QLED technology. One downside to OLED technology is that there is some stutter when playing low frame rate content, especially when watching movies or TV Shows.
  • Like all OLED TVs, there is no visible backlight flicker which helps motion appear smoother, but it does result in some persistence blur.
  • 4k @ 60Hz + HDR : 29.4 ms
  • 4k @ 60Hz @ 4:4:4 : 21.1 ms
  • 1080p @ 120Hz : 21.9 ms
  • Great choice for PC use. Image remains accurate when viewed at an angle so the sides of the screen are uniform. Supports chroma 4:4:4 for clear text across all backgrounds
  • the brightness of the screen changes depending on the content and areas of static content may have a risk of burn-in (see here)

An alternative to burn in concerns would be whatever the samsung Q8F series equivalent will be in 2019. They are HDR 1000 FALD VA tvs.
The high end samsung "QLED"s already support VRR/free-sync on amd gpus and xbox one in their 2018 model, they just can't do 4k 120hz native input yet since there is no hdmi 2.1 circuitry in 2018 tvs.

Samsung Q8F (rtings review)
  • Excellent wide color gamut
  • Feels responsive due to low input lag
  • Great motion handling
  • the viewing angles are poor so the sides of the screen lose accuracy when viewed from up-close.
  • "Excellent contrast ratio on the Samsung Q8F. It features a full array local dimming feature and is able to get very deep blacks. 7957:1 "
  • "Very good brightness with HDR content. Small highlights are hitting the target 1000 cd/m² that HDR is mastered for. The screen brightness dips considerably with very bright scenes, but is still good for a bright room. Similar brightness to the LG C8, but with brighter highlights in very dark scenes, as shown by the small window tests."
  • "Excellent wide color gamut. The Q8FN can display nearly 100% of the P3 color space, and has the highest Rec.2020 coverage we have ever seen, although it is very close to the 2017 Q9F"
  • Update 06/08/2018: FreeSync has been tested and the score has been updated. FreeSync was supported from our Xbox One S and our Radeon RX 580 GPU, in 1080p, 1440p and 4k resolutions. FreeSync is activated by enabling the TV's Game mode and FreeSync settings
  • Excellent low input lag on the Samsung Q8FN QLED TV. Input lag is exceptionally low with 120 Hz content, similar to the NU8000, and better than the LG C8. It can display most resolutions without any issues, but chroma 4:4:4 is not supported in PC Mode with a 1440p@120Hz signal (Likely a bandwidth limitation that will be overcome with hdmi 2.1 models in 1440p and 4k 120hz)
  • 4k with Variable Refresh Rate : 15.4 ms
  • 4k @ 60Hz @ 4:4:4 + 8 bit HDR : 16.7 ms
  • 1080p with Variable Refresh Rate : 6.5 ms
  • 1440p @ 120 Hz: 10.0 ms
  • can also interpolate games while keeping a low input lag, which is great for smooth play. 4k interpolated: 20.8ms
  • Great choice for a PC monitor. Picture quality is good. The TV supports chroma 4:4:4 for clear text across all backgrounds, and it has low input lag so the TV feels very responsive. It also has a low response time
 
Last edited:
inherent risk of experience permanent image retention.
all the OLED tv's tested showed it.

proven and with a monitor...walk away and lets see how long that pretty screen last.
 
inherent risk of experience permanent image retention.
all the OLED tv's tested showed it.

proven and with a monitor...walk away and lets see how long that pretty screen last.

The second one I outlined , the Samsung QLED VA LCD line is a safer bet. Some halo e.g. white text on black background compared to OLED but they are feature rich, support HDR 1000nit, Native Contrast : 6055 : 1,Contrast with local dimming : 19018 : 1 , have a huge color volume near perfect rec 2020, low input lag, Variable Refresh Rate, 4:4:4 on desktop and only limited at highest resolutions+hz by not having hdmi 2.1. Has low input lag 10 ms at 120hz .. ~15 - 16m at 60hz s... and 20.8ms input lag at 60hz even with interpolation on. The Samsung QLED Q8 model is 40 FALD zones, the Samsung QLED Q9 series is 480 FALD zones. HDMI 2.1 and 120hz 4k native will be in next years models.
 
Last edited:
Back
Top