Future 38" IPS panel 98% DCI P3 144Hz HDR 600

the Freesync version goes for 1200 while the ‘proper’ gysnc version would go for ~1700

Based on un-cited 'speculation'. Also remember that Freesync isn't 'free', from a hardware standpoint; Freesync2, which seeks to close the gap with G-Sync somewhat, is even more expensive.

And so long as AMD maintains their steady two to three generation disadvantage, G-Sync + a top-end Nvidia GPU is that much better.
 
Based on un-cited 'speculation'. Also remember that Freesync isn't 'free', from a hardware standpoint; Freesync2, which seeks to close the gap with G-Sync somewhat, is even more expensive.

And so long as AMD maintains their steady two to three generation disadvantage, G-Sync + a top-end Nvidia GPU is that much better.
I don’t disagree with your assessment. I would pay it but I doubt it would be much of a commercial success with the price difference.
 
Gsync compatible might well explain the need for DP 1.4 to keep up with HDR/10 bit bandwidth and the 144/175 refresh rate. Wonder what the range would be if it did have a variable..
 
Hopefully they just had the brightness cranked to max for that video, the black levels were terrible
 
Its said that for a while now. TFTCentral is now saying Q4.

EDIT: Dan from LG over on OCUK forums said July/Aug, but we all know in the display business that means Q4.
 
Last edited:
Hope the delay also means they do a freesync only version (or just ditch gsync version completely and sell more at the lower price bracket hint hint) as the 34" out right now is great, best UW option so far.
 
Hope the delay also means they do a freesync only version (or just ditch gsync version completely and sell more at the lower price bracket hint hint) as the 34" out right now is great, best UW option so far.

Only worry here is that 'Freesync version' is not specific enough to ensure that the included Freesync capability is as close to G-Sync capability as it can get. That's one of the benefits with G-Sync: you get the best VRR support without question. Sometimes worth paying for.
 
As long as the g sync module doesn’t need a fan, I would rather have it than a freesync version.
 
As long as the g sync module doesn’t need a fan, I would rather have it than a freesync version.

Yeah, that's a goofy one. It's clear that there's an economy of scale delta that is pushing Nvidia toward using expensive and power hungry FPGAs as opposed to ASICs for G-Sync. Can't argue with the results though.
 
Yeah, that's a goofy one. It's clear that there's an economy of scale delta that is pushing Nvidia toward using expensive and power hungry FPGAs as opposed to ASICs for G-Sync. Can't argue with the results though.
Could you explain more? There are multiple types of Gsync chips?
 
Could you explain more? There are multiple types of Gsync chips?

At least two or three- there might have been a revision of the first version, between the modules that users could retrofit a monitor with and monitors that then shipped with G-Sync, and then the ones that have the HDR FALD capability. First two might be the same. Could have been more, but functionality wise there have only been two.
 
Giant company designing multiple N-billion transistor chips continues to peddle custom FPGAs for consumer products many years later. Sums up how much they believe in their own bullshit while trying to pawn it off to monitor OEMs.

Also, on the 34GK950, the F version is now objectively & technically superior to the G version, so at least there is precedence for LG doing it right.
 
That LG monitor has had many complaints as far as quality control goes. I had some flickering and had to return mine. There’s a reason why it’s not in stock for multiple vendors.
 
Sums up how much they believe in their own bullshit while trying to pawn it off to monitor OEMs.

It's custom. Custom isn't cheap, and you'd bet if they could reduce costs and increase margins (or just sales period), they'd do it.

Also, on the 34GK950, the F version is now objectively & technically superior to the G version

I'm not familiar with the models, do you have references that explain the disparity?
 
Custom is bullshit. ASIC company using FGPAs for years in consumer products is laughable. They never believed in their own custom smoke and always expected to give in to the actual standard when it was convenient. They are throwing illions at smart car chips that are still shipping in nothing - they believe in that.

Same glorious LG DCI-P3 98% panel, not the older LG panel that people still fawn over because it has an Alienware™ stamp:
http://www.tftcentral.co.uk/reviews/lg_34gk950f.htm 144hz, 10bit input, no FPGA, no RGB frag harder disco light.
http://www.tftcentral.co.uk/reviews/lg_34gk950g.htm 120hz, 8bit input, FGPA, RGB frag harder disco light.

Custom FPGA literally cripples it.
 
Custom is bullshit. ASIC company using FGPAs for years in consumer products is laughable.

Nvidia makes money. No reason to spend on an ASIC when an FPGA is cheaper at a given volume. Expecting them to spend unnecessarily is 'laughable'.

They never believed in their own custom smoke and always expected to give in to the actual standard when it was convenient.

Citation needed. They're still shipping G-Sync solutions.

Custom FPGA literally cripples it.

This is on LG- the current G-Sync modules support everything that the panel is capable of.
 
This is on LG- the current G-Sync modules support everything that the panel is capable of.
I don't think that's correct. I think that you're assuming that the "current G-sync modules" are all the G-Sync Ultimate modules. I believe that the LG is using the standard G-Sync module (the OG one, still "current") because the panel doesn't support everything necessary for the "G-Sync Ultimate" experience, like HDR1000.
 
I don't think that's correct. I think that you're assuming that the "current G-sync modules" are all the G-Sync Ultimate modules. I believe that the LG is using the standard G-Sync module (the OG one, still "current") because the panel doesn't support everything necessary for the "G-Sync Ultimate" experience, like HDR1000.

They could use that module if they chose to fully support the panel, at the very least, and that assumes that there's not another option. Still LG's choice here.
 
They could use that module if they chose to fully support the panel, at the very least, and that assumes that there's not another option. Still LG's choice here.
Which G-Sync module could LG use that supports 3440x1440 at 144 and HDR400 or HDR600?
 
...if it's going to be a G-Sync Ultimate display.
It's not an HDR1000 display...
Therefore it does not meet G-Sync Ultimate requirements from Nvidia...
Therefore it cannot use the G-Sync Ultimate module...
Therefore it must employ FreeSync, FreeSync 2, or G-Sync standard...

G-Sync standard module does not support HDR (or 3440x1440 above 120Hz, IIRC)

Therefore this is a limitation of Nvidia and their ASIC, not of LG's implementation of G-Sync for this panel.

Not trying to argue with you, these are Nvidias rules as I understand them.
 
Therefore it does not meet G-Sync Ultimate requirements from Nvidia...
Therefore it cannot use the G-Sync Ultimate module...

This logic doesn't follow. The module presents a set of capabilities. G-Sync Ultimate is a product labeling / branding thing. LG could easily use the module for its expanded capabilities beyond the standard module without labeling and marketing the monitor as 'G-Sync Ultimate'. And again that assumes that an 'in between' module doesn't exist.
 
This logic doesn't follow. The module presents a set of capabilities. G-Sync Ultimate is a product labeling / branding thing. LG could easily use the module for its expanded capabilities beyond the standard module without labeling and marketing the monitor as 'G-Sync Ultimate'. And again that assumes that an 'in between' module doesn't exist.
I don't believe that is correct. From what I recall and understand (and I could totally be wrong here) Nvidia won't sell the G-Sync Ultimate module for a panel that doesn't meet the G-Sync ultimate requirements (like HDR1000). This module is very expensive, rumored around $500 and wouldn't make sense in a mid-range monitor anyway. There is no in-between module. (Though specs of unreleased G-Sync "standard" monitors with HDMI 2.0 ports suggest that maybe a refresh is coming - but these don't show HDR support)

When LG releases two monitors using the same panel and the G-Sync version doesn't support HDR and > 120Hz while the FreeSync version does, you just assume that it's LG's mistake in implementation of G-Sync? There are many, many FreeSync 2 HDR monitors. How many G-Sync HDR monitors are there that aren't "G-Sync Ultimate" certified at HDR1000? None that I'm aware of. I have a hard time concluding that it's the monitor manufacturer's shortcoming and not an Nvidia/ASIC limitation as you have done.
 
I have a hard time concluding that it's the monitor manufacturer's shortcoming and not an Nvidia/ASIC limitation as you have done.

Because of this assumption:

From what I recall and understand (and I could totally be wrong here) Nvidia won't sell the G-Sync Ultimate module for a panel that doesn't meet the G-Sync ultimate requirements (like HDR1000). This module is very expensive, rumored around $500 and wouldn't make sense in a mid-range monitor anyway.

And I don't see a 34" 144Hz 10bit panel as 'mid-range'; that's definitely high-end.

I'm not really trying to be argumentative- it seems that LG could have used a better part to fully support their panel. At the same time any of the reasons you've brought up (cost, branding) could have been involved, but there's no technical reason for the lack of support.

What's really hurting G-Sync is the economy of scale involved. By implementing fewer monitor-side features, Freesync takes advantage of the economy of scale by being an 'affordable' value add for monitor ASIC producers.
 
Here, this is where I read it:

https://www.anandtech.com/show/13060/asus-pg27uq-gsync-hdr-review/2

While there are specific G-Sync HDR standards as part of their G-Sync certification process, those specifications are only known to NVIDIA and the manufacturers. Nor was much detail provided on minimum requirements outside of HDR10 support, peak 1000 nits brightness, and unspecified coverage of DCI-P3

That link references the Nvidia G-Sync HDR whitepaper.

This is where I concluded that Nvidia won't let somebody release a G-Sync display with HDR support that isn't "Ultimate". And the fact that none exists, and that LG was forced to nerf this display in the G-Sync version seems to support that conclusion.
 
This is where I concluded that Nvidia won't let somebody release a G-Sync display with HDR support that isn't "Ultimate". And the fact that none exists, and that LG was forced to nerf this display in the G-Sync version seems to support that conclusion.

It's a guess, may be right- but I don't see anything to support that they wouldn't sell the module for use in a non-Ultimate display. I get that earning the certification is a hard requirement, but the part is just a part.
 
It's a guess, may be right- but I don't see anything to support that they wouldn't sell the module for use in a non-Ultimate display. I get that earning the certification is a hard requirement, but the part is just a part.
I sort of thought the same so I'm not sure why we haven't seen any. With Freesync support on Nvidia GPUs it may not matter anymore.
 
I sort of thought the same so I'm not sure why we haven't seen any. With Freesync support on Nvidia GPUs it may not matter anymore.

G-Sync is going to be a hard sell going forward. I have two G-Sync displays (and one FreeSync) and I can't complain, but I realize that Freesync2 tightening the standard and Nvidia enabling it over DisplayPort and maybe HDMI on future generations means G-Sync will have to be price-competitive to sell- and that's not going to happen even if G-Sync hardware is the same price as Freesync hardware. Economy of scale and all. We'd need to see something radical like G-Sync support from typical ASIC makers that would likely also support Freesync, or a lot of work on Nvidia's side to lessen the burden on display manufacurers.
 
There is only one DP 1.4 G-Sync module, it is using the Intel Altera Arria 10 GX480 FPGA. A monitor doesn't need to have HDR, let alone HDR1000 to use this G-Sync module. NVIDIA simply doesn't use the FALD back-light channel that the FPGA has on a monitor like the LG 38GL950G.

Not all DP 1.4 G-Sync monitors now and in the future will be HDR1000 "Ultimate". The new 27" 4K 144 Hz monitors that are the FALD-less 27" (and basically have no HDR) versions of the PG27UQ and X27 use the same G-Sync DP 1.4 module/FPGA, fans to cool it, etc.
 
Last edited:
Back
Top