LG 48CX

Yeah everyday I check BestBuy and the date keeps getting pushed back further and further so I haven't even bothered to place an order yet. Currently it's sitting at Jul 21st if I order today...and what happened to all that people who claimed that they ordered one when it was listed at $1499 and would receive it in a matter of days? All those people should've received it by now yet there's been little update, unless of course they all got delayed too and the original dates were never correct.

Yeah, Best Buy has been strange. I put it in my cart when it first released and it showed a date in July. A couple days later I checked back and it showed available for pickup "same day" in a nearby city, with delivery option only 3 days out. I decided against the purchase to wait on a 1399 or so drop (given the expense of burn in protection and $200 mount), and some kind of firmware update re: VRR/black levels. Now it's showing a late July date again.

I'm beginning to wonder if almost any of the early BB people are getting their TVs this month.
 
First time using a TV as a monitor..but is it "normal" to have a TINY bit of overscan in certain games intermittently? And I mean tiny...like 1 mm shaved off the right side of the screen that goes away after turning the TV off and on, and even then it doesn't always show up. I have the latest firmware and the HDMI port is set to "PC" with Just Scan set to on.

And what is considered the best image quality settings for 4k60hz? 4:4:4 chroma and 8 bit?

I think you're describing the pixel orbiter to help prevent burn in. It periodically shifts the whole screen up/down/left/right a pixel or two.
 
Is the drop in chroma noticeable?
Not in graphics (4k bluray uses 4:2:0 for example - color resolution the one tradeoff humans are not very sensitive to). Very noticable in text though, especially pixel-thin colored text which gets really distorted.
 
I didn't realize that the C9 does not have 4K @ 120Hz @ 4:2:0 ...... This is only a feature for the CX.

It certainly does; it's the same panel as the CX. You need HDMI 2.1 to have enough bandwidth for any form of 4k120 though.

I should be able to get 4:4:4: @ 4K @ 120hz once I install my nVidia 3090 this Sept.

If NVIDIA allows it. For some odd reason their current cards don't support 10-bit output @ 4:4:4, and HDMI 2.1 doesn't have the bandwidth for 4k120 12-bit 4:4:4, so if NVIDIA doesn't add (really unlock) said feature you'll have to drop to 4:2:2.
 
It certainly does; it's the same panel as the CX. You need HDMI 2.1 to have enough bandwidth for any form of 4k120 though.

If NVIDIA allows it. For some odd reason their current cards don't support 10-bit output @ 4:4:4, and HDMI 2.1 doesn't have the bandwidth for 4k120 12-bit 4:4:4, so if NVIDIA doesn't add (really unlock) said feature you'll have to drop to 4:2:2.

Small correction in your post maybe, HDMI 2.1 does have enough bandwidth for 4k120 12-bit 4:4:4 HDR @ 48GBPS, it's just LG decided to nerf the 2020 CX line and make the HDMI 2.1 port only do 40GBPS instead of 48GBPS so now you're stuck with 4k120 10-bit 4:4:4 HDR (which currently is not supported on Nvidia RTX gaming GPU's as it's 8-bit or 12-bit only so now you'll be stuck with 8-bit on CX or the better 12-bit on the C9 unless Nvidia makes changes to their software/hardware for the RTX 3000 series to allow 10-bit).
 
Small correction in your post maybe, HDMI 2.1 does have enough bandwidth for 4k120 12-bit 4:4:4 HDR @ 48GBPS, it's just LG decided to nerf the 2020 CX line and make the HDMI 2.1 port only do 40GBPS instead of 48GBPS so now you're stuck with 4k120 10-bit 4:4:4 HDR (which currently is not supported on Nvidia RTX gaming GPU's as it's 8-bit or 12-bit only so now you'll be stuck with 8-bit on CX or the better 12-bit on the C9 unless Nvidia makes changes to their software/hardware for the RTX 3000 series to allow 10-bit).

Yeah, I forgot about that. If NVIDIA supports the full 48GBPs then the C9 actually ends up being the better purchase. That being said, there's some rumors that the new consoles may only support up to 40GBPs, and the GPU makers may follow suit...
 
I was one that was able to purchase the LG 48X from Best buy when it was listed at $1499. I purchased it for console gaming and received it this past Monday. TV is absolutely incredible.
20200624_153907.jpg
20200624_155711.jpg
 
Last edited:
Small correction in your post maybe, HDMI 2.1 does have enough bandwidth for 4k120 12-bit 4:4:4 HDR @ 48GBPS, it's just LG decided to nerf the 2020 CX line and make the HDMI 2.1 port only do 40GBPS instead of 48GBPS so now you're stuck with 4k120 10-bit 4:4:4 HDR (which currently is not supported on Nvidia RTX gaming GPU's as it's 8-bit or 12-bit only so now you'll be stuck with 8-bit on CX or the better 12-bit on the C9 unless Nvidia makes changes to their software/hardware for the RTX 3000 series to allow 10-bit).

The LG OLEDs are 10-bit panels. While sending 12-bit signal might allow for more accurate internal processing, I would assume that the end result would be impossible to tell apart. It gets converted to 10-bit color in the end no matter what.

Really the only concern is Nvidia fixing their shit so that 10-bit color over HDMI works correctly. It should be a no-brainer when 12-bit color works just fine and they already support 10-bit color with Displayport.

I dislike Nvidia's bullshit with 10-bit color in the first place by requiring crap like the rarely updated Studio drivers if you want 10-bit in pro apps like the Adobe suite. It's such a stupid thing to gate behind a specific driver.
 
I doubt nividia GPUs will cut off at 40Gbps since they are usually similar to the Quadros and historically could be flashed to work as one. Now RTX/2000 series can install studio drivers for 10bit displayport output. In later generations gpus will probably have dp 2.0 also. AMD gpus support 10bit out on hdmi already. Consoles at 40Gbps could happen but it won't matter since they should output 10bit for HDR and because they are based on amd gpus.

Nvidia does support 10bit out via displayport... they just choose not to support 10bit out on hdmi of gaming gpus for some reason, most likely to market their more expensive studio/Quadro graphics cards.

https://www.videomaker.com/news/nvidia-geforce-rtx-cards-now-support-10-bit-color-in-adobe/ (july 2019)
"GeForce RTX on par with Quadro RTX after 10-bit color in Adobe update
It looks like this brings the GeForce RTX closer to the standard of the Quadro RTX. It may even be a better bargain for someone that wants to save some cash; it’s half the price of the Quadro RTX. Though, you still won’t get the ECC-enabled memory for mission-critical applications."


As far as I know, the studio drivers still only allow 10bit over displayport, not hdmi. Correct me if I'm wrong.

The LG OLEDs are 10-bit panels. While sending 12-bit signal might allow for more accurate internal processing, I would assume that the end result would be impossible to tell apart. It gets converted to 10-bit color in the end no matter what.

Really the only concern is Nvidia fixing their shit so that 10-bit color over HDMI works correctly. It should be a no-brainer when 12-bit color works just fine and they already support 10-bit color with Displayport.

I dislike Nvidia's bullshit with 10-bit color in the first place by requiring crap like the rarely updated Studio drivers if you want 10-bit in pro apps like the Adobe suite. It's such a stupid thing to gate behind a specific driver.

In fact if you read the avsforum threads on LG OLEDs, some people claim there is loss on 10bit or 12 bit signals so using a display capable of accepting a 12bit signal on a 10 bit panel will potentially get a very slightly better picture, kind of like downsampling a higher resolution (a higher color resolution in this case).
 
Last edited:
In fact if you read the avsforum threads on LG OLEDs, some people claim there is loss on 10bit or 12 bit signals so using a display capable of accepting a 12bit signal on a 10 bit panel will potentially get a very slightly better picture, kind of like downsampling a higher resolution (a higher color resolution in this case).
This sounds like the US$50,000 speaker cable argument... with the main problem being someone with too much money and not enough sense used a cheap cable, experienced a problem, and decided to throw money at it.
 
They are talking about loss in processing of the source signal to be clear, not cable signal loss. But it's probably negligible.


The LG OLEDs are 10-bit panels. While sending 12-bit signal might allow for more accurate internal processing, I would assume that the end result would be impossible to tell apart. It gets converted to 10-bit color in the end no matter what.

Really the only concern is Nvidia fixing their shit so that 10-bit color over HDMI works correctly. It should be a no-brainer when 12-bit color works just fine and they already support 10-bit color with Displayport.

I dislike Nvidia's bullshit with 10-bit color in the first place by requiring crap like the rarely updated Studio drivers if you want 10-bit in pro apps like the Adobe suite. It's such a stupid thing to gate behind a specific driver.
 
  • Like
Reactions: N4CR
like this
MC warranties are the best.

Absolutely and AMEN!

I have both an extended warranty on my 2080 Ti and new LG OLED 55" C9.

btw for those of you on the fence, Microcenter carries a refurb'd 55" and 65" C9 OLED. The 55" is $999 ... before you guys say "ewwww" ... read the slickdeals.net forums on this deal. These displays come directly from LG and once you register the TV with LG, you get the full 12 month warranty which also covers burn-in. They will basically replace your TV. Also, it's a new set from top to bottom. Remote is new, TV and screen is new. My display even had the very thin factory weird plastic you have to remove. It was in perfect new condition. You can verify everything I've just said if you want to hope over to the slickdeals.net forums that originally posted this deal.

Microcenter also offers an extended warranty for $129 that covers 24 months and allows you to return the TV is there are "any" issues with your set. It's a win / win.
 
Also, it doesn't matter about LG nerfing HDMI 2.1. Anyone can purchase this Club 3d cable within the next week or two.

https://www.club-3d.com/en/detail/2496/displayport_1.4_to_hdmi_4k120hz_hdr_active_adapter_m-f/

Also: https://www.techpowerup.com/forums/...-displayport-to-hdmi-4k-120hz-adapter.268589/

I'm no expert but I've read that DP protocol has a built in data compression ( DP1.4 DSC video compression technology ) which means the availible bandwidth on your C9 / CX will be sufficient.

I've also read that there should be no latency introduced from using this adapter.






DisplayPort 1.4 to HDMI 4K120Hz HDR Active Adapter M/F



The Club 3D CAC-1085 is the perfect solution to connect to any HDMI™ 4K120Hz ready displays*. If you have a DisplayPort™ 1.4 ready PC or any other device that lacks the new HDMI™ 4K120Hz specification, the Club3D CAC-1085 will be the simple way to upgrade your device and connect to your new TV*.


With its DP1.4 DSC video compression technology, this adapter is able to convert DP1.4 video signals to HDMI™2.1, supporting video display resolutions up to 8K (7680 x 4320)@60Hz ** and creating life like colors and movements with HDR giving users the ultimate visual experience. The Adapter is powered thru an USB Type C to USB Type A cable(provided with the product).

* Please update your TV Firmware to the version which supports these resolutions/refresh rates !
** Please update your Graphic drivers on your PC and make sure that DSC1.2 is supported on your devices to support the these resolutions/refresh rates!

Please use one of our Extension/Adapter cables to connect to your devices:
In case you need assistance to choose the correct cable, please visit our
website www.club-3d.com or feel free to mail us at [email protected] and it will be our pleasure to assist you.
 
I prefer native, 1:1 signal in both video and audio - avoiding compression whenever possible but that does seem like a useful device for now and perhaps for other sources on pre 2000, 3000 series gpus. Question is how much does it cost?
 
Yeah everyday I check BestBuy and the date keeps getting pushed back further and further so I haven't even bothered to place an order yet. Currently it's sitting at Jul 21st if I order today...and what happened to all the people who claimed that they ordered one when it was listed at $1499 and would receive it in a matter of days? All those people should've received it by now yet there's been little update, unless of course they all got delayed too and the original dates were never correct.
My 48" arrived this past Sunday. Setting up this weekend.
 
I prefer native, 1:1 signal in both video and audio - avoiding compression whenever possible but that does seem like a useful device for now and perhaps for other sources on pre 2000, 3000 series gpus. Question is how much does it cost?

I think you are thinking of a different compression and or results of compression. Quality is not effected at all. In fact, if you've ever used DP chances are the signal was compressed.
 
I think you are thinking of a different compression and or results of compression. Quality is not effected at all. In fact, if you've ever used DP chances are the signal was compressed.
It's up to 3:1 compression so it's not the native signal anymore. While testing ordinary people results in reports of "virtually lossless" it's still another watered down version of the original signal, however slightly. DSC, 8bit dithered on 10 bit panel, 4:2:2, 4:2:0, non native resolution. They are all usable, some are worse than others but they aren't 1:1 native, uncompressed anymore. Same goes for audio compression. Where I have a choice I'll always shoot for 1:1 native uncompressed in both video and audio.

It still sounds like a worthwhile device (price depending) considering the wait on the 3000 series and using previous generations of gpus.
 
I think its around $65 dollars but Im not 100% on money conversion of the price I saw. It's shipping around the 4th or 5th of July I also read.
 
It's up to 3:1 compression so it's not the native signal anymore. While testing ordinary people results in reports of "virtually lossless" it's still another watered down version of the original signal, however slightly. DSC, 8bit dithered on 10 bit panel, 4:2:2, 4:2:0, non native resolution. They are all usable, some are worse than others but they aren't 1:1 native, uncompressed anymore. Same goes for audio compression. Where I have a choice I'll always shoot for 1:1 native uncompressed in both video and audio.

It still sounds like a worthwhile device (price depending) considering the wait on the 3000 series and using previous generations of gpus.

Compression yeah but what if you really can't tell an obvious difference like with chroma subsampling? I say DSC is totally fine if you cannot actually tell a difference at all, including tests with up close zoomed-in shots.If it's really that good and you still are somehow against it regardless, well I guess you must also be one of those people who swear that audio files must be in FLAC and that 320kbps MP3 is totally unacceptable? DLSS 2.0 must also be a no go for you since it's no longer 1:1 pixels being rendered. And what about frame rate interpolation that you keep bringing up to reach 1000Hz? Not a true 1000 frames now is it?😉
 
Last edited:
Last I read the Club3d adapter wouldn't support any form of VRR.

So far that is pretty much the only known issue using it. Club3D have alluded that it might be possible to fix with a firmware update to the adapter but that is not guaranteed to happen.

It's up to 3:1 compression so it's not the native signal anymore. While testing ordinary people results in reports of "virtually lossless" it's still another watered down version of the original signal, however slightly. DSC, 8bit dithered on 10 bit panel, 4:2:2, 4:2:0, non native resolution. They are all usable, some are worse than others but they aren't 1:1 native, uncompressed anymore. Same goes for audio compression. Where I have a choice I'll always shoot for 1:1 native uncompressed in both video and audio.

It still sounds like a worthwhile device (price depending) considering the wait on the 3000 series and using previous generations of gpus.

As always it's what compromises you are willing to accept. To me 4K 120 Hz with DSC over DP sounds like a much better idea than being stuck at 60 Hz for custom resolutions and desktop. You can just hook up two inputs if you want to use say 4K 120 Hz 8-bit 4:2:0 or 1440p 120 Hz with VRR. Then you can use the TV with and without the adapter simply by changing inputs.

People already have a hard time telling the difference between 8- and 10-bit, chroma subsampling etc unless you look at very specific things (gradients, text rendering). If DSC has some noticeable image quality reduction then it would have been mentioned already in reviews of the few displays using DSC. Or maybe we just don't know where to look yet.
 
Last edited:
For a mix of SDR (mostly) and HDR gaming at 4k60hz until HDMI 2.1 comes out....what's the general consensus on RGB 8bit, full vs 422 10/12bit and limited? Lots of conflicting descriptions out there but these two settings seem to be the most often recommended.
 
Yeah everyday I check BestBuy and the date keeps getting pushed back further and further so I haven't even bothered to place an order yet. Currently it's sitting at Jul 21st if I order today...and what happened to all the people who claimed that they ordered one when it was listed at $1499 and would receive it in a matter of days? All those people should've received it by now yet there's been little update, unless of course they all got delayed too and the original dates were never correct.

I think the date just gets pushed back by a day everyday. It means nothing though. When I ordered mine 2 days ago it said it would arrive July 13th. But then this morning I got an email that it shipped and will arrive on Monday June 29th. I'm not sure why they are vague on the date but they do seem to have them and are shipping. Can't wait to get mine!
 
Last I read the Club3d adapter wouldn't support any form of VRR.

Not a big deal really who cares if you drop or dupe a frame? I mean, when your at 90fps / 100fps - 120fps range ( hopefully ) depending on the game of course and how much bells and whistles you want on screen @4K @120hz, I don't see a problem with that.

I just recently let my Samsung 55" NU8000 set go and I didn't have VRR on it and I never ever saw much of a difference in a negative way.

I will say, gsync is a bit smoother on the LG C9 @ 1440p @ 120hz

But, no, you're correct. This adapter doesn't appear to support VRR.

Besides, this is a temp stop gap measure for all interested parties I would imagine. Two things are going to happen eventually. People are going to get the new Radeon that supports HDMI 2.1 or a new nVidia GPU that supports HDMI 2.1 and use them on these LG's this year or next. It's going to happen.

Regardless of how anyone company would hope to control what end-users can or cannot do, it's going to happen this year or next.

I suspect that Samsung, Song and LG will be at the top when it comes to VRR, 120hz @ 4K, low latency as well as maybe other possible players. The future looks bright.

I also wonder if makers of typical desktop monitors, Acer, and others internet the larger display market to cash in.

I am positive other manufactures are paying attention to this 48" CX with gsync.
 
Compression yeah but what if you really can't tell an obvious difference like with chroma subsampling? I say DSC is totally fine if you cannot actually tell a difference at all, including tests with up close zoomed-in shots.If it's really that good and you still are somehow against it regardless, well I guess you must also be one of those people who swear that audio files must be in FLAC and that 320kbps MP3 is totally unacceptable? DLSS 2.0 must also be a no go for you since it's no longer 1:1 pixels being rendered. And what about frame rate interpolation that you keep bringing up to reach 1000Hz? Not a true 1000 frames now is it?😉

Interpolation can be repeated frames and not imagined in between frames so that would depend on how it's implemented and from what base frame rate you are starting from. For example some VR tech cuts anything sub 90fps to 45 fps solid and doubles it. That differs from "time-warp" guestimation methods and quasi-manufactured "tween" frames. A say 100fps frame rate solid could potentially be multiplied 3x or even 10x for current and future high Hz displays. (a normal 100fps would be a good base rate for non-quasi frame motion definition to be multiplied from).
Interpolation/multiplying of frames in this scenario would be to reduce/eliminate blur, that is, increase motion clarity without having to use BFI as opposed to the the increased motion definition types of interpolation that are usually used on 24fps or 60fps content, making manufactured in-between frame states/positions since their motion definition and pathing articulation, etc. are so low to start with.

DLSS2.0 supposedly can make the resolution BETTER than native. It has to know the game already and it fills in extra data from known renders sortof, so again it's not compressing the data it's actually ADDING data to a lower resolution signal, and so much resolution/scene data that it reportedly can be a better looking result than the native resolution without DLSS2.0 (kind of like supersampling a higher resolution down to your native resolution). DLSS2.0 also has it's own AA so probably saves a lot of frame rate there too incidentally.

I do prefer FLAC though where I can get it and I do have a library of FLAC. :ROFLMAO:
Like I said, where I have the choice I'll always prefer native 1:1 and uncompressed original source material. That doesn't mean DSC, if a club3D adapter supports it, wouldn't look decent and be playable or that I never use compressed material (e.g. streaming services like netflix, amazon). Most likely DSC and 8bit dithered are both top tier "nearly native" results compared to other compromises like 4:2:2, 4:2:0, or running a lower non native resolution.
 
Last edited:
Not a big deal really who cares if you drop or dupe a frame? I mean, when your at 90fps / 100fps - 120fps range ( hopefully ) depending on the game of course and how much bells and whistles you want on screen @4K @120hz, I don't see a problem with that.

I just recently let my Samsung 55" NU8000 set go and I didn't have VRR on it and I never ever saw much of a difference in a negative way.

I will say, gsync is a bit smoother on the LG C9 @ 1440p @ 120hz

But, no, you're correct. This adapter doesn't appear to support VRR.

Besides, this is a temp stop gap measure for all interested parties I would imagine. Two things are going to happen eventually. People are going to get the new Radeon that supports HDMI 2.1 or a new nVidia GPU that supports HDMI 2.1 and use them on these LG's this year or next. It's going to happen.

Regardless of how anyone company would hope to control what end-users can or cannot do, it's going to happen this year or next.

I suspect that Samsung, Song and LG will be at the top when it comes to VRR, 120hz @ 4K, low latency as well as maybe other possible players. The future looks bright.

I also wonder if makers of typical desktop monitors, Acer, and others internet the larger display market to cash in.

I am positive other manufactures are paying attention to this 48" CX with gsync.

If you are riding a roller coaster of frame rates in order to play a very demanding game on very demanding settings especially considering 4k resolution, you can get stutter/judder at quick frame rate drop offs. That would especially be true if you are dropping in the low end of the pool on a frame rate graph. Games also can get the odd frame rate "pothole" besides. The higher you keep the lower end of your frame rate graph, the less jarring the regular variance outside of "pot holes" might be but it still won't be as smooth as using VRR. G-sync (and VRR in general) has very appreciable gains. G-Sync monitors have been around since 2014 and for many of us, like higher Hz, there is no looking back.

I agree that there are ok compromises for early adopters though, as long as hdmi 2.1 gpus and HDMI 2.1 VRR consoles come out later and work correctly like you said.
 
My 48" arrived this past Sunday. Setting up this weekend.

Dang. I'm getting really antsy now that people are starting to receive them. Seems like it wasn't long ago that some people thought that the 48" was going to be vaporware. Now we know that they actually exist as a final and (somewhat) available product!

I probably would've jumped on the 55" since it was available sooner, but I wanted to go down in size. I’ve been waiting on the 48” since it was first rumored/announced. Gaming on a 55" is pretty rad, but if I'm honest with myself, 48" is a better size for desktop use...at least in my setup. I can only put so much distance between myself and my monitor because I have an Ikea Galant corner desk that sits up against the wall:

image002.PNG


I don't normally run my application windows like that. I generally try to keep stuff towards the lower portion of the screen. Looking forward to the 48" not being so tall, and the sharper text will be nice (the 40" 4K Samsung that I owned briefly was soo crisp). I'm going to repurpose my 55" as an actual TV. I just don't really enjoy watching movies from my mesh PC chair so it will be nice to put the B7 somewhere where I can enjoy the stunning PQ in movies from a couch.
 
For a mix of SDR (mostly) and HDR gaming at 4k60hz until HDMI 2.1 comes out....what's the general consensus on RGB 8bit, full vs 422 10/12bit and limited? Lots of conflicting descriptions out there but these two settings seem to be the most often recommended.
4:2:2 is perfectly fine for multimedia content including games. If fringing in text bothers you in other applications you can always turn the color depth down for those scenarios.
 
Interpolation can be repeated frames and not imagined in between frames so that would depend on how it's implemented and from what base frame rate you are starting from. For example some VR tech cuts anything sub 90fps to 45 fps solid and doubles it. That differs from "time-warp" guestimation methods and quasi-manufactured "tween" frames. A say 100fps frame rate solid could potentially be multiplied 3x or even 10x for current and future high Hz displays. (a normal 100fps would be a good base rate for non-quasi frame motion definition to be multiplied from).
Interpolation/multiplying of frames in this scenario would be to reduce/eliminate blur, that is, increase motion clarity without having to use BFI as opposed to the the increased motion definition types of interpolation that are usually used on 24fps or 60fps content, making manufactured in-between frame states/positions since their motion definition and pathing articulation, etc. are so low to start with.

DLSS2.0 supposedly can make the resolution BETTER than native. It has to know the game already and it fills in extra data from known renders sortof, so again it's not compressing the data it's actually ADDING data to a lower resolution signal, and so much resolution/scene data that it reportedly can be a better looking result than the native resolution without DLSS2.0 (kind of like supersampling a higher resolution down to your native resolution). DLSS2.0 also has it's own AA so probably saves a lot of frame rate there too incidentally.

I do prefer FLAC though where I can get it and I do have a library of FLAC. :ROFLMAO:
Like I said, where I have the choice I'll always prefer native 1:1 and uncompressed original source material. That doesn't mean DSC, if a club3D adapter supports it, wouldn't look decent and be playable or that I never use compressed material (e.g. streaming services like netflix, amazon). Most likely DSC and 8bit dithered are both top tier "nearly native" results compared to other compromises like 4:2:2, 4:2:0, or running a lower non native resolution.

I guarantee if I were to blind test you, you would not be able to tell the difference between FLAC and mp3 with 100% accuracy. 😉 Point is if you can't tell any difference in reality then avoiding compression for nothing more than the sake of avoiding compression is just dumb. Technologies like DSC are things that should be appreciated. Because without it, the alternative is chroma subsampling until we can brute force it with more physical bandwidth, and even then we will hit a wall eventually with just how much bandwidth we can cram into a cable.
 
I guarantee if I were to blind test you, you would not be able to tell the difference between FLAC and mp3 with 100% accuracy. 😉 Point is if you can't tell any difference in reality then avoiding compression for nothing more than the sake of avoiding compression is just dumb. Technologies like DSC are things that should be appreciated. Because without it, the alternative is chroma subsampling until we can brute force it with more physical bandwidth, and even then we will hit a wall eventually with just how much bandwidth we can cram into a cable.

FLAC vs. MP3 is the wrong comparison here, because FLAC is also compressed. Lossless compression is a thing, like FLAC vs. WAV. I believe DSC is not lossless though. Whether someone can actually tell the difference... that I don't know. It's been barely used and may end up being a dead technology with DP 2.0 and HDMI 2.1 having enough bandwidth it isn't necessary.
 
FLAC vs. MP3 is the wrong comparison here, because FLAC is also compressed. Lossless compression is a thing, like FLAC vs. WAV. I believe DSC is not lossless though. Whether someone can actually tell the difference... that I don't know. It's been barely used and may end up being a dead technology with DP 2.0 and HDMI 2.1 having enough bandwidth it isn't necessary.

I've never seen DSC in person so I can't say if it is actually visually lostless or not. DP 2.0 and HDMI 2.1 have enough bandwidth for the specs we want to achieve right now which is 4k120/144Hz but later on the future with 8k high refresh monitors, or maybe 4k240/360Hz monitors, DSC may be needed.

Also if FLAC is compressed too then that just makes things even MORE ironic 🤣
 
I've never seen DSC in person so I can't say if it is actually visually lostless or not. DP 2.0 and HDMI 2.1 have enough bandwidth for the specs we want to achieve right now which is 4k120/144Hz but later on the future with 8k high refresh monitors, or maybe 4k240/360Hz monitors, DSC may be needed.

Also if FLAC is compressed too then that just makes things even MORE ironic 🤣

I mean...it shouldn't come as a big surprise that audio/video can be compressed losslessly. People use lossless compression all the time...like zip files, heh.
 
Thanks everyone for all the info. I just started looking at upgrading my current setup (triple Samsung 48" 4K TVs - JU6700, JS9000, JU6700) and have been reading through this thread.
I've been enjoying my setup but started looking at upgrading the center display and think that either the 48" CX or 55" CX might work for me.

Might have to make a run to Costco and get the 55" and see if that is "just right" or "too much"...!
 
Thanks everyone for all the info. I just started looking at upgrading my current setup (triple Samsung 48" 4K TVs - JU6700, JS9000, JU6700) and have been reading through this thread.
I've been enjoying my setup but started looking at upgrading the center display and think that either the 48" CX or 55" CX might work for me.

Might have to make a run to Costco and get the 55" and see if that is "just right" or "too much"...!


I'm virtually settled on my upgrade pathways. I'm getting two displays.

48" cx for desktop/gaming use.

65" H9g for general tv use (900 dollars gets shockingly good looking tv for the money) - This one is the most up in the air, as I still want to wait and see what TCL and Vizio have to offer this year, but if they can't top the H9g in terms of picture quality per dollar, I'll just go with that.
 
Bit rate, and bit depth (the range available between the quietest and loudest moments), and the sample rate of audio do matter. A 320kbps 16bit (bit depth) audio file compared to a 1411 kbps 24bit (bit depth) audio file for example, or DTS, Dolby HD compared to uncompressed hdmi audio (dolby trueHD, ATMOS).

It's one of those things.. with the right hardware, lossless/native/less-compressed is best. You might not even always be consciously aware of it especially without side by side comparison but some would argue your brain will still hear it or see it. It's like how much water can you add to this drink before it starts being noticeably different in taste and color? I'll take my drink without any water or as little water added wherever possible thanks. I agree that the best of the compression methods are good compromises and for most are "good enough" - but most things are just that, compromises. That also goes for tone mapping, which there is no real way to get away from with oled's color brightness limitations without otherwise clpping colors ~ spectral highlights ~ detail-in-colors at their height ceiling.

I've said it's not that I won't use compression ever, for example netflix and amazon streaming videos, but if I can get somewhat better hardware/configurations and I can find a better, less watered down version of content I'll use it.
 
Last edited:
I've never seen DSC in person so I can't say if it is actually visually lostless or not. DP 2.0 and HDMI 2.1 have enough bandwidth for the specs we want to achieve right now which is 4k120/144Hz but later on the future with 8k high refresh monitors, or maybe 4k240/360Hz monitors, DSC may be needed.

Also if FLAC is compressed too then that just makes things even MORE ironic 🤣

I'm definitely curious as to whether I can see any differences between with and without DSC, but since the compression ratio is ~3:1 (and you can get ~2:1 with lossless compression), I would expect that the difference is very small if perceptible at all.

Uncompressed 8k @ 60hz 4:4:4 exceeds HDMI 2.1's bandwidth, so it will require DSC. DisplayPort 2.0 has the bandwidth to carry it uncompressed, but it will also require even thicker and shorter passive cables. Given this impending physical limitation, I'd rather take a visually lossless compressed feed at 1/3 the bandwidth over a 6' cable than an uncompressed feed over a 2' cable (or having to use expensive active cables).
 
Bit rate, and bit depth (the range available between the quietest and loudest moments), and the sample rate of audio do matter. A 320kbps 16bit (bit depth) audio file compared to a 1411 kbps 24bit (bit depth) audio file for example, or DTS, Dolby HD compared to uncompressed hdmi audio (dolby trueHD, ATMOS).

It's one of those things.. with the right hardware, lossless/native/less-compressed is best. You might not even always be consciously aware of it especially without side by side comparison but some would argue your brain will still hear it or see it. It's like how much water can you add to this drink before it starts being noticeably different in taste and color? I'll take my drink without any water or as little water added wherever possible thanks. I agree that the best of the compression methods are good compromises and for most are "good enough" - but most things are just that, compromises. That also goes for tone mapping, which there is no real way to get away from with oled's color brightness limitations without otherwise clpping colors ~ spectral highlights ~ detail-in-colors at their heigh ceiling.

I've said it's not that I won't use compression ever, for example netflix and amazon streaming videos, but if I can get somewhat better hardware/configurations and I can find a better, less watered down version of content I'll use it.

Yeah but then here you are using FLAC audio instead of WAV, which goes against your whole philosophy of always shooting for a 1:1 native uncompressed source. Are you gonna go and replace your whole library of FLAC with WAV now? :LOL: Anyways the biggest issue with the Club3D adapter isn't even DSC, it's the lack of VRR support. Between using 4k120Hz 420 with VRR or getting 4k120Hz 444 with no VRR, I'd take the former. Not to mention we are so close to the arrival of HDMI 2.1 GPUs that the adapter is also a case of too little too late, would've been a HUGE benefit to C9 owners had it came out last year when we were still very far off from new cards.
 
Yeah but then here you are using FLAC audio instead of WAV, which goes against your whole philosophy of always shooting for a 1:1 native uncompressed source. Are you gonna go and replace your whole library of FLAC with WAV now? :LOL: Anyways the biggest issue with the Club3D adapter isn't even DSC, it's the lack of VRR support. Between using 4k120Hz 420 with VRR or getting 4k120Hz 444 with no VRR, I'd take the former. Not to mention we are so close to the arrival of HDMI 2.1 GPUs that the adapter is also a case of too little too late, would've been a HUGE benefit to C9 owners had it came out last year when we were still very far off from new cards.


Unfortunately no since FLAC is the least-compressed format that is the most available, wav is unrealistic. If atmos music actually took off and became a thing (making new uncompressed atmos surround recordings from the studio master tracks), I would definitely start building a library of that though. I am definitely looking forward to uncompresesd hdmi audio with eARC from dolbytrueHD and atmos movies.

About the VRR-less, assumingly DSC club 3D adapter. Yeah it's a short stop gap and VRR 4:2:0 would be more preferable in the meantime I agree. I won't be getting my LG OLED until november (only 4months or so away really) so hopefully the 3090ti /hybrid will be out by then and with hdmi 2.1 (othewise amd gpu?) making the adapter unnecessary.
 
Back
Top