LG 48CX

Cool I'll be interested to hear how that cable works out for you off of a hdmi 2.1 gpu later. Did you get the 20' one?

Yes. Never paid that much for a cable but from what I have been reading 6-8' is the limit for HDMI 2.1 via copper.
 
Yes. Never paid that much for a cable but from what I have been reading 6-8' is the limit for HDMI 2.1 via copper.

Given how close in price 20ft is to the shorter ones....I'd get a 20ft one regardless of what I need and coil the excess in case I need it longer later given how expensive the cable is. If I end up needing one (I have a 2m copper cable now so we'll see) I'll get a 50ft one in case I need it down the road to run to my living room for VR, I have a 10m active copper 2.0 cable for that now.
 
Yeah TBH I'd rather experiment with 3 $20 cables off Amazon and return the 2 that don't work vs dropping $120 on one.
 
Planning to (hopefully) get an RTX 3080 later this week and also a CX 48". The card I’m looking at only has 1 HDMI out port, which would run to the CX for Gsync. If I want to run audio out to my AVR, can I use a separate HDMI cable from my motherboard to the AVR?
 
Planning to (hopefully) get an RTX 3080 later this week and also a CX 48". The card I’m looking at only has 1 HDMI out port, which would run to the CX for Gsync. If I want to run audio out to my AVR, can I use a separate HDMI cable from my motherboard to the AVR?
Wait for others to reply to this, but I'm fairly sure you can audio out from the HDMI-ARC on the CX.
 
Wait for others to reply to this, but I'm fairly sure you can audio out from the HDMI-ARC on the CX.
I use HDMI-eARC on my second computer from the CX to Denon AVR. I do have to change the windows sample rate from 24 bit to 16 bit and then back to 24 bit every time I boot up, though.
 
I use HDMI-eARC on my second computer from the CX to Denon AVR. I do have to change the windows sample rate from 24 bit to 16 bit and then back to 24 bit every time I boot up, though.

I’ve heard of issues including this one with EARC, which is why I’m hoping to bypass it entirely by using a separate cable from my MB.
 
I’ve heard of issues including this one with EARC, which is why I’m hoping to bypass it entirely by using a separate cable from my MB.
Don't know about Win10, but on Win7 I used a cheapo GPU (GT710) as a sound card by running it into my AVR. That was when nVidia had the "never clock down on multiple displays" issue.
 
  • Like
Reactions: es-wi
like this
Planning to (hopefully) get an RTX 3080 later this week and also a CX 48". The card I’m looking at only has 1 HDMI out port, which would run to the CX for Gsync. If I want to run audio out to my AVR, can I use a separate HDMI cable from my motherboard to the AVR?

You can use a Displayport->HDMI cable also. I'm doing this on my 2080Ti right now.
 
If you already have a 18 AWG 4K rated cable, you should be good and 18 AWG cables have been around for a good while now.
 
I used to have a 25' hdmi/dvi, displayport and a bit shorter usb run to my desk setup from a storage room years ago. I also ran a media server pc from that room 50' through basement ceiling to my living room tv.

Right now I use a 32' active usb 3.1 extension cable plugged in with a wallwart adapter in the middle of it , run to my oculus quest. It's adapted to usb-C at the oculus quest headset end. So for that device, fiber HDMI isn't going to do me any good. However I looked it up and monoprice does make a fiber usb-c cable in a similar price range to the hdmi ones.

https://www.monoprice.com/product?p_id=38579
Transmits Up To 100 Feet: SlimRun™ AV can perfectly transmit 4K@60Hz HDR video to distances up to 100 feet without extenders by using optical fiber to replace copper wire as the high-speed signal transmission medium.
4k 60, quest is 75hz (90hz capable internally supposedly but not supported). I don't know what the quest 2's connection will be. I'm guessing it would work since the Quest uses compression in the signal.

It would be interesting to be able to move my pc back to the storage room again after all these years, running fiber to my regular hdmi displays,peripherals, and my VR headset. As quiet as I can make my pc, there's nothing as quiet as when the pc is off or is in another room. It also doesn't put heat into the room in the summer. I can crank up the AC some but in between cycles, if opting to run a room fan that is just adding more noise again. A 100' usb-C run would probably even reach from my storage room across my breezeway and into my garage for VR if I ever get around to remodeling my garage as a workout and VR space with rubberized tile floor, etc.
 
Last edited:
5ms-10ms input lag is pretty dang good for a tv and when its OLED too. Can't wait to see bench marks between the 3080 and 3090 @4k.
 
So we have confirmation that, on launch, there's a good chance HDMI 2.1 with VRR won't even work.

No wonder why no one's covering. Nvidia probably told shills to downplay it.
 
Does this have any limitations on what it supports? E.g. does it pass through uncompressed 5.1 audio or support stuff like Dolby Atmos?
Doing the same thing with mine. Sadly the receiver I have it on is not Atmos capable. If I schlep my PC downstairs and connect it to that receiver I will update the forum.
 
I finally see how this 25' 48gb HDMI cable I brought from monoprice a few months back will work out. I am not expecting much but I needed a long cable to run to my c9 from my computer. I will probably have to fork over the cash for the fiber HDMI.
 
Does this have any limitations on what it supports? E.g. does it pass through uncompressed 5.1 audio or support stuff like Dolby Atmos?

I use it with 7.1 LPCM, to my knowledge it can carry any audio stream HDMI can. Not every DP->HDMI cable does audio at all though, so be sure to check the comments when you're deciding what to order.
 
Does this have any limitations on what it supports? E.g. does it pass through uncompressed 5.1 audio or support stuff like Dolby Atmos?

I'm using this adapter: STARWARE DP Male to HDMI Female Passive...$8.99. It works great...no limitations in audio support. I'm running 6.1.2 and Atmos is fine where supported. I had a terrible time with eARC going from the CX to a Denon x4700h before I bought the adapter.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
So we have confirmation that, on launch, there's a good chance HDMI 2.1 with VRR won't even work.

No wonder why no one's covering. Nvidia probably told shills to downplay it.

If it makes you feel any better and it may, I turned off gsync on my C9. Guess what? I cannot tell once difference whatsoever.

But, maybe it's my hardware, I'm on a 2080 ti / 10900K @ 5.3ghz with 32GB of DDR4 4400mhz.

I wonder if nVidia is purposely blocking VRR from working on the LG's and within it's drivers. Is that possible? Forcing people to buy gsync branded products.
 
If it makes you feel any better and it may, I turned off gsync on my C9. Guess what? I cannot tell once difference whatsoever.

But, maybe it's my hardware, I'm on a 2080 ti / 10900K @ 5.3ghz with 32GB of DDR4 4400mhz.

I wonder if nVidia is purposely blocking VRR from working on the LG's and within it's drivers. Is that possible? Forcing people to buy gsync branded products.

I think you just didn't have VRR configured correctly, have v-sync on, run all your stuff at 60+FPS (the best you can get at 4k on a C9 until you get a new video card), or don't recognize tearing. LG TV boxes are covered in NVidia logos, and they're listed on the official GSync list. It's also easy to run the pendulum demo and see if it's working or not (assuming you can recognize tearing).

1600297575722.png
 
If it makes you feel any better and it may, I turned off gsync on my C9. Guess what? I cannot tell once difference whatsoever.

But, maybe it's my hardware, I'm on a 2080 ti / 10900K @ 5.3ghz with 32GB of DDR4 4400mhz.

I wonder if nVidia is purposely blocking VRR from working on the LG's and within it's drivers. Is that possible? Forcing people to buy gsync branded products.

Try Samurai Shodown 2 in MAME and let me know if the shadows flicker irregularly and if the scrolling is smooth. That's the real VRR test.
 
3080 Reviews are up and....~30% better than a 2080 Ti at 4k while having only a measely 10GB VRAM is definitely lackluster. Can't wait for the big boy 3090.
 
Well next year the 3080Ti model will come out or the 3080 Super or whatever nvidia will call it so it should bridge the gap between 3080 and 3090 in performance. I'm gonna guess it to be $999. One could wait for it which I might possibly do before upgrading my 1080Ti. Right now I dont see myself doing anything before November but that's just me. Too many home projects need my money this year. I do want to see those 3090 bench marks when they come out. Next week I think it is isn't it?
 
3080 Reviews are up and....~30% better than a 2080 Ti at 4k while having only a measely 10GB VRAM is definitely lackluster. Can't wait for the big boy 3090.

Cat's out of the bag. Nvidia is pulling the wool over rubes' eyes.

They wanted to trick people into thinking that they lowered prices, but they actually raised them.

The 3090 is the "real" 3080. The 3080 is the real 3070, etc.

Last gen was $1,200 for a good high end part. Now it's $1,500 for a good high end part.

More snake oil. More deceptive dog shit.

Even though AMD is so incompetent, and these companies are so corrupt with so much collusion going on, nothing would surprise me, I hope that AMD shocks everyone and actually blows Nvidia away this time. It needs to happen for the good of the market.
 
Cat's out of the bag. Nvidia is pulling the wool over rubes' eyes.

They wanted to trick people into thinking that they lowered prices, but they actually raised them.

The 3090 is the "real" 3080. The 3080 is the real 3070, etc.

Last gen was $1,200 for a good high end part. Now it's $1,500 for a good high end part.

More snake oil. More deceptive dog shit.

Even though AMD is so incompetent, and these companies are so corrupt with so much collusion going on, nothing would surprise me, I hope that AMD shocks everyone and actually blows Nvidia away this time. It needs to happen for the good of the market.

I don't expect something faster than the 3090 because there really isn't room in the die, and I don't expect them to have a whole new die, so i think the 3090 is the Titan or the **80ti... but I do expect something with 20GB and possibley slightly higher clocks to come out between the 3080 and 3090. Whether that's called a "3080 Super" or 3080Ti, that doesn't mean "a 3080 is actually a 3070" if you start from a 3090 being a Titan (because there's not really room on the die above it for a Titan). You're forgetting about the Titan last gen that was $2500.
 
Last gen was $1,200 for a good high end part. Now it's $1,500 for a good high end part.

Well to be fair, $700 will still get you a good high end part in the 3080. It's just not going to be that massive of an upgrade for 2080Ti owners...hence the 3090. But for those like myself with a 1080Ti that skipped the 2080Ti, in some games the 3080 provides 2x the framerates. That's huge.

Even though AMD is so incompetent, and these companies are so corrupt with so much collusion going on, nothing would surprise me, I hope that AMD shocks everyone and actually blows Nvidia away this time. It needs to happen for the good of the market.

I hope they bring it for the sake of competition as well, but given their history of disappointment there I'm just not holding my breath. I miss when they were neck and neck, like in the X800XT/6800GT days. Now it seems as though they can only compete in the midrange. But, I'd be happy to see that change for everyone's sake.
 
Does amd even have anything going with their AI upscaling to the level of nvidia's DLSS at this point in time? Cyberpunk could be the showcase game to set the bar for 4k gameplay with all of the bells and whistles running on Nvidia gpus. Nvidia's top end are always the flagship gpus and they been leading the way - forcing VRR to market with g-sync and in a well performing state, supporting VRR on hdmi tv's "ahead of hdmi 2.1", starting to push into raytracing, and with what appears to be way ahead of the pack with their DLSS (machine learning AI upscaling to the point of supersampling in the look of the end result) implementation. Some other things like freestyle, the geforce experience suite, etc. are nice too.

It seems like nvidia is the solid choice, and at the top end of gaming cards there really isn't any competition. I'd gladly be shown otherwise in the future by AMD but it would have to be across the board including AI upscaling and all of the other things NVIDIA is doing. Considering their position as underdog they would have to do it all at the same level (for cheaper) and with the same level of game support and game driver support - or do it considerably better.

Tick - TOCK
------------------
I also stuck with 1080Ti (s) and skipped the 2000 series as it was a mediocre upgrade for 120hz+ gameplay on a 1440p monitor and it didn't have hdmi 2.1. The 3000 series die shrink, hdmi 2.1 and VRR available on OLED TVs along with the other features of the gpu are a big step forward. I used to wait for the Ti to drop and save ~ 30% compared to buying the "Titan" tier's early adoption tax but this time the vram and the fact that I'm not doing SLI anymore is making me not want to bother waiting 8 to 12 months for a 2080 Ti version to drop. The other big factor with that timing of course is that I'm planning on buying a LG OLED during the november deals season and I don't want to wait that long after for the "Ti" tier card to drop to get hdmi 2.1 output.

Personally I considered the 2000 series a "tick" and the 3000 series a big "TOCK". I skipped the 2000 series as I found it incremental and mediocre from my position (although I have 1080ti SLI not single card). The fact that the 2000 series was incremental compared to the 3000 series was even admitted to by Jensen Huang in his speech and his charts showing 1000, 2000, and 3000 series' performance.



"Every couple of generations, the stars align as it did with Pascal and we get a giant generational leap"
"Pascal was known as the perfect 10. Pascal was a huge success and set a very high bar."

"It took the "Super Family" of Turing to meaningfully exceed Pascal on game performances without ray tracing"
"With ray tracing turned on, Pascal, using programmable shaders to compute ray-triangle intersections, fell far behind Turing's RT core."
(However)
"Turing with ray tracing ON reached the same performance as pascal with raytracing OFF."

"So we doubled down on everything. Twice the shaders, twice the ray-tracing, and twice the Tensor Core. The triple double. Ampere knocks the daylights out of Pascal on ray tracing. And even with ray tracing on, crushes Pascal on frame rate. To all my Pascal gamer friends, it is safe to upgrade now".
 
Last edited:
I don't expect something faster than the 3090 because there really isn't room in the die, and I don't expect them to have a whole new die, so i think the 3090 is the Titan or the **80ti... but I do expect something with 20GB and possibley slightly higher clocks to come out between the 3080 and 3090. Whether that's called a "3080 Super" or 3080Ti, that doesn't mean "a 3080 is actually a 3070" if you start from a 3090 being a Titan (because there's not really room on the die above it for a Titan). You're forgetting about the Titan last gen that was $2500.

I'm not ruling out a 7nm TSMC variant Titan card.
 
Well to be fair, $700 will still get you a good high end part in the 3080. It's just not going to be that massive of an upgrade for 2080Ti owners...hence the 3090. But for those like myself with a 1080Ti that skipped the 2080Ti, in some games the 3080 provides 2x the framerates. That's huge.



I hope they bring it for the sake of competition as well, but given their history of disappointment there I'm just not holding my breath. I miss when they were neck and neck, like in the X800XT/6800GT days. Now it seems as though they can only compete in the midrange. But, I'd be happy to see that change for everyone's sake.

If AMD's recent high performance units are any indication, even if they can make it peformance competitive its going to be non-competitive in power draw and heat. Considering these new 3000 series GPUs are using 350W, that's a scary proposition.
 
If AMD's recent high performance units are any indication, even if they can make it peformance competitive its going to be non-competitive in power draw and heat. Considering these new 3000 series GPUs are using 350W, that's a scary proposition.

Right, and I'd likely still go with Nvidia, but let's say it was possible. If they could somehow manage a card that was 85-90% of a 3080 or a (fantasy land here, I realize) 3090 for $100-$200 less then I'd have to think that -some- people would buy them despite being power hungry space heaters. Anything that could nip at the heels of NV's top cards would certainly put some pressure on them to not try and get away with the launch prices we've been seeing. Having said that, I actually don't think that the 3080 is terribly unreasonable for the performance and features it's bringing but it's always nice to have other options for those who seek them...and competition is good for everyone, AMD fans (who can finally buy a card from their team that's near the top of the benchmark charts) and NV users (who benefit from lower prices and/or NV's drive to innovate with performance and features that further justify their prices) alike.

But like I said...extremely skeptical considering the past few launches that were supposed to be "The Big One".
 
Back
Top