LG 48CX

R
There likely will be if you’re willing to wait on the 3080Ti/Super...but yeah, I feel yah. The 3090 isn’t a good value, but it’s the one that will push the most frames at 4K which matters more for the CX. I’m still leaning toward the 3080 since even that will be a huge upgrade from my 1080Ti but we will see after the reviews come out.

The RTX 3090 isn't a good value? LUL WHAT? It's the fastest GPU on the planet. Do some of you really want this for free, or cheap? I really really really hate to see this logic within our HardOCP community where, it's expected that we reside within a community of "Hard" advanced users who chase performance. We do it hard or we go home.

Every generation of GPU I have to see, hear and witness this attitude here on these forums. You people have had 2 years to tuck away $20 here and $40 there. You have old hardware, older electronics, buying habits you could have sold and or changed to make this a reality.

Value is a very personal perception that is going to differ from one person to the next. Please, speak for yourself.

I challenge all of you to start selling the crap you have laying around to make this 3090 happen. 2 weeks is a long time to clear out $1,000 worth of stuff. Wife has an old juicer she isn't using or going to use, that's $50 right there, old bar stools in the garage, that's $30 or $40, random memory? That's $10 or $20 right there. Old 1tb or 2tb hard drives, people pay $30 or $40 for these all day long. Old CPU, Video Card, Motherboards? $100 ... $200? etc etc etc. A PS4 or Xbox that you really don't use? That's another thing, get rid of your old PS4 and Xbox shit now. This is def the time to do that. It's estimated that the average house hold has $2,000 to $3.000 in unwanted items laying around that have current market value. Yes, I've read that before and it's probably true.

Getting back to this "value" comment. They are giving us a TITAN class GPU with 24GB's of DDR6, not @ $2400 dollars but at the incredible value of $1500. That's $900 off. And this is still not a good value to some of you? ... Jesus.

Anyways, good luck.

And seriously, use these next 2 weeks to sell off your unwanted crap. Tech hording is lame and a year from now, all that random shit you have laying around will be almost worthless.

See you guys at Microcenter at 4AM the morning of the 24th. And, have your damn money together. I've dreamed ... dreamed of owning a Titan class GPU. Who is going to join me?
 
Last edited:
This is a community service announcement for LG OLED C9 owners. Turn ... OFF ... automatic firmware updates NOW.

With 3xxx series launching within a few short weeks, I'm seeing some speculation on various forums that LG may nerf, disrupt, change, impliment "fixes" to these 2019 sets with their full 48gb HDMI 2.1 performance in some as of yet, unknown ways. It's absolutely better to be safe than sorry.

This logic of course stems from the fact that LG could have added in 4K@120hz 4.2.2 support along with Freesync support to the 2019 C9, etc but for "market" focused reasons, did not. This is of course, at the end of the day, an unfair game LG has chosen to play with it's 2019 sets / owners. Simply unacceptable.

It's being suggested that now that HDMI 2.1 is launching, with both nVidia and AMD, they may pull off a "firmware" update to make their newer 2020 sets have clear performance and feature advantages over the 2019 sets.

It's generally understood and accepted now within the community of large format display users that the C9 has some advantages over the CX series.

I not only turned off automatic firmware updates but I have also blocked my LG C9 at the router level. No data for you!

This seems like weird conspiracy theories to me. The LG C9 series afaik is already discontinued so people are buying whatever is left. LG claimed in CES interviews that 4K 120 Hz 4:2:0 was not possible on C9 due to a hardware issue. It is well known that they use a very different setup for HDMI 2.1 compared to CX. Even now the performance difference between C9 and CX is completely negligible and you won't see a difference unless you have them side by side and calibrated. I own both and can't tell any real difference.

Lack of Freesync support is a bummer but overall it is a use case that only concerns Xbox One X and AMD GPU users. For Xbox Series X, PS5 and PCs running Nvidia GPUs it's not an issue.
 
This seems like weird conspiracy theories to me. The LG C9 series afaik is already discontinued so people are buying whatever is left. LG claimed in CES interviews that 4K 120 Hz 4:2:0 was not possible on C9 due to a hardware issue. It is well known that they use a very different setup for HDMI 2.1 compared to CX. Even now the performance difference between C9 and CX is completely negligible and you won't see a difference unless you have them side by side and calibrated. I own both and can't tell any real difference.

Lack of Freesync support is a bummer but overall it is a use case that only concerns Xbox One X and AMD GPU users. For Xbox Series X, PS5 and PCs running Nvidia GPUs it's not an issue.


The hardware is nearly identical from what I've read. This claim is just market BS. Someone did an in-depth specification / feature set deep dive and found out / determined that the 2019 sets could actually very easily do the 4K@120hz 4.2.2 after looking at the numbers. This lie was enacted and perpetrated to sell 2020 sets to uneducated consumers which is very often the case. It's no secret manufactures want you buying all their new stuff year in and year out.

And, a conspiracy theory? Really?

I guess you missed the whole settlement Apple agreed to after it was discovered they ... secretly .... again, secrety "nerfed" the batteries in an entire series of phones, in the millions which ulimately resulted in a vast, VAST amount of people buying Apples new phones a lot sooner than they should have. that settlement was in hundreds of millions of dollars. Google it. Conspiracy theory ....... reeeeeaaaaaaaaaaaaaallllly. O-KAY.
...
You sound like the very trusting type ..... I admire that in you. When you get to my age, the world becomes ... very clear, your eyes pop open and you grow to become jaded and cynical as you realizes the entirity of humanity want's to fuck you from behind while at the same time reaching into your pockets to take all your money.
 
The hardware is nearly identical from what I've read. This claim is just market BS. Someone did an in-depth specification / feature set deep dive and found out / determined that the 2019 sets could actually very easily do the 4K@120hz 4.2.2 after looking at the numbers. This lie was enacted and perpetrated to sell 2020 sets to uneducated consumers which is very often the case. It's no secret manufactures want you buying all their new stuff year in and year out.

And, a conspiracy theory? Really?

I guess you missed the whole settlement Apple agreed to after it was discovered they "nerfed" the batteries in an entire series of phones which resulted in millions of people buying Apples new phones a lot sooner than they should have. that settlement was in hundreds of millions of dollars.

You sound like the very trusting type ..... I admire that in you. When you get to my age, the world becomes ... very clear, your eyes pop open and you grow to become jaded and cynical as you realizes the entirity of humanity want's to fuck you from behind while at the same time reaching into your pockets to take all your money.

What Apple does has no bearing on LG. Apple's "throttling the phone to maximize battery life" was a clear technical implementation that was most likely not intended to be a malicious feature. I have a hard time assuming malicious intent over just pure incompetence. There are no "uneducated" consumers who will be even aware of the 4K 120 Hz 4:2:0 thing. It is not a feature that concerns anyone using these TVs in their living room as you know, TVs.

We don't know what actually goes inside the LG TVs. Also the CX series is not capable of 4:2:2 at 120 Hz without HDMI 2.1 source. Only 8-bit, 4:2:0 without HDR. That this is not achievable on C9 could be a number of reasons and just looking at bandwidth numbers and specs does not tell the whole tale. As a software developer I know these things can be far more complicated than they seem. For developers of these things maybe it was considered the feature was so niche that putting in the man hours to implement it to work on the C9 was not worth the cost. This happens all the time in modern software development with bug fixes, new features and newer model developmetn getting preference from management when allocating how developers should spend their time.
 
What Apple does has no bearing on LG. Apple's "throttling the phone to maximize battery life" was a clear technical implementation that was most likely not intended to be a malicious feature. I have a hard time assuming malicious intent over just pure incompetence. There are no "uneducated" consumers who will be even aware of the 4K 120 Hz 4:2:0 thing. It is not a feature that concerns anyone using these TVs in their living room as you know, TVs.

We don't know what actually goes inside the LG TVs. Also the CX series is not capable of 4:2:2 at 120 Hz without HDMI 2.1 source. Only 8-bit, 4:2:0 without HDR. That this is not achievable on C9 could be a number of reasons and just looking at bandwidth numbers and specs does not tell the whole tale. As a software developer I know these things can be far more complicated than they seem. For developers of these things maybe it was considered the feature was so niche that putting in the man hours to implement it to work on the C9 was not worth the cost. This happens all the time in modern software development with bug fixes, new features and newer model developmetn getting preference from management when allocating how developers should spend their time.

"CX series is not capable of 4:2:2 at 120 Hz without HDMI 2.1 source. Only 8-bit, 4:2:0 without HDR"

You knew what I meant.

And what Apple does vs LG absolutely has no bearing on anything, you're missing the point and maybe purposely so. Of course it doesn't. I'm speaking to the fact, FACT that corporations constantly play these marketing BS games, removing features, misinformation, misrepresentation, etc. You make it sound as if everyone plays nice and farts rainbows. And that Apple, Samsung, etc has never ever removed or broke a feature for their own benefit. Pleaaaaaaaseeee.

Again, without over complicating my initial posting ( and please don't take it where it doesn't belong ) to safe guard against any unexpected "feature downgrades" I would disable automatic firmware updates in high-end LG 2019 sets. Samsung was caught red handed purposely breaking a handful of features in 2015 or 2016 series via firmware updates. This, you can Google. Apple was recently caught doing this with their battery management back end on some of their phones that resulted in very substantial settlements.

With HDMI 2.1, the CX having only 40 vs the C9 having 48, I would dare not take a chance with LG pushing any updates. If LG did push an update in the next 2 or 3 weeks, I would be very very suspicious of ... why.

Yes, I am sore that LG could have added in Freesync and 4K@120hz 4.2.0 to C9 owners but didn't.

Hisense and TLC just released their new 2020 sets and so far, they are receiveing glowing reviews. I cannot wait to see what the Chinese can beg, borrow and steal from these other manufactures moving forward. No doubt that in the next few years we are going to see some very high performance budget friendly sets.
 
Again, without over complicating my initial posting ( and please don't take it where it doesn't belong ) to safe guard against any unexpected "feature downgrades" I would disable automatic firmware updates in high-end LG 2019 sets. Samsung was caught red handed purposely breaking a handful of features in 2015 or 2016 series via firmware updates. This, you can Google. Apple was recently caught doing this with their battery management back end on some of their phones that resulted in very substantial settlements.

With HDMI 2.1, the CX having only 40 vs the C9 having 48, I would dare not take a chance with LG pushing any updates. If LG did push an update in the next 2 or 3 weeks, I would be very very suspicious of ... why.

40 vs 48 Gbps has absolutely no real world relevance to users. The LG OLED panels are 10-bit, 40 Gbps is enough for 4K 120 Hz 10-bit 4:4:4 with HDR. HDMI 2.1 AV receivers will also only have 40 Gbps and the same seems to be the case for Xbox Series X and most likely PS5. These are all done for the same end goal: less cost to the manufacturer. The folks crying over 10 vs 12 bit are probably ones whose favorite movie is "4K test images".

LG would really have to go out of their way to break things to what end? To make C9 owners sell theirs and buy a CX? Make CX sell better? They are already having difficulty keeping up with demand! I just have a hard time being so cynical that major companies have management rub their hands and tell their engineers to break features on purpose in hopes of selling more things. "Never attribute to malice that which is adequately explained by stupidity."

But if it helps you sleep at night by all means turn off firmware updates. I'll leave them on on my C9 65" and CX 48".
 
R

The RTX 3090 isn't a good value? LUL WHAT? It's the fastest GPU on the planet. Do some of you really want this for free, or cheap? I really really really hate to see this logic within our HardOCP community where, it's expected that we reside within a community of "Hard" advanced users who chase performance. We do it hard or we go home.

Every generation of GPU I have to see, hear and witness this attitude here on these forums. You people have had 2 years to tuck away $20 here and $40 there. You have old hardware, older electronics, buying habits you could have sold and or changed to make this a reality.

Value is a very personal perception that is going to differ from one person to the next. Please, speak for yourself.

I challenge all of you to start selling the crap you have laying around to make this 3090 happen. 2 weeks is a long time to clear out $1,000 worth of stuff. Wife has an old juicer she isn't using or going to use, that's $50 right there, old bar stools in the garage, that's $30 or $40, random memory? That's $10 or $20 right there. Old 1tb or 2tb hard drives, people pay $30 or $40 for these all day long. Old CPU, Video Card, Motherboards? $100 ... $200? etc etc etc. A PS4 or Xbox that you really don't use? That's another thing, get rid of your old PS4 and Xbox shit now. This is def the time to do that. It's estimated that the average house hold has $2,000 to $3.000 in unwanted items laying around that have current market value. Yes, I've read that before and it's probably true.

Getting back to this "value" comment. They are giving us a TITAN class GPU with 24GB's of DDR6, not @ $2400 dollars but at the incredible value of $1500. That's $900 off. And this is still not a good value to some of you? ... Jesus.

Anyways, good luck.

And seriously, use these next 2 weeks to sell off your unwanted crap. Tech hording is lame and a year from now, all that random shit you have laying around will be almost worthless.

See you guys at Microcenter at 4AM the morning of the 24th. And, have your damn money together. I've dreamed ... dreamed of owning a Titan class GPU. Who is going to join me?
Haha, I love your passion.

I spent probably the equivalent of $1800 on SLI dual 1080 cards in 2016. The 3090 is my next upgrade, let's do it HARD!

(But I won't be queing at Microcenter, I'll probably just be boring and wait for reviews of quiet fan AIB models so I can pick one that's nice and silent).
 
Hi guys,
Is anyone using the CX for playing BR games (like fortnite, warzone, or pubg) and if so, can you please post your feedback on the experience and input lag?
Thanks.
 
Haha, I love your passion.

I spent probably the equivalent of $1800 on SLI dual 1080 cards in 2016. The 3090 is my next upgrade, let's do it HARD!

(But I won't be queing at Microcenter, I'll probably just be boring and wait for reviews of quiet fan AIB models so I can pick one that's nice and silent).

I am still rockin 1080ti sc hybrids in sli on a high bandwidth bridge. I skipped the 2000 series as I found it incremental and mediocre from my position (this was even admitted to by Jensen Huang in his speech and his charts showing 1000, 2000, and 3000 series' performance yesterday).

I try to buy in the deep end every so many years, picking my battles. Traditionally I've waited on the Ti series for almost the exact same performance rather than paying the "Titan Early Tax". However, this time I'm going to get a 3090 "Neptune" with the AiO water cooler whenever I can get one from EVGA hopefully, and not at "Limited stock" / "OOS" upcharges of several hundred dollars above retail so I'll have some wait already as it is.

I'm still not decided on which OLED I'll get between november deals on C9, E9, CX of 55" and the posibility of a LG OLED panel based Vizio 55" with it's own gaming chip. I'm still leaning heavily toward the 48 CX for it's smaller size though.
 
Last edited:
Ok, I just got my new 48 CX set up. It is a total stunner on the desktop so far. I mean - WOW. These inky blacks and punchy image.

So, anxious to try a game and test that I can do 120HZ at 4k with an old and shoddy HDMI cable, I loaded up Doom. On the first load, I came to see my right monitor of my triple screen setup was set to be the default monitor - that and that left screen are my old 40" 4k monitors using display ports. So I set the 48CX in the middle to be my main monitor. And the right monitor was still trying to display the game on launch, except this time I got a black screen and had to CRL-ALT-DEL and quit. So I tried again, verifying the CX was the main and same issue. Then I unplugged the old 40's, going with just the CX, and DOOM fired up fine.

So I went into the menus and turned V-Sync on, and low and behold I was running at 120hz! Great, I thought. Then I realised I was only in 1080p. I went to change the res, and 4k was not an option! I think the max is 1440p. I run on SLI Pascal Titans...so it's not the video card. Does anyone have any ideas why this is happening? Is it my crappy HDMI cable?

I will order better cables. I want to get 2 more of these CX's. But I want to make sure I can get it running 4k @120hz.

And how do I test the desktop HZ? It's set to 120 in the Nvidia Control Panel...but it seems as responsive as my 60hz screens. I am in GAME mode, btw.

Thanks for any help! I am going to try to find another HDMI cable to test with...
 
So I went into the menus and turned V-Sync on, and low and behold I was running at 120hz! Great, I thought. Then I realised I was only in 1080p. I went to change the res, and 4k was not an option! I think the max is 1440p. I run on SLI Pascal Titans...so it's not the video card. Does anyone have any ideas why this is happening? Is it my crappy HDMI cable?

Yes, you need a HDMI capable of transmitting 4k120, which old ones can not do. I personally hate HDMI's cable branding; I think 2.1 cables are branded as "Ultra High Speed", but it's really impossible to keep track anymore.
 
Yes, you need a HDMI capable of transmitting 4k120, which old ones can not do. I personally hate HDMI's cable branding; I think 2.1 cables are branded as "Ultra High Speed", but it's really impossible to keep track anymore.

Thanks! Yes I just found a 6' HDMI cable from upstairs, and I think for sure I am 120hz on the desktop now. It feels much more smooth. But still in Doom, I have no 4k option. Should I change my desktop HZ to 60 to test?

EDIT - Ok, I have no 4k still in DOOM. But I loaded F12019, and I was able to get 4k 120hz on this single screen (not triple) on low graphics settings, and it ran 120fps with Vsync, but I lost FPS going to higher graphics settings. Which is shocking given I am driving the game with SLI Pascal Titans...and a 9900k, 64gb ram.

But anyway, if I can get 4k in F12019, why not in DOOM? I don't get it...
 
Last edited:
The RTX 3090 isn't a good value? LUL WHAT? It's the fastest GPU on the planet. Do some of you really want this for free, or cheap?

It has nothing to do with expecting it to be free or cheap. That's a ridiculous statement. No one is expecting top tier performance at a bargain basement cost. Perhaps you misunderstood me, and maybe that's my fault for not putting it into context. If the rumors are true that the 3090 will be roughly 15-20% faster than the 3080, yet costs over twice as much, that does not represent a good value compared to the 3080. Just like with the Inland NVMe drive that was all the rage last year. It was slightly slower than some other high end drives like the Samsung 970 EVO Plus, but it was also like 1/2 the price - which made it a great value. Most people wouldn't pay twice as much for slightly higher read/write speeds, and the same principle applies here (assuming the rumors are true and the 3080 has most of the real-world performance of the 3090).

Now with that being said, I'm glad that you brought up the point about the Titan because when I look at it from that perspective, I can see where you're coming from. I've been buying the top Ti card the last few cycles (780Ti, 980Ti, 1080Ti, skipped the 2080 series) and so I haven't kept up with Titan prices. Compared to Titan RTX @ ~$2500, heck yes this new 3090 seems like a bargain! But I think that Nvidia was on some sort of crack binge when they decided on that MSRP, particularly when previous Titan models were priced in the $1200 range.

I really really really hate to see this logic within our HardOCP community where, it's expected that we reside within a community of "Hard" advanced users who chase performance. We do it hard or we go home.

Yes, that's the motto and there have always been people around here who have chased performance. But, it's also a well known fact that you have to pay to play! And that chasing that last 10-15% of performance will always, always cost you. Diminishing returns. It's not unique to the computer hobby, or the 3090, or [H]. It's that way with virtually everything. Audio gear...a pair of headphones or an amp that has slightly better measurements than another piece of gear that costs half as much. Cars that only go a little bit faster than a Corvette but cost several times the price. Custom firearms that shoot slightly tighter groups for 4x the price of an off the shelf rifle. Air coolers that cost 2x-3x what a budget cooler does, but only perform a few degrees cooler. Countless Intel CPU releases where the top end i7 was priced much higher than the mainstream variant (such as the i7-920 or i7-2600K), yet only offered maybe a 10% performance gain, particularly if you overclocked the cheaper one to narrow the gap.

So let's try to agree on common ground here. Compared to the 3080 which should offer most of the performance of the 3090 at less than half the cost, I do not think that the latter represents a good value (i.e. bang for buck). But compared to the previous Titan which cost roughly $1000 more, yeah, I can see why one would think that this new one was a good value! It's going to outperform the old card at a lower price. And that's the same reason why the 3080 is looking like an incredible value compared to the older, costlier 2080Ti. The 3080 is looking like it'll significantly outperform the 2080Ti for ~$500 less, and that's kind of mind-blowing. And those who absolutely must chase that last bit of FPS can step up and shell out another $800 (more than the price of the 3080!) for the top tier card, while still feeling like they're getting a bargain compared to Titan RTX.
 
As far as the price vs value title goes that will go to the 3070 the reviewers are speculating. Won't know for sure until they do the actual bench marks however. That's not to say the 3080 and 3090 are bad in any way. They just wont take the price vs value crown is all. That's the early speculation at this time.
 
As far as the price vs value title goes that will go to the 3070 the reviewers are speculating. Won't know for sure until they do the actual bench marks however. That's not to say the 3080 and 3090 are bad in any way. They just wont take the price vs value crown is all. That's the early speculation at this time.

Agreed. The 3070 may very well end up being the "sweet spot" for mainstream gamers. Most people are still gaming at 1440p and may not benefit as much as 4K users from the 3080/3090, or be willing to pay the premium for them. I expect it'll sell extremely well.
 
what did you do exactly to make it work?

In Windows 10:

- Open "Bluetooth & other devices"
- "Add Bluetooth or other device"
- "Everything else"
- Pick your TV from the list it populates, repeat for all the copies of the TV (it is a few different devices with the same name"
- In Device Manager, verify you see your TV listed under "Digital Media Devices"
 
After crunching some numbers, I decided the 3090 is worth it. I want a significant upgrade over my 2080 Ti and a 3080 may not be enough. The 3080 is looking to be anywhere from 30-40% gain over a 2080 Ti. So if we assume 15% gain from a 3080 to 3090, that would make it about 50-60% faster than a 2080 Ti. The leap from 2080 Ti to 3080 isn't enough for me.
 
I think the big "what-if?" with 3080/90's is overclocking. It looks like clock speeds are coming down into the 1.7-1.8Ghz range. If that's just for power and heat and the silicon supports 2.1Ghz like Turing (or more), they'll be utter monsters overclocked and even a 3080 would be a huge gain over a 2080Ti. If Samsung's fab process isn't so hot and 1.8Ghz is all they can actually do, yeah it might take a 3090 to be a real solid upgrade.
 
I think the big "what-if?" with 3080/90's is overclocking. It looks like clock speeds are coming down into the 1.7-1.8Ghz range. If that's just for power and heat and the silicon supports 2.1Ghz like Turing (or more), they'll be utter monsters overclocked and even a 3080 would be a huge gain over a 2080Ti. If Samsung's fab process isn't so hot and 1.8Ghz is all they can actually do, yeah it might take a 3090 to be a real solid upgrade.

3080 may be a big gain over a 2080 Ti definitely. But that would make the 3090 an even bigger one. If the 3080 is 50% over a 2080 Ti and the 3090 is an additional 15% over a 3080, total performance gain going from a 2080 Ti to 3090 is now roughly 72%.
 
Well, just bought my 48" CX. It arrives tomorrow. This thread was instrumental in my decision to purchase. Now if I can get a 3090 sometime around launch I will be ready to roll. RIP: Wallet 1970-Sept 2020.
 
Well, just bought my 48" CX. It arrives tomorrow. This thread was instrumental in my decision to purchase. Now if I can get a 3090 sometime around launch I will be ready to roll. RIP: Wallet 1970-Sept 2020.
It's ok, you won't regret it.

You can always tuck your empty wallet under your head at night and whisper "sorry" to it.
 
It seems we are talking the new Nvidia cards. Hope this is relevant.

I have a brand new 48 CX. Figuring it all out and need a better HDMI cable. But ultimately I want to run 3 of them for sim racing in triple 4k @120hz.

I currently run SLI Pascal Titans, previously on 3 40" 4k screens, and juuust got 60 fps/60hz. Do you guys think the new 3090 is an upgrade on those, even with a single card? Been 4 years I have run with this setup.

I have a 9900k too. Two of these 3090's in SLI would slay triple 4k at 120hz, right?
 
It seems we are talking the new Nvidia cards. Hope this is relevant.

I have a brand new 48 CX. Figuring it all out and need a better HDMI cable. But ultimately I want to run 3 of them for sim racing in triple 4k @120hz.

I currently run SLI Pascal Titans, previously on 3 40" 4k screens, and juuust got 60 fps/60hz. Do you guys think the new 3090 is an upgrade on those, even with a single card? Been 4 years I have run with this setup.

I have a 9900k too. Two of these 3090's in SLI would slay triple 4k at 120hz, right?
Definitely an upgrade. Dual 3090's should obliterate 4k120.
 
Definitely an upgrade. Dual 3090's should obliterate 4k120.

Thank you, brother. Glad I could skip a couple generations. Time to dig deep again! Do you have any advice on what the Pascal Titans are worth? I have so not been gaming the past 3 years. I bet they only have 100 hours gaming on them at the total upper limit.
 
R

The RTX 3090 isn't a good value? LUL WHAT? It's the fastest GPU on the planet. Do some of you really want this for free, or cheap? I really really really hate to see this logic within our HardOCP community where, it's expected that we reside within a community of "Hard" advanced users who chase performance. We do it hard or we go home.

Every generation of GPU I have to see, hear and witness this attitude here on these forums. You people have had 2 years to tuck away $20 here and $40 there. You have old hardware, older electronics, buying habits you could have sold and or changed to make this a reality.

Value is a very personal perception that is going to differ from one person to the next. Please, speak for yourself.

I challenge all of you to start selling the crap you have laying around to make this 3090 happen. 2 weeks is a long time to clear out $1,000 worth of stuff. Wife has an old juicer she isn't using or going to use, that's $50 right there, old bar stools in the garage, that's $30 or $40, random memory? That's $10 or $20 right there. Old 1tb or 2tb hard drives, people pay $30 or $40 for these all day long. Old CPU, Video Card, Motherboards? $100 ... $200? etc etc etc. A PS4 or Xbox that you really don't use? That's another thing, get rid of your old PS4 and Xbox shit now. This is def the time to do that. It's estimated that the average house hold has $2,000 to $3.000 in unwanted items laying around that have current market value. Yes, I've read that before and it's probably true.

Getting back to this "value" comment. They are giving us a TITAN class GPU with 24GB's of DDR6, not @ $2400 dollars but at the incredible value of $1500. That's $900 off. And this is still not a good value to some of you? ... Jesus.

Anyways, good luck.

And seriously, use these next 2 weeks to sell off your unwanted crap. Tech hording is lame and a year from now, all that random shit you have laying around will be almost worthless.

See you guys at Microcenter at 4AM the morning of the 24th. And, have your damn money together. I've dreamed ... dreamed of owning a Titan class GPU. Who is going to join me?

The problem is you can really only evaluate the card’s value, retrospectively, at the END of its life, based on how many games it handled and how well it handled them on your display.

$1,500 isn’t a lot for THIS card, but it is a lot for ANY card especially at the start of a new console gen when there is so little we know about the real demands 4K gaming will put on a system going forward. It could end up having much more vram than needed and not enough RTX performance, for example. Or maybe it isn’t quite enough for 4K 120fps but almost gets there. There is more to this hobby than throwing money at it.

I’ve been playing with cards since 1998. Don’t buy into hype. Even the professional reviews can be shallow. This is still only the second iteration of RTX tech. Infancy.
 
There’s a Q&A thread going on Reddit with some actually Nvidia employees. Sounds like the new cards are full 48gbps, and more importantly, will have 10bit 4:4:4 4K 120hz available over HDMI 2.1!
 

Attachments

  • 6CF3595B-88A2-4E56-8ABE-7DACB29D9BD4.jpeg
    6CF3595B-88A2-4E56-8ABE-7DACB29D9BD4.jpeg
    145.5 KB · Views: 0
  • 98D23ED3-2DEB-4259-9F6B-F5FEE5F01D50.jpeg
    98D23ED3-2DEB-4259-9F6B-F5FEE5F01D50.jpeg
    188.4 KB · Views: 0
There’s a Q&A thread going on Reddit with some actually Nvidia employees. Sounds like the new cards are full 48gbps, and more importantly, will have 10bit 4:4:4 4K 120hz available over HDMI 2.1!

He didn't specifically say 10Bit over HDMI with 444 but I'll just assume they read the question properly before giving an answer. Finally we can put this to rest.
 
And lets not forget next year. Super or Ti or both variations of the 3xxx series will be released. It will also be interesting to see what the tv manufacturers roll out.
 
I want some unambiguous confirmation that Nvidia is going to support HDMI 2.1 VRR given their conflict of interest with G-Sync.

You can support HDMI 2.1 without VRR.
 
Nvidia just CONFIRMED this is fixed!

The GeForce RTX 30 series will support the following over HDMI: RGB/YUV444: 8, 10, 12bpc YUV422/420: 8, 10, 12bpc Same formats are supported on HDMI 2.1 Fixed Rate Link.

DOM
So what does this mean on the 48" LG CX in comparison to new OLEDs from Sony and Vizio that support the full HDMI 2.1 48Gbps? Neither of them will produce a 48" OLED display, correct? Are there any other non-OLED displays incoming that are <48" that will support the full HDMI 2.1 48Gbps bandwidth?
 
So what does this mean on the 48" LG CX in comparison to new OLEDs from Sony and Vizio that support the full HDMI 2.1 48Gbps? Neither of them will produce a 48" OLED display, correct? Are there any other non-OLED displays incoming that are <48" that will support the full HDMI 2.1 48Gbps bandwidth?

I am by no means an expert but basically Nvidia has just confirmed the new RTX 30 series & drivers will fully support / match the max color depth capabilities of all HDMI 2.1 enabled TVs on the market today including the LG OLED CX. Since I believe all of these TV are 10bit panels even a cut down 40Gbps bandwidth will still support the max native capabilities of the display.


DOM
 
So what does this mean on the 48" LG CX in comparison to new OLEDs from Sony and Vizio that support the full HDMI 2.1 48Gbps? Neither of them will produce a 48" OLED display, correct? Are there any other non-OLED displays incoming that are <48" that will support the full HDMI 2.1 48Gbps bandwidth?

You can enable 12-bit color on those displays, same on LG C9. End result will have maybe - with emphasis on maybe - some improvement to banding due to getting higher bit depth content in to the TV's processor. 10-bit color is more than enough and the actual panels are also just 10-bit.
 
Ok so I had planned on SLI 3090's for my triple CX48 setup. But I just learned the 3090 is over 5 inches WIDE. Like, how in the HELL are you supposed to fit those in even the biggest of cases, let alone maintain a second sound card for SimVibe? Madness.

I need to find out if a single 3090 is better than two Pascal Titans. I hate that the new cards are this wide. Insane.
 
This talk of getting a 12-bit signal into the TV to be down-sampled to 10-bit for the panel as being higher quality is nonsense. People cannot even tell the difference between 8-bit FRC and 10-bit. 40 Gbps for 10-bit 4K/120 Full RGB is all you need.
 
I should have thought of this before, but it just slipped my mind. I currently have my monitor connected to my 2080ti via display port and an HDMI cable to my AV receiver for 5.1 sound. How can I accomplish this with the OLED? I can buy a good sound card and use optical, but would rather not. Anyone done this?
 
I want some unambiguous confirmation that Nvidia is going to support HDMI 2.1 VRR given their conflict of interest with G-Sync.

You can support HDMI 2.1 without VRR.

The LG TV's have giant "G-SYNC COMPATIBLE" logos on the box and website. VRR over HDMI is supported right now on C9's and CX's on Turing cards, many of us are using it.
 
Back
Top