LG 48CX

elvn

Supreme [H]ardness
Joined
May 5, 2006
Messages
4,580
..

Might even swap it out for a 7900 XTX if the raster performance is close enough as I honestly can't tell a damn difference between RT on and off at a glance. I either need a direct side by side or I have to really analyze parts of the image to notice and nobody plays games like that.

I know that shadows can make a huge difference. People used to turn them off or set them to low, perhaps more often in the past, but good global dynamic shadows can deliver an almost 3D ~ holographic look to a game. I notice that kind of aesthetic gain in a big way personally. It takes away the flatness of the scene, gives things depth. Especially large outdoor games with long view distances and animated objects (and shadows) both near and far, all the way out in the distance. So RT can add very dynamic 3d effect between shadows and highlights, reflections in those types of games, depending how well it's implemented.

I don't use RT due to the performance hit but I can see how it could be valuable if the big performance drop wasn't the tradeoff. The catch 22 now might be that in some cases games that have the frame rate to spare might not be as detailed and/or as open world wise so not have quite as great of an effect overall. Even less detailed stylized rpg/adventure//mmo can be very demanding with very high view distances + #anim objects in distance. Games with very long view distances unlocked + high # of animated objects and effects in the distance maxed in settings usually destroy frame rates when outdoors. Could also depend to a degree on how well the dev designed the game. Sill, even simple minecraft/pixelated games look cool with the dynamic lighting of raytracing so I definitely see the benefit outside of the frame rate hit in games.

. . .

Even dynamic shadows on high+ without raytracing, in large outdoor area games with very far view distances + large # of animated objects in distance, can deliver a more 3d/holographic feel - so RT could do that and better with highlight/reflection/shadow dynamism. You might not notice it in a corridor shooter or some demolition derby arena shooter as much. AC Valhalla HDR, or Odyssey HDR should benefit from RT though as they are large adventure games with long view distances (depending on your settings and where you are in the game worlds).
 
Last edited:

MistaSparkul

2[H]4U
Joined
Jul 5, 2012
Messages
2,302
I know that shadows can make a huge difference. People used to turn them off or set them to low, perhaps more often in the past, but good global dynamic shadows can deliver an almost 3D ~ holographic look to a game. I notice that kind of aesthetic gain in a big way personally. It takes away the flatness of the scene, gives things depth. Especially large outdoor games with long view distances and animated objects (and shadows) both near and far, all the way out in the distance. So RT can add very dynamic 3d effect between shadows and highlights, reflections in those types of games, depending how well it's implemented.

I don't use RT due to the performance hit but I can see how it could be valuable if the big performance drop wasn't the tradeoff. The catch 22 now might be that in some cases games that have the frame rate to spare might not be as detailed and/or as open world wise so not have quite as great of an effect overall. Even less detailed stylized rpg/adventure//mmo can be very demanding with very high view distances + #anim objects in distance. Games with very long view distances unlocked + high # of animated objects and effects in the distance maxed in settings usually destroy frame rates when outdoors. Could also depend to a degree on how well the dev designed the game. Sill, even simple minecraft/pixelated games look cool with the dynamic lighting of raytracing so I definitely see the benefit outside of the frame rate hit in games.

. . .

Even dynamic shadows on high+ without raytracing, in large outdoor area games with very far view distances + large # of animated objects in distance, can deliver a more 3d/holographic feel - so RT could do that and better with highlight/reflection/shadow dynamism. You might not notice it in a corridor shooter or some demolition derby arena shooter as much. AC Valhalla HDR, or Odyssey HDR should benefit from RT though as they are large adventure games with long view distances (depending on your settings and where you are in the game worlds).

Well don't get me wrong, I do think RT is cool and that it's the future of gaming. I think the main issue right now is that RT in just about every game is very limited to only certain lighting. It's not like say, Portal to Portal RTX where the entire game is path traced and the difference in visuals is immediately obvious and huge. Turning on RT from off to the maxed out preset in games like CP2077, Control, Dying Light 2, etc. there is a difference in visual fidelity sure, but it's far more subtle and doesn't make an immediate day and night difference like Quake and Portal RTX does IMO. And yeah cutting my fps by ~40-50% on average also isn't a good proposition for turning it on since I'm now turning what is an over 120+ fps experience into an 80fps one, on a $1599+ GPU to boot . The performance hit for turning on RT is still pretty big even on Lovelace, in fact in some games it's actually no better than Ampere in that both architectures will lose the same % performance when turning RT on.

1668104351523.png
1668104358535.png
1668104378911.png
1668104448763.png



I was hoping that by the 3rd generation RTX GPU we could turn on RT without cutting fps by up to 66%. Hell even the FIRST gen RTX GPU, the 2080 Ti isn't that far off from the latest and greatest Lovelace when it comes to % performance loss from turning on RT, kinda sad actually. Seems like nvidia isn't really making these new RT cores more efficient at their jobs at all, instead we are just getting more brute force to push higher frame rates in general. Until either games look massively upgraded with RT on, or the performance hit is down below 20% in even the heaviest RT implementations, I'd prefer to just leave it off.
 
Last edited:
  • Like
Reactions: elvn
like this

elvn

Supreme [H]ardness
Joined
May 5, 2006
Messages
4,580
It's sound like it's going to take a dev shift for RTX but probably also for the other thing we'd talked about where instead of nvidia's guessing vectors based on a buffered image, the games themselves would report the vectors of objects and backgrounds, cameras, etc. while the OS/drivers would report those of the peripherals. Which I believe VR dev does but I'd think VR game devs expect that going in where pancake screen devs doing it might be a slow adoption. The VR frame projection / spacewarp stuff is based on in game virtual objects, background, camera, etc vectors but also the headset where on pc desktop it would be the mouse and keyboard or gamepad entry (mouse-looking, movement keying, controller panning).

They have to up their game, literally:

-HDR dev (maybe even Dolby Vision)
-RTX Dev
-Frame Amplification Dev (based on actual reported game engine and peripheral/OS driver vectors projectiing the next frame rather than nvidia's "look at the next frame and guess the vectors from the difference in the two images" method.
- Surround sound / ATMOS.

I'd also like if they did gaming AI upscaling hardware on the displays themselves if possible to avoid the port+cable bottleneck, even if 4k to 8k on future 8k screens (or their uw resolutions when gaming).
 
Last edited:

MistaSparkul

2[H]4U
Joined
Jul 5, 2012
Messages
2,302
It's sound like it's going to take a dev shift for RTX but probably also for the other thing we'd talked about where instead of nvidia's guessing vectors based on a buffered image, the games themselves would report the vectors of objects and backgrounds, cameras, etc. while the OS/drivers would report those of the peripherals. Which I believe VR dev does but I'd think VR game devs expect that going in where pancake screen devs doing it might be a slow adoption. The VR frame projection / spacewarp stuff is based on in game virtual objects, background, camera, etc vectors but also the headset where on pc desktop it would be the mouse and keyboard or gamepad entry (mouse-looking, movement keying, controller panning).

They have to up their game, literally:

-HDR dev (maybe even Dolby Vision)
-RTX Dev
-Frame Multiplication Dev (based on actual reported game engine and peripheral/OS driver vectors projectiing the next frame rather than nvidia's "look at the next frame and guess the vectors from the difference in the two images" method.
- Surround sound / ATMOS.

I'd also like if they did gaming AI upscaling hardware on the displays themselves if possible to avoid the port+cable bottleneck, even if 4k to 8k on future 8k screens (or their uw resolutions when gaming).

I'm just wondering why everyone believes nvidia has been making improvements to RT performance every gen when the numbers don't line up with that. Let's say the upcoming 4070 Ti is supposed to be on par with the 3090 Ti in raster, but faster in ray tracing because of the 3rd gen RT core vs 2nd gen RT core, and both games get 100fps with RT disabled. If the 3090 Ti turns on RT it loses 40% performance and drops to 60fps, the 4070 Ti should suffer a much less performance penalty for turning RT on and lose like 25% or something, but I'm pretty sure it's going to also lose 40% and be right in the same ballpark as the 3090 Ti in RT as well. So how exactly is nvidia improving things on the RT performance front? Seems like all they did was make more powerful GPUs in general. Anyways I'm getting off topic here so I'll end my rant about RT performance. Seeing the raster performance of the 4090 though really makes me pray for a 4K 160+Hz OLED next year. CES 2023 maybe?
 

Attachments

  • HZD 4090.png
    HZD 4090.png
    2.5 MB · Views: 1
Last edited:

kasakka

2[H]4U
Joined
Aug 25, 2008
Messages
2,802
I'm just wondering why everyone believes nvidia has been making improvements to RT performance every gen when the numbers don't line up with that. Let's say the upcoming 4070 Ti is supposed to be on par with the 3090 Ti in raster, but faster in ray tracing because of the 3rd gen RT core vs 2nd gen RT core, and both games get 100fps with RT disabled. If the 3090 Ti turns on RT it loses 40% performance and drops to 60fps, the 4070 Ti should suffer a much less performance penalty for turning RT on and lose like 25% or something, but I'm pretty sure it's going to also lose 40% and be right in the same ballpark as the 3090 Ti in RT as well. So how exactly is nvidia improving things on the RT performance front? Seems like all they did was make more powerful GPUs in general. Anyways I'm getting off topic here so I'll end my rant about RT performance. Seeing the raster performance of the 4090 though really makes me pray for a 4K 160+Hz OLED next year. CES 2023 maybe?
If we look at a purely path traced game (Quake 2 RTX) you start to see where the performance gains are. Unfortunately these are rarely seen in tests.

I don't have exact numbers, but if I set a target framerate of 144 fps and let Quake 2 RTX use dynamic resolution scaling to maintain it, my 2080 Ti would be running at sub-1080p resolutions, probably closer to 720p if not even lower. My 4090 runs at around 2880x1620 so 75% of 4K.

My guess is that even if the percentages for framerate drops look the same, just the fact that the 4090 runs RT games at massively higher framerates means it has to process the RT effects faster because they are going to take a good chunk of frametime.

As another empiric example, with the 2080 Ti I pretty much turned RT off in Shadow of the Tomb Raider because it was a good way to reach better framerates. With my 4090 I can turn off DLSS, turn RT effects to max and still maintain around 200 fps framerates at 4K!

TechPowerup has really done a disservice by not publishing minimum framerates too because I tried my 4090 with my old Ryzen 3700X and then replaced it with a system with 13600K and the minimum framerates improved just massively while the maximum framerates were not always that much higher but the overall experience is so much smoother. Also note that a lot of TechPowerup data is using a 5800X CPU, which is fine but they have themselves tested that 5800X3D or latest gen AMD/Intel do give up to 30% boosts in some games.
 

MistaSparkul

2[H]4U
Joined
Jul 5, 2012
Messages
2,302
If we look at a purely path traced game (Quake 2 RTX) you start to see where the performance gains are. Unfortunately these are rarely seen in tests.

I don't have exact numbers, but if I set a target framerate of 144 fps and let Quake 2 RTX use dynamic resolution scaling to maintain it, my 2080 Ti would be running at sub-1080p resolutions, probably closer to 720p if not even lower. My 4090 runs at around 2880x1620 so 75% of 4K.

My guess is that even if the percentages for framerate drops look the same, just the fact that the 4090 runs RT games at massively higher framerates means it has to process the RT effects faster because they are going to take a good chunk of frametime.

As another empiric example, with the 2080 Ti I pretty much turned RT off in Shadow of the Tomb Raider because it was a good way to reach better framerates. With my 4090 I can turn off DLSS, turn RT effects to max and still maintain around 200 fps framerates at 4K!

TechPowerup has really done a disservice by not publishing minimum framerates too because I tried my 4090 with my old Ryzen 3700X and then replaced it with a system with 13600K and the minimum framerates improved just massively while the maximum framerates were not always that much higher but the overall experience is so much smoother. Also note that a lot of TechPowerup data is using a 5800X CPU, which is fine but they have themselves tested that 5800X3D or latest gen AMD/Intel do give up to 30% boosts in some games.

Yeah but again it's kinda like just saying that the 4090 is a faster card in general that's why it does RT better, not that it does it more efficiently. I'd prefer to have a GPU generation where I can turn on RT effects in games and experience minimal performance penalty for doing so. But it seems like we aren't going to head that direction and instead just brute force it so that even with all RT maxed out we can still get 200fps at 4K, but then if you turn RT off you will get 400fps and then that reason for not using RT is still going to be there.
 

elvn

Supreme [H]ardness
Joined
May 5, 2006
Messages
4,580
Frame amplification tech seems like a good way to at least double frame rates to start with. Perhaps even more in the future. That would have a huge effect on frame rates which could help RT be more easily digested.

. . . . . .

However this is how I understand it:

Imagine an animation flip book with a clear overlay sheet on each drawn page, and blank page in between every drawn page cell.

(Clear overlay) on Drawn Cell 1 + [Blank Cell 2] + Drawn Cell 3

Nvidia's frame generation form of frame amplification tech is starting at a drawn cell 1 page and then flips ahead "two pages" to the next drawn Cell 3 page. Then it's doing it's best AI thinking to guess what the vectors are on the 1st cell in order to figure out how to build a "tween" cell get to the 3rd frame cell. Then it's drawing that best guess on the middle blank page based on it's imagined/guessed vectors. This has a lot of trouble in a lot of things, especially things like 3rd person adventure games or any games where the virtual camera moves around independently of the character, causing artifacts from bad guesses as the camera pathing makes some things in the sceen remain motionless or makes them move at different rates in regard to the FoV even though they are technically moving or at different speeds in the game world itself. It also has more artifacting the lower the foundation frame rate is (e.g. beneath 120fps) since that has larger time gaps between animation cells (animation flip book with far fewer pages so transitioning between fewer animation cells states, more staggered/chopped animation cell cycles).

VR's space warp style frame amplification tech instead has a clear page overlay on top of the first page. The clear page has a z-axis grid with a number of arrows and rates for some of the vectors already written on the clear page. VR system's apps can use actual vector data incl. rates, arcs, player inputs, etc. and then plot and draw measured results on the middle page going forward. So it's not completely guessing what the vectors are for everything in the scene, including the player's inputs and the effects of the virtual camera (or the player's head motion and hand motions in VR).

Unfortunately nvidia went with the former uninformed version, at least in the near timeframe. Hopefully the whole industry will switch to the VR method at some point. That would require the PC game devs to write their games using the same kind of VR tools and do the work to code for that. They have a lot of room to advance. Unfortunately progress is a lot slower on the PC and PC display front than the VR front, though in aspects like PPD,VR is way "behind" or inferior and will continue to be for some time yet.

. . .

from blurbuster.com 's forum replies (Mark R.) :

reprojection could in theory be created as a parallel-layer API running between the game and the graphics drivers, much like how VR APIs do it.

One interesting innovation of reprojection that has not yet been done is making sure the pre-reprojected framerate is above flicker fusion threshold. That causes stutters in reprojection to disappear!

For VR, We Already Have Hybrid Frame Rates In The Same Scene
----------------------------------------------------------------------------------------------------


For example on Oculus Rift 45fps to 90fps, sometimes certain things stutter (hand tracking 45fps) while the background scrolls smooth (90fps) via head turns.

But if we had 100fps reprojected to 500fps, then even physics objects like enemy movements would still be smooth looking, just simply more motionblurred (due to frametime persistence) than turns (due to reprojection-based frame rate amplification).

Not everything in the game world *needs* to run at the same framerate; if motion blur is acceptable for such movements.

Different things running at different frame rates on the same screen is very common with reprojection (Oculus Rift), which is ugly when some of the framerates are below stutter/flicker detection threshold.

But if all framerates could be guaranteed perfect framepaced triple-digit, then no stuttering is visible at all! Just different amounts of persistence motion blur (if using reprojection on a non-strobed display). This will be something I will write about in my sequel to the Frame Rate Amplification Article.

Hybrid Frame Rates Stop Being Ugly if 100fps Minimum + Well Framepaced + Sample And Hold
------------------------------------------------------------------------------------------------------------------------------------------


Hybrid frame rates will probably be common in future frame rate amplification technologies, and should no longer be verboten, as long as best practices are done:

(A) Low frame rates are acceptable for slow enemy movements, but keep it triple-digit to prevent stutter
(B) High frame rates are mandatory for fast movements (flick turns, pans, scrolls, fast flying objects, etc)
(C) If not possible, then add GPU motion blur effect selectively (e.g. fast flying rocket running at only 100 frames per second, it's acceptable to motionblur its trajectory to prevent stroboscopic stepping)

The frame rates of things like explosions could continue at 100fps to keep GPU load manageable, but things like turns (left/right) would use reprojection technology. The RTX 4090 should easily be capable of >500fps reprojection in Cyberpunk 2077 at the current 1 terabyte/sec memory bandwidth -- and this is a low lying apple just waiting to be milked by game developers!

In other words, don't use 45fps. Instead of 45fps-reproject-90fps, use min-100fps-reproject-anything-higher. Then frame rate amplification can be hybridized at different frame rates for different objects on the screen -- that becomes visually comfortable once every single object is running at triple-digit frame rates!

Technically, reprojection could in theory be created as a parallel-layer API running between the game and the graphics drivers, much like how VR APIs do it. Except it's overlaid on top of non-VR games.

One major problem occurs when you're doing this on strobed displays -- sudden double/multi-image effects -- and requires GPU motion blur effect (mandatory) to fix the image duplicating (akin to CRT 30fps at 60Hz). However, this isn't as big a problem for sample-and-hold displays.
 
Last edited:

hhkb

Limp Gawd
Joined
May 24, 2013
Messages
147
Decided to keep my 48CX and buy the new zowie 25" XL2566K instead for FPS. I need to find a monitor arm though that can reach ~70-80cm so I can put it in front of my CX and then move it out of the way when not using it. Anyone have suggestions?
 

sharknice

2[H]4U
Joined
Nov 12, 2012
Messages
3,036
I need to clean my 48CX after a move is there anything I need to be careful not to use on the screen?
You're not really supposed to use any cleaners on screens. Just use a dry micro fiber cloth, if it's still dirty get it a little damp and wipe off any spots, then go back to a dry microfiber cloth.

I've heard people say X cleaner is fine to use on screens, but I've never had anything on a screen a damp microfiber cloth can't get off. I wouldn't risk it if you don't have to.
 

hhkb

Limp Gawd
Joined
May 24, 2013
Messages
147
I need to clean my 48CX after a move is there anything I need to be careful not to use on the screen?
I use Ecomoist cleaner (available in the UK, there is probably an NA equivalent) - spray it on the cloth then wipe. Alternatively, if you want to be really careful, damp a cloth in distilled water (tap water will have minerals and will streak, I wouldn't use especially if you have hard water). The LG manual says to use a dry cloth but it's impossible to clean it with a dry cloth imo, and you don't want to rub hard when cleaning the screen.
 
Last edited:

Lateralus

More [H]uman than Human
Joined
Aug 7, 2004
Messages
17,494
^ Agree with those folks. There may be some screen cleaners that are safe to use, but I've rarely had to use anything more than filtered water + a microfiber cloth. Anything beyond that could be detrimental and/or a waste of $. Some of them could work well but make sure you read the fine print (and try water first because why not?).
 

hhkb

Limp Gawd
Joined
May 24, 2013
Messages
147
I've gotten used to 48 over the last 2 years so I don't see much reason to upgrade if the image quality and Hz is the same. Some would argue that the image quality would be downgraded due to the 42s lower peak brightness vs the 48. The only thing that will get me to upgrade at this point is noticeably better image quality like a QD-OLED or a decent bump in Hz to like 160Hz.

Yeah I think you're right. Still, the prices this black friday are so good (700 right now in the UK). I still find the 48 slightly too big occasionally (like if I have to do a work call on Zoom and need to fullscreen share), but the immersion is great at around ~1m away.

edit: I couldn't help myself, I pulled the trigger on the 42, that price is too good. I'll be trying out FPS which was my main problem game and report how improved it is.
 
Last edited:

MistaSparkul

2[H]4U
Joined
Jul 5, 2012
Messages
2,302
Yeah I think you're right. Still, the prices this black friday are so good (700 right now in the UK). I still find the 48 slightly too big occasionally (like if I have to do a work call on Zoom and need to fullscreen share), but the immersion is great at around ~1m away.

edit: I couldn't help myself, I pulled the trigger on the 42, that price is too good. I'll be trying out FPS which was my main problem game and report how improved it is.

FPS is epic on a big OLED unless you want a competitive edge for multiplayer then yeah I can see 48" being a little problematic. I'm enjoying Crysis Remastered on my CX with Auto HDR and maxed out RT on a 4090 and it's amazing.
 

Attachments

  • 20221124_213225.jpg
    20221124_213225.jpg
    680.7 KB · Views: 1

hhkb

Limp Gawd
Joined
May 24, 2013
Messages
147
FPS is epic on a big OLED unless you want a competitive edge for multiplayer then yeah I can see 48" being a little problematic. I'm enjoying Crysis Remastered on my CX with Auto HDR and maxed out RT on a 4090 and it's amazing.
Yeah that’s why I keep thinking 42” will feel like a downgrade, for immersive gaming at least. Been playing cyberpunk with a 4090 on it at max ray tracing settings and it’s glorious. It really is a great size for gaming and watching movies, at about 1m away. It’s my fave monitor of all time and I’ve put thousands of hours of gaming and productivity work on it now.

BUT, I do find it a bit too large occasionally. I can’t see the full screen at 1m when playing FPS so I have to look around to see hud elements. I’m pretty used to it now (have 400 hours playing DRG on it for example) - but it can be annoying sometimes. I basically can’t full screen apps at all on it either, which is annoying 1% of the time for screen sharing at work.

Maybe I should invest in an ergotron HX so I can push it farther back when I need to. It’s sort of hard to justify a £250 arm though when the C2 is on sale for £700. 42 would be better ultimately as a pc monitor I think - would lose some immersion but it is a more practical size.

Okay, I convinced myself to cancel my 42" order and look out for a used HX arm :D.
 
Last edited:

kasakka

2[H]4U
Joined
Aug 25, 2008
Messages
2,802
Yeah that’s why I keep thinking 42” will feel like a downgrade, for immersive gaming at least. Been playing cyberpunk with a 4090 on it at max ray tracing settings and it’s glorious. It really is a great size for gaming and watching movies, at about 1m away. It’s my fave monitor of all time and I’ve put thousands of hours of gaming and productivity work on it now.

BUT, I do find it a bit too large occasionally. I can’t see the full screen at 1m when playing FPS so I have to look around to see hud elements. I’m pretty used to it now (have 400 hours playing DRG on it for example) - but it can be annoying sometimes. I basically can’t full screen apps at all on it either, which is annoying 1% of the time for screen sharing at work.

Maybe I should invest in an ergotron HX so I can push it farther back when I need to. It’s sort of hard to justify a £250 arm though when the C2 is on sale for £700. 42 would be better ultimately as a pc monitor I think - would lose some immersion but it is a more practical size.

Okay, I convinced myself to cancel my 42" order and look out for a used HX arm :D.

When I moved, I was thinking of originally selling my CX 48" but it happened to be a great size for our small living room. But I was actually happy to go back to a smaller LCD for desktop use. The CX was good but I was often not using its full desktop, instead using maybe like the bottom 2/3rds instead, sort of like a 3840x1600 display I suppose. Even 1m viewing distance I found it a bit large otherwise even though it was lovely for media and games. I think the 42" size would be more desktop friendly.

I tried putting the CX on a monitor arm initially but the problem is that it is of course limited by the depth of your desk. I bought a cheap floorstand for it instead and that's a more practical solution though not very adjustable. I figured that I am not very interested in moving the screen back and forth but will instead find a good spot and leave it there.

At this point there is not much reason to buy the C2 42" if you have a CX/C1/C2 48". It's pretty much the same thing but smaller. Wait for C3 to be unveiled to see if it brings anything new at least.

What I am really hoping LG would do is a fixed curve C3 that is not much more expensive than the flat C3. To me the motorized curve of the LG Flex is a stupid feature. You will play with it a bit and then probably park it on one setting. LG should just make a 42", 4K 240 Hz version of the 45GR95QE (the curved 45" 3440x1440 OLED).
 

hhkb

Limp Gawd
Joined
May 24, 2013
Messages
147
Yup I'm just going to wait until we get something significantly better at 42". Not worth the side grade right now. Still have 0 burn in and only like 3-4 dead pixels on the edges at about 6.5k hours so far.
 

MistaSparkul

2[H]4U
Joined
Jul 5, 2012
Messages
2,302
When I moved, I was thinking of originally selling my CX 48" but it happened to be a great size for our small living room. But I was actually happy to go back to a smaller LCD for desktop use. The CX was good but I was often not using its full desktop, instead using maybe like the bottom 2/3rds instead, sort of like a 3840x1600 display I suppose. Even 1m viewing distance I found it a bit large otherwise even though it was lovely for media and games. I think the 42" size would be more desktop friendly.

I tried putting the CX on a monitor arm initially but the problem is that it is of course limited by the depth of your desk. I bought a cheap floorstand for it instead and that's a more practical solution though not very adjustable. I figured that I am not very interested in moving the screen back and forth but will instead find a good spot and leave it there.

At this point there is not much reason to buy the C2 42" if you have a CX/C1/C2 48". It's pretty much the same thing but smaller. Wait for C3 to be unveiled to see if it brings anything new at least.

What I am really hoping LG would do is a fixed curve C3 that is not much more expensive than the flat C3. To me the motorized curve of the LG Flex is a stupid feature. You will play with it a bit and then probably park it on one setting. LG should just make a 42", 4K 240 Hz version of the 45GR95QE (the curved 45" 3440x1440 OLED).

I expect the C3 to be exactly like how the C1 was to the CX, basically the same thing and not worth it over a heavily discounted C2. After that in 2024 maybe we'll get the next big thing for WOLED but I'm pretty sure LG's main focus for their TV's isn't the PC gaming/desktop crowd so instead of giving us higher refresh rates or a curve, they'll probably be releasing that MLA panel that supposedly increases the brightness significantly and even that may not hit the C series but instead be limited to larger sized G series TVs. I think we reached the end of the road here for WOLED TV's but hey at least LG is actually making monitors now starting with that 27" 240Hz so hopefully we'll just get a 32" 4K 144Hz+ WOLED monitor next year or in 2024 and never have to bother with using TV's as a monitor again.
 

kasakka

2[H]4U
Joined
Aug 25, 2008
Messages
2,802
I expect the C3 to be exactly like how the C1 was to the CX, basically the same thing and not worth it over a heavily discounted C2. After that in 2024 maybe we'll get the next big thing for WOLED but I'm pretty sure LG's main focus for their TV's isn't the PC gaming/desktop crowd so instead of giving us higher refresh rates or a curve, they'll probably be releasing that MLA panel that supposedly increases the brightness significantly and even that may not hit the C series but instead be limited to larger sized G series TVs. I think we reached the end of the road here for WOLED TV's but hey at least LG is actually making monitors now starting with that 27" 240Hz so hopefully we'll just get a 32" 4K 144Hz+ WOLED monitor next year or in 2024 and never have to bother with using TV's as a monitor again.
That's kind of why I was hoping they would focus on things like increased refresh rate and a fixed curve model. It's hard to sell a C3 if it's just the same thing as the C2 but more expensive unless they delay release until C2 supplies are depleted.
 

MistaSparkul

2[H]4U
Joined
Jul 5, 2012
Messages
2,302
That's kind of why I was hoping they would focus on things like increased refresh rate and a fixed curve model. It's hard to sell a C3 if it's just the same thing as the C2 but more expensive unless they delay release until C2 supplies are depleted.

Well again higher Hz and curved screens are features that are more appealing towards the PC crowd and I don't think that's LG's main focus when it comes to their TV lineup so I don't see them bothering with it. Would be happy to be wrong though but I don't expect much out of the C3 that would interest us PC people who use their TV's as a monitor.
 
Top