Quiz
Gawd
- Joined
- Aug 25, 2010
- Messages
- 660
What are the main differences between using a TV (such as an LG C3) vs. a "real" gaming monitor?
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
That said with oled type where black bars are less of an issue, considering their size, one should consider to buy a TV and use it in UW mode if they want it, it will often be even cheaper for the same size than a UW monitor and then if you want 16:9 (games that do not support UW, watching 16:9 content, etc...) you have free up-down real estate that came with it.TVs flicker while monitors with hardware gsync modules are steady as fucks. Also TVs don't have the cool UW aspect ratios.
I was thinking more along the lines that gaming monitors are starting to employ smart TV features whereas most TVs you'd want to buy come with well working game modes, low input lag etc. As long as you have a HDMI 2.1 port on your source device, the biggest difference is form factor. Modern gaming monitors tend to have more HDMI 2.1 ports than DP ports too.As kasakka said, this is getting to be less and less of an issue every year as many name brand and some off brand TVs have adequate gaming modes now.
Aside from pläying video games, there aren't many use cases that demand 144+Hz. I couldn't find any channel in my region that broadcast 4K 144Hz let alone 240+Hz.Higher Hz. TV's tend to top out at 120Hz, while some gaming monitors can go 300+hz.
You can think of it as headroom. As an example, if we have a 120 Hz monitor but our GPU can run a game at 200 fps, then we have 80 frames per second we cannot display. So higher refresh rate is better, but yes, you would also need to reach those high framerates to make full use of it.Do gaming monitors that have very high refresh rates (300+ Hz) actually make a difference over something more common like 120 Hz? Don't you have to actually need to have 300+ fps to benefit from the very high refresh rate? Or does a very high refresh rate also benefit lower fps gaming?
Well Sure. Also 144hz vs 120hz isn't a big a deal... but 240hz is nicer for gaming.Aside from pläying video games, there aren't many use cases that demand 144+Hz. I couldn't find any channel in my region that broadcast 4K 144Hz let alone 240+Hz.
You can think of it as headroom. As an example, if we have a 120 Hz monitor but our GPU can run a game at 200 fps, then we have 80 frames per second we cannot display. So higher refresh rate is better, but yes, you would also need to reach those high framerates to make full use of it.
Most 300+ Hz monitors are 1080p eSports focused displays so they throw out image quality, HDR support etc out of the window for the best pixel response times and highest refresh rates. Pretty easy to run those games at those 300+ framerates.
Meanwhile getting even 120 fps at 4K can be a challenge unless you own a RTX 4090.
Same here. I enjoy my TV just fine in the living room but for the desktop would prefer something smaller, or an ultrawide option. Using a 28" 4K LCD as a "waiting until something better gets released" option.For me it just came down to display size preference. There is no TV that's in 32" size and offers all the features that I want. I did use a 48" OLED for a while and kinda got used to it and used it for almost 3 years and recently went back to a 32" monitor and probably won't be going back to a big TV sized display again.
For me it just came down to display size preference. There is no TV that's in 32" size and offers all the features that I want. I did use a 48" OLED for a while and kinda got used to it and used it for almost 3 years and recently went back to a 32" monitor and probably won't be going back to a big TV sized display again.
I must be crazy because most of you guys prefer smaller screen sizes.... I went for a 65" Samsung QD OLED this year after 3 years gaming at 55". Was worried it would be too much but I love it. For me it's all about nailing that perfect viewing distance using the "island desk" setup advocated by elvn. I don't think I'm going back to a smaller screen unless it's for something compelling, like OLED @ 4k/240Hz.
An objetive question demands an objetive answer, and that is YES.Do gaming monitors that have very high refresh rates (300+ Hz) actually make a difference over something more common like 120 Hz?
So whatever the best combination of value and performance is in a 32 inch size, but size being the fundamental parameter?Just a matter of preference. For some people even 32" is too large and they would only use 27" displays. I have a few friends who play nothing but competitive shooters like Valorant and OW2 so they only use 24" screens because that's what they prefer for their use case scenario. Having personally tried 21, 24, 27, 32, 34 ultrawide, 40, 48, and 55 inch displays I just so happen to land on 32" as my preferred display size.
So whatever the best combination of value and performance is in a 32 inch size, but size being the fundamental parameter?
I'm kind of that way, but with regard to TV. In that I just want the biggest one possible. I found I had one good wall here. So I chucked the couch I had against it. And now with a projector that wall's a TV. It's definitely at a loss of black level and contrast compared to OLED and QLED for that matter. However, it's all about the size for me. It does have another redeeming feature in that the RGB laser system creates a very colorful picture. (For motion, I do turn the motion system, "MEMC" on for sports, but with other content the picture looks very odd to me.)
For my main computer display, I basically want it to be a CRT, but bigger so I can throw up more windows and such for work. CX with the 120 Hz BFI on all the time is basically the closest I've found so far. (Some have mentioned input lag with it on, but I don't feel it. Seems instant when I'm moving the mouse and such. FWIW.)
If my neck had a vote, it would opt for a smaller display. Trying to discipline myself so I don't strain it with the height of the CX...
It's a fair point about gaming modes. You absolutely want to use it even just for desktop use because it makes the mouse more responsive.The only thing I would caution on monitors vs TV is that monitors typically have the "gaming mode" input lag but without all the caveats that TV's have with said modes. For example, my Samsung TV has a gaming mode that cuts input lag from 32ms to 16ms. Not bad, not great either. But in gaming mode, I lose the ability to have correct colors. It will only run in the screen's native gamut and all color processing/correction features are disabled. Monitors don't have that kind of drama. You usually get to have your cake and eat it too. I also notice that newer OLED TV's that do support BFI do not support BFI and gaming mode together. So for LG C2's and newer, you're going to have about 33~ms of input lag if you want BFI - which you typically want for games. Just some food for thought. Otherwise I think that for the most part TV's and monitors are more or less the same.
Yes! I remember using my father in law's screen and without game mode even desktop usage was crap.It's a fair point about gaming modes. You absolutely want to use it even just for desktop use because it makes the mouse more responsive.
Subjective. To me, 60hz BFI is better than nothing. Not everyone is interested in high framerate gaming or minds 60hz. I want to connect my game consoles (Wii U and Switch - basically 60 fps only) machines and benefit. I also don't mind the brightness drop. I know that it comes with the territory and am willing to put up with it. But yeah, I wish 120hz BFI for OLED was still a thing. Shame on you LG...I totally disagree about BFI. It's not even a real option on anything newer than the LG CX series (and apparently Samsung S95C as well) because it works only at 60 Hz, with increased input lag too. It makes HDR look like SDR too. Without BFI, the input lag on the C2 is ~6 ms at 4K 120 Hz and ~10 ms at 60 Hz. Which makes input lag a total non-issue as long as you are using game mode but not BFI.
Yep - something to read reviews for. In general terms though, monitors have less of that drama.Monitors also do have some of "that drama". My Samsung G70A locks out several picture options just enabling VRR. Most displays lock out picture options if HDR is used too.
Shame on LG indeed. Though I guess Samsung's reentry into the OLED TV market spooked LG such that they redirected resources towards increasing brightness. I think Vincent Teoh reported something like that. The loss of such BFI sucks though. Hope it comes back...Yes! I remember using my father in law's screen and without game mode even desktop usage was crap.
Subjective. To me, 60hz BFI is better than nothing. Not everyone is interested in high framerate gaming or minds 60hz. I want to connect my game consoles (Wii U and Switch - basically 60 fps only) machines and benefit. I also don't mind the brightness drop. I know that it comes with the territory and am willing to put up with it. But yeah, I wish 120hz BFI for OLED was still a thing. Shame on you LG...
Yep - something to read reviews for. In general terms though, monitors have less of that drama.
A remarkable display. Not that I ever thought otherwise. It's just I thought they would keep getting better. Like that more neck friendly 42" version on the horizon with 120 Hz BFI. Until it wasn't...
So word on the street (blurbusters) indicates that monitors with 60-120hz BFI are coming. He’s probably under NDA so he hasn’t said anything definitive. I’ll keep taking shots of that hopium. LolA remarkable display. Not that I ever thought otherwise. It's just I thought they would keep getting better. Like that more neck friendly 42" version on the horizon with 120 Hz BFI. Until it wasn't...
So word on the street (blurbusters) indicates that monitors with 60-120hz BFI are coming. He’s probably under NDA so he hasn’t said anything definitive. I’ll keep taking shots of that hopium. Lol