Difference between 4K 42" TVs vs 4K 42" Monitors?

Cerulean

[H]F Junkie
Joined
Jul 27, 2006
Messages
9,476
Howdy! What is the difference between 4K 42" TVs and 4K 42" computer monitors? Why would I want to choose a TV over a computer monitor, vice versa?
 
Inputs and the amount of processing. TVs usually have only HDMI while desktop monitors come with Displayport and HDMI. TVs do more image processing which leads to high input lag if game mode (which removes most of the processing) is not enabled. TVs also come with TV tuners and smart TV crap. The reason to choose a TV for a monitor is wanting a big screen which isn't that common in desktop monitors. TVs also tend to be cheaper.

That doesn't mean computer monitors don't have their perks too. If you need things like variable refresh rate support then you are limited to Freesync on AMD cards because Nvidia does not support adaptive sync with anything but Displayport at the moment. Computer monitors depending on what you buy can come with better factory calibration and support for higher refresh rates.
 
Personally, I still do not understand why people choose monitors over TVs. What allure at all does a monitor have over a TV? Beyond me.
 
Personally, I still do not understand why people choose monitors over TVs. What allure at all does a monitor have over a TV? Beyond me.
1. At that size, PBP (picture by picture). Having the option of having multiple inputs viewed on the same screen simultaneously. TVs cannot do this.
2. to a lesser extent, sleep capability. Most but not quite all big screen tvs do not have sleep options like monitors do.
Number 1 is the biggest item for me and quite a few people who use anything from 40+" screens for pc use.
 
Personally, I still do not understand why people choose monitors over TVs. What allure at all does a monitor have over a TV? Beyond me.
For gaming, the best televisions still have more than 1 frame of lag at 60 Hz with up to 20ms of input lag, and you can only get there by turning off all the features that make the TV have a potentially better picture. The input lag on my PG27UQ is only 8ms including the use of HDR and G-SYNC, which is less than 1 frame at 98 Hz.

Another issue is that generally any TV that is smaller than 50" will have quality compromises compared to their larger siblings. Some can even use a completely different panel. And as far as I'm aware there are no quality TV displays within "optimal" desktop size (32" or smaller, generally).
 
Other influencing factors:
  • The stand. Can you find a 42" 4K TV that has a stand like most computer monitors do? I.e. HP Z43 computer monitor has a "proper" stand (granted, if your environment permits, a VESA mount works as an alternative whether TV or computer monitor is used)

  • Matte screen. Can you find a 42" 4K TV that has a matte screen? I.e. HP Z43 computer monitor has a matte screen
 
Displayport
Did some research and I definitely agree, this is a big one. I didn't know that HDMI was limited to 24/30p @ 4K and handled half the bandwidth of DisplayPort. DisplayPort is clearly the winner, and the next revision of DisplayPort will dominate even more with 8Kp60 support and 32Gbps throughputs. So if you are needing 4K for productivity, computer monitor and DisplayPort is the way to go.
 
Personally, I still do not understand why people choose monitors over TVs. What allure at all does a monitor have over a TV? Beyond me.


EDIT - er I read that backwards. I am in favor of TVs being used alongside monitors. Depends what you need. - end.edit.

Can you point me to where I can get a 48" and 56" monitor please? Each under $1000 US.

Each with a good quality image, chroma 444 and all that.

It would be nice if more TVs had Display Port.
 
Did some research and I definitely agree, this is a big one. I didn't know that HDMI was limited to 24/30p @ 4K and handled half the bandwidth of DisplayPort. DisplayPort is clearly the winner, and the next revision of DisplayPort will dominate even more with 8Kp60 support and 32Gbps throughputs. So if you are needing 4K for productivity, computer monitor and DisplayPort is the way to go.

HDMI 2.0 as found in most 4K TVs is 4K @ 60 Hz. HDMI 2.1 found in some upcoming 2019 TVs can handle 8K @ 60 Hz or 4K @ 120 Hz. There is no GPU with HDMI 2.1 ports out yet though.

HDMI 2.0's limitation is mostly with HDR where it doesn't have enough bandwith so to get 4K @ 60 Hz + HDR you need to compress the color space to 4:2:2 or 4:2:0.

One benefit HDMI has over Displayport is ability to work with longer cables. If your computer is close to your display as most are then it doesn't matter, but if like me you have a 4K TV in the living room, connecting my computer in the next room required a pretty long and expensive HDMI cable to get things to work right.
 
Last edited:
This thread is full of valid points, so i'll just preach some more to the choir.

It's only now that the best TVs are becoming (near) ideal gaming displays, and soon, with 8K, productivity, before they were middling and you couldn't really get high FPS unless you reduced the Resolution.

And another is that the sheer range of monitors available in terms of sizes and pricing. A lot of people consider 1440p at 27 inches to be the sweet spot in terms of PPI and TVs, outside of 8K, simply don't do that kind of PPI.

The pricing for TVs is different, a midrange TV is usually priced the same as a High End Gaming Monitor, and High End TV's price is only matched by a TOTL monitor.

And a cheap monitor will be vastly more desirable for PC usage than a cheap TV. Input Lag / PPI / Chroma Subsampling / Processing EOTF/Gamma etc all culminate into monitors just simply being better suited variations of flatpanel tech for PC usage, since that's after all the engineers' intention.
 
Tv's usually don't do Adobe RBG or DCI-P3 well at all.

Feel free to correct me if that has changed.
 
Both of my 4k tv/monitors are at 60hz.

The new one says it is doing hdr but who knows. I can't tell with what I have done so far.

The first time I watched batman beyond (remastered) win popped up a full screen alert about switching to hdr output.

The text looks ok on the desktop. How would I tell if it were at 422? Or 420?

Edit spell check issues
 
Last edited:
HDMI 2.0 as found in most 4K TVs is 4K @ 60 Hz. HDMI 2.1 found in some upcoming 2019 TVs can handle 8K @ 60 Hz or 4K @ 120 Hz. There is no GPU with HDMI 2.1 ports out yet though.

HDMI 2.0's limitation is mostly with HDR where it doesn't have enough bandwith so to get 4K @ 60 Hz + HDR you need to compress the color space to 4:2:2 or 4:2:0.

One benefit HDMI has over Displayport is ability to work with longer cables. If your computer is close to your display as most are then it doesn't matter, but if like me you have a 4K TV in the living room, connecting my computer in the next room required a pretty long and expensive HDMI cable to get things to work right.
HDMI runs into the same run length limitations DisplayPort does if you saturate the bandwidth. You can't beat physics. HDMI is recommending Ultra High Speed cables shorter than 3 meters for HDMI 2.1
Tv's usually don't do Adobe RBG or DCI-P3 well at all.

Feel free to correct me if that has changed.
There are a lot of televisions now that cover almost all of the DCI-P3 color space . The more expensive ones are even starting to get close to covering Rec.2020.
Both of my 4k tv/monitors are at 60hz.

The new one says it is doing hdr but who knows. I can't tell with what I have done so far.

The first time I watched batman beyond (remastered) win popped up a full screen alert about switching to hdr output.

The text looks ok on the desktop. How would I tell if it were at 422? Or 420?

Edit spell check issues
Well all Blu-ray video is mastered using 4:2:0 chroma subsampling, so it wouldn't even matter in that context. If you want to test, use this image displaying it 1:1 on your screen (no zoom or scaling). The third line from the bottom should be crisp and easy to read, and the bottom two lines should be clear.

http://i.rtings.com/images/test-materials/2017/chroma-444.png
 
Input lag? Duh. Some people don't like all the extra processing TVs these days force down your throat.

My NU8500 has:

1080p @ 60Hz : 15.4 ms
1080p @ 60Hz + HDR : 15.6 ms
1080p @ 60Hz Outside Game Mode : 77.0 ms
1080p @ 120Hz : 9.3 ms
4k @ 60Hz : 15.4 ms
4k @ 60Hz + HDR : 15.7 ms
4k @ 60Hz @ 4:4:4 : 15.8 ms
4k @ 60Hz @ 4:4:4 + 8 bit HDR : 15.9 ms
4k @ 60Hz Outside Game Mode : 63.4 ms
4k With Interpolation : 21.4 ms
4k @ 120 Hz : N/A
4k with Variable Refresh Rate : 15.2 ms
1080p with Variable Refresh Rate : N/A

Yes a 144hz monitor is better, my point being these are not bad and this is not the best TV for gaming on the market.

When/if NV pulls it's head from it's ass and supports freesync on HDMI.....
 
My NU8500 has:

1080p @ 60Hz : 15.4 ms
1080p @ 60Hz + HDR : 15.6 ms
1080p @ 60Hz Outside Game Mode : 77.0 ms
1080p @ 120Hz : 9.3 ms
4k @ 60Hz : 15.4 ms
4k @ 60Hz + HDR : 15.7 ms
4k @ 60Hz @ 4:4:4 : 15.8 ms
4k @ 60Hz @ 4:4:4 + 8 bit HDR : 15.9 ms
4k @ 60Hz Outside Game Mode : 63.4 ms
4k With Interpolation : 21.4 ms
4k @ 120 Hz : N/A
4k with Variable Refresh Rate : 15.2 ms
1080p with Variable Refresh Rate : N/A

Yes a 144hz monitor is better, my point being these are not bad and this is not the best TV for gaming on the market.

When/if NV pulls it's head from it's ass and supports freesync on HDMI.....
Adaptive Sync is not part of HDMI 2.0. AMD's implementation of FreeSync over HDMI is proprietary, as far as I'm aware.
 
Tv's usually don't do Adobe RBG or DCI-P3 well at all.

Feel free to correct me if that has changed.

Good ones cover like 100% of DCI-P3 nowadays and some even go beyond that. It's no longer the case that TVs i suppose. MidRange samsungs (NU8000+) do ~85%, Higher-End ones do 99.8%, like the Q7FN.
 
Back
Top