Cerulean
[H]F Junkie
- Joined
- Jul 27, 2006
- Messages
- 9,476
Howdy! What is the difference between 4K 42" TVs and 4K 42" computer monitors? Why would I want to choose a TV over a computer monitor, vice versa?
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
1. At that size, PBP (picture by picture). Having the option of having multiple inputs viewed on the same screen simultaneously. TVs cannot do this.Personally, I still do not understand why people choose monitors over TVs. What allure at all does a monitor have over a TV? Beyond me.
For gaming, the best televisions still have more than 1 frame of lag at 60 Hz with up to 20ms of input lag, and you can only get there by turning off all the features that make the TV have a potentially better picture. The input lag on my PG27UQ is only 8ms including the use of HDR and G-SYNC, which is less than 1 frame at 98 Hz.Personally, I still do not understand why people choose monitors over TVs. What allure at all does a monitor have over a TV? Beyond me.
Personally, I still do not understand why people choose monitors over TVs. What allure at all does a monitor have over a TV? Beyond me.
Did some research and I definitely agree, this is a big one. I didn't know that HDMI was limited to 24/30p @ 4K and handled half the bandwidth of DisplayPort. DisplayPort is clearly the winner, and the next revision of DisplayPort will dominate even more with 8Kp60 support and 32Gbps throughputs. So if you are needing 4K for productivity, computer monitor and DisplayPort is the way to go.Displayport
Personally, I still do not understand why people choose monitors over TVs. What allure at all does a monitor have over a TV? Beyond me.
Did some research and I definitely agree, this is a big one. I didn't know that HDMI was limited to 24/30p @ 4K and handled half the bandwidth of DisplayPort. DisplayPort is clearly the winner, and the next revision of DisplayPort will dominate even more with 8Kp60 support and 32Gbps throughputs. So if you are needing 4K for productivity, computer monitor and DisplayPort is the way to go.
HDMI runs into the same run length limitations DisplayPort does if you saturate the bandwidth. You can't beat physics. HDMI is recommending Ultra High Speed cables shorter than 3 meters for HDMI 2.1HDMI 2.0 as found in most 4K TVs is 4K @ 60 Hz. HDMI 2.1 found in some upcoming 2019 TVs can handle 8K @ 60 Hz or 4K @ 120 Hz. There is no GPU with HDMI 2.1 ports out yet though.
HDMI 2.0's limitation is mostly with HDR where it doesn't have enough bandwith so to get 4K @ 60 Hz + HDR you need to compress the color space to 4:2:2 or 4:2:0.
One benefit HDMI has over Displayport is ability to work with longer cables. If your computer is close to your display as most are then it doesn't matter, but if like me you have a 4K TV in the living room, connecting my computer in the next room required a pretty long and expensive HDMI cable to get things to work right.
There are a lot of televisions now that cover almost all of the DCI-P3 color space . The more expensive ones are even starting to get close to covering Rec.2020.Tv's usually don't do Adobe RBG or DCI-P3 well at all.
Feel free to correct me if that has changed.
Well all Blu-ray video is mastered using 4:2:0 chroma subsampling, so it wouldn't even matter in that context. If you want to test, use this image displaying it 1:1 on your screen (no zoom or scaling). The third line from the bottom should be crisp and easy to read, and the bottom two lines should be clear.Both of my 4k tv/monitors are at 60hz.
The new one says it is doing hdr but who knows. I can't tell with what I have done so far.
The first time I watched batman beyond (remastered) win popped up a full screen alert about switching to hdr output.
The text looks ok on the desktop. How would I tell if it were at 422? Or 420?
Edit spell check issues
Absolutely. If I remember correctly 5m/16ft is the recommended limit for 4K/60 on HDMI 2.0.HDMI 2.0 a 6' cable should work, no?
Personally, I still do not understand why people choose monitors over TVs. What allure at all does a monitor have over a TV? Beyond me.
Input lag? Duh. Some people don't like all the extra processing TVs these days force down your throat.
Adaptive Sync is not part of HDMI 2.0. AMD's implementation of FreeSync over HDMI is proprietary, as far as I'm aware.My NU8500 has:
1080p @ 60Hz : 15.4 ms
1080p @ 60Hz + HDR : 15.6 ms
1080p @ 60Hz Outside Game Mode : 77.0 ms
1080p @ 120Hz : 9.3 ms
4k @ 60Hz : 15.4 ms
4k @ 60Hz + HDR : 15.7 ms
4k @ 60Hz @ 4:4:4 : 15.8 ms
4k @ 60Hz @ 4:4:4 + 8 bit HDR : 15.9 ms
4k @ 60Hz Outside Game Mode : 63.4 ms
4k With Interpolation : 21.4 ms
4k @ 120 Hz : N/A
4k with Variable Refresh Rate : 15.2 ms
1080p with Variable Refresh Rate : N/A
Yes a 144hz monitor is better, my point being these are not bad and this is not the best TV for gaming on the market.
When/if NV pulls it's head from it's ass and supports freesync on HDMI.....
Tv's usually don't do Adobe RBG or DCI-P3 well at all.
Feel free to correct me if that has changed.