Running 24p on 120Hz monitor

baco80

Weaksauce
Joined
Jan 22, 2010
Messages
111
If Im looking running 24p the best as possible,

do I have to set graphic card output at 24Hz, and the monitor does the 5:5 task...?

...or do i have to set graphic card output to 120Hz?
 
What is your 24p source? Do you have a BD player in your PC, or are you hooking up a PS3 to the monitor, or are you playing video from a hard drive?

Most of the info I find is about people dealing with displaying 24p content on 60Hz LCDs. A true 120Hz LCD should make this much simpler. For starters, you probably want to use the latest version of MPC-HC (the sourceforge version is ancient).
 
Im considering a 120Hz display purchasing. And in that case, wondering If I should setup graphic cards output at 24hz or 120hz?

I mean, is the dsiplay, usually, that takes the 24hz signal and does the 5:5 task to show the video at 24p@120Hz... or jus takes 120Hz signal right out of the PC/Bluray player?

Another subject out of this question, is how the software player or bluray device manages 23.976 video to process and converts it at 24p cadence.
 
Im considering a 120Hz display purchasing. And in that case, wondering If I should setup graphic cards output at 24hz or 120hz?

I mean, is the dsiplay, usually, that takes the 24hz signal and does the 5:5 task to show the video at 24p@120Hz... or jus takes 120Hz signal right out of the PC/Bluray player?

Another subject out of this question, is how the software player or bluray device manages 23.976 video to process and converts it at 24p cadence.

the monitor will run at whatever frequency the source tells it to. If you're computer is sending the monitor a 120hz signal, that's what it will display. If it's sending a 59 or 60hz signal, thats what the monitor will display.

So with that said, if your bluray player/PS3/DVD player/etc is sending out a 60hz signal, that's what the monitor will use.
 
Blu-Ray devices can either send 1920p at 24FPS or 1920i at 60FPS. No Blu-Ray device will send 1920p at 60FPS.
 
the monitor will run at whatever frequency the source tells it to. If you're computer is sending the monitor a 120hz signal, that's what it will display. If it's sending a 59 or 60hz signal, thats what the monitor will display.

So with that said, if your bluray player/PS3/DVD player/etc is sending out a 60hz signal, that's what the monitor will use.

Umm cant the monitor take 60Hz signals, doubling it and showing a refresh rate of 120Hz?

I mean, if Im going to a 240Hz capable tv, do I mandatory need a 240Hz source to have the tv working at those featured 240Hz?
 
Umm cant the monitor take 60Hz signals, and double it and show a refresh rate of 120Hz just doubling hte input refresh rate?

I mean, if Im going to a 240Hz capable tv, do I mandatory need a 240Hz source to have the tv working at those featured 240Hz?

24 does not divide evenly into 60, but it does into 120 and 240.

If you are looking at a 120Hz or 240Hz LCD TV, it will only take 60Hz at maximum from a PC's video card, and then insert the rest of the frames internally in the TV. If you are looking at a 120Hz LCD monitor, it will take the full 120Hz from the PC's video card.
 
24 does not divide evenly into 60, but it does into 120 and 240.

If you are looking at a 120Hz or 240Hz LCD TV, it will only take 60Hz at maximum from a PC's video card, and then insert the rest of the frames internally in the TV. If you are looking at a 120Hz LCD monitor, it will take the full 120Hz from the PC's video card.

Why cant a 120Hz LCD tv take a 120hz signal, just like the monitor?

...and on the other side, why cant a 120Hz LCD monitor take 60Hz and double it, just the way you described for the tv?
 
Why cant a 120Hz LCD tv take a 120hz signal, just like the monitor?

Because it's a marketing gimmick. They never should have called them 120Hz or 240Hz TVs because that's not what they're doing. It's causing endless amounts of confusion.

...and on the other side, why cant a 120Hz LCD monitor take 60Hz and double it, just the way you described for the tv?

The TV has internal circuitry that generates the additional frames, while the monitor does not - on a PC, that sort of thing would be handled by the video card, or (most likely) the media player that you play the video with.
 
The TV has internal circuitry that generates the additional frames, while the monitor does not - on a PC, that sort of thing would be handled by the video card, or (most likely) the media player that you play the video with.

Very clear, but in this case im not talking about generating frames or frame interpolation. Im just talking about Hz. Why cant the monitor just take that refresh rate (60Hz) and just double it to 120Hz; I mean, not generating new or interpolated frames.


About the tv marketing gimmick, its ok. But what implies so, the "24p capable" feature on some tv displays? being capable of taking 24hz signals (out from the bluray player i.e.) and refreshing it at 72, 96 or 120Hz?
 
Very clear, but in this case im not talking about generating frames or frame interpolation. Im just talking about Hz. Why cant the monitor just take that refresh rate (60Hz) and just double it to 120Hz; I mean, not generating new or interpolated frames.

why would you want to? It would just be showing each frame twice at that point (or you'll be getting a blank frame displayed in between each picture frame. You wouldn't see it any different than if you were viewing it on a 60hz display. Actually, I think that this is what happens on a 120hz TV when you turn off frame interpolation.
 
Why cant a 120Hz LCD tv take a 120hz signal, just like the monitor?

...and on the other side, why cant a 120Hz LCD monitor take 60Hz and double it, just the way you described for the tv?

for the first part, it has to do with the bandwidth of HDMI 1.3. It can't handle the amount of data that's required for 120 screen refreshes @ 1920x1080. Now, with that said, I'm not sure why they can't make a 120hz TV that will take a 120hz signal over dual-link DVI (or 120hz @ 720p over HDMI). I can only assume that it's not something most people would pay a premium for.

For the second part, thats because the monitor 100% relies on it's source to tell it what frequency (and resolution) to run at. If the source is telling it to run at 59, 60, 100, 110 or 120hz, the monitor will run at that frequency. It doesn't have the circuitry to know that it should be doubling the frequency on a 60hz input. Even if it did, it would do absolutely nothing for the image quality
 
Last edited:
Why cant a 120Hz LCD tv take a 120hz signal, just like the monitor?
for the first part, it has to do with the bandwidth of HDMI 1.3. It can't handle the amount of data that's required for 120 screen refreshes @ 1920x1080.

That has no much sense; if so, neither the tv nor the monitor would not be able to take any 120Hz signal coming out from hdmi. I guessed the difference on this was based on monitor/tv input capability, not the interface.

Whats the point so, on 120Hz monitors with no dual-link dvi? by your statement hdmi is useless to benefit of the 120Hz feature.

By the way..:"HDMI 1.3 was released June 22, 2006 and increased the single-link bandwidth to 340 MHz (10.2 Gbit/s)"

1920x1080 @120Hz x 24bit color = 5.56Gbit/s
 
Last edited:
From what I understand the TV inserts the frames via post processing as they have specific chips in them designed to do this. Now why can't a computer monitor do this? Well they don't have these chips since they are intended for different uses. The post processing process adds a lot of input lag, and so the vast majority of computer users would leave it off. It doesn't make sense for manufactures to sell a more expensive monitor with this feature since the market for it would be highly limited.
 
Back
Top