Higher Frames Per Second = Bad

And now something completely different ...
You know you all want to be the first to overclock a movie projector to get more FPS out of the movie! Custom watercool and volt mod that biach!
 
There will definitely be good and bad that comes with this, that much is obvious. We'll have studios jump on the bandwagon and completely destroy it and others who embrace and improve it.

Just like everything else. I'm definitely curious as to how it looks, however
 
i'd prefer it if film makers filmed everything at 1k fps and released it at all the current popular fps's
i am sick and tired of panning scenes looking like a quick slideshow
i am sick of watching a massive blur attack on my screen during action scenes
i am sick of stupid assumptions that high fps and realistic shit will be "too real" beacause really, why would i pay for a blu ray if i didn't want it to be more "realistic"?

i want 72 fps films right now, its closer to what almost everyone views at home anyways (60-75hz) and there is no reason to not film in higher fps, if people really DON'T like it they can still release it at 24fps, but you can't release a 24fps at 48 or 72

Thank you!!! When I saw Bourne Ultimatum in theaters there were only front row seats left. The movie was not enjoyable in that setting....
 
A blu ray won't do 1080p/48fps anyway, so completely optional as not 100% of theaters will be showing it this way, either
 
And now something completely different ...
You know you all want to be the first to overclock a movie projector to get more FPS out of the movie! Custom watercool and volt mod that biach!

no need for that, put two in SLI and bang! 48FPS!
you could try CrossFireX too, but will have to wait for working drivers :D

(post being deleted by Kyle in 3... 2... 1...) :p
 
I think the people are attributing the 48FPS to other issues.

I recall something about the cameras having a hard time with the color red, so the props had to be modified for the cameras. Apparently another issue is the contrast seems to be a bit wonky as well.

This is due to the camera, you can have 48FPS without these issues. Assuming the contrast claims are true that needs to be adjusted for until a camera capable of proper capture is made. Not sure why they had to overpaint/makeup people for the cameras, but straight from the Peter Jackson behind the scene hobbit videos they said they have to do over paint/makeup the scenes.
 
I think Jackson is banking on 240 hz displays for the future (48*5). For 120hz displays, there's going to be a small amount of judder. Half as much and twice as fast, so it will barely be noticeable, but it will still be there.

Main reason is that it is easier to convert 48 fps vs 60 fps to 24 fps for cinemas that don't have 48 fps projectors.
 
I think the people are attributing the 48FPS to other issues.

I recall something about the cameras having a hard time with the color red, so the props had to be modified for the cameras. Apparently another issue is the contrast seems to be a bit wonky as well.

This is due to the camera, you can have 48FPS without these issues. Assuming the contrast claims are true that needs to be adjusted for until a camera capable of proper capture is made. Not sure why they had to overpaint/makeup people for the cameras, but straight from the Peter Jackson behind the scene hobbit videos they said they have to do over paint/makeup the scenes.

Yea apparently a common problem with the RED EPIC cameras. I saw a comparison video of various movie cameras and the difference in saturation is clear. Doesn't mater too much these days with color correction software.
 
Higher fidelity recording, whether it be resolution or framerate, often has the unfortunate side effect of exposing poor production quality. Obvious stunt doubles, cheap looking props, badly designed sets, etc.
 
A downside that I didn't think of at first is that blu-ray won't support the higher frame rates. Blu-Ray can only do 1080p24, or 1080i60. To keep the framerate and introduce interlacing, the resolution needs to drop to 720p.

...I'm sure that they'd love to introduce a new format.
 
A blu ray won't do 1080p/48fps anyway, so completely optional as not 100% of theaters will be showing it this way, either

But a fair amount are likely capable. 35mm is slated to be on less than 37% of global cinema screens by the end of this year. Most of these are at least 4K capable. I wouldn't doubt that most could handle 48fps since it's been a known logical step up for the past couple of years.

Not to mention, that adding an additional profile to handle 48fps to a cinema system that uses Blu-ray like discs wouldn't be nowhere near as disruptive as it would be at home, not that it matters as I wouldn't doubt Sony would do a new home profile like they did for 3D TVs when they became affordable. Most cinema projectors store the movie on a hard drive anyways.
 
While I havent seen a movie natively recorded in 48fps, I have to say what I have seen upscaled to this framerate has been absolutely horrid and unenjoyable to watch. I must agree with every complaint people have made, it makes the movie feel like a soap opera on VHS. I'll give the Hobbit a chance since both Jackson and James Cameron are vouching for this and they arent going to screw around, but I have already given it a fair shot in the tv market and absolutely despise it.

And I dont think this is one of those "you get used to it" sort of things either. Motion blur and lack of clarity is part of what adds to the cinematic experience. Remember these are movies people, not real life. Hollywood cant quite emulate real life yet, so when you introduce something like 48fps your eyes can pick apart the pieces of every scene and recognize just how fake it is. Sets look like sets instead of real locations, props are easily noticeable, special fx look like cheap homebrew trash. Everything just looks utterly fake.

Film-making is an art. Much like a painter may decide to blur the lines of his subject, perhaps paint something completely abstract, so do movies. Not every movie needs to be made to look "uber-real" to give us the impression of realism.
 
I did. If frame rate and shutter speed have no relation to each other it shouldn't be a problem right?
Nope. A camera which supports 48 fps and 1/24 will shoot 48 fps at 1/24. You wouldn't actually do this, but you could, as the frame rate and shutter speed are independent. As I said earlier. Whether the effect of a slower-than-frame rate shutter speed is what's desirible is a separate debate.
 
There's a Sharp tv @ the frys nearby playing Tron over and over. The contrast ratio is great, but motion looks...odd. I don't think it's the movie, but honestly can't recall. Is that "TV" 120 or 240 hz? Or "smooth vision?" I have a 120 hz pc display, but that doesn't do anything funky to movies. Overall, I would say it makes the film look like a cheap made for tv movie.

The refresh rate has absolutely nothing to do with it, not at all.

The reason it looks strange is because of frame interpolation, in which the TV set fakes the frames between what's in the frames of the movie.

This can be disabled in any good set, and all reference standards recommend doing so.
 
48FPS should be the new standard, period. That way, you can continue to show both old 24fps content, and new 48fps content... I fail to see the debate here. Just because the Hobbit is filmed at 48fps, doesn't mean all scenes will be output at 48fps. If you want stuttery film, you want a faster shutter speed anyways to eliminate motion blur from being captured, making longer pauses between frames.

This is NOT motion compensation, frame doubling, etc, it's simply a new tool. Just like all new tools, there will be alot of bad examples, and only a few good ones at first. Once the methodology of using this new format gets hammered out, people will wonder why we didn't get it sooner.

I'm sure the studios will start selling us updated copies of all the classics "remastered" to 48fps though... that will be a disaster...
 
48FPS should be the new standard, period. That way, you can continue to show both old 24fps content, and new 48fps content... I fail to see the debate here. Just because the Hobbit is filmed at 48fps, doesn't mean all scenes will be output at 48fps. If you want stuttery film, you want a faster shutter speed anyways to eliminate motion blur from being captured, making longer pauses between frames.

This is NOT motion compensation, frame doubling, etc, it's simply a new tool. Just like all new tools, there will be alot of bad examples, and only a few good ones at first. Once the methodology of using this new format gets hammered out, people will wonder why we didn't get it sooner.

I'm sure the studios will start selling us updated copies of all the classics "remastered" to 48fps though... that will be a disaster...
 
48Hz is a step in the right direction...but it should be at least 60Hz. It's worse with some movies and better with others, but some of them are intolerable for me at 24Hz. I remember Avatar was especially bad, I actually couldn't watch parts of the movie because the jerking was going to make me throw up.
 
Remember when the FMV scenes in the early final fantasy games on PSX? I think the framerate there was capped at something like 15-20fps. I remember when PS2 came out how they had 30fps FMV scenes for their games, and how much cooler and fluid it looked. Of course they also had much higher quality and more details scenery to work with.

You cant just take a movie and film it in 48fps and expect it to suddenly look better. There needs to be a proportional increase in set detail to go with it.
 
48Hz is a step in the right direction...but it should be at least 60Hz. It's worse with some movies and better with others, but some of them are intolerable for me at 24Hz. I remember Avatar was especially bad, I actually couldn't watch parts of the movie because the jerking was going to make me throw up.

Just about every movie is 24fps.
 
This is just a bunch of artsy-fartsy film academy snobs turning their noses up at what is a long over due and logical improvement. I hate low FPS in games, and I hate it in movies. I look forward to seeing The Hobbit in 48FPS!
 
I tried watching The King's Speech on a smooth motion 50", it looked like news footage. I disliked it enough that I finished watching it on my 15" laptop.
 
I tried watching The King's Speech on a smooth motion 50", it looked like news footage. I disliked it enough that I finished watching it on my 15" laptop.

Can't you turn smooth motion off? I have it off on my TV because when it's on the video will often stutter.
 
It can be weird and disturbing at first. It's called the "soap opera effect", because those cheap productions used "cheap cameras" which were 60 fps and sometimes they sped up the non-talking frames to save air time. Just give it a go until you get used to it, and then when you see 24 fps material, you'll be like, "How did I put up with that choppy crap?"
 
Can't you turn smooth motion off? I have it off on my TV because when it's on the video will often stutter.

In reality, the majority of the 120 and 240hz sets run at those speeds all the time. The toggling is just enabling/disabling the doubling of frames added via software (interpolation) to fill in the gaps from the normal 60fps picture. This is why some people can still see a stutter even with the feature is turned off.
 
Why are so many people here confusing smooth motion or post-production interpolation with filming at a higher framerate?
 
Why are so many people here confusing smooth motion or post-production interpolation with filming at a higher framerate?

I don't know, I don't understand it either.

Smoothmotion looks nothing like actual higher fps footage. Frame interpolation is an ugly gimmick.
 
I don't get why all of you people are complaining about what you think 48fps will look like. Have any of you ever seen an 48fps source? Didn't think so.

You complain that 120hz looks "fake" or "artificial" - THAT'S BECAUSE IT IS. They are adding artificial frames with algorithms.

Don't compare your Walmart Samdung LCD's on Soap Opera mode to what a true 48fps source will look like. Holy crap.

48fps source is going to eff'ing rock. I cannot stand watching movies (especiallyin 3D IMAX) with a lot of close up action shots - it's a damn slideshow as previously stated. The technology is not sufficient.

48fps will be a GIANT leap forward in fiedlity, just like 4K.
 
Here's an interesting thing for us gamers...

Think about all the Call of Duty: Modern Warfare / Black Ops / COD5 commercials you see on actual television. All the visuals in those are played in super high res on powerful PCs and passed off as "gameplay footage" for consoles.

Those are running at very high framerates despite the actual commercial being only 24 FPS. You get the same soap opera effect!
 
48Hz is a step in the right direction...but it should be at least 60Hz. It's worse with some movies and better with others, but some of them are intolerable for me at 24Hz. I remember Avatar was especially bad, I actually couldn't watch parts of the movie because the jerking was going to make me throw up.

If doubt you saw "jerking" in the theater. That was probably the effect of 24fps->60hz due to 3:2 pulldown. The only way to eliminate that is a TV that properly handles 1080p/24fps...and a display device which outputs it. The display part is actually somewhat uncommon...even my almost-top-end Panasonic Plasma flickers a bit at 24fps enough that I don't use the mode and deal with the stutter caused by 3:2 pulldown.
 
If doubt you saw "jerking" in the theater. That was probably the effect of 24fps->60hz due to 3:2 pulldown. The only way to eliminate that is a TV that properly handles 1080p/24fps...and a display device which outputs it. The display part is actually somewhat uncommon...even my almost-top-end Panasonic Plasma flickers a bit at 24fps enough that I don't use the mode and deal with the stutter caused by 3:2 pulldown.

I most certainly did. In panning scenes in a movie the camera "jumps" from frame to frame, which is especially noticeable on IMAX. This isn't surprising, considering the math....

I recently saw the Hunger Games at the Lincoln Square IMAX. The screen there is 97 feet wide. If a scene is translating horizontally across a distance of 100 or so feet in 2-3 seconds (the "reaping" scene), the camera is moving at 33-50 feet per second. At 24 frames per second that means it is moving 1.375 to 2.083 feet per frame. On a 97 foot wide screen, this means that every frame is offset from the previous one by 1.33 to 2.02 feet. That jump is *huge* and looks like the characters are stuttering across the screen.
 
Nope. A camera which supports 48 fps and 1/24 will shoot 48 fps at 1/24. You wouldn't actually do this, but you could, as the frame rate and shutter speed are independent. As I said earlier. Whether the effect of a slower-than-frame rate shutter speed is what's desirible is a separate debate.

You realize that on a camera that allows what you describe it'll hold one image for both frames exposed, essentially halving your frame rate right? Like it or not, you cannot divorce frame rate and shutter speed. As I've said, while they are independently adjustable, they are linked.
 
Sure, you'll get an effective frame rate of 24 fps. But your frame rate will be 48 fps: the sensor is sampled at a rate of 48 Hz.

Your first statement, "the faster your frame rate, the faster your shutter speed" was wrong. That's what I corrected, and I was right to have done so. End of story.
 
Does 60 fps count?

Ditto here, working with 60fps footage many days a week. A well photographed scene recorded in 4K@60fps turns into ugly crap. Geez, even progressive 30fps doesn't look as good as 24fps.

But all this is probably a subjective matter, some of us love the "lack of realism" of low framerate cinema, some others would kill for 120fps footage.

What I can't stand is ignorant people thinking that 24fps lovers are all neanderthals fighting modern tech. I'll be one of the first guys to get a 4K or 8K TV if it doesn't force me to watch movies @interpolated 120Hz (not even porn :D )

If doubt you saw "jerking" in the theater. That was probably the effect of 24fps->60hz due to 3:2 pulldown. The only way to eliminate that is a TV that properly handles 1080p/24fps...and a display device which outputs it. The display part is actually somewhat uncommon...even my almost-top-end Panasonic Plasma flickers a bit at 24fps enough that I don't use the mode and deal with the stutter caused by 3:2 pulldown.

*brofist*
 
If doubt you saw "jerking" in the theater. That was probably the effect of 24fps->60hz due to 3:2 pulldown. The only way to eliminate that is a TV that properly handles 1080p/24fps...and a display device which outputs it. The display part is actually somewhat uncommon...even my almost-top-end Panasonic Plasma flickers a bit at 24fps enough that I don't use the mode and deal with the stutter caused by 3:2 pulldown.

I notice it at 60Hz too. Just much less often.
 
I cannot watch or own a plasma.
The 60hz flicker kills my head after about 30 seconds - similar to a 60hz CRT. DLP and LCD are the only techs I can stand.
Affects about 15% of people.

And that's a cool story bro.
 
I cannot watch or own a plasma.
The 60hz flicker kills my head after about 30 seconds - similar to a 60hz CRT. DLP and LCD are the only techs I can stand.
Affects about 15% of people.

And that's a cool story bro.

People say i'm crazy when I say that I can see the flickering that you mentioned. Every computer I ever had to use I ended up changing the refresh rate of the monitor.
 
Back
Top