Jump to content

Why Do People Still Shoot at 24FPS? It always ruins the footage for me


herein2020
 Share

Recommended Posts

I know this is probably very controversial, but I ask myself this question every time I see a video shot in the USA at 24FPS instead of 30FPS; why did they do that? I am mainly talking about the USA because I know overseas there is PAL and 50Hz refresh rates and some other things involved in that formatting which I know nothing about. I will assume that if my TV was set to PAL and the frame rate was 25FPS it would look the same to my eyes as my TV set to NTSC and the frame rate set to 30FPS.  So back to shooting at 24FPS  or 23.97FPS in the USA...I just don't get it; I have never seen 24FPS footage (that I am aware of) anywhere other than Hollywood that does not look like its is stuttering badly at 24FPS. If there is no motion, or its a talking head, then sure I can't tell the difference; but most of the time the footage looks great....except it is stuttering along due to the frame rate when there is fast motion.

To me and the TVs and monitors that I use to view YouTube and online content, I can almost always tell when its not 30FPS and there's nothing "cinematic" about it. I even researched the history of frame rates and I know they started out that way to save  tape media, but those days are long gone. Motion simply isn't smooth if it is not shot at 29.97FPS (30FPS) in my opinion. Somehow Hollywood gets away with it, maybe its their post processing, their camera equipment, etc. but every other footage at that frame rate is just a stuttering mess to me if its fast action or a lot of things change between frames. I have also watched a lot of videos on frame rates and they describe the problems that occur when you shoot in 59.97FPS then try to slow the footage down to 50% on a 24FPS timeline....let alone to 23.97FPS. 

Even with Hollywood, playing a movie straight from a DVD, there's been scenes that were hard for me to watch because the frames appeared to be stuttering. So am I the only one that thinks this way? Is it something with H.265/H.264, YouTube compression, LongGOP compression, bitrates, or something else that makes 24FPS look so terrible most of the time when motion is involved?

Link to comment
Share on other sites

I see that too. A mantra, repeating what they heard. Cinema Look Pull Down 3: 2. I also disagree with the dogma 1/48 shutter expose. I understand the 180o shutter for the rotating reflex camera, but today it is no longer necessary, the blur drag is not too bad compared to what it offers in quality of movement, not to mention increased exposure, when necessary.

Link to comment
Share on other sites

Funny how perception and difference coexist amongst the human race, distinct periods of life, backgrounds, etc...

In a line:

Cinema is a lie told @ 24 frames-per-second; it doesn't matter what Godard thought ; ) Irreal, a replaced, pictured reality as alternative is everything we want when we suspend our own disbelief (E ;- )

Link to comment
Share on other sites

I think it just boils down to what people are used to. When I watch a film that isn't 24fps its jarring. For better or worse, it's the "standard." 

That translates to others in the video world wanting to film in 24 fps because they think it's more "cinematic" even if they're just YouTubers in their "studio" (aka converted guest room.) The jittering you're noticing might be related to shutter speed more than frame rate? A lot of YouTubers leave their exposure on auto or will set it to aperture priority (blurry backgrounds!) because it's easier and they're more concerned with bokeh than motion cadence. There are other reasons too why their footage might look jittery, like filming at 23.97 but editing or exporting at 24 fps (though I think most software these days is smart enough to "fix" this?) 

Usually though I don't notice 24 fps issues with a lot of the people that I watch that film in it. It is kinda silly though when you think about grown men in their guest bedrooms talking to a camera being worried about whether their videos look "cinematic." I don't think anyone really notices or cares, except maybe other grown adults that do the same thing!

Most of my work is 29.97 fps, since that's what most event videography calls for. 24 fps is generally for weddings and narrative projects or when a client specifically asks for it. 

Link to comment
Share on other sites

Human vision blurs while watching fast action or moving your gaze in real life. When I watch 60fps, it strikes me as much less natural--too fast and sharp to feel like natural vision . 

Not to mention you'll never get a high frame rate movie distributed unless you're a big deal, and even then...

Link to comment
Share on other sites

1 hour ago, newfoundmass said:

I think it just boils down to what people are used to. When I watch a film that isn't 24fps its jarring. For better or worse, it's the "standard." 

That translates to others in the video world wanting to film in 24 fps because they think it's more "cinematic" even if they're just YouTubers in their "studio" (aka converted guest room.) The jittering you're noticing might be related to shutter speed more than frame rate? A lot of YouTubers leave their exposure on auto or will set it to aperture priority (blurry backgrounds!) because it's easier and they're more concerned with bokeh than motion cadence. There are other reasons too why their footage might look jittery, like filming at 23.97 but editing or exporting at 24 fps (though I think most software these days is smart enough to "fix" this?) 

Usually though I don't notice 24 fps issues with a lot of the people that I watch that film in it. It is kinda silly though when you think about grown men in their guest bedrooms talking to a camera being worried about whether their videos look "cinematic." I don't think anyone really notices or cares, except maybe other grown adults that do the same thing!

Most of my work is 29.97 fps, since that's what most event videography calls for. 24 fps is generally for weddings and narrative projects or when a client specifically asks for it. 

I think maybe it is a combination of frame rate and shutter speed when filming in 24FPS, but the thing is, I've never noticed jittering at 30FPS.  Here is an example, great looking video until the 0:28s mark where it goes from being very nice to a jittery mess on my screen. Every single time a video that I'm watching starts becoming unwatchable to me due to stutter and I go in and check the frame rate it is always 24FPS, and in my own testing with the proper shutter angle and 24FPS the footage just didn't look as good to me so years ago I decided to never film in 24FPS unless a client specified it. 

 

51 minutes ago, TheRenaissanceMan said:

Human vision blurs while watching fast action or moving your gaze in real life. When I watch 60fps, it strikes me as much less natural--too fast and sharp to feel like natural vision . 

Not to mention you'll never get a high frame rate movie distributed unless you're a big deal, and even then...

The 60FPS was just the capture frame rate, the final delivery is something lower like 30FPS or 24FPS. My point was, I wonder if some of the jitter at 24FPS for some of the videos I have seen comes from shooting at 60FPS then delivering at 24FPS. I guess for me I've never noticed a motion blur difference between 30FPS and 24FPS, but I notice all the time the stuttering when 24FPS is used. 

Link to comment
Share on other sites

30fps can look more video like to me, but then living in a Pal country, I'm use to 25fps anyway.  30fps can certainky look smoother to playback.  Though 60fps is even smoother and this didn't do much for the Hobbit movies.  

In the video above, I saw some great 24fps and some not so great.  Without knowing how it was shot and with what settings, I can't blame all of it on the frame rate, since some of the shots look fine.  I think motion is harder to do right with 24fps and therefore any deficiencies in that area will be less covered.  

I shoot 24fps for my own work and Weddings.  I find stutter more if I shoot at a higher frame rate and then playback at normal speed in a 24p timeline.  Effectively speeding up the footage.  However I can see this effect in a 25p timeline also.

Link to comment
Share on other sites

I'm curious @herein2020 if what you are seeing is judder (from playing 24 fps on a 30 or 60 hz screen) or something else. I've never been able to verify that I can see judder personally. Once 120 hz monitors are the norm, of course judder won't be an issue.

I watched that drone video you posted, and at the 28s mark there is definitely something wrong. It's jumping all over the place without that much motion, so I would guess at operator error somewhere. Certainly dropping a 60fps clip on a 24 fps timeline will cause noticeable problems without good frame blending.

Link to comment
Share on other sites

I think with YouTube in general the problem is people are told to set their camera to 24fps because it's the movie standard. That is fine. But then they go on to record their video without thinking about their frame rate: camera pans that are too fast, subjects moving in the frame too fast, parallaxing shots that are too fast and look weird.

I see this commonly with drone shots with parallax shots. Maybe their shutter speed was off, or they filmed in 30fps and then forgot to conform it. I definitely notice it at the 0:28 mark in the video.

Maybe in the future the world will standardized on PAL (or something close) and we can just make 25fps the new cinema standard and do away with the weird (but genius) quirks with NTSC cathode ray tube jiggery they did.

Link to comment
Share on other sites

10 hours ago, herein2020 said:

I think maybe it is a combination of frame rate and shutter speed when filming in 24FPS, but the thing is, I've never noticed jittering at 30FPS.  Here is an example, great looking video until the 0:28s mark where it goes from being very nice to a jittery mess on my screen.

Yeah, too my eyes, that looks like low frame rate and high shutter speed.

Link to comment
Share on other sites

5 hours ago, KnightsFan said:

I'm curious @herein2020 if what you are seeing is judder (from playing 24 fps on a 30 or 60 hz screen) or something else. I've never been able to verify that I can see judder personally. Once 120 hz monitors are the norm, of course judder won't be an issue.

I watched that drone video you posted, and at the 28s mark there is definitely something wrong. It's jumping all over the place without that much motion, so I would guess at operator error somewhere. Certainly dropping a 60fps clip on a 24 fps timeline will cause noticeable problems without good frame blending.

The funny thing is, I see that literally all the time in online videos, not as much in Hollywood productions but even some of them have some really bad scenes where it is jittery. I think the biggest problem is that the typical camera doesn't have an option to record at a multiple of 24, they are mostly multiples of 30 (i.e. 30FPS, 60FPS, 120FPS) vs multiples of 24 (24FPS, 48FPS, 72FPS) so anytime you record at the higher frame rates using a multiple of 30 (just talking NTSC here), getting it to conform properly to a 24FPS timeline is going to require some additional configuration and thought for the process.

So years ago, I concluded...why do any of that? Why have to worry about panning speeds, reconforming clips in post, optical flow, frame blending, etc.....just to achieve extra motion blur which my clients will never know or appreciate. Is all that work and careful planning really worth it in the end for anything less than high end commercial and Hollywood work? It gets even better...want motion blur? You can even add that back in post using Davinci Resolve.  I mean come on....24FPS is at the very edge of what the human eye perceives as smooth motion...why not step back from that edge and give yourself a little more breathing room, is motion blur really that important or even noticeable when 90% of your viewers will be watching your creation on their cell phones?

I watched an excellent video from Gerald Undone years ago when I started questioning my frame rate choices again because I kept seeing people recommending 24FPS, but after watching this video I never questioned my 30FPS decision again.

 

Link to comment
Share on other sites

Shoot what you're happy with.  If I listened to every opinion on YouTube or forum, I'd be a confused mess.  I've had suggestions even here to shoot 1080p, and 8K and 4K and 6K and then there's the thoughts on 8 bit vs 10 bit vs 12 bit and 60fps and 24fps and 25fps and 30fps.  You can find a YouTube video suggesting each and everyone if you look.  

Its like upscaling 1080p to 4K, what's the point, when you can shoot 4K.  If you want the motion blur of 24p, film it.  If not, film something else.  I'll use motion blur to correct where I couldn't use the right shutter speed.  I won't use it to pretend I'm shooting something I wasn't, unless a client asks me to.

Link to comment
Share on other sites

5 minutes ago, SteveV4D said:

Shoot what you're happy with.  If I listened to every opinion on YouTube or forum, I'd be a confused mess.  I've had suggestions even here to shoot 1080p, and 8K and 4K and 6K and then there's the thoughts on 8 bit vs 10 bit vs 12 bit and 60fps and 24fps and 25fps and 30fps.  You can find a YouTube video suggesting each and everyone if you look.  

Its like upscaling 1080p to 4K, what's the point, when you can shoot 4K.  If you want the motion blur of 24p, film it.  If not, film something else.  I'll use motion blur to correct where I couldn't use the right shutter speed.  I won't use it to pretend I'm shooting something I wasn't, unless a client asks me to.

I definitely do that already, 29.97FPS and 60FPS are the only frame rates I use. It's just incredible to me that so many people release YouTube videos with that kind of stutter and either don't notice it or don't care; yet the rest of the footage is great (color grade, camera movements, content, etc.) and they had to have seen the stuttering prior to uploading. I'm with @Video Hummus I would think by now it would all be standardized and something like 30FPS vs 24FPS wouldn't even need to still be a consideration.

Link to comment
Share on other sites

1 hour ago, herein2020 said:

The funny thing is, I see that literally all the time in online videos, not as much in Hollywood productions but even some of them have some really bad scenes where it is jittery. I think the biggest problem is that the typical camera doesn't have an option to record at a multiple of 24, they are mostly multiples of 30 (i.e. 30FPS, 60FPS, 120FPS) vs multiples of 24 (24FPS, 48FPS, 72FPS) so anytime you record at the higher frame rates using a multiple of 30 (just talking NTSC here), getting it to conform properly to a 24FPS timeline is going to require some additional configuration and thought for the process.

If you see it in YouTubers' shots but not Hollywood shots with similar amounts of motion, then it's user error and not a problem of 24 fps per se. Certainly any time you put footage on a timeline with a mismatched frame rate you'll have problems unless you know what you're doing.

I've shot a lot of projects at 24.00 fps, but they were shown in theaters on native projectors (nothing big, just screenings for friends, small festivals). On a proper projector it's pretty easy to see the difference between 24 and 30 fps, with the former feeling more like a classic movie to most people. So at this point, under proper conditions and with proper technique, there is certainly an argument to be made for 24 if you are trying to conjure a traditional movie impression.

Once we move to 120 and 240 hz monitors though, the technical limitation will be gone and then any problems you see will purely be user error with mismatched settings, or bad technique. Then it will just be a question of the impression you want to give, 24 vs 30 vs 60, or even 120. With 240 hz you could even add 48 to that list of native formats.

I do think we should move away from the fractional rates though... 29.97 instead of 30 is just annoying to think about.

 

53 minutes ago, herein2020 said:

I use. It's just incredible to me that so many people release YouTube videos with that kind of stutter and either don't notice it or don't care; yet the rest of the footage is great (color grade, camera movements, content, etc.) and they had to have seen the stuttering prior to uploading.

Probably because it's a lot easier to stare at a still image and painstakingly grade it than to let something play and evaluate the overall effect.

Link to comment
Share on other sites

For me it takes the edge of of reality. It’s a visual pretense that tells you what you’re watching is a bit of a conceit. There’s some real psychological power in that. Assumptions and biases are made by the viewer. These are good things depending on what narrative you’d like to present. 
 

As for YT’ers. They’ll often mess up the shutter speed regardless of the frame rate, so I wouldn’t base any decisions about one’s video settings from random content creators. 

Link to comment
Share on other sites

The quality of movement in cinema (film) is formed by these characteristics:

  • Global shutter with smooth in-out
  • Projection Speed 1: 1 (not pull down)
  • Acceptable number of frames (I think 24 is the minimum acceptable. No one creates wine tasting arguments to defend 20fps)

24p and 1/48 expose are standards for projectionists, camera manufacturers and filmmakers to talk each other. If the standard was 30p it would be a little better, but with higher costs for film, development, print, copies ...

In digital, global shutter is possible, and solves movement much better than rolling shutter. The smooth in-out, made by the mirror in a plane away from the film, is another of the mechanical and chemical wonders that we lost when embracing the digital facilities (there are exotic solutions, like the mirror in alexa studio, the old tessive time filter .. .)

Now, 1: 1 viewing is a mess in digital. There was the old standard 24p, but for TVs they created the 60i and 50i. For electronic devices, monitors, phones ... 60p.

Even TVs that can offer the true 24p, depend on the signal being sent that way. I heard that with NVIDIA Shield it is possible to send the 24p signal from applications like NetFlix.

TVs that offer 120Hz, usually like to meddle and alias movement, and sell as if it were “incredible", and often, turning off all features still offers an image with signal intervention.

That said, the 30p manages to meet 1: 1 in almost all cases, except for TVs in Europe (and maybe other places in the world, I don't know). Little cost involved (25% of storage). Improves movement, without making it clinical, like 60p.

1/48 expose, if there is no smooth in-out of the mirror indicating the trend of the next frame, I think very little. It creates a very large gap that the digital with its implacable cuts only worsens. Here I still prefer a bigger exposure (tests in progress). Less than 360o and greater than 180o.

More frames (30p) and a larger aperture (maybe 1/48) also softens the effects of rolling shutter. But for me this is one of the unacceptable characteristics of digital. And I think that's what the friend said that hollywood has less of that jitter effect

Link to comment
Share on other sites

This topic contains two very distinct and completely separate topics that people are mashing together - one is filming in 24p and the other is getting smooth playback of 24p through the whole image pipeline.  One is an aesthetic choice and the other is a technical configuration issue.

Most computers aren't setup for a 24fps refresh rate, and so when you play back a video it will be doing its own horrible things to it.  That's on top of all the other horrible things that amateurs do to the frame rates already.

It's very easy to see what is going on with your playback - set your smartphone to its super-slow-motion mode and then play a video and film your screen with your phone.  Import the footage from your phone and then count the frames that each frame of video is displayed on your computer.

There's a reason that Resolve is designed to work with BMs hardware interface cards - it means that Resolve can run your reference monitor at the right settings without the OS getting in the way and doing horrible things to the colour, bit-depth and frame rate.

Until anyone has confirmed with a high-speed camera that their display is actually showing 24p (or any other frame rate) with an equal amount of time per frame then they can't speak with any credibility about the aesthetics of that frame-rate.

Link to comment
Share on other sites

On 1/26/2021 at 12:36 PM, SteveV4D said:

Shoot what you're happy with.  If I listened to every opinion on YouTube or forum, I'd be a confused mess.  I've had suggestions even here to shoot 1080p, and 8K and 4K and 6K and then there's the thoughts on 8 bit vs 10 bit vs 12 bit and 60fps and 24fps and 25fps and 30fps.  You can find a YouTube video suggesting each and everyone if you look.  

Ask four videographers, get six opinions 🙂

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

×
×
  • Create New...