When you go to the theatre to see the newest feel-good romantic comedy of the summer, you’re seeing 24 frames per second. This is the standard for film, adopted decades ago when filmmakers were looking for the absolute cheapest way to make their films. 24 frames per second was identified as the lowest frame rate possible to produce believable motion, while staying in sync with the audio track. Film was (and largely still is) very expensive to work with — it must be sent away to be developed; processing/color correction/effects are tedious to perform; editing is more difficult; prints are expensive to produce. In light of all this, it’s no wonder that filmmakers wanted to get away with the fewest amount of frames possible.
Fast forward to the 21st century. It’s the age of digital video. High definition is the norm, with its standards topping out at 1920×1080 pixels at 30 frames per second (referred to as 1080p, 1080p/30p, 1080/30p, etc.). In fact, we’re seeing 60 frames per second at that resolution, although there’s really no broad, official standard for it as of this writing. In the past decade, we’ve even seen the emergence of Ultra High Definition, which is up to 16 times higher resolution than HD! These are truly exciting times we’re living in. “24p” is the name given to video shot at 24 progressive frames per second (more details on its Wikipedia page), which equates to the 24 frames per second film standard.
More frames equal smoother motion. This is most noticeable when you try to watch something shot in something like 15 or 20 frames per second — you get a jittery, strobe-like effect, especially when there’s camera movement or fast motion on screen. And when you compare 24fps to 30 or 60fps, you’ll likely be able to see the jittery quality that can plague 24p. Camera panning, unless done very slowly, will exhibit jitter than many find unpleasant. Quick movement, such as someone running across the screen, will also look like it’s slightly stuttering.
Despite this, 24p is still a highly popular video format to shoot in. Why? The main answer seems to be that 24p gives that “film look” — the look that they’re used to seeing in the movie theater, as opposed to what they see on the nightly network news. And while I can understand this, the fact remains that “the film look” is a result of much more than just 24fps. We’re talking about film camera processing versus video camera processing; lens selection; inherent color differences between the two media; and much more. Believe me — I’ve shot plenty of 24p footage, and it doesn’t look like film. It looks like jittery video a lot of times.
In my experience, I’ve found that a lot of videographers like to shoot in 24p just for the sake of saying that they do so. For many, 24p communicates “art” — again, the difference between a film and the nightly news. It raises the perceived value of the production. Unfortunately, when many of those people watch their 24p footage, they are disappointed.
Just in the past month, director Peter Jackson announced that he’ll be shooting The Hobbit in 48p — twice the frame rate of the current film standard. Why? According to the Variety article, he wants the movie to appear more lifelike. He also prefers 48p for “for smoother action and crisper 3D visuals.”
This has sent waves throughout the film industry. Many modern theaters can handle 48p — but the majority can’t yet. Both a 48p version and a 24p version will be cut — complicating the production process significantly and leaving most theater-goers out of the smoother, higher-quality experience. As technology changes, theaters will be forced to upgrade their equipment, or be left behind. Ultimately, however, the determining factor will be box office revenues — whether audiences will be able to tell the difference, and whether or not they care, is yet to be seen.
So what does this all have to do with me, and whether I shoot in 24p or not? For me, the negatives far outweigh the positives. I see 24fps as a lingering dinosaur in the industry, kept for two main reasons: the majority of movie theaters can’t support anything else, and it’s perceived to be the “film style.” While some artists can work in 24p and produce amazing results, I can’t see why higher frame rates (and thus more visual information and smoother motion) aren’t utilized in a day and age when it’s all digital.
And it’s also due to personal preference. When I watch 24p, and then watch something similiar in 30p, there’s a big difference for me. It’s amazing what a mere 6 frames per second can do! I simply don’t like the jitter. Any sort of animation looks really bad to me in 24p. Plus, artificial slow motion from 24p footage is simply atrocious — whereas you can get away with it most of the time with 30p.
Everything I shoot is 1080/30p or 720/60p. And it’s beautiful. If I find the need to imitate the “film look,” there is a variety of techniques that I can employ before, during and after production to achieve this. I believe that 24p will eventually die out as technology races forward, and I don’t want to be stuck with a bunch of jittery 24p footage.
There is one instance where I would consider working in 24p. If I were to become involved in a project destined to be shown at a theater, I would likely shoot 24p. Like I sad, the vast majority of theaters can only handle analog 24p. Sure, you can convert 30p or 60p footage to 24p, but it is far from ideal, and often gets messy. This is a very isolated circumstance, and while I would certainly love to be part of the production of short/feature films, it’s probably not something I’ll be doing anytime soon.
Make that two instances where I would work in 24p — if someone paid me for it.