Evolution for film?
The latest film from Ang Lee will go down in history with a curious distinction: a production boldly shot on a format virtually nobody in the world watched it on.
Lee’s preferred way for audiences to see Billy Lynn's Long Halftime Walk, about a 19-year-old US soldier haunted by memories of Iraq, is in 120 frames per second, in 3D and in 4k resolution. The tiny teensy issue with this is that only five cinemas in the world are equipped to screen it in such a way: two in America, two in China and one in Taiwan.
A number of cinemas, however, achieved the first stipulation, bringing the film to audiences in 120 frames per second (or fps). This resulted in a surfeit of slack-jawed critics coming to terms with a highly unusual look.
Sadly, Australians were not invited to Lee’s eyeball-bewildering party. Billy Lynn’s local distributor, Village Roadshow, decided to release it in garden variety, plain old 24fps. As we contemplate why that does or doesn’t matter, and what we see here versus the spectacle exhibited overseas, now is a good time to take stock on how frame rates came into existence, what they mean for viewers and why big-time directors such as Lee seem intent to keep messing with the formula.
James Cameron is among the industry titans to extol the supposed benefits of higher frame rates. The best-known example of a director upping the frame rate ante is Peter Jackson, whose Hobbit movies were shot and widely screened screened in 48fps. Audiences responded less than enthusiastically, many grumbling about soapy high-end production values and a distractingly glossy look.
Given Jackson’s failure to part his high definition Red Sea and lead cinema patrons to a promised new land, why would Ang Lee not only follow in his footsteps but increase the rate further – all the way up to 120fps?
The Chief Innovation Officer of visual technology company RealD, Pete Lude, has a theory: “When you only go to 48, or 50, or maybe even 60 FPS, you're not getting the full physiological/neurological benefit of the [High Frame Rate],” he recently said. “The brain still knows something's going on. When you start to get to 120 FPS, it seems that is a much more engaging experience for your human visual system.”
Our “human visual system” (that sounds like a robot trying to explain eyesight) is all about optical illusion. The crux of this illusion is known as the Phi phenomenon, which refers to how a series of images, if displayed in rapid succession, can provide the impression of movement.
During the silent film era, the existence of hand-cranked cameras meant frame rates during the shooting process were varied (cinematographers could easily “undercrank” or “overcrank”). Likewise for how they were exhibited, with some cinemas screening films faster in order to squeeze in extra sessions. Frame rates during this period ranged from about 14 to 26 per second.
The introduction of sound (or “the talkies”) standardised frame rates. Given audio was recorded as a track that ran down the side of film strips, that film had to be fed through machines at an even pace. In the late 1920s, an international standard was established of 24 frames a second. This number is actually substantially lower than the figure Thomas Edison once recommended as the minimum needed, which was 46fps (“anything less will strain the eye,” the inventor famously said).
Edison’s advice could be followed, however, if films shot in 24fps were ran through a triple or double shutter projector. Running a film through the latter, for example, meant that each of the 24 frames would be displayed twice. So, while shot in 24fps, films would be projected at twice that rate.
Twenty-four was not a number ordained by the heavens; plenty of others could have been used. One of the reasons 24 was settled upon is because it is easily divisible. An editor would know half a second, for example, equals 12 frames – or a third eight, or a quarter six – and be able to chop the film accordingly. Also, film is pricey stuff, so the impulsion from a business point-of-view was to keep it erring lower than higher.
The cost of film and the ease with which it can be manually edited no longer applies in the age of digital cameras and projectors. In that sense it is a brave new world, the floodgates open for higher frame rates. When footage is filmed and exhibited in a higher rate (such as this one on YouTube, which runs at 60fps) more visual information is packed into each second.
This creates a crisper and sharper image. You can particularly notice the difference when fast motion is being displayed: an athlete running, for example, or a person standing still and waving their hand. The look it creates is largely associated with high-def television (American football is broadcast in the US in a high frame rate) and video games. It is not considered “cinematic”.
That last point is perhaps why critics have, unsurprisingly, been reluctant to champion higher frame rates. There is a general feeling, however, that the industry is moving there with or without them. TV, games and online videos will lead the charge. When audiences become increasingly familiar with that crisp, glossy look, there is every chance cinema will follow suit.
What the medium might need to do so is a game-changing movie spruiking the new format, in a similar way James Cameron’s Avatar ushered in a new era of modern 3D films. It’s hard to believe that was ever going to happen with Billy Lynn’s Long Halftime Walk, a psychological drama largely devoid of sensational imagery. There's little doubt, however, that higher frame rates will be back. Literally in greater numbers.
- Luke Buckmaster