I for one would love to see us drop the fractional frame rates (29.97, etc). They're an archaic technical relic that cause trouble when working with timecode. At Sphere we debated standardizing on 30/60/120fps but ultimately decided it was a battle we didn't want to fight in an already complicated building.
voltaireodactyl|1 year ago
qingcharles|1 year ago
mrandish|1 year ago
I think very few people (including myself) have ever seen a true side-by-side test where everything other than 24fps vs 30fps is perfectly identical. This is because correctly engineering such a head-to-head test is surprisingly difficult. In addition to having identical (or nearly identical) content shot in cinematic style, there are several other variables which each have to be technically correct. These include having the same signal chain from camera shutter speed, capture, compression, edit and grading to distribution format, playback device and display.
One thing that's especially tricky is whether the 24fps content ever goes through a 3:2 pulldown conversion (or similar). A significant amount of high-quality big-screen-film-sourced content originally made in 24fps goes through this sort of pulldown when viewed at home - even when the source is 24fps (whether Blu-ray, Netflix, Amazon or Apple). This pulldown process definitely imparts a look many associate with being "cinematic". Yet what we see in an actual theater is native 24fps so that's what we need to match for an accurate comparison.
Having recently upgraded my dedicated high-end home theater I was surprised that every device from playback source (streaming box or Blu-ray), AVR and 4k HDR projector - while being native 24f capable - defaulted to having the native 24f turned off in settings (thus silently applying a real-time 3:2 pulldown to the native 24f source). This was only discovered during detailed calibration using test signals. This means many people's impressions of 24fps may actually have been formed watching 24fps content automatically converted to 30fps with 3:2 pulldown by their source, AVR or display.
I suspect associating my subjective sense about "cinematic" with the label "24fps" may not only be erroneous but unfair to 30fps. Technically, 30fps has advantages in reducing motion judder on fast-moving objects and camera pans. A good example of this is the Hollywood-produced pre-digital 24fps Oliver Stone football movie "Any Given Sunday" which was shot entirely on film. They did the best they could with 24fps but some of the fast, ball-tracking camera pans are extremely distracting - something 30fps would have definitely helped if it had been an option back then. Nowadays, for the first time, the industry has some freedom to choose frame rates and I wonder if, done properly, 30fps might be a better option in which us film-look purists would lose nothing of what we love but gain in reducing some unavoidable artifacts from 24 frame's limitations.
burntwater|1 year ago