Did you know you can take Apple's new 'Live Photos' and convert them into usable smooth video using Final Cut Pro X? Kenny Yin shows us how...
Following on from Hubert's interlacing trick last week, Kenny Yin got in touch to tell us about a discovery he made converting Apple's iPhone 6 'Live Photos' into usable video. We will let Kenny show us how after a side by side demonstration of his findings.
Make sure you watch in the highest resolution so you can to see the differences.
Live Photos are bad for video editing works because they are not originally designed for this purpose. They are meant to delight users to see their photographical memories ‘coming to life’ and is turned on by default if you have an iPhone 6s or iPhone 6s Plus.
Based on Apple PR, “Live Photos are not videos.” This is true from end users’ perspective, since Live Photos have the same property as Photos (no video trimming, no visible video component) and can be easily taken as any photo without doubt.
This is where Live Photo shines. As an editor, you may discover while a client has some great still shots, they would - at their best, make boring Ken Burns videos (unless with some really good parallax compositing). Having Live Photo's video component to work with essentially makes a difference between having videos and only stills.
From a technical perspective, every live photo composes of a 12MP JPEG still shot and a 15fps, 3 seconds long 1440x1080 H.264 encoded MOV video.
The resolution of the video is not a problem, especially if the Live Photo is taken horizontally (higher than 1080p!). Even if taken vertically, there would always be creative solutions, such as cropping a (tiny) bit, feathering out edges, and adding a scaled up, blurred version of the footage as background.
While the past two paragraphs make Live Photos sound desirable for video creation, there's a huge catch. For videos to appear natural, 24fps is the bare minimum. Live Photos' 14fps clips already appear to be quite choppy on their own. When juxtaposed with other 30fps clips in the same timeline, the choppiness becomes far more pronounced.
Optical Flow analyses pixel changes between frame to frame, then applies a complex mathematical algorithm to calculate/guess missing frames in between. While Optical Flow sounds like magic, it is really just very advanced guesswork. It usually takes a lot of time and CPU/GPU power to to analyze the clip.
In this March's Final Cut Pro X 10.2 release, Apple vastly improved Optical Flow performance (on a 2014 MacBook Air, I see a 6-10x performance improvement). Output of Optical Flow is also much more consistent in 10.2 and later releases. This makes the Live Photos + Optical Flow a viable solution.
Smoothing out live photos is easy. Export them through Image Capture or OS X’s Photos app, then drag the 15fps MOV file on a 30fps timeline. Don't worry if Optical Flow is greyed out by default.
There’s a little trick - first change the speed of the clip to 99% or 101%, then choose Image Quality as “Optical Flow.” 'The Hubert Retiming Trick' - Editor
Voila! Perfectly smooth and usable footage (as long as there’s not too much shaking or rapid movement.
If desired, the clip speed can be changed back to 100% while preserving optical flow, but that 1% really doesn’t make any measurable difference. I have also attached three short video clips for FCP.co readers to try out (If they don't have an iPhone 6s handy and they wish to)...
Download Demo iPhone Live Photo Clips.
A great tutorial from Kenny. With all contributors, we asked if Kenny would like to supply a bio. He told us that he hasn't got one yet as he is a high school student, but would appreciate a follow on Twitter. His name is @iosight.
We don't think this is going to be the last article on Optical Flow discoveries you didn't expect either!