On its own, FCPX cannot edit and output a 3D master. However, coupled with Resolve in the middle to do the 3D heavy lifting, it performs rather well. Here's how Matt Brading did exactly that on his latest short film.
Will Final Cut Pro X ever support native 3D editing? We don't know and would suggest that the importance of such a feature has been demoted a few cells in the development team's spreadsheet of future desirable features.
3D films do have their niche and when Matt Brading decided to make the short film 'The Hub' he used FCPX as a 3D edit tool with Resolve in the middle of the workflow.
We will let Matt take up the story, well worth a read and a view of the video even if you are not doing 3D production as there is a lot of deep Final Cut Pro X knowledge in there. Matt gets our third FCPX expert badge of the week!
The Hub is a short, stereoscopic 3D film that shows a world where smartphones and internet have taken over people’s everyday lives. The film was a test to see whether low-budget stereoscopic film making was a reality and was shot using a DIY Beam-Splitter Rig with two Canon 550Ds.
My main role in the project was to design and test an end-to-end post production workflow for stereoscopic film. This workflow was then used to handle the editing of the film. The workflow centred around using Resolve to carry out the stereoscopic grading and Final Cut to handle media organisation and editing.
Post Production Workflow
The first stage of the process was to synchronise the left eye, right eye and audio files for each take together. This was done by adding scene, take and angle meta-data to every file and then using FCP X to synchronise clips based upon their audio wave forms. This produced a synchronised clip that was assigned time of day timecode. Using the Roles feature of FCP X, the synchronised clips were exported with the Left, Right and Audio being exported as separate files with matching timecode, duration and naming.
These exports became the online media files and were imported into DaVinci Resolve. Inside Resolve, the stereoscopic sequences were created using a stereo EDL. Each clip was then groomed to correct any geometric and colour disparities between the left and right cameras in order to produce a smooth stereoscopic effect. Once all clips were groomed, they were exported from resolve as side-by-side offline files.
These offline files were imported into FCP X in order for the offline edit to take place. The offline process was essentially the same as for a 2D project, the only difference was that the external monitor was set to translate the side-by-side files and produce a watchable stereoscopic effect. This allowed the editor to monitor in 3D and make sure the edit suited the 3D format. Once the offline edit was completed, an XML was exported from FCP X.
This XML was then re-imported into the same Resolve project as was used earlier, which meant that the files could be linked back to the online media with all the previous grade nodes still attached. At this point, any clips that needed VFX could simply be rendered out of Resolve and forwarded to the VFX team.
Inside Resolve, the film was graded first in 2D using only the left eye files. These grades were then automatically rippled across to the right eye and were manually checked and tweaked to make sure their were no remaining disparities.
Once the film had been graded and the VFX were finished, a final result was rendered out of Resolve as full frame left and right eye files. These were then imported back into FCP X in order to add the audio mix and any titles or credits. A custom 3D export filter was built in motion that would allow FCP X to export the film in a number of different 3D formats (discrete, side-by-side, interlaced).
©2013 FCP.co/Matt Brading