Patrick Southern and Sam Mestman take us on a detailed walk-through of the workflow involved in the post production of a feature. We start with part one, on set where the capture of logging information is essential to cut time and costs in post production. With the first part containing over 4000 words of wisdom, this series will be the 'go to' post production reference for future filmmakers.

It is completely beyond me why an editorial department wouldn't either be living on set or in the production office cutting away while everyone else is shooting the movie. Currently, most people have a DIT making dailies and creating LUTs that then get delivered to the editorial department either later that night, the next day, or sometimes even weeks later if post production starts after the shoot.

For the We Make Movies original TV pilot, Off The Grid, we decided to do things a little differently. You can watch our behind-the-scenes workflow video to get a general sense of what we did. 

 

On Off The Grid, post production started the second the footage was placed on our LumaForge Jellyfish. We had a 6K Dragon as our A camera, with 5K Epic as our B camera. On-set editorial looked like this:

OTG part one 01

 

Here's why:

Filmmaking is the only artistic medium where most artists can’t afford to make their art the way they want to. My aim is to remove that hurdle along with all others so the only limitation in making a movie is one's own creativity.

It has been a long and winding road to get to where we are now as digital filmmakers at We Make Movies and LumaForge. In 2008 I produced, edited, and colored an independent feature film appropriately titled How I Got Lost.

We might have been the first independent film to have shot with the RED Camera (Build 15 for those who know what that means), and if I’d known anything about marketing at the time, I would have made a bigger deal out of that. We shot in 4K and finished at 1080. The following is a list of things I shouldn’t have had to experience when I made How I Got Lost in my Final Cut Pro 7 - RED - Apple Color workflow:

  • Work offline at 1080p with a complicated conform to R3Ds at the end.
  • Sync sound manually
  • Name clips manually
  • Work from hand written Script Supervisor binder and camera/sound reports
  • No searchable metadata… had to remember all of my footage
  • Burn Dailies onto DVDs with TC Burn-in
  • Manually create track based OMFs for sound
  • Go back and forth color grading blindly on my computer and then checking how it looked in an expensive grading theater with DCI-P3 Projector.
  • Master to HD-Cam SR & Make a DCP
  • Work in Sneakernet, reconnect my whole movie at every turn, and deliver a new drive to every collaborator
  • Rely on someone else to distribute the movie and get pennies on the dollar
  • Spend $400,000, go deep into debt and become unable to make a much better movie later on due to cost

2009

In 2009, having no idea what else to do, I started We Make Movies, a filmmaking community designed to reboot how Hollywood makes movies. It would later become, as far as I know, the world’s first community funded production company.

That same year, Blackmagic Design purchased DaVinci Resolve, which would truly democratize film finishing for colorists around the world.

2011

In 2011, Final Cut Pro X was introduced to rave reviews throughout the professional editing community!… Ok, maybe not so much… It was left for dead by the pros for reasons that are well documented.

In the long run, it turned the application into the best kept secret in post production. It also allowed for a really strong, passionate community of users and 3rd party developers to rise up around it largely without interference from the established players. FCPX was free to grow and develop without any limitations. I was one of the people who left it for dead at launch, and then picked it back up when multicam support was added in 10.0.3.

2013

In 2013, after the 10.0.6 release that added audio component editing and the world’s best implementation of RED Raw support, I posted a three part FCPX-RED workflow series for Moviemaker Magazine that spelled out a better way to make movies using Final Cut Pro X, and how it represented the future of filmmaking.

I had been using a cobbled together version of RED, Final Cut Pro 7, and Apple Color for years, and was continually frustrated by the ridiculously difficult conform process.

FCPX changed the game for me when I realized I could grade my RED Raw in Redcine X and have it automatically update in FCPX, transcode my RED footage down to proxy within FCPX, batch sync audio to video in Sync-N-Link without the need for exported dailies, and batch rename my clips using metadata.

This would forever change the way I made movies. When I wrote those articles, it was because I wanted to help clear up a lot of misinformation around this App that I felt had been wrongly dismissed by professionals. To me, it represented a forward thinking way of making content.

expendables 3 EPK editor article

Later in 2013, I went to Bulgaria to do what I still consider to be the craziest doc project I’ve ever seen done in FCPX (and where I threw Gergana Angelova into the deep end…somehow she came out the other side and now works at LumaForge).

Then, as part of LumaForge, I collaborated with Jan Kovac and Mike Matzdorff to set up their workflow for Focus.

From there, I worked with Kevin Bailey to help build Shot Notes X which was then used extensively on Whisky Tango Foxtrot. Along the way, I helped launched FCPWORKS with Noah Kadner and we got to take a deep dive into some of the largest FCPX implementations in the world. This helped us understand what pros really wanted to be able to do.

We recognized that FCPX could never work for post production facilities unless there was an affordable shared storage solution that was optimized for it. With that information, I returned to LumaForge and launched the Jellyfish and ShareStation.

FCPX 10.1 fundamentally changed the ways Libraries worked. The "Trashcan" Mac Pro was released in late 2013. 3D Text was added in 10.2 (which I still think is an awesome feature). FCPX also had massive enhancements to effects and XML handling. A massive ecosystem of 3rd party apps emerged, including Color Finale, Lumberjack, and the updated DaVinci Resolve.

 More people began dipping their toes in the water. There has been a steady stream of case studies. Facilities became willing to be open about how and why they were using FCPX in professional environments.

Ronny Courtens and Jesus Perez Miranda lead a sustained charge throughout Europe with the FCPX Tour aimed at fundamentally changing how facilities are approaching content creation. Things began to ramp up at IBC in 2016 with the European communities’ FCPX Tour presentations.

And finally, in October of 2016, we got the massive FCPX 10.3 update we had been waiting for. This landed the day before the FCPX Creative Summit, which was, hands down, the most fun I’ve ever had at a post production event, and was pretty much summed up by Patrick Southern’s recent article.  This culminated with Apple returning to LACPUG in November 2016, and allowing a taped presentation of the new features in 10.3, which is something that they hadn’t done in years. LumaForge was lucky enough to follow Apple with our demonstration on collaborative narrative workflow in FCPX 10.3.

Final Cut Pro X 10.3 Workflow at LACPUG

Final Cut Pro X 10.3 - 10 Tips and Tricks at LACPUG

Lumaforge : FCPX 10.3 Workflow

 

This 5 part series should be looked at as a cheat sheet on how to make a movie, pilot, or doc without limits in the modern age.

This is a much needed update to those Moviemaker articles I wrote back in 2013. Everything you are about to read has actually been done with a real world project called Off The Grid, which is We Make Movies’ first original TV Pilot that premiered at the Sundance theater in Hollywood. Off The Grid represents 7 long years of me trying to create a reproducible model for content creation that removes all the barriers in the way of you and your team telling your story the way you want to tell it.

Here is what we were able to accomplish with Off The Grid (and you’ll be able to do this too after reading this series):

  • Work in 6k from native RED RAW footage and original audio with no LUT’s or dailies required
  • Automatically batch sync jam-synced sound and create synchronized/multicam clips and automatically apply sub-roles to multi channel wav files with Sync-N-Link X
  • Add the Script Supervisor's notes to footage via Shot Notes X, making them searchable in the Browser.
  • Automatically batch re-name clips based on metadata
  • Start editing while importing footage and create same day rough cuts of scenes.
  • Easily work in FCPX in a shared environment
  • Create synced dailies for review in the cloud with Frame.io
  • Easily flip between offline and online resolution with a click of the button
  • Automate a perfectly organized sound delivery with no track preparation with X2Pro
  • Manage the color pipeline for theatrical, web, and broadcast and Color Graded for the theater in DCI-P3 with an inexpensive broadcast monitor
  • Make a color accurate DCP, in-house for free
  • Retain connections to media when going between machines/collaborators
  • Pass VFX renders back and forth through the cloud in a mastering codec
  • Spend less than 5% of what we spent in 2008 making How I Got Lost

Enough preamble. Patrick Southern and I have put together this guide to help you make your movie, TV show, web series, or documentary and not need anyone else along the way.  There are no more gatekeepers between you and showing your movie on the big screen. The following can be looked as the ultimate cheat sheet to make and deliver your movie in the easiest way possible, and still have it look like someone in Hollywood made it.

System Requirements - Getting Started

The official minimum system requirements from Apple are as follows. They are NOT the minimum requirements for this workflow:

  • OS X 10.11.4 or later
  • 4GB of RAM (8GB recommended for 4K editing and 3D titles)
  • OpenCL-capable graphics card or Intel HD Graphics 3000 or later
  • 256MB of VRAM (1GB recommended for 4K editing and 3D titles)
  • 4.15GB of disk space

We recommend:

  • At least 16GB of RAM (32GB+ preferable)
  • 1-4GB of VRAM on your Graphics Card.

If you’re working in a shared environment, we would also recommend putting your editorial team on a Jellyfish (mention fcp.co and save 10%) so everyone has simultaneous access to the same media on storage that is designed specifically for the demands of FCPX Libraries. Here was our Jellyfish setup for Off The Grid:

  • Jellyfish Tower
  • MacBook Pro for Ingest
  • Mac Pro (2013 or later) as Assistant Editor's machine for transcoding, syncing, and organizing footage
  • iMac 5K Retina as Editor's machine for cutting the show

You’ll want to have licenses of Shot Notes X and Sync-N-Link X for adding metadata and batch syncing your footage. Finally download the Frame.io app from the App Store and create a Frame.io account on their website. We’ll cover other 3rd party apps and plugins later.

Part 1: On-set Post Production

Camera Dept. Workflow

For the most flexibility, record RED RAW on Set. This will allow you to do a transcode-free one light color pass in REDCINE-X. Any grades you make in your REDCINE-X bin will pass directly into FCPX. That's because the changes are metadata based rather than burned into the footage.

Also make sure that your camera department jam-syncs timecode with your sound recordist throughout the day. You should also slate every take to provide a visual representation of sync, scene, and take just in case something goes wrong down the line.

DIT-Asst. Editor Workflow

When ingesting footage from camera cards, use an app like Shot Put Pro or Resolve’s Clone Tool to make sure your footage copies to the drive with no errors. Shot Put Pro and Resolve will also allow you to copy to multiple destinations at once, allowing instant backups of your Original Negative (O-Neg).

Once ingested, the RED RAW can be graded in Redcine-X which will update .RMDs in FCPX. For us, this made it immediately possible to then check a scene to ensure the footage was workable. You can transcode your Proxy Media while building your Events by running RED clips and production audio through Shot Notes X and Sync-N-Link X.

Adjust metadata as needed to deliver final scene-based Events to the editor to work from in a master FCPX Editorial library. On Off The Grid, 3 computers were involved in what essentially became a fully operational on-set lab & post house.

Production Sound Workflow

Communicate with your production sound recordist ahead of time and make sure they can record iXML data to their multichannel wav files. What does this mean? Basically, they should be labeling each of the microphone tracks they record after the mic type (i.e - Boom, Plant Mic) or after the character who is speaking (if it’s a lav mic) so that when you get the files into post they will be labeled something like this in FCPX (later will show up this way on proper tracks when you deliver to your sound designer in Logic, Reaper, or ProTools):

Track 1 - Boom 1
Track 2 - Boom 2
Track 3 - Brynn
Track 4 - Sara
Track 5 - Pete
Track 6 - Leslie
Track 7 - Terrence
Track 8 - Wallace

This used to only be possible when using jam-synced timecode with Sync-N-Link X (which still works). New to FCPX 10.3, however, is the automatic labeling of sub-roles with iXML data in the event that you did not record jam synced timecode and need to sync by hand or in PluralEyes.  I can’t even express how much of a timesaver this is.

OTG part one 06

Script Supervisor Workflow

It used to be that you’d pay a bunch of money to a script supervisor to write down a bunch of detailed, handwritten notes that would be photocopied and placed in a binder for the editor who would either never look at them again, or would certainly not be able to have any immediate use for them within their NLE. No longer. FCPX restores value to the script supervisor position, and all you need is to download the Shot Notes X app from the App Store.

From there, you’ll want to purchase Shot Notes X and send them a “Shot Log Template” for them to work from. This will make it possible to add your Script Supervisor’s notes to the footage in Final Cut Pro X later in the process. They can use the “Shot Log Template” to build a database in Excel, Numbers, or even FileMaker. When they are finished with their notes, they will need to export a .CSV (Comma-Separated Values) doc for import into Shot Notes X (More info on Shot Notes Workflow in Part 02).

Managing your .R3D’s with Redcine-X for use with FCPX

If you’re working with RED RAW, a great benefit is the ability to manage the look of your footage by using metadata (the RAW controls) in either REDcine-X or FCPX. Probably the single biggest reason I switched to FCPX in the first place was because of the ability I had to seamlessly switch between Proxy (ProRes 422 Proxy files generated by FCPX from the RED Media) and Original Media (the native .R3D files) at the click of a button in FCPX.  Your workflow with RED Media on-set should be as follows:

  • DIT Copies .R3D’s onto Jellyfish.
  • DIT imports .R3D’s into REDcine-X and makes sure .RMD’s have been saved for all RED clips. There is a simple option in REDcine-X that will batch create this and can be done at any time… this video I did back in 2013 is still relevant:
  • DIT grades one light to RED RAW footage. If assistant is already working from footage in FCPX, this one light will automatically be updated to your clips in FCPX.
  • In FCPX, Assist will set all Spatial conform settings to “Fill” upon importing RED clips
  • In FCPX, Assist will transcode original RED media to proxy once one light has been applied, keeping proxy files generated on the Jellyfish.

OTG part one 02

  • Editor will work in Proxy mode in FCPX throughout offline edit.  Once the finishing stage is reached, simply switch back to your Optimized/Original media to start manipulating your original .R3D media (this can be toggled in the View Menu dropdown from the viewer in FCPX).

Your Libraries

There are two types of Libraries you should work from while on-set. The first is the Dailies Library.

This is where you will initially organize all of the footage as it comes to you from the camera department. It’s also where you’ll assemble timelines to export dailies for review to a service like Frame.io.

The second type of Library is a Scene Library. This is where you’ll edit your movie. If you are working on a project under 30 minutes, a Master Scene Library should suffice. However, depending on the scope of your project you may want to initially break your Master Library down into more workable Scene or Reel Libraries. You’ll want to determine ahead of time which scenes will go in which reels.  A good rule of thumb for Features is breaking them into 20 Scene increments.

The Dailies Library

OTG part one 03

On day one, you’ll want to start by creating your Dailies Library. In FCPX, select “File>New Library”. You’ll then want to set your Library Settings. This is an important step, as it ensures quick and easy collaboration between editors later down the road. 

To access your Library Settings select “File>Library Properties” or use the keyboard shortcut “Command+Control+J”. Click the “Modify Settings” button next to “Storage Locations”. Set your “Media" to a folder on your Jellyfish, either at the root of the machine or within a directory specific to your movie. Set “Motion Content” to “In Library” so that the Titles, Effects, Generators, and Transitions you use can pass between editors. Set your “Cache” to either a “FCPX Cache” folder you create on the root of the Jellyfish, or place it in the same folder as your “Media”. Finally, leave “Backups” set to “Final Cut Backups”. This ensures that you have a backup of your Library somewhere other than your working drive.

Next, scroll to the bottom of your “Library Properties” and change your “Color Processing” from “Standard” to “Wide Gamut”. This will allow you to create Projects in Rec. 2020 Color Space, making it possible to color grade accurately on a P3 monitor for cinema delivery.

Now that you’ve set all of your Library settings, go into Finder and duplicate your Library to create your remaining Libraries. This ensures your Library Properties are retained across the board. Back in FCPX, create an Event for your first day of shooting within your Dailies Library.

Importing

OTG part one 05

When importing your .R3Ds and audio into FCPX, make sure to leave “Assign iXML track names if available” checked. To make editing fast and efficient across all machines, check “Create proxy media” under “Transcoding”.

You can assign an Audio Role to the audio of the clips under the "Assign Role" drop-down. If you want to create a separate Audio Role for “Location Sound”, do that before importing and select your “Location Sound” Role from the “Assign Role” drop-down.

OTG part one 04

Now the track names that your sound recordist set for each mic will pass into the “Location Sound” Role and will be properly assigned to each Role Component (Similar to Tracks or Channels).

Syncing, Multicams & Metadata 

Now that you’ve imported your footage, select all of your R3Ds and set their “Spatial Conform” to “Fill” in the Video tab of the Inspector (Command+4 if it isn’t already open). Doing so makes sure the footage will fit properly in your Multicams, Projects, and when turning over to DaVinci Resolve.

From here, you’re going to want to add your Script Supervisor’s notes to your footage in Shot Notes X. Before opening Shot Notes X, double check the .CSV provided by your Script Supervisor. Make sure to fix any misspellings and verify that the .CSV matches the layout of the “Shot Log Template” you supplied them before shooting began. Once you’re sure that the .CSV is correct, open Shot Notes X. Drag your .CSV to Shot Notes X, and then drag the Event containing your footage to Shot Notes X. Save you XML and tell Shot Notes X which Library to add the updated footage to.

FCPX will ask if you want to “Keep Both” or “Replace” similar items found in the Event. Select “Replace”. Once the footage has updated, select the original clips and hit “Delete” to reject them. Next, drag your Event to the Sync-N-Link X icon on your computer’s dock. A few helpful settings to select:

  • Name synchronized clips using video
  • Trim clips to common video and audio duration
  • Also make Muticams
  • Use Subrole Names for audio component names
  • Import automatically after saving

Once the footage is back in FCPX, go through and Reject any Synchronized clips that have a matching Multicam Clip. Double check the sync of your Multicams by opening the Angle Editor and verifying that slate clap is simultaneous between cameras. Additionally, in the Audio tab of the Inspector, make sure to uncheck the Camera audio if it hasn’t already been disabled

Digital Dailies 

Now that your footage has your Script Supervisor’s note and has been properly synced, it’s time to export and upload your dailies and get it uploaded for your producers to check out.

To do this, we highly recommend getting a frame.io account because it enables batch uploading of dailies in FCPX, and is a fantastic platform for client notes and review. To get going, select all your synced footage and hit “E” to append all of your footage to the Primary Storyline.

In the “Clips” tab of the Timeline Index ("Command+Shift+2” or the “Index” button of the far left side of the screen) search “Multicam”. Use the results to jump to each of your Multicam clips. Duplicate each Multicam (Option+Drag Right) and switch to your other camera angle.

When you’ve done this for every Multicam Clip, use “Command+A” to select all of the clips in your timeline. Open your Effects Browser (Command+5), select “Basics” and double click on the “Timecode” effect. This will apply Burn-In timecode to all of your clips. Click the “Share” button in the upper right corner of FCPX, scroll down and click the “Frame.io H.264” setting. Now sit back, relax, and watch your dailies export and upload to Frame.io, right before your eyes!

 

That pretty much takes you through Part 1. In Part 2 we’ll be doing a deep dive into on-set organization. If you have any deeper questions at all about how to make some of this work for your productions… please reach out to us over at www.lumaforge.com. For existing LumaForge customers… We have your back with this stuff, and when you bought our storage, you also bought our brains too. We want to work with you to take your productions to the next level.

Lastly, if you’re a filmmaker in Los Angeles looking to find other people to make movies with, there’s no better place than We Make Movies and if you’d like to become a member, you can do so here.

 

Sam MestmanSam Mestman is the CEO of Lumaforge, maker of the Jellyfish and the SHARESTATION, a shared storage platform optimized for media and entertainment. He is also Founder of We Make Movies www.wemakemovies.org, the world’s first community funded production company, as well as a workflow architect for FCPWORKS. As a professional editor and colorist, he has worked for Apple, ESPN, Glee, and Break Media (to name a few), and has edited or colored hundreds of shorts, features, web series, and just about every other type of content you can think of.  He is also one of the world’s leading experts on Final Cut Pro X Workflow, and is responsible for some of the largest FCPX professional integrations in the world.

patrick southernPatrick Southern is the Chief Workflow Engineer at LumaForge in Hollywood, CA. He previously worked as an Editor and Assistant Editor on documentary projects for A&E, Riot Games, Smithsonian, National Geographic, and the Lifetime Movie Network. He has helped develop and refine a number of software tools for documentary editing. He has also acted as a FCPX Post Production Consultant on a number of independent feature documentaries.