A production in Australia used Final Cut Pro X to edit films for multiple screens in a live performance. Video, music, dancers and over 60 live cues in the show - how was it done?
We have to thank Tim Maisey for this rather unique FCPX user story. We will let him take up the story on this rather fabulous community project:
Rhythms of Life: FCPX
This was a seat-of-your-pants-tight-deadlines-low-budget-live-performance-community-involvement-real-life project involving 26 interviews, 2 filmmakers/editors, 17 musicians and performers, 14 live production crew, 8 projection screens, and 2 live performances.
This write-up may help others with any basic FCPX features, my (imperfect) workflow, tools, ideas, or to avoid any mistakes I made. I hope it is interesting and useful.
The project began back in early 2014 - I had been asked by Peter Keelan, the Creative Director, if I would be interested in running some workshops and filming some interviews for Denmark Arts (subject to getting the funding). Sounds good. Months later, the project had developed into a full-blown multi-screen live event.
The green light was given in mid June 2014. Pete envisioned filming local community elders about their life and having the interviews projected onto multiple screens. The project was called 'Rhythms of Life' and it would involve other community members responding to the interviews with music, dance, and writing. I had not been involved in a community project before and trying to communicate what the project was about to others was difficult.
Was it a film? No, but there would be film in it. Will there be live music? Yes - we hope so. Who are we interviewing? We don't know yet - probably 10-15 people. Who is involved? Me, Pete and Nic Duncan were the core group. Pete had lots of experience in music and community projects and would be getting musicians, performers and general helpers involved.
The questions started coming including how the projections would work (one projector, multiple projectors?), how would we produce content/edit for each screen and how would the live performances fit in?
On the first interview on a Saturday morning in late July at a house way out of town, we turned up with everything - 5 cameras, tripods, lights, booms, greenscreen, slider and took over the house. The original plan was to film outside, but it was raining and the couple we were interviewing were going on holiday the next day.
We tried using a clapperboard, but it was off-putting - hopefully we could sync it up in FCPX later. The shoot took about 4 hours with intro, setup, filming and some impromptu lunch. We would have to slim down on equipment and interview time! (See Typical Setup Photo above and early promo produced in August).
First Rough Cut
Fast-forward to early September, there was a meeting for all those involved going through what the project was about - time for a quick rough edit from the first 13 interviews.
At that time, I only had the footage from my camera so it was easy to import each interview as a new event into one library (see FCPX screenshot below). But it was already getting a bit out of hand. The new FCPX project snapshot feature provided some security and versioning, but sometimes I was editing the snapshot rather than the original project file.
I've always wanted someway of locking different parts of an edit, as it is so easy to move something that messes everything else up - checking in and checking out would be a good way to go. Due to pressures of getting a rough edit done, I had not looked at all the footage and only skimmed to interesting bits, used Favourites, some keywords and put together a timeline.
The 7-minute cut was well received, but I learned that it was not a film project, but a community project, which would require some different skills!
(Click for larger image)
Multiple Screen Edit
We had to figure out how the projections would work. Luckily Pete had met Matt Andrews who was a VJ and had used Mad Mapper and VDMX5 at live events - perfect!
We set up a trial with one projector mapped to multiple virtual screens (see Mapping Setup Photos). I learned that some of the screens were not going to be 16:9 format, but round - the projections would be going onto huge drums! The framing during filming had not taken this into account - hopefully it would be ok.
I also learned that the easiest process would be to produce multiple master files for each interview, one for each projection-mapped screen. Master files are quick, but the largest format. This was great, as compressing the files would have taken too long for any quick changes.
I had only produced one timeline and one master file, so whilst everyone was reviewing Pete's edit I went to work on producing files for a multi-screen extravaganza!
I copied my project to a new project and began editing on my laptop. In the new project, I disabled the sound (we decided that only the centre screen/one project would have sound) and edited the shots. I then copied the project again and repeated.
I then went back to the original project and disabled some of the B roll. I quickly produced some new master files. This was all very easy in FCPX and gave us an idea of the workflow required. It was the first chance to see what it would look like across multiple screens. Advantages are that jump cuts are disguised by 'jumping' across to a different screen and it also allowed the B roll and stills to be available at different times on other screens. It certainly requires different editing, and also trying to keep each separate project in visual sync is a challenge.
We continued our interviews, up to two a day and eventually ended up interviewing 26 people and filming 2 school choirs.
With each interview there were suggestions of other people to interview - it would have reached 30, it could have gone on - we interviewed a Stanford professor who researched gravity waves, a professor who plays trombone to cows, a hippy, a woman in a coffin planning her funeral, a Vietnam veteran, a pilot who had flown for the Onassis family and had met Picasso.
We learned what it means to be part of a community and also the tragedy of family members and partners who had died. We had some set key questions, and got answers that ranged from a few seconds to over 10 minutes. And, we had a deadline of a live performance on 8 November.
Each interview would have to be edited to between 2 and 3 minutes to fit into the final live performance time of around 90 minutes.
The performers got the raw interview footage and were working on their music, poetry, writing, and singing at the same time as we were working on the edits. Sometimes the performers would have a strong creative idea from the interview that would influence the edit, and sometimes the edit would come first. It was very dynamic with new performers found and some dropping out.
The live performances could also occur at any point within the interview as well as at the start and end meaning that gaps or fades were required at certain points within the edit. Occasionally I got very detailed instructions on the editing required or particular points or words in the interviews that were needed in particular sequences. Being able to skim to these sections or review in person with the performers in FCPX made editing a breeze.
Once we had completed most of the filming, we split the editing. Pete and I swapped hard drives with each other's output from their cameras/audio and started the real edit.
Pete was on FCP7 and I was on FCPX - we kept our projects separate. Very soon I realised that having one library wasn't a good idea, so I created new libraries for each interview. It would be more manageable.
I used Final Cut Library Manager, which is great for organising different libraries. I imported the footage into each library so that the library was self-contained; a good idea at the time I thought as I could then copy across the whole library as a backup. However, copying into the library meant that I had the original camera files on the drive and another copy inside the library.
The libraries became big, up to 50GB each. I began to run out of space on my 1TB drive and at the time I didn't want to un-self-contain the libraries as I didn't know how easy this would be. I later found out this was very easy (Thank you, Larry Jordan). I had to upgrade to a 4TB drive, which took many hours of copying.
(Click for larger image)
Going through all the footage that included an hour per camera, additional audio, 100 stills, and B roll per interview meant we had a lot to get through.
I edited the main project as if it was one film, then copied the project and re-edited muting the sound and disabling (V) the clips that I didn't want in the new project/screen. I would then try and see what it looked like, but it is actually not that easy to fire off more than one master file at the same time.
A mock up of the screen mapping was produced by Matt in VDMX5. You drag the master files into each screen and then 'arm' and 'launch' to watch all the screens. It wasn't an exact replica of how it would be live, but a lot better than nothing.
(Click for larger images)
I felt I needed to watch all the footage, as I was either the interviewer on camera/sound (too much to do!) or just on camera/sound. Either way with very shallow depth of field and only 12 minutes per reel, there was a lot to think about during the shoot.
The FCPX skimmer came in really handy when looking for times when the shot changed on one of the cameras. The audio waveforms were also good for finding out when the next question was asked (different levels of interviewer/interviewee - we weren't filming the interviewer).
Other times we shot B roll or were at different locations, which again was easy to find. Originally I organised by events, but then used keywords and then smart collections that were refined over time. I now have a separate library that stores standard smart collections. The projects and old project snapshots had their own events for ease of finding.
(Click for larger images)
I had used the multicam feature successfully a number of times in FCPX, but I found that I couldn't sync between the two cameras in FCPX. I think it is because of multiple files from one camera (Canon 5D) in a particular codec compared to one file from the other camera (Sony) in a different codec.
However, syncing my camera to my audio recorder (Zoom H4N) works fine. Due to time pressures, I didn't investigate and went with whichever sound was going into each camera/sound recorder. Not perfect. Later on I tried PluralEyes; it synced on the audio, but seemed to have issues with the Sony codec.
We decided that we needed to get the audio mastered. We therefore had to complete the edit to get an audio file ready for mastering. Yet, I knew we still had some more filming and separate vocals to do.
I used the Timeline Index to check that the roles had been automatically set correctly to Video, Dialogue, Music, and Effects - I corrected a few. The effects I got from the standard Apple packs in FCPX and any non-live music was either previously recorded or royalty-free music bought from the iTunes store.
I then exported the roles as separate audio tracks - this step was quick. The files were put on Dropbox to be picked up by an audio engineer in Perth (500km away). They were then mastered over the weekend and the final single audio track inserted back into the main project file. The other audio tracks were muted, again using the Timeline Index.
The projects were checked to make sure everything was in sync, audio ok and then new master files produced.
My Time Machine drive did not have enough free space to use as a backup. I manually copied the libraries across to a separate backup drive. This was taking too long.
I decided on setting up a new 'transfer' library to be used as an incremental backup on my laptop. Any changes required when we were doing rehearsals would be copied onto this transfer library and then copied to the other drive when I got back to my studio. I used Compare Folders app to check that everything in the FCPX folders and master files were in sync across my drives. Again, not ideal.
The project files were labelled by interviewee and also by screen position. Each master file was in ProRes422 at 1-3 GB in size and the audio was stored separately.
I couldn't run more than 4 master files at a time - my system wouldn't handle it. So I couldn't see what it would look like for real. I would have to wait for technical rehearsal, the week before the live performance. The master files were converted to .hap files, a larger format favoured by VJs and used by VDMX5 to trigger the multiple screens.
Rehearsals and Live Performance
At the first technical rehearsal the door projector, a critical part of the performance, was not talking to VDMX5.
I also got handed some music that needed to go into one of the scenes - it was an easy job in FCPX to put the track in the timeline fade down for the interview and then fade up for 30 seconds at the end for the live performance. I also changed the framing by using Position and Scale in FCPX as some parts needed realigning due to the round drum shape.
Some old master files were being used - the correct ones were remastered during the rehearsal. All of this meant a midnight finish and we still hadn't seen how it would finally look. This was a week before the live event.
We sorted out the rest of the technical issues during the next technical and one and only dress rehearsal 2 days before the performances. Apart from the projection mapped screens, there was the sound, lighting, live band, separate live musicians, dancers, live writing on a separate projector, stage management, promotion, ticket sales, etc. etc.
(Click for larger images)
The final live performances had each interview scene with its set of mapped projection screens triggered/faded live using VDMX5 in response to the action on stage; there were around 60 live cues across 8 mapped projection screens.
The performances did sell out, the reviews did rave, there were tears and laughter, and the whole community benefitted from making new connections.
FCPX gave me the confidence to make sense of 1000s of different files across 26 interviews, multiple screens and with live performers - and still make last minute editing changes as required.
Tim Maisey of TMCG productions grew up in England and worked in London for various banks, computer and media companies before moving to the small town of Denmark on the south coast of Western Australia. There he discovered filmmaking and community.