Final Cut Pro X isn't ready for prime time? Think again. The recent series of BBC1's Have I Got News For You was post produced on FCPX. Could it handle the multiple angles of composited layers in a complex multicam timeline with a tight turnaround? The editor James Branch details the workflow.
Hi. My name is James Branch and I make my living editing sales promos and sizzle reels for production companies and broadcasters.
I worked at a trailer house in London for 18 years but ever since I turned freelance I have been working from home and using Final Cut Pro X - and only Final Cut Pro X. Promos and sizzle reels are 90% of my work but I have also post produced a feature, edited commercials, short films, educational materials and stand-up comedy specials.
On the 18th March 2020 I received an email from one of my regular clients, Hat Trick Television. They produce the satirical news show, ‘Have I Got News For You’ - a show that, in September, will reach its 30th year and 60th series.
It’s a prime-time BBC1 show that takes a hard look at the week’s news. A studio-based panel show with a guest host each week and two teams of two, each captained by the two original stars of the show, Ian Hislop and Paul Merton.
Due to COVID-19 they were planning to shoot series 59 in the studio as normal but without an audience but they needed a back up plan if lockdown was called by the Government. So, the question was, could it be done over Zoom?
Having literally used Zoom for the first time that same day I had a think and said, ‘Yes. I can do that!’ I quickly filmed a Zoom call with my 12 year old daughter, put the two video files into a Multicam clip, edited it and sent it to the production manager.
Happy with that we moved on to a bigger test. A Zoom call with Ian and Paul was arranged to simulate the show with the producers standing in as host and guests. We recorded each feed by ‘pinning’ the camera view and so I had five files to work with.
The show’s design team had come up with a ‘virtual studio’ graphic for each person and team so I made a compound clip for each contestant and put one CC into each angle in a Multicam clip. That all worked great so an 8 minute, proof-of-concept edit of the show was sent over the BBC.
It quickly became apparent that doing it all via Zoom wasn’t viable due to audio delay, video quality and, over the course of nine weeks, many of the guests simply wouldn’t have a good enough internet connection. But, it works in principle!
Then what happened? Lockdown! No studio version anymore, it’s going to be done remotely.
Now, I won’t go into a blow-by-blow account of how the workflow grew and changed over the course of the nine week run, so I’ll take you through the edit and delivery of TX9, the last show. Needless to say there were some very late nights/early mornings required during the production of the early shows as we worked things out and identified areas where we could save precious time.
After the first show it quickly became apparent my trusty 2015 5k iMac simply couldn’t cope with the demands I was making of it so an 8-core iMac Pro was ordered and I never looked back. The other essential element in my workflow was my internet connection. The 50 Mbps upload speed I was getting from my BT Infinity 2 service was crucial.
Over course of the run I had to learn and use some pieces of software and online services I’d never used before and, as ever, whenever I reached out to the FCPX community for help and advice, they didn’t let me down.
So, before we start here is the final camera and production set up.
A one man camera team was sent to each participant’s home and, wearing appropriate masks and gloves, they set up camera, lights and sound and left the room during the broadcast.
An SDI feed was transmitted to the machine room at the company base and these feeds were combined into a five-way split. This was then broadcast back to each participant via a low latency feed so they could see themselves and everyone else during the show. They each had earpieces connected to a Unity system so they could hear each other clearly and the producers could talk to the host during recording.
(Click for larger images)
This five way split was also available over Zoom so myself and all the other members of the crew could watch and listen. The ProRes rushes were delivered to my home in South London on SSD cards as soon as each camera team had finished on location.
So, I would watch the recording via the Zoom link and record it. That file would then be the basis for my master Multicam clip. So, here’s a stage by stage breakdown of recording to delivery. I also had to produce a 42 minute extended edition of the show which broadcasts on the Monday night following the 28 minute Friday night show.
The aim of the workflow was pretty simple. As soon as the cut was approved by the producers, it would be exported; there was no time for post production.
I had to prepare some stills and simple animations to be played to the contestants during the show. This was done on Tuesday and Wednesday morning and was very simple. I set up a frame.io project for all the researchers and associate producers to upload all the stills, VTs, clarities and other graphic elements to, rather than be inundated with links.
The show would start around 3pm and here’s my set up for the viewing the recording.
I would watch the feed in one window, follow the running order of the show in an another and also log sections of the show and key moments using Lumberjack Systems Logger. On TX9 I was watching a Zoom feed (and recording it) and used Lumberjack’s noteLogger app which is faster and more responsive than the browser-based logger pictured here.
Recordings would last between two and three and a half hours. Once completed I would take my Zoom recording, import it into FCPX and, once I had altered the start time to match the BITC, I created my Multicam clip with the Zoom recording as Angle One, export an xml and then use Lumberjacks Lumberyard app to apply the logging information I had applied and have it all available and searchable in the browser.
Using a Smart Collection I could quickly navigate my way around the recording. If I wanted to find the Pick Ups done for section 4a of the show, I could get there very fast.
I’d then wait for the rushes to be delivered to my house. Some I would get in under an hour, others wouldn’t arrive until the next morning. As they were delivered on SSD cards I could slot them into the Blackmagic Multidock we had rented and connected via thunderbolt to my iMac Pro so I could start work immediately.
So, now we come to the secret of the success of this workflow - Final Cut Pro X’s brilliant Multicam system.
Building the Master Multicam clip.
After importing the rushes for one of the participants I would place them into their ‘Virtual Studio’. Very simple really. A graphical representation of the normal studio set. The design team supplied me with a shot for each person and a two-shot for each team with a matte.
I placed the matte above their virtual studio shot and applied the Silhouette Luma blend mode in the Inspector.
I then made a Compound Clip of these two elements and and created another Compound Clip to include the rushes for the participant below my Virtual Studio CC.
I positioned the person into their frame and colour corrected the shot. I’d then take that CC and place it into my Multicam Clip.
I could, at anytime during the edit, go back into an angle’s CC and make any adjustments to the grading, positioning or audio and it would immediately ripple through to my edits.
The Multicam is very flexible a robust and, quite simply, without it this wouldn’t be possible to do this in the time.
So, while I’m doing this, two of the producers divided the show into two parts and started compiling a list of the sections of the recording they want to use, referencing the time of day BITC on their Zoom recording.
After a few hours I’d get both sets of time-codes and then, as my MC had the same start TC as the Zoom recording, assembling the first rough cut could be done in about an hour and a half.
On most edits you would do the clock and end credits last. In this case, they were always the first things I did! In the still above, the bars, tone, clock, titles sequence, music, end credits and end logo are already in place.
A little help from my friends…
Although I was alone in my home office, I needed help with a few things like sound and motion graphics. At the end of each show in studio there is a crane shot away from the desks, over the audience and the lights are dimmed, so the design team created a virtual version of this shot and there was no way I’d have time to build that shot each week.
So I called in my friend Dawn Donohoe to take care of that and do any other photoshop and motion graphics work the team needed that I couldn’t do myself.
All I had to do was supply the five individual shots to length from the end of the show and Dawn would place them and export this end shot to me.
As each person was filmed in their own home there were always issues with background noise or reverb. Again, to alleviate my workload, I asked my dubbing mixer friend Ian Marriot-Smith to do a pass in ProTools on all the rushes audio. I’d supply each channel recorded on each person as a mono WAV file.
And finally, there was one more person to get onboard when the question was asked by the production team, ‘James, what happens if you get ill?’
Good question. ‘No show!’, was the answer so I called in fellow FCPX editor Gavin Burridge to shadow my work. Luckily he only lives about 20 mins away from me in south London so getting a back up copy of the rushes to him was easy. This is where PostLab came into play but more of that later…
So, by the end of the day Wednesday I had a fully logged, synced, graded and complete Multicam clip. An assembly edit and a link emailed to the producer for the next morning and all rushes copied over from the SSD cards onto my Pegasus RAID. A further back up copy of the rushes would be made overnight and couriered to Gavin first thing. All audio exported to a shared DropBox folder for Ian to work on.
9am. Zoom call with the series producer. Over the next four hours I’d share my screen and we’d get the edit down to around the half hour mark.
At this point every shot would still be the five way split but I’d switch angles here and there and isolate audio where people were talking over each other; a common occurrence due to the audio and video delay during recording. We’d export a cut at half twelve and send password protected DropBox links and frame.io links to the production team.
While awaiting feedback, I’d edit the trail, provide Dawn with the shots for the end crane shot and put in the processed audio from Ian as it arrived back to me over the shared DropBox folder.
I just added this treated audio into the appropriate CC and mute the original audio. So now all the audio and video was ‘post-produced’. All I had to do was complete the show!
The series producer and I would convene again on Zoom for another hour to do further notes from the channel, the producers and legal team before I was left alone to get the show looking like a show.
I’d now import all the materials from frame.io the team would have uploaded overnight making sure everything had keywords and had Roles applied.
Then, at about 2pm on Thursday, the real grunt work started. Refining the cut into the fast-paced, smooth final piece would take me 8-10 hours. Every audio edit, laugh, pause and stumble had to smoothed and cleaned up as I went. Every still and VT had to be graded as I went. Here’s an example of an unedited portion on the show:
And here’s it is finished:
And a section from the end of the show:
8.30am. Zoom call with series producer. Over the next couple of hours we’d refine the cut further and add/remove sections to get the show to time.
We had to have another set of links ready for the channel and producers to sign off for 11am. Once that link was sent I would watch the show through from end to end on a domestic HD TV set via an UltraStudio Express HDMI feed checking for an interlacing issues and audio issues correcting as I went.
Soon, we’d get more notes to do and then we’d get final sign off around 2pm.
Once approved the show is already post produced! As the show is due to be broadcast in 7 hours time we had to get it to the BBC fast so I export the mix and run it through the Nugen Audio LM2 stand alone app to comply with EBU 128 and lay that mix back to the edit.
I apply the broadcast safe filter to an adjustment layer over all the video and export a ProResHQ stereo file. I then copy the file to a ExFat formatted SSD drive and await the arrival of a courier to take the file to a facility in central London who will then play the show down the line to the BBC ready for broadcast.
As a back up I send the file to the facility using the MASV file transfer service.
By 6pm the show is with the BBC and ready to go.
However, by the time the drive is ready, I will have already started on the extended version of the show. An associate producer will have emailed me selects to add a further 12-15 mins of material, and so a mini version of the last 24 hours starts. By the end of Friday I’ll have completed around two-thirds of this cut.
Throughout the morning I’ll have a few short Zoom calls with the associate producer and we’ll get a final cut to the team by around lunchtime. This is will eventually be viewed and signed off by early evening - after a few more tweaks.
As this episode doesn’t go out until later Monday night I export a 4 track mono AVC Intra 100 file and produce an AS-11 DPP file using the DDP Metadata app. I upload this file with the paperwork (including a Harding Test) to the BBC Cloud Portal.
I’ll have to deliver Friday’s show this way too so it goes onto the BBC server. I’ll wake up on Sunday morning with an email from the BBC saying the AS-11 files have been accepted and validated and that’s that!
Episode 6 was due to be transmitted on Friday 8th May but that was VE Day and the BBC had scheduled alternate programming that night so the show would go out on Thursday - but still record on a Wednesday. Uh, okay.
So what do you do when you have half the time? Double the editors!
Gavin and I had started using PostLab so he could monitor my progress through the edit. He would download all the elements from frame.io and link up his back up copy of the rushes when accessing my read-only library on Postlab. That way he would be able to take over the edit at a moments notice if needed. So here is the timeline we had for getting the show out in half the time.
Wednesday - Show records from 12 noon to 3pm.
All rushes to me, imported and then sent with the BM Dock to Gavin’s house. I build the MC clip, import all elements from frame.io and then duplicate my library and post it to Gavin via PostLab. So, this time he was his only library to work with while I work on mine.
One producer works with me on the first half of the show, one with Gavin on the second half. By around 9pm we each have a fine cut of our halves of the show. We each then start to refine the shots etc.
Thursday - 2am. We are both finished and Gavin uploads his library to PostLab. I open it, create a new event and copy his edit into it. Then I copy that event to my Library and close Gavin’s library. Everything links immediately to my media. Flawless. Even if I do ‘Reveal in Finder’ on a clip from Gavin’s half the show it links back to the original MC - not a duplicate. Brilliant. I watch the show through and export a link and go to bed at 4am.
9am Zoom call with series producer and hey presto! we are now were we would normally be on a Friday morning. The rest of the day proceeds like a normal Friday and the BBC gets its finished show by 6pm. 30 hours after recording started. Not bad.
Notes and observations
On the Monday morning we had to change a caption on both the Friday show and the Monday show. That meant redelivering both DPPs. The Monday one urgently!
I had to change a caption and the clock but, instead of re-exporting the entire episode I just exported a file of the revised clock and a file of the revised shot with the new caption and made an insert edit on a copy of the master mxf file using the rather brilliant CineXtools.
Exporting the file and remaking the DPP would have taken around 45 mins but with CineXtools I did it all in about 10.
Export time for a 30 minute HD AVC 100 file on an 8 Core iMac Pro 3 GHz Intel Xeon W 64GB was 15 minutes. On my 2017 5k iMac it was around double that.
For the end credit roller I took the basic one included with FCPX and opened it in Motion to make my own version HD, 25fps, interlaced, upper field first.
As I’m not Sound Mixer I felt reassured having Ian do a pass on the audio but it’s entirely possible to do all the audio in FCPX. I need to get myself more training! Thanks to Peter Wiggins though for showing me around the AU Dynamics Processor in FCPX as a tool to normalise audio.
Thanks to Alex Snelling and his excellent white paper on creating AS-11 files and for helping me out!
More on creating MXF files within FCPX without using a Compressor setting here. Take extra care to note of the section where it mentions increasing the audio levels on all channels by +3dB prior to export!
More on using the DPP Metadata app here
Further thanks to Peter Zacaroli at Western Digital, Peter Wiggins, Andy Quested (BBC Tech Guru!), everyone at Lumberjack Systems for all their help and support, to Ripple Training and Ross and his fantastic team at Electric Robin.
Thanks again to Gavin, Dawn and Ian! To Sam Ames, my superb Production Manager and everyone at Hat Trick Television for their brilliance and professionalism. And thanks for the use of stills from the show.
And thanks to my wife and family for the their patience and love. x
Products and services I used are here:
I graduated from the London Film School in 1992 and got my first job as an assistant editor on a BBC drama shot and edited on 16mm film.
After that I started working at a trailer house in Soho, London - initially doing post production on 35mm but soon started editing trailers on AVID.
After 18 years there I went freelance and have been based at home ever since. I also have 6 years experience doing stand up comedy and have written for BBC radio.