8K60p REDRAW, a Mac Pro, a Pro Display XDR, Final Cut Pro and the legendary Nile Rogers. Another superb FCP user story from Ronny Courtens.
In 2017, we published a case study with Tom Sefton from Pollen Studio about how he used Final Cut Pro in the production of the groundbreaking Forever Project that allowed live audiences to interact in realtime with life- size photorealistic 3D recordings of Holocaust survivors.
This award-winning project was so successful that a new company was founded with the goal to create a new kind of virtual experiences that could revolutionize how music stars and other famous people interact with their fans:
Just imagine you could sit down with any of your favorite artists in a virtual room and talk with them about their work and life, asking them questions and getting immediate answers, feeling their presence and emotions as if they were having a real-life conversation with you.
Forever Holdings and its associate on the project, Universal Music UK invited Nile Rodgers to take part in this ground-breaking project. Working with the National Portrait Gallery, the experience is being called “the world’s first voice interactive digital portrait”:
- One year of preparation and testing. Recording Nile Rodgers in 8K60p 3D REDRAW in one of the largest sound stages in London. Another year of post production, reviewing and fine-tuning, collaborating with people all over the world in the midst of a global pandemic.
- Organizing, editing and grading eight hours of stereoscopic 8K60p rushes playing in full quality in Final Cut Pro. Delivering hundreds of 8K stereoscopic SDR and HDR masters for voice interactive VR installations and online platforms. Sharing FCP metadata and assets between several teams specialized in 3D video production, speech recognition, AI driven natural language processing, and advanced immersive VR technologies...
How did the project evolve from start to finish? How did Final Cut Pro and Mac hardware contribute to the process? And were the producers able to achieve their ambitious goals? To get answers to these questions, I sat down again with Tom Sefton who managed the 3D production and directed Nile Rodgers during the shoot.
What was your reaction when you heard you were going to work with Nile Rodgers?
Tom: I was stoked! I mean, he’s the guy who co-founded the world-famous band “Chic”. He wrote and produced mega hits like “Let’s Dance” for David Bowie, “Like A Virgin” for Madonna, “The Reflex” for Duran Duran, just to name a few. He has worked with Diana Ross, Mick Jagger, INXS, Bryan Ferry, Lady Gaga, Daft Punk... 3 Grammys and 50 million album sales... His career and his life are truly exceptional. I mean... yeah!
You already had a very extensive experience with interactive VR projects. Did this project present any new challenges to you?
Tom: Oh, absolutely!
- We would work 8 hours with Nile, spread over 2 days. In that time, we needed to film him answering no less than 350 questions collected from research done into his life, people he'd worked with, his fans... And we needed to enable dozens of people responsible for the project to review and evaluate every session right away, both on set and remote.
- This project was intended to be viewed on multiple platforms: from physical VR installations with HTC Vive Pro headsets to interactive websites and much more. To produce the best possible quality for the Vive Pro VR installations, we wanted to shoot in 8K60p. So we would have to organize, tag, align, edit, grade, QC and review around 8 hours of stereoscopic 8K60p footage and render out many hundreds of stereoscopic master clips in different formats and color spaces. More than we ever did before.
We started our research in April 2019. Our goal was to make a recorded human appear perfectly lifelike and have the best possible emotional connection to a person watching it. Over the course of 9 months, we tested different cameras from ARRI, Sony, Blackmagic and RED with different setups and different backdrops. We finally decided on using a RED Helium 8K 3D rig for the shoot.
To get a perfectly clean image of Nile in VR, we wanted to de-noise the background in all of the footage and export the processed footage to ProRes 4444, our final delivery format. Noise reduction is a very time- consuming process, so we tested how much time this would take.
For our tests, we had a fully maxed out 18-core iMac Pro with an external Radeon 7 GPU. With 2 nodes of NR, we got around 0.5 fps of rendering speed to get from 8k60p REDRAW to 8k 60p ProRes 4444. When you extrapolate that figure, the entire de-noising and rendering process would take more than 3 months!
But two days before I packed the van to go to London for the shoot, our 2019 Mac Pro arrived. A 16 core model with 192GB ram and Dual Vega II with an extra Radeon VII GPU card and an Afterburner card. We tested rendering again and all of a sudden we jumped from half a frame per second to something like 13 frames per second!
Using the new Mac Pro, our NR processing and rendering time suddenly went from 3 months to around 3 weeks. I was flabbergasted. It’s the first time that's happened in years for me with hardware just blowing my expectations of what it would do.
We filmed Nile in late February 2020. We had rented an enormous sound stage at Malcolm Ryan Studios, one of the major studio facilities in London. The stage is so big you can get like a double decker bus parked in it.
Nile was on his own in the middle of the stage with blackout everywhere and a black floor and, I mean, it was amazing. We also had our Mac Pro and the Pro Display XDR on set. I played some 8K HDR test footage of my daughter on the Apple display and it looked absolutely gorgeous.
We used dual Red Helium cameras with the 60fps upgrade. Both cameras had their own additional cooling systems. The cameras were aligned using our custom built 3D rig that allows for minute adjustment of camera alignment and inter-ocular distance.
How did you manage the audio?
Tom: We had a lot of microphones. Some were for the crew to communicate, some were for Nile and his guitar because he also plays some music as part of the VR experience. Our head of audio, Martin Hudson, did a brilliant job of organizing a live recording desk which allowed wireless communication alongside hard drive recording of the raw files. We had a 24-channel Allen & Heath desk and we actually used every input and output channel.
The audio stems were recorded in Logic Pro running on an iMac Pro. The Allen & Heath desk also lets you plug in a fast ssd with an USB -3 connection such as the OWC Envoy Pro. We used this option to record all of the raw audio as another safety backup.
We recorded each session onto RED Mini Mags, 24 Mags in total. Right after every recording, our DIT used Hedge to backup the cards onto several RAIDs. We knew we would not have time to import the RED clips into Final Cut Pro between sessions, so we used a Blackmagic SSD recorder to record live 1080p30 stereoscopic proxies with audio. We used these proxy clips in Final Cut Pro for immediate review on set and for the initial edit layout.
I had created a Final Cut Pro Library for the entire production. As soon as the proxy files came in, I imported the stereo files into a separate Event for each session and we could immediately output a full stereoscopic 3D signal from the Mac Pro to a 4K 3D monitor.
My only regret was that Nile would not see the full-quality 3D image during the shoot. On the evening of the first day, On the evening of the first day, I decided to prepare one shot in 8K that would show our work in all its glory. I knew the workflow, I could just export a short 8K 3D segment of a session from Red Cine X and bring this into Final Cut Pro. I used the Noise Reduction filter in Final Cut Pro to eliminate the background noise and it did an amazing job, the image was crystal clear. Then I graded the stereo clip using the Color tools and I set up the Right and Left eyes for 3D playback.
The next morning, Nile walked onto set. He got changed and then he just came and sat next to me to have a look at what we had prepared. The segment showed him answering a question while throwing his hands out towards the screen. As soon as he saw the movement in full 3D on the monitor, his face transformed and he smiled and he was like “I get it, that looks impressive”.
That was the best feedback we could have expected, Nile was happy. We finished the shooting and started prepping for an intensive post production process.
Back home, our DIT exported 4K ProRes 30p stereoscopic files from the raw 8K RED footage. We would use these to start editing and reviewing the footage in Final Cut Pro on our MacBook Pros while he would use the Mac Pro to do the time consuming noise reduction of the 8K RED footage and export those processed files as 8K60p ProRes 4444.
How would you manage to relink the 8K60p footage to your 4K30p edits?
Tom: I love working with multicam and compound clips. And this is the beauty of multicam in Final Cut Pro: Create a 60p multicam from your 30p proxy footage, it will play perfectly. When you get the 8K60 files, just open the Angle Editor, add an angle, drop the 8K60p file into that angle and align it with the 30p clip using the clapper board as a reference.
Now you can instantly switch between 4K30p and 8K60p just by changing the Active Angle in the multicam. You can use the same method to add the mastered audio clips to the edited multicam clips later on.
We started editing a few weeks after the shoot, and then came COVID. If the pandemic had started 4 weeks earlier, before the shoot with Nile, it would have been the end of this project. We knew we would have to find a way to collaborate remotely between myself and my assistant editor, so we decided to use Postlab. And it was just in time.
On a Sunday evening, we got a warning from the government that the country would go in full lockdown the next day. So I rushed to the studio, I disconnected a Raid with a dupe of the 4K ProRes footage and I took it with a MacBook Pro back to my house, literally like the last chopper out of Saigon. I left the Mac Pro in the studio to render out the processed RED footage and everyone started working from home.
The powerful metadata tools in Final Cut Pro allowed us to quickly organize and tag many hundreds of assets and to include the information with the video files that we sent for review to a large number of collaborators working from home.
Besides Keywords and Notes, Markers were extremely powerful for us to visually track and trace each individual asset over a project with 8 hours of stereo footage. Markers allowed us to quickly organize and tag many hundreds of assets and to include them with the video files that we sent for review to a large number of collaborators working from home. We also added captions to each clip in Final Cut Pro as a visual support.
The lockdown ended in May and we could go back into our studio. We had done a lot of work in PostLab. I’d pretty much got all of the sessions edited, I had assets ready to be assigned, we knew what was going where. Gradually, we started to “relink” the 4K ProRes proxy footage to the noise reduced 8K ProRes 4444 clips. We mastered the audio stems in Logic Pro, adding subtle layers of noise reduction, compression, levels and normalizing, and brought the mastered files into Final Cut Pro.
I now had a Final Cut Pro Library that I could use with the 4K ProRes clips if I wanted to work on my MacBook Pro. Or with the 8K60p ProRes 4444 clips when I had the Mac Pro. The Mac Pro has an Afterburner card inside and we could play the stereo 8K60p clips side by side in realtime at Best Quality without any issues.
The performance of Final Cut Pro running on the Mac Pro is jaw-dropping.
We had several remote edit sessions on Zoom with our VR producer and programmer to try out different takes and reactions in VR. Finally, we used Final Cut Pro with the Pro Display XDR to grade the footage for different deliverables. A general grade was not ideal, it was way more accurate for us to actually eyeball the clips in Final Cut Pro and use the XDR display and the waveforms to grade our footage. And it was brilliant, it just worked really well and when you look at the footage on the XDR monitor, the quality is just stunning.
To deliver HDR masters, we needed to create a Wide Gamut HDR Library. So we duplicated the original Library and set the color space of the duplicate Library to High Dynamic Range. That was the only hairy moment we had with Final Cut Pro. When we changed the color space, we got a long spinning beach ball. But 20 minutes later, 8 hours of stereo rushes had been duplicated into a different color space! We delivered hundreds of stereo master clips to the VR specialists in different versions and different color spaces.
Once the VR experience for the Vive Pro was ready, we started testing it with an audience. Due to COVID, we only could bring in one person at a time into our office in York to watch the experience on the HTC Vive Pro. The Vive was then cleaned and left for 24 hours before a next person could come in. The feedback we got was amazing. We had people that literally thought Nile was going to come and sit on their knee and they were like: this is so real, it’s scary! I am going to blow our own trumpet because it is simply spine-tingling when you see him in the Vive Pro headset. He is really there in front of you and he sits down and you actually can feel he is there with you...
Unfortunately, it will take some more time before audiences can watch the 3D experience in physical VR installations. Luckily, Innovate UK were very helpful to deliver the project as a shorter web-based interactive experience for audiences at home while we wait for things to get normal again.
Looking back on this project, how do you feel?
Tom: It's been a fantastic experience. Nile is an absolute gentleman - warm and funny, captivating and friendly and he didn't once complain about very long filming days with small amounts of breaks. I can't compliment him enough, other than saying I was a fan before and a bigger one after. Not only is his impact on popular music unfathomable, he is a forward thinker and keen technologist. He really got what we were doing, and I was delighted to have him sign our Mac Pro!
Final Cut Pro, the Mac Pro and the Pro Display XDR have been essential in getting our project delivered in time.
Many thanks to Tom Sefton for taking the time to explain his workflows. Also a big thank you to Katie Blake / Bright White Ltd. for the beautiful photos that were taken on set.