McC VR-01-Jaunt-page

Every month new VR videos appear that push the boundaries of what this medium can do. This is the story of how editor Duncan Shepherd put Final Cut Pro X at the heart of ‘Pure McCartney VR,’ a series of five innovative Paul McCartney VR video films made by Jaunt.

 

FCP.co is very pleased to be able to publish the story behind the making of Paul McCartney's 360 videos. But how did Paul get involved with 360 production?

Cliff Plumer, president of Jaunt Studios explains:

'Paul has been a big believer in VR ever since Jaunt filmed a concert of his at CandleStick Park. When it came time to launch the box set of his music in June, it seemed like a good opportunity to team up again to promote his music.

I’m old enough to remember buying albums just for the art, so why not do something more creative, leveraging Paul’s storytelling techniques, reminiscing over his career. We wanted to put those stories into 360 and create something special, get the viewer to watch more than once and discover something new.'

 

 So the challenge was set. Alex Gollner talks to editor Duncan Shepherd:

  

What is VR video? If you take a look at any of these five short films, you can look in any direction inside a sphere of video: straight ahead, to your right, behind, below or above you. 

Apple says that Final Cut Pro X provides “Unprecedented power for the next generation of post.” It was Final Cut’s real-time ability to play back large numbers of rendered high-resolution images and videos on top of a high-resolution video background without delay that made it the right tool for this job. Also important was the wide range of third-party workflow tools and features unique to Final Cut such as ‘roles.’

Paul McCartney was filmed in long VR video takes that lasted for almost the entire length of each film. To these background spheres of video director Tony Kaye and editor Duncan Shepherd overlaid animations, graphics, pictures and normal ‘flat’ videos. 

Many videos and most films have surround sound. We are used to hear sound as if it comes from any point: a 360° soundscape. In the case of these videos Kaye and Shepherd’s strategy was to fill out the 360° ‘viewscape’ - wherever viewers can look, there is something to see. These overlay elements were generated in other applications. They were positioned, timed and combined to work in sync with McCartney’s words and music in Final Cut Pro X.

(View in Chrome for VR)

 

The shoot

The films were commissioned to promote and be part of the ‘Pure McCartney’ collection of songs and videos. The initial idea was to return to where his music videos were created. This evolved to shooting all the films in a single location: a room where McCartney might go for songwriting inspiration:

McC VR-02a-Set-Fisheyes

McC VR-02b-Set-Elements

 

An important aspect of staging the shoot was to allow space for content that would be overlaid onto the full sphere of video. For example white and blue screens could show what was visible on the day (imagery created using on-set projectors), or overlaid with new content.

 

Editor Duncan Shepherd:

Just one shoot, no re-shoots. I think it was only two and a half hours. We had a very, very small window in which to capture him. It was quite low impact. There was a crew of 16. From Paul's point of view he just sort of stood around waiting for Tony to say, “Okay, now talk to me about this,” and he would talk for 10 minutes and Tony would say, “Great, amazing. That's great. Carry on.” Off we went. There was a couple of takes where Paul just sat and jammed on his keyboards or guitar, and he used the mandolin to illustrate the inspiration behind “Dance Tonight”. We had about 90 minutes of footage that actually came out of the VR camera, of which we used roughly 35 minutes in the end.

McC VR photo 01

 

Paul was interviewed five times by Tony Kaye who was nearby but not always visible in the 360° space. Tony has been a huge fan of Paul and the Beatles for as long as I've known him, (since the 80s) and he was able to develop a friendly rapport with Paul, which allowed Paul to be himself. He's genuinely charming and interesting as a subject, and has led a unique and storied life. We ended up with great footage, Paul talking about either writing songs or his time with The Beatles or making a specific music video. We brought that footage to Los Angeles, reviewed the dailies in the VR goggles, and started immersing ourselves in the experience to figure out how to make what we’d capture into VR films.

Tony was adamant that we take nothing out of what McCartney said or did, he just wanted absolutely everything in. He considered it to be like a simpler work on Paul McCartney. It was a matter of making space in the 360 world for 2D content: you got to fill every direction someone might look. 

Gradually, through listening to the interviews and realizing that each one clearly had a different feel because they are about a different song or a different video, Tony started to work towards trying to think in terms of what could we fill out the 360 world with given what Paul was talking about.

McC VR photo 04

 

For example, one of the films is about the song Dance Tonight, which Paul describes in terms of his liking to visit guitar shops and the time be bought a mandolin. He talks about learning to play it and realizes that it had peculiar tuning and then writing a song for his daughter on the mandolin. He talks you through the creative process of that songwriting and then making the video with Michel Gondry. 

 

In that instance, we latched on to the Michel Gondry aspect and injected little bits of cutout cardboard graphics and cool diagrams (as an homage). The kind of things Michel puts into some of his music videos and feature films. So we show people how Paul played the song and little bits of fun graphics of London imagery such as double-decker buses and policemen - in a Michel Gondry style. 

Art direction was provided by Tony Kaye’s designer Alex Kohnke. Alex would think of what tone was needed, and would design something in 2D in Adobe Photoshop for us. We’d look at it and try and work out a way to make that work in the 3D space. 

We’d either give it to our 3D spatial designers, Nick and Zach Young of MachinEyes, or we’d do it in Final Cut Pro X and then we’d give it to the designers or we’d just do the whole thing in Final Cut using the Dashwood VR tools, depending on what needed to be done. 

Each one started to develop their own flavour. Tony would say, “Well I want Early Days to be about light. I want it to be the lights coming out of the projectors.” We would gradually work our way towards a solution that was almost like 3D-like light painting, you know, the way you do that with Picasso’s light paintings technique they used to do with photography. Trying and thinking of a way to illustrate that but in a three dimensional world. 

My Valentine was about the process of making a film, on the film so that became quite cinematic.

Mull of Kintyre was really a story about writing the song, and Paul and Linda raising their family in Scotland. 

 

Linda took loads of wonderful photos of that time, and we took the opportunity to raid the internet for assets. Once we settled on the ones we wanted we sent Paul McCartney’s archive a detailed stills request list that we’d produced from FCPX using Clip Exporter They provided high-quality versions that we could use in the final edit. 

Ultimately, everything had to have a sort of overriding design and narrative idea, then within those ideas, we’d keep filling out the shots. Rather than being restricted by the VR environment, we just went with the creative idea that Tony wanted, and would make the VR system work with that. We tried to impose an artistic overlay on to the VR system.

 

VR video preview

Once the footage was in Final Cut, the team viewed the clips playing back from the timeline using the free Dashwood 360VR Viewer app. This application adds a window overlying the FCPX UI to simulate what viewers would see when looking around the VR video sphere in a web browser, in an app, or on a mobile phone. For desktop use, they could scroll around with a mouse to look in any direction. They also used the application with a developer version of the Oculus Rift VR headset (the DK2) attached to their Macs to simulate the experience in Oculus Rift, HTC Vive and other commercial ‘Head-Mounted Displays.’ 

Duncan Shepherd:

I think it is pretty much mandatory to review edits and outputs in a headset to get any real sense of viewer experience of the edit.

 

The VR viewscape in Final Cut Pro X

Here’s a frame from the original shot of ‘Dance Tonight’ showing what the Jaunt camera captured:

McC VR-03a-Mono-no-overlays

 

The individual cameras in the Jaunt rig were pointed outwards from the centre of a sphere to capture the view in all directions. Stitching software then takes the multiple camera shots and combines them into a sphere of video. In the case of the Jaunt system, this software runs on their servers.

McC VR photo 05

 

As video editing software cannot yet interpret a sphere of video, the stitching software maps that sphere onto a rectangle. This ‘equirectangular’ mapping is the same we use to convert the features of the Earth - a sphere - into the maps we see on flat paper and screens. Final Cut is designed to work with rectangles of video. In this case, the sphere of video is mapped onto a rectangle, so any images or videos overlaid onto that rectangle appear pasted onto the inside of the sphere.

The convention is that the centre of the ‘equi’ is what is straight ahead when you start watching the video, the top line of the rectangle what is directly above you if you should decide to look up, the bottom line of the rectangle is what is directly below you. As the left- and right-hand edge of the rectangle will be wrapped around and joined together to make the back of the sphere of video, this point represents what you see when you look directly behind you.

As these were 3D films (known as ‘Stereo’ in post production), there is a different view sent to each eye, which produces the feeling of 3D depth as you look around inside the sphere. As with normal human vision, there is more difference between the left and right views for objects closer to the camera than those objects that are further away. In the 3840x3840 square of 3D VR video that FCPX works with, the 3840x1920 view for the left eye is placed above the 3840x1920 view for the right eye. 

McC VR-03b-Stereo-no-overlays

 

The workflow also allowed for the 2D illustrations and pieces of video to be positioned and rotated in 3D space within the VR sphere. This was initially done in Adobe After Effects and Houdini and overlaid in Final Cut. As production continued Shepherd did more and more of the 3D animation within FCPX using Dashwood Cinema Solutions’ 360VR Toolbox plugins. 

When viewed through a VR headset or on a phone using VR video-enabled playback software, the source frame looks like this in three different directions:

McC VR-03c-Direction1-no-overlays

McC VR-03d-Direction2-no-overlays

McC VR-03e-Direction3-no-overlays

 

Shepherd and the rest of his team then added pictures, animation and film overlays so that the same frame looked like this:

McC VR-03f-Mono-overlays

 

The view through a headset then looked like this:

McC VR-03g-Direction1-overlays

McC VR-03h-Direction2-overlays

McC VR-03i-Direction3-overlays

 

Here is the Final Cut timeline showing that moment in the film:

McC VR-03j-timeline

 

The background sphere of video is the ‘SHT060’ clip. All the screen content has been combined into a compound clip named ‘DANCE TONIGHT SCREENS COMP’ - this content is designed so that it seems that it was being shown on the various screens visible in the 360° view. The animated overlays that seem to float in the air in front of the screens and McCartney are combined in the ‘4.3 DANCE TONIGHT v6a Clip.’

Here is Shepherd’s diagram of the workflow: (Right click for larger image)

McC VR-04-workflow

 

Storing low-bandwidth and high-quality media files in compound clips

The films were shot on a Jaunt One VR camera. VR video cameras are actually multiple cameras arrayed in a ‘rig.’ Each individual camera records footage that must be combined together to form the sphere of video being captured. This process is called ‘stitching.’ Sometimes this is done in application running on a PC or Mac. In the case of Jaunt, computers they control do the stitching remotely. 

As the stitching process is complex, even on high-end servers, it can take a long time to produce stitches that are seamless. In most VR video workflows, all the footage is combined using a ‘quick stitch’ method. These not-quite seamless video spheres are good enough to make editing decisions with. Once Kaye and Shepherd determined which parts will be used in the final production, the full-quality high resolution stitches were generated for only those parts.

That means Final Cut needed to work with two versions of each VR video shot: a low-bandwidth 2304 X 2304 ProRes files quick stitch version that could be used efficiently by Final Cut while editing, and the full-quality high-bandwidth 3840 x 3840 version which can be used when closer to final output.

In conjunction with LA-based freelance Final Cut Pro X workflow consultant Darren Roarke, Shepherd developed a process where the quick stitches could be used until the full-quality versions were generated. 

Shepherd and Roarke put each quick stitch into its own Compound Clip. Compound clips are mini-timelines that can be used on the main timeline as self-contained clips. While roughing out the main edit, each compound clip only had the preview version of the stitch. Eventually the team were working with final-quality stitches. At that point the compound clips were used to display either low-bandwidth proxy versions of the final stitches that Final Cut could display easily or full versions for showing when the main timeline was shared with collaborators or when exporting for distribution. 

 McC VR photo 06

 

Keywording to store interview transcription

In order to make the overlays match what McCartney was saying, Shepherd used Final Cut’s Notes metadata field to associate specific words with specific time ranges in clips. 

The team selected short time ranges of each compound clip associated with a spoken sentence. These were given a keyword - in this case ‘**LINE.’ When the browser was set to show segments of any media with the **LINE keyword, the video of each line appeared as a separate clip. For each of these clips Shepherd entered a transcription of what sentence was said in its Notes field. This could be done quickly by using MacOS’s speech to text feature. 

Once the transcription was entered, any piece of dialogue could be found using the search feature of the media browser. With the **LINE keyword being displayed, the team could type in a word or phrase in the search field to find all the clips where that was said. 

Because the note associated with each favourited clip is also carried into the timeline, any piece of dialogue could be found using the search field in the timeline index. This meant the team could quickly jump to the point in the timeline when that dialogue was said.

McC VR-05-transcription-in-notes

 

Editing

Duncan Shepherd on his collaborators: 

Probably the most important creator was Alex Kohnke. If I send him some interesting stills from a video, he'll print them onto paper and then paint on them with acrylic paint, and then photograph those paintings and then send those back to me. Or if I say to him, "I want some Michel Gondry lettering, he'll cut out some letters of cardboard, hang them off a coat hanger or a fishing line, against the green screen background, before he paints them himself at home. And then photograph them, give me a thousand images of that that he's photographed. Out of those thousands, there'll be a hundred where the letters are in the right order, and they are swinging around the right way. He's a very tactile designer, and that was part of the brief. There's a photographer called Peter Beard who does artwork where he draws over the pictures and adds found objects and sticks and bits of string and stuff like that. That was a look we were going for. 

Alex would produce either graphical guides, he’d built a still frame that would look nice to him and we’d try and copy it, or he’d produce elements that we could throw into our cuts, and he might make a paper cut-out of a dove, and animate it by unfolding it one frame at a time and just capturing it on a stills camera. Then he'd paint a little bit and he'd take a picture, and he’d paint another little bit, and we turn these things into little animations, like a one second animation, we might get 50 frames or 60 frames or something. We'd mess around with that.

We'd get from Alex all sorts of bits of media. He'd give us ideas, and it was up to us to make – this is one of the reasons Final Cut Pro X is so good at this sort of thing. You could just throw all of this stuff into FCPX, and it just kind of consumes it.

There's a few file types that it will choke on, but on the whole, layered Photoshop files and PNGs, and stills and all sorts of stuff, we would just keep throwing into the machine from time to time. My workflow with stills is if you were to give me 60 stills, at a certain resolution, I'll make an edit at the same resolution, and I'll put all those frames on, so they are one frame long. I’d export that as a ProRes file at whatever whacky frame size Alex shot at. 

Apart from Tony Kaye, Alex was probably our most significant artistic collaborator, Tony wanted Alex’ art to come through in the work that we were doing in all aspects of the visual design. 

After roughing out in FCPX, Nick and Zak Young of MachinEyes would take these elements and build a 3D world in Houdini. They’re really great guys, very creative and energetic, good people to have in the room.

Whether it came directly from Alex or went through MachinEyes, it would all finally come back into Final Cut, to be either positioned, re-timed or just to be tweaked, polished or sometimes just put on top of the timeline as a ProRes 4444 render with alpha. 

 

This 3D animated layer would then be overlaid on to top of the live 3D footage of McCartney in Final Cut. As the live footage was a sphere of 3D video, any content that overlays it requires adjustment in Z space to ensure there are no “clashes” of depth perception between live footage and added elements. Overlays are effectively set in a concentric sphere on the inside of the live footage sphere.

 

Duncan Shepherd:

Eventually we'd get to a stage where we send a version off to Paul. If he liked it, it would be colour corrected in Assimilate Scratch by Dave Franks. Dave taught me a lot about stereo imaging, and did his best to prevent me from breaking people’s eyeball muscles!

On a day-to-basis I primarily work with Brian Zwiener, who's my assistant editor. We work very closely with each other on lots of projects. Brian's somebody that I trust in the FCPX world. Over the last couple of years, we've done a few projects together for advertising agencies and we did some films for Samsung, and I'm a big advocate of giving assistants a proper job to do. I like to see what people can do if you say, “Can you edit this job? Off you go and start making selects.” I like it when they grab the whole job, because I think it brings the best out in people. I gave Brian parts of Mull of Kintyre and he did parts of Coming Up – he shared a lot of that creative work with me. 

McC VR photo 02

 

Audio

The Frame.io content review service was used to send updates to the sound team:

We were lucky to get Geoff Emerick, who was Paul's recording engineer for most of his career with the Beatles and has done a lot of stuff with Wings, and is a very well-known in audio mixing and engineering circles. He had such a good relationship with Paul that we were able to get the stems of his songs, which is unheard of in the recording world, they are the ‘golden goose,’ you never give anybody the stems because they can remix all your albums and release bootlegs. With Geoff involved onboard, he and Tony worked with Luke Bechthold at Subtractive to mix immersive soundscapes that had our dialogue and some evocative bits of music that were unique. Sounds that were something that Geoff had come up with out of Paul's archive, or had found or re-used in some way that hadn't been used before. 

Geoff and Luke would arrange these audio stems inside the virtual reality soundscape. There are different kinds, but they were generally mixing for Dolby Atmos. Instead of being 5.1 or 7.1, Dolby Atmos can make sound come from anywhere in such a way it might as well be called Dolby 128.1.  The audio side was handed off to them in its entirety because it was so much down to Geoff. In a sense Geoff Emerick was the designer, like Alex Kohnke was the designer of the videos. Geoff was the designer of the soundscape because he really knew Paul and had access to Paul’s music. Luke put that together in the same way I put the video side of things together with FCPX. He worked in Avid ProTools. Geoff and Luke had help from Dolby, they've made a film about that process

 

FCP X’s Roles feature for collaboration

Editors on complex projects need to share their work with other professionals. During production, parts of timelines need to be shared with visual effects suppliers and sound designers. Towards the end of production, colourists and music editors need to get precise versions of the edit. When the editor’s job is over, then they need to pass on their timeline in various formats depending on what device the video will be seen on. Usually this means different TV station delivery standards or streaming service encodings.

In traditional editing software, these kinds of ‘turnovers’ require a special version of the timeline. Assistants usually have to make a copy of the current timeline and customize it for each specific collaborator. Colourists want all the audio combined together. Sound people don’t want a timeline with hundreds of video clips, they want video combined into a single flat clip. Also it is common for the sound team to want all the audio from a specific source to be combined in the same or adjacent audio tracks throughout the timeline. 

The problem with creating a specially customized timeline for collaborators is that it goes out of sync with the ‘real’ editor’s timeline. If video edits continue while a copy of the timeline is prepared for the sound team, then the sound will be designed based on an older version of the edit. That means the editing has to stop while the turnovers are done.

Final Cut Pro’s ‘Roles’ feature makes turnovers to collaborators and for delivery to different outputs more straightforward. Instead of having to make a special version of the timeline for each collaborator, assistants can use the editor’s main timeline to share a version customized to the collaborator’s needs.

In timelines, editors use FCPX to mark which clips have which roles in the production. Clips can be defined as graphics, titles, titles in a specific language, audio associated with a specific character, 2D, 3D, or any category the editor decides.

In the case of Pure McCartney VR, Shepherd set up roles based on what his VFX collaborators needed and what his audio collaborators needed:

McC VR-06a-roles-timeline

 

Here you can see that MachinEyes VFX like to receive separate files for background video, screen content, overlaid graphics and title clips: 1 BACKGROUND, 2 SCREENS, 3 OVERLAYS and 4 TITLES. Geoff and Luke had their audio organised around dialogue and music, with dialogue-specific roles mostly based around what kinds of microphones were used on set: BOOM, CH3, MixL, MixR, RAD1 and Other Dialogue.

At this stage the music mix had already come back from Luke Bechthold at Subtractive, so it has its own role. The audio elements that were sent to his Avid ProTools system were divided into roles useful for his mix. The dialogue-specific roles were assigned to multichannel audio clips using Intelligent Assistance’s Sync-N-Link X The audio captured on set had iXML data marking which microphones were on which channels. This meant instead of audio tracks in FCPX being labeled as ‘Left’ and ‘Right’ or ‘Mono 1’ and ‘Mono 2,’ when Shepherd started to edit, Sync-N-Link X labeled them as Boom, RAD1 (radio microphone) and Other Dialogue.

This means that Shepherd could put audio elements anywhere he wanted on the timeline - above the main storyline or below – in any order. He didn’t have to make sure the clips from the boom mic are in a specific layer in the timeline. When exported, all the boom mic recordings were sent to the same set of audio tracks for the sound team.

McC VR-06b-roles-export-optionsThe screenshot shows how the video elements are divided into background, screens, overlays and titles. When exports are done from this timeline, FCPX can be set to export any combination of video elements based on their role in the story. Here is the pop-up menu that shows roles options when exporting files:

 

 

 

 

 

The creative side

As VR video is new to many people, it is important to help creative collaborators understand how it works and what it can or can’t do.

Duncan Shepherd:

[Director] Tony Kaye got onboard with the project without really caring what VR was like, he just thought this project will be a great thing. I brought him around to my house and showed him some footage inside the headset, inside the Oculus goggles. He said to me, “Get Breathless, and get Raging Bull and cut bits of film together, and then we'll have a look at that.” I cut some footage together from The Shining and from Tree of Life and a few other films and others he asked for. 

You always have this moment when you first put the 360° video goggles on, when you just slam some footage in, and everybody has the same response, “Oh, this is all we need.” That’s why many VR video productions are 2D spheres of video viewed in 3D goggles. There is a valid case of making interesting stuff just like that. 

So Tony saw movies in a sphere through the goggles, was kind of blown away with it, and then went away to think about it. I think he became quite process agnostic. He is a filmmaker so in a sense, actually making a film is no different in a VR world than it is with a RED camera or a film camera or a hand crank camera. You’ve still got a narrative to tell, and you’ve got a star to put on the screen, and it’s a question of organizing the imagery so that it doesn’t take away from the star but also allows them to tell their story. I think he wasn’t that worried about the VR side of it apart from putting the camera in an interesting environment and letting Paul just kind of run with it. 

McC VR photo 07

 

Once we got into the edit suite, that side of things became more useful to think about and Tony made a decision quite early on, which I think was interesting and agree with, which is that if somebody doesn’t see something on the first run-through, then that’s fine, they’ll just have to watch it again. 

As a director in VR, you can’t insist that people look at a particular thing. You can goad them into it. You can try and ask them to look in a particular direction, but ultimately the viewer has got a choice in any one of six directions if you like, north, south, east, west and up and down, and they will do that. Whenever we show people anything we’ve done, they always look in a different direction to the previous people we showed it to. You can never really tell what your viewer is going to look at specifically. You just have to allow the environment to have enough things that you want to tell a story in every direction for it to make some sort of sense. This is true whether that story is documentary, narrative or any kind. 

There are different ways and tricks you can do things. I’m finding with the sort of narrative idea, it’s actually not too difficult. If there’s only one person in the story, that’s quite easy because they are the only person you are looking at. I think if you had a room full of people arguing, which I haven’t tried to edit yet, that would be harder. It would be more like watching a bunch of people on a theatrical stage arguing. That’s actually not how you would write a play. In a play, you have one person who is doing the talking at any one time generally speaking so it’s analogous to what happens in VR. Even if you’re watching a film in a cinema, your eyes can wander, can’t they? But certainly, if you’re watching a play, you can look all over the place, it’s up to you. 

Some of those guidelines, we were made aware of, but in general we ignored them. We all sit in swivel chairs when we are working. We don’t mind if the viewer is missing something directly behind where they happen to be looking. There were places that we could easily put imagery or media, but in general we just made sure there was something everywhere. So even if you were looking in the ‘wrong’ direction, you’d get something. There would always be some little Easter egg or something worthwhile seeing in some direction, whether you had your head twisted around 180° or if you were still looking at Paul McCartney or if you were just wandering around. We tried to make it be an immersive experience as much as we could. 

I think that that’s part of the fun of it. You can present people with so many different bits of visual imagery, all of which that can support the central idea. As an editor, you’re always having to make decisions when you would really rather keep both shots, but you have to take one out. Sometimes in this sort of an environment, you can keep both. If you’ve got too much great shots for your running time, you can show it all in VR. You can have a 30-second sequence, which has six great shots all around you. It actually gives you more freedom in that respect.

You just can't guarantee which one of those images people are going to look at, but you can say to yourself, “These programs are quite short, they're five minutes long or they’re ten minutes long, people might watch them twice.” I like to think that with music fans will tend to kind of dig into these things more than once if they're given enough reason to. If they realize that there's some benefits watching it twice, they will watch it twice. 

 

Shepherd has worked on many 360° videos already, but working on this one was different organisationally:

Usually there are layers and layers and layers before you get to the person making the final decision. With this project Paul was only one layer away. We worked for a month preparing some cuts so that he could see them, we’d decide that we thought they were good enough for him to see. That was just within our own little creative team, and it was quite a horizontal team. We were all pretty much on the same level. 

When we felt like something was good enough to show Paul, or if there was a particular date we were trying to hit because he was available, we'd send ProRes QuickTime files to the Jaunt cloud. Their computers would render versions for Samsung Gear, for desktops, and a version for web 360, taking into account the various different qualities of those different deliverables. A couple of days later we would get concise notes that were consistent and cogent, and made sense to us: usually just about a very specific thing. They weren’t finicky about specific frames: notes we could act on without them being too detailed. 

A good note is obviously when they just love what you've done. In general people don't give you good notes, they don't really, apart from saying, "I like it." You don't get a list of things that they like, you just get a list of things that people don't like. So a good note is no note, and a bad note is one that's just technically difficult to act on.

 

 

Artistic evolution

The first video was Coming Up. That was simple, and that became the first one because it was so simple. Essentially that was inspired by the colourful blocks used in the original video. It was Paul talking about making the video, one of the first to employ multiple performance, marrying takes of him playing a whole big band style orchestra of rock and pop stars. Our VR video didn't really need to have a lot of stuff going in the 360 space. Paul sits in front of you. It is a colourful kind of soundscape. There is a big screen next to him playing the old video. Towards the end, we placed lots of stills from behind the scenes of the shoot, so that was relatively simple. We were sharpening our claws and getting ourselves ready, even though we had actually started the work. It was still a process of working out how far you could go in the medium.

 

For My Valentine we had Paul playing a Wurlitzer keyboard, playing the song he was talking about, playing it just like jamming away with people in the background milling around in between takes, whilst we also had Paul describe writing the song and making the video. 

 

McC VR-07a-My-Valentine-mono

 

McC VR-07b-My-Valentine-direction-1

McC VR-07c-My-Valentine-direction-2

McC VR-07d-My-Valentine-direction-3

 

Then we could overlay video. He talked about Natalie Portman and Johnny Depp who featured in the video, so we overlaid frames of Natalie Portman and Johnny Depp. Towards the end we ran the video for My Valentine which Paul directed, and was made in three different versions. One as a regular music video with both Natalie Portman and Johnny Depp performing sign language. There was also a version of the video that was only Natalie Portman, and a version of the video that was Johnny Depp. It made perfect sense for us to show those three videos in sync, all together at the end of the film, in a way that if you looked in any one of three directions, you would see that particular version of the video. That was a reasonable way to show the three videos together. It was probably the best way of them actually existing, rather than as three separate entities. It was nice to see them all married up together. 

McC VR-07e-My-Valentine-three-up

 

It was as if you had a three channel multi-cam. As an editor, you get to see this all the time. When you’re watching multi-cam takes and you go, "This is great. I love seeing all these things in sync, but only I ever get to see it,” because you are sitting there in front of the computer. But it's better when it surrounds you. When you got the screens to both sides and one in front of you. We just did it as three, because I thought that was a nice shape to build in VR. There was no real “behind you”, because it was triangle. 

So that's probably the two ends. One end was the simple construction, which just use slabs of colour, just to generate a kind of an editorial tempo in time with the beat, and then later on we got to a stage where we were really trying to show nine different interesting bits of art going on all around you all the time for a period of seven or eight minutes, and that was hard work. It was worthwhile because it was enjoyable to watch, but it took a long time. 

 

Technical evolution

As the world of VR video is at a very early stage, even over the weeks these films were made, the workflow evolved. By the time the last film was made, the process for Shepherd was simpler. He was able to animate 2D elements in 3D space within Final Cut Pro:

On My Valentine I did a lot of the 3D work myself in Final Cut, because Tim Dashwood helped me with some problems I was having with some image quality issues. He sent me new Beta versions of his 3D 360VR Toolbox plugins for FCPX. It was good to be able to build these stereo 360 worlds myself, rather than relying on a designer doing it in an external application, and saying, “Can you make them this size, and this far away from the viewer.” It was good to do it myself, even if I got it wrong. Ultimately you need to play with the controls yourself, to see what you can do and to see what you can’t. You often break things or you make mistakes, but eventually you sort of blunder towards a solution.

I’ve always gone by the maxim that, “If I like it, then it's good.” I have to hope that people hire me because they think that the rest of the world might agree with me. I can't personally judge whether somebody else would like it. That's not my job. That's the way I’ve always worked.

 

Final Cut Pro X for VR video

Duncan Shepherd concludes:

I have a very long list of reasons why I think FCPX is the correct editing tool for me, and none of them are technological. They are all about the creative freedom it gives me to dive deep and explore the available story and imagery I’m presented with.

 

Final Cut Pro’s ability to work with very high resolution video, to be able to composite multiple layers of pictures, graphics and video made it the perfect editing application to make Pure McCartney VR.

Its advanced metadata features meant that Duncan Shepherd and his team were able to find any piece of media in moments using keywording and were able to collaborate with other professionals using Roles.

Also important for VR video production is Final Cut’s community of workflow consultants and developers whos tools support advanced productions worldwide.

So next time you are planning leading-edge VR video, use Final Pro X to take the medium forward – to inspire others to tell their story in VR.

 

Cliff Plumer, president of Jaunt Studios adds:

'There is a lot going on in VR, but it’s mainly driven but the technology of VR. We wanted to take some creative chances to push the boundaries, to create a medium that consumers want more of. We are just beginning.'

 

 

 Written by Alex Gollner.

 

Want to find out more? Duncan Shepherd and Alex Gollner we bill demonstrating 360 VR production techniques with Final Cut Pro X at the FCPX TOUR @IBC. 

 

©2016 FCP.co

No part of this article may be reproduced without written consent from FCP.co