fbpx
Welcome, Guest
Username: Password: Remember me
  • Page:
  • 1

TOPIC: Remote Documentary Workflow Suggestions

Remote Documentary Workflow Suggestions 24 Jul 2020 18:39 #108918

  • bcassin
  • bcassin's Avatar Topic Author
  • Offline
  • Senior Boarder
  • Senior Boarder
  • Posts: 45
  • Thank you received: 0
Hi all,

Looking for workflow recommendations from seasoned FCPX documentary editors.

I'm about knee deep into the rough cut of a 2 hour doc for broadcast and this is the first time I have used FCPX for such a large project. Usually use AVID for long form programming. I am very familiar with FCPX but haven't applied it to this type of project yet.

Here is the technical situation:
  • 15,600 clips, appx 212.2 hrs of footage.
  • working remotely with producer (using PostLab)
  • Producer has a Pegasus RAID with original footage ProRes 4k
  • I have a GLYPH drive with transcoded HEVC 1920x1080 footage

Here is how I have it organized:
  • Library for cuts, each act its separated into their own event.
  • 4 Libraries of footage to help separate where and when things were shot, and to enable me and the producer to rate and keyword concurrently on separate libraries.

For the most part this works, but I am missing the ability to see what I have used from the browser in the act I am cutting (since it is in a different library than the footage). I separated out the acts of the show into their own library so it opens faster. In the past I noticed the longer timelines got, and the more versions we incurred, the slower the library opened. This is what also what I did when I cut shows in FCP7.

The other thing I have noticed is that occasionally some of the HEVC footage causes FCPX to beach ball before it plays. Not sure why some clips do this and others don't since they were transcoded at the same time. I'm also wondering if I should have had the footage transcoded at the same frame size as the original 4K.

As an aside, I wish FCPX had the ability to show duplicate media in the timeline. Especially on a show this long, it' abound to happen:) Maybe 10.5??

Before I deeper into the project, any constructive feed back on my approach would be most welcome.

Brian
Owner InnerParakeet
www.innerparakeet.com

Please Log in or Create an account to join the conversation.

Last edit: by bcassin.

Remote Documentary Workflow Suggestions 25 Jul 2020 10:42 #108924

  • FCPX.guru
  • FCPX.guru's Avatar
  • Offline
  • Platinum Boarder
  • Platinum Boarder
  • bbalser.com
  • Posts: 3595
  • Karma: 34
  • Thank you received: 498
To show Used Media, use a Smart Collection. It will show you used media for the currently active Timeline.

Attachments:

Please Log in or Create an account to join the conversation.

Remote Documentary Workflow Suggestions 25 Jul 2020 10:53 #108925

  • bcassin
  • bcassin's Avatar Topic Author
  • Offline
  • Senior Boarder
  • Senior Boarder
  • Posts: 45
  • Thank you received: 0
Thanks FCPX guru! I do know about that feature. Rereading my post I realized I wasn’t entirely clear.

By used media in the timeline I meant duplicate media. I’ll edit my post to reflect that.

Thanks for the quick reply!
Owner InnerParakeet
www.innerparakeet.com

Please Log in or Create an account to join the conversation.

Remote Documentary Workflow Suggestions 25 Jul 2020 18:55 #108930

  • joema
  • joema's Avatar
  • Offline
  • Platinum Boarder
  • Platinum Boarder
  • Posts: 1556
  • Karma: 27
  • Thank you received: 333

  • 15,600 clips, appx 212.2 hrs of footage.
  • working remotely with producer (using PostLab)

I worked on a doc last year which had 8,500 clips, 220 camera hours and 130 multi-camera interviews. It was all in one library, using just three events to contain each major shooting location. All other curation and organization used FCPX ratings and keyword collections. This worked fairly well on an iMac Pro and dual 32TB OWC Thunderbay 4 RAID-0 arrays, one primary and one backup.

On previous docs we tried a more detailed keyword approach but this didn't prove useful to the lead editor. It is tempting to use lots of events and keyword collections, even when this doesn't actually help. Using multiple libraries can be a problem since each one is a stand-alone entity and cross-library querying isn't possible. There can also be complications when doing cross-library copying of clips and projects.

People from other NLEs tend to create lots of events because they superficially appear like bins or folders. However you can very effectively manage huge amounts of data in only one event by using ratings and keyword collections. Whether you click on an event or a keyword collection, they both retrieve tagged data. Ingested data can easily be auto-keyworded by folder name or Finder tag, making them as easy to locate as event-based ingest. I'm not recommending using only one event, simply saying don't over use them.

It is easy to fall into a reactive keywording approach, e.g, the assistant editor (AE) goes through the material and creates many similar keywords based on the first impression, e.g, "happy", "smiling", "kids", "children", etc. It is much better to first carefully survey the material then devise a sparse, truly useful keyword approach.

Ultimately the editor must become very familiar with the material. There is no way around that. During that familiarization they'll mark their own favorites and keyword collections. The question is how the AE can prep the material and best assist the lead editor. This is not by over-categorizing material, but doing things that are difficult, tedious or require 1st hand knowledge from the production staff. Examples:

(1) Mark all roles immediately after ingest. See available tutorials about using roles.
(2) Sync all multicam and external audio material.
(3) Mark any ratings or keyword ranges on the MC clips, not the parent clips.
(4) Reject all parent clips and advise lead editor run browser filter in "hide rejected", which avoids accidental insertion of non-MC clip into the timeline, in turn requiring tedious frame matching to the MC clip.
(5) Study closely the material and devise a sparse keyword system which is truly helpful and avoids redundancy.
(6) Keyword items that would otherwise be difficult to find. E.g, use the interview subject's name to keyword relevant b-roll.
(7) Do not keyword unnecessary things. E.g. on a wedding shoot, do not keyword pre-ceremony bridal shots, the ceremony and the reception. These are inherently sequential and chronologically sorted in the event browser; broad keywords typically aren't needed for that.
8. If the AE marks favorites, these should be saved as keyword collection, then the favorites un-marked to free up the favorite system for the lead editor.

In general the preferred FCPX approach is do lots of curating, classifying and organizing in the Event Browser, before putting anything on a timeline. If you want to string clips together in a provisional sequence, use a compound clip for that (which is skimmable in the browser), not multiple timelines (which are not skimmable).

Re performance there is some degradation if creating many project snapshots. There is no hard limit but above 30 or so (depending on platform and timeline complexity) it can slow down opening the library.

I've edited complex 2 hr products on FCPX containing many thousands of edits and performance was generally good. That is on a 2017 iMac 27 and a 10-core iMac Pro. Last year I edited a 6-camera 2-hr stage performance on a 2018 MacBook Pro 15, and it was OK. It's best to defer adding compute-intensive effects to the last phases, otherwise it slows down the timeline.

For collaborative work on a NAS, each editor can have their own NAS-based library and send project XMLs back and forth. However with FCPX a lot of work takes place *before* the timeline phase so the issue is how to achieve non-conflicting collaborative work for that.

We simply told each remote AE to stay in their own assigned event. There they did keywording, rating, multicam creation, etc. They then sent the event XML we then used MergeX to merge his updates to that event in a master library. MergeX is now owned by PostLab. I don't know the current availability: www.merge.software

There are several issues related to collaborative work and media management. I recommend all files be globally unique, due to an issue FCPX has using XML. During ingest FCPX will attempt to "uniquify" duplicate filenames in different folders by adding a suffix such as (fcp1), (fcp2) etc. However this doesn't work perfectly for the XML situation. To avoid this and for general media hygiene, it's good to rename files or add an incrementing serial number to ensure global uniqueness before ingest. This can be done with Finder or a 3rd party tool like A Better Finder Rename: www.publicspace.net/ABetterFinderRename/index.html

If any camera or recorder did not have the correct time of day, immediately after ingest bulk-update those using Modify>Adjust Content Created Date and Time. This ensures they sort adjacently based on time.

When copying clips or projects between libraries, this should be done inside a "transfer event". You create the event, copy the file or project to that, then copy the event to the destination library. This avoids some FCPX media management issues that can create duplicate clips.

There is a known narrow issue with some HEVC codecs on both Mojave and Catalina which can cause FCPX to hang or crash. To avoid this transcode all HEVC material to optimized media.

Do not use the Chrome browser or any browser based on Chrome such as Brave. These mis-use the MacOS VideoToolBox framework and can cause FCPX to hang or crash.

Re detecting duplicate clips in the timeline, this is a commonly-requested feature but FCPX does not have that. In theory someone could write an XML parsing app which analyzes a project-level XML, or else converts it to CSV for a duplicate check using a spreadsheet or other utility. I don't know if that exists.

Please Log in or Create an account to join the conversation.

  • Page:
  • 1