fbpx
✎ Latest Forum Posts
✎ Latest Free FCPX Effects
Bretwilliams's Avatar
Bretwilliams
Bretwilliams's Avatar
Bretwilliams
Karsten Schlüter's Avatar
Karsten Schlüter
Joe M. replied to the topic 'Multicam editing' in the forum. yesterday

RickLFoy wrote: ... I always did the syncing after I imported my content never before importing. It worked like that with me before so I don't know what would have been different. now... I will try your steps on a different project since already started manually editing it. Thanks again I'll let you know how it turns out.


I didn't mean to imply you sync before importing to FCPX. Rather you sync *after* importing but *before* you add anything to the timeline. The same applies to curating the material. You mark favorites, rejects and keywords *before* putting anything on a timeline. For multicam material you only add curated multicam ranges to the timeline, not the constituent or parent clips. To avoid accidentally adding a parent clip it may be useful to reject those after multicam sync and run the event browser in "hide rejected".

As FCPX.guru said, tagging all clips from a given camera is important. There are various ways to do this. As I described above, this is easy if you offload each camera to a uniquely-named folder, then import with the option of adding keywords from folder names and Finder tags.

As FCPX.guru said, you can also simply create keyword collections named after each camera or camera+operator, etc, click on each one then import. Each batch of clips will be assigned that keyword, making it very easy to identify what clips came from what camera. This is especially important if shooting from several similar cameras. E.g, most of the Panasonic and Sony mirrorless cameras use similar filenames, same internal codec, and the video header contains no identifying data to identify material from, say, a Sony A6300 from an A7R3.

Re camera/recorder time of day, if you have many clips from many cameras shot over several days, it's often helpful to ensure they sort adjacently by time of day. My team has shot 1,000 multicam interviews over the past three years and 50 multicam interviews in one weekend, so if a camera or recorder has the incorrect TOD it can make picking the right clips for multicam sync difficult -- even IF they are tagged with the correct camera name. The importance of this was discussed on MacBreak Studio #223 from 02:00 to 04:00:
youtu.be/azJ4J41JaZk?t=113

For later archival search and retrieval, it's useful if the clip date and time of day are correct at a file level. FCPX gives the option for updating TOD of the disk file vs just TOD within FCPX, however the option to update the disk file may not work correctly due to disk permissions issues.

Read More...

Joe M. replied to the topic 'Multicam editing' in the forum. 2 days ago

RickLFoy wrote: ...I select 3 different files (recorded at the same time) with the syncing using audio.....



To make the FCPX sync work properly, the clips must be labeled with an angle name or camera name or recorder name in the Inspector before doing the sync. This isn't that difficult and the entire batch of clips shot by one camera can be named in a single step. In general create multicam clips, not sync clips -- even if you only have a single camera and external audio source. Overall procedure:

1. Before importing anything, put all clips from each camera in a separate uniquely-named named disk folder. E.g, Wide_DVX200, CloseUp_S1H, etc.

2. When importing make sure the FCPX preference>Import>Keywords from folders and Finder tags are set. This will automatically tag the imported clips with the camera name. This is especially important when using similar cameras where you can't tell the shots apart from the filenames.

3. Set Event Browser to List View to better see the clip data. Pick menu View>Browser>Toggle Filmstrip/List View or press OPT+CMD+2.

4. If you determine that time of day is incorrect for certain clips since the operator did not set the camera or recorder clock, the clips will not sort adjacent to each other chronologically. This can complicate picking all clips from a given take to sync in a multicam. This can be fixed and adjusted for all clips from that camera. Select the batch of problem clips and Modify>Adjust Content Created Date and Time.

5. After clip time of day is correct, in the Event Browser, select all clips from a given audio recorder or camera. If the Inspector is not on, use CMD+4 to show it.

6. In the Inspector, click the "i" button to show information about the clip. This will reveal a "Camera Name" edit box at the bottom.

7. Enter a Camera Name for that group of clips and press return. This batch labels all selected clips with that camera name.

8. Repeat for each camera AND audio recorder. Note all clips from a given device can be labeled in one step, so this is quick.

9. Select all clips in the Event Browser you wish to sync

10. Right-click and select either "New Multicam Clip". Some experienced editors suggest using multicam clips even for single-camera with external audio: www.fcpworks.com/sync-or-multicam-clips/

11. For either multicam or sync clips make sure the checkbox "use audio for synchronization" is checked.

This will create a new multicam or sync clip which is synchronized using audio. You can double-click the multicam clip and open it in the Angle Editor to verify or adjust the sync. SHIFT+Z to fit to screen. If the sync is not correct you can drag the problem clip left/right or use the comma/period keys for single frame adjustment or left and right angle bracket for 10-frame adjustments.

If you later want to add another camera to an existing multicam, click on down arrow next to the angle name at left of the timeline when in the Angle Editor, and select "Add Angle", then drag the new clip from the Event Browser to the new lane on the timeline.

If a clip is synced incorrectly you can selectively re-sync that one clip within the multicam. In the Angle Editor, click down arrow next to the angle name for the master audio you wish to sync to, and select "Set Monitoring Angle". Then select the clip which is synced wrong, and on that lane's name select the down arrow and pick "Sync Selection to Monitoring Angle". Only that one clip will be re-synced.

Likewise if you forget to add a clip to the multicam during the initial sync, it's not necessary to rebuild the multicam. Just drag/drop the additional clip to the proper angle in the Angle Editor, then use the above procedure to sync that one clip within the multicam.

After the multicam is created, do not simply dump that on a timeline. Use FCPX favorites, rejects and keywords on the MC clip to tag it curate the content. Have the Angle Viewer enabled when doing this -- SHIFT+CMD+7. Ideally mark all the individual "parent" clips comprising the multicam as "rejected" and run the browser filter with "hide rejected". This will help prevent accidentally adding a single-cam clip to the timeline which would later require manual frame matching with the multicam clip.

"Sync and adjust camera angles": support.apple.com/guide/final-cut-pro/sy...gles-ver23c76b1a/mac

Read More...

Joe M. replied to the topic 'media asset management system' in the forum. 3 days ago

Tom Wolsky wrote: If you try to relink a file the relink dialog I think still shows the expected filepath.

Thanks for that correction. Yes if you select a bunch of red "missing media" files, then do File>Relink Files, and pause at that screen, you can cursor down and one by one, it will show the expected file pathname at the bottom. However there's no way to obtain a comprehensive list.

But in some cases all you may need is a few file pathnames to help determine why they are missing.

This is a good recommendation because at first glance the UI is not obviously showing the expected file pathname. It is easy to miss.

Read More...

Joe M. replied to the topic 'media asset management system' in the forum. 3 days ago

DaveMaine wrote: I am using FCPX as a media management tool as well. What I do differently is leave the media in place wherever it exists, and just bring in the file reference. It keeps the library much smaller in the single library...


You are proving it can be done. Thanks for posting that.

Even though your media is carefully placed on a certain disk, you could also store with the library a list of all media paths used. Reason: if in the future you get red "missing media" clips the first thing you want to know is where is FCPX looking for the now-missing media. Unfortunately there is no tool within FCPX or Finder which lists the path of each media file, or which displays the symlink pointers. Finder is especially frustrating because "Get Info" only resolves the symlink IF the target file is there, in which case you don't need it. When the target file is missing Finder only shows the path of the symlink itself which is useless.

In that situation you can use terminal, navigate inside the library to the original media folder and do ls -l to make it list the full pathnames the symlink is pointing to, or you can just use Final Cut Library Manager, which has an optional feature to export all this to CSV. The future person might not have this knowledge so including the CSV in the same folder as the library might be a good idea: www.arcticwhiteness.com/finalcutlibrarymanager/

Read More...

Joe M. replied to the topic 'media asset management system' in the forum. 4 days ago

cofe wrote: ... Why not FCPx as your MAM? Depending on your project sizes you can make libraries self contained (store media within the container)...I store the original media externally, but keep the FCPx proxies within the library container. I edit in proxy mode and when I need footage from another project I just open that Lib and FCPx will pull the enquired proxy into the current Lib when I edit the clip into a timeline. When I need to online I either just connect the online drive(s) or copy it into the current projects external online media location via the consolidate command.

You could even create custom Libs that hold all those generic clips you encounter over time for quicker access.
This system needs a little planning and discipline but you avoid another software and all the potential hassles Joema listed. It also might be nicely assisted with PostLab, even more so with a proxy workflow in the cloud via their upcoming 'Drive' feature.


Yes that can work. A few possible issues: Each major FCPX upgrade requires library upgrade. In general this is reliable but if you had a 4-yr-old archival library then opened it with the latest FCPX that would be a big upgrade. That problem is not unique to FCPX -- any database content system will periodically require database upgrades.

There is a soft upper limit on the current FCPX library database. I have used libraries with 8,500 clips and 200+ camera hours of material and it worked OK but the headroom is not infinite. Each event is implemented as a separate SQLite database so splitting library across events may help, but there is no need to over-do it. A single event can manage a huge amount of content.

There is another soft limit based on # of projects or snapshots. Each project also is implemented as a separate SQLite database, and each open db entails system and memory overhead. FCPX uses a "deferred opening" algorithm to try and avoid opening an excessive number of projects or events but sometimes with over 30-50 projects it will slow down.

In general these limits are very high (except the # of projects) and even a huge feature film (inc'l all footage from all cameras) can be managed in a single library, given appropriate hardware. But there's a difference between that and a long-term archival library which might contain clips from multiple films or documentaries. The main constraint is each library is absolutely independent and FCPX cannot query across libraries.

Proxies if stored externally can be somewhat fragile. The original proxy design was in-library only and that is more reliable but it bloats library size. For longer-term archival storage I recommend also grabbing the FCPX auto-backups for a given library which are stored in /Movies/Final Cut Backups. These are very small and contain no proxies, cache or even symlinks. If after years you try to open or upgrade a regular library, having these around might be useful. I would also suggest exporting library and project XMLs as part of the archive package.

Re cross-library copying of projects and clips, there are some FCPX media management issues to be aware of. In general you want to only copy that material inside a "transfer event". Sam Mestman discusses this from 06:30 to 11:00 in the below video. He demonstrates this on a Lumaforge NAS but it's not unique to a NAS. "Final Cut Pro X Virtual User Group #7":
youtu.be/NAv89cGexIM?t=386

There is also an FCPX media management issue if duplicate filenames are imported, even if those were within separate disk folders. FCPX places the media or symlinks within a simplified library folder structure, which creates filename conflicts. It solves this by adding a "uniqueifier" of (fcp1), (fcp2), etc. However this doesn't work correctly in all cases and if exporting/importing XMLs can create spurious duplicate clips. To avoid this (as well as good data hygiene) it's best if all media files are globally unique across all libraries and all time. To achieve this, before import you could add an incrementing 6-digital serial number, then save the high value and use the next value for the next import. Files can easily be batch renamed to add that serial number using either Finder or a 3rd party tool like A Better Finder Rename.

Finder batch renaming (MacMost): youtu.be/rRIZAjylKDw
A Better Finder Rename: www.publicspace.net/ABetterFinderRename/index.html

Read More...

Joe M. replied to the topic 'Macbook Pro GPU upgrade worth it for performance boost' in the forum. 4 days ago

jinhim wrote: ...I work with many effects and find my gpu processor has a large amount of headroom but GPU memory is maxed out when I render and export.

I've read that the GPU speed improvement is potentially double with the 5600M, will I see a speed boost if I upgrade to identical machine with the 5600 M ?..


Render and export are two different things. Rendering is translating edit directives and effects to render cache. For an unrendered timeline being exported, rendering is doing that translation on a few frames before handing it to the encode phase.

In general a faster GPU will not help encoding. It might help rendering -- IF the effect is highly GPU-bound. As a generalization effects use the GPU but the split between CPU and GPU varies. In the case of Neat Video the user can actually adjust the CPU/GPU split.

You can help characterize this by turning off background rendering, delete render files, select all timeline clips with CMD+A, then time with a stopwatch the render phase when you do CTRL+R. After the timeline is fully rendered, export to your preferred format and time that. This will give you a very rough idea of render vs encode time.

After rendering the timeline, try this export preset: File>Share>Master File>Settings, Format: Computer, Video Codec: H.264 Faster Encode, Resolution: 1920x1080.

The 5600M is a nice upgrade and a useful option but I'm not sure it will totally transform a performance issue. The Apple Silicon Macs will be out soon and might have significantly better performance. Supposedly a 12-core 13" MacBook Pro will be released this year, which might give the first hint. I don't think the 16" MPB or iMac will be released until next year.

Read More...

Joe M. replied to the topic 'Color Finale 2 Pro Crashing FCPX -Any tips?' in the forum. 4 days ago

substance3 wrote: ...I've not heard back yet from support but hopefully this is something that can be fixed.


It is the responsibility of the plugin developer to promptly debug and fix problems like this. Apple has provided lots of info and tools about how to do this: developer.apple.com/videos/play/wwdc2018/414/

Ultimately when plugin developers move their code to the FxPlug 4 framework, it will optionally enable running that in a separate "sandboxed" address space, and should not crash the host app (FCPX in this case). However the plugin code will still crash which will be disruptive. IOW the plugin functionality will probably freeze or malfunction but FCPX will stay up and you can do an orderly re-launch without risking data loss.

So FxPlug 4 will help contain the problem but there's no substitute for developers debugging and supporting their code. Ideally customers could provide a small, reproducible scenario, but if not possible there are plenty of ways for a developer to debug and fix things like this.

When plugin developers do not expeditiously investigate and fix their problems it taints the entire plugin ecosystem with a reputation for poor reliability and poor support. This makes customers less likely to purchase plugins -- from any vendor.

Read More...

Joe M. replied to the topic 'media asset management system' in the forum. 5 days ago

Glitchdog wrote: ...What I was hoping to find is a MAM that can sync with all the metadata in FCPX. BUT since I've just started wrapping my head around the FCPX metadata, I wasn't sure if I really needed something (if it exists) that can sync with all the FCPX metadata. Do I need all the FCPX metadata accessible in an asset management setup?


FCPX metadata is stored within various SQL tables within several SQLite databases within a FCPX library. That info is undocumented and in a format that can be humanly understood. Normally with a SQL database a human devises the schema - table names, column names, datatypes, etc, and also writes the queries. However internally FCPX uses Apple's "Core Data" framework, where the programmer-facing data view is an object graph, and Core Data itself creates and uses the underlying SQL data store. So while the tables can be queried with a 3rd party tool, the info is not understandable.

This leaves the only method of getting metadata info as XML. The application-specific details are also not documented but it's possible to figure some things out, so some 3rd parties have written tools for that, such as MergeX which is now owned by PostLab: www.merge.software

With any long-duration MAM approach (whether primitive and manual or a sophisticated 3rd-party database), durability and support are major issues. E.g, say you spend thousands of hours tagging photos using LightRoom. In 8 years will that database be available, readable and intact? Does the app developer allow exporting the data (with all features) in a standards-based format such as CSV or something else in case their database crashes or goes non-supported?

CatDV has been around a long time, now has an FCPX workflow extension, is widely used in the industry and has the features you want: www.squarebox.com/fcpx/

However it is likely expensive. Also (even given the funding) you can't just pick a product and commit. What if under high stress or heavy load it became unreliable or damaged data? What if that only happened using a certain client OS or only on a NAS if using a certain network protocol? Who would support and debug that? These issues apply to any 3rd-party MAM.

Even using the relatively simple Mac Spotlight Indexing, all features of this do not always work consistently on all NAS platforms. IOW you could spend lots of time using Finder tags on a local drive, move it to the NAS, then find an indexed search of those tags doesn't consistently.

For these reasons some people fall back and use the simple approach of embedding metadata in filenames. It is painfully primitive but at least filenames are durable. E.g, camera file C00001.mp4 becomes 2018_ATL_Wedding_Smith_Reception.mp4. Whether a database or rudimentary filenames you must decide on a consistent naming convention convention ahead of time then stick to that -- it can't be "stream of consciousness.

However the above filename approach can conflict with FCPX reliance on filenames at ingest time, esp if initial curation is done within FCPX. That raises yet another issue: traditionally initial curation took place outside the NLE because (a) NLE tools for tagging data were poor or (b) The assistant editor didn't want "all that junk in my library".

With FCPX it's often faster to just ingest everything using "leave files in place" and curate with the skimmer, ratings and keywords. No external tool is that fast, not even Kyno: lesspain.software/kyno/

But once ingested FCPX is reliant on the filename and if later changed this breaks the link (if not on the original drive where "inode lookup" is possible). This in turn argues for deciding for all time on a filename convention *before* ingest. There is no good answer for this issue.

That said, I recommend you investigate closely Kyno. It is non-subscription, cross-platform and may fit your needs:

lesspain.software/kyno/

Read More...

Joe M. replied to the topic 'media asset management system' in the forum. 6 days ago

Glitchdog wrote: ... budget non-profit, for a media asset management system. As I look more at the rich metadata within Final Cut, I don't see much out there so far that has a really good ability to read/write FCPX metadata.

Our setup is currently one editor with a second one in place in the next year or two. We want to work off our internal Windows server currently (a MacMini server in-between is fine), then eventually migrate to the cloud, once pricing has dropped to fit our budget. Would like a system that doesn't nickel and dime you to death. Wondering what others are using that you feel is a good marriage between FCPX and the MAM?


There are now FCPX workflow extensions that integrate and provide MAM capabilities. However you don't necessarily need these and can "roll your own" collaborative system, but at a possibly significant cost in investigative and testing time.

Frame.io: frame.io
KeyFlow Pro 2: www.keyflowpro.com

See also PostLab: hedge.video/postlab/concepts

FCPX by itself is really good at data organization but scope is limited to a single library. What you'd ideally like is cross-library searching. I think Frame.io or KeyFlow Pro give that, but you can do it yourself manually with the inexpensive tool FindrCat. Using this method you'd curate, rate and keyword imported media within a given FCPX library, then export using FindrCat which preserves those as Finder tags on the media files, making them searchable using the MacOS Spotlight indexing system.

Data organization using the FCPX rating and keywording system is heavily range-based, not just clip-based. I think FindrCat and Spotlight indexing are all clip-based. Ideally you'd like a cross-library MAM workflow extension which preserved FCPX's range-based tagging. I don't remember if Frame.io or KeyFlow Pro do that.

Theoretically you save FCPX keywords and ratings across multiple libraries using FindrCat, but it would assume those datasets were always on line. It's probably possible to subsequently index those with NeoFinder, producing a searchable database of offline media: cdfinder.de/guide/8/8.7/neofinder_tags.html

Note Spotlight indexing may have limitations on a NAS drive and is not designed for intermittently-connected cloud assets.

Read More...

greg2020 wrote: I made a new archive on my external hard drive.
I entered some big movie source files and did cut and edit them in a project / event.
I exported some of the movies and deleted all the footage and projects.
Anyway the archive file Testmovie.fcpbundle in 20BG![.../quote]

Disable background rendering in FCPX preferences, then select library in left sidebar and delete all render files with File>Delete Generated Library Files>Delete Render Files>All.

To do this in a more guided and safe fashion, get Final Cut Library Manager: www.arcticwhiteness.com/finalcutlibrarymanager/



Read More...

dgbarar wrote: ...The thesis that we have had is that the 10 bit files showed Matrix Coefficients and Transfer Characteristics for BT.601 and that is why Compressor is shifting the RGB parades up when transcoding to Proress 422. Yet, so does the 8 bit .h264 file. Yet, Compressor transcodes to Prores 422 without shifting up the RGB parade...


The Matrix Coefficients thing was a guess. I think the issue is how gamma is handled between various encodings and playback methods. I'm not sure how gamma is encoded in the video header. It is common to see gamma differences on playback, esp between Quicktime and other container types. There may be no single rigid spec but various software may be allowed to interpret gamma as they see fit, based on whether TV or cinema playback is expected. I know VLC player has various configurable gamma settings.

I will try to explore this tomorrow; I was busy today on work items.

Read More...

dgbarar wrote: ...I know that the file was HEVC 10 bit 422. I believe I set the camera up Long GOP--but I do not recall. The file did open in FCPX. It did not open in Compressor...I will upload the file to the URL that you provided....


The file you sent was UHD 4k/29.97, 10-bit 4:2:2, 160 mbps, AVC (H264). It imports OK to FCPX 10.4.8 and can be shared to Compressor and then exported as ProRes 422. As with several Panasonic 10-bit formats it's not compatible with Quicktime and won't play in Quick Look or Quicktime. Maybe for that same reason it won't transcode in Compressor if directly loaded.

I think your original question about waveforms may involve a difference in how various apps handle gamma encoding. I'll look at this more tomorrow. See: larryjordan.com/articles/caution-premier...-color-the-same-way/

Read More...

dgbarar wrote: I would like to read up more on this potential issue with REC601 in metadata. You mentioned "Invisor" but I do not know who this is. Can you reference a URL?


Invisor is a utility that displays metadata from video files. You can compare side-by-side the differences between multiple files. After installation it integrates with Finder and you can right-click on a file and select Services>Analyze with Invisor, then at top select Comparison Mode, then drag/drop additional files from Finder to Invisor and it displays them in a comparison grid: apps.apple.com/us/app/invisor-media-file...or/id442947586?mt=12

REC601 is an older standard for color space which was embedded in the header of the Fuji HEVC files: en.wikipedia.org/wiki/Rec._601

dgbarar wrote: There seems to be more issues with Compressor. I went to camera store today and recorded a 10 bit HEVC files from a GH5. I wanted to see what Compressor would do with these files. No go. Compressor put it in Completed as Failed. So there seems to be other issues with Compressor and 10 bit HEVC. Even QuickTime would not open this file...


Can you import this file into FCPX? Some files from some cameras are not viewable in Finder or Quicktime Player, but can be imported to FCPX. One example is the 4k 10-bit 4:2:2 H264 All-Intra material from a GH5.

Can you upload the GH5 HEVC sample to me? If so I'll examine it. If possible just drop it here: www.transferbigfiles.com/dropbox/joema

Read More...

dgbarar wrote: ...I also transcoded joema's file with Adobe Media Encoder (AME) to Prores 422. AME does not shift the RGB parade up and does a better job of preserving the original histogram...


I confirm this happens if the Fuji 10-bit HEVC file is imported directly to Compressor and exported as ProRes 422, but not if exported from FCPX and not if sent from FCPX to Compressor.

I'm not sure why that is. Invisor shows there is some REC601 metadata in the video file header; maybe that is a factor.

As an immediate workaround you can import all those files to FCPX and batch export them all to ProRes 422. Try this:

youtu.be/5tF9wFw99ew

Read More...

Joe M. replied to the topic 'Transcoding/Analyzing stops overnight' in the forum. 1 week ago

Turn off Enable Power Nap, also turn off screen saver. Make sure there is no timeout whatsoever on screen activity, whether for security, power savings or anything else, including screen auto-lock timeout. Turn down the brightness. Let it run overnight and see if that makes a difference. When you check in the morning, the last window config should still be up.

Despite making those changes, if the transcode seems seems to have halted, open a terminal window and type "last reboot". That will give the MacOS reboot history. If the machine somehow encountered a fault and restarted it will tell you.

Read More...

I just tested this using X-T3 4k/24 10-bit HEVC F-log material, sending it from FCPX 10.4.8 to Compressor 4.4.6 on MacOS 10.15.6 on an iMac Pro, and a standard gamut library. I used both ProRes 422 and ProRes 4444XQ outputs.

I don't see any shift on the RGB parade between the original camera clip and the ProRes exports. Try this test clip and see if you get the same or different results: www.imaging-resource.com/PRODS/fuji-x-t3...-h265-FLog-10bit.MOV

Read More...

Joe M. replied to the topic 'Import clips in chronological order.' in the forum. 2 weeks ago

jacob.brown wrote: I have hundreds of clips which I want to put into one timeline. When I try to import all the clips from my browser to my timeline all the clips are in reverse chronological order. I tried to first sort them by date created but this didn't fix when I imported. Anyone know how I can do this?...


On my system running 10.4.8, when I add clips from the browser to the timeline using the E key, they are placed on the timeline in the current sorted order in the browser. I have Group By set to none, and tried sorting by name and date in filmstrip mode. In list mode you can click on the column heading to re-sort them in the browser by date ascending or descending, alphabetic ascending or descending, etc. They will be placed on the timeline in that order.

That said I'd generally advise against dumping hundreds of clips on a timeline. With FCPX the preferred approach is curate, rate and keyword the clips in the browser, then use the query tools to find the clips you want, and only then add those to the timeline. You typically end up rejecting a bunch of media, then run the browser filter in "hide rejected".

Some people attempt to do that at the Finder level before importing the clips, reasoning "I don't want all that stuff junking up my library". That is the old way. I did that with Premiere, since scrubbing through media was so slow. The new way is accepting and leveraging the database, ultra-fast skimmer and query features of FCPX. No external tool is as fast as FCPX, not even Kyno. Importing with "leave files in place" takes no additional space. It is typically faster to import more broadly then curate and reject within FCPX.

If you want provisional sequences of clips in a certain order, use compound clips for that, not creating lots of timelines. Compound clips are skimmable in the browser, whereas projects (aka timelines) are not.

Re default duration of stills added to timeline, this only affects the selected range when you click on a single clip. Eg. if set to 2 sec, you click on a clip in the browser, and a yellow selection box will be 2 sec. Unfortunately that only works for single clips, not a range of clips. As Larrie said, you can add a bunch of clips to the timeline select them all, the do CTRL+D and enter 200 and it will make them all 2 sec, however there will be gaps you must then close.

You can set it to 2 sec in prefs, then rapidly go through the browser and click on each still, then click on the 1st clip, then shift-click on the last clip, and all clips with that 2 sec range will be selected. You can then do Q to add them as 2-sec connected stills.

This is a bit unwieldy but in theory using FCPX curation tools you will have already favorited or keyworded the stills you want and will be filtered ONLY on those. IOW if you import 500 provisional stills you won't use them all. At some point you must evaluate those and separate the keepers. The FCPX method is that curation is done using favorites, rejects and keywords -- ahead of time. You're then presented with a much smaller group of stills or clips to work with.

Read More...

Joe M. replied to the topic 'Remote Documentary Workflow Suggestions' in the forum. 2 weeks ago

  • 15,600 clips, appx 212.2 hrs of footage.
  • working remotely with producer (using PostLab)

I worked on a doc last year which had 8,500 clips, 220 camera hours and 130 multi-camera interviews. It was all in one library, using just three events to contain each major shooting location. All other curation and organization used FCPX ratings and keyword collections. This worked fairly well on an iMac Pro and dual 32TB OWC Thunderbay 4 RAID-0 arrays, one primary and one backup.

On previous docs we tried a more detailed keyword approach but this didn't prove useful to the lead editor. It is tempting to use lots of events and keyword collections, even when this doesn't actually help. Using multiple libraries can be a problem since each one is a stand-alone entity and cross-library querying isn't possible. There can also be complications when doing cross-library copying of clips and projects.

People from other NLEs tend to create lots of events because they superficially appear like bins or folders. However you can very effectively manage huge amounts of data in only one event by using ratings and keyword collections. Whether you click on an event or a keyword collection, they both retrieve tagged data. Ingested data can easily be auto-keyworded by folder name or Finder tag, making them as easy to locate as event-based ingest. I'm not recommending using only one event, simply saying don't over use them.

It is easy to fall into a reactive keywording approach, e.g, the assistant editor (AE) goes through the material and creates many similar keywords based on the first impression, e.g, "happy", "smiling", "kids", "children", etc. It is much better to first carefully survey the material then devise a sparse, truly useful keyword approach.

Ultimately the editor must become very familiar with the material. There is no way around that. During that familiarization they'll mark their own favorites and keyword collections. The question is how the AE can prep the material and best assist the lead editor. This is not by over-categorizing material, but doing things that are difficult, tedious or require 1st hand knowledge from the production staff. Examples:

(1) Mark all roles immediately after ingest. See available tutorials about using roles.
(2) Sync all multicam and external audio material.
(3) Mark any ratings or keyword ranges on the MC clips, not the parent clips.
(4) Reject all parent clips and advise lead editor run browser filter in "hide rejected", which avoids accidental insertion of non-MC clip into the timeline, in turn requiring tedious frame matching to the MC clip.
(5) Study closely the material and devise a sparse keyword system which is truly helpful and avoids redundancy.
(6) Keyword items that would otherwise be difficult to find. E.g, use the interview subject's name to keyword relevant b-roll.
(7) Do not keyword unnecessary things. E.g. on a wedding shoot, do not keyword pre-ceremony bridal shots, the ceremony and the reception. These are inherently sequential and chronologically sorted in the event browser; broad keywords typically aren't needed for that.
8. If the AE marks favorites, these should be saved as keyword collection, then the favorites un-marked to free up the favorite system for the lead editor.

In general the preferred FCPX approach is do lots of curating, classifying and organizing in the Event Browser, before putting anything on a timeline. If you want to string clips together in a provisional sequence, use a compound clip for that (which is skimmable in the browser), not multiple timelines (which are not skimmable).

Re performance there is some degradation if creating many project snapshots. There is no hard limit but above 30 or so (depending on platform and timeline complexity) it can slow down opening the library.

I've edited complex 2 hr products on FCPX containing many thousands of edits and performance was generally good. That is on a 2017 iMac 27 and a 10-core iMac Pro. Last year I edited a 6-camera 2-hr stage performance on a 2018 MacBook Pro 15, and it was OK. It's best to defer adding compute-intensive effects to the last phases, otherwise it slows down the timeline.

For collaborative work on a NAS, each editor can have their own NAS-based library and send project XMLs back and forth. However with FCPX a lot of work takes place *before* the timeline phase so the issue is how to achieve non-conflicting collaborative work for that.

We simply told each remote AE to stay in their own assigned event. There they did keywording, rating, multicam creation, etc. They then sent the event XML we then used MergeX to merge his updates to that event in a master library. MergeX is now owned by PostLab. I don't know the current availability: www.merge.software

There are several issues related to collaborative work and media management. I recommend all files be globally unique, due to an issue FCPX has using XML. During ingest FCPX will attempt to "uniquify" duplicate filenames in different folders by adding a suffix such as (fcp1), (fcp2) etc. However this doesn't work perfectly for the XML situation. To avoid this and for general media hygiene, it's good to rename files or add an incrementing serial number to ensure global uniqueness before ingest. This can be done with Finder or a 3rd party tool like A Better Finder Rename: www.publicspace.net/ABetterFinderRename/index.html

If any camera or recorder did not have the correct time of day, immediately after ingest bulk-update those using Modify>Adjust Content Created Date and Time. This ensures they sort adjacently based on time.

When copying clips or projects between libraries, this should be done inside a "transfer event". You create the event, copy the file or project to that, then copy the event to the destination library. This avoids some FCPX media management issues that can create duplicate clips.

There is a known narrow issue with some HEVC codecs on both Mojave and Catalina which can cause FCPX to hang or crash. To avoid this transcode all HEVC material to optimized media.

Do not use the Chrome browser or any browser based on Chrome such as Brave. These mis-use the MacOS VideoToolBox framework and can cause FCPX to hang or crash.

Re detecting duplicate clips in the timeline, this is a commonly-requested feature but FCPX does not have that. In theory someone could write an XML parsing app which analyzes a project-level XML, or else converts it to CSV for a duplicate check using a spreadsheet or other utility. I don't know if that exists.

Read More...

Joe M. replied to the topic 'Missing Files-RAID 5 Array issue' in the forum. 2 weeks ago

Below is the response I posted to the same question on another forum:

You say Reveal in Finder showed zero length files for the FCPX media files (apparently inside the library). This implies the full contents of the library (inc'l those files) were not successfully copied off the failed RAID array -- despite the message "you can still copy files off the disk".

In theory a single drive can fail on a RAID-5 array and the data will be OK but in reality many things can go wrong at file system level or in an application database as used by FCPX or Lightroom. This shows how RAID by itself is not a reliable backup, but that's a separate issue.

Even given the original camera card with intact files, it might not relink to those if FCPX renamed the disk files as part of the ingest. In some cases FCPX will append a "uniquifier" suffix of (fcp1), etc to the file to prevent duplicate filenames. If it does that it might not relink to the original card file because the filenames are different.

Due to the filesystem damage from your disk crash, apparently the disk filenames have also been changed and have a square bracket suffix containing numbers. So you might not be be able to look at the salvaged disk file and see the original disk filename, which FCPX may have itself renamed before the crash.

Here is one thing you might try. If possible rename the current disk drive where the salvaged data and libraries are. Rename it to the original disk name used before the RAID failure. Also place the library and media on the salvaged drive in the same path locations as on the RAID before the failure. Then see if that makes any difference. There is a very low probability of this helping but in some cases FCPX resolves file paths based partially on the drive name. That info is stored in the FCPX library which we cannot change, but changing the drive name to the original name might help (although I'm not optimistic).

If that doesn't work, the only remaining step might be to try filesystem repair or data recovery on the failed RAID. Something about that crash damaged the filesystem, causing the zero-length files. The RAID-5 failover might have protected the data but the filesystem got munged because of an abrupt shutdown. Sometimes that can be fixed. You could either try Disk Utility First Aid or a 3rd party utility like Disk Warrior (which is very good). That only works on HFS+ not on APFS drives.

Don't try that if the data is of such extreme importance that a professional recovery service is warranted. They can sometimes recover the data (at a price) but only if you don't mess it up trying yourself.

Read More...