After FCP is updated and I try to open a library it states that the library needs to be updated to work with this version of FCP. No problem....I understand the need. I have a "best practices" question. I have 20+ libraries (1 per year). I rarely open them but "some day" I hope to use them and I want them to work with latest version of FCP. Whenever there is an update, I go through and open each library and update it to work the the latest version of FCP that I am running. Is this wise or necessary? Should I just wait and do it every few years? If necessary, is there a short cut (rather than open each library one at a time)? I read that after 10.1 there was an "Update Projects and Events” in the File menu that would update everything. Did that go away? I don't see it in 10.5.1
First problem: we have absolutely no access to the FCP APIs. It's private.
And even if we had it, we think it would be dangerous to automate this batch update process. Blindly, for safety, this will require a prior backup of each library. If all the libraries were exclusively with external media, we could possibly do some cleanups, then backups, before running the batch update. But in most cases, libraries contain media. They are heavy. It would then be catastrophic to launch a general batch backup ...
No, we believe that such a possibility will put a lot of users in very critical situations. We believe this is a process that should be considered on a case-by-case basis.
Final Cut Library Manager can scan drives, locate FCP libraries and do various tasks like delete render files. However this is only possible because those items are in well-known folders within the library, and interpretation of the contents isn't required -- just deletion. It is a great and valuable utility, but it is constrained by the opaque "black box" nature of FCP libraries.
The media file metadata and edits themselves are stored in various SQLite databases within the library. There's a database per event and one per project. Each database contains several SQL tables with relational links between them. It is possible to open those tables and read the contents using a SQLite utility or code library, but the schema and data dictionary are not documented. IOW the meaning of values in row/column positions, what columns within a table signify and what tables serve what function -- are all undocumented.
Moreover there are primary<->foreign key links between tables and some of those reference tables with binary (aka "blob") columns. There is no apparent way to decipher (for example) what byte offsets in a row of a blob table signify a URL, inode or bookmark locator. A single SQL table must be updated in coordination with other tables, else it will break referential integrity and corrupt the database.
It appears the programmer-facing API within FCP or other MacOS apps is Core Data, which implements an object graph (normally in memory) but which can be translated by a lower layer to a persistent data store implemented by SQLite.
It is possible to observationally determine a few things by exporting to CSV the before/after state of FCP SQL tables and using a utility like Beyond Compare for a "diff" comparison. However that is not remotely sufficient to write a utility to update the tables, or even determine where URL or file locators are stored.
It will likely not be possible to write a utility of the type discussed until and unless Apple publishes an API for 3rd-party access to FCP libraries. I don't see that ever happening.
Edit/add: I just remembered Final Cut Library Manager has the optional ability to export to CSV the full pathname of every media file used in a FCP library. I don't know how they do that, but that is impressive!