A week later and the information from a packed WWDC is beginning to sink in. The move to Apple Silicon will bring advantages to FCPX such as speed and resource use, but what about new possibilities with new technology?
It was a curious clash of dates, the keynote presentation from Apple's World Wide Developer Conference happened a day after Final Cut Pro X's ninth birthday.
We all know that Apple have repeatedly said that a 10-year plan was drawn up for FCPX during development. The Pro Apps team's road map must be getting slightly dog-eared by now having been regularly thumbed on the journey.
But was the Arm-based transition ever on that map?
For the last couple of years, yes, and it's grown bigger in the view through the windscreen taking all the attention from a big update. Transitioning to a Universal App that runs on Intel and Apple Silicon together must have utilised many of the engineering resources that would have gone in to a major release.
What is more interesting though is that the move to Apple Silicon has unfolded another road map, a much larger area to navigate, with wider highways and faster cars.
OK, enough with the driving analogies. You only have to take a look at the State of the Union presentation at about 14 minutes in to realise that the move to new chips will have a huge impact on us video editors.
I'm sure you all saw that automatic labelling of ranges on clips isn't that far away. This is where FCPX will really dominate the other NLEs.
The 2011 relaunch was painful, but it gave us a new framework with up to date technologies that can be expanded on. Adobe, Avid and Blackmagic will find it hard to lever technology back into their NLEs.
You might say Adobe is ahead with the auto-reframing of objects for the creation of different aspect ratio deliverables. You would be correct, but anyone who works both Premiere and FCPX will tell you that being able to do something, and being able to do it well without problems or work-arounds are two different things. I'll mention multicam as a decent example to measure the differences between the two.
Apple gave us a quick flash of reframing in the Keynote, but that's not what's getting us excited, this is:
Every iPhone developer will now be able to write or port their apps for macOS. Having done the hard work of harnessing AR or ML on the phone, they will be able to incorporate all their knowledge and resources into apps or plugins to run with FCPX.
This will start off a whole new sphere of plugin and app development. Here's a few ideas that sprang to mind:
Voice to Text and Translation
Yes, there are app and plugins that do this already, but this would happen on the machine in realtime.
The next 10 years is going to be an interesting ride. If you have any ideas for cool plugins, why not post them in the comments below.