Back in March 2016 I took a prototype Jellyfish portable started storage unit out on the road for a broadcast Final Cut Pro X edit. A few months later, I took a shipping model out on another high pressure, quick turnaround broadcast job. What’s new, what's changed and how did it perform?

Looking back at the Jellyfish’s first outing, it suited the job really well and with one hiccup, worked exceeding well.

On the road with Final Cut Pro X and a Jellyfish, the new budget shared storage system

A few months later and I have another, similar broadcast job and the question of edit kit was raised.

The Outside Broadcast company were quite happy to supply a Synology drive, but as the job increased in complexity, a quicker and beefier storage solution was needed. The job needed two edit machines, an ingest station, a graphics station and connection to the truck EVS system.

Hiring an ISIS system for the weekend would involve an engineer attending and that would blow the budget. So I suggested that I could supply and look after a newer, faster shared storage system.

A quick call to Ronny Courtens and he was more than happy to loan a Jellyfish for a week.

fcpx jellyfish 2

Plugging the just delivered LumaForge Jellyfish up to a MacBook Pro with GigE.

So what has changed about the Jellyfish? Quite a lot actually.

It is now a smaller form factor than the ‘cube’ before. Internally, things are a lot more secure and space has been saved by racking all the drives up vertically. Should a drive fail, swapping one out is easy as they just slide out of the rack.

fcpx jellyfish 3

fcpx jellyfish 4

I’d be a lot happier transporting the new model in an overhead bin on a plane or shipping it in a foam lined flight case. (FedEx delivered the machine to me in the UK from Ronny in Belgium.)

A spare USB key boot drive is also supplied. After the Jellyfish’s first outing, LumaForge has supplied a spare with every machine. Luckily we didn’t need it this time!

We also have to mention the handle. Portable shared storage needs to be exactly that and the handle really helps.

The machine supplied had 2 10GigE ports and 4 GigE ports. As two top of the range 27 inch Retina iMacs were to be used for the edit stations, these were connected up via Sonnet Thunderbolt 2 adaptors.

fcpx jellyfish 5

Bottom row of four connectors are GigE, above that are two 10GigE connectors.

fcpx jellyfish 6

The Sonnet adaptor boxes need to have a driver installed on the client machines.

Although we expected high speeds of read/write, we were quite taken aback when we managed to achieve the same on two machines at the same time. This is a fast box! Check out the video.

I think anybody including the old ‘seen it all before’ shared storage technicians would be impressed by 900 MB/s read/write on two iMacs from a desktop unit.

The machines were configured before the event as there was editing to be done before hand. This is where the Jellyfish benefits start to shine as the storage unit is the same for the live event.

This might sound obvious, but most OB’s have a day of configuration of edit suites where versions are checked, storage, cache and autosave directories set, plugins loaded, speeds tested and fixes for all the little things right down to loading editor’s preferences. Then starts the laborious process of loading up the precut media and relinking everything back together again.

Connecting to the Jellyfish can be done in seconds with no IT experience needed. LumaForge has updated the client login application to make NFS and SMB shares even easier.

fcpx jellyfish 7

With the Jellyfish, we just turned up, got the machines out and carried on editing. For an afternoon we were even based in a dressing room as the OB truck wasn’t ready yet!

Two more connections to the Jellyfish were needed. The first was a graphics machine that was running Mavericks. A quick call to LumaForge and they walked us through how to connect the old 10.9 machine without using the normal app.

fcpx jellyfish 7

Edit 1 setup in the back of a tender. Note the fairly complex timeline!

fcpx jellyfish 8

Edit 2 slightly more snug in the back of the replay area.

Then we had to connect to the truck EVS system. A quick SMB share from the XTAccess box and we could see growing EVS streams in the FCPX browser. This process is vital in today’s fast turnaround OB environment.

fcpx jellyfish 9

The Jellyfish hooked up to the truck's EVS machines. (XTAccess in the background)

There then followed two days of hassle free Final Cut Pro X editing. The Jellyfish didn’t miss a beat. It only took me a few minutes each morning to go around all the clients to reconnect them after the trucks (and Jellyfish) were powered down overnight.

The Suicide Squad inspired opener for the show which was half edited before the event and finished on site. It is the export from that rather complex timeline in the picture above!

One thing I’d like to emphasise here is the speed. The combination of the Jellyfish, the NFS filesystem, the Retina iMacs and FCPX provided just about the fastest OB edit system you can buy (or borrow!) It makes other similar OB setups using ISIS & Adobe Premiere systems look like they are back in the 90’s using SCSI drives.

The opener for the last show made up of EVS match streams, archived EVS clips and Sony F5 slo-mo and full speed camera originals and a Canon 5D second angle. Cut and ready just a few minutes before TX.

Waveform generation of long files, especially with multitrack and/or growing recordings are the achilles' heel of NLEs. I was very impressed how fast the Jellyfish/iMac/FCPX combination produced the waveforms. This video shows the speed of a closed file with multiple audio tracks.

But the benefits didn’t stop

Traditionally, at the end of an OB, the EVS operators produce ‘melts’ which are just all the best clips from different angles of the coverage put together. Normally, this takes time to copy over, but as each clip was being ‘archived’ onto the Jellyfish through the transmission, we were able to derig the machine 15 minutes after coming off air.

In the pre-tournament edit and two days of transmission, we stored just over 2TB of camera originals, stills, and music. The EVS transferred just over 1.5TB of streams and archived clips to the Jellyfish. NFS was used for FCPX and media, the EVS mounted via SMB.

fcpx jellyfish 10

The work was tough, edits were going through the suites so fast that there wasn’t any time for media management. No problem, as the Jellyfish was going back to my studio for a day to copy the media, matches and masters that were needed on to another drive. Also a good opportunity to make split track exports.

If you have ever had a generator driver standing over you at the end of a job waiting for you to power off, you will know the benefit of being able to tidy up and media manage at your leisure.

Time is money, or in this case, time is diesel.

I can see more jobs where the editor/s are in control of the technical aspects of OB post production. They know how things should be configured, how fast things should run (and why) as they are the end users. The Jellyfish is not only a fast shared storage box, it is very easy to hook up to each client. No IT or field engineer required.

And the bottom line really is the bottom line. If a client has a fully working, fast edit system on location on a tight budget, everybody wins.

 

peter wigginsPeter Wiggins is a broadcast freelance editor based in the UK although his work takes him around the world. An early adopter of FCP setting up pioneering broadcasts workflows, his weapon of choice is now Final Cut Pro X.

You can follow him on Twitter as @peterwiggins or as he runs the majority of this site, you can contact him here.