• Wondering which camera, gear, computer, or software to buy? Ask in our Gear Guide.

Syncing Workflow

Okay, it's something that I never understood and that gives me serious itching when I read articles about.

I know the real way to do it is to use Timecode. As Guerilla filmmakers we don't have that so moving on.

The way I usually do it is to clap using hands or slate. Then I drag every production file (audio and video) in Premiere, go through every single file and relabel it in a clear way (scene, take).

Then I start the actual edit and when I decide on a take, I check if the matching audio is good. If yes, I use that take, if not, I look for another take or a way around. And ONLY THEN do I sync the files, manually by finding the clap on both files and while it's easy, it also feels stupid because after I've trimed everything to keep what I need, now I need to untrim to go back to the start of the file to find the clap and once I've linked audio and video, I retrim again and put it back into place.

It works but I feel stupid every time I do it. But I can't think of any other practical way. A better way to do it would be to sync the files before the editing, i.e. merging audio and video before bringing everything into the editing suite.


What's the professional way to do it (without Timecode) ?
 
Ideally, there would be a software that merge both files into only one that contains the video and the "good" sound.

At the moment, there isn't (that I'm aware of) but if you're willing to wait for Adobe Premiere Pro CS7, from what I saw, they have that function built in. Currently you'll have to deal with PluralEyes via dragging your video and audio files on to the time line (with space in between each shot), exporting that time line, having pluraleyes process it and import the new file.
 
That's not the issue. I feel frustrated not being able to convey what I want to say :(

The question is "how". I have the audio and I have the video and I want every video file to be synced to its own audio file so that when I actually dive into the edit, I see the video file and listen to the synced audio. It's like replacing the reference audio track in the video file with the synced one.

But PluralEyes or whatever, it's just not possible because the files stay distinct unless you tie them inside a sequence.

Ideally, there would be a software that merge both files into only one that contains the video and the "good" sound.

You'd either:
Use a sequence to line up and sync audio with video, using either a slate or a guide track (or both) and use the sequence for editing, or as a reference for an 'in' point.

OR:
Set 'in' point at the clap in the video, and at the clap in the audio, hit 'merge' or 'sync' and then it creates a sync'd file for you. Easy peasy. I have a feeling the command in APP is 'Merge clips' whereas in Avid and FCP7 it's Sync or Autosync.

Just make sure you tell it to sync using the 'in' points (or 'out' points if it's a tail-slate)
 
You kindda are pedant in this case APE. Even if I wanted to, I can't use Timecode on my camera or my audio equipement. I'm not looking for the easiest way to do it because the easiest way to do it is to buy Timecode compliant equipement. I'm looking for the smartest way to do it. But the way I see it, smartest and professional go hand in hand.

The way I see it, you are looking for the smartest way to do it within the constraints of using non-professional equipment, which is not IMHO anything like the same as the "professional way to do it" as the professional way to do it is entirely different for what you or anyone else here is suggesting. The professional way to do it may not be applicable to your equipment or workflow but it's still worth mentioning what the professional way to do it is and making clear there exists a significant difference.

Also, APE, even when using timecode you often have to double-check and/or re-sync things manually anyway. Timecode makes things heaps easier, but it's not 100% perfect. That's why there's still a clap at the start of every take.

I've AC'd on docos where cam and sound were syncing simply via TOD timecode, with no slate at all. Luckily I was checking and re-syncing TC every hour or so to make sure there was no drift. Lot of trust put into your AC, was shot on S16mm, so there was no guide track to attempt to sync it up later if the TC did drift!

I'm not so much of a production sound expert but as I understand it, the slate in a timecode locked system broadcasts a timecode sync location, forcing the timecode slaves to re-sync and therefore avoiding the likelihood of drift. So I'm not sure why you are having to re-sync TC every hour when the TC should be automatically re-synced every take. One of the main reasons for using TC in the first place is to avoid drift, so something strange appears to be going on with your setup.

G
 
I'm not so much of a production sound expert but as I understand it, the slate in a timecode locked system broadcasts a timecode sync location, forcing the timecode slaves to re-sync and therefore avoiding the likelihood of drift. So I'm not sure why you are having to re-sync TC every hour when the TC should be automatically re-synced every take. One of the main reasons for using TC in the first place is to avoid drift, so something strange appears to be going on with your setup.

With a TC slate you can either have just the TC run on the display, or you can also have a device 'jam' everything that's running in sync TC. Assuming everything jams correctly, that's your best/easiest way of working (and generally, at least on high budget productions is what happens). Simply having the smart slate itself, though, does not necessarily mean anything has been jam sync'd. The readout certainly shows TC, but whether the smart slate is the 'master' or the 'slave' can be up to the production, and generally you're using a Master ClockIt to set TC on sound, camera (if your camera accepts it) and slate, so the readout on the slate is (should be) the correct TC, but nothing is necessarily jam sync'd at each clap, so there's still potential for drift. You also need to re-sync whenever the slate is turned off.


The doco I was ACing on were not using smart slates at all - we were shooting AatonCode using a S16 camera. We were using time of day/free-run TC. We slated with a dumb slate at the start of every roll to account for global discrepancies, but I also had to check every hour or so to ensure the time of day was still correct and sync'd accurately. Not the most ideal way to do it on location, but syncing in post was a breeze (apparently).

The best/professional way to run TC with a smart slate would be to jam sync - though it can depend on the preferences of the production, and I always have my dumb slate in my bag, even on sets that provide smart slates, as a backup just in case something happens with the smart slate (or the batteries die at an inopportune time!).

Just my experiences, I certainly haven't AC'd on Hollywood features, so their workflow might be different.
 
Last edited:
Yep, jam sync'ing would certainly be best for production sound. It's worth noting that in post even jam sync'ing is not accurate enough and full TC lock is required. Having TC and scene and take numbers in the metadata transforms the audio post process. Locating and editing in alt takes for auditioning or permanent use becomes a task which takes a mouse click rather than 5 minutes (or considerably more if the files are not very well labelled and managed). 5 (or 20+) minutes doesn't sound like much of a time saver but multiply that by the few hundred times you need to do it during post for a feature and you're looking at huge amounts of saved time, not to mention fewer mistakes and less ADR. All in all, way better and way cheaper than the cost of hiring TC equipment for production sound.

G
 
With PSMs being able to name their files (or at least save as metadata) scene, shot, take etc. it'd be really great with digital cameras to have metadata in the camera for those things as well - I imagine a digital camera report (iPad?) that would sync the information wirelessly with the camera as metadata. It could be a two-way sync - you could have white balance, shutter, fps data as well as lens data from /i lenses sync to the camera report, and the scene/take/shot, whether the take was good/bad etc.

It would make things on set a lot easier, and I imagine in post a lot easier too, both for the sync/edit and for vfx shots - you could have all the needed data (lens height, lens selection, lens distance, angle etc. etc.) right there in the metadata of the file.
 
Back
Top