Festival Screening Formats

A re-occuring theme seems to be many filmmakers' preoccupation with the visual side of filmmaking at the expense of knowledge/care when it comes to the audio side of things. The result is that many films which succeed in getting through the most difficult step, the selection process, end up either not being screened or possibly worse, end up being screened with fairly disastrous consequences.

Most film festivals require entrance submissions on DVD (and sometimes other consumer formats). This is fine and easy for virtually all low/no budget filmmakers. However, once accepted, the bigger film festivals will require a projection (or exhibition) copy of your film and the specifications for this projection copy are far more tricky to comply with and particularly when it comes to the audio specs, poorly understood by many filmmakers. An area which causes many problems is specs which appear to allow the submission of stereo sound.

As I discussed and explained in this Stereo Warning thread, stereo sound is far and away the most popular format for TV broadcast, youtube (and other internet distribution) and for the music business but it is not, nor has it ever been, a film audio format. The mistake made by many is not appreciating that the term "stereo" frequently has a very different meaning in the film industry to any of the other industries!

This confusion stems from the fact that the term "stereophonic" (or stereo for short) is widely misused. It is commonly used to mean 2 channel (left, right) sound but is actually defined as a method of sound reproduction which creates an illusion of audible perspective. Therefore, even 5.1 surround sound can legitimately be called "stereophonic" or stereo. The confusion is made worse by the fact that from the mid 70s to the mid 90s "Dolby Stereo" was the most common film audio format. Dolby Stereo (and it's consumer equivalent, Dolby ProLogic) is recorded to two channels and is compatible with all 2.0 stereo systems but is in fact a 4 channel format (Left, Centre, Right, Surround). During re-recording, a Dolby encoder takes the 4 channels and encodes them into two channels, these two channels are referred to as Left total and Right total (LtRt) to differentiate them from the Left only, Right only (LoRo) which is standard 2.0 stereo. On playback in a cinema, this LtRt mix passes through a Dolby decoder which recreates the original 4 channel LCRS mix.

It's actually quite difficult (until it's decoded) to tell the difference between an LtRt mix and a LoRo mix and some festivals may not bother checking. If you submit your film with an LoRo stereo mix when an LtRt mix is required, your film maybe rejected for screening. Those festivals which don't check will screen your film and the Dolby decoder will attempt to decode your LoRo mix. The result of this decoding process on an LoRo mix is unpredictable but commonly, parts of the music mix, certain stereo sound effects and even sometimes the dialogue itself can play from the wrong speaker or jump to a different speaker. Unfortunately, the speaker it's most likely to jump to is the surround speaker. Obviously, having parts of your sound mix (possibly the dialogue itself) suddenly jump from say the centre speaker to the surround speaker in mid sentence is going to pretty much destroy your screening.

In short, you should never use 2.0 stereo for any screening in a cinema, even if the specifications appear to imply that stereo is an accepted audio format. However, small, regional film festivals usually have very limited funds/resources, maybe using a temporary venue rather than a cinema for screening and may only have a 2.0 stereo system. So, you need to check and be sure you clearly understand what the festival is after, especially if they specify stereo as the audio format!

G
 
Those festivals which don't check will screen your film and the Dolby decoder will attempt to decode your LoRo mix. The result of this decoding process on an LoRo mix is unpredictable but commonly, parts of the music mix, certain stereo sound effects and even sometimes the dialogue itself can play from the wrong speaker or jump to a different speaker. Unfortunately, the speaker it's most likely to jump to is the surround speaker. Obviously, having parts of your sound mix (possibly the dialogue itself) suddenly jump from say the centre speaker to the surround speaker in mid sentence is going to pretty much destroy your screening

Well that explains a lot. At the Columbia Gorge Int. Film Fest, the audio for my movie did exactly what you describe. It jumped from speaker-to-speaker, almost randomly, and to the best of my recollection, the majority of the speaker-jumping involved dialogue. It confused the hell out of me, and yes, of course it was VERY distracting. I can't think of any better way to kill an audience's suspension of disbelief than to constantly remind them that they're watching an amateur movie. 2001 and wheaty were in attendance to witness the debacle, and the optimist in me thinks that they would've enjoyed the movie more, if it were not for this ugly distraction. :D

I hope you don't mind the digression, but since you've answered one formerly mind-boggling audio question for me, may I ask you another?

When we screened at our local 2-buck theater, the audio often went out of sync with the video. It seemed to happen at the beginning of new scenes. When a new scene started, the audio would go silent for a couple seconds, then come back in, a couple seconds out of sync. This would last for maybe 30 seconds or so, then it would correct itself and come back in sync. WTF?

It was a 2.0 stereo mix, played on DVD. I've discussed this issue with other local indie filmmakers and they've experienced the same problem at this theater. And yet, this theater screens Hollywood movies every single night without a hitch. Is it that our 2.0 mixes are not compatible with their hardware/software? Would the problem be fixed with a 4-channel Dolby Stereo mix?

Thanks for the info!
 
Last edited:
And a follow-up question:

Obviously, I'd greatly prefer the 4.0 screening format of Dolby Stereo. However, if in a pinch, might we be able to trick the 4.0 system into behaving like a 2.0 system? Could we not mix down to 4.0, while leaving the front and surround channels completely silent?
 
At the Columbia Gorge Int. Film Fest, the audio for my movie did exactly what you describe. It jumped from speaker-to-speaker, almost randomly, and to the best of my recollection, the majority of the speaker-jumping involved dialogue. It confused the hell out of me, and yes, of course it was VERY distracting. I can't think of any better way to kill an audience's suspension of disbelief than to constantly remind them that they're watching an amateur movie.

All the commercial theatrical projection systems I know of will automatically assume that a 2.0 mix is an LtRt mix and will try to decode it, potentially resulting in exactly the symptoms you experienced. It is also possible the decoding process will turn your LoRo mix into a reasonable sounding surround mix but there's no way of knowing whether you will get an acceptable surround mix or a disaster until you actually feed your LoRo mix through a theatrical Dolby Decoder Unit.

When we screened at our local 2-buck theater, the audio often went out of sync with the video. It seemed to happen at the beginning of new scenes. When a new scene started, the audio would go silent for a couple seconds, then come back in, a couple seconds out of sync. This would last for maybe 30 seconds or so, then it would correct itself and come back in sync. WTF? ... It was a 2.0 stereo mix, played on DVD. I've discussed this issue with other local indie filmmakers and they've experienced the same problem at this theater. And yet, this theater screens Hollywood movies every single night without a hitch. Is it that our 2.0 mixes are not compatible with their hardware/software? Would the problem be fixed with a 4-channel Dolby Stereo mix?!

To be honest, I'm not sure what would be causing that problem. It could be that an LtRt mix would solve the problem but IMHO, a more likely culprit would be the DVD format itself. Theatrical projection systems are designed for either 35mm film or DCP, not for DVD, BluRay, HDCAM or any other format. To play a DVD or other format through a theatrical projector system essentially requires some sort of hardware hack, both for the picture and for the sound. It sounds like the audio routing is a bit fouled up, which would not be a surprise. In my experience, once the Dolby installation techs have been in to calibrate the cinema's sound system there's usually no one at most cinemas who have any clue about how the sound system works. Dolby actually directly supports many international film festivals by not only supplying the audio/visual equipment to project and playback the various exhibition formats accepted by the festival but by supplying their own technical staff for the duration of the festival to help eliminate the type of problem you describe.

Obviously, I'd greatly prefer the 4.0 screening format of Dolby Stereo. However, if in a pinch, might we be able to trick the 4.0 system into behaving like a 2.0 system? Could we not mix down to 4.0, while leaving the front and surround channels completely silent?

No, Dolby Stereo uses a technology which makes it impossible to do this. Explaining why is tricky without going into a lot of technical detail but I'll try as simply as I can: Modern digital audio formats like Dolby Digital, DTS and even polywavs create digital bitstreams. The decoder just has to read the metadata in the bitstream to know which audio channel (speaker) each bit of digital audio data belongs to. Dolby Stereo on the other hand was an analogue format, the output of the encoder was not a digital bitstream but just two analogue audio channels. Part of the encoding process (converting the LCRS to LtRt) involves taking the surround channel and mixing it into the left and right channels but slightly out of phase. Not enough out of phase to cause too many problems when played back on a standard (LoRo) stereo system but enough out of phase for the decoder to detect the difference, extract the "out of phase" part of the LtRt signal and route it to the surround speaker. When you feed a 2.0 mix to a Dolby Stereo decoder, there are only the two audio signals, no other info or data, so, the decoder does not know whether it's getting an LoRo mix or an LtRt mix, it will just treat everything it's give as an LtRt mix and try to decode it. The problem arises because it's almost impossible to create a LoRo mix without phase discrepancies, which will confuse the Dolby Stereo decoder into outputting that part of the signal to the wrong speaker. As you cross the threshold of the decoder's phase detection circuitry the sound will suddenly jump to a different speaker, cross the threshold again and the sound will jump again. The only practical way of knowing with any certainty where the decoder is going to place the sound elements in your mix is to have a full LCRS sound system, place a Dolby Stereo encoder and decoder in the monitoring chain while mixing and to listen to what's happening!

I worked exclusively in Dolby Stereo for a period of nearly 3 years about 15 years ago and to be honest it's a real PITA! I've no idea how many hours I wasted jiggling sounds around, trying to pan them and get them to play out of the speakers I wanted! Working with a "discrete" system like Dolby Digital is a dream compared to the "matrix" technology used in Dolby Stereo. When you want to position a sound somewhere all you have to do is pan it there, wow, how quick and easy is that! Believe me, in practice, you wouldn't want to use the 4.0 Dolby Stereo screening format, it can sound very good when it's finished but creating it is a hassle. It should in theory be possible to get a Dolby Stereo decoder to output just a 2.0 mix but in practice it's unworkable because you wouldn't be able to pan anything to the centre of your mix, you'd need almost perfect phase coherency and, you'd still need to monitor it all through a 4.0 (LCRS) sound system and a Dolby decoder anyway, to know whether or not you were actually getting a 2.0 mix. In practice it would be far easier to just to create a 4.0 mix!

As Dolby Digital is a discrete system, rather than a matrix system (like Dolby Stereo), you can create a 2.0 (LoRo) mix in Dolby Digital and indeed this is often done in the HDTV broadcast world. If a festival has the facility to play Dolby Digital (or other 5.1 formats), using it to create a LoRo Dolby Digital mix is pointless though, as LoRo is the worst of all possible theatrical screening audio formats! LoRo mixes should be completely discounted as a film festival exhibition format unless that film festival provides no other option!! For theatrical screening, 3.0 is the absolute minimum channel count.

G
 
Last edited:
My most recent film has been in a lot of festivals in the last 2 years. The majority of them, even the big ones that will take DCP and HDCam are now accepting digital files, like .mov and .mp4

So what does this do to the sound? I've seen my digital files projected in theaters of various sizes at all these festivals. Sometimes it sounds great. Other times, meh. Occasionally, its just way too loud or soft, but I usually assume that's the theater.

I had a professional 5.1 mix (done by a guy who works at Disney, so it was about as pro as you go). I walked out with a 5.1 set of tracks for DCP (which I couldn't afford to have made) and LtRt set for HDCam (which I always send to festivals if they accept them) and a plain old stereo mix which I put on my digital files. That's what I've been sending to festivals. More and more of the second tier fests - the ones that still play in theaters and nice venues but aren't your Cannes, Sundance etc. seem to be going this route, since it's way easier than shipping a bunch of tapes or drives all over the world.

So with this new situation, how do we make our films sound their best? Can you do a 3.0 or higher mix in a digital file and can Mr. Projectionist's MacBook Air play it through the theater's sound system?

Thanks for this very helpful post :)
 
My most recent film has been in a lot of festivals in the last 2 years. The majority of them, even the big ones that will take DCP and HDCam are now accepting digital files, like .mov and .mp4.

So what does this do to the sound? I've seen my digital files projected in theaters of various sizes at all these festivals. Sometimes it sounds great. Other times, meh. Occasionally, its just way too loud or soft, but I usually assume that's the theater.

No, it's not the cinema. Cinemas are carefully calibrated to theatrical standards which have been around for 30+ years. Commercial dubbing theatres (mix stages) are also calibrated exactly the same. The end result is that a mix produced in a dubbing theatre will sound the same (or at least fairly close) in any given cinema. So, all's well and good in the cinema world but at the mid and lower tier film festivals we're generally looking at filmmakers who cannot afford a commercial dubbing theatre and who often don't know that theatrical audio specs even exist, let alone try to get close to them! This presents festivals with a dilemma, if they leave the cinema calibrated to standard theatrical levels many/most of the films, which are probably mixed closer to music industry levels, will be deafening loud. If they turn the cinema sound system down so music industry type levels don't cause all the audience to run out with their hands over their ears, then those films at or close to actual theatrical levels will sound extremely quiet. At the major festivals the sound systems are left calibrated to theatrical standards and filmmakers are expected to produce an exhibition copy suitable for standard cinema screening, if they don't, their film will simply be rejected for screening. Some fests, Sundance for example, will check the exhibition copy early enough to give filmmakers a bit of time to try and rectify the mix. If the mid and lower tier fests took this approach, they would probably have to reject the vast majority of films selected for screening, if they even had the manpower and facilities to check the exhibition copies before screening in the first place! So generally they just "take a view" and turn the sound system down by whatever amount they think is most appropriate. Unfortunately though, this effectively penalises those who have made a film at or close to actual theatrical standards!

[2] So with this new situation, how do we make our films sound their best? [1] Can you do a 3.0 or higher mix in a digital file and [2a] can Mr. Projectionist's MacBook Air play it through the theater's sound system?

[1] Yes. the digital video file formats you mentioned will both allow for multi-channel sound in the form of a poly-wav (which is converted to a multi-channel AAC format). So 3.0 all the way up to 7.1 are supported.

[2] and [2a] are really the same question. Mr. Projectionist's Macbook Air can output up to 7.1 but does Mr. P. know how to setup his Mac to do this and integrate that output with the theatrical system? Even if the answer is "yes" there's still the issue of levels at mid and lower tier fests I mentioned above. Conclusion: It's a crap shoot! Logically, a festival which advertises itself as "a film festival" should be able to perfectly screen an actual theatrical film! In practice though a theatrical film is specifically a DCI/SMPTE DCP with a theatrical sound mix and anything else is going to require some jerry-rigging, with unpredictable results.

The only sensible advice I can give is check with the individual festival/s and only assume an actual theatrical mix will play as intended at the top festivals.

G
 
Last edited:
Back
Top