Questions? Feedback? powered by Olark live chat software

Editing Sound Like a Pro

protoolsProfessional-quality sound editing is crucial to a finished film. While not all films can afford a dedicated sound editor, with some effort, it’s possible to edit sound yourself. The minimum requirements are a copy of Pro Tools (or a comparable Digital Audio Workstation, though Pro Tools is the standard in the film industry) and patience. Sound editing is primarily concerned with:

  • Ensuring proper sync between video and audio.
  • Eliminating clicks, pops and other artifacts that arise during the video-editing process.
  • Eliminating extraneous sounds from production.
  • Ensuring proper timing relationships between music, sound effects, Foley and ADR (automatic dialogue replacement).

In this short guide, we will go over the basics of completing each of these tasks. Before we begin, it’s important to note that sound editing is NOT the same as sound mixing. Audio mixing is concerned with blending the levels on all the tracks in a sound mix, as well as adding effects such as compression and reverb. The last step of audio mixing for film is exporting the sound locked to the video in both stereo and surround versions; mastering is a subset of sound mixing that deals with optimizing a sound mix for a variety of listening environments. Sound editing goes hand-in-hand with video editing, and is focused on proper timing and seamless assembling of all sound in a film.

Ensuring Sync

Sync issues arise for a number of reasons, including incompatible frame rates between versions of the video and transfer issues between the audio and video-editing programs. Note that audio DOES NOT have a frame rate. This is a common misconception. Digital audio is recorded with a sample rate (the number of times per second the signal is sampled) and a bit rate (the depth of the signal, or the number of bits used to store amplitude data at each signal sampling). Technical standards for recording and editing digital audio for video are a 48-kHz sample rate and 24-bit depth, which should be maintained from the recording stage through the video-editing and sound-editing stages. Changes in sample rate and bit rate won’t normally cause syncing issues. They will, however, lead to other problems, including digital sound artifacts and a general depression of audio quality.

Counterintuitively, the more common cause of audio sync issues lies with changes in the video frame rate. It is crucial that the frame rate be standardized between recording and every stage of editing (standard frame rates for different media are available here). It will save time during editing if the audio is synced during the production process, which can be achieved through the use of a cable following S/PDIF or TDIF protocols linking the camera and the sound recorder. A full breakdown of the terminology and consumer/professional variations is available here.

Sound devices with syncing capabilities are significantly more expensive, so many low-budget productions choose to synchronize the audio later from external microphones with the internal camera audio in the editing stage. Sometimes, a perfect sync actually sounds hollow because it ignores the time sound takes to travel across a room and destroys the illusory space between the audience and the camera’s subject. Possible corrections for this include the use of reverb as a plug-in (though we won’t get into that, as it’s more of a mixing topic) and manually un-synchronizing the audio, delaying it 5-15 milliseconds, a change that isn’t noticeable to the ear but can be psychologically effective.

The final important way to ensure sync during the editing process is by using the OMF (Open Media Framework) protocol when transferring files between audio and video-editing programs. OMF ensures all information is transmitted when exporting and importing between Final Cut or Avid and Pro Tools or Logic.

Eliminating Clicks and Pops

Clicks and pops arise in audio when two non-adjacent clips are placed next to each other. The jump in the signal between them leads to a percussive sound. The issue is resolved by creating a short crossfade between the two clips. You’ll want to create an equal-power rather than an equal-gain crossfade for a smoother transition (there’s science behind that, but just trust me on it). Often, projects with many dialogue, music, sound effects and Foley tracks, called audio stems in the sound world, and a complicated video edit will have thousands upon thousands of cuts in the audio. It’s not always possible to manually crossfade every single transition, especially when on a deadline.

The best way to deal with clicks and pops, in my opinion, is to solo out tracks one at a time and use your ears, with the volume turned up. Any audible clicks should be immediately dealt with; on the other hand, if you can’t hear a click with the track soloed, it will almost certainly be fine in the final mix. Clicks and pops occur at every stage of editing, as every video pass implies a new audio edit, and are generally still being removed during the final mixdown.

Eliminating Extraneous Sounds From Production

With this step, we see one of the bigger differences between a professional and an amateur sound editor. Without a serious time investment, it’s difficult to rescue audio from a bad take or selectively edit frequency ranges to get rid of wind noise without harming the dialogue. Fine-tuned elimination of extraneous sounds often involves complex multi-band, time-dependent equalization of the audio signal, as well as manually redrawing signals. This is where sound editors really earn their pay; in the case of a low-budget production, it’s probably best to just scrap any audio with too much extraneous production noise.

Ensuring Proper Timing Relationships

Foley is the overdubbing of (generally naturalistic) sound effects to a production in post. Hiring a Foley artist for a day is well worth it for verisimilitude and will often resolve many of the issues related to production noise. ADR generally sounds terrible unless great attention is paid to it, which means rerecording the actors in a proper sound studio with prep time to get them into their roles and significant attention paid to ensure that the artificial reverberation of their voices matches the room in which the scene was originally recorded. In other words, unless the film has a proper budget for it, ADR often leads to less rather than more professional-sounding audio.

Properly placing ADR and Foley is a time-consuming task, with attention needing to be paid to both sync and the contextual flow of the audio. For the placement of the music, it’s best to work with a dedicated music editor. If this isn’t possible, oftentimes, film composers are also competent music editors. The flaw with this, though, is that they tend to over-prioritize their own contributions. This is often resolved during the rerecording mixdown session.

This covers the main concerns of a professional sound editor. Poor sound in a final film is a giveaway of amateurism, even if the picture looks pristine. Proper editing and mixing are crucial to professional-quality audio. By following these guidelines and consulting the help files of your DAW, you’ll be well on your way to sound-editing your film projects like a pro.