Chapter 16 Main Points

►Sound effects (SFX) can be classified as anything sonicthat is not speech or music. They bring realism and addeddimension to a production; they shape what you see.

►Sound effects perform two general functions: contextualand narrative. Contextual sound emanates from and duplicatesa sound source as it is. It is also known as diegeticsound—coming from within the story space. (Nondiegeticsound, or extra-sound, comes from outside the storyspace; music underscoring is an example of nondiegeticsound.) Narrative sound adds more to a scene than whatis apparent.

►Narrative sound can be descriptive or commentative.

►Sound effects can break the screen plane; define space;focus attention; establish locale; create environment; emphasize,intensify, and exaggerate action; depict identity;set pace; provide counterpoint; create humor, metaphor,paradox, and irony; symbolize meaning; animate inanimateobjects; and unify transition.

►Silence can be used to enhance sonic effect, particularly insituations where sound is expected or anticipated.

►Generally there are six types of sound effects: hard (orcut) effects, soft effects, Foley effects, ambience (or background)effects, electronic effects, and design effects.

►The two primary sources of sound effects are prerecordedand produced.

►Prerecorded sound effects that can number from severaldozen to several thousand are available in libraries.The major advantage of sound-effect libraries is that forrelativelylittle cost many different, perhaps difficult-to-produce,sounds are at your fingertips. The disadvantagesinclude the lack of control over the dynamics and thetiming of the effects, possible mismatches in ambienceand the possibility that the effects may sound canned, andthe effects’ not being long enough for your needs.

►Sound-effect libraries (and any recorded sound) can bemanipulated with a variety of methods, such as varyinga sound’s playing speed, playing it backward, looping asound, and altering it through signal processing.

►Producing live sound effects in the studio goes back to thedays of radio drama. Producing effects in synchronizationwith the picture is known as Foleying, named for formerfilm soundman Jack Foley, although today Foley refers toproduced effects in general.

►Many types of sound effects can be produced vocally.

►Foley effects are produced in a specially designed soundstudio known as a Foley stage, an acoustically dry studioto keep ambience from adding unwanted sounds tothe recording.

►The keys to creating a sound effect are analyzing its soniccharacteristics and then finding a sound source that containssimilar qualities, whatever it may be.

►The capacitor microphone is most frequently used inrecording Foley effects because of its ability to pick upsubtleties and capture transients (fast bursts of sound).Tube-type capacitors help smooth transients and warmdigitally recorded effects.

►A critical aspect of Foley recording is making sure that theeffects sound consistent and integrated and do not seemto be outside or on top of the sound track.

►Microphone placement is critical in Foley productionbecause the on-screen environment and perspective mustbe replicated as closely as possible on the Foley stage.

►Important to successful Foleying is having a feel for thematerial—to become, in a sense, the on-screen actorby recognizing body movements and the sounds theygenerate.

►In addition to the obvious preparations for Foley recording—making sure that all props are on hand and in place—it is also important to be suitably dressed and physically fit.

►Instead of, or in addition to, Foley recording, many directorschoose to capture authentic sounds either as theyoccur on the set during production, by recording themseparately in the field, or both.

►If sounds are recorded with the dialogue as part of theaction, getting the dialogue is always more important thangetting the sound effect.

►When a director wants sound recorded with dialogue, thesafest course of action is to record it with a different micthan the one being used for the talent.

►An advantage of multiple-miking sound effects is the abilityto capture various perspectives of the same sound.

►In field recording, utmost care should be taken to recordthe sound effect in perspective, with no unwanted sound—especially wind and handling noise.

►As with most sound-effect miking, the directional capacitoris preferred for field recording. Parabolic, middle-side(M-S), and stereo mics may also be used in the field.

►A key to collecting successful live effects in the field ismaking sure that the recorded effect sounds like what itis supposed to be, assuming that is your intention.

►Ambience sound effects, also known as backgroundeffects or backgrounds, are used to fill the screen space,providing a sense of environment, location, time of day,indoor or outdoor settings, and so on. They also smoothchanges in presence, provide continuity in scene changes,give context to the principal sound, and mask noise ondialogue tracks.

►One approach to creating ambience is known as worldizing—recording room sound to add the sound of thatspace to a dry recording or to use it to enhance orsmooth ambient backgrounds that are already part ofthe dialogue track.

►Sound effects can also be generated electronically withsynthesizers and computers and by employing MIDI—anapproach called electronic Foley. A synthesizer is an audioinstrument that uses sound generators to create waveforms.Computer-generated sound effects can be createdwith preprogrammed software or software that allowssounds to be produced from scratch.

►An important sound-shaping capability of many electronickeyboards and computer software programs is sampling,a process whereby digital audio representing a sonicevent, acoustic or electroacoustic, is stored on disk or intoelectronic memory. It can then be signal-processed intoa different sound or expanded to serve as the basis for alonger sonic creation.

►In recording samples it is important to take your time andrecord them with the highest fidelity; if a sample has to bererecorded, it might be difficult to reconstruct the samesonic and performance conditions.

►The main difference between produced and design effectsis that design effects are created after recording duringediting and usually involve deconstructing the waveformto create an almost entirely new effect.

►Database systems and software programs used to manageprerecorded sound libraries facilitate searching, locating,and auditioning a sound effect in seconds.

►Ripping, also referred to as digital audio extraction, is theprocess of copying audio (or video) data in one mediaform, such as CD or DVD, to a hard disk. To conservestorage space, copied data are usually encoded in a compressedformat.

►Organizing and naming sound-effect files in a productionrequires a system whereby a file can be identifiedin any of a combination of ways: by category, name ofeffect, take number, length, source, sampling rate, bitdepth, channels, and any remarks. This requires a softwaremanagement program to organize the data and make iteasily retrievable.

►Spotting sound effects involves going through the scriptor edited work print and specifically noting on a spottingsheet each primary and background effect that is called for.A spotting sheet indicates not only the sound effect butalso whether it is synchronous or nonsynchronous, its in- andout-times, and its description. If the sounds have beenrecorded, notes about their dynamics, ambience, the waythey begin and end, and any problems that require attentionare also noted.