Module 53 |
Updated: 08/25/2005 |
Technical Any noticeable, abrupt, or undesirable change in audio or video during a production is referred to as a technical continuity problem. We tend to accept some technical continuity problems; others we don't. News and documentaries are often shot under drastically different conditions, and so we tend to accept such things as changes in video color balance or audio ambiance between scenes. But in such things as dramatic productions we don't want technical inconsistencies diverting our attention from the storyline. In this type of production the medium (television) should be totally "transparent" so there's nothing to get in the way of the message (the story).
|
Audio Continuity Problems Audio continuity problems can be caused by a wide range of factors including shot-to-shot variations in:
In single-camera production most of these inconsistencies may not be easy to detect on location; it's only when the various shots or takes start to be assembled during editing that you discover the problem. As you cut from one scene to another, you may discover that the talent suddenly seems to move closer or farther away from the mic, or that the level or type of background sound changes (passing traffic, an air conditioner, or whatever). Some problems can be helped with the skilled use of graphic equalizers or reverberation units. Changes in background sound can sometimes be masked by recording a bed of some additional and consistent sound, such as music or street noise. As in most of life, it's easier to avoid problems than to fix them - assuming there even is a way to fix them.
First, be aware that mics used at different distances reproduce sounds differently. This is due to changes in surrounding acoustics, as well as the fact that specific frequencies diminish over distance. Although some expensive directional mics will minimize the effect of distance, most mics exhibit proximity or presence effects. A good pair of padded earphones placed on top of a set of well-trained ears can help in detecting audio differences. With the increased reliability of wireless mics, many production facilities are equipping actors with their own personal mics. The distance of the mic - it's generally hidden in the person's clothes - can't change, and because of the proximity of the mic, background sounds tend to be eliminated. Some of the things we talked about in using personal mics should be kept in mind here. Finally, you need to watch for changes in background sounds. For example, the sound of a passing car or a motorcycle may abruptly appear or disappear when you cut to a shot that was recorded at a different time. Even if an obvious background sound doesn't disappear, its level may change when you cut from one person to another. This may be due to differences in microphone distance coupled with the level adjustments needed to compensate for the different strength of voices. The scene here would make a beautiful background for an interview, but the running water could create major sound problems - especially for a single camera interview or a dramatic production. Audio technicians will typically want to keep the camera or audio recorder running for a minute or so after an interview so that the ambient sound on the location can be recorded. A moment of "silence" may be needed when editing together the interview or a bed of background sound may be desirable to even out differences between speakers. Sounds of traffic, which could even come from a sound effect CD, could also be used in this way.
Music can smooth the transition between segments and create overall production unity - if it's used in the right way. Background music should add to the overall mood and effect of the production without calling attention to itself. The music selected should match the mood, pace, and time period of the production. Vocals should be avoided when the production contains normal (competing) dialogue. Ideally, the beginning of a musical selection should coincide with the start of a video segment and end as the segment ends. This almost never happens, of course, at least without a little production help. With today's computer-based editing systems musical selections can be shortened or lengthened to accommodate accompanying video. This is not too hard when a piece contains repetitive sections separated by momentary pauses - you can just take out segments to shorten the piece and repeat segments to lengthen it. To a limited degree you can electronically speed up and slow down instrumental segments with digital editing equipment, especially if the music is not well known. (To a degree you can do the same thing with video. Rather than edit out segments, feature films used on television are sometimes speeded up slightly to provide time for extra commercials.) Since a noticeable music continuity issue arises when music has to be faded out "midstream" to conclude at the end of a video segment, you can try backtiming the music. If the music is longer than the video, you can start the music a predetermined amount of time before starting the video. You then fade in the music as the video starts. If you calculate things correctly, the music and the video will both end at the same time. Let's assume, for example, that a music selection is two minutes and 40 seconds long and the video is only two minutes long. By starting the audio 40 seconds before the video and fading it in with the start of the video, they should both end together. As will see later, all of this is fairly easy when you are using a nonlinear, computer-based editing system. (Everything is visible on the computer screen's time line.) With linear editing the process takes a bit more work and planning.
Video has its own continuity problems; for example, changes in:
Intercutting scenes from cameras with noticeably different color characteristics (color balance) will immediately be apparent to viewers. To alleviate this problem all cameras should be carefully color-balanced and compared before a production. This is especially important if multiple cameras are being used and the shots will later be cut together. (You may remember that we previously discussed setting up both monitors and cameras.) Once cameras are color balanced and matched, an electronic test pattern with all of the primary and secondary colors should be recorded at the beginning of the videotape. This is used to color balance the video playback. Notice in the photos above that several things subtly change-especially skin tones and color balance. Cutting from the closeup to the two-shot would also represent a problem because of the change in the position of the woman's head. Editing systems often make use of a vectorscope for adjusting colors on tapes before editing starts. A vectorscope and a waveform monitor are both a part of the software of professional nonlinear editing systems. These professional editors allow you to change the basic color balance of scenes. However, in trying to match different video sources the subtle differences between some colors may not be able to be satisfactorily corrected. This is why the initial color balancing of cameras is so important. This page shows the various color balance and luminance range settings available in one sophisticated nonlinear video editing system. |
TO NEXT MODULE Search
Site Video Projects
Revision
Information
Issues
Forum Comment or Problem
Associated Readings
Bibliography
Index for Modules To Home Page Tell
a Friend
Tests/Crosswords/Matching