The Invisible Cut Page 4
The movie Twelve Angry Men* is a classic example of how filmmakers brilliantly maneuvered their way around potential crossing-the-line traps, because virtually the whole movie is a group shot of twelve jurors seated around a table. I will show examples below with diagrams and frame grabs from two scenes, and I will refer to them again as I deconstruct this movie in Chapter 13.
The way the editor, Carl Lerner, takes on the dreaded group scene is to initially establish where all the jurors are seated. He does this by cutting to shots which angle down at the jurors from all four corners of the table. After the first high shot (shown in frame grab #1-3), the editor cuts to another shot (seen in frame grab #1-4) that crosses the line completely and reverses everyone’s direction, so the audience sees two clearly opposing angles. The same is true of another pair of shots (frame grabs #1-8 and #1-9) which then completes the coverage of all four corners of the table, illustrated in Diagram 1.
frame grab 1-3
frame grab 1-4
frame grab 1-8
frame grab 1-9
In the next two shots, the actors are positioned a certain way so that when the editor cuts from one group (frame grab #1-10) to the next (frame grab #1-11), only actors Fonda and Binns appear both times. Thus he has to honor only one stage line, as shown in Diagram 2.
frame grab 1-10
frame grab 1-11
frame grab 1-14
frame grab 1-15
frame grab 1-16
Frame grab #1-14 is the same angle as frame grab #1-10 but now the interaction shifts between two other actors, Warden and Fonda, and a new stage line is established, as seen in Diagram 3. When the editor cuts to the next two shots (shown in frame grabs #1-15 and #1-16), only those two actors appear and they continue to stay on the left and right sides of the screen, respectively. If the camera had crossed that line from frame grab #14 to frame grab #15, the actors’ positions would have been flipped and the audience would have become confused.
The editor is sometimes able to violate a stage line rule when he knows the audience is looking elsewhere. For instance, when the editor cuts from one group of jurors (frame grab #2-5) to a “reverse angle,” the opposite perspective of a shot already taken (frame grab #2-6), there are six actors seen in each shot. The dramatic focus is between actors Balsam and Fonda, who remain in left-right positions from one shot to the next, honoring Stage Line A, as seen in Diagram 4. The positions of the jurors sitting between them are flipped from one shot to the next — their Stage Line B is crossed — but the editor knows the audience won’t notice, since they’re not focused on the other jurors.
frame grab 2-5
frame grab 2-6
Honoring the stage line is most important in dialogue scenes, because it’s crucial that the audience knows where the actors are looking and at whom. On the other hand, if the editor is cutting an action scene, the editor will probably want to cross the line and purposely break the rules, because that visual disorientation gives the scene energy and excitement.
CAMERA ANGLES
Where the camera is placed in relation to its subject has a distinct effect on the audience’s perception. Because of this, an editor has to be aware of the impact of various camera angles and how to use them most effectively to convey the story.
In terms of camera distance, there is a lot of variation in how filmmakers label different shots. This is especially true of mid-range distance shots. For example, a “medium full shot” could be sometimes called a “medium shot.” For the purposes of this book, the choices will be simplified to the following:
Overview of Shot Sizes
Tight close-up: Cuts off part of chin and top of head
Close-up: Head
Close shot: Head to shoulders or breast
Medium close shot: Head to waist
Medium shot/medium full shot: Head to knees
Full shot: Whole body
Medium long shot: Middle distance showing small group or some geography
Long shot: Full geography
The long shot has enough distance to show the actors in relation to each other and their surroundings. If it’s completely inclusive, it’s one and the same as the master shot. The opposite extreme, the close-up, is an editor’s most powerful weapon, so it should be saved for when it can be used to its maximum effect. It can show the interior life of a character or be used just for dramatic emphasis. Because close-ups have less physical context, they can also be used as setups to surprise or scare the audience, or as cutaways to fix a problem. But if a close-up isn’t motivated or necessary, the audience will feel bounced around and be too aware of the cutting. If it’s not clear which actor to cut to, a “two shot” (two actors within the frame), or an “over-the-shoulder shot” (shooting past an actor’s shoulder or part of his head to another actor) may be preferable. Also, usually the editor should not cut to a shot that’s only slightly closer or further away than the previous shot, because the change will not give the audience enough new information and it will be disorienting.
The same may happen with a slight change in the positioning of the subject in relation to its background. For example, imagine an actor standing in front of a tree. If his position changes just a little from shot to shot, the audience will be confused and wonder if the tree behind the actor moved slightly to one side. But if the audience sees the actor in profile after a straight-on shot, which is a dramatic change in angle, they will expect the tree to be in an entirely different position, and they won’t be confused. In fact, the greater the change in angle, the more an editor will be able to create distraction and a smoother cut. A cut that involves no change in angle, but a change in distance — such as cutting in close from the same vantage point — can be very effective, but not naturally as smooth as a change in angles.
An editor must be aware of the psychological impact of angles. For instance, when the camera looks down on an actor, it creates a perspective that can make him appear more benign or powerless. If this angle represents the point of view of another actor looking down at him, he will appear even more victimized. An angle looking up at an actor usually implies a more menacing perspective. This angle can also have a different impact depending on the context. If the audience admires this character he may seem more heroic. All told, extreme perspectives usually have a powerful effect of some sort.
CAMERA LENSES
An editor must also know the visual and psychological impact of lenses so he can use the best one to serve the scene. The length of the lens has its own particular impact; it greatly affects the “depth of field” or the range of sharp focus. For instance, if an actor is shot with a long lens, the background will be less in focus than if he is shot with a shorter length lens. If an editor cuts between two actors who are shot with different lenses but from the same distance, the focus in the background will be somewhat different between the two shots. If done for no good reason, it will confuse the audience. However, if there are other changes between the two shots in addition to lenses — in distance or angle, assuming this is properly motivated — then the new information will be useful and not disorienting. The following two extremes illustrate the different effects.
Wide angle
Using a wide-angle lens (a lens of less than normal length) causes the background to seem to be more in focus, but the images appear to be farther from the camera and from each other. As a result, an actor or object moving toward or away from the camera will appear to change more dramatically in size and move faster than he actually does. It also has a large “field of view,” the area covered by a lens, which is useful when including a lot of visual information in confined locations. It can also magnify and distort a subject very close to the camera, possibly making it seem more threatening or disturbing.
Telephoto
Using a telephoto lens (a very long lens) makes images seem closer to the camera and to each other so that the foreground, middle ground, and background are more compressed. Because the background seems closer, when th
e camera is following a traveling actor or object in the foreground, their speed will seem greater than it really is. The telephoto lens has a unique versatility when multiple cameras are used, because it can capture an effective close-up from a distance, without getting in the way of the other cameras. When the camera is close it will also flatten the perspective of the subject, often making it appear more benign.
Sidney Lumet, the director of Twelve Angry Men, described how the psychological and visual aspects of both lenses and angles affected his choices in making his courtroom drama. (The length of a lens is measured in millimeters.)
One of the most important dramatic elements for me was the sense of entrapment these men must have felt in that room. Immediately a “lens plot” occurred to me as the picture unfolded. I wanted the room to seem smaller and smaller. That meant that I would slowly shift to longer lenses as the picture continued. Starting with the normal range (28 mm to 40 mm), we progressed to 50 mm, 75 mm, and 100 mm lenses. In addition, I shot the first third of the movie above eye level and then, by lowering the camera, shot the second third at eye level, and the last third from below eye level. In that way, toward the end, the ceiling began to appear. Not only were the walls closing in, the ceiling was as well. The sense of increasing claustrophobia did a lot to raise the tension of the last part of the movie. On the final shot, an exterior that showed the jurors leaving the courtroom, I used a wide-angle lens, wider than any lens that had been used in the entire picture. I also raised the camera to the high above-eye-level position. The intention was to literally give us all air, to let us finally breathe after two increasingly confined hours.2
CAMERA MOVES
An editor must understand the impact that camera moves have on the audience and the cumulative result of cutting those shots together. Here is a basic list of shots, ranging from least flexible to most flexible:
Zoom: The camera lens moves in or out with no loss of focus. (Because the zoom has an unnatural two-dimensional effect, it’s difficult to cut in smoothly on a zoom. It works best when shot without any real background, with a very slow move or combined with a more flexible move like a tilt, pan, or dolly. However, the very inflexibility of the zoom lens can make it useful for the editor if he wants to create a jarring effect.)
Tilt: The camera moves up or down.
Pan: The camera moves left to right.
Swish pan: Like a basic pan but very fast with blurs between its beginning and end points.
Dolly (aka Traveling or Tracking Shot): A move from a wheeled platform or on tracks that move forward and backward.
Crane: A move from a wheeled platform with a boom, the arm on which the camera is mounted, which can raise and lower itself to many levels and swing to many angles.
Crab dolly: A move from a wheeled platform with a mounted camera and steering control, which can have a combination of moves in nearly any direction.
Steadicam: A handheld but stable camera, which is the most flexible of all.
OPTICALS
Opticals are effects that were originally created by optical printers, which used film projectors mechanically linked to a movie camera. This technique allowed filmmakers to re-photograph previously processed film to create transitions and composite effects.
To review the basics:
Dissolve: one shot gradually fades out at the same time the next shot fades in so that at midpoint each shot is equally superimposed over the other. It is often used as a transition between scenes either to slow time down or to show the passage of time, a change in place, or a connection between ideas, moods, or interior thoughts.
Fade: the outgoing shot gradually disappears into blackness or washes out to a white screen, and is called a “fade-out”; a “fade-in” is the reverse. Although both dissolves and fades can indicate a change in time or place, the fade creates a more complete, distinct break in the narrative.
Wipe: a dividing line — horizontal, vertical, diagonal, straight, jagged or even invisible — sweeps across the screen and wipes out the shot to reveal an entirely new shot. It can be used like a fade or dissolve, but it’s a rather old-fashioned device.
Skip framing: By optically eliminating every second or third frame, the editor can speed up a sluggish shot, but he can only get away with it in a shot with very little motion, such as a static shot of an actor. “Double framing” has the opposite effect, slowing down the action by repeating frames two or three times. The same limitations apply.
Freeze frame: repeated printing of a frame to extend the moment and make it more dramatic.
Since the 1980s, digital compositing has, for the most part, replaced optical printing, and has vastly expanded filmmakers’ options. Also, directors and editors are less dependent on transition opticals for telling a story, because today’s audiences are so savvy and quick to accept plot complexities. They don’t need, for example, a dissolve to tell them there’s been a passage of time. With digital technology, filmmakers can also create much more elaborate transitions. One of many possibilities is to have part of an outgoing frame inserted into the first frame of the next shot, creating a kind of mosaic. Digital art can also be animated and used to create a transition from one shot to the next.
Before computer technology, simple matte shots were created by photographically combining two or more elements and masking parts of each image to avoid double-exposure. An example would be combining a studio shot of an actor in a boat with a background “stock shot” (pre-existing film from a library) of a lake. Computers made it easier to create “traveling mattes” (mattes that change), because the images that are composited could be more easily synchronized. With the advent of CGI (computer-generated imagery) opticals have become even more complex. For example, actors can be filmed at different times against any imaginable background or special effect.
In this case, when the editor initially puts the movie in a cut, the actors may just be reacting to a blue or green screen that will later be replaced by the CGI-created background. The editor probably will have to continue cutting all the way through “postproduction,” the period after the movie is shot, until all the effects are completed and perfected. The editor may also have to cut in a digitally generated “character” that interacts with other live characters but may not yet be perfected or even created when the editor first puts the sequences together. He will have relatively little flexibility with such a character. He can’t, for instance, choose another take or change the length of the shot, except possibly to shorten it slightly, since each frame of animation is very expensive. He may even have to mentally act out the moves of the character, to time the scene out correctly and give the optical house guidelines.
A director may want to use “previsualization” (digital graphics used to create a rough version of a sequence often with digital counterparts to the actual actors), which can then be edited, and even have music and dialogue added before the director films the final version. An editor can also add or remove elements from the frame digitally. He can cut people out of the scene, move them around, move in for a close-up; basically, he can direct the movie in three dimensions in the cutting room. These types of innovations are virtually unlimited. Although they dramatically increase the editor’s options, he and the director have to decide to what degree the technology serves the movie.
CHAPTER 9
* * *
THE SOUND AND MUSIC
SOUND
The film editor is, in a sense, both a picture and sound editor, because the two elements are completely dependent on each other, and sound is always part of the editor’s decision-making process. In many ways, sound and film editing are similar: both are often the most effective when they are imperceptible, giving invisible support to the story. However, there are differences. For example, in a movie the sound is instantaneous, but the picture is slightly delayed, since it takes a few frames to absorb it. And because sounds can be heard from many directions, one sound doesn’t necessarily have to replace another. Sound can also be i
solated, made louder, or heard off screen, all of which makes it more flexible and versatile than the movie’s images.
Sound can be “synchronized” (recorded during shooting) and “post-synchronized” (recorded afterward) and can be used to add either imaginative elements such as the sounds of a monster or realistic “foley” sounds (recorded in sync with the onscreen action such as footsteps or punches).
Film editors will add some sound effects to their first cut to punch up a scene or to just enhance the synchronized sound that already exists. The “sound designer,” who is responsible for the postproduction effects, may end up keeping some of this temporary sound, but will usually end up replacing virtually all of it with elements from his own library for the “final mix,” when separate tracks of dialogue, music, and sound effects are equalized and combined into one track by technicians on a soundstage.
Dialogue
Dialogue is a crucial part of the film editor’s process. Dialogue will also be both synchronized and post-synchronized. Of post-synchronized, there are three kinds:
—“Looping” or “ADR”: replacing dialogue by recording it on a soundstage in sync with the original picture. Looping becomes necessary because of poor sound conditions or flawed performances during shooting, or a need for additional exposition.