Updated: 07/16/2005

 


Ultimately, digital image acquisition will definitely replace film for TV production. You don't need a crystal ball to know that.

-Ed Nassour, senior VP of postproduction,
20th Century Fox Television


New Insights in an Ongoing Debate:

 Film vs. Videotape

 

Which is better: film or videotape?"

The fact is, each is superior in a number of ways; it depends on your needs.  At the same time we need to acknowledge the fact that much of the information about the "inferiority of video" is no longer valid. Even so, old beliefs persist. 

Let's look first at the advantages of film.


Advantages of Film

Since the TV Production modules concentrate on television, we tend to emphasize the advantages of video. But someone is bound to ask: "Why is most dramatic television still shot on film.''

There are several reasons.

After more than 100 years of 35mm film production, a rich and highly sophisticated tradition has grown up around film. Unlike video production where newcomers may quickly find themselves functioning as camerapersons and even in some cases as directors, the feature film tradition typically involves long, highly competitive apprenticeships.

Less motivated people tend to drop out in favor of those who are more talented, persistent, and dedicated.

Because of rich heritage of film, the production and postproduction processes have not suffered from a lack of talent or supporting industries. In Southern California alone there are thousands of companies that specialize in various aspects of film production.

Comparing the closing credits of a major film feature with those of a typical video production provides some measure of the differences that still exist between the two media. (Try sitting through the closing credits of Pearl Harbor or The Day After Tomorrow!)

For decades, film has enjoyed consistent worldwide standards. A 16mm film can be broadcast on any of world's broadcast systems, regardless of the broadcast standard, and a 35mm film can be shown in almost any theater in the world.

Video, on the other hand, has not only progressed through numerous tape formats, but there are now a half-dozen incompatible broadcast standards being used in various parts of the world. For producers with an eye on international distribution, film has for decades been the obvious choice.

However, with the move to HDTV, producers now see a way of covering all of the bases: having a medium that can be used with in any of the SDTV, HDTV world standards —and converted to film for theatrical use. Specifically, many productions are now being shot on 1080/24p video. They can later be converted to film, if there is a need for that.

Relative Equipment Durability

It is commonly assumed that film equipment is more durable than video equipment because film cameras are mechanically much simpler. Although this may be true, professional video equipment—especially with the new solid-state recording media—has been used under quite extreme conditions without breaking down.

Because film cameras are not as mechanically complex, they are cheaper to maintain than video cameras. At the same time, when stored and used in extreme temperatures, color film stock will suffer such things as color shifts.

Although video equipment used to be rather fragile and unreliable, this is no longer the case—at least with professional video equipment. In most any application today, the issue of film vs. video durability is not a significant issue.


Technical Quality Compared

It is commonly believed that the quality of 35mm motion picture film as viewed on television is better than video. If we are talking about the artistic differences, then film has a definite advantage for the historical reasons we've noted.

Although artistic differences between film and videotape are difficult to measure, purely technical differences are not. This brings us to the following statement.

    If production conditions are controlled and if comparisons are made solely on the basis of sharpness and color fidelity, the best 35mm film will be slightly inferior to the best video—assuming the latest professional-quality video equipment is used and the final result is broadcast.

As controversial as this statement might be with some film people, the reason becomes obvious when the production process for each medium is traced.

First, it is important to realize that if a signal from a video camera is recorded on the highest-quality process, no discernible difference will be noted between the picture coming from the camera and the picture that is later electronically reproduced.

With film intended for broadcast the process is far more complex.

First the image is recorded on negative film. The original negative film is used to make a master positive, or intermediate print. From the master positive a "dupe'' (duplicate) negative is created; and from that a positive release print is made. This adds up to a minimum of three generations.

At each step things happen: color and quality variations are introduced by film emulsions and processing, there is a general optical degradation of the image, and the inevitable accumulation of dirt and scratches on the film surface starts.

After all of these steps, the film release print is projected into a video camera to convert it to an electronic signal—which is where the video signal started out in the first place. (We'll talk about video's brightness range limitations, etc., a little later.)

To understand the film-video difference we must also bear several other factors in mind. Film is theoretically capable of resolving several times more detail than standard video. But, since it looses much of its sharpness in its route from film camera to television camera, when the film is converted to video electronic image enhancement is routinely used to restore lost sharpness. Although image enhancement sharpens the overall look of the film image, once lost. subtle details cannot be enhanced back into existence.

At the same time video is becoming capable of resolving ever-greater levels of fine detail. Eastman Kodak has announced a CCD chip capable of holding 16,777,216 bytes per square inch, which is double the resolution of standard 35mm film. Another company, Foveon, has announced a relatively inexpensive CMOS-type chip that is not only capable of the same resolution as film, but has a tonal scale and brightness range that is reportedly equal to film.

But the sharpness of video isn't necessarily a plus. Many people think the slightly softer look of film is actually one of its advantages. For one thing, the soft ambiance surrounding the film image is subconsciously if not consciously associated with "Hollywood film making.'' There are also subtle tonal and color changes with film, which,  while not representing the true values of the original subject matter, are subconsciously associated with film and it's historical heritage. At the same time,  the slightly sharper image of video is associated with news and the live coverage of events, subject matter that is very much in contrast to the normal fare of feature films.


Coping With Brightness Ranges

Until recently, video cameras simply could not handle the brightness range of film. (Remember the 30:1 brightness range limitation of video?)

If film exposure is carefully controlled, a bright window in the background of a scene, for example, will not adversely affect the reproduction of surrounding tones. With the limited brightness range associated with tube-type video cameras the same bright window would significantly darken surrounding tones. This same problem crops up with consumer and prosumer equipment that rely on automatic exposure circuitry.

As a result of early experience with professional tube-based video cameras, many producers concluded that film had a major advantage over video. And in this respect it definitely did.  However, when the latest generation of professional HDTV 24p cameras are manually setup (as opposed to relying on automatic settings) brightness range capabilities end up being almost identical to film.

As we note here the DULSA camera not only has a resolution four times greater than today's HDTV cameras, but its brightness range equals that of motion picture film stocks.

There is also a less obvious difference between film and video. With NTSC television the film-to-video conversion process requires some technical "fancy footwork" that results in the introduction of almost subliminal effects associated with the film image on TV. 

NTSC video is transmitted at 30 frames per-second and the frame rate for film is 24 per-second.  (The machine shown on the left converts film images to video.)  Since there is no nice, neat math associated with dividing 30 by 24, the only way to make the conversion is to regularly scan some film frames twice.  

This results in a subtle high-speed jitter, a type of artifact that has become associated (if only subconsciously) with the film image on TV.

With the SECAM and PAL broadcast standards used in non-NTSC countries the conversion process is easier.  Both of these video systems operate at 25 frames per second—very close to the 24 fps used in film.  The 1 fps difference is almost impossible to detect, so adjusting the film camera or projector rate to 25 fps is a common solution.


DI - the Intermediate Digital Step

By 2005, major motion pictures were using the advantages of digital imaging as an intermediate step between the color negative film shot in the camera and the final release print copied for use within theaters. (Here we are talking about films made for theatrical release.)

Scanning the film into digital form provides much more control over color correction and artistic color changes. Of course once in digital form many special effects are also possible.


Uncompressed Video

One of the quality compromises involved in HDTV has been the need to compress the signal. However, as the cost of digital recording and storage has decreased, we are seeing some production facilities move to uncompressed (4:4:4, 10 bit) video recording and editing. Silence Becomes You, released in 2005, was billed as the world's first uncompressed 4:4:4 feature production—shot with a video camera and later converted to film.

Once this is more widely adopted, we'll see a major jump in image quality and post-production speed and economy, making the switch to HDTV even more attractive.


Digital Cinema

So-called digital cinema or e-cinema (electronic cinematography) is rapidly gaining ground, especially since it is becoming almost impossible in theaters to distinguish between it and film. E-cinema is now preferred by many independent "filmmakers," and major "film" competitions now have more entries on video than on film.

The major weakness in the move to digital cinema has been with projectors. But, the latest generation is based on projector imagers with a 4-megapixel resolution —twice that of the last generation of projectors. The detail possible with these projectors exceeds that of 35mm film projection.

Now the major stumbling block for digital cinema is the great initial investment in equipment — the projector and the associated computer.  However, once this investment is made, major savings can be realized.

And, as Michael Goldman, points out, major savings are also possible during production.


Digital imaging obviously saves us money, no question, especially in areas where we would normally shoot a lot of film....  Our crew can shoot more material in the same amount of time, and they can see what they are shooting instantly, in broadcast quality.  

Michael Goldman, Millimeter senior editor

Directors of Photography in film often resist moving to video equipment because "everything is different," and old habits and patterns of thinking are difficult to break.

For this reason, video camera manufactures have made some of their cameras resemble the operation of film cameras. The video camera shown here uses standard 35mm motion picture lenses. This means that directors of (film) photography do not have to abandon all that they have learned about the lenses.

Previously, we mentioned the almost subliminal effect that the NTSC film-to-video process creates. To make video look even more like film, even this "double-step" effect (resulting from the extra film fields being regularly added) can be electronically created. In fact everything, right down to electronically-generated random specks of "dust" can be added to the video image!

This aside, the first practical step used in creating a "film look" is through the use of filters. This link lists filters that are often used to make video look like film (if that's your goal).

Film also can have a more saturated color appearance. With sophisticated video equipment this can be simulated by adjusting the color curves in a sophisticated video editor. This can also be addressed in post-production by channeling video through programs such as Photoshop, After Effects or Chroma Match. By softening the image to smudge the digital grid of video, and reducing the contrast, you can take additional steps to make video look like film.

Of course, the question is why would you want to degrade the quality of one medium to match another?

Possibly it's a matter of what people get used to. When people first heard high-fidelity audio, they didn't like it. After listening to music and voice for decades on low quality radio and phonograph speakers, they had become used to this as "the standard" in audio quality, and anything else — even something better — didn't sound right..

The feature film, 28 Days Later, released in mid-2003, did very well at the box office and was shot with video equipment.

By 2005 a number of feature films had been shot in high-definition video and then transferred to 35mm film for release in theaters.



Single-Camera, Multiple-Camera
Production Differences
 

Purely technical considerations aside, the primary underlying difference between film and video lies in the way it's shot.

Film is normally shot in a single-camera style, and video is normally shot  in the studio using a multiple-camera production approach.

In film each scene can be carefully set up, staged, lit, rehearsed, and shot. Generally, a number of takes are made of each scene and the best one is edited into the final production. As they strive for perfection in today's high-budget feature film productions, some directors re-shoot scenes many times before they are satisfied.  (Possibly the record is held by one well-known film director who reportedly shot the same scene 87 times.)

Quite in contrast, video is generally shot with several cameras covering several angles simultaneously. Instead of lighting being optimized for one camera angle, it must hold up for three or more camera angles at the same time.  This means that it's generally lit in a rather flat manner, which sacrifices dimension and form.  And, with the exception of single-camera production, multiple takes in video are the exception rather than the rule.


Film and Videotape Costs 

The minute-for-minute cost of 16mm and 35mm film and processing is hundreds of times more than the cost of broadcast-quality video recording.

For example, Director of Photography Michael Caporale says, "We're realizing terrific economies by shooting with the 27V [high-definition video camera]. For Tattered Angel, film stock was a mere $1,500 verses $104,000 for film and processing. Plus...the image quality and dynamic range [of the video] are truly impressive."

And, unlike film, tape is reusable, which results in even greater savings.

By replacing film with videotape and speeding the production process George Lucas saved at least $3-million on the 2002 Attack of the Clones.

--Larry Thorpe, Senior VP, Sony Electronics


Offsetting the savings with video is the initial cost of video equipment. Depending on levels of sophistication, the initial investment in video production and postproduction equipment can easily be ten times the cost of film equipment. The cost of maintaining professional videotape equipment is also greater — although this is changing with the adoption of computer disk and solid-state recording.

On the other hand, there is a substantial cost savings in using video for postproduction (special effects, editing, etc.). For these and other reasons film productions intended for television are routinely transferred to videotape. This transfer can take place as soon as the film comes out of the film processor.

Reversal of the negative film to a positive image, complete with needed color correction, can be done electronically as the film is being transferred to videotape or computer disk. From this point on all editing and special effects are done by the video process. The negative film is then locked away in a film vault and kept in perfect condition.

Even for film productions intended for theatrical release, major time and cost savings can be realized by transferring the film to videotape for editing. Once edited, the videotape is then used as a "blueprint'' for editing the film.


Will Video Replace Film?

So will video soon replace film for primetime TV production?

Yes, eventually, just as it will eventually replace film in motion picture work. The move is well underway.

But right now "Hollywood" has a tremendous investment in film technology. Plus, top creative personnel still typically come from a film background. And there is also this:

Film often looks better because film people have more experience and understand their medium better. Film shooters understand their tools and how to bring out the subtleties.

Videographers must be prepared to learn the language that film shooters have built over the last 100 years. It's a language made up of camera movements, filtering techniques, subtleties of focus, and depth of field. And it's a language coming into the video world through the gateway of high-definition television.

-HD pioneer Pierre de Lespinois



TV Production Index

To Home Page

         

© 2005, All Rights Reserved